Skip to content

Commit aef9b9f

Browse files
alexholdenmillerkylegao91
authored andcommitted
add contiguous call to tensor (#127)
when attention is turned off, pytorch (well, 0.4 at least) gets angry about calling view on a non-contiguous tensor
1 parent 96c6033 commit aef9b9f

File tree

1 file changed

+1
-1
lines changed

1 file changed

+1
-1
lines changed

seq2seq/models/DecoderRNN.py

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -102,7 +102,7 @@ def forward_step(self, input_var, hidden, encoder_outputs, function):
102102
if self.use_attention:
103103
output, attn = self.attention(output, encoder_outputs)
104104

105-
predicted_softmax = function(self.out(output.view(-1, self.hidden_size))).view(batch_size, output_size, -1)
105+
predicted_softmax = function(self.out(output.contiguous().view(-1, self.hidden_size))).view(batch_size, output_size, -1)
106106
return predicted_softmax, hidden, attn
107107

108108
def forward(self, inputs=None, encoder_hidden=None, encoder_outputs=None,

0 commit comments

Comments
 (0)