[General boards] [Fall 2018 courses] [Summer 2018 courses] [Winter 2018 courses] [Older or newer terms]

Reshape in add_embeddings()


In add_embeddings(), after the call to tf.nn.embedding_lookup(), my tensor has shape (None, n, embed_size), as expected. Right after that, I call tf.reshape(), but the output tensor has shape (None, None) instead of (None, n * embed_size). After searching online, the only discussion I can find on the topic suggests passing tf.shape(x)[0] instead of None as the first element of the shape tuple to tf.reshape(), but that doesn’t help.

The (None, None) shape of, e.g. x_w, subsequently leads tf.matmul(x_w, W_w) to output a tensor with shape equal to that of W_w, which is (n_word_features * embed_size, hidden_size), instead of of the desired (None, hidden_size).

Finally, this all results in mismatched dimensions when doing the addition operation (x_w W_w) + (x_t W_t), since the two matmuls are outputting matrices with wrong and incompatible shapes.

Has anyone else experienced this?


You’re right that your reshape() call is giving the wrong result. Shoot me an email with your reshape call and I’ll take a look.


Thanks, but I figured it out in the meantime. It turns out that reshape expect ints, not Dimension objects. However, it doesn’t raise an error, it just does the wrong thing.