Perhaps this is trivial, but perhaps it is not. I have spent way too much time trying to figure out how to make this work. Here is the code:
# batch x time x events batch = 2 time = 3 events = 4 tensor = np.random.rand(batch, time, events) tensor = 0 tensor = 0 tensor = 0 tensor = 0 tensor = 0 tensor = 0 tensor = 0 non_zero = ~tf.equal(tensor, 0.) s = tf.Session() g = tf.global_variables_initializer() s.run(g) s.run(non_zero)
I am trying to apply
tf.nn.softmax to the non-zero values across each of the
time dimensions. However, when I am using
tf.boolean_mask then it actually gathers all of the non-zero values together. That is not what I want. I want to preserve the dimensions.
Here is the screenshot of what the tensor looks like:
tf.nn.softmax should be applied to only those groups and it should "put them back" into their original positions. Does anyone know how to do this?