'how can I make a 2D softmax Layer?
I am making a network using keras library.
Lets suppose that I have 2D matrix
[ 0 0 1 2
0 1 2 5
1 0 0 1 ]
what I want to do is obtaining the following matrix
[ 0.00 0.00 0.02 0.10
0.00 0.02 0.10 0.99
0.02 0.00 0.00 0.02 ]
As shown I want to make the layer to express the largest element of 2d array to be emphasized only.
How can I achieve this?
Is this can be simply achieved by simply adjusting softmax twice ?
Solution 1:[1]
You don't need to worry about the 2d shape, the softmax will work fine.
import tensorflow as tf
inputs = tf.random.normal(shape=(3, 3))
outputs = tf.keras.activations.softmax(inputs)
print(inputs)
print(outputs)
tf.Tensor(
[[-0.3471133 -0.8292573 -0.06646241]
[-1.2869339 -0.52089226 0.3157407 ]
[-0.8821394 0.16500719 -0.41590676]], shape=(3, 3), dtype=float32)
tf.Tensor(
[[0.33996844 0.2099163 0.4501153 ]
[0.12319015 0.26501083 0.61179894]
[0.18370579 0.52347124 0.29282293]], shape=(3, 3), dtype=float32)
Solution 2:[2]
If I understand correctly, you want to take the softmax over the entire 2D array. If so, applying directly the softmax to a 2D array will return the softmax over each column (separately!). E.g.:
X = np.log([[1, 1, 2], [3, 3, 3]])
Y = tf.keras.layers.Activation('softmax')(X)
assert np.allclose(Y, [[0.25, 0.25, 0.5], [0.3333, 0.3333, 0.3333]], atol=1e-4)
If you want the softmax over all elements of the 2D vector, this should do:
X = np.log([[1, 1, 1], [1, 2, 4]])
X = np.expand_dims(X, axis=0) # add batch dim
X = tf.keras.layers.Reshape((-1,))(X) # the batch dimension will be preserved (shape in Reshape doesn't include the batch dim)
# equivalent to: X = X.reshape(m, -1), where m is the batch dim.
# This however will not keep track of the gradients for backprop.
#That's why, it's better to use a Reshape layer.
Y = tf.keras.layers.Activation('softmax')(X)
assert np.allclose(Y, [[0.1, 0.1, 0.1, 0.1, 0.2, 0.4]])
Sources
This article follows the attribution requirements of Stack Overflow and is licensed under CC BY-SA 3.0.
Source: Stack Overflow
Solution | Source |
---|---|
Solution 1 | Zabir Al Nazi |
Solution 2 | Slifer |