딥러닝
Softmax cross entropy 구현(tensorflow)
ufris
2020. 11. 19. 15:49
tensorflow를 이용해 softmax cross entropy를 직접 구현을 해보면
output = np.array([[0.12,0.546,0.02], [0.054,0.23,0.0003]])
Y = np.array([[0,1,0],[1,0,0]])
hypothesis = tf.nn.softmax(output) # [[0.29103963 0.44561682 0.26334355]
# [0.31845567 0.37973867 0.30180566]],
# Cross entropy cost/loss
cost = tf.reduce_mean(-tf.reduce_sum(Y * tf.math.log(hypothesis), axis=1))
# Y * tf.math.log(hypothesis) = [[-0. -0.80829584 -0. ]
# [-1.14427198 -0. -0. ]]