Notice
Recent Posts
Recent Comments
Link
«   2025/01   »
1 2 3 4
5 6 7 8 9 10 11
12 13 14 15 16 17 18
19 20 21 22 23 24 25
26 27 28 29 30 31
Tags
more
Archives
Today
Total
관리 메뉴

ufris

Softmax cross entropy 구현(tensorflow) 본문

딥러닝

Softmax cross entropy 구현(tensorflow)

ufris 2020. 11. 19. 15:49

 

softmax cross entropy 수식

 

tensorflow를 이용해 softmax cross entropy를 직접 구현을 해보면

 

 

output = np.array([[0.12,0.546,0.02], [0.054,0.23,0.0003]])


Y = np.array([[0,1,0],[1,0,0]])

hypothesis = tf.nn.softmax(output)  # [[0.29103963 0.44561682 0.26334355]
                                                           # [0.31845567 0.37973867 0.30180566]],


# Cross entropy cost/loss
cost = tf.reduce_mean(-tf.reduce_sum(Y * tf.math.log(hypothesis), axis=1)) 

# Y * tf.math.log(hypothesis) = [[-0.         -0.80829584 -0.        ]
#                                                   [-1.14427198 -0.         -0.        ]]