ufris
Softmax cross entropy 구현(tensorflow) 본문
tensorflow를 이용해 softmax cross entropy를 직접 구현을 해보면
output = np.array([[0.12,0.546,0.02], [0.054,0.23,0.0003]])
Y = np.array([[0,1,0],[1,0,0]])
hypothesis = tf.nn.softmax(output) # [[0.29103963 0.44561682 0.26334355]
# [0.31845567 0.37973867 0.30180566]],
# Cross entropy cost/loss
cost = tf.reduce_mean(-tf.reduce_sum(Y * tf.math.log(hypothesis), axis=1))
# Y * tf.math.log(hypothesis) = [[-0. -0.80829584 -0. ]
# [-1.14427198 -0. -0. ]]
'딥러닝' 카테고리의 다른 글
Loss와 Accuracy는 항상 반비례 관계인가 (0) | 2021.07.15 |
---|---|
[Cuda] GTX 3090 사용 시 tensorflow graph 생성이 느린 문제 (0) | 2021.01.20 |
CARPE DIEM, SEIZE THE SAMPLES UNCERTAIN “ATTHE MOMENT” FOR ADAPTIVE BATCH SELECTION (0) | 2020.10.22 |
tensorflow import_meta_graph를 이용한 image test (0) | 2020.08.20 |
성능 최적화 tensorRT / tflite (0) | 2020.06.05 |