Skip to content

Commit

Permalink
A minor mistake in cross entropy loss (#357)
Browse files Browse the repository at this point in the history
tf.reduce_mean(-tf.reduce_sum(y_true * tf.math.log(y_pred),1)) or else it simply finds the sum and the reduced mean remains the sum itself.
  • Loading branch information
kilarinikhil committed Apr 12, 2020
1 parent 7aeb6cb commit c577281
Showing 1 changed file with 1 addition and 1 deletion.
Original file line number Diff line number Diff line change
Expand Up @@ -109,7 +109,7 @@
" # Clip prediction values to avoid log(0) error.\n",
" y_pred = tf.clip_by_value(y_pred, 1e-9, 1.)\n",
" # Compute cross-entropy.\n",
" return tf.reduce_mean(-tf.reduce_sum(y_true * tf.math.log(y_pred)))\n",
" return tf.reduce_mean(-tf.reduce_sum(y_true * tf.math.log(y_pred),1))\n",
"\n",
"# Accuracy metric.\n",
"def accuracy(y_pred, y_true):\n",
Expand Down

0 comments on commit c577281

Please sign in to comment.