本文共 2490 字,大约阅读时间需要 8 分钟。
TensorFlow2教程完整教程目录(更有python、go、pytorch、tensorflow、爬虫、人工智能教学等着你):
import tensorflow as tf
y = tf.constant([1, 2, 3, 0, 2])y = tf.one_hot(y, depth=4) # max_label=3种y = tf.cast(y, dtype=tf.float32)out = tf.random.normal([5, 4])out
loss1 = tf.reduce_mean(tf.square(y - out))loss1
loss2 = tf.square(tf.norm(y - out)) / (5 * 4)loss2
loss3 = tf.reduce_mean(tf.losses.MSE(y, out))loss3
a = tf.fill([4], 0.25)a * tf.math.log(a) / tf.math.log(2.)
-tf.reduce_sum(a * tf.math.log(a) / tf.math.log(2.))
a = tf.constant([0.1, 0.1, 0.1, 0.7])-tf.reduce_sum(a * tf.math.log(a) / tf.math.log(2.))
a = tf.constant([0.01, 0.01, 0.01, 0.97])-tf.reduce_sum(a * tf.math.log(a) / tf.math.log(2.))
Minima: H(p,q) = H(p)
\(h(p:[0,1,0]) = -1log\,1=0\)
\(H([0,1,0],[p_0,p_1,p_2]) = 0 + D_{KL}(p|q) = -1log\,q_1\) # p,q即真实值和预测值相等的话交叉熵为0tf.losses.categorical_crossentropy([0, 1, 0, 0], [0.25, 0.25, 0.25, 0.25])
tf.losses.categorical_crossentropy([0, 1, 0, 0], [0.1, 0.1, 0.8, 0.1])
tf.losses.categorical_crossentropy([0, 1, 0, 0], [0.1, 0.7, 0.1, 0.1])
tf.losses.categorical_crossentropy([0, 1, 0, 0], [0.01, 0.97, 0.01, 0.01])
tf.losses.BinaryCrossentropy()([1],[0.1])
tf.losses.binary_crossentropy([1],[0.1])
gradient vanish
e.g. meta-learning
转载地址:http://gkgyz.baihongyu.com/