I am a student studying machine learning. For my study, we need to differentiate the loss function by second order, we use "chainer.functions.sigmoid_cross_entropy". A similar function is "chainer.functions.softmax_cross_entropy". This function has an argument ", enable_double_backprop" to realize the second derivative, but not in "chainer.functions.sigmoid_cross_entropy".
Is "chainer.functions.sigmoid_cross_entropy" a second-order differentiable function?
Please teach me!
chainer.functions.sigmoid_cross_entropy (x, t, normalize = True, reduce = 'mean')
chainer.functions.softmax_cross_entropy (x, t, normalize = True, cache_score = True,
class_weight = None, ignore_label = -1, reduce = 'mean', enable_double_backprop = False,
soft_target_loss = 'cross-entropy')
Yes,
sigmoid_cross_entropyis second-order differentiable. For performance reasons,softmax_cross_entropyis not second-order differentiable unlessenable_double_backprop=Trueis given.Functions that does not support higher-order derivatives are listed in https://github.com/chainer/chainer/issues/4449.