The Loss number of both train set and validation set both went down at the begining and then went up again

24 Views Asked by At

I try to use the siamese network which combines with two resnet networks (Pretrained) to solve the few shot problem. The loss function is contrassive loss and the optimizer is Adam with 0.001 learning rate.

Both train set loss and validation loss went down since the experiment beginning and the model reached its best results at about 450 epochs.

However, When the experiment keep running, both numbers began to rise and to the same extent as they had at the beginning of the experiment.

I don't think it was a overfit problem, is there anyone who knows what happen? How can I fix this problem?

Thanks!

0

There are 0 best solutions below