How to resume training with different learning rate of the optimizer in pytorch?

134 Views Asked by At

Suppose, I trained a model with Adam optimizer with learning rate 0.001 and saved the model and the optimizers. Now I want to resume my training with the same optimizer but this time with different learning rate(say 0.002). Can I do that?

another thing to clarify: While resuming the training we usually load the model and the optimizer from some saved checkpoints. But suppose this time I only load the the model not the optimizers, and continue training what will be the problem here?

0

There are 0 best solutions below