When using the Lightning’s built-in LR finder:
# Create a Tuner
tuner = Tuner(trainer)
# finds learning rate automatically
# sets hparams.lr or hparams.learning_rate to that learning rate
tuner.lr_find(model)
a lot of checkpoint lr_find_XXX.ckpt are created in the running directory which creates clutter. How can I make sure that these checkpoint are not created? Or keep them in a dedicated directory?
As it is defined in the lr_finder.py as:
the initial model is saved with the checkpoint you are mentioning
lr_find_XXX.ckptto the directorytrainer.default_root_dir. If no default directory is defined during the initialization of the trainer, current working directory will be assigned as the default_root_dir. After finding the ideal learning ratelr_findrestores the initial model from the checkpoint and removes the checkpoint.You are probably stopping the program before the checkpoint is restored and removed so you have two options:
Trainer(default_root_dir='./NAME_OF_THE_DIR')but be aware that this is also the directory that the lightning logs are saved to.