why use train_hyperparameter step in TF_OD API sample config

19 Views Asked by At

I just wonder why use "step" not "epoch" in sample config file

In the config file there is set "Batch_size" so it doesn't matter what you use "step or epoch"

But last epoch can be doesn't use total__training_data when i use "step"

0

There are 0 best solutions below