How can I find the inference time of a trained model in TensorFlow Object Detection API v2?

42 Views Asked by At

I used this repository to train my model: https://github.com/nicknochnack/TFODCourse/tree/main and this is the video link if you're interested: https://youtu.be/yqkISICHH-U

I would like to find the inference time of the trained model per image. Preferably, the average inference time on a set of images.

On Section 8 - Load Train Model From Checkpoint of 2. Training and Detection.ipynb, I tried calculating the inference time by subtracting the start time and end time before and after predict() method call. Here is the codeblock for quick reference:

# Load pipeline config and build a detection model
configs = config_util.get_configs_from_pipeline_file(files['PIPELINE_CONFIG'])
detection_model = model_builder.build(model_config=configs['model'], is_training=False)

# Restore checkpoint
ckpt = tf.compat.v2.train.Checkpoint(model=detection_model)
ckpt.restore(os.path.join(paths['CHECKPOINT_PATH'], 'ckpt-35')).expect_partial()

@tf.function
def detect_fn(image):
    image, shapes = detection_model.preprocess(image)
    start_time = time.time() # Start Time
    prediction_dict = detection_model.predict(image, shapes)
    end_time = time.time() # End Time
    print(f"Inf time: {end_time - start_time}") # Difference
    detections = detection_model.postprocess(prediction_dict, shapes)
    return detections

However, the calculated differences were too slow. For instance, 4.839 seconds for detecting an object. I am not sure if I am wrong or not since most object detection models have only milliseconds of inference time.

If I finding the inference time wrong, may I ask for suggestions on how I should do it? Thank you.

0

There are 0 best solutions below