Differences in timing between timeit.timeit() and Timer.autorange()

256 Views Asked by At

I am trying to figure out how to use Python's timeit module but I get vastly different timings between its timeit.timeit method and timeit.Timer.autorange():

import timeit
setup = """
def f():
    x = "-".join(str(n) for n in range(100))
"""

def f():
    x = "-".join(str(n) for n in range(100))


t = timeit.timeit("f()", setup=setup, number=100)
print(t)

num, timing = timeit.Timer(stmt='f()', globals=globals()).autorange()
per_run = timing/num
print(per_run *1000)

results in numbers like

0.0025681090000944096  # timeit.timeit
0.014390230550020533   # timeit.Timer.autorange

so an order of magnitude of difference between the two approaches.

I am probably doing something wrong but have no idea what. The autorange documentation is so sparse.

1

There are 1 best solutions below

0
bugmenot123 On

The result of timeit.timeit is the total runtime, not per iteration. You need to divide it by the number!

This example which uses the number of iterations determined by autorange should nicely show it:

import timeit

setup = """
def f():
    x = "-".join(str(n) for n in range(100))
"""

def f():
    x = "-".join(str(n) for n in range(100))


iterations, timing = timeit.Timer(stmt='f()', globals=globals()).autorange()
per_run = timing/iterations
print(per_run)

timing = timeit.timeit("f()", setup=setup, number=iterations)
per_run = timing/iterations
print(per_run)

Right now it gave me

2.2483807100024934e-05
2.0093059600003473e-05