Modular Exponentiation Big O Notation

117 Views Asked by At

I'm taking an algorithms design and analysis class where we focus on the time and space complexity of common algorithms but I'm having a hard time understanding Big O notation/time complexity. For this project we had to code up Fermat's little theorem to determine the primality of a number alongside Miller-Rabin's theorem to catch Carmichael numbers (like 561). I'd just appreciate a dumbed down explanation of what the Big O notation is and why for my modular exponentiation function (mod_exp) as it's used in both functions. That way i can better understand it and try to determine the time and space complexity of the other functions on my own.

def mod_exp(x, y, N): 
    if y == 0:          # Time: Space: 
            return 1
    z = mod_exp(x, y//2, N)         # Time: Space: 
    if y % 2 == 0:
        return (z**2) % N   # Time: Space: 
    else:
        return (x*(z**2)) % N   # Time: Space: 
#Total Time: Total Space:

Bout the only thing i understand is that it's recursively calling itself an N amount of times but what about the return statements? Since its recursively returning z^2 mod N or x * z^2 mod N would those instances also be O(n) or is that an exponential instance so it will be O(logn) or O(2^n).

1

There are 1 best solutions below

0
lawnjittle On

This is a good explanation of Big O.

Your mod_exp function looks like it's O(log y) to me. The math in all of your returns is O(1). Some people might argue that that's not true, but in an algorithms analysis class, it's generally a fair assumption that arithmetic is O(1).

The parameters x and N are misleading because they don't change between recursive calls to the function, so they can be thought of as global variables or inline constants that aren't actually impacting how many recursive calls are made.

The call to mod_exp(x, y//2, N) is the most interesting part. Since you're dividing y by 2 on each call and your function stops recursing when y == 0 (i.e. y == 0 is your base case), then you're expecting to make at most O(log y) recursive calls.

In computer science, the base of log is presumed to be 2 unless otherwise stated. (In normal math, the base of log is presumed to be 10.)

If you don't immediately understand why y = y//2 can only happen O(log y) times before y == 0, then let me know. That's a knowledge gap that you should be aggressive about closing; it's a critical intuition for algorithms analysis.

You know there can be at most O(log y) calls to your mod_exp function. And you know that executing a single call to that function (excluding the other recursive calls that may or may not result) takes O(1) time. O(log y) calls * O(1) work per call means your final answer is O(log y).