I can imagine only two cases: as 2^k approaches n, (say, if 2^k == n), then the number of steps required is n / n, or 1, so it has a run-time of O(1).
However, if 2^k is small, (say, if 2^k == 1), then n steps are required, so it has a run-time of O(n).
In essence, if 2^k can be written in terms of n, then the number of steps required is an integer, so it has a run-time of O(1). If 2^k cannot be written in terms of n, it has a run-time of O(n).
Both of these cases are worst-case; for any unknown and possibly large n, these run-times still hold. Is there any situation in which 2^k can't be qualified as much smaller than n or close to n?
Imagine you were to write a (possibly very long) table containing the run-times for the function for values of k such that (n / 2^k) >= 1. As k starts at 0 and increases, the run-times are O(1), but as k starts at a large value and decreases, the run-times are O(n). Is there a theoretical point at which the run-time switches from O(1) to O(n)?
Or am I simply not grasping the "main idea" of Big-O analysis?
EDIT: A linear search is assumed
You’re missing a main idea of big-O analysis. There’s always an assumption (usually unstated) that you’re looking at a limit as something goes to infinity. Usually, there’s only one variable, so it’s clear from context that that’s the variable going to infinity. (For example,
n^2+23n∈O(n^2)(asngoes to infinity).)In your case, you have two variables,
nandk. So in order to talk about big-O, you need to quantify what is going to infinity (it looks to ben), and what the other variable is doing in the meantime. This does lead to your two possibilities: ifkis constant asngoes to infinity, then the answer isO(n); but ifkis such thatn/2^kis constant (i.e.,k=lg(n)-C), then the answer isO(1). But there are other possibilities: ifn/2^k=√n(i.e.,k=lg(n)/2), then the answer isO(√n).So depending on how
kgrows withn, the answer is anything betweenO(1)andO(n).