Recurrence Relations & Time Complexities
Recurrence Relations
Here’s how I figure out the time complexity of recursive functions. First, you need to figure out the recurrence relation. It usually takes this form.
T(N) = X * T(Y) + Z
- A := The number of recursive calls in your function
- B := The “input” into your recursive calls
- C := The “work” done inside your function
Let’s take the following code as an example, from the coin change problem.
def recurse(target):
if target < 0: return float('inf')
if target == 0: return 0
return 1 + min([
recurse(target-coin) for coin in coins
])
Here’s what the recurrence relation would become:
T(N) = len(coins) * T(N - min(coins)) + O(len(coins))
This might seem kind of arbitrary, but let’s take a look:
- I define
coins
as thecoins
array - I define
N
as theamount
variable - Thus, we call the recuse function,
len(coins)
times, as shown by the code,for coin in coins
- We do
recurse(target-coin)
, so really, if we’re talking worst case, it's subtracting the minimum valued coin from the target, so that’s why we doN-min(coins)
- Finally, since we call
min([...])
in our function, that’sO(len(C))
work right there.
Since we’re talking “worst case” time complexity, we can approximate all of this into a simple:
T(N) = N * T(N - 1) + O(N)
We can define N
loosely as: max(amount, len(coins))
, for our purpose, it achieves our goal nicely.
Recurrence Relation → Time Complexity
Now, how do we go from a recurrence relation to a time complexity? let’s abstract this, starting with our previous definition:
T(N) = X * T(Y) + Z
Basically, we’re simply doing the following:
O(N) = X^(Y) * Z
Now, Y
will typically be one of the following:
n-1
orn-c
, wherec
is a constantn / 2
orn / c
, wherec
is a constant
Now, n-c
is pretty much going to map to O(n)
, whilst n/c
is going to map to log_c(n)
That’s really it!
If we follow through with our previous example:
T(N) = N * T(N - 1) + O(N)
This is what we get:
N^(N-1) * N = O(N^N)
So, our time complexity from the previous recursive function can have an upper bound of O(N^N)
, where N
is min(len(coins), amount)
.
In terms of the space complexity, I believe you can do exactly the same as previously mentioned but just don’t multiple the Z
term. So we’d be talking about an O(N^(N-1))
space complexity in our previous example.
Memoization
What if we added memoization, perhaps through a method such as:
@functools.lru_cache(maxsize=None)
My general approximation is this:
- Multiply the size of each of the inputs by the work done inside the function
Thus, in our previous example, we have one input, amount
, with size N
. The work done inside of the function can be approximated by N
also.
Thus, we simply to O(N*N) = O(N^2)
as an upper bound using memoization.
In terms of space complexity, you can just multiply each of the size of the inputs together: O(N)
in this case!
Conclusion
Hopefully, this article has shown how you can quickly approximate most recursive function time-complexities. These ideas give you an upper bound, but not always a tight-upper bound. I like to use this method in technical interviews if I’m asked the time complexity of a recursive function. Overall, these methods are heuristics and not precise.