Master theorem

Analysing complexity of recursive functions is not always easy, especially when the function does something fancy. Luckily, clever brains have provided a cookbook that makes determining the complexity of a recursive algorithm quite easy.

We may consider a recursive function as:
R(N) = A * R(\frac{N}{B}) + O(N^C)
Let me explain the symbols. R is a recursive function, that is called on the problem of size N. As it’s recursive, it calls itself A times, for a smaller problem of size N/B. In many recursive implementation, especially divide and conquer techniques, we got a “merge” step (please think about mergesort). The complexity of the merge step is expressed as the last term of the above formula.

Then we may determine three cases that will describe the complexity of the call.
1. A = B^C then O(R(N)) = N ^ C log N
2. A < B^C then O(R(N)) = N ^ C
3. A > B^C then O(R(N)) = N ^ {log _b{c}}

Do you remember mergesort? Mergesort has two recursive calls, that divide the problem into two pieces. When the recursion is done, then comes an extra “merge” step, that has O(N) complexity. If we plug this information into the formula we got: A = 2, B = 2, C = 1. What leads to 2 = 2 ^ 1. We take the first case, and obtain complexity of O(N) = N log N.

Advertisements

Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out / Change )

Twitter picture

You are commenting using your Twitter account. Log Out / Change )

Facebook photo

You are commenting using your Facebook account. Log Out / Change )

Google+ photo

You are commenting using your Google+ account. Log Out / Change )

Connecting to %s