Master Theorem for Recurrence Relation

The Master Theorem provides a quick method to Analyse Asymptotically the Recurrence Relations.


Master Theorem


FOR DECREASING FUNCTIONS 

If a Recurrence Relation is of the Form

`T(n)=aT(n-b)+f(n)`

Then, as per Master Theorem, we have Three Conditions depending on value of `a`


 If

 Answer

 `a=1`

`O(n f(n))`

 `a>1`

`O(a^{\frac{n}{b}}f(n))` 

 `a<1`

`O(f(n))`

In any problem, our main motive is to find `a, f(n)` and `b`.

NOTE: Replace `f(n)` by it's asymptotic equivalent.


FOR DIVIDING FUNCTIONS 

If a Recurrence Relation is of the Form


`T(n)= aT ( \frac{n}{b} ) + n^k(\log(n))^p`

then, as per the Master Theorem, we have six conditions depending on value of `a,b,k` and `p`

 If       

Answer 

 `\log_b(a)>k` 

`\Theta(n^{\log_b a})`

 `\log_b a=k` and `p>1`

`\Theta(n^k(\log(n))^{p+1})`

 `\log_b a=k` and `p=1`  

`\Theta(n^k\log(\log(n)))`

 `\log_b a=k` and `p<1`  

`\Theta(n^k)`

 `\log_b a<k` and `p \geq 0` 

`\Theta(n^k(\log(n))^p)`

 `\log_b a<k` and `p < 0` 

`\Theta(n^k)`

In any problem, our main motive is to find `a,b,k` and `p`.


But, why it works❔

The approach was first presented by Dorothea Haken, James Benjamin Saxe and Jon Louis Bentley. 
The derivation is quite typical to understand. But, at basic level, the idea lies behind tree, height of tree and domination. After certain height, few terms begin to dominate. 


REFERENCES AND FURTHER READINGS



Ad Code

Responsive Advertisement