Introduction
A recurrence relation can be defined in the following way.
1.T(0) or T(1) =c1;
2.T(n) = c2 + T(n-1);where
T(0) or T(1) = cost to solve problem of 0 or 1 size respectively.
T(n) = cost to solve problem of size n.
T(n-1) = cost to solve problem of size n-1.
c1 and c2 are constants.
Here Line 1 describes basic step,it says the cost of problem for trivial case(s),whereas line 2 describes the cost of problem for n input size in terms of (n-1) input size.
When a algorithm contains a recursive call to itself,its running time can often be described in terms of Recurrence Relation.
When a algorithm contains a recursive call to itself,its running time can often be described in terms of Recurrence Relation.
Solving a Recurrence relation means to eliminate the recursion from function definition and finding the closed form of the function.There are several technique to solve Recurrence Relations.They are
- Substitution Method
- Iterative Method or Recursion Tree Method
- Master Method
Substitution Method
It is one of the most powerful method of solving Recurrence relation.This method comprises of two step
Let's see how it Work with examples.
T(n)=1 ,if n=1
otherwise T(n) =2*T(floor(n/2))+n
=> Guess T(n) =O(n*log(n)).Next we will Prove it by induction.
To show the upper bound,we will try to find two constants c and n(0) such that T(n) <=c*f(n) for all n>=n(0).First we will try to find out the values for n=2,n=3(small n)
Basic Case:
Our aim is to establish the result T(n) <=c*f(n), so now we will focus on finding out value of c and one can easily see that c=2 for n=2 and n=3.But we need induction to establish the guess for all n.
Inductive Steps: Let's assume that the guess is true for all n=2,3,......K;
for n=K+1,we have T(n)=2*T(floor(n/2))+n;
<=2*c(floor(n/2))*log(floor(n/2)) +n;
<=c*n*log(n/2)+n;=c*n*log(n)-c*n+n
<=c*n*log(n);
Hence the induction step holds true.The complexity of the above recurrence relation is O(n*log(n))
Be Careful { Error may occur while using Asymptotic notation.
For example,in the previous question,if one try to establish T(n) = O(n) by taking the initial guess T(n)<=c*n and apply it in recurrence relation in the following manner,
T(n)<=2(c*(floor(n/2))) + n
<=(c*n) + n
<=n(1+c)
<=O(n)
But it is not correct because it is not the exact form of inductive hypotheses,the actual form is T(n)<=c*n.So we have to strictly prove that form to establish the complexity is O(n).}
Next I will show you O(n^2) is also a solution of this the recurrence relation.( Because if F(n)= n^2 and G(n)=n*log(n),then G(n)=O(F(n)) ,follow previous blog!!)
Guess T(n) <= c*n^2
applying in the recurrence relation we get T(n)<=2*(c*(floor(n/2))^2) +n
<=(c*n^2)/2+n
<=C*n^2
Hence complexity is O(n^2).
Try to Prove that T(n) !=O(log(n)) by taking initial guess T(n)<=c*log(n).
Also try to solve the T(n) =T(n^(.5)) +1.It is the recurrence relation for Euclid's G.C.D Algorithm(Hope you all knew the algorithm,if not go for a google search).
Ex1. T(n) = T(n/2) +c ,T(1) =1. be the given recurrence relation.Here we will substitute the recursive part of R.H.S with the help of the definition of recursive expression on L.H.S.
So,T(n) =T(n/2) + c can be written as
=[ T((n/2)/2) +c ] + c
=T(n/(2^2)) +c +c
=[ T(n/(2^2)/2) +c ] +c+c
=T(n/(2^3)) + c+c+c
=.................................
=.................................
=.................................
=T(n/(2^k)) + c+c+c.................k times -eq(1).
Now,let's consider 2^k =n
k=log(n),substituting in eq(1),we get
T(n) =T(1)+ k*c
=1+c*log(n)
=O(log(n))
Another Example :T(n) = n^2 +2*T(n/2)
= n^2 +2*[ (n/2)^2 + 2^2 T(n/(2^2))]
=n^2 + 2* n^2/4 +4*T(n/4)
=n^2+ 2*n^2/4 + 4*n^2/16+8*T(n/8)
=................................................
=................................................
= Σ[k=0 to log(n)-1] ( n^2(.5)^k+n*T(1) )
processing in the same manner we will get complexity = O(n^2)
Recursion Tree is shown below
Try to solve Previous question using this technique.It's too simple.Also try to Solve this one
T(n) = 2*T(n^(.5)) +log(n),T(1) =1.
Master theorem have following three cases.
The basic idea of the theorem is that it will actually find the greater among the functions.
You can easily realize In case 1 n^(log(a)/log(b)-e) is larger and in case 3 f(n) is larger.In case 2 both have same size,so we multiply by a logarithmic factor,Hence the solution is Θ(f(n)).
Now we will solve some problem.
Example 1 T(n) = 9*T(n/3) + n^2
=> a= 9,b= 3,log(a)/log(b) =2.hence n^(log(a)/log(b)) = n^2.So,it matches with second case.Hence the complexity is Θ(n^2*log(n)).
Example 2 T(n) = 3*T(n/4) + n*log(n).
=>clearly it falls under case 3.Hence complexity is Θ(n*log(n)).
Example 3 T(n) = 2*T(n/2) +n/log(n).
=>Not applicable.There exists non-polynomial difference between two terms.
So that's all about Solving Technique Of Recurrence Relation.If u have any doubt regarding this article,drop me a mail at ernstbinoy@gmail.com. You can also Follow these links.
- Guess the form of the solution.
- Next take help of Mathematical Induction to establish the correctness of the Guess
Let's see how it Work with examples.
T(n)=1 ,if n=1
otherwise T(n) =2*T(floor(n/2))+n
=> Guess T(n) =O(n*log(n)).Next we will Prove it by induction.
To show the upper bound,we will try to find two constants c and n(0) such that T(n) <=c*f(n) for all n>=n(0).First we will try to find out the values for n=2,n=3(small n)
Basic Case:
- T(2) =4
- T(3) = 5.
Our aim is to establish the result T(n) <=c*f(n), so now we will focus on finding out value of c and one can easily see that c=2 for n=2 and n=3.But we need induction to establish the guess for all n.
Inductive Steps: Let's assume that the guess is true for all n=2,3,......K;
for n=K+1,we have T(n)=2*T(floor(n/2))+n;
<=2*c(floor(n/2))*log(floor(n/2)) +n;
<=c*n*log(n/2)+n;=c*n*log(n)-c*n+n
<=c*n*log(n);
Hence the induction step holds true.The complexity of the above recurrence relation is O(n*log(n))
Be Careful { Error may occur while using Asymptotic notation.
For example,in the previous question,if one try to establish T(n) = O(n) by taking the initial guess T(n)<=c*n and apply it in recurrence relation in the following manner,
T(n)<=2(c*(floor(n/2))) + n
<=(c*n) + n
<=n(1+c)
<=O(n)
But it is not correct because it is not the exact form of inductive hypotheses,the actual form is T(n)<=c*n.So we have to strictly prove that form to establish the complexity is O(n).}
Next I will show you O(n^2) is also a solution of this the recurrence relation.( Because if F(n)= n^2 and G(n)=n*log(n),then G(n)=O(F(n)) ,follow previous blog!!)
Guess T(n) <= c*n^2
applying in the recurrence relation we get T(n)<=2*(c*(floor(n/2))^2) +n
<=(c*n^2)/2+n
<=C*n^2
Hence complexity is O(n^2).
Try to Prove that T(n) !=O(log(n)) by taking initial guess T(n)<=c*log(n).
Also try to solve the T(n) =T(n^(.5)) +1.It is the recurrence relation for Euclid's G.C.D Algorithm(Hope you all knew the algorithm,if not go for a google search).
Iterative Or Recursion Tree Method
Basic idea of this method is to Expend the terms and then try to derive the base case from it.Generally we avoid floor or ceiling here.I'm not going into formal definition as because you all know the concept of iteration.Just look the examples i'm explaining,you can definitely follow it.Ex1. T(n) = T(n/2) +c ,T(1) =1. be the given recurrence relation.Here we will substitute the recursive part of R.H.S with the help of the definition of recursive expression on L.H.S.
So,T(n) =T(n/2) + c can be written as
=[ T((n/2)/2) +c ] + c
=T(n/(2^2)) +c +c
=[ T(n/(2^2)/2) +c ] +c+c
=T(n/(2^3)) + c+c+c
=.................................
=.................................
=.................................
=T(n/(2^k)) + c+c+c.................k times -eq(1).
Now,let's consider 2^k =n
k=log(n),substituting in eq(1),we get
T(n) =T(1)+ k*c
=1+c*log(n)
=O(log(n))
Recursion Tree for T(n) = T(n-1) + 1 |
= n^2 +2*[ (n/2)^2 + 2^2 T(n/(2^2))]
=n^2 + 2* n^2/4 +4*T(n/4)
=n^2+ 2*n^2/4 + 4*n^2/16+8*T(n/8)
=................................................
=................................................
= Σ[k=0 to log(n)-1] ( n^2(.5)^k+n*T(1) )
processing in the same manner we will get complexity = O(n^2)
Recursion Tree is shown below
T(n) = 2*T(n^(.5)) +log(n),T(1) =1.
Master Method
This a faster method for solving recurrence relation.Here we express the inductive step of recurrence relations as T(n) = a*T(n/b) +f(n) where a>=1 and b>1 and f(n) is some asymptotically positive function.Master theorem have following three cases.
The basic idea of the theorem is that it will actually find the greater among the functions.
You can easily realize In case 1 n^(log(a)/log(b)-e) is larger and in case 3 f(n) is larger.In case 2 both have same size,so we multiply by a logarithmic factor,Hence the solution is Θ(f(n)).
Now we will solve some problem.
Example 1 T(n) = 9*T(n/3) + n^2
=> a= 9,b= 3,log(a)/log(b) =2.hence n^(log(a)/log(b)) = n^2.So,it matches with second case.Hence the complexity is Θ(n^2*log(n)).
Example 2 T(n) = 3*T(n/4) + n*log(n).
=>clearly it falls under case 3.Hence complexity is Θ(n*log(n)).
Example 3 T(n) = 2*T(n/2) +n/log(n).
=>Not applicable.There exists non-polynomial difference between two terms.
So that's all about Solving Technique Of Recurrence Relation.If u have any doubt regarding this article,drop me a mail at ernstbinoy@gmail.com. You can also Follow these links.
- http://net.pku.edu.cn/~course/cs101/2007/resource/Intro2Algorithm/IntroductionToAlgorithm_v2_en.pdf
- http://www.csd.uwo.ca/~moreno//CS424/Ressources/master.pdf
- http://cs.anu.edu.au/student/comp3600/lec_print/alg_lec05_recurrences_print.pdf
- http://www.cs.nthu.edu.tw/~wkhon/algo09/lectures/lecture3.pdf
- http://www.myedupages.com/all-topics/165-fundamentals-of-algorithms/merge-sort/381-the-iteration-method-for-solving-recurrence-relations
No comments:
Post a Comment