You are on page 1of 58

Analysis of Algorithms

Contents
 Analysis of algorithm: frequency count and its
importance in analysis of an algorithm,
 Time complexity & Space complexity of an algorithm, Big
‘O’, ‘Ω’ and ‘θ’ notations, Best, Worst and Average case
analysis of an algorithm.
Algorithms
• An algorithm is simply a set of rules for carrying
out some task, either by hand, or more usually, on
a machine.
• Performance of software depends upon –
• The algorithm chosen
• The suitability and efficiency of various layers of
implementation

• Algorithm - Independent of programming language,


computer hardware or any other implementation
aspects
• Algorithm includes –
• Input
• Processing
• Output
Algorithms
• Algorithm Design Tools –
• Flowchart
• Pseudo code

• Analysis of Algorithm –
• Algorithm heavily depend on organization of data.
• Analysis involves measuring the performance of an algorithm

• Performance is measured in terms of following


parameters
• Time complexity
• Space complexity
Algorithms
• Space complexity:

• The amount of computer memory required during program


execution as a function of the input size.

• Compile Time – Storage requirement at the compile time


• Variable declaration

• Run Time – Storage requirement at the compile time


• Dynamic declaration
Algorithms
• Time complexity:

• Time Complexity T(P) is the time taken by a


program P, that is, the sum of its compile and
execution times.

• System dependent

• Count the number of algorithm steps.


Algorithms
Frequency Count:
Number of times that each statement is
executed.

Calculation of s/e.

Frequency of non executable statement=zero.

s/e* frequency=total steps for each


statement.

Summation of total steps = step count for


entire function.
Statement s/e Frequency Total
Steps
int sum (int a[ ] ,int n) 0 0 0
{ 0 0 0
int sum=0; 1 1 1
int i; 0 0 0
for (i=0;i<n ;i++) 1 n+1 n+1
sum=sum +a[i]; 1 n n
return sum; 1 1 1
} 0 0 0
Total 2n+3

Step count table for addition of array elements


Statement s/e Frequency Total
Steps

int sum (int a[ ] ,int n) 0 0 0


{ 0 0 0
if (n) 1 n+1 n+1
return sum(a,n-1)+a[n-1]; 1 n n
return a[0]; 1 1 1
} 0 0 0
Total 2n+2

Step count table for recursive summing function


Asymptotic Notations
• Asymptotic Notations are languages that allow us to
analyze an algorithm’s running time by identifying
its behavior as the input size for the algorithm
increases.
• This is also known as an algorithm’s growth rate.

• Does the algorithm suddenly become incredibly slow


when the input size grows?
• Does it mostly maintain its quick run time as the
input size increases?
Notations
Five types of Asymptotic Notations:
Describes the behavior of the time or space
complexity for large instance characteristics.
1) Big Oh (O)

2) Little Oh (o)

3) Omega (Ω)

4) Little Omega (ω)

5) Theta (θ)
Representation 0f “Big Oh”:
Function f (n) = O (g ( n) ) iff there exist
positive constants c and no such that
f (n) ≤c* g (n) for all n,n ≥ no
Function or Term Name
O (1) Constant
O (n) Linear
O (n2) Quadratic
O (n3) Cubic
O (2n) Exponential
O ( n log n) n log n
n! Factorial
O ( log n ) Logarithmic
Big Oh notation
• Set of all functions whose rate of growth is the same
as or lower than that of g(n).
f (n) ≤c* g (n) for all n,n ≥ no

• g(n) is an asymptotic upper bound for f(n).


Representation 0f “Little Oh”:
Function f (n) = o (g ( n) )
lim f (n) / g (n) = 0
n->infinity

lim g (n) / f (n) = 0


n->infinity
g (n) is an upper bound for f(n) that is not
asymptotically tight.
Representation 0f “Omega”:

Function f (n) = Ω(g(n)) iff


there exist positive constants
c and no such that
f (n) ≥ c* g (n)
for all n,n ≥ no

Set of all functions whose rate


of growth is the same as or
higher than that of g (n) .

• g(n) is an asymptotic lower


bound for f(n).
Representation 0f “Little Omega”:
Function f (n) = ω(g ( n) ) iff there exist positive
constants c and no
such that
lim g (n) / f (n) = 0
n->infinity

g (n) is a lower bound for f(n) that is not


asymptotically tight.
Representation 0f “Theta”:
Function
f(n) = θ(g(n)) iff there exist
positive constants c1,c2
and no such that

c1*g(n)≤ f(n)≤c2*g(n)
for all n, n ≥no

Set of all functions that


have the same rate of
growth as g(n).

• g(n) is an asymptotically
tight bound for f(n).
Relations Between
Theta,Oh,Omega
Comparison of Functions
f g  a b

f (n) = O(g(n))  a  b
f (n) =  (g(n))  a  b
f (n) =  (g(n))  a = b
f (n) = o(g(n))  a < b
f (n) = w (g(n))  a > b
3. Graphical Representation
Time Complexity
• Linear Loops:
• Addition & Subtraction
• for (i=0; i<1000; i++)
application code
• f(n) = n
• for (i=0; i<1000; i+2)
application code
• f(n) = n/2
• Plotting points – straight line
Time Complexity
• Logarithmic Loops:
• Divide & Multiply
• for (i=0; i<1000; i*2)
application code
• f(n) = n
• for (i=0; i<1000; i/2)
application code
• f(n) = log n
• Controlling variable is multiplied or divided in each
iteration
Time Complexity
• Nested Loops:
• Loops inside loops

• Total iterations = inner loop X outer loop

• Types -
• Linear Logarithmic
• Quadratic
• Dependent Quadratic
Time Complexity
• Nested Loops:
• Linear Logarithmic:
• for (i=0; i<10; i++)
for (j=0; i<10; j*2)
application code

• Inner loop = log 10


• Outer loop = 10
• Inner loop is controlled by outer loop
• 10 log 10
• f(n) = n log n
Time Complexity
• Nested Loops:
• Quadratic:
• for (i=0; i<10; i++)
for (j=0; i<10; j++)
application code

• Inner loop = 10
• Outer loop = 10
• The number of times inner loop executes is the
same as the outer loop
• 10 X10
• f(n) = n2
Time Complexity
• Nested Loops:
• Dependent Quadratic:
• for (i=0; i<10; i++)
for (j=0; j<i; j++)
application code
• Inner loop depends on the outer loop for one of its
factor

• f(n) = n (n+1)/2
4. Analysis of Searching Algorithm
1. Sequential Search or Linear Search :
Function for Sequential Search:
void seq_search (int a[ ] , int key, int n)
{
int i, flag, loc;
i=1; flag=0;
for (i=0; i<n; i++) {
if (a[i] == key) {
loc=i;
flag=1; break;
}
}
if (flag== 1)
“Element found in array” .
else
“ Element not present in array ” .
}
Time Complexity:
1) Best Case : O (1)
2) Worst Case : O (n) as (n+1) key comparisons.
3) Average Case: O (n) as (n+1)/2 key comparisons.
Let Pi = Probability that element to be searched will
found at ith position.
i = number of comparisons required to search the ith
element.
Therefore, Total comparisons,
C = 1*P1+2*P2+3*P3+………+n* Pn.
As the element could be found at any location with
equal probability i.e. P1=P2=........... Pn =1/n
So, C=1/n +2/n +….n/n=a/n(1+2…+n)
=n(n+1)/2n
= (n+1)/2 = O (n)
2. Binary Search: Function for Binary Search:
int search (int a[ ] , int n, int x)
{ int first=0,last=n-1,mid;
while (first<=last)
{
mid= (first + last)/2;
if (a [mid] ==x)
return mid;
else if (x<a [mid])
last = mid-1;
else
first = mid + 1;
} return (0);
}
2. Binary Search: Worst case Behaviour
(unsuccessful search) :
• Assume number of array elements = n
• After 1 comparison maximum number of elements
= n/2.
• After 2 comparisons =n/22.
• After m comparisons =n/2h
• Lowest value of comparison will be h[1] i.e.
Maximum number of elements left =1
n/2h =1 or 2h = n
i.e. h = Log2n
= O(Log2n)
3. Binary Search: Best case Behaviour

 Element will be found in one


comparison.

Best case complexity = O(1)


3. Binary Search: Average case Behaviour
(Successful search)
• Successful search terminates with one comparison
i.e. Success at level 0 (h=0).

Probability of successful search at level 1 (h=1) is


twice that of level 0 (h=0) and so on.

Finally, since there are at most two comparisons


per iteration (pass) of the algorithm, and at most
k+1 passes of the algorithm, it follows that there
are at most 2*(k+1) = 2*(ceil(log2 n) + 1)
comparisons for the complete binary search. Or, in
other words, the search has a time complexity
bound of O (log n).
Analysis of Sorting Algorithms

Classification of sorting techniques.


We have studied following sorts:
1) Insertion Sort
2) Bubble sort
3) Selection Sort
4) Radix Sort
5) Quick Sort
6) Merge Sort.
1) Insertion Sort : Function
void insert ( )
{
int i, j, temp;
for (i=1;i<n; i++)
{
temp= a[i] ;
for (j=i-1;(j>=0) && (a [j] > temp) ;j--)
{
a[j+1] = a [j] ;
}
a[j+1] = temp ;

}
1) Insertion Sort : Analysis
Best Case : O (n)

Average Case : O (n2)

Worst Case: O (n2)


2) Bubble Sort : Function
void bubble ( )
{
int i, j, temp;
for (i=1;i<n; i++)
{
for (j=0; j<n-i ;j++)
{
if( a [j] > a[j+1])
{
temp = a [j];
a [j] = a[j+1];
a[j+1] = temp ;
}
}}
}
2) Bubble Sort : Analysis
Best Case : O (n2)

Average Case : O (n2)

Worst Case: O (n2)


3) Selection Sort : Function
void select (int a[ ][ ],int n)
{
int i, j, temp ,n ,a [20];
for (i=0;i<n-1; i++)
{
for (j=i+1; j<n ;j++)
{
if( a [i] > a [j])
{
temp = a [i];
a [i] = a [j];
a [j] = temp ;
} }}
}
3) Selection Sort : Analysis
Best Case : O (n2)

Average Case : O (n2)

Worst Case: O (n2)


Quick Sort: Function
 Pivot
 Partition
 Code:
{ int i = left, j = right - 1;
while ( left < right ) {
{ while ( (a [left ] < = pivot) && (left < right) ) {
left++;
}
while ( a [right] >pivot)
right--;
if (left <right)
swap (a[ i ], a [ j ]);
else break; }
swap (a [ i ], a [ right-1 ]);
quickSort ( a, left, i-1);
quickSort (a, i+1, right); }
Quick Sort: Analysis
T (n) = T (i) + T (n - i -1) + c * n

The time to sort the file is equal to


the time to sort the left partition with i elements, plus
the time to sort the right partition with n-i-1 elements,
plus
the time to build the partitions

Average Case : O (log n)


. Best Case : O (n log n)
1) The pivot is the median of the array,
2) the left and the right parts have same size.
3) There are log n partitions, and to obtain each partitions we
do n comparisons (and not more than n/2 swaps). Hence
the complexity is O (n log n)

T (n) = 2T (n/2) + c * n
Divide by n:
T (n) / n = T (n/2) / (n/2) + c
Telescoping:
T (n/2) / (n/2) = T(n/4) / (n/4) + c
T(n/4) / (n/4) = T(n/8) / (n/8) + c
……
T(2) / 2 = T(1) / (1) + c
Add all equations:
T (n) / n + T (n/2) / (n/2) + T (n/4) / (n/4) + …. + T(2) / 2
= (n/2) / (n/2) + T(n/4) / (n/4) + … + T(1) / (1) + c.*log n
After crossing the equal terms:
T (n) /n = T(1) + c Log n = 1 + c * Log n
T (n) = n + n c * Log n
Therefore T (n) = O (n log n)
Worst case analysis
The pivot is the smallest element, Then one of the
partitions is empty, and we repeat recursively the
procedure for n-1 elements.

T ( n) = T(n-1) + c*n, n > 1


Telescoping:
T(n-1) = T(n-2) + c(n-1)
T(n-2) = T(n-3) + c(n-2)
T(n-3) = T(n-4) + c(n-3)
T(2) = T(1) + c.2
Add all equations:
T (n) + T(n-1) + T(n-2) + … + T(2)
= T(n-1) + T(n-2) + … + T(2) + T(1) + c (n) +
c(n-1) + c(n-2) + … + c.2
T (n) = T(1) + c(2 + 3 + … + n)
T (n) = 1 + c(n(n+1)/2 -1)
Therefore T (n) = O(n2)
Merge Sort: Function
consider separately its left half and its right
half.
Sort them and then merge them.
Function:
void merge_sort ( int [ ] a, int left, int right)
{
if (left < right) {
int center = (left + right) / 2;
merge_sort (a, left, center);
merge_sort (a, center + 1, right);
merge_sort (a, left, center + 1, right);
}
}
Merge Sort: Analysis
Assumption: n is a power of two.
For n = 1 time is constant (denoted by 1)
Otherwise:
time to merge sort n elements =
time to merge sort n/2 elements +
time to merge two arrays each n/2 el.
Time to merge two arrays each n/2 elements
is linear, i.e. O (n)
Thus we have:
(a) T(1) = 1
(b) T (n) = 2T(n/2) + n
T (n) = 2T(n/2) +n divide by n:
1) T (n) / n = T(n/2) / (n/2) + 1
Telescoping: n is a power of two, so we can write

2) T(n/2) / (n/2) = T(n/4) / (n/4) +1


3) T(n/4) / (n/4) = T(n/8) / (n/8) +1
…….
4) T(2)/ 2 = T(1) / 1 +1

The sum of the left-hand sides will be equal to the sum of


the right-hand sides:

T (n) / n + T(n/2) / (n/2) + T(n/4) / (n/4) + … + T(2)/2 =


T(n/2) / (n/2) + T(n/4) / (n/4) + …+ T(2) / 2 + T(1) / 1 + Log n

(Log n is the sum of 1’s in the right-hand sides)


After crossing the equal terms, we get
T (n) /n = T(1)/1 + Log n

T (1) is 1, hence we obtain


T (n) = n + n log n
= O (n log n)

Hence the complexity of the Merge Sort


algorithm is O (n log n).
 Best Case: O (n log n)

 Average Case: O (n log n)

 Worst Case: O (n log n)

You might also like