Chapter 3: Analysis Tools

Analysis.pdf


Experimental Studies


Limitations of Experiments

Running Time

Theoretical Analysis

Pseudocode

A program code unrelated to the hardware of a particular computer and requiring conversion to the code used by the computer before the program can be used. No strict syntax rules, designed for humans, not for computers.
Example: find max element of an array

Algorithm arrayMax(A, n)
  Input: array A of n integers
  Output: maximum element of A
  currentMax <- A[0]
  for i <- 1 to n−1 do
    if A[i] > currentMax then currentMax <- A[i]
  return currentMax

Pseudocode Details

if…then…[else…]
while…do…
repeat…until…
for…do…

Indentation replaces braces

Algorithm method(arg[, arg…])
Input…
Output…
var.method (arg[, arg…])
return expression
<- Assignment (like = in C++)
= Equality testing (like == in C++)
n2 Superscripts and other mathematical formatting allowed

The Random Access Machine - (RAM) Model

Primitive Operations

Examples:

Counting Primitive Operations

Algorithm arrayMax(A, n) Number of operations
  currentMax <- A[0] 2
  for i <- 1 to n−1 do 2 + n
    if A[i] > currentMax then 2(n − 1)
      currentMax <- A[i] 2(n − 1)
  { increment counter i} 2(n −1)
  return currentMax
1

Total 7n − 1

Estimating Running Time

            a = Time taken by the fastest primitive operation
            b = Time taken by the slowest primitive operation             a (7n − 1) ≤ T(n) ≤ b(7n − 1)

Growth Rate of Running Time

Changing the hardware/ software environment
The linear growth rate of the running time T(n) is an intrinsic property of algorithm arrayMax

Growth Rates

Function / n
1
2
10
100
1000
5
5
5
5
5
5
log n
0
1
3.32
6.64
9.96
n
1
2
10
100
1000
n log n
0
2
33.2
664
9996
n2
1
4
100
104
106
n3
1
8
1000
106
109
2n
2
4
1024
1030
10300
n!
1
2
3628800
10157
102567
nn
1
4
1010
10200
103000

Constant Factors

The growth rate is not affected by
Examples:


Asymptotic Notation

Big-Oh

Given functions f(n) and g(n), we say that f(n) is O(g(n)) if there are positive constants c and N > 0 such that f(n) < c g(n) for n > N.

Example 1
: 2n +10 is O(n)
2n +10 ≤ cn; (c − 2) n ≥10; n ≥ 10/(c − 2)
Pick c = 3 and N = 10.

Example 2
: the function n2 is not O(n)
n2c n; nc
The above inequality cannot be satisfied since c must be a constant

Example 3
: 7n - 2 is O(n)
need c > 0 and N ≥ 1 such that 7n - 2 ≤ c n for nN
this is true for c = 7 and N = 1

Example 4
: 3n3+ 20n2+ 5 is O(n3)
need c > 0 and N ≥ 1 such that 3n3+ 20n2+ 5 ≤ cn3 for nN
this is true for c = 4 and N = 21

Example 5
:
3 log n + log log n is O(log n)
need c > 0 and N ≥ 1 such that 3 log n + log log nc log n for nN
this is true for c = 4 and N = 2

Big-Oh Rules

            1. Drop lower-order terms
            2. Drop constant factors
            Say “2n is O(n)” instead of  “2n is O(n2)”
            Say “3n + 5 is O(n)” instead of  "3n + 5 is O(3n)"

Asymptotic Algorithm Analysis

            1. We find the worst-case number of primitive operations executed as a function of the input size
            2. We express this function with big-Oh notation
Example:
We determine that algorithm
arrayMax executes at most 7n−1 primitive operations
We say that algorithm
arrayMax “runs in O(n) time”

Computing Prefix Averages

        A[i]= (X[0] +X[1] +… +X[i])/(i+1)

Prefix Averages (Quadratic)

The following algorithm computes prefix averages in quadratic time by applying the definition:
Algorithm prefixAverages1(X, n)
Input array X of n integers
Output array A of prefix averages of X
A <- new array of n integers
for i
<- 0 to n−1 do
   s
<- X[0]
   for j
<- 1 to i do
      s
<- s+X[j]
   A[i]
<- s/(i+1)
return A



#operations
n
n
n

1 + 2 + …+(n−1)
1 + 2 + …+(n−1)
n
1

Arithmetic Progression

        O(1 + 2 + …+ n)

Prefix Averages (Linear)

The following algorithm computes prefix averages in linear time by keeping a running sum

Algorithm prefixAverages2(X, n)
Input array X of n integers
Output array A of prefix averages of X
A <- new array of n integers
s <- 0
for i <- 0 to n−1 do
   s <- s + X[i]
   A[i] <- s/(i + 1)
return A



#operations
n
1
n
n
n

1

Algorithm prefixAverages2 runs in O(n) time


Relatives of Big-Oh

Big-Omega
f(n) is
Ω(g(n)) if there is a constant c > 0 and an integer constant N > 0 such that f(n) > c g(n) for n > N.

Big-Theta
f(n) is
Θ(g(n)) if there are constants c' > 0 and c" > 0 and an integer constant N > 0 such that c'g(n) <  f(n) < c"g(n) for n > N.

little-oh
f(n) is o(g(n)) if, for any constant c > 0, there is an integer constant N > 0 such that f(n) < c g(n) for n > N.

little-omega
f(n) is
ω(g(n)) if, for any constant c > 0, there is an integer constant N > 0 such that f(n) < c g(n) for n > N.

Intuition for Asymptotic Notation

Big-Oh
f(n) is O(g(n)) if f(n) is asymptotically less than or equal to g(n)

big-Omega
f(n) is Ω(g(n)) if f(n) is asymptotically greater than or equal to g(n)

big-Theta
f(n) is Θ(g(n)) if f(n) is asymptotically equal to g(n)

little-oh
f(n) is o(g(n)) if f(n) is asymptotically strictly less than g(n)

little-omega
f(n)) is ω(g(n)) if f(n) is asymptotically strictly greater than g(n)

Example Uses of the Relatives of Big-Oh

Example 1: 5n2 is Ω(n
2)
f(n) is Ω(g(n)) if there is a constant c > 0 and an integer constant N ≥ 1 such that f(n) ≥ c g(n) for nN
let c = 5 and N = 1

Example 2:
5n2 is Ω(n)
f(n) is Ω(g(n)) if there is a constant c > 0 and an integer constant N ≥ 1 such that f(n) ≥c g(n) for nN
let c = 1 and N = 1,
5n n

Example 3: 5n2 is ω(n)
f(n) is ω(g(n)) if, for any constant c > 0, there is an integer constant N ≥ 0 such that f(n) ≥c g(n) for nN
need 5N2cN - given c, then N that satisfies Nc/5 ≥ 0