• asymptotic notation:
    • O: Big Oh
    • Ω: Omega
  • asymptotic notation used as:
    • large-input algorithm performance assessor
    • algorithm success probability assessor
  • performance assessed along lines of
    • running time of algorithm (time complexity)
    • analogously, memory usage by algorithm (space complexity)
  • success probability of an algorithm is explored in terms of
    • negligible
    • noticeable
    • overwhelming

case study:

need for asymptotic notation


problem statement

  • consider the problem of reading integers from
    • a file whose number of lines are not known
    • a standard input stream
  • the goal is to read these integers into numpy arrays
    • numpy arrays used to be of fixed size
      • consider this past situation to illustrate need for asymptotic notation

study process

  • two algorithms are studied
    1. incremental array size increase
    2. doubling array size as array spaces fill up

algorithm schemes

analytical time assessment

  • for reading n lines of integers
    • with other symbols as time constants for other parts of the algorithm,
  • time taken by each algorithm is analytically derived to be

algorithm 1 T1(n)=nd+n(n1)2c+e

algorithm 2 T2(n)<(4C+D)n+A

  • for large enough number of lines in input, only the terms of the highest order matter, so
    • T1(n)0.00013n2 ms 
      • T_1
    • T2(n)<0.0021n ms 
      • T_2

payoff noticed

  • even with taking disparity hits in the constants, the second algorithm would outperform the first algorithm
    • 2 seconds (2nd algorithm) to 22 minutes (1st algorithm)
  • the first algorithm’s time complexity is parabolic for large inputs
    • while the second one’s is linear

Big Oh notation

  • Big Oh, denoted by O, is a notation for understanding time complexity of an algorithm
    • expresses the upper bound for an algorithm’s running time
    • as the number of inputs tends to infinity
    • hence the name asymptotic notation
  • in algorithm #2 above, the upper bound for the runtime is given by
    • T2(n)<0.0021n ms 
    • an this holds only for when n>150
  • this time upper bound of algorithm #2 for n>150 is expressed in Big Oh notation as T2O(n)
    • this is the same as saying “for sufficiently large n, T2(n)<dn
    • where
      • d is some constant
      • T2(n) is time taken by algorithm #2 to process n lines
  • the big O is a measure for how time taken by an algorithm scales when the number of inputs increase for large setts of inputs

formally

  • assume two positive functions f and g
    • g is the upper bound for f’s runtime
  • the notation fO(g) (f belongs to Big Oh g) means that there exists
    • a c such that c0
    • a n0 such that nn0
    • in whose bounds f(n)cg(n)

mathematical assumptions for Big Oh

  • functions are positive
    • i.e. functions whose values are real
  • function inputs are natural numbers
    • and starting from some point n0
  • the function (f(n)) we are interested in is either
    • the run time amount (of algorithm for given input size)
    • the memory consumption amount (of algorithm for given input size)
  • g(n) specifies the upper limit

  • ignore values of f and g for small n

  • ignore constant factors in bounding function in the notation

criteria for Big Oh

criterion 1

  • given positive functions f and g
    • fO(g) if and only if the ratio f(n)g(n) is bounded for sufficiently large n
    • i.e. there exists c and n0 such that f(n)g(n)c for all nn0

criterion 2

  • given positive functions f and g
    • if limnf(n)g(n) exists
    • then fO(g) if and only if limnf(n)g(n)<


  • in some cases, criterion 2 cannot be applied since the limit doesn’t exist
    • in such cases, ensure ratio is bounded on both sided for sufficiently large n
      • i.e. check for criterion 1 only, but both less than and greater than

Big Oh lemma

general lemma

  • if fO(g) and a>0
    • then afO(g)
  • if f1O(g1) and f2O(g2)
    • then f1+f2O(g1+g2)
    • also, f1f2O(g1g2)
  • if f1,f2O(g)
    • then f1+f2O(g)
  • if f1O(f2) and f2O(g)
    • then f1O(g)
  • if g1O(g2)
    • then g1+g2O(g2)
  • if fO(g) and limnb(n)=
    • then i=1b(n)f(i)O(i=1b(n)g(i))

common functions lemma

  • p(n)O(nk)
    • if p is a polynomial of degree k
  • nkO(an) for any constant real numbers a>1 and k>0

  • (logan)kO(nl) for any constant real numbers a>1,k>0 and l>0

  • loganO(logbn) for any constant real numbers a>1 and b>1

  • an0(bn) for any constant real numbers ba>0

Big Omega notation

  • Big Oh notation is to express time taken by algorithm in terms of
    • not more than philosophy
    • upper bound for time taken
    • assess and compare speeds of algorithms
  • Big Omega notation is to express time taken by algorithm in terms of
    • atleast so much philosophy
    • lower bound for time taken
    • used in security to ensure encryption algorithms take atleast so much time to break

formally

  • assume two positive functions f and g
    • g is the lower bound for f’s runtime
    • f grows asymptotically no slower than g
  • the notation fΩ(g) (f belongs to Big Omega g) means that there exists
    • a c such that c0
    • a n0 such that nn0
    • in whose bounds f(n)cg(n)
      • not f(n)cg(n) for Big Oh notation

criteria for Big Omega

  • given positive functions f and g
    • if limnf(n)g(n) exists
    • then fΩ(g) if and only if limnf(n)g(n)>0
      • unlike for Big Oh notation:
        • fO(g) if and only if limnf(n)g(n)<

Big Oh and Big Omega relationship

  • they are tightly related
    • gΩ(f) if and only if fO(g)
  • keeping this in mind, the lemmas of Big Oh can be extended to Big Omega
    • with mathematical modifications

references