Let's Understand Big Omega, Big Theta, Small Oh and Small Omega - BunksAllowed

BunksAllowed is an effort to facilitate Self Learning process through the provision of quality tutorials.

Random Posts

Let's Understand Big Omega, Big Theta, Small Oh and Small Omega

Share This

The efficiency of an algorithm is measured based on space usage and time consumption. We will say that this algorithm is efficient if it consumes less space and takes less time to execute. Though in reality both can not be achieved simultaneously. In this tutorial, we will discuss techniques of complexity analysis along with methods of efficiency measurement.



The efficiency of an Algorithm


A problem may have many algorithmic solutions. Among them, one is chosen that works efficiently in terms of different metrics used for comparing the performance of the algorithms.


The complexity of an algorithm is represented by a function, which describes efficiency in terms of time or memory consumption. These complexity measurements are known as time complexity and space complexity, which are discussed below.


Time complexity

Time complexity does not represent the actual time (real clock time) consumption of the algorithm with respect to the input data. But it gives a clear understanding of the performance of the algorithm in terms of time consumption with respect to the size of the input. Basically, the time complexity is measured in terms of the number of comparisons of data, and the number of memory access performed, which affects the actual run time of the algorithm.


Space complexity

Space complexity represents how much additional memory/space is required by the algorithm in terms of the input size. It does not speak about the memory needed for storing the input data. Though it can be represented by natural units, like bytes. But the number of elements is used for this representation, instead of actual memory consumption.


In most cases, developers do not consider space complexity. But remember that the importance of space complexity is not less than time complexity. And sometimes, it becomes so important that developers may need to compromise with time complexity.


For both time and space, designers are interested in asymptotic complexity, to analyze the performance of the algorithm where the input size is very large.



Asymptotic Notations


Instead of actual time or memory consumption, using this notation time or space consumption of the algorithm is estimated as a function of the size of input data.


The asymptotic behaviour represents the growth of the function f(n) as n increases. The performance of an algorithm is better compared to another algorithm if the asymptotic growth rate is slower than the other algorithm.


Big-Oh (O): Asymptotic upper bound

A function f(n) is in the order of g(n), i.e. O(g(n)), if there exists positive constant C and n0 such that :


f(n) <= C * g(n), for n >= n0 in all case

Example


Say, f(n) = 3 * n + 2

  3 * n + 2 ≤ 4 * n, if n ≥ 2
  f(n) ≤ C * g(n)
  so f(n) in O(n)

Here, f(n) = 3 * n + 2 and this satisfies the equation 3 * n + 2 ≤ 4 * n 
  where, C = 4 & n0 = 2.

Big-Omega (Ω): Asymptotic lower bound

The function f(n) is the order of g(n) i.e. Ω(g(n)), if there exists positive constant C & n0 such that


f(n) ≥ C * g(n) for all n ≥ n0

Example


Say, f(n) = 3 * n + 2
  3 * n + 2 ≥ 3 * n, if n ≥ 1
  f(n) ≥ C * g(n)
  so, f(n) in Ω(n)

Here, f(n) = 3 * n + 2 and this satisfies the equation 3 * n + 2 ≥ 3 * n 
  where, C = 3 & n0 = 1.

Big-Theta (Θ): Asymptotic tight bound

The function f(n) is in the order of Θ(g(n)) if there exists positive constants C1, C2 & n0 such that


*g(n) ≤ f(n) ≤ C2*g(n) for all n ≥ n0

Example


Say, f(n) = 3 * n + 2
  3 * n ≤ 3 * n + 2 ≤ 4 * n, if n ≥ 2
  C1 *  g(n) ≤ f(n) ≤ C2 * g(n)
  so, f(n) in Θ(n)
   where, C1 = 3, C2 = 4 & n0 = 2.

Small-Oh (o)

The O-notation (asymptotic upper bound) may or may not be asymptotically tight. For example, f(n) = 4n2 = O(n2) is asymptotically tight, but f(n) = 4n = O(n2) is not.


The o-notation is used to denote an upper bound of a function that is not asymptotically tight.


Formally, o(g(n)) is defined as


o(g(n)) = {f(n): where c > 0 
  a constant n0 exists where n0 > 0,  
  such that 0 ≤ f(n) ≤ c.g(n) for all n ≥ n0}.

In this o-notation, as n approaches infinity, the function f(n) becomes insignificant with respect to g(n). i.e.



limn->∞f(x)/g(x) = 0

Small-Omega (ω)

The ω-notation denotes a lower bound of a function that is not asymptotically tight. It can be defined as


f(n) = ω(g(n)), if and only if g(n) ∈ o(f(n))

Formally, ω is defined as


ω(g(n)) = {f(n) : where c > 0,
  there exists a constant n0 > 0 such that 
  0 ≤ c g(n) < f(n) for all n ≥ n0} 

That is, f(n) becomes arbitrarily large with respect to g(n) as n approaches infinity.


limn->∞f(x)/g(x) = ∞

No comments:

Post a Comment