Asymptotic Analysis

Asymptotic Analysis

The Mathematical Definition:

Asymptotic behavior is a mathematical concept that describes how a function behaves as the input (or independent variable) approaches infinity

It sounds too nerdy because it is.

What does asymptotic analysis of an algorithm mean?

It simply provides a sweet spot for high-level reasoning about algorithms

High-Level Idea

Suppress constant factors and lower-order terms

  • constant factors - these are system dependent i.e, the architecture of the computer

  • lower-order terms - these are most likely irrelevant for large inputs.

  • eg: equate 6nlog₂n + 6n with just nlogn .

here comes the notion of introducing symbols to define that upper-bound like

Big-O, Omega, Theta, little-O, etc

The Big O

An UPSHOT on Big O, What does it semantically mean?

  • Big Oh is like a function that outputs the runtime of an algorithm suppressing the constants and lower-order terms

  • The above is just a general idea of Big-O notation, Let's deep dive:

Mathematical Definition:

The T(n) is said to be Big-O of function f ----> [ O(f(n)) ] there exists two constants

c and n0 such that c

For all n ≤ n0

$$T(n)≤cf(n)$$

Here, c is a constant factor, and the above inequality forms an upper bound

Graphical Representation

Example: let c = 2

Here we can use a simple methodology of a game in which you can try to show that c and n0 form an upper bound on T(n).

Big Omega(Ω)

An Upshot on Big Omega:

  • If a running time is Ω(f(n)), then for large enough n, the running time is at least c⋅f(n) for some constant c.

  • That means for large enough input n (i.e, n0) the running time is at least c.f(n) which is denoted by Omega

Mathematically:

The T(n) is said to be Big Omega (Ω) of function f ----> [ Ω(f(n)) ] there exist two constants

c and n0 such that for all:

$$n \ge n_0$$

$$ T(n) \ge cf(n)$$

Here, c is a constant factor, and the above inequality forms a lower bound

Graphical Representation:

Here we can use a similar methodology of a game in which you can try to show that c and n0 form a lower bound on T(n).

Difference between Big-O and Big-Omega:

The difference between Big O notation and Big Ω notation is that Big O is used to describe the worst-case running time for an algorithm. But, Big Ω notation, on the other hand, is used to describe the best-case running time for a given algorithm.

Big Theta(θ)

An Upshot on Big Theta

  • Big theta is either the exact performance value of the algorithm or a useful range between narrow upper and lower bounds

  • It finds the range (in-between) of the upper bound (Big O) and lower bound (Big Omega).

Mathematically:

T(n) is only said to be Big theta of f(n) when :

$$T(n) = O(f(n)) \: and\: T(n) =\Omega(f(n))$$

i.e, there exist the constants c1 , c2 and n0

such that for all

$$n \ge n_0$$

$$ c_1f(n) \le T(n) \le c_2f(n)$$

Graphical representation:

We also have little o which forms a strict asymptotic upper bound, but it's not used that frequently.

Note: Try reading the article in the dark mode because most of the graphs consist of white outlines which might be not visible in light mode, I made it deliberately in this manner because we all know we are developers 😎👨‍💻.

Did you find this article valuable?

Support Codexperiences by becoming a sponsor. Any amount is appreciated!