## What is Asymptotic Notation?

Whenever we want to perform an analysis of an algorithm, we need to calculate the complexity of that algorithm. But when we calculate the complexity of an algorithm it provides a different amount of resource required. So instead of taking the exact amount of resources, we represent that complexity in a general form (Notation) which produces the basic nature of that algorithm.

We use that general form (Notation) for the analysis process.

Note – In asymptotic notation, when we want to represent the complexity of an algorithm, we use only the most significant terms in the complexity of that algorithm and ignore the least significant terms in the complexity of that algorithm (Here complexity can be Space Complexity or Time Complexity).

Majorly, we use THREE types of Asymptotic Notations and those are as follows…

- Big – Oh (O)
- Big – Omega (Ω)
- Big-Theta (Θ)

Also Read: What is Big-Omege Notation (Ω) & Big-Theta Notation (Θ)?

# Big-Oh Notation (O)

Big-Oh notation defines the **upper bound** of an algorithm in terms of Time Complexity.

That means Big-Oh notation always indicates the maximum time an algorithm requires for all input values. That means Big-Oh notation describes the worst case of an algorithm time complexity. Big-Oh Notation can be defined as follows.

**Consider function f(n) as the time complexity of an algorithm and g(n) is the most significant term. **

**If f(n) <= C g(n) for all n >= n _{0}, C > 0 and n_{0} >= 1. Then we can represent f(n) as O(g(n)).**

**f(n) = O(g(n))**Consider the following f(n) and g(n).**f(n) = 3n + 2g(n) = n**

If we want to represent **f(n)** as **O(g(n))** then it must satisfy **f(n) <= C g(n)** for all values of **C > 0** and **n _{0}>= 1**

f(n) <= C g(n)

⇒3n + 2 <= C n

The above condition is always TRUE for all values of

**C = 4**and

**n >= 2**.

By using Big – Oh notation we can represent the time complexity as follows.

**3n + 2 = O(n)**