Time Complexity and Space Complexity in algorithms

Always a question arises -

When does an algorithm provide a satisfactory solution to a problem?

One measure of efficiency is the time used by a computer to solve a problem using the algorithm, when input values are of a specified size

second measure is the amount of computer memory required to implement the algorithm when input values are of a specified size

Questions such as these involve the computational complexity of the algorithm. An analysis of the time required to solve a problem of a particular size involves the time complexity of the algorithm. An analysis of the computer memory required involves the space complexity of the algorithm.

There are three types of time complexity — Best, average and worst case.

In simple words for an algorithm, if we could perform and get what we want in just one(eg. on first instance) computational approach, then that is said as O(1) i.e. Time complexity here falls into “Best case” category.

Say for example, same algorithm results into many iterations/recursions or say n times it had to perform to get the result, then the example used for this algorithm describes it’s worst case time complexity.

Below are some common time complexities with simple definitions. Feel free to check out Wikipedia, though, for more in-depth definitions.

Simple example with code -

So scenario on time complexity for this above given example would be -

Asymptotic Notations

Asymptotic Notations are languages that allow us to analyze an algorithm’s running time by identifying its behavior as the input size for the algorithm increases. This is also known as an algorithm’s growth rate.

The following 3 asymptotic notations are mostly used to represent time complexity of algorithms:

Big Oh (O)

Big Oh is often used to describe the worst-case of an algorithm by taking the highest order of a polynomial function and ignoring all the constants value since they aren’t too influential for sufficiently large input.

Big Omega (Ω)

Big Omega is the opposite of Big Oh, if Big Oh was used to describe the upper bound (worst-case) of a asymptotic function, Big Omega is used to describe the lower bound of a asymptotic function. In analysis algorithm, this notation is usually used to describe the complexity of an algorithm in the best-case, which means the algorithm will not be better than its best-case.

Big Theta (Θ)

When an algorithm has a complexity with lower bound = upper bound, say that an algorithm has a complexity O(n log n) and Ω(n log n), it’s actually has the complexity Θ(n log n), which means the running time of that algorithm always falls in n log n in the best-case and worst-case.

If you want to dive deep into time complexity, then refer Michael Olorunnisola ‘s article:

or look at Shilpa Jain ‘s article —

Space Complexity

Space complexity deals with finding out how much (extra)space would be required by the algorithm with change in the input size. For e.g. it considers criteria of a data structure used in algorithm as Array or linked list.

How to calculate space complexity of an algorithm — https://www.quora.com/How-do-we-calculate-space-time-complexity-of-an-algorithm