Photo by Susan Holt Simpson on Unsplash

Data structures and algorithms are the most important and basic concepts of computer science. Most of the big companies hire their employees on the basis of DSA(data structures and algorithms) knowledge. By learning DSA you not only grab a job in big companies but also learn how to code efficiently.

In this article, I will tell you the most basic concept of DSA that is Big O Notation using Javascript.

Data structures is a programmatic way to store data so that data can be used efficiently. There are several types of data structures like arrays, linked list, stack, queue, binary tree, heap, hashing, graph, matrix, strings and advanced data structures.

For example, we can store a list of items having the same data-type using the array data structure.

Array Data Structure Example

Algorithms? They are simply the finite set of sequence or you can say the well defined computer set of instructions to solve a problem or perform a computation.

For any defined problem, there can be N number of solutions. This is true in general. If I have a problem and I discuss it with all of my friends, they will all suggest me different solutions. And I am the one who has to decide which solution is the best based on the circumstances.

Similarly, for any problem which must be solved using a program, there can be an infinite number of solutions. Let’s take a simple example to understand this.

Problem: Make a function to add n numbers

Solution 01:

Solution 01

Solution 02:

Solution 02

In the above solutions, we can get the desired output but what solution is better for large values of n. How can we determine which solution is better?

To check this we can use performance.now(). It will tell us how much time our solution takes to execute the problem.

Let’s look into it.

First for Solution 01:

Solution01 Performance time is 0.00680499 seconds

For Solution 02:

Solution02 Performance time is 0.000055000 seconds

From above two screenshots you can see the difference in elapsed time in execution from a piece of code.

Second solution is way more faster than the first one. Even with higher value of n, second one is more reliable.

From above problem I wanna say that this is how DSA helps us to make our code more efficient and more faster. So to analyse the algorithm we use Big-O notation.

Big-O Notation

An algorithm is O(f(n)) if the number of simple operations the computer has to do is eventually less than a constant times f(n), as n increases.

f(n) could be linear, f(n) = n f(n) could be quadratic, f(n) = n² f(n) could be constant, f(n) = 1 f(n) could be something entirely different

Time Complexity

Time complexity of an algorithm is defined as how much time is taken by our program or set of finite piece of code to complete the given task.

The time complexity of algorithm is most commonly expressed using the big O notation. It’s an asymptotic notation to represent the time complexity.

Rules Of Thumb:

Arithmetic operations are constant, O(1).

e.g: either you are adding 2 +2 or million + billion the time taken by arithmetic operation is same in both cases, that is O(1). Variable assignment is constant. Accessing an element in an array (by index) or object (by key) is constant. In a loop, the complexity is the length of the loop times the complexity of whatever happens inside of the loop.

Example for time complexity is already explained above. The problem of finding the sum of n numbers is the best example to know about the time complexity of an algorithm.

In solution 01, as we perform the execution using for loop the time complexity became O(n), whereas in solution 02 we returned the sum only by performing arithmetic operation whose time complexity was O(1) which is constant.

This is why if we use large number to find its sum, solution 01 would not be the best choice.

Space Complexity

Space complexity in simpler terms determine the additional memory we need to allocate in order to run the code in our algorithm.

Sometimes auxiliary space is confused with space complexity. But auxiliary space is the extra space or we can say temporary space used by an algorithm.

Space Complexity = Auxiliary Space + Space used by an Input

Here is the list of Inputs alongside the memory they consumed:

bool, char, unsigned char, signed char, __int8 : 1 byte __int16, short, unsigned short, wchar_t, __wchar_t : 2 bytes float, __int32, int, unsigned int, long, unsigned long : 4 bytes double, __int64, long double, long long : 8 bytes

Rules Of Thumb:

Most primitive types like booleans, numbers, undefined & null occupy constant space, O(1). Strings require O(n) space (where n is the string length ) Reference types are generally O(n), where n is the length (for arrays) or the number of keys (for objects).

So to explain the order of space complexity, look at the two examples shown below. Here you can easily identify that first example is of constant order of space complexity and second example is of n order of space complexity.

Example:

Space complexity of O(1):

Space Complexity Of O(1)

2. Space complexity of O(n):