3 Sorting Algorithms Every Developer Should Know
Understanding and analyse basic sorting algorithms
Sorting numbers is one of the most commonly used operation in computer science. Learning this topic will help prepare you for your next coding interview.
Sorting is important in programming for the same reason it is important in everyday life. You should understand various sorting techniques and how they operate to be more efficient than the other.
3 Basic Sorting Algorithms that you should know are :
- Bubble Sort
- Selection Sort
- Insertion Sort
Let’s look at each one of them and their efficiencies. Understanding pros and cons of each algorithm will help you decide which algorithm to use at what scenarios.
Bubble sort is one of the basic algorithms, which works by repeatedly swapping adjacent elements until they are in the intended order.
It is called bubble sort because after each pass through the array, the largest element bubbles up to the end of the array. After n-1 passes through the entire array, it will be sorted, while bubbling up biggest element each time we pass through the array.
The time complexity of bubble sort is O(n²)
The above code is an optimised bubble sort, which will not continue if a pass didn’t have any swaps. If a pass didn’t have any swaps it means array is sorted.
The time complexity O(n²) is determined by the nested loop, as we have n passes, on which each pass have to traverse through n elements (at most), which results in n*n = n² operations.
Selection sort is a simple algorithm. It sorts an array by repeatedly finding the minimum element and arranging it.
It is called selection sort because it repeatedly selects the next-smallest element and swaps it into place. Array is considered as two part, sorted part and unsorted part. In every iteration the minimum element in unsorted part is selected and added to the sorted path. If we continue this process n-1 times the array will become sorted.
The time complexity of selection sort is O(n²)
In the above code we are finding the minimum element’s index on each iteration and swapping it with the ith index, as we build up the sorted part. The elements up to indices i-1 will be sorted at all time.
The time complexity O(n²) is determined by the nested loop, as we have n iterations, and on each iteration we traverse through n elements (at most) to find the minimum element, which results in n*n = n² operations.
Insertion sort is a very simple sorting algorithm. It works very similar to the way you would sort playing cards in your hands. The array is essentially split into a sorted and an unsorted part. Values from the unsorted part are placed at the correct position in the sorted part.
It is called insertion sort because we repeatedly takes an element and insert it a position where it belongs to form a sorted array. By taking one-by-one from the unsorted part and placing it in correct position in sorted part, we end up sorting the whole array.
The time complexity of insertion sort is O(n²)
In the above code we stores the element we are trying to place in a variable, and we move down the array from that position while shifting the array elements forward until we find the right spot for the element to be inserted. We insert the element there. This process is continued until the array is sorted.
The time complexity O(n²) is determined by the nested loop, as we have n iterations, and on each iteration we traverse through n elements (at most) to find the insert the chosen element at it’s correct position, which results in n*n = n² operations.
Awesome! You have learned about 3 basic sorting algorithms. I recommend practise these algorithms by coding it yourself only with the help of the algorithm, once you thoroughly understand the algorithm.
If you find this helpful, don’t forget to like, share, and follow. Do you enjoy learning algorithms ? Let me know in the comments!