In this blog post I'm going to talk about bunch of sorting algorithms and their time complexities. Sorting algorithms are used to rearrange the elements of a pile in ascending or descending order. There are different sorting algorithms, such as selection sort, insertion sort, bubble sort, quick sort. Let's look at them separately.
Selection sort:
This algorithm picks all the pairs of elements in the list and makes sure that smaller one comes first. If the first one is bigger then it swaps the elements. The time complexity of this algorithm is O(n^2) which is not the best one you can use.
Insertion sort:
Insertion sort algorithm picks every element in the list and tries to add them to a new pile in a way that the new pile will be sorted. It is similar to the method people use when they sort pile of playing cards. The complexity for this algorithm is also O(n^2).
Bubble sort:
This one picks every two adjacent elements in the list and puts the smaller one to the smaller index. Then it tries same thing again until the list is sorted. The time complexity varies in this algorithm. For the best case in which the list is already sorted, it only takes O(n) time to sort it. But in other cases, bubble sort has a time complexity of O(n^2)
Quick sort:
This algorithm is the one I use a lot. It takes a pivot and gathers all elements smaller than pivot to a new list, and all elements bigger than pivot to another list. Then sorts the sublists recursively and joins them together. The time complexity for this algorithm is O(nlogn) which makes quicksort one of the best sorting algorithms.
There are a lot of algorithms other than these here and each of them have best case and worst case runtimes. But on average the algorithms that have O(nlogn) complexity are considered the fastest algorithms now. If the set of numbers in the list is not too big bucket sort can be used to get a O(n) complexity but it only works on small sets.
Selection sort:
This algorithm picks all the pairs of elements in the list and makes sure that smaller one comes first. If the first one is bigger then it swaps the elements. The time complexity of this algorithm is O(n^2) which is not the best one you can use.
Insertion sort:
Insertion sort algorithm picks every element in the list and tries to add them to a new pile in a way that the new pile will be sorted. It is similar to the method people use when they sort pile of playing cards. The complexity for this algorithm is also O(n^2).
Bubble sort:
This one picks every two adjacent elements in the list and puts the smaller one to the smaller index. Then it tries same thing again until the list is sorted. The time complexity varies in this algorithm. For the best case in which the list is already sorted, it only takes O(n) time to sort it. But in other cases, bubble sort has a time complexity of O(n^2)
Quick sort:
This algorithm is the one I use a lot. It takes a pivot and gathers all elements smaller than pivot to a new list, and all elements bigger than pivot to another list. Then sorts the sublists recursively and joins them together. The time complexity for this algorithm is O(nlogn) which makes quicksort one of the best sorting algorithms.
There are a lot of algorithms other than these here and each of them have best case and worst case runtimes. But on average the algorithms that have O(nlogn) complexity are considered the fastest algorithms now. If the set of numbers in the list is not too big bucket sort can be used to get a O(n) complexity but it only works on small sets.