Skip to main content

5 Important Sorting Algorithms


5 Important Sorting Algorithm

In computer science, a sorting algorithm is an algorithm that puts elements of a list in a certain order. The most frequently used orders are numerical order and lexicographical order. Efficient sorting is important for optimizing the efficiency of other algorithms (such as search and merge algorithms) that require input data to be in sorted lists.
Here we will discuss 5 well known and important sorting techniques.
  1. Bubble Sort
  2. Insertion Sort
  3. Quick Sort
  4. Merge Sort
  5. Heap Sort

1. Bubble Sort




Bubble sort, sometimes referred to as sinking sort, is a simple sorting algorithm that repeatedly steps through the list, compares adjacent elements and swaps them if they are in the wrong order. The pass through the list is repeated until the list is sorted. The algorithm, which is a comparison sort, is named for the way smaller or larger elements "bubble" to the top of the list. Although the algorithm is simple, it is too slow and impractical for most problems.




Algorithm




Step 1 - Begin 

Step 2 -    for all elements of list
Step 3 -       if list[i] > list[i+1]
Step 4 -          swap(list[i], list[i+1])
Step 5 -       end if
Step 6 -    end for
Step 7 -   return list
Step 8 - end 

Example



Take an array of numbers " 5 1 4 2 8", and sort the array from lowest number to greatest number using bubble sort. In each step, elements written in bold are being compared. Three passes will be required;




First Pass

( 5 1 4 2 8 ) → ( 1 5 4 2 8 ), Here, algorithm compares the first two elements, and swaps since 5 > 1.
( 1 5 4 2 8 ) → ( 1 4 5 2 8 ), Swap since 5 > 4
( 1 4 5 2 8 ) → ( 1 4 2 5 8 ), Swap since 5 > 2
( 1 4 2 5 8 ) → ( 1 4 2 5 8 ), Now, since these elements are already in order (8 > 5), algorithm does not swap them.



Second Pass

( 1 4 2 5 8 ) → ( 1 4 2 5 8 )
( 1 4 2 5 8 ) → ( 1 2 4 5 8 ), Swap since 4 > 2
( 1 2 4 5 8 ) → ( 1 2 4 5 8 )
( 1 2 4 5 8 ) → ( 1 2 4 5 8 )
Now, the array is already sorted, but the algorithm does not know if it is completed. The algorithm needs one whole pass without any swap to know it is sorted.



Third Pass

( 1 2 4 5 8 ) → ( 1 2 4 5 8 )
( 1 2 4 5 8 ) → ( 1 2 4 5 8 )
( 1 2 4 5 8 ) → ( 1 2 4 5 8 )

( 1 2 4 5 8 ) → ( 1 2 4 5 8 )



Analysis

Worst-case performance comparisons,  swaps
Best-case performance comparisons,  swaps
Average performance comparisons,  swaps
Worst-case space complexity auxiliary

2. Insertion Sort

Insertion sort is a simple sorting algorithm that builds the final sorted array (or list) one item at a time. It is much less efficient on large lists.But Insertion sort Adaptive,Stable and In-place in nature.

Algorithm


Step 1 − Begin
Step 2 - If it is the first element, it is already sorted. return 1;
Step 3 − Pick next element
Step 4 − Compare with all elements in the sorted sub-list
Step 5 − Shift all the elements in the sorted sub-list that is greater than the value to be sorted
Step 6 − Insert the value
Step 7 − Repeat until list is sorted

Step 8 - End

Example 

The following example shows the steps for sorting the sequence 
{3, 7, 4, 9, 5, 2, 6, 1}. In each step, the key under consideration is underlined. The key that was moved (or left in place because it was biggest yet considered) in the previous step is marked with an asterisk.

3  7  4  9  5  2  6  1
3* 7  4  9  5  2  6  1
3  7* 4  9  5  2  6  1
3  4* 7  9  5  2  6  1
3  4  7  9* 5  2  6  1
3  4  5* 7  9  2  6  1
2* 3  4  5  7  9  6  1
2  3  4  5  6* 7  9  1
1* 2  3  4  5  6  7  9



Analysis

Worst-case performanceО(n2) comparisons and swaps
Best-case performanceO(n) comparisons, O(1) swaps
Average performanceО(n2) comparisons and swaps
Worst-case space complexityО(n) total, O(1) auxiliary

3. Quick Sort

Quicksort (sometimes called partition-exchange sort) is an efficient sorting algorithm, serving as a systematic method for placing the elements of a random access file or an array in order. Developed by British computer scientist Tony Hoare in 1959 and published in 1961 it is still a commonly used algorithm for sorting.Quicksort is a comparison sort, meaning that it can sort items of any type for which a "less-than" relation (formally, a total order) is defined. Efficient implementations of Quicksort are not a stable sort, meaning that the relative order of equal sort items is not preserved. Quicksort can operate in-place on an array, requiring small additional amounts of memory to perform the sorting.

Algorithm

Partition Algorithm

Step 1 − Choose the highest index value has pivot

Step 2 − Take two variables to point left and right of the list excluding pivot

Step 3 − left points to the low index

Step 4 − right points to the high

Step 5 − while value at left is less than pivot move right
Step 6 − while value at right is greater than pivot move left
Step 7 − if both step 5 and step 6 does not match swap left and right
Step 8 − if left ≥ right, the point where they met is new pivot


Quick Sort Algorithm

Step 1 − Make the right-most index value pivot
Step 2 − partition the array using pivot value using partition algorithm
Step 3 − quicksort left partition recursively
Step 4 − quicksort right partition recursively



Example

The following example shows the steps for sorting the sequence 
{10, 80, 30, 90, 40, 50, 70}. The pivot is marked bold in each step.

10 80 30 90 40 50 70 
10 30 40 50    90 80
10 30 40          90
10 30
10
10 30          80 90
10 30 40    70 80 90
10 30 40 50 70 80 90
Analysis


Worst-case performanceO(n2)
Best-case performanceO(n log n) (simple partition)
or O(n) (three-way partition and equal keys)
Average performanceO(n log n)
Worst-case space complexityO(n) auxiliary (naive)
O(log n) auxiliary

4. Merge Sort

In computer science, merge sort (also commonly spelled mergesort) is an efficient, general-purpose, comparison-based sorting algorithm. Most implementations produce a stable sort, which means that the order of equal elements is the same in the input and output. Merge sort is a divide and conquer algorithm that was invented by John von Neumann in 1945.

Algorithm

Step 1 − if it is only one element in the list it is already sorted, return.
Step 2 − divide the list recursively into two halves until it can no more be divided.
Step 3 − merge the smaller lists into new list in sorted order.

Example
The following example shows the steps for sorting the sequence 
{38,27,43,3,9,82,10}.(..)  shows the division of the list. 

(38 27 43 3 9 82 10)
(38 27 43 3)(9 82 10)
(38 27)(43 3)(9 82)(10)
(38)(27)(43)(3)(9)(82)(10)
(27 38)(3 43)(9 82)(10)
(3 27 38 43)(9 10 82)
(3 9 10 27 38 43 82)

Analysis

Worst-case performanceO(n log n)
Best-case performanceO(n log n) typical, O(n) natural variant
Average performanceO(n log n)
Worst-case space complexityО(n) total with O(n) auxiliary, O(1) auxiliary with linked lists[

5. Heap Sort

In computer science, heapsort is a comparison-based sorting algorithm. Heapsort can be thought of as an improved selection sort: like that algorithm, it divides its input into a sorted and an unsorted region, and it iteratively shrinks the unsorted region by extracting the largest element and moving that to the sorted region. The improvement consists of the use of a heap data structure rather than a linear-time search to find the maximum. Heapsort was invented by J. W. J. Williams in 1964.[2] This was also the birth of the heap, presented already by Williams as a useful data structure in its own right.

Algorithm



Step 1 - Build a max heap from the input data.

Step 2 - At this point, the largest item is stored at the root of the heap. Replace it with the last item of the heap followed by reducing the size of heap by 1. Finally, heapify the root of tree.

Step 3 - Repeat above steps while size of heap is greater than 1.


How to build the heap?
Heapify procedure can be applied to a node only if its children nodes are heapified. So the heapification must be performed in the bottom up order.

Example
The following example shows the steps for sorting the sequence 
{4,10,3,5,1}.(.) is used to show the index.



4 10 3 5 1

0 1  2 3 4


        4(0)

        /   \
     10(1)   3(2)
    /   \
 5(3)    1(4)

The numbers in bracket represent the indices in the array 
representation of data.

Applying heapify procedure to index 1:

         4(0)
        /   \
    10(1)    3(2)
    /   \
5(3)    1(4)

Applying heapify procedure to index 0:

        10(0)
        /  \
     5(1)  3(2)
    /   \
 4(3)    1(4)
The heapify procedure calls itself recursively to build heap
in top down manner.

after swapping the root with last element
        1(0)
        /  \
     5(1)  3(2)
    /   \
 4(3)    10(4)

after reducing the heap size
        1(0)
        /  \
     5(1)  3(2)
    /   
 4(3)    10(4)
again call heapify at root note

       5(0)
        /  \
     4(1)  3(2)
    /   
 1(3)    10(4)

Do the same until heapsize is 1.

...... after a while

       1(0)
          
     3(1)  4(2)
       
 5(3)     10(4)

1 3 4 5 10
0 1 2 3 4

Analysis
Worst-case performance
Best-case performance (distinct keys)
or  (equal keys)
Average performance
Worst-case space complexity total  

  

Conclusion

I hope this article helps you to understand the fundamental concepts of this sorting algorithms easily. If you liked it please share with you friends too.

References

Comments

Popular posts from this blog

Problem Solving and Algorithms

Problem Solving and Algorithms Introduction A problem space is all of the various components that go into creating a resolution for a problem. Think of it like a frame, which acts as something of a border to help define an area. A problem space helps you or, on a larger scale, a business, figure out what the problem is, work through ways to correct them and then drives implementation of the appropriate solution. The ultimate purpose is to take corrective action for some identified problem. Problem solving refers to cognitive processing directed at achieving a goal when the problem solver does not initially know a solution method. In computer science there are two different types of problems, ill-defined and well-defined : different approaches are used for each. Well-defined problems have specific end goals and clear expected solutions, while ill-defined problems do not. In computer science, an algorithm is a finite sequence of well-defined , computer-implementable inst...

A Beginner's Guide to System Design

A Beginner's Guide to System Design Introduction Systems design is the process of defining the architecture, modules, interfaces, and data for a system to satisfy specified requirements. Systems design could be seen as the application of systems theory to product development. There is some overlap with the disciplines of systems analysis, systems architecture and systems engineering. Whether you are a backend developer, product manager or technical manager, everyone needs to know the basics of System Design. So this article is the only destination that can solve all queries you have about System Design. Aspects of System Design High Level System Design High-level design ( HLD ) explains the architecture that would be used for developing a software product. The architecture diagram provides an overview of an entire system, identifying the main components that would be developed for the product and their interfaces. The HLD uses possibly nontechnical to mildly te...

A Basic Overview of LINUX

A Basic Overview of LINUX Introduction Linux is a family of open source Unix-like operating systems based on the Linux kernel,an operating system kernel first released on September 17, 1991, by Linus Torvalds . Linux is typically packaged in a Linux distribution.Distributions include the Linux kernel and supporting system software and libraries, many of which are provided by the GNU Project.Popular Linux distributions include Debian , Fedora , and Ubuntu . Commercial distributions include Red Hat Enterprise Linux and SUSE Linux Enterprise Server . Desktop Linux distributions include a windowing system such as X11 or Wayland , and a desktop environment such as GNOME or KDE Plasma . Distributions intended for servers may omit graphics altogether, or include a solution stack such as LAMP . Because Linux is freely redistributed, anyone may create a distribution for any purpose.Linux was originally developed for personal computers based on the Intel x86 architecture, but has ...