When I studied algorithms more than twenty years ago, I saw their general usefulness, but since then, I have never had to use them in the real world because they had already been implemented by database management services or built-in functionalities of most programming languages that I used.
Interviewer: Can you sort this list by size?
Interviewee: Absolutely.
/types confidently.../
l.sort()
There came a time when I had to remind myself of sorting algorithms. At 1st things 1st, I was about to implement a feature that lets users prioritize their options by comparing them two at a time. I thought that the Bubble Sort algorithm would fit perfectly.
So, I built the UI using ReactJS and Django for the backend to let users compare options and sort their priorities.
However, I soon noticed a problem: for longer lists, the process became quite time-consuming for users.
One day, I remembered something about algorithm complexity (also known as Big O Notation) and realized that Bubble Sort isn't the most efficient choice for this task.
In computer science, Big O notation describes how the performance of an algorithm grows relative to the size of the input. It's usually written like O(value)
, where value
is a function of n
, and n
is the number of items the algorithm processes.
For example:
- Finding the first item in a list or a key in a dictionary of n items is
O(1)
. - Summing all items by iterating through a list is
O(n)
. - Using a triple nested loop over the list would result in
O(n³)
.
Here is an overview of most common complexities:
Complexity | Name | Meaning and Example |
---|---|---|
O(1) |
Constant | Always the same time, no matter input size, e.g. accessing an array item |
O(log n) |
Logarithmic | Doubles size, just +1 step, e.g. Binary Search |
O(n) |
Linear | Time grows with size, e.g. loop over list |
O(n log n) |
Linearithmic | "Almost linear", but a bit slower, e.g. Merge Sort |
O(n²) |
Quadratic | Time explodes with size, e.g. nested loops as in Bubble sort |
O(2ⁿ) |
Exponential | Be careful — grows crazy fast, e.g. Recursive Fibonacci |
O(n!) |
Factorial | Nightmare mode, e.g. solving Traveling Salesman by brute force |
Understanding the Big O practically
Bubble Sort is an algorithm with a time complexity of O(n²)
. That means: if you have a list of items, the algorithm needs to compare each item against many others, using two nested loops.
- To sort a list of 5 items, the user might need up to 25 comparisons.
- To sort 10 items, up to 100 comparisons.
- To sort 50 items, up to 2,500 comparisons.
Not ideal.
I asked ChatGPT to help evaluate sorting algorithms based on their complexity, and Quick Sort came out as a much better choice. Its complexity is O(n log n)
, where log n
(base 2) roughly means "how many times you can divide n
by 2 until you reach 1."
- To sort 5 items, Quick Sort would need around 12 comparisons.
- To sort 10 items, around 34 comparisons.
- To sort 50 items, around 282 comparisons.
That's still some work, but much more manageable.
From a usability perspective, though, it’s even better to let users drag and drop items manually or to offer AI-suggested orderings. So I implemented those options too.
Algorithm complexities in action
Below is a quick cheat sheet with complexities from the fastest to the slowest:
Sorting algorithms
Algorithm | Best Case | Average Case | Worst Case | Notes |
---|---|---|---|---|
Timsort | O(n) |
O(n log n) |
O(n log n) |
Used in Python, Java |
Merge Sort | O(n log n) |
O(n log n) |
O(n log n) |
Stable, reliable |
Heap Sort | O(n log n) |
O(n log n) |
O(n log n) |
In-place |
Quick Sort | O(n log n) |
O(n log n) |
O(n²) |
Fastest on average |
Shell Sort | O(n log n) |
O(n¹·⁵) |
O(n²) |
Improved insertion |
Insertion Sort | O(n) |
O(n²) |
O(n²) |
Good for small/mostly sorted data |
Selection Sort | O(n²) |
O(n²) |
O(n²) |
Simple but slow |
Bubble Sort | O(n) |
O(n²) |
O(n²) |
Only good for tiny lists |
Search algorithms
Algorithm | Best Case | Average Case | Worst Case | Notes |
---|---|---|---|---|
Hash Table Search | O(1) |
O(1) |
O(n) |
Worst case rare (collisions) |
Binary Search | O(1) |
O(log n) |
O(log n) |
Needs sorted data |
Jump Search | O(1) |
O(√n) |
O(√n) |
For sorted arrays |
Ternary Search | O(1) |
O(log n) |
O(log n) |
Split into three parts |
Linear Search | O(1) |
O(n) |
O(n) |
No sorting needed, but slow |
Encryption algorithms
Algorithm | Complexity | Notes |
---|---|---|
AES (Advanced Encryption Standard) |
O(1) per block |
Very fast, symmetric key |
ChaCha20 |
O(1) per block |
Modern stream cipher, very fast |
RSA Encryption | O(n³) |
Public key, slow for big data |
Elliptic Curve Cryptography (ECC) | O(n²) |
Faster than RSA at similar security |
Diffie-Hellman Key Exchange | O(n³) |
Secure key exchange, slower |
Just for the hunger of interest
Python, MySQL, and PostgreSQL use these algorithms behind the scenes:
System | Sorting Algorithm | Notes |
---|---|---|
Python sort() and sorted()
|
Timsort | Hybrid of Merge Sort + Insertion Sort |
MySQL ORDER BY
|
Quicksort (small datasets), Merge Sort or Filesort (large datasets) | Depends on storage engine (InnoDB, etc.) |
PostgreSQL ORDER BY
|
Quicksort (small data), Heapsort + Mergesort for bigger sets | Intelligent memory-aware switching |
Final words
So if you ever have to solve a development issue algorithmically instead of using default libraries — or even when choosing an algorithm from a library — remember the Big O notation. Sometimes the difference can save milliseconds or seconds of time, and sometimes, when users' input is involved, it can save minutes or even hours.
Cover picture by Steve Johnson