Every partition step requires less work.Amount of work doing is ~number of items in array (holds up
better if the array is larger).
How many times does n have to be divided in half before the result is 1?
Quicksort performs n*log2(n) operations in the best case.
Incomparably better than insertion and selection sort!
Worst case: pivot is smallest or largest value.Absolute worst is where this is basically every case ->
doesn't divide sub array in half at all, just finds correct place for one value at a time. Partitions either
into smaller or larger values, not both.
In this case, array has to be partitioned (roughly) n times (actually n-1)
Worst case performance: around n^2 operations
(or n-1 * n or something)
Worst case usually occurs when array is nearly sorted in either direction
In this case, insertion sort would actually be extremely fast.
Typically much more concerned about worst case than best case. For example, when you have very
specific time requirements. If you're unlucky, you may run into huge problems. So you choose a
different algorithm instead.
Worst case is very unlikely to happen, esp with larger arrays.Average case is much, much closer to best
Think of randomly arranged data. What is the likelihood for it to be sorted? VERY, VERY SMALL.
Can fix a partially sorted array so it's ready for Quicksort. Multiple ways to do that.
Go one pass through the array and r