Best Case,Average Case, Worst Case
How performance varies for different input of same size – based on organization of the data.
May be asked what an array should look like to be best/avg/worst case for an algorithm.
Although useful, can be difficult to determine exact num of operations by an algorithm
Alternative is to focus on barometer instruction
Is instruction executed most times in an algorithm. May be multiple. Number of times this is executed
is usually proportional to the running time of the algorithm.
Cost Functions for Searching
If array isn't sorted, use linear search (it's the only way to do it) - start with first item, compare each
item to target, if find, return true, else never find, return false
Linear search barometer instruction: equality checking (comparisons) arr[i] == x. Or, could be i++, or I
Best case: 1, item looking for is first item in array → make comparison arr[i] == x once.
Worst case: length of the array, or not in the array. Both cases: have to go through entire thing. Make n
Average case: to calc, need to know how often the two worst cases arise. Can make assumptions about
this, although they might not hold for a particular use. Essentially, how often are you going to look for
something in the array? How often will you find something in the end of the array? How often at the
Assume: target is not in array half the time.
Therefor half the time entire array has to be checked. Probability 0.5.
Also assume: Equal probability of target being at any array location. Probability 1/n that target is at
some position i.
I comparisons if target at the ith position.
Take weighted average of values to find the total expected number of comparisons. E = 1*1/n + 2*1/n
+ … + n * 1/n
or E = (n+1)/2
Target not in array: n comparisons
Target in array: (n+1)/2 comparisons
Take weighted average = (3n + 1)/4
So, on average, we assume it will perform this many comparisons.
If we sorted, could change average cost to n/2. But then we shouldn't be using