Loading

Best-case Complexity

Best-case Complexity Often when dealing with computer algorithms it is not only useful to know worst-case complexity, it is also important to know what is the best case for an algorithm. This complexity can be split into two fields. One is time complexity the other is space complexity. The definition using (g(n)), read Big Omega of g, can be applied to both time or space and as such, only one definition of best case complexity is required. Informally the best case complexity is the largest g(n) such that T(n) = (g(n)) (T(n) is in (g(n))).
Best-case complexity b = {inf(S) | S = \-/ f s.t there exist a c, B > 0 s.t \-/ n e” B g(n) e” cf(n) } where g(n) is your algorithm.
In plain English this means that the worst case complexity is the largest element in the set of all functions such that g is in (f(n)) for all n greater than B, which is the breaking point.
Best-case Complexity is not all the best measure of well an algorithm performs. More often computer scientists care more bout the worst case complexity and after that how well an algorithm will run on average. An algorithm that has a best case of (1) is pointless if its worst case is O(2n).