Prove that the running time of an algorithm is \(\Theta(g(n))\) if and only if its worst-case running time is \(O(g(n))\) and its best-case running time is \(\Omega(g(n))\).
Mathematically proving this boils down to the same work done in Exercise 3.1-5 since this statement also relies on Theorem 3.1. What follows is an informal explanation of why this statement is true.
Suppose \(f(n) = \Theta(g(n))\) and that \(f_w(n)\) and \(f_b(n)\) indicate the worst-case and best-case runtimes of \(f(n)\) respectively. Based on Thereom 3.1 we know that \(f(n) = O(g(n))\) and \(f(n) = \Omega(g(n))\).
Saying \(f(n)\) has an upper-bound represented by \(O(g(n))\) is the same as saying that in the worst-case, \(f(n)\) will not exceed \(O(g(n))\). This indicates that \(f_w(n) = O(g(n))\). Similar logic can be used to describe that \(f_b(n) = \Omega(g(n))\).
The key here is that \(f(n)\) is the set of all functions that include and lie between \(f_w(n)\) and \(f_b(n)\). The usage of \(\Theta\), \(O\) and \(\Omega\)-notation is a mathematically rigorous way of describing the area this set of functions is bound by with the worst-case run time defining the upper-bound and the best-case run time defining the lower bound.
If we suppose instead that \(f(n) = \Theta(g(n))\) and that, say, \(f_w(n) \neq O(g(n))\) then we are saying that \(f_w(n)\) exceeds the upper-bound represented by \(O(g(n))\) which in turn implies that \(f(n)\) also exceeds the upper-bound represented by \(\Theta(g(n))\) since \(f_w(n)\) a possible run-time of \(f(n)\). This is clearly a contradiction, and the same argument applies to making the claim that \(f_b(n) \neq \Omega(g(n))\).