Time
The time taken by an execution of the algorithm is the number of instructions in the execution trace. For an algorithm \(A\), and an input \(x\), we define \(t’_A(x)\) to be the time taken by the algorithm \(A\) on input \(x\).
The worst case time complexity of an algorithm \(A\) is defined as:
\begin{equation} t_A(n) = \max_{x : |x| = n} t’_A(x) \end{equation}
where \(|x|\) denotes the size of the instance \(x\).
Why this definition? We want to argue that our programs will take at most \(t\) time for any input of size at most \(n\). i.e., when analyzing performance, we are not interested in the time taken on easy inputs or even on time taken on specific input strings. We are only interested in the growth of time with the input size. From this perspective, worst case time complexity is the right measure of the performance of an algorithm.
Exercise #
- Find two inputs \(x\) and \(y\) such that \(|x| = |y|\) and \(t’_{\texttt{gcd}}(x) \neq t’_{\texttt{gcd}}(y)\).
- What is \(t_{\texttt{gcd}}(n)\)?