Asymptotic complexity of algorithms
Asymptotic complexity vs. run time
- Roughly speaking, the run time of an algorithm is measured by the number of operations executed by the machine.
- However, in theoretical considerations, the run time is estimated by the algorithm's asymptotic complexity. In asymptotic considerations, constant multiplicative factors are completely disregarded. For that, it does not matter how fast the machine and how smart the algorithm's implementation is.
Characterizing parameters
For the inputs of an algorithmic problem, one or more (numerical) characterizing parameters are to be selected, from which the total size of an input can be computed (or, at least, quite tightly bounded from above and from below).
Examples:
- The number of elements in sets, maps, and sequences and the (maximum) size of an element in a set, map, or sequence.
- The number of nodes and the number of edges/arcs in a graph.
Asymptotic complexity of an algorithm
- For a selection of [math]\displaystyle{ k }[/math] characterizing parameters for an algorithmic problem, let [math]\displaystyle{ \mathcal{I}(n_1,\ldots,n_k) }[/math] denote the set of all instances for which these parameters assume the values [math]\displaystyle{ n_1,\ldots,n_k }[/math].
- For an algorithm for this algorithmic problem, the asymptotic complexity of the algorithm with respect to this selection of characterizing parameters is defined as follows.
- In the worst case: by the function [math]\displaystyle{ f_\max:\mathbb{N}^k\rightarrow\mathbb{R}^+ }[/math] such that [math]\displaystyle{ f_{\max} (n_1,\ldots,n_k) }[/math] is the maximum number of operations executed by the algorithm for any instance in [math]\displaystyle{ \mathcal{I}(n_1,\ldots,n_k) }[/math].
- In the best case: by the function [math]\displaystyle{ f_\min:\mathbb{N}^k\rightarrow\mathbb{R}^+ }[/math] such that [math]\displaystyle{ f_{\min}(n_1,\ldots,n_k) }[/math] is the minimum number of operations executed by the algorithm for any instance in [math]\displaystyle{ \mathcal{I}(n_1,\ldots,n_k) }[/math].
- In the average case: For this case, a probability distribution [math]\displaystyle{ P_{n_1,\ldots,n_k} }[/math] over each set [math]\displaystyle{ \mathcal{I}(n_1,\ldots,n_k) }[/math] is required. For short, let [math]\displaystyle{ P:=(P_{n_1,\ldots,n_k})_{n_1,\ldots,n_k\in\mathbb{N}} }[/math]. Then the asymptotic average complexity is the function [math]\displaystyle{ f_P:\mathbb{N}^k\rightarrow\mathbb{R}^+ }[/math] such that [math]\displaystyle{ f_P(n_1,\ldots,n_k) }[/math] is the expected number of operations executed by the algorithm, taken over all instances in [math]\displaystyle{ \mathcal{I}(n_1,\ldots,n_k) }[/math], with respect to the distribution [math]\displaystyle{ P_{n_1,\ldots,n_k} }[/math].
Bounds on the asymtptotic complexity
Consider an algorithmic problem, a selection of [math]\displaystyle{ k }[/math] characteristizing parameters, and an algorithm for this algorithmic problem. Moreover, let [math]\displaystyle{ f_\max }[/math], [math]\displaystyle{ f_\min }[/math], and [math]\displaystyle{ f_P }[/math] be defined as above.
For a function [math]\displaystyle{ f_1:\mathbb{N}^k\rightarrow\mathbb{R}^+ }[/math], we say that the algorithm is in [math]\displaystyle{ \oplus(g) }[/math] in the
- worst case, if [math]\displaystyle{ f_{\max}\in\oplus(g) }[/math],
- best case, if [math]\displaystyle{ f_{\min}\in\oplus(g) }[/math],
- average case with respect to [math]\displaystyle{ P }[/math], if [math]\displaystyle{ f_P\in\oplus(g) }[/math],
where "[math]\displaystyle{ \oplus }[/math]" is one of "[math]\displaystyle{ \mathcal{O} }[/math]", "[math]\displaystyle{ o }[/math]", "[math]\displaystyle{ \Omega }[/math]", and "[math]\displaystyle{ \Theta }[/math]".