Algorithm Analysis
An algorithm is the step-by-step unambiguous instructions to solve a given problem.
In CS, multiple algorithms are available for solving the same problem (for example, a sorting problem has many algorithms, like insertion sort, selection sort, quick sort and many more).
- Algorithm analysis helps us to determine which algorithm is most efficient in terms of time and space consumed.
- The goal of the analysis of algorithms is to compare algorithms (or solutions) mainly in terms of running time but also in terms of other factors (e.g., memory, developer effort, etc.)
In the traditional study of algorithms, there are two main criteria for judging the merits of algorithms:
- correctness (does the algorithm give solution to the problem in a finite number of steps?)
- efficiency (how much resources (in terms of memory and time) does it take to execute the algorithm).
To compare algorithms, we can define a few objective measures:
- Execution times: Not a good measure as execution times are specific to a particular computer (Hardware dependent).
- Number of statements executed: Not a good measure, since the number of statements varies with the programming language as well as the style of the individual programmer.
- Rate of Growth: Let us assume that we express the running time of a given algorithm as a function of the input size n (i.e., f(n)) and compare these different functions corresponding to running times. This kind of comparison is independent of machine time, programming style, etc.
The rate at which the running time increases as a function of input is called rate of growth.
Asymptotic Notations
To analyze an algorithm we need some kind of syntax, and that forms the base for asymptotic analysis/notation.
Asymptotic notations are mathematical tools to represent the time complexity of algorithms for asymptotic analysis.
The following 3 asymptotic notations are mostly used to represent the time complexity of algorithms:
- Θ Notation (Average Case)
- Big O Notation (Worst Case)