Different algorithms may complete the same task with a different set of instructions in less or more time, space or effort than other.
The analysis and study of algorithms is a discipline in Computer Science which has a string mathematical background. It often relies on theoretical analysis of pseudocode.
Complexity of Algorithms
A way to classify how efficient an algorithm is, compared to alternative ones. The focus is on how execution time increases with the data set to be processed.
Algorithmic complexity
A measure of how long an algorithm would take to complete given an input of size n.
If an algorithm has to scale, it should compute the result within a finite and practical time bound even for large values of n. For this reason, complexity is calculated asymptotically as n approaches infinity.
Time Factor
Time is measured by counting the number of key operations such as comparisons in the sorting algorithm.
Space Factor
Space is measured by counting the maximum memory space required by the algorithm.
The complexity of an algorithm f(n) gives the running time and/or the storage space required by the algorithm in terms of n as the size of input data.
Complexity analysis doesn't concern itself with actual execution time, which depends on processor speed, instruction set, disk speed, compiler, etc. Complexity is about the algorithm itself, the way it processes the data to solve a given problem.
Space Complexity
Defining a formula for prediction of how much memory space is required for the successful execution of the algorithm.
Time Complexity
Determining a formula for total time required towards the execution of that algorithm.
The most important properties of a good algorithm are correctness and generality.
Empirical Method
Measuring the efficiency of an algorithm by writing a program in some programming language then running it in some computer and measuring the elapsed time from start to completion.
Theoretical Method
Measuring the efficiency of an algorithm by detaching it from its runtime environment and simulating it using a theoretical one-processor machine.
Algorithms should be given the same input to compare them fairly.
Input size
The size of the input is a factor in the time and space needed by an algorithm to execute.
Three Different Algorithm Times
Worst case
Best case
Average case
The worst case analysis is often the most important consideration in algorithm design.
Average case
The average-time needed to execute the algorithm over some finite set of inputs all of size n
The average-case time for input of size n of an algorithm is very useful, but treat with care: what is "average"?
The worst case analysis is often the most useful. It gives a performance guarantee that the algorithm would not behave any worse. Other inputs could be as bad, if not better, than the worst case.
Asymptotic notation
Mathematical tools to represent the complexities of algorithms for asymptotic analysis
Big Oh Notation (O(n))
Provides an upper bound for the runtime of an algorithm. It is the most commonly usednotation.
Big Omega Notation (Ω(n))
Provides a lower bound for the runtime of an algorithm. It is the least commonly used notation.
Big Theta Notation (Θ(n))
Provides a tight bound for the runtime of an algorithm. It bounds a function from above and below, so it defines the exact runtime behavior.
Little Oh Notation (o(g(n)))
Describes an upper bound that cannot be tight. It is used to denote an upper bound that is not asymptotically tight.
Little Omega Notation (ω(g(n)))
Describes a looselower bound of f(n). It is used to denote a lower bound that is not asymptotically tight.