The total possible permutations of a set of values is just the number of ways they can be arranged - this is just a factorial, which is the product of all integers less than or equal to itself
We can compare algorithms by their complexity relative to the size of the problem. It is important to consider how this scales with size of the problem, not just one set size in isolation
Some algorithms will be more efficient time-wise: they will take less time to complete
Some algorithms will be more efficient space-wise: they will take up less memory
Algorithms should be designed to run as quickly as possible and to take up as little memory as possible
A function maps a domain onto a codomain, where the domain is all possible inputs and a codomain is all possible outputs.
The range is the specific set of outputs that a function generates at one time
Complexity refers to the number of steps and amount of memory used by an algorithm
Constant complexity is represented as O(1) and means an algorithm will always take the same amount of time regardless of data set size. Likely has no loops.
Logarithmic complexity is represented with O(log n) and means the data set is halved with each pass.
Linear complexity is represented with O(n) and means performance is directly proportional to size of the data set. Likely has a while or for loop.
Polynomial complexity is represented with O(n^2) - performance is proportional to the square of the data set, so efficiency decreases with increasingly large data sets. Likely has a nested loop.
Exponential complexity is represented with O(2^n) and means it doubles with each addition to the data set each pass. Very inefficient. Likely has recursion.
To express complexity, we can remove all terms except for the one with the highest factor or exponent, and remove all constants - and from there identify what complexity it is
Generally, iteration will be faster than recursion
Some problems may require more memory than is feasible, thus rendering it effectively unsolvable. Alternatively if it has exponential complexity, adding more or faster processors would make very little difference. These are hardware limitations.
A tractable problem is one that can be solved in polynomial time or better. These are practical and useful.
An intractable problem is one that cannot be solved in polynomial time or better. They are still possible to solve but it is impractical to do so in any reasonable time frame.
Heuristic methods allow us to sacrifice an optimal answer for one that is 'good enough', making intractable problems useful
There are some problems that cannot be solved algorithmically.
The Halting Problem asks if it is possible to write a program that can determine (without executing it) whether or not a given program will halt. The answer is no.
The Halting Problem proves there are some problems which can never be solved by computers.