2.3 Algorithms

Cards (10)

  • Big-O notation is a way of describing the time complexity of an algorithm. This relates to how the speed of the algorithm changes as the number of items in the input increases. For example, the bubble sort algorithm is O(n2n^2) (also known as polynomial time) which means that doubling the size of a list will quadruple the time it takes to be sorted, and a list ten times longer will take 100 times longer to sort (10210^2).
  • O(1), or constant time, means that an algorithm will always take the same amount of time regardless of how many items in the input.
  • O(log n), or logarithmic time, means that as the number of items given to an algorithm increases, the speed of the algorithm only increases very gradually. The binary search algorithm runs in logarithmic time, with a million items to search through the worst case scenario is 21 steps to search the list.
  • O(n), or linear time, means that the time an algorithm takes to run is directly proportional to the number of items in the input. An example of this kind of algorithm is linear search. Searching a list of 100 items will on average take 10 times longer than searching a list of 10 items.
  • O(xnx^n), or exponential time, means that the time taken for an algorithm to run increases extremely quickly as the number of inputs increases. Alongside O(n2n^2) (polynomial time), these kinds of algorithms are too slow to use for anything other than small amounts of data.
  • Bubble sort is an algorithm which makes a series of passes through a list, during each pass it repeatedly compares pairs of elements (starting with the first two) and swaps them if they're in the wrong order. It runs in polynomial time (O(n2n^2)).
  • Insertion sort is an algorithm which steps through a list, looking at each element and moving that element as far to the left as necessary (by swapping) so that it's in the correct sorted position. As the algorithm continues the sorted part of the list on the left will grow and the unsorted part on the right will shrink, until the list is fully sorted. It runs in polynomial time (O(n2n^2)).
  • The merge sort algorithm sorts a list by first splitting it into two smaller lists. This process is repeated recursively until there are a series of lists each containing one or zero elements. Then the lists are merged back together using an algorithm which inserts the smallest numbers first. It runs in linearithmic time (O(n log n)).
  • The quick sort algorithm sorts a list by first choosing an element to be the pivot, it then moves items in the list such that all values less than or equal to the pivot are moved to its left and all other values are moved to its right. The process is repeated recursively until the list is fully sorted. On average it runs in linearithmic time (O(n log n)).
  • Space complexity is like time complexity (e.g. binary search runs in logarithmic time), but for memory usage. A search algorithm which always uses the same amount of memory regardless of the size of the array to be searched, would have a space complexity of O(1) (constant space).