Save
OCR A-Level Computer Science
2.3 Algorithms to solve problems and standard algorithms
2.3.1 Analysis and design of algorithms
Save
Share
Learn
Content
Leaderboard
Share
Learn
Cards (134)
What does algorithm efficiency measure in terms of time and space requirements?
Time and space complexity
Match the key factor with its effect on algorithm efficiency:
Input size ↔️ Requires more time and space
Algorithm design ↔️ Some are inherently more efficient
Hardware environment ↔️ Faster processors improve efficiency
Space complexity is important for
memory-constrained
systems.
True
Space complexity refers to how the memory usage of an algorithm scales with the size of its
input
What is the Big O notation for an algorithm with logarithmic space complexity?
O(log n)
What is the Big O notation for an algorithm with linear-logarithmic space complexity?
O(n log n)
What are the three common algorithm design strategies?
Divide and Conquer, Greedy, Dynamic Programming
Space complexity measures how memory usage scales with input
size
The Merge Sort algorithm is an example of the Divide and Conquer
strategy
Space complexity considers the additional memory required by the algorithm beyond the
input
Match the time complexity with its Big O notation:
Constant ↔️ O(1)
Logarithmic ↔️ O(log n)
Linear ↔️ O(n)
Quadratic ↔️ O(n^2)
Which algorithm has logarithmic space complexity?
Binary search
What is the main advantage of dynamic programming?
Finding optimal solutions
The Divide and Conquer strategy is efficient for large problems and can be
parallelized
.
True
Greedy algorithms always find the optimal solution.
False
Dynamic Programming is highly efficient for specific
problems
.
True
Dynamic Programming solves overlapping subproblems only
once
Match the algorithm design strategy with its characteristic:
Divide and Conquer ↔️ Parallelization
Greedy ↔️ Fast execution
Dynamic Programming ↔️ Optimal solutions
Time complexity
describes how the running time of an algorithm scales with the size of its
input
What does time complexity measure?
How running time scales
Arrange the time complexities from fastest to slowest:
1️⃣ O(1)
2️⃣ O(log n)
3️⃣ O(n)
4️⃣ O(n log n)
5️⃣ O(n^2)
Considering space complexity is essential for
resource optimization
.
True
The memory usage of a logarithmic space complexity algorithm grows logarithmically with the input
size
The memory usage of a quadratic space complexity algorithm grows quadratically with the input
size
Dynamic programming solves overlapping subproblems only once to avoid redundant
calculations
.
True
Match the algorithm design strategy with its example:
Divide and Conquer ↔️ Merge Sort
Greedy ↔️ Dijkstra's Algorithm
Dynamic Programming ↔️ Fibonacci Sequence
Algorithm efficiency is measured through time and space complexity analysis.
True
What does worst-case analysis help ensure in algorithm design?
Handling large inputs efficiently
Considering space complexity is crucial for resource optimization, especially with large
datasets
.
True
What are the three main algorithm design strategies?
Divide and Conquer, Greedy, Dynamic Programming
Which sorting algorithm is an example of Divide and Conquer?
Merge Sort
Which algorithm is an example of the Greedy strategy?
Dijkstra's Algorithm
What is a classic example of Dynamic Programming?
Fibonacci Sequence
What is a key aspect of Dynamic Programming?
Optimal Substructure
Which sequence is a classic example of Dynamic Programming?
Fibonacci Sequence
What is a limitation of the Greedy algorithm?
May not find optimum
Merge Sort has a time complexity of O(n log n) and a space complexity of
O(n)
.
True
Binary Search requires the input list to be sorted.
True
What is the time complexity of binary search?
O(log n)
What is the time complexity of merge sort?
O(n log n)
See all 134 cards