Summary
✔ Algorithm analysis helps evaluate performance using time and space complexity.
✔ Order of growth determines how an algorithm scales with input size.
✔ Asymptotic analysis generalizes efficiency across different hardware.
✔ Worst-case complexity (Big O) is the most important for real-world scenarios.
✔ Big O, Big Ω, and Big Θ describe different efficiency bounds.
✔ Mathematical foundations (asymptotic behavior, recurrences) help analyze complex algorithms.
Algorithm is finite set of steps to solve a problem. Analysis is process of computing two algorithm with respect to time and space.
Algorithm analysis is a systematic approach to evaluating the efficiency of algorithms. It involves determining the computational complexity of an algorithm, which refers to the amount of time and space resources it requires to execute. This analysis helps us understand how an algorithm’s performance scales with the size of the input data.
Importance of Algorithm Analysis
- Predicting Performance: Helps estimate the algorithm’s behavior on large datasets.
- Comparing Algorithms: Enables choosing the most efficient algorithm for a given problem.
- Optimizing Algorithms: Identifies bottlenecks and areas for improvement. Understanding
- Theoretical Limits: Provides insights into the fundamental limitations of computation.
Order of Growth
- Let f(n) and g(n) be the time taken by two algorithms where n >= 9 and f(n) and g(n) are also greater than equal to 0.
- A function f(n) is said to be growing faster than g(n) if g(n)/f(n) for n tends to infinity is 0 (or f(n)/g(n) for n tends to infinity is infinity).
- Example 1 : 4n^2 + 3n + 100
- After ignoring lower order terms, we get: 4n2
- After ignoring constants, we get: n2
- Hence order of growth is: n2
Worst, Best, and Average Cases
- Worst Case: The maximum time or space taken by an algorithm for the worst possible input. It ensures that the algorithm won’t perform worse than this. It is noted by Big-O
- Best Case: The minimum time or space taken for the most favorable input. It is noted by Big-Omega.
- Average Case: The expected time or space taken by the algorithm averaged over all possible inputs. It is noted by Theta.
\(T_{avg} = \frac{\text {Total time for all cases}}{\text{Number of Cases}}\)
Example: Linear Search
Problem: Searching for a key k in an array of size n.
/* Algorithm */
def linear_search(arr, key):
for i in range(len(arr)):
if arr[i] == key:
return i
return -1
Analysis:
- Worst Case: O(n) (key is not in the array or is the last element).
- Best Case: O(1) (key is the first element).
- Average Case: Assuming the key is equally likely to be at any position
\(T_{avg} = \frac{n+1}{2} \approx O(n) \)
Space and Time Complexities
Space Complexity: The total memory space required by an algorithm, including input, auxiliary, and output space.
Example: Merge Sort has a space complexity of O(n) .
Time Complexity: The computational time an algorithm takes to execute as a function of the input size.
Example: Bubble Sort has \( O(n^2) \) time complexity in the worst case.
Example: Bubble Sort
def bubble_sort(arr):
n = len(arr)
for i in range(n):
for j in range(0, n-i-1):
if arr[j] > arr[j+1]:
arr[j], arr[j+1] = arr[j+1], arr[j]
return arr
Analysis:
- Worst Case: \( O(n^2) \) (when the array is in reverse order).
- Best Case: O(n) (when the array is already sorted).
- Average Case: \( O(n^2) \).
- Space Complexity: O(1) (in-place sorting).
Mathematical Background
Asymptotic Behavior
Asymptotic notation is used to describe the growth of an algorithm’s runtime or space as the input size n approaches infinity.
Big O Notation
Worst case, Upper Bound (at Most)
Omega Notation
Best Case, Lower Bound (at Least)
Theta Notation
Average Case, Exact Time
Example: Binary Search
Binary Search divides the search interval into halves repeatedly. Its time complexity is: \(T(n) = O (\log n) \)
Solving Recurrences
Recurrences are equations that describe the running time of recursive algorithms.
Substitution Method
Guess the solution and prove using mathematical induction.
Example:
Given
\(T(n) = 2T (\frac{n}{2})+n \)
