Time Complexity:
Time complexity is a measure of how the running time of an algorithm increases with the size of the input. It helps us understand how the algorithm's performance scales. Time complexity is typically expressed using big O notation, which provides an upper bound on the growth rate of the algorithm.Here are some common time complexity classifications:
- Constant Time (O(1)): An algorithm with constant time complexity always takes the same amount of time to execute, regardless of the input size. It is considered the most efficient.
def print_first_element(arr): print(arr[0]) - Linear Time (O(n)): An algorithm with linear time complexity has a running time that grows linearly with the input size.
def print_elements(arr): for element in arr: print(element) - Quadratic Time (O(n^2)): An algorithm with quadratic time complexity has a running time that grows with the square of the input size.
def print_all_pairs(arr): for i in range(len(arr)): for j in range(len(arr)): print(arr[i], arr[j])
Space Complexity:
Space complexity refers to the amount of memory an algorithm requires to solve a problem. It measures how the space usage increases with the input size. Similar to time complexity, space complexity is also expressed using big O notation.Here are some common space complexity classifications:
- Constant Space (O(1)): An algorithm with constant space complexity uses a fixed amount of memory regardless of the input size.
def sum_of_two_numbers(a, b): result = a + b return result - Linear Space (O(n)): An algorithm with linear space complexity increases its memory usage linearly with the input size.
def copy_array(arr): new_arr = [] for element in arr: new_arr.append(element) return new_arr - Quadratic Space (O(n^2)): An algorithm with quadratic space complexity uses memory that grows with the square of the input size.
def create_matrix(n): matrix = [[0] * n for _ in range(n)] return matrix