Keyboard shortcuts

Press or to navigate between chapters

Press S or / to search in the book

Press ? to show this help

Press Esc to hide this help

What is Complexity?

Complexity in computer science refers to the measure of resources an algorithm requires to solve a problem. These resources are typically time (how long the algorithm takes to run) and space (how much memory the algorithm uses). Complexity analysis helps evaluate an algorithm's efficiency, especially as the input size grows, allowing developers to choose the most suitable algorithm for a task.

What is Time Complexity?

Time complexity describes the amount of time an algorithm takes to complete as a function of the input size ( n ). It focuses on the number of operations executed, ignoring constant factors and hardware-specific details. Time complexity is usually expressed using Big O notation, which provides an upper bound on the growth rate in the worst-case scenario.

  • Key Points:
    • Measures how the runtime scales with input size.
    • Expressed as ( O(f(n)) ), e.g., ( O(n) ), ( O(n^2) ), etc.
    • Common in analyzing loops, recursive calls, or operations like comparisons and assignments.
    • Example: A linear search checking each element in an array has ( O(n) ) time complexity because it may need to inspect all ( n ) elements.

What is Space Complexity?

Space complexity measures the amount of memory or storage an algorithm uses as a function of the input size ( n ). It includes both the auxiliary space (extra memory allocated during execution, like temporary arrays or recursion stacks) and the input space (memory for the input data).

  • Key Points:
    • Focuses on memory usage, including variables, data structures, and recursion stacks.
    • Also expressed using Big O notation.
    • Does not include the input size in some analyses if the algorithm modifies the input in place.
    • Example: A recursive algorithm like naive Fibonacci has ( O(n) ) space complexity due to the recursion stack, even though its time complexity is ( O(2^n) ).

Examples with Java Code

Below are examples illustrating time and space complexity, with comments for clarity.

1. Linear Search (O(n) Time, O(1) Space)

public class LinearSearchExample {
    public static int linearSearch(int[] array, int target) {
        // Loop through each element
        for (int i = 0; i < array.length; i++) {
            if (array[i] == target) {
                return i; // Return index if found
            }
        }
        return -1; // Not found
    }
    
    public static void main(String[] args) {
        int[] array = {10, 20, 30, 40, 50};
        int target = 30;
        
        // Time complexity: O(n) - may need to check all n elements
        // Space complexity: O(1) - uses only a few variables (i, target)
        int result = linearSearch(array, target);
        System.out.println("Index of " + target + ": " + result);
    }
}
  • Time Complexity: ( O(n) ), as it may iterate through all ( n ) elements.
  • Space Complexity: ( O(1) ), as it uses only a constant amount of extra memory (variables i and target).

2. Merge Sort (O(n log n) Time, O(n) Space)

public class MergeSortExample {
    private static void merge(int[] array, int left, int mid, int right) {
        int n1 = mid - left + 1;
        int n2 = right - mid;
        int[] leftArray = new int[n1]; // Temporary array
        int[] rightArray = new int[n2]; // Temporary array
        
        for (int i = 0; i < n1; i++) leftArray[i] = array[left + i];
        for (int j = 0; j < n2; j++) rightArray[j] = array[mid + 1 + j];
        
        int i = 0, j = 0, k = left;
        while (i < n1 && j < n2) {
            if (leftArray[i] <= rightArray[j]) {
                array[k++] = leftArray[i++];
            } else {
                array[k++] = rightArray[j++];
            }
        }
        while (i < n1) array[k++] = leftArray[i++];
        while (j < n2) array[k++] = rightArray[j++];
    }
    
    public static void mergeSort(int[] array, int left, int right) {
        if (left < right) {
            int mid = left + (right - left) / 2;
            mergeSort(array, left, mid); // Recurse on left half
            mergeSort(array, mid + 1, right); // Recurse on right half
            merge(array, left, mid, right); // Merge results
        }
    }
    
    public static void main(String[] args) {
        int[] array = {12, 11, 13, 5, 6, 7};
        
        // Time complexity: O(n log n) - log n levels of recursion, each doing O(n) work
        // Space complexity: O(n) - temporary arrays for merging
        mergeSort(array, 0, array.length - 1);
        
        System.out.print("Sorted array: ");
        for (int num : array) System.out.print(num + " ");
    }
}
  • Time Complexity: ( O(n \log n) ), as the array is divided into ( \log n ) levels, and each level involves ( O(n) ) merging work.
  • Space Complexity: ( O(n) ), due to the temporary arrays (leftArray and rightArray) used during merging.

Key Differences Between Time and Space Complexity

  • Time Complexity:
    • Concerns the number of operations or runtime.
    • Affected by loops, recursion, or function calls.
    • Example: A nested loop over ( n ) elements results in ( O(n^2) ) time.
  • Space Complexity:
    • Concerns memory usage.
    • Affected by variables, data structures, or recursion stacks.
    • Example: Creating a new array of size ( n ) results in ( O(n) ) space.

Why Analyze Complexity?

  • Scalability: Predicts how an algorithm performs with large inputs.
  • Trade-offs: Helps balance time vs. space (e.g., a faster algorithm might use more memory).
  • Optimization: Guides developers to choose or design efficient algorithms.

Next: Common Algorithmic Complexities with Examples