Algorithm Efficiency
Algorithm efficiency is measured in two dimensions: time efficiency and space efficiency.
Big O Notation
Big O notation mathematically describes the asymptotic behavior of a function. It provides an estimation of an algorithm's growth rate. The rules for deriving Big O are:
- Replace all additive constants in the runtime function with 1.
- In the modified function, keep only the highest-order term.
- If the highest-order term exists and its coefficient is not 1, remove the coefficient. The result is the Big O order.
Time Complexity
Time complexity refers to the count of basic operations performed by an algorithm. It is an estimation focusing on the term with the greatest impact in the expression and is expressed using Big O notation.
Example 1: Nested Loops
void ProcessMatrix(int dimension) {
int total = 0;
for (int row = 0; row < dimension; ++row) {
for (int col = 0; col < dimension; ++col) {
++total;
}
}
for (int idx = 0; idx < 2 * dimension; ++idx) {
++total;
}
int constant = 10;
while (constant--) {
++total;
}
printf("%d\n", total);
}
The operation count function is F(N) = N² + 2N + 10. Applying Big O rules yields O(N²).
Example 2: Linear Loop
void LinearProcess(int size) {
int counter = 0;
for (int idx = 0; idx < 2 * size; ++idx) {
++counter;
}
int fixed = 10;
while (fixed--) {
++counter;
}
}
F(N) = 2N + 10. Complexity is O(N).
Example 3: Two Independent Loops
void DualLoop(int paramA, int paramB) {
int tally = 0;
for (int i = 0; i < paramA; ++i) {
++tally;
}
for (int j = 0; j < paramB; ++j) {
++tally;
}
}
F(N) = M + N. Complexity is O(M + N).
Example 4: Constant Time Loop
void FixedOperation() {
int result = 0;
for (int iter = 0; iter < 100; ++iter) {
++result;
}
printf("%d\n", result);
}
F(N) = 100. Complexity is O(1). Any algorithm with a fixed, constant number of operations has O(1) time complexity.
Example 5: String Search (Linear Search)
const char* FindCharacter(const char* text, char target) {
while (*text != '\0') {
if (*text == target)
return text;
++text;
}
return NULL;
}
For a string of length N, the worst-case (target not found or at the end) requires N checks. The best-case is O(1) (target at the start). The average case is O(N/2). We consider the worst-case: O(N).
Example 6: Bubble Sort
void BubbleSort(int* arr, int length) {
assert(arr);
for (size_t boundary = length; boundary > 0; --boundary) {
int swapped = 0;
for (size_t pos = 1; pos < boundary; ++pos) {
if (arr[pos - 1] > arr[pos]) {
Swap(&arr[pos - 1], &arr[pos]);
swapped = 1;
}
}
if (swapped == 0)
break;
}
}
The first pass does N-1 comparisons, the second N-2, and so on. This sum is (N-1) + (N-2) + ... + 1 = N(N-1)/2. Complexity is O(N²).
Example 7: Binary Search
int BinarySearch(int* sortedArr, int count, int value) {
assert(sortedArr);
int low = 0;
int high = count;
while (low < high) {
int middle = low + ((high - low) >> 1);
if (sortedArr[middle] < value)
low = middle + 1;
else if (sortedArr[middle] > value)
high = middle;
else
return middle;
}
return -1;
}
With each step, the search space halves. After X steps, N / 2^X = 1, so X = log₂ N. Complexity is O(log N).
Example 8: Recursive Factorial
long long ComputeFactorial(size_t num) {
return num < 2 ? num : ComputeFactorial(num - 1) * num;
}
The function recurses N times, performing a constant-time operation each time. Complexity is O(N).
Space Complexity
Space complexity measures the extra storage space an algorithm uses relative to its input size, typically counting variables. It also uses Big O notation.
Example 1: Bubble Sort Space
// Same BubbleSort function as above
Variables used: arr, length, boundary, swapped, pos. This is a constant number, so space complexity is O(1).
Example 2: Fibonacci Array
long long* GenerateFibonacci(size_t n) {
if (n == 0)
return NULL;
long long* sequence = (long long*)malloc((n + 1) * sizeof(long long));
sequence[0] = 0;
sequence[1] = 1;
for (int idx = 2; idx <= n; ++idx) {
sequence[idx] = sequence[idx - 1] + sequence[idx - 2];
}
return sequence;
}
The algorithm allocates an array of size n+1. The space complexity is O(n).
Example 3: Recursive Factorial Space
// Same ComputeFactorial function as above
Each recursive call creates a new stack frame using constant space. With N recursive calls, N stack frames exist simultaneously, leading to O(N) space complexity.
Common Time Complexities
Common orders include O(1), O(log N), O(N), and O(N²). O(1) is optimal. O(log N) is highly efficient and often seen in divide-and-conquer algorithms.
Key Principles
- Time complexity counts operations, not actual clock time.
- Space complexity counts additional variables or memory allocated, not total bytes.
- Time is cumulative (operations add up). Space can be reused (e.g., a loop variable uses the same memory each iteration).