Selection Sort
Selection sort finds the minimum element in the range 0 to N-1 and places it at the beginning, then repeats the process for the remaining unsorted portion.
public static void selectionSort(int[] data) {
if (data == null || data.length < 2) {
return;
}
for (int i = 0; i <= data.length - 2; i++) {
int minPosition = i;
for (int j = i + 1; j <= data.length - 1; j++) {
if (data[minPosition] > data[j]) {
minPosition = j;
}
}
swap(data, i, minPosition);
}
}
private static void swap(int[] data, int a, int b) {
if (a == b) return;
data[a] ^= data[b];
data[b] ^= data[a];
data[a] ^= data[b];
}
def selection_sort(data):
if not data or len(data) < 2:
return
for i in range(len(data) - 1):
min_pos = i
for j in range(i + 1, len(data)):
if data[min_pos] > data[j]:
min_pos = j
data[i], data[min_pos] = data[min_pos], data[i]
Bubble Sort
public static void bubbleSort(int[] data) {
if (data == null || data.length < 2) {
return;
}
for (int i = data.length - 1; i >= 1; i--) {
for (int j = 0; j < i; j++) {
if (data[j + 1] > data[j]) {
swap(data, j, j + 1);
}
}
}
}
private static void swap(int[] data, int a, int b) {
if (a == b) return;
data[a] ^= data[b];
data[b] ^= data[a];
data[a] ^= data[b];
}
def bubble_sort(data):
if not data or len(data) < 2:
return
for i in range(len(data) - 1, 0, -1):
for j in range(i):
if data[j + 1] > data[j]:
data[j], data[j + 1] = data[j + 1], data[j]
Insertion Sort
Insertion sort builds the sorted portion incrementally—first making elements 0 through 1 sorted, then extending to include the next element, and so on.
public static void insertionSort(int[] data) {
if (data == null || data.length < 2) {
return;
}
for (int i = 1; i < data.length; i++) {
for (int j = i - 1; j >= 0 && data[j] > data[j + 1]; j--) {
swap(data, j, j + 1);
}
}
}
private static void swap(int[] data, int a, int b) {
if (a == b) return;
data[a] ^= data[b];
data[b] ^= data[a];
data[a] ^= data[b];
}
def insertion_sort(data):
if not data or len(data) < 2:
return
for i in range(1, len(data)):
key = data[i]
j = i - 1
while j >= 0 and data[j] > key:
data[j + 1] = data[j]
j -= 1
data[j + 1] = key
Merge Sort
public static void mergeSort(int[] data) {
if (data == null || data.length < 2) {
return;
}
process(data, 0, data.length - 1);
}
private static void process(int[] data, int left, int right) {
if (left == right) return;
int middle = left + ((right - left) >> 1);
process(data, left, middle);
process(data, middle + 1, right);
merge(data, left, middle, right);
}
private static void merge(int[] data, int left, int middle, int right) {
int[] buffer = new int[right - left + 1];
int idx = 0;
int p1 = left;
int p2 = middle + 1;
while (p1 <= middle && p2 <= right) {
buffer[idx++] = data[p1] <= data[p2] ? data[p1++] : data[p2++];
}
while (p1 <= middle) buffer[idx++] = data[p1++];
while (p2 <= right) buffer[idx++] = data[p2++];
for (idx = 0; idx < buffer.length; idx++) {
data[left + idx] = buffer[idx];
}
}
def merge_sort(data):
if len(data) < 2:
return data
mid = len(data) // 2
left = merge_sort(data[:mid])
right = merge_sort(data[mid:])
return merge(left, right)
def merge(left, right):
result = []
i = j = 0
while i < len(left) and j < len(right):
if left[i] <= right[j]:
result.append(left[i])
i += 1
else:
result.append(right[j])
j += 1
result.extend(left[i:])
result.extend(right[j:])
return result
Time complexity: O(N·logN) via Master theorem. Space complexity: O(N).
Quick Sort
Version One
Partition the array around a pivot—elements less than or equal to the pivot on one side, elements greater on the other—then recursively sort both partitions.
import random
class Solution:
def findKthLargest(self, nums, k):
def partition(left, right):
pivot_index = random.randint(left, right)
nums[left], nums[pivot_index] = nums[pivot_index], nums[left]
pivot_value = nums[left]
while left < right:
while left < right and nums[right] < pivot_value:
right -= 1
nums[left] = nums[right]
while left < right and nums[left] >= pivot_value:
left += 1
nums[right] = nums[left]
nums[left] = pivot_value
return left
def quicksort(left, right):
if left == right:
return nums[left]
pos = partition(left, right)
if pos + 1 == k:
return nums[pos]
elif pos + 1 < k:
return quicksort(pos + 1, right)
else:
return quicksort(left, pos - 1)
if len(nums) == 1:
return nums[0]
if len(set(nums)) == 1:
return nums[0]
return quicksort(0, len(nums) - 1)
Time complexity: O(N·logN). Not stable.
Version Two (Recommended)
Partition into three regions: less than pivot, equal to pivot, and greater than pivot.
import random
class Solution:
def findKthLargest(self, nums, k):
def quickselect(arr, k):
if not arr:
return None
pivot = random.choice(arr)
left = [x for x in arr if x > pivot]
middle = [x for x in arr if x == pivot]
right = [x for x in arr if x < pivot]
if k <= len(left):
return quickselect(left, k)
elif k > len(left) + len(middle):
return quickselect(right, k - len(left) - len(middle))
else:
return pivot
return quickselect(nums, k)
Superior partitioning approach. Not stable.
Stability
Definition
A sorting algorithm is stable when elements with equal values maintain their relative order after sorting.
Consider an input seqeunce: 3 3 1. Marking them as 3① 3② 1, after sorting if 3① still precedes 3②, the algorithm exhibits stability.
When Stability Matters
Multi-level sorting scenarios require stability to preserve results from previous sorting stages.
For example, ranking products by quality score (higher first), then by price (lower first when scores match). This requires two passes of comparison, and the second pass must not disrupt the ordering from the first pass.
In data structures, complex custom objects typically require stable sorting.
Comparison Summary
No comparison-based algorithm achieves O(N·logN) time complexity, O(1) space complexity, and stability simultaneously.
| Algorithm | Time Complexity | Space Complexity | Stable |
|---|---|---|---|
| Selection | O(N²) | O(1) | No |
| Bubble | O(N²) | O(1) | Yes |
| Insertion | O(N²) | O(1) | Yes |
| Merge | O(N·logN) | O(N) | Yes |
| Quick | O(N·logN) | O(logN) | No |
| Heap | O(N·logN) | O(1) | No |
Stability Details
Algorithms that swap elements across multiple positions typically loose stability.
Selection Sort
Locates the minimum across the entire unsorted region and swaps it to the sorted boundary. This wide-range exchange destroys relative ordering. Example: 3① 3② 1 3③ becomes 1 3② 3① 3③.
Bubble Sort
Adjacent element swapping preserves stability.
Insertion Sort
Building sorted regions incrementally—expanding from 0..1 to 0..2 to 0..3. Skipping the swap when encountering equal values maintains stability.
Merge Sort
Merging sorted subarrays. Taking elements from the left subarray first when values are equal preserves stability.
Quick Sort
Both versions (two-way and three-way partitioning) swap elements across non-adjacent positions, losing stability.
Heap Sort
No mechanism exists to preserve stability throughout the entire process.
Optimization Considerations
Merge sort can achieve O(1) space complexity via in-place merging tcehniques, but this degrades time complexity to O(N²). Implementing stable quick sort is possible but involves complex code transformations.
Focus on standard implementations rather than exotic optimization variants.
Practical Engineering Considerations
Production systems evaluate two factors: dataset size and stability requirements.
For small datasets, O(N²) algorithms perform comparably to O(N·logN) variants, yet the former consume only O(1) extra space. Therefore, selection sort, bubble sort, and insertion sort are preferable for small inputs.
For large datasets requiring stability, merge sort dominates. When stability is unnecessary and space is constrained, heap sort or quick sort become the choices.
Quick sort's in-place partitioning makes it memory-efficient, while merge sort's predictable O(N·logN) guarantee and stability make it ideal for external sorting scenarios.