Welcome to the Algorithm Wiki
Explore foundational algorithms used in pathfinding, sorting, and decision-making.
Each section provides an overview, real-world applications, complexity analysis,
and clear pseudocode so you can implement or visualize them in your own projects.
Choose any algorithm from the sidebar to dive deeper!
Breadth-First Search (BFS)
BFS explores a graph layer by layer from the start node, guaranteeing the shortest path in
unweighted graphs. It employs a queue to track frontier nodes.
Typical uses include maze-solving, peer-to-peer broadcast, and determining the minimum moves in
puzzles such as the 15-tile game.
Complexities — Time: O(V + E) | Space: O(V)
Pseudocode
function BFS(start, goal):
queue ← new Queue()
queue.enqueue(start)
visited ← {start}
parent ← map()
while not queue.isEmpty():
node ← queue.dequeue()
if node == goal:
return reconstructPath(parent, start, goal)
for neighbor in node.neighbors():
if neighbor not in visited:
visited.add(neighbor)
parent[neighbor] ← node
queue.enqueue(neighbor)
return []
Depth-First Search (DFS)
DFS dives deep along branches before backtracking, using recursion or an explicit
stack. It excels at exploring tree-like structures and detecting cycles.
Applications: generating mazes, topological sorting, and identifying connected components
in images or graphs.
Complexities — Time: O(V + E) | Space: O(V) (max recursion)
Pseudocode
function DFS(start, goal):
stack ← new Stack()
stack.push(start)
visited ← {start}
parent ← map()
while not stack.isEmpty():
node ← stack.pop()
if node == goal:
return reconstructPath(parent, start, goal)
for neighbor in node.neighbors():
if neighbor not in visited:
visited.add(neighbor)
parent[neighbor] ← node
stack.push(neighbor)
return []
Dijkstra’s Algorithm
Dijkstra computes shortest paths in graphs with non-negative edge weights by expanding the node
with the smallest tentative distance stored in a min-heap.
Used in GPS navigation and network routing (e.g., OSPF) where accurate distance metrics are
critical.
Complexities — Time: O((V + E) log V) | Space: O(V)
Pseudocode
function Dijkstra(start, goal):
dist ← map(default = ∞)
dist[start] ← 0
parent ← map()
pq ← new MinHeap()
pq.insert((0, start))
while not pq.isEmpty():
(d, node) ← pq.extractMin()
if node == goal: break
for (neighbor, w) in node.neighbors():
alt ← d + w
if alt < dist[neighbor]:
dist[neighbor] ← alt
parent[neighbor] ← node
pq.insert((alt, neighbor))
return reconstructPath(parent, start, goal)
A* Search Algorithm
A* augments Dijkstra with a heuristic function h(n)
estimating remaining cost to
the goal, dramatically reducing explored nodes while preserving optimality if the heuristic is
admissible.
Standard in game AI and robotics path planning; choose Manhattan or Euclidean heuristics for grid
worlds, or domain-specific estimates for complex maps.
Complexities — Time: depends on heuristic (≤ Dijkstra) |
Space: O(V)
Pseudocode
function A*(start, goal, h):
g ← map(default = ∞)
g[start] ← 0
f ← map(default = ∞)
f[start] ← h(start)
parent ← map()
open ← new MinHeap()
open.insert((f[start], start))
while not open.isEmpty():
(fScore, node) ← open.extractMin()
if node == goal:
return reconstructPath(parent, start, goal)
for (neighbor, w) in node.neighbors():
tentative ← g[node] + w
if tentative < g[neighbor]:
g[neighbor] ← tentative
f[neighbor] ← tentative + h(neighbor)
parent[neighbor] ← node
open.insert((f[neighbor], neighbor))
return []
Bubble Sort
Repeatedly swaps adjacent out-of-order elements, letting the largest “bubble” to the end each
pass. Simple but slow (quadratic time).
Good for teaching purposes or nearly-sorted lists where early-exit optimization makes it almost
linear.
Complexities — Time: O(n²) | Space: O(1)
Pseudocode
function bubbleSort(A):
n ← length(A)
for i in 0 to n-2:
swapped ← false
for j in 0 to n-2-i:
if A[j] > A[j+1]:
swap(A[j], A[j+1])
swapped ← true
if not swapped: break
Selection Sort
Finds the smallest element in the unsorted region and swaps it to the front. Performs
n passes regardless of initial order.
Makes only O(n) swaps, so it can be preferable when write-operations are expensive (e.g., EEPROM
memory).
Complexities — Time: O(n²) | Space: O(1)
Pseudocode
function selectionSort(A):
n ← length(A)
for i in 0 to n-2:
minIndex ← i
for j in i+1 to n-1:
if A[j] < A[minIndex]:
minIndex ← j
swap(A[i], A[minIndex])
Insertion Sort
Builds the final sorted list one element at a time by inserting each new element into the correct
position among already-sorted elements.
Fast (O(n)) on nearly-sorted data; used inside hybrid sorts (e.g., TimSort) for small sub-arrays.
Complexities — Time: O(n²) worst / O(n) best | Space: O(1)
Pseudocode
function insertionSort(A):
for i in 1 to length(A)-1:
key ← A[i]
j ← i-1
while j ≥ 0 and A[j] > key:
A[j+1] ← A[j]
j ← j-1
A[j+1] ← key
Merge Sort
Divide-and-conquer: recursively splits the list, sorts each half, then merges sorted halves. It’s
stable and guarantees O(n log n).
Excellent for external sorting of huge files and is the backbone of language library
implementations (Java, Python’s TimSort variant).
Complexities — Time: O(n log n) | Space: O(n)
Pseudocode
function mergeSort(A):
if length(A) ≤ 1: return A
mid ← floor(length(A)/2)
left ← mergeSort(A[0..mid-1])
right ← mergeSort(A[mid..])
return merge(left, right)
function merge(L, R):
result ← []
while L and R not empty:
if L[0] ≤ R[0]:
result.append(L.popFront())
else:
result.append(R.popFront())
return result + L + R
Quick Sort
Picks a pivot, partitions elements into < pivot, = pivot, > pivot, then recursively sorts
partitions. Average O(n log n) time and in-place (O(log n) extra space).
The most widely used general-purpose in-memory sort (e.g., C stdlib), but worst-case O(n²) if
pivots are chosen poorly.
Complexities — Avg Time: O(n log n) / Worst: O(n²) | Space: O(log n)
Pseudocode
function quickSort(A):
if length(A) ≤ 1: return A
pivot ← choosePivot(A)
L ← [x in A if x < pivot]
E ← [x in A if x == pivot]
G ← [x in A if x > pivot]
return quickSort(L) + E + quickSort(G)
Greedy Assignment
Iteratively picks the locally best pair (highest score), removes them from the pool, and repeats
until empty. Fast (typically O(n²)) but may miss the global optimum.
Use cases: quick matching heuristics when speed is critical and optimality is
negotiable.
Pseudocode
function greedyMatch(leaders, followers):
matches ← []
while leaders and followers not empty:
(bestL, bestF, bestScore) ← argmaxScorePair(leaders, followers)
matches.append((bestL, bestF))
remove bestL from leaders
remove bestF from followers
return matches
Monte Carlo Search
Generates many random assignments, scores each, and retains the best. Quality improves with more
iterations but runtime grows linearly.
Use cases: approximating solutions when evaluation is cheap but the search
space is enormous.
Pseudocode
function monteCarloMatch(leaders, followers, iterations):
best ← null
bestScore ← -∞
for i in 1..iterations:
perm ← randomShuffle(followers)
score ← evaluate(leaders, perm)
if score > bestScore:
bestScore ← score
best ← zip(leaders, perm)
return best
Simulated Annealing
Starts with a random solution, applies random swaps, and occasionally accepts worse moves with
probability exp(-Δ/T). Temperature T gradually cools, reducing uphill moves over time.
Use cases: escaping local optima in complex search spaces (e.g., job-shop
scheduling).
Pseudocode
function simulatedAnnealingMatch(initial, T0, α, iterations):
current ← initial
best ← current
bestScore ← evaluate(current)
T ← T0
for i in 1..iterations:
next ← randomSwap(current)
Δ ← evaluate(next) - evaluate(current)
if Δ > 0 or random() < exp(-Δ / T):
current ← next
if evaluate(current) > bestScore:
best ← current
bestScore ← evaluate(current)
T ← T * α
return best
Genetic Algorithm
Maintains a population of candidate solutions, selects parents by fitness, produces offspring via
crossover and mutation, and iterates across generations.
Use cases: optimizing combinatorial problems (e.g., traveling salesman, feature
selection).
Pseudocode
function geneticMatch(leaders, followers, popSize, generations):
population ← initPopulation(leaders, followers, popSize)
for g in 1..generations:
fitness ← [evaluate(ind) for ind in population]
matingPool ← selection(population, fitness)
offspring ← crossoverAndMutate(matingPool)
population ← offspring
return argmax(population, evaluate)
Tabu Search
Explores the neighborhood of the current solution while maintaining a short-term memory
(tabu list) of recent moves to prevent cycling.
Use cases: vehicle routing, graph coloring, and other optimization problems
requiring strategic exploration.
Pseudocode
function tabuMatch(initial, tabuLimit, iterations):
current ← initial
best ← current
tabu ← Queue()
for k in 1..iterations:
neighborhood ← generateNeighbors(current)
candidate ← argmax(neighborhood \\ tabu, evaluate)
move ← diff(current, candidate)
tabu.enqueue(move)
if tabu.size() > tabuLimit: tabu.dequeue()
current ← candidate
if evaluate(current) > evaluate(best):
best ← current
return best