Introduction to Genetic Algorithms
Designing search under uncertainty: from evolutionary metaphors to optimization code.
1. Why Genetic Algorithms Still Matter
In a world dominated by deep learning, genetic algorithms seem almost retro at first glance. But when you work with messy real-world problems—ugly constraints, unknown gradients, conflicting objectives—you quickly rediscover why evolutionary computation still matters.
Genetic algorithms are less about finding “the” answer and more about structuring search when you do not fully understand the landscape you are exploring.
Instead of tracking a single best guess, GAs maintain a whole population of candidate solutions. This simple shift mirrors how robust teams, markets, and even societies operate under uncertainty: they keep multiple hypotheses alive and let feedback decide what survives.
Visual_Concept
2. From Evolution Metaphor to Algorithm
The inspiration for genetic algorithms is straightforward: evolution works. Nature continuously generates variation, filters it through selection, and preserves what works well enough to survive. GAs turn that story into an algorithm.
- Organism → Chromosome (a candidate solution encoded as bits, numbers, or symbols).
- Gene → Variable or feature that influences the solution’s behavior.
- Fitness → Objective function that measures how “good” a solution is.
- Environment → Problem constraints and data the solution must satisfy.
- Generations → Iterations of variation, evaluation, and selection over time.
Initialize population P with N random individuals
Evaluate fitness of each individual in P
repeat until termination_condition:
Select parents from P based on fitness
Apply crossover to parents to produce offspring
Apply mutation to offspring
Evaluate fitness of offspring
Create new population P from offspring (and optionally some elites)
return best individual found3. Encoding: How Solutions Become Genomes
The first serious design decision in any GA is representation. The same real-world problem—say, university course scheduling, portfolio optimization, or layout design—can be encoded as bit strings, integer vectors, or permutations. The encoding defines what “mutation” and “crossover” actually mean.
Good encodings obey a simple rule: small genetic changes should correspond to small semantic changes. If flipping a single bit completely destroys feasibility, your search will keep stumbling into nonsense.
# Example: simple binary encoding for a 1D optimization
# Maximize f(x) for x in [0, 1] using 16-bit representation
BITS = 16
def encode(real_value: float) -> str:
# Map [0, 1] -> [0, 2^BITS - 1]
clipped = max(0.0, min(1.0, real_value))
integer = int(clipped * (2**BITS - 1))
return format(integer, f"0{BITS}b") # e.g., '0100110010101010'
def decode(bitstring: str) -> float:
integer = int(bitstring, 2)
return integer / (2**BITS - 1)Visual_Concept
4. Fitness Functions: Defining “Good”
Textbooks ask you to “define a fitness function” as if it were a purely technical step. In practice, it is where you encode your values, trade-offs, and blind spots. A fitness function is the contract between your algorithm and your beliefs about what should win.
Most real problems mix hard constraints—conditions that must never be violated—with softer goals like cost, comfort, or fairness. Genetic algorithms often combine these into a single scalar fitness by adding penalties for constraint violations.
def fitness(schedule) -> float:
conflicts = count_conflicts(schedule)
room_overloads = count_room_overloads(schedule)
early_morning_penalty = count_8am_classes(schedule)
return (
-10 * conflicts # Hard-ish: clashes are very costly
- 5 * room_overloads # Capacity issues
- 1 * early_morning_penalty # Softer preference
)Whatever you reward, you will get more of. A misdesigned fitness function will evolve precisely the kind of behavior you never wanted.
5. Selection: Who Gets to Reproduce
Selection is how the algorithm decides which individuals get to pass their genes forward. It is the computational equivalent of promotions and budgets in an organization: it shapes what survives and what disappears.
Visual_Concept
import random
from typing import List, Any
def tournament_selection(population: List[Any], fitnesses: List[float], k: int = 3):
"""Randomly pick k individuals and return the fittest as the parent."""
candidates = random.sample(list(zip(population, fitnesses)), k)
best_individual, _ = max(candidates, key=lambda x: x[1])
return best_individualIncrease selection pressure too much and the population collapses around early winners, losing diversity and getting stuck in local optima. Make it too gentle and evolution drifts slowly, never really committing to anything.
6. Exploration vs Exploitation
Every GA lives inside the tension between exploration and exploitation. Early on, you want broad exploration: high diversity, more mutation, lower selection pressure. Later, you want to exploit what you have learned and refine the best regions of the search space.
The most interesting designs borrow ideas from systems thinking and control: adapt mutation rates when diversity drops, preserve a few elites to stabilize learning, or hybridize GAs with local search for fine-tuning.
Visual_Concept
7. From Algorithms to Systems Thinking
Once you have lived with genetic algorithms for a while, you start seeing them everywhere. Organizations are populations of strategies. Markets are fitness landscapes. Leadership decisions change the “selection pressure” and mutation rate in subtle ways.
This is where GAs connect directly to system thinking frameworks and leadership psychology: they teach you that progress is emergent, not commanded; that diversity is not a luxury, but a hedge against unknown futures; and that defining “fitness” carefully is the most leveraged decision you make.
To explore the systems lens behind this, see “System Thinking Frameworks” and how feedback loops and incentives shape long-term behavior.