๐ฏ 10 Mind-Bending Algorithms That Changed Computing
๐ Table of Contents
- ๐ Introduction
- ๐ Wave Function Collapse
- ๐จ Diffusion Algorithm
- ๐ฅ Simulated Annealing
- ๐ด Sleep Sort
- ๐ฒ Quantum Bogo Sort
- ๐ RSA Cryptosystem
- ๐ง Marching Cubes Algorithm
- โ๏ธ Byzantine Fault Tolerance
- ๐ฆ Boids Artificial Life
- ๐ Boyer-Moore String Search
- โก Quick Reference
- ๐ Summary Table
- ๐ฏ Key Takeaways
๐ Introduction
What This Covers
This lecture explores 10 fascinating algorithms that range from practical to theoretical, from life-saving to completely absurd. These algorithms demonstrate how creative problem-solving in computer science can:
- Save lives (medical imaging)
- Secure the internet (cryptography)
- Generate infinite content (procedural generation)
- Simulate nature (artificial life)
- Optimize complex systems (distributed computing)
Why This Matters
Algorithms are the building blocks of all software. Understanding how different algorithms solve problems gives you a deeper appreciation for the code running everything from your smartphone to hospital equipment.
Real-World Impact:
- Medical professionals visualize 3D scans using marching cubes
- Your online banking relies on RSA encryption
- Video games use wave function collapse for infinite worlds
- AI image generators use diffusion models
- Search tools use Boyer-Moore for lightning-fast text searching
๐ Wave Function Collapse
What It Is
Wave Function Collapse (WFC) is a procedural generation algorithm inspired by quantum mechanics that creates coherent, rule-based random content.
Definition: An algorithm that generates infinite, consistent patterns by treating possibilities as a โwave functionโ that โcollapsesโ into specific choices following predefined rules.
The Quantum Physics Inspiration
The Double Slit Experiment Analogy:
- In quantum physics, particles behave like waves when unobserved
- Upon observation, the wave function โcollapsesโ to a specific state
- WFC applies this concept to content generation
Why This Concept Exists:
- Nature optimizes computation (the universe doesnโt render what youโre not looking at)
- Similarly, games need to generate content on-demand, not pre-compute everything
How It Works
INITIAL STATE (Superposition)
โ
All possibilities exist simultaneously
โ
OBSERVATION (Collapse)
โ
Select one tile/element following rules
โ
PROPAGATE CONSTRAINTS
โ
Neighboring tiles adjust their possibilities
โ
REPEAT until entire map is generated
Step-by-Step Process:
- Start with a seed pattern (your initial map/tileset)
- Analyze relationships (which tiles can be neighbors)
- Create constraint rules (roads must connect, grass canโt float)
- Begin with superposition (all tiles are possible everywhere)
- Collapse one tile (pick lowest entropy position)
- Propagate constraints (update neighboring possibilities)
- Repeat until complete
Practical Applications
| Use Case | Example | Why WFC Works |
|---|---|---|
| Infinite Games | Side-scrolling platformers | Generates consistent terrain forever |
| Level Design | Dungeon generators | Creates playable, logical layouts |
| Texture Synthesis | Seamless patterns | Maintains visual coherence |
| City Planning | Procedural cities | Roads, buildings follow urban logic |
Real-World Example
Video Game Scenario:
Initial Map Tile: [Road-Grass-Building]
WFC Process:
- Next tile must connect road properly
- Can't place floating buildings
- Grass transitions naturally
- Result: Infinite coherent city
๐ก Pro Tips
- Start with good samples: The quality of your seed pattern determines output quality
- Define clear rules: Ambiguous constraints lead to impossible situations
- Handle contradictions: Implement backtracking when the algorithm paints itself into a corner
- Optimize entropy calculation: Choose tiles with fewest possibilities first for faster generation
Common Mistakes to Avoid
โ Too many constraints โ Algorithm gets stuck
โ Too few constraints โ Output looks random/incoherent
โ No backtracking โ Generation fails on contradictions
โ
Balanced rules with fallback strategies
๐จ Diffusion Algorithm
What It Is
The Diffusion Algorithm is a machine learning technique that generates images, audio, or video by reversing a noise-adding process.
Definition: A generative AI algorithm that learns to transform random noise into structured data by reversing a gradual corruption process.
Why It Exists
The Problem: How do you teach a computer to create realistic images from scratch?
Traditional Approaches Failed Because:
- GANs (Generative Adversarial Networks) were unstable to train
- VAEs (Variational Autoencoders) produced blurry results
- Direct generation lacked control and quality
Diffusionโs Solution: Learn the reverse process of destruction
The Thermodynamics Connection
| Thermodynamics Concept | Diffusion Algorithm Equivalent |
|---|---|
| High Entropy (disorder) | Random noise image |
| Low Entropy (order) | Structured, coherent image |
| Diffusion Process | Particles spread from high to low concentration |
| Reverse Diffusion | Algorithm reconstructs order from chaos |
How It Works
Phase 1: Forward Diffusion (Training)
# Pseudocode for forward diffusion
def forward_diffusion(clean_image, timesteps=1000):
"""Gradually add noise to image"""
noisy_images = []
for t in range(timesteps):
noise_level = calculate_noise_schedule(t)
noisy_image = clean_image + noise_level * random_noise()
noisy_images.append(noisy_image)
return noisy_images # From clean โ complete noise
Phase 2: Reverse Diffusion (Generation)
# Pseudocode for reverse diffusion
def reverse_diffusion(random_noise, trained_model, timesteps=1000):
"""Gradually remove noise to create image"""
image = random_noise
for t in reversed(range(timesteps)):
# Model predicts what noise to remove
predicted_noise = trained_model(image, timestep=t)
image = remove_noise(image, predicted_noise)
return image # From noise โ coherent image
Visual Process:
TRAINING PHASE:
Clean Image โ +noise โ +noise โ +noise โ Pure Noise
Step 0 Step 1 Step 2 Step 3 Step 1000
GENERATION PHASE:
Pure Noise โ -noise โ -noise โ -noise โ New Image
Step 1000 Step 3 Step 2 Step 1 Step 0
The Magic Behind It
Why This Works:
- Small steps are easier: Removing a tiny bit of noise is simpler than generating a full image
- Learned patterns: Model learns what โreal imagesโ look like at every noise level
- Gradual refinement: Each step slightly improves the image
- Conditioning: Can guide generation with text prompts, sketches, etc.
Real-World Applications
| Application | Example Tools | What It Does |
|---|---|---|
| Image Generation | DALL-E, Stable Diffusion, Midjourney | Creates images from text descriptions |
| Image Editing | Photoshop AI, Canva Magic Edit | Inpainting, outpainting, style transfer |
| Audio Generation | AudioLDM, Riffusion | Generates music and sound effects |
| Video (Emerging) | Runway Gen-2, Pika Labs | Text-to-video generation |
| 3D Models | DreamFusion, Point-E | 3D object generation |
Practical Example
Creating an Image with Stable Diffusion:
# Using Stable Diffusion
from diffusers import StableDiffusionPipeline
# Load pre-trained model
pipe = StableDiffusionPipeline.from_pretrained("stable-diffusion-v1-5")
# Generate image from text
prompt = "A serene mountain landscape at sunset, oil painting style"
image = pipe(
prompt=prompt,
num_inference_steps=50, # More steps = higher quality
guidance_scale=7.5 # How closely to follow prompt
).images[0]
image.save("mountain_sunset.png")
๐ก Pro Tips
- More steps = better quality but slower generation (50-100 steps is typical)
- Guidance scale controls creativity vs. prompt adherence (7-15 is sweet spot)
- Negative prompts tell the model what to avoid
- Seed values make generations reproducible
The Controversial Side
Ethical Concerns:
- Can generate deepfakes and misinformation
- Trained on copyrighted images without permission
- Potential to replace human artists
- โInfinite army of OnlyFans modelsโ (as mentioned in lecture)
Memory Aid
๐ฏ Remember: Diffusion is like sculpting from fog โ you start with chaos (random noise) and gradually reveal the statue (coherent image) by removing what doesnโt belong.
๐ฅ Simulated Annealing
What It Is
Simulated Annealing is an optimization algorithm inspired by metallurgy that finds near-optimal solutions to complex problems by balancing exploration and exploitation.
Definition: A probabilistic optimization technique that mimics the physical process of heating and slowly cooling metal to remove defects and reach a low-energy state.
Why It Exists
The Optimization Problem:
Many real-world problems have:
- Multiple solutions (not just one right answer)
- Local optima (good solutions that arenโt the best)
- Global optimum (the absolute best solution)
Example Problems:
- Amazon warehouse layout optimization
- Traveling salesman problem (shortest route through cities)
- Circuit board design
- Protein folding
- Job scheduling
The Metallurgy Inspiration
Physical Annealing Process:
| Step | What Happens | Why It Matters |
|---|---|---|
| Heat Metal | Atoms gain energy, move freely | Allows exploration of configurations |
| Slow Cooling | Atoms settle into low-energy states | Finds stable, optimal structure |
| Rapid Cooling | Atoms freeze in suboptimal positions | Creates defects, brittleness |
| Controlled Cooling | Atoms find global minimum energy | Produces strong, defect-free metal |
The Mountain Climbing Analogy
Problem: Find the highest peak in a mountain range
Local Peak GLOBAL PEAK
/\ /\
/ \ / \
/ \ Valley / \
/ \ /\ / \
____/ \__/ \____/ \____
Simple Hill Climbing: Gets stuck at local peak โ
Simulated Annealing: Can escape valleys to find global peak โ
Why Simple Hill Climbing Fails:
- Always moves to higher ground
- Gets trapped at first local maximum
- Never explores valleys that might lead to higher peaks
How Annealing Succeeds:
- Sometimes accepts โworseโ solutions (going downhill)
- Probability of accepting worse solutions decreases over time
- Can escape local optima early, then settle on global optimum
How It Works
Core Algorithm:
def simulated_annealing(initial_solution, max_iterations):
current_solution = initial_solution
current_energy = evaluate(current_solution)
temperature = INITIAL_TEMP # Start hot
for iteration in range(max_iterations):
# Generate neighbor solution (small random change)
neighbor = generate_neighbor(current_solution)
neighbor_energy = evaluate(neighbor)
# Calculate energy difference
delta_energy = neighbor_energy - current_energy
# Acceptance criteria
if delta_energy > 0: # Better solution
current_solution = neighbor
current_energy = neighbor_energy
else: # Worse solution
# Accept with probability based on temperature
acceptance_probability = exp(delta_energy / temperature)
if random() < acceptance_probability:
current_solution = neighbor # Accept anyway!
current_energy = neighbor_energy
# Cool down (reduce temperature)
temperature = cool_down(temperature, iteration)
return current_solution
Key Components:
- Temperature (T): Controls randomness
- High T โ Accept bad moves frequently (exploration)
- Low T โ Rarely accept bad moves (exploitation)
- Cooling Schedule: How temperature decreases
# Common cooling schedules T = T_initial * (0.95 ** iteration) # Exponential T = T_initial / (1 + iteration) # Linear T = T_initial / log(1 + iteration) # Logarithmic - Acceptance Probability:
P(accept worse) = e^(ฮE / T) Where: - ฮE = negative (worse solution) - T = current temperature - Higher T โ Higher probability
Exploration vs. Exploitation
EARLY PHASE (High Temperature)
โโ Exploration: 80% | Exploitation: 20%
โโ Accepts many bad moves
โโ Searches widely across solution space
โโ Escapes local optima
MIDDLE PHASE (Medium Temperature)
โโ Exploration: 50% | Exploitation: 50%
โโ Balanced approach
โโ Refines promising areas
LATE PHASE (Low Temperature)
โโ Exploration: 20% | Exploitation: 80%
โโ Rarely accepts bad moves
โโ Fine-tunes best solution found
โโ Converges to optimum
Real-World Example: Traveling Salesman
Problem: Visit 50 cities with shortest total distance
# Simulated Annealing for TSP
def tsp_simulated_annealing(cities):
# Start with random route
current_route = random_permutation(cities)
current_distance = calculate_total_distance(current_route)
temperature = 10000 # High initial temperature
cooling_rate = 0.995
while temperature > 1:
# Generate neighbor by swapping two cities
new_route = swap_two_random_cities(current_route)
new_distance = calculate_total_distance(new_route)
# Shorter distance is better (negative delta is improvement)
delta = new_distance - current_distance
if delta < 0 or random() < exp(-delta / temperature):
current_route = new_route
current_distance = new_distance
temperature *= cooling_rate # Cool down
return current_route
Practical Applications
| Domain | Problem | What SA Optimizes |
|---|---|---|
| Logistics | Delivery routes | Minimize travel time/fuel |
| Manufacturing | Production scheduling | Minimize downtime, maximize throughput |
| VLSI Design | Circuit layout | Minimize wire length, heat |
| Finance | Portfolio optimization | Maximize returns, minimize risk |
| Machine Learning | Hyperparameter tuning | Find best model parameters |
| Warehouse | Inventory placement | Minimize picking time |
The Learning Analogy
Why This Relates to Learning to Code:
โInitially you start out exploring all kinds of different technologies and frameworks, then eventually you find one specific area to exploit and specialize in.โ
| Learning Phase | Temperature | Behavior |
| ย | ย | ย |