Act II of VII
Machines
A computer is a clock ticking through a list of simple instructions, very fast.
On this page
The working table of contents.
- The transistor — a switch you control with electricity (not how it's built, just what it does).
- Switches to logic — AND, OR, NOT gates. How combining them lets you build an adder (the "aha" moment: math from switches).
- Logic to processor — registers, ALU, control unit, the fetch-decode-execute cycle. One instruction at a time, billions of times per second.
- Memory hierarchy — why not all memory is equal (registers → cache → RAM → disk), the speed-vs-cost trade-off, locality as the key insight.
- The clock — what clock speed actually means, why "faster clock = faster computer" stopped being true (power wall), and what happened next (more cores, specialized chips).
- Beyond the CPU — GPU (same instruction, many data points), accelerators (TPU, NPU) as chips built for one job.
Going deeper
Branches that earn their own article.
- Transistor physics (CMOS, FinFET).
- Digital logic design (combinational, sequential, FSMs).
- ISA deep dives (x86, ARM, RISC-V).
- Pipelining, hazards, out-of-order execution.
- Cache coherence protocols.
- Branch prediction.
- GPU architecture internals.
- FPGA programming.
- Chip fabrication process.