Beautiful Science: The Man Who Revolutionized Computer Science with Math

Credit: Talia Herman for Quanta Magazine

You’d probably never heard of Leslie Lamport, but his handiwork is everywhere in the world we live today – from the internet, to cloud computing, to artificial intelligence, and so on. In essence, the fact that modern computers can effectively coordinate with each other owes to the work of Lamport who has now turned his fertile mind to making programming itself more efficient – at the age of 81.

Distributed Systems

The work for which Lamport is best known focuses on a single question: How do you get computers to talk to one another efficiently and reliably? The question is harder than it sounds. Tens of thousands of computers exchanging millions of messages and executing tens of millions of interrelated functions is a recipe for chaos. Some kind of order must be imposed. The computers must operate in logical sequence.

A simple network consists of several computers calling in to a central server. But you wouldn’t want the computers of tens of thousands of World of Warcraft addicts calling in to the same server at once. One server wouldn’t be able to keep up with all the requests. To do that, you need a distributed network, the kind of network Lamport researches. A distributed network isn’t arrayed around a single server. Instead, multiple computers talk to one another. Although each computer operates independently, the overall network works coherently. This coherence is achieved through algorithms. Like a mathematical theorem, algorithms make assertions that must be formally proved. Only when proven algorithms are executed across the network do computers coordinate with maximum efficiency and functionality.

A Spark from the Theory of Relativity

As early as the 1970s, Lamport realized that one of the biggest problems in distributed computing is getting the clock right. When multiple devices operate in overlapping slices of time, ambiguity results. Which computer’s clock is the right one?

In a seminal 1978 paper [1], Lamport introduced the notion of “causality” to solve this issue. The ingenuity of his insight was to use an idea from physics: Einstein’s special relativity. According to Einstein, two observers may disagree on the order of events; they may have different notions of what “at the same time” means. But if one event causes another, that causality establishes a time sequence that eliminates the ambiguity. And sending or receiving a message can establish causality among multiple processes. Logical clocks — now also called Lamport clocks now provide a standard way to reason about concurrent systems. See note 2 at the bottom of this post for a more detailed explanation of how Lamport clocks work.

Algorithms are not the same as Codes

With this tool in hand, computer scientists next wondered how they could systematically make these connected computers even bigger, without adding bugs. Lamport came up with an elegant solution: Paxos, a “consensus algorithm” that allows multiple computers to execute complex tasks. Without Paxos and its family of algorithms, modern computing could not exist.

Lamport’s tendency to see computer networks as mathematical problems waiting to be solved sets him apart from many computer scientists who focus on mastering specific computer languages, like Java or HTML. “Math is more fundamental”, Lamport says. It’s the edifice on which the languages stand. College computer-science majors need to learn math, he says, not programming languages that will be obsolete in a few years.

Martín Abadi, a principal scientist at Google who collaborated with Lamport during the 1980s and ’90s, says that Lamport’s algorithms are so powerful because he excels at understanding the large abstract principles at work in computer science, then finding a way to make the abstractions concrete enough to manipulate and apply.

Lamport’s work since the 1990s has focused on “formal verification,” the use of mathematical proofs to verify the correctness of software and hardware systems. Notably, he created a “specification language” called TLA+ (for Temporal Logic of Actions). A software specification is like a blueprint or a recipe for a program; it describes how software should behave on a high level. It’s not always necessary, since coding a simple program is akin to just boiling an egg. But a more complicated task with higher stakes — the coding equivalent of a nine-course banquet — requires more precision. You need to prepare each component of each dish, combine them in a precise way, then serve them to every guest in the correct order. This requires exact recipes and instructions, written in unambiguous and succinct language using the precise language of mathematics to prevent bugs and avoid design flaws.

The way Lamport sees it, the way most programmers try to cobble together a system before formally proving whether that system will work is like a chef preparing a banquet without first knowing that their recipes will lead to the intended dishes.

WATCH: In Conversation with Leslie Lamport


Notes

[1] This post is adapted from “How to Write Software with Mathematical Precision, by Sheon Han at Quantamag. For a link to Lamport’s seminal 1978 paper, “Time, clocks, and the ordering of events in a distributed system”, Communications of the ACM, click https://dl.acm.org/doi/10.1145/359545.359563

[2]Here’s a more detailed description of Lamport’s idea of logical clocks.

Leave a Reply