went by, in alliances that included powerful corporations and governments that were very pleased to run machines with software that did not come from the laboratories of Microsoft in Redmond, Washington, or of Apple in Cupertino, California. It was not that Moglen or his original long-haired clients had changed or compromised their views: the world simply had moved in their direction, attracted not necessarily by the soaring principles of âfree as in free speech,â or even because it was âfree as in free beer.â They liked it because it worked. And, yes, also because it was free.
The hackers had led an unarmed, unfunded revolution: to reap its rewards, all that the businessesâand anyone elseâhad to do was promise to share it. The success of that movement had changed the modern world.
It also filled the lecture hall on a Friday night. Yet Moglen, as he stood in the auditorium that night in February 2010, would not declare victory. It turned out that not only did free software not mean free beer, it didnât necessarily mean freedom, either. In his work, Moglen had shifted his attention to what he saw as the burgeoning threats to the ability of individuals to communicate vigorously and, if they chose, privately.
âI can hardly begin by saying that we won,â Moglen said, âgiven that spying comes free with everything now. But we havenât lost. Weâve just really bamboozled ourselves and weâre going to have to unbamboozle ourselves really quickly or weâre going to bamboozle other innocent people who didnât know that we were throwing away their privacy for them forever.â
His subject was freedom not in computer software but in digital architecture. Taken one step at a time, his argument was not hard to follow.
In the early 1960s, far-flung computers at universities and government research facilities began communicating with one another, anetwork of peers. No central brain handled all the traffic. Every year, more universities, government agencies, and institutions with the heavy-duty hardware joined the network. A network of networks grew; it would be called the Internet.
The notion that these linked computers could form a vast, open library, pulsing with life from every place on earth, gripped some of the Internetâs earliest engineers. That became possible in 1989, when Tim Berners-Lee developed a system of indexing and links, allowing computer users to discover what was available elsewhere on the network. He called it the World Wide Web. By the time the public discovered the web in the mid-1990s, the personal computers that ordinary people used were not full-fledged members of the network; instead, they were adjuncts, or clients, of more centralized computers called servers.
âThe software that came to occupy the network was built around a very clear idea that had nothing to do with peers. It was called server-client architecture,â Moglen said.
So for entry to the promise and spoils of the Internet, individual users had to route their inquiries and communications through these central servers. As the servers became more powerful, the equipment on the desktop became less and less important. The servers could provide functions that once had been built into personal computers, like word processing and spreadsheets. With each passing day, the autonomy of the users shrunk. They were fully dependent on central servers.
âThe idea that the network was a network of peers was hard to perceive after a while, particularly if you were a, let us say, an ordinary human being,â Moglen said. âThat is, not a computer engineer, scientist, or researcher. Not a hacker, not a geek. If you were an ordinary human being, it was hard to perceive that the underlying architecture of the net was meant to be peerage.â
Then, he said, the problem became alarming, beginning with an innocent, and logical, decision made by naïve technologists. They