“For people who want to make sure the Web serves humanity, we have to concern ourselves with what people are building on top of it,” Tim Berners-Lee told me one morning in downtown Washington, D.C., about a half-mile from the White House. Berners-Lee was speaking about the future of the Internet, as he does often and fervently and with great animation at a remarkable cadence. With an Oxonian wisp of hair framing his chiseled face, Berners-Lee appears the consummate academic—communicating rapidly, in a clipped London accent, occasionally skipping over words and eliding sentences as he stammers to convey a thought. His soliloquy was a mixture of excitement with traces of melancholy. Nearly three decades earlier, Berners-Lee invented the World Wide Web. On this morning, he had come to Washington as part of his mission to save it.
At 63, Berners-Lee has thus far had a career more or less divided into two phases. In the first, he attended Oxford; worked at the European Organization for Nuclear Research (CERN); and then, in 1989, came up with the idea that eventually became the Web. Initially, Berners-Lee’s innovation was intended to help scientists share data across a then obscure platform called the Internet, a version of which the U.S. government had been using since the 1960s. But owing to his decision to release the source code for free—to make the Web an open and democratic platform for all—his brainchild quickly took on a life of its own. Berners-Lee’s life changed irrevocably, too. He would be named one of the 20th century’s most important figures by Time, receive the Turing Award (named after the famed code breaker) for achievements in the computer sciences, and be honored at the Olympics. He has been knighted by the Queen. “He is the Martin Luther King of our new digital world,” says Darren Walker, president of the Ford Foundation. (Berners-Lee is a former member of the foundation’s board of trustees.)
Read the full report here.