The firm had also become well known as a place whose hiring philosophy was to recruit MIT dropouts. The idea was that if they could get into MIT they were smart, and if they dropped out, you could get them cheaper.

building new computers, said Barker, the operative assumption is that you design something you think will work, get the prototype ready, start testing, then gradually fix the design errors until the machine passes the test. It would have been an engineering fluke if the machine ran perfectly straight away.

Lick hired people based not on their doctoral work or class standing but on a simple test he applied: the Miller Analogies Test. (The test covers every field from geology to history and the arts. It requires both good general knowledge and an ability to apply that knowledge to relationships.) “I had a kind of a rule,” he said. “Anybody who could do 85 or better on the Miller Analogies Test, hire him, because he’s going to be very good at something.”

“The hope,” Licklider wrote, “is that in not too many years, human brains and computing machines will be coupled . . . tightly, and that the resulting partnership will think as no human brain has ever thought and process data in a way not approached by the information-handling machines we know today.”

“The process of technological development is like building a cathedral,” remarked Baran years later. “Over the course of several hundred years new people come along and each lays down a block on top of the old foundations, each saying, ‘I built a cathedral.’Next month another block is placed atop the previous one. Then comes along an historian who asks, ‘Well, who built the cathedral?’ Peter added some stones here, and Paul added a few more. If you are not careful, you can con yourself into believing that you did the most important part. But the reality is that each contribution has to follow onto previous work. Everything is tied to everything else.”

Heart liked working with small, tightly knit groups composed of very bright people. He believed that individual productivity and talent varied not by factors of two or three, but by factors of ten or a hundred. Because Heart had a knack for spotting engineers who could make things happen, the groups he had supervised at Lincoln tended to be unusually productive.

The IMP had other self-sufficiency measures, one of which was called a “watchdog” timer. “If the program went wild,” Heart explained, a small timer in the machine would run down to zero (but a healthy program would continually reset it). If the timer reached zero and went off, the IMP was assumed to have a trashed program. It would then throw a relay to turn on the paper tape reader, give it a while to warm up, and then reload the program. BBN would ship each machine with a copy of a tape that had three copies of the entire IMP operating program in sequence, giving each IMP the chance to do three auto-reloads before someone had to go and rewind the tape.

The very word “protocol” found its way into the language of computer networking based on the need for collective agreement among network users. For a long time the word has been used for the etiquette of diplomacy and for certain diplomatic agreements. But in ancient Greek, protokollon meant the first leaf of a volume, a flyleaf attached to the top of a papyrus scroll that contained a synopsis of the manuscript, its authentication, and the date. Indeed, the word referring to the top of a scroll corresponded well to a packet’s header, the part of the packet containing address information.

Stefferud and others in the MsgGroup—the community with the most experience with e-mail—immediately saw the flaws in the U.S. Postal Service’s plan, which involved converting messages from digital electronic media to paper and then delivering them by hand as you would ordinary mail. Not only would this approach cost more than e-mail, but it would never be fast enough to compete with e-mail as long as it depended on USPS’s traditional foot power for those final steps to the mailbox. Desktop computers “will make the perfect mailbox,” Stefferud predicted, and would bypass the post office entirely. An analogy could be drawn to the once farcical notion of automated garbage collection, which was unthinkable until the invention of the “electric pig,” the early name given to the in-sink disposal. “The key is not in automating the bag/can/truck/ person mechanism,” Stefferud said. “It is in bypassing them altogether.”

One of the MsgGroup’s eminent statesmen, Dave Crocker, sometimes probed the Net with a sociologist’s curiosity. One day, for example, he sent a note to approximately 130 people around the country at about five o’clock in the evening, just to see how fast people would get the message and reply. The response statistics, he reported, were “a little scary.” Seven people responded within ninety minutes. Within twenty-four hours he had received twenty-eight replies. Response times and numbers on that order may seem hardly noteworthy in a culture that has since squared and cubed its expectations about the speed, ease, and reach of information technology. But in the 1970s “it was an absolutely astonishing experience,” Crocker said, to have gotten so many replies, so quickly, so easily, as that.

Perhaps what TCP/IP had to recommend it most was the fact that it was unerringly “open.” Its entire design was an open process, following a path first blazed by Steve Crocker and the Network Working Group and continuing into the Internet. The ARPANET, and later the Internet, grew as much from the free availability of software and documentation as from anything else.

“Standards should be discovered, not decreed,” said one computer scientist in the TCP/IP faction. Seldom has it worked any other way.

Frank Heart’s pragmatic attitude toward technical invention—build it, throw it out on the Net, and fix it if it breaks—permeated Net sensibility for years afterward.