Intel's accidental revolution By Michael Kanellos Staff Writer, CNET News.com November 14, 2001, 4:00 a.m. PT The foundation of modern computing was something of an accident. The Intel 4004 Microprocessor, which debuted thirty years ago Thursday,sparked a technological revolution because it was the first product tofuse the essential elements of a programmable computer into a singlechip. Since then, processors have allowed manufacturers to embed intelligenceinto PCs, elevators, air bags, cameras, cell phones, beepers, key chainsand farm equipment, among other devices. But that's not the way the story was supposed to turn out. The 4004 was designed to be a calculator component for a Japanesemanufacturer, which initially owned all rights to the chip. At the time,most Intel executives saw little promise in the product. The microprocessor's transformation of Intel and other PC-centriccompanies into titans of industry instead came through clever bargaining,some fortuitous design decisions and chance. "I think it gave Intel its future, and for the first 15 years we didn'trealize it," said Intel Chairman Andy Grove. "It has become Intel'sdefining business area. But for...maybe the first 10 years, we looked atit as a sideshow. It kind of makes you wonder how many sideshows thereare that never become anything more." In the past 30 years, of course, microprocessors and microcontrollers(embedded microprocessors with integrated components) have becomeubiquitous. In 2000 alone, 385 million microprocessors were shipped and6.4 billion microcontrollers went out factory doors, according to MercuryResearch. "It is not an exaggeration to say that the microprocessor has made afundamental impact on everyone's life in this country," said LinleyGwennap, principal analyst at The Linley Group. "Before themicroprocessor, computers were these huge things...that filled up a roomor at least were file cabinet size." The chip trioThe 4004 was essentially the brainchild of three engineers: Ted Hoff,Stan Mazor and Federico Faggin. In April 1969, Busicom, a Japanesecalculator manufacturer, contracted with Intel, then specializing inmemory, to develop a series of custom chips for five upcoming machines. The concept had been considered inevitable; the difficulty lay in how todo it. Mazor, a former Fairchild Semiconductor engineer, joined Hoff todevelop a design. Economically, a single chip was imperative. Busicom's originalspecifications "would have taken about 16 different chips," recalled LesVadasz, president of Intel Capital, the chipmaker's investment arm, andone of the managers of the 4004. "We said, 'Holy s**t. We don't have thatkind of manpower.'" Cost-conscious Intel also required that the calculator chip fit into thesame 16-pin package the company used on its memory products. Pins, themetallic channels on a pin package, serve as conduits for electricalsignals. "We were very careful in being minimalistic," Mazor said. "Managementwasn't too interested in (the 4004). We got into the computer businessmore or less by mistake." After Hoff and Mazor completed the conceptual architecture, Intel'sVadasz lured Faggin from Fairchild in April 1970 to construct the chip.Like Hoff, Faggin had already established a reputation within theindustry. He had developed silicon gate technology, which alloweddesigners to drop aluminum transistor gates, which were far larger andharder to control. Silicon gate technology "was smaller, faster, more reliable, cheaper.What more do you want?" Faggin said. To this day, disagreements swirl over who deserves the most credit forthe 4004. The architecture guaranteed the chips would work, said Mazor,calling Faggin "the guy who stayed up all night and tested them to see ifthey worked." For his part, Faggin said that "anybody with a college degree coulddesign an instruction set," a fundamental part of Hoff and Mazor's workin 1971--an opinion shared by some analysts. Mazor even admits that heand Hoff borrowed liberally from IBM and Digital instruction sets.Vadasz, who had a bitter falling out with Faggin in the 1970s, creditsHoff because he came up with the necessary creative conceptual leaps. In any event, deadlines had already become a crisis. On Faggin's secondday on the job, Masatoshi Shima, a Busicom engineer, arrived to check onthe project's progress. No work had been done since December. Shima hitthe roof. "It was very close" to falling apart, Faggin recalled. "It took me thebest part of one week to calm him down." Nonetheless, Busicom granted an extension to the contract. Fourteen-hourworkdays for Faggin and three drafting assistants followed. Unlikecurrent designers, who use high-end workstations to design circuits,Faggin's team laid out circuit patterns with razor-thin strips ofrubylith, designing tape now considered archaic even by newspaper layoutrooms. While the 4004 became the first microprocessor, Intel's total packageconsisted of four chips: the 4001, a read-only memory (ROM) chip forstoring software; the 4002, a random access memory (RAM) chip for datastorage; and the 4003, an input-output device. By October, workingsamples of the 4001 had been produced--a milestone. "Before that time, I was under a lot of stress because I didn't know ifthere were any 'gotchas,'" Faggin said. Despite early success, the first batch of 4004 chips didn't work--a quicklook through a microscope showed the manufacturing team had forgotten acrucial step. The memory still prompts a big laugh from Faggin. Although the delays angered Busicom, the extension handed Intel its firstfortunate twist of fate. Some Intel insiders began to comprehend thepower of the invention, assisted by pushing from the three inventors. Intel founder Bob Noyce, for instance, started to question whether the4004 had broader implications, recalled Vadasz. Meanwhile, the calculator business had become more cutthroat. By the timeIntel finished the 4004, Busicom wanted a discount. Intel made acounteroffer: It would drastically cut the contract price if Busicomwould grant Intel a license to freely sell the chip outside thecalculator market. Busicom agreed. Whoops. Mixed reactionAn article in ElectronicNews heralded the release of the 4004. Itprocessed 4 bits of data at a time, ran at 108 kilohertz (a tenth of 1megahertz) and could perform mathematical calculations. It cost less than$100. Gordon Moore, Intel's CEO at the time, hailed it as "one of themost revolutionary products in the history of mankind." Others were less excited. "It was interesting, but it certainly wasn'tperceived as a threat," said Nathan Brookwood, a processor analyst whowas at that point working at Digital Equipment, the then-reigning titanin mini-computers. Years later, many still failed to grasp the concept. In 1975, a seniorengineer at DEC told Brookwood that Intel would "never be a threat...Thatwas the conventional wisdom in the mini-computer business in themid-1970s to late 1970s." In April 1972, Intel released the 8008, which could process data in 8-bitchunks. Negotiations once again worked to Intel's advantage. The 8008 chip was designed for Datapoint, a terminal manufacturer inTexas that couldn't pay for it at the end of the contract. To settle,Datapoint granted Intel the rights to the chip, including the instructionset, which Datapoint developed. The instruction set eventually becamepart of the basis for the X86 architecture behind Intel chips today. "The irony is that the original instruction set was theirs, and theoriginal motivation was theirs," Mazor said. The breakthrough moment for microprocessing came in 1974, according tomany, with the 8080 processor. Not only did the chip feature a morecomplex instruction set, it came in a package with 40 pins, twoinnovations that greatly expanded its capabilities. "With 4-bitprocessors, the level of complexity is minimal," said Dean McCarron,principal analyst at Mercury Research. "The 8080 was a home run." So why Intel?By this time, though, competitors such as RCA, Honeywell and Fairchildhad come out with microprocessors, many of which, such as Motorola's 6800family, provided superior performance. Zilog, whose engineers includedFaggin and former Busicom engineer Shima, received rave reviews for itsZ80 processor. So how did Intel emerge as the victor? For one, the company strove to ensure that adoption was as easy aspossible. Along with chips, Intel sold complete development systems toindustrial designers to seed software development. "In a way, through that project, we had the first PC, but we nevercapitalized on it," Vadasz said. "With the emergence of the PC, thatbusiness disappeared." Competitors also miscalculated demand. National Semiconductor, forinstance, marketed an expensive 16-bit chip in an 8-bit world, recalledMazor. "Everybody did everything else wrong, and they did it with greateffort," he said. But most importantly, IBM selected the Intel 8088 for the first PC in1981. IBM had two PC projects: one in Austin, Texas, and one in Florida.The Austin project relied on a Motorola processor, but delays made IBMfavor the Florida project. "You can't underestimate the importance of the IBM deal," McCarron said."If it wasn't for that, we'd be talking about Motorola vs. AMD." Or not. In a final twist in the early years, IBM required that Intel finda second source for the chip. The company turned to AMD, singing alicensing agreement that effectively helped create its lead competitortoday.