The introduction of microprocessors in the 1970s made personal computers possible.

Over the last decade, these advances in pricing and processing power have made the personal computer the largest consumer of microprocessors. At the same time, microprocessors have transformed the ubiquitous PC from a stand-alone office workhorse doing word-processing and spreadsheets to a widely connected, information machine that can send faxes and e-mail, access on-line services, or provide a video link for meetings. Today, Pentium processors and clones are driving the PC into untapped, new frontiers of mass-market communications and interactive multimedia home computing. By the turn of the century, when high volume chips are capable of executing more than a billion instructions per second, doors will open to brave new worlds we can only begin to imagine such as holographic videoconferencing and personal digital assistants that beep your cardiologist when your stock portfolio slides.

The Pentium microprocessor [actual size]

Microprocessor

---A microprocessor, also called a CPU, is a tiny, enormously powerful high speed electronic brain etched on a single silicon semiconductor chip which contains the basic logic, storage and arithmetic functions of a computer. It thinks for the computer and, like a traffic cop, coordinates its operations. It receives and decodes instructions from input devices like keyboards, disks, or cassettes, then sends them over a bus system consisting of microscopic etched conductive "wiring" to be processed by its arithmetic calculator and logic unit. The results are temporarily stored in memory cells and released in a timed sequence through the bus system to output devices such as CRT Screens, networks, or printers.

The first microprocessor, the Intel 4004 4-bit [1971], measured just 1/8" by 1/16" yet was as powerful as the first electronic computer 25 years earlier [1946], which weighted 30 tons, used 18,000 vacuum tubes, and required so much electricity that the lights of West Philadelphia are said to have dimmed each time it was turned on. Today, DEC's 64-bit Alpha microprocessor is more than 550 times as powerful as the 4004, with speeds comparable to yesterday's mainframes.

Programmability

---CPUs can be programmed either by the chip manufacturer, distributor, or the computer manufacturer. Those programs for a single purpose product like a calculator or a video game, are generally written by the OEM and entered into memory by the CPU manufacturer or distributor. For PCs, the CPU, which must perform a wide range of tasks, is generally programmed by the computer's manufacturer. The user merely inserts a prerecorded cassette tape, cartridge, or floppy disk containing instructions for each application into the computer and the CPU performs the instructions.

Key Components

---A microprocessor has five key components; an arithmetic and logic unit [ALU], which calculates and thinks logically; registers, which are memory cells that store information temporarily for the ALU; a control unit, which decodes input instructions, and acts as a traffic cop; bus systems, which are submicron wiring routes connecting the entire system; and a clock, which times the sequential release of the processed data.

Computer Interface

---In addition to a CPU, a computer requires memory and parts for connecting it to input/output devices. A device that includes a CPU, memory and input/output ports on a single chip is called a microcontroller. The two basic types of memory are RAM [Random Access Memory] and ROM [Read Only Memory]. RAM stores modifiable programs and data that the user needs to perform immediate tasks. ROM stores unalterable programs that govern the computer's standard operations. Input devices like keyboards, mouses, and programs on cassette tape, cartridges, disks, and CD-ROM enable us to communicate with the computer. Output devices like monitors, printers, and moderns enable computers to communicate with each other.

It is astonishing how electronic computer technology has transformed our world considering it is only 50 years old and the microprocessor, which revolutionized computers, is less than 25 years old. Its development involved the convergence of three developing technologies: the calculator, the computer and the transistor.

The first "digital calculators"--fingers and thumbs--are still in use, as is the 5,000 year old abacus. Calculators go back to 17th century inventions, including Schickard's Calculating Clock [1623], Pascal's Pascaline [1642], and Leibriz's Stepped Reckoner [1673]--machine that used hand-made gears and wheels to do arithmetic.

The next breakthrough came as a result of the Industrial Revolution during the first half of the 19th century with Charles Babbage's steam powered mechanical computing and printing machines: The Difference Engine and the Analytical Engine. Although never completed, their designs included a "mill" to carry out arithmetic processes, a "store" to hold data, a "barrel" to control the operation and "punch cards" to provide the external program--all fundamentals of modern computers.

Business calculators appeared next. In 1884, Felt invented the key-driven mechanical calculator. In 1893, Steiger introduced a mass-produced, automated version of Leibniz's machine, widely used for scientific calculations.

Twentieth Century

---At the turn of the century, a flood of immigration created a major problem for US Census takers. To hand sort the 1890 census information would have taken a decade, rendering the data virtually useless. But in 1889, Herman Hollerith came up with a solution: the first electromechanical machine for recording and tabulating digital material.

In the 1920's and 30's, the use of punch card equipment for data processing expanded. In 1933, Columbia University received an endowment of punch card and accounting machines for IBM's Thomas Watson which led Wallace Eckert to create a mechanical program to link them together, closing the gap between calculators and future computers. Shannon and Stibitz's later discovery that relay circuits could perform binary math, provided the base for the rise of the electronic computer.

World War II

---During World War II, military requirements for gun trajectories and code breaking accelerated computer research. Though Germany showed little interest in computers, in 1938, Konrad Zuse, a German, built the Z3, the first electromechanical, general-purpose, program-controlled computer, later destroyed by Allied bombs. In 1943, the British built Alan Turing's COLOSSUS, an electronic calculating crypto analysis machine that cracked ENIGMA, the Nazi code machine--a crucial breakthrough in winning the war.

In 1946, ENIAC, the first general purpose, program controlled, all electronic, vacuum tube-based digital computer was completed at the University of Pennsylvania to solve ballistic equations. ENIAC worked 500 times faster than the earlier electromechanical Harvard Mark I [1937-44], which doomed the electromechanical approach. Other electronic computers quickly followed, including the Cambridge University built EDSAC [1946-49], the first full-scale stored program computer, and the MIT Whirlwind [1945-50], the first interactive, parallel, real-time computer.

Vacuum Tubes

---Vacuum tubes became the standard circuitry element of the first generation of mass-produced computers in the 1950s. UNIAC I, built in 1951 by Remington Rand's Eckert-Mauchly Computer Division, made Remington the leader. In 1953 MIT's Whirlwind introduced magnetic core memory, which became the primary memory for most computers until the mid 70s. In the mid 50s, with UNIVAC I obsolete and UNIVAC II delayed, IBM's 705 [a vacuum tube magnetic core memory machine] established IBM as large-scale computer leader--a lead it never relinquished. But tubebased computers encountered problems of power, temperature, size and especially, tube maintenance.

The Transistor

---The solution, the transistor, came from the world of telecommunications. Wartime research had produced new information on normally non-conductive elements germanium and silicon, which become conductors after their atomic structure and a very tiny current is applied. This led to a project begun at Bell Labs in 1946 to develop a "solid state" telephone signal amplifier to replace vacuum tube ones. Bell's John Bardeen, and Walter Brattain, led by William Shockley, invented the point-contact transistor there, in 1947, a turning point in electronics and computer technology.

Commercial viability began with Shockley's cheaper, more reliable junction transistor in 1950. In 1953, Sonotone marketed the first transistorized hearing aid. In 1954, TI produced the first portable transistor radio and Philco developed the surface barrier transistor, first used in 19565 in the MIT Lincoln Laboratory's TX-0 computer built by Ken and Stan Olsen and Harlan Anderson, who left MIT in 1957 to start up DEC. By 1959, tubes were out and transistors were in, ushering in a second generation of smaller more powerful computers. These included the IBM 7090 series [1959]; the IBM 1401 [1960], which became the world's most widely used computer; and MIT's LINK [1962], the first personal computer.

The Integrated Circuit

---Early transistors also had a major drawback: like vacuum tubes, they were discrete components and had to be wired together to form a single circuit, a process that was expensive and inefficient. In 1958, newly hired Jack Kiby at Texas Instruments designed the first integrated circuit IC. The following year Jean Hoerni, one of the "Traitorous Eight" with Robert Noyce, Gordon Moore, and five others who had defected from Shockley Semiconductor to form Fairchild, invented the planar [flat] transistor, making viable the commercial production of ICs. That same year, Noyce filed a patent for an IC based on the planar process, which became the basis for subsequent IC production, even at TI.

In 1961 Fairchild marketed the first planar IC, which used deposited aluminum film rather than wire as internal connections. In 1962 RCA's Stephen Hofstein and Frederic Heiman developed a MOSFET [metal-oxide-semiconductor field-effect transistor] for use in ICs. The MOSFET carried a single charge, was cheaper, smaller and used less power than bipolar transistors and would become the commonly used transistor in the manufacture of microprocessors.

Because volume manufacturing of ICs was not quite ready, as an interim measure, hybrid circuits composed of tiny, discrete bare chips were used by IBM for its 1964 System/360, the first family of ROM programmed computers, which ushered in the third generation of computers, rendering all previous commercial computers obsolete. The DEC PDP-8, the first true minicomputer appeared in 1965 followed in 1966 by Hewlett-Packard's 2116A and Data General's NOVA in 1968.

Meanwhile, other advances in IC technology set the stage for the development of the microprocessor. These included Fairchild's production in 1967 of the Micromosaid, the first IC made using computer-aided-design, and the development in 1970, of large-scale integration [LSI] using MOS transistors. At the start of the 60's, less than 1% of all solid state components manufactured were incorporated in ICs. In 1963, over 10% of all components were in ICs, practically all to satisfy military demand. But by 70, nearly 90% of all components manufactured were in IC's. This boom triggered an epidemic of mass defections from Fairchild, ironically itself the product of a mass defection. Among these defectors were Charles Spork, who left to become President of National Semiconductor, Robert Noyce and Gordon Moore, who resigned and joined with Andrew Grove to found Intel in 1968, and Jerry Sanders, who resigned to head a group of eight other Fairchild "graduates" to form Advanced Micro Devices in 1969.

The First Microprocessor

---Within a year of its founding, Intel was marketing static RAM chips and Hamilton Electro was signed as its first distributor. In 1970, Intel introduced the first dynamic RAM, which increased IC memory by a factor of four. These early products identified Intel as an innovative young company. However, their next product, the microprocessor, was more than successful: it did for ICs had done for transistors.

The project that produced the microprocessor originated in 1969, when Busicom, a Japanese calculator manufacturer, asked Intel to build a chip set for high performance desktop calculators. Busicom's original design called for a dozen different logic and memory chips. Ted Hoff, the Intel engineer assigned to the project, believed the design was not cost effective. His solution was to simplify the design and produce a programmable processor capable of creating a set of complex special-purpose calculator chips.

Together with Federico Fagin, later the founder of Zilog, Hoff came up with a four-chip design; a ROM for custom application programs, a RAM for processing data, an I/O device, and an unnamed 4-bit central processing unit which would become known as a "microprocessor."

Busicom, initially skeptical, finally gave Intel the go ahead. Nine months later, Intel successfully completed the project. Remarkably, the four-bit 4004 microprocessor, composed of 2,300 transistors etched on a tiny chip, could execute 60,000 instructions per second, making it as powerful as the massive ENIAC, but despite the Busicom success, Intel itself was divided over the microprocessor's broader application. Some with little idea of the new CPU's potential opposed it, while advocates such as Hoff and Mazor believed it could handle a host of application. After a heated debate, the advocates won. Intel negotiated a return of the Busicom design rights and committed to developing the microprocessor's potential.

The First Eprom

---As the 4004 project was being completed in early 1971, Intel introduced the 1701 EPROM [erasable programmable read-only memory], a memory device that greatly increased the microprocessor's practicality and versatility. Invented by Dov Frohman, the EPROM could be programmed electrically by Intel or its distributors. Then, whenever necessary, it could be erased by exposure to ultraviolet light and reprogrammed. Intel senior vice-president Ed Gelback noted, "It made sense to be able to reprogram the microprocessor instead of buying fixed ROMs for it. You could change your system overnight or every five minutes with an EPROM."

The First 8-Bit: Intel's 8008

---While the 4004 microprocessor and the 1701 EPROM were still being developed, Datapoint approached both Intel and TI to design large scale integrated circuits for a new intelligent terminal but later rejected both designs, choosing rather to build its terminals with standard logic ICs. Despite the rejection, both Intel and TI completed their projects. TI eventually patented its chip, while in 1972 Intel produced the 8008, the first 8-bit microprocessor. With the capability of processing alphabetical as well as numerical characters, the 8008 represented a huge advance over the arithmetically only oriented 4004.

Industry Skepticism

---It was not surprising that the use of the microprocessor would initially meet with the same reluctance that greeted the invention of the transistor and the IC. To overcome these initial objections, Intel created a customer education program and introduced development tools, software, and peripherals. These design aids proved crucial to the microprocessor's acceptance and the growth of its customer base.

Market Breakthrough

---In 1974, there were a series of breakthroughs. Digital, then the leading manufacturer of minicomputers, announced the MVPs, a series of microprocessors built around the Intel 8008. DEC's endorsement of the 8008 meant that the microprocessor had truly arrived. At the same time, Motorola entered the market with a systems-oriented approach. Its 8-bit 6800 offered full support and was the first +5-volt single power supply CPU, an innovation that lowered cost. Also, in 1974, TI further advanced microelectronic technology while expanding the microprocessor market by introducing the 4-bit TMS 1000, a self-contained computer on a single chip. With a bulk order price of just $2 and the capability of being used in a board range of consumer products, including calculators, appliances, toys and games, the TMS 1000 became the most popular microcontroller.

The First General-Purpose Microprocessor: Intel's 8080

---In April 1974, Intel introduced the 8-bit 8080, the first general-purpose microprocessor. With the ability to execute 290,000 instructions per second and 64K bytes of addressable memory, the 8080 was the first microprocessor with the speed, power, and efficiency to become a key tool for designers. Development labs set up by Hamilton/Avnet, Intel's first microprocessor distributor, showcased the 8080 and provided a broad customer base which contributed to its becoming the industry standard. A key factor in the 8080's success was its role in the introduction in January 1975 of the MITS Altair 8800, the first personal computer. It used the powerful 8080 microprocessor and established the precedent that personal computers must be easy to expand. With its increased sophistication, expandability, and an incredibly low price of $395, the Altair 8800 proved the viability of home computers.

The year 1975 also witnessed a dramatic lowering of microprocessor prices. At the September WESCON trade show, MOS Technology created a sensation by offering its 6502 8-bit microprocessor for just $25. At the time, Intel's 8080 and Motorola's 6800 had already come down from $360 to $179. Intel and Motorola immediately dropped their prices to $69.95. By 1980, improved versions of Intel's 8080 cost under $5.

Other Developments

---In 1975, Advanced Micro Devices introduced the 4-bit 2901 bit-slice processor. Instead of the standard unipolar MOS transistors, the 1901 employed the much faster Schottky bipolar devices. At the same time, it reduced the uneconomical bipolar power requirements by using groups of so-called "bit-slice" chips that divide up the processor's work and generate less heat. In 1976, Intel introduced the 8048, the first 8-bit single chip microcomputer. That same year, Zilog's Z80 entered the market. Designed by Masatoshi Shima, who had come over with Federico Faggin from Intel, the Z80 represented a significant advance over the Intel 8080. It had twice the speed, could execute more instructions, and had its own dynamic RAM refreshers. The Z80 was used in Radio Shack's original TRS-80, while an enhanced version ran the video game PAC-MAN.

16-Bit Microprocessor

---IN 1974 National Semiconductor introduced PACE, the first 16-bit microprocessor. It was not until the late 70's that fast and efficient 16-bit microprocessors arrived and began serving the high performance needs of new sophisticated PCs and industrial computers. These included Intel's 8086 [1978] and the powerful Motorola 68000 [1979], which anticipated the next generation of microprocessors by combining data into 32-bit chains. With an ability of addressing 16 million bytes of memory [16 megabytes,] the 6800 was used in Apple's Macintosh computers and in the Radio Shack TRS-80.

Other important developments included the introduction of the Intel 8088 [1979], an 8-bit with 16-bit architecture used in the IBM PC, and the Intel 80286 [1982], used in the IBM PC AT. In addition, in 1980 Intel introduced the 2816 E2 PROM, an electrically erasable programmable read-only memory, which enables the user to reprogram a microprocessor electrically in the field without removing the memory device.

The 32-Bit Microprocessor

---The 32-bit microprocessor provides increased speed and the power of a minicomputer in a microprocessor. In 1983, National Semiconductor shipped the NS32032--the first full 32-bit microprocessor offered on the merchant market. Texas Instruments second-sourced the 32000 family, which also includes the 32332, introduced in 1985 and designed for use in high performance systems. In 1984, Motorola introduced its advanced CMOS MC68020 and became the second company to ship a full 32-bit microprocessor.

In 1985 Intel entered the 32-bit market with the 80386, a 32-bit processor with 275,000 transistors, a more than 100-fold increase in transistor count over the 4004, a top operating speed of 4 MIPS, and the ability to run DOS software concurrently with UNIX programs. It was the only microprocessor that could make this claim. Intel's 386 pushed the microprocessor into a new realm of commercial applications. Before the 386, computer makers could build higher performing machines with other technologies, but they couldn't build them as small or as inexpensively.

Recession

---The 386 was a bright spot in an otherwise sorry era of microprocessor history. A combination of slumping demand and growing capacity saw the US semiconductor industry lose money every quarter from the second quarter of 1985 through the fourth quarter of 1986. At the same time, Japanese semiconductor manufacturers engaged in unfairly low pricing of DRAMS and EPROMS, forcing Intel to abandon the DRAM business it had created. The whole industry was beset by layoffs, salary cuts and plant closings.

Things began to improve in the fall of 1986, marked by the signing of an historic Japanese/American trade agreement that stopped DRAM dumping and required the Japanese to set aside 20% of their chip purchases for foreign firms by mid 1990.

Not So Riscy Business

---By 1987, 32-bit processor speed records were being broken by streamlined approach to chip architecture known as RISC or Reduced Instruction Set Computing. RISC eliminates extraneous instructions so a computer can execute common jobs faster, producing more powerful machines at a lower cost. At the International Solid State Circuits Conference in 1987, Hewlett-Packard Co. unveiled a RISC microprocessor that at 15 MIPS was three times faster than industry leading chips from Intel and Motorola. Shortly afterward, AMD entered the RISC market with a 17 MIPS RISC chip. In 1989, these speedy RISC chips along with Sun Microsystem's Sun Sparc Unix processor, propelled the market for high powered scientific, engineering and commercial workstations that performed sophisticated and complex applications at one-twentieth the cost of a minicomputer.

Mainframe On A Chip

---The nineties witnessed unprecedented improvement in chip pricing and performance spurred by robust challenges to Intel's domination of the PC world from AMD. TI, Motorola, and newcomers Cyrix Corp., NexGen MicroSystems, Information Technology, Chips & Technologies, and MIPs Computer Systems.

Intel's 486microprocessor was introduced in late 1989. Dubbed "the mainframe on a chip", the 486 boasted 1.2 million transistors, 32-bit processors, clocks speeds of 25Mhz and data crunching power of 15 MIPS. In January of 1990 Motorola parried with the 68040 chip, besting the 486 at 20 MIPS and 3.5 million floating point operations per second. Along with RISC chips, these microprocessors radically altered the economics of the computer business, ushering in an ear of "client-server" networks in which a group of computes work together on a single desktop as if they are all one speedy and powerful mainframe, without temperature controlled clean rooms and skilled technicians.

Mobile Computing

---While competition continued to drive down prices, space- and power- saving designs were making microprocessors available for a range of new uses in automobile, cellular phones, pagers, and portable computers. In 1990, one of the hottest new markets was mobile computing, driven by chips that extended a portable computer's battery life by 50% and reduced circuit-board size by 40% such as AMD's Am286LX and Intel's 386SL and 82360SL. By 1993, further advances in CPU circuitry that reduced power consumption and boosted performance turned the laptop business into the fastest growing segment of the exploding computer market with a new category called subnotebooks. Top entries included H-P's OmniBook, IBM's ThinkPad, and Toshiba's T3400, all weighing under 5 lbs. in a package smaller in length and width than a ream of typing paper.

Price Wars

---Competition among chip manufacturer was fast and fierce. In April of 1991, Intel unveiled its low-cost 486SX chip, priced at an affordable $400 and designed to stimulate demand in every type of computer, from tiny palmtop PCs to giant supercomputers. A year later, Cyrix's first offering, the Cx486SLC mimicked Intel's top-if-the-line 486 for $119 per chip in quantity shipments--a full 40% less than Intel's cheapest design.

Intent on staving off cloners, Intel announced plans to give birth to new chip families every two years, marketing in 1992 nearly 30 new variants of its cutting-edge 486. Though price wars kept profit margins low, the end result was good news across the board. Total sales of microprocessors in 1993 skyrocketed to $8.3 billion, up an incredible 59% from 1992.

Pentium: Double Processing For The Buck

---On March 22, 1993 Intel launched the most complex microprocessor over built. The Pentium was more than just a follow-on of the 486, however. It was the culmination of an internal battle over RISC technology and conventional CISC [complex instruction-set computer] chips. While RISC chips--including Intel's own--run faster than CISC microprocessor, they are slower than when using the enormous library of existing PC software. Intel wanted to build a chip that could run available PC software at near-RISC speeds. The outcome of this effort was Pentium, a microprocessor with 3 million transistors, clock speeds of more than 100 MHz, and the capacity to execute 100 MIPS to drive cutting edge multimedia and communication software of the burgeoning information Superhighway.

That same year, on the RISC front, Digital released the world's first commercial 64-bit microprocessor, the super-fast Alpha, which, along with Silicon Graphics' R4400, serves as the building block for a new generation of high-end network servers and workstations. In the world of personal computers, RISC is also the basis for Motorola's PowerPC challenge to the dominant x86 class of Intel microprocessors. The end product of a three-year alliance between IBM, Apple Computer Corp, and Motorola, the PowerPC chip allows personal computers to run both Mac and DOS-based software.

AMD Sends In The Clones

---But engineering labs weren't the only battle station in the microprocessor wars. During the late '80s and '90s another offensive was taking place in the courts. The issue was whether AMD had the right to clone Intel 386 and 486 chips under a 1982 technology-sharing agreement between the two chip makers. Despite initial legal setbacks, AMD cloned the widely used chips, putting itself at risk for huge damage payments. The decision led to a starting rebound for the company in 1994, during which time sales of its 486 chips doubled and production got underway for very fast 100 MHz 486s priced significantly below Intel. In December of 1994, the California Supreme Court ruled in favor of AMD, after which Intel settled the case rather than appeal the decision to a federal jurisdiction.

Microprocessors For The Masses

---By the end of 1994, the same $2,000 that one year earlier could buy a state-of-the-art 486 PC could now buy a Pentium-based machine with a fast PCI bus for rewed-up performance. This led Americans for the first time to spend more money on home PCs and software than on color TVs. In 1995, Microsoft's long-awaited Windows 95 operating system and growing consumer interest in the Internet is expected to drive the PC market to new heights as first-time computer buyers and upgraders flock to the latest crop of multimedia Pentium PCs, outfitted with CD-ROMs, video and sound cards, and high-powered modems. Analysts predict a third year of 30-plus percent microprocessor growth, to $15 billion, up from a mere $1 billion less than a decade ago.

P6: The Next Generation

---IN 1996, next generation chips like Intel's P6, and equivalents from Cyrix, AMD and NexGen will enter the market with 5.5 million transistors and double the data-crunching speed of the fastest Pentiums, initially at 130 MHz and later to over 200 MHz. The P6 will give relatively inexpensive desktop systems the power to deal with voice and images as naturally as they do type fonts today, enabling them to take on even more tasks once reserved for supercomputers--like translating voice into text, conducting video conferences and using intelligent agents to fetch information from on-line networks.

In the commercial realm, the P6 will enhance the performance of network servers with its capacity to run as many as four CPUs at the same time.

Prime-Time PCs

---Computers are also about to take a giant step forward in the world of multimedia, where microprocessors are blurring the lines between PCs, telephones, TVs and video games. Telephone and broadcast companies are now gobbling up digital signal processing [DSP], data compression, and multimedia chips for video servers and TV set-top boxes that will speed up the processing of real-time video and graphic images, movies, and near virtual reality 3D holographic games to homes. In PCs, DSP chips will soon allow the computer to hear, see and talk, allowing it to double as a telephone, dictation machine, stereo and video conferencing system.

A Look Toward The Future

---What's in store? Chipmakers will continue finding ways to etch even tinier lines on silicon so that signals zip more quickly between transistors, increasing clock speed and boosting performance exponentially. Today's fastest fifth generation Windows-compatible microprocessors are built on silicon wafers etched with lines 0.5 microns wide. Sixth generation microprocessors like the P6 will utilize 0.35 micron technology, followed by 0.25 micron technology in 1997. Mainstream microprocessors for the x86 instruction set in 2001 will be manufactured in 0.18 micron technology with 20 million transistors. Beyond 0.18 microns will require more sophisticated techniques such as electron ion beam or X-ray lithography which can produce lines so small that if such transistors were silicon pearls, you would need a string of 2,500 of them to circle a human hair.

One can only speculate on what creations will be unleashed when a microchip with seven times the computing horsepower of a 100-MHz Pentium is cheap enough and small enough for a product aimed at the mass market. Technology already exists for comic book fantasies like Dick Tracy watches that are phone, fax, stereo and television, and Star Trekian computers that use smart, context-sensitive voice recognition to respond to plain and simple human speech. But if the past is any indication, the microprocessor of the future will continue to surpass the wildest expectations of even its staunchest advocates.

Twenty five years ago, the answer to the question, "But what can it do?" created an industry worth hundreds of billions of dollars, giving rise to everything from pocket calculators to personal computers that entertain, inform, and educate. Today, with cyberspace about to become a suburb of America, analyst Daniel T. Klesken at Robertson, Stephens & Co. in San Francisco has already raised his chip industry forecast for the year 2000 from $200 billion to $350 billion. Exactly where computer science and technology will take us in the 21st century is anyone's guess. But whatever the specifics, one thing is clear; the microprocessor will be there, generating a future of dazzling promise and unimaginable achievement.

What invention brought the Internet to mass audiences?

50 Cards in this Set.

What was the original motivation for the creation of the Internet quizlet?

The Internet was originally created to transport messages more rapidly for an increasingly sedentary and isolated population. The Internet originated as a military and government project. You just studied 75 terms!

Did the Internet originate as a military and government project?

The Internet originated as a military-government project, with computer time-sharing as one of its goals. In the 1960s, computers were relatively new, and there were only a few of the expensive, room-sized mainframe computers across the country for researchers to use.

What was the original motivation for developing the Internet?

A major initial motivation for both the ARPANET and the Internet was resource sharing – for example allowing users on the packet radio networks to access the time sharing systems attached to the ARPANET. Connecting the two together was far more economical that duplicating these very expensive computers.

Chủ Đề