A Brief History of Personal Computers The electronic computer is a relatively modern invention; the first fully operable computer was developed about 50 years ago, at the end of World War II, by a team at the University of Pennsylvania's Moore School of Engineering. This team was headed by John Mauchly and J. Prosper Eckert, who named the new machine ENIAC, for Electronic Numerical Integrator and Calculator. ENIAC was hardly a personal computer, occupying a large room and weighing about 33 tons. By today's standards, ENIAC was extremely slow, unreliable, and expensive to operate. In 1945, on the other hand, it was considered a marvel.

Over the next 30 years, computers became smaller, faster, and less expensive. However, most of these machines remained isolated in their own air-conditioned rooms, tended by specially trained personnel. By 1975, computers were in great demand at universities, government agencies, and large businesses, but relatively few people had ever come face-to-face with an actual computer. This all began to change in the late 1970 s.

To understand why, let's take a closer look at the early computers. ENIAC and its immediate successors were large, slow, and unreliable primarily because they used thousands of large, slow, and unreliable vacuum tubes in their electronic circuits. The vacuum tubes were glass cylinders, typically about four inches high and an inch in diameter, which generated a lot of heat and thus could not be placed too close together. Then, in 1947, a momentous event occurred at Bell Labs - William Shockley, John Bardeen, and Walter Brattain announced the invention of the transistor. Only about an inch long and a quarter inch across, a transistor produced very little heat, and did the same job as a vacuum tube.

The downsizing of computers began in the 1950 s as transistors replaced vacuum tubes, and continued into the 1960 s with the introduction of the integrated circuit (IC) - an ice cube-sized package containing hundreds of transistors. By the late 1960 s, microchips, consisting of thousands of electronic components residing on a piece of silicon the size of a postage stamp, had begun to replace ICs. At this time, some minicomputers occupied a space no larger than a small filing cabinet and cost less than $25, 000. Then, in 1970, Marci an Hoff, Jr. , working at Intel Corporation, invented the microprocessor, a central processing unit on a chip. The technological world was now ready for the personal computer.

The First Personal Computer The first personal computer to be successfully marketed to the public was built in 1974. It was designed by Micro Instrumentation and Telemetry Systems (MITS), a small electronics firm located in New Mexico, which named it the Altair 8800. The Altair was a very primitive machine about the size of a bread box; it contained 256 bytes (not kilobytes) of RAM, had no ROM, and its input and output devices consisted of rows of toggle switches and lights, respectively. It was also quite inexpensive - $395 in kit form. Sales of the Altair took off after an article about it appeared in the January, 1975 issue of Popular Electronics magazine.

Although add-on products for the Altair 8800 (such as memory boards and paper tape readers) gradually appeared over the next couple of years, it's questionable whether anyone ever did any useful work on this machine. Nevertheless, the Altair is of major historical significance because it inspired thousands of computer hobbyists and professionals to become interested in personal computers. Two of those inspired by the Altair were Paul Allen and William Gates, both about twenty years old at the time. They joined together to write and sell a version of the BASIC programming language for the new computer. With the easy-to-use BASIC available, Altair owners no longer had to write programs in low-level, mind-numbing machine language. Soon thereafter, Gates and Allen formed Microsoft Corporation, which is now the world's largest software company (and the publisher of Windows).

Another Altair aficionado was Stephen Wozniak, who joined forces with his friend and fellow Californian, Steven Jobs, to form Apple Computer, Inc. In 1977, they brought the now legendary Apple II personal computer to market. The Apple II was an instant hit and for the next few years, Apple was the fastest growing company in the United States. The IBM PC By 1980, there were dozens of companies manufacturing personal computers, but the major producers of the larger minicomputers and mainframes had not yet entered the fray. This changed dramatically in 1981, when IBM brought out its first personal computer (not so imaginatively named the IBM Personal Computer). Although it wasn't much more powerful than most other personal computers of the time, the IBM PC was a milestone in the history of personal computers for two basic reasons: 1.

For many businesses, especially the larger ones, it 'legitimized' personal computers. If IBM was selling them, the reasoning went, then maybe PCs really could be a useful business tool. As a result, the IBM PC became wildly popular; IBM could not produce them fast enough to keep up with the demand. 2. It was built with generic parts and used an operating system (PC-DOS) developed by Microsoft and virtually identical to MS-DOS, which was sold by Microsoft. The IBM PC also used open architecture - IBM published detailed specifications so that anyone could build circuit boards for it to expand its capabilities.

These features enabled enterprising companies to 'clone' the PC - to build their own IBM-compatible personal computers. The Apple Macintosh For the next few years, the personal computer industry evolved as a few IBM clones and dozens of non-IBM compatible PCs were brought to market. Then, in 1984, Apple introduced the Macintosh, which it advertised, with a decidedly anti-IBM slant, as 'the computer for the rest of us.' With its small size and integral screen, the 'Mac' certainly looked different, but what really made it stand out was its easy-to-use, mouse-driven, graphical user interface. (This GUI is similar, broadly speaking, to the Windows interface developed later by Microsoft. Both interfaces, and the mouse as well, trace their roots back to work done about 1980 at Xerox Corporation's Palo Alto Research Center. ) The Apple Macintosh was not at all compatible with the IBM PC.

Nevertheless, after a slow start, it became increasingly popular. Today, the Macintosh and its descendants are the only reasonably popular alternatives to IBM-compatible personal computers. In 1984, IBM also introduced a new microcomputer, the IBM PC/AT, which used the more powerful 80286 processor. This event set off a flurry of activity by other manufacturers who quickly cloned the new machine and introduced improvements of their own.

Recent Developments From this point on, the increased competition for the PC buyer's dollar brought forth more powerful computers at an ever-accelerating rate. This boom was fueled by Intel Corporation, which introduced new generations of microprocessors every few years. In each case, microcomputer manufacturers quickly brought out machines designed around the new chip and software developers used the greater speed to create more sophisticated programs. Microsoft Windows was one of the major beneficiaries of the more powerful computers.

When it was first introduced in 1985, Windows ran sluggishly on the existing hardware (graphics-intensive programs require relatively fast computers), and it did not have much success. However, by the time the much improved Version 3. 0 was brought to market in 1990, hardware had caught up with the demands of the software and this version of Windows was an immediate hit. Then, when an even better Version 3.

1 was introduced in 1992 in the midst of a computer price war that made top-of-the-line machines affordable, Windows became the standard operating environment for IBM-compatible microcomputers. Windows 98, the version you will learn in this text was rolled out in June, 1998.