Bulky And Unreliable Vacuum Tubes With Transistors example essay topic
Edison's invention consisted of a conducting filament mounted in a glass bulb. Electricity passing through the filament caused it to heat up and created a vacuum that prevented the filament from oxidizing and burning up. In 1883, Edison detected electrons flowing through the vacuum from the lighted filament to a metal plate mounted inside the bulb. This became known as the Edison Effect, but he did not develop this particular discovery any further. However, British physicist John Fleming found that the Edison Effect also detected radio waves and converted them to electricity.
In 1904, Fleming first showed off his two-element vacuum tube known as the Fleming diode that converted an alternating current (AC) signal into direct current (DC) (Kuphaldt). The Fleming diode consisted of an incandescent light bulb with an extra electrode inside. Electrons boiled off the surface of the metal plate and into the vacuum inside the bulb as the filament became white-hot. When the extra electrode became more positive than the filament, a direct current flowed through the vacuum proving that AC signals could be converted into DC. One of the first uses of the Fleming diode was to detect weak signals produced by the new wireless telegraph. Later, the diode vacuum tube was used to convert AC into DC for power supplies in electronic equipment (Kuphaldt).
Many inventors tried to improve the Fleming diode, but the only one who succeeded was American inventor Lee De Forest. In 1906, De Forest introduced a third electrode called the grid into the Fleming diode. (This grid was simply a bent wire between the plate and filament.) The grid changed (modulated) the current flowing from the filament to the plate when De Forest applied the signal from the wireless telegraph antenna to the grid instead of the filament. In 1907, he patented his bulb that had the same contents as the Fleming diode, except for the added electrode, and he is credited with the invention of the vacuum tube. This triode, called the Audion, was the first successful electronic amplifier that could also be used as a switch. Long distance communication became a reality as radio signals were amplified before they were transmitted, and receivers using the vacuum tubes amplified the incoming signal (Maxfield, Brown 1997) The vacuum tube changed the way almost anything related to electricity or communications worked, especially in the area of digital computers.
De Forest's invention could control the flow of electrons inside the tube. This created the ability for electronic digital computers to use the zeros and ones of binary arithmetic to perform their operations. A zero could be represented by the absence of an electron current to the plate, and a small but detectable current served as a one (Sullivan et al, 1988). It was the beginning of today's huge electronics industry.
However, this revolutionary invention had its drawbacks. Tubes were bulky and generated much heat. Simple computers generated so much heat that cooling systems were needed to prevent overheating. Furthermore, the cooling systems consumed nearly as much power as the tubes themselves. The vacuum tubes were also unreliable, and they burned out like light bulbs. The practicality and reliability of computers built with vacuum tubes was directly related to how often the tubes failed.
A computer having thousands of tubes might operate for as little as one hour between malfunctions (Sullivan et al, 1988). If large computers were to become possible, a replacement for the vacuum tube was a necessity. The Transistor William Shockley joined Bell Telephone Laboratories in 1936. In 1939, Shockley began searching for a way to convert certain substances into an amplifying device, but the war interrupted his work. He resumed his research in 1945 when Bell Labs established a group to develop a semiconductor to replace the vacuum tube. Shockley led the group of physicists that included John Bardeen and Walter Brattain (IC Knowledge).
The team was about to give up when a last attempt to try a purer substance as a contact point led to the invention of the point-contact transistor amplifier. On December 16, 1947, Bardeen and Brattain fixed two electrodes into germanium that measured one half inch in length. The electrical power coming out of the germanium was 100 times stronger than what went in, successfully amplifying the circuit, and the transistor was born. Bardeen and Brattain took out a patent for their transistor, and William Shockley received a patent for the transistor effect and a transistor amplifier (Riordan, Hoddeson, 1997). In 1957, the team received the Nobel Prize in Physics for the invention of the transistor (Lindberg, et al). The transistor was the first device designed to act as both a transmitter (converting sound waves into electronic waves) and a resistor (controlling electronic current.) It was composed of semiconductor material that could both conduct and insulate.
Transistors transformed the world of electronics and greatly influenced computer design. By replacing the bulky and unreliable vacuum tubes with transistors made of semiconductors, computers could now perform the same functions using less power and space (Sullivan et al, 1988) Integrated Circuits The transistor was amazing, but there was the problem of wiring everything together. The computer's advanced circuits contained so many components and connections that they were difficult to build. Scientists, engineers, and inventors rushed to solve this problem, and two separate inventors created almost identical integrated circuits at almost the same time. In 1957, research engineer Robert Noyce co-founded the Fairchild Semiconductor Corporation. Jack Kilby, an engineer with a background in circuit boards and transistor-based hearing aids, joined Texas Instruments in 1958 (Bellis).
From 1958 to 1959, they worked to increase the number of components involved to make technical advances but in a smaller space. They both succeeded in making all of the components and the chip out of the same block (monolithic) of semiconductor material (Lindberg, et al). Kilby used germanium and Noyce used silicon. The integrated circuit replaced the previously separated transistors, resistors, capacitors, and all the connecting wiring onto a single chip.
In 1959, both applied for patents. Jack Kilby and Texas Instruments received a patent for miniaturized electronic circuits, and Robert Noyce and the Fairchild Semiconductor Corporation received a patent for a silicon-based integrated circuit (IC Knowledge). In 1961, the Fairchild Semiconductor Corporation produced the first commercially available integrated circuits. After several years of legal battles, the two companies decided to cross license their technologies and created a worldwide market now worth about $1 trillion a year (Riordan, Hoddeson, 1997) All computers began using integrated circuits instead of the individual transistors and the accompanying parts.
Texas Instruments used the chips in Air Force computers and the Minuteman Missile in 1962. They later used the chips to produce the first electronic portable calculators. The original integrated circuit had only one transistor, three resistors, one capacitor, and was the size of an adult's pinkie finger. Today, an integrated circuit can hold 125 million transistors, and it is smaller than a penny.
(Bellis) Jack Kilby holds patents on over sixty inventions, and in 1970, he received the National Medal of Science. Robert Noyce holds sixteen patents and founded Intel, the company responsible for the invention of the microprocessor in 1968. For both men however, the invention of the integrated circuit remains historically as one of the most important in the world of technology. Almost all modern technological products use the integrated circuit, and it greatly reduced the cost of electronic functions. The Microprocessor The integrated circuit completely changed computer design, and it seemed that the only thing to do was reduce the size. The Intel 4004 chip accomplished this by placing all the parts that made a computer think (central processing unit, memory, input and output controls) on one small chip.
(Bellis) In 1968, Robert Noyce and Gordon Moore decided to create their own company. Noyce was able to convince venture capitalist Art Rock to finance their plans, and Rock raised $2.5 million dollars in less than 2 days. A hotel chain had already trademarked the name "Moore Noyce", so the two founders decided upon the name 'Intel' (short for "Integrated Electronics") for their new company. (Bellis) In late 1969, Busicom, a potential client from Japan, asked to have twelve custom chips designed for keyboard scanning, display control, printer control, and other functions in a Busicom-manufactured calculator. Intel did not have the work force for the job, but they did have the brainpower to come up with a solution. Intel engineer Ted Hoff decided that Intel could build one chip to do the work of twelve.
Intel and Busicom agreed and funded the new programmable general-purpose logic chip (Hoff). Federico Faggin headed the design team along with Ted Hoff and Stan Mazor. Nine months later, the first microprocessor was born. The new chip had as much power as the EN IAC, which was 3,000 cubic feet with 18,000 vacuum tubes. It was approximately 1/8th inch wide by 1/16th inch long and consisted of 2,300 metal oxide semiconductor (MOS) transistors. Intel decided to buy back the design and marketing rights to the 4004 from Busicom for $60,000.
Busicom later went bankrupt, and they never produced a product using the 4004 (Hoff). However, Intel created a marketing plan to encourage the development of applications for the 4004 chip that led to its widespread use within a few months. Intel publicly introduced the world's first single chip microprocessor, the Intel 4004, in November 1971. Intel engineers Federico Faggin, Ted Hoff, and Stan Mazor are recognized as the inventors of the chip (IC Knowledge). Today's microprocessor has more than 5 million transistors performing hundreds of millions of calculations each second and is responsible for the incredible acceleration of technological advancement over the last 30 years.
This tiny electronic miracle evolved over many years of research, experiments, and amazing discoveries into one of the most complex mass-produced products in technological history; and it all began in 1879 with the invention of a simple light bulb.
Bibliography
Bellis, M. The History of Computers. Retrieved Mar. 03, 2005, from About.
com web site: web Hoff, T. Fascinating Facts about the Invention of the Microprocessor. Retreived Feb 28, 2005, from Idea Finder web site: web Knowledge.
History of the Integrated Circuit. Retrieved February 28, 2005, from IC Knowledge web site: web Kuphaldt, T.
R. Early Tube History. Retrieved Feb. 24, 2005, from All About Circuits Web site: web 3/chp t 13/2.
html. Lindberg, A.A. The History of the Integrated Circuit. Retrieved Mar. 03, 2005, from Nobel prize.
org Web site: web circuit / history Maxfield, C. and Brown, A. Retrieved Feb. 24, 2005, from The History of Computers Web site: web Riordan, M.
and Hoddeson, L. (1997).
Crystal Fire-The Birth of the Information Age. New York, NY: W.W. Norton & Company Inc. Sullivan, J. (Ed. ). (1988).