engl 2007 transistor - engineer without fear

11
Newcomer 1 Cassandra Newcomer ENGL 2007 - Writing for Engineers Professor Bubrow 11/13/2014 Foundation of the Digital Age: The Transistor According to the PBS documentary Transistorized, the transistor “was probably the most important invention of the 20th Century.” Displacing the bulky and fragile vacuum tubes that preceded it, the invention of the transistor ushered in the age of digital computing and transformed society in ways that only science fiction authors had dared to dream. The internet, the personal computer, and the smart phone all owe their existence to the transistor and the integrated circuits built of them. Transistors put a man on the moon and coordinate air traffic control. Even in our daily lives, we are surrounded by transistors from morning to evening— everything from our morning alarm clocks to our crosswalk signals use them. Modern transistors are so small as to be nearly invisible, and yet this now ubiquitous invention has arguably altered the trajectory of human history more than any other engineering accomplishment. This paper examines the problems encountered in the development of the transistor and the solutions discovered in the invention process. It also explores the transistor’s economic, sociological, and environmental impact. William Gibson, a popular science fiction author, wrote that “something tends to happen with new technologies, generally: the most interesting applications turn up on a battlefield, or in a gallery” (86). So it was with semiconductors, as the bulk of the groundwork for the transistor

Upload: others

Post on 16-Oct-2021

2 views

Category:

Documents


0 download

TRANSCRIPT

Newcomer 1

Cassandra Newcomer

ENGL 2007 - Writing for Engineers

Professor Bubrow

11/13/2014

Foundation of the Digital Age: The Transistor

According to the PBS documentary Transistorized, the transistor “was probably the most

important invention of the 20th Century.” Displacing the bulky and fragile vacuum tubes that

preceded it, the invention of the transistor ushered in the age of digital computing and

transformed society in ways that only science fiction authors had dared to dream. The internet,

the personal computer, and the smart phone all owe their existence to the transistor and the

integrated circuits built of them. Transistors put a man on the moon and coordinate air traffic

control. Even in our daily lives, we are surrounded by transistors from morning to evening—

everything from our morning alarm clocks to our crosswalk signals use them. Modern transistors

are so small as to be nearly invisible, and yet this now ubiquitous invention has arguably altered

the trajectory of human history more than any other engineering accomplishment. This paper

examines the problems encountered in the development of the transistor and the solutions

discovered in the invention process. It also explores the transistor’s economic, sociological, and

environmental impact.

William Gibson, a popular science fiction author, wrote that “something tends to happen

with new technologies, generally: the most interesting applications turn up on a battlefield, or in

a gallery” (86). So it was with semiconductors, as the bulk of the groundwork for the transistor

Newcomer 2

was laid over the course of World War II. Semiconductors are made by mixing a non-conductive

insulator with a small amount of conductive impurity in a process called “doping” (Zeghbroeck).

This allows enough electricity—but not too much—to flow through the lattice structure of the

compound in either the positive or negative direction as desired, a critical part of elementary

circuit design (Zeghbroeck). Previously, semiconductor development had been limited by the

difficulties in dealing with the volatile compounds required to make them, such as copper oxide,

lead sulfide, and cadmium sulfide (Herring s336). In addition to difficulties purifying the

compounds, “slight differences in exact stoichiometric ratios of the elements involved…were

extremely difficult, if not impossible to determine and control at the required levels…

semiconductor research therefore remained more art than science until World War II intervened”

(s337). The war accelerated both the pace and the volume of technological research. Fledgling

radar technology demanded vast supplies of silicon, and by the end of the war, purification

processes pioneered by DuPont had reached the ability to create silicon that was 99.999% pure

(s337). However, even within this ultra-pure sampling, some silicon crystals would work, and

other would not, and for what reason, no one knew (s337). It was theorized that the reason was

trace impurities that yet remained within the silicon.

In 1939 at Bell Labs, a scientist named Russel Ohl noticed a strange phenomenon with a

particular sample of silicon crystal: “the amount of current changed when the crystal was held

over a bowl of water. And a hot soldering iron. And an incandescent lamp on the desk…by early

afternoon, Ohl realized that it was the light shining on the crystal that caused this small current to

begin trucking through it” (Guercio). When a flashlight was turned on, the voltage jumped to

half a volt, over ten times anything ever seen before by Ohl and his other scientist friends

Newcomer 3

(Guercio). The sample crystal had a crack right down the middle, and this proved to be the

reason for its strange behavior. When studied further, they found that:

…the crystal had different levels of purity on either side of the crack. Due to the subtle

traces of extra elements, one side had an excess of electrons, and the other side a deficit.

Since opposites attract, the electrons from one side had rushed over to the other -- but

they went only so far, creating a thin barrier of excess charges right at the central crack.

That barrier created a one way street -- electrons could now only travel in one direction

across it. When Ohl shined light on the rod, energy from the light kicked sluggish

electrons out of their resting places and gave them the boost they needed to travel around

the crystal. But due to the barrier, there was only one way they could travel. All those

electrons moving in a single direction became an electric current (Guercio).

P-type semiconductors have more holes than electrons, and n- type semiconductors have more

electrons than holes (Zeghbroeck). For this reason, they called their discovery a p-n junction, for

it was where materials of the two different conductance joined together (Guercio). Much later,

this discovery would form the basis of solar cell technology. But of more immediate importance,

the scientists at Bell Labs realized they had discovered something incredible: a potential way to

replace vacuum tubes.

The point contact transistor was developed in 1947 by a Bell Labs team of three

American physicists: John Bardeen, Walter Brattain, and William Shockley (Hoddeson).

Bardeen calculated that they would need the two metal contacts to be within 0.002 inches to

work—but the smallest wires in the world were over 300% too big (Guercio). Luckily, Brattain

was an inventive sort: “Instead of bothering with tiny wires, [he] attached a single strip of gold

Newcomer 4

foil over the point of a plastic triangle. With a razor blade, he sliced through the gold right at the

tip of the triangle. Voila: two gold contacts just a hair-width apart” (Guercio). A germanium

crystal connected to a voltage source was then attached to the points of contact (see fig 1).

Building on the previously discovered knowledge of how to control minute electron drift, this

contraption was able to able to amplify the power of a signal over a hundredfold (Guercio). The

demonstration the scientists made to their superiors on December 23, 1947, is often credited as

the birthdate of the transistor as we

know it (Guercio). Eight years later,

they were awarded the Nobel Prize in

Physics for their discovery (Rudberg).

Bell Laboratories considered several

possible names, several of which were

clearly thought up by engineers and not

marketers. Titles suggested by

employees for the device included the

“semiconductor triode,” “solid triode,”

“surface states triode,” “crystal triode,” and “iotatron” (Hoddeson). Bell Labs dispensed ballots

to employees to vote on which name they thought was best, and when the election results came

in, the invention was luckily named the transistor. According to the Nobel Foundation, employee

John Pierce was the creator of the winning ballot: a combination of the word “transfer” and

“resistor.” And so, the transistor was born.

Fig 1. The world’s first ever transistor. (Rubin)

Newcomer 5

The infant transistor technology was, at first, slow to catch on. Throughout the 1950s,

“vacuum tubes were a $4 billion dollar industry…[they] dominated the consumer electronics

market. Ten years after the invention of the transistor, vacuum tubes were outselling transistors

by more than 13 to 1” (Staff). This was primarily due to cost constraints, as early transistors were

too expensive to use in items such as televisions, radios, and civilian computers. Once again,

however, the military industrial complex came to the rescue of commercially unviable research

and development. The increasingly complicated circuitry needed to compete in the Cold War

arms race soon began to bump up against the limitations of vacuum tube technology. Circuits

weren’t just more complicated—they were also larger: “increasing complexity translated into

physically larger systems…higher energy demands and heat dissipation issues. There were limits

to the number of electronic components that one could stuff into an airplane or missile” (Staff).

Miniaturization became crucial, as cost was a much lesser priority than size and reliability.

Physically larger systems brought problems of their own aside from just size: “as the number of

components increased, the “mean time between failures of the entire system got shorter. The

more sophisticated the system, the more likely it would fail. To the military mind, the

implications were truly frightening” (Staff). Accidental detonation of a nuclear warhead would

simply not do. Devices using transistors, called solid state devices for the lack of gas used in

vacuum tubes, were a far more appealing option (Herring s336). However, early transistors too

had their limitations, which military funded research sought to remedy. The point contact

transistor was too fragile to use in most commercial applications, so Shockley had invented the

bipolar Junction transistor “by eliminating the fragile point contacts and instead forming the

emitter, base, and collector as a single semiconductor sandwich with three different layers”

(Riordan). However, the frequency response was too slow to be used in many applications. The

Newcomer 6

next jump forward in transistor technology would come not from Bell Labs, but from a different

source: Texas Instruments. At the time, Texas Instruments was primarily a defense contractor

which “focused on military markets for transistors as replacements for the bulkier and far more

fragile vacuum tubes. The U.S. armed services, among its biggest customers, were willing to pay

a big premium for transistors that performed uniformly and flawlessly over a wide range of

conditions” (Riordan). Bell Laboratory’s focus on telecommunications was about to become less

of a help and more of a hindrance.

Transistor research had mostly utilized germanium up until this point, despite silicon’s

early prevalence in semiconductor research. Silicon had begun to fall to the wayside after WWII

due to its high melting point and chemical reactivity; anything hot enough to melt silicon at 1415

C was hot enough to melt most potential crucibles as well (Riordan). Germanium began to

replace it as the choice of semiconductor material, as it was an element with similar properties,

far less reactive than silicon, and melted at a much cooler 937 C (Hassion 1076). The best

purification techniques were “able to purify germanium to a level unattainable in

silicon…because silicon melted at a higher temperature than germanium” (Seidenberg).

However, this early benefit became a later disadvantage: “silicon’s intrinsically higher energy

gap meant that silicon devices could operate at a significantly higher temperature” (Seidenberg).

Early transistors were large and cumbersome enough that the limited heat handling capabilities

of germanium weren’t an issue, but as transistors shrunk, silicon’s higher thermal conductivity

meant heat flowed away from the junction three times faster than germanium (Seidenberg).

These traits made it better suited for applications in tight space constraints, as well as extreme

weather (a benefit for military and industrial purposes). In 1954, the scientists and engineers at

Texas Instruments instead found the solution: the trick was not removing every single impurity,

Newcomer 7

but to make the silicon base layer so extremely thin that trace impurities would not affect the

functioning (Riordan). With this discovery, the silicon transistor became viable for manufacture,

and transistor engineering has

focused chiefly on how to print

silicon wafers ever thinner and

smaller ever since (see fig 2). By

the end of 1960, practically all of

the semiconductor industry “had

switched from germanium to

silicon transistors and

diodes…[the discovery] provided

the industry with the capability of

mass producing reliable

miniaturized high performance

silicon devices whose switching

speeds, rectification efficiencies,

breakdown voltages, and power-dissipation ratings were superior to germanium” (Seidenberg).

With this last critical development, personal computers and consumer electronics became not

just a fantasy—but a tangible reality.

In 1964, Martin Greenburger, a professor at MIT’s School of Industrial Management,

wrote an article speculating about the potential of the transistor to revolutionize just about every

industry imaginable. Many of his observations as to future possible uses of the technology have

proved particularly prescient when viewed from a perspective of fifty years later. Among several

Fig 2. Past transistor sizes and predicted future transistor sizes. (Fox)

Newcomer 8

theorized adaptations are what would later become

the credit card, electronic banking and bill pay, online

tax return filing, digital medical charts, and the

automation of the New York Stock Exchange—none

of which existed at the time, and all of which are hard

to imagine life without now. During the 1960s, the

silicon transistor followed people home as the

consumer electronics industry was born, with

offerings ranging from televisions to car radios to

rudimentary personal computers. As it became less a

topic of highly specialized physics research and more

frequently a topic on the news, it began to emerged

into popular awareness via pop culture, too. The

popular Marvel comic, Iron Man, portrayed transistor technology inaccurately, but constantly,

and its use by the tech enthused main character shows how cutting edge the transistor was

considered at the time (Fig 3). By the 2000s, the gross national product of the United States was

$9.2 trillion, and of that amount, semiconductors accounted for $204 billion, or about 2.2% of

that total; it is now:

a bigger part of the US economy than mining, communications, utilities, or agriculture,

forestry, and fishing. The top 10 U.S. airlines put together made only half as much money

as semiconductor makers. Intel and Texas Instruments sold more than Coca-Cola and

Pepsi…every one of the 15 corporations receiving the most patents in 2000 was in the

semiconductor or computer business. (Turley)

Fig 3. Tony Stark recharges his armor via

“transistor-powered roller wheels.” Tales of

Suspense #54. June 1964.

Newcomer 9

In 1947, there was only one transistor in existence. By 2001, there were about 60 million

transistors built for every person on earth, and by 2010, the number had reached 1 billion

(Turley). The scope of the rate of change in processing power cannot be understated; even a

nintendo 64 game cartridge, now dated technology by 2014 standards, has more processing

power than NASA had to conduct a lunar landing (Turley).

The technologies created on the foundation of the transistor have given us the ability to

telecommute to work and run daily errands that would other require consumption of fossil fuels

to perform. Most people now use smart phone apps and online websites to perform tasks that

would otherwise necessitate making a physical trip to somewhere. The transistor has also been

critical to the development of green technology, like nuclear power plants, solar cells,

hydroelectric dams, and electric cars, all of which either directly contain or were developed on

computers using transistors. The sociological impacts of an increasingly social media obsessed

culture are likely worthy of a research paper all their own—and even in the papers that do focus

exclusively on that as a topic, whether it does more harm or more good seems a contentious point

of debate. Like with all technologies, whether it hurts or helps is more a function of who is using

a given technology than any innate characteristic of the technology being used. The increasing

automation enabled by transistor technology has demolished the highly paid manufacturing jobs

once held by blue collar workers, and similarly eliminated many of the secretarial and office jobs

once held by white collar workers. The additional profit from the reduction in need for human

labor to produce a commodity has largely gone to the upper class who own the machines

everything is now created by. European countries have managed to offset the potentially harmful

side effects of the sharp increase in efficiency with strong social safety nets; Switzerland and

similar countries are now debating the merits of a universal basic income (Lowrey). Utopias

Newcomer 10

have historically been impossible primarily due to limited resources; too little food for too many

people. The potential now exists for a society where people are free to focus on what they love

instead of working to live—but only if we can bring ourselves to care about what happens to

others, instead of just ourselves.

For the first time in history, humans now face the bewildering prospect of too much free

time, rather than too little. But in corporate controlled countries like America, transistor

technology has unfortunately been a method by which income inequality has sharply increased in

favor of the already socioeconomically privileged—while the cost of doing most things has

decreased due to automation, the labor force once employed in those roles now finds itself either

underemployed in minimum wage jobs that pay barely enough to survive, or unemployed

completely. As a result, income inequality is now worse in American than in any other time in

recorded history (Matthews). The transistor has been called the “nerve cell” of the Information

Age, and as per Moore's Law, the number of transistors packed into a given amount of silicon

doubles nearly every 18 months (Riordan). Scientific progress is ethically neutral; what a

technology is used for is determined by humanity as a whole, as scientists and engineers have

little say in what their discoveries are used for by others. Transistors amplify a current several

orders of magnitude beyond its initial value—it would seem that holds true whether that signal is

electrical or ethical. It is trivial to determine whether the current flow is in a positive or negative

direction in a circuit, but the direction in regards to humanity’s future is far more difficult to

determine.

Newcomer 11

Works Cited

Fox, Will. Microchip transistor sizes. Digital image. Future Timeline. N.p., n.d. Web. 14 Dec.

2014.

Greenberger, Martin. "The Computers of Tomorrow." The Atlantic Monthly. May 1964: 63-67.

Print.

Guercio, Gino. Transistorized! PBS. 1999. Web. 13 Dec. 2014.

Hassion, F. K., D. C. Thurmond, and F. A. Trumbore. "On the Melting Point of Germanium."

The Journal of Physical Chemistry 59.10 (1955): 1076-078. Web.

Herring, Conyers. "The Invention of the Transistor." Reviews of Modern Physics 71.2 (1999):

S336-345. Web.

Hoddeson, Lillian. Crystal Fire: The Invention of the Transistor and the Birth of the Information

Age. New York: Norton, 1998. Print.

Lowrey, Annie. "Switzerland’s Proposal to Pay People for Being Alive." The New York Times.

The New York Times, 16 Nov. 2013. Web. 12 Dec. 2014.

Matthews, Chris. "Wealth Inequality in America: It’s Worse than You think." Fortune, 31 Oct.

2014. Web. 11 Dec. 2014.

Riordan, Michael. "The Lost History of the Transistor." IEEE Spectrum, 30 Apr. 2004. Web. 13

Dec. 2014.

Rubin, Julian. The Point Contact Transistor. Digital image. The Invention of the Transistor. Web.

14 Dec. 2014.

Rudberg, E.G. "Nobel Prize in Physics 1956 - Presentation Speech." Nobelprize.org. Nobel

Media AB 2014. Web. 16 Nov 2014.

Seidenberg, Phillips. “From Germanium to Silicon, A History of Change in the Technology of

the Semiconductors.” IEEE Center for the History of Electrical Engineering. Web. 10

Dec. 2014

Staff of the IEEE History Center. "Your Engineering Heritage: The U.S. Federal Government

and Innovation, a Brief History (Part III)." Your Engineering Heritage: The U.S. Federal

Government and Innovation, a Brief History (Part III). IEEE-USA Today's Engineer

Online, Aug. 2011. Web. 14 Dec. 2014.

Turley, Jim. "The Business of Making Semiconductors." InformIT. Pearson. Web. 19 Nov. 2014.

Zeghbroeck, Bart. "Doped Semiconductors." Doped Semiconductors. Web. 14 Dec. 2014.