December 14, 2007 4:00 AM PST

Celebrating 60 years of transistors

Celebrating 60 years of transistors
Related Stories

Intel on lookout for next big thing

September 17, 2007

Intel scientists find wall for Moore's Law

December 1, 2003
Related Blogs

Toshiba's 3D memory, fuel-cell TV and Cell processors for PCs

October 3, 2007
On December 16, 1947, John Bardeen and Walter Brattain, two Bell Labs researchers, built the world's first transistor.

Their device, called a point contract transistor, conducted electricity and amplified signals, a job then currently handled by bulky and delicate vacuum tubes and other components.

Their colleague William Shockley followed soon after with junction transistors. Although Bardeen and Brattain were first, Shockley's device became the basis for a scientific and industrial juggernaut.

"It is the seminal device in terms of the way we think about information, and information is everything, from the music we listen to (to) the TV we watch," Intel CTO Justin Rattner said. "Modern communications is all based on theories of information, not on how many megawatts we can pump into the antenna. It is how clever we can be finding those few faint signals and putting them to use, which is a computing problem."

He added: "You couldn't have five tubes in your iPod."

Besides making it easier to store information and send signals, transistors had another, somewhat unanticipated, characteristic. They could be shrunk at a consistent rate over time, which makes transistors and electronic products steadily cheaper and faster.

Click for gallery

The effect, ultimately expressed as Moore's Law, encouraged investors to pour money into high-tech outfits because people had at least some level of assurance that tomorrow's products would be noticeably better than the ones available today. A high tolerance for risk has become one of the defining traits of Silicon Valley.

For tech companies, Moore's Law also served as a threat. Companies that chose not to invest in the new manufacturing techniques or components would quickly fall behind. Thus, innovation has become a matter of simple survival.

Predicting the end of Moore's Law is a cottage industry. If it does end, the heady lifestyle could slow down. Consumers would simply stop replacing their computers or other devices as fast as they do now and resort to getting new stuff when it breaks, Dan Hutcheson, CEO of VLSI Research, has said.

To date, though, the naysayers have been wrong. Lithography, the technique exploited to draw circuits, was supposed to hit a wall at 1 micron, and then at 250-nanometer manufacturing. That's because, some people theorized, it would be impossible to draw circuits smaller than the wavelength of light used by lithography machines. The industry blew past the 250-nanometer manufacturing mark in the mid-1990s. (A micron is a millionth of a meter, and a nanometer is a billionth of a meter. The measurement refers to the average feature length of a chip.)

Chips now come out of factories with 45-nanometer features, thanks to the introduction of metal gates in transistors, a massive change.

Many believe that Moore's Law, as it applies to existing technologies, may peter out around 2020. The structures inside transistors--particularly an insulating layer called the gate oxide--will by that time consist of only a few atoms.

Nonetheless, optimists say chip designers will stop shrinking transistors and instead begin to stack them. Toshiba has plans for 3D memory. The economic and performance benefits would continue to grow. Intel and IBM are also working on transistors with two or three gates, which would have a similar effect to going 3D.

Others believe chip designers will find a way to harness quantum effects--that is, replacing electronic signals with another physical phenomenon.

"We just can't turn the crank. If anything, it is becoming more difficult, and we will see many more dramatic changes than we've seen in the last 40 years," said Rattner. "Will we call it Moore's Law when the transistors don't use electrical charge? At some point we will make a transition from charge-based transistors to something else. As long as we preserve the basic tenets of Moore's Law, I think people will still call it that."

See more CNET content tagged:
transistor, AT&T Bell Laboratories, manufacturing, signal, 3D


Join the conversation!
Add your comment
Transistors are important but
the modern world exists because Robert Noyce, who was at Fairchild Semiconductor and Jack Kilby, who was at Texas Instruments, got rich and famous, not because of their work with components, but because they found a solution to the inter-connection problem.

(From <a href=""></a> OiRc-0009 Seeing Views intelligently (+ off-topic) )
Posted by CharlesRovira (97 comments )
Reply Link Flag
This was both a start and a continuation
I was in high school when the transistor's invention was announced. As a kid already playing with electronics, it set my career as a participant in the amazing technology that has followed. It has been immense fun.

While we honor many who followed, the key event was the invention of the transistors, pretty much all by Bardeen, Brattain, and Shockley. They accounted for all three major types of transistor from the initial point contact to the junction transistor and the field effect transistor. They also provided the basis for understanding them.

In the aftermath there were many who have changed the electronics world improving cost and performance by about 100,000,000 times. While we properly honor those who achieved a first in some aspect of the technology, in most cases that same achievement would have happened at nearly the same time anyway. Few of them came out of the blue like the original transistor and the physics behind it.

Even though we will eventually stop advancing via Moore's Law that says we will improve performance every year, there are centuries of design ahead to use these technologies for the good of mankind. We have hardly touched the things that these technologies can enable. What ideas do we have for using up to a billion transistors at a time? The future will show many that we can't even imagine today.
Posted by rdill (2 comments )
Reply Link Flag
My dad worked for Westinghouse in Elmira, NY for many years designing vacuum tubes. They still get used in some audio equipment, but I won't turn away from my current tech. I sometimes wonder how I will see things in thirty years when nano-technology and quantum physics takes us to places we can't imagine today - just like the generations before us.

I can easily envision technology that looks like "magic" if we, the human race, manage to survive.
Posted by Spork_This1 (11 comments )
Reply Link Flag
Working with early transistors.
In 1951, after graduating from college, deg in Engineering/physics, I was employed by Westinghouse, and worked at their transistor development lab, before being assigned to the Atomc Pwr. Div. I worked with point contact units, and early junction units. I initially worked on trying to measure and ID charecteristics. I also worked on oscillator circuitry. One big thing was trying to get matched pairs, and if we could get 2 units within 30 or so % of each other, we had a matched pair!!
I worked on the osc. circuitry, until a red letter day when I was able to get a sustained and reasonably stable 1 meg frequency going!! I can hardly digest the difference in such things todat!! After transferring to tha Atomic power div., I had the privilege of teaching Reactor Instrumentation &#38; Control, specializing in control rod programming, to the crew of the nautilus. Lee Robinson W. Palm Beach Fla.
Posted by robinaire (4 comments )
Reply Link Flag

Join the conversation

Add your comment

The posting of advertisements, profanity, or personal attacks is prohibited. Click here to review our Terms of Use.

What's Hot



RSS Feeds

Add headlines from CNET News to your homepage or feedreader.