Jack Kilby began his career at Texas Instruments in 1958, where he went on to invent arguably the most important invention of the 20th century: the integrated circuit, also known as the microchip.
The integrated circuit allowed for the miniaturization of electronic devices, which makes most of our modern consumer electronic devices possible. He had many achievements throughout his career including a Nobel Peace Prize in Physics, a National Medal of Science, the IEEE Medal of Honor, the Charles Stark Draper Prize, a Computer Pioneer Award, the Kyoto Prize, and was even a professor of Electrical Engineering at Texas A&M.
And, for all of his achievements, his peers always remarked about Kilby's humility and kindness.
There's no way to cover everything about Jack Kilby in one article, so we'll focus on his development of the world's first integrated circuit.
The Tyranny of Numbers and the World's First Integrated Circuit
The concept of the integrated circuit was originally theorized by Geoffrey Dummer in the late 1940s. In 1952, Dummer presented his research at a conference, where Jack Kilby was in attendance.
Dummer had been working on solutions to what was called "The Tyranny of Numbers", a concept coined by Jack Morton, the CEO of Bell Labs.
"For some time now, electronic man has known how 'in principle' to extend greatly his visual, tactile, and mental abilities through the digital transmission and processing of all kinds of information. However, all these functions suffer from what has been called 'the tyranny of numbers.' Such systems, because of their complex digital nature, require hundreds, thousands, and sometimes tens of thousands of electron devices." -Jack Morton
Basically, computer hardware had hit a brick wall.
Although people were designing more complex computing devices, having to hand-solder hundreds of discrete components to individual modules was incredibly time-consuming and uneconomical. They were also very fragile—a single bad component or solder connection could render an entire module useless. Then, there were the wires, lots of wires.
Fortunately for the world of computing, Jack Kilby would finally solve this problem in 1958.
It doesn't look like much, but this 7/16 by 1/16 inch slab of germanium would change the world forever. Courtesy of Texas Instruments.
When Kilby joined Texas Instruments in 1958, he began working on the Micro-Module program. This was the US Army Signal Corps' attempt at solving the Tyranny of Numbers.
Micro-Modules were standardized sizes and shapes for electronic components that had the wiring built in. These modules could snap together to make various circuits so they wouldn't need to wire connections. Kilby felt that Micro-Modules were somewhat of a half measure because they did nothing to reduce the number of components.
Since Kilby had just joined TI, he did not have vacation time yet when most of the company traditionally left for a two-week vacation. Kilby made the most of his time alone in the lab and put to the test his theory that passive devices and transistors could be made from the same material and make a complete circuit.
On September 12th, 1958, he demonstrated his theory by running electricity through the slab and displaying the results on an oscilloscope. The oscilloscope produced an unending sine curve.
From Germanium to Silicon
Kilby's germanium integrated circuit generated interest from the military but wasn't a financial success initially. Robert Noyce, who would go on to co-found Fairchild Semiconductor and Intel, was working on his own integrated circuit independently from Jack Kilby. Although Jack Kilby got a prototype working months before Noyce did, Noyce's silicon IC ended up fairing much better commercially. For this reason, Kilby and Noyce are often cited as co-inventors despite working separately.
Noyce's silicon ICs are the more common ones that we're familiar with today and were part of what coined the term "Silicon Valley"—but whether or not Noyce would have made a silicon IC had Kilby not demonstrated the concept with germanium is something that historians can debate.