News

The First IC and the Y2K Scare: The January 2018 Hardware History Roundup

January 27, 2018 by Chantelle Dubois

From the invention of the integrated circuit to surviving the Y2K problem, January has been an interesting month in history hardware. Here is a roundup of a few pieces of hardware history still saved in memory.

From the invention of the integrated circuit to surviving the Y2K problem, January has a lot of anniversaries in EE history. Here is a roundup of a few pieces of hardware history still saved in memory.

1959 - The Integrated Circuit Conceptualized

1958 was an exciting year for silicon. Jack Kilby, working at then-small company Texas Instruments, would come up with the “solid circuit” in which all components of a circuit could be made from a single silicon crystal. This would reduce the size of components, including the transistor, and making manufacturing easier and cheaper. Kilby produced a prototype and applied for a patent with Texas Instruments. 

 

First IC prototype. Image courtesy of Texas Instruments.

 

Meanwhile, Robert Noyce, working for Fairchild Semiconductor (which at the time was also just a small company), would file for a patent detailing his idea for an integrated circuit on Jan. 23, 1959. What made his idea unique compared to Kilby’s is that Noyce would also introduce the concept of “unitary circuits”, which provided a better way of connecting the components inside the integrated circuit. 

While Noyce’s patent would be awarded a few months before Kilby’s—and although both men are credited for inventing the integrated circuit—Kilby would be the only one to be recognized with a Nobel prize for the invention. And, of course, both Texas Instruments and Fairchild Semiconductors would engage in court battles over the ownership of the IC. Eventually, they would settle on co-owning the integrated circuit. The IC chip industry would grow into an industry worth billions by the mid-60s—and worth hundreds of billions today.

1983 - Apple Lisa Introduced

After development began in 1978, the Apple Lisa computer finally released to the world on January 19th, 1983. It featured a state-of-the-art graphical user interface (a novel feature at the time) and was targeted to business owners.

The Apple Lisa featured a 5MhZ Motorola 68000 CPU, 1 MB of RAM, 2 Apple FileWare 5.25” double-sided floppy disk drives, ran the Lisa OS, and was just under $10,000 USD (or nearly $25,000 USD today). 

 

An advertisement for the Apple Lisa, targeting the business market. Image courtesy of PBS.

 

However, the Apple Lisa would prove to be a failed product. Only 100,000 units were sold, partially because of its high price tag. 

Part of its failure, however, was due to overall unreliable performance. It was rife with various quirks, including a “Y1995”, in which the computer could only accept dates in the range of 1980 and 1995. This was because the base year built into the computer was 1980 and the year was stored in a 4-bit integer on the real-time clock chip—which maxes out at 15 (1111). 

A more successful Lisa 2 would eventually be released and then later on the entire Lisa line would be replaced by the Macintosh.

1984 - First 1MB Memory Chip

The year 1984 was an exciting one for the computing world in Japan—Hitachi announced the world’s first 1MB memory chip on January 6th, 1984. 

This occurred the same year that NAND and NOR flash memory was invented by Fujio Masuoka, who was working for Toshiba at the time. However, Masouka would eventually leave Toshiba and then sue the company for never taking advantage of his invention which is now worth billions today.

2000 - The World Survives Y2K

Anyone who was around for New Year’s Eve 1999 will likely remember the Y2K craze. Making computers "Y2K-proof" was discussed around the world with many concerns and warnings about what would happen when our calendars rolled over to 2000. 

The problem stemmed from the way dates were stored—since memory was still a relatively costly commodity, only the last two numbers of the year were stored so that 1998 would appear as 98, 1999 as 99, and 2000 would then be represented as 00.

The concern was that this would break the logic of a lot of programs which would recognize 00 as 1900. The exact outcome of this would not be known until the “event horizon” was passed. Both media and experts had different expectations, with the former being a little more on the doomsday side of things.

 

Best Buy's Y2K warning sticker. Image courtesy of Funny Junk.

 

However, patches were rolled out and New Year Day 2000 arrived with little incident. 

 


 

Want to see more hardware history? Let us know what your interests are in the comments below!

Feature image courtesy of Texas Instruments.

4 Comments
  • Heath Raftery February 01, 2018

    Funny to think that $US2.5B of sales was a failure.

    Like. Reply
  • A
    adx February 02, 2018

    Or even selling 100k of the first model of anything.

    Like. Reply
  • M
    MAlvis February 02, 2018

    History & Human Issues Question: Though I believe, after 10 years of working to understand (despite lots of media sour grapes stories which did not likely fit reality) & reading a portion of Bill Gates “The Road Ahead” (~1990), I finally got a handle on how the competition between CPM (DEC OS?). Gates group, etc, finally lead to IBM (hardware people, not so good at more complex software to control the hardware), finally ended up offering the DOS OS (IBM version vs. MS version) (~$40/PC, vs. CPM ~$240/PC or a programming language ~$840/PC) as software with which the first IBM PCs could be immediately used to put the machine into some use.

    More accurate history can be instructive to many on what drives what actually flies here on spaceship earth via Karmic Law.

    Like. Reply