News

A History of Touchscreens (and What’s Coming Next)

September 16, 2015 by Jennifer A. Diffley

Cypress announced a better automotive touchscreen and Apple's come out with innovative touch technology as well. Here's a look at where we've been and where we're likely headed.

Back in 1994, when Friends dominated television and Ace of Base was the top musical group on the radio, IBM unveiled the Simon, a mobile phone with all the delicate aesthetics of a brick.

But the Simon was remarkable for one major reason besides its one-hour battery: it had a touchscreen--kind of. It utilized resistive touch, which meant the screen had to be physically pressed to interact with the electrical layers beneath it and complete a circuit. The behemoth phone required a stylus to operate (check out this demonstration of the magnificent stylus action), but it was still a leap forward. The Simon predated the iPhone by 13 years, and in the meantime various manufacturers utilized resistive touch for PDAs and keyboards. The first capacitive display--which uses the electrical charge in a human touch to detect commands--was released by LG in its Prada smartphone. The capactive display meant phonescreens didn't have to flex, and they were much more responsive than resistive touch. Though the iPhone wasn't the first to use capacitive touch, it did set the standard for mobile devices and meant that if a smartphone was to compete in the market, it needed a touchscreen.

Moving Forward

One interesting development came this year from Cypress Semiconductor, which has released an automotive touchscreen that is waterproof and able to be manipulated even with gloved hands. This may not seem particularly fascinating, but it's remarkable that the screen can withstand the tremendous electromagnetic interference found in cars. It's also worth noting that waterproof operation is a step forward as well, since there are very few touchscreens that operate with water interaction (an iPhone won't, but some Androids will). 

And then of course came Apple's contribution to touchscreen innovation: pressure sensitive capacitive touch. It first came with Force Touch in its keyboards and the Apple Watch. Force Touch includes a haptic feedback engine that makes it seem as though the display is moving downwards, when in reality the haptic feedback pushes back at the user.

The innovation in pressure sensitivity was manifested again in the new iPhone 6s, which features 3D touch (a more sensitive version of Force Touch; no one's going to be able to keep the terms straight). And yet, 3D touch still isn't present in the new iPad Pro or in any of the new Macs. In fact, true pressure sensitivity has yet to be fully integrated into computers.

Designers have unlimited possibilities when it comes to the future of touchscreens. Users want to feel the clothes they're shopping for online or transmit the textures in a picture across platforms like Instagram. Technology like that being developed to add sensation to prosthetic limbs could be utilized to imbue mobile devices with new tactile sensations as well. And faster processors would allow for real-time manipulation of synthetic touch. Then there's the virtual reality market: it's about to explode, but what would make it really sensational (pardon the pun) would be the addition of touch response. Touchscreens may go away altogether someday, but in the meantime, designers have an obligation to bridge the gap between a glass screen and the nuanced senses of the everyday world. 

1 Comment
  • V
    vlgligor September 15, 2017

    A short comment on “Then there’s the virtual reality market: it’s about to explode, but what would make it really sensational (pardon the pun) would be the addition of touch response.”

    The touch response for virtual projections already exists in few forms and being sometimes called gesture recognition. I would iterate here just few examples:
    1. Projected laser virtual keyboards, a gadget that made its way to the market, on the relatively cheap side, few years ago
    2. Microsoft Kinect box that takes the player inputs without any touch
    3. LeapMotion’s sensor for touchless input on a computer’s screen, incredible technology that i tested about 3 years ago and i was impressed
    4. HTC Vive / Sony PS etc controllers for VR, that are (in my opinion) in a rudimentary stage, but are a first step.

    In other words, the hardware exists, the sensitivity / accuracy might still be an issue, but the future development will have to focus on creating algorithms for the gesture recognition and standardization.

    Cheers!

    Like. Reply