Under pressure to create the best human-machine interfaces? Yes, we are.

October 24, 2016 // By EDN Europe
Jon Stark, CEO, Peratech
I remember my first flight on a Boeing 777. Each seat had a touch screen. It was revolutionary.

I loved that my economy seat, an otherwise lousy, skinny, lounge chair—as remote as could be at 37,000 feet above the earth—became a technological wonder: over 300 of us packed into a (large) sardine can, each sitting and watching whatever the heck we wanted.

It sounds so ancient putting the memory to words, a distant recollection of bliss, scanning through listings of movies simply by touching our screens. We have moved so far from that point, not just from a technological perspective, but from a social perspective as well. Yes, touch interfaces have transformed not only the most mundane objects in our lives, but also transformed how we interact with content.

Today, touch displays that make up most of what we call human-machine interfaces (HMIs) crop up in some pretty creative places, offering the promise of advancing our capability to interact with whatever that particular HMI is controlling. Whether it’s a new oven at home or a tablet menu at an airport restaurant, the quality of that experience is determined by the level of interaction that the HMI provides. We laugh about “pocket-dialling” our mother-in-law over the weekend. When the HMI is our oven, and leaning against the display causes the oven to turn off and epically fail to bake the cupcakes for your daughter’s fourth birthday, we don’t laugh at all and no, she will never understand.

At home, tears ensue. At work, those of us working to make the best HMI experiences get back to work. We’re under pressure to get it right: every device, every situation, every time.

Description: ../Desktop/Peratech-Touch-Sensing-Hob.jpg

Figure 1. Force touch sensors on oven controls give a much more intuitive user experience.

First, we replaced buttons with resistive touch screens and then as customer bliss turned to frustration we replaced resistive touch with capacitive touch, striving to implement what will work better and better. Then came the iPhone and iPad, ushering in almost universal understanding of the elegance and limitations of capacitive touch controls. A toddler pinches a picture book to zoom in on a picture, and we immediately revel at the elegance of touch. A man on the train fights with his smartphone to peck out a text, and autocorrect has him suggesting he sell a body part he never intends to part with. Let’s face it, we love this new found HMI, but it is far from Nirvana. On the technical side, there are challenges getting capacitive touch to work. Capacitive sensing struggles at giving precise touch force and location even in normal conditions. Add curved or flexible surfaces, moisture, heat, and gloves, and capacitive touch has a very difficult time meeting today’s customers’ basic needs, let alone the product designers’ next-generation HMI aspirations.

Adding the element of force-sensing to improve HMI has been one elusive approach until recently. The thought is, if you can produce a signal in proportion to the pressure applied by a finger or stylus, and do that in a precise, reliable and repeatable way, you have added a new dimension to the way in which devices can be controlled. A simple