FEATURE21 December 2016

At the touch of a non-existent button

x Sponsored content on Research Live and in Impact magazine is editorially independent.
Find out more about advertising and sponsorship.

Automotive Features Impact Technology UK

One step on from gesture technology, Ultrahaptics is using ultrasound to simulate the feel of control knobs in thin air. By Jane Bainbridge

Home Appliances

For years, the turn of a knob or the press of a button gave us the physical satisfaction and intuitive understanding that an on/off, increase/decrease type action had taken place. Then we adapted to the computer mouse and the way in which a horizontal movement could translate into a vertical one. 

A further leap was required with the rise of gesture technology, and while it has only been adopted in a limited number of situations, people are generally at ease with the wave of a hand turning on a tap in public toilets, or guiding a computer game. 

But what if the mid-air movement of gesture technology was combined with the tactile sensation of the push of a button? 

That is precisely what UK start-up Ultrahaptics is offering with its system manipulating highly accurate puffs of air to give the user a tactile experience without either a physical button or needing gloves or attachments. 

It is the brainwave of Tom Carter, who in 2013 was studying computer science at the University of Bristol. While working in the Bristol Interaction and Graphics (BIG) Lab his professor suggested that for his final-year project he might want to look at ultrasound and how to manipulate it, based on research first done in the 1970s. 

Steve Cliffe & Tom Carter 2

The project turned into a PhD and, in his third year, Carter won a prize awarding him a small amount of funding. This he promptly spent on flights to Las Vegas to attend the annual global consumer electronics show, CES. The result was £900,000 of seed funding from IP Group. 

The technology uses 256 transducers or speakers in an array, to emit ultrasound waves at a rate of 40kHz. The sound waves combine and the force vibrates the surface of your skin, which simulates the sense of touch. It can go as fine as 4mm and the transducers are the same as those found in rear parking sensors. 

Carter, now chief technology officer at Ultrahaptics, says: “We have an array of ultrasound speakers and what we do is very precisely control the timing between the speakers so that all of the sound waves arrive at the same point at the same time. This displaces the skin on the hand ever so slightly, simulating the sense of touch. We can target each fingertip individually and we can create a different texture on each different fingertip.”

So, for example it can feel as if you’re touching a ball or cube, that your hand is passing through a force field, or that you’re popping bubbles. 

“We get that immediate click feeling and that’s something that you can’t get with gesture recognition on its own,” adds Carter. “The feeling of touch is essential if you want to control your devices quickly and efficiently.”

To push the technology out into the market and allow firms to incorporate it into their own products, Ultrahaptics created an ‘evaluation programme’.

It took this to the following year’s CES show, where it instantly sold out. In October 2015 the company raised £10m in funding.

At the time that Carter was starting out, people were still struggling to give up their BlackBerrys – with button keyboard – for the iPhone touchscreen. Typing was slower on the touchscreen keyboard as there was no haptic feedback. Ultimately, the iPhone introduced an element of haptic feedback so users sense when the device is responding to them. 

Automotive

The evaluation programme – which was officially launched in January 2015 – costs $20,000 to join, so it is predominantly appealing to high-end manufacturers. Ultrahaptics’ business model is to work with partners to develop customer sensations and switches on non-recurring engineering expense (NRE), a one-time cost to research, develop, design and test a
new product. 

After that, a manufacturer will pay for a licence to produce hardware using the software, as well as Ultrahaptics receiving a royalty for every product sold. The company is currently exploring creating a more accessible developer platform, so more companies could try and incorporate the technology in their products. 

So where is it most likely to appear first? Inevitably, there is much secrecy about who is testing this technology, with Jaguar Land Rover and the University of Tokyo the only organisations willing to be officially acknowledged. 

However, the company is confident it will appear in a product on the market by the end of the year, probably in the white label category.

It sees the two areas where the tech is most likely to be applied as control – knowing when your gesture has actually worked (especially important in the car environment so you don’t take your eyes off the road), and augmented reality/ virtual reality (AR/VR) – offering a further level of immersion into a virtual world. 

VR

Steve Cliffe, CEO of Ultrahaptics, says: “What’s really remarkable is that this technology can be used anywhere we interact with technology. Think about how many devices are now controlled with touchscreen: computers, car dashboards, TVs, sat navs, phones, and even the latest kitchen devices and parking meters.

“Our technology removes the need to touch devices physically, but still gives us that vital recognition that we are interacting with our devices, making that communication two way, and intuitive. And that’s just the control applications for the technology. We believe that touch is the missing piece to make virtual worlds more real. At the moment, despite sophisticated graphics and audio, the spell is still broken in VR once you reach out to touch something.” 

0 Comments