But things are improving. The latest iPhone features Siri, a natural language user interface that answers questions, makes recommendations and perform actions.
Smartphones and tablet computers already use the touchscreen interface to great effect and if some of the many research projects underway succeed, touch technology – or haptics – will transform the way we use electronic devices.
One promising example of haptics is OmniTouch, a wearable projection system developed by Microsoft Research and Carnegie Mellon University (CMU) in the US. It enables users to turn pads of paper, walls or even their own hands, arms and legs into graphical, interactive surfaces.
A significant innovation of OmniTouch is its use of a depth sensing camera, similar to Microsoft's Kinect, to track the user's fingers on everyday surfaces. This means they can control interactive applications by tapping or dragging their fingers, much as they would with conventional touchscreens. The projector can superimpose keyboards, keypads and other controls onto any surface, adjusting automatically for the surface's shape and orientation to minimise distortion of the projected images.
"It's conceivable that anything you can do on today's mobile devices, you will be able to do on your hand using OmniTouch," says Chris Harrison, of CMU's Human-Computer Interaction Institute. "The palm of the hand could be used as a phone keypad, or as a tablet for jotting down brief notes. Maps projected onto a wall could be panned and zoomed with the same finger motions that work with a conventional multitouch screen."
OmniTouch is a significant haptic advance, not only because it uses a depth camera, but also a laser picoprojector. Currently, it is mounted on a user's shoulder, but it could be reduced to the size of a matchbox, allowing it to be wearable or to be integrated into future handheld devices.
"These sorts of technologies have only been recently enabled," Harrison says. "Computationally powerful smartphones have been around for a few years now. Mobile projectors are finally getting small and bright enough to be practical. And finally, in the case of OmniTouch, depth cameras offer sensing capabilities that were not possible just a few years ago, and open many new opportunities for interfaces."
"With OmniTouch, we wanted to capitalise on the tremendous surface area the real world provides," says Hrvoje Benko, a researcher in Microsoft Research's Adaptive Systems and Interaction group. "We see this work as an evolutionary step at Microsoft Research to investigate the unconventional use of touch and gesture in devices, to extend our vision of ubiquitous computing even further."
The optical sensing used in OmniTouch allows a range of interactions, similar to the capabilities of a computer mouse or touchscreen. It can track 3d motion on the hand or other commonplace surfaces and can sense whether fingers are 'clicked' or hovering. And OmniTouch does not require calibration – users can simply wear the device and immediately use its features.
Harrison previously worked with Microsoft Research on other haptic interfaces, such as Skinput, a technology that used bioacoustic sensors to detect finger taps on a person's hands or forearm. Skinput thus enabled users to control smartphones or other compact computing devices. One of his latest projects, TapSense, uses the different sounds from a user's touch to enhance haptic interfaces with surfaces – it can distinguish between the finger tip, pad, nail and a knuckle.
US company Novint Technologies is a leader in haptic interfaces for gaming, in the form of its Falcon and XIO products. Users hold onto the Falcon's grip and as it moves, the computer tracks a 3d cursor. When the cursor touches a virtual object, the computer registers contact with that object and updates currents to motors in the device to create an appropriate force to the device's handle, which the user feels.
"The computer updates the position of the device and the currents to the motors a thousand times a second, providing a very realistic sense of touch," the company says.
For example, when a 3d cursor touches a virtual sphere, there is a force perpendicular to the surface. The device reacts and pushes in the radial direction away from the centre of the sphere, proportional to how hard the user pushes against the sphere.
"The computer keeps track of the direction of the force, based on the position of the cursor and the amount of the force, which lets the user slide the 3d cursor across the surface of the sphere, giving it a consistent smooth feel. The effect is that the cursor, and therefore device, physically cannot move through the sphere, and it is actually a virtual solid object."
Although haptic devices have been available for commercial applications for many years, these devices have historically cost tens of thousands – even hundreds of thousands – of dollars. Novint says its devices are designed as affordable consumer controllers and claims it makes high fidelity 3d touch accessible to the consumer market for the first time.
Another US company – Sensable – produces a range of haptic devices for 3d modelling including its Intellifit Digital Restoration System, which makes it possible to scan, design and fabricate common dental restorations, using a 3d Virtual Touch stylus.
The company's Phantom range caters for varying modelling needs, all featuring haptic feedback, and it even provides an OpenHaptics Toolkit, enabling developers to add haptics and 3d navigation to applications. The toolkit handles complex calculations, provides low level device control for advanced developers and supports polygonal objects, material properties, and force effects.
"It means designers can model complex, highly detailed organic shapes faster than with traditional cad tools, create multiple design variations, clean up, modify and stylise scan data and create detailed textures for prototyping, evaluation and manufacturing," it claims.
Another unusual haptic interface also developed at CMU is based on magnetic levitation. Invented by Ralph Hollis, of CMU's Robotics Institute, the maglev haptic interface allows users to perceive textures, feel hard contacts and notice even slight changes in position. Users are working on applications for controlling remote robots and as a simulation technology for dental training and biopsy needle insertion.
The maglev haptic interface has a single moving part – a bowl shaped device called a flotor – that is embedded with wire coils. Electric current flowing through the coils interacts with permanent magnets underneath, causing the flotor to levitate.
Users control the device with a handle attached to the flotor, moving it much like a computer mouse, but in three dimensions. Based on the interaction of the virtual objects being manipulated, corresponding signals are transmitted to the flotor's electrical coils to exert forces and torques to the handle that the user can feel. A CMU spin off, Butterfly Haptics, produces a commercial version of the device called the Maglev 200.
Surround Haptics, a new tactile technology developed at Disney Research, Pittsburgh (DRP), enables video game players and film viewers to feel a variety of sensations – from the smoothness of a finger being drawn against skin to the jolt of a collision.
It has been demonstrated enhancing a driving simulator game in collaboration with Disney's Black Rock Studio. With players seated in a chair with vibrating actuators, Surround Haptics will enable them to feel road imperfections and objects falling on the car, sense skidding, braking and acceleration and experience ripples of sensation when cars collide or jump and land.
"Although we have only implemented Surround Haptics with a gaming chair to date, the technology can be embedded into clothing, gloves, sports equipment and mobile computing devices," says Ivan Poupyrev, a senior research scientist at DRP. "This technology has the capability of enhancing the perception of flying or falling, of shrinking or growing, of feeling bugs creeping on your skin. The possibilities are endless."
DRP researchers have designed an algorithm for controlling an array of vibrating actuators in such a way as to create 'virtual actuators' anywhere within the grid of actuators. Poupyrev says a virtual actuator can be created between any two physical actuators; the user has the illusion of feeling only the virtual actuator.
"As a result, users don't feel the buzzing or pulsing typical of most haptic devices today, but can feel discrete, continuous motions such as a finger tracing a pattern on skin."
The phenomenon of phantom sensations created by actuators has been known for more than 50 years, but its use in tactile displays has been limited because of an incomplete understanding of control mechanisms. DRP conducted a series of psychophysical experiments to work out how to achieve and manipulate these mechanisms.
In addition to enhancing user experiences with interactive games, movies and music, DRP says Surround Haptics' underlying technology can potentially provide new tactile means of communication for the blind, emergency workers, vehicle operators, athletes and others.
A host of other haptic projects are underway worldwide. Several involve developing touch sensitive fabrics, something car makers like BMW are considering installing in future models. Automotive companies are also considering touch based navigation systems in which devices mounted on the steering wheel pull the skin of the driver's fingertips left or right – research suggests drivers follow such 'instructions' more accurately than voice commands, especially when speaking on their mobile phone! Tactile gaming vests have been developed that turn playing a video game into a complete – even painful – sensory experience!
Potential in medical applications
Many think there is huge potential for haptics in medicine: a system developed at the University of Leeds aims to give surgeons a hands on feeling when using keyhole techniques. Palpatronix' system combines a computer generated environment for virtual surgery and a hand held device that applies pressure to the users' hand. What the user feels will depend on how hard they are compressing the virtual tissue.
Two similar EU funded projects give touch feedback. Robocast is a robotic neurosurgery system, while Immersence aims to combine vision and touch to create virtual 'objects' that can be transmitted electronically.
Harrison believes the time is ripe for a haptic revolution.
"We haven't yet seen advanced haptic feedback technologies widely integrated because it requires 'buy in' – someone needs to take the risk. It's like Apple building a phone around a multitouch screen with no keyboard. It was a gutsy move and now is practically the norm. If someone adds an advanced haptic technology, others will follow.
"The real world is full of rich haptic feedback: we push a door, grab a toothbrush, grasp a bottle. So far computing has lacked much touch input, so we're mostly clicking buttons and poking touchscreens. But there is a huge opportunity for providing haptic feedback to the user, just as we get from real world actions."
The last few years have seen an explosion in social media, enabling people to communicate: maybe we are about to see yet another way of staying in touch.