Feeling the future of robotic control

5 mins read

Now VR and AR are established in several vertical markets, the next stage is adding haptic feedback.

Immersive Reality may be struggling to penetrate the consumer world, but across industry it is becoming well established.

Manufacturers, for example, are now finding uses for the technology including the way it can assist employees to diagnose potential problems, record their progress during training or reduce wastage and increase the quality of assembly, lengthening the life of machines.

Such use cases were established and detailed in ARC advisory group digital transformation analyst Will Hasting’s 2018 report into the use of AR in manufacturing and now it seems the focus, in terms of innovation, has shifted from the ability to see to the ability to see and feel.

Sectors leading the way include defence, with bomb disposal robots, aerospace, for maintenance work and ceramics, looking to improve the quality of the end product.

Another area in which haptic and virtual reality is in demand is in healthcare, allowing the ability to train surgeons before operating on patients. Richard Vincent, founder and CEO of FundamentalVR (pictured) said, “Where we seek to get our technology to is to create haptics, which give the sensory clues that you need to change your behaviour in a surgical procedure.”

FundamentalVR provides a replication of a number of surgical procedures through Microsoft, Vive or Oculus VR headsets and two geomatic haptic arms in order to give students a sense of the different forces needed to carry out an operation. These procedures are built on a gaming engine provided by UnityVR with FundamentalVR’s own R&D team incorporating the feedback through the geomatic arms.

“Whether it’s reaming where you can feel the surface that’s starting to change and so change your behaviour or whether it’s a pedicle screw, where you have to feel the inside of the channel at the back, we can achieve those clues, changing behaviour.”

Elsewhere, in manufacturing, Unity Studios Aps chief design officer Thomas Fenger explained that companies are now interested in using haptic response systems and, in particular, VR and AR to make sure production runs smoothly.

“Right now, companies that deliver industrial robots are branching into using AR/VR for controls.

“Mostly I see this as sort of the pre-installation of the controls. We have already seen ABB using virtual reality to be able to set up controller robots in a digital copy of the sort of the environment that they come to use.

“The idea is to not interrupt production. The operator will set up the procedures that the robot will follow and can pre-simulate without actually interrupting processes. That’s probably where we see the most benefit right now.”

True replication

The challenge that system developers are now facing is replicating true interaction through robotics. According to Professor Robert Stone, Director of the Human Interface technologies team and Emeritus Professor, XR and Telepresence at the University of Birmingham, these technologies have not developed much from systems in the 90s.

“We saw a lot of early VR applications using master slave manipulators from the nuclear industry and for a long time, people were asking if these things could be adapted. Even the original exoskeletons in the 1960s were being looked at as a means by which you could interact more intuitively with virtual worlds.

“It’s only recently that haptics has come to the fore because a lot of people are coming out with quite outlandish products but, when you look at these products, very little has changed since the 90s. We’re still looking at pneumatics and at electromechanical devices that are built in pulley-like systems that restrict or restrain the movements of the hand or arm’; we’re still looking at exoskeletons.

“The big issue from my perspective, as a human factor specialist, is that we are still in the position today where there is not one system, be it the robotics, or VR that is able to satisfy all the needs of the sensors that are built into our fingertips, the skin of the fingers, the muscles, the tendons. It’s still incredibly primitive.”

“The important thing with touch is that everyone feels things slightly differently” added Vincent, “I feel a table surface and say it’s quite smooth, you might say it’s a bit rough. Neither of us are wrong but the point is we can feel it and therefore we know it’s there and that’s what we’re trying to achieve with our technology

“Is it exactly the same as real life? No. But is it enough to create the learning experience? Yes. And that’s the difference.

“What haptics do is lift the cognitive load, to make it easier for people to learn because you’ve got more than one sensory experience going on. They accelerate the learning speed because, again, you’re learning through different behaviours, not just theoretical observation. You’re doing it.”

Good Vibrations?

Despite his “primitive” description of the haptic response technologies, when it comes to touch sensation, Stone said that there is a place for certain haptic systems in non-mission critical situations.

He pointed to the use of piezoelectric transducers built into a joystick that vibrates to give the user some form of touch sensation as a good application of haptic technologies.

“One of the best haptic feedback systems I’ve ever used for a robot, was developed by L3Harris over in the US for bomb disposal.

“They used a joystick to develop a haptic feedback controller and we put a power tool onto the end of the robot with a rotating piece of sandpaper to strip stuff away.

“Using this hand controller platform and a multi axis platform we were able to strip the plastic sheath of a wire remotely without sanding or burning through the actual wire itself. You couldn’t have done that with a glove or any of these other techniques for that application.”

Fenger said that approaches by companies to incorporate haptic feedback seemed to follow two routes, a precise system that mimics real interaction or ditching such complicated systems in favour of feedback designed to notify the user of their surroundings. These systems are similar to those outlined by Stone in which “rumblings in an Xbox controller” signalled changes.

“Most VR headsets give different versions of vibrations as feedback. That’s kind of a more interesting way to go, because you learn to accept a certain type or speed or rumble as another kind of haptic feedback,” said Fenger. “So basically, it will be a suspension of disbelief. You accept the tools are different when you are inside of the VR helmet and you accept you touched something when you get a slight rumble.”

When it comes to the realistic haptic feedback, Vincent accepts that his FundamentalVR solution is not quite the finished product. The aim is to get to be as realistic as cadaveric training but he pointed out that the price and convenience comparison weighed heavily with technology.

As for the use of VR and haptics in surgery a lot has been made in the run up to the release of 5G networks of remote surgeries and using new, faster signals to conduct operations from a different room, hospital, country or even continent.

Vincent warns that such systems still have issues to address and there are questions as to whether it is the right technology.

“I’m not sure why you’d want to go there with VR because you really want AR rather than VR. You’d want to enhance the view rather than necessarily replace it.

“But the biggest issue is the weakest link, which is the latency issue because there is the possibility that the moment you’re just about to sever an artery and try and stitch it up, that you get that one millisecond blip in the signal. That could, quite literally, be terminal.

“We’ve had a number of conversations with manufacturers of surgical robots around how some of our haptic intelligence could be used in live robot activity. We’ve not deployed any of that yet, but that’s certainly an area of interest with the people who are specialising in that production.”