Introducing Zeno, the lifelike robot that is helping autistic children communicate

6 mins read

The autistic spectrum is sometimes, mistakenly, thought of as a gauge on which we all have our own place. Such a spectrum, you would have thought, must have the behaviour of a standard human being either at one end of it or bang in the middle. But this is not the case; it is a spectrum that deals with excesses of behaviour and there is a wide range of different behaviours. A slightly over enthusiastic attitude towards Star Trek, for example, doesn't necessarily mean you have autism, just perhaps enjoy a bit of escapism.

Autistic people can be loud and intelligent, and can also be quiet and mentally retarded. There are about 700,000 people in the UK with autism, or 1 in 100, and studies in the US suggest the figure is nearer 1 in 88. Males are five times more likely than females to have the disorder. According to the web site of the National Autistic Society: "Autism is a lifelong developmental disability that affects how a person communicates with, and relates to, other people. It also affects how they make sense of the world around them. It is a spectrum condition, which means that, while all people with autism share certain difficulties, their condition will affect them in different ways. Some people with autism are able to live relatively independent lives but others may have accompanying learning disabilities and need a lifetime of specialist support. People with autism may also experience over or under sensitivity to sounds, touch, tastes, smells, light or colours." Communicating with people, particularly children, with autism is obviously a problem, as communication difficulties are at the heart of the disorder. Speech development is often delayed in autistic children, resulting in them being less willing to use it as a form of communication and making them appear shy or unresponsive. Alternative ways of communicating, like using gestures for example, can therefore be more effective. This is where Zeno comes in. Zeno and his female friend Alice are 2ft tall robots that can communicate in ways that humans are not always able to. The robots, developed by RoboKind, are aimed at a variety of applications in education and health. Director of engineering and cofounder Richard Margolin commented: "Autism therapy was one of the applications that we were aware of when we started. We had developed bigger robots in the past that were much more expensive, so our first goal was to take all of that technology and make it available to researchers in a smaller package that was much more affordable – reducing it by a factor fo 10 from $250,000." Initially, Robokind's goal was to develop a robot as a tool for researchers and for internal use. Aside from the hardware, the company undertakes artificial intelligence development. It became clear there were two immediate areas where the robots could be particularly useful. Firstly, in education, as the US has introduced new STEM standards which lend themselves to practical robotics and, secondly, for autism therapy. Autism is not something that has a cure, so the focus is on management and therapy. Margolin explained: "For autism, our focus is on social interaction therapies. In autism one of the areas that tends to be deficient is natural social interaction. Our robots, with expressive faces and the ability to gesture and interact verbally, can teach in a repeatable way. Additionally the robot doesn't get frustrated when kids act up, which some therapists do, and the robot doesn't get tired. Other robots that don't have faces are trying to do some of the same types of therapies but, instead of trying to show the kids what a smile looks like, they dance around and act happy. Here, you can really replicate facial expression, movement and gesture in a way that is not overwhelming to a child with autism – dealing with people directly for them can be overwhelming." Zeno's face is remarkable. It is covered in a material created in house called Frubber. The patented silicone elastomer is designed specifically to look, move and feel like human flesh. "It is one of the enabling technologies that allows us to do facial expressions with quality and realism," said Margolin. "The facial expression aspect is mostly mechanical and is something we are expert in. We think we are the best in the world with respect to facial mechanics and being able to do it in a package like this doesn't require things like a pneumatic power generator." More challenging is the artificial intelligence. Watching the emotional response to what the robot is doing and analysing that in real time enables researchers – and ultimately the rapists – to see how effective the therapy is at that moment. Is the child getting bored or overwhelmed and can the therapy be modified during the process to be more effective? "It is actively individualising all the time," claimed Margolin. "We are using visual motional recognition and movement cues from the individual we are doing the therapy with. So we are looking at facial expressions, we are looking at movement – is it jerky for example? We are looking at opening the system up in a way that you can work with a number of internal and external sensors for the robot and we are looking at direct bio-feedback – like heart rate, respiratory rate and skin response (amount of sweat) – to measure how the person is feeling. What we are trying to do, and we are getting better at it, is to develop a theory of mind about the individual we are working with so we can deliver the therapy effectively; deliver the information we are trying to get across in a way that works for that person. We build up a profile of that person as time goes on." Zeno is loaded with sensors. He has a cmos digital HD camera in each eye, each of which can turn independently, giving researchers the option of stereoscopic vision by adjusting the focal length. There are microphones in the ears, two IR distance sensors in the waist for obstacle detection and ultra sensitive push button sensors in the feet – ideal for robot soccer! On the bottom of the feet are two more sensors on the back for 'ground detect' to let the robot know that when it thinks it has put its foot on the ground, it actually has. There is also a three axis gyrometer, three axis accelerometer and three axis compass. On the front of the feet there are flip detect sensors. "We don't want him to walk off the end of a table – he is a big heavy expensive piece of hardware," Margolin pointed out. Josh Jach, director of electronics, said that, for him, a key challenge has been power management. "Each motor has a maximum current draw of 2A and, with 21 motors, that is more power than you want to deliver from a household ac/dc adaptor. We have to ensure each motor uses just what it needs and manage the system so that it will maintain the most critical motors at all times. The legs for example – you can't cut power off there. Each motor has its own controller which feeds back how much current it is currently pulling and that allows us to do management through the software." The motherboard is currently based on a X86 Intel Atom Z530 1.6GHz cpu with 1Gbyte of DDR2, 4Gbyte of flash and a 16Gbyte SD micro card. Jach said: "In future, we are moving from the Atom to a more versatile TI OMAP chip. It is more power efficient and less costly, a more cost efficient product will need more power efficiency – more MIPS/W is how we look at it." The OMAP platform includes two ARM Cortex-A processors and two Cortex-M series, the former providing processing power and off load appropriate tasks to the more power efficient Cortex-M. Jach continued: "The Intel Atom pulls 1A on standby and 2A when it is running all the software, and that is just because the Intel architecture is not that power efficient. It is equivalent to one or two motors. The ARM platform draws about 100mA, which is a lot lower. We also have three fans on the back of the Intel and a huge heatsink; there is only a tiny heatsink on the ARM and no fans. It just needs natural convective cooling. In a quiet room the fans are quite loud and off-putting for an autistic child." The University of Texas at Arlington (UTA) is leading the research into this form of autism therapy, with partners that include RoboKind, Dallas Autism Treatment Centre and Texas Instruments. Jach noted: "UTA has implemented a Kinect based feedback to the robot to pick up the therapist's movements and relay them to the robot. So you can have a therapist in a remote location feed back a session in video through the eyes. We also have National Instruments' LabVIEW hooked up and accessing any function that the robot can do. The robot is sending animations via LabVIEW wirelessly over 802.11n; effectively it is being used as the control software." Margolin added: "We use LabVIEW in the build process of this robot; our robot calibration software is basically an automatic calibration suite that runs in LabVIEW, connects to the robot, runs it through this test procedure and then generates all calibration files for each robot. Our model is to keep all of our software open source and it is only our content that we are keeping closed – like our programmes for high schools. But anyone who wants to download our software for conversational interactive stuff, our motor and motional control, our vision; they can do that from the web site." It is early days, but Margolin believes Zeno and Alice may have their place in autism therapy. "We have thus far conducted a number of efficacy studies that have shown this is worth pursuing and we are now actively in a number of places working directly with children with autism. We are a Dallas based company and at the Dallas Autism Treatment Centre has about 75 children with autism concentrated in one location, so we have been working with them."