comment on this article

Overcoming visual impairment

Electronics technology is helping to overcome visual impairment, but there’s still a lot of work to be done.

In 2015, more than 2million people in the UK were living with some degree of sight loss. It affects people of all ages, but the older we get, the more likely it is that our vision will be impaired. According to statistics from RNIB, one in five people aged 75 and more are living with sight loss and nearly two-thirds of those living with sight loss are women.

The most common cause of blindness – almost complete loss of vision – is age related macular degeneration. But other conditions which affect the sight to some degree include glaucoma, cataracts and diabetic retinopathy.

Steve Tyler, head of solutions, strategy and planning for RNIB, said the number of visually impaired people in the UK is set to double by 2050 as the population ages. “The visually impaired tell us they want to be able travel, to be able to look after their money. For example, they want to use ATMs, to go shopping.”

Tyler said these desires have one thing in common; access to information. “We’ve found,” he continued, “there’s a need to influence the industries which have most effect on how things work – and this includes operating systems and embedded systems.

“Android, iOS and the like used to be focused on smart devices, but are now finding their way into every day products. A good strategy is to make sure that accessible APIs are built in to enable the use of screen reading technology or the addition of other functions. So, if you have specialised devices, you have half a chance of making them useable.”

Tyler pointed to what he called ‘good progress’ with operating systems, such as iOS. “These support voice, screen readers, zoom, contrast and font size, allowing them to be personalised. But they also allow the connection of a Braille display. It not only shows that our strategy works, but also that, if you can get companies engaged, they will deliver.”

And a particular benefit of devices such as the iPhone is they can be used by anyone. “Gone are the days when a specialised device was something that nobody wanted, but which they had to use.”

This highlights a particular problem with disability aids in general – they aren’t always attractive. And, because of this, the very people who might benefit are reluctant to use them. “If you can create a business case driven not only by visual improvement, but also by making a product desirable and building a case for its mainstream use – so there’s money in it – there’s half a chance it will be sustainable,” Tyler contended.

Entertainment systems are one of the target areas for the RNIB and its work is paying off. “Samsung’s TVs are fully accessible,” Tyler said. “You can talk to them and they can talk to you. The menus, the APIs; it’s all there. But the real challenge is domestic appliances; the things that used to be in smart devices – such as touchscreens and soft buttons – are now found in things like washing machines.

“Manufacturers either need to build in assistive technology or get on the connectivity agenda so their products can talk to other devices.”

Here, an issue common across many industries appears – standards. “For this to happen,” Tyler continued, “we need standards that take account of the need to integrate systems that bring a step change in accessibility. Although some manufacturers say they’re involved in connectivity, they want it to be based on their technology. We need to push standards so we can get information out of devices.”

Another challenge is to get manufacturers to understand the market. “Some will say ‘disability is niche’, but our response is ‘how niche?’,” Tyler said. “What they often mean is that they think it’s too complicated.

“As you get older, you lose the ability to do everyday things. But, while we’re less able to do things, we have more disposable income. Developing products those people want still seems to be a challenge.”

Around and about
Travel is an important issue to the visually impaired, according to RNIB. “Getting around is a problem and more than 50% of those with sight issues are socially isolated,” Tyler pointed out. “And older people aren’t always tech savvy.”

He noted that blindness cuts across the population and highlighted the fact that the visually impaired want to do what everyone else does, including meeting people where and when they want to, rather than having to use specialised transportation.

“We need to be able to take advantage of smart devices,” he said. “For example, electronic timetables could tell us where a bus is going and technology on the bus could tell you when you should get off.”

Using transport is one element of the problem; navigating your way around is another. “We now have GPS, which has given a boost, but GPS is still focused on car drivers – even in pedestrian mode, the device will still talk about roundabouts.

“The system could tell you that you have ‘arrived’, but what it might not take account of is that, while you’re close to where you want to be, there might be three lanes of traffic in the way. Systems don’t always know that and their accuracy is limited.”

Some developers have created systems which take clues from the environment. “You can boost what the system is doing through the use of visual cues – and that’s interesting,” Tyler said. “OrCam, for example, has built a system which can access written material, extract print from the environment and which can be activated by a gesture.”

OrCam’s MyEye product enables the user to read text in most formats – even on a computer or smartphone screen. It can also recognise previously stored faces.

“If you could link that with a GPS system, for example, you could merge data so that, when you get off the bus, you know you’re almost there.”

Understanding the environment
Interpreting the visual environment is a challenge which researchers are tackling in various ways. One area where there is a lot of activity is in automotive, where companies are developing the technology needed to enable autonomous vehicles.

“A lot of technology finding its way into the mainstream is being developed by the automotive sector,” Tyler said. “Some of that technology, and how the environment is interpreted, is useful for blind people.”

He pointed to Google’s need to develop the technology necessary to deliver driverless vehicles. “This gives you a sense of the way in which technology is going. But the big challenge is the ability to interpret what you’re seeing.

“The problem is that autonomous cars don’t always know what they’re looking at, but still have to make a decision. For example, consider a woman chasing a chicken – cars have never seen that. While it might recognise the human, it may not recognise a chicken. How does it interpret that data and make sense of it in a way that’s useful?”

Tyler – who is totally blind – believes no ‘silver bullet’ technology will appear in the foreseeable future. Nevertheless, progress is being made, including haptics technology. “If you told me five years ago that my ‘go to’ device would be a tablet, I’d have said you were nuts – but that’s what it is.”

He also pointed to devices such as Amazon Echo. “It’s fully accessible from my point of view; you can put things in your basket, readnewspapers and so on. The point is that if you design things properly, you’ll attract everyone.

“Yet, while technology is here,” he concluded, “the biggest challenge remains the interface – how do you make things make sense?”

Seeing in colour

Pursuing a different approach to assisting the visually impaired is Colorophone, based at the Norwegian University of Science and Technology in Trondheim.

According to Dominik Osinski, research has shown the visual cortex can be activated by sound – a process called sensory substitution. “The first device which translated light into sound was developed by Noiszewski in 1898,” he claimed. “Yet, after almost 120 years, sensory substitution devices are not widely used by the visually impaired. One of the reasons is a difficulty in coding visual information by other senses.”

One of the pioneers of sensory substitution devices was Dr Paul Bach-y-Rita, whose work in the 1960s involved transmitting information from a camera to blind individuals via plates vibrating against their backs. “It was something like a dentist’s chair,” Osinski noted, “but, after a while, users could recognise faces.”

Colorophone is looking to make a system which is easier to understand, cheaper and more useful. It comprises glasses, a camera and an ultrasonic distance sensor. Users scan the scene in front of them by moving their head. Information from the camera and distance sensor is processed, converted into frequencies corresponding to a particular colour and delivered via bone conductive headphones so as not to block out surrounding sounds.

“After only two minutes, 98% of users could distinguish between 16 colours,” Osinski noted. “And, in one of our tests, a user could not only discern an orange and its size, but also the fact that it had a blue label. It’s even possible to identify stairs as parallel lines of different colours.”

Specs get smart

Oxford University researcher Dr Stephen Hicks started working on a retinal implant technology, but realised retinal prosthetics were specific and that current technology would only support a 40pixel image.

Looking to build a better solution, he created OxSight, which is developing SmartSpecs. The augmented reality display system is intended to allow those with severe visual impairment make sense of their surroundings by translating it into shapes and shades that allow them to discern physical objects.

“I thought that if you have a small number of pixels and did some processing, you could find important objects,” he said. “You could then estimate what the object is and see it moving. While it isn’t photographic, it’s useful.”

One of the challenges was creating a product which provided enough visual information, but which was acceptable enough to be worn in public. “We want to make something people can wear, rather than conduct a research project,” he said.

The system captures a scene using a camera, with data processed at high frame rates on a mobile processor. “If an object is more than 4m away,” Dr Hicks said, “then it’s blanked out, while those closer it is, the brighter it becomes.”

Data is projected onto a transparent display via an LCD projector in the side arms of the glasses. “While we need better battery technology, everything else is small enough and light enough,” he added.

SmartSpecs are aimed at people who retain some central vision, such as those with retinitis pigmentosa, diabetic retinopathy and some forms of glaucoma. “And if the user is totally blind,” Dr Hicks concluded, “we can turn the image into sound. While it’s not perfect yet, we’re getting closer.”

Graham Pitcher

Related Downloads

Comment on this article

This material is protected by MA Business copyright See Terms and Conditions. One-off usage is permitted but bulk copying is not. For multiple copies contact the sales team.

What you think about this article:

Add your comments


Your comments/feedback may be edited prior to publishing. Not all entries will be published.
Please view our Terms and Conditions before leaving a comment.

Related Articles

Magnetism discovery

Researchers have found that a 2D nanomaterial consisting of organic molecules ...

Production challenges

The challenges associated with meeting the needs of customers are now extending ...

Get to market faster

A quick look at using Vicor's PFM and AIM in VIA packaging for your AC to Point ...

Digital consciousness

​Would you consider uploading your brain to the cloud if it meant you could ...