comment on this article

The impact of cells on electronics design

Carver Mead – a true electronics visionary – once created a field of research that he called neuromorphic electronics. The name reflected the fact that the whole approach was inspired by how brain cells, or neurons, work.

Now, one of his former students is trying to do something similar. But instead of brain cells, his strategy is to mimic cells in general – and the results could be revolutionary.

Rahul Sarpeshkar, Associate Professor of Electrical Engineering and Computer Science at MIT's Research Laboratory of Electronics (RLE), calls his approach cytomorphic electronics – 'cyto' pertains to cells as 'neuro' does to neurons.
"I am stepping back and looking at how the DNA protein networks function inside a cell – whether neuron, liver or any other kind of cell. In many ways, this is more fundamental – the basic units in biology are DNA-protein and protein-protein computation. Even neurons performing sophisticated learning operations work this way."

The potential is huge, not only for creating extraordinarily low powered, computationally powerful devices, but also for exploiting the combination of analogue and digital processing that biology seems to use.

"Biological systems are incredibly energy efficient, but use unreliable and noisy components to perform reliable, precise and complex computations," Sarpeshkar says. "For example, the brain computes with a staggering energy efficiency of approximately 0.2fJ per floating point operation. The average human cell is even more energy efficient, using only about 0.0001fJ per energy consuming biomolecular operation."

This means a cell in the human body is more than 1000 times more energy efficient than a nanoscale digital transistor. In one second, a cell performs about 10million energy consuming chemical reactions, which altogether require about 1pW (10^-12W). Similarly impressive numbers for the energy efficiency of the eye, ear, and other organs of the body are provided in a book written by Sarpeshkar*.

"The creation of circuits inspired by biology can lead to novel architectures and systems that have applications outside of biology," he says. "Such circuits can also be applied immediately to repairing biological systems, for example in neural prosthetics. The synergy between biological and electronic circuits has already led to ultra low power and noise robust systems for the deaf, blind and paralysed and to advanced ear inspired radio receivers."

Fundamental to how nature has created such efficient systems is the exploitation of digital and analogue electronics – a key theme of cytomorphic electronics. Sarpeshkar says one aim is to lay a rigorous foundation for an analogue circuits approach to 'systems and synthetic' biology, fields which will be highly important in the future of medicine and biological engineering.

"A comparison of the pros and cons of analogue versus digital computation reveals that analogue computation exploits freely available physical functions in the underlying technology that are not necessarily logical or linear to compute, making it more energy efficient than digital computation at low precision and vice versa.

"Biology exploits this insight to compute in a novel and highly energy efficient 'collective analogue' or hybrid fashion that is not purely digital or purely analogue, but an intimate combination of both. Engineering systems can take inspiration from biology to compute in this fashion and can improve energy efficiency by delaying digitisation after an optimal amount of analogue preprocessing."

In fact, any transistor exhibits analogue states. As it switches from its nonconductive and conductive states, the 0 and 1 of its digital aspect, a transistor passes through every state in between – slightly conductive; moderately conductive; fairly conductive – just as a vehicle accelerating from 0 to 60 has to go through every speed in between. Obviously, conventional systems are designed to perform binary logic, so those transitional states are ignored. But it's the transitional states that Sarpeshkar and his colleagues are trying to exploit.

"Let's say the cell is a pancreatic cell making insulin," Sarpeshkar says. "When the glucose goes up, it wants to make more insulin. But it's not 'bang-bang'. If the glucose goes up more, it'll make more insulin. If the glucose goes down a little, it's going to make less insulin. It's graded, not a logic gate. Certain other computations in the cell actually cause variables to switch in a digital fashion via positive feedback as in a latch, but even these are probabilistic and better modelled by analogue circuits with noise."

Treated as an analogue device, a transistor has in practice a large range of possible conductivities, so it could model an equally large range of chemical concentrations. But when treated as a binary switch, it has only two possible states, so modelling a large range of concentrations demands many transistors. For large circuits that model sequences of reactions within cells, binary logic rapidly becomes unmanageably complex. But analogue circuits don't, since they can exploit the same kinds of physical phenomena that make cellular machinery so efficient in the first place.

"If you think about it, what is electronics?" Sarpeshkar says. "It's the motion of electrons. What is chemistry? Chemistry is about electrons moving from one atom or molecule to another atom or molecule. They must be deeply connected: they're both about electrons."

In fact, Sarpeshkar has shown the fundamental thermodynamic equations that govern chemical reactions are the same ones that govern electron flow in transistors in a particular regime known as subthreshold operation. Thus, there is a deep mathematical similarity between chemistry and electronics which his group is exploiting in the design of cytomorphic circuits.

Sarpeshkar's lab has built a steady stream of practical engineering projects inspired by biology. One project mimics the cochlea, or inner ear, to create an energy efficient, extremely fast rf spectrum analyser on a chip – the 'rf cochlea'.

"The rf cochlea chip exploits the fact that the ear's spectrum analysis architecture is the fastest and most hardware efficient known to man – faster than a digital fast Fourier Transform or an analogue filter bank," Sarpeshkar says. "It maps efficiently the partial differential equations that describe fluid membrane hair-cell interaction in the biological cochlea at kHz audio frequencies to inductor-capacitor-amplifier interaction in the rf cochlea at GHz frequencies."

He says the resulting broadband chip costs 20 times less than a traditional analogue filter bank and, compared with a system that digitises its rf input directly to perform spectrum analysis, consumes 100 times less power. Another project took inspiration from the ear to create a cochlear implant processor for the deaf, which Sarpeshkar developed in 2005. It was tested on a deaf person, who could understand speech with it on her first attempt.

The systems Sarpeshkar and his team are developing are operating at almost the optimum possible energy efficiency, according to the laws of physics – for example, one circuit in his cochlear implant processor extracts the energy in an audio signal with an energy efficiency set by the laws of thermodynamics, a law of physics that is invariant with time, technology independent, and far more limiting than any artificial law such as Moore's Law. Even with a built in microphone sensor, Sarpeshkar's cochlear implant processor consumes 251µW, enabling it to operate for 30 years when powered by a 100mAh battery with 1000 wireless recharges.

Sarpeshkar says similar systems inspired by biology can provide the same benefits for nerve stimulation in many areas.

"Such work can be combined with state of the art micropower neural amplifiers," Sarpeshkar says. "The integration of several such ultra low power and bio-inspired innovations can enable highly miniature neural prosthetics for the deaf, blind, paralysed and other conditions to become a reality in the future."

The deep similarity that Sarpeshkar has shown to exist between the performance of circuits in biology and those in electronics could lead to dramatic increases in performance of biology inspired systems, thanks to the speed of electronics.

"The striking mathematical similarities between chemical reaction dynamics and electronic current flow in the subthreshold regime of transistor operation … imply that one can mimic and model large scale chemical processing systems in biological and artificial networks very efficiently on an electronic chip – at time scales that could potentially be a million times faster."

It holds the possibility of simulating cells, organs and tissues with ultra fast, highly parallel analogue and hybrid analogue-digital circuits on large scale 'supercomputing' electronic chips. Such simulations are extremely computationally intensive, especially when the effects of noise, nonlinearity, network feedback effects and cell to cell variability are included.

"Stochastics (non deterministic, random events) and cell to cell variability are highly important factors for predicting a cell's response to drug treatment, like the response of tumour cells to chemotherapy treatments," Sarpeshkar says. "In turn, analogue circuit design techniques can be mapped to design and create synthetic biology circuits that have been shown to be in accord with biological data. They can impact the treatment of gene therapies in diseases like cancer and diabetes, or increase the understanding of how such circuits malfunction, thus leading to better drug therapies."

One project conducted by Sarpeshkar's team illustrates how potentially efficient cytomorphic techniques could be. The aim was to model the effects of increasing the concentrations of two different proteins within a cell gradually. Both proteins prompt the cell to start producing other proteins, but they do it in different ways: one binds to a strand of DNA and causes the cell to increase production of a particular protein; the other deactivates a protein that is suppressing protein production.

Sarpeshkar and his colleagues were able to model both processes using circuits featuring only eight transistors. Moreover, the circuits turned out to be mirror images of each other, representing the difference between activating protein production directly and deactivating a deactivator. In effect, this used a handful of transistors to implement a complex set of equations. Done digitally, this would take several lines of code, and require millions of transistors.

Most recently, his group has created a fuel cell that runs on glucose: the same sugar that powers human cells. This idea goes back to the 1970s, when scientists considered powering pacemakers with a glucose fuel cell, but lithium batteries were preferred.

What is new is the fuel cell is fabricated from silicon, using standard semiconductor techniques. It features a platinum catalyst that strips electrons from glucose, mimicking the activity of the cellular enzymes that break down glucose to generate ATP, the energy source for all cells. So far, the fuel cell can generate hundreds of microwatts – enough to power an ultra low power implant.

* Ultra Low Power Bioelectronics: Fundamentals, Biomedical Applications and Bio-inspired Systems.
ISBN-10: 0521857279
Cambridge University Press

David Boothroyd

Related Downloads

Comment on this article

This material is protected by MA Business copyright See Terms and Conditions. One-off usage is permitted but bulk copying is not. For multiple copies contact the sales team.

What you think about this article:

Add your comments


Your comments/feedback may be edited prior to publishing. Not all entries will be published.
Please view our Terms and Conditions before leaving a comment.

Related Articles