Hidden Content
Right now, you're carrying around the most powerful computer in existence – the human brain. This naturally super-efficient machine is far better than anything humans have ever built, so it's not surprising that scientists are trying to reverse-engineer it. Rather than binary bits of information, neuromorphic computers are built with networks of artificial neurons, and now an MIT team has developed a more lifelike synapse to better connect those neurons.
For simplicity's sake, computers process and store information in a binary manner – everything can be broken down into a series of ones and zeroes. This system has served us well for the better part of a century, but having access to a whole new world of analog "grey areas" in between could really give computing power a shot in the arm.
The brain is a perfect model for those kinds of systems. While we've barely scratched the surface of how it works exactly, what we do know is that the brain deals with both analog and digital signals, processes and stores information in the same regions, and performs many operations in parallel. This is thanks to around 100 billion neurons dynamically communicating with each other via some 100 trillion synapses.
While neural networks mimic human thinking on the software side, neuromorphic chips are much more brain-like in the design of their hardware. Their architecture is made up of artificial neurons that process data and communicate with each other through artificial synapses. IBM's TrueNorth supercomputer is one of the most powerful neuromorphic systems, and Intel has recently unveiled a more modest, research-focused chip it calls Loihi.

In conventional neuromorphic chips, synapses are made of amorphous materials wedged in between the conductive layers of neighboring neurons. Ions flow through this material when a voltage is applied, transferring data between neurons. The problem is they can be unpredictable, with defects in the switching medium sending ions wandering off in different directions.
"Once you apply some voltage to represent some data with your artificial neuron, you have to erase and be able to write it again in the exact same way," says Jeehwan Kim, lead researcher on the project. "But in an amorphous solid, when you write again, the ions go in different directions because there are lots of defects. This stream is changing, and it's hard to control. That's the biggest problem – nonuniformity of the artificial synapse."
To combat the problem, the MIT researchers designed a new medium for an artificial synapse. They started with a wafer of single-crystalline silicon, then grew a layer of silicon germanium over the top. Both materials have a lattice-like pattern, but silicon germanium's pattern is slightly larger, so when the two overlap it forms a kind of funnel shape, keeping ions on the straight and narrow.

The team built a neuromorphic chip using this technique, with silicon germanium synapses measuring about 25 nanometers wide. The team then tested them all by applying a voltage to them, and found that overall there was about a four percent variation in the current that passed through them. An individual synapse, tested over 700 cycles, was also found to keep a consistent current, with a variation of just 1 percent.
"This is the most uniform device we could achieve, which is the key to demonstrating artificial neural networks," says Kim.
The scientists then put the chip through its paces with a simulated test. They used an artificial neural network that functioned as though it was made up of three sheets of the neurons connected with two layers of the synapses. Then they fed in data on tens of thousands of handwriting samples, and found that the system was later able to recognize 95 percent of samples it was then given. That's not far off the 97 percent accuracy that more established systems can achieve.
Next, the team plans to develop a physical neuromorphic chip that can handle this task in the real world.
"Ultimately we want a chip as big as a fingernail to replace one big supercomputer," says Kim. "This opens a stepping stone to produce real artificial hardware."

The research was published in the journal Nature Materials.
Source: MIT