Gehirn-inspiriertes Computing könnte grosse Probleme auf kleine Weise lösen

Die Forscher der Pennsylvania State University haben ein 2D-Gerät entwickelt, das mehr als nur Ja-oder-Nein-Antworten liefert und dem Gehirn ähnlicher sein könnte als aktuelle Computerarchitekturen.
Die Lösung besteht darin, vom Gehirn inspirierte, analoge, statistische neuronale Netze zu erstellen, die nicht nur ein- oder ausgeschaltet sind, sondern eine Reihe probabilistischer Antworten liefern, die dann mit der erlernten Datenbank in der Maschine verglichen werden.
Dazu entwickelten die Forscher einen Transistor aus 2D-Materialien – Molybdändisulfid und schwarzer Phosphor. Diese Geräte sind energieeffizienter und produzieren weniger Wärme. Daher eignen sie sich ideal zum Skalieren von Systemen.
Den Forschern zufolge hat das statistische Rechnen mit neuronalen Netzen Anwendungen in der Medizin, da diagnostische Entscheidungen nicht immer zu 100% mit Ja oder Nein getroffen werden. Sie erkennen auch, dass medizinische Diagnosegeräte für die bestmögliche Wirkung klein und tragbar sein müssen und nur minimale Energie verbrauchen.

Brain-inspired computing could tackle big problems in a small way

While computers have become smaller and more powerful and supercomputers and parallel computing have become the standard, we are about to hit a wall in energy and miniaturization. Now, Penn State researchers have designed a 2D device that can provide more than yes-or-no answers and could be more brainlike than current computing architectures.

«Complexity scaling is also in decline owing to the non-scalability of traditional von Neumann computing architecture and the impending ‹Dark Silicon› era that presents a severe threat to multi-core processor technology,» the researchers note in today’s (Sept 13) online issue of Nature Communications.

The Dark Silicon era is already upon us to some extent and refers to the inability of all or most of the devices on a computer chip to be powered up at once. This happens because of too much heat generated from a single device. Von Neumann architecture is the standard structure of most modern computers and relies on a digital approach — «yes» or «no» answers — where program instruction and data are stored in the same memory and share the same communications channel.

«Because of this, data operations and instruction acquisition cannot be done at the same time,» said Saptarshi Das, assistant professor of engineering science and mechanics.

«For complex decision-making using neural networks, you might need a cluster of supercomputers trying to use parallel processors at the same time — a million laptops in parallel — that would take up a football field. Portable healthcare devices, for example, can’t work that way.»

The solution, according to Das, is to create brain-inspired, analog, statistical neural networks that do not rely on devices that are simply on or off, but provide a range of probabilistic responses that are then compared with the learned database in the machine. To do this, the researchers developed a Gaussian field-effect transistor that is made of 2D materials — molybdenum disulfide and black phosphorus. These devices are more energy efficient and produce less heat, which makes them ideal for scaling up systems.

«The human brain operates seamlessly on 20 watts of power,» said Das. «It is more energy efficient, containing 100 billion neurons, and it doesn’t use von Neumann architecture.»

The researchers note that it isn’t just energy and heat that have become problems, but that it is becoming difficult to fit more in smaller spaces.

«Size scaling has stopped,» said Das. «We can only fit approximately 1 billion transistors on a chip. We need more complexity like the brain.»

The idea of probabilistic neural networks has been around since the 1980s, but it needed specific devices for implementation.

«Similar to the working of a human brain, key features are extracted from a set of training samples to help the neural network learn,» said Amritanand Sebastian, graduate student in engineering science and mechanics.

The researchers tested their neural network on human electroencephalographs, graphical representation of brain waves.  After feeding the network with many examples of EEGs, the network could then take a new EEG signal and analyze it and determine if the subject was sleeping.

«We don’t need as extensive a training period or base of information for a probabilistic neural network as we need for an artificial neural network,» said Das.

The researchers see statistical neural network computing having applications in medicine, because diagnostic decisions are not always 100% yes or no. They also realize that for the best impact, medical diagnostic devices need to be small, portable and use minimal energy.

Das and colleagues call their device a Gaussian synapse and it is based on a two-transistor setup where the molybdenum disulfide is an electron conductor, while the black phosphorus conducts through missing electrons, or holes. The device is essentially two variable resistors in series and the combination produces a graph with two tails, which matches a Gaussian function.

Others working on this project were Andrew Pannone, undergraduate in engineering science and mechanics; and Shiva Subbulakshmi, student in electrical engineering at Amrita Vishwa Vidyapeetham, India, and a summer intern in the Das laboratory.

The Air Force Office of Scientific Research supported this work.

Image: Courtesy of Penn State
Source: Pennsylvania State University; A’ndrea Elyse Messer, September 13, 2019
https://news.psu.edu/story/587777/2019/09/13/research/brain-inspired-computing-could-tackle-big-problems-small-way