Physical systems perform machine-learning computations

Physical systems perform machine-learning computations
Cornell researchers have successfully trained (from left to right) a computer speaker, a simple electronic circuit and a laser to perform machine-learning computations. Credit: Logan G. Wright et al / Cornell University

You may not be able to teach an old dog new tricks, but Cornell researchers have found a way to train physical systems, ranging from computer speakers and lasers to simple electronic circuits, to perform machine-learning computations, such as identifying handwritten numbers and spoken vowel sounds.

The experiment is no mere stunt or parlor trick. By turning these physical systems into the same kind of neural networks that drive services like Google Translate and online searches, the researchers have demonstrated an early but viable alternative to conventional electronic processors—one with the potential to be orders of magnitude faster and more energy efficient than the power-gobbling chips in data centers and server farms that support many artificial-intelligence applications.

“Many different physical systems have enough complexity in them that they can perform a large range of computations,” said Peter McMahon, assistant professor of applied and engineering physics in the College of Engineering, who led the project. “The systems we performed our demonstrations with look nothing like each other, and they seem to [be] having nothing to do with handwritten-digit recognition or vowel classification, and yet you can train them to do it.”

The team’s paper, “Deep Physical Neural Networks Trained with Backpropagation,” published Jan. 26 in Nature. The paper’s co-lead authors are Logan Wright and Tatsuhiro Onodera, NTT Research postdoctoral fellows in McMahon’s lab.

The central research theme of McMahon’s group exists at the intersection of physics and computation: How to harness physical systems to perform computation more efficiently or faster than conventional computers.

For this project, they focused on one type of computation: Machine learning. The goal was to find out how to use different physical systems to perform machine learning in a generic way that could be applied to any system. The researchers developed a training procedure that enabled demonstrations with three diverse types of physical systems—mechanical, optical and electrical. All it required was a bit of tweaking, and a suspension of disbelief.

“Artificial neural networks work mathematically by applying a series of parameterized functions to input data. The dynamics of a physical system can also be thought of as applying a function to data input to that physical system,” McMahon said. “This mathematical connection between neural networks and physics is, in some sense, what makes our approach possible, even though the notion of making neural networks out of unusual physical systems might at first sound really ridiculous.”

For the mechanical system, the researchers placed a titanium plate atop a commercially available speaker, creating what is known in physics as a driven multimode mechanical oscillator. The optical system consisted of a laser beamed through a nonlinear crystal that converted the colors of incoming light into new colors by combining pairs of photons. The third experiment used a small electronic circuit with just four components—a resistor, a capacitor, an inductor and a transistor—of the sort a middle-school student might assemble in science class.

In each experiment, the pixels of an image of a handwritten number were encoded in a pulse of light or an electrical voltage that was fed into the system. The system processed the information and gave its output in a similar type of optical pulse or voltage. Crucially, for the systems to perform the appropriate processing, they had to be trained. So the researchers changed specific input parameters and ran multiple samples—such as different numbers in different handwriting—through the physical system, then used a laptop computer to determine how the parameters should be adjusted to achieve the greatest accuracy for the task. This hybrid approach leveraged the standard training algorithm from conventional artificial neural networks, called backpropagation, in a way that is resilient to noise and experimental imperfections.

The researchers were able to train the optical system to classify handwritten numbers with an accuracy of 97%. While this accuracy is below the state-of-the-art for conventional neural networks running on a standard electronic processor, the experiment shows that even a very simple physical system, with no obvious connection to conventional neural networks, can be taught to perform machine learning and could potentially do so much faster, and using far less power, than conventional electronic neural networks.

The optical system was also successfully trained to recognize spoken vowel sounds.

The researchers have posted their Physics-Aware-Training code online so that others can turn their own physical systems into neural networks. The training algorithm is generic enough that it can be applied to almost any such system, even fluids or exotic materials, and diverse systems can be chained together to harness the most useful processing capabilities of each one.

“It turns out you can turn pretty much any physical system into a neural network,” McMahon said. “However, not every physical system will be a good neural network for every task, so there is an important question of what physical systems work best for important machine-learning tasks. But now there is a way to try find out—which is what my lab is currently pursuing.”

Co-authors include doctoral student Martin Stein, Mong Postdoctoral Fellow Tianyu Wang, Darren Schachter, and Zoey Hu.


Harnessing noise in optical computing for AI


More information:
Logan G. Wright et al, Deep physical neural networks trained with backpropagation, Nature (2022). DOI: 10.1038/s41586-021-04223-6

Provided by
Cornell University


Citation:
Physical systems perform machine-learning computations (2022, January 26)
retrieved 26 January 2022
from https://techxplore.com/news/2022-01-physical-machine-learning.html

This document is subject to copyright. Apart from any fair dealing for the purpose of private study or research, no
part may be reproduced without the written permission. The content is provided for information purposes only.

Stay connected with us on social media platform for instant update click here to join our  Twitter, & Facebook

We are now on Telegram. Click here to join our channel (@TechiUpdate) and stay updated with the latest Technology headlines.

For all the latest Technology News Click Here 

 For the latest news and updates, follow us on Google News

Read original article here

Denial of responsibility! TechiLive.in is an automatic aggregator around the global media. All the content are available free on Internet. We have just arranged it in one platform for educational purpose only. In each content, the hyperlink to the primary source is specified. All trademarks belong to their rightful owners, all materials to their authors. If you are the owner of the content and do not want us to publish your materials on our website, please contact us by email – [email protected]. The content will be deleted within 24 hours.