Upgrade 2021: phi LAB Speakers

September 21, 2021 // Upgrade 2021: PHI LAB Speakers

How Applying the Backpropagation Algorithm Enables Deep Physical Neural Networks

Logan Wright, Research Scientist | NTT Research Physics & Informatics Lab

Summary

It’s possible to turn virtually any physical system into a deep neural network capable of performing computations far faster and more efficiently than traditional electronic computers – if you can find the right physical system and apply an effective algorithm.

That was the gist of Dr. Logan Wright’s talk at Upgrade 2021, the NTT Research Summit, during which he discussed a paper he co-authored with Dr. Tatsuhiro Onodera, “Deep physical neural networks trained with backpropagation.” Wright and Onodera are both Research Scientists in the NTT Research Physics and Informatics (PHI) Lab and visiting scientists in the School of Applied and Engineering Physics at Cornell.

Dr. Wright’s work demonstrated how any physical system could be used to create a physical neural network, or PNN. He showed three examples: a mechanical system that consisted of a speaker connected to an oscillating metal plate, a non-linear analog electronic system, and an optical system.

“We’re taking a very broad perspective on all natural systems as performing some kind of computation, and we’re harnessing that with two key ingredients of deep learning, namely deep trained features, and the backpropagation algorithm,” Dr. Wright said.

Every physical system has controllable functions, meaning some parameters that you can tune to change the way it behaves. “If you put a signal into that system, as you tune those parameters, you can change the way that the system affects that input signal,” he said.

However, only a select few physical systems make for good PNNs. And his work demonstrated that even the most suitable physical systems must overcome what he calls the “simulation realty gap.”

“Even if you have a really good model of a physical system, or any system, it’s only so good,” he said. “As you feed the output of that physical system into yet another simulation and yet another simulation, the gap between simulation and reality just grows.”

Fixing the simulation reality gap involved a tweak to the backpropagation algorithm. That algorithm has been used for years to train deep neural networks that use digital electronics. But as computational requirements for deep learning models grow, they are outpacing Moore’s Law – leading to the search for more efficient computing methods.

Dr. Wright’s work involves applying the backpropagation algorithm to unconventional hardware – namely optical, mechanical and electrical systems – in hopes of unleashing a dramatic increase in performance at acceptable levels of energy efficiency.

The paper on which his talk was based, “Deep physical neural networks trained with backpropagation,” has been published in the journal Nature, one of the world’s most cited scientific journals.

Click below for the full transcript.

Logan Wright

Research Scientist | NTT Research Physics & Informatics Lab

Dr. Logan Wright joined NTT Research in 2018 after receiving his PhD in Applied Physics from Cornell University. At NTT Research, he studies the physics of computation and its application to new computing machines and paradigms.