Eli Yablonovitch

Professor, UC Berkeley

Eli Yablonovitch Outlines the Promise of Physics-Based Optimization Principles for the Future of Computing

In the physical world we constantly witness examples of optimization under constraints. Light through a glass window takes the route that requires the least amount of time. A leaf falling from a forest canopy follows a path which generates the least entropy in its floating freefall. It is thus useful to recognize where optimization principles come into play, and exploit them to solve optimization problems across many fields, including electronic engineering.

 

In his talk at Upgrade 2020, the NTT Research Summit, Dr. Eli Yablonovitch, world-renowned physicist, engineer and Director of UC Berkeley’s Center for Energy Efficient Electronics Science, discussed how optimization principles of physics, specifically the principle of least entropy generation, can solve difficult engineering and computational problems.  An equivalent of the entropy statement is that nature will try to dissipate the least amount of heat, subject to the external driving terms.

 

Nature’s optimization principles can embody constraints analogous to the Lagrange Multiplier method taught in some undergraduate college classes. This mathematical strategy is used to calculate the local maxima or minima subject to some constraint. Physical Optimization can accommodate various constraints, and the Lagrange Multiplier numbers have direct meaning as the voltage or current in a circuit, or the gain in an optical machine.

 

“Physics will try to go to the state of lowest power dissipation. Another way of saying it is we go from some initial state to some final state,” he said. “And physics does this for you for at a modest cost – because it is always trying to minimize power dissipation.”

 

Dr. Yablonovitch reviewed a few experiments, each with a constraint of requiring a digital answer, but the system needed to respond to driving forces so as to minimize heat generation.  In an experiment by one of his Berkeley colleagues, Prof. Roychowdhury, harmonic oscillators phase locked in a way that solved the Ising problem; indicative of the system settling into a least power dissipation state.

 

In a quantum example from Yamamoto at Stanford, the same principle comes into effect. “It’s based upon minimum entropy generation, synonymous with minimizing the power dissipation,” Dr. Yablonovitch said. “It is a very beautiful system because it’s time multiplexed so the different optical pulses experience the same gain, while locking in a digital answer.”

 

This principle can be applied to artificial intelligence, which is also an optimization problem. Dr. Yablonovitch describes MIT researchers’ use of iterative algorithms on silicon photonics chips. “As you go round over and over again, through the silicon photonics, you end up minimizing the power dissipation.”

 

If these examples take advantage of the same physics principle, what is the promise? “The promise is that the physics-based hardware will perform the same function at far greater speed and far less power dissipation than conventional digital algorithms,” Yablonovitch said. While the challenge of global optimization remains unsolved “nonetheless there are terrific applications in deep learning and in neural network back-propagation, artificial intelligence, and control theory,” he said.

 

For the full transcript of Eli Yablonovitch’s presentation, click here.

 

Watch Eli Yablonovitch’s full presentation below.

Physics Successfully Implements Lagrange Multiplier Optimization

Eli-Yablonovitch

Eli Yablonovitch,
Professor, UC Berkeley