Deep-learning models have become pervasive tools in science and engineering. However, their energy requirements now increasingly limit their scalability. Deep-learning accelerators aim to perform deep learning energy-efficiently, usually targeting the inference phase and often by exploiting physical substrates beyond conventional electronics. Approaches so far have been unable to apply the backpropagation algorithm to train unconventional novel hardware in situ. The advantages of backpropagation have made it the de facto training method for large-scale neural networks, so this deficiency constitutes a major impediment. https://www.nature.com/articles/s41586-021-04223-6
Deep physical neural networks trained with backpropagation
Facebook
Twitter
LinkedIn
Pinterest