Neural Information

A lexical approach for identifying behavioral action sequences was published in PLOS Computational Biology

Title: A lexical approach for identifying behavioral action sequences [PLOS Computational Biology / bioRxiv] Authors: Gautam Reddy, Laura Desban, Hidenori Tanaka, Julian Roussel, Olivier Mirat, Claire Wyart Published on 10 January 2022. Abstract: Animals display characteristic behavioral patterns when performing a task, such as the spiraling of a soaring bird or the surge-and-cast of a male moth searching for a female. Identifying such conserved patterns occurring rarely […]

A lexical approach for identifying behavioral action sequences was published in PLOS Computational Biology Read More »

Beyond BatchNorm: Towards a General Understanding of Normalization in Deep Learning was accepted for NeurIPS 2021

Title: Beyond BatchNorm: Towards a General Understanding of Normalization in Deep Learning [arXiv, PDF] Authors: Ekdeep Singh Lubana, Robert P. Dick, Hidenori Tanaka Accepted for Neural Information Processing Systems (NeurIPS) 2021 in Sep 2021.  Published on 10 June 2021. Abstract: Inspired by BatchNorm, there has been an explosion of normalization layers in deep learning. Recent works

Beyond BatchNorm: Towards a General Understanding of Normalization in Deep Learning was accepted for NeurIPS 2021 Read More »

Noether’s Learning Dynamics: The Role of Kinetic Symmetry Breaking in Deep Learning was accepted for NeurIPS 2021

Title: Noether’s Learning Dynamics: The Role of Kinetic Symmetry Breaking in Deep Learning  [arXiv, PDF] Authors: Hidenori Tanaka and Daniel Kunin Accepted for Neural Information Processing Systems (NeurIPS) 2021 in Sep 2021.  Published on 6 May 2021. Abstract: In nature, symmetry governs regularities, while symmetry breaking brings texture. Here, we reveal a novel role of symmetry

Noether’s Learning Dynamics: The Role of Kinetic Symmetry Breaking in Deep Learning was accepted for NeurIPS 2021 Read More »

Rethinking the limiting dynamics of SGD: modified loss, phase space oscillations, and anomalous diffusion was submitted to arXiv

Title: Rethinking the limiting dynamics of SGD: modified loss, phase space oscillations, and anomalous diffusion  [arXiv, PDF] Authors: Daniel Kunin, Javier Sagastuy-Brena, Lauren Gillespie, Eshed Margalit, Hidenori Tanaka, Surya Ganguli, Daniel L. K. Yamins Submitted on 19 July 2021.  Revised on 5 October 2021. Abstract: In this work we explore the limiting dynamics of deep neural networks trained with stochastic gradient descent

Rethinking the limiting dynamics of SGD: modified loss, phase space oscillations, and anomalous diffusion was submitted to arXiv Read More »

Neural Mechanics: Symmetry and Broken Conservation Laws in Deep Learning Dynamics was accepted by ICLR

Title: Neural Mechanics: Symmetry and Broken Conservation Laws in Deep Learning Dynamics [arXiv, PDF] Authors: Daniel Kunin, Javier Sagastuy-Brena, Surya Ganguli, Daniel L.K. Yamins, Hidenori Tanaka Submitted on 8 December 2020. Abstract: Predicting the dynamics of neural network parameters during training is one of the key challenges in building a theoretical foundation for deep learning. A central obstacle is that

Neural Mechanics: Symmetry and Broken Conservation Laws in Deep Learning Dynamics was accepted by ICLR Read More »

Pruning neural networks without any data by iteratively conserving synaptic flow was accepted by NeurIPS 2020

Title: Pruning neural networks without any data by iteratively conserving synaptic flow [arXiv, PDF] Authors: Hidenori Tanaka, Daniel Kunin, Daniel L. K. Yamins, and Surya Ganguli Submitted on 9 June 2020. Abstract: Pruning the parameters of deep neural networks has generated intense interest due to potential savings in time, memory and energy both during training and

Pruning neural networks without any data by iteratively conserving synaptic flow was accepted by NeurIPS 2020 Read More »

From deep learning to mechanistic understanding in neuroscience: the structure of retinal prediction has been published in NeurIPS 2019

Title: From deep learning to mechanistic understanding in neuroscience: the structure of retinal prediction [NeurIPS 2019, arXiv, PDF] Authors: Hidenori Tanaka, Aran Nayebi, Niru Maheswaranathan, Lane McIntosh, Stephen A. Baccus, and Surya Ganguli Journal: Advances in Neural Information Processing Systems (NeurIPS) 2019 Published on 13 Feb 2020. Abstract: Recently, deep feedforward neural networks have achieved considerable success

From deep learning to mechanistic understanding in neuroscience: the structure of retinal prediction has been published in NeurIPS 2019 Read More »

The dynamic neural code of the retina for natural scenes was submitted to arXiv

Title: The dynamic neural code of the retina for natural scenes (bioRxiv, PDF) Authors: Niru Maheswaranathan, Lane T. McIntosh, Hidenori Tanaka, Satchel Grant, David B. Kastner, Josh B. Melander, Aran Nayebi, Luke Brezovec, Julia Wang, Surya Ganguli, Stephen A. Baccus Submitted on 17 Dec 2019. Abstract: Understanding how the visual system encodes natural scenes is a fundamental goal of sensory neuroscience. We show here that a three-layer network model predicts the retinal response to natural scenes with an

The dynamic neural code of the retina for natural scenes was submitted to arXiv Read More »

Your Privacy

When you visit any website, it may store or retrieve information on your browser, mostly in the form of cookies. This information might be about you, your preferences or your device and is mostly used to make the site work as you expect it to. The information does not usually directly identify you, but it can give you a more personalized web experience. Because we respect your right to privacy, you can choose not to allow some types of cookies. Click on the different category headings to find out more and change our default settings. However, blocking some types of cookies may impact your experience of the site and the services we are able to offer.