We try to include links for all of our recent papers. Most links direct you to a journal’s site where that particular publication is available for download. If you cannot access one of our papers, let us know. For a more complete list, see Michael’s google scholar citations profile.
Beyond Linear Response: Equivalence between Thermodynamic Geometry and Optimal Transport
Adrianne Zhong and Michael DeWeese
[arXiv]
More is Better in Modern Machine Learning: when Infinite Overparameterization is Optimal and Overfitting is Obligatory
James Simon, Dhruva Karkada, Nikhil Ghosh, Mikhail Belkin
[ICLR ‘24] [arXiv]
The lazy (NTK) and rich (μP) regimes: a gentle tutorial
Dhruva Karkada
[arXiv]
The eigenlearning framework: A conservation law perspective on kernel ridge regression and wide neural networks
James Simon, Madeline Dickens, Dhruva Karkada, Michael DeWeese
[TMLR ‘23] [arXiv] [code]
Shortcut engineering of active matter: run-and-tumble particles
Adam Frim and Michael DeWeese
[arXiv]
Time-Asymmetric Fluctuation Theorem and Efficient Free Energy Estimation
Adrianne Zhong, Benjamin Kuznets-Speck, Michael DeWeese
[arXiv]
A Spectral Condition for Feature Learning
Greg Yang, James Simon, Jeremy Bernstein
[arXiv]
On the Stepwise Nature of Self-Supervised Learning
James Simon, Maksis Knutins, Liu Ziyin, Daniel Geisz, Abraham Fetterman, Joshua Albrecht
[arXiv]
Reverse Engineering the Neural Tangent Kernel
James Simon, Sajant Anand, Michael DeWeese
[ICML ‘22] [arXiv] [code]
Geometric Bound on the Efficiency of Irreversible Thermodynamic Cycles
Adam Frim and Michael DeWeese
[PRL ‘22]
Optimal Finite-Time Brownian Carnot Engine
Adam Frim and Michael DeWeese
[PRE ‘22]
Limited-control optimal protocols arbitrarily far from equilibrium
Adrianne Zhong and Michael DeWeese
[arXiv]
A Solution to the Fokker-Planck Equation for Slowly Driven Brownian Motion: Emergent Geometry and a Formula for the Corresponding Thermodynamic Metric
Neha Wadia, Ryan Zarcone, Michael DeWeese
[PRE]
Sparse coding models predict a spectral bias in the development of primary visual cortex (V1) receptive fields
Andrew Ligeralde and Michael DeWeese
[bioRxiv]
Engineered Swift Equilibration for Arbitrary Geometries
Adam Frim, Adrianne Zhong, Stephen Chen, Dibyendu Mandal, Michael R DeWeese
[PRE] [arXiv]
Critical Point-Finding Methods Reveal Gradient-Flat Regions of Deep Network Losses
Charles Frye, Jamie Simon, Neha Wadia, Andrew Ligeralde, Michael DeWeese, Kris Bouchard
[Neural Computation]
Efficient sensory coding of multidimensional stimuli
Thomas Yerxa, Eric Kee, Michael DeWeese, Emily Cooper
[PLOS Computational Biology]
Heterogeneous Synaptic Weighting Improves Neural Coding in the Presence of Common Noise
Pratik Sachdeva, Jesse Livezy, Michael DeWeese
[Neural Computation]
Spike-timing-dependent ensemble encoding by non-classically responsive cortical neurons
Michele Insanally, Ioana Carcea, Rachel Field, Chris Rodgers, Brian DePasquale, Kanaka Rajan, Michael DeWeese, Badr Albanna, Robert Froemke
[eLife]
On the sparse structure of natural sounds and natural images: similarities, differences, and implications for neural coding
Eric Dodds, Michael DeWeese
[Frontiers in Computational Neuroscience]
Replay as wavefronts and theta sequences as bump oscillations in a grid cell attractor network
Louis Kang, Michael DeWeese
[eLife]
Design of optical neural networks with component imprecisions
Michael Fang, Sasikanth Manipatruni, Casimir Wierzynski, Amir Khosrowshahi, Michael DeWeese
[Optics Express]
Numerically recovering the critical points of a deep linear autoencoder
Charles Frye, Neha Wadia, Michael DeWeese, Kris Bouchard
[arXiv]
Spatial whitening in the retina may be necessary for V1 to learn a sparse representation of natural scenes
Eric Dodds, Jesse Livezey, Michael DeWeese
[bioRxiv]