David G. Clark

David G. Clark

dgclark@fas.harvard.edu

Research Fellow, Kempner Institute at Harvard University

Hello!

I am a theoretical neuroscientist and research fellow at the Kempner Institute for the Study of Natural and Artificial Intelligence at Harvard University.

A general theory of neural circuits must link three deeply coupled elements: synaptic connectivity, large-scale neuronal activity, and the tasks these circuits must perform. I develop theories relating these elements using methods from machine learning and statistical physics.

A common theme is the rich role of disorder present in large neural circuits; I draw on tools from the physics of disordered systems, treating this heterogeneity as a central ingredient rather than a nuisance.

A unifying goal is to connect high-dimensional nonlinear network models to the complex, heterogeneous data that modern experiments produce.

I earned my Ph.D. in Neurobiology and Behavior from Columbia University in the Center for Theoretical Neuroscience, where I was primarily advised by Larry Abbott and worked closely with Ashok Litwin-Kumar and Haim Sompolinsky. Before that, I studied physics and computer science at UC Berkeley.

My publications are listed below, or see Google Scholar. My CV is here.

Outside of science, I see a lot of Broadway.

Social

Publications & preprints

(2026). Structure, disorder, and dynamics in task-trained recurrent neural circuits. bioRš›˜iv.

bioRš›˜iv Code Kempner Blog Post

(2025). A theory of multi-task computation and task selection. bioRš›˜iv.

bioRš›˜iv

(2025). Connectivity structure and dynamics of nonlinear recurrent neural networks. Physical Review X.

PDF Journal arXiv

(2025). Simplified derivations for high-dimensional convex learning problems. SciPost Physics Lecture Notes.

PDF Journal arXiv

(2025). Associative synaptic plasticity creates dynamic persistent activity. bioRš›˜iv.

bioRš›˜iv

(2025). Symmetries and continuous attractors in disordered neural circuits. bioRš›˜iv.

bioRš›˜iv

(2024). Theory of coupled neuronal-synaptic dynamics. Physical Review X.

PDF Journal Physics Magazine Viewpoint arXiv Code

(2023). Dimension of activity in random neural networks. Physical Review Letters.

PDF Journal arXiv

(2021). Olfactory landmarks and path integration converge to form a cognitive spatial map. Neuron.

PDF Journal Code Video

(2021). Credit assignment through broadcasting a global error vector. NeurIPS 2021.

arXiv Code

(2019). Unsupervised discovery of temporal structure in noisy data with dynamical components analysis. NeurIPS 2019.

arXiv Code

(2017). Neuromorphic Kalman filter implementation in IBM's TrueNorth. Journal of Physics: Conference Series.

PDF Journal

* Equal contribution