University of Washington
Papers and research supported in part or in whole by The Swartz Foundation
Fereshteh Lagzi and Adrienne L. Fairhall. Emergence of co-tuning in inhibitory neurons as a network phenomenon mediated by randomness, correlations, and homeostatic plasticity. Science Advances 10.12 (2024): eadi4350. Matthew Farrell, Stefano Recanatesi, and Eric Shea-Brown. From lazy to rich to exclusive task representations in neural networks and neural codes. Current opinion in neurobiology 83 (2023): 102780. Aditi Pophale, Kazumichi Shimizu, Tomoyuki Mano, Teresa L. Iglesias, Kerry Martin, Makoto Hiroi, Keishu Asada, Paulette GarcĂa Andaluz, Thi Thu Van Dinh, Leenoy Meshulam, and Sam Reiter. Wake-like skin patterning and neural activity during octopus sleep. Nature 619.7968 (2023): 129-134. Stefano Recanatesi, Serena Bradde, Vijay Balasubramanian, Nicholas A. Steinmetz, and Eric Shea-Brown. A scale-dependent measure of system dimensionality. Patterns 3.8 (2022). Doris Voina, Stefano Recanatesi, Brian Hu, Eric Shea-Brown, and Stefan Mihalas. Single Circuit in V1 Capable of Switching Contexts During Movement Using an Inhibitory Population as a Switch. Neural Computation 34.3 (2022): 541-594. Stefano Recanatesi, Ulises Pereira-Obilinovic, Masayoshi Murakami, Zachary Mainen, Luca Mazzucato Metastable attractors explain the variable timing of stable behavioral action sequences. Neuron. 110(1): 139-53.39 (2022). Matthew Farrell, Stefano Recanatesi, Guillaume Lajoie, and Eric Shea-Brown Gradient-based learning drives robust representations in recurrent neural networks by balancing compression and expansion. Nature Machine Intelligence 4: 564-573 (2022). Charles B. Delahunt, Pedro D. Maia, and J. Nathan Kutz. Built to last: functional and structural mechanisms in the moth olfactory network mitigate effects of neural injury. Brain Sciences 11.4 (2021): 462. Leenoy Meshulam, Jeffrey L. Gauthier, Carlos D. Brody, David W. Tank, and William Bialek. Successes and failures of simplified models for a network of real neurons, arXiv preprint arXiv:2112.14735 (2021). Stefano Recanatesi, Matthew Farrell, Guillaume Lajoie, Sophie Deneve, Mattia Rigotti, and Eric Shea-Brown Predictive learning as a network mechanism for extracting low-dimensional latent space representations, Nature Communications, 12, 1417 (2021). Merav Stern, Eric Shea-Brown, and Daniela Witten Inferring Neural Population Spiking Rate from Wide-Field Calcium Imaging, BioRXiv, (2020). Matthew Farrell, Stefano Recanatesi, R. Clay Reid, Stefan Mihalas, Eric Shea-Brown Autoencoder networks extract latent variables and encode these variables in their connectomes, ArXiv (2020). Merav Stern and Eric Shea-Brown Network Dynamics Governed by Lyapunov Functions: From Memory to Classification, Spotlight in Trends in Neurosciences (2020). David Dahmen, Stefano Recanatesi, Gabriel Ocker, Xiaoxuan Jia, Moritz Helias, Eric Shea-Brown Strong coupling and local control of dimensionality across brain areas, BioRXiv (2020). Stefano Recanatesi, Gabe Ocker, Michael Buice, and Eric Shea-Brown Dimensionality in recurrent spiking networks: Global trends in activity and local origins in connectivity, PLOS Computational Biology 5(7): e1006446, (2019). Matthew Farrell, Stefano Recanatesi, Guillaume Lajoie, and Eric Shea-Brown Recurrent neural networks learn robust representations by dynamically balancing compression and expansion, BioRXiv (2019). Charles B. Delahunt and J. Nathan Kutz Putting a bug in ML: The moth olfactory network learns to read MNIST. Neural Networks 118 (2019): 54-64. Stefano Recanatesi, Matthew Farrell, Madhu Advani, Timothy Moore, Guillaume Lajoie, Eric Shea-Brown Dimensionality compression and expansion in deep neural networks, ArXiv 1906:00443, (2019). Delahunt, Charles B., Jeffrey A. Riffell, and J. Nathan Kutz Biological Mechanisms for Learning: A Computational Model of Olfactory Learning in the Manduca Sexta Moth, With Applications to Neural Nets. Frontiers in Computational Neuroscience 12 (December): 102 (2018). Return to main Research page
Return to main Research page