Neuromorphic Computing

Neuromorphic Computing investigates the computational principles that enable high-level sensory processing and sensory cognition in the human brain by attempting to implement these principles into large-scale, high-performance computer models. Even after several decades of exponential growth in processing power, computers still cannot match the ability of the brain to interpret, respond to, and learn from natural sensory inputs. Rapid progress in neuroscience, however, is enabling an alternative strategy for achieving brain-like behavior: identifying the computational primitives that underlie the processing in biological neural circuits. The enormous scale of biological neural systems means neuromorphic computing research requires high-performance neural simulations tools in order to test complex scientific hypotheses at scale.  


Synthetic cognition through petascale models of the primate visual cortex

Garrett Kenyon, Principal Investigator, NMC Affiliate Researcher, LANL Staff Scientist
Pete Schultz, NMC Research Scientist
Gerd Kunde, NMC Affiliate Researcher, LANL Staff Scientist
John George, NMC Affiliate Researcher, LANL Staff Scientist
Melanie Mitchell, Portland State University Professor
Dylan Paiton, LANL Student
Xinha Zhang, UNM Graduate Student, NMC Associate Research Scientist

The PetaVision project seeks to develop an open-source high-performance neural simulation toolbox in the context of the active investigation of the computational principles underlying human sensory cognition.  The ultimate goal of PetaVision is to create a synthetic cognition system that emulates the functional architecture of the primate visual cortex. Using petascale computational resources and a growing knowledge of the structure and function of biological neural systems, this research has begun to reproduce the information processing capabilities of cortical circuits in the brain. This research pushes the limits of supercomputers and advances in this area are closely coupled with advanced computing and exascale computing research. Funding for this research comes from the National Science Foundation.
 

SALSA: Sparse Adaptive Learning for Sensing and Analytics

Garrett Kenyon, Principal Investigator, NMC Affiliate Researcher, LANL Staff Scientist
Wei Lu, University of Michigan Professor
Pete Schultz, NMC Research Scientist
Gerd Kunde, NMC Affiliate Researcher, LANL Staff Scientist
William Shainin, Northern Arizona University Undergraduate Student, NMC Student
Wesley Chavez, New Mexico Tech Postbaccalaureate Student, NMC Associate Research Scientist
Sheng Lundquist, New Mexico Tech Undergraduate Student, NMC Student
 
Inference Models (IMs) of the primate visual cortex seek to learn their deep network structure directly from their environmental sensory inputs. By representing the deep structure of the visual environment, SALSA seeks to enable more accurate performance of basic visual judgements, such as object detection and tracking. Because IMs learn to detect objects directly from the data, these models can compensate for internal structural damage and adapt to changing environmental conditions. By focusing on biologically-inspired neural architectures, SALSA seeks to enable target detection and tracking in streaming video obtained by mobile, lightweight platforms. In particular, SALSA seeks to yield improved capabilities for intelligent video processing from mobile platforms such as drones, robots, cubeSats, by exploiting the ultra-light-weight, low-power, and low bandwidth capabilities of biological neural circuits.  Along with collaborators at the University of Michigan, SALSA seeks to implement neurally-inspired architectures in mixed-signal, memristor-based circuits for ultra-low-power operation. This is a joint research project with the University of Michigan and funding for this project comes from the Defense Advanced Research Projects Agency (DARPA) through the DARPA UPSIDE Program.

© 2016 New Mexico Consortium