Organizers: | |
---|---|
Abstract: | In this session, we will discuss theoretical frameworks for building principled neuromorphic algorithms. How can these frameworks contribute to a language to formalize and optimize neuromorphic computing algorithms? How can neuromorphic algorithms be systematized, modularized, combined and reused? Due to time, please only ask understanding questions during and between talks, and save questions for the general discussion at the end. |
Speakers | 8:00-8:08: Friedrich Sommer: Taxonomy of conceptual frameworks and neuromorphic VSA 8:08-8:16: Paxon Frady: Frameworks for efficient coding and computation on neuromorphic hardware 8:16-8:24: Gregor Schöner: Dynamic Field Theory as a theoretical framework for neuromorphic embodied cognition 8:24-8:32: Christian Tetzlaff: The usage of on-chip learning for universal computing 8:32-8:40: Hyeryung Jang: Information-Theoretic Principles for Neuromorphic Computing: Information Bottleneck and Bayesian Learning 8:40-8:48: Gregor Lenz: Efficient ANN-SNN conversion 8:48-9:00: General Discussion and Questions |
Reference material | https://www.pnas.org/content/pnas/116/36/18050.full.pdf https://redwood.berkeley.edu/wp-content/uploads/2020/08/resonator1.pdf |
Recording | Not yet available |
Link to Presentation | Not yet available |
Please use the comment section on this page to ask questions or comment about this specific presentation.