Versions Compared

Key

  • This line was added.
  • This line was removed.
  • Formatting was changed.

Organizers:

E Paxon Frady (Deactivated) friedrich.sommer (Deactivated)

Abstract:

In this session, we will discuss theoretical frameworks for building principled neuromorphic algorithms and the features that make an ideal framework for Loihi. We will discuss efforts to develop a robust and flexible system that can go beyond networks that perform just a single task. As well, we will cover how proposed frameworks might synthesize into a high-level abstraction paradigm for building general purpose algorithms.

Speakers

Friedrich Sommer

Paxon Frady

Gregor Schöner

Christian Tetzlaff

Hyeryung Jang

Gregor Lenz

Pre-requisites/

co-requisites

Link to pre-recorded material coming soon

Recording

Not yet available. How can these frameworks contribute to a language to formalize and optimize neuromorphic computing algorithms? How can neuromorphic algorithms be systematized, modularized, combined and reused?

Due to time, please only ask understanding questions during and between talks, and save questions for the general discussion at the end.

Speakers

8:00-8:10: Friedrich Sommer: Sparse VSA as conceptual framework for neuromorphic computing

8:10-8:20: Paxon Frady: Frameworks for efficient coding and computation on neuromorphic hardware

8:20-8:30: Gregor Schöner: Dynamic Field Theory as a theoretical framework for neuromorphic embodied cognition

8:30-8:40: Christian Tetzlaff: The usage of on-chip learning for universal computing

8:40-8:50: Hyeryung Jang: Information-Theoretic Principles for Neuromorphic Computing: Information Bottleneck and Bayesian Learning

8:50-9:00: General Discussion and Questions

Reference material

Videos:

Neuromorphic nearest neighbors:

https://intel-ncl.atlassian.net/wiki/spaces/forum/blog/2020/05/20/398983184/INRC+Forum+May+27%2C+2020%3A+Paxon+Frady%2C+Garrick+Orchard%2C+Mike+Davies%2C+Intel?atlOrigin=eyJpIjoiZGU4ZWIzMTJjNjM3NDY5NmFjYjc0NDJhZjRjNmM1ZWEiLCJwIjoiYyJ9

Paxon’s talk on resonator networks at VSA workshop: https://youtu.be/T0mqBCpDqwk

https://intel-ncl.atlassian.net/wiki/spaces/forum/blog/2020/08/21/601489420?atlOrigin=eyJpIjoiZGU4ZWIzMTJjNjM3NDY5NmFjYjc0NDJhZjRjNmM1ZWEiLCJwIjoiYyJ9

https://intel-ncl.atlassian.net/wiki/spaces/forum/blog/2020/05/06/369885188?atlOrigin=eyJpIjoiZGU4ZWIzMTJjNjM3NDY5NmFjYjc0NDJhZjRjNmM1ZWEiLCJwIjoiYyJ9

Papers:

Frady, E.P., Sommer, F.T. (2019). Robust computation with rhythmic spike patterns. https://www.pnas.org/content/pnas/116/36/18050.full.pdf

Frady, E.P., Kent, S., Sommer, F.T., Olshausen, B. (2020). Resonator networks 1, An Efficient Solution for Factoring High-Dimensional, Distributed Representations. https://redwood.berkeley.edu/wp-content/uploads/2020/08/resonator1.pdf

Frady, E.P., Orchard, G., Florey, D., Imam, N., Liu, R., Mishra, J., Tse, J., Wild, A., Sommer, F.T. and Davies, M. (2020). Neuromorphic Nearest Neighbor Search Using Intel's Pohoiki Springs. https://dl.acm.org/doi/10.1145/3381755.3398695

Frady, E.P., Kleyko, D., Sommer, F.T. (2020). Variable Binding for Sparse Distributed Representations: Theory and Applications. https://arxiv.org/pdf/2009.06734.pdf

Hyeryung Jang, Nicolas Skatchkovsky, and Osvaldo Simeone (2020). BISNN: Training spiking neural networks with binary weights via Bayesian learning. https://arxiv.org/pdf/2012.08300.pdf

Yulia Sandamirskaya (2014). Dynamic neural fields as a step toward cognitive neuromorphic architectures. https://www.frontiersin.org/articles/10.3389/fnins.2013.00276/full

Websites:

VSA

https://www.hd-computing.com/

https://sites.google.com/ltu.se/vsaonline

NEF

https://www.nengo.ai/nengo/examples/advanced/nef-algorithm.html

DNF

https://dynamicfieldtheory.org/

Recording

https://intel.webex.com/recordingservice/sites/intel/recording/9a81723d9f324e3aa758784f451f896c/playback

Link to Presentation

Not yet available

...