Unlocking the Potential of Event-based Vision with Loihi

Organizer:

Garrick Orchard

Time:

10:15am-11:15am PST, 10th Feb, 2021

Abstract:

This session will highlight recent works which process live streaming data from an event-based vision sensor in real-time with Loihi. The works demonstrate the value Loihi adds to an event-based vision processing pipeline, whether through low power consumption, low-latency for control, or the ability to learn online and on-chip, thereby helping to unlock the full potential of event-based vision sensors.

Through the talks participants will get a feel for both the strengths and weaknesses of using Loihi for visual processing tasks.

Talks:

Online few-shot gesture learning with an event-based sensor on Loihi
Emre Neftci
Neuromorphic Machine Intelligence Lab, UC Irvine

A PID controller for UAVs using an event-based sensor and Loihi
Yulia Sandamirskaya
Intel Labs

Neuromorphic Space Imaging: Tracking Satellites and Space Junk
Gregory Cohen
International Centre for Neuromorphic Engineering, Western Sydney University

Session Recording:

Unlocking the potential of event-based vision with Loihi

Online Content:

https://intel-ncl.atlassian.net/wiki/spaces/INRC/pages/1080328193/Tutorials+and+Related+Presentations?atlOrigin=eyJpIjoiZjNkNTg4ODFiOGVkNDJiOTkzODkzNzg3ZjQ1ZDU5ODUiLCJwIjoiYyJ9

Please use the comment section on this page to ask questions or comment about this specific presentation.