INRC Forum Nov 15: Maryam Parsa and Lava Open-Door Q&A
Join us Tuesday, November 15, 2022 @ 9:00-10:30am PST / 18:00-19:30 CET for another great INRC Forum, this time featuring an update on optimization at Intel and George Mason University!
Agenda
Intel Labs introduction and update on lava-optimization.
Research Talk by INRC member @Maryam Parsa of George Mason University.
@Sumedh Risbud (Deactivated) hosts Lava Open-Door Q&A. Ask questions and get feedback from Lava developers.
Eventifying and Applying Bayesian Optimization in Lava
Abstract: Despite the monumental success of deep learning over the past few decades, researchers have yet to find a solution that overcomes the inherent fragility of machine learning models when facing adversarial attacks and stochastic environments. A variety of online, continual learning methods have been proposed to address these limitations with varying levels of success and cohesion. We introduce the first phase of the Eventified Bayesian Optimization (EBO) project implemented in Lava-Optimization, which aims to provide a generalized framework for eventified hyper-parameter optimization and lifelong continual learning applications to the neuromorphic research community. Here, we present an outline of the project over the next few years and highlight the completion of the first phase that implements a Bayesian optimization on Lava.
Bio: Maryam Parsa is an Assistant Professor in Electrical and Computer Engineering (ECE) department at George Mason University (GMU). Prior to joining GMU, she was a Postdoctoral Researcher at Oak Ridge National Laboratory (ORNL). She received her PhD in ECE from the Center for Brain Inspired Computing (C-BRIC) at Purdue university in 2020. She was the recipient of several prestigious awards including a four-year Intel/SRC PhD fellowship, ORNL ASTRO fellowship, Purdue university Ross fellowship, TECHCON'18 student presenter award, and ICONS'21 best paper award. She also worked at Intel Corporation in 2014 and 2016 as research intern. Dr. Parsa's research interests are in the areas of neuromorphic computing, neural architecture search and Bayesian optimization/learning across the full stack of materials, devices, circuits, algorithms, and applications.
How to join:
INRC Members can find the meeting link on INRC Fall Forum Schedule. If you are interested in becoming a member, join the INRC.