Skip to content

Math Data Science

Title: Nonlocal Analysis Motivated by Machine Learning

Speaker: Jimmy Scott, University of Tennessee

Abstract: In a variety of situations, processes used to train neural networks can be considered as an optimization problem constrained by a discretization of a partial differential equation. Meanwhile, integro-differential equations can be thought of as nonlocal relaxations that generalize local differential equations. This discussion will be on expanding the established theory for nonlocal equations in order to study broader classes of neural networks. To this end, we will present several nonlocal integro-differential equations and optimization problems motivated by artificial neural networks. Discussion topics will include building connections between neural networks and continuum models as well as challenges concerning well- or ill-posedness of initial value and optimization problems. Asymptotic regimes will also be presented that (we hope) will help shed some light on the efficacy of existing neural networks, as well as serve as effective approximations in their own right.

Thursday, March 5, 2020 at 3:40pm to 4:55pm

Ayres Hall, 112
1403 Circle Drive, Knoxville, TN 37996



Contact Name

Vasileios Maroulas

Contact Email

Google Calendar iCal Outlook

Recent Activity