
Dear all, Our next AI seminar is scheduled to be on February 23rd , 2-3 PM. It will be followed by a 30-minute Q&A session with the graduate students. Location: KEC 1001 Zoom link: https://oregonstate.zoom.us/j/96491555190?pwd=azJHSXZ0TFQwTFFJdkZCWFhnTW04UT09<https://nam04.safelinks.protection.outlook.com/?url=https%3A%2F%2Foregonstate.zoom.us%2Fj%2F96491555190%3Fpwd%3DazJHSXZ0TFQwTFFJdkZCWFhnTW04UT09&data=05%7C02%7Cai%40engr.orst.edu%7C7950f974d50e4c23e0b108dc3393c9b0%7Cce6d05e13c5e4d6287a84c4a2713c113%7C0%7C0%7C638441957037231868%7CUnknown%7CTWFpbGZsb3d8eyJWIjoiMC4wLjAwMDAiLCJQIjoiV2luMzIiLCJBTiI6Ik1haWwiLCJXVCI6Mn0%3D%7C0%7C%7C%7C&sdata=sqsVCkrkNF7L6j4I4jr8e%2FjFts5tk13EmUqP1cuKH6A%3D&reserved=0> Gradient-free bilevel programming with applications in machine learning Alireza Aghasi Assistant Professor Electrical and Computer Engineering Oregon State University Abstract: Our talk is mainly about a scalable and easy way of addressing bilevel optimization problems when we only have access to noisy evaluations of the objective functions, and absolutely no access to their gradient or Hessian. The talk starts with some motivating applications of bilevel programming in machine learning. We will then briefly overview zeroth-order methods for general optimization problems, and specifically focus on a more recent technique called Gaussian smoothing, which allows estimating the first and second order derivatives of an objective function solely through random queries to the function. Finally, we propose a computational algorithm to address general bilevel programs, without the need to access the Jacobian or Hessian of the underlying objectives. We present theoretical guarantees on the performance of the algorithm, and ultimately present iteration complexity (sample complexity) results. To the best of our knowledge, the proposed algorithm is the first result in the literature about a fully zeroth-order scheme to address general bilevel programs. Speaker Bio: Alireza Aghasi joined the School of Electrical Engineering and Computer Science at Oregon State University in Fall 2022. Between 2017 and 2022, he was an assistant professor in the Department of Data Science and Analytics at the school of business, Georgia State University. Prior to this position he was a research scientist with the Department of Mathematical Sciences, IBM T.J. Watson research center, Yorktown Heights. From 2015 to 2016 he was a postdoctoral associate with the computational imaging group at the Massachusetts Institute of Technology, and between 2012 and 2015 he served as a postdoctoral research scientist with the compressed sensing group at Georgia Tech. His research fundamentally focuses on optimization theory, probability theory and statistical analysis, with applications to various areas of data science, artificial intelligence, and modern signal processing. Please watch this space for future AI Seminars : https://engineering.oregonstate.edu/EECS/research/AI<https://nam04.safelinks.protection.outlook.com/?url=https%3A%2F%2Fengineering.oregonstate.edu%2FEECS%2Fresearch%2FAI&data=05%7C02%7Cai%40engr.orst.edu%7C7950f974d50e4c23e0b108dc3393c9b0%7Cce6d05e13c5e4d6287a84c4a2713c113%7C0%7C0%7C638441957037231868%7CUnknown%7CTWFpbGZsb3d8eyJWIjoiMC4wLjAwMDAiLCJQIjoiV2luMzIiLCJBTiI6Ik1haWwiLCJXVCI6Mn0%3D%7C0%7C%7C%7C&sdata=MfVDHXpv5gaM%2FNN7DdVGUHaQq7D7ApeVEnmv8nN1Wn8%3D&reserved=0> Rajesh Mangannavar, Graduate Student Oregon State University ---- AI Seminar Important Reminders: -> For graduate students in the AI program, attendance is strongly encouraged
participants (1)
-
Mangannavar, Rajesh Devaraddi