
Hi Everyone, I think many of you might find the EECS Tech Talk tomorrow (Tuesday) at 11am interesting. It is from one of our new faculty at he intersection of security and ML. Details below. -Alan From: Tina Batten <tina.batten@oregonstate.edu> Sent: Wednesday, November 17, 2021 8:15 AM To: Fern, Alan Paul <Alan.Fern@oregonstate.edu> Subject: Tech Talk Tuesday: Building Secure and Reliable Deep Learning Systems from a Systems Security Perspective View this email in your browser<https://nam04.safelinks.protection.outlook.com/?url=https%3A%2F%2Fmailchi.mp%2Foregonstate%2Ftech-tuesday-nov-23-2021-hong%3Fe%3D2d71d63281&data=04%7C01%7Calan.fern%40oregonstate.edu%7C1721196bd5274406b15e08d9a9e5581d%7Cce6d05e13c5e4d6287a84c4a2713c113%7C0%7C0%7C637727624761694599%7CUnknown%7CTWFpbGZsb3d8eyJWIjoiMC4wLjAwMDAiLCJQIjoiV2luMzIiLCJBTiI6Ik1haWwiLCJXVCI6Mn0%3D%7C3000&sdata=gTIuTTX7551PA22mFQEFak1V%2Bcy0Ydvbml7BFx%2Bt%2Fow%3D&reserved=0> [https://mcusercontent.com/4ed5c72a58ff81119658f624f/images/5aee7181-8630-4c8...] TECH TALK TUESDAY [https://mcusercontent.com/4ed5c72a58ff81119658f624f/images/856130cc-f3f1-fcd...] Building Secure and Reliable Deep Learning Systems from a Systems Security Perspective Sanghyun Hong Assistant Professor Computer Science Oregon State University Tuesday, November 23, 2021 Talk: 11:00-11:30 a.m. PST | Q/A: 11:30-11:45 Zoom link: https://beav.es/tech-talk<https://nam04.safelinks.protection.outlook.com/?url=https%3A%2F%2Foregonstate.us2.list-manage.com%2Ftrack%2Fclick%3Fu%3D4ed5c72a58ff81119658f624f%26id%3D18c9d59b62%26e%3D2d71d63281&data=04%7C01%7Calan.fern%40oregonstate.edu%7C1721196bd5274406b15e08d9a9e5581d%7Cce6d05e13c5e4d6287a84c4a2713c113%7C0%7C0%7C637727624761694599%7CUnknown%7CTWFpbGZsb3d8eyJWIjoiMC4wLjAwMDAiLCJQIjoiV2luMzIiLCJBTiI6Ik1haWwiLCJXVCI6Mn0%3D%7C3000&sdata=9Nsf2si7saPNGYApFwuH56o%2BpodHRJZ%2FjxYVAu60orc%3D&reserved=0> Info: https://eecs.oregonstate.edu/tech-talk-tuesday<https://nam04.safelinks.protection.outlook.com/?url=https%3A%2F%2Foregonstate.us2.list-manage.com%2Ftrack%2Fclick%3Fu%3D4ed5c72a58ff81119658f624f%26id%3D9f33fc4e19%26e%3D2d71d63281&data=04%7C01%7Calan.fern%40oregonstate.edu%7C1721196bd5274406b15e08d9a9e5581d%7Cce6d05e13c5e4d6287a84c4a2713c113%7C0%7C0%7C637727624761704549%7CUnknown%7CTWFpbGZsb3d8eyJWIjoiMC4wLjAwMDAiLCJQIjoiV2luMzIiLCJBTiI6Ik1haWwiLCJXVCI6Mn0%3D%7C3000&sdata=0i%2FQz4zYf1afodnEZMIIV22YMkQKdPZfUfxkjuYSnJ8%3D&reserved=0> Abstract How can we build secure and reliable deep learning systems, e.g., autonomous cars or AI-assisted robotic surgery, for tomorrow? We cannot answer this question without understanding the worst-case behaviors of deep neural networks (DNNs), i.e., the core component of those systems. Recent work studied the worst-case behaviors, such as mispredictions caused by adversarial examples or models altered by data poisoning. However, most of the prior work narrowly considers DNNs as an isolated mathematical concept, and it overlooks a holistic picture, e.g., leaving out the threats caused by hardware-level attacks. In this talk, I will discuss my work, studying computational properties of DNNs from a systems security perspective, that has exposed critical security threats and has steered the industrial practices. First, I will present my work exposing the false sense of security: DNNs are not resilient to parameter perturbations. An adversary can inflict an accuracy drop up to 100% with a single bit-flip in its memory representation. Second, I will show how brittle the computational savings provided by efficient deep learning techniques are in adversarial settings. By adding human-imperceptible input perturbations, an attacker can completely offset a multi-exit network's computational savings on an input. Third, I will show how privacy-protection mechanisms, offered without a holistic picture, can put millions of users under a serious privacy threat. These mechanisms do not admit any arms race and eventually give the upper hand to an adversary. Finally, I will conclude by discussing how these results opened up new research directions and have steered industrial practices. Speaker Bio Sanghyun Hong is an Assistant Professor in Computer Science at Oregon State University. His research interest lies at the intersection of computer security, privacy, and machine learning. His current research focus is to study the computational properties of DNNs from a systems security perspective. He also works on identifying distinct internal behaviors of DNNs, such as network confusion or gradient-level disparity, whose quantification led to defenses against backdooring or data poisoning. You can find more about Sanghyun at https://sanghyun-hong.com<https://nam04.safelinks.protection.outlook.com/?url=https%3A%2F%2Foregonstate.us2.list-manage.com%2Ftrack%2Fclick%3Fu%3D4ed5c72a58ff81119658f624f%26id%3D73dde24ea3%26e%3D2d71d63281&data=04%7C01%7Calan.fern%40oregonstate.edu%7C1721196bd5274406b15e08d9a9e5581d%7Cce6d05e13c5e4d6287a84c4a2713c113%7C0%7C0%7C637727624761714506%7CUnknown%7CTWFpbGZsb3d8eyJWIjoiMC4wLjAwMDAiLCJQIjoiV2luMzIiLCJBTiI6Ik1haWwiLCJXVCI6Mn0%3D%7C3000&sdata=QkW4l7vmjR7sf1k4Bs5rjLggqq48kH0nychFjFZYrq0%3D&reserved=0>. Copyright (c) 2021 Oregon State University, All rights reserved. School of Electrical Engineering and Computer Science Oregon State University 1148 Kelley Engineering Center 110 SW Park Terrace Corvallis, OR 97331-5501 Phone: (541) 737-3617 | eecs.oregonstate.edu<https://nam04.safelinks.protection.outlook.com/?url=https%3A%2F%2Foregonstate.us2.list-manage.com%2Ftrack%2Fclick%3Fu%3D4ed5c72a58ff81119658f624f%26id%3Da1f1f649e3%26e%3D2d71d63281&data=04%7C01%7Calan.fern%40oregonstate.edu%7C1721196bd5274406b15e08d9a9e5581d%7Cce6d05e13c5e4d6287a84c4a2713c113%7C0%7C0%7C637727624761714506%7CUnknown%7CTWFpbGZsb3d8eyJWIjoiMC4wLjAwMDAiLCJQIjoiV2luMzIiLCJBTiI6Ik1haWwiLCJXVCI6Mn0%3D%7C3000&sdata=FEY8%2F3aATZJbTbvwic5PdQ%2FilmX1ZdXoKkrbq1gZ5Xg%3D&reserved=0> Want to change how you receive these emails? You can update your preferences<https://nam04.safelinks.protection.outlook.com/?url=https%3A%2F%2Foregonstate.us2.list-manage.com%2Fprofile%3Fu%3D4ed5c72a58ff81119658f624f%26id%3Dcbfc20fcb5%26e%3D2d71d63281%26c%3D19b85c3ec9&data=04%7C01%7Calan.fern%40oregonstate.edu%7C1721196bd5274406b15e08d9a9e5581d%7Cce6d05e13c5e4d6287a84c4a2713c113%7C0%7C0%7C637727624761714506%7CUnknown%7CTWFpbGZsb3d8eyJWIjoiMC4wLjAwMDAiLCJQIjoiV2luMzIiLCJBTiI6Ik1haWwiLCJXVCI6Mn0%3D%7C3000&sdata=mnf7ePp1l07oF9xGNqWzmzOHmFvTOTuIfZTyUg3hTOE%3D&reserved=0> or unsubscribe from this list<https://nam04.safelinks.protection.outlook.com/?url=https%3A%2F%2Foregonstate.us2.list-manage.com%2Funsubscribe%3Fu%3D4ed5c72a58ff81119658f624f%26id%3Dcbfc20fcb5%26e%3D2d71d63281%26c%3D19b85c3ec9&data=04%7C01%7Calan.fern%40oregonstate.edu%7C1721196bd5274406b15e08d9a9e5581d%7Cce6d05e13c5e4d6287a84c4a2713c113%7C0%7C0%7C637727624761724463%7CUnknown%7CTWFpbGZsb3d8eyJWIjoiMC4wLjAwMDAiLCJQIjoiV2luMzIiLCJBTiI6Ik1haWwiLCJXVCI6Mn0%3D%7C3000&sdata=e114lBid%2BTc5WNC1ZRC8h0m5kE4TV2eKfPpIBAluVNE%3D&reserved=0>.