ACM e-Energy Keynotes
Vijay Janapa Reddi
June 29, 2022 - 3:00pm CEST / 9:00am EST
The Future of Smart Cities is Tiny and Bright
Abstract: Tiny machine learning (TinyML) is a rapidly expanding discipline that combines ML techniques with low-cost embedded hardware. TinyML enables on-device sensor data analysis (vision, audio, IMU, etc.) while consuming little power. Processing data near the sensor allows for a wide range of novel, always-on ML use-cases that conserve bandwidth, latency, and energy while enhancing responsiveness and privacy. This talk presents the TinyML vision and highlights some of the fascinating applications that TinyML is enabling in energy, water, transportation, and public health and safety. Despite the excitement, there are several hardware and software obstacles, along with data privacy concerns, that we must overcome. On-device ML constraints such as limited memory and storage, communication barriers, extreme hardware heterogeneity, software fragmentation, and a lack of relevant and commercially viable large-scale tinyML datasets present a significant barrier to unlocking the full potential of TinyML for a more innovative and more sustainable ecosystem. Therefore, the talk also discusses the prospects for addressing these concerns and ushering in a new age for smart cities, which rests on the bright future of tiny machine learning devices.
Biography: Vijay Janapa Reddi is an Associate Professor at Harvard University, as well as the VP and founding member of MLCommons (mlcommons.org), a nonprofit organization dedicated to accelerating machine learning (ML) innovation for all. He also serves on the board of directors of MLCommons and is the Co-Chair of the MLCommons Research organization. He was the driving force behind the MLPerf Inference ML benchmark for datacenter, edge, mobile, and IoT systems. Prior to joining Harvard, he was an Associate Professor in the Electrical and Computer Engineering department at The University of Texas at Austin. His research focuses on machine learning, computer architecture, and runtime software. He focuses on developing computing systems for tiny Internet of Things devices, as well as mobile and edge computing. Dr. Janapa-Reddi has received numerous honors and awards, including the Gilbreth Lecturer Honor (2016) from the National Academy of Engineering (NAE), the IEEE TCCA Young Computer Architect Award (2016), the Intel Early Career Award (2013), the Google Faculty Research Awards (2012, 2013, 2015, 2017, 2020), Best Papers at the 2020 Design Automation Conference (DAC), the 2005 International Symposium on Microarchitecture (MICRO), and the 2009 International Symposium on High-Performance Computer Architecture (2006, 2010, 2011, 2016, 2017, 2021). He is a member of the MICRO and HPCA Halls of Fame (in 2018 and 2019, respectively). He is deeply committed to increasing access to applied machine learning for STEM, diversity, and the use of AI for social benefit. He created the Tiny Machine Learning (TinyML) series on edX, a massive open online course (MOOC) that combines embedded systems with machine learning and that thousands of worldwide learners can access and audit for free. He was also in charge of the Austin Hands-on Computer Science (HaCS) program, which was implemented in the Austin Independent School District for K-12 CS instruction. Dr. Janapa-Reddi holds a Ph.D. in computer science from Harvard University, an M.S. in computer science from the University of Colorado at Boulder, and a B.S. in computer science from Santa Clara University.
Carole Jean Wu, Meta AI
June 30, 2022 - 3:00pm CEST / 9:00am EST
Sustainable AI: Environmental Implications, Challenges and Opportunities
Abstract: The past decade has witnessed orders-of-magnitude increase in the amount of compute for AI. Modern natural language processing models are fueled with over trillion parameters while the memory needs of deep learning recommendation and ranking models have grown from hundreds of gigabytes to the terabyte scale. We will explore the environmental implications of the super-linear growth trend for AI from a holistic perspective, spanning data, algorithms, and system hardware. I will talk about the carbon footprint of AI computing by examining the model development cycle across industry-scale use cases and, at the same time, considering the life cycle of system hardware. The talk will capture the operational and manufacturing carbon footprint of computing. I will present an end-to-end analysis for what and how hardware-software design and at-scale optimization can help reduce the overall carbon footprint of AI and computing. Based on the industry experience and lessons learned, I will share the key challenges across the many dimensions of AI. This talk will conclude with important development and research directions to advance the field of AI in an environmentally-responsible and sustainable manner.
Biography: Carole-Jean Wu is currently a Research Scientist Manager at Meta. Her research sits at the intersection of computer architecture and machine learning with emphasis on developing energy- and memory-efficient systems and microarchitectures, optimizing systems for machine learning execution at-scale, and designing learning-based approaches for system design and optimization. She is passionate about pathfinding and tackling system challenges to enable efficient, responsible AI execution. Carole-Jean chairs the MLPerf Recommendation Benchmark Advisory Board, co-chaired MLPerf Inference, and serves on the MLCommons Board as a Director. Prior to Meta, she was a tenured Associate Professor at ASU. Carole-Jean received her M.A. and Ph.D. from Princeton and B.Sc. from Cornell.
Carole-Jean is the technical program co-chair for the 2022 Conference on Machine Learning and Systems (MLSys) and the program chair for the 2018 IEEE International Symposium on Workload Characterization (IISWC). She has served on the technical program committees of ACM/IEEE ISCA, MICRO, and HPCA and on the editorial board of IEEE Computer Architecture Letters and IEEE Micro. She is the recipient of the NSF CAREER Award, Distinction of Honorable Mention of the CRA Anita Borg Early Career Award, the IEEE Young Engineer of the Year Award, the Science Foundation Arizona Bisgrove Early Career Scholarship, and the Intel PhD Fellowship, among a number of IEEE Micro Top Picks, IEEE Best Paper Awards.
Giorgio Cortiana, E.ON Quantum Computing
July 1, 2022 - 3:00pm CEST / 9:00am EST
Quantum Computing: an outlook on future energy systems
Abstract: The power of Quantum Computing has long been advertised: with their unprecedented computational approach, quantum computers will offer the possibility to tackle new kind of problems, or solve specific existing challenges in a more efficient and faster way. QC can complement and enhance state-of-the-art machine learning, advanced simulation techniques, and optimization approaches, to better address the increasing complexity and dynamics of decentral and decarbonized energy systems. With some selected examples, we will explore the scope of the opportunity for quantum computing to shape the energy world of tomorrow and advance the green energy transition.
Biography: Giorgio Cortiana has PhD in Particle Physics and several years of experience in the forefront research and data science (Fermi National Accelerator Laboratory, US; CERN, Switzerland). At E.ON Digital Technology, Giorgio is heading the Centre of Excellence Analytics – Energy Intelligence part of E.ON’s Global Data and IoT department. He is leading several data science projects on intelligent asset management, smart power- and heating-grids, as well as on energy markets. Together with his team he applies state-of-the-art machine-learning techniques and quantum technologies to support a sustainable company’s digital transformation, as well as the green energy transition.