Skip to main content

Formal Analysis of Autonomous Systems: A Runtime Assurance Perspective

Save to calendar

Oct 25

Date and time: 25 October 2023, 13:00 – 14:00 CEST (UTC +2)
Speaker: Hazem Torfah, Chalmers University of Technology
Title: Formal Analysis of Autonomous Systems: A Runtime Assurance Perspective

Where: Digital Futures hub, Osquars Backe 5, floor 2 at KTH main campus
Directions: https://www.digitalfutures.kth.se/contact/how-to-get-here/

OR

Zoom: https://kth-se.zoom.us/j/69560887455
Meeting ID: 695 6088 7455
Password: 755440

Moderator: Jana Tumova, tumova@kth.se
Administrator: Alva Kosasih, kosasih@kth.se

Abstract:
In recent years, there has been an increase in autonomous systems operating in complex environments and relying on machine learning (ML) components to perform challenging decision-making tasks. However, ML models are brittle and susceptible to failures that can compromise the safety of autonomous systems. There is a pressing need for a systematic methodology that can identify the conditions in which components in the autonomy pipeline can fail at design time and detect such failures at runtime.

This talk proposes a new safety assurance approach for autonomous systems based on runtime verification. Runtime verification is an analysis technique based on extracting information from a system at runtime and evaluating this information to determine whether an execution of the system satisfies or violates a given property. Specifically, we will report on recent results, presenting runtime verification methods for capturing the operational design domains, i.e., the conditions under which the system or a component thereof is designed to operate safely, and for evaluating the safety of a system in noisy and unpredictable environments.

Bio: Hazem Torfah is an Assistant Professor at Chalmers University of Technology in Gothenburg, Sweden. He leads the lab on Safe and Trustworthy Autonomous Reasoning, supported by the Wallenberg AI, Autonomous Systems and Software Program (WASP). Previously, he was a postdoctoral researcher in the EECS Department at UC Berkeley. He received his doctoral degree in Computer Science in December 2019 from Saarland University, Germany. His research interests are the formal specification, verification, and synthesis of cyber-physical systems. He is one of the developers of the RTLola monitoring framework, which has been integrated into the ARTIS fleet of unmanned aerial vehicles in close collaboration with the German Aerospace Center (DLR). Hazem’s current focus is the development of quantitative methods for the explainability and runtime assurance of AI-based autonomous systems.

Links to Hazem Torfah’s social media accounts