University of Minnesota
Software Engineering Center
/

You are here

Michael Whalen

Photo of Michael Whalen
Director of the Software Engineering Center
Director of Graduate Studies
Phone Number: 
612-624-5130
Office Location: 
Kenneth H Keller Hall room 6-254
Biography: 

Dr. Michael Whalen is the Program Director at the University of Minnesota Software Engineering Center. He has 15 years experience in software development and analysis, including 10 years experience in Model-Based Development & safety-critical systems. Dr. Whalen has developed simulation, translation, testing, and formal analysis tools for Model-Based Development languages including Simulink, Stateflow, Lustre, and RSML-e. He has led successful formal verification projects on large industrial avionics models, including displays (Rockwell-Collins ADGS-2100 Window Manager), redundancy management and control allocation (AFRL CerTA FCS program) and autoland (AFRL CerTA CPD program). Dr. Whalen was the lead developer of the Rockwell-Collins Gryphon tool suite, which can be used for compilation, test-case generation, and formal analysis of Simulink/Stateflow models. This tool suite has been used both for academic research and industrial verification projects.

Dr. Whalen is a frequent speaker and author on the use of formal methods, with 10 invited presentations, five journal publications, one book chapter, 19 conference papers, and 7 contractor and technical reports published. His PhD dissertation involved using higher-order abstract syntax as a basis for a provably-correct code generation tool from the RSML-e specification language into a subset of C. His interests include novel uses of model checking, test generation, theorem proving, and random search simulation tools to reduce the cost and manual effort required for systems and software validation for critical systems.

Recent Publications

Resolute: An Assurance Case Language for Architecture Models

Arguments about the safety, security, and correctness of a complex system are often made in the form of an assurance case. An assurance case is a structured argument, often represented with a graphical interface, that presents and supports claims about a system's behavior. The argument may combine different kinds of evidence to justify its top level claim. While assurance cases deliver some level of guarantee of a system's correctness, they lack the rigor that proofs from formal methods typically provide.

Automated Oracle Data Selection Support

The choice of test oracle—the artifact that determines whether an application under test executes correctly—can significantly impact the effectiveness of the testing process. However, despite the prevalence of tools that support test input selection, little work exists for supporting oracle creation. We propose a method of supporting test oracle creation that automatically selects the oracle data—the set of variables monitored during testing—for expected value test oracles.

Efficient Observability-based Test Generation by Dynamic Symbolic Execution

Structural coverage metrics have been widely used to measure test suite adequacy as well as to generate test cases. In previous investigations, we have found that the fault-finding effectiveness of tests satisfying structural coverage criteria is highly dependent on program syntax – even if the faulty code is exercised, its effect may not be observable at the output. To address these problems, observability-based coverage metrics have been defined.

Pages