University of Minnesota
Software Engineering Center
/

You are here

Gregory Gay

Gregory Gay
Student/Research Assistant
Office Location: 
6-248 Keller Hall
Education: 
Ph.D. Computer Science, University of Minnesota, 2015
Advisor: Dr. Mats Heimdahl.
Thesis title: Steering Model-Based Oracles to Admit Real Program Behaviors.

M.S. Computer Science, West Virginia University, 2010.
Advisor: Dr. Tim Menzies.
Thesis title: Robust Optimization of Non-Linear Requirements Models.

B.S. Computer Science, West Virginia University, 2008.
Biography: 

Greg is an assistant professor of Computer Science & Engineering at University of South Carolina. He was previously is a PhD student and research assistant at University of Minnesota under a NSF Graduate Research Fellowship, working with the Critical Systems research group. He received his BS and MS in Computer Science from West Virginia University.

Additionally, Greg has previously interned at NASA's Ames Research Center and Independent Verification & Validation Center, and spent time as a visiting academic at the Chinese Academy of Sciences in Beijing.

Research: 

Greg's research is primarily in the areas of search-based software engineering and automated software testing and analysis, with an emphasis on aspects of the test oracle problem. His current research focus is on construction of effective test oracles for real-time and safety critical systems, including methods of selecting oracle data and making comparisons.

His approach to addressing research problems is based on a data-centric approach, forming an intersection between search, optimization, data mining, and artificial intelligence. He strives to harness the information content of software development artifacts to improve the efficiency and quality of the testing process and to automate tasks in order to lessen the burden on human testers.

His past research has largely focused on the application of search, optimization, and information retrieval techniques to various software engineering tasks, including model optimization, requirements engineering, effort estimation, defect detection, and the traceability between source code and defect reports.

Recent Publications

Steering Model-Based Oracles to Admit Real Program Behaviors

The oracle - an arbiter of correctness of the system under test (SUT) - is a major component of the testing process. Specifying oracles is challenging for real-time embedded systems, where small changes in time or sensor inputs may cause large differences in behavior. Behavioral models of such systems, often built for analysis and simulation, are appealing for reuse as oracles. However, these models typically provide an idealized view of the system.

Moving the Goalposts: Coverage Satisfaction is Not Enough

Structural coverage criteria have been proposed to measure the adequacy of testing efforts. Indeed, in some domains—e.g., critical systems areas—structural coverage criteria must be satisfied to achieve certification. The advent of powerful search-based test generation tools has given us the ability to generate test inputs to satisfy these structural coverage criteria. While tempting, recent empirical evidence indicates these tools should be used with caution, as

Observable Modified Condition/Decision Coverage

In many critical systems domains, test suite adequacy is currently measured using structural coverage metrics over the source code. Of particular interest is the modified condition/decision coverage (MC/DC) criterion required for, e.g., critical avionics systems. In previous investigations we have found that the efficacy of such test suites is highly dependent on the structure of the program under test and the choice of variables monitored by the oracle.

Pages