University of Minnesota
Software Engineering Center
/

You are here

Gregory Gay

Gregory Gay
Student/Research Assistant
Office Location: 
6-248 Keller Hall
Education: 
Ph.D. Computer Science, University of Minnesota, 2015
Advisor: Dr. Mats Heimdahl.
Thesis title: Steering Model-Based Oracles to Admit Real Program Behaviors.

M.S. Computer Science, West Virginia University, 2010.
Advisor: Dr. Tim Menzies.
Thesis title: Robust Optimization of Non-Linear Requirements Models.

B.S. Computer Science, West Virginia University, 2008.
Biography: 

Greg is an assistant professor of Computer Science & Engineering at University of South Carolina. He was previously is a PhD student and research assistant at University of Minnesota under a NSF Graduate Research Fellowship, working with the Critical Systems research group. He received his BS and MS in Computer Science from West Virginia University.

Additionally, Greg has previously interned at NASA's Ames Research Center and Independent Verification & Validation Center, and spent time as a visiting academic at the Chinese Academy of Sciences in Beijing.

Research: 

Greg's research is primarily in the areas of search-based software engineering and automated software testing and analysis, with an emphasis on aspects of the test oracle problem. His current research focus is on construction of effective test oracles for real-time and safety critical systems, including methods of selecting oracle data and making comparisons.

His approach to addressing research problems is based on a data-centric approach, forming an intersection between search, optimization, data mining, and artificial intelligence. He strives to harness the information content of software development artifacts to improve the efficiency and quality of the testing process and to automate tasks in order to lessen the burden on human testers.

His past research has largely focused on the application of search, optimization, and information retrieval techniques to various software engineering tasks, including model optimization, requirements engineering, effort estimation, defect detection, and the traceability between source code and defect reports.

Recent Publications

Automated Oracle Data Selection Support

The choice of test oracle—the artifact that determines whether an application under test executes correctly—can significantly impact the effectiveness of the testing process. However, despite the prevalence of tools that support test input selection, little work exists for supporting oracle creation. We propose a method of supporting test oracle creation that automatically selects the oracle data—the set of variables monitored during testing—for expected value test oracles. This approach is based on the use of

Automated Steering of Model-Based Test Oracles to Admit Real Program Behaviors

The test oracle—a judge of the correctness of the system under test (SUT)—is a major com- ponent of the testing process. Specifying test oracles is challenging for some domains, such as real-time embedded systems, where small changes in timing or sensory input may cause large behavioral differences. Models of such systems, often built for analysis and simulation, are appealing for reuse as test oracles. These models, however, typically represent an idealized system, abstracting away certain issues such as non-deterministic timing behavior and sensor

Improving the Accuracy of Oracle Verdicts Through Automated Model Steering

The oracle—a judge of the correctness of the system under test (SUT)—is a major component of the testing process. Specifying test oracles is challenging for some domains, such as real-time embedded systems, where small changes in timing or sensory input may cause large behavioral differences. Models of such systems, often built for analysis and simulation, are appealing for reuse as oracles. These models, however, typically represent an idealized system, abstracting away certain issues such as non-deterministic timing behavior and sensor noise.

Pages