University of Minnesota
Software Engineering Center
/

You are here

Gregory Gay

Gregory Gay
Student/Research Assistant
Office Location: 
6-248 Keller Hall
Education: 
Ph.D. Computer Science, University of Minnesota, 2015
Advisor: Dr. Mats Heimdahl.
Thesis title: Steering Model-Based Oracles to Admit Real Program Behaviors.

M.S. Computer Science, West Virginia University, 2010.
Advisor: Dr. Tim Menzies.
Thesis title: Robust Optimization of Non-Linear Requirements Models.

B.S. Computer Science, West Virginia University, 2008.
Biography: 

Greg is an assistant professor of Computer Science & Engineering at University of South Carolina. He was previously is a PhD student and research assistant at University of Minnesota under a NSF Graduate Research Fellowship, working with the Critical Systems research group. He received his BS and MS in Computer Science from West Virginia University.

Additionally, Greg has previously interned at NASA's Ames Research Center and Independent Verification & Validation Center, and spent time as a visiting academic at the Chinese Academy of Sciences in Beijing.

Research: 

Greg's research is primarily in the areas of search-based software engineering and automated software testing and analysis, with an emphasis on aspects of the test oracle problem. His current research focus is on construction of effective test oracles for real-time and safety critical systems, including methods of selecting oracle data and making comparisons.

His approach to addressing research problems is based on a data-centric approach, forming an intersection between search, optimization, data mining, and artificial intelligence. He strives to harness the information content of software development artifacts to improve the efficiency and quality of the testing process and to automate tasks in order to lessen the burden on human testers.

His past research has largely focused on the application of search, optimization, and information retrieval techniques to various software engineering tasks, including model optimization, requirements engineering, effort estimation, defect detection, and the traceability between source code and defect reports.

Recent Publications

Automatically Finding the Control Variables for Complex System Behavior

Testing large-scale systems is expensive in terms of both time and money. Running simulations early in the process is a proven method of finding the design faults likely to lead to critical system failures, but determining the exact cause of those errors is still time-consuming and requires access to a limited number of domain experts. It is desirable to find an automated method that explores the large number of combinations and is able to isolate likely fault points.

When to Use Data from Other Projects for Effort Estimation

Collecting the data required for quality prediction within a development team is time-consuming and expensive. An alternative to make predictions using data that crosses from other projects or even other companies. We show that with/without relevancy filtering, imported data performs the same/worse (respectively) than using local data. Therefore, we recommend the use of relevancy filtering whenever generating estimates using data from another project.

A Baseline Method For Search-Based Software Engineering

Background: Search-based Software Engineering (SBSE) uses a variety of techniques such as evolutionary algorithms or meta-heuristic searches but lacks a standard baseline method. Aims: The KEYS2 algorithm meets the criteria of a baseline. It is fast, stable, easy to understand, and presents results that are competitive with standard techniques.

Pages