University of Minnesota
Software Engineering Center
/

You are here

Automated Testing - Lessons Learned

Date of Event: 
Thursday, November 7, 1996 - 12:45pm

We had about 40 people attend last month's gathering. After introductions the only major business to report is the results of our discussion concerning a topic for the January meeting. We decided to focus on software metrics. A special email list has been set up to allow interested members to work toward narrowing the topic. You can sign up for the list by emailing Gordon Dosher at: gordon.dosher@network.com or you can send any suggestions to Jesse Freese at: Jesse_Freese@Fissure.com.

Test Automation - Lessons Learned
Presenters: Steve Gitelis, Mitchel Krause, and Carlos Vincens
(Presentation Summary - complements of Gail Bertossi)

Each of the presenters provided insights from their unique perspectives:

Steve Gitelis - testing consultant
Mitchel Krause - BTREE sales manager
Carlos Vincens - BTREE manager of testing services group

Part I. Test Automation Experiences & Recommendations - Steve Gitelis

Project Definition

Steve shared his automated testing experiences on a project that developed an airline stall prevention system. The system detects when an airplane APPROACHES a stall condition and causes a series of events to occur to prevent the stall from occurring.

The system had:

250 requirements
3500 Test Items (pass/fail)
2500 Paths
300 Status Variables
4 level nesting of status logic
FAA's standard required the vendor to demonstrate path coverage without test hooks (passive testing) (2500 paths X 2 = 5000 tests)

SPECIFIC TOOLS USED - EXPERIENCES & BENEFITS

  • McCabe BattleMap (Path coverage, data & control flow coverage)
  • BTREE Verification System (Scripting, i/o control, device monitoring, PC and control)
  • Nohau Emulator
  • 429 Simulator - home grown
  • 422 Cross Data Monitor - home grown
MCCABE BATTLEMAP
 
Experiences:
  • training: 1.5 month - 3 day class
  • slow tool
  • couldn't run timing dependent code
  • $25K single user license
Benefits:
  • 85% code coverage
  • Satisfies FAA requirement of passive testing
  • Windows based
  • Traces control flow & data flow
BTREE IMPLEMENTATION
 
Experiences:
  • Need start up time in schedule
  • Listed 9 specific problems encountered including PC compatibility and technical interface between the boxes
  • Learning curve of 2 Person Months
  • 6 weeks initialization
  • Sales staff support
  • $70K - high cost
Benefits:
  • If they didn't buy, they would need to build (a time, resource & $ consuming proposition)
  • Provided passive probe interface as required by FAA
  • Windows interface
  • Visual tools for snapping memory
  • Supported importing Excel spreadsheet data as tool parameter input
NOHAU EMULATOR
 
Experiences:
  • 6 person months to be expert
  • 1.5 person months to get going
  • Requires separate version of code
  • technical challenge
  • $10K
Benefits:
  • Active probe
  • Supported both hardware & software breakpoints
  • Memory tracing
  • Windows based
  • Controllable by BTREE
  • Tracks to C source code level
429 SIMULATOR & 422 MONITOR
 
Experiences:
  • Developed in-house for system debugging (no commercial product available)
  • DOS based
  • BTREE controllable
  • Budgeted for $10K, cost up to $100K to develop
  • No scripting capability
Benefits:
  • Active probe
  • Supported both hardware & software breakpoints
  • Memory tracing
  • Windows based
  • Controllable by BTREE
  • Tracks to C source code level
BUILD vs. BUY RECOMMENDATION

Steve's recommendation was to BUY, BUY, and BUY. Tool development usually incurs cost and schedule overruns; development doesn't follow a formal process; and the wrong people are chosen to develop the tool.

But, if you have to build, here are 5 recommendations to follow:

  1. Employ formal specifications for the tool
  2. Employ software configuration management
  3. Insist on formal budget planning & tracking
  4. Employ good staff
  5. Insist on a due date, 1 month before the tool is needed.
PAYBACK ON USE OF AUTOMATED TOOL SUITE

Steve's opinion is that the use of these specific tools saved 60 person months on the schedule and $600K on cost.

The cost of the tools was set at 5 person months for setup, $125K for tool set, 6 person months learning curve, for a total investment of $250K.

Part II - When Is an Automated Tool Right for You? by Mitchel Krause

Vendors typically need to 1) educate their clients about their tools and uses, and 2) evaluate the client's problem, possible solutions, and environment to determine if the vendor's tools and services meet the client's needs and abilities.

During the proposal stage, the client's corporate culture is evaluated. Is there management backing? Is the staff adequate to apply the tool? What is the product being tested? How is testing organized? What testing processes are currently in use?

What is the testing problem/solution? Regression testing, time-to-market, non-intrusive testing, improved testing (fault simulation, traceability, stress testing, system timings are examples of the choices available.

Failures to successfully implement automated testing are usually caused by a lack of time and/or people. Use of automated tools is costly the first time around. Payback is realized when the automated tests are reused.

The cost is dependent upon the level of support provided by the vendor. Cost increases as each of the following options is used:

  • Standard software & hardware
  • Custom software and hardware
  • Services
  • System definition
  • Training
  • Test Planning/Test Scripting
Success relies on four factors:
  1. Corporate Commitment

  2. Management understands the amount of testing required and the functions that comprise testing. Development and Testing groups work together, not in isolation.
  3. Resources (time, people)

  4. Time is provided to build and maintain a regression test base, and time is provided to overcome the learning curve associated with a test tool and methodology.
    Dedicated resources, including some software engineers, are provided for the testing effort. Additional people are provided to facilitate overcoming the learning curve.
  5. Phasing of Implementation

  6. Boring and repetitive tasks are automated first. Test activities are planned using processes for planning, tracking and oversight, and configuration management.
  7. Vendor services

  8. Use of vendor services such as training or additional support for the initial phase of implementation can ease the learning curve.

Part III - Test Automation Case Study by Carlos Vincens

Key Success Factors in Automating Tests
  1. Management Support
  2. Test involvement early in the development process
  3. Defects were found & removed early in the development cycle.
  4. Detailed test data is helpful in getting development support to address

  5. the problems.
Conclusion of the case study:
  1. Automated testing enables testers to do better testing.
  2. The test team should include software engineers.
Carlos presented a five step case study of test automation. The steps are applicable to automating tests independent of the specific tools selected.
  1. Learning Phase

  2. The vendor learns the customer's system, test process, and goals & objectives. The customer learns about the vendor's proposed solution and automated test methodologies.
  3. Test Planning

  4. The overall plan is developed including schedule, test objectives,
    resource (people & equipment) availability, and the scope of the work.
  5. Test Design

  6. The customer designs the tests. Together the vendor & the customer design test scripts. Good software development practices are applied to the designs, including consideration of reuse in future projects.
  7. Automated Script Development

  8. The test case used a data-driven instead of procedure-driven testing approach. Test scripts were placed under source code control. Test scripts were reviewed and unit tested. The net result was that most PRODUCT errors were found during the test development activities.
  9. Test Evaluation Results

  10. Because the tests were automated, regression tests could be executed after each new product build at 'no cost'. Stress tests were derived from the functional tests. Errors were found during regression testing. Traceability was automatic. The data captured during the test was used by the developers for debugging.
Question from audience: What is the ratio of developers to testers?
Answer: The ratio is dependent upon the application domain. In mission critical applications, a 1 to 1 ratio is not unusual. In low risk of recall commercial applications, the ratio may be 3 to 1.

Steve Gitelis - President G B Lumina
Mr. Gitelis has ten years experience as a real-time software engineer, and 20 years experience as a Software Test & QA engineer and manager. He is currently president of GB Lumina, a local contract software and network engineering firm.

Mitch Krause - Btree Verification Systems
As head of the Advanced Services Division at B-Tree Verification Systems, Mitch Krause directs design and development of custom embedded software and manages all internal/external testing services.

Mitch has eleven years of experience in the software testing field since earning his Electrical Engineering degree from North Dakota State University in 1995. Prior to joining B-Tree, Mitch applied his software test skills in both the medical industry and in the field of aeronautics. He began his career in the software test group at Boeing Military Airplanes, followed by 3 years as a software test lead in the area of embedded software test at Cardiac Pacemakers Inc.