Case Studies of Software Test Automation - Automated testing—it’s the Holy Grail, the Fountain of Youth, and the Philosopher’s Stone all rolled into one. For decades, testers have looked to automated testing for relief from the drudgery of manual testing—constructing test cases and test data, setting system preconditions, executing tests, comparing actual with expected results, and reporting possible defects. Automated testing promises to simplify all these operations and more.
About Jack Berlin
Founded Accusoft (Pegasus Imaging) in 1991 and has been CEO ever since.
Very proud of what the team has created with edocr, it is easy to share documents in a personalized way and so very useful at no cost to the user! Hope to hear comments and suggestions at info@edocr.com.
Tag Cloud
Praise for Experiences of Test Automation
“ What you hold in your hands is a treasure trove of hard-won knowledge about what works
and what doesn’t in test automation. It can save you untold hours and costs by steering you
away from paths that lead nowhere and guiding you towards those that lead to success.â€
—Linda Hayes
“ From tools to methodology, Dorothy Graham and Mark Fewster weave a compelling set of
stories that provide a learning experience in automation. This comprehensive tome is the
first of its kind to take the reader deep into the world of automated testing, as depicted by
case studies that show the realities of what happened across a multitude of projects spanning
a wide variety of industries and technology environments. By identifying similarities and
repeated themes, the authors help the reader focus on the essential learning lessons and pit-
falls to avoid. Read this book cover to cover for inspiration and a realization of what it takes
to ultimately succeed in test automation.â€
—Andrew L. Pollner, President & CEO of ALP International Corporation
“ Many years after their best-seller Software Test Automation, Mark Fewster and Dorothy
Graham have done it again. Agile methodologies have given test automation a dominant
presence in today’s testing practices. This is an excellent, highly practical book with many
well-documented case studies from a wide range of perspectives. Highly recommended to
all those involved, or thinking about getting involved, in test automation.â€
— Erik van Veenendaal, Founder of Improve Quality Services and vice-chair of
TMMi Foundation
“ This book is like having a testing conference in your hand, with a wealth of case studies
and insights. Except that this book is much cheaper than a conference, and you don’t have
to travel for it. What impressed me in particular was that it is all tied together in a concise
‘chapter zero’ that efficiently addresses the various aspects I can think of for automation suc-
cess. And that is something you will not get in a conference.â€
—Hans Buwalda
“ An exciting, well-written, and wide-ranging collection of case studies with valuable real-
world experiences, tips, lessons learned, and points to remember from real automation proj-
ects. This is a very useful book for anyone who needs the evidence to show managers and
colleagues what works—and what does not work—on the automation journey.â€
—Isabel Evans, FBCS CITP, Quality Manager, Dolphin Computer Access
“Experiences of Test Automation first describes the essence of effective automated testing.
It proceeds to provide many lifetimes worth of experience in this field, from a wide variety
of situations. It will help you use automated testing for the right reasons, in a way that suits
your organization and project, while avoiding the various pitfalls. It is of great value to anyone
involved in testing—management, testers, and automators alike.â€
—Martin Gijsen, Independent Test Automation Architect
“This offering by Fewster and Graham is a highly significant bridge between test automa-
tion theory and reality. Test automation framework design and implementation is an inex-
act science begging for a reusable set of standards that can only be derived from a growing
body of precedence; this book helps to establish such precedence. Much like predecessor
court cases are cited to support subsequent legal decisions in a judicial system, the diverse
case studies in this book may be used for making contemporary decisions regarding engage-
ment in, support of, and educating others on software test automation framework design and
implementation.â€
—Dion Johnson, Software Test Consultant and Principle Adviser to the Automated Testing
Institute (ATI)
“Even with my long-established ‘test automation won’t work’ stance, this book did make
me pause and ponder. It opened my mind and gave me a few ‘oh, I hadn’t thought of that’
moments. I would recommend this book as an initial reference for any organization wanting
to introduce test automation.â€
—Audrey Leng
“This book is a stunning achievement. I believe that it is one of the best books ever written in
test automation. Dot and Mark’s approach presenting 28 case studies is a totally new concept
including eye-catching tips, good points, and lessons learned. The case studies are coming
from life experiences, successes and failures, including several aspects of automation, dif-
ferent environments, and a mixture of solutions. Books are ‘the’ source of wisdom, and what
a good idea for using storytelling to increase our learning through triggering our memories.
This book is a must for everyone who is thinking of or involved in test automation at all levels.
It is truly unique in its kind.â€
—Mieke Gevers
ExpEriEncEs of TEsT
AuTomATion
This page intentionally left blank
ExpEriEncEs of TEsT
AuTomATion
Case Studies of Software Test
Automation
Dorothy Graham
Mark Fewster
Upper Saddle River, NJ • Boston • Indianapolis • San Francisco
New York • Toronto • Montreal • London • Munich • Paris • Madrid
Capetown • Sydney • Tokyo • Singapore • Mexico City
Many of the designations used by manufacturers and sellers to distinguish their products are claimed as trade-
marks. Where those designations appear in this book, and the publisher was aware of a trademark claim, the des-
ignations have been printed with initial capital letters or in all capitals.
The authors and publisher have taken care in the preparation of this book, but make no expressed or implied
warranty of any kind and assume no responsibility for errors or omissions. No liability is assumed for incidental
or consequential damages in connection with or arising out of the use of the information or programs contained
herein.
The publisher offers excellent discounts on this book when ordered in quantity for bulk purchases or special sales,
which may include electronic versions and/or custom covers and content particular to your business, training
goals, marketing focus, and branding interests. For more information, please contact:
U.S. Corporate and Government Sales
(800) 382-3419
corpsales@pearsontechgroup.com
For sales outside the United States, please contact:
International Sales
international@pearson.com
Visit us on the Web: informit.com/aw
Library of Congress Cataloging-in-Publication Data
Graham, Dorothy, 1944-
Experiences of test automation : case studies of software test automation / Dorothy Graham, Mark Fewster.
p.
cm.
Includes bibliographical references and index.
ISBN 978-0-321-75406-6 (pbk. : alk. paper)
1. Computer software—Testing—Automation—Case studies. I. Fewster, Mark, 1959- II. Title.
QA76.76.T48G73 2011
005.3028’7—dc23
2011040994
Copyright © 2012 Pearson Education, Inc.
All rights reserved. Printed in the United States of America. This publication is protected by copyright, and per-
mission must be obtained from the publisher prior to any prohibited reproduction, storage in a retrieval system,
or transmission in any form or by any means, electronic, mechanical, photocopying, recording, or likewise. To
obtain permission to use material from this work, please submit a written request to Pearson Education, Inc.,
Permissions Department, One Lake Street, Upper Saddle River, New Jersey 07458, or you may fax your request
to (201) 236-3290.
ISBN-13: 978-0-321-75406-6
ISBN-10: 0-321-75406-9
Text printed in the United States on recycled paper at RR Donnelley in Crawfordsville, Indiana.
Second printing, August 2012
To my husband, Roger, for your love and support,
your good ideas, and for making the tea!
And to Sarah and James, our wonderful children.
—Dot Graham
To my wife, Barbara, for the good times we’ve shared.
And to my terrific son, Rhys, for the good times you bring.
—Mark Fewster
This page intentionally left blank
ix
conTEnTs
Foreword
xxix
Preface
xxxi
Reflections on the Case Studies
Dorothy Graham, Mark Fewster
1
A
Management Issues
2
A.1
Objectives for Automation
2
A.2
Management Support
3
A.3
Return on Investment and Metrics
3
A.4
Automation in Agile Development
4
A.5
Skills
5
A.6
Planning, Scope, and Expectations
5
A.7
Relationship with Developers
6
A.8
Triggers for Change and Getting Started
6
A.9
Tools and Training
7
A.10
Political Factors
8
B
Technical Issues
8
B.1
Abstraction, Abstraction, Abstraction: The Testware Architecture 8
B.2
Automation Standards
11
B.3
Reusability, Documentation, and Flexibility
12
B.4
Test Results and Reporting
12
B.5
Testing the Tests: Reviews, Static Analysis, Testing the Testware 13
B.6 What Tests to Automate
13
B.7
Test Automation Is More than Execution
14
B.8
Failure Analysis
14
B.9
Automation Finding Bugs?
15
B.10
Tools and Technical Aspects
15
C
Conclusion
16
x
Contents
Chapter 1
An Agile Team’s Test Automation Journey: The First Year
Lisa Crispin
17
1.1
Background for the Case Study
18
1.1.1
The Problem
19
1.1.2 Our Goals
19
1.2 Whole Team Commitment
19
1.3
Setting Up the Automation Strategy
20
1.3.1 A Testable Architecture
20
1.3.2
Setting Up the Build Process
22
1.3.3 Getting a Base for Testing: GUI Smoke Tests
22
1.3.4 Driving Development at the Unit Level
23
1.4
Applying Acceptance Test-Driven Development (ATDD) to
Test behind the GUI Using FitNesse
24
1.4.1
In-Memory Tests
24
1.4.2
Tests Using the Database
25
1.4.3
Benefits of FitNesse Tests
26
1.5
Use an Incremental Approach
26
1.6
The Right Metrics
27
1.7
Celebrate Successes
28
1.8
Incorporate Engineering Sprints
28
1.9
Team Success
29
1.10 Continuous Improvement
31
1.11 Conclusion
32
Chapter 2
The Ultimate Database Automation
Henri van de Scheur
33
2.1
Background for the Case Study
33
2.2
Software under Test
35
2.3 Objectives for Test Automation
36
2.4
Developing Our Inhouse Test Tool
37
2.4.1 Configuration
38
2.4.2
Resource Optimization
39
2.4.3
Reporting
39
2.4.4
Failure Analysis
39
2.5 Our Results
40
2.6 Managing Our Automated Tests
40
2.7
Test Suites and Types
41
2.8
Today’s Situation
43
2.9
Pitfalls Encountered and Lessons Learned (the Hard Way)
43
Contents
xi
2.10
How We Applied Advice from the Test Automation Book
45
2.11 Conclusion
47
2.12 Acknowledgments
48
Chapter 3 Moving to the Cloud: The Evolution of TiP, Continuous
Regression Testing in Production
Ken Johnston, Felix Deschamps
49
3.1
Background for the Case Study
50
3.2 Moving Our Testing into the Cloud
52
3.2.1 What We Wanted to Get Out of Our TiP Test Strategy
54
3.2.2 Guiding Principles
54
3.3
How We Implemented TiP
55
3.4
Sample of Monthly Service Review Scorecards
58
3.4.1
Reading Our Scorecard
58
3.4.2 What We Did with the Incident and Escalation Report
60
3.5
Exchange TiP v2—Migrating TiP to the Windows Azure Cloud
62
3.6 What We Learned
63
3.6.1
Issues Related to Partner Services
64
3.6.2 Challenges in Monitoring Across the Cloud
64
3.6.3
Sample Issues Found in Production by TiP Tests
65
3.6.4 Aggregating to Deal with “Noise†in the Results
65
3.6.5
Pitfalls
66
3.7
Conclusion
67
3.8
Acknowledgments
67
Chapter 4
The Automator Becomes the Automated
Bo Roop
69
4.1
Background for the Case Study: My First Job
69
4.1.1 My First Role: Technical Support
71
4.1.2
Joining the QA Team
71
4.2 My Great Idea . . .
72
4.2.1
But Would It Be Short-Lived?
72
4.3
A Breakthrough
74
4.3.1 Getting into the Job
75
4.3.2
Validating Checkpoints
76
4.3.3
Then Things Got Difficult
77
4.3.4
The Beginning of the End
78
4.4
Conclusion
80
xii
Contents
Chapter 5
Autobiography of an Automator: From Mainframe to
Framework Automation
John Kent
83
5.1
Background for the Case Study
84
5.1.1
Early Test Automation: My First Encounter with a Testing Tool 84
5.1.2 Overcoming Problems to Use the Tool to Replay Tests
85
5.1.3 How Mainframe Dumb Terminal Systems Worked and Why
Capture/Replay Was a Good Idea
86
5.1.4 Attended Replay and Its Advantages
87
5.2
A Mainframe Green-Screen Automation Project
88
5.2.1
Scaling Up
88
5.2.2 Using the Tool on the Tool Scripts
88
5.2.3
Success
88
5.2.4 Who Wants It Now?
89
5.3
Difference between Mainframe and Script-Based Tools
89
5.3.1
Level of Interaction
89
5.4
Using the New Script-Based Tools
91
5.4.1
Trying to Use New Tools the Old Way
91
5.4.2
Programming the Tools
92
5.4.3
Building the Framework
93
5.4.4 Other Features of the Framework
96
5.4.5
The Software Test Automation Paradox: Testing the Tests
96
5.5
Automating Tests for IBM Maximo
97
5.5.1 Moving on to 2010
97
5.5.2
The Liberation Framework
98
5.5.3
Technical Challenges
100
5.5.4
Results of the Test Automation
101
5.5.5
Rolling Out Automation to the Rest of the Organization
102
5.6
Conclusion
102
5.7
Additional Reading
103
Chapter 6
Project 1: Failure!, Project 2: Success!
Ane Clausen
105
6.1
Background for the Case Study
105
6.2
Project 1: Failure!
107
6.3
Project 2: Success!
109
6.3.1 Our Estimated Return on Investment
109
6.3.2
The Start
111
6.3.3
Pilot Project Goals
112
Contents
xiii
6.3.4
The First Month: Understanding the Task and the Tools
113
6.3.5
The Strategy and Plan
115
6.3.6
The Agile Testing Method
116
6.3.7
Result of the First Period
118
6.4
The Next Time Period: Testing for Real
118
6.4.1 What to Automate
119
6.4.2
Stakeholders’ Involvement
119
6.4.3 Uniform Solution
120
6.4.4 Application Structure and Test Case Structure in QC
121
6.4.5 Go/Do Not Go after 3 Months
124
6.4.6
Real Project after the Pilot Project
124
6.4.7
The First Automated Test Used in Real Releases to Production 124
6.4.8
The Whole Automated Test in Production
126
6.5
Conclusion
127
Chapter 7
Automating the Testing of Complex Government Systems
Elfriede Dustin
129
7.1
Background for the Case Study
129
7.2 Our Requirements for Automation
131
7.3
Automated Test and Re-Test (ATRT), Our Automated Testing
Solution—What Is It?
132
7.3.1 Can’t Be Intrusive to SUT
132
7.3.2 Must Be OS Independent (Compatible with Windows,
Linux, Solaris, etc.)
134
7.3.3 Must Be Independent of the GUI
134
7.3.4 Must Automate Tests for Both Display-Centric and
Non-Display-Centric Interfaces
135
7.3.5 Must Be Able to Handle a Networked Multicomputer
Environment
137
7.3.6 Nondevelopers Should Be Able to Use the Tool
137
7.3.7 Must Support an Automated Requirements
Traceability Matrix
140
7.4
Automated Testing Solution Applied
140
7.5
Conclusion
142
Chapter 8
Device Simulation Framework
Alan Page
143
8.1
Background for the Case Study
143
8.2
The Birth of Device Simulation Framework (DSF)
145
xiv
Contents
8.3
Building the DSF
146
8.4
Automation Goals
148
8.5
Case Studies
149
8.5.1 USB Firmware
149
8.5.2 USB Storage
150
8.5.3
Video Capture
152
8.5.4 Other Applications of DSF
152
8.6 No Silver Bullets
153
8.7
Conclusion
154
8.8
Acknowledgments
154
Chapter 9 Model-Based Test-Case Generation in ESA Projects
Stefan Mohacsi, Armin Beer
155
9.1
Background for the Case Study
155
9.2 Model-Based Testing and Test-Case Generation
157
9.2.1 Model-Based Testing with IDATG
158
9.3 Our Application: ESA Multi-Mission User Services
161
9.3.1
Testing Approach for MMUS
163
9.4
Experience and Lessons Learned
168
9.4.1
Benefits
168
9.4.2
ROI of Model-Based Testing
168
9.4.3
Problems and Lessons Learned
171
9.5
Conclusion
173
9.5.1
Summary
173
9.5.2 Outlook
173
9.6
References
174
9.7
Acknowledgments
175
Chapter 10 Ten Years On and Still Going
Simon Mills
177
10.1 Background for the Case Study: “Beforeâ€
177
10.2
Insurance Quotation Systems Tested Automatically
Every Month
179
10.2.1 Background: The UK Insurance Industry
179
10.2.2 The Brief, Or How I Became Involved
180
10.2.3 Why Automation?
181
10.2.4 Our Testing Strategy
182
10.2.5 Selecting a Test Automation Tool
184
Contents
xv
10.2.6 Some Decisions About our Test Automation Plans
185
10.2.7 The Test Plan
187
10.2.8 Some Additional Issues We Encountered
189
10.2.9 A Telling Tale: Tester Versus Automator
191
10.2.10 Summary
192
10.2.11 Acknowledgments
192
10.3 What Happened Next?
193
10.4 Conclusion
193
10.4.1 Separation of Tester and Automator?
193
10.4.2 Management Expectations
194
10.4.3
Independence from Particular Tools and Vendors
194
Chapter 11 A Rising Phoenix from the Ashes
Jason Weden
197
11.1 Background for the Case Study
197
11.1.1 The Organizational Structure
199
11.1.2 The Business Domain
199
11.2 The Birth of the Phoenix
199
11.3 The Death of the Phoenix
202
11.4 The Rebirth of the Phoenix
203
11.4.1
(Re)Starting the Automation Project
203
11.4.2
Increasing Ease of Use
204
11.4.3
Increasing the Visibility of Our Automation Efforts
204
11.4.4
Implementing Better Testing Methods
206
11.4.5 Realizing Benefits: Return on Investment
207
11.5 The New Life of the Phoenix
207
11.5.1 Focusing on Knowledge-Sharing
208
11.5.2 Tracking of Automation Framework Test Run Results
208
11.5.3 Designing for Speed and Ease of Use
210
11.6 Conclusion
212
11.6.1 Use Time to Automate
213
11.6.2 Enhance Automation Skills and Share Knowledge
213
11.6.3 Acquire Formal Education
214
11.6.4 Track Progress
214
11.6.5 Assess Usability
214
11.6.6 Tailor Your Automation to Suit Your Organization
215
11.6.7 And Finally . . .
215
xvi
Contents
Chapter 12 Automating the Wheels of Bureaucracy
Damon Yerg (A Pseudonym)
217
12.1 Background for the Case Study
217
12.1.1 The Organization
217
12.1.2 The Agency Testing
219
12.2 The Agency Automation
219
12.2.1 Enhancing Record and Playback
220
12.2.2 Health Checks and Smokers
221
12.2.3 Challenges and Lessons Learned
221
12.3 From 2000 to 2008
223
12.3.1 Benefits from the Mainframe Tool
223
12.3.2 The Web Way
223
12.3.3 Our KASA
224
12.3.4 More Challenges and Lessons Learned
224
12.3.5 Selling Automation
225
12.4 An Alignment of Planets
226
12.4.1 Gershon Review
228
12.4.2
Independent Testing Project
228
12.4.3 Core Regression Library Management Methodology
229
12.4.4 Our Current Location on the Journey
230
12.5 Building Capability within Test Teams
231
12.5.1 The Concept: Combine Script Development and
Business Knowledge
231
12.5.2 The Tools: KASA Meets DExTA
232
12.6 Future Directions: The Journey Continues
233
12.6.1 MBT Solutions
233
12.6.2 Embedded Automation Engineers
233
12.6.3 Organizational Approach to Regression Testing
234
12.6.4 Automate Early
234
12.7 Conclusion
235
Chapter 13 Automated Reliability Testing Using Hardware Interfaces
Bryan Bakker
237
13.1 Background for the Case Study
238
13.2 The Need for Action
239
13.3 Test Automation Startup (Incremental Approach)
240
13.4 Buy-In from Management
242
Contents
xvii
13.5 Further Development of Test Framework
244
13.5.1
Increment 2: A Better Language for the Testers
244
13.5.2
Increment 3: Log File Interpretation
244
13.6 Deployment and Improved Reporting
248
13.7 Conclusion
250
Chapter 14 Model-Based GUI Testing of Android Applications
Antti Jääskeläinen, Tommi Takala, Mika Katara
253
14.1 Background for the Case Study
253
14.1.1 About MBT
255
14.1.2
Our Experience: Using TEMA on Android Applications
255
14.1.3 The Rest of This Chapter
256
14.2 MBT with TEMA Toolset
256
14.2.1 Domain-Specific Tools
256
14.2.2 Roles in TEMA
257
14.2.3 What the TEMA Toolset Does
257
14.2.4 Action Machines and Refinement Machines
258
14.2.5 Defining Test Data
259
14.2.6 Test Configuration: Test Model and Testing Modes
259
14.2.7 Test Generation from the Model
260
14.2.8 An Example: Sending an SMS Message
261
14.3 Modeling Application Behavior
261
14.3.1 Modeling with TEMA Model Designer
261
14.3.2 Modeling with ATS4 AppModel
264
14.4 Generation of Tests
266
14.4.1 Function and Choice of Guidance Algorithms
266
14.4.2 Guidance Algorithm Trade-Offs
267
14.5 Connectivity and Adaptation
268
14.5.1 The Adapter Executes Keywords
268
14.5.2 Action Keywords and Verification Keywords
268
14.5.3 Challenges in Implementing Verification
269
14.5.4 Using A-Tool and Changes Needed to Use It
269
14.5.5 Additional Problems
271
14.6 Results
272
14.7 Conclusion
273
14.8 Acknowledgments
274
14.9 References
274
xviii
Contents
Chapter 15 Test Automation of SAP Business Processes
Christoph Mecke, Melanie Reinwarth, Armin Gienger
277
15.1 Background for the Case Study
278
15.1.1
Special Requirements of SAP as a Software Company
279
15.1.2 Test Automation Tools at SAP
280
15.2 Standards and Best Practices
282
15.2.1 Regression Test Process
282
15.2.2 Specification and Design
283
15.2.3 Coding Guidelines
283
15.2.4 Code Inspections
284
15.2.5 Reuse Guideline
284
15.2.6 Checkman for eCATT
285
15.3 eCATT Usage Examples
286
15.3.1 Data-Driven Automation Framework for Health
Care Processes
286
15.3.2 Test Automation Framework for Banking Scenarios
289
15.4 Conclusion
292
15.5 Acknowledgments
293
Chapter 16 Test Automation of a SAP Implementation
Björn Boisschot
295
16.1 Background for the Case Study
295
16.2 Project Overview
298
16.3 Phase 1: Proof of Concept
299
16.3.1 Define the Scope of the Project
299
16.3.2 Set the Expectations
301
16.3.3 Start Scripting the Test Cases
304
16.4 Phase 2: Project Start
307
16.4.1 Approval
307
16.4.2 Code Conventions and Documentation
307
16.4.3 Structured Approach
310
16.4.4 Data-Driving Test Cases
314
16.3.5 Multilingual
317
16.4.6 Security Role Testing
319
16.5 Conclusion
319
Contents
xix
Chapter 17 Choosing the Wrong Tool
Michael Williamson
321
17.1 Background for the Case Study
321
17.1.1 The Product
321
17.1.2 Development Team
322
17.1.3 Overview of Development at Google
323
17.1.4 Overview of Release Cycles
324
17.2 Our Preexisting Automation (or Lack Thereof)
324
17.2.1 Manual Testing and the Need for More Automation
324
17.3
Decision Needed: New Tool or Major Maintenance Effort?
326
17.3.1 What We Had and Why It Had to Change
326
17.3.2 Overview of eggPlant
328
17.4 Moving Forward with eggPlant
328
17.4.1 Development Experience
328
17.4.2 Using the Tool Language
329
17.4.3 Problems with Image-Based Comparison
331
17.4.4 Test Maintenance Had to Be Done by Testers
332
17.4.5 Submitting Code Using Continuous Integration
333
17.4.6 What Would the Submit Queue Have Done for Us?
334
17.4.7
How Did Our eggPlant Automation Adventure Turn Out?
335
17.4.8 A Surprising Twist to Earlier Assumptions!
335
17.5 What Did We Do after eggPlant?
336
17.6 Conclusion
336
17.6.1 eggPlant as a Tool
336
17.6.2 Test Automation in General: Our Lessons Learned
337
17.6.3 Current Problems with Automation
337
Chapter 18 Automated Tests for Marketplace Systems: Ten Years
and Three Frameworks
Lars Wahlberg
339
18.1 Background for the Case Study
340
18.2 Automated Test Frameworks
341
18.2.1 Framework A
342
18.2.2 Framework B
343
18.2.3 Framework C
344
18.3 Test Roles
344
18.3.1 Test Engineer
344
xx
Contents
18.3.2 Test Automation Architect
345
18.3.3 Daily Build Engineer
345
18.4 Abstraction Layer
345
18.5 Configuration
348
18.6 Cost and ROI
349
18.7 Conclusion
352
Chapter 19 There’s More to Automation Than Regression Testing:
Thinking Outside the Box
Jonathan Kohl
355
19.1 Background for the Case Study
355
19.2 Two Tales of Task Automation
357
19.2.1 Automation Failed, and How Did That Tester
Suddenly Improve?
357
19.2.2
Automating Around the Testing, Not Automating the Testing 360
19.3
Automation to Support Manual Exploratory Testing
362
19.4 Automating Data Interactions
364
19.5 Automation and Monitoring
368
19.5.1 The Tests That Passed Too Quickly
368
19.5.2
If Nothing Is Wrong, the Test Must Have Passed, Right?
369
19.6
Simulating Real-World Loads by Combining Simple Tools
370
19.7 Conclusion
372
19.8 References
372
Chapter 20 Software for Medical Devices and Our Need for
Good Software Test Automation
Albert Farré Benet, Christian Ekiza Lujua,
Helena Soldevila Grau, Manel Moreno Jáimez,
Fernando Monferrer Pérez, Celestina Bianco
375
20.1 Background for the Case Study
376
20.1.1 Medical Devices Background
376
20.1.2 Company Background
378
20.1.3 Medical Device Constraints and Specifics Pertaining
to STA
378
20.1.4 Different Projects, Different Scenarios
379
20.2
Comparison of the Different Approaches to Each Project
381
20.2.1
Defining Test Objectives: Focusing on Critical Features
382
20.2.2 Test Flow
383
20.3 Project hamlet
385
Contents
xxi
20.4 Project phoenix
386
20.4.1 The Tools
386
20.4.2 Maintenance and Migration Issues
387
20.5 Project doityourself
388
20.5.1 The Tools
388
20.5.2 Maintenance and Tool Validation Issues
389
20.5.3 Techniques
389
20.5.4 Unexpected Problems and Applied Solutions
390
20.6 Project miniweb
391
20.6.1 Tools
391
20.6.2 Maintenance
391
20.6.3 Techniques
391
20.6.4 Unexpected Problems and Applied Solutions
392
20.7 Test Execution
392
20.8 Result Reporting
393
20.9 Conclusion
396
20.9.1 Looking Back at the Projects
396
20.9.2 What Would We Do Differently?
398
20.9.3 Plans for the Future
400
Chapter 21 Automation through the Back Door (by Supporting
Manual Testing)
Seretta Gamba
401
21.1 Background for the Case Study
401
21.2 Our Technical Solution
403
21.2.1 Command-Driven Testing
403
21.2.2
ISS Test Station
405
21.3
Implementing Test Automation with ISS Test Station
406
21.3.1 The Automation Process
406
21.4
Implementing Test Automation
409
21.4.1 Original Procedure
410
21.4.2 Weaknesses
411
21.5 Supporting Manual Testing
413
21.5.1 Available Features
413
21.5.2 Features Not Currently Available in Our Framework
413
21.6 The New Manual Test Process
417
21.6.1 Migration to the Framework
417
21.6.2 Manual Testing with the Framework
417
21.6.3 Automating the Manual Tests
420
xxii
Contents
21.7 Conclusion
422
21.7.1 Starting Phase
422
21.7.2 Status in 2010
423
21.7.3 Next Steps
423
21.8 References
423
Chapter 22 Test Automation as an Approach to Adding Value
to Portability Testing
Wim Demey
425
22.1 Background for the Case Study
427
22.2 Portability Testing: Love or Hate It
428
22.3 Combination of Both Worlds as a Solution
428
22.3.1 LA-PORTA
430
22.3.2 Virtualization Product
432
22.3.3 VixCOM
432
22.3.4 Test Automation Tool
433
22.3.5 File Structure
434
22.4 Conclusion
435
22.5 Acknowledgment
435
Chapter 23 Automated Testing in an Insurance Company:
Feeling Our Way
Ursula Friede
437
23.1 Background for the Case Study
437
23.2 The Application
439
23.3 Objectives
440
23.4 The Work
441
23.4.1 Phase 1
441
23.4.2 Phase 2
442
23.4.3 Phase 3
442
23.4.4 Phase 4
443
23.5
Lessons
443
23.5.1 Screen Resolution
443
23.5.2 Less Is Sometimes More
444
23.6 Conclusion
444
23.6.1 Greatest Success
444
23.6.2 Don’t Get Carried Away
445
Contents
xxiii
Chapter 24 Adventures with Test Monkeys
John Fodeh
447
24.1 Background for the Case Study
447
24.2
Limitations of Automated Regression Testing
449
24.2.1 Automated Regression Tests Are Static
449
24.2.2 Automated Regression Tests Are Simple
450
24.2.3 Reinitialization of Automated Tests
450
24.2.4 Synchronized with Application
450
24.2.5 Vulnerable to Changes
450
24.3 Test Monkeys
451
24.3.1 Characteristics
451
24.3.2 Basic Features
452
24.4
Implementing Test Monkeys
453
24.5 Using Test Monkeys
454
24.5.1 Metrics
456
24.6 Benefits and Limitations
458
24.7 Conclusion
459
24.8 Additional Reading
460
Chapter 25 System-of-Systems Test Automation at NATS
Mike Baxter, Nick Flynn, Christopher Wills, Michael Smith
461
25.1 Background for the Case Study
461
25.1.1 System-of-Systems Operational Context
462
25.1.2
Initial Objectives and Constraints for Test Automation
464
25.2 Test Execution Tool Integration
465
25.3 Pilot Project for the Tool
466
25.4
In-Service Model
467
25.5
Implementation
467
25.6 Typical Script Template
470
25.7
Lessons Learned
472
25.7.1 General Lessons
472
25.7.2 Technical Lessons
474
25.8 Conclusion
474
xxiv
Contents
Chapter 26 Automating Automotive Electronics Testing
Ross Timmerman, Joseph Stewart
477
26.1 Background for the Case Study
477
26.2 Objectives for Automation Project
480
26.3 Brief History of the Automation Project
480
26.3.1 Our First Tools
480
26.3.2 Limitations of the First Tool and Creation of the
Next-Generation Tool
481
26.4 Results of the Automation Project
483
26.5 Conclusion
483
Chapter 27 BHAGs, Change, and Test Transformation
Ed Allen, Brian Newman
485
27.1 Background for the Case Study
485
27.2 Buy-In
487
27.2.1 The Executives
487
27.2.2 The Developer “Whyâ€
488
27.2.3 Empowering QA
489
27.3 The Story of Building the Automation Framework
491
27.3.1 Creating Test Points
491
27.3.2 The Beginning
492
27.3.3 The Consultant
492
27.3.4 Redoing the Framework
492
27.4 Description of our Automation Framework
493
27.4.1 Modules in Our Framework
493
27.4.2 Considerations for Modules
495
27.4.3 Script Execution
497
27.4.4 Failure Capturing Method
497
27.5 The Test Environment
497
27.5.1 Multiple LANs
498
27.5.2 Virtual Machines
498
27.6 Metrics
499
27.6.1 Benefits of Automation
499
27.6.2 Effect on Customer-Found Defects
500
27.7 Conclusion
501
27.7.1 Lessons Learned
501
27.7.2 Ongoing Challenges
502
27.7.3 What’s Next
503
Contents
xxv
Chapter 28 Exploratory Test Automation: An Example Ahead of
Its Time
Harry Robinson, Ann Gustafson Robinson
505
28.1 Background for the Case Study
505
28.2 What’s a Trouble Manager?
507
28.3 Testing a Trouble Manager Transaction
509
28.3.1 Testing That a CreateTicket Transaction Succeeds
When All Required Fields Are Present
509
28.3.2 Testing That a CreateTicket Transaction Fails When
a Required Field Is Missing
509
28.4 Constructing Test Cases Programmatically
510
28.5 New Ways to Think about Automated Tests
511
28.6 Testing the Trouble Manager Workflow
513
28.7 Test Generation in Action
518
28.8 Home Stretch
520
28.9 Post-Release
521
28.10 Conclusion
522
28.11 Acknowledgments
522
Chapter 29 Test Automation Anecdotes
523
29.1 Three Grains of Rice
Randy Rice
523
29.1.1 Testware Reviews
523
29.1.2 Missing Maintenance
525
29.1.3 A Wildly Successful Proof-of-Concept
526
29.2 Understanding Has to Grow
Molly Mahai
527
29.3 First Day Automated Testing
Jonathon Lee Wright
528
29.3.1
Initial Investment
529
29.3.2 What Is to Be Automated?
529
29.3.3 First Day Automated Testing
531
29.3.4 Problems and Solutions
533
29.3.5 Results of Our First Day Automation Approach
534
29.4 Attempting to Get Automation Started
Tessa Benzie
535
xxvi
Contents
29.5 Struggling with (against) Management
Kai Sann
536
29.5.1 The “It Must Be Good, I’ve Already Advertised
It†Manager
536
29.5.2 The “Testers Aren’t Programmers†Manager
536
29.5.3 The “Automate Bugs†Manager
537
29.5.4
The “Impress the Customers (the Wrong Way)†Manager 537
29.6
Exploratory Test Automation: Database Record Locking
Douglas Hoffman
538
29.6.1 The Case Study
539
29.7
Lessons Learned from Test Automation in an Embedded
Hardware–Software Computer Environment
Jon Hagar
545
29.7.1 VV&T Process and Tools
545
29.7.2 Lessons Learned
547
29.7.3 Summary of Results
548
29.8 The Contagious Clock
Jeffrey S. Miller
549
29.8.1 The Original Clock
549
29.8.2
Increasing Usefulness
550
29.8.3 Compelling Push
550
29.8.4 Lessons Learned
551
29.9 Flexibility of the Automation System
Mike Bartley
551
29.10 A Tale of Too Many Tools (and Not Enough
Cross-Department Support)
Adrian Smith
552
29.10.1 Project 1: Simulation Using a DSTL
552
29.10.2 Project 2: Testing a GUI Using TestComplete
553
29.10.3 Project 3: Rational Robot
554
29.10.4 Project 4: Final Python Project and QTP Proof-of-Concept 554
29.10.5 Project 5: QTP2
555
29.10.6 The End of the Story
556
29.11 A Success with a Surprising End
George Wilkinson
556
29.11.1 Our Chosen Tool
557
29.11.2 The Tool Infrastructure and an Interesting Issue as a Result 558
29.11.3 Going toward Rollout
559
29.11.4 The Unexpected Happens
560
Contents
xxvii
29.12 Cooperation Can Overcome Resource Limitations
Michael Albrecht
561
29.13 An Automation Process for Large-Scale Success
Michael Snyman
562
29.13.1 Where We Started
562
29.13.2 The Key to Our Eventual Success: An Automation Process 564
29.13.3 What We Learned
565
29.13.4 Return on Investment
566
29.14 Test Automation Isn’t Always What It Seems
Julian Harty
567
29.14.1 Just Catching Exceptions Does Not Make It a Good Test
568
29.14.2 Sometimes the Failing Test Is the Test Worth Trusting
569
29.14.3 Sometimes, Micro-Automation Delivers the Jackpot
570
Appendix
Tools
573
About the Case Study Authors
587
About the Book Authors
605
Index
607
This page intentionally left blank
xxix
forEword
Automated testing—it’s the Holy Grail, the Fountain of Youth, and the Philosopher’s
Stone all rolled into one. For decades, testers have looked to automated testing for
relief from the drudgery of manual testing—constructing test cases and test data, set-
ting system preconditions, executing tests, comparing actual with expected results,
and reporting possible defects. Automated testing promises to simplify all these
operations and more.
Unfortunately, successful, effective, and cost-effective automated testing is dif-
ficult to achieve. Automated testing projects are often initiated only later to stumble,
lose their way, and be thrown onto the ever-growing pile of failed projects.
Automation fails for many reasons—unachievable expectations is perhaps the
most common, followed by inadequate allocation of resources (time, people, and
money). Other factors include tools that are poorly matched to needs, the sheer
impatience for success that hinders quality work, and a lack of understanding that
automated testing is a different kind of software development, one that requires the
same professional approach as all other development efforts.
Dorothy and Mark’s previous book, Software Test Automation: Effective Use of
Test Execution Tools, published in 1999, set the standard for books on this topic. The
first part detailed practices found in most successful automation efforts—scripting
techniques, automated comparison, testware architecture, and useful metrics. The
second part described the experiences of a number of organizations as they imple-
mented test automation efforts. Now, with an additional 10 years of industry knowl-
edge behind them, Dorothy and Mark provide another set of organizational and per-
sonal experiences to guide our automation work. It brings us up to date, describing
both the classical and most modern approaches to test automation. Each chapter
tells a story of a unique automation effort—including both successes and failures—to
give us guidance.
Certain themes reoccur in Experiences in Test Automation: reasonable and
achievable objectives; management support; metrics, including return on investment;
required skills; planning; setting expectations; building relationships; tools; training;
and politics—all necessary to make test automation successful. However, these same
xxx
Foreword
themes are equally applicable at both the project and personal levels. One great ben-
efit of this book comes from stepping outside the test automation realm and consid-
ering these themes in the larger context.
I first met Dorothy and Mark at the 1998 EuroStar conference in Munich. I
was impressed with both their knowledge of and passion for helping others do great
automated testing. I congratulate them for their outstanding accomplishment and
commend this book to you.
—Lee Copeland
December 2011
xxxi
prEfAcE
Test automation tools have been around for about 30 years, yet many automation
attempts fail, or at least are only partially successful. Why is this?
We wanted to understand if the principles of effective automation, as published
in our previous book, Software Test Automation, are still relevant and what other
principles now apply, so we began gathering information about real-world test auto-
mation implementations. This led us to a rather pleasant discovery: Over the past 10
years, many people have had good success with software test automation, many of
them using our book. Of course, we are not the only ones to have described or dis-
covered good automation practices, yet successful and lasting automation still seems
to be an elusive achievement today. We hope the stories in this book will help many
more people to succeed in their test automation efforts.
This book brings together contemporary automation stories. The technology of
test automation has progressed significantly since our last book on automation was
published in 1999. We wanted to find out what approaches have been successful,
what types of applications are now being tested using test automation, and how test
automation has changed in recent years. Different people have solved automation
problems in different ways—we wanted to know what can be learned from their
experiences and where and how test automation is being applied in new ways.
The case studies in this book show some approaches that were successful and
some that were not. This book gives you the knowledge to help avoid the pitfalls and
learn from the successes achieved in real life. We designed this book to help you get
the most out of the real-life experiences of other professionals.
The case studies in this book cover mainly the automation of test execution, but
other types of automation are mentioned in some chapters. We focus primarily on
system-level automation (including user acceptance testing), although some chapters
also cover unit or integration testing. Test automation is described for many types
of applications, many environments and platforms; the chapters cover commercial,
open source, and inhouse tools in traditional and agile development projects. We are
surprised by the number of different tools being used—around 90 commercial and
open source tools are listed in the Appendix (which includes any tools used by the
chapter authors, not just testing tools).
xxxii
Preface
The experiences described in this book are all true, even though in some cases
the author or company name is not revealed. We encouraged the case study authors
to describe what happened rather than offer general advice, so this book is very real!
In collecting this book’s stories, we were struck by the pervasiveness of test auto-
mation into every industry and application. We were also impressed with the ded-
ication and persistence of those who have developed test automation within their
companies. Unfortunately, we were also struck by the difficulties that many of them
encountered, which sometimes resulted in failure. We are sure the experiences
described in this book can help you to be more successful with your test automation.
Case Studies Plus (Our Added Value)
This book is more than a collection of essays; we worked closely with the authors of
the chapters to produce a book with information that we felt would be most useful to
you. Our review process was thorough; we asked questions and suggested changes in
several rounds of reviewing (special thanks are due to the chapter authors for their
patience and additional information). Our “old versions†folder contains over 500
documents, so each chapter has been carefully crafted.
We help you get the most from this book by offering Good Points, Lessons, and
Tips. Each chapter includes our own comments to highlight points we think should
stand out at a glance. Watch for these helpful notes:
â– Good Points, which are well worth noting (even if they are not necessarily
new).
Good Point
Management support is critical, but expectations must be realistic.
■Lessons, often learned the hard way—things it would have been better not
to do.
Lesson
Automation development requires the same discipline as software development.
â– Tips on ways to solve particular problems in a way that seemed new or novel
to us.
Preface xxxiii
Tip
Use a “translation table†for things that may change, so the automation can use a
standard constant term.
We picture these interjections as our way of looking over your shoulder as you
read through the chapter and saying, “Pay attention here,†“Look at this,†and “This
could be particularly useful.â€
How to Read This Book
Each case study is a standalone account, so the chapters can be read in any order.
The arrangement of the chapters is designed to give you a variety of experiences if
you do read the book from front to back.
To decide which chapter you would like to read first or read next, look at
Table P.1, a “chapter selector†that summarizes characteristics of the various chap-
ters. The table enables you to see at a glance which chapters cover a particular appli-
cation, tool, development methodology, and so on, and helps you to quickly find the
chapters most directly relevant to you. After Table P.1 are one-paragraph summaries
of each case study chapter.
Following this Preface, the section titled “Reflections on the Case Studies†pres-
ents our overall perspective and summary of the management and technical issues
discussed in the chapters along with our view and comments on those issues (and
our diagram of test ware architecture). In this section of the book, we distill the most
important points of advice to those currently involved in, or about to embark on,
their own automation. This is the “executive summary†of the book.
Chapters 1 to 28 are the case study chapters, each written by an author or authors
describing their experience in their specific context: what they did, what worked
well, what didn’t, and what they learned. Some of the chapters include very specific
information such as file structures and automation code; other chapters are more
general. One chapter (10) is an update from a case study presented in Software Test
Automation; the rest are new.
Chapter 29, “Test Automation Anecdotes,†is a mini-book in its own right—a col-
lection of short experience stories from over a dozen different people, ranging from
half a page to several pages, all with useful and interesting points to make.
Finally, the Appendix, “Tools,†covers the commercial and open source tools
referred to in the chapters.
xxxiv
Preface
Table P.1 Case Study Characteristics
Chapter
Author
Application
Domain
Location
Lifecycle
Number
on the
Project
Time
Span
Tool Type(s)
Pilot
Study?
ROI
Measured?
Successful?
Still
Breathing?
1
Lisa Crispin
Financial, web
USA
Agile
9–12
1 yr,
report
after 6 yr
Open source
No
No
Yes
Yes
2
Henri van de
Scheur
Database
Norway
30–3
5–6 yr
Inhouse
No
No, but
2,400 times
improved
efficiency
Yes
Yes
3
Ken Johnston,
Felix Deschamps
Enterprise
server
USA
Traditional
with agile
elements
>500
~3 yr
Commercial,
Inhouse
No
No
Yes
Yes
4
Bo Roop
Testing tool
USA
Waterfall
12–15
1 yr, 2 mo
Commercial
No
No
No
No
5
John Kent
Mainframe to
web-based
UK
Traditional
40
23 yr
Commercial
Yes
No
Yes
Yes
6
Ane Clausen
2 projects:
pensions and
insurance
Denmark
None and
agile
3–5
6 mo
1 yr
Commercial
No
Yes
No
Yes
No
Yes
No
Yes
7
Elfriede Dustin
Government:
Department of
Defense
USA
Agile
100s
4½ yr
Commercial,
Open source,
Inhouse
Yes
Yes
Yes
Yes
8
Alan Page
Device drivers
USA
Traditional
Hundreds
9 yr
Commercial,
Inhouse
No
No
Yes
Yes
9
Stefan Mohacsi,
Armin Beer
European Space
Agency services
Austria,
Italy,
Germany
Traditional
>100
6+ yr
Commercial,
Open source,
Inhouse
No
Yes, projected
payback after
4 cycles
Yes
Yes
10
Simon Mills
Financial:
insurance
UK
Chaotic and
variable
Dozens
15 yr
Commercial
No, but
began
small
scale
No, but now
running 5
million tests
per month
Yes
Yes; client
base still
growing
11
Jason Weden
Networking
equipment
USA
Traditional
(waterfall)
25
3 yr
Inhouse
No
No
Ultimately,
yes
Yes
Preface
xxxv
Table P.1 Case Study Characteristics
Chapter
Author
Application
Domain
Location
Lifecycle
Number
on the
Project
Time
Span
Tool Type(s)
Pilot
Study?
ROI
Measured?
Successful?
Still
Breathing?
1
Lisa Crispin
Financial, web
USA
Agile
9–12
1 yr,
report
after 6 yr
Open source
No
No
Yes
Yes
2
Henri van de
Scheur
Database
Norway
30–3
5–6 yr
Inhouse
No
No, but
2,400 times
improved
efficiency
Yes
Yes
3
Ken Johnston,
Felix Deschamps
Enterprise
server
USA
Traditional
with agile
elements
>500
~3 yr
Commercial,
Inhouse
No
No
Yes
Yes
4
Bo Roop
Testing tool
USA
Waterfall
12–15
1 yr, 2 mo
Commercial
No
No
No
No
5
John Kent
Mainframe to
web-based
UK
Traditional
40
23 yr
Commercial
Yes
No
Yes
Yes
6
Ane Clausen
2 projects:
pensions and
insurance
Denmark
None and
agile
3–5
6 mo
1 yr
Commercial
No
Yes
No
Yes
No
Yes
No
Yes
7
Elfriede Dustin
Government:
Department of
Defense
USA
Agile
100s
4½ yr
Commercial,
Open source,
Inhouse
Yes
Yes
Yes
Yes
8
Alan Page
Device drivers
USA
Traditional
Hundreds
9 yr
Commercial,
Inhouse
No
No
Yes
Yes
9
Stefan Mohacsi,
Armin Beer
European Space
Agency services
Austria,
Italy,
Germany
Traditional
>100
6+ yr
Commercial,
Open source,
Inhouse
No
Yes, projected
payback after
4 cycles
Yes
Yes
10
Simon Mills
Financial:
insurance
UK
Chaotic and
variable
Dozens
15 yr
Commercial
No, but
began
small
scale
No, but now
running 5
million tests
per month
Yes
Yes; client
base still
growing
11
Jason Weden
Networking
equipment
USA
Traditional
(waterfall)
25
3 yr
Inhouse
No
No
Ultimately,
yes
Yes
Continues
xxxvi
Preface
Chapter
Author
Application
Domain
Location
Lifecycle
Number
on the
Project
Time
Span
Tool Type(s)
Pilot
Study?
ROI
Measured?
Successful?
Still
Breathing?
12
Damon Yerg
(pseudonym)
Government
services
Australia
V model
Hundreds
11 yr
Inhouse
Yes
No, but
comparable
manual effort
calculated
Yes (peaks
and troughs)
Yes; thriving
and forging
ahead
13
Bryan Bakker
Medical devices
Netherlands
V model
50
1.5 yr
Commercial,
Open source,
Inhouse
Started
small
Yes
Yes
Yes
14
Antti Jääskeläinen,
Tommi Takala,
Mika Katara
Smartphone
applications in
Android
Finland
2
6–8 mo
Commercial,
Open source
Entire
project
is a pilot
study
No
Yes
Yes
15
Christoph
Mecke, Melanie
Reinwarth, Armin
Gienger
ERP Systems
(SAP), 2
projects: health
care and
banking
Germany,
India
Traditional
10
4 yr
2 yr
Commercial,
Inhouse
No
No
Yes
Yes
16
Björn Boisschot
SAP applications
in the energy
sector
Belgium
Traditional
12
6 mo
Commercial
Yes
No
Yes
Yes
17
Michael
Williamson
Web-based,
distributed
USA
Agile
15
6 mo
Commercial,
Open source
Yes
No
No
No
18
Lars Wahlberg
Financial
marketplace
systems
Sweden
Incremental
to agile
20
(typical)
~10 yr
Open source
Yes
Yes, projected
payback for
tests run daily,
weekly, or
monthly
Yes
Yes
19
Jonathan Kohl
Various, web to
embedded
Canada
Agile and
traditional
A few –60
Various
Commercial,
Open source,
Inhouse
Yes, in
some
cases
No
Yes
Yes; some
still in use
Table P.1 Case Study Characteristics (Continued)
Preface xxxvii
Chapter
Author
Application
Domain
Location
Lifecycle
Number
on the
Project
Time
Span
Tool Type(s)
Pilot
Study?
ROI
Measured?
Successful?
Still
Breathing?
12
Damon Yerg
(pseudonym)
Government
services
Australia
V model
Hundreds
11 yr
Inhouse
Yes
No, but
comparable
manual effort
calculated
Yes (peaks
and troughs)
Yes; thriving
and forging
ahead
13
Bryan Bakker
Medical devices
Netherlands
V model
50
1.5 yr
Commercial,
Open source,
Inhouse
Started
small
Yes
Yes
Yes
14
Antti Jääskeläinen,
Tommi Takala,
Mika Katara
Smartphone
applications in
Android
Finland
2
6–8 mo
Commercial,
Open source
Entire
project
is a pilot
study
No
Yes
Yes
15
Christoph
Mecke, Melanie
Reinwarth, Armin
Gienger
ERP Systems
(SAP), 2
projects: health
care and
banking
Germany,
India
Traditional
10
4 yr
2 yr
Commercial,
Inhouse
No
No
Yes
Yes
16
Björn Boisschot
SAP applications
in the energy
sector
Belgium
Traditional
12
6 mo
Commercial
Yes
No
Yes
Yes
17
Michael
Williamson
Web-based,
distributed
USA
Agile
15
6 mo
Commercial,
Open source
Yes
No
No
No
18
Lars Wahlberg
Financial
marketplace
systems
Sweden
Incremental
to agile
20
(typical)
~10 yr
Open source
Yes
Yes, projected
payback for
tests run daily,
weekly, or
monthly
Yes
Yes
19
Jonathan Kohl
Various, web to
embedded
Canada
Agile and
traditional
A few –60
Various
Commercial,
Open source,
Inhouse
Yes, in
some
cases
No
Yes
Yes; some
still in use
Continues
xxxviii
Preface
Chapter
Author
Application
Domain
Location
Lifecycle
Number
on the
Project
Time
Span
Tool Type(s)
Pilot
Study?
ROI
Measured?
Successful?
Still
Breathing?
20
Albert Farré
Benet, Christian
Ekiza Lujua,
Helena Soldevila
Grau, Manel
Moreno Jáimez,
Fernando
Monferrer Pérez,
Celestina Bianco
4 projects,
all medical
software
Spain, USA,
Italy
Spiral,
prototyping,
waterfall
2–17
5 yr
2 yr
Few
months
1 yr
Commerical
Inhouse
Commercial
Commercial
No
No
Yes
Partly
No
Yes
Yes
Yes
No
Planned
21
Seretta Gamba
Insurance
Germany
Iterative
27
12 mo
Commercial,
Inhouse
Yes
No
Yes
Yes
22
Wim Demey
Customized
software
packages
Belgium
Traditional
V model
4 mo
Commercial,
Open source
Yes
No
Yes
Yes
23
Ursula Friede
Insurance
Germany
Traditional
(V model)
30
~6 mo
Commercial
No
No, but
quantified
savings of
€120,000 per
release
Yes
Yes
24
John Fodeh
Medical
applications and
devices
Denmark
Traditional
(V model),
incremental
30
6 yr
Commercial,
Inhouse
Yes
No
Yes
Yes
25
Mike Baxter,
Nick Flynn,
Christopher Wills,
Michael Smith
Air traffic
control
UK
Traditional
15–20
Cycles
lasting
3–12 mo
Commercial,
Open source,
Inhouse
Yes
No
Yes
Yes
26
Ross Timmerman,
Joseph Stewart
Embedded:
automotive
systems
USA
Phased
waterfall
8
5 yr
Inhouse with
commercial
hardware
No
No
Yes
Yes
Table P.1 Case Study Characteristics (Continued)
Preface xxxix
Chapter
Author
Application
Domain
Location
Lifecycle
Number
on the
Project
Time
Span
Tool Type(s)
Pilot
Study?
ROI
Measured?
Successful?
Still
Breathing?
20
Albert Farré
Benet, Christian
Ekiza Lujua,
Helena Soldevila
Grau, Manel
Moreno Jáimez,
Fernando
Monferrer Pérez,
Celestina Bianco
4 projects,
all medical
software
Spain, USA,
Italy
Spiral,
prototyping,
waterfall
2–17
5 yr
2 yr
Few
months
1 yr
Commerical
Inhouse
Commercial
Commercial
No
No
Yes
Partly
No
Yes
Yes
Yes
No
Planned
21
Seretta Gamba
Insurance
Germany
Iterative
27
12 mo
Commercial,
Inhouse
Yes
No
Yes
Yes
22
Wim Demey
Customized
software
packages
Belgium
Traditional
V model
4 mo
Commercial,
Open source
Yes
No
Yes
Yes
23
Ursula Friede
Insurance
Germany
Traditional
(V model)
30
~6 mo
Commercial
No
No, but
quantified
savings of
€120,000 per
release
Yes
Yes
24
John Fodeh
Medical
applications and
devices
Denmark
Traditional
(V model),
incremental
30
6 yr
Commercial,
Inhouse
Yes
No
Yes
Yes
25
Mike Baxter,
Nick Flynn,
Christopher Wills,
Michael Smith
Air traffic
control
UK
Traditional
15–20
Cycles
lasting
3–12 mo
Commercial,
Open source,
Inhouse
Yes
No
Yes
Yes
26
Ross Timmerman,
Joseph Stewart
Embedded:
automotive
systems
USA
Phased
waterfall
8
5 yr
Inhouse with
commercial
hardware
No
No
Yes
Yes
Continues
xl
Preface
Chapter
Author
Application
Domain
Location
Lifecycle
Number
on the
Project
Time
Span
Tool Type(s)
Pilot
Study?
ROI
Measured?
Successful?
Still
Breathing?
27
Ed Allen, Brian
Newman
Web-based,
mobile, desktop,
social channels
(voice, chat,
email)
USA
Traditional
28
1 yr
Commercial,
Inhouse
No
No, but
benefits
measured
Yes
Yes
28
Harry Robinson,
Ann Gustafson
Robinson
Problem
reporting for
telephone
systems
USA
Waterfall
30 overall
4 on
project
1.5 yr
Inhouse
No
No
Yes
No
Table P.1 Case Study Characteristics (Continued)
Chapter Summaries
Chapter 1, An Agile Team’s Test Automation Journey:
The First Year
Lisa Crispin describes, in her very engaging style, what happened when an agile team
decided to automate their testing. Given Lisa’s expertise in agile, you will not be sur-
prised to see that this team really was agile in practice. One of the interesting things
about this project is that everyone on the team (which was fairly small) was involved
in the automation. Not only did they excel in agile development, they also developed
the automation in an agile way—and they succeeded. Agile development was not the
only component of this team’s success; other factors were equally important, includ-
ing building a solid relationship with management through excellent communication
and building the automation to help support creative manual testing. Another key
factor was the team’s decision to build in process improvement along the way, includ-
ing scheduling automation refactoring sprints twice a year. You are sure to agree that
what Lisa and her team accomplished in their first year is remarkable. The project
was done for a United States company in the finance sector.
Preface
xli
Chapter
Author
Application
Domain
Location
Lifecycle
Number
on the
Project
Time
Span
Tool Type(s)
Pilot
Study?
ROI
Measured?
Successful?
Still
Breathing?
27
Ed Allen, Brian
Newman
Web-based,
mobile, desktop,
social channels
(voice, chat,
email)
USA
Traditional
28
1 yr
Commercial,
Inhouse
No
No, but
benefits
measured
Yes
Yes
28
Harry Robinson,
Ann Gustafson
Robinson
Problem
reporting for
telephone
systems
USA
Waterfall
30 overall
4 on
project
1.5 yr
Inhouse
No
No
Yes
No
Chapter 2, The Ultimate Database Automation
Henri van de Scheur tells a story that spans half a dozen years, relating what hap-
pened when he and his colleagues developed a tool for testing databases in multiple
environments. They set good objectives for their automation and a good architecture
for the tool. They automated so many tests that they developed a lifecycle for auto-
mated tests that included periodic weeding. Tests were run nightly, weekly, or with
special scheduling. Despite great success, a number of problems were encountered,
and Henri describes them honestly. The development of this database testing tool
(now open source) was done in Norway by a small team, over several years, and it
achieved a very impressive return on investment.
Chapter 3, Moving to the Cloud: The Evolution of TiP,
Continuous Regression Testing in Production
Ken Johnston and Felix Deschamps from Microsoft describe how they moved from
product-based to service-based automated testing by implementing the automation
in the cloud. Testing of Microsoft Exchange servers was already extensively auto-
mated, and much of the existing automation was reusable. Testing in production
seems a foreign concept to most testers, but this chapter explains why it was neces-
sary and beneficial to move to continuous monitoring and contains useful tips for
anyone considering a similar move. This experience takes place in the United States,
over three years, and unsurprisingly, Microsoft tools were used.
xlii
Preface
Chapter 4, The Automator Becomes the Automated
Bo Roop takes us on a guided tour of attempting to automate the testing of a test
automation tool. One of the first questions to ask a tool vendor is “Do you test the
tool using the tool?†But the answer isn’t as straightforward as you might think! With
his lively writing style, Bo gives an honest description of the difficulties and chal-
lenges he encountered, particularly in the verification of test results. It is a good idea
to find out what others have tried, and Bo shows the advantages of doing so. His sen-
sible approach to automation is to start by automating the easier components before
tackling the more complex. Unfortunately, this story does not have a happy ending.
It illustrates how presumably well-intentioned management actions can sabotage an
automation effort. For reasons that become obvious when you read this chapter, the
tool vendor is not identified: a fictitious company and tool name are used instead.
This experience takes place in the United States with one automator (the author)
and covers just over one year.
Chapter 5, Autobiography of an Automator: From
Mainframe to Framework Automation
John Kent tells us how and when test automation started and offers surprising infor-
mation about the origins of capture/replay technology. Understanding how automa-
tion worked on mainframes shows how some of the prevailing problems with test
automation have developed; approaches that worked well in that environment did
not work well with GUIs and the need to synchronize the test scripts with the soft-
ware under test. The principles John discovered and put into practice, such as good
error handling and reporting and the importance of testing the automation itself,
are still relevant and applicable today. John’s explanation of the economic benefits of
wrappers and levels of abstraction are compelling. He ends with some recent prob-
lem/solution examples of how web elements can trip up the automation. This United
Kingdom–based project involved mainly commercial tools.
Chapter 6, Project 1: Failure!, Project 2: Success!
Ane Clausen tells of two experiences with test automation, the first one unsuccess-
ful and the second one a solid success, largely due to what she learned from her first
experience. Lessons are not always so well learned—which is a lesson in itself for
everyone! Ane’s first story is told honestly and highlights the serious impact of insuf-
ficient management support and the importance of choosing the right area to auto-
mate. In her second story, Ane designed a three-month pilot study with clear objec-
tives and a good plan for achieving them. Many useful lessons are described in this
Preface
xliii
chapter, such as good communication (including using the walls), limited scope of the
early automation efforts, good use of standards in the automation, a good structure
(looking for common elements), and keeping things simple. The continuing automa-
tion was then built on the established foundation. Ane’s experience was with pension
and insurance applications in Denmark, using commercial tools.
Chapter 7, Automating the Testing of Complex Government
Systems
Elfriede Dustin, well known in the test automation world, shares her experience of
developing an automation framework for real-time, mission-critical systems for the
U.S. Department of Defense. Because of the particular type of software that was
being tested, there were specific requirements for a tool solution, and Elfriede and
her colleagues needed to spend some time searching for and experimenting with dif-
ferent tools. Their clear statement of requirements kept them on track for a success-
ful outcome, and their eventual solution used a mixture of commercial, open source,
and inhouse tools. They met with some unexpected resistance to what was techni-
cally a very good system. This story covers hundreds of testers and tens of automa-
tors, testing millions of lines of code, over a period of four and a half years.
Chapter 8, Device Simulation Framework
Alan Page from Microsoft tells a story of discovery: how to automate hardware device
testing. We all take for granted that our USB devices will work with our computers,
but the number of different devices that need to be tested is very large and growing,
and it was difficult to automate such actions as unplugging a device. However, a sim-
ulation framework was developed that has enabled much of this testing to be auto-
mated in a way that has found widespread use inside and outside of Microsoft. The
chapter includes numerous examples showing both the problems encountered and
the solutions implemented. This story is from the United States and was an inhouse
development now used by hundreds of testers.
Chapter 9, Model-Based Test-Case Generation in ESA
Projects
Stefan Mohacsi and Armin Beer describe their experience in using model-based
testing (MBT) for the European Space Agency (ESA). Their team developed a test
automation framework that took significant effort to set up but eventually was able
to generate automated tests very quickly when the application changed. This chapter
xliv
Preface
includes an excellent return-on-investment calculation applicable to other automation
efforts (not just MBT). The team estimated break-even at four iterations/releases.
The need for levels of abstraction in the testware architecture is well described. The
application being tested was ESA’s Multi-Mission User Information Services. The
multinational team met the challenges of automation in a large, complex system with
strict quality requirements (including maintainability and traceability) in a waterfall
development—yes, it can work! If you are thinking of using MBT, you will find much
useful advice in this chapter. A mixture of inhouse, commercial, and open source
tools were used by the team.
Chapter 10, Ten Years On and Still Going
Simon Mills updates his case study from our previous book, Software Test Automation
(Addison-Wesley, 1999). Still automating 10 years on is a significant achievement!
The original story is included in full and contains excellent lessons and ideas. The
success and continued growth of this automation is a testament to the sound founda-
tion on which it was built more than a decade ago. The case study describes many
lessons learned the hard way and some amusing observations on Simon and his team’s
first automation attempts. Their automation architecture separated their tests from
the specific tools they were using—a wise move as was proved later. They devised a
reliable way to document their tests that has stood the test of time. This story takes
place in the United Kingdom, uses commercial tools, and covers about 15 years.
Chapter 11, A Rising Phoenix from the Ashes
Jason Weden tells a story of initial failure leading to later success. The failure of the
first attempt at automation was not due to technical issues—the approach was sound.
However, it was a grassroots effort and was too dependent on its originator. When he
left, the automation fell into disuse. But the phoenix did rise from the ashes, thanks
to Jason and others who had the wisdom to build on what had gone before, making
many improvements to ensure that it was more widely used by business users as well
as technical people. Their “test selector†for choosing which tests to execute gave the
test engineers flexibility, and they ensured their legitimacy by keeping stakeholders
informed about bugs found by automated tests. The small team that implemented
automated testing for home networking equipment is based in the United States.
Chapter 12, Automating the Wheels of Bureaucracy
Damon Yerg (a pseudonym) tells of experiences in automating large systems for
a government agency, over more than 10 years, with hundreds of developers and
Preface
xlv
testers and more than a dozen automators. After some uncertain starts, external pres-
sure brought the right support to move the automation in the right way. The tests to
be automated covered diverse applications from web-based to mainframes, all with
environmental factors. This story brings home the need for automation standards
when many people are using the automation. Damon and his colleagues organized
the regression library into core and targeted tests to enable them to be selective
about which tests to run, and they related the automated tests to risk factors. The
basic design of the automation supported business expert testers and offered techni-
cal support as needed. One of the most powerful things they did to ensure continu-
ing management support was to develop a spreadsheet describing the be