PM Articles > Alan S. Koch > Agile Testing: No Mini-Waterfalls!

Agile Testing: No Mini-Waterfalls!

by Alan S. Koch, PMP

This column is the second in a series.

Part 1 What Is Agile Testing?

Part 2 Agile Testing: No Mini-Waterfalls!

Part 3 Agile Testing - Pair Testing

Part 4 Agile Testing - Technical Excellence
"In the first week of each Sprint, we testers don't have much to do. Then we are going crazy the last few days."

"The developers don't have much to do in the last few days of each Sprint, so they are usually getting started on the next Sprint."

"It's not unusual for the team to have nothing that is deliverable at the end of the Sprint. Everything's been coded, but the testing isn't done."

I hear these sorts of complaints often, and they clue me into the fact that these Agile teams have fallen into a common trap: Their Sprints are mini-waterfalls! This malady is common in organizations that are just becoming comfortable with an agile approach.

The cause is predictable: Development teams have used a waterfall approach for so long that they can't imagine working in any other way. Although they can understand the idea of doing a small chunk of development in each Sprint, they envision that development work precisely the way they've always done it.

What is a Mini-Waterfall?

Although different teams may do it differently, the typical mini-waterfall looks like this:

  • The first few days of the Sprint are dedicated to clarifying the requirements details for the User Stories in that Sprint and making design decisions.
  • Then the developers get to work on coding and unit testing.
  • With any luck, the User Stories will be ready for functional testing several days before the Sprint ends, and the remainder of the Sprint is spent in testing and bug fixing.

Each Sprint is structured as a mini-waterfall, with the developers doing their part in the first two-thirds or so of the Sprint, then tossing the software over the wall to the testers for the final days of the Sprint. Of course, the coding sometimes takes longer than expected, which shortens the testing time to only a day or two, so when the Sprint is over, nothing is completely tested.

What's Wrong With Mini-Waterfalls?

Mini-Waterfalls cause a variety of problems, including the ones described in the quotes above.

  1. The mini-waterfall mode of operation makes poor use of the team members' time. Testers have little to keep them busy in the first part of the Sprint, and developers have little to keep them busy at the end.
  2. If time crunches in a Sprint, testing suffers. This results either in delivering poor quality software, or having nothing deliverable when the Sprint ends.
  3. Collaboration between developers and testers suffers. Agility is based on full and open collaboration among all of the team members. (We will discuss the many ways testers and developers can collaborate in a future installment of this series.)
  4. Teamwork is undermined. Maintaining the "toss-it-over-the-wall" relationship between developers and testers limits the degree to which they can act as a single team.
  5. Some developers believe that testing is not their job, and mini-waterfalls reinforce that belief. This is a predictable impact of the toss-it-over-the-wall mode of operation.

How Not to Fix Mini-Waterfalls

Some organizations have adopted an approach that eliminates the mini-waterfall within each Sprint by creating a slightly bigger mini-waterfall. During each Sprint, the testers are testing what the developers built in the prior Sprint.

Although this approach addresses the first two issues I listed above, it exacerbates the others. Even if you call the developers and testers one team, they are actually operating as two completely separate teams, just as they do in the traditional waterfall approach.

Working in this mode is taking a giant step backward -- away from Agility.

Working in this mode is taking a giant step backward -- away from Agility. It shuts down collaboration, and it ensures that the results of a Sprint are never production-quality. When the developers are "done" the software hasn't been tested, and when the testers are "done" there are bugs that need to be fixed.

How to Avoid Mini-Waterfalls

Avoiding mini-waterfalls requires that the development team think about (and do) the work within each Sprint differently from the way they have historically worked. Instead of thinking, "design, then develop, then test," they need to start thinking of design, develop, and test as simply the work we do for each User Story.

Of course there is a natural order to software work. You can't test something that hasn't been coded, and you shouldn't code before considering the design. But we don't want those facts to constrain how we do our work.

The Agile methods are built on the idea that the User Story is our basic unit of work.

The Agile methods are built on the idea that the User Story is our basic unit of work. Our backlog is made up of User Stories, we estimate and prioritize the User Stories, and we constitute our Sprints with User Stories. We need to continue that thought-process within our Sprints: The User Stories continue to be our basic unit of work, and we work them within each Sprint in priority order.

When we work in this way, it looks more like this:

After Sprint planning, the team divvies up the tasks for User Story 1 (the highest priority User Story in the Sprint) and gets to work on them right away. They strive to drive User Story 1 to their definition of done as quickly as possible, which of course includes testing.

In all likelihood, some team members will run out of User Story 1 tasks and move on to User Story 2 (the next highest priority User Story in the Sprint) before User Story 1 is completely done. But the focus remains on finishing User Story 1 ASAP. Then the focus moves to User Story 2, and so on.

The Benefits of Avoiding Mini-Waterfalls

This mode of operation corrects all of the problems of mini-waterfalls without adding any new problems to the mix.

  1. Every team member will be busy every day of the Sprint. The design work, development work and testing work are spread evenly throughout each Sprint.
  2. If time crunches in a Sprint, only the lowest-priority User Story in the Sprint is in jeopardy. All of the other User Stories will easily be completed according to the team's definition of done, including the necessary testing.
  3. Collaboration between developers and testers is enhanced because everyone is working on the same User Story (or at most, two of them). Developers are designing and coding while testers are preparing tests for the same User Story. Then everyone can run tests so they are completed ASAP.
  4. There is no more us-vs-them between the developers and the testers. Everyone is narrowly focused on one thing: completing each User Story ASAP.
  5. All of the developers end up being involved in the testing (and not just by fixing defects). Driving each User Story to "Done" leads the team members to collaborate on testing, making it clear that testing is everyone's job.

If your Agile team is experiencing the pitfalls of mini-waterfalls, you have much to gain by following a different approach -- a more Agile approach! Waterfalls have no place on an Agile project; not even mini-waterfalls!

Not all comments are posted. Posted comments are subject to editing for clarity and length.

This is one of the best written (and shortest) articles I've read on how to run a sprint the best way, and the high value of user stories.

The comments to this entry are closed.

©Copyright 2000-2017 Emprend, Inc. All Rights Reserved.
About us   Site Map   View current sponsorship opportunities (PDF)
Contact us for more information or e-mail
Terms of Service and Privacy Policy

Stay Connected
Get our latest content delivered to your inbox, every other week. New case studies, articles, templates, online courses, and more. Check out our Newsletter Archive for past issues. Sign Up Now

Got a Question?
Drop us an email or call us toll free:
7am-5pm Pacific
Monday - Friday
We'd love to talk to you.

Learn more about ProjectConnections and who writes our content. Want to learn more? Compare our membership levels.