As many testers have, I struggled in my first year on a Scrum team. How do I write tests without a spec? How do I know when I am done with software testing when there is no test plan? I present this article to show the important contributions that software testers on Scrum teams can make in the phase of feature and user stories creation.

Author: Dave McNulla, http://dmcnulla.wordpress.com/

Agile is not a process. It’s a set of values that are intended to help people develop software more quickly with better quality. If we value people and how we work together over tools and processes, we can figure out a way to get the job done better. If we value working software over great documentation, then we have a better chance at delivering working software. If we work with the customer instead of creating a rigid contract, then the software will be more like what the customer can use. If we value making the right changes over following a plan, then we have a better chance of getting past what the plan failed to account. There are also 12 key principles to support these values. Even with those values, and with the somewhat specific principles, they leave the door wide open to how a team does agile.

Scrum is a way to develop software that uses the agile values and principles with some guidelines to help. The saying “it is hard by the yard but a cinch by the inch” – that is Scrum to me. Scrum is small teams building small things quickly and well. I work on a Scrum team. I am not the only tester, but I am the tester. That means I bring a special set of skills to the team.

When you join a Scrum team as a tester, you will help that team succeed by bringing your skills. There are three things you can do to help yourself help them, two of them will sound familiar. You can make sure that the right product is getting created. Make sure the product is being made right. And you can be flexible to maximize how you help the team.

An Agile Testing Approach. Source: http://pathfindersoftware.com/2011/05/functional-test-vs-technical-testing/

Before story implementation



First, you can help the team by validating the product. Validation is looking at the design. Talk through the feature to make sure it makes sense. See it through to the end – clear up all the ambiguities. It’s easier to tie up loose ends when you talk about it than with code.

Identify product risks and system risks before they become issues. This may require planning special tests. For instance, if your server supports load balancing, you may need to test many of the features in a load balanced environment, maybe even with traffic. I had a group of system testers come after me to verify the system, but fixing bugs they found were substantially more expensive.

Make sure the feature will support test automation. If test automation support needs to be developed, that needs to be included in the estimate and design. The implementation of server capabilities can be different to support mocks, which are helpful in automated tests.

This requires you to be there from the beginning. Do not miss the initial meetings for designing and estimating the features. If you aren’t being invited, crash the party or make sure you get invited. That is when you can give this initial feedback to make sure the right product is being made.

During implementation

Second, you can help by making sure the tests are getting written from the start. The best way to do that is to meet with the developer and story author before any work is done. As Janet Gregory said in the book she authored with Lisa Crispin, Agile Testing: A Practical Guide to for Testers and Agile Teams, [1] “We found great success with the ‘Power of Three.'” When the tester, developer, and customer (or customer advocate) met to discuss the feature from the beginning, and each clarification, the tester is providing their input for what will be tested. If your team practices Behavior Driven Development (BDD), then write the scenarios – or at least direct them to make sure they are not light on verification.

If you do not write the scenarios then review them. They should explain the purpose of the feature. You should be able to tell what the system state is at the beginning, the trigger for the test, and the criteria for the test. I personally take it upon myself to see the tests are in balance of unrepeated scenarios (using the DRY concept) with enough detail to know which business rules are being tested. Of course, all the acceptance criteria must be verified.

Before completion

This will sound crazy for any team that practices BDD, but you will not have sufficient quality if you do not have any exploratory testing during/after the feature is completed. For some reason, I have not figured out how to explore before the feature exists. Keep notes on how you are testing your feature, including key parameters and settings. You can find a few tools to record web activity that make tracking your activity easier in case you find issues.

Your team may be using a secure architecture but you can also verify the interfaces for vulnerabilities. SQL injection and cross-site scripting are the most obvious tests, but investigating the information that gets passed by using a sniffing tool like Zap Proxy [2] can expose greater dangers to the privacy of users.

Most development teams find it hard to run a system performance test for each feature. Single feature performance testing is much more viable. Create support for the current feature, such as a jmeter script and parameters for inputs like usage levels.

All the time

As stated, Scrum teams are intentionally small. Most have a high ratio of developers to testers. Even though that can account for most of your testing time, successful teams have flexible members. You can help in that cause with a willingness to step outside your circle of comfort. Setting up servers and tools, solving build issues, etc., are ways you could help the team.

Writing the code behind the tests, such as step definitions for cucumber scenarios, will help the team get to ‘done’ but also help you understand what is happening when the step runs. You may find that a step or function name is doing less than it implied.

Pairing on development and testing strengthens both team members. With people crossing disciplines, they improve understanding of the product, the code, and what other stakeholders find important.

So much to do!

As we find with any good audit there is so much to do. It’s amazing that stories get done, that features are completed, and that we find a way to have a sustainable pace. To keep my sanity, there are a few things I try to remember. Slower is faster – we can maintain a better pace by not pushing the pace too fast. The whole team is responsible – whether it’s support for testing built in the features or more people testing, it is not done until it is done. There is a backlog – if a test is not possible because of dependencies such as obtaining a skill, then schedule it with another card. Make sure it is not forgotten.

Scrum teams are a fun way to work because it reflects my values. If you value your team, keep the software working, solve for the customer, and accept changes, you and your team will have fun, too.

Reference

1. Agile Testing, Lisa Crispin and Janet Gregory, Addison Wesley, ISBN 978-0-321-53446-0

2. https://www.owasp.org/index.php/OWASP_Zed_Attack_Proxy_Project

About the Author

Dave McNulla has been testing software since 1993. He has used Microsoft Test for pre-web applications, WinRunner, TestPartner, and QuickTest Pro. In 2008, he started using Watir to build a test framework that was used for years. He also contributes to support forums such as Stack Overflow and the ‘Watir General’ group at Google Groups. Visit his blog at http://dmcnulla.wordpress.com/