Testing is critical for digital services - here are some of our approaches

Our testing approach has evolved carefully over the years and we have taken on board Agile and Lean practices.

Quality is a key value of every successful Agile team and testing lies at the heart of quality. Agile testing is therefore critical to the success of an Agile team. Best practice in terms of testing is always evolving. We are always learning new techniques and strategies and incorporating them into our testing approach. Let's look at the story of an Agile testing approach currently in use by our teams.

Testing Is The Alpha And Omega

In our experience, user stories are initiated by the customer/product owner. That then triggers a conversation between the customer, tester and coder about the nature of the story. Where tester may cause a user story to be written is by spotting a gap in the requirements - which is fairly common.

The first question we should be asking on starting work on a new User Story is: "Under what circumstances will this software be complete?". The second question should be: "How am I going to test this?". The two questions go hand in hand. A large proportion of the answer to the first question is based around testing. our first job should be to create a Test Design (AKA Test Plan) for the User Story. The Test Design should include techniques and approaches for a reasonably standard group of items. E.g.

  1. Function testing
  2. User Testing
  3. Risk Testing
  4. Automatic Checking
  5. Stress Testing
  6. Scenario Testing

This is just an example list and not definitive law! In fact, there are many types of testing not included on this list. Take a look at the testing quadrants here to see the other types of testing that you might consider.

Testing Poster

Our Test Design is a rough statement of strategy and maybe highlights a couple of areas that will need special attention. We would normally expect the Test Design to be pretty similar across most User Stories. The Test Design is the strategy for testing rather than a list of the tests themselves.

The Alpha

Our testing begins with Acceptance Tests. The tester(s) at this point are working with the Product Owner to define a set of scenarios that must be satisfied in order for the software to be functionally accepted. A team should have a reasonably consistent approach to this. For example, a team may choose to use a BDD style of syntax for scenarios such as:

  • Given that my stomach is empty
  • When I eat a bunch of apples
  • Then my stomach is full

This style of expression is very useful in an environment where a testing tool such as Behat is used as the scenarios can be translated into automated tests.

We can see here that testing is The Alpha. We start the process of development by defining tests. In effect, our specification is our tests.

Working

At this point, the rest of the team can get involved in creating a great piece of software. Our clients can write acceptance tests in plain English, our SPACECRAFT UX team can start creating the User Experience, our developers can start making the Acceptance Tests pass, users can test the solution for usability  etc. The process of working is primarily a testing-based task! At this point I would expect all work to be focused around these kinds of testing:

Role Testing Activity
Programmer Make Acceptance Tests pass

Make GUI Tests pass

Write Unit Tests and make them pass

Write Integration Tests and make them pass
User Experience Liaise with Users, Testers and Developers to ensure GUI tests pass
Tester Writing And Running Acceptance Tests

Writing And Running GUI Tests

Writing And Running Manual Test Scripts

Exploratory Testing

Most teams have a high degree of automation in these areas of testing. It is normal for Unit and Integration tests to be executed automatically on check in. Depending on the length of the GUI and Acceptance test cycles, these may be executed on check in or executed on a nightly or per-User Story basis.

Continuous Communication is the key to testing during development of a User Story. It is normal for testers to be receiving periodic, incomplete releases of a User Story so that they can explore and refine tests based on the work in progress. Many User Stories experience specification change during development as the underlying requirements clarify. Continuous Communication enables testers to be kept in the loop on changes. The tester is also be able to feedback to the developer before the User Story is complete, thus reducing wasted time on developing erroneous programming.

The Omega

All User Stories end with a final run-through of the tests that have been written. This activity is co-ordinated by the testers. There should also be a final exploratory test run to catch any odd bugs that may have crept in that are not covered by the testing. Of course, we would expect tests to be written for any bugs found during exploratory testing!

There are some other testing activities that may be expensive to execute on a continuous basis as work on a User Story progresses. The cost of these activities is generally time and skill related. Either the activities require a highly-skilled team member (e.g. security tester) or a large amount of time (e.g. hallway testing for UX) that must be planned in. These kind of activities may only be undertaken at the end of each User Story or at fixed time intervals (each week, each Sprint, each iteration etc). The kind of testing activities addressed here are:

  • Cross-browser Testing
  • Hallway UX Testing
  • Device Testing
  • Security Testing
  • Load Testing
  • Stability Testing
  • and many more...

Where possible the team should seek to find ways to optimise these processes so that they can be executed as frequently as possible.

Aftermath

The future is always in motion as our testing strategy should be. We are never satisfied with our work and we always seek out improvement. Ours is a tireless quest for perfection!

A great example of this testing approach was the Ministry of Justice Employment Tribunals Digital Service. Check out our talk on Agile approaches and the case study!

--

If you are interested in approaches to Agile for your project, we would love to hear from you!


­
Make a comment



There is no response to “Testing is critical for digital services - here are some of our approaches”

Share this post

About

The official Jadu Blog (a peek inside). The musings and magic of the Jadu team and log of new web apps, customer super hero stories and mobile web marvels.

Recent posts

Archives

Tags