Sunday, January 25, 2009

It's cheaper to spend on testing!

We all hear about the value of testing. The arguments sound so convincing. Write automated tests and you can find your bugs before QA, your customers will be happy, the cost of quality will go down. But then we close the blog page or put down the book on testing, go back to work and face the time pressures to get a release out. We tell ourselves tomorrow we will create the tests. When finally we get the time, our boss says "Don't waste your time coding tests, just focus on getting your code right!" She goes on to say; "I don't have the money nor the time to let you waste your time on automated testing. We can't maintain the code we have, so how will we manage also a huge base of test scripts?"

Well let me tell you a story about how testing early could have saved the day with a certain project. The software was encountering various problems which were sporadic. Normal signal user QA could find the source of the problem.

Whenever the product was deployed to production, the customers would report all sorts of weird messages. The system admin and product owner kept saying these error messages were attributable to the vendor's product failing under load.

So JMeter was brought out of the closet and the project team proceeded to record a couple of performance test scripts. While JMeter is not a regression nor functional testing tool, it does have some rudimentary functional testing features. Those features were used to verify the correct page was getting displayed. Sure enough within a couple of hours using the response assertion and duration assertions the team found the file handle leak and re-created all of those weird error messages. They even identified a couple of functional defects which no one else had identified yet. In turns out there were a couple of functional errors which were attributable to resource synchronization. By using JMeter to put the system under load instead of normal manual or automated processes, the team was able to see where and why the application was failing when deployed to production.

In just a couple of hours the team was able to get moving again. The team had been stalled for months. Everything else had been blamed, except for the original code; operating systems, integration servers, mainframes, etc. To be fair the projec team did find problems in all of these infrastructure areas, but the biggest problem was in the code.

The cost in lost opportunity, staffing costs, and vendor support costs could have paid for a lot of test scripts. A lot of time and money was wasted. But, the good news is that there is now real support for improved testing (functional and performance). The product owner is now all excited to get more tests created and to get a copy of JMeter on his desktop. This individual is even intrigued by continuous integration. So maybe the hassle and delays were worthwhile. Our product's future can be improved.

While the saying "Test early, test often" is true; what is also true is "some testing late, is better than no testing"

1 comment:

Edward said...

This is a great posting - I find myself fighting this battle (showing value of tests and the costs associated with not testing).

I recently started using sonar and there's a plugin that calculated technical debt. If management doesn't understand code coverage or complexity figures, it should be able to comprehend a dollar figure (and how it's calculated).

Another thing to consider - unit tests are NOT enough. A full testing suite would be unit, then on to integration, then smoke, then load, etc. And it's not always dev's responsibility to provide these degrees of testing.