In an effort to get our teams understanding ‘Done Done’ we’ve started looking at acceptance testing. We’ve moved, over the last 18 months or so, from a traditional waterfall process to a full on Agile one, our final hurdle is the QA process. We still have a very waterfall-esque QA approach.

We’ve done lots in terms of getting QA involved early on, making environments available for the different stages of testing. But we still have a major problem where by all our QA tends to get done once the developers yell ‘FINISHED!’.

The problem is, we developers tend to lie. Not knowingly, more out of a desire to move on to something more interesting. So even though we may be making our burn down chart look a little neater, we’re in fact delaying the inevitable. QA return to us our shiny code, complete with bugs we never thought possible.

So recently, we’ve started looking at Fitnesse in an effort to bolster our QA efforts and also to bring more light on to the process that is QA. If we get more people looking at it, we can surely all help to improve it by offering new tests and finding ways to improve how the existing tests work.

I’ve known about Fitnesse for quite some time, since CITCON made it to London, my first open space conference. However, even then the general consensus was it was great, but quickly turned in to a beast to maintain. This problem still remains, as from the recent alt.net conference, Gojko himself said, it aint great, but it’s all we got.

I’ve waited too long for someone to come up with a better alternative (whilst keeping an eye on J Miller) we’ve started to give Fitnesse a go. I’m doing my best to ensure we don’t fall in to the pitfalls people have done before. My main gripe would have been the need to build fixtures. I have heard many a nightmare about the fixtures replicating ‘value’ already given by unit testing, fixtures being brittle, and requiring as much effort as production code to keep going. These are all things I’m tying to keep us from.

So recently my new team showed me our second good acceptance test, the first one referred to in the title that inspired this post was by another team which I used to experiment with. The approach I’m taking, is to use the FIT approach against the parts of the system that have to handle all the ‘scenarios’. We’re following a flavour of SOA that I like (not full on SOA, that still makes me uncomfortable). We’re also cleaning up some of the more terrible examples of misplaced responsibility we have. This gives us an opportunity to move domain logic from the ASP.NET front end (urgh) to the places it belongs.

The SOA gives us a nice potential for isolation. That means we’re able to get our examples from the business (or indeed from the beast that we’re rewriting) and feed them in, table like, to check the code we wrote was the right code! One thing I really don’t want to get hung up on is driving selenium or other slow front end tests from Fitnesse. That would just give us the illusion of acceptance testing when all we’d really be doing it moving are old fragile, brittle, slow selenium tests into a fancy wiki that no one would read.

But I’ll come back to that later. One decision which we’ve made, and now you might disagree with this, is to tie the fixtures directly beneatch the WCF contract stuff we have for the service. this means it talks to a more pure, less noisy version of the services the service offers. This so far has allowed our tests to read quite clearly in terms of what makes up a table. It has also hidden quite a bit of fluff we use to on all our services which qould give our fixtures another reason to be brittle.

It does however mean that we wouldn’t pick up an problems with the WCF layer we put atop. I’m going with this is an OK trade off considering how good it felt to see 400 test cases passing and 100 failing. I never thought I’d enjoy seeing red tests so much!

Me feels I need to be more specific (and more frequent!) in my blogging, maybe if I don’t let post sit in drafts for 2 weeks they’ll be better. Hopefully there’s more on acceptance testing to follow!