Proactive testing: Tips from Michael Bolton

An opportunity arises

Attending Rapid Software Testing in London presented me with a unique opportunity, to corner one of the great testing minds “Michael Bolton” for a quick chat about my idea’s around proactive testing & to gather whatever info he had around this area.

The specificity problem

So I went on to tell him about the two current techniques I’d tried out to prevent defects going into our product’s code base. He appeared happy enough with the idea of a show and tell.1  What Michael did notice though was a potential “specificity” problem with my up front test cases2, which is essentially when you skew a developer’s mindset by providing them with a large list of test conditions or worse still bloated traditional scripted test cases.  Whilst my proactive “up front test cases” were very lean, they were also very extensive in that I tried to cover as much conditions as I could with them.   So I could see Michael’s point here, perhaps you are putting the developer in a comfort zone by providing these extensive checks to conform to prior to committing any code they’d produced.  Would they be interested in thinking of other test conditions for themselves or would they just proceed to execute my conditions?

I’d talked to James Bach over email not so long ago about the same thing, trying to obtain some proactive tips.  Like Michael he had the same worry, the “specificity” aspect.   I’m not to sure what to think myself, I think both James & Michael are correct in that there is a glaring danger and depending on the skill of the tester in providing high risk based coverage in these test conditions you could be in danger of the developer falling into this “specificity” trap. Likewise a tester can only cover so much himself, experience from working on something similar as a group or comparing or reviewing others test’s will show you that each person generally has something new to contribute.

Certainly for myself this has not been a problem so far, perhaps I’m extremely lucky to work with such talented people who will not only take on board these test conditions I give them, but also continue to think about other conditions I might not have thought of myself.   It’s interesting & I think if anything it’s an awareness aspect like so many others in testing that we have to constantly remind ourselves of.

Michael didn’t stop there, he suggested the generation of test idea’s which could be based upon risk & coverage as opposed to extensive conditions.   This would reduce the factor of “specificity” that’s for sure, but I personally see a lot of value in some of the more detailed test conditions I generate.   After all these aren’t just checks of requirements & business rules, I use a very thought provoking technique which covers all aspects of testing to generate these.   It essentially testing not checking, whilst I produce lots of checks the invaluable testing done to generate these reaps so many rewards, often calling for major changes to the requirements themselves.   If you haven’t read my post on “lean test case design yet” please do & you’ll see exactly what I mean.

Both Michael and James are correct though, it is dangerous giving developers any illusion of a test safety net.   Likewise ditching the benefits of not doing so is also dangerous so I guess in the end it comes down to what will work best for yourself in your own project.   This works for me so successfully not just because I work beside talented people but also due to our companies mind set towards shipping an successful end product, so the team will happily work together thinking of new test idea’s and not just leaving it to myself.

Rapid planning

We went off track a little and Michael talked about mapping out functional area’s of a product and applying a heuristic test strategy model to these area’s.  You would then join up these interactions points on your map and begin considering what test sessions you could apply given your time constraints and capacity for test sessions.   This was interesting! I was intrigued!  I’d highlighted that a mind map could prove useful for such a thing only for Michael to respond that the benefits of collaboration would be lost with a mind map.  He went on to talk about the benefits of doing such a task with your team & with nothing more than some post it notes, a white board & some pens!   Genius and how true, by mind mapping your killing that collaborative, valuable thought process.

In terms of the maps capacity for change he went on to highlight the fact that we can easily drop stuff in and out of scope, whilst having a group discussion about this & once we were happy with this map, or plan if you’d prefer to call it that, we could take a picture for storage of the plan.

So a very rapid, lean, collaborative & effective technique.   I’m sure you’ve all done something similar before, I know I have but I’ve never considered the benefits of taking such an approach before I’ve just did it.   Now I’m aware of those benefits I’ll more actively seek to use this technique in future.  Likewise looking at it from a Session Based Test Management (SBTM) aspect sounds very useful, with SBTM being something our team has planned to try out.

Requirements tips

So back to proactive testing, Michael highlighted the importance of requirements feedback.  I’d like to think every tester reading this provides feedback on requirements they receive, if not then please start doing it :-)  He highlighted that by simply annotating your requirements documentation you’d save time & retain your focus.   He also suggested decorating paragraphs or pages of requirements with what they’re about, from which you could generate a test matrix or map of testing types you’d apply to these sections.   If then at a glance you’d noticed one type of testing didn’t appear to have a lot of coverage from looking at your map or matrix, or even just illustrations placed on your requirements themselves you could at least then ask yourself “is this a problem?”

Another tip Michael mentioned, which was also mentioned in our rapid software testing class previously by another very clever guy called Anders Dinsen was to simply pull developers, BA’s, even team leads or management over & ask them questions about the requirements.   From this you could find if there was differences in opinions about how these should be done, as such highlighting key risks to your project early.  Excellent!  Usually I’d just discussed my feedback with a stakeholder, BA or TA.   Sometimes I’d even just gather it all up and send it in an email to them.  Losing that benefit of verbal communications which often reveals so much more.   Getting the developers involved & management sure, why not?   After all they’re probably the most key in making it successful you could argue.  Even better speaking to more than one person about them, what a great idea!  Why didn’t I think of that?

Importance of pairing

Michael also highlighted that importance of cycling people through a feature and not creating knowledge silos.   Pairing if possible could also be look at as a proactive technique.   Even better still pairing testers with developers.  I think reducing knowledge silos through pairing has been a challenge in our development team for a while.   A lot of people want to be doing it, however the cost is often hard to sell to management.  That being said pairing often happens on more difficult tasks or near the start of a project anyway.

Summary

I think it’s always good to summarise longer post’s when possible, so here goes:

Proactive testing

  • Specificity problem
    • Skewing developers mindsets, losing tests they may have potentially done which are not documented.
    • Can arise with any tests you hand to a developer to run.
    • The more tests the bigger the problem can be.
    • Don’t be scared to provide tests to developers, just be more aware of tests they might not run.
  • Requirements
    • Annotating documentation saves time & retains focus
    • Decorate paragraphs or pages with what they’re about
      • Can allow generation of a text matrix or map of types of testing for feature area’s
      • Allows you to see if a type has low coverage & ask “is this a problem?”
    • Collaborate feedback
      • Involve developers, BA’s, TA’s & management
      • Ask the same questions to different people, look for inconsistencies in their response

Rapid Planning

  • Mapping initial plans
    • Mapping functional area’s of a product or a new feature on a whiteboard
    • Use sticky notes, this increases the map/plans flexibility
    • Easy to see interaction points
    • Allows for group collabiration, more test idea’s!
  • Session based test management
    • Post it notes become your charters for test sessions
    • Flexible, use as many sessions as you like, then keep only the important ones for your plans time constraints
    • Easy reporting: “for the release we plan to do this amount of sessions in these areas”

So that was a quick breakdown of a thirty minute chat with the great Michael Bolton.  I’m very glad I took the chance to steal some of his time.  I’ve came away with some excellent idea’s to try out, I hope you picked up a few as well.   Thanks for reading :-)

  1. Show and tell: If you haven’t read my previous post about these then let me explain. These are when a developer gets the chance to demonstrate their completed work to a tester & the tester gets a chance to ask questions & try testing techniques on the functionality, with the hope to find avoidable defects or uncovering issues that require further discussion. All prior to any code being committed into the products code base. I explain them in more detail here along with the benefits of using this technique. []
  2. Up front test cases: Test conditions created up front, prior to any code being written.  Internally I call this “Up front test cases”, some of you may refer to these as Acceptance Tests, whatever you want to call it is fine. The main point of these is that you create a list of test conditions for developers to check prior to completing a piece of work.  I explain them in more detail here along with the benefits of using this technique.  They also tie into my lean test case design technique which I talk about here.

    What I love about writing these up front is that it’s much like testing your application.  Your mind goes crazy thinking of idea’s for new test conditions & while you’re writing these you’re also doing a form of requirements validation as you’ll spot gaps & potential issues.  No doubt you’ll have previously reviewed & fed back on these requirements anyway, but you’ll be surprised how many extra things you notice once your brain is in full blown test mode. []

Related posts:

  1. Tales from the trenches: Proactive testing
  2. Tales from the trenches: Lean test case design