Pages

Tuesday, June 21, 2011

Dev Box testing is a mindset shift for QAs

I never thought that this needed a blog entry of its own since this has been a common practice in all the Agile projects I have done in TW. But apparently it isn't so easy in other organizations.

A “Dev Box” is basically a developer machine on which active development happens. The idea of “Dev Box” testing is to get the QA on the team to do a quick sanity test of the story on a developer machine before the final checkin of the story is done and the developer moves the card to Development Complete or Ready for Test.

It is as informal as a developer pair shouting out to a QA on the team “Hey, we think we are done with the story, can you do a quick round of Dev Box testing before we call it dev complete ? “ and the QA coming over to the dev pair station and doing a quick test. This usually takes no more than 15 mins.

Even though this sounds very basic, it has the following advantages

* Reduces the wait time to find defects as the QA need not wait for a build to be churned out and deployed on an environment, hence providing quick feedback to the developers.

* It provides more insight for the developers to look at how a QA is testing the application and vice versa.

* It also aligns developers and QAs towards building a much better quality product by having quality discussions much earlier in the cycle with a tangible story at hand. Sometimes the QA might have useful inputs in how a widget plays on the web page and it might be just a quick fix enhancement which the developers can jump on to during the testing session itself.

Apparently this is not quite easy as it sounds in organizations where cross functional teams are not a common practice. I have worked for clients who have a separate quality assurance department they report to and the QAs refuse to test on a developer machine as the policies do not allow them to do so, even though personally they agree with the benefits of cutting down the feedback loop.

Another client I was talking to remarked that QAs in their organization were actually driven by wanting to log a huge number of defects in their defect tracking system and not by actually wanting to deliver a quality product. This was to an extent that their yearly appraisals were affected party by what was logged in the defect tracking system as their managers will only look at those reports.

By having a QA as an integral part of the development team and adopting practices like Dev Box testing , the team goes through a mindset shift after which everyone is focussed on one goal, which is to deliver business value by building a quality product.

Wednesday, June 15, 2011

Release plan checklist

When I build release plans, or even look at release plans of other projects, I end up running through a checklist of things in my mind, to determine if it is good enough. If you are an Agile PM trying to build a plan, this could be useful for you.

Iterations Is the length of the iteration enough to be able to complete a medium size story within it ?

Do the number of iterations fit well within the acceptable timeline ?

Can we assume a production quality build after every iteration ?
Estimation Are the stories sized relatively ?

Does the team understand the estimation unit across all roles ?
Velocity Is the planned velocity, the average velocity of last 3 iterations ?

If its a new project, are we planning based on a raw velocity exercise ?

Are team members across roles in planning the velocity ?
Resource ramp up Is there time  factored in  for new people to ramp up on the team ?
Ordering of stories Are the stories ordered around the critical path functionality ? (Always remember the critical path determines the schedule)

Are the higher priority stories slotted for earlier iterations ?

Are the stories ordered so that they meet any functional or technical dependencies?
Negotiable Scope Are there some “nice to have” stories in the plan which can be later negotiated if need be to bring the project on track ?
Spikes / Proof of Concepts For technical unknowns, are there spike stories which allow the team to explore technical solutions ?
Non functional requirements Is there clarity on requirements for Performance, Security, Scalability and how they are going to be addressed ?
Functional Automation Will developers do functional automation as a part of a story or this will be done as a part of QA ?
Regression/Stabilization Is there a need for a separate regression/stabilization iteration once the development is complete ?
User Acceptance Testing How much time is required to UAT the set of stories the team will deliver ?
Risks Does the team understand how much risk is there in the plan ?

Are these risks shared with the customer ?