Pages

Tuesday, June 23, 2009

What should we measure on a project ?

While reading "The Goal", the author clearly states that there are 3 important measurements in a system ( a manufacturing plant, which can be extended to a software delivery team)
  1. Throughput - Which in a software project would be the No. of story points actually delivered at the end of the cycle (Iteration or Release)
  2. Operational Expense - Which in a software project would be the cost incurred (eg: billable consultants, development kit etc..) to actually deliver features (or story points)
  3. Inventory - Which again in a software world would translate to the number of stories sitting in transition states and not yet customer signed off (Eg: In Development, In QA, On Hold).
The more important point is that one cannot improve on one of these measures in isolation.

Why I say it is important, is because in teams delivering software we often tend to focus on "Increasing the team velocity" without looking at how we can actually reduce the number of "Hangover/In Flight stories".

It is also common for teams to actually sign up for more Story Points in an Iteration or Release in order to show improvement in throughput, which results in more stories in hangover because of some unresolved bottleneck.




The Goal .. is a refreshing read

I have started reading "The Goal". I am just half way through it and am really impressed by the writing style the author has adopted, and how effectively the essence of Lean and Theory of Constraints have been delivered.

The fact that the book is written in a style of a fast paced thriller novel makes it superb. If you are planning to read a book on Lean, this is one of the books I would highly recommend.

Sunday, May 17, 2009

Building accountability in Agile teams - Someone needs to ask the tough questions

Agile teams often find themselves in a difficult position when it comes to accountability of the work done. Agile rightly promotes collective owenership, in the codebase, build environment, and even the story wall.While it is extremely useful to have polyskilled people in the team, it should not hamper the need for accountablity on the delivery of a Story, A Feature and A Release.

One way to build this accountability is to ask the critical questions when there are issues in delivery, and tie it with feedback so that the issue doesnt manifest itself again. This can also be coupled with a bit of coaching.

Examples

Scenario 1 : Customer's testing team finds a bug on a story which was delivered to them.

Question (to team QA) : Is this bug breaking the acceptance criteria of the story ?
QA : Yes

Question (to team QA) : Why was this bug not caught when we tested this story ?
QA : Well, this was working fine on the build I used. It broke in the build given to the customer for testing

Question (to team QA) : Why did we not catch this when we ran our automation/regression suite ?
QA : That is because we have some QA backlog on the automation front. This test isnt yet been added to the automation suite

Feedback (to team QA) : Well, that means we need to catchup on our automation first, before developing anymore stories. Now you understand how bad it looks when a basic bug is found during customer testing. Please take this as a challenge that from now on, any stories you test and give to customer, the customer should not be able to find any bugs in them. We should do all that is required so that a story does not fail at any point again.


Scenario 2 : QAs find some basic bugs during testing of a story. A bug which breaks a happy path scenario

Question (to team Dev pair) : Why was such a basic bug not caught during development ?
Dev : Thats because we had to refactoring as a part of another story which ended up breaking this piece of code.

Question (to team Dev pair) : Why were there no unit tests around this piece of code then ?
Dev : Well.. The unit tests were missing because I was finding it difficult to write tests for this piece

Feedback (to team Dev pair) : We now know that having unit tests around this would have ensured that this code would not have broken during a refactoring session. Simply put, if we had test driven the whole piece, we would not have ended up in this situation. I think we need to practice TDD more religiously. in the team. Let us have ahuddle to see how we can improve this. (Maybe look at some coaching on TDD)

It is the Iteration Manager's role to actually ask these questions and make the team retrospect their delivery, time and again, and learn from mistakes.This will allow the team to build more accountability on the work that is delivered.

Wednesday, May 13, 2009

How do we do both SOA and Agile in a big enterprise ?

The project I am working on currently is a big enterprise application, mostly structured around the SOA model. Since we evolved a legacy code base, the service boundaries have just started to become crisp and clear. We are slowly starting to think of certain teams in the project owning a set of services, and also deciding on service contracts as a norm.

I am now starting to wonder how this model is going to work with Agile teams.

* Will this force us to do Big Design Upfront ? (BDUF)
* Will this force us to have service level teams ?
* How can we write good user stories (which are testable and showcaseable) for service level teams ?
* How can we structure the teams when we have multiple consumers (teams?) for one service ?
* Will this force developers to communicate using service contracts and documents more than unit tests?
* If we try to encourage cross functional teams, will it put a lot of stress on resourcing, with the need of shuffling people around a lot ?

I do not have answers to any of these yet, and a quick look at the Agile community , also does not give me any concrete answers.

Friday, May 1, 2009

Using FreeMind in Retrospectives



So in the last few weeks I attended a couple of retrospectives for my team, and towards the end I facilitated one for another team.

To facilitate I used FreeMind to jot down points which came up during the retro mapping them using the Mind Map. As people were discussing "What could the team do better ?" , I ended up collating them on the map which was projected on a whiteboard. When it came to voting the top 3 points to discuss, we just asked people to put their votes (read color dots) on the whiteboard against the projected map.

Worked nicely. No use of markers, no dirty handwriting, and the team has their mental map right in front of them. The best part is, to send across Retrospective notes as a soft copy, all you need is to export this map as HTML/Word Doc with one click. You can even click print and give each person a copy of it before they leave the meeting room.




Worth a try guys !