Saturday, November 28, 2009

Silver Bullets and Simple Solutions

Recently I was reading the Boston Globe and saw a letter to the editor about problems with H1N1 vaccine distribution and lamenting that "We’re eight months into a pandemic, and it seems that all the government can do is tell us to wash our hands!" While I understand the writer's frustration about the availability of vaccines, and the technology used to produce them, I was  struck by the writer's attitude that a simple solution couldn't possibly be effective.  I'm not a medical professional, but from what I've read, hand washing, while mundane sounding, is effective in preventing the spread of disease. Since I  am a software development professional, I also thought that the attitude that the more exotic solution  is always better, is common in software development as well.

When people ask me about improving their release management process to be more agile,  they are disappointed when I don't focus on the latest SCM tool, but rather talk about approaches like unit testing, small and frequent commits and updates, continuous integration, and the like. All of these things are dismissed as simple and even mundane. Yet, if they are simple, why not do them? The truth is that the best way to make your release management process more agile is to have fewer codelines and keep the codeline you want to release working. There is no real magic to that, other than the magic that happens organically when a team is disciplined and committed to keeping the code high quality.

Since Fred Brooks wrote No Silver Bullet (which appears in The Mythical Man-Month: Essays on Software Engineering, Anniversary Edition (2nd Edition)) software developers have always been looking for technological ways to make projects go more smoothly.  Technology helps, to be sure. A good diff tool can help you merge code when you really have to, and an SCM tool that supports staged integration well can help you get to the point where you can trust your team to keep the codeline working. But the best approach is to small, incremental changes, and small, comprehensible modules.

In the same vein as what the  Pragmatic Programmers tell us in The Pragmatic Programmer: From Journeyman to Master, and as Bob Martin provides further guidance about in Clean Code: A Handbook of Agile Software Craftsmanship, there is a lot you can do to make your team more agile by small changes, keeping things working along the way, and iterating. Think about tools as a way to support how people work, and not the other way around. Tools help you work effectively, but a good tool can't replace good technique.

Sunday, November 22, 2009

Planning Time and Sprint Duration

While having lunch with a friend of mine he mentioned that his team had frequently changing priorities and how the team tried having short (1 week) sprints to be able to adapt to business changes. He discussed how the team felt like the overhead of planning for a 1 week sprint was too high, so the team decided to abandon a sprint model. This conversation reminded me that this kind of question comes up a lot, especially with teams transitioning  to agile.

It makes a lot of sense to tune your sprint length to the rate at which requirements change and the rate at which the team can deliver functionality. Adding work as you go makes it difficult to make commitments and to measure progress, and new "high-priority" work can disrupt flow. If your sprints are 4 weeks long, then there is a greater temptation to add work mid-stream. If a sprint is 1 week long, then it's easier for a Product Owner to be comfortable slotting work into the next sprint.

A sprint isn't just the time spent coding. The planning and review are also important. So, what's a good ratio of planning to "coding" time in a short sprint? In a canonical 4 week sprint, such as described in Agile Project Management with Scrum the team spends 1 day on planning, and about 1 day on review and retrospective. This adds up to 2 days out of 20, or 10%. For a one week sprint, this same ratio gives us 1/2 day for review and planning.

Given the overhead of getting people together, and the dynamics of meetings, the calculation probably isn't linear. But I have worked on teams where we could do a reasonable job planning and reviewing in 1/2 day. This seems like reasonable overhead if:
  • The backlog is well defined by the product owners in advance of the planning meeting so that we can quickly estimate.  
  • Daily scrums start on time, stay focused, and fit within the time box that the team expects (typically 15 minutes)
  • The number of features is small enough that it is possible to have a focused review meeting in an hour or so, with 30 minutes allocated to "retrospective" discussions.
  • There is adequate interaction with a product owner during the sprint so that small issues can be resolved quickly and outside of the review.
This is my experience with planning for 1 week sprints. What are your experiences? How long do you spend in planning and reviewing? Is it enough? What are the prerequisites for an effective 1-week sprint? Please comment!

Tuesday, November 10, 2009

Doing Less to Get Things Done

Have you ever been in a situation where someone walks into the room and announces that they just got off the phone with a customer you need to add some functionality, described in very specific terms. As described the feature could take a lot of work, so you bounce around some ideas about how to do what the customer asked for. Along the way you realize that maybe, perhaps, there is another way that you can add a similar feature that meets the needs at much lower cost. But no one asked the customer what problem they wanted to solve. So what do you do now?

Some options are:
  1. Saying, sorry, we don't really know what the requirement is, so come back when you have more to say.
  2. Spend the next couple of hours discussing how to implement all of the options you think of, and planning how to get them done in detail
  3. List some options for what the customer might really mean, then delegate someone to fine out more, using your options as a basis for conversation.
Option 1 sounds appealing, but doesn't actually help you solve the problem of efficiently building (eventually) what the customer wants. While option 2 has you thinking about the problem and solutions, at some point you're making the solution more expensive than the customer probably wants it to be. This is an easy scenario to fall into since people, and engineers in particular want to solve problems. But a long conversation without data doesn't solve this problem and keeps you away from making progress on other problems that you know enough to solve.

Option third option is a good compromise. Spend some time discussing what problems the customer might want to solve focusing on the problem, not the solution (implementation). Then spend a few minutes figuring out how you might implement each proposed option so that you can attach a cost to each. Then delegate someone to have a follow up conversation with the customer using your options as a starting point. Three options is a good rule of thumb.

It's very easy to get caught up in solving problems without asking if you're solving the right problem. Whenever you're asked to to build something very specific, ask yourself if you really understand the problem. By taking a step back you can save time, and in the end have happier customers.

(For more on figuring out what the problem really is see the appropriately named book: Are Your Lights On?: How to Figure Out What the Problem Really Is)

Sunday, November 8, 2009

Fail, To Succeed

I was listening to a commentary on NPR about a pre-school graduation which mentioned a comment from an education expert Leon Botstein that "we should be rewarding: Curiosity. Creativity. Taking risks. Taking the subjects that you're afraid you might fail. Working hard in those subjects, even if you do fail. We should reward children when they show joy in learning."

This led me to thinking about a reason that some teams struggle with being agile. Agile teams are good at making corrections based on feedback. For this to work you need to be willing to honestly evaluate your progress against a plan, and be willing to revise the plan (and how you work) based on this feedback. This is a hard thing to do if you're used to the idea that any feedback other than "you're doing OK" is bad. (I have more to say about this in a contribution to the 97 Things Every Programmer Should Know project.)

Agile methods help you create an environment where it's safer to try things by providing for feedback at such an interval that things can't go that wrong. By making small steps and evaluating the results, you can take small risks that you believe will be for the better. And if it didn't work out, you haven't lost that much.

This process is visible at all scales in an agile project.

Sprint Reviews give the team a chance to evaluate features every sprint. It happens that, based on review feedback that features are removed as well as added or enhanced. And that's fine because the team only spent a week or two. In other environments such decisions might not be made til it was far too late to change course and either implement something new or avoid embarrassment.

Integration builds give the team rapid feedback when a code change causes an integration problem (even when the developer thought that it was adequately tested).

Unit Tests give you a chance to understand the impact of a refactor before you commit a change. You can see a problem before anyone else is affected. And you can decide to abort a change with unintended consequences.

Frequent commits to a a version management system allow you to recover from changes that become more involved than you thought.

Being willing to fail allows you to improve as long as the failures are small and easily identified. Being agile means being willing to take small risks.

Lessons in Change from the Classroom

This is adapted from a story I shared at the Fearless Change Campfire on 22 Sep 2023 I’ve always been someone to ask questions about id...