Wednesday, October 19, 2011

Agile or Not: How to Get Things Done

Agile software development always felt intuitive to me. Developing software incrementally, in close collaboration with the customer is the obvious way to deal with the uncertainty inherent in both software requirements and implementation. The technical practices of automating necessary but time consuming tests, and deploying, early and often are the obvious ways to give an team the ability to evaluate the  functionality you have and to to decide if the software works as expected. And it's also important to decide if what you built still makes sense given the current environment. Agile isn't the only way that people build software, and it may not be a perfect approach, but it's one of the best ways of dealing with a system that has unknowns.

Agile software development acknowledges uncertainty. The ability of agile methods to make it very obvious very quickly when a project, or even a process, is failing makes people uncomfortable. The visibility of the failure leads to a desire to return to more "traditional" processes. For example, a common complaint is that agile organizations don't create "good enough" specifications. Someone might blame an unforeseen problem on the lack of a spec that mentioned it. This is possible, but it's only true if the person writing the spec could have foreseen the problem. 

The desire to point back to a lack of specification also points to a lack of buy-in to a fundamental premise of agile: it's a collaboration between the business and the engineering team. Some other possible causes of the problem could be:
  • The development team didn't test thoroughly enough. 
  • The code was not agile enough, so the bad assumptions were embedded into the code so deeply they were difficult to address.
  • Communication was bad.
More complete specifications that address all of the issues a system might encounter are one way to build software. But the best that people can do is to write specifications that address what they know. Quite often no one has complete advance knowledge of what a system needs to do.

There are projects where things are known with enough certainty that a waterfall process can work. People can write good specs, the team can implement to those specs, and the end result is as exactly what everyone wanted.  I've worked on projects where this was true, and in these cases, the specifications were reviewed and tested as much as code might. 

Even  projects that seem suited to waterfall fail. For any method to be successful:
  •  Everyone involved needs to be committed to the approach and 
  • There needs to be a feedback loop to correct errors in time. 
The second is point is more important than the first, since people will make mistakes. But being committed to the process is what lets people accept and offer constructive feedback. The reason that waterfall projects have such a bad reputation is that many are implemented in a way that results in problems surfacing late.

Agile methods when done well have the advantage of having built in feedback loops, so  that customers and teams have ways of identifying problems early. When agile projects fail it's often because people ignore the feedback and let things degrade for longer than necessary. (Failing fast can be considered success!)

So, Agile or not, your process will only work for you if everyone works within the framework to the best of their abilities, and if you have mechanisms in place to help people do the right things. Otherwise you can blame your process, but odds are, that's not where (most of) the fault lies.

Sunday, October 16, 2011

More on Being Done

Continuing the conversation from last week, Andy Singleton followed up on my post on being done with this post. Which is good as this is one of those questions that sounds simple in theory, but in practice contains some subtlety.

While I was advocating a good definition of "Done" to enable you to measure progress along a path, Andy's point seems to be that many teams don't establish enough of a path. He says:
In my opinion, most agile teams aren't doing "Test Driven Development", and they aren't doing scrum iterations where they plan everything in advance. Instead, they are doing "Release Driven Development." They focus on assembling releases, and they do a lot of planning inside the release cycle.
This is probably true in more cases than not, though one could argue whether if you are not doing iterations or TDD whether you are, in fact, doing agile. But even if can concede (which I'm not sure if I am) that you can do agile without planning and acceptance criteria of some sort, what Andy describes above still sounds like it has an element of plan, execute, adjust, though perhaps in a more chaotic way than a "textbook" scrum process, and and perhaps at a smaller scale. So knowing having a clean definition of what you want to do is still important. In the Indivisible Task I discussed how teams get stuck with planning because it's difficult to think of tasks that are small, discrete, and which produce useful work. It is possible to do so, but it is hard.

Perhaps the disagreement here is more of a misunderstanding. I consider being able to measure progress in terms of completed work items an essential part of gathering the data you need to improve your process and understand how to be more agile. While doing it at a macro (sprint) level is good,  doing it on a day-to-day or hour-to-hour basis is essential. If you do this at a fine-grained enough level, and have a good testing infrastructure, you can release software more frequently. So, at the limit, perhaps Andy and I are talking about the same thing.

Since I only summarized, I encourage you to read what Andy has to say for yourself. And I look forward to hearing comments both from Andy and other readers.

Wednesday, October 12, 2011

Being Done

Agile New England (which used to be called the New England Agile Bazaar, and which was started by Ken Schwaber) , has this wonderful activity before the main event each month: they host Agile 101 sessions, where people who know something about agile lead a short (30 minutes) small (about 10 people) class on agile basics for those who want to learn more about some aspect of agile. From time to time I lead a session on Agile Execution, where the goal is to help people understand how to address the following questions:
  • How can software and other project elements be designed and delivered incrementally? What set of management and technical practices would enable this?
  • How do you know whether your Agile project will complete on schedule?
When I lead the sessions, I tend to focus on tracking, defining stories in terms of vertical slices and the importance of continuous integration and testing to making your estimates trackable. Since the classes are so small and since the attendees have diverse experiences, the classes are sometimes more of a conversation than a lecture, and I find that I learn a lot, and sometimes find myself rethinking what I know (or at  least exploring things that I thought I understood well enough).

During the October 2011 meeting I found myself reconsidering the value of defining "done" when writing User Stories. I always have thought that defining done is essential to tracking progress. But what done means is still a difficult question. Andy Singleton of Assembla suggested that
The only useful definition of done is that you approved it to release, in whatever form
While the goal of agile methods is releasing software, I find that this definition, while appealing in its simplicity, misses some things:

  • Agile methods have a goal of continuously shippable code. Of course, "shippable" might not mean "ready to release" and cane mean something closer to "runnable," but you can get there by doing no work since the end of the last release. That isn't the goal of agile.
  • With that large scale definition of "done" you have no way of tracking progress within a sprint. 
  • Without an agreement on what you're going to, it's hard to know when you are changing direction. And acknowledging change is an important part of being agile.
The last point about acknowledging change isn't just about "blame" for things not working out. It's about evaluating how well you understand both the business and technical aspects of your project, and it forms the basic for improvement. 

True, having incremental definitions of done that you can use to track progress does help manage budgets. But that really is the least important aspect of having a good definition of done. Even if I were on a project with an infinite time and money budget, I'd want to have a sense of what our goals are. 

Having an agreement among all of the stakeholders on what being "done" means lets me:
  • Improve communication among team members and between team members and business stakeholders.
  • Evaluate my understanding of the problem and help me identify what I can improve.
  • Set expectations so that it's easier to develop trust between stakeholders and the engineering team that the team can, and will, deliver.
"Ready for Release" is a key component of "done" and and essential part of being agile. But it's not enough.



See Andy's response, and read more in Part 2 of this conversation.

Monday, October 10, 2011

Continuous Learning, Coaching, and Learning from Others

There was an article in the Boston Globe this week by Scott Kirshner: Staying Competitive in the Workplace that emphasized the importance of keeping your skills up to date.

It's a short article and worth a read. Some of the activities Kirshner suggests are similar to those Atul Gawande makes in the appendix of his book Better: A Surgeon's Notes on Performance.

Related to this theme is a New Yorker article by Gawande, Personal Best,  on the advantages of challenges of engaging someone to coach you in your profession. I continue to be amazed at how much I'm learning from Gawande, a surgeon, about how to be a better software engineer. I suspect that I first realized this when I started learning about Patterns. (The short post, The Pattern Technology of Christopher Alexander by Michael Mehaffy and Nikos Salingaros discusses how an architect influences the software development community.)

Maybe the common theme of all these writings is that it's important to be ready to learn, and you can learn from the people you least expect to.

Sunday, October 9, 2011

SSQ Article on SCM and Tools

I recently was interviewed for an article  on SCM and Tools that Crystal Bedell wrote for Search Software Quality. Updating tools and processes key to overcoming SCM challenges is brief, and makes some good points about the relative value of tools compared to understanding what you are trying to accomplish with your process.


The best SCM tool, from a day-to-day perspective is the one that is the most invisible to developers, and the best tool really can't help you much if you have a process that just makes it hard to collaborate. This article was also validation that trying to be too agnostic when Brad Appleton and I wrote the SCM Patterns book was a good idea. While some tools make some practices easier, a tool can't replace having an understanding of the reasons to use (or more important, not use) techniques.

Read the article, and if you want to learn some basic SCM practices read the book. For those who want a very comprehensive discussion of branching (which seems to be, for better or worse, the topic that people most struggle with when using SCM tools),  Streamed Lines is a good place to star.