Skip to main content

Motivation Visibility, and Unit Testing

I've always been interested in organizational patterns (such as those in Organizational Patterns of Agile Software Development). I've recently found myself thinking a lot about motivation. I'm now reading Drive: The Surprising Truth About What Motivates Us and just finished Rob Austin's book on performance measurement. Being the parent of a three year old, I'm finding more and more that "because I said so, and I'm right" isn't too effective at home. My interests in motivation are closely related to my interest in writing software effectively. Writing software is partially a technical problem about frameworks, coding, and the like, but the harder (and perhaps more interesting) problem is how to get a group of people working together towards a common goal. Agile practices, both technical and organizational, build a framework which makes having the right amount of collaboration and feedback possible. But there's a bootstrapping process: How do you get people to start doing the practices, especially technical ones, such as unit testing?

In an ideal world, everyone will know how to write unit tests,  understand their value, and want to write them. In an organization transitioning to agile having all three parts in place is not a given.  Most people understand why unit tests are useful, in principle. The problem is the execution. In  my experience, there are two reasons people who claim that that want to write unit tests give for not writing them:

  • They are too hard. The overhead of the test can make the cost of developing a feature excessive. 
  • They are too easy. Some functionality appears to be trivial, so why would we want to test it.

Both of these reasons can have merit at times. Testing getters and setters, and trivial calls to a system library and other simple coding constructs really don't add value. And writing a complicated test using a hard to use framework to verify something non business critical that could be quickly validated by visual inspection ("make the background color of the home screen orange") may well be not worth the effort. The problem is that most people have bad intuitions about where the lines line until they start practicing the skill of unit testing. Most teams need experience with testing to get a true feel of what is a trivial test, and what is a seemingly trivial test that can unmask a major problem

Rather than frame the testing challenge with the default being the old way of not testing:
Write a test when it makes sense.

Change your perspective to the default being to test:
Write a test unless you can explain why you did not.

The key to this approach is to make encourage people to think through their rationale for not testing. There are a few ways to do this, but one approach is to tie the explanation mechanism into something every developer works with every day: your source code repository. Have the team agree that, in addition to the source files, every commit will have a change to a test or a rationale for why you didn't write a test.  For example if I do a commit without a test I could write:
ISSUE-23: Fixed the spelling of the company name. NO TEST: it was a typo. 
ISSUE-26: Fixed the rendering mechanism. NO TEST: we don't have a good framework for testing this sort of thing
or even
ISSUE-28: Fixed a serious logic issue. NO TEST: I didn't feel like writing one.

By developing a team agreement to add tests or explain why not, you are starting with a small change of behavior that paves the way for a greater change based on understanding. Even if a message like the last one is acceptable, many will be uncomfortable admitting to laziness, and think harder for a reason. By reviewing the commit messages later on you can get a sense of impediments to testing (technology, organizational,  or attitude), and use that data in a retrospective to decide how to improve.

By being creative you can help people on your team understand the value of process changes, and start a conversation about how to evolve practices to suit your team.


Popular posts from this blog

Continuous Integration of Python Code with Unit Tests and Maven

My main development language is Java, but I also some work in Python for deployment and related tools. Being a big fan of unit testing I write unit tests in Python using PyUnit. Being a big fan of Maven and Continuous Integration, I really want the  Python unit tests to run as part of the build. I wanted to have a solution that met the following criteria:
Used commonly available pluginsKeep the maven structure of test and src files in the appropriate directories.Have the tests run in the test phase and fail the build when the tests fail.
The simplest approach I came up with to do this was to use the Exec Maven Plugin by adding the following configuration to your (python) project's POM.

<plugin> <groupId>org.codehaus.mojo</groupId> <artifactId>exec-maven-plugin</artifactId> <executions> <execution> <configuration> <executable>python</executable> <workingDirectory>src/test/python</workingDirect…

Displaying Build Numbers in Grails Apps

Being a fan of Continuous Delivery, identifiable builds, and Continuous Integration: I like to deploy web apps with a visible build number, or some other way of identifying the version. For example, having the build number on the login screen for example. In the Maven/Java world, this is straightforward. Or at least I know the idioms. I struggled with this a bit while working on a Grails app,  and wanted to share my solution. There may be other, better, solutions, but the ones I found approaches that didn't quite work they way that I'd hoped.

My requirements were:
To display a build number from my CI tool, where the number was passed in on the command line. In Bamboo, for example you might configure a grails build as
-Dbuild.number=${bamboo.buildNumber} warTo only change build artifacts and not any source files.To not misuse the app version, or change the names of any artifacts.To be simple and idiomatic.I realized that that Grails itself changes the application metadata (appl…