Skip to main content

Really Dumb Tests

When I mention the idea of automated unit testing (or developer testing as J.B. Rainsberger referred to it in his book JUnit Recipies ) people either love it, or more likely are put off because all the tests that they want to write are too hard, and the tests that they can write seem too simple to be of much value. With the exception of tests that "test the framework" that you are using, I don't think that one should worry prematurely about tests that are too simple to be useful, as I've seem people (myself included) spin their wheels when there was a coding error that could have been caught by one of these "simple" tests.

A common example is the Java programmer who accidentally overrides hashcode() and equal() in such a way that 2 items which are equal do not have the same hashcode. This causes mysterious behavior when you add items to a collection and try to find them later (and you don't get a HashCodeNotImplementedCorrectly" exception.) True, you can generate hashcode and equals with your IDE, but it's trivial to test for this, even without a framework that does the hard work for you.

So when I start working with a system that needs tests, I try not to worry too much about how "trivial" the test seem initially. Once I was working in a Interactive Voice Response application for which we would test by dialing in to a special phone number. We'd go through the prompts, and about half-way into our test we'd hear "syntax error." While I really wanted to write unit tests around the voice logic, we didn't have the right tools at the time, so I tried to see if I could make our manual testing more effective.

This IVR system was basically a web application with a phone instead of a web browser as a client, and VXML instead of HTML as the "page description" language. A voice node processed the touch-tone and voice input and sent a request to a J2EE server that sent back Voice XML which told the voice node what to do next. During our testing the app server was generating the VXML dynamically and we'd sometimes generate invalid XML. This is a really simple problem, and one that was both easy to detect, and costly when we didn't find it until integration testing.

I wrote a series of tests that made requests with various request parameter combinations, and tested the generated voice XML to see if it validated to the DTD for VXML. Basically:

String xmlResponse = appServer.login("username", "password");
try{
XMLUtils.validate(xmlResponse);
} catch (Exception e){
fail("error " + e);
}

Initially people on the team dismissed this test as useless: they believed that could write code that generated valid XML. Once the test started to pick up errors as it ran during the Integration Build, the team realized the value of the test in increasing the effectiveness of the manual testing process.

Even though people are careful when the right code, it's easy for a mistake to happen in a complex system. If you can write a test that catches an error before an integration test starts, you'll save everyone a lot of time.

I've since found lots of value writing test that simply parse a configuration file using and fail if there is an error. Without a test like this the problems manifest in some seemingly unrelated way at runtime. With a test you know the problem is in the parsing, and you also know what you just changed. Also, once a broken configuration is committed to your version control system you slow down the whole team.

If your application relies on a configuration resource that changes, take the time to write some sanity tests to ensure that the app will load correctly. These are in essence Really Dumb smoke tests. Having written these dumb tests you'll feel smart when you catch a error at build time instead when someone else is trying to run the code.

Comments

Popular posts from this blog

Continuous Integration of Python Code with Unit Tests and Maven

My main development language is Java, but I also some work in Python for deployment and related tools. Being a big fan of unit testing I write unit tests in Python using PyUnit. Being a big fan of Maven and Continuous Integration, I really want the  Python unit tests to run as part of the build. I wanted to have a solution that met the following criteria:
Used commonly available pluginsKeep the maven structure of test and src files in the appropriate directories.Have the tests run in the test phase and fail the build when the tests fail.
The simplest approach I came up with to do this was to use the Exec Maven Plugin by adding the following configuration to your (python) project's POM.

<plugin> <groupId>org.codehaus.mojo</groupId> <artifactId>exec-maven-plugin</artifactId> <executions> <execution> <configuration> <executable>python</executable> <workingDirectory>src/test/python</workingDirect…

Displaying Build Numbers in Grails Apps

Being a fan of Continuous Delivery, identifiable builds, and Continuous Integration: I like to deploy web apps with a visible build number, or some other way of identifying the version. For example, having the build number on the login screen for example. In the Maven/Java world, this is straightforward. Or at least I know the idioms. I struggled with this a bit while working on a Grails app,  and wanted to share my solution. There may be other, better, solutions, but the ones I found approaches that didn't quite work they way that I'd hoped.

My requirements were:
To display a build number from my CI tool, where the number was passed in on the command line. In Bamboo, for example you might configure a grails build as
-Dbuild.number=${bamboo.buildNumber} warTo only change build artifacts and not any source files.To not misuse the app version, or change the names of any artifacts.To be simple and idiomatic.I realized that that Grails itself changes the application metadata (appl…

Motivation Visibility, and Unit Testing

I've always been interested in organizational patterns (such as those in Organizational Patterns of Agile Software Development). I've recently found myself thinking a lot about motivation. I'm now reading Drive: The Surprising Truth About What Motivates Us and just finished Rob Austin's book on performance measurement. Being the parent of a three year old, I'm finding more and more that "because I said so, and I'm right" isn't too effective at home. My interests in motivation are closely related to my interest in writing software effectively. Writing software is partially a technical problem about frameworks, coding, and the like, but the harder (and perhaps more interesting) problem is how to get a group of people working together towards a common goal. Agile practices, both technical and organizational, build a framework which makes having the right amount of collaboration and feedback possible. But there's a bootstrapping process: How do yo…