ArticleS. UncleBob.
TheSensitivityProblem [add child]

The Sensitivity Problem


Software is a very sensitive domain. If a single bit of a 100MB executable is wrong, the entire application can be brought to it's knees. Very few other domains suffer such extreme sensitivity to error. But one very important domain does: Accounting. A single digit error in a massive pile of spreadsheets and financial statements, can cost millions, and bankrupt an organization.

Accountants solved this problem long ago. They use a set of practices and disciplines that reduce the probability that errors can go undetected. One of these practices is Dual Entry Bookkeeping. Every transaction is entered twice; once in the credit books, and once in the debit books. The two entries participate in very different calculations but eventually result in a final result of zero. That zero means that the all the entries balance. The strong implication is that there are no single digit errors.

Now this technique is not perfect, and other controls are necessary. However, it makes a very powerful first line of defense for the accuracy of the entries. Accountants generally feel that this practice is worth the effort, even though the effort required is effectively double. After all, you have to maintain two sets of books instead of one. In this day and age of electronic spreadsheets, this may not seem an onerous burden. But imagine the cost of dual entry bookkeeping back in the days when books were kept manually.

We in software have a similar mechanism that provides a first line of defense: Test Driven Development (TDD). Every intention is entered in two places: once in a unit test, and once in the production code. These two entries follow very different pathways, but eventually sum to a green bar. That green bar means that the two intents balance, i.e. the production code agrees with the tests. This is not perfect, and other controls are necessary; but there can be little doubt that TDD vastly decreases the defect load in software projects.

I have been consulting for a number of teams that have adopted Agile Methods, including TDD. One common issue I have found is that developers drop the discipline of TDD in the face of schedule pressure. "We don't have time to write tests" I hear them say. Before I comment on the absurdity of this attitude, let me draw the parallel. Can you imagine an accounting department dropping dual entry bookkeeping because they've got to close the books on time? Even before SARBOX such a decision would be such a huge violation of professional ethics as to be unconscionable. No accountant who respected his profession, or himself, would drop the controls in order to make a date.

That is not to say that accountants don't take shortcuts in the face of schedule pressure. They might not break down all the categories. They might not do all the what-if scenarios. They might bundle some things together that should be split. In other words, they might reduce scope. But would they drop the dual entry practice?

Consider a surgeon who, under schedule pressure, decides not to scrub. Consider a pilot who, under schedule pressure, decides not to go through the checklist. There might sometimes be emergency situations that justify a decision like that; but they had better be extremely rare, life-or-death situations. Even then, the decision is more likely to make things worse than better.

Now, back to the absurdity of not having enough time for TDD. TDD is one of the best ways we know of to ensure that programmers understand what they write, and know that it works. To drop TDD is equivalent to deciding that it's not important for the program to work right. If schedule is more important that accuracy, then I can always be on time. If it doesn't matter whether my code works or not, then I'm done now! If it doesn't work, then I suppose it doesn't have to compile either. I could just ship empty files and claim that I was done.

OK, this is an extreme position; but it makes a point. It is perfectly fair for the business to say: "I don't need it to be perfect on the date." However, this does not mean "I'll accept crap on the date." It does not mean, "Just ship whatever you've got on the date." What it really means is this: We need to decide what we will deliver on the date, and then make sure that it works as expected. In other words, we reduce scope, not accuracy.

Make no mistake about this. No business person wants you to ship code with undefined behavior. What they may want is attenuated behavior. But, believe me, even if they want the behavior attenuated, they still want that attenuated behavior to work.

I want you to think of TDD the way accountants think of dual entry bookkeeping. I want you to consider it as an essential part of your profession. I don't want you to think of it as optional. I don't want you to think of it as a luxury. I want you to think of it as absolutely necessary to your professional and personal self respect.

Under schedule pressure, we do not drop our disciplines. Under schedule pressure we increase our disciplines and reduce scope. No matter how tight the schedule is, we will know what we are shipping, and we will know, to the best of our ability, that it works.


!commentForm
 Thu, 6 Oct 2005 13:21:45, Tanton Gibbs, TDD and DRY
Yep, I blogged on this very same thing a while back. Many people think TDD goes against DRY. I pointed out that it should be DRYN (Don't repeat yourself needlessly). Dual entry bookkeeping, via unit tests, is an excellent example of repitition that works. Stee McConnell[?] in Code Complete also talks about embedding your unit tests in to your code in some circumstances. For instance, if you are doing a mathematical computation and you have a fast and slow version, put both in your code and have them check each other during debug runs. This is similar in spirit to both unit tests and dual entry bookkeeping and shows a time when repetition is worth it.
 Fri, 7 Oct 2005 08:08:47, Craig Demyanovich, Good metaphor when working with accountants
I like this metaphor. My current project is for a system that sits in front of an accounting system and must interface with it. Our customer is familiar with the double entry practice in accounting. Now, I don't recall using this metaphor when our team was teaching the customer about TDD, but I bet it would've been very useful. Thanks!
 Mon, 10 Oct 2005 02:34:24, Tim Haughton (timhaughto@gmail.com), No time for tests....
I don't know if I've become more sensitive to these kinds of stories since reading Dave Astels' bits on Behaviour Driven Development, but it does seem like one needs a very good understanding of Agile techniques to view TDD as a specification process rather than a verification process. Even experienced developers can be heard uttering phrases like "we dont't need to test this bit, it's a piece of cake". I'm becomming more and more confident that a shift to specification oriented language and tool set will help in this process.
 Wed, 12 Oct 2005 03:09:38, Sebastian Kübeck,
I personally doubt that writing code without tests is really faster. You usually pay
the price during debugging and the problems that occur running an untested system in production.
I could imagine two reasons why people stop writing tests under pressure:

1. TDD is a discipline that takes some time to adopt. People don't learn programming that way
so under preassure, they tend to step back to the former procedure. If they had learned it
as natural part of software development, the wouldn't even think of not writing tests.

2. The time spend during debugging and trouble shooting may be precieved shorter than the time
writing tests, although the opposite is true in my experience. People don't even think of
the fcat that debugging is waste of time.
 Wed, 11 Oct 2006 11:55:34, Kerry Buckley, Shipping crap on time
"If it doesn't matter whether my code works or not, then I'm done now! If it doesn't work, then I suppose it doesn't have to compile either."

You may laugh. We had some code developed by an outsourcing company a while back, and were assured that it was all complete (including unit tests), and tested. We didn't need to use the classes for a while (we weren't too agile back then), but when we did, it turned out that most of them didn't compile. Not just missing libraries and stuff like that, but basic syntax errors like missing parantheses or semicolons.

On the plus side, that episode was enough to support our case for doing the development in-house, at least until recently.