ArticleS. PaulPagel.
FreshTesting [add child]

Keeping Tests Fresh.

Sometimes I sit down to write a test on something I haven't worked on before or don't know intimately, and I just can't write the first test. I need a context of the system and a state of existing code. Then I generally do one of two things: look at how things are implemented in the code, or look at other tests that exercise similar behavior. Using either of these as a strict model to write a test is problematic to the flow of TDD.

By using the implementation code as a model, I am limiting one of the great things about TDD from the beginning, the fact the design should evolve as a byproduct of making the test pass. This TDD complacent method tends to ingrain existent design into my mind. These strict models resign to existing design, even if your story/test/problem isn't exactly the same, just similar enough to convince you of the model.

Copy/pasting a similar test and editing is a developer mistake I commit sometimes when it looks like a freebie is being tossed at me: duplicate then abstract. It is very tempting, yet I have found it painfully regressive. Especially when the tests themselves have begun to rot, as their ability to act as developer docs are deprecated. It causes a lie in logic which is always painful. Either the debugger gets fired up or the test gets scrapped in order to handwrite anyway. Copy/pasting something which is similar is starting from a false expectation most of the time. It is more important to me to have faith in the integrity of my tests.

The most powerful TDD I see is at the beginning of a project, since there is a state of tabula rasa allowing you to move infinitely lateral. Before tests depreciates provides the best model for TDD. Test depreciation is unavoidable, as with design changes, the tests are changed with regularity to accommodate the new structures. The reason for having the tests is to have a safety net to make the code easy to change. It becomes important to keep tests "fresh" when there is already design in place. Otherwise good design appears to degrade due to over/improper use.

Handwriting a test from scratch can seem like an extra step, like reinventing the wheel. Most of the time the extra step is exactly that, an extra step, but in those cases where you are following a false model it is very expensive. It introduces the worst type of design into the system, the kind with little to no smell, but with false premises. Introducing bad design into tests or failing to maintain test code ends up introducing bad design into production code.

 Thu, 5 Jan 2006 22:16:48, David Chelimsky, copy/paste
Copy/paste can be a real problem, yet it takes an awful lot of discipline to abandon it completely. For example if you're test driving something new - you write the first test which has some setup in it. You're not ready to move stuff to a setup method because there's no duplication to warrant it yet. So for the second test you copy/paste the first test, modify what you will, get it to pass, and then refactor out the duplication. With just a few lines of testing code that's probably acceptable, but there's a point where you get yourself in trouble. I guess that's different for everyone, and perhaps on different days (depending on the coffee/sleep ratio). Maybe committing to a day of absolutely zero copy/paste to see if it really slows me down would be worthwhile.
 Thu, 5 Jan 2006 22:53:53, Tim Ottinger, What he said
I hate copying from any existing tests, always feeling like I've cheated somehow. It is a way to get by (or try to get by) when you don't really know what you're doing. When I talk about whether I really know something or not, I consider whether I can work from the blank slate. If not, then I don't really know my topic. Not only is working from tabula rasa liberating, it's a good indicator that you've done enough backgrounding. It's good on so many levels.

I just wrote the other side of this blog, that naive tests don't help so you have to know the software to write tests for new parts. I think that it would be good if we could work on the pro/con of this and give guidance on navigating between sylla and charybdis. If you don't know enough, you can't blue-sky the tests. If you are copying the tests, then you only reinforce the implementation in testing (for better or worse). There has got to be a middle way, and some heuristics to help get around: something more useful than "pairing should fix that."
 Fri, 6 Jan 2006 15:50:00, Thomas Eyde, Copy/paste is for professionals -- don't do that at home
I think the trick is to write the tests for the truly new requirements, then find reusable code if any, or refactor existing code to be reusable. Not easy, but that's what I strive for.

I also think there's nothing wrong to copy/paste and then refactor. It's without the refactoring things start to get dangerous.
 Sat, 7 Jan 2006 12:28:24, Matisse Enzer, Copy and paste as an iterative action
Yesterday I was adding tests to some "legacy" code.
After I got the library to compile under the test harness I picked one subroutine to test, and create d a test to run that subroutine. Of course I got a run-time exception because the subroutine I was testing made a call to a subroutines defined in some other library. So, I created a stub version of that subroutine in my test harness. ran the test again, another missing external subroutine. I copied and pasted the stub I had created, changed the name, reran the test. Several times. Each time adding a single "fake" subroutine. (Sometimes I had to edit the stubs to return some mock data.)

I also copied-and-pasted my first test. Renamed it, and changed the arguments it passes to the subroutine I'm testing. Did this three times.

 Mon, 30 Jan 2006 22:46:53, Paul Pagel, the copy/paste problem
There are many ways to introduce code debt into a system. Duplicate code is one of the worse. Copy/Pasting leaves the payment of this debt to the memory of the developer. So if it is copy/paste a few lines at a time, then refactor at the end of small cycles, then I go for it. But if it is going to be coping a file or a series of tests/methods, I won't.