I was reading this blog by Joel Spolsky about 12 steps to better code. The absence of Test Driven Development really surprised me. So I want to throw the question to the Gurus. Is TDD not really worth the effort?
|
Test driven development was virtually unknown before Kent Beck's book came out in 2002, two years after Joel wrote that post. The question then becomes why hasn't Joel updated his test, or if TDD had been better known in 2000 would he have included it among his criteria? I believe he wouldn't have, for the simple reason that the important thing is you have a well-defined process, not the specific details of that process. It's the same reason he recommends version control without specifying a specific version control system, or recommends having a bug database without recommending a specific brand. Good teams continually improve and adapt, and use tools and processes that are a good fit for their particular situation at that particular time. For some teams, that definitely means TDD. For other teams, not so much. If you do adopt TDD, make sure it's not out of a cargo cult mentality. |
|||||
|
Joel Spolsky himself answered this question back in 2009:
|
|||||
|
Joel has actually addressed this specifically in a few places. He's explained that the things tests are not capable of catching a lot of important issues, particularly subjective ones such as "does this software's user interface suck?" According to him, over-reliance on automated tests at Microsoft is how we ended up with Windows Vista. He's written how, in his experience, the kinds of bugs that users actually find tend to fall into two categories: 1) the ones that show up in common usage, which the programmers would have found themselves had they run their own code before using it, or 2) edge cases so obscure that no one would have thought to write tests to cover them in the first place. He's stated that only a very small percentage of the bugs he and his team fixes in FogBugz are the sort of thing that unit testing would have caught. (I can't find that article now, but if anyone knows which one I mean, feel free to edit the link into here.) And he's written how it can be more trouble than it's worth, especially when your project grows into a very large project with many unit tests, and then you change something (intentionally) and end up with a very large number of broken unit tests. He specifically uses the problems that unit testing can cause as the reason why he has not added it as a 13th point to the Joel Test, even when people suggest that he ought to. |
|||||||||||||||||||||
|
No one but Joel could answer that for sure. But we can try some reasons/observations. First of all, testing is not absent from the Joel's Test
Secondly, the whole idea of the Joel Test (as I understand it) is to have quick, yes-no questions. "Do you do TDD ?" will not exactly fit in (answer could be : "some of us", "for that part of the code" or "we do unit test". Thirdly, I think no one said that (even Joel) that those points where "the only ones worth time" (by the way, "do you program" is not on it), just that those are good quick questions to ask when coming into contact with a software team, whether as a future team member or even as a customer - this is a list I gave to some non technical people around me that were looking for clues about how good/bad their own IT department was. It is not everything, but it is really bad to beat in three minutes. |
|||||||||||||
|