This myth that "testing means no more debugging" needs to die.
Just as we are finally getting good IDE’s, good debuggers and, more importantly, an increasingly widespread conviction in the developer community that these tools are part of a healthy software engineering process, a new vanguard of smug programmers come out of the woodwork with their superior attitude and resurrect the old mantra that only bad developers use debuggers.
Yes, testing is important, and promoting testing is laudable, but not at the expense of equally useful tools and practices that it took us decades to hone and perfect.
Ever since I’ve become serious at testing, my usage of debuggers has increased, if only because I write more code now than before (I didn’t used to write any tests), and this extra code needs to be debugged, like any other. Or do these people assume that just because the code that you write is called "tests", it’s suddenly become bug-free?
But here is my real secret: most of the time, I don’t use a debugger to debug. I use it to verify that my code works as I think it does.
That’s right: I launch my debugger even before a bug has manifested itself. Even before my code is working at all!
I launch it to inspect all the variables, verify my assumptions, stare at my code for what it really is, not for what my biased view tells me it is. I also use the debugger to modify variables and try to trip my code, cause errors that could or shouldn’t happen and make sure it reacts accordingly. Of course, eventually, I capture all of this in tests, but these approaches are complementary.
Don’t throw away your debugger, or the quality of your code will suffer.