I come from a consulting background, having worked with companies similar to TW. Naturally, TDD was drilled into our brains. I’m well-versed with it.
For those unfamiliar, Test-Driven Development (TDD) flips the usual process: instead of writing code first and tests later, you write the test first—watch it fail—then write just enough code to make it pass. Rinse, repeat.
There are some solid advantages:
- It forces you to focus on the input/output contract of functions.
- You tend to write only the necessary code to pass the test—promoting cleaner, leaner implementations.
- You think through edge cases before jumping into implementation.
Sounds clean, right?
But even after trying it seriously, I find myself not loving it. Here's why:
- When clarity is missing, it's hard to TDD. At early stages of dev, I often don’t know the exact input/output—I’m still exploring the shape of the problem.
- It demands iteration. Write a test. Make it pass. Refactor. Repeat. But in real-world scenarios with time pressure and legacy systems, this ideal workflow can feel like a luxury.
- For known problems, where I already know what to build, TDD sometimes feels like a ceremony for ceremony’s sake. When you already have clarity, writing the code first is simply faster and more intuitive.
- Mild OCD here. I hate seeing red tests or compilation errors, even temporarily. It bugs me more than it should, but it does affect my focus.
Recently, in an interview, I mentioned that I prefer writing code first and testing later—and that it's a controversial take. The interviewer laughed, said “Yeah, that’s controversial,” and shut down the conversation without hearing me out.
That rubbed me the wrong way. Not because I wasn’t challenged—I'm open to being wrong—but because it felt dismissive.
So, fellow devs: Am I missing something here? Are these real drawbacks, or am I just bad at sticking to the process?
Would love to hear what other folks think.