Cognitive Biases in Software Engineering: How They Impact Developers and Managers
A practical guide to recognizing and mitigating the mental shortcuts that undermine technical decision-making
Let’s face it
We developers and tech managers love to think we’re the most logical people on the planet. After all, we spend our days working with machines that operate on pure logic. But here’s the uncomfortable truth: our brains are just as messy and biased as everyone else’s, maybe even more so because we don’t expect it.
I’ve spent many years both writing code and managing teams, and I’ve seen firsthand how cognitive biases can totally derail technical projects. So let me break down the most problematic mental shortcuts I’ve witnessed (and more often than not in myself, but hey the first step is admitting you have a problem), with a particular focus on how they manifest differently depending on whether you’re the one writing the code or the one managing the people writing the code.
The 10 Cognitive Biases That Are Probably Screwing Up Your Code or Team Right Now
1. Recency Bias
Ever noticed how your team suddenly becomes obsessed with fixing that one bug a customer complained about yesterday, while ignoring the critical issue affecting hundreds of users for months?
For Developers:
You’re ready to rewrite the entire codebase using that shiny framework you read about on Hacker News this morning
You focus all your energy on that bug Karen from Marketing reported yesterday while ignoring the performance issue affecting thousands of users
You’re convinced Pattern X is the best approach to every problem because it worked well in the last project (which, by the way, was completely different)
For Engineering Managers:
“Our last retrospective revealed X, so clearly X is our biggest problem company-wide!”
You prioritize whatever feature was mentioned in the last executive meeting
That developer who did great in the last sprint must be your star player (never mind the six months of mediocre performance before that)
2. Confirmation Bias
When you know you’re right and spend three hours googling just to find that one random forum post from 2011 that agrees with you.
For Developers:
You write tests designed to confirm your code works rather than trying to break it (that would be, you know, the actual point of testing)
“Users are just using it wrong” is your go-to explanation for negative feedback
You somehow always find documentation that supports your approach while missing all contradictory evidence
For Engineering Managers:
You hear what you want to hear in status updates, filtering out warning signs
You keep pushing a technical direction despite mounting evidence it’s turning into a disaster
The team members who agree with you are “insightful” while those who raise concerns are “not seeing the big picture”
3. Availability Heuristic
That one time the production database crashed and became the stuff of company legend? Now you’ve got three engineers dedicated to database optimization while your frontend is held together with duct tape and prayers.
For Developers:
You’re certain SQL injection attacks are the biggest security threat because you read an article about one last week
You build elaborate protection against edge cases you’ve personally encountered while ignoring more common scenarios
You keep solving problems from your last job even though your current company faces completely different challenges
For Engineering Managers:
The bug that caused the CEO’s demo to fail gets five engineers assigned to it immediately
You completely restructure support processes because one customer with a dramatic story complained loudly
“We’re never hiring junior developers again” after one bad experience
4. Anchoring Bias
“It’s just a simple CRUD app, right? Should take a couple of days!” — Words that have launched a thousand death marches and countless weekend deployments. This ties very well with the Dunning-Kruger effect, and often then leads to the sunk cost fallacy.
For Developers:
You committed to the “couple of days” estimate and now you’re working nights and weekends to avoid admitting it was wildly unrealistic
Your first solution becomes THE solution because you’ve already mentally committed to it
You “optimize” code until it matches some arbitrary benchmark you set before understanding the actual requirements
For Engineering Managers:
You promise delivery dates based on when marketing wants the feature, not on what engineering says is possible
You lowball salary offers based on what candidates made at their previous jobs rather than what they’re worth
You judge team performance based on their first few sprints, ignoring all evidence of improvement or changing circumstances
5. Sunk Cost Fallacy
Sometimes it’s better to cut your losses and start over with an updated approach. This can also be applied for giving up on your startup — if it is failing cut it. Close it down. Don’t stay simply because you’ve already invested X years into it.
For Developers:
“I’ve already spent two weeks on this approach, I can’t switch now” (even though a better solution would take days instead of weeks)
You keep using a problematic library because “we’ve already invested time in learning it”
You continue debugging using the same approach for days, despite making zero progress
For Engineering Managers:
You keep funding projects that no longer align with business goals because “we’ve already spent so much”
You maintain ancient systems requiring expensive specialists rather than migrating to something maintainable
You stick with failing processes because admitting they don’t work feels like admitting failure
6. Dunning-Kruger Effect
This effect is especially dangerous because it follows a predictable pattern: beginners with minimal knowledge often have maximum confidence. As engineers gain experience, they realize how much they don’t know (the “valley of despair”). Truly senior engineers tend to have measured confidence — they know what they know, but also recognize the boundaries of their expertise.
What makes Dunning-Kruger particularly toxic in software teams is that those most affected are least equipped to recognize it in themselves. The same knowledge gaps causing overconfidence prevent accurate self-assessment. This is why peer review processes are essential — they provide external reality checks that help calibrate your internal confidence meter.
For Developers:
You roll your own solution for everything instead of using external solutions, because ‘how hard can it be’?
You dismiss industry best practices as “overkill” because you don’t understand the problems they solve
You write “clever” code without comments because it seems obvious to you now (future you and your colleagues will hate you)
For Engineering Managers:
You override technical decisions made by your specialists because “how complicated could it be?”
You commit to deadlines without consulting the people who’ll do the work
You evaluate engineers’ technical skills in areas where your own expertise is limited, often favoring those who speak confidently over those who are actually competent
7. Self-serving Bias
Come on, this is the description of your favorite co-worker.
For Developers:
“The deployment succeeded because of my brilliant code, but it failed because ops screwed up the configuration”
You take credit for all the features users love while blaming “unclear requirements” for the parts they hate
When your code has bugs, it’s because of “edge cases,” but when others’ code has bugs, it’s because of incompetence
For Engineering Managers:
Your team succeeded because of your amazing leadership, but they failed because they didn’t execute properly
You claim credit for on-time deliveries but blame “unforeseen technical challenges” when deadlines slip
You view successful quarters as proof of your strategy and unsuccessful ones as market anomalies
8. Survivorship Bias
Who hasn’t heard of a failed start up doing the best and most glamorous architecture, instead of something simpler and easier to maintain, because all of the bigger companies are doing it?
For Developers:
“We should use microservices because that’s what Netflix does!” (ignoring that Netflix evolved to that architecture after years of learning and has hundreds of engineers to support it)
You learn programming by following success stories without studying the far more common failure patterns
You adopt complex architectures used by tech giants without considering whether they solve problems you actually have
For Engineering Managers:
You build development processes based only on successful projects, ignoring what went wrong in failures
Your hiring process favors candidates from successful companies without evaluating their actual contributions
You apply management techniques that worked at your last company without adapting them to your current team’s context
9. Status Quo Bias
Who hasn’t joined a company and tried to suggest an improvement only to be cut down with the argument of — we are used to doing it this way?
For Developers:
“We’ve always used this approach” becomes the single most annoying phrase you’ll ever hear
You resist learning new tools or techniques because your current workflow is comfortable
You defend problematic legacy code because rewriting it would disrupt your routine
For Engineering Managers:
You maintain inefficient processes because changing them would require effort
You block organizational improvements because they’d disrupt established team dynamics
You stick with obsolete technical approaches because everyone already knows them
10. Bandwagon Effect
Who hasn’t had a colleague that want’s to use something so cutting edge that has a handful of stars and forks on GitHub instead of something that was tried and tested?
For Developers:
You choose technologies based on Twitter hype rather than project requirements
You implement patterns because “everyone is doing it” without understanding why
You follow trendy opinions on best practices without critically evaluating them
For Engineering Managers:
“We need to be doing agile!” (without understanding what that actually means for your specific context)
You let the loudest team members dictate decisions regardless of merit
You prioritize features because competitors have them, without asking if your users need them
How to Not Be a Walking Cognitive Bias Disaster
For Developers
**Document your assumptions
— **Write down what you’re taking for granted before you start coding
— Future you will thank present you when those assumptions change
**Actively seek contradiction
— **Don’t ask colleagues “Does this look good?” Ask “What am I missing?”
— The goal isn’t validation; it’s finding problems before your users do
*Set time limits for explorations
— *“I’ll spend exactly two hours evaluating this new tech” prevents rabbit holes
— Your excitement about a new framework is a terrible decision-making tool
**Break down tasks to expose complexity
— **If you can’t explain exactly how you’ll implement something, you don’t understand it well enough to estimate it
— Details reveal the dragons hiding in seemingly simple tasks
For Engineering Managers
**Require written proposals for major decisions
— **Force articulation of alternatives and trade-offs
— Make implicit assumptions explicit where everyone can see (and question) them
*Assign devil’s advocates
* — Make it someone’s explicit job to argue against the prevailing view
— This lowers the social cost of dissent and surfaces important counterarguments
**Track estimate accuracy
— **Compare estimates vs. actuals as a learning exercise (not a blame game)
— Look for patterns in what types of tasks consistently fool everyone
**Diversify your information diet
— **Seek input from people with different backgrounds and experience levels
— Build teams with complementary biases to counterbalance each other
The Bottom Line
Cognitive biases aren’t character flaws — they’re part of the standard human operating system. The difference between average and exceptional engineers isn’t the absence of bias, but the awareness of it.
By recognizing these patterns in ourselves and our teams, we can build guardrails that compensate for our predictable irrationality. The most powerful tool against bias isn’t more technical knowledge — it’s metacognition, the practice of thinking about how we think.
I’d love to hear what biases you’ve seen derail technical projects and what strategies you’ve found to combat them. Drop your experiences in the comments!
If you want to help out, read it out and clap at Medium as well