I spent four years studying how humans think, make decisions, and behave in groups. Then I spent twenty years writing software.

For a long time, I kept those two things in separate mental compartments. The degree was the degree. The code was the code. I was a developer who happened to have an unusual educational background — a curiosity, a conversation starter, maybe a liability depending on who was doing the hiring.

Then I hit a bug I couldn't find. And something shifted.


The Bug I Couldn't See

I had been staring at the same section of code for three hours. I'd read it a dozen times. I'd added logging everywhere. The output didn't make sense and the code looked correct — which meant I was wrong about what "correct" meant, but I couldn't see where.

I stepped away. Came back. Read it again. Still nothing.

Then I handed it to someone who had never seen the codebase. They found it in four minutes.

Any developer reading this knows exactly what happened. It's one of the most common and frustrating experiences in the field. But I remember standing there thinking — I know exactly why this happens. I studied this.

The phenomenon is called confirmation bias combined with inattentional blindness. When you write code, you encode your intent into it. When you re-read it, you're not actually reading what's there — you're reading what you meant to write. Your brain fills in the gaps. It corrects the errors before they reach conscious awareness because you already know what it's supposed to say.

The person who found my bug in four minutes wasn't smarter than me. They just didn't have my intent in their head. They read the code as it actually was, not as I meant it to be.

That was the moment I stopped treating my psychology degree as a footnote.


What Developers Get Wrong About Users

The second place the psychology background shows up is in interface design and user behavior — and this one is harder to internalize because it requires you to accept something uncomfortable:

Users are not irrational. They're rational about different things than you are.

This distinction matters enormously. When you build a system and watch users interact with it in ways that seem obviously wrong, the instinct is to blame the user. They didn't read the documentation. They're not technical. They're doing it wrong.

But the user isn't doing it wrong relative to their mental model. They're doing it exactly right. The problem is that their mental model and your mental model are different, and you built the system for your mental model.

Psychology — specifically the study of schema theory, cognitive load, and how people construct categories — gives you a framework for understanding this gap before you build, not after. It teaches you to ask: what does this user already believe is true about how systems like this work? And how do I either meet that expectation or explicitly break it in a way they'll notice and understand?

Most developers learn this eventually through painful iteration. Having the vocabulary for it from the start is a different kind of advantage.


What Sociology Taught Me About Codebases

The sociology half of my degree is the one people find most surprising in this context. What does the study of group behavior and social structures have to do with software?

More than you'd think.

Codebases are social artifacts. They're not just logical structures — they're records of decisions made by groups of people under particular constraints at particular moments in time. The architecture reflects the team structure. The naming conventions reflect the communication patterns. The technical debt maps almost perfectly to the organizational debt that accumulated alongside it.

There's a principle in software engineering called Conway's Law that formalizes this: organizations produce systems that mirror their own communication structures. I didn't learn this from a programming book. I learned the underlying concept — that social structures shape the outputs of groups — in a sociology classroom. The application to software came naturally.

Understanding this has practical implications. When I inherit a codebase, I'm not just reading code — I'm reading the archaeology of a team. I can tell where decisions were made under time pressure, where there was disagreement that never got resolved, where someone left mid-project and their replacement didn't fully understand the context. This is useful information. It tells me where the risk is and where the assumptions are buried.


The Advantage Nobody Talks About

The tech industry has a complicated relationship with non-traditional backgrounds. On one hand, there's genuine openness to autodidacts and career changers. On the other hand, there's an implicit hierarchy that places CS degrees at the top and treats everything else as a compensated weakness.

I've been in both positions — treated as less credible because of my background, and chosen specifically because of it.

Here's what I've come to believe: the developers who build the best systems are almost always the ones who understand both machines and people. The machine part you can learn from documentation, from practice, from twenty years of writing Node.js. The people part — why users behave the way they do, why teams make the decisions they make, why your own brain lies to you when you're debugging — that's harder to pick up from a technical curriculum.

My degree didn't teach me to code. But it taught me things that made me a better engineer than I would have been without it. The ability to step outside my own perspective and ask what someone else is actually experiencing. The vocabulary to name the cognitive patterns that show up in debugging, in user research, in team dynamics. The habit of treating human behavior as a system with its own logic, rather than as noise to be filtered out.

That's not a liability. That's leverage.


What This Means If You're Hiring

If you're a founder or CTO evaluating technical candidates, here's my honest take:

The developer who studied computer science exclusively has deep expertise in how machines work. That's valuable. But the developer who came to engineering from psychology, philosophy, history, design, or any other discipline that required sustained study of human behavior — they've been trained to think about problems from a fundamentally different angle.

The best technical teams I've been part of had both. The worst were monocultures — everyone trained the same way, thinking the same way, with the same blind spots shared uniformly across the group.

Non-traditional backgrounds aren't a risk to be managed. They're a signal that someone learned to think carefully about something hard — and that skill transfers.


← Back to Blog