syfre logo
December 18, 2025Steve Macfarlane
ai in the real world

Rick and Morty and the Human Cost of Artificial Intelligence

Share this insight
HomearrowInsightsarrowRick and Morty and the Human Cost of Artificial Intelligence

Rick Sanchez is probably the most accurate portrait of unchecked technological power ever put on television. The fact that he’s a cartoon drunk seems to be the only thing stopping people from taking that more seriously.

Because here’s the thing about Rick and Morty that doesn’t get said enough. It’s not really science fiction. It’s not interested in just imagining better technology or a more advanced future. What it’s actually interested in is examining what happens when the usual constraints disappear. When someone is smart enough, powerful enough, and untouchable enough that human nature has nowhere left to hide.

That’s not a new storytelling trick. Make a character rich or powerful or beyond consequence and their worst instincts get a lot harder to disguise (think Succession or House of Cards). Rick and Morty just applies that logic specifically to technology, and with a precision that gets more uncomfortable the longer you work in AI.

Rick and Morty Future Car

The best satire doesn’t actually predict the future. It just watches the present more carefully than everyone else, especially the parts that make us squirm a little. The writers aren’t psychic, they’re just paying extremely close attention to emergent behaviour and following the logic wherever it leads.

And if you’ve spent any time lately watching how AI is actually being built and deployed, some of it lands a bit too close to home these days.

For example,  Glootie’s manic insistence on building an app at any cost reads now like a pretty sharp parody of vibe coding. Zero friction creation, instant shipping, and a complete lack of interest in what happens after. The goal is the build, and everything else is someone else’s problem.

Then there’s the Meeseeks, which feel like early stateless agents. Spun up on demand, laser focused on completing a task, and prone to total unravelling the moment that task turns out to be poorly defined, context dependent, or worst of all, Jerry shaped and messily human. Which, in practice, most real tasks are.

And my personal favourite, the butter robot. Hyper capable, narrowly scoped, and just self aware enough to ask the one question nobody wants to answer…What is my purpose.

Pass the butter.

It’s funny until it isn’t.

Butter passing robot

These aren’t really jokes about gadgets or sci-fi scenarios. They’re jokes about responsibility. Specifically about what happens once the technology is already running.

Rick builds things and moves on, that’s his whole pattern. The system works, the problem is technically solved, and whatever comes next is someone else’s concern. He’s not malicious about it, he’s just completely uninterested in maintenance, consequences, or the gap between what he intended and what actually happens.

That gap is usually Morty’s problem.

And if you spend enough time watching how AI systems actually get built and deployed right now, that dynamic feels less like satire and more like a reasonably accurate job description.

Roll with that Rick

Morty isn’t intelligent the way Rick is intelligent. He doesn’t build anything himself, or even really conceptualise anything on his own. What he does is stay. He exists inside the systems Rick creates and moves on from. He’s there for the aftermath, the edge cases, and the parts that can’t be abstracted away or solved with another layer of cleverness.

In AI terms, Morty is the human in the loop. Except he’s not there to provide oversight,  he’s just there to absorb damage.

He’s the cost centre nobody really models for.

And here’s how the show gets uncomfortable in a way that feels super real. As AI systems become more autonomous, the human role doesn’t disappear, it shifts. Fewer people are needed to produce and more are needed to clean up, review, moderate, and carry the moral weight of decisions the system made but couldn’t actually own.

Morty isn’t bad at the systems. His presence, messy and disruptive and stubbornly human, is usually what makes them survivable, and that’s no small thing.

Do you think this is gonna work

We laugh at Morty because he’s neurotic and emotional and keeps asking uncomfortable questions at the worst possible times. But he’s also the only character who actually deals with what happens after the clever idea has moved on.

Rick gets the breakthrough and Morty just gets the aftermath.

In a lot of ways, that’s more a typical organisational dynamic than just a comedy trope. And it’s playing out in real companies right now in ways that would feel very familiar to anyone who’s spent time around serious AI deployment.

The show doesn’t try to forecast specific models or platforms. It watches incentives and follows behaviour to its most uncomfortable conclusion and then lets you laugh, not because it’s absurd, but because it’s so recognisable.

There’s a line Rick throws out early in the series, usually right before something goes catastrophically wrong.

“Don’t think about it.”

It’s a joke, but it’s also a philosophy. Move fast, build faster, and trust that if something breaks there’s always another universe to escape to.

The problem, for most businesses anyway, is that’s not an actual option.

Rick drinking whiskey

That posture made sense when intelligence was rare and effort was the challenge, but it makes a lot less sense when intelligence is everywhere and the real constraint is no longer capability but responsibility.

AI didn’t create that tension, it just made it harder to ignore.

Morty doesn’t ever really resist what’s happening. He just feels it. He says the quiet part out loud.

“I’m not okay with this.”

And then he keeps going anyway.

Morty Kissing Rick

The future shaped by AI won’t be defined by how smart our systems get. It’ll be defined by what’s left for humans to do once they’re running.

As intelligence scales, responsibility doesn’t disappear. It concentrates. It pools in whoever is left in the room once the clever work is done.

That’s what Rick and Morty keeps showing us, often without appearing to. Rick’s inventions almost always work. The damage happens downstream, offscreen, or to someone who didn’t sign up for it. Responsibility doesn’t vanish when intelligence grows. It just relocates.

And in this version of things, it usually ends up with Morty.

The question AI is actually asking of businesses right now isn’t whether you can build increasingly powerful systems. It’s whether you’re prepared to own the aftermath. The edge cases, the unintended consequences, the moments where someone still has to say “I’m not okay with this” and mean it.

Rick never sticks around for those moments.

But the reality is that someone always has to.

Morty Questions

That’s the part most AI conversations still underplay. Not whether systems will become more capable (they will) but where the human weight of those systems lands once they do. Who absorbs the confusion. Who feels it when optimisation meets reality.

Rick and Morty as a show doesn’t argue for slowing down or stopping. It just reveals what happens when nobody designs deliberately for the Mortys of the world. When someone ends up in that role by accident rather than intention, the results tend to be messy in ways that were entirely predictable and almost entirely ignored.

It’s what happens when human judgment, care, and accountability get treated as afterthoughts rather than load bearing parts of the system.

Morty’s role isn’t to resist progress. It’s to make it survivable.

The question AI is actually leaving us with isn’t whether we can build systems that think. It’s whether we’re willing to design for the humans who have to stay once they do.

Get swifty dancing
AI Workshop icon
Ready to unlock the power of AI, wondering where to start?
Syfre's AI Roadmap Workshops can guide your business on what to focus on.