API & Integration

Rich Hickey's AI Critique: What Developers Need to Know in 2026

Alex Thompson

Alex Thompson

January 13, 2026

10 min read 68 views

Clojure creator Rich Hickey's controversial 'Thanks AI!' critique sparked intense debate about AI's role in programming. In 2026, his insights about AI's limitations and the enduring value of programming fundamentals remain more relevant than ever for developers navigating the AI landscape.

code, coding, computer, data, developing, development, ethernet, html, programmer, programming, screen, software, technology, work, code, code

Introduction: The AI Programming Debate That Won't Go Away

Back in 2024, when Clojure creator Rich Hickey dropped his "Thanks AI!" critique, the programming world split right down the middle. Some called it a necessary reality check. Others dismissed it as old-school resistance to progress. Now, in 2026, with AI tools embedded in nearly every developer's workflow, Hickey's arguments feel less like criticism and more like prophecy.

I've been programming professionally for fifteen years, and I've watched more "revolutionary" technologies come and go than I can count. What struck me about Hickey's perspective wasn't the skepticism—we've all got that—but the specific, technical concerns he raised about how AI actually interacts with the craft of programming. He wasn't just saying "AI is bad." He was asking questions most of us were too busy to consider.

This article isn't about whether you should use AI tools. You probably already do. It's about understanding their limitations, preserving what makes programming valuable, and integrating these tools without losing your edge as a developer. Because in 2026, that's the real challenge.

The Core of Hickey's Argument: AI as Amplifier, Not Replacement

Hickey's central point, which many people missed in the initial reaction, wasn't that AI tools are useless. Far from it. His concern was about what happens when we treat AI as a replacement for understanding rather than an amplifier of capability.

"Programming is not about typing," he emphasized repeatedly. "It's about thinking." And this is where things get interesting. AI tools in 2026 are phenomenal at generating code. They can produce working solutions to common problems in seconds. But they can't—and this is crucial—they can't understand the why behind your decisions.

I've seen this play out in real projects. A junior developer uses an AI tool to generate a complex data transformation pipeline. The code works. Tests pass. But when requirements change six months later, nobody understands how to modify it. The original intent, the trade-offs considered, the alternative approaches rejected—all that context lives in the developer's head, not in the generated code.

Hickey's warning was about this exact scenario: when AI-generated code becomes a black box within your codebase. It creates what he called "synthetic complexity"—complexity that doesn't serve the problem but exists because the tool generated it that way.

The Understanding Gap: Why AI Can't Replace Design Thinking

technology, computer, code, javascript, developer, programming, programmer, jquery, css, html, website, technology, technology, computer, code, code

Here's where Hickey's experience with Clojure's design philosophy becomes particularly relevant. Clojure emphasizes simplicity, immutability, and data transformation. These aren't just language features—they're design principles that require deep understanding to apply effectively.

AI tools in 2026 still struggle with design principles. They can implement patterns they've seen before, but they can't evaluate whether a pattern is appropriate for your specific context. I've tested this extensively with current AI coding assistants. Give them a well-defined function to write, and they'll nail it. Ask them to design a system with specific constraints around scalability, maintainability, and team skill levels? The results are... mixed at best.

One developer in the original discussion shared a perfect example. They asked an AI to generate a caching layer for their application. The AI produced a technically correct implementation using a popular caching library. What it didn't consider was that their application already had three different caching strategies in place, each serving different purposes. The "correct" code would have created consistency nightmares.

This is what Hickey meant when he talked about AI lacking "judgment." Judgment comes from experience, from understanding the broader context, from making mistakes and learning from them. No amount of training data can replicate that.

The Tooling Problem: When AI Creates Its Own Dependencies

Another concern Hickey raised that's become increasingly relevant in 2026 is tooling complexity. AI coding assistants don't just generate code—they often generate dependencies, configurations, and entire toolchains.

I've worked on projects where AI-generated code introduced five new libraries to solve a problem that could have been handled with standard library functions. Each library comes with its own learning curve, maintenance burden, and potential security issues. The AI doesn't care about your team's capacity to maintain these dependencies. It just solves the immediate problem.

Want desktop app?

Native applications on Fiverr

Find Freelancers on Fiverr

This creates what I call "invisible technical debt." The code works today, but the maintenance costs compound over time. One team I consulted with discovered their AI assistant had generated code using three different HTTP client libraries across their codebase. Each had slightly different error handling, retry logic, and configuration patterns. Untangling this took weeks.

Hickey's perspective here is practical, not philosophical. He's asking: "What are we actually getting, and what are we paying for it?" In 2026, with AI tools more integrated than ever, this question matters more than ever.

Practical Integration: Using AI Without Losing Your Edge

coding, programming, css, software development, computer, close up, laptop, data, display, electronics, keyboard, screen, technology, app, program

So, if AI tools have these limitations, should we avoid them entirely? Absolutely not. The key—and this is where I've developed my own approach after years of experimentation—is strategic integration.

First, use AI for what it's good at: boilerplate reduction, documentation generation, and exploring alternative implementations. I regularly use AI tools to generate initial drafts of repetitive code, then refine them with human judgment. The difference is that I treat the AI output as a starting point, not a finished product.

Second, maintain what I call "conceptual ownership." Even when using AI-generated code, you need to understand it thoroughly before integrating it. This means reviewing it line by line, testing edge cases, and ensuring it aligns with your project's architectural principles. Yes, this takes time. But it prevents the black box problem Hickey warned about.

Third, establish team guidelines. In 2026, this is non-negotiable. Your team should agree on when and how to use AI tools. Which problems are appropriate for AI assistance? Which require human-only design? What review process will you use for AI-generated code? Having these guidelines prevents inconsistent quality and maintenance nightmares.

The Human Skills That Matter More Than Ever in 2026

Here's the counterintuitive part: as AI tools become more capable, certain human skills become more valuable, not less. Hickey hinted at this, but let me make it explicit based on what I'm seeing in the industry right now.

Problem decomposition—breaking complex problems into simpler, composable parts—is a skill AI still can't replicate effectively. I've watched senior developers take a messy requirement and decompose it into clean, independent functions. The AI can implement each function, but it can't do the decomposition.

Architectural thinking is another. Understanding how systems fit together, where boundaries should exist, how data should flow—these require holistic understanding that transcends individual code snippets. The best architects I know in 2026 use AI tools constantly, but they direct them with strong architectural vision.

And then there's what I call "requirements archaeology"—digging beneath surface requirements to understand the real problem. Clients and stakeholders often request solutions to symptoms rather than root causes. Human developers can ask "why" repeatedly to uncover deeper needs. AI tools take requirements at face value.

Developing these skills requires deliberate practice. It means sometimes solving problems without AI assistance, just to keep those muscles strong. It means code reviews that focus on design decisions, not just correctness. It means valuing understanding as much as output.

Common Mistakes and How to Avoid Them

Based on the original discussion and my own observations, here are the most common pitfalls developers face with AI tools in 2026—and how to steer clear of them.

Mistake 1: Treating AI as an oracle. Developers ask AI for "the best way" to solve a problem and implement the answer without critical evaluation. Solution: Always ask "why" about AI suggestions. If the AI can't provide a coherent rationale, be skeptical.

Featured Apify Actor

Web Scraper

Need to scrape data from websites but tired of getting blocked or wrestling with proxies? This open-source web scraper f...

175.6M runs 98.8K users
Try This Actor

Mistake 2: Skill atrophy. Relying on AI for basic programming tasks until you can't solve simple problems without it. I've interviewed developers who couldn't explain basic algorithms they used daily because they always generated them with AI. Solution: Regular practice without AI. Solve katas, work on personal projects, or contribute to open source using only your brain sometimes.

Mistake 3: Consistency breakdown. Different team members using AI differently, leading to inconsistent code quality and patterns. Solution: Those team guidelines I mentioned earlier. Make them explicit, document them, and review them regularly.

Mistake 4: Missing the forest for the trees. Focusing on AI-generated code working in isolation while missing system-wide implications. Solution: Always evaluate AI suggestions in the broader context of your entire system.

One developer in the original thread put it perfectly: "AI is giving us sharper chisels, but we still need to know how to carve." That's even truer in 2026 than it was in 2024.

The Future: Where Hickey's Critique Points Us

Looking ahead, Hickey's critique suggests a direction for AI tool development that we're only beginning to see in 2026. Instead of tools that generate code autonomously, we need tools that enhance human understanding and decision-making.

Imagine AI that helps you visualize data flow through your system. Or tools that identify design inconsistencies across your codebase. Or assistants that suggest simpler approaches based on your actual requirements rather than generating complex solutions.

Some of this is already emerging. I've been testing tools that use AI to generate architectural diagrams from code, or to suggest where your implementation diverges from common patterns. These are augmentations of human judgment, not replacements for it.

The most exciting development I've seen recently is AI that helps with what Hickey calls "problem sensing"—identifying where complexity is creeping into your design before it becomes entrenched. These tools don't write code for you. They help you see your own code more clearly.

This aligns perfectly with Hickey's philosophy: tools should make us better thinkers, not just faster typists. In 2026, that distinction makes all the difference.

Conclusion: Embracing AI Without Surrendering Craft

Rich Hickey's "Thanks AI!" critique wasn't a rejection of technology. It was a defense of craft. In 2026, with AI tools more powerful and pervasive than ever, that defense feels increasingly urgent.

The developers who thrive in this new landscape won't be those who reject AI tools. Nor will they be those who blindly accept every AI suggestion. They'll be the developers who use AI strategically while cultivating the human skills that AI can't replicate.

They'll understand that programming is fundamentally about communication—with computers, with other developers, and with future maintainers. AI can help with that communication, but it can't replace the human judgment behind it.

So use the tools. Generate code, explore alternatives, reduce boilerplate. But never outsource your understanding. Never stop asking "why." And never forget that the most valuable thing you bring to programming isn't your ability to type—it's your ability to think.

That was Hickey's real message. And in 2026, it's more important than ever.

Alex Thompson

Alex Thompson

Tech journalist with 10+ years covering cybersecurity and privacy tools.