The Problem
With Asking AI
to Think
Strategically

Strategy is a conviction problem, not an information problem. Most AI only solves the second one.

CONSENSUS CONVICTION THE FORK · 2026

We're asking AI to do the thing it's worst at: make bets. And we've dressed it up as strategy.

Here's what's happening. A leader sits down with Claude or ChatGPT and asks: "What should our brand positioning be?" The AI returns five positioning statements, each backed by market research, competitive analysis, and consumer sentiment. Clean. Comprehensive. Full of options.

The leader feels informed. They feel like they've thought strategically. But they haven't. What they've done is outsource the thing strategy actually requires—a choice—to a system built to hedge every bet.

STRATEGY IS NOT AN INFORMATION PROBLEM

Strategy has a reputation problem. Everyone thinks it's about having more data. Better frameworks. More time in the research phase. The mythology of strategy is that it lives in information density.

It doesn't. Strategy lives in conviction.

Information is cheap now. A decent analyst can pull market sizing, competitor positioning, and historical trend data in an afternoon. A good AI can do it in minutes. The thing information doesn't create is the decision to narrow, to sacrifice, to stake a claim on what matters more than something else.

That's where strategy lives. In the moment you say no to the third option because the first two matter more. In the moment you decide that speed beats polish. That loyalty beats acquisition. That 30% of customers served exceptionally beats 100% served adequately.

Information can support those choices. It can't make them.

WHAT CONSENSUS AI DOES TO STRATEGIC THINKING

Large language models are consensus engines. They're trained on the internet, which means they're trained on the aggregate of human opinion weighted by traffic, citations, and visibility. They synthesize the middle. They find the overlap. They return what most people would say, expressed clearly.

That's valuable for a lot of things. It's terrible for strategy.

A consensus engine will never tell you to do the weird thing that works for your specific situation. It will never tell you that conventional positioning doesn't apply to you. It will never make you uncomfortable with your own thinking.

Because the moment it deviates from consensus, it's on shakier epistemic ground. It has less training data. Less validation. Less confidence in its own output. So it hedges. It qualifies. It returns options and lets the human decide.

Strategy isn't about having more information. It's about being willing to act on less information than your competitors, because you've made a stronger bet on what matters.

This is the fundamental mismatch. AI is built to reduce uncertainty by expanding options. Strategy requires embracing uncertainty by narrowing options. Every time AI hands you five positioning statements instead of one, it's moved you further from strategic thinking, not closer.

THE OPTIONS TRAP

More options feel good. They feel like due diligence. They feel like you're exploring the space thoroughly. But options are a strategic tax.

The more options you have, the longer you deliberate. The longer you deliberate, the more you waffle. The more you waffle, the more your strategy becomes a compromise of all five options instead of the commitment to one. You end up positioned in the middle of your own five statements, which is the place where no one else has conviction and everyone else is also fighting.

Good strategy is a bet against the middle. It's the moment a company looks at what everyone else is doing and says: we're going the other direction. Not recklessly. Not without reason. But decisively.

Most AI will never suggest that move because that move doesn't have the highest search volume or citation count. That move is inherently consensually unpopular before it works.

WHAT IT ACTUALLY TAKES

Strategic thinking requires three things that AI is not equipped to provide.

First: conviction. A real belief about what matters in your market and why. Not a synthesized view of what everyone says matters. Your view. Your interpretation of what you know about your customers, your competitors, and yourself.

Second: sacrifice. A willingness to say what you're not doing, who you're not serving, which opportunities you're walking away from. AI optimizes for coverage and completeness. Strategy optimizes for focus and exclusion. Those are opposite directions.

Third: nerve. The comfort with being wrong, or at least being wrong alone. Most people don't like being alone in their opinions. AI lets you stay in consensus by offering you the comfort of five options. Real strategy means you made a choice and you own it, whether the market proves you right in six months or six years.

These aren't things you can get from a system trained on what most people think. These come from you. From your judgment. From the thinking you do before you ask the tool, and the thinking you do after you get the answer.

AI is useful for what comes before strategy and what comes after. It's excellent at research and synthesis. It's competent at implementation and execution. But in the middle, in that narrowing moment where you commit to a position, you're on your own.

That's the actual work of strategy. That's where value gets created. And that's the part no AI is going to do for you.

← All Field Notes