top of page
  • LinkedIn
  • Discord
  • X
  • Facebook
Search

Why Outsourcing Leadership Thinking to AI Is Risky

  • Writer: Michael Rickwood
    Michael Rickwood
  • 6 days ago
  • 4 min read
ree

By 2026, the real leadership risk won’t be using AI; nobody's stopping that train.

The real risk will be outsourcing thinking to it.


I say this as someone who actively works with AI every day. I love working with large language models. The strategies evolve constantly. It feels almost like a gold rush in innovation, learning, and entrepreneurial development.


For months now, I’ve used them as brainstorming and sparring partners to:


Explore ideas.

Stress-test strategies.

Remove tedious work.

Accelerate structure and organisation.

Learn new things.


Used well, it’s extraordinary, but there’s a quiet temptation that comes with working alongside something so fluent, so fast, and so confident.


Letting it decide for you. And some of them (in particular ChatGPT) seem almost designed in taking up that task.


This can often show up in moments of pressure. Usually when self doubt kicks in.


A difficult announcement.

A strategic fork in the road.

A message that needs to land cleanly and reassure people.


AI produces something that looks clear.

Polished.

Convincing.


And that’s where the risk begins because AI does not hold full context, it doesn’t feel consequences and it certainly doesn’t own outcomes.


When leaders allow AI-generated reasoning to stand in for their own judgment, something subtle happens. The message may sound good but it’s no longer fully owned and people know this instinctively.


There’s a reason credibility erodes when leaders can’t explain their own decisions without slides, scripts, or tools.


Clarity without ownership will lose you trust.


A useful illustration comes from a recent experiment in a US university, where an AI system was given control over vending machine operations. It optimised pricing and stocking decisions logically but within weeks, the operation was bankrupt. Not because the AI was “bad”. But because it optimised locally, without understanding the full system it sat inside.


Leadership works the same way.


AI is excellent at inquiry.

At surfacing options.

At revealing blind spots.


But judgment still requires pause, reflection

and responsibility.


This matters not just for organisations, but for individuals and solo entrepreneurs too.

Founders. Executives. Young leaders.


And let’s be honest, If you outsource strategic thinking entirely, you’re handing your decision-making to something that doesn’t carry the weight of the outcome. I mean, you can try to blame the LLM for sinking your project, business company but it won’t get you anywhere.


The only guilty one is the one who abdicated their decision making to an artificial intelligence that functions by drawing words from an impossibly large language pool and matching requests through using deep math. They are beautifully and lovingly built. Thy are designed  to learn and grow but cannot at this stage lead you to battle.


So what kind of leader should we aspire to be? One who avoids AI altogether? Absolutely not.


The leaders who will endure aren’t the ones who avoid AI. They’re the ones who use it rigorously, then step back and ask:

“Can I get up and talk about this as I might own it?”

“Do I actually believe this?”

“Can I stand behind this when things get difficult?”

“Would I still say this without the tool?”


If anything, I’m finding AI sharpens my thinking.

As long as I resist to lean on it too much in the tough moments that force me to make quick or rushed decisions or even allow fear and laziness to show up.


I think we are threading a path of experimentation. And these revelations will reveal themselves more and more. Used well, it accelerates clarity.

Used carelessly, it leaks credibility.


And in leadership, credibility is everything.


LinkedIn version:


Why Outsourcing Leadership Thinking to AI Is Risky


By 2026, the real leadership risk won’t be using AI.

It will be outsourcing thinking to it.


I say this as someone who actively works with AI every day.

I enjoy it.


For months now, I’ve used it as a sparring partner to:


Explore ideas.

Stress-test strategies.

Remove tedious work.

Accelerate structure and organisation.


Used well, it’s extraordinary.


But there’s a quiet temptation that comes with working alongside something so fluent, so fast, and so confident.


Letting it decide for you.


This often shows up under pressure.

When self-doubt kicks in.

When time is short.

When a message has to land cleanly and reassure people.


A difficult announcement.

A strategic fork in the road.

A moment where words carry consequences.


AI produces something that looks clear.

Polished.

Convincing.


And that’s where the risk begins.


AI does not hold full context.

It doesn’t feel consequences.

It doesn’t own outcomes.


When leaders allow AI-generated reasoning to stand in for their own judgment, something subtle happens.

The message may sound good, but it’s no longer fully owned.


People feel that.


There’s a reason credibility erodes when leaders can’t explain decisions without slides, scripts, or tools.


Clarity without ownership loses trust.


A useful illustration comes from a recent university experiment where an AI system was given control over vending machine operations.

It optimised pricing and stocking logically.

Within weeks, the operation was bankrupt.


Not because the AI failed.

But because it optimised locally, without understanding the full system it sat inside.


Leadership works the same way.


AI is excellent at inquiry.

At surfacing options.

At revealing blind spots.


But judgment still requires pause, reflection, and responsibility.


This matters not just for organisations, but for founders, executives, and solo leaders too.


If you outsource strategic thinking entirely, you’re handing decisions to something that doesn’t carry the weight of the outcome. You can try to blame the tool afterwards, but it won’t change the result.


The leaders who endure won’t be the ones who avoid AI.

They’ll be the ones who use it rigorously, then step back and ask:


Can I stand up and explain this as my own?

Do I actually believe this?

Would I still make this call if the tool went silent?


Used well, AI accelerates clarity.

Used carelessly, it leaks credibility.


And in leadership, credibility is everything.

 
 
 

Comments


Capitol Building Washington DC. Leadership and History

Invest in the skill that strengthens every other success.

+33 6 70 225624

For any inquiries, or book a discovery call, please fill out the form below.

Back to Home

© 2025 Vortolo. All rights reserved 

We block all non-essential cookies by default. Some cookies are not yet categorized due to current platform limitations.
Full classification will be completed as the business scales.

bottom of page