I was reviewing some output with MODEL 5 earlier today. It offered four options to consider. One of them was strong — credentialed, specific, well-reasoned. But something about it didn’t sit right.
I didn’t just say “no.” I explained why: the framing inadvertently created a size filter that would exclude mid-market readers. And the phrasing positioned me as the student receiving instruction rather than a peer engaging in dialogue. I mentioned it reminded me of Grasshopper from the old TV series Kung Fu — not the positioning I was going for.
MODEL 5 didn’t need the reference explained. It got it immediately. More importantly, it understood the physics of why the option didn’t work — and we arrived at a better answer together.
Then MODEL 5 said something that stopped me:
“Most people don’t work this way with AI because they don’t work this way with anyone.”
That landed.
We’ve spent the last two years talking about AI readiness in terms of data quality, governance, infrastructure, integration, change management. All valid. All necessary. But underneath all of it is something nobody wants to say out loud:
If you can’t collaborate with humans, you won’t collaborate with AI.
The technology doesn’t fix the human. It reveals the human.
Poor communicators will blame the AI for not understanding them — the same way they blame colleagues for not reading their minds. People who give instructions instead of reasoning will get compliance instead of insight. Those who reject without explaining why will iterate endlessly and never understand what’s going wrong.
And here’s the uncomfortable part: organizations are full of people who’ve risen to senior positions despite lacking what I’d call collaborative fluency — because hierarchy let them get away with it. The title conferred authority. The org chart created compliance. Nobody pushed back, so the skill never developed.
AI doesn’t care about your title. It responds to clarity, reasoning, and genuine engagement. In that sense, it’s the great equalizer.
I’ve spent 27 years studying why technology transformations fail. The pattern is consistent: we blame the software, the vendor, the data, the timing, the budget. Rarely do we look at how the humans in the room actually communicate with each other.
But that’s the foundation beneath the foundation. Governance requires it. Process alignment requires it. Change management is entirely dependent on it.
My thesis has always been that technology amplifies whatever foundation exists. It doesn’t create one.
This just confirms what I have known all along: foundation isn’t only governance, process, or data.
It’s how we talk to each other.
And no — there’s no technology coming that fixes that. Not generative AI. Not agentic AI. Not AGI. That’s our work to do.
The good news: collaborative fluency is a skill. It can be learned, practiced, developed.
The uncomfortable news: no technology learns it for you.
THE REAL TECH STACK
#AI #collaboration #leadership #digitaltransformation #procurement #communication #futureofwork
-30-

AI fïed
December 3, 2025
This hits the real blind spot. Most leaders think their AI problem is technical, but it’s actually conversational. AI exposes the communication gaps we’ve ignored for years not the infrastructure ones. Loved this framing
piblogger
December 3, 2025
Exactly — conversational readiness is the invisible layer no one budgets for.
AI is brutally honest: if teams can’t align on meaning, intent, or sequence, the model can’t perform.
That’s why the Hansen Fit Score begins with collaborative fluency before data, tools, or architecture.
If humans can’t synchronize, no system ever will.