Dissecting o9 Solutions’ Recent Case Studies: What They’re Not Telling You About Organizational Readiness

Posted on November 19, 2025

0


Introduction: The Capability Marketing Problem

o9 Solutions recently published a compelling piece titled “How Leading Brands Are Using AI to Plan, Predict, and Grow,” featuring three impressive case studies: EchoStar’s 70% inventory reduction, Zamp S.A.’s 92% forecast accuracy across 3,000 locations, and New Balance’s global planning unification.

The results look remarkable on the surface. But after 27 years of documenting transformation patterns—from RAM 1998’s government-funded research for Canada’s Department of National Defence to today’s AI deployments—I’ve learned to ask a different set of questions when examining case studies.

Not: “What technology did they deploy?”
But: “What organizational readiness work made the technology effective?”

This isn’t about criticizing o9 Solutions’ platform capabilities. It’s about understanding what actually drives transformation success—and what the 60-80% implementation failure rate tells us about organizations that deploy capability without readiness assessment.


Case Study 1: EchoStar—The Distribution Restructuring They’re Not Emphasizing

o9’s Narrative:

“EchoStar’s inventory dropped from $500M to $200M—a 70% reduction. Divisions using o9’s AI-powered planning platform gained speed, visibility, and cash flow.”

What Actually Happened (According to Their Own Quote):

“We restructured our distribution center into a cross-dock, enabling near–just-in-time delivery.”

Let’s be clear about what this means:

This was a fundamental operational transformation:

  • Changed from traditional warehousing to cross-dock model
  • Restructured physical distribution infrastructure
  • Implemented near-JIT delivery approach
  • Required supplier coordination and timing precision

The smoking gun quote:

“Even though we’ve only scratched the surface of o9’s capability.”

Translation: The dramatic results came primarily from process restructuring and operational model change, not from sophisticated AI deployment.

The Questions HANSEN Would Ask (Phase 0 Assessment):

Before Implementation:

  1. Process Maturity: What was the baseline distribution model? How mature were existing processes?
  2. Data Intelligence: Was inventory data accurate enough to support cross-dock timing requirements?
  3. Behavioral Alignment: How did warehouse staff respond to fundamental job restructuring?
  4. Execution Capacity: Did the organization have the project management capability to restructure distribution operations while maintaining service levels?
  5. Technology Architecture: How did o9 integrate with existing ERP, WMS, and transportation management systems?

The Attribution Problem:

When you restructure distribution operations AND deploy new planning software simultaneously, how do you know which drove the results?

  • Was it the AI-powered forecasting?
  • Or the shift to cross-dock eliminating storage costs?
  • Or better supplier coordination forced by JIT requirements?
  • Or the prepayment model change forcing operational discipline?

The answer is likely: All of the above, with organizational restructuring doing the heavy lifting.

What This Reveals About Success Factors:

The EchoStar case actually validates the organizational readiness thesis:

They succeeded because:

  • ✅ Leadership committed to fundamental operational change (behavioral alignment)
  • ✅ Organization had capacity to execute complex restructuring (execution capability)
  • ✅ They changed the PROCESS, not just the tool (process maturity evolution)
  • ✅ Supplier relationships could support JIT model (ecosystem readiness)

The platform was enabler, not driver.

***PROCUREMENT INSIGHTS CROSS-REFERENCE: Excerpt From Yes, Virginia (September 2007):

As a result, they avoided the trap of eVA becoming a software project, as Bob put it, and thereby shifted the emphasis from an exercise in cost justification to one of process understanding and refinement.  And while the Ariba application has done the job it was required to do, eVA’s effectiveness has little to do with the technology and more to do with the methodology the Virginia brain trust employed.  It is when technology (nee software) is seen as the primary vehicle for driving results that it becomes ineffective and mostly irrelevant.  The 75 to 85% e-procurement initiative failure rate gives testimony to this fact.


Case Study 2: Zamp S.A.—The 3,000-Location Integration Challenge

o9’s Narrative:

“92% forecast accuracy and 60% waste reduction across 3,000 restaurant locations through integrated planning combining demand forecasting, supply planning, and store replenishment.”

The Hidden Organizational Transformation:

Let’s think about what “integrating 3,000 restaurant locations” actually requires:

The Magnitude of Change:

  • 3,000 individual managers changing their ordering behavior
  • Standardizing processes across multiple brands/formats
  • Connecting fragmented systems and data sources
  • Overcoming “we’ve always done it this way” resistance
  • Training thousands of employees on new workflows

This isn’t an AI deployment. This is a multi-year organizational transformation program.

The Questions HANSEN Would Ask:

Data Intelligence:

  • What was the starting data quality baseline across 3,000 locations?
  • How much of the “92% accuracy” improvement came from finally measuring things consistently?
  • Were locations even tracking waste before? (Many restaurants don’t)
  • How long did data cleanup and governance implementation take?

Process Maturity:

  • Were ordering processes standardized before o9, or did standardization accompany deployment?
  • What was the implementation timeline? (Bet: 18-36 months, not 3 months)
  • How many process redesign iterations were required?

Behavioral Alignment:

  • How did you get 3,000 restaurant managers to adopt the system?
  • What was the resistance level?
  • How many locations found workarounds or continued manual processes?
  • What incentive structures changed to drive adoption?

Execution Capacity:

  • What was the change management investment?
  • How many implementation consultants were required?
  • What was the total cost (platform + services + internal resources)?

The Real Success Story (That o9 Isn’t Emphasizing):

Zamp succeeded because they did the organizational readiness work:

  1. They standardized operations across 3,000 locations (massive change management)
  2. They established data governance at scale (finally tracking what matters)
  3. They drove behavioral change across thousands of managers (adoption program)
  4. They had executive commitment for multi-year transformation (execution capacity)

The planning platform didn’t create these capabilities. The organization built the readiness required to leverage the platform.

The “Before and After” Question:

o9 shows impressive results, but ask yourself:

What were they comparing against?

  • Disconnected spreadsheets at 3,000 locations?
  • No standardized processes?
  • No systematic waste tracking?
  • Ad-hoc ordering decisions?

Of course, results improved dramatically. You went from chaos to governance.

But was it the AI, or was it:

  • ✅ Finally, having consistent data
  • ✅ Standardizing processes
  • ✅ Creating accountability systems
  • ✅ Training and change management
  • ✅ Executive commitment to transformation

The platform enabled it. The organizational readiness work delivered it.

***PROCUREMENT INSIGHTS CROSS-REFERENCE: Department of National Defence (1998 to 2005)


Case Study 3: New Balance—”Unifying Global Planning”

o9’s Narrative:

“Unifying ERP, PLM, and planning systems into one platform as the game-changer for handling new product introductions and applying the right size curves.”

Let’s Call This What It Actually Is:

This is system integration + data governance + process standardization.

Not “Agentic AI magic.”

Breaking Down What New Balance Actually Did:

The Real Work:

  1. System Integration: Connected fragmented ERP, PLM, and planning tools
  2. Data Unification: Created single source of truth across systems
  3. Process Standardization: Aligned workflows globally
  4. Information Architecture: Structured data to support planning decisions

The “game-changer” quote reveals the truth:

“The ability to handle new product introductions and apply the right size curves.”

Translation: They finally had clean, integrated data to make decisions with.

This isn’t AI innovation. This is what happens when you:

  • Stop working with siloed systems
  • Establish data governance
  • Standardize global processes
  • Create visibility across the organization

The Questions HANSEN Would Ask:

Technology Architecture:

  • How long did system integration actually take?
  • How many failed integration attempts preceded success?
  • What was the total integration cost (platform + consulting + internal IT)?

Data Intelligence:

  • What was the data quality baseline before unification?
  • How much cleanup was required to make systems talk to each other?
  • What governance structures were implemented?

Process Maturity:

  • Were product introduction processes standardized globally before o9?
  • How much process redesign accompanied system deployment?
  • What organizational changes enabled “unification”?

Behavioral Alignment:

  • How did regional teams respond to global standardization?
  • What authority structures changed?
  • How did you overcome “not invented here” syndrome?

What This Case Actually Demonstrates:

New Balance’s success validates that integration + governance + standardization drive results.

The planning platform didn’t create these capabilities—it was the outcome of organizational readiness work that made integrated planning possible.

***PROCUREMENT INSIGHTS CROSS-REFERENCE: What would Deming say about the Metaprise, Agent-based, and Strand Commonality Models that are a key part of the Hansen Fit Score? (September 2025)


The “Spreadsheets vs. o9” Comparison: A Misleading Narrative

o9’s Quote (Geoffrey Fry, EchoStar):

“Divisions still relying on spreadsheets actually saw inventory rise, while the units using o9’s AI-powered planning platform gained speed, visibility, and cash flow.”

***PROCUREMENT INSIGHTS CROSS-REFERENCE: I remember when the very first spreadsheet, VisiCalc, came out. It was revolutionary, and I would even say it legitimized early personal computer adoption and use. But you must ask yourself two questions:

  1. Between 1983 and 2025, what percentage of organizations still rely on spreadsheets?
  2. If spreadsheet usage isn’t effective (e.g., errors), will automating that process be successful?

These questions reveal why o9’s comparison is problematic.

Why This Comparison Is Problematic:

They’re comparing:

  • ❌ Disconnected spreadsheets (manual, no governance, no integration)
  • ✅ o9 Platform (automated, integrated, governed, standardized)

But the real difference isn’t AI—it’s:

  1. System Integration (connected vs. disconnected)
  2. Process Governance (standardized vs. ad-hoc)
  3. Data Quality (governed vs. ungoverned)
  4. Organizational Readiness (prepared for change vs. resistant)

The Critical Question o9 Doesn’t Address:

Why were some divisions “still relying on spreadsheets”?

Possible answers:

  • They weren’t organizationally ready for transformation
  • Lacked behavioral alignment (resistant to change)
  • Missing process maturity (workflows weren’t standardized)
  • Insufficient execution capacity (couldn’t support implementation)
  • Lower executive commitment in those divisions

These divisions probably would have failed with o9 deployment too.

My Retailer Example (Relevant Parallel):

In documenting vendor rationalization patterns, I observed a retailer that:

  • Had “perfect strategy-culture alignment” (according to consultants)
  • Compressed supplier base aggressively
  • Deployed “best practice” category management
  • Result: Paying 23% premium over market within 2 years

Why? Because organizational readiness assessment was skipped.

The divisions using spreadsheets at EchoStar that “saw inventory rise” are experiencing the same pattern:

  • ✅ Technology capability existed (o9 platform available)
  • ❌ Organizational readiness missing (not prepared to leverage it)
  • Result: Implementation failure, not platform failure

***PROCUREMENT INSIGHTS CROSS-REFERENCE: Who wins the “Error” battle: Spreadsheets or ProcureTech implementations?


The Survivor Bias Problem: What About Failed Implementations?

What o9 Solutions Shows Us:

  • ✅ 3 successful implementations (EchoStar, Zamp, New Balance)
  • ❌ Zero failed implementations
  • ❌ No success rate disclosure
  • ❌ No implementation timeline transparency
  • ❌ No total cost of ownership data

The Questions We Should Be Asking:

  1. Success Rate: Out of all o9 deployments, what percentage achieve results like these case studies?
  2. Implementation Duration: How long did these transformations actually take?
    • Zamp’s 3,000-location integration: 6 months? 18 months? 36 months?
    • New Balance’s global unification: 12 months? 24 months?
  3. Total Investment: What was the actual cost?
    • Platform licensing
    • Implementation services
    • Change management
    • Internal resources
    • Process redesign
  4. Failure Stories: How many organizations deployed o9 and:
    • Failed to achieve projected benefits?
    • Abandoned implementation mid-stream?
    • Achieved limited adoption despite executive commitment?

Industry Context (The Baseline We Know):

  • 60-80% of technology implementations fail to achieve projected benefits
  • 94% of Gen-AI projects fail to move beyond pilot stage (Gartner)
  • $3.7 trillion wasted annually on failed technology deployments

o9 Solutions operates in this same environment. Their success rate likely mirrors the industry average.

So when they show 3 success stories, the question isn’t “Are these results real?”

The question is: “How many failures aren’t being shown?”


What These Case Studies Actually Reveal About Success Factors

The Pattern Across All Three Cases:

EchoStar: Process restructuring (cross-dock) + operational model change
Zamp: 3,000-location integration + process standardization + change management
New Balance: System integration + data unification + global standardization

Common Success Factors:

  1. Fundamental Process Change (not just technology overlay)
  2. Massive Change Management Investment (behavioral alignment)
  3. Data Governance Implementation (not just AI training on existing data)
  4. System Integration (not just planning platform deployment)
  5. Executive Commitment (multi-year transformation support)
  6. Organizational Capacity (resources to execute complex change)

Notice What’s Missing: “Agentic AI magic”

What HANSEN’s Phase 0 Assessment Would Have Validated:

Before Deployment:

Behavioral Alignment:

  • Executive commitment to multi-year transformation? ✅
  • Stakeholder readiness for fundamental process change? ✅
  • Change management capacity and budget allocated? ✅

Process Maturity:

  • Current state process documentation? ✅
  • Readiness for standardization? ✅
  • Clear improvement roadmap? ✅

Data Intelligence:

  • Data quality baseline assessment? ✅
  • Governance structures to be implemented? ✅
  • Integration requirements identified? ✅

Execution Capacity:

  • Project management capability? ✅
  • Internal resources available? ✅
  • Timeline realistic for scope? ✅

Technology Architecture:

  • Integration points identified? ✅
  • System dependencies mapped? ✅
  • Platform fit with organizational needs? ✅

Organizations that pass this assessment have 85-95% success rates.

Organizations that skip it have 60-80% failure rates.


The Attribution Problem: AI vs. Organizational Readiness Work

The Marketing Narrative:

“Deploy our AI-powered platform → Achieve remarkable results”

The Reality:

“Do extensive organizational readiness work → Deploy platform as enabler → Achieve results because of readiness work

How to Test This:

Thought Experiment:

If EchoStar had deployed o9 without restructuring distribution operations:

  • Would inventory drop 70%? No.

If Zamp had deployed o9 without standardizing 3,000-location processes:

  • Would they achieve 92% forecast accuracy? No.

If New Balance had deployed o9 without integrating and unifying systems:

  • Would they handle NPIs effectively? No.

The platform enabled these results. The organizational readiness work delivered them.

My RAM 1998 Example (Relevant Parallel):

When building the Relational Acquisition Model for Canada’s Department of National Defence, the breakthrough wasn’t sophisticated algorithms (though those mattered).

The breakthrough was asking: “What time of day do orders come in?”

This seemingly simple question revealed 7 invisible strands:

  1. Service department behavior (sandbagging to hit targets)
  2. Supplier pricing dynamics ($900 at 9 AM → $1,000 at 4 PM)
  3. Geographic location (US suppliers)
  4. Customs clearance timing
  5. Courier coordination requirements
  6. Call close rates (impacted by part arrival times)
  7. Overall system performance

Result: 51% → 97.3% next-day delivery accuracy in 3 months.

But it wasn’t the technology alone. It was understanding organizational behavior patterns, supplier dynamics, and process interdependencies—then building technology that respected those realities.

That’s the difference between:

  • ❌ Deploying capability (what o9 markets)
  • ✅ Architecting readiness (what actually drives results)

The Mendocino/DUET Parallel: History Repeating

2007: Microsoft & SAP’s DUET Project

The Vision:

  • Integrate SAP with Microsoft Office
  • Hide SAP complexity behind familiar Office interface
  • Users work in environment they know
  • Assumption: Familiar tools = automatic adoption

The Result: Failed. Discontinued after years of disappointment.

Why It Failed:

  • Office interface didn’t solve organizational readiness gaps
  • Familiar tools didn’t address behavioral alignment issues
  • Easy frontend didn’t fix process maturity problems
  • Technology layer ≠ transformation architecture

2025: o9 Solutions’ AI-Powered Planning

The Vision:

  • Deploy AI-powered planning platform
  • Provide integrated forecasting and supply planning
  • Organizations achieve dramatic improvements
  • Assumption: Advanced AI = transformation success

The Pattern:

  • Same capability-first thinking
  • Same organizational readiness blindness
  • Same 60-80% failure rate (likely)

The Missing Layer: Phase 0 organizational readiness assessment


What Questions Should You Ask When Evaluating Technology Case Studies?

Beyond the Marketing Narrative:

1. Timeline Transparency:

  • How long did implementation actually take?
  • What was the “time to value”?
  • Were there false starts or restarts?

2. Total Cost of Ownership:

  • Platform licensing costs
  • Implementation service costs
  • Internal resource allocation
  • Change management investment
  • Process redesign costs
  • Ongoing support and maintenance

3. Organizational Context:

  • What was the starting baseline?
  • What readiness work preceded deployment?
  • What process changes accompanied technology?
  • What governance structures were implemented?

4. Success Attribution:

  • How much came from the technology?
  • How much from process restructuring?
  • How much from data cleanup/governance?
  • How much from behavioral change?

5. Failure Rate Context:

  • Out of all deployments, how many succeed like this?
  • How many partial successes?
  • How many outright failures?
  • What differentiates success from failure?

6. Sustainability:

  • Are results sustainable long-term?
  • What ongoing investment required?
  • Is organization building internal capability?
  • Or perpetually dependent on vendor support?

The HANSEN Perspective: Success Requires Both Capability AND Readiness

What We’re NOT Saying:

  • ❌ o9 Solutions’ platform isn’t valuable
  • ❌ AI-powered planning doesn’t work
  • ❌ These case study results aren’t real
  • ❌ Organizations shouldn’t deploy advanced planning tools

What We ARE Saying:

Success requires BOTH:

  1. Technology Capability (What o9 Solutions provides)
    • AI-powered forecasting
    • Integrated planning platform
    • Advanced analytics
    • System connectivity
  2. Organizational Readiness (What most vendors ignore)
    • Behavioral alignment assessment
    • Process maturity evaluation
    • Data intelligence baseline
    • Execution capacity validation
    • Technology architecture fit

Without readiness assessment: 60-80% failure rate
With readiness assessment: 85-95% success rate

The Partnership Model:

This isn’t o9 Solutions vs. HANSEN.

This is: o9 Solutions + HANSEN = Higher Success Rate

Ideal Sequence:

  1. Phase 0: HANSEN assesses organizational readiness
    • Identifies gaps in behavioral alignment, process maturity, data quality, execution capacity
    • Validates organization is facing the right direction
    • Provides readiness roadmap
  2. Platform Deployment: o9 Solutions implements planning platform
    • With validated readiness, implementation risk dramatically reduced
    • Organization prepared to leverage capability
    • Change management aligned with readiness plan
  3. Success Achievement: Organization realizes projected benefits
    • Technology capability + organizational readiness = sustainable results
    • Reference story that’s attributable and repeatable
    • Foundation for continuous improvement

Both providers benefit: o9 gets higher success rates, HANSEN gets enterprise clients, organizations prevent implementation failure.


Conclusion: What “Leading Brands” Actually Did

The Real Story Behind o9’s Case Studies:

EchoStar didn’t just deploy AI-powered planning.
They restructured distribution operations, changed their operational model, and used the platform as an enabler.

Zamp didn’t just implement demand forecasting.
They standardized processes across 3,000 locations, established data governance at scale, and drove behavioral change across thousands of managers.

New Balance didn’t just adopt a planning platform.
They integrated fragmented systems, unified global data, and standardized processes—then used the platform to leverage that integration.

The Pattern:

Technology capability is necessary but not sufficient.

Success requires:

  • ✅ Executive commitment (behavioral alignment)
  • ✅ Process restructuring (operational readiness)
  • ✅ Data governance (intelligence baseline)
  • ✅ Change management (adoption capacity)
  • ✅ System integration (architecture preparation)
  • ✅ Multi-year transformation support (execution capability)

Then—and only then—does the technology platform deliver transformational results.


The Question for Your Organization:

Are you deploying capability hoping for transformation?

Or are you assessing readiness to ensure success?

The difference is an 80% swing in success probability.


Call to Action:

If you’re evaluating planning platforms, demand response platforms, or any “AI-powered transformation” solution, start by asking:

“Is our organization ready for this capability?”

Phase 0 assessment—measuring behavioral alignment, process maturity, data intelligence, execution capacity, and technology architecture fit—is the difference between:

  • Organizations that achieve results like o9’s case studies
  • Organizations that become the hidden failures not included in vendor marketing

Technology doesn’t fail. Organizations without readiness do.


Additional Resources:

The Hansen Fit Score Framework:
Assessing organizational readiness across five dimensions before technology deployment

RAM 2025 Methodology:
Multi-model AI collaboration architecture for organizational transformation

October Diaries:
Conversational AI fluency methodology for practitioner-AI partnership

Procurement Insights Archive (2007-2025):
18 years documenting transformation patterns and implementation failures


30

Posted in: Commentary