If Eric is explaining why so many ERP programs still fail inside the project, the Compounding Technology Shadow Wave™ work explains why they were set up to fail long before the project ever mobilized. Two analyses, different vantage points, the same answer.
Eric Kimberling published a piece on Sunday titled “The Hidden Cost of Agile in ERP Implementations” that should be read carefully by anyone responsible for enterprise technology decisions. Eric brings something the procurement-technology discourse has not previously had at this scale — an expert-witness practice that has examined ERP implementations that failed badly enough to end up in court, with legal-grade documentation of what actually went wrong rather than statistical patterns about how often things go wrong.
The argument is that agile, when applied indiscriminately to enterprise resource planning programs, is making implementation outcomes worse rather than better. The mechanism Eric identifies is precise: agile gives cover to skip the structured front-end work that ERP demands, produces undocumented design decisions and squishy requirements, paves the existing cow paths instead of redesigning them, and fragments the end-to-end coherence that ERP value depends on. The result is what he describes as a generation of ERP programs that move quickly in the early stages, accumulate enormous technical and organizational debt, and then stall when the business finally has to operate on the new platform.
In the LinkedIn dialogue following Eric’s piece, the convergence between his expert-witness diagnosis and the framework’s structural analysis surfaced explicitly when Eric asked whether agile accelerates the shadow technology gap. The framework’s answer — yes, when agile is used as a substitute for structured discovery rather than as execution after the operating backbone is understood — is what this post extends and develops. Agile does not eliminate shadow behavior. It can unintentionally legitimize it. That formulation is the load-bearing observation underneath what follows.
The framework’s structural diagnosis of this pattern is not contemporary. The Procurement Insights archive has been continuously documenting the substrate-versus-methodology question since 2008, with the agent-based versus equation-based modeling distinction from that period directly predicting the failure mode Eric documents in his expert-witness practice. The Compounding Technology Shadow Wave™ vocabulary that surfaced in May 2026 is the formalization of analytical positions that have been documented in the archive for eighteen years.
The mechanism Eric is naming
Walk through what Eric’s piece actually identifies, because each observation maps cleanly onto a structural element the framework has been developing.
ERP systems were conceived and built to manage end-to-end business processes, with value coming from coherence across the enterprise rather than from any individual module. Agile, by contrast, is optimized to deliver small slices of functionality as quickly as possible. When the agile mindset is applied to ERP, teams begin carving the program into independent sprints that deliver narrow capabilities without first proving how those capabilities will fit together. The system breaks, not because any individual workstream was bad, but because each was deployed in isolation from the broader process it was supposed to enable.
That is the fragmentation mechanism. Local delivery speed produces incoherence at the enterprise level. Eric is observing the divergence at the single-program scale; the framework is observing the same divergence across the four-wave technology compounding pattern. The mechanism is the same. The scale is different.
Eric writes that agile makes things squishy: requirements are loosely tracked, design decisions are made in flight, and the artifacts that should serve as the project’s institutional memory are either thin or entirely absent. The configuration choices end up locked inside the head of a consultant who rolls off the project, and the organization inherits a system it does not fully understand and cannot confidently evolve.
That is provenance breakage. When AI outputs need to be defended to auditors, regulators, or the board, they cannot be defended if their inputs traversed unprovenanced shadow infrastructure. The chain breaks at whichever upstream wave’s shadow the inputs traversed. Eric is observing the same breakage at the ERP-project scale; the framework names it as a cross-wave structural condition. The fingerprint is identical.
Eric writes that agile, in practice, takes the pendulum and swings it too far the other way, toward being completely unstructured, valuing documentation a lot less, and valuing speed over all else. He describes how this becomes part of the disease it claimed to cure. The minimum-viability mindset gives cover to project teams, vendors, and system integrators to skip the rigorous planning that ERP demands. The unspoken logic is that if it does not work, the team can always pivot later.
This is where the agile-as-legitimization mechanism becomes visible. Local workarounds, spreadsheet logic, SaaS exceptions, informal approval paths, and undocumented process variants get captured sprint by sprint and hardened into the new platform before anyone has validated whether they belong in the future-state model. The agile cycle becomes a vehicle for institutionalizing the very shadows the transformation was supposed to resolve. Speed without substrate validation does not surface the inheritance — it operationalizes it. As the framework named the dynamic in March 2026, AI doesn’t break broken processes. It perfects them. That formulation operates in the same conceptual territory Eric’s expert-witness practice documents at the ERP-program scale.
The closest analytical alignment is in Eric’s paving the cow paths observation. He writes that agile reinforces existing local practices rather than challenging them — the new system is configured to mirror how employees already work, the new processes look suspiciously like the old processes, and the productivity and insight gains that justified the investment never quite materialize. The organization ends up with a more modern technology platform running essentially the same business it was running before.
That is, structurally, substrate inheritance (today’s programs landing on yesterday’s unresolved decisions) rendered as ERP-program-scale phenomenology. Each new platform lands on top of unresolved spreadsheet, BYOD, and SaaS-era operating decisions that were never designed to support what is now being deployed on top of them. Eric calls this paving the cow paths. The framework calls it substrate inheritance. Same phenomenon, two different names.
The litigation lens and the framework’s adjacent operational engagement
Eric’s piece introduces a vantage point the framework’s archive has been operating adjacent to for over a decade. The clearest documentation is the November 2015 Blog Talk Radio segment with Kelly Barner on the Oregon Supreme Court ruling that denied Oracle’s attempt to dismiss its top brass as defendants in the Cover Oregon Health Exchange lawsuit. That coverage demonstrates that the framework engaged the executive-liability dimension of ERP implementation failure, in alignment with the litigation lens Eric raises in his current piece.
The framework’s operational diagnostic posture is documented separately. In 2007, the framework was engaged by Multnomah County in Oregon to produce a confidential evaluation of SAP’s public-sector deployment record before the County committed to a similar engagement. The diagnostic work surfaced that SAP’s intended reference account (King County) had abandoned the platform after two years and considerable cost. That finding led Multnomah County to decline the SAP engagement. The work is operationally distinct from the Cover Oregon/Oracle situation but illustrates the same diagnostic posture: pre-commitment surfacing of operational reality rather than reliance on vendor-controlled reference cases. That posture is what Phase 0™ now formalizes as a commercial instrument.
Eric brings expert-witness evidence. Projects that failed badly enough to end up in court, where the post-mortems are exhaustive, the documentation is preserved as legal record, and the findings are produced under adversarial scrutiny. That evidence base produces specific, defensible findings about what actually went wrong rather than statistical patterns about how often things go wrong. The framework’s archive shows how often the pattern appears across enterprise technology generations; Eric’s expert-witness cases show exactly how it plays out inside a single program under legal-grade examination. The two evidence bases are complementary at adjacent layers rather than the framework having a gap that Eric fills.
When Eric writes that requirements were never properly documented, design decisions were made informally and never validated, test scenarios were thin, accountability between the client, the integrator, and the software vendor was murky — that is what substrate inheritance looks like when it is examined under legal-grade scrutiny. The pattern is consistent enough across cases that he describes it as a common theme that emerges in expert witness case studies and identifies agile-as-rigor-replacement as the recurring root cause.
The Procurement Insights archive documented the same failure rate at the project scale in July 2011 — “85% of all eProcurement initiatives fail” — and traced the cause to the same mechanism Eric’s piece identifies: organizations attempting to adapt operations to an idealized workflow rather than designing technology to support real-world operational reality. The framework has the longitudinal evidence (Procurement Insights from 2007 onward, the four-wave pattern across thirty-five years, the 1998 Department of National Defence agent-based Metaprise R&D engagement as the foundational proof case). It has the empirical evidence at scale (Marijn’s data, Stanford HAI failure rates, MIT and BCG and McKinsey convergence on organizational-readiness as the determining variable). It has the structural diagnosis (the Compounding Technology Shadow Wave™ post). It has the measurement instruments (the Hansen Deflator Formula™, the forthcoming Hansen Optionality Loss Estimate™). Eric’s piece adds the legal-grade documentation across multiple cases — evidence that complements rather than duplicates the framework’s longitudinal archive.
What both analyses converge on
The clearest way to see the relationship between Eric’s piece and the framework’s body of work is to put the two side by side at the dimensions that matter most:
Each row names a real distinction. None of them position the two analyses as competitive. Eric maps the methodology mechanism inside a single ERP transformation; the framework names the structural condition every ERP transformation inherits regardless of methodology. Eric argues for hybrid program structure; the framework argues for substrate diagnosis as the upstream prerequisite for any program structure to succeed. Eric refuses to predict timing within a single program; the framework refuses to predict which technology wave arrives next — but documents the substrate condition that will determine whether the enterprise captures value from whichever one does.
The convergence gives the prescription a level of structural credibility that neither analysis could establish alone. Together they produce a more complete strategic posture than either one produces alone. The reader who absorbs Eric’s methodology critique and the framework’s substrate diagnosis is positioned to understand both the project-scale mechanism by which ERP transformations are currently failing and the multi-decade structural inheritance underneath those project-scale mechanisms.
The hybrid prescription is structurally identical to the resolution architecture
The most striking convergence is in Eric’s hybrid prescription. He writes that the front end of an ERP program, where the most consequential decisions are made, should look more like a structured waterfall effort. Requirements should be elicited, documented, and signed off. Future-state business processes should be designed at the enterprise level. The data strategy should be defined before configuration begins. Integration architecture should be mapped end-to-end. Organizational change management should be planned with the same rigor as the technical workstreams.
That is, in substance, Phase 0™. The framework’s diagnostic instrument is calibrated to surface the multi-wave substrate condition across all four waves and to produce a documented readiness profile before deployment begins. Eric is describing the same structural function calibrated to a single ERP program rather than to the cross-wave aggregate, but the analytical move is identical: do the hard, structured work up front to define enterprise truth, then let iterative execution operate against that documented backbone.
Eric writes that once the backbone is in place, agile and iterative methods can be introduced productively into the build, configuration, testing, and deployment phases. Sprints become a way to make steady progress against a clearly understood target rather than a substitute for having a target at all. That is structurally identical to the framework’s resolution architecture.
The relational governance positioning that anchors the framework’s resolution architecture has been formally in the published archive since 2013, when Andy Akrouche documented the 2002 Province of Ontario relational governance engagement and the subsequent CRA adoption of the same model, which a benchmark study showed produced a 25% reduction in CRA acquisition costs one year after implementation. Phase 0™ produces the documented multi-wave exposure map. The relational governance layer provides the contractual frame within which operational disciplines do their wave-specific work. The execution layer is where humans (and increasingly agentic AI under delegated authority) operationalize the governance frame.
Two independent analyses, different starting points, the same structural answer. Eric arrived at the answer through expert-witness evidence on ERP transformation failures. The framework arrived at the answer through nineteen years of Procurement Insights archive evidence on the four-wave compounding pattern, building on operational proof cases that predate the contemporary vocabulary by decades.
The cost of leaving the substrate unaddressed
Eric’s piece argues that programs accumulating undocumented decisions during agile sprints are accumulating risk that will be paid back, with interest, during cutover and stabilization. The framework quantifies what the accumulated risk costs at the enterprise level.
The Hansen Deflator Formula™ measures realized AI and ERP value as the product of theoretical value and the substrate-quality coefficients across the prior technology waves. For a typical mid-cap enterprise, illustrative coefficients (Wave 1 ≈ 0.80, Wave 2 ≈ 0.70, Wave 3 ≈ 0.65) produce a deflator of approximately 0.36. An ERP or AI initiative that should produce $10 million in value realizes $3.6 million. The remaining $6.4 million is absorbed by the unresolved substrate from 1990, 2007, and 2012. For a five-thousand-employee enterprise, that translates to $7 million to $9 million per year of absorbed AI value alone, plus $25 million to $40 million per year of aggregate enterprise drag from the broader four-wave stack. The full development of the deflator math is in the financial impact piece linked above.
That is the cost. It accumulates whether or not anyone is measuring it. The board does not see it directly because the substrate friction has been absorbed into other line items by the time it reaches them. The implementation team and the support team see it daily because they operate at the substrate layer where the friction lives. Eric’s expert-witness cases see it in retrospect, when the cumulative cost is large enough to produce litigation.
What this means for the senior buyer commissioning an ERP transformation
Eric’s piece closes with three practical recommendations for executives sponsoring ERP or broader digital transformation programs. The recommendations are sound at the program-scale, and they translate cleanly into framework-scale guidance.
First, treat methodology choices as strategic decisions rather than as delivery details to be left to the integrator. The framework would add: methodology choices at the program scale are downstream of substrate condition at the enterprise scale. The right methodology cannot rescue an enterprise whose substrate forecloses the value the program is designed to capture.
Second, insist on independent oversight. Vendors and system integrators have legitimate roles to play, but their economic incentives are not always aligned with the long-term success of the program. The framework’s archive independence (zero vendor sponsorships across nineteen years) is structurally calibrated to exactly this need. Eric’s expert-witness practice operates from the same independence; the framework’s diagnostic and measurement work operates from the same independence. The convergence on independence as a prerequisite for credible advisory work is not coincidental.
Third, remember that speed is not the same as progress. The framework extends the observation: speed at the program scale is dwarfed by substrate inheritance at the enterprise scale. A program that moves quickly without surfacing the multi-wave substrate condition is producing project-level velocity at the cost of multi-decade structural drag.
In practice, that means commissioning a Phase 0™-style substrate diagnostic before signing the next ERP or AI contract, and using its findings to shape both the hybrid program structure Eric recommends and the commercial frameworks that will have to hold it. That is the first move. Everything downstream depends on it.
Where the two pieces of work go from here
Eric’s piece is one of the smartest engagements with the ERP transformation problem the discourse has produced in recent memory. The substrate question is the natural complement to it. Worth reading both.
The framework will extend this work into optionality loss (what strategic moves the substrate condition forecloses), convergence events (when inherited risk becomes acutely consequential), and non-linear loss (how a single missed window permanently changes the trajectory). Each of those forthcoming pieces will engage the kind of evidentiary base Eric’s expert-witness practice has been producing for two decades.
My hope is that this piece serves as a bridge: ERP leaders who follow Eric’s work can use the compounding-wave and deflator tools to extend his insights into AI and multi-wave substrate risk, and I would welcome ongoing dialogue with Eric and his community as this work evolves. The LinkedIn exchange following Eric’s original piece has already opened that territory. This post is what its substantive extension looks like in the framework’s published archive.
—30—
Compounding Technology Shadow Wave™ · Hansen Deflator Formula™ · Hansen Optionality Loss Estimate™ · Phase 0™ · Hansen Fit Score™ (HFS™) · RAM 2025™ · Real-World Condition Substrate™ · Strand Commonality™ · ARA™ · Implementation Physics™
Hansen Models™ · Founder: Jon W. Hansen · hansenprocurement.com
-30-
Eric Kimberling’s Hidden Cost Of Agile In ERP And The Substrate Underneath It
Posted on May 11, 2026
0
If Eric is explaining why so many ERP programs still fail inside the project, the Compounding Technology Shadow Wave™ work explains why they were set up to fail long before the project ever mobilized. Two analyses, different vantage points, the same answer.
Eric Kimberling published a piece on Sunday titled “The Hidden Cost of Agile in ERP Implementations” that should be read carefully by anyone responsible for enterprise technology decisions. Eric brings something the procurement-technology discourse has not previously had at this scale — an expert-witness practice that has examined ERP implementations that failed badly enough to end up in court, with legal-grade documentation of what actually went wrong rather than statistical patterns about how often things go wrong.
The argument is that agile, when applied indiscriminately to enterprise resource planning programs, is making implementation outcomes worse rather than better. The mechanism Eric identifies is precise: agile gives cover to skip the structured front-end work that ERP demands, produces undocumented design decisions and squishy requirements, paves the existing cow paths instead of redesigning them, and fragments the end-to-end coherence that ERP value depends on. The result is what he describes as a generation of ERP programs that move quickly in the early stages, accumulate enormous technical and organizational debt, and then stall when the business finally has to operate on the new platform.
In the LinkedIn dialogue following Eric’s piece, the convergence between his expert-witness diagnosis and the framework’s structural analysis surfaced explicitly when Eric asked whether agile accelerates the shadow technology gap. The framework’s answer — yes, when agile is used as a substitute for structured discovery rather than as execution after the operating backbone is understood — is what this post extends and develops. Agile does not eliminate shadow behavior. It can unintentionally legitimize it. That formulation is the load-bearing observation underneath what follows.
The framework’s structural diagnosis of this pattern is not contemporary. The Procurement Insights archive has been continuously documenting the substrate-versus-methodology question since 2008, with the agent-based versus equation-based modeling distinction from that period directly predicting the failure mode Eric documents in his expert-witness practice. The Compounding Technology Shadow Wave™ vocabulary that surfaced in May 2026 is the formalization of analytical positions that have been documented in the archive for eighteen years.
The mechanism Eric is naming
Walk through what Eric’s piece actually identifies, because each observation maps cleanly onto a structural element the framework has been developing.
ERP systems were conceived and built to manage end-to-end business processes, with value coming from coherence across the enterprise rather than from any individual module. Agile, by contrast, is optimized to deliver small slices of functionality as quickly as possible. When the agile mindset is applied to ERP, teams begin carving the program into independent sprints that deliver narrow capabilities without first proving how those capabilities will fit together. The system breaks, not because any individual workstream was bad, but because each was deployed in isolation from the broader process it was supposed to enable.
That is the fragmentation mechanism. Local delivery speed produces incoherence at the enterprise level. Eric is observing the divergence at the single-program scale; the framework is observing the same divergence across the four-wave technology compounding pattern. The mechanism is the same. The scale is different.
Eric writes that agile makes things squishy: requirements are loosely tracked, design decisions are made in flight, and the artifacts that should serve as the project’s institutional memory are either thin or entirely absent. The configuration choices end up locked inside the head of a consultant who rolls off the project, and the organization inherits a system it does not fully understand and cannot confidently evolve.
That is provenance breakage. When AI outputs need to be defended to auditors, regulators, or the board, they cannot be defended if their inputs traversed unprovenanced shadow infrastructure. The chain breaks at whichever upstream wave’s shadow the inputs traversed. Eric is observing the same breakage at the ERP-project scale; the framework names it as a cross-wave structural condition. The fingerprint is identical.
Eric writes that agile, in practice, takes the pendulum and swings it too far the other way, toward being completely unstructured, valuing documentation a lot less, and valuing speed over all else. He describes how this becomes part of the disease it claimed to cure. The minimum-viability mindset gives cover to project teams, vendors, and system integrators to skip the rigorous planning that ERP demands. The unspoken logic is that if it does not work, the team can always pivot later.
This is where the agile-as-legitimization mechanism becomes visible. Local workarounds, spreadsheet logic, SaaS exceptions, informal approval paths, and undocumented process variants get captured sprint by sprint and hardened into the new platform before anyone has validated whether they belong in the future-state model. The agile cycle becomes a vehicle for institutionalizing the very shadows the transformation was supposed to resolve. Speed without substrate validation does not surface the inheritance — it operationalizes it. As the framework named the dynamic in March 2026, AI doesn’t break broken processes. It perfects them. That formulation operates in the same conceptual territory Eric’s expert-witness practice documents at the ERP-program scale.
The closest analytical alignment is in Eric’s paving the cow paths observation. He writes that agile reinforces existing local practices rather than challenging them — the new system is configured to mirror how employees already work, the new processes look suspiciously like the old processes, and the productivity and insight gains that justified the investment never quite materialize. The organization ends up with a more modern technology platform running essentially the same business it was running before.
That is, structurally, substrate inheritance (today’s programs landing on yesterday’s unresolved decisions) rendered as ERP-program-scale phenomenology. Each new platform lands on top of unresolved spreadsheet, BYOD, and SaaS-era operating decisions that were never designed to support what is now being deployed on top of them. Eric calls this paving the cow paths. The framework calls it substrate inheritance. Same phenomenon, two different names.
The litigation lens and the framework’s adjacent operational engagement
Eric’s piece introduces a vantage point the framework’s archive has been operating adjacent to for over a decade. The clearest documentation is the November 2015 Blog Talk Radio segment with Kelly Barner on the Oregon Supreme Court ruling that denied Oracle’s attempt to dismiss its top brass as defendants in the Cover Oregon Health Exchange lawsuit. That coverage demonstrates that the framework engaged the executive-liability dimension of ERP implementation failure, in alignment with the litigation lens Eric raises in his current piece.
The framework’s operational diagnostic posture is documented separately. In 2007, the framework was engaged by Multnomah County in Oregon to produce a confidential evaluation of SAP’s public-sector deployment record before the County committed to a similar engagement. The diagnostic work surfaced that SAP’s intended reference account (King County) had abandoned the platform after two years and considerable cost. That finding led Multnomah County to decline the SAP engagement. The work is operationally distinct from the Cover Oregon/Oracle situation but illustrates the same diagnostic posture: pre-commitment surfacing of operational reality rather than reliance on vendor-controlled reference cases. That posture is what Phase 0™ now formalizes as a commercial instrument.
Eric brings expert-witness evidence. Projects that failed badly enough to end up in court, where the post-mortems are exhaustive, the documentation is preserved as legal record, and the findings are produced under adversarial scrutiny. That evidence base produces specific, defensible findings about what actually went wrong rather than statistical patterns about how often things go wrong. The framework’s archive shows how often the pattern appears across enterprise technology generations; Eric’s expert-witness cases show exactly how it plays out inside a single program under legal-grade examination. The two evidence bases are complementary at adjacent layers rather than the framework having a gap that Eric fills.
When Eric writes that requirements were never properly documented, design decisions were made informally and never validated, test scenarios were thin, accountability between the client, the integrator, and the software vendor was murky — that is what substrate inheritance looks like when it is examined under legal-grade scrutiny. The pattern is consistent enough across cases that he describes it as a common theme that emerges in expert witness case studies and identifies agile-as-rigor-replacement as the recurring root cause.
The Procurement Insights archive documented the same failure rate at the project scale in July 2011 — “85% of all eProcurement initiatives fail” — and traced the cause to the same mechanism Eric’s piece identifies: organizations attempting to adapt operations to an idealized workflow rather than designing technology to support real-world operational reality. The framework has the longitudinal evidence (Procurement Insights from 2007 onward, the four-wave pattern across thirty-five years, the 1998 Department of National Defence agent-based Metaprise R&D engagement as the foundational proof case). It has the empirical evidence at scale (Marijn’s data, Stanford HAI failure rates, MIT and BCG and McKinsey convergence on organizational-readiness as the determining variable). It has the structural diagnosis (the Compounding Technology Shadow Wave™ post). It has the measurement instruments (the Hansen Deflator Formula™, the forthcoming Hansen Optionality Loss Estimate™). Eric’s piece adds the legal-grade documentation across multiple cases — evidence that complements rather than duplicates the framework’s longitudinal archive.
What both analyses converge on
The clearest way to see the relationship between Eric’s piece and the framework’s body of work is to put the two side by side at the dimensions that matter most:
Each row names a real distinction. None of them position the two analyses as competitive. Eric maps the methodology mechanism inside a single ERP transformation; the framework names the structural condition every ERP transformation inherits regardless of methodology. Eric argues for hybrid program structure; the framework argues for substrate diagnosis as the upstream prerequisite for any program structure to succeed. Eric refuses to predict timing within a single program; the framework refuses to predict which technology wave arrives next — but documents the substrate condition that will determine whether the enterprise captures value from whichever one does.
The convergence gives the prescription a level of structural credibility that neither analysis could establish alone. Together they produce a more complete strategic posture than either one produces alone. The reader who absorbs Eric’s methodology critique and the framework’s substrate diagnosis is positioned to understand both the project-scale mechanism by which ERP transformations are currently failing and the multi-decade structural inheritance underneath those project-scale mechanisms.
The hybrid prescription is structurally identical to the resolution architecture
The most striking convergence is in Eric’s hybrid prescription. He writes that the front end of an ERP program, where the most consequential decisions are made, should look more like a structured waterfall effort. Requirements should be elicited, documented, and signed off. Future-state business processes should be designed at the enterprise level. The data strategy should be defined before configuration begins. Integration architecture should be mapped end-to-end. Organizational change management should be planned with the same rigor as the technical workstreams.
That is, in substance, Phase 0™. The framework’s diagnostic instrument is calibrated to surface the multi-wave substrate condition across all four waves and to produce a documented readiness profile before deployment begins. Eric is describing the same structural function calibrated to a single ERP program rather than to the cross-wave aggregate, but the analytical move is identical: do the hard, structured work up front to define enterprise truth, then let iterative execution operate against that documented backbone.
Eric writes that once the backbone is in place, agile and iterative methods can be introduced productively into the build, configuration, testing, and deployment phases. Sprints become a way to make steady progress against a clearly understood target rather than a substitute for having a target at all. That is structurally identical to the framework’s resolution architecture.
The relational governance positioning that anchors the framework’s resolution architecture has been formally in the published archive since 2013, when Andy Akrouche documented the 2002 Province of Ontario relational governance engagement and the subsequent CRA adoption of the same model, which a benchmark study showed produced a 25% reduction in CRA acquisition costs one year after implementation. Phase 0™ produces the documented multi-wave exposure map. The relational governance layer provides the contractual frame within which operational disciplines do their wave-specific work. The execution layer is where humans (and increasingly agentic AI under delegated authority) operationalize the governance frame.
Two independent analyses, different starting points, the same structural answer. Eric arrived at the answer through expert-witness evidence on ERP transformation failures. The framework arrived at the answer through nineteen years of Procurement Insights archive evidence on the four-wave compounding pattern, building on operational proof cases that predate the contemporary vocabulary by decades.
The cost of leaving the substrate unaddressed
Eric’s piece argues that programs accumulating undocumented decisions during agile sprints are accumulating risk that will be paid back, with interest, during cutover and stabilization. The framework quantifies what the accumulated risk costs at the enterprise level.
The Hansen Deflator Formula™ measures realized AI and ERP value as the product of theoretical value and the substrate-quality coefficients across the prior technology waves. For a typical mid-cap enterprise, illustrative coefficients (Wave 1 ≈ 0.80, Wave 2 ≈ 0.70, Wave 3 ≈ 0.65) produce a deflator of approximately 0.36. An ERP or AI initiative that should produce $10 million in value realizes $3.6 million. The remaining $6.4 million is absorbed by the unresolved substrate from 1990, 2007, and 2012. For a five-thousand-employee enterprise, that translates to $7 million to $9 million per year of absorbed AI value alone, plus $25 million to $40 million per year of aggregate enterprise drag from the broader four-wave stack. The full development of the deflator math is in the financial impact piece linked above.
That is the cost. It accumulates whether or not anyone is measuring it. The board does not see it directly because the substrate friction has been absorbed into other line items by the time it reaches them. The implementation team and the support team see it daily because they operate at the substrate layer where the friction lives. Eric’s expert-witness cases see it in retrospect, when the cumulative cost is large enough to produce litigation.
What this means for the senior buyer commissioning an ERP transformation
Eric’s piece closes with three practical recommendations for executives sponsoring ERP or broader digital transformation programs. The recommendations are sound at the program-scale, and they translate cleanly into framework-scale guidance.
First, treat methodology choices as strategic decisions rather than as delivery details to be left to the integrator. The framework would add: methodology choices at the program scale are downstream of substrate condition at the enterprise scale. The right methodology cannot rescue an enterprise whose substrate forecloses the value the program is designed to capture.
Second, insist on independent oversight. Vendors and system integrators have legitimate roles to play, but their economic incentives are not always aligned with the long-term success of the program. The framework’s archive independence (zero vendor sponsorships across nineteen years) is structurally calibrated to exactly this need. Eric’s expert-witness practice operates from the same independence; the framework’s diagnostic and measurement work operates from the same independence. The convergence on independence as a prerequisite for credible advisory work is not coincidental.
Third, remember that speed is not the same as progress. The framework extends the observation: speed at the program scale is dwarfed by substrate inheritance at the enterprise scale. A program that moves quickly without surfacing the multi-wave substrate condition is producing project-level velocity at the cost of multi-decade structural drag.
In practice, that means commissioning a Phase 0™-style substrate diagnostic before signing the next ERP or AI contract, and using its findings to shape both the hybrid program structure Eric recommends and the commercial frameworks that will have to hold it. That is the first move. Everything downstream depends on it.
Where the two pieces of work go from here
Eric’s piece is one of the smartest engagements with the ERP transformation problem the discourse has produced in recent memory. The substrate question is the natural complement to it. Worth reading both.
The framework will extend this work into optionality loss (what strategic moves the substrate condition forecloses), convergence events (when inherited risk becomes acutely consequential), and non-linear loss (how a single missed window permanently changes the trajectory). Each of those forthcoming pieces will engage the kind of evidentiary base Eric’s expert-witness practice has been producing for two decades.
My hope is that this piece serves as a bridge: ERP leaders who follow Eric’s work can use the compounding-wave and deflator tools to extend his insights into AI and multi-wave substrate risk, and I would welcome ongoing dialogue with Eric and his community as this work evolves. The LinkedIn exchange following Eric’s original piece has already opened that territory. This post is what its substantive extension looks like in the framework’s published archive.
—30—
Compounding Technology Shadow Wave™ · Hansen Deflator Formula™ · Hansen Optionality Loss Estimate™ · Phase 0™ · Hansen Fit Score™ (HFS™) · RAM 2025™ · Real-World Condition Substrate™ · Strand Commonality™ · ARA™ · Implementation Physics™
Hansen Models™ · Founder: Jon W. Hansen · hansenprocurement.com
-30-
Share this:
Related