What is the saying about good intentions and understanding? I appreciate Robert’s sentiments as well as Stephany Lapierre’s comment, and agree there is a problem. However, I think they are focusing too much on the symptoms and not the underlying root cause. Today’s post digs deeper into the root cause of procurement transformation challenges.
MY TAKE ON ROBERT’S POST
I am going to be referencing three Procurement Insights’ posts as the basis for my thinking about S2P suite challenges as they are connected in several important ways:
Analysis of the S2P Suite Post and Related Articles
Robert’s post about S2P suite analytics limitations reveals a fundamental procurement technology problem that the three Procurement Insights articles address from different angles.
Core Problem Alignment
The original post identifies the classic “80% spend on PO” promise versus 20-30% reality. This perfectly illustrates what I call the “equation-based ProcureTech first” approach that creates an 80% failure rate. Organizations implement sophisticated suites like Coupa expecting technology to solve process and behavioral challenges, but success ultimately depends on the human capability layer, not the software features, e.g., The road might be long, but the people-first, agent-based cause is justified (Interview Highlights).
The Friction and Workaround Dynamic
When the post notes that “processes create friction” and stakeholders “dodge your process like Neo in The Matrix,” this aligns with my agent-based thinking. The problem isn’t that procurement professionals don’t understand role definitions – it’s that role definitions become meaningless when the same agents must serve competing network demands KPMG’s Tanya Ward’s Recent Post Has It’s Origins In 2008 – But Are We Now Headed In The Right Direction?. The catalog gaps, long cycles, and bypass behaviors represent competing agent incentives that traditional implementations ignore.
Why Current Industry Analysis Fails
The commenter’s point about poor data structure is valid but incomplete. Major analysts depend on technology vendors for revenue – they don’t profit from documenting implementation failures The road might be long, but the people-first, agent-based cause is justified (Interview Highlights), which means the industry systematically underanalyzes why these workarounds persist. Organizations continue to implement complex supply strategies using simplistic organizational frameworks that fail to account for human behavior KPMG’s Tanya Ward’s Recent Post Has It’s Origins In 2008 – But Are We Now Headed In The Right Direction?.
AI as a Collaborative Tool, Not Silver Bullet
The “My Dinner with Claude” post reinforces that effective human-AI collaboration requires domain expertise, critical thinking, and the willingness to question both human intuition and AI analysis My Dinner With Claude: Or Why AI Is Not A Threat But A Reliable Procurement Partner (Also, Why The Hansen Fit Score Works!). Simply adding AI analytics to failing S2P implementations won’t solve the underlying process archaeology and stakeholder alignment issues.
Strategic Implications
In my model, I focus on high-functioning practitioners and established solution providers who understand that technology alone doesn’t guarantee success. The road might be long, but the people-first, agent-based cause is justified (Interview Highlights) post suggests the path forward isn’t better analytics dashboards, but better implementation methodologies that address the human behavioral layer driving those workarounds.
Robert’s post’s “plan for significant off-system spend from day one” advice is understandable, but reactive. My approach would be proactive: understand why stakeholders create workarounds before designing the system, then architect processes that align with rather than fight natural behavioral patterns.
Bottom Line
The S2P suite analytics frustration stems from treating symptoms (poor visibility into off-system spend) rather than causes (misaligned incentives and inadequately designed processes). The three articles collectively argue for process archaeology and agent-based analysis to understand why workarounds exist, then designing implementations that work with human behavior rather than against it.
DIGGING DEEPER INTO STEPHANY’S COMMENT
Analysis of Stéphany Lapierre’s Comment
Stéphany’s comment represents exactly the kind of well-intentioned but incomplete thinking that perpetuates the cycle described in my post. Her focus on data structure quality – “get your supplier data well structured with legal entities and hierarchy, classification etc.” – is technically correct but strategically misguided.
Why Her Comment Proves the Point
Stéphany’s suggestion exemplifies what I call “equation-based thinking.” She’s essentially saying: better data inputs = better analytical outputs = better procurement outcomes. This is the same logic that leads organizations to believe Coupa will deliver 80% spend visibility if they just configure it properly.
The fundamental flaw is assuming that data quality issues exist in isolation. In reality, poor supplier data structure is often a symptom of the same behavioral and process problems causing the workarounds mentioned in the original post:
- Procurement cycles too long → Stakeholders bypass formal supplier registration processes
- Catalog doesn’t have what they need → They work with unregistered or improperly classified suppliers
- They know what they want and bypass the process → They don’t bother with proper data governance protocols
The Real Problem
Organizations continue to implement complex supply strategies using simplistic organizational frameworks that fail to account for human behavior and competing incentives KPMG’s Tanya Ward’s Recent Post Has It’s Origins In 2008 – But Are We Now Headed In The Right Direction?. Stéphany’s recommendation treats data quality as a technical problem when it’s actually an organizational behavior problem.
“Although recognizing the benefits of creating the right culture to improve data quality resulting in better data access and utilization, 95% of all executives identify organizational and process challenges as the primary obstacles impeding the adoption of big data and AI initiatives. In short, the problem with accessing and using quality data to its full potential is a people and process issue – and ultimately a leadership issue.” – Getting Beyond the Twilight Zone of data uncertainty, Procurement Insights, May 5, 2021
Why This Approach Backfires
Even if you perfectly structure all supplier data upfront, you’ll still face the same cycle:
- Deploy pristine data taxonomy
- Stakeholders encounter friction in maintaining it
- Workarounds emerge to bypass data requirements
- Data quality degrades over time
- Analytics become unreliable again
The Hansen Method Alternative
Instead of starting with data structure, process archaeology would ask: Why do stakeholders resist proper supplier data management? Agent-based analysis would identify the competing incentives that make data quality maintenance feel like bureaucratic overhead rather than value-added work.
The solution isn’t better data governance policies – it’s designing processes where maintaining accurate supplier data becomes the natural, friction-free path rather than an additional burden.
Bottom Line
Stéphany’s comment, while technically sound, represents the aspirin approach to procurement challenges. It addresses a visible symptom (poor data quality) while ignoring the underlying organizational dynamics that created the problem. This is precisely why so many procurement transformations achieve temporary improvements before reverting to old patterns.
30
BONUS COVERAGE
In my next post I will dig deeper into the challenges with Stephany’s comment in a post titled: Process archaeology and agent-based design work. Data quality aspirin doesn’t.
In the meantime:
“Instead of Treating the Headache, Address the Root Cause: Why the Aspirin Approach Fails in ProcureTech Transformations”
Posted on September 17, 2025
0
What is the saying about good intentions and understanding? I appreciate Robert’s sentiments as well as Stephany Lapierre’s comment, and agree there is a problem. However, I think they are focusing too much on the symptoms and not the underlying root cause. Today’s post digs deeper into the root cause of procurement transformation challenges.
MY TAKE ON ROBERT’S POST
I am going to be referencing three Procurement Insights’ posts as the basis for my thinking about S2P suite challenges as they are connected in several important ways:
Analysis of the S2P Suite Post and Related Articles
Robert’s post about S2P suite analytics limitations reveals a fundamental procurement technology problem that the three Procurement Insights articles address from different angles.
Core Problem Alignment
The original post identifies the classic “80% spend on PO” promise versus 20-30% reality. This perfectly illustrates what I call the “equation-based ProcureTech first” approach that creates an 80% failure rate. Organizations implement sophisticated suites like Coupa expecting technology to solve process and behavioral challenges, but success ultimately depends on the human capability layer, not the software features, e.g., The road might be long, but the people-first, agent-based cause is justified (Interview Highlights).
The Friction and Workaround Dynamic
When the post notes that “processes create friction” and stakeholders “dodge your process like Neo in The Matrix,” this aligns with my agent-based thinking. The problem isn’t that procurement professionals don’t understand role definitions – it’s that role definitions become meaningless when the same agents must serve competing network demands KPMG’s Tanya Ward’s Recent Post Has It’s Origins In 2008 – But Are We Now Headed In The Right Direction?. The catalog gaps, long cycles, and bypass behaviors represent competing agent incentives that traditional implementations ignore.
Why Current Industry Analysis Fails
The commenter’s point about poor data structure is valid but incomplete. Major analysts depend on technology vendors for revenue – they don’t profit from documenting implementation failures The road might be long, but the people-first, agent-based cause is justified (Interview Highlights), which means the industry systematically underanalyzes why these workarounds persist. Organizations continue to implement complex supply strategies using simplistic organizational frameworks that fail to account for human behavior KPMG’s Tanya Ward’s Recent Post Has It’s Origins In 2008 – But Are We Now Headed In The Right Direction?.
AI as a Collaborative Tool, Not Silver Bullet
The “My Dinner with Claude” post reinforces that effective human-AI collaboration requires domain expertise, critical thinking, and the willingness to question both human intuition and AI analysis My Dinner With Claude: Or Why AI Is Not A Threat But A Reliable Procurement Partner (Also, Why The Hansen Fit Score Works!). Simply adding AI analytics to failing S2P implementations won’t solve the underlying process archaeology and stakeholder alignment issues.
Strategic Implications
In my model, I focus on high-functioning practitioners and established solution providers who understand that technology alone doesn’t guarantee success. The road might be long, but the people-first, agent-based cause is justified (Interview Highlights) post suggests the path forward isn’t better analytics dashboards, but better implementation methodologies that address the human behavioral layer driving those workarounds.
Robert’s post’s “plan for significant off-system spend from day one” advice is understandable, but reactive. My approach would be proactive: understand why stakeholders create workarounds before designing the system, then architect processes that align with rather than fight natural behavioral patterns.
Bottom Line
The S2P suite analytics frustration stems from treating symptoms (poor visibility into off-system spend) rather than causes (misaligned incentives and inadequately designed processes). The three articles collectively argue for process archaeology and agent-based analysis to understand why workarounds exist, then designing implementations that work with human behavior rather than against it.
DIGGING DEEPER INTO STEPHANY’S COMMENT
Analysis of Stéphany Lapierre’s Comment
Stéphany’s comment represents exactly the kind of well-intentioned but incomplete thinking that perpetuates the cycle described in my post. Her focus on data structure quality – “get your supplier data well structured with legal entities and hierarchy, classification etc.” – is technically correct but strategically misguided.
Why Her Comment Proves the Point
Stéphany’s suggestion exemplifies what I call “equation-based thinking.” She’s essentially saying: better data inputs = better analytical outputs = better procurement outcomes. This is the same logic that leads organizations to believe Coupa will deliver 80% spend visibility if they just configure it properly.
The fundamental flaw is assuming that data quality issues exist in isolation. In reality, poor supplier data structure is often a symptom of the same behavioral and process problems causing the workarounds mentioned in the original post:
The Real Problem
Organizations continue to implement complex supply strategies using simplistic organizational frameworks that fail to account for human behavior and competing incentives KPMG’s Tanya Ward’s Recent Post Has It’s Origins In 2008 – But Are We Now Headed In The Right Direction?. Stéphany’s recommendation treats data quality as a technical problem when it’s actually an organizational behavior problem.
“Although recognizing the benefits of creating the right culture to improve data quality resulting in better data access and utilization, 95% of all executives identify organizational and process challenges as the primary obstacles impeding the adoption of big data and AI initiatives. In short, the problem with accessing and using quality data to its full potential is a people and process issue – and ultimately a leadership issue.” – Getting Beyond the Twilight Zone of data uncertainty, Procurement Insights, May 5, 2021
Why This Approach Backfires
Even if you perfectly structure all supplier data upfront, you’ll still face the same cycle:
The Hansen Method Alternative
Instead of starting with data structure, process archaeology would ask: Why do stakeholders resist proper supplier data management? Agent-based analysis would identify the competing incentives that make data quality maintenance feel like bureaucratic overhead rather than value-added work.
The solution isn’t better data governance policies – it’s designing processes where maintaining accurate supplier data becomes the natural, friction-free path rather than an additional burden.
Bottom Line
Stéphany’s comment, while technically sound, represents the aspirin approach to procurement challenges. It addresses a visible symptom (poor data quality) while ignoring the underlying organizational dynamics that created the problem. This is precisely why so many procurement transformations achieve temporary improvements before reverting to old patterns.
30
BONUS COVERAGE
In my next post I will dig deeper into the challenges with Stephany’s comment in a post titled: Process archaeology and agent-based design work. Data quality aspirin doesn’t.
In the meantime:
Share this:
Related