New Zealand Public Sector eSourcing: Transparent Procurement encourages Competition & Investment

Posted on October 30, 2008

2


It is widely accepted that procurement transparency promotes competition, and that competition is a good thing for economic efficiency and growth. On a national level, it is also interesting to note the positive correlation between transparency and Foreign Direct Investment.

Background to Tenders & Sourcing

Transparency, in this context, requires that purchasing decisions be demonstrably fair and open to scrutiny. Tendering is the process whereby a purchaser announces their intention to purchase a good or service, and invites potential suppliers to submit bids.  The purchaser then assesses and compares the bids in order to inform their choice. Where the purchaser provides detailed information about their requirements, and about how they will compare bids, the process is referred to as a Request For Proposal (RFP). If the RFP is vague, and allows bidders to submit proposals in any format, the process of assessment tends to be more subjective, and thus less transparent. In contrast, if an RFP provides a highly detailed list of individually weighted questions or criteria, and each answer is scored according to predefined guidelines, then the decision making process can be very transparent.

This post considers the use of eSourcing software in public sector tenders. To clarify terminology, eSourcing software is generally regarded as consisting of eAuction software, and eRFx software – software for managing request for proposal style tenders online.  The terms eSourcing and eRFX will be used interchangeably.

Cost vs Benefit of Transparency

Whilst the benefits of transparency are clear, increasing attention is being paid to the costs incurred by an overly regulated and bureaucratic procurement process. Firstly, there is the costs to the purchaser of following a more time consuming process. Secondly, there is the increased cost to the bidders which may be a barrier to entry for smaller companies.

Administrative Cost

The cheapest way to choose a supplier is to pick one at random out of the phone book. Beyond this the more rigorous the assessment process, the greater the cost. 

  • More bidders results in greater competition and greater choice, but drives up the cost of assessment.
  • Defining more detailed evaluation criteria may reduce the risk of selecting the wrong supplier, but at the cost of increasing the evaluators’ workload, and again drives up cost.
  • Multiple Evaluators – by having the bids analysed by multiple people working in parralell it may be possible isolate and eliminate biases from individual scorers. However, multiple evaluators drives up cost both by committed more man-hours to the evaluation, and by increasing the administrative burden of distributing bids and collating scores.

Barriers to Entry

In principle, an RFP evaluation should be as fine grained and detailed as possible. This reduces the risk of overlooking a flaw in a given bid, and helps to tease out opaque but important differences. In so doing, a more detailed evaluation can favour a smaller, more specialist company over a big company with a larger sales budget. Unfortunately, a highly detailed RFP can actually discourage smaller companies from bidding because it drives up their cost of sale, and thus their financial risk. 

Balancing Cost vs Benefit

A balance must be struck between the benefits of transparent sourcing and the costs of increased workload. Where this balance lies depends on the degree of risk associated with a purchase and on what can be done to ameliorate the cost of tendering to both buyer and supplier. Risk is a factor of the amount of money to be spent, the criticality of the good or service being tendered for, and the variation in quality in the possible provision of that good or service. Costs of tendering can be reduced by good management, and, in some cases, by the appropriate use of information technology.

Tender Evaluation in New Zealand’s Public Sector

This years (2008) report by Transparency International rates all countries according to the transparency of their governments. The three countries with the best (greatest transparency) score were Sweden, Finland and New Zealand – all equal with a score of 8.3. Transparency International’s ratings are determined by many factors, such as the freedom of the press, and the spread of ownership of the media. But a signficant part of the rating comes from the openess of government, one aspect of which is the transparency of their procurement decisions. This corresponds with our experience at SupplierSelect in working with a number of New Zealand public sector organisations in recent sourcing projects:

  • A financial organisation selecting a secure messaging system
  • A university selecting a student administration system
  • Public housing department evaluating technology providers

All these projects featured a very rigorous assessment process:

  1. The RFX questionnaire contained many hundreds of questions structured into sections and sub sections.
  2. A large team of scorers was involved in the assessment, with each scorer creating their own set of scores. A final committee stage then produced agreed scores for each resondent’s answer to every question. Fine grained permissions were used for different roles. For example, scorers were not initially permitted to view question weightings for fear that that this would influence their scores.
  3. Weights were set for each question and section to determine the contribution of each assessment area to the total score. Multiple sets of such weightings were then created to reflect the priorities of different groups of stake holders.

New Zealand public sector organisations are accountable to the Auditor General’s office, which publishes some very clear guidelines for public sector procurement (”Procurement Guidance for Public Entities” – http://www.oag.govt.nz/2008/procurement-guide) .  A section of these guidelines is devoted to “Selecting an Evaluation Model”, and goes on to describe the “Weighted Attribute Model”:

“Under this model, the criteria are weighted to reflect their relative importance. Each criterion in the tender or proposal is scored, and each is multiplied by the relevant weighting to give a weighted score.”

This section goes on to advise that if price is included in the weighting attribute assessment, it is important to test the calculation model to determine what level of price sensitivity is being achieved:

“If weighting price, it is important to carry out some level of sensitivity analysis as part of the weighting process to ensure that the price weighting is appropriate. For example, if the price weighting is too high, the evaluation effectively becomes a lowest-price conforming model.”

These guidelines make it plain that any public sector organisation involved in a significant purchase must be prepared to have their decisions scrutinised in detail.

The role of eSourcing

The evaluation models described above are far from revolutionary. They have been recognised as best practise for running tenders in many industries for decades, long before the advent of web based eSourcing software. In common with many business software products, eSourcing software does not introduce radical new methods, but, by automating manual, time consuming steps, it reduces the time and cost of execution. A comparison might be made with accounting software. It is perfectly possible to prepare a company’s annual accounts on spreadsheets, but it is much less time consuming and error prone to do the same job with an accounting software package. 

Running a tender to comply with strict regulations can be a very expensive process. Every step in the decision making process must be documented. Multiple people must be involved to ensure wide participation and avoid bias. A wide sample of potential suppliers must be considered. Working with documents, either electronic or paper, the process requires very careful management, and must be audited painstakingly. Using an eSourcing package, much of the administrative burden is shouldered by the software.  For example, 

“7.119 
A public entity should record all communications that seek clarification with a participant, including any clarifications presented by the participant, so that an audit trail is maintained.”

In practice, maintaining such an audit trail is difficult, especially in cases where a bidder picks up the phone to call the evaluation team. Even if communication is restricted to email, it is a laborious process to produce logs of email traffic.  In a good eSourcing system, messaging is built into the system, and audit events are logged for each action. 

“7.82 
The process should require the public entity to carefully consider each tender or proposal, on an equal basis, against the evaluation criteria”

This is a vague rule, and as such provides a disgruntled bidder with many opportunities to complain. A common interpretation of this rule is to ensure that the same amount of time is dedicated to the consideration of each. Again, this may seem a trivial requirement, but it is difficult to implement in practice.  In the EU, when dealing with paper bid documents sent by post, it is recommended that a witness stand in the same room as the evaluator, and record the time that each bid is opened. With good eSourcing software, the time that each bid is submitted and made available to the evaluator is recorded automatically in the Audit Event log.

Accuracy

 Assessing and scoring tender bid responses may not seem like a major mathematical exercise. However, for a large tender involving many scorers and multiple scoring sets, the number of calculations involved becomes daunting. Consider an RFP with 400 questions, where 8 bidders respond, and 5 staff are on the evaluation team, each scoring all questions. This produces 400 x 8 x 5 = 16,000 scores. Each question is weighted, and there are 3 weighting sets. To find a total score for one respondent involves calculating the sum of 16,000 multiplications. Do this for each respondent, and for each of the 3 weighting sets. Then consider that questions are structured into sections and subsections, each which has its own weighting. Scores within sections are normalized, so that the section score caps the total score. 

It soon becomes apparent that a great deal of arithmetic is involved in a complex assessment. Of course a well structured spreadsheet can easily do these calculations, but to aggregate scores from multiple bidders and scorers into the same spreadsheet invariably involves copying and pasting, with considerable scope for error. 

With a good eRFx package, these calculations are managed automatically. Change logs are kept of every score altered or edited. Drill down report enables the evaluator to pinpoint differences between suppliers, or between scorers. 

Summary

The crucial aspects of supplier or bid evaluation are defining criteria, communicating these effectively to the bidders, and interpreting the answers intelligently. eSourcing software doesn’t directly help with any of this. But making the evaluation process transparent, and involving a wide team of evaluators, can result in a greatly increased administrative overhead, and increased risk of error. Thus full transparency, historically, is only feasible for the biggest purchases. By easing the administrative burden introduced by following a fully transparent process, eSourcing increases the range of purchases for which transparency sourcing is cost effective, and thus plays a positive role in increasing overall public sector transparency.

About SupplierSelect

SupplierSelect is an eTendering platform for managing sophisticated request for proposals. It is used by public sector organisations in Canada, UK and New Zealand, and by private sector companies around the world for supplier evaluation in a number of contexts including tenders, RFx and ratings.

© SupplierSelect Ltd 2008

 

 

Editor’s Note: As indicated in my October 22 post regarding this article, I would like to stress that neither SupplierSelect’s appearance in this blog, nor the positions presented by Patrick Dobbs in this article are to be construed as an endorsement by Procurement Insights.  That said I felt that it was a compelling piece that presents interesting concepts that are worthy of both consideration and discussion.  As always, I will leave it up to you the reader, to ultimately determine the veracity of the concepts presented as well as their viability in the public sector.

 

Purchasingb2b Article; Be sure to check out the September 2008 issue of Rogers™ purchasing b2b magazine in which an edited version of my August 20, 2008 profile of the Canadian Public Procurement Council (The Web 2.0 Association: A Dynamic Engagement Between Stakeholders Sharing The Same Interests And Goals) appears in the magazine’s “In The Field” section.