assessement guidance – “best value for the money”

21
Ulrich Norf WW - Public Procurement support CIO Summit , June 2015 Assessement Guidance – “Best Value for the Money”

Upload: corporacion-colombia-digital

Post on 29-Jul-2015

181 views

Category:

Technology


3 download

TRANSCRIPT

Ulrich Norf

WW - Public Procurement support

CIO Summit , June 2015

Assessement Guidance –

“Best Value for the Money”

AGENDA

• Public sector IT Procurement Market view

• The Position of the European Commission

• Solution for Germany and beyond

• Background

• BKM

3

Market View

•The purchase of goods and services by the public sector accounts for a significant proportion of public expenditure, and of demand for goods and services in the economy. In some markets the public sector is likely to be by far the largest buyer, and in a position to affect competition through its purchasing behaviour. (High Volume, Catalogue, Shopping basket…)

•Governments increasingly providing commercial services through “e” or “ICT Enabled” Government driven by a citizen centric approach to expectations and experiences of private sector and the internet

•Simultaneously there has been greater emphasis and pressure on government efficiency and transparency in decision making and procurement, as well as a desire for access to the latest technology and services and reduced cycle time for procurement.

•The result has been an increased emphasis on competitive pricing and with particular respect to IT related purchases, a large amount of confusion in how to specify the products needed by government which maximise best public value over the entire lifetime of the product(s) and not just the up front cost alone.

4

Government Procurement Worldwide

LAR

Some awareness of EU / US legislation as “Best Practise” for Govt tenders through ‘Fair Open Competition Group’ as well as through NAFTA.

EMEA/EU

The EU member state are using the EU rules for public procurement specified in 2 directives (2004/18/EC & 2004/17/EC). With respect to IT equipment any tender over the value of €137,000 for central governments (€211,000 for non-central government) requires the contracting authority to publish details on Tenders Electronic Daily (or TED) http://ted.publications.eu.int/official/

APAC In APAC, PRC, Korea, SEA, India, ANZ have different govt tender processes. In India Government tenders are published in newspapers, while in PRC all tenders are notified to the public through government websites. With the exception of ANZ & Singapore, tender processes are country specific. There is no specific, detailed legal guidance

USA

Legal guidance is provided through the U.S. Federal Acquisition Regulation (FAR). Use of performance specifications or published standards are preferred consistent with Office of Management and Budget (OMB) Circular A-119. In the USA, any tender over the value of $25,000 is required to be posted on FedBizOpps.gov

AGENDA

• Public sector IT Procurement Market view

• The Position of the European Commission

• Solution for Germany and beyond

• Background

• BKM

EC Directives 2004/17/EC & 2004/18/EC

6

Technical specifications can be formulated:

by reference to product features;

in terms of performance or functional requirements; OR

using a combination of both

Each reference to product features must be accompanied by the words “or equivalent“

EC Directives 2004/17/EC & 2004/18/EC

7

No reference to a specific make, origin, process or brand, unless justified by the subject-matter of the contract or the need to describe the contracted goods, if this is not possible otherwise

If such a reference is made, the “or equivalent” wording must be used

The Position of the European Commission

8

Use of brands in the technical specifications for microprocessors is not justified

Even if accompanied by the words “or equivalent”

Use of a minimum clock-rate is discriminatory

Microprocessors may be described using references such as the type of the microprocessor and its required performance

The performance of microprocessors should be evaluated using appropriate benchmarks developed by industry consortia or third-party independent benchmarking firms

AGENDA

• Public sector IT Procurement Market view

• The Position of the European Commission

• Solution for Germany and beyond

• Background

• BKM

IT Procurement Guidelines for Germany since 2007 hosted by BITKOM and Central Procurement agency

Supported / Awarded by: OECD

(Organization for Economic Co-operation and Development)

DIGITALEUROPE (European Information & Communications

Technology Industry Association in Brussels)

Why was this Website created in 2007? To support the official tender writes with a guidance as the procurement law, driven by the EU, has changed and is now requiring benchmarks instead of technical features and brands in all Member states.

What was and is recommended? Application based benchmarks from BAPCO has been the first choice, based on a deep technical analysis

Policy „Recommendations for procurement of efficient and environmentally friendly desktop PCs, Notebooks, Server and others IT equipment” -> developed for the public sector -> observed by all industries

Developed by: • German Ministries

• Ministry of the Interior’s Procurement Office • Ministry of the Environment • Federal Employment Agency • Army’s (Federal Office for Information Management and Technology)

• Industry partners (Intel, AMD*, OEMs, LOEMs) Languages supported: English / French / Spanish / Portuguese / Dutch / German

www.ICT-procurement.com

10

Recommendations

11

Creating a standard procedure to ensure accuracy, transparency and reproducibility.

Based on Rules of

Installation Guidance (OS and Driver)

Do and Don’t

Benchmarking tool

Documentation

AGENDA

• Public sector IT Procurement Market view

• The Position of the European Commission

• Solution for Germany and beyond

• Background

• BKM

13

Measuring Performance

“Our position is that the only consistent and reliable measure of performance is the execution time of real programs, and that all proposed alternatives to time as the metric or to real programs as the items measured have eventually led to misleading claims or even mistakes in computer design.”

Mainstream Usages

“Perhaps the most important and pervasive principle of computer design is to focus on the common case: In making a design trade-off, favor the frequent case over the infrequent case. This principle applies when determining how to spend resources, since the impact of the improvement is higher if the occurrence is frequent.” From Computer Architecture: A Quantitative Approach,

Fifth Edition, Hennessy & Patterson

14

Reporting Performance

“The guiding principle of reporting performance measurements should be reproducibility – list everything another experimenter would need to duplicate the results.”

Multiple Benchmarks

“To overcome the danger of placing too many eggs in one basket, collections of benchmark applications, called benchmark suites, are a popular measure of performance of processors with a variety of applications. Of course, such suites are only as good as the constituent individual benchmarks. Nonetheless, a key advantage of such suites is that the weakness of any one benchmark is lessened by the presence of the other benchmarks.” From Computer Architecture: A Quantitative Approach,

Fifth Edition, Hennessy & Patterson

15

Basic Benchmark Selection Criteria

1. The benchmark uses real applications or benchmark applications executing real workloads, and be based on real-world scenarios and workflows

2. The benchmark was designed with industry stakeholder input baked into the development process, guided by industry best practices and transparency

From Computer Architecture: A Quantitative Approach, Fifth Edition, Hennessy & Patterson

16

From Computer Architecture: A Quantitative Approach, Fifth Edition, Hennessy & Patterson

Define

What is being measured?

Identify and reduce all possible variables

Follow the rules

Stick to the benchmark tool rules and standards

Be consistent and fair

Keep things the same and always double check

Perform multiple runs for statistical consistency

Compare fairly and consistently

Document everything

Eliminate guesswork and ensure replicability

Best Practice

AGENDA

• Public sector IT Procurement Market view

• The Position of the European Commission

• Solution for Germany and beyond

• Background

• BKM

18

From Computer Architecture: A Quantitative Approach, Fifth Edition, Hennessy & Patterson

Different benchmarks test different areas of platform performance. For comprehensive platform evaluations, it’s best to cover all three domains of performance.

Mainstream Usages – use standard OS libraries, APIs and services

Web Usages – use web browser technologies; inherently cross-OS

Game & GPGPU Usages – use graphics

Benchmark Domains

19

Describing Performance

Performance benchmarks generally are

acceptable way of describing the

performance of a system

Need to choose benchmark(s) that are:

independent, regulated and widely

recognised;

relevant;

up-to-date;

Reflecting the usage model

Need to set up and follow a rigorous

Methodology Performance benchmarks do not fit all situations From Computer Architecture: A Quantitative Approach,

Fifth Edition, Hennessy & Patterson

Describing Other Features Unrelated to Performance

20

• Performance benchmarks tell nothing about other important CPU & computer system features unrelated to performance

• Even when benchmarks are used to describe performance, contracting authorities should:

Identify the CPU and computer system features unrelated to performance, which may help lowering their TCO and achieve “Best Value for Money”;

AND

Describe these features in the technical specifications, in an objective and non-discriminatory way

Placeholder Footer Copy / BU Logo or Name Goes Here