Luca M. Caldarelli - PowerPoint PPT Presentation

1 / 29
About This Presentation
Title:

Luca M. Caldarelli

Description:

'Since government have a large share in the realisation of the social and ... as a proxy of an institution's nodality, measured on the basis of: number of ... – PowerPoint PPT presentation

Number of Views:53
Avg rating:3.0/5.0
Slides: 30
Provided by: Staf546
Category:

less

Transcript and Presenter's Notes

Title: Luca M. Caldarelli


1
Measurement metricsbeyond the eGEP experience
eGovernment Monitor Network, Kick-off
meeting Geneva 29th-30th May 2008
Luca M. Caldarelli eGovernment Public Policy
Consultant
2
Governments as catalysers of growth
Since government have a large share in the
realisation of the social and economic
development in the European Union, is worthwhile
to revalue the role of public administration in
the Lisbon process



Source Innovating Public Administration and the
Lisbon Strategy, background document for the
Ministerial Troika on November the 4th, 2004
,
,
  • Improvements in governments productivity could
    unleash billions of Euro and boost growth by
    reducing administrative bottlenecks
  • Such gains do not necessarily entail lay off and
    cut but could simply be redeployed to maintain
    and improve the level of services do more with
    the same
  • Achieving these gains, also by harnessing the
    potential of ICT, is a must given the
    increasing costs deriving from demographic
    conditions the public budget will have to cope
    with in the near future

3
The need to demonstrate impacts
  • While there is a global consensus that ICT can
    contribute to the objectives of economic growth
    and stability
  • yet there is a growing recognition that such
    outcomes and the associated benefits cannot
    simply be assumed
  • Hence, national and international policy-makers
    are under increasing pressure to demonstrate the
    benefits and impacts of ICT investments in the
    public sector

It is only possible to be sure that change has
worked if we can measure the delivery of the
benefits it is supposed to bring
Source UK Cabinet Office, Successful IT
Modernising Government in Action
Considerable advances have been achieved in the
rollout of the ICT-based public services however
much remain to be done to demonstrate economic
impact and social acceptance

Source European Commission, i2010 A European
Information Society for growth and employment
4
Back to 2005 the eGovernmentEconomics Project
(eGEP)
  • eGOV costs monitoring methodology
  • Expenditure estimate for EU25
  • Total ICT 36.5 billion (2004)
  • eGOV only 11.9 billion (2004)

Expenditure Study
  • Measurement Framework
  • About 90 indicators
  • Implementation methodology

Measurement Framework
Economic Model
5
The eGEP measurement framework
Cashable financial gains
Financial organisational Value
Better empowered employees
Efficiency
Better organisational and IT architectures
Net Costs
Inter-institutional cooperation
Set-up Provision Maintenance
Political Value
Openness and participation
Democracy
Transparency and accountability
Reduced admin. burden
Constituency Value
Increased user value satisfaction
Effectiveness
More inclusive public services
6
How could it be used?
  • EU must agree with
  • Member States indicators
  • Methodology for new benchmarking

Very simple fully comparable indicators
EU25 Benchmarking
Less simple indicators, some comparability
problems
  • A national level unit can
  • Impose indicators top down
  • Build consensus on most comparable ones

National level monitoring of eGOV
eGEP
  • A public agency can
  • Select any of eGEP indicators
  • First use them for ex ante business cases
  • Then for steady and continuous measurement
  • Use eGEP implementation tools

Sophisticated indicators, no comparability
problems
Micro-level business case measurement
7
Lessons from eGEPimpact measurement difficulties
Overall score
Level
Type
Difficulties score
4 4 4
International (EU25)
  • Cooperation
  • Comparability
  • Feasibility

Policy system benchmark
HIGH
  • Cooperation
  • Comparability
  • Feasibility

Public policy benchmark
Member State (holistic)
3 3 3
MEDIUM-HIGH
  • Cooperation
  • Comparability
  • Feasibility

Organisational benchmark
Member State (within vertical and/ or region)
2 1 3
MEDIUM
  • Cooperation
  • Comparability
  • Feasibility

Measurement/ Internal benchmark
Individual public agency (voluntary)
1 0 2
LOW
4High 3Medium-high 2Medium
1Low 0 null
8
A view on the players decision-makingvirtuous
cycles
9
A view on the players different stakeholdersfor
different evaluations
10
Evaluation methodologies vs. objectives
Evaluation Methodologies
Evaluation Objectives
Input
Internal evaluation
External evaluation
Output
Participatory evaluation
Outcome
11
Main rankings approach a blurred picture
12
Literature review new measurement drivers
Magoutas et al.(2007)
Papadomichelakiet al.(2006)
Castelnovo Simonetta(2007)
Economist(2007)
Accenture(2007)
United Nations(2008)
Petricek et al.(2006)
UK NationalAudit Office(2007)
Baldrige Criteria(2008)
13
Models oriented at citizen centricity
  • Castelnovo and Simonetta
  • Quality of services defined in terms of service
    availability, satisfaction levels with services,
    importance of services offered, fairness of
    service provision, cost.
  • Magoutas et al.
  • Citizen centricity to be measured by access
    possibilities, user skills, user expectations.
  • Papadomichelaki et al.
  • Quality dimensions of a customer-oriented
    approach to service delivery availability,
    usability, service security, input from the
    customer priorities.
  • Accenture
  • Citizen surveys as a component of the overall
    index (first time ever).
  • Economist Intelligence Unit
  • Consumer and business adoption categorys
    weight boosted up to 25, reflects the adoption
    behaviour and the provided G2C/G2B opportunities.
  • United Nations
  • eParticipation index, focus on institutional
    capacity, leadership role and willingness to
    engage citizens, existing structures which
    facilitate citizens access to public policy
    dialogue.

14
Models oriented at cooperation/integration
  • Castelnovo and Simonetta
  • Service availability, user satisfaction,
    institutional fairness and delivery costs
    directly linked to the perception of value by
    internal stakeholders interacting with each
    other.
  • Petricek et al.
  • External connectivity as a proxy of an
    institutions nodality, measured on the basis of
    number of inlinks and number of outlinks.
  • United Nations
  • Leveraging new infrastructures within the public
    sector in order to better share information and
    bundle, integrate and deliver services through
    multiple delivery channels
  • Accenture
  • Customer service maturity indicator, focus on
    co-operation and integration, measurement object
    extent to which governments agencies manage
    relationships with citizens and businesses.

15
Models oriented at process re-engineering
  • United Nations
  • Back-office reorganisation main driver of the
    TLC infrastructure index.
  • Economist Intelligence Unit
  • Legal environment indicator, focus on how the
    institutional and legal framework have been
    modified to better meet the ICT challenges.
  • Papadomichelaki et al.
  • Back office procedures, leadership of the
    organisation, managements dedication to quality
    to be assessed and monitored in order to gain
    knowledge on the overall quality of the service
    delivery process.
  • Baldrige Criteria
  • Set-up of an Organisational Profile for each
    assessed institution, it keeps track of internal
    process re-engineering activities.
  • Petricek et al.
  • Website structure as a measure of internal
    processes effectiveness
  • Indicators size of connected components, average
    distance between randomly selected pairs of
    nodes, distribution of links within a site.
  • UK National Audit Office
  • BPR aiming at providing clearer, easily findable
    and joined-up information.

16
To sum up impact measurement challenges
  • Policy vs. methodological imperatives
  • Policy needs quick backing of causal relations
  • Scientific evaluators are careful not to confuse
    correlation and causation, would like natural
    experiments or longitudinal analyses.
  • On benchmarking
  • Reality distilled in manageable form for policy
    consumption?
  • Apples and oranges compared, context and
    processes overlooked,policy-learning and
    transfer obliterated?
  • Member States at the end of eGEP comparability,
    lack of data.

Benchlearning
17
2008 the Benchlearning challenge
A bottom-up collaborative benchmarking based on a
peer-to-peer experimental exchange among fairly
comparable public agencies from at least two
different EU Member States, designed as a
symmetric learning process, that () will
implement and calculate more sophisticated
indicators in a chosen area of impact the ICT
enabled services the selected agencies provide
and in the process will build transformative
capacities.
18
Why to benchlearn?
  • To benchmark only 10 eGEP indicators
  • The simplest and more comparable.
  • To boost the public sectors impact evaluation
    capabilities
  • Focus on more sophisticated impact indicators
  • Measurement capacities are built bottom-up.
  • To provide the involved agencies with tangible
    benefits
  • Opportunity to look at processes complexity
  • Identify enabling and hindering factors.

19
Benchlearning bottom-up benchmarking
  • Benchlearning is
  • Voluntary, bottom up and learning oriented
  • Flexible, with no need of uniform rigid
    indicators.
  • Gradually scalable from micro to meso and macro
  • Groups of similar organisations
  • Groups of similar verticals / regions
  • Groups of similar countries.
  • Provides insights and learnings on the eGOV value
    chain
  • Key drivers and success factors
  • Main barriers
  • Organisational processes and input.

20
What benchlearning is aiming to achieve
ACTIVITY
AIM
To extrapolate the promising areas where EU can
become a global leader
21
Benchlearning vs. benchmarkig
TEST AND LEARNING
BEST IN CLASS
vs
22
Benchlearning vs. benchmarking outcomes
RANKING
CAPACITIES AND LESSONS
vs
23
How benchlearning works (1/2)
  • Set-up
  • Letter of intent from the participating agencies
  • Preparation and running of a kick-off meeting
    with all participating agency.
  • As is and mainstreaming
  • Review of existing measurement systems and data
  • Analysis of organisational strategy and context
  • Draft report on operationalised indicators and
    preliminary measurement.
  • First full measurement (or zero measurement)
  • Web-enabled data gathering template
  • Support to pilot participating agency to use the
    template
  • Support to pilot participating agency to gather
    the data
  • Validation of data and calculation of indicators.

First year measurement
24
How benchlearning works (2/2)
  • Set-up (same as Y1)
  • As is and mainstreaming (same as Y1)
  • Second full measurement (same as Y1).
  • Continuous exchange of information
    (www.epractice.eu/community /benchlearning)
  • Inter-agency workshops.
  • Provision to the agencies of a measurement
    organisational model (processes and roles)
  • Inter-agency workshops.

Second year measurement
Exchange activities
Sustainability actions
25
Benchlearning groups how to manage them
Generate ownership
Start simple
  • Voluntary participation
  • Participants self-interested in capacity
    building/learning.
  • Clear mandate and leadership buy-in
  • Groups to be assemblednot by facilitator.
  • Multi-stakeholders butfirm governance
  • Exchange and consensus
  • But with clear lines of accountability.
  • Micro level only, single public organisations
  • 1 champion plus 3-4 learning organisations.
  • Groups assembled from similar countries
  • Leverage existing collaboration networks.
  • Third party facilitators (EU contractors,
    governments)
  • Intense and in-depth work.

26
Expected project outcomes eGEP 2.0
Pilot 1
Pilot 2
Pilot 3
EFFICIENCY
ADMINISTRATIVE BURDEN REDUCTION
CITIZEN CENTRICITY
Proxy indicator of Full Time Equivalent Gains
Standard cost model based indicator
Plurality of subjective and objective metrics
Agencies impact indicators
Work in progress on number of data field indicator
Work in progress on a combined index
Simplified online version of the eGEP Measurement
Framework eGEP 2.0
27
Thank youfor your attention!!!
28
References
  • Accenture (2007), Leadership in Customer Service
    Delivering on the Promise, Ottawa online
    Available at http//nstore.accenture.com/acn_com/
    PDF/2007LCSReport_DeliveringPromiseFinal.pdf
  • Baldrige National Quality Program (2006),
    Criteria for Performance Excellence, USA online
    Available at http//www.quality.nist.gov/PDF_files
    / 2006_Business_Criteria.pdf Baldrige National
    Quality Program (2008), Criteria for Performance
    Excellence, USA online Available at http//www.
    baldrige.nist.gov/Criteria.htm
  • Capgemini (2007), The User Challenge Benchmarking
    The Supply Of Online Public Services 7th
    Measurement. online Available at
    http//ec.europa.eu/information_society/eeurope/i2
    010/docs/benchmarking/egov_benchmark_2007.pdf
  • Castelnovo, W. and Simonetta, M. (2007), The
    Evaluation of e-Government projects for Small
    Local Government Organisations. In The Electronic
    Journal of e-Government Volume 5 Issue 1, pp 21
    28
  • Codagnone, C., Caldarelli, L., Cilli, V.,
    Galasso, G. Zanchi, F. (2006), Compendium to
    the Measurement Framework, eGEP Project delivered
    by RSO for the European Commission, DG
    Information Society, Brussels. online Available
    at http http//82.187.13.175/egep/asp/E_Home.asp

29
  • Economist Intelligence Unit (2007) The 2007
    e-readiness rankings - Raising the bar. online
    Available at http//graphics.eiu.com/files/ad_pdfs
    /2007 Ereadiness_Ranking_WP.pdf
  • Magoutas, B., Halaris, C., Mentzas, G. (2007), An
    Ontology for the Multi-perspective Evaluation of
    Quality in E-Government Services. In Proceedings
    of the 6th International Conference, EGOV 2007,
    Regensburg, Germany, September 3-7, 2007, p.
    318-329. online Available at http//www.springer
    link.com/content/p78w21624g1k7213/
  • National Audit Office (2007), Government on the
    Internet Progress in Delivering In-formation and
    Services Online Research Report, London online
    Available at http//www.governmentontheweb.org/acc
    ess_reports. aspdownload
  • Papadomichelaki, X., Magoutas, B., Halaris, C.,
    Apostolou, D., Mentzas, G. (2006), A Review of
    Quality Dimensions in eGovernment Services. In
    Wimmer M.A., Scholl H.J., Grönlund Å., Andersen
    K.V. EGOV 2006. LNCS, vol. 4084, pp. 128138.
    Springer, Heidelberg
  • Petricek, V., Escher, T., Cox, I.J., Margetts, H.
    (2006), The web structure of e-government -
    developing a methodology for quantitative
    evaluation. In Proceedings of the 15th
    International Conference on World Wide Web.
    Edinburgh, Scotland, May 23 - 26, 2006. WWW '06.
    ACM Press, New York, NY, p. 669-678. online
    Available at http//doi.acm.org/10.1145/1135777.
    1135875

30
  • Picci, L. (2006), The quantitative evaluation of
    the economic impact of e-government A structural
    modelling approach. In Information Economics and
    Policy, 18 (1), p. 107-123 online Available at
    http//www.sciencedirect.com/ science?_obArticleU
    RL_udiB6V8J-4HGD7251_user10_cover
    Date032F312F2006_rdoc1_fmt_origsearch_so
    rtdviewc_acctC000050221_version1_urlVersio
    n0_userid10md500709827baef087e5576adc91d4acca
    3
  • United Nations - Department of Economic and
    Social Affairs Division for Public
    Ad-ministration and Development Management
    (2008), UN e-Government survey 2008 From
    e-Government to connected governance. online
    Available at http//unpan1.un.org/intradoc/groups/
    public/documents/ UN/UNPAN028607.pdf.
Write a Comment
User Comments (0)
About PowerShow.com