Title: Monitoring
1Monitoring evaluating cluster initiatives in
Flanders
- Bart De Caesemaeker22 February 2007
- www.iwt.be
2Overview of this presentation
- Cluster initiatives in Flanders a changing
context. - Opportunities and need for monitoring and
evaluation. - Approach
- Practical implementation
- 4.1 Activity reporting
- 4.2 Effect measurement
- 4.3 Innovation profiles
- 4.4 Impact on target groups
- First findings - Lessons learned.
31. Cluster initiatives in Flanders A changing
context.
- Brief description of the Flanders innovation
system - Flanders
- Historically Collective Centres (since '60)
- IWT
- Historical overview of cluster policy
- (Economic) clusters (1994)
- Technology Valleys (1998)
- CIN Cooperative Innovation Networks (2002)
- Technological Competence Poles ( 2005 )
- New context
41.1 Flanders Geographical situation.
51.1 Flanders some figures
- One of the 3 regions in Belgium
- 6 million inhabitants
- 60 of European Union purchasing power lies
within 400 km - 7 universities of which 2 world-class
- Very open economy (80 export)
- Strong sectors automobile, petrochemicals and
plastic processing, life sciences - Excellent ICT-infrastructure
- Educated, productive, loyal, multilingual and
flexible workforce
61.1 Flanders the 'ancient' intermediary system.
- Historically
- Only a limited number of well known big centres
performed collective projects ( - 2002). - Structural funding (since sixties)
- Additionally 'collective' projects
- No competition
- Follow-up was based on reporting of activities
71.1 IWT General survey
- History Established 1991
- Mission promotion of innovation
- Formal Government agency
- Personnel 60 scientific advisors
81.1 IWT Key figures
- Annual budget 200 million euro
- 60 million for RD-projects
- 10 million for Innovation projects SMEs
- 30 million for Strategic Basic Research
- 7 million for HEI
- 30 million for Flemish Cooperative Innovation
Networks - Measures of Flemish Government
- Clients
- 150 innovative enterprises / year
- 500 SME projects / year
- Network of intermediaries
- 250 advisors in the field
91.2 Historical overview of cluster policy
- 1994 Bottom-up accreditation of organisations as
(economic) cluster - 1998 Technology Valleys as policy choices
(mapping) - 2002 CIN' (Cooperative Innovation Networks)
bottom-up competitive funding
for projects (2x2 years) of
animation/stimulation of innovation
(semi-structural) - 2005 Technological Excellence Poles special
initiatives (picking winners by
intermediaries) / start-up grants for
specific infrastructure for
collective research and innovation platforms
101.3 New VIS / CIN - Scheme 2002. Cooperative
Innovation Networks.
- VIS/CIN Decree philosophy
- transparent scheme, juridical sound (no
authorisation anymore), open call - structural financing for 2 x 2 years
- financing of projects and activities(not of
organizations or institutes) - management and control more follow-up and
attention to the results and output, than
(financial) verification of input
111.3 CIN project types.
- Projects of Thematic Innovation Stimulation (TIS)
- target group of companies with a common
technological need or opportunity - must cover Flanders
- Projects of Subregional Innovation Stimulation
(RIS) - target group of companies in a geographical
region - all (industrial) sectors
- Projects Technological Services (TD)
- to offer technological (innovative) solutions and
opportunities - Projects Collective Research (CO)
- from strategic long term research to cooperative
technology transfer projects
121.3 Cluster initiatives in Flanders A changing
context of the intermediary system.
- Historically
- only a limited number of well known big centres
performed collective projects ( - 2002). - Follow-up was based on reporting of activities
- New context CIN (Cooperative Innovative
Networks)since 2002. - Bottom up implementation. Evaluation of proposals
in a call, based on the advice of external
experts. - Fast growing network of smaller projects (appr.
70 cluster projects, appr. 200 collective
projects in total). - Dynamic network, periodicity of 4 years.
- Intermediate project evaluation after 2 years.
132. Opportunities and need for monitoring and
evaluation.
- The new context caused
- For IWT needs and opportunities
- Efficient monitoring and evaluating of projects.
- Uniformity of reports
- Possibility to aggregate results for policy
makers. - Possibility to compare projects
- Quick detection of problems.
- Evaluation of effects of projects.
- A need for efficient reporting by the projects.
- Decrease of administrative burden.
143. Approach Overview of follow-up scheme
154. Practical implementation4.1 Activity
reporting - RAP
- A general set of 16 RAP numbers indicators are
defined that can indicate how well a project is
performing. - Emphasis on results of activities, less on
project means and efforts. - Each project defines target values for a subset
of the indicators, at contract negotiating phase. - All activities results are reported
- Each project reports 3 times/year, on a fixed
date - Report consists of
- Indictors (seminar) incl .additional information
(seminar title, date, place, number of attendees) - Up to 4 success stories can be added.
164. Practical implementation4.1 Activity
reporting - RAP What it is
- Web based reporting of activities
- Predefined list of activities (with added value
for target group) - 13 activities no. of clients, no. of
cooperative actions - Information actions
- Publications
- Seminars
- Company visits
- Ad hoc services (eg. By phone)
- Technology transfer
- Partner Matching
- Advice
- Audits
- Innovation plans
- Feasibility studies
- Innovation projects
- Innovation coaching
Well defined and standard list of actions
174. Practical implementation4.1 Activity
reporting - RAP How it works
- RAP numbers are indicators for how well a project
is performing. - Each project defines its target values for a
subset of RAP numbers (at contract negotiating
phase) - All activities are reported 3 times/year on a
fixed date for all projects, by giving the amount
eg. 15 seminars, 45 company visits, - Most numbers require additional info eg. RAP
number 'Seminars' requires title, date, place,
number of attendees to be added for each reported
seminar (consistency check) - For each report up to 4 success stories can be
added (1 page)
184. Practical implementation4.1 Activity
reporting - RAP
- Highly automated webtool.
- Fully automatic notification of project reporter.
- Fully automatic warning of reporting delays.
- Possibility to overview reporting of 50 reports
in a couple of hours. - Easy to give feedback to project reporters.
- Base for detailed evaluation of projects.F.ex.
Intermediate evaluation after two years.
19(No Transcript)
204.1 Activity reporting RAP general
conclusionsIt works well and it is accepted by
the projectleaders
- RAP fits in an evolution to give more
responsibility to projectleaders - Earlier funding programmes agreement on means
(if you'd spend enough manpower on the project
then you got your funds) - RAP less attention to follow up of manpower but
focus on activities with added value - Next step effect measurements, we care less
about how or what you do, as long as you get the
required effects - RAP reduces reporting efforts
- RAP reduces follow up time by IWT (50 projects
can be screened in a couple of hours) - Projects in 'problems' are easily and early
identified - Every 4 months status available for policy makers
and board (advantage of standardized information)
21Summary Report RAP
224.2 Measurements of project 'effects'
- Still in Start up phase.
- What it is Measurement of Direct Response on
Activity - Eg. After Advice Does company uses advice? Y/N
- Eg. After Seminar Does attendee ask additional
info? Y/N - ? Counting number of responses
- What it is NOT a measurement of the economical
benefits for companies - 7 direct effects identified and described (direct
response on activity) - 8 indirect effects identified (longer term
effects and attribution to action not straight
forward) - Eg. Company increased its RD budget? Y/N
234.2 Measurements of project 'effects' Practical
issues
- System developed in close cooperation with
project leaders (1 year 'development' time, 10
test cases) - Reporting of effects is an obligation for all
projects - Each project
- selects number of effects to follow up (from
list, additional specific effects can be defined)
and - sets target values based on the projects
objectives - reporting prior to mid-term review (go/no go)
- final reporting (in case of continuation request
go/no go) - Advisor is responsible for registration of
effects (at a return visit, after a call, ) - Start 2006, projects of call 2005.
- Evaluation criterion in call 2006 point of no
return.
244.3 Innovation profiles Who does what for what
kind of company?
- The situation
- 250 FTE 'wandering' around in Flanders
- Decentral steering (75 organizations)
- ? Who knows what they are doing??
- Solution
- Periodic (on-line) registration of all services
delivered to companies - Idem for needs of companies
- Completion of 'innovation profile' of the visited
company - Economical parameters (eg. B2B, cost competition,
) - Innovation status
- In a period of 3 months each FTE has to register
at least 10 company profiles - Registration period is repeated each year
25Innovation profiles learning about our clients
1. Geen2. Klanten en leveranciers3. Lokale en
internat. concurrentie4. Eigen initiatief
medewerkers 5. Strategie
26Innovation profilesWhat we learn from it.
What it is the size of the visited companies?
In which sectors are the visited companies?
274.4 Effect measurements Impact on target
groups ongoing
- Ongoing project investigation at IWT.
- Are the offered services relevant?
- What is the realised / perceived added value for
the customers? - What is the impact on the commercial market?
- Proper / effective selection criteria applied for
projects? - Segmentation of services over target groups
optimal? - What are the caracteristics of a successful
project?Predictable? - Efficiency of the global network?
284.4 Effect measurements Impact on target
groups ongoing
- Project
- IWT project leader
- Scientific steering committee
- Flanders counselling committee
- IWT participates in EU project 'Effect
measurements of intermadiaries' - Action plan
- Inventarisation actual status CIN programmes
- Inventarisation of actual functioning (case
studies) - Analysis and recommendations
- Report planned june 2007.
295. Lessons learnedBuilding up a system of
monitoring and effect measurement.
- Tools and standardization are importantDevelop
these in cooperation with the 'players' - Focus on benefits for players
- Consistent communication and behaviour (don't
fool them) - Step by Step, it requires a change in culture
- Use an external and independent consultant to
guide the process - Give immediate feedback to prove that the data
are used - ASK GIVE, do not ask data that are not used,
add value to the data and return them to the
players - Attention what you ask is what you get, look out
for unwanted biases (eg. if they have to report
eg. company visits, they will focus on doing
company visits)
305. Lessons learned Start-up of cluster projects.
- Most CIN cluster projects have been prepared
during circa 1 year before they get subsidies
from IWT.Project monitoring with RAP reveals
that it takes approximately one more year before
an initiative gets a wide response in a sector. - Projects that fail to move the companies during
the first years, fail to get the co financing (20
non subsidised part). The membership fee is
raised and the initiative fades.
315. Conclusion
- Monitoring and/or evaluation of subsidized
projects is always a must, but especially when
project results are not easy to specify. - Re-designing of follow up on cluster projects
offers opportunities for improving administrative
efficiency. - Re-desiging of project monitoring and evaluation
can direct the initiatives to a more
goal-oriented approach. By setting the targets
that are monitored and evaluated, the focus of
the projects is defined.