Title: L2C
1- L2C
- Learning to Collaborate
- Continuous Evaluation of the Outputs and Systems
Developed - (WP 6)
- Chiara Frigerio, UCSC
Learning to Collaborate
2Structure of the Presentation
- The Evaluation Process
- First Round Evaluation
- The Knowledge Community Evaluation
- The Prototypes Evaluation
- 3. Evaluation Process Next Steps
3Evaluation Process Objectives
- Guide continuous improvements to the development
of outputs of the project, including testing and
validating their effectiveness for the users. - Represent a good indicator of performance,
whereby the identification of challenges and
needs can be used to improve potential future
opportunities of collaboration. - Represent the criteria for quality assurance of
the outputs, verifying also accomplishment of
goals. - Provide assessment of the partners effort
invested in the processes of innovation and new
knowledge creation for gauging the value and
effectiveness of their efforts.
4Output to be evaluated
5The Evaluators
- L2C Project partners, who care about the
innovation being introduced and about its
effectiveness because of the efforts invested. - Target users of the project outputs, who
represent the intended beneficiaries and users of
the research findings. - Potential future buyers of the projects outputs,
who have an interest on the projects success in
delivering intended results (for examples,
organizations or people whose opinion on the
output is essential for its adoption in an
organizational context). - Strategic partners whose feedback in terms of
knowledge is essential for the project
improvements (for example, practitioners or
scientists with expert knowledge on a specific
topic). - The European Community, which will provide
independent evaluators and reviewers to assess
the projects outputs.
6Evaluation Methodology
The main reference for this project is the Goal
Question Metric (GQM) approach by Basili and
Weiss (1984). This framework is based on the
assumption that to measure a projects
effectiveness in a purposeful way it is essential
to first specify the goals to be accomplished.
7Evaluation Framework/1
To illustrate Suppose one of the main
objectives/goals of the Knowledge Community is to
be usable. In this case, suitable criterion and
metrics to be presented would be How intuitive
is it for a user to find a contribution? 1(very
intuitive) to 5 (not intuitive at all)? How
understandable understandable are the menus?
1(very understandable) to 5 (not
understandable at all).
8Evaluation Framework/2
9Evaluation Perspectives
The evaluation of the IT-based tools will be
conducted along two dimensions. A
technical/technological perspective which will
investigate IT-related dimensions such as
usability, functionalities, security and
effectiveness of the tools. A pedagogical and
social perspective which focuses on factors such
as value to and level of acceptance by the users.
Users represent relevant actors who will
contribute to continuous improvement of the
outputs and will also provide final feedback on
the quality and learning value of the tools.
10Overall Evaluation Process
11Overall Evaluation Approach
12Deadlines
13Expected Deliverables
14Structure of the Presentation
- The Evaluation Process
- First Round Evaluation
- The Knowledge Community Evaluation
- The Prototypes Evaluation
- 3. Evaluation Process Next Steps
15Deliverable 6.3
- D6.3 provides the formative assessment of the
first version of the following outputs - The ACDT Knowledge Management Tools from
technical/technological perspective (from
Workpackage 2) Period of evaluation February
2006. - The evaluation of the specification of the
first version of the ACDT Simulation Games
Prototypes, from a pedagogical point of view as
presented in D3.1 ACDT Framework, Simulation
Scenarios and Design (from Workpackage 3)
Period of evaluation February 2006.
16Structure of the Presentation
- The Evaluation Process
- First Round Evaluation
- The Knowledge Community Evaluation
- The Prototypes Evaluation
- 3. Evaluation Process Next Steps
17The Knowledge Community Evaluation/1
WHAT first prototype of the system (programming
errors, technical problems, usability and
navigation issues) WHEN August 2006 WHO pool
of experts comprised of FVA and INSEAD internal
usability experts and usability free-lance
experts hired by FVA HOW report of tickets,
usability protocol submission and think-aloud
method
18The Knowledge Community Evaluation/2
WHAT first prototype of the system after
technical improvements WHEN September 2006, 2nd
Consortium meeting in Milan WHO L2C Consortium
partners HOW informal suggestions and
brainstorming
19The Knowledge Community Evaluation/3
WHAT first formal assessment on the
technical/technological features and
functionalities WHEN February 2007, after a one
month period of usage WHO by the Consortium
partners HOW questionnaire composed of 44
questions in total, among which 5 were open
questions, 26 were questions based on Likert
scale answers and 13 were yes/no questions.
20System Features Evaluated
21The Knowledge Community Qualitative Evaluation
- A number of areas were identified for
improvement, especially making the community
functions easier to use for members who do not
belong to the L2C network. The following are the
main features which need to be improved - The left menu, which right now imposes a long
scrolling down to various sources of information
makes the home or main page appear too overloaded
with information. A more selective way to filter
and present relevant information is needed. - Some online instructions for new users who will
not be familiar with the purpose and logic of the
L2C project. First impression management is
important. - The contributions editing functionality.
- A layout or interface which uses more contrasting
and intense colors. - Additional search and better organization
functions, since the contents of the knowledge
community will increase over time. - Some additional collaboration tools in the
virtual community area could be inserted to
provide easy to use and accessible communication
and collaboration opportunities.
22 Knowledge Community Overview
The assessment of the Knowledge Community shows
that in general the partners are satisfied with
the tool and its functionalities. According to
their opinions, this is a good starting point
since the community has the relevant
collaboration tools, it is pretty intuitive, and
simple to be used. The suggestions and feedback
provided will be used to drive future
improvements to the knowledge community from a
technical point of view. They will be discussed
among partners to decide on how to best address
each specific issue, and in particular, to
determine a list of priorities.
23Structure of the Presentation
- The Evaluation Process
- First Round Evaluation
- The Knowledge Community Evaluation
- The Prototypes Evaluation
- 3. Evaluation Process Next Steps
24The Prototypes Evaluation
WHAT version 1 of each of the simulation games
prototypes WHEN February 2007 WHO by the
Consortium partners, based on their expertise and
interest, as determined during the 3rd Consortium
meeting in Athens in January 2007. HOW
questionnaire composed of 15 questions in total,
among which 1 was an open-ended question and 14
were questions using Likert scale answers.
Additionally, 6 of the 15 questions were intended
to evaluate the prototypes in general, while the
other 9 assessed specific dimensions of each
prototype.
25EduSynergy
- The overall assessment has showed that the
EduSynergy prototype presents a learning
experience which provides players an opportunity
to come into contact with a number of
collaboration challenges and dynamics at an
organizational level. However, there are still
are some dimensions which need to be clarified
and improved - Transferability of the Edusynergy scenario to a
wider, non-University/academic audience - The need to fine-tune the collaboration focus of
the simulation - Addressing the complexities of intra-organizationa
l collaboration - Realism of the player role
26World Team
- The overall assessment indicates that the World
Team prototype presents detailed key learning
points, which consider all organisational, group
and personal dynamics. Areas of improvement would
be - Acquisition strategy familiarity
- Further developments to the scenario
- Inclusion of a performance indicator to track
progress - Reccomandation for controlling communication
opportunities within the simulation
27Pit Stop
- The overall assessment has showed that the Pit
Stop design specification is well described and
detailed. It provides a good starting experience
for discussing distinctive individual and team
behaviours and competences for high performing
teams, and to extend the discussion to team
practices and performance within each
participants organization/division. The
suggestions for improvement refer to - Emphasis on the time factor
- Qualification of key learning points
- More emphasis on the theories on stress management
28Eagle Racing
- The overall assessment has showed that the Eagle
Racing prototype is well described and addresses
interesting challenges. Further improvements
should consider the following suggestions - Prioritizing long list of learning points
- Taking into account the complexity of decision
making process and goals - Avoid extreme stereotypes
29Intermediary Agent
- The overall assessment has showed that the
Intermediary Agent prototype is quite good
described, but still there are some opportunities
of improvement concerning the following - Identity of the intermediary agent
- Information asymmetry
- Change and collaboration dynamics
30Overall Assessment of The Five Prototypes
- The prototypes address a broad spectrum of
models, collaborative breakdowns etc. However in
some places it is necessary to make further
specifications, as suggested by the partners and
concerning the following - Need for an overall model of learning objectives
- Identification of top theories
- All the improvements previously suggested will
be validated with partners during the piloting
phase, Workpackage 4.
31Structure of the Presentation
- The Evaluation Process
- First Round Evaluation
- The Knowledge Community Evaluation
- The Prototypes Evaluation
- 3. Evaluation Process Next Steps
32Evaluation Process upcoming activities
- Summative pedagogical evaluation of the Knowledge
Community (final users) - Summative technological evaluation of the
Knowledge Community (after improvements from
first round evaluation) - Pedagogical evaluation of the final simulation
prototypes (partners) - Formative technical evaluation of the final
simulation prototypes (technical experts and
eventually partners) - The possibility to perform the proposed
evaluation activities depends on the WPs progress.