WP7 Quality and evaluation

From POERUP - Policies for OER Uptake
Revision as of 15:55, 3 August 2011 by Pbacsich (Talk | contribs) (entry)

(diff) ← Older revision | Latest revision (diff) | Newer revision → (diff)
Jump to: navigation, search

Month 1 thru Month 30 (30 months)

The Université Nancy 2 will be responsible for setting up the evaluation and quality framework. The project will use both internal evaluation (operational evaluation by the project partners - similar to that done in Re.ViCa and currently in VISCED) and external evaluation (e.g. by the experts of the International Advisory Committee but maybe augmented by other experts since not all IAC members are experts).

Internal evaluation

Systems and tools will be set up to assess the project progress and results and to improve the overall effectiveness of the project. Therefore an evaluation plan will be created at the very beginning of the project in agreement with all partners and will be implemented within the whole project lifecycle. The evaluation plan will consist of an evaluation strategy that defines the exact tools to be used and specified how, when and by whom the evaluation activities will be arranged.

All partners involved in the POERUP project will contribute to the operational evaluation process (e.g. evaluation of workshops, evaluation of project, and evaluation of partners' meetings) by completing specific questionnaires provided by the Evaluation WP leader, by drawing up and implementing action plans where necessary in order to address weaknesses highlighted by the evaluation and - at the end of the project - a final assessment. The WP leader will also provide a final evaluation report.

External evaluation

For the external evaluation and the validation of our research results, the International Advisory Committee members will provide feedback after each meeting through questionnaires and during the numerous discussion sessions during the IAC meetings. This work will be done in close collaboration with Sero (leader of WP Exploitation) in order that the evaluation tools set up for each key meeting will be in line with

  1. the goals of the event
  2. the event audience, and
  3. the requirements for the outcomes submitted to experts for assessment.