Author(s): Susan Harris-Hümmert
Publisher: Vs Verlag
Year: 2011
Language: English
Pages: 301
Cover......Page 1
Evaluating
Evaluators......Page 4
ISBN 9783531177830......Page 5
Acknowledgments......Page 6
Abstract......Page 8
Contents......Page 10
List of Figures......Page 16
List of Tables......Page 18
Glossary......Page 20
1 Introduction......Page 22
1.1 Thesis rationale......Page 23
1.2 Research questions......Page 24
1.3 Methodological and theoretical background......Page 26
1.4 Outline of the study......Page 27
1.5 Contribution of the study......Page 28
2 Evaluation and quality in German higher education......Page 30
2.1.2 Concepts of the university in brief......Page 31
2.2 Educational Science in German higher education......Page 34
2.3 Notions of ‘quality’ in higher education......Page 36
2.4 Stakeholders in HE and their attitudes towards accountability......Page 40
2.5 Definitions of evaluation......Page 43
2.6 The habitus of the evaluator......Page 46
2.7 Professors within the dichotomy of egalitarianism and elitism......Page 48
2.8 The influence of Dutch evaluation methodology in German HE......Page 50
2.9 An analysis of the German HE evaluation landscape......Page 51
2.10 Political factors in association with German HE evaluation......Page 53
2.11 The influence of PISA and the Bologna Process on German HE......Page 54
2.12 The evaluation landscape in Germany......Page 55
2.13 Evaluation agencies and the significance of their launch date......Page 57
2.14 The history of the......Page 58
2.15 Evalag’s evaluation methodology......Page 59
2.16 The impact of higher education evaluation in Germany 2000-2004......Page 60
2.17 Background to evaluation of......Page 61
2.18 Summary......Page 63
3.1.1 Phenomenology and decision-making theory......Page 64
3.2 Using qualitative methods......Page 68
3.2.1 Case study method......Page 69
3.3 Reason for not conducting pilot study......Page 70
3.4 Gaining overarching access......Page 71
3.5 Archival research......Page 72
3.5.1 Methods......Page 73
3.6 Reason for conducting retrospective analysis......Page 74
3.7 Method of interviewing agency employees......Page 75
3.7.1 Method of interviewing evaluators......Page 76
3.7.3 Choice of interview partner / sampling......Page 77
3.7.4 Interviewing elites: culture and the researcher......Page 78
3.7.6 Ethical considerations regarding interviews......Page 81
3.7.7 Method of interview data analysis......Page 83
3.8 Summary......Page 85
4.1 First steps into documentary research......Page 86
4.2 The historical background preceding evaluation 2003/2004......Page 88
4.3 Preparing the......Page 94
4.4 The Ministry’s criteria for selection of experts......Page 98
4.5 The context preceding the constitutional meeting 16 April 2003......Page 100
4.6 The constitutional meeting 16 April 2003......Page 103
4.7 The compilation of questionnaires A and B for the self-report......Page 107
4.7.1 Professor Phillips’s comments on the draft questionnaire of 10 April 2003......Page 108
4.8 Interim organisational details......Page 113
4.9 Preparations in advance of the plenary meeting 25 September 2003......Page 115
4.9.1 The plenary meeting 25 September 2003......Page 119
4.10 Towards determining evaluation criteria for......Page 122
4.11 Interim correspondence and proceedings prior to onsite visits......Page 124
4.12 The Ulm Declaration 2001......Page 126
4.13 The allocation of experts to HEIs for onsite visits......Page 127
4.14 Developing evaluation criteria for judging the quality of research......Page 130
4.15 A typical timetable for HEI visits; onsite reflections......Page 133
4.16 The post-onsite plenary meeting 10-11 December 2003......Page 136
4.17 The sub-committee meeting 16 January 2004......Page 139
4.18 The plenary meeting 27 February 2004......Page 141
4.19 The Achtenhagen/Schelten statement 23 March 2004......Page 144
4.20 The final meeting with HEI representatives 1-2 April 2004......Page 145
4.20.1 Discussions with representatives from PH Weingarten......Page 147
4.21 The editorial meeting 29 April 2004......Page 151
4.22.2 Productivity and the practical relevance of research......Page 153
4.23 Early career academic support......Page 154
4.25 Criteria for judging the structures of teaching and learning provision......Page 155
4.26 Summary......Page 156
5.1 Special areas of knowledge, age of experts, and other experience......Page 160
5.2 Role of non-specialist expert......Page 162
5.3 Prior knowledge of......Page 163
5.4 Familiarity with other colleagues / previous networks......Page 164
5.5 How experts viewed their selection......Page 168
5.6 Reasons for participating in the evaluation......Page 169
5.6.1 Knowledge of institutions......Page 170
5.6.3 Accommodation of evaluation into existing schedules......Page 171
5.6.4 Experience of being evaluated......Page 173
5.7 The significance of the first plenary meeting......Page 174
5.8 The extent of ministerial involvement......Page 176
5.9 Experts’ opinions on how HEIs perceived the evaluation......Page 181
5.10.1 The foreign experts......Page 183
5.10.2 The chairperson......Page 186
5.10.3 The sub-committee......Page 188
5.10.4 Jürgen Baumert......Page 189
5.11 Summary......Page 190
6.1 Working towards evaluation criteria......Page 192
6.1.1 Frameworks for quality......Page 195
6.1.2 Considerations for ‘first-time’ evaluations......Page 198
6.1.3 Camps and hierarchies: consequences for establishing evaluation criteria......Page 199
6.1.4 Influences from The Netherlands......Page 203
6.1.5 Publications and impact factor......Page 206
6.1.6 Research funding and notions of quality......Page 209
6.2 Summary......Page 212
7.1 Techniques for working with large amounts of information: the self reports......Page 214
7.1.1 Self reports and the art of seeing through façades......Page 216
7.2 Attitudes to gaps in the statistics......Page 219
7.3 Numbers of students at the onsite meetings......Page 221
7.4 Compiling reports......Page 223
7.5 Discrepancies between discussions and the final report......Page 225
7.6 How the experts thought they were perceived by the HEIs......Page 228
7.6.1 Self-concept of the evaluators......Page 230
7.6.2 Knowledge of the field and publishing experience......Page 231
7.6.3 Empathy with academic staff......Page 232
7.7 Gender issues......Page 234
7.8 Difficulties with HEIs......Page 236
7.9 What the experts learnt from the evaluation......Page 239
7.10 Summary......Page 244
8 Review of the evaluation, critical discussion, and conclusions......Page 246
8.1 Ministerial contract, impact on evaluation and appointed experts......Page 247
8.2 Connectivity of experts and differing paradigms......Page 248
8.4 Statistical problems......Page 250
8.5 Experiences onsite and strategies for dealing with specific situations......Page 251
8.6 Exploiting particular expertise......Page 252
8.7 Problems of size......Page 253
8.8 Attitudes to selection, evaluation experience, and status......Page 254
8.9.1 Evaluator types......Page 256
8.10 Evaluatory impact and perception of experts......Page 257
8.11 Conclusions......Page 258
8.13 Implications for policy and future research......Page 260
9.1 Introduction of evaluation standards and the ‘Exzellenzinitiative’......Page 262
9.2 Organising quality assurance in HE......Page 264
9.3 Evaluations in Erziehungswissenschaft......Page 265
9.4 Changing role of......Page 266
9.5 A vision for the future?......Page 267
References......Page 268
Additional Internet References......Page 279
Appendix......Page 282
B Questions for evaluators......Page 284
C Interview participants, place, and date of interview......Page 286
D Questionnaires A and B (final version) Questionnarie A (final version) 1. Structures and quantitative data 1.1. Brief synopsis......Page 288
E The Ulm Declaration Ulmer Erklärung zur berufsbezogenen Gymnasiallehrer-Ausbildung an den Landesuniversitäten von Baden-Württe......Page 296
F The Schelten / Achtenhagen statement......Page 300