Formative Assessment in Ghana

Every day in Ghana we were woken at 6 am precisely – not by an alarm clock (we never needed one) but by the combination of sunrise, chickens crowing, and the neighbour’s radio. After breakfast we would walk downhill on a dirt track to Oblogo School, passing through the recycling yard and waving to the workers busy breaking up plastic chairs. At the school the children would be lining up to hand over the small amount of cash for that day’s teaching, and we would go up to the Omega Schools office on the first floor. We would work through till 4pm and then climb back up the hill to our flat, with local children calling out ‘Obruni!’ (white man) as we passed. Back home we would first jump in the shower and then relax on the balcony to watch the sunset, drink in hand. We would speculate about the probability of a power cut, hoping that tonight we might be able to cook in the light and maybe watch TV.

This was our life for a year (2011-12) while my wife and I were working as volunteers with a group of schools in Ghana. I was mainly developing formative assessment systems; Sandie was helping with this as well as quality assuring their teaching materials. There were a lot of positives for us – the people were lovely, the weather was warm and at weekends we could go to the local beach. The work itself, however, could be challenging, mainly due to problems with the lack of infrastructure and experienced staff.

Formative assessment

Formative assessment (also known as ‘assessment for learning’) is based on the idea that testing is not valuable unless it leads to improved teaching and learning. Omega Schools already had end-of-term tests, the results of which were reported to parents. When we arrived they were in the process of introducing mid-term tests in English, maths and science, using multiple choice questions. My task was to develop a system to make good use of these results to provide high-quality feedback to teachers and school managers (the term used for headteachers) and help to promote pupils’ learning.

All the tests were developed by subject specialists who were not assessment experts. Our first task was to improve the quality of those assessments, so that the results became more meaningful. As well as checking the questions and providing feedback to the subject specialists, we ran training courses in the principles of test development. Our colleagues showed great interest in these sessions, but some of the questions we saw later made us doubt that they had fully understood the points we had made.

Providing feedback

The main focus of our work, however, was collecting students’ results and providing detailed feedback to teachers and school managers in a way which was comprehensible and useful to them.

Feedback for teachers

For teachers we produced three different reports, based purely on the children’s test scores in each subject and not comparing their results to those from other schools. In the first report we showed the scores for each pupil in each subject, and whether they were significantly above () or below (↓) the class average. Pupils above average in all three subjects were marked as ‘needing extra challenge’ and those below average in all three as ‘needing extra help’. Children who were above average in two subjects and below in the other were also identified, so that the reason for their poor results in the third subject could be explored.

Figure 1: Example of Teacher Feedback 1

Ghana F1

In the second teacher report, we gave a table showing each child’s results on each item – either correct (C), wrong (X), or wrong with a question mark (?). The last indication was based on a model predicting item results from total test score. If the pupil got a wrong answer when the model predicted they should have got it right, this was indicated with a question mark. The idea was that the teacher should explore why they got the unexpected result, and if there was a problem with their understanding of that particular aspect of the subject.

Figure 2: Example of Teacher Feedback 2

Ghana F2

The third teacher report showed the proportion of pupils in the class getting each item right, with a brief description of the item content. The intention was to give teachers an idea of which items, and hence areas of the curriculum, caused the most problems for their pupils and might require further work.

Figure 3: Example of Teacher Feedback 3

Ghana F3

Feedback for school managers

To produce feedback for school managers, we created ‘Omega scores’ for each subject, which were standardised to a mean of 50 and standard deviation of 10 across all pupils. Reports showing average Omega scores for each year group and subject, and comparing them with the overall average, enabled school managers to see how their results stacked up against other schools’.

Figure 4: Example of School Manager Feedback

Ghana F4

We even produced a ‘league table’ plot of all schools’ results in ascending order, with a large arrow pointing to their own school – this became known to us as the ‘Hand of God’ plot.

Figure 5: Example of ‘Hand of God’ Plot

Ghana F5

The limitation of this feedback will be immediately obvious. All of the scores computed were relative, and had to be, because we had no objective standard against which to measure pupils, classes or schools. Nevertheless, the feedback enabled school managers to see the strengths and weaknesses of particular classes in certain subjects, and (in theory at least) helped them identify teachers who needed support.

Use of feedback

Formative assessment needs two elements to work successfully: high-quality, relevant and timely feedback, and intelligent use by teachers to improve and focus their teaching. We believe the feedback we produced was both relevant and high quality – it was certainly more detailed and specific than anything we have seen elsewhere. It was as timely as we could make it, given that three stages were involved:

  • Collecting the pupil scores from schools
  • Analysing the data
  • Printing and collating the feedback.

Data analysis was relatively quick, but collecting the scores from schools was a slow and surprisingly complicated process. Printing and collating the feedback also took time, as each individual teacher received a bound booklet containing all of the reports relevant to him/herself, as well as a simply worded explanation of what the tables meant, and how the information could be used to direct pupils’ learning.

We did not rely entirely on written explanations. We ran training sessions for teachers, and for school managers, explaining the use and purpose of the feedback, and answering any questions they might have. Further, while the system was being developed, teachers and managers were interviewed and questioned about the usefulness of the feedback; some reports were dropped or modified on the basis of their comments. In general interviewees seemed pleased with the feedback and claimed that they found it helpful; but many struggled when asked to give specific examples of how it had been used.

Problems encountered

In operating the system, several problems were encountered, some of which have already been alluded to above.

  • It was very hard to get data recorded and entered which was good enough to be used without a great deal of checking and recoding. Getting school staff to use consistent pupil identifiers was a nightmare, not helped by pupils’ names being spelled differently each time, and often having first and second names transposed.
  • It was emphasised many times (by ourselves, and by the senior managers of Omega Schools) that the purpose of the feedback was to help teachers to teach their classes, and individual pupils, more effectively. Nevertheless, many teachers remained convinced that the real purpose was to judge their work, and that they would be in trouble if their pupils did not obtain the highest scores. (As a result, there was some evidence of maladministration – I developed statistical methods of detection, but that is another story.)
  • Before we left I trained staff in running the system we had set up, but unfortunately these all left shortly after. This highlights another issue in developing countries – staff mobility, allied with a critical lack of suitably skilled personnel to operate sophisticated systems.

Ideas for the future

When we joined Omega Schools in 2011, there were only ten schools in the chain, but the number expanded rapidly. By spring 2014 (when I did some analysis for them from home) it had increased to nearly 40. Although a larger number of schools/pupils may enhance the value of the analysis, it becomes impracticable to run a system like that described above. The process of collecting data from schools, and producing feedback booklets, becomes far too cumbersome, expensive and time consuming.

Ultimately, I believe the only feasible way to get accurate and timely formative data in this environment is to maximise the use of IT: pupils need to do the assessments on screen, the data needs to be sent to head office electronically, and the results communicated to teachers and managers the same way. However, this would require a large investment in tablets or equivalent to allow all children to do the tests in a reasonable period of time. Given that, a high-quality electronic data collection, analysis and feedback system could be developed to provide formative information for teachers in such schools.

We really enjoyed our year in Ghana, and met some wonderful people. I am convinced there is a potential for the use of good formative assessment systems to help improve teaching and

Dougal Hutchison’s CV

EDUCATION

Ph D, University of London Institute of Education, 1999

M Sc (Statistics), London School of Economics, 1971

B Sc (Mathematical Science) University of Edinburgh, 1967

LANGUAGES         

French.            Spoken: reasonable.  Written: basic

Spanish.           Basic

German.          Basic

 

GENERAL QUALIFICATIONS:

As Chief Statistician, I was responsible for the statistical and psychometric procedures used by NFER, the organisation with the largest educational research turnover in the UK.  I have worked on the design and analysis of a large number of national and international studies of educational attainment. Examples of my experience are:

  • The design and analysis of a number of large national and international studies, e.g. the 1986 Assessment of Performance Unit Language survey, the England and Wales sample for the 1990 International Association for the Assessment of Education Progress Mathematics and Science survey.
  • I was in charge of the statistical analysis of the National Child Development study (1958 Cohort), and was responsible for a number of statistical innovations, including the introduction of survival analysis techniques.
  • In addition, I have extensive familiarity with multilevel modelling/hierarchical linear modelling in the analysis of educational data.  This has involved mainly MlwiN, but also Mplus. This has included two Economic and Social Research Council-funded projects looking at applications and technical aspects of multilevel modelling.
  • I carried out the NFER’s first school effectiveness/value added analysis in 1991, on primary school reading using multilevel modelling.
  • I was a consultant (1999) to the Department for Education and Employment on the statistical and psychometric validity of the proposed International Study of Life Skills (ILSS).
  • I worked on the design and analysis of some large national and international studies, e.g. the 1986 Assessment of Performance Unit Language survey, the England and Wales sample for the 1990 International Association for the Assessment of Education Progress Mathematics and Science survey, the Trends in Mathematics and Science Survey (TIMSS), and, more recently, PISA and PIRLS.  The studies as implemented included complex sampling plans and it was necessary to estimate sampling errors, design effect and intracluster correlations for the data.  I have produced SPSS macros to calculate sampling errors for such studies, and these are in general use in the NFER for this purpose.
  • I was responsible for, and did most of, the analyses for the standardisation of the British Ability Scales, a battery of 21 basic scales extending over an age range of 2:6 to 17:11, involving over 3000 computer runs.
  • I have had very extensive experience of SPSS and MlwiN, and some experience of SAS, R and Mplus.
  • At present some of my research interests include measurement error, computer marking of essays, non-response and randomised control trials.

Since retiring, I have worked at the National Centre for Research and Testing, Delhi, in a capacity-building role, analysing and reporting on the large scale (100,000-plus) National Achievement Surveys (NAS), first in Year V, and later in Year VII/VIII.

I am also working on an RCT in social work practice at the University of Bedford.

PROFESSIONAL

In addition to formal statistical qualifications, I have been a member of the Council of the Royal Statistical Society, and various attached sub-committees.

EXPERIENCE

1986-2010            Chief Statistician, NFER

1976-85                                              Principal Statistician, National Children’s Bureau.

1974-1976                                      Lecturer in Economic and Social Statistics, University of Kent at Canterbury

1971-1974                                      Research Associate, Centre for Educational Sociology, University of Edinburgh

ADDITIONAL CONSULTANCY EXPERIENCE: UK AND INTERNATIONAL

1967-68                                              Volunteer teacher of Mathematics and Science, Botswana

1977                                                        Member BERA Task Force on Research Careers in Education

1979-82                                              Council Member Royal Statistical Society (RSS)

1980-83                                              RSS Council representative on Local Groups committee

1980-84                                              RSS Council representative on Social Statistics committee

1983-84                                              RSS President Appointing committee

1988                            Technical advisor: Elton Committee on Discipline in Schools

1989-90                       RSS Working Party on Official Statistics

1993                            ALCD Fellow Multilevel Modelling Project, Institute of Education

1995-present               Multilevel modelling project: Institute of Education             and University of Bristol: Fellow

1998-9                         TIMSS International Technical Advisory Group

1997                            DfES Advisory Group on proposed International Survey of Life Skills

2002-3                         Consultant National Literacy Study, University of Malta

2002-3                         Training National Centre for Education Examinations, and Evaluation (NCEEE) in Item Response theory

2003-5                         Consultant: Metropolitan Police on staffing distribution formula

2003-8                         Independent member of the Office for National Statistics Quality and Methodology Programme Board.

2003-4                         Consultant: Home Office on assessment of initiatives

2004                            Consultant: Metropolitan Police on Value Added systems for assessing performance

2005                            Teaching multilevel modelling to post-graduate students at the Norwegian University of Science and Technology,      Trondheim

2006-7                         Consultant: London Local Government Association on financial distribution formula

2006                            Advisor to the Catholic University of Chile and the Chilean Government Education Department on Value Added

2007                            Sampling Error Consultant to Vietnam 2007 World Bank Survey of Student Achievements in Mathematics and Vietnamese Reading

2008                            Teaching longitudinal data analysis via multilevel modelling to post-graduate students at the Norwegian University of Science and Technology, Trondheim

2010-2012                   Research Associate, Department for International Development, University of Oxford

2011-2012                   Consultant, National Assessment Survey Year V and Year VIII, India.

2012-date                    Tilda Goldberg Research Centre, University of Bedford.

I have published over 60 books, chapters and journal articles.  A listing of some relevant recent ones is attached.

BOOKS

HUTCHISON D AND STYLES, B (2010) A guide to running randomised controlled trials for educational researchers. Slough: NFER.

HUTCHISON, D. and BENTON, T. (2009) Parallel Universes and Parallel Measures: Estimating the Reliability of Test Results. Ofqual: HMSO.

SCHAGEN, I, RUDD, P, RIDLEY, K., JUDKINS, M. HUTCHISON, D. and JONES, G. (2005) Statistics Commission Report No. 26: School Education Statistics: User Perspectives. London: Statistics Commission.

MIFSUD, C., GRECH, R., HUTCHISON, D. And MORRISON, J.  (2004) Literacy for School Improvement: Value Added for Malta.  Valetta: Agenda for NFER, Literacy Unit, University of Malta and The Education Division, Malta.

MIFSUD, C., HUTCHISON, D., GRECH, R. & MORRISON, J.  (2004) Improving Literacy in Malta.  University of Malta

BENTON, T., HUTCHISON, D., SCHAGEN, I. & SCOTT, E.  (2003) Study of the Performance of Maintained Secondary Schools in England.  Report for the National Audit Office, November 2003

MIFSUD, C., MILTON, J., BROOKs, G. and HUTCHISON, D. (2000). Literacy in Malta: the 1999 National Survey of the Attainment of Year 2 Pupils. Slough: NFER with the University of Malta.

BROOKS, G., FLANAGAN, N., HENKHUZENS, Z. and HUTCHISON, D. (1998) What Works for Slow Readers? The Effectiveness of Early Intervention Schemes, 100pp. Slough: NFER. ISBN: 0700514805.*

SCHAGEN, I. and HUTCHISON, D. (eds) (1994) How reliable is National Curriculum Assessment?  Slough: NFER.

FOXMAN, D., HUTCHISON, D. and BLOOMFIELD. B. (1991) The APU Experience. HMSO.

JOURNAL ARTICLES AND BOOK CHAPTERS

HUTCHISON, D. and YESHANEW, T (2010) ‘Using Value Added and Contextualization Techniques in Education’   Proceedings of the First International Conference on Educational Research for Development College of Education, Addis Ababa University  224-240

HUTCHISON, D. (2010)  ‘The Standard Error of Moving Average Smoothed Equipercentile Equating’ Quality & Quantity 44 (4), 783-791.

HUTCHISON, D. (2009)  ‘Automated essay scoring systems’ chapter 48 in Handbook of Research on New Media Literacy at the K-12 Level: Issues and Challenges Tan, L. and  Subramaniam, R. 777-793.  IGI Global.

HUTCHISON, D. and YESHANEW, T (2009) ‘ Augmenting the use of the Rasch model under time constraints’  Quality & Quantity 43, 5, 717-729

HUTCHISON, D. (2008) ‘On the conceptualisation of measurement error’ Oxford Review of  Education, 34, 4, 443-461.

SCHAGEN, I and HUTCHISON, D. (2008)  ‘Using Multilevel Modelling in Comparability Studies’, Chapter 9 in Techniques for monitoring the comparability of examination standards  Newton, P. and Goldstein. H. (eds).  London: QCA.

HUTCHISON, D and SCHAGEN, I (2007)  ‘PISA and TIMSS- are we the man with two watches?’ Chapter 9 in Lessons Learned: What International Assessments Tell Us About Math Achievement. Washington: Brookings Institute.

SCHAGEN, I., HUTCHISON, D and HAMMOND, P (2006) ‘League Tables and Health Checks: The Use of Statistical Data for School Accountability and Self-Evaluation’ CIDREE Yearbook 6.

HUTCHISON, D., MIFSUD, C. GRECH, R. and MORRISON, J. (2005) ‘Literacy in Malta: The National Literacy Survey of Year 5 pupils’ Research in Education 73, 36- 52

HUTCHISON, D., MIFSUD, C., GRECH, R., MORRISON, J. & HANSON, J. (2004) ‘The Malta Literacy Value-Added Project- a template for value added in small islands?’ Research Papers in Education 20, 3, 303–345

SCHAGEN, I. & HUTCHISON, D. (2003) “Adding value in educational research – the marriage of data and analytical power”, British Educational Research Journal, Vol. 29, No. 5, pp. 749-765.

HUTCHISON, D. (2003) ‘The effect of Group-Level Influences on Pupils’ Progress in Reading’British Education Research Journal, 29,1  25-40.

HUTCHISON, D., KENDALL, L., BARTHOLOMEW, D., KNOTT, M., GALBRAITH, J. and PICCOLI, M. (2000). ‘Reliability of reading assessment in three countries’, Quality and Quantity, 34, 353-65.

HUTCHISON, D. (1993) ‘The Assessment of School Effectiveness Using Administrative Data’ Educational Research

Lesley Kendall’s CV

Qualifications

1972                M.Sc Mathematical Statistics, University of Kent at Canterbury

1969                B.Sc Mathematics (first class), University of Kent at Canterbury

Employment history

1999 – 2007    Principal Research Officer in (2002 – 2005 Deputy Head of) the Department of  Professional and Curriculum Studies

1996 – 1999    Principal Statistician and Deputy Head of Statistics, NFER, Slough, UK

1984 – 1996    Senior Statistician, NFER, Slough, UK

1973 – 1975    Senior Statistician, NFER, Slough, UK

1970 – 1973    Statistician, NFER, Slough, UK

1969 – 1970    Research statistician, BAT Research Establishment, Southampton, UK

Since leaving NFER in 2007, I have worked for NFER as a Research Associate, and carried out a range of research work for an educational charity.

I am a member of the Evaluation Advisory Board of the Education Endowment Foundation (EEF).

Selected publications

Brown, E., Kendall, L., Teeman, D. and Ridley, K. (2004). Evaluation of Excellence in Cities Primary Extension: a Report of the Transition Strand Study. Slough: NFER. .

Brooks, G., Gorman, T., Kendall, L. and Tate, A. (1992). What teachers in training are taught about reading: The working papers. Slough: NFER.

Brooks, G., Gorman, T. and Kendall, L. (1993). Spelling it out: the spelling abilities of 11- and 15-year-olds. Slough: NFER.

Chamberlain, T., Lewis, K., Teeman, D. and Kendall, L. (2006). Schools’ Concerns and their Implications for Local Authorities: Annual Survey of Trends in Education 2006 (LGA Research Report 5/06). Slough: NFER.

Easton, C., Edmonds, S., Kendall, L., Lee, B., Pye, D. and Whitby K. (2003). Tracking the Progress of Investors in People in Schools (DfES Research Brief 406). London: DfES

Easton, C., Knight, S. and Kendall, L. (2005). Annual Survey of Trends in Primary Education: Survey of 2004 (LGA Research Report 8/05). Slough: NFER.

Eames, A., Benton, T., Sharp, C. and Kendall, L. (2006). The Impact of Creative Partnerships on the Attainment of Young People. Slough: NFER

Kendall, L. (1997). Something in the air: young people’s perceptions of asthma and air quality. Slough: NFER.

Kendall, L., Lee, B., Pye, D. and Wray, M. (2000). Investors in People in Schools (DfEE Research Brief 207). London: DfEE

Kendall, L., O’Donnell, L., Golden, S., Ridley, K., Machin S., Rutt, S., McNally, S., Schagen, I., Meghir, C., Stoney, S., Morris, M., West, A. and Noden, P. (2005). Excellence in Cities: the National Evaluation of a Policy to Raise Standards in Urban Schools 2000-2003 (DfES Research Report 675a). London: DfES.

Kendall, L., O’Donnell, L., Golden, S., Ridley, K., Machin S., Rutt, S., McNally, S., Schagen, I., Meghir, C., Stoney, S., Morris, M., West, A. and Noden, P. (2005). Excellence in Cities: the National Evaluation of a Policy to Raise Standards in Urban Schools 2000-2003. Report Summary (DfES Research Report 675b). London: DfES.

Kendall, L., Rutt, S. and Schagen, I. (2005). Minority Ethnic Pupils and Excellence in Cities: Final Report (DfES Research Report 703). London: DfES.

Kendall, L., Rutt, S. and Schagen, I. (2005). Minority Ethnic Pupils and Excellence in Cities: Final Report (DfES Research Brief 703). London: DfES.

Kendall, L., Morrison, J., Sharp C. and Yeshanew T. (2008). The Impact of Creative Partnerships on Pupil Behaviour. London: Creativity, Culture and Education.

Kendall, L., Morrison, J., Yeshanew, T. and Sharp, C. (2008). The Longer-Term Impact of Creative Partnerships on the Attainment of Young People: Results from 2005 and 2006. Final Report. London: Creativity, Culture and Education

Knight, S., Taggart, G. and Kendall, L. (2006). Annual Survey of Trends in Secondary education: Report for 2005(LGA Research Report 1/06). Slough: NFER.

Lewis, K., Kendall, L. and Teeman, D. (2005). Evaluation of the East Midlands Broadband Consortium: Connectivity in Schools. Findings from an Online Survey. Slough: NFER.

Lewis, K., Kendall, L. and Teeman, D. (2005). An Evaluation of the East Midlands Broadband Consortium: Summary of Findings from an Online Survey of Schools. Slough: NFER.

Morris, M., Rutt, S., Kendall, L. and Mehta, P. (2008). Narrowing the Gap in Outcomes for Vulnerable Groups: Overview and Analysis of Available Datasets on Vulnerable Groups and the Five ECM Outcomes. Slough: NFER.

Ridley, K. and Kendall, L. (2005). Evaluation of Excellence in Cities Primary Pilot 2001-2003 (DfES Research Report 675).London: DfES.

Sharp, C. and Kendall, L. (1996). Discretionary awards in dance and drama: a survey of local education authorities. Slough: NFER.

Sharp, C., Mawson, C., Pocklington, K., Kendall, L. and Morrison, J. (1999). Playing for Success: an Evaluation of the First Year (DfEE Research Report 167). London: DfEE. .

Sharp, C., Kendall, L., Bhabra, S., Schagen, I. and Duff, J. (2001). Playing for Success: an Evaluation of the Second Year (DfES Research Report 291). London: DfES.

Sharp, C., Blackmore, J., Kendall, L., Schagen, I., Mason, K. and O’Connor, K. (2002). Playing for Success: an Evaluation of the Third Year (DfES Research Report 337). London: DfES.

Sharp, C., Blackmore, J., Kendall, L., Greene, K., Keys, W., Macauley, A., Schagen, I. and Yeshanew, T. (2003). Playing for Success: an Evaluation of the Fourth Year (DfES Research Report 402). London: DfES.

Tabberer, R., Saunders, L. and Kendall, L. (1997). Raising achievement in Newham schools: a review by NFER on behalf of Newham LEA. Slough: NFER.

Wilson, R., Schagen, S. and Kendall, L. (2007). Evaluation of Birmingham Local Authority’s Action Research Strategy (Research Summary). Slough: NFER.

Wilson, R., Sharp, C., Shuayb, M., Kendall, L., Wade, P. and Easton, C. (2007). Research into the Deployment and Impact of Support Staff Who have Achieved HLTA Status. London: TDA.

Wilson, S., Benton, T., Scott, E. and Kendall, L. (2007). London Challenge: Survey of Pupils and Teachers 2006 (DfES Research Report 823). London: DfES.

Wilson, S., Benton, T., Scott, E. and Kendall, L. (2007). London Challenge: Survey of Pupils and Teachers 2006 (DfES Research Brief 823). London: DfES.

Eames, A., Benton, T., Sharp, C. and Kendall, L. (2006). The Impact of Creative Partnerships on the Attainment of Young People. Slough: NFER

Sandie Schagen’s CV

Curriculum vitae – Dr Sandie Schagen

Full name                    Sandra Helen Schagen

Date of birth                14 April 1948

Telephone                    0(044) 1494 571426

Mobile                         0(044) 7749 202337

E-mail                          sandie@schagen.co.uk

Qualifications

1985                PhD (Theology) Nottingham University.
(Thesis title: ‘Concepts of resurrection and immortality in Intertestamental Judaism
and in the New Testament’)

1979                PGCE English and religious education, Trent Polytechnic, Nottingham.

1973                BD first class, King’s College, University of London.

Key skills

My varied background has enabled me to develop a wide range of skills and personal attributes, including:

  • The ability to write (or edit) reports, proposals and other documents, using a style and language level appropriate for the intended audience
  • A high level of numeracy, and competence in understanding and interpreting statistical data
  • The ability to scope a research or evaluation project, decide on the most effective methodology and design instruments to achieve the desired outcome
  • Expertise in making research and evaluation findings accessible to a range of audiences, including policymakers, practitioners and the general public
  • A logical, analytical approach to project planning and problem solving
  • An eye for detail, good at identifying errors or inconsistencies
  • Excellent organisational and time management skills, able to prioritise tasks, manage multiple projects successfully and meet deadlines
  • Confidence when giving oral presentations to small groups or large audiences
  • A professional attitude in dealing with clients, discussing contractual matters and negotiating as necessary
  • Good interpersonal skills, effective as a team member or leader, and as a supervisor or line manager.

Posts held since graduation

2013 to date    Consultancy work as a member of ERA

2011-12           Voluntary work at Omega Schools, Accra, Ghana.

2009-10           Freelance work at Ministry of Education, Wellington, New Zealand.

2008-09           Visiting Chief Researcher at NZCER, Wellington, New Zealand.

2001-08           Principal Research Officer at NFER, Slough, UK.

1993-2001       Senior Research Officer at NFER, Slough, UK.

1990-93           Church Relations Officer at Christian Aid, London, UK.

1986-90           Commissioning Editor at Bible Society, Swindon, UK.

1985-86           Teacher at King Edward VII School, Coalville, Leicestershire, UK.

1982                Temporary lecturer in New Testament at Furman University, Greenville, South Carolina, USA.

1977-78           Analyst/programmer at Charnwood Borough Council, Loughborough, Leicestershire, UK.

1975-77           Arts-based program adviser at Westfield College, University of London, UK.

1973-75           Computer programmer at International Computers Ltd (ICL), London, UK.

ERA: Educational Research and Analysis

In 2013 I formed a partnership with three former colleagues from NFER.  Our consultancy is called ERA: Educational Research and Analysis.  We offer extensive experience in qualitative and quantitative evaluation, with an emphasis on statistical methodology.

With ERA colleagues, I have been involved in developing:

  • a Monitoring and Evaluation Framework in conjunction with a proposal for Department for International Development funding for girls’ high schools in Ghana
  • methods of detecting maladministration at key stage 2 for the Department for Education
  • a methodology for evaluating a new mathematics GCSE syllabus for Pearson.

We have recently completed a longitudinal evaluation of summer schools and academic apprenticeships for the Sutton Trust.

Earlier work

 At Omega Schools, Ghana

From November 2011 to September 2012 I worked on a voluntary basis with Omega Schools, a group of low-cost private schools to the west of Accra, Ghana.  My work involved:

  • helping to design analysis and feedback of test results for teachers, school managers and the Omega education team
  • training teachers and school managers in the use of feedback to improve the performance of schools, classes and individual pupils
  • running workshops for school managers and subject specialists on test development, report writing and time management
  • editing and quality assuring teacher guides with detailed lesson plans, chiefly in English and mathematics.

At the Ministry of Education, Wellington

I was employed on a freelance basis at the Ministry, working in Research Division and Schooling Division.  My principal tasks were:

  • producing a synthesis of research and evaluation relating to the implementation of the New Zealand Curriculum
  • the revision of a PISA 2006 science report, which required extensive editing to improve readability and highlight the messages arising for schools and policymakers
  • reviewing research reports, making recommendations for any necessary revision, and writing submissions for the Minister to approve their release
  • writing working papers, one on science achievement in international studies, and one on resilient students in New Zealand (building on findings from an OECD study)
  • scoping the evaluation of current and future ICT professional development initiatives, and recommending the most appropriate methodology.

At NZCER

In April 2008 I undertook a one-year contract as Visiting Chief Researcher at the New Zealand Council for Educational Research (NZCER).  During my time there, I was principal author of two lengthy reports, based on findings from national surveys of principals, teachers, trustees and parents in New Zealand schools.  I contributed two short literature reviews to another major project, one dealing with adult literacy and numeracy, the other with the youth transition to the labour market in New Zealand.

I also led or co-led three smaller projects:

  • analysing the extent of severe behaviour in secondary schools in the Wellington and Hutt Valley regions (based on a teacher survey) for the Post-primary Teachers Association (PPTA)
  • exploring (via focus groups held by videoconferencing, and an online survey) the experience of students learning in virtual classrooms, for the Ministry of Education
  • investigating the factors which encourage young people to undertake higher education courses in science, engineering or technology (again using an electronic questionnaire) for the Institute of Professional Engineers New Zealand (IPENZ).

In these projects I had main or sole responsibility for writing the proposal, dealing with the client, designing the research instruments and analysis, and writing the reports.

I was also a member of the Research Management Team at NZCER, and contributed to a number of discussions from the perspective of having managed similar work in a different organisation.

At NFER

 In 1993 I joined the staff of the National Foundation for Educational Research (NFER), the largest independent UK agency specialising in educational research.  During the next 15 years I gained extensive experience in both quantitative and qualitative research methods.  As a Principal Research Officer, I led and managed projects relating to a wide range of educational issues.  I also had two management roles:

  • I was responsible for managing staff allocations for the whole of the Slough Research Group, which comprised over 30 researchers and six administrative staff, engaged at any given time in some 30-40 projects.
  • I was a Research Staff Manager, with line management responsibility (direct or indirect) for a group of six researchers.

The rest of my time was spent as Project Director for a number of projects, which involved supervising the research teams, dealing with contractual matters, advising on research strategies and instruments, editing reports when necessary, and ensuring that quality assurance procedures were followed and deadlines met.

In 2001 I took over responsibility for NFER’s expanding health education portfolio.  We subsequently tendered successfully for a wide range of health education and promotion research projects, including:

  • a large-scale annual survey of smoking, drinking and drug use among young people, conducted jointly with the National Centre for Social Research
  • an evaluation of the Drug, Alcohol and Tobacco Training Package for teachers
  • an evaluation of the impact of the A PAUSE sex and relationships education programme
  • an evaluation of the National Healthy School Standard, a joint project with the Thomas Coram Research Unit
  • a longitudinal evaluation, and then a further evaluation, of the School Fruit and Vegetable Scheme, in collaboration with nutritionists from Leeds University.

Much of NFER’s work involves the evaluation of educational initiatives, often for government departments but also for local authorities, education-related agencies or corporate clients.   Projects in this category under my supervision included:

  • the evaluation of the accelerated Key Stage 3 initiative for the then DfES
  • a survey of the implementation of the national entitlement to language learning at Key Stage 2, also for the DfES
  • an evaluation of the Community Leadership Strategy, for the National College for School Leadership
  • a number of evaluation projects relating to the development and implementation of the Face 2 Face With Finance resources for NatWest Bank/The Royal Bank of Scotland
  • the evaluation of the Learning Money Matters programme, for the Personal Finance Education Group (pfeg).

 

I have a particular interest in school management and systems, and while at NFER I researched topics such as the organisation of school sixth forms, and primary-secondary continuity and progression.  I was also involved in a number of studies designed to assess the impact on attainment of different types of school and school systems.

 

Selected publications

 

Below I give the references for a small selection of my publications, to illustrate the range of my interests.

 

McMeeking, S., Smith, R., Lines, A., Dartnall, L. and Schagen, S. (2003). Evaluation of the Community Development Programme in Financial Literacy and Basic Skills (Summary Report). London: Basic Skills Agency.

Noden, P., Rutt., S, Schagen, S. and West, A. (2007). Evaluation of the Two Year Key Stage Three Project (DfES Research Report 836). London: DfES.

Page, M., Schagen, S., Fallus, K., Bron, J., De Coninck, C., Maes, B., Sleurs, W. and Van Woensel, C. (2005). Cross-Curricular Themes in Secondary Education. Report of a CIDREE Collaborative Project. Brussels: CIDREE.

Regan, C. and Schagen, S. (1998). Teaching Justice: a Research and Conference Report on Contemporary Social Issues in the Curriculum. Ireland: Network of Curriculum Development Units in Development Education; Slough: NFER.

Rudd, P., Lines, A., Schagen, S., Smith, R. and Reakes, A. (2004). Partnership Approaches to Sharing Best Practice (LGA Research Report 54). Slough: NFER.

Schagen, S. (2008, September 26). National standards — a good idea? New Zealand Education Review. p. 7.

Schagen, S., Davies, D., Rudd, P. and Schagen, I. (2002). The Impact of Specialist and Faith Schools on Performance (LGA Research Report 28). Slough: NFER.

Schagen, S. and Hipkins, R. (2008). Curriculum changes, priorities and issues: Findings from the NZCER secondary 2006 and primary 2007 national surveys. Wellington, New Zealand: NZCER.

Schagen, S., Johnson, F. and Simkin, C. (1996).  Sixth Form Options: Post-compulsory Education in Maintained Schools. Slough: NFER.

Schagen, S. and Kerr, D. (1999). Bridging the Gap? The National Curriculum and Progression from Primary to Secondary School. Slough: NFER.

Schagen, S. and Lines, A. (1996). Financial Literacy in Adult Life: a Report to the NatWest Group Charitable Trust. Slough: NFER.

Schagen, S. and Schagen, I. (2002). The Impact of the Structure of Secondary Education in Slough: Final Report. Slough: NFER.

Schagen, S. and Wylie, C. (2009). School resources, culture and relationships:  Findings from the NZCER secondary 2006 and primary 2007 national surveys.  Wellington, New Zealand: NZCER.

Spielhofer, T., Walker, M., Gagg. K., Schagen, S. and O’Donnell, S. (2007). Raising the Participation Age in Education and Training to 18: Review of Existing Evidence of the Benefits and Challenges (DCSF Research Report 012). London: DCSF.

Spielhofer, T., O’Donnell, L., Benton, T., Schagen, S. and Schagen, I. (2002). The Impact of School Size and Single-Sex Education on Performance (LGA Research Report 33). Slough: NFER.

Weston, P. and Schagen, S. (1995). The Post-16 Experience of Full-time Students (Cohort Study of TVEI Extension Students: Briefing No.8). Sheffield: ED.

Weston, P., Schagen, S., Lines, A. and MacDonald, A. with Hutchison, D., Hewitt, D. and Self, T. (1995). The Impact of TVEI on Young People. Final Report of the Cohort Study of TVEI Extension Students. Sheffield: ED.

Ian Schagen’s CV

Qualifications

  • BA in Natural Sciences (Class 2 division 1) from Cambridge University, 1968.
  • MSc in Statistics (Distinction) from Birkbeck College, University of London, 1974.
  • PhD as a member of academic staff from Loughborough University of Technology, 1982. Thesis title: ‘Theory and Applications of Multidimensional Stationary Stochastic Processes’.
  • Fellow of the Royal Statistical Society and Chartered Statistician.

Career History

1968-77 Petroleum Reservoir Engineer with British Petroleum Co. Ltd, in Great Yarmouth, Sunbury-on-Thames and London.
1977-86 Lecturer in Statistics and Operations Research in the Computer Studies Department at Loughborough University of Technology.
1982-83 Exchange Professor in the Mathematics Department of Furman University, Greenville, South Carolina, USA.
1986-96 Senior Statistician at the National Foundation for Educational Research (NFER).
1996-2008 Head of Statistics at NFER.
2008-2009 Chief Research Analyst, Research Division, Ministry of Education, New Zealand
2009-2010  Chief Research Analyst, Assessment & Qualifications Team, Schooling Group, Ministry of Education, New Zealand
2011-2012 Volunteer consultant with Omega Schools, Greater Accra, Ghana, helping to develop systems for analysing and reporting assessment data
Nov 2013-date Educational and statistical consultant with ERA Educational Research and Analysis

Details of Recent Positions

In April 2008 I retired as Head of Statistics and manager of the NFER’s Statistics Research and Analysis Group, responsible for fourteen qualified statisticians and for the analysis of all data connected to the Foundation’s research projects.

From April 2008 until April 2009 I worked with the Research Division of the New Zealand Ministry of Education, as Chief Research Analyst. My role involved capacity building for analysts in the division, helping with the analysis of international and other datasets, and advising on assessment and general educational issues.

In September 2009 I returned to New Zealand and the Ministry of Education, to work this time in the Assessment and Qualifications team of the Schooling Group, handling both technical and policy issues, mainly concerned with the introduction of the new National Standards.

In October 2011 I went to work as a volunteer with Omega Schools, a group of low-cost schools on the western outskirts of Accra. I worked with them for a year, developing systems to analyse the results of regular assessments and present to teachers and school management in ways which would enhance teaching and learning. I worked with local staff to improve their capacity to carry our such work independently.

In November 2013 I joined up with three other highly-experienced ex-colleagues from NFER to form ERA Educational Research and Analysis, a partnership of educational and statistical consultants. We have been engaged to provide expert assistance in proposal development for DfID and ESRC, as well as providing psychometric backup for NFER. I was also engaged by Omega Schools to produce feedback for teachers and school managers based on the mid-term tests taken in February 2014.

Selected Publications

SCHAGEN, I. (1990). ‘A method for the age standardisation of test scores’, Applied Psychological Measurement, 14, 4, 387–93.

SCHAGEN, I.  (1990). ‘Analysis of the effects of school variables using multilevel models’, Educational Studies, 16, 1, 61–73.

SCHAGEN, I.  (1991). ‘Beyond league tables. How modern statistical methods can give a truer picture of the effects of schools’, Educational Research, 33, 3, 216–22.

SCHAGEN, I.  (1993). ‘Problems in measuring the reliability of National Curriculum assessment in England and Wales’, Educational Studies, 19, 1, 41–54.

SCHAGEN, I. (1994). ‘Graphical representation of the reliability of National Curriculum assessment.’ In: HUTCHISON, D. and SCHAGEN, I. How Reliable is National Curriculum Assessment? Slough: NFER.

SCHAGEN, I.  (1994). ‘Multi–level analysis of the key stage 1 National Curriculum assessment data in 1991 and 1992’, Oxford Review of Education, 20, 2, 163–71.

HUTCHISON, D. and SCHAGEN, I. (1994). How Reliable is National Curriculum Assessment? Slough: NFER.

SCHAGEN, I. and HUTCHISON, D.  (1994). ‘Measuring the reliability of National Curriculum assessment’, Educational Research, 36, 3, 211–21.

SAINSBURY, M., SCHAGEN, I. and WHETTON, C. (1995). ‘Issues in the evaluation of standard assessment tasks and the reliability of National Curriculum assessment’, British Educational Research Journal, 21, 2, 237–40.

SCHAGEN, I. (1995). ‘In the premier league’, Education Guardian, 23 May.

SCHAGEN, I. and SAINSBURY, M.  (1996). ‘Multilevel analysis of the key stage 1 National Curriculum assessment data in 1995’, Oxford Review of Education, 22, 3, 265–72.

SCHAGEN, I.. (1997). ‘Value added taxes the statisticians’, Times Educ. Suppl.,  4210, 7 March, 14.

BROOKS, G., SCHAGEN, I. and NASTAT, P. (1997). Trends in Reading at Eight: a Report on the 1995 Survey of Reading Attainment in Year 3 in England and Wales. Slough: NFER.

SAINSBURY, M., WHETTON, C., MASON, K. and SCHAGEN, I.  (1998). ‘Fallback in attainment on transfer at age 11: evidence from the Summer Literacy Schools evaluation’, Educational Research, 40, 1, 73–81.

SCHAGEN, I. and MORRISON, J. (1998). QUASE: Quantitative Analysis for Self–Evaluation. Overview Report 1997: Analysis of GCSE Cohorts 1994 to 1996. Slough: NFER.

SCHAGEN, I. and WESTON, P.  (1998). ‘Insights into school effectiveness from analysis of OFSTED’s school inspection database’,  Oxford Review of Education, 24, 3, 337–44.

SCHAGEN, I.  (1998). ‘Statistics for dummies’, Managing Schools Today, 8, 3, 33–4.

SCHAGEN, I.  (1999). ‘An inexact science’, Managing Schools Today, 8, 5, 24–6.

SCHAGEN, I.  (1999). ‘Large can – not many worms: an evaluation of the role of age–standardised scores in the presentation of assessment data’, British Educational Research Journal, 25, 5, 691–702.

SCHAGEN, I.  (1999). ‘Testing, testing testing’, Managing Schools Today, 8, 4, 28–9.

SCHAGEN, I.  (1999). ‘True lies’, Managing Schools Today, 8, 6, 42–3.

SCHAGEN, I. and MORRISON, J.  (1999). ‘A methodology for judging departmental performance within schools’, Educational Research, 41, 1, 3–10.

SCHAGEN, I., SAINSBURY, M. and STRAND, S.  (1999). ‘Statistical aspects of baseline assessment and its relationship to end of key stage one assessment’, Oxford Review of Education, 25, 3, 359–67.

SAINSBURY, M., SCHAGEN, I., WHETTON, C. and CASPALL, L.  (1999). ‘An investigation of hierarchical relationships in children’s literacy attainments at baseline’, Journal of Research in Reading, 22, 1, 45–54.

SCHAGEN, I. (2000). Statistics for School Managers. Westley: Courseware Publications.

FELGATE, R., MINNIS, M. and SCHAGEN, I.  (2000). ‘Some results from the analysis of data from the National Numeracy Project’, Research Papers in Education, 15, 2, 163–84.

SCHAGEN, I.  (2002). ‘Attitudes to citizenship in England: multilevel statistical analysis of the IEA civics data’, Research Papers in Education, 17, 3, 229–59.

SCHAGEN, I. (2002). ‘Well, what do you know?’ Education Journal, 63, 27–8.

SCHAGEN, I., KENDALL, L. and SHARP, C.  (2002). ‘Measuring the success of “Playing for Success”’, Educational Research, 44, 3, 255–67.

SCHAGEN, I. and GOLDSTEIN, H. (2002). ‘Do specialist schools add value? Some methodological problems’, Research Intelligence, 80, 12–15.

SCHAGEN, I & HUTCHISON, D. (2003) “Adding value in educational research – the marriage of data and analytical power”, in British Educational Research Journal, Vol. 29, No. 5, pp. 749-765.

SCHAGEN, I. and SCHAGEN, S.  (2002). ‘A fair comparison: selective v. comprehensive education’, Education Journal, 60, 26–7, 30.

SCHAGEN, I. and SCHAGEN, S. (2002). ‘What kind of school is best?’ NFER News, Autumn, 10.

SCHAGEN, S., DAVIES, D., RUDD, P. and SCHAGEN, I. (2002). The Impact of Specialist and Faith Schools on Performance (LGA Research Report 28). Slough: NFER.

SHARP, C., KENDALL, L. & SCHAGEN, I. (2003) “Different for girls? An exploration of the impact of Playing for Success”, in Educational Research, Vol. 45, No. 3, pp.309-324.

SCHAGEN, I. and SCHAGEN, S.  (2003). ‘Analysis of national value–added datasets to estimate the impact of specialist schools on pupil performance’, Educational Studies, 29 , 1, 3–18.

SCHAGEN, I. & SCHAGEN, S. (2003) Evidence to House of Commons, in Secondary Education: Diversity of Provision, 4th Report of Select Committee on Education, http://www.publications.parliament.uk/pa/cm200203/cmselect/cmeduski/94/94.pdf

SCHAGEN, I. & SCHAGEN, S. (2003) “Analysis of national value-added datasets to assess the impact of selection on pupil performance”, in British Educational Research Journal, Vol.29, No.4, pp.561-582.a

SCHAGEN, I. & SCHAGEN, S. (2003) “Analysis of national value-added datasets to estimate the impact of specialist schools on pupil performance”, in Educational Studies, Vol. 29, No. 1, pp.3-18.

SCHAGEN, I. (2004) “Data – does it really speak for itself?”, in TOPIC, Spring 2004, Issue 31.

SCHAGEN, I. (2004) “Multilevel analysis of PIRLS data for England”, in Proceedings of the IRC-2004 (IEA International Research Conference), Nicosia, Cyprus, Volume 3, pp.82-102.

SCHAGEN, I. (2004) “Further statistical analysis of the IEA Civics data to investigate attitudes to citizenship in England”, in Proceedings of the IRC-2004 (IEA International Research Conference), Nicosia, Cyprus, Volume 4, pp.115-125.

SCHAGEN, I. (2004) “Presenting the Results of Complex Models – Normalised Coefficients, Star Wars Plots, and Other Ideas”, in But What Does it Mean? The Use of Effect Sizes in Educational Research (editors Schagen, I. & Elliot, K.). Slough: NFER, pp.25-41.

SCHAGEN, I. & ELLIOT, K. (2004) But What Does it Mean? The Use of Effect Sizes in Educational Research (editors). Slough: NFER

SCHAGEN, I. and SCOTT, E. (2004). ‘Study of the performance of maintained secondary schools in England’, NFER News, Spring, 7.

SAINSBURY, M. & SCHAGEN, I. (2004) ‘Attitudes to reading at ages nine and eleven’, in Journal of Research in Reading, Vol. 27, Issue 4, pp.373-386.

TWIST, L., GNALDI, M., SCHAGEN, I. & MORRISON, J. (2004) ‘Good readers but at a cost? Attitudes to reading in England’ in Journal of Research in Reading, Vol. 27, Issue 4, pp.387-400.

GRAY, M., SCHAGEN, I., & CHARLES, M. (2004) ‘Tracking pupil progress from Key Stage 1 to Key Stage 2: how much do the ‘route’ taken and the primary school attended matter?’, Research Papers in Education, 19, 4, 389-413.

SHARP, C., SCHAGEN, I. & SCOTT, E. (2004) Playing for Success: The Longer Term Impact. A Multilevel Analysis. London: DfES (Research Report 593).

RUDDOCK, G., STURMAN, L., SCHAGEN, I., STYLES, B., GNALDI, M. & VAPPULA, H. (2004) Where England Stands in the trends in international mathematics and science study (TIMMSS) 2003. London: DfES. See: http://www.nfer.ac.uk/research/timms/full_report.pdf

SCHAGEN, I. (2004) “Improving schools through data. Part 2: How is it going to work?”, in Managing Schools Today, November/December 2004, pp.49-51.

SCHAGEN, I., BENTON, T. & RUTT, S. (2004) Study of Attendance in England: Report for the National Audit Office [online].

SCHAGEN, I. & SCHAGEN, S. (2005) ‘The impact of faith schools on pupil performance’, in Faith Schools: Consensus or Conflict? (Gardner, J., Cairns, J. & Lawton, D. eds) London: RoutledgeFalmer.

GNALDI, M.; SCHAGEN, I., TWIST, L. & MORRISON, J. (2005) “Attitude items and low ability students: the need for a cautious approach to interpretation”, in Educational Studies, Vol. 31, No. 2, pp. 103-113.

SCHAGEN, I., RUDD, P., RIDLEY, K., JUDKINS, M., JONES, G. & HUTCHISON, D. (2005) Review of Schools Education Statistics. Report to Statistics Commission.

SCHAGEN, I & SCHAGEN, S. (2005) “Combining multilevel analysis with national value-added data sets – a case study to explore the effects of school diversity”, in British Educational Research Journal, Vol. 31, No. 3, pp.309-328.

NODEN, P. and SCHAGEN, I. (2006). ‘The Specialist Schools Programme: golden goose or conjuring trick?’ Oxford Review of Education, 32, 4, 431–448.

SCHAGEN, I. & SCHAGEN, S. (2006). ‘Using national value-added datasets to explore the effects of school diversity.’ In: Hewlett, M., Pring, R. and Tulloch, M. (Eds) Comprehensive Education: Evolution, Achievement and New Directions. Northampton: Centre for the Study of Comprehensive Schools.

SCHAGEN, I. (2006) ‘The use of standardized residuals to derive value-added measures of school performance’, in Educational Studies, Vol.32, No.2, June 2006, pp.119-132.

SCHAGEN, I. (2006). ‘Statistical literacy is the essential skill for educational managers’, Education Journal, 98, 21.

SCHAGEN, I.  & RIDLEY, K . (2006) ‘How good are our educational statistics?’, in Practical Research for Education, Issue 35, May 2006, pp. 62-67.

SCHAGEN, I. & SAVORY, C. (2006) “Post-16 Education: shining a light on the effects of different patterns of provision”, in Education Journal, Issue 95, June 2006, pp.33-35.

SCHAGEN, I. (2006). ‘The devil is in the detail: what can experience tell us about the use of data to facilitate the new relationship with schools?’ In: Department for Education and Skills Department for Education and Skills Research Conference 2005: ‘Putting the Evidence into Education, Skills and Children’s Well-being’ (Research Report CR2005) [online]. Available: http://www.dfes.gov.uk/research/data/general/CR2005.pdf  [7 August, 2006].

SCHAGEN, I., LOPES, J., RUTT, S., SAVORY, C. and STYLES, B. (2006). Evaluating Patterns of Post-16 Provision [online]. Available: http://nfer.ac.uk/publications/pdfs/downloadable/esq.pdf  [5 July, 2006].

SCHAGEN, I. and BENTON, T. (2006). ‘Don’t judge schools purely on raw attendance data or examination results’, Education Journal, 96, 30–31.

SCHAGEN, I. (2006). Review of Data Systems Underpinning DfES SR2004 PSA Targets. London: DfES. [online]. Available:  http://www.dfes.gov.uk/rsgateway/DB/STA/t000700/SFR2004_PSA_TargetV1.pdf

SCHAGEN, I. and STYLES, B. (2006). ‘Value-added analysis is useful: a response to Gorard’, Research Intelligence, 96, 9.

SCHAGEN, I. and HUTCHISON, D. (2006) ‘Comparisons between PISA and TIMMS – Are we the Man with Two Watches?’, keynote address at IEA Research Conference, Washington DC, November 2006.

SCHAGEN, I., HUTCHISON, D. and HAMMOND, P. (2006). ‘League Tables and Health Checks: The Use of Statistical Data for School Accountability and Self-evaluation.’ In: DOBBELSTEIN, P. and NEIDHARDt, T. (Eds) Schools for Quality – What Data-based Approaches Can Contribute. Brussels: CIDREE/CVO.

Schagen, I., Lopes, J., Rutt, S., Savory, C. and Styles, B. (2006)  Evaluating patterns of post-16 provision. London: Learning and Skills Network.

Schagen, I. ‘National and international performance data and government policy’, presented in the symposium ‘Optimising the impact of research on policy’ at BERA 2007, November 2007, London University Institute of Education.

Schagen, I. and Hutchison, D. (2007). ‘Comparisons between PISA and TIMSS – we could be the man with two watches’, Education Journal, 101, 34-35.

Schagen, I. (2007). ‘Why ‘data’ is singular’, RSS News, 35, 3, 1-2.

Schagen, I. (2007). ‘Comments on ‘Cognitive ability and school improvement”, Practical Research for Education, 38, October 2007, 34-36.

Hutchison, D. and Schagen, I. (2007). ‘Comparisons between PISA and TIMSS – are we the man with two watches?’ In: Loveless, T. (Ed) Lessons Learned : What international assessments tell us about math achievement. Washington, D.C.: Brookings Institute Press.

Schagen, I. and Hutchison, D. (2008). ‘Multilevel modelling methods.’ In: Newton, P., Baird, J., Goldstein, H., Patrick, H. and Tymms, P. (Eds) Techniques for monitoring the comparability of examination standards. London: QCA.

Schagen, I. and Hutchison, D. (2008). ‘Response to commentaries on Chapter 10.’ In: Newton, P., Baird, J., Goldstein, H., Patrick, H. and Tymms, P. (Eds) Techniques for monitoring the comparability of examination standards. London: QCA.

Hutchison, D. and Schagen, I. (2008) ‘Concorde and discord: the art of multilevel modelling’ in International Journal of Research and Method in Education, 31, 1, 11-18.

Schagen, I. (2008) ‘What evidence is there from TIMSS about changes in the relationship between school decile and attainment?’, in in evidence, issue 16th July 2008, 1-3.

Schagen, I. and Lawes, E. (2009) ‘Well-being and educational level’, Education Counts Factsheet. Wellington: Ministry of Education.

Tooley, J., Dixon, P., Shamsan, Y. and Schagen, I. (2010). ‘The relative quality and cost-effectiveness of private and public schools for low-income families: a case study in a developing country.’ In School Effectiveness and School Improvement, 21, 2,  117-144.