The University of Cambridge’s examinations arm has been tasked with developing a standard test to measure the “learning gain” of students at English higher education institutions.
A large-scale pilot of the test of critical thinking and problem-solving skills, which approximately 27,000 undergraduates at nine universities will be invited to participate in, is due to get under way this autumn.
The 成人VR视频 Funding Council for England, which announced plans for the test last September, said that Cambridge English Language Assessment, part of the university-owned Cambridge Assessment, had won a tender to develop the exam.
Hefce intends to use the results of the new test alongside survey data to judge the improvement in skills and competencies made by students during their time at university.
If the pilot is successful, and the test is rolled out across England, its results could also become a key measure of institutional performance in the teaching excellence framework (TEF).
Alongside the national pilot, Hefce is funding 12 projects at more than 70 universities and colleges to test measures of learning gain, drawing on data such as test scores, course grades and survey results.?
The newly announced exercise is likely to draw upon Cambridge English Language Assessment’s existing test of critical thinking and problem solving, which is used to assess applicants to some courses at Cambridge, the University of Oxford and University College London.
A tender document published by Hefce reveals further details of how the national pilot is likely to operate, explaining that UK-domiciled school-leavers will be tested on four occasions: at the start of their degree this November, and then at the end of their first, second and third years of study.
The test will be required to assess undergraduates from all disciplines and will be a multiple-choice exam, delivered using online survey software.
The test, which should take no longer than 30 minutes to complete, will not include questions that measure aptitude through numerical reasoning.
“We are currently working to finalise the various elements of the study and expect to announce full details in November,” a Hefce spokeswoman said.
Interest in learning gain has been sparked by concerns that degree classification is an insufficient indicator of students’ progress, especially when considered alongside A-level scores.
It is hoped that the results of a nationally administered standardised test could be used by students to demonstrate their ability to employers, and by institutions to identify which teaching methods work best.
But the incorporation of the assessment into the TEF would be controversial, with doubts over how helpful a generic exam can be to measure the progress of students who are following discipline-specific courses.
Evidence from Brazil, one of the few countries to operate national standardised university exams, has highlighted significant difficulties in persuading students to take the test and to try their best at it.
Baroness Wolf of Dulwich, Sir Roy Griffiths professor of public sector management at King’s College London, has warned that standardised tests were “completely unable” to measure university performance since some disciplines would be “far more closely related” to what the test was measuring than others.
But Chris Rust, emeritus professor of higher education at Oxford Brookes University, argued that the results were likely to be more instructive than data on graduate employment and satisfaction, which the TEF proposes to use to measure student progress.
“If you look at the ludicrous nature of the TEF measures and how appalling they are as indicators of excellent teaching, this has got to be a step in the right direction,” Professor Rust said. “This could be a very useful part of a triangulation of data.”