成人VR视频

‘Learning gain’: did students bulk up in mental muscle?

Could US moves to measure undergraduates’ improvements in critical thinking, writing and other skills spread to the UK?

二月 19, 2015

Source: Getty/Rex

Brain gain: there is interest in exploring ways of determining how much students have developed intellectually in the course of degree study

Four years ago, the book Academically Adrift: Limited Learning on College Campuses sent a shudder through the US academy.

According to its findings, nearly half of undergraduates showed no substantial improvement in critical thinking, reasoning or writing skills in their first two years of study and were only slightly better on graduation.

The dismal results presented by sociologists Richard Arum and Josipa Roska caused a sensation, highlighting an apparent neglect of students ill-served by America’s costly higher education system.

That moral challenge to justify the fees charged to undergraduates now appears to have reached the UK, with policymakers taking an interest in the idea of “learning gain” at university.

As part of a wide-ranging review of performance measures in use, the 成人VR视频 Funding Council for England is considering whether indicators of learning gain – also known as “value-added” – could conceivably be collected from universities.

It has commissioned Rand Europe to carry out a study on the mechanisms available to track the “distance travelled” by students, with the education consultants due to deliver their report this spring.

Of course, many educators will wonder why the higher education sector’s main tool of assessment – the degree classification system – is not seen as a sufficient indicator of an individual’s progress, particularly if considered alongside that student’s A-level scores.

Several university rankings already do this to judge teaching quality, awarding “value-added” points for institutions that manage to help students with low Ucas tariff scores to achieve at least a 2:1.

But if a student with grades of BCC at A level later gains first-class honours, is that always a sign of great teaching or can it indicate “dumbing down”? Does awarding a third to a straight-A student have to reflect badly on a university’s teaching? Or might it suggest that the institution has more rigorous standards?

Degree inflation a spur to action

Many in the sector say that a new set of measures are indeed required to show learning gain because rampant degree inflation has eliminated any possibility of the comparability of standards between institutions.

Standardised tests can show how much students have developed their key study skills, argued Roger Benjamin, president of the Council for Aid to Education, which provides Collegiate Learning Assessment tests to more than 200 higher education institutions in the US.

“If you do not have some way to compare yourself against another institution, how do you know if you’re doing well?” Dr Benjamin asked.

Speaking at a Hefce conference on learning gain at the Royal Society on 9 February, he added that “professors do not have a clear incentive” to prove that what they are doing in the classroom has been effective.

In the absence of a trusted comparative measure across the sector, employers are simply choosing graduates from elite universities that have their own tough admissions procedures, he said.

“Too many students that go to less selective institutions never get an interview for a job,” said Dr Benjamin, claiming that “badges” to indicate significant learning gain could help to unearth “hidden gems” from less prestigious universities.

He admitted, however, that academics are distrustful of standardised tests, in which students are asked to read a range of materials and write an essay on a real-life scenario, usually on a business-related subject, showing their reasoning.

Critics of the tests – which, controversially, were used in the Academically Adrift study – claim that they are a waste of time and resources as they are easily gamed by students.

Astute academics can quickly teach students how to produce the false dichotomies, straw-man arguments and nonsensical but plausible-sounding platitudes required to succeed in the exam, they contend.

Dr Benjamin defended the rigour of the assessments, which can be graded by computers, but pointed out that they are an alien concept to the UK, which does not have the “meta-domain critical thinking tests” commonly found in the US. University learning in the UK is more focused on deep subject knowledge, rather than the broader range of subjects taught at American liberal arts colleges.

“We should move into developing the tests for the [different] disciplines,” he said.

Another approach to measuring learning gain discussed at the Hefce conference was the use of student surveys.

Some educators contend that the National Survey of Student Engagement (NSSE) used in the US is a reliable proxy for learning gain, as it asks undergraduates to rate their own understanding of a subject as well as their own contributions to classroom discussions. Hefce piloted a modified set of elements from the NSSE in the UK in 2013.

However, Fiorella Kostoris, professor of economics at Sapienza University of Rome, told the conference that there was “no correlation between what students think they are learning and what they are”.

In a study of 6,000 students at Sapienza, 80 per cent thought that they could answer questions to a certain level, but far fewer actually managed to hit these predicted grade boundaries, she said.

Full-spectrum assessment

The “mixed methodologies” approach pioneered by academics from the Center of Inquiry into the Liberal Arts at Wabash College, in Indiana, might prove a more robust way to measure learning gain.

Between 2006 and 2012, researchers there used grades, surveys and standardised tests to examine how much 17,000 or so students had learned at 49 liberal arts colleges in the US.

With about 800 pieces of information on each student, including data on their intended career paths, their moral reasoning and psychological well-being, the Wabash National Study should have proved a treasure trove of information to aid learning gain.

But the study’s director, Charles Blaich, told the Hefce conference that many institutions took only limited interest in its findings, with 37 per cent of institutions in the project’s first four-year phase making no response.

“Faculty do not like standardised tests as they do not feel they represent what their [teaching] is about,” said Professor Blaich.

He added that this testing process “also requires a lot of transparency that a lot of staff are not used to”.

Professor Blaich also cautioned against using learning gain measures as a way to rank institutions, despite the inevitable pressure to add them to the basket of student satisfaction and graduate outcome measures currently in use in many league tables.

The “differences between the highest [institution] and lowest are often very small, particularly compared to the difference within an institution [at subject level]”, he said of NSSE scores.

However, with universities always keen to show how they deserve taxpayers’ support, a nifty learning gain measure such as this could prove a vital weapon in the battle to hang on to their funding.

jack.grove@tesglobal.com

请先注册再继续

为何要注册?

  • 注册是免费的,而且十分便捷
  • 注册成功后,您每月可免费阅读3篇文章
  • 订阅我们的邮件
注册
Please 登录 or 注册 to read this article.