成人VR视频

The 成人VR视频 Office has questions to answer about flawed survey on international students

The Migration Advisory Committee’s survey was so poorly framed that any data would have been unusable, so how was it even approved? asks Tanja Bueltmann

五月 23, 2018
questions
Source: Getty

When I first saw screenshots of the Migration Advisory Committee’s (MAC) survey intended to assess the impact of international students, I assumed that they were fake. The nature of the framing and the use of loaded questions led me to conclude that this could not possibly be an official survey. Imagine my surprise – and deep concern – when I?discovered that not only was the survey real, there were also flaws in its design so fundamental that any data gathered would be unusable. While concerns over questions are important, it was the survey’s design that worried me the most. After all, the survey was launched as part of work commissioned by the 成人VR视频 Office to provide evidence of the impact that international students have, and the results would, therefore, likely?have been used to inform policy.

The survey invited students to examine their attitudes towards international students. In itself, there is not necessarily anything wrong with such a survey – although I?would always have questioned the framing around “impact”. But from the outset, the MAC survey undermined itself by establishing a context of “us” and “them”, asking respondents whether they were born in the UK or were a British citizen. Such a question is not neutral in a survey such as this.

In light of that general concern, what made the survey so problematic was that it then proceeded to ask respondents to assess the impact of international students and the effects of “being at university alongside” them.

Respondents were asked to do so on a scale from positive to negative. Seeking responses using such a?scale automatically forced respondents to problematise international students, and the experience of interacting with them, in a way they might never have done naturally.

Worse: subsequent questions then, in essence, invited respondents to make assumptions about international students, for instance about their number. But how would respondents know? It is not always clear who is an international student unless they self-identify as such. What, for example, if there are a number of British citizens in a class who have Asian heritage? Would a survey participant have assumed that they were international students and then based their response on that?

Yet while these are critical methodological concerns in terms of framing and questions posed, there are much more basic questions about the survey that must be answered. As a historian who carries out interdisciplinary research that uses methods from the social sciences, surveys have long been an important part of my research. Apart from general ethics clearance of my projects, I had to submit for detailed scrutiny any surveys that I hoped to use. It appears that there was no such scrutiny of the MAC survey – or, if there was, it was not thorough enough to identify the survey’s very basic flaws.

Why, for example, was the survey set up without any verification mechanism, enabling anyone with the URL to complete it? That alone renders the survey unusable because there is no way of knowing who supplied the responses – were they actually a student or not? As a result, the survey was automatically open to manipulation should anyone have wanted to do so. To make matters worse, it was also possible to fill in the survey multiple times. On an iPhone, all that was required to be able to fill it in again was to clear the cache. Clearly, some form of status checking would have been required to make the survey data usable and to protect their integrity, for instance a process that required logging in with a student email address first.

In light of all these issues, I welcome the pulling of the MAC survey. However, that is not enough. The committee’s immediate response is disappointing because it suggests that there is no interest in a progressive assessment of what went wrong. Instead of apologising for the poorly designed survey and the distress the questions caused to many international students when they too saw them, the MAC’s response in essence lays blame on those who identified the survey’s problems, noting that “the survey had potential to show a very positive view of international students…but cannot now be used to add to our evidence base”.

Let’s be clear: it cannot be used because the data are unusable. Consequently, the survey had no potential to show anything, certainly not anything positive. The MAC would do well to provide a more detailed response to what happened so that we can understand not only what went wrong with the survey approval and design, but also why it was deemed acceptable. There needs to be an opportunity for reflection. I would also like to know how other MAC data have been gathered and what the future plans are now for any further evidence-gathering. For us to have confidence in the MAC and the work it does, answers to these questions are essential.

But criticism should not be reserved for the MAC alone. I would like to understand why universities across the UK, and Universities UK specifically, so readily disseminated the survey. With its design flaws so clearly apparent, I am concerned that nobody saw a problem prior to sharing the survey. Let this serve as a warning to us all: at a time in which the hostile environment is alive and kicking, we need to be more alert to the messaging and implications of research focused on immigrants.

Tanja Bueltmann is a professor of history and faculty associate pro vice-chancellor knowledge exchange Faculty of Arts, Design and Social Sciences at Northumbria University.

后记

Print headline:?How did a fundamentally flawed survey on international students even get approval?

请先注册再继续

为何要注册?

  • 注册是免费的,而且十分便捷
  • 注册成功后,您每月可免费阅读3篇文章
  • 订阅我们的邮件
注册
Please 登录 or 注册 to read this article.
ADVERTISEMENT