成人VR视频

My eureka moment: Braving the elements

It takes courage to stand against accepted dogma. Barbara Oakley has risked her life, literally and academically, but believes the search for truth will ultimately be supported and be successful

五月 20, 2010

When the wind is screaming and the waves are two storeys high, it is easy to die.

I dangled by one arm over the ice bath of the Bering Sea, trapped by a flipped rope ladder between the 1,000-ton weight of the Soviet ship Chasovoi - "The Watchman" - and the neighbouring American vessel, the Mary Jane. A massive rubber bumper kept the two ships apart - for the moment. If the sea rippled just so, my body would be crushed between the bumper and the Chasovoi's steel hull.

The salty spray across my mouth mixed with the sour bile of fear. "Don't think, don't think, don't think, just do it, grab that rope, don't look down."

Impending death has a gut-wrenching way of providing a eureka moment - one of near mystical clarity. For me, it was "Wow! Life can really be dangerous!"

We academics do not usually think about living a life of danger, but Thomas Kuhn, in his seminal The Structure of Scientific Revolutions (1962), subtly alluded to the possibility when he argued that scientific revolutions are generally sparked by the young or by outsiders. Such individuals, he noted, are more likely to be free of disciplinary dogma - the overly pat set of explanations any discipline can inadvertently develop. Scientific theory, Kuhn pointed out, does not evolve directly from an accumulation of facts, but rather from people's changing perceptions of reality.

But how could people perceive reality differently? And what does this have to do with danger?

In 1949, neurologist Antonio Egas Moniz won a Nobel prize for spearheading the frontal lobotomy as a cure for mental illness. His "research" was based on dubious experimentation, with reported results that amounted to wishful thinking. But contemporaries who tried to point out flaws in Moniz's work were silenced. Their criticisms were deemed unpublishable, but not because of their content. Moniz was a scientific lion of such towering repute that the gatekeepers of science - the editors of top journals - had little interest in even considering questions about his work.

Lobotomies thus continued to be performed for decades. Only gradually, after thousands of personalities were irreversibly flattened, did reports of the procedure's flaws begin to slip into print. Eventually, the lobotomy fell into profound disrepute. Researchers' perceptions changed too, but, contrary to Kuhn's hypothesis, that change was based on a gradual accumulation of dissenting facts.

Think this rarely happens? Think again.

Heeding the siren call of John Watson and B.F. Skinner, psychology fell under the cultish sway of radical behaviourism, which meant that only behaviours - not the brain itself - could be studied. George A. Miller described the underpinnings of psychology in the 1950s: "(Behaviorism) ... was the point of origin for scientific psychology in the United States. The chairmen of all the important departments would tell you that they were behaviorists. Membership in the elite Society of Experimental Psychology was limited to people of behavioristic persuasion; the election to the National Academy of Sciences was limited either to behaviorists or physiological psychologists ... The power, the honors, the authority, the textbooks, the money, everything in psychology was owned by the behavioristic school ... those of us who wanted to be scientific psychologists couldn't really oppose it. You just wouldn't get a job."

In a penetrating analysis of the effects of behaviourism, neuroscientist Bernard Baars concluded: "In the upshot, a small but vigorous minority (led by Watson and Skinner) purged psychology and brain physiology of its most central problems for most of the century. By keeping moderates on the defensive they made empirical progress nearly impossible." Criticising behaviourism was the career equivalent of dangling by one arm over icy, crashing seas.

In the early 1980s, psychologist Lenore Walker's The Battered Woman (1979) laid out research findings that have shaped the study of violent relationships since. Her conclusion was that battered women are in essence innocent bystanders who play no role in their victimisation; any assertions to the contrary were vilified as a form of blaming the victim.

This led to simplistic policies that often worsen the problem of domestic abuse. As lawyer and professor of social work Linda Mills wrote in Violent Partners (2008): "If the feminist movement is really about empowering women and ending oppression for all, it should be asking whether or not the woman who has called 911 has played any role in the altercation with her husband - whether she, too, ever participates in acts of physical or psychological aggression against her partner."

Walker's methodology is deeply problematic. Although she draws many of her conclusions from a 30-year-old study of 400 abused women, she argues that it is unnecessary to find a group of non-battered women to use as comparative controls. Instead, she uses as controls the "normal" relationships of the battered women themselves. This is laughable on the face of it, except to credulous journalists. In fact, Walker's ability to insert herself into high-profile criminal cases is legendary.

Sporadic criticism of Walker's work has been published, but her standard response is a deflection: an accusation that critics are against battered women. Walker and those who have followed in her footsteps have helped to prevent research on the many different attributes of battered women that may have a bearing on the situation.

Academia is a relatively safe career - at least once you have tenure. Research the right things and it's smooth sailing. But step out of line and you're the lead fish in the deadliest catch.

In 2006, I fell foul of disciplinary dogma when I attempted to publish a book that brought hard science into our understanding of why bad people - from genocidal dictators and conniving business executives to manipulative child-abusing uncles - do what they do. The book, Evil Genes: Why Rome Fell, Hitler Rose, Enron Failed, and My Sister Stole My Mother's Boyfriend (2007), was eventually published to accolades from Steven Pinker, Terrence Deacon, Michael Stone and dozens of other world-renowned researchers. But it was initially turned down by 40 publishers despite its balanced portrayal of environmental and biological factors in shaping personalities. A common theme in the rejections was a resolute unwillingness to accept even a hint of biologically based explanation for horrific behaviour. One editor said: "Everything's relative - many people thought Hitler was a good person."

Hitler was a "good person"? Has everything become so "relative" that even the Holocaust has lost its horror?

Book editors are not alone in such thinking. Psychologists will practically knit themselves into sweaters before admitting that people can be bad - no matter what neuroscience may tell us about a psychopath's lack of empathy and sadistic pleasure in hurting others.

A key theory is that espoused by Philip Zimbardo. Based on the results of his famous 1971 Stanford prison experiment and propped up by similar morality plays masquerading as research, he asserted that people themselves are not generally bad; instead, there are only bad situations and organisations. In other words, the atrocities perpetrated in Abu Ghraib prison in Iraq had nothing to do with a psychopath-like character being given free rein, but stemmed from powerful situational pressures. (Zimbardo did not go to Iraq to investigate these "powerful situational pressures", but he happily discussed their influence on Abu Ghraib's notorious prison guard, Ivan "Chip" Frederick.) Zimbardo has testified about his findings before Congress - he's the go-to guy for sound bites whenever a horrific mass murder occurs. Much like Moniz, Watson, Skinner and Walker, Zimbardo has a knack for inserting himself into high-profile media.

Burrowing into Zimbardo's work, I was appalled to discover that his research conclusions, which have influenced national policy, were based on a tiny experiment that has not been and could not be replicated. As sociologist Augustine Brannigan observed: "The procedure did not test any hypotheses, identify specific variables, or employ control groups or statistical tests to identify differences in treatment outcomes." The experiment was, in reality, a fiasco where one guard in particular revealed a shocking propensity for sadism. It had to be stopped after six days as it degenerated into a profoundly abusive environment.

Zimbardo is a lion of the psychology establishment. He has served as president of the American Psychological Association, and his popular introductory psychology textbooks have helped shape the attitudes of hundreds of thousands of college students, and thus the journalists whose attention he courts, and the book and journal editors who serve as intellectual gatekeepers. Indeed, through his public, dramatically staged mea culpas, Zimbardo has created an ironic sweet spot for himself where he can accuse anyone who attempts to criticise his work of being an unethical publicity seeker. This is not to say that organisational and situational factors are not important - they are. But by minimising the significance of individual factors in horrific situations, Zimbardo has helped set research in this area back by a generation.

What do these protracted periods of madness within a discipline have in common, aside from making things dangerous for dissenters?

First, the findings, no matter how spurious, are presented as solid science. The lobotomy was touted as a precise surgical procedure, even though it simply involved jabbing an ice pick behind someone's eye and wiggling it around. Similarly, Zimbardo could discount the importance of a sadistic guard when he asserted the supremacy of organisation and situation through his research publications. Much as with a beautifully garnished platter, presenting one's findings as "science" sells.

An apparently credible scientific platform is only part of the danger wrought by research demagogues. Another part is the media. Journalists often have little scientific background. All it takes is a simplistic solution that journalists and their readers can understand easily, touted by a media-savvy self-promoter with a solid academic platform, and voila! Suddenly, any criticism is no longer simply criticism; instead, it is a tiresome, negativistic rebuttal of what everyone knows is true.

In this way, "research" becomes myth. As with every myth, there is always a hero - in this case, the researcher. And there is also a villain - any would-be critic. But there is a further factor: when the myth-maker also writes textbooks, the next generation of researchers can be led astray as new students are indoctrinated with what everyone thinks is true. The ne plus ultra of this process is when, using the credibility of their academic platform and appealing to "obvious" scientific truths, showboaters are able to drive through legislation that supports their findings. In the face of such crushing clout, who would dare put on the villain's cloak to try to rebut findings? Who but the young and outsiders, who are often unaware of the career dangers?

We like to think of modern science as a reputable affair. But when it provides for power, control, fame or money, it is not.

During the plenary session of the Toward a Science of Consciousness conference in April in Tucson, I asked about the relevance of studies on the brain's consumption of energy to our understanding of intelligence. "We stay away from any focus on intelligence," the speaker snapped, annoyed that I questioned the accepted ban. (Research on intelligence may, after all, be used for racist purposes.) But if we want to give disadvantaged kids a better shot at life, we had damned well better know how to nurture their "smarts" - so we need to understand what "smarts" are made of.

I am a female professor of engineering and I would like to learn why most engineers are male. But I know better than to apply for funding for research that may uncover a cause pointing towards possible differences between male and female brains. These and thousands of other important questions are untouchable as a result of devotion to disciplinary dogma or for reasons of political correctness.

This brings me back to my own story. Despite my current academic perch - I am a tenured engineering professor at Oakland University in Michigan - my approach to academic research is that of an outsider. I arrived in academia far later than most, after a career path that spanned work as a radio operator at the South Pole Station in Antarctica, service as both an enlisted soldier and officer in the US Army, and yes, the job that nearly left me crushed between two big ships - work as a translator on Soviet trawlers on the Bering Sea. My focus on psychological issues makes me a double outsider: I am a systems engineer who applies my knowledge to human systems. This helps me to see not only the disciplinary blinkers and silos of the social sciences versus the hard sciences, but also to see the more massive silos that separate academia from the media and from government. I have learned that it is often impossible to understand scientific paradigm shifts without stepping away from academia and looking more broadly at the crushing interplay of media and government, and then zooming in as well to understand individual factors.

Kuhn was wrong when he claimed that scientific theory does not evolve directly from an accumulation of facts, but rather from people's changing perceptions of reality. In the social sciences, at least, people's perceptions of reality appear to change only with the accumulation of facts from those willing to be vilified for contradicting received wisdom. The reason scientific change is so slow is because of the difficulty of making research breakthroughs. But making those breakthroughs can be all the more difficult because of the effects of mediagenic self-seekers who marshal a variety of forces to make life dangerous for dissenters.

"Don't think, don't think, don't think, just do it, grab that rope, don't look down." My left hand connected with a knot on the edge of the ladder, my fingers tightened around it.

Deep breath. Voice within: Climb! CLIMB!

I was suddenly 30 feet up, at the top of the ladder, husky arms helping me over the bulwark. "Vsyo khorosho? Everything OK?"

My ultimate eureka moment is this - for those who would step outside the box, it can be nearly as dangerous in academic research as it is on the high seas. But even though the path is initially forged alone, in the end, there are helping hands for those who push on, determined to find their way to truth.

请先注册再继续

为何要注册?

  • 注册是免费的,而且十分便捷
  • 注册成功后,您每月可免费阅读3篇文章
  • 订阅我们的邮件
注册
Please 登录 or 注册 to read this article.
ADVERTISEMENT