Forgot password
Enter the email address you used when you joined and we'll send you instructions to reset your password.
If you used Apple or Google to create your account, this process will create a password for your existing account.
This site is protected by reCAPTCHA and the Google Privacy Policy and Terms of Service apply.
Reset password instructions sent. If you have an account with us, you will receive an email within a few minutes.
Something went wrong. Try again or contact support if the problem persists.

But I Read It in the Papers

This article is over 15 years old and may contain outdated information

Whether it’s a boy gone missing after his parents took away his Xbox or a school shooting in Germany, videogames are often found in the company of tragedy. What begins as an off-hand quote about the victim or perpetrator’s gaming habits quickly spirals into a media-fueled crusade against the “true” cause of the misfortune. Letter-writers, opinion columnists, politicians and activists on all sides are quick to cite research to support their claims, but their arguments may be built on a shaky foundation if their information comes from newspaper reports.

Many people, including the researchers themselves, have doubts about whether the analysis presented in newspapers facing huge budget cutbacks, shrinking readership and declining standards is still reliable.

“I think they’re kind of giving people, their audience, what the audience wants,” says psychologist Dr. Christopher Ferguson. “Probably the majority of newspapers are sold to people who don’t play videogames, who don’t see the value in them and may be suspicious of them already. So newspapers have to kind of market to the audience they’re trying to sell to.”

Ferguson has been studying the effects of videogames on human behavior as an assistant professor at Texas A&M International University, arguing there is little or no evidence that gaming causes real-world violence. He says his research doesn’t get the same attention from newspapers as studies suggesting games have negative effects.

“Fear kind of sells,” he says. “Negative messages tend to get a lot more attention than the positive messages do.”

But psychologist Dr. Douglas Gentile, who studies videogames as an assistant professor at Iowa State University and as Director of Research for the National Institute on Media and the Family, says newspaper reporters are too worried about presenting both sides of a debate that he says the “videogames cause violence” side has conclusively won.

image

“We haven’t trained reporters very well how to tell quality science from junk science,” Gentile says. “Where this matters is in the ‘get both sides of every story’ rule that reporters do seem to follow pretty darn well. … The joy of science is that at a certain point there aren’t two sides. The world isn’t both flat and round. We now know the right answer.”

Despite their opposing viewpoints on whether videogames cause violence, Ferguson and Gentile agree that reporters often lack the necessary scientific knowledge to properly evaluate research, ask good questions and write stories that are comprehensive. Ferguson points out that reporters don’t typically explain the procedures researchers use to measure behavior in videogames studies. “What people hear in the research is, ‘Videogames cause aggression,’ and they immediately start to think of little kids kicking each other or punching each other or shooting each other. You know, they think of all these horrible, violent acts, and what they don’t realize is that 99 percent of the time that’s not what the studies are looking at. They’re having people filling in the missing letters of words. They’re having people give little noise bursts to each other that are not painful, and it’s not anything that the majority of us would consider to be violent behavior.”

Recommended Videos

Ferguson also believes reporters aren’t aware of other factors he’s researched that could affect a study’s results, like exposure to domestic violence, personality and genetics. “The psychological community in general has done a very poor job of informing people about the limitations of our research,” he says, “and I think that psychological studies in general come out sounding a lot more sure of themselves than they really ought to be.”

It’s common practice for psychologists to identify the problems, limitations and possible errors of their own research in a discussion section at the end of a published study. “Any good scientist will tell you exactly where the flaws are in his own study,” says Gentile. “We just want to get the best scientific information out to the people who can use it.” While this section is an important part of the scientific process, it is rarely addressed in newspaper articles.

But if newspaper reporters aren’t putting research studies in the right context or they’re leaving out important information, it might not be their fault. Smaller newsrooms and bigger workloads mean one of the biggest problems may be reporters aren’t even writing the stories credited to them.

image

“I was very surprised how many news sources pulled directly from the university’s press release,” says Andrew Przybylski, a psychology graduate student at the University of Rochester who led a recent group of studies on whether violent content increased players’ enjoyment of a game. “I expected the articles to vary more widely based on different readings of the actual article we published.”

Susan Hagen, part of the University of Rochester’s communications department, says she has no problem with journalists poaching from her press releases, which she says are always checked by the researchers involved before they’re sent out. Communications staff at Texas A&M International and Iowa State said they take similar precautions.

A story for Canada’s financially troubled Canwest News Service about Przybylski’s studies appeared in numerous newspapers across the country with a byline given to David Wylie. Most of the article is cut and pasted from the University of Rochester’s press release, including direct quotes plus sentences and phrases used to describe the studies. The article never names the press release as a source, implying that Wylie spoke to Przybylski and his co-authors directly or quoted from the actual study. Attempts to contact Canwest for comment received no response.

Gentile also worries whether newspapers have become too reliant on press releases for their content. “There’s a serious problem with the lack of funding for good investigative journalism, and so many papers are now relegated to just running the press release,” he says. “They should make a few calls to make sure that it’s credible and that the study really does show the thing that we’re claiming it does. But that takes time, and that’s of course the problem.”

He also points out that game journalists are often guilty of the same behavior, uncritically printing positive stories about videogames based on similar press releases.

Przybylski says many of the media reports he read that were based on the press release skipped or oversimplified some important information from his research. “We were very careful to say that violence is frequently woven into the creative narratives of games,” he says. “We were testing if adding more violence added to enjoyment, and if violence was a consistent motivator.”

Przybylski’s results suggested fun came mostly from satisfying psychological needs of competence and autonomy. Newspaper reports said things like “Violence is not a major reason why people play computer games,” (Australian Associated Press), even though Przybylski’s studies focused on whether gamers got additional enjoyment from violent content, not why they were playing them in the first place. A Daily Telegraph story’s lead said Przybylski’s studies showed “players like the adventure rather than the blood and gore” even though Przybylski never uses the word “adventure” his actual published report. Both of these stories also incorporate substantial portions of the Rochester press release.

There are other reasons why journalists shouldn’t rely on press releases as their sole sources of information. Schools choose which research will receive the press release treatment based on whether it will catch an editor’s eye, not necessarily by the scientific value of the research. “Many studies – I would probably say “most” – are very specialized, [and] the findings are only of interest to specialists in the field,” says Hagen by email. “Fortunately for us, many of our psychological studies, however, are accessible, and so make good stories for the general public. As the largest form of entertainment in the world today, everyone knows what a videogame is and everyone is interested in psychological findings about them.” More frequent newspaper coverage of videogame studies than other, less popular topics can make it seem like more research and consensus exists than there really is.

Other factors affect newspaper coverage as well. Ferguson says a press release from a smaller school like Texas A&M International won’t get as much attention as one from a bigger university like Iowa State. Similarly, researchers who have been doing their work for many years, such as Dr. Craig Anderson, have become entrenched in reporters’ Rolodexes and will get called for comment much more often, which limits the range of opinions most stories express. Gentile wonders whether newspapers neglected to cover a recent study of his on how cooperative games can foster cooperative behavior simply because the press release was sent out later in the week, as opposed to a Monday or Tuesday when newspapers are typically more starved for content.

So, when newspaper readers unfold a page and settle their eyes on a story about videogame research, they may not be getting the best information possible. They may be getting a regurgitated press release that’s missing the important caveats and limitations that are an essential part of any scientific report. They may be getting an imprecise account of the research that oversimplifies the complicated methods and results the researchers described. And they may not receive the proper context or expert opinion to properly evaluate the study’s importance.

What’s the best course of action, then?

image

Ferguson advises people to read everything carefully. “It’s just too easy for values or opinions or that kind of stuff to infuse social sciences. Part of that is just because our standards of evidence and the statistics we use are very weak,” he says. “I think it would be great for the general populace to know that and be more cynical about all results they get, including me. I’m not saying I’m immune to any of this. If I make people more cynical about my results as much as they become more cynical about everybody else’s, that’s great. People should be much more cautious about interpreting results from the social sciences.”

Chris LaVigne studied psychology as an undergraduate at Simon Fraser University where he wrote a paper about Freudian interpretations of Star Wars called “Sometimes a Lightsaber is Just a Lightsaber.”


The Escapist is supported by our audience. When you purchase through links on our site, we may earn a small affiliate commission.Ā Learn more about our Affiliate Policy