(Note: Joe Newman is a lawyer that focuses on privacy and technology at the Future of Privacy Forum. This article is adapted from a larger piece with Joseph Jerome on video games and privacy entitled “Press Start To Track: Privacy And The New Questions Posed By Modern Videogame Technology.” The full version of the paper will be published by the American Intellectual Property Law Association’s Quarterly Journal later this year.)
“Are you a pervert?”
I stared at my TV screen, slightly dumbstruck. Sorry, what?
“Are you a pervert?”
- “I don’t think so.”
- “No use denying it.”
This was a question I was being asked by Catherine, a game about navigating tumultuous relationship drama. Vincent, the game’s protagonist, was struggling with a dilemma of having to choose between two women, his straight-laced girlfriend Katherine or the mysterious seductress Catherine. After pondering for a moment, I made my choice: I was then shown a pie chart that documented how all other Catherine players tackled the same question. Pretty cool.
It wasn’t until much later, after I had finished Catherine that I was struck with a quick and somewhat disconcerting revelation.
I had just told a complete stranger whether or not I believed I was a pervert.
All Catherine had to do was ask me a series of questions: “Are you more of a Sadist or a Masochist?” “Do you have to carefully choose which underwear to wear each day?” “Have you ever cheated before?” and I willingly submitted my information. Information which is now being stored on a random server somewhere, along with data from hundreds of thousands of other players.
What had I just done?
We tend not to think too much about the impact that online communication has had on the gaming world when it comes to our privacy. Before 1986’s The Legend of Zelda on NES, it was rare for home console games to internally store any data about their players. Today, through a combination of data-gathering sensors and integrations with external data sources, our games are more advanced and connected than ever before. That also means they’re collecting tons of data about us as players. EA, for instance, generates more than 50 terabytes of player data from its games every day.
What types of data can games collect? We’re talking about stuff like a player’s physical characteristics (including facial features, body movement and voice data), location and nearby surroundings, biometrics, and information gleaned from one’s social networks, to start. Additionally, within the game environment itself data analysts monitor in-game behavior in order to discover a great deal about a gamer’s mind: from their temperament to their leadership skills; from their greatest fears to their political leanings.
It’s not our intention to freak anyone out. If it’s used at all, this player data is generally utilized by developers to make their games better. Whether it’s eradicating glitches, balancing game mechanics, or delivering personalized storytelling (see the Mass Effect series, or The Walking Dead, or even Catherine, as described above), “datafying” games can provide a huge benefit to developers and gamers alike. New and innovative game features – even entire genres of videogames (looking at you, MMORPGs) wouldn’t be possible without exchanging mass quantities of player data.
And yet, there’s the “creepy” problem. People freaked out when Microsoft announced that Kinect 2.0 would be always on, watching players “like a cross between fictional serial killer Buffalo Bill, an actual serial killer, and that robot from The Black Hole.” Gabe Newell had to fend off accusations that the Steam client scanned its players’ web browsing logs. Oh, and also – the NSA is apparently watching you when you play Angry Birds.
Now, perhaps you’re more like some of the people we read on message boards that say “Privacy is dead. I’ve got nothing to hide – why should I care who has my information?” Privacy means different things to different people, and it’s easy to respond with a kind of resigned shrug when it comes to protecting your data. However, even if you don’t care about random strangers cataloguing your sexual preferences, you might care more about salespeople knowing your in-depth economic tendencies. One data mining company called DeltaDNA uses a player’s in-game actions to predict their “revenue potential” as well as how likely they are to promote the game to others on social networks. Players who are “scored” as likely to spend money are likely to receive offers that differ from those players that are at risk of leaving the game.
Targeted offers are one thing, but how would you feel if you found out that your game got artificially harder for you simply because an algorithm had deduced you might be willing to pay real-money for some powerups? It’s a very slippery and very dangerous slope.
Is there a solution to all this? Well, there are a host of laws that affect game developers in what they can and can’t do with your data. The Federal Trade Commission is very involved in policing against developers who engage in “unfair or deceptive trade practices” involving player data. (The FTC recently cracked down on a mobile flashlight app that shared users’ location and device IDs without their knowledge). The Entertainment Software Ratings Board (ESRB) as well as hardware providers are also reacting to player concerns about privacy – for instance, Microsoft responded to public criticism about Kinect 2.0 by reversing its policies about the whole “always on” thing.
Regardless, there’s plenty of legal gray area here. And coming up with rules that keep pace with constantly-evolving tech is surprisingly tough. To illustrate this difficulty: imagine if there was a rule that games had to get the player’s notice and consent before they collected and used player data for any purpose. Now think about the famous fourth-wall breaking moment in Metal Gear Solid when Psycho Mantis “reads your mind” (or more accurately, read the contents of your memory card) and commented on your gameplay habits. Had Metal Gear Solid announced it intended to scan the player’s data and asked for permission to do so, the surprise of Psycho Mantis’s “psychic demonstration” – an enjoyable and ultimately privacy-benign moment – would have been ruined.
So, what to do? Ultimately, there’s no single rule that game developers can follow to avoid being “creepy” with player data. There are certainly lines that shouldn’t be crossed (using psychographic data to secretly price discriminate against players is pretty much a no-no), but this is still an emerging field. As people start to pay more attention to the very rich types data that games can provide, new guidelines and player expectations will need to be developed and tweaked.
In the meantime, it’s worth taking the time to think about all the ways your games are tracking you, and take action to protect yourself if you don’t feel comfortable with it (including choosing to play offline). Remember that just because a game asks you for your information doesn’t mean you always have to provide it.
Published: May 20, 2014 07:30 pm