Breaking Harmony Square is a free video game that teaches players to recognize political disinformation … [+] while trying to undermine democracy.


Misinformation has become one of the most pressing problems we face as a society. How can we find effective solutions when we cannot agree on the facts? And misinformation has invaded almost every area of ​​our lives. From our health care to our choices as voters, how can we distinguish between true and false?

Politically, at least, researchers have found that simply playing a video game can help. In a new video game, players learn how to spot misinformation by trying to undermine democracy. In other words, the more people learn how to create political division in a game, the more they will recognize the methods when they see them in real life.

The game is called Harmony Square and it is free to play. I thought I would try. At first it was amusing when I was hired as chief disinformation officer and had to choose my code name. Then my new employers explained more about my job: “We hired you to sow discord and chaos in Harmony Square.”

After that, the game took me through five major disinformation techniques, starting with posting an extreme opinion. Here’s what the game said about it: “See? Only by posting an extreme opinion did you get people to respond to emotions. You made them believe your opinion was representative of a wider group of Harmony Square residents. We call this ‘anger bait’. “

By the end of the game, I had learned to play both sides and use bots. When I destroyed the city of Harmony Square, the game was no longer fun. But I also had a better sense of how to spot disinformation campaigns on social media.

Pre-bunking works better than debunking

And that is the aim of the game. Developed by psychologists at the University of Cambridge, the game is supported by the Department of State’s Global Engagement Center and the Department of Cybersecurity and Infrastructure Security (CISA) at the Department of Homeland Security.

The idea is this: as players learn how to use conspiracy theories, fake experts, bots, and other techniques to create a political split, they learn to spot those techniques in real life. It is a type of “psychological vaccine” based on what is known as a “vaccination theory”: When people are exposed to common fake messaging techniques, when they see misinformation, they are better able to notice and ignore misinformation.

It is related to a method called “pre-bunking”. “Trying to expose misinformation after it has spread is like closing the barn door after the horse has slipped. By pre-bunkering, we want to stop the spread of fake news in the first place,” said Dr. Sander van der Linden, director of the Cambridge Social Decision-Making Lab and lead author of the new study in a press release.

Pre-bunking is becoming more common. For example, Twitter has started flagging tweets that were misinformation during the US presidential election. However, the researchers argue that marking individual tweets only goes so far. Rather than refuting every single conspiracy theory, the researchers believe that it is better to build a “general vaccination” by teaching people how misinformation techniques work. In other words, when people see them being manipulated, they are less likely to have to refute every single false conspiracy theory.

The results of the study

For the study, researchers asked 681 people to rate messages and social media posts for reliability. Some were real, some were misinformation, and some were fake misinformation created just for the study. Then they let half of the participants play Harmony Square and the other half play Tetris. The participants then rated a further series of entries.

The group that played the 4 levels in Harmony Square learned 5 common manipulation techniques. According to the study, these were:

  1. “Trolling people, that is, deliberately provoking people to react emotionally, thereby causing outrage.
  2. Exploitation of emotional language, ie trying to make people anxious or angry about a certain topic.
  3. Artificially increasing the reach and popularity of certain messages, for example through social media bots or by buying fake followers.
  4. Creation and dissemination of conspiracy theories, that is, placing the blame on a small, mysterious and vicious organization for world events.
  5. Polarization of the audience by specifically highlighting and magnifying differences between groups. “

Study participants who played Harmony Square rated misinformation as reliable 16% less than before the game. They were also 11% less likely to pass on fake news. And that regardless of their personal political affiliation.

When comparing the group that played Tetris with the group that played Harmony Square, the researchers found an effect size of 0.54. “The effect size suggests that 63% of half who played the game would find misinformation significantly less reliable if the population were split like the sample, compared to only 37% of half who used to navigate online Information left is the vaccination of Harmony Square, “said Van der Linden.

The project is part of a series of the Department of Cyber ​​Security and Infrastructure Security of the Department of Homeland Security, which examines how “foreign influencers” use disinformation. And it’s not the only game like this. While Harmony Square deals with political disinformation campaigns, the game Go Viral! is a five minute game that “protects you from misinformation about Covid-19” and was developed by the same team.

Playing Harmony Square itself takes ten minutes to play and “is quick, easy and cheeky, but the experiential learning that underpins it means people are more likely to spot misinformation the next time they log in and less likely to share it on Facebook or YouTube,” said Dr. Jon Roozenbeek, lead author of the study.