On Their Terms
- Written byFran Panetta
- Published date 03 November 2024
In a world riddled with disinformation, how do you get the disengaged to re-engage? How do you get sceptics to trust?
This is what our team started to discuss this week as we kicked off our new project On Their Terms.
With thanks to funding from the European Media and Information Fund (EMIF) we will be spending the next 18 months working with project partners in the UK and France - More in Common, Destin Commun, Fake Off and UMICC - figuring out how to enable the most disengaged individuals in society to better protect themselves from disinformation.
We're all familiar with audience segmentation, but usually its done with demographics. More in Common (MiC), if you don't know them, are a brilliant Think and Do Tank who believe that what people have in common is stronger than what divides them. Underpinned by insights from psychology, sociology and political science, they segment audiences according to their values, beliefs and sense of group attachments. As the MiC team told us this week, the demographics between King Charles and Ozzy Osborn are the same (both white, divorced, live in a castle), but they are very different. By working to understand the forces driving people apart and find common ground, More in Common aims to build more united, inclusive and resilient societies.
For our project, MiC has identified two groups particularly vulnerable to disinformation: the Disengaged Battlers and the Disengaged Traditionalists. Both groups have a low trust in politics and in traditional media. Interestingly they are on the different sides of the political spectrum. Disengaged battlers are on the left, the disengaged traditionalists on the right.
So if your social aim is countering disinformation and supporting democracy (and having a clear social impact is important for us at the institute), then there are few options or as More in Common puts it simply - you can look at the supply side (where the disinformation is coming from, and the pathways it reaches people) or you can look at the demand side (the unfortunate people who receive it, which as we know, is us all). For this project we're looking at the demand side.
There are many commonalities with this project and the work we did last year on our fellowship programme. As we discussed, as storytellers there are a few interventions possible: Inoculation (where the audience member receives a dose of the disinformation which then gives them some kind of protection), Debunking (telling you when something is fake), Labelling (putting a big fat label on something that is fake, or more subtle meta data in a piece of media saying how it has been created with tools (AI or otherwise)), and media literacy - showing people what to do when they encounter media, making them more aware of what's out there and how it's been manipulated.
The thing is, that most of the time, all of this (from inoculation to media literacy) is pretty didactic stuff.
And labelling is being questioned as to its efficacy. Further, you really have to care, to engage with any of this. It's usually pretty boring.
That's where we come in. We're interested in how creative storytelling can play a part. Last year our fellows spent a year coming up with engaging stories that would do just that - use creative storytelling to counter disinformation. For this project we'll be looking at the role narrative games can play. According to More in Common, a high proportion of gamers are in these disengaged audience groups. Why don't we meet them "On Their Terms" and build projects in media they enjoy, and with storytelling that works for them?
We intend to bring together games designers, media literacy professionals and behavioural psychologists to help us explore how we can do that. We will build game concepts and test prototypes to understand their potential for impacting these disengaged audience groups. And we'll also see what sort of process works for our games creatives.
We'll learn what inputs and approaches best support game designers to imagine really impactful projects that don't compromise on creativity.
But there are challenges we already see ahead. As our fellow Daria Cybulska told us last year, many of the tropes of media literacy "don't immediately trust what you see, check other sources” are used by conspiracy theorists. How do we make sure we're not fuelling this? If a good way to protect yourself from disinformation is to source check from "reliable" places, but our disengaged groups are sceptical of mainstream news, what does that mean? If we hone in on the techniques of manipulation - images created by AI, texts written by GPT are we just going to add to their scepticism of the truth? Will we be contributing to a society where anything can be plausibly denied?
Next steps are research and discussions with our games designers. The journey has begun...
The European Media and Information Fund is entrusted to address the phenomenon of online disinformation in Europe and promote a more resilient and fact-based digital information ecosystem. It is managed by the Calouste Gulbenkian Foundation.