How to prevent users of social media and web services from falling into the trap of disinformation? A peer reviewed study published by the universities of Cambridge and Bristol tested a new method, dubbed 'pre-bunking'. The researchers worked together on YouTube and Jigsaw (a Google unit that builds tools for journalists, activists and members of civil society).
What is pre-bunking? Scholars call it an "aptitude vaccination". The experiment carried out consists in inserting short videos of about ninety seconds in YouTube advertising slots to inform the public of the most common manipulation techniques used to spread false or misleading news: emotional manipulation, false dichotomies, scape- goating (i.e. finding a scapegoat). After watching these informative videos, similar to public service announcements, the 30,000 participants in the experiment recorded a 5% higher ability to identify false information.
The positive effect of pre-bunking seems to have been recorded on people with different political orientations, educational levels and personality traits. The encouraging results have led some experts to see pre-bunking as the most scalable way to combat disinformation. Instead of focusing on denying or correcting the avalanche of false news circulating in every corner of the Internet, a preventive approach, focused on macro-categories or methodologies for spreading fake news can be more effective.
Content moderation activities on platforms, as we know, are particularly delicate and demanding, and require a great deal of resources and human work. So can pre-bunking be the solution? Jigsaw is currently working on a method localization project, with the aim of countering anti-immigration rhetoric in Central and Eastern Europe. ethical and political implications. The question arises spontaneously: who decides what falls into the category of "manipulation" or "distortion of reality"? Will the process be left in the hands of private companies like Google? Or will it be the prerogative of governments? How is democratic control over these processes ensured? Isn't such a delicate instrument in danger of falling into the wrong hands? The questions are similar to those asked about platform moderation policies, but they become even more pressing as we talk about psychological influence techniques. Jon Roozenbeek, one of the researchers who led the experiment, insists that pre-bunking is only part of the solution to the disinformation problem, and must be accompanied by cross-cutting efforts in other areas as well.
What is pre-bunking? Scholars call it an "aptitude vaccination". The experiment carried out consists in inserting short videos of about ninety seconds in YouTube advertising slots to inform the public of the most common manipulation techniques used to spread false or misleading news: emotional manipulation, false dichotomies, scape- goating (i.e. finding a scapegoat). After watching these informative videos, similar to public service announcements, the 30,000 participants in the experiment recorded a 5% higher ability to identify false information.
The positive effect of pre-bunking seems to have been recorded on people with different political orientations, educational levels and personality traits. The encouraging results have led some experts to see pre-bunking as the most scalable way to combat disinformation. Instead of focusing on denying or correcting the avalanche of false news circulating in every corner of the Internet, a preventive approach, focused on macro-categories or methodologies for spreading fake news can be more effective.
Content moderation activities on platforms, as we know, are particularly delicate and demanding, and require a great deal of resources and human work. So can pre-bunking be the solution? Jigsaw is currently working on a method localization project, with the aim of countering anti-immigration rhetoric in Central and Eastern Europe. ethical and political implications. The question arises spontaneously: who decides what falls into the category of "manipulation" or "distortion of reality"? Will the process be left in the hands of private companies like Google? Or will it be the prerogative of governments? How is democratic control over these processes ensured? Isn't such a delicate instrument in danger of falling into the wrong hands? The questions are similar to those asked about platform moderation policies, but they become even more pressing as we talk about psychological influence techniques. Jon Roozenbeek, one of the researchers who led the experiment, insists that pre-bunking is only part of the solution to the disinformation problem, and must be accompanied by cross-cutting efforts in other areas as well.