We are doing science for policy
The Joint Research Centre (JRC) is the European Commission's science and knowledge service which employs scientists to carry out research in order to provide independent scientific advice and support to EU policy.
The democratic foundations of our societies are under pressure from the influence that social media has on our political opinions and our behaviours, according to a new JRC report.
48% of Europeans use social media every day or almost every day. As well as being helpful tools to stay informed, be entertained, shop and stay close to our friends, these platforms have revolutionised the way we experience politics, by engaging more citizens in the political process and enabling minority voices to be heard.
But these platforms also allow polarising messages and unreliable information to be spread easily. This can limit our perspectives and hamper our ability to make informed political decisions. As the report authors find, this has a dangerous impact on our democratic societies.
In "Technology and Democracy: understanding the influence of online technologies on political behaviour and decision-making" an international team of experts takes a behavioural science approach to investigate the impact of online platforms on political behaviour.
The report identifies key "pressure points": challenges that emerge when we interact politically on online platforms that are not subject to much public oversight or democratic governance.
Co-Coordinating Lead author of the report, Professor Stephan Lewandowsky says: "Essential components of human behaviour are governed by relatively stable principles that remain largely static even as the technological environment changes rapidly. In the absence of behavioural reflections, policymakers may feel that they are constantly playing catch-up with technological advances. This report seeks to help policymakers regain the initiative."
As the Commission prepares a new European Democracy Action Plan, Digital Services Act and EU Citizenship Report 2020, this research is designed to help citizens, civil society and policymakers make sense of the impact the online world is having on our political decisions, and identify actions to safeguard a participatory, democratic European future.
The first pressure point identified in the report is the "attention economy". When we are online, our attention and engagement are sold as products to advertisers.
The private organisations running the online services we use have become very adept at capturing and keeping that attention, to the extent that our political views and actions can be shaped without us realising what is behind that influence.
For example, YouTube claims that their video recommender algorithm, which automatically selects videos it thinks a user will be interested in, is responsible for 70% of viewing time on the site. There is also evidence that YouTube's recommendations are drawing viewers into increasingly extremist content.
Facebook's algorithm, analysing only 300 likes, can predict a user's personality with greater accuracy than their own spouse. This gives rise to concerns over "microtargeting": highly personalised advertisements being directed at users based on their own personalities. If used politically, microtargeting has considerable potential to undermine democratic discourse—a foundation of democratic choice.
The second pressure point is "choice architectures". Social media platforms use several behavioural techniques to encourage people to constantly engage and share, with settings and options that make it much more complicated to leave a platform than to sign up to one.
Online users are generally unfamiliar with what data they produce and provide to others, as well as how that data is collected and stored, when they perform basic tasks online.
Thirdly, there is "algorithmic content curation". The algorithms that sort through and select the information we see online are so complex that even their developers have a hard time explaining them. This raises obvious problems for transparency and accountability.
This is especially problematic because these algorithms can encourage polarised discourse or stop us from receiving certain information.
On platforms like Twitter, Reddit, and Facebook, algorithms prioritise content that has, or is expected to have, a high level of engagement. The risk is an overexposure of polarising and controversial content and underexposure to less emotive, but more informative content.
The final pressure point identified is "misinformation and disinformation". A recent Eurobarometer survey in all EU counties revealed that over half of the population say they come across fake news online at least once a week.
Behavioural science shows that people have a predisposition to orient towards negative news. When coupled with algorithms that promote content with a high level of engagement, online platforms can easily amplify the reach of false and misleading information.
This is particularly concerning when false and misleading information have the potential to set the political agenda, incentivise extremism and ultimately lead to a "post truth" world in which facts have less influence in shaping public opinion than emotion and personal belief.
The report notes that there is already substantial legislation that applies to the online world, and several regulatory initiatives are currently taking shape at the European level.
Examples include consumer protection rules which resulted in a large quantity of COVID-19 misinformation being removed from Facebook, and the EU's Audiovisual Media Services Directive that regulates content on YouTube and other sites.
In efforts to tackle extremism, the EU Internet Forum brings together governments, Europol, and the biggest technology and social media companies to ensure that illegal content, including terrorist propaganda, is identified and taken down as quickly as possible.
The report provides insights from behavioural science for policymakers that can be applied in several areas, from tackling misinformation and disinformation, to safeguarding electoral processes and facilitating public discussion.
Specific actions could include banning microtargeting for political ads, transparency rules so that users understand how an algorithm uses their data and to what effect, or requiring online platforms to provide reports to users showing when, how and which of their data is sold.
The authors argue that policymakers must pursue these kinds of actions in conjunction with broader efforts to meaningfully engage politically with citizens to understand their different values and perspectives and re-establish trust in political institutions.
Looking to the future, the report also employs strategic foresight to set out possible future scenarios for the European information space in 2035, to help policymakers envisage how choices made now could shape, and be shaped by, the future of our societies.
In "Technology and Democracy: understanding the influence of online technologies on political behaviour and decision-making" the JRC, together with a team of experts, synthesise the state of the art knowledge about digital technology, democracy, and human behaviour to enable policymakers to safeguard a participatory and democratic European future through legislation that aligns with human thinking and behaviour in a digital context.
This report is the second output from the JRC's Enlightenment 2.0 multi-annual research programme. Advances in behavioural, decision and social sciences show that we are not purely rational beings: Enlightenment 2.0 seeks to understand the other drivers that influence political decision-making.