The views presented in this factual summary report are not the views of the European Commission but of the stakeholders that participated in this open public consultation. It cannot in any circumstances be regarded as the official position of the Commission or its services.
Objectives of the consultation
Following the Commission’s Communication “Shaping Europe’s Digital Future (.pdf)
” of February 19th 2020, the Commission ran a 14 weeks public consultation on the main challenges arising around the provision of digital services, and online platforms in particular.
The consultation was launched in preparation of the proposals for a Digital Services Act and the Digital Markets Act, as well as to explore emerging challenges in other areas related to online platforms and digital services such as the situation of self-employed individuals offering their services through platforms. The consultation sought to gather views, evidence and data from people, businesses, online platforms, academics, civil society, and all interested parties to help shape the future rulebook of digital services.
Who replied to the consultation?
In total, 2,863 responses were submitted by a diverse group of stakeholders. Most feedback was received by citizens (66% from EU citizens, 8% from non-EU citizens), companies/ businesses organisations (7.4%), business associations (6%), and NGOs (5.6%). This was followed by public authorities (2.2%), others (1.9%), academic/research institutions (1.2%), trade unions (0.9%), as well as consumer and environmental organisations (0.4%) and several international organisations.
The results of one independently run campaign were submitted, including replies from around 800 citizens. Additionally, around 300 position papers were received in the context of the open public consultation.
In terms of geographical distribution, most of the respondents are located in the EU. Respondents came from all Member States, with a majority of contributions received from Germany (27.8%), France (14.3%), and Belgium (9.3%). Internationally, the highest share of respondents that participated were from the UK (20.6%) and the US (2.8%).
Countries with ≤15 submissions include Czechia, Hungary, Norway, Luxembourg, Romania, Greece, Latvia, Slovakia, Canada, Lithuania, Australia, Cyprus, Malta, Japan, Slovenia, Bulgaria, Croatia, Estonia, Russia, China, Greenland, Iceland, India, Iran, Micronesia, Thailand, Ukraine, Åland Islands.
Zoom-in: the three largest respondent groups
Companies/Businesses organisations and business associations
Of the 211 participating companies/business organisations, 80.1% specified that they were established in the EU and 11.4% indicated that they were established outside of the EU.
26.5% described themselves as a conglomerate, offering a wide range of services online. 21.3% identified as a scaleup and 6.6% as a startup. In terms of annual turnover, more than half of the participating companies/business organisations indicated a turnover of over EUR 50 milion per year. 13.3% make an annual turnover of smaller than or equal to EUR 2 milion, 3.8% of the respondent revealed an annual turnover of smaller than or equal to EUR 10 million, whereas 6.2% specified an annual turnover of smaller than or equal to EUR 50 million. 28.4% of the responding companies/business organisations were online intermediaries, 24.6% were providers of other types of digital services. 12.3% indicated that they were an association, representing the interest of the types of businesses named prior. Of the 180 participating business associations, 15% indicated that they were representing online intermediaries, 19.4% specified that they are working on behalf of digital service providers other than online intermediaries, and 40% indicated that they represented the interests of other businesses.
Non-governmental organisations (NGOs)
Of the 159 participating NGOs, almost half (49.7%) stated, that they represented fundamental rights in the digital environment. 22.6% dealt with flagging illegal activities or information to online intermediaries for removal, and 22% represented consumer rights in the digital environment. Furthermore, 18.9% specified that they were fact checking and/or cooperating with online platforms for tackling harmful, (but not illegal) behaviours and 13.2% represented the rights of victims of illegal activities online. 10.7% represented interests of providers of services intermediated by online platforms, including trade unions, and 10.7% gave no answer. 30.8% of the responding NGOs indicated “other”.
59 public authorities participated in the open public consultation, of which 43 representing authorities at national level (72.9%), 8 at regional level (13.6%), 6 at international level (10.2%), and 2 at local level (3.4%). Among EU Member States, authorities replied from Austria, Belgium, the Czech Republic, Denmark, Estonia, Finland, France, Germany, Greece, Ireland, Italy, Latvia, Luxembourg, and Poland. About half of the responding public authorities were governments, administrative or other public authorities other than law enforcement in a member state of the EU (49.2%). 15.3% indicated that they were a law enforcement authority in a Member State of the EU and 15.3% specified that they were another, independent authority in a Member State of the EU. These replies are complemented by a targeted consultation ran by the Commission with Member States.
Summary of the Results
How to effectively keep users safer online?
A majority of respondents, all categories included, indicated that they had encountered both harmful and illegal content, goods or services online, and specifically noted a spike during the Covid-19 pandemic. More specifically, 46% of the respondents to the relevant question indicated that they had encountered illegal goods, and 67% stated that they had encountered illegal content online. A large share of respondents who said they had notified illegal content, goods or services to platforms, expressed their dissatisfaction with the platforms’ response, and the ineffectiveness of reporting mechanisms after the exposure took place.
There is broad consensus among respondents on the need to harmonise at EU level obligations for online platforms to address illegal content they host. Among the most widely supported measures, including by online platforms, are simple, standardised, and transparent notice and action obligations harmonised across the single market. A large majority of stakeholders want all platforms to be transparent about their content policies, support notice and action mechanisms for reporting illegal activities, and request professional users to identify themselves clearly (90%, 85% and 86% respectively).
There is a general call, especially among citizens, for establishing more transparency in the content moderation processes and outcomes.
With regard to the use of automated tools in content moderation, several respondents pointed to the usefulness of such tools for addressing illegal content at scale, but there is also a strong call for caution in the use of such tools for a series of risks to over-removal of legal content.
Moreover, whilst there is a strong call for action, many categories of stakeholders, including citizens, online intermediaries, civil society organisations, academic institutions, NGOs and national authorities emphasized that any new measure to tackle illegal content, goods or services online, should not lead to unintentional, unjustified limitations on citizens’ freedom of expression or fundamental rights to personal data and privacy.
There is a general agreement among stakeholders that content which is not in itself illegal should not be defined in the Digital Services Act, as this is a delicate area with implications for freedom of expression. Respondents, in particular among academics and civil society, point however to the particular role of how algorithmic systems shape access to content on online platforms, and how platforms’ systems are used for reaching very wide audiences with content that might be inciting violence, hate speech or disinformation. Several stakeholders, amongst them citizens, civil rights associations, NGOs, academic institutions as well as media companies and telecommunication operators pointed out the need for algorithmic accountability and transparency audits, especially with regards to how content is prioritized and targeted. In addition, especially in the context of addressing the spread of disinformation online, regulatory oversight and auditing competence over platforms’ actions and risk assessments was considered as crucial (by 76% of all stakeholders responding to the relevant question).
At the same time, most stakeholder groups acknowledged that not all types of legal obligations should be put on all types of platforms. According to various stakeholder groups, especially business organisations and start-ups, enhanced obligations are especially needed for larger platforms, but these obligations might be disproportionate for smaller ones. Start-ups especially stressed the point that a “one-size-fits-all” approach would be most beneficial for very large platfoms, but could have detrimental effects on medium-sized or smaller platforms and businesses at the core of the European digital ecosystem. They stress that their growth and evolution should not be hindered by disproportionate rules.
Respondents also generally agree that the territorial scope for these obligations should include all players offering goods, content or services, regardless of their place of establishment.
Reviewing the liability regime of digital services acting as intermediaries
A large majority of stakeholder groups broadly considered the principle of the conditional exemption from liability necessary for a fair balance between protecting fundamental rights online and preserving the ability of newcomers to innovate and scale. With regards to consumer protection, some organisations defending consumer rights supported changes to the liability regime in support of a faster resolution of damages for consumers.
In particular representatives of smaller service providers, but also some civil society organisations pointed to legal uncertainty and disincentives for service providers to act against illegal goods, services or content disseminated through their service. Start-ups strongly called for a legislative framework that reaffirms the principles of the E-Commerce Directive, while supporting the introduction of a clarification of the liability regime with regards to voluntary measures they might take.
There was also a broad convergence among all stakeholder categories around the need to preserve the prohibition of general monitoring obligations for online intermediaries in order to preserve a fair balance and protect fundamental rights, including the right to privacy and freedom of expression.
What issues derive from the gatekeeper power of digital platforms?
Among respondents to the question (1476 in total), the vast majority (90%), including among platforms (73%), agrees that there is a need to consider dedicated regulatory rules to address negative societal and economic effects of gatekeeper power of large platforms. Among businesses and business users who replied to the relevant question (155 in total), 88% encountered unfair trading conditions on large platforms. Examples of unfair practices by large gatekeeper platforms listed by respondents cover exclusionary conducts, exploitative conducts and transparency-related problems.
In general, most of the issues presented by respondents were perceived to be due to an imbalance in bargaining power between platforms and business users, which is considered to hamper competition, foster uncertainty in relation to contractual terms, and also result in lock-in of consumers. The majority of respondents, including among platforms, is of the opinion that there are structural (competition) issues that current regulation cannot address or cannot deal with effectively. A minority of respondents (mainly several large platforms and their trade associations and some research institutes and national authorities), argue that (extra) ex-ante regulation is unnecessary, especially in the light of the consultation on the New Competition Tool and the recent adoption of the Platform to Business Regulation.
The vast majority (84%) of respondents considers that dedicated rules on platforms should include prohibitions and obligations for gatekeeper platforms. Again, according to the majority of respondents, the proposed list of problematic practices should be targeted to clearly unfair and harmful practices of gatekeeper platforms; specific enough to avoid confusion of what is and is not permitted; adaptable to a dynamic, fast moving sector; and specific to certain gatekeepers.
Other respondents (among trade associations, research institutes and platforms) consider that a ‘blacklist’ of prohibited practices should require careful consideration in relation to a dynamic industry that has multiple business models, types of users, and types of business partners, and it should result from an assessment of market failures to be resolved. In addition, several large platforms are of the opinion that blanket banning of market behaviours risks being inefficient, negatively impacting consumers and actually risk worsening competition by limiting the ways in which entrants can innovatively challenge the incumbents.
In determining the gatekeeper role of large online platforms, respondents consider all the characteristics mentioned in the questionnaire relevant (i.e. large user base, wide geographical coverage, large share of total market revenue, impact on a certain sector, exploitation of strong network effects, leverage of assets to enter new areas of activity, raising of barriers to entry, accumulation of valuable and diverse data and information, lack of alternative services, lock-in of users). In general, stakeholders of all categories point out the need to ensure a high level of coherence and legal certainty; the criteria used should be transparent, objective and easily measurable, and a merely cumulative approach might not be sufficient. Some respondents from different stakeholder categories (including platforms, trade associations and telecom operators) state that a “one-size-fits-all” approach might be unfeasible, while others (mainly from trade associations) state that the new legislation should be general in nature, so that it may be applicable regardless of industry, sector, technology or business-model.
Regarding the enforcement of any prohibitions and obligations and case-by-case remedies imposed on gatekeeper platforms, there is a strong majority (70% and 80% respectively) among stakeholders that considers that there is a need for a specific regulatory authority. Furthermore, respondents generally consider that an effective coordination between EU bodies and the relevant national regulatory authorities is needed, especially in the light of the fact that issues related to gatekeepers are likely to have an important cross-border component. Platforms in particular point out the need to minimise fragmentation and allow for a pan-European approach.
Other emerging issues and opportunities, including online advertising and smart contracts
Regarding online advertising, stakeholder views echoed the broad concerns around the lack of user empowerment and lack of meaningful oversight and enforcement.
The most frequent issues pointed to as necessary relate to more transparency regarding the identity of the advertiser, how the advertisements are personalized and targeted. Implementing features that explain why certain advertisments are shown to users were considered a good practice to build upon to empower users.
A large share of the general public responding to the consultation pointed to deceptive and misleading advertisements as being a major concern in their online experience. Users, academic institutions and civil society organisations are particularly concerned about targeted advertisements to minors and political advertising.
Academic institutions pointed to persistent difficulties when conducting research, and explained the difficulty of observing emerging issues and phemonena online, blaming an inconsistent access to relevant data. Several pointed to the need for a generally disclosed ad archive, as well as an independent auditing of ad systems.
On the topic of smart contracts, a majority of stakeholder groups stated that there is not sufficient clarity with regards to validity, applicable law and jurisdiction. Further, more regulatory clarity is considered necessary in terms of standards and liability.
Addressing challenges around the situation of self-employed individuals offering services through online platforms
Individuals, but also public authorities, businesses and employers or workers organisations answered to this section. The variety of services offered through online platforms and covered by the responses included food delivery, household maintenance, ride-hailing, software development, translations, art and design, health counselling or training.
Most individuals and organisations highlighted the need for action to remove existing obstacles to improve the situation of individuals offering services online and offline. The most frequently mentioned obstacle was the lack of clarity concerning the employment status of individuals offering services, including the risk of infringing competition law. The main concerns of the individuals supported by the views of social partners and trade unions included the lack of social security coverage, work precariousness and uncertainty vis-à-vis working time and risks of social dumping. A big majority of respondents indicated that they are not able to collectively negotiate their remuneration or other conditions vis-à-vis platforms. The public authorities also argued that EU measures should be considered addressing unjustified barriers to cross-border transactions.
The issue of the lack of transparency in online ratings, the transparency on remuneration as well as the lack of possibility to organise collectively vis-à-vis the platform represented the three most pertinent challenges in the participants’ responses. A big majority of the respondents (both citizens and organisations) indicated that the possibility of collective bargaining would represent a significant improvement for individuals offering services both in the online and offline economy. The platforms and the business associations highlighted the need for creating harmonized rules across Member States to ensure a level playing field among platforms but also vis-à-vis the traditional sectors of the economy. They called for an agile way of establishing decent working conditions for platform workers without endangering competitiveness and creating the risk of misclassification.
Governance of digital services and aspects of enforcement
There is a broad alignment from all categories of stakeholders that the internal market principle enshrined in the E-Commerce Directive is crucial for the development of innovative services in the Union and should be preserved.
With regard to the burdens for companies in the single market, business associations and medium-sized companies in particular pointed out that the legal fragmentation around rules for tackling illegal content, goods and services, is limiting most businesses, but especially small and medium-sized enterprises (SMEs) and start-ups, from scaling up. More specifically, business associations pointed out that SMEs and start-ups are facing a competitive disadvantage, since they are affected in a disproportionate manner as opposed to larger companies. Start-ups and SMEs confirmed this observation, by pointing to the business risks of having to adapt their services to potentially 27 different sets of rules, which does not just inhibit their growth across the Union, but also globally.
At the same time, besides the need to address the refragmentation of rules, there is also a general understanding among stakeholders that cooperation between authorities should be improved in the cross-border supervision of digital services, and in particular online platforms. 66% of the respondents to the relevant question in the open public consultation noted that an unified oversight entity for EU oversight is very important. Many stakeholder groups, but especially business associations and companies, considered, that the degree of oversight should vary depending on the services’ obligations and related risks.
Authorities and other respondents, in particular academic institutions and civil society organisations, consider that the supervision of such cross-border services require accessing appropriate data and adequate financial and human resources in competent authorities tasked with the supervision of online platforms. Many groups of stakeholders, especially digital rights associations, identified the need for interdisciplinary skills in the oversight entity, particularly in-depth technical skills, including data processing and auditing capacities, which would allow for a reliable and thorough oversight and transparency of algorithmic decision-making processes.
The contributions can be downloaded from the Better Regulation Portal