For many years, the regulation of digital markets has been a key priority in Europe. In 2015, the European Commission committed the EU to the creation of a digital single market – and that commitment spawned a series of diverse initiatives and regulatory changes affecting providers and users of digital goods and services. Even between EU Member States, the standard of digital regulation has varied—and continues to vary—with many issues being left to local implementation. And, in 2016, the UK voted to leave the EU (with Brexit finally happening in 2020) – a process that will lead to further digital regulatory divergence across Europe.
As a result of the continuing evolution of European digital regulation, the process of Digital Compliance has become a much greater priority for any organization providing or consuming digital goods and services in Europe. As digital technology evolves, it is increasingly challenging for organizations to ensure continued regulatory compliance.
To help organizations to track the main developments in European digital compliance, Morrison & Foerster’s European Digital Regulatory Compliance team reports below on some of the main topical developments in Digital Regulation & Compliance.
The dynamic double-team is arriving! In December 2020, the European Commission published two headline proposed pieces of legislation designed to implement the EU’s digital strategy. Together, the Digital Services Act (DSA) and Digital Markets Act (see more details below) are intended to create a safer digital space and to establish a level playing field to foster innovation and growth both in the EU and globally.
The DSA, as one half of this legislative package, will focus on the regulation of digital services providers, or “intermediaries.” It seeks to address the dominance of “very large” platforms (those reaching over 10% of the 450 million consumers in Europe) and the accountability of companies for third-party content. Breaches of the DSA may attract one-off fines of up to 6% of annual global turnover or periodic penalty payments of a maximum of 5% of average daily turnover.
Significantly, the DSA will apply to any provider offering their services to users in the EU. The intermediaries regulated by the DSA will have different obligations depending on their role, size, and impact on the online ecosystem. Digital platforms will be responsible for removing illegal content, meeting transparency obligations and completing additional due diligence. The DSA seeks to add to and build upon the EU’s e-Commerce Directive, while harmonizing regulation across the EU and clearing up issues such as liability for third-party content.
Once in force, the DSA and Digital Markets Act will be directly effective across the EU and will therefore not require national implementation by each Member State.
A long road lies ahead for both the DSA and Digital Markets Act, as they will need to be debated and agreed before adoption. Both pieces of legislation will now be debated and (eventually) agreed by the EU institutions. Further comments and/or updates might arise during the course of those debates. And, of course, there will no doubt be strong opinions and lobbying from digital service providers to try to relax both the requirements on digital intermediaries and the size of the possible financial penalties that could be imposed.
The second half of the EU’s legislative duo described above is the Digital Markets Act (DMA). The DMA will target “gatekeepers”—the core platforms that act as a gateway between business users and customers. The proposed regulations express concern at the “entrenched and durable” positions of such platforms, which the EU believes results in unfair practices and lack of contestability leading to higher prices, lower quality and less innovation in the digital economy. Only a cynic would draw a connection between the EU’s concern and the fact that these platforms are almost exclusively headquartered outside the EU.
A core platform provider will be presumed a gatekeeper if it:
While the draft DMA sets out these thresholds for presuming that a provider is a gatekeeper, it also allows the European Commission to designate gatekeeper status based on other assessments too.
Gatekeepers will, among other things:
The DMA comes at a time where several EU Member States are debating or have already adopted new regulatory instruments also aimed at regulating gatekeepers or “large digital companies” more generally with a view to ensuring that relevant markets remain open and competitive (e.g., the German UPSCAM regulation that entered into force in January 2021). Given that the DMA claims to be the only instrument providing such regulation across the EU, it remains to be seen how its relationship to these national regimes will ultimately play out.
As for the DSA, the DMA will now need to be debated and agreed amongst the EU institutions before adoption and, once adopted, it will be directly effective across the EU.
Post-Brexit, the UK will not adopt the DSA or DMA. But, the UK is also looking to change the way in which it regulates digital markets – which, in all likelihood, means trying to achieve the same goal as the EU in terms of regulating “big tech” companies.
A number of key UK regulatory agencies (the Competition and Market Authority (CMA), Information Commissioner’s Office, Financial Conduct Authority, and Ofcom) have combined forces to advise on the UK’s strategy for regulating digital markets. Together, they make up the UK’s Digital Task Force—which has published its first document of advice.
The regulatory regime proposed by the Digital Task Force entails a legally binding code of conduct (with different rules for different types of companies), pro-competitive interventions (including remedies such as personal data mobility and interoperability), and enhanced merger rules, all to be overseen by a new Digital Markets Unit (DMU) sitting within the CMA. The UK government established the DMU in April 2021 and has committed to consulting on the pro-competitive regime later in 2021, with the aim of being able to regulate the large global digital tech providers by 2022.
The Digital Task Force’s regime targets digital companies with a “strategic market status” (SMS), to be determined on an evidence-based assessment. This mirrors the EU’s strategy (as described above) in taking aim against those companies perceived as enjoying entrenched market power. But, in contrast to the EU approach, the UK Digital Task Force proposes that such an SMS assessment should be applied to a company’s specific activity, rather than to the company as a whole.
While the Digital Task Force envisages a proactive regime and open and productive relationships with SMS firms, it goes further than the EU in its proposed penalties. The task force recommends that the UK deters breaches of the regime with fines of up to 10% of worldwide turnover. It remains to be seen how big tech firms will swallow this “work with us or against us” mentality.
The main things to look out for in the coming months will be the actual establishment of the DMU in April 2021, as well as the upcoming consultation by the UK government on its pro-competitive regime. The UK Government has also committed to passing legislation that cements the DMU in statute, when parliamentary time allows, which may provide further details about the DMU’s powers in practice.
Over the past few months, the European Commission has ramped up its preparatory work for two upcoming, eye-catching legislative drafts in the context of emerging technologies such as artificial intelligence, machine learning, or autonomous robotics.
First, the Commission has published its legislative proposal for an AI regulatory framework. Based on the Commission’s preparatory work, most notably its White Paper on AI released in February 2020, and its Inception Impact Assessment issued in July 2020, the proposal entails a set of mandatory regulatory requirements, at least for certain high-risk types of AI and GDPR-style fines based on a percentage of turnover for non-compliance. The Commission announced that it would aim to tackle three main areas of concern that the Commission has identified: (i) the impact that AI‑based decision-making can have on fundamental rights (e.g., in case of biased or discriminatory AI decisions); (ii) the safety risks arising from AI applications that currently cannot be captured by the EU product safety framework (e.g., due to the dynamic nature of AI); and (iii) a solution for the assignment of liability in case of damages caused by AI. See our separate analysis of the proposal.
Second, and together with the AI proposal, the Commission also published its proposal to revise the EU Machinery Directive. The Machinery Directive regulates the essential health and safety requirements that machinery must comply with before being allowed to be placed on the EU single market, and the procedures to remedy related risks. The Commission’s proposal entails a number of amendments aimed at more explicitly taking into account that machinery increasingly implements emerging technologies such as AI, machine learning, or autonomous mobility, and that these technologies bring about new health and safety risks for persons and property exposed to such machinery. For example, the proposal addresses risks associated with a lack of cybersecurity and with the coexistence of humans and robots in the same working area.
The Commission published both drafts on April 21, 2021 so that these initiatives will now enter the ordinary legislative proceedings at EU level and could then be finalized in early 2022. Companies who expect that they will be affected by the new rules should analyze the published drafts, and keep an eye out for opportunities to discuss with relevant stakeholders any concerns regarding the impact of these rules on their businesses.
Fines issued by EU Commission to Valve and others for geo-blocking
In February 2021, the EU Commission fined Valve, owner of the online PC gaming platform “Steam”, and five other PC video game publishers € 7.8 million in total for breaching EU antitrust rules due to geo-blocking practices (see 20 January 2021 press release). Valve alone was fined over €1.6 million.
The Commission found that Valve and the PC video game publishers restricted cross-border sales of certain PC video games on the basis of the geographic location of users within the European Economic Area (EEA). Allegedly, they prevented consumers from activating and playing PC video games that they bought in certain EU countries in certain other EU countries, thus restricting cross-border sales. This enforcement action demonstrates that compliance with geo-blocking rules is of major importance and one of the topics that could always attract the attention of regulators (particularly the EU Commission).
EU Commission report on Geo-Blocking Regulation
While this enforcement action was based on general principles of EU competition law, the EU Commission was also active regarding the EU’s specific anti-geo-blocking provisions in the Geo‑Blocking Regulation (Regulation (EU) 2018/302). The EU legislated to end the practice of geo-blocking in the EU market as part of its Digital Single Market Strategy.
The EU Commission has also published its first review of that Regulation, which had entered into force in December 2018 (see “Commission publishes its short-term review of the Geo-blocking Regulation”). The report analyzed the effects of the implementation of the Geo‑Blocking Regulation, noting in particular that geo-blocking measures, i.e., preventing users from accessing websites cross-border, have decreased significantly. This also resulted in an increase of cross‑border purchases.
The report further considered possible effects of a potential extension of the Geo-Blocking Regulation’s scope, particularly regarding copyright-protected content (such as audiovisual content, music, e-books and games).
Before considering any follow-up measures in relation to such extension, the EU Commission intends to launch a dialogue with stakeholders in the audiovisual sector in order to discuss specific proposals for improvement. Companies that could be affected by more restrictive geo-blocking rules should make sure to get involved in this dialogue. In addition, it is important to keep in mind that, under general competition law, geo-blocking restrictions may already apply even to services currently carved-out of the Geo-Blocking Regulations.
In addition to the proposed Digital Services Act package (see above), the EU has started to introduce other specific legislative initiatives to regulate online intermediaries.
The EU’s Regulation on Promoting Fairness and Transparency for Business Users of Online Intermediation Services (the Platform to Business Regulation or P2BR) (hailed by the EU as the “first ever rules” of their type) has applied since July 2020. Despite Brexit, these rules were also retained in the UK, as amended by the Online Intermediation Services for Business Users (Amendment) (EU Exit) Regulations 2020 under the EU Withdrawal Act 2018.
The P2BR applies to online intermediaries (i.e., online marketplaces that allow businesses to offer and sell goods to consumers) and search engines, and place a large emphasis on transparency, including a requirement on service providers to disclose how they rank goods and services on their site.
More recently, the EU Commission has published Guidelines on this so-called “ranking transparency” under the P2BR.
As well as pushing transparency and predictability, the P2BR also restrict certain “unfair” practices, including the right to suspend or terminate a user’s account without providing it with a statement of reasons for doing so (and a right for the user to appeal), and amending terms and conditions without provide notice of at least 15 days. Online platforms will also need to have an internal complaint‑handling procedure.
While the P2BR is largely limited to transparency aspects, some EU Member States are already one step ahead of EU law and have recently enacted more substantive requirements addressing online intermediaries. In Germany, the new Media State Treaty entered into force in late 2020 and now aims to ensure that, for example, search engines or video sharing platforms apply the ranking criteria to be disclosed non-discriminatorily. On that basis, the Media State Treaty may also provide helpful insights for the further development of the Digital Services Act.
Given that the P2BR is now in force, the key thing to watch out for now will be the extent to which it is enforced by Member States (and, indeed, how much leeway intermediaries are afforded given that the P2BR has only applied for eight months or so). It also remains to be seen how many Member States follow Germany’s suit, and impose requirements on online intermediaries that go beyond transparency.
In February 2021, the UK government published a Report by John Penrose MP on how the UK’s regime around competition and consumer law issues should evolve post-Brexit.
In the long term, Penrose Report calls for the UK CMA to be able to decide consumer infringement cases and impose fines in the same way that it currently does for competition law cases without having to go to court. It is proposed that this will enable consumer law enforcement to assume a greater priority status than it currently enjoys.
In addition, the Penrose Report supports the UK government’s announced intention to establish a new digital markets unit (DMU) within the CMA to oversee digital firms with substantial, entrenched market power. However, the Report argues that the DMU should focus on individual firms that own and run new network and data monopolies, rather than the digital sector as a whole. The Penrose Report’s emphasis on “making markets work for people, not the other way around” suggests that an update to regulations and regulators powers is likely.
The UK government’s response will be released “in due course.” As noted above, the UK government intends to establish the DMU in April 2021.
Strengthening consumer rights is also hot on the EU’s agenda, with Member States continuing to implement the Omnibus Directive and Digital Content Directive. Implementation efforts in this area will ramp up across the EU in the next couple of months, given that implementation deadlines for the Omnibus Directive and Digital Content Directive will expire in Q3/Q4 2021. As the implementation of supposedly harmonized EU rules is always at risk of leading to diverging legislation at Member State level, affected companies should keep an eye out on these national implementation efforts in key markets.
In 2018, the European Union adopted the “European Electronic Communications Code” (the “EECC”)—a new Directive to revise and modernize the European regulatory framework for electronic communications (see our alert). The EECC was supposed to have been implemented into the national laws of the EU Member States by December 2020, but implementation efforts are still ongoing across the EU. Thus, in February 2021, the European Commission initiated infringement proceedings against 24 Member States for their failure to implement the EECC in time.
Two key factors for the delay may be that not all aspects of electronic communications laws are harmonized by the EU and the EECC’s handling of web-based (or OTT) communications services. While the EECC resolves the long-disputed questions whether OTT communications services are in‑scope of the EU framework by generally including all of these services in its scope, it leaves it to the Member States to define the scope of obligations actually applying to such OTT services that do not allow a communication with traditional phone numbers.
So for example in Germany, there was a lively debate regarding the extent to which OTT services will be included in the non-harmonized national security rules provided by the current Telecommunications Act. And, following the adoption of the EECC implementation by the German parliament on April 23, 2021, many providers of web-based communications services will soon have to facilitate measures such as legal interception, implement know-your-customer procedures, or comply with data access requests from law enforcement – just as more traditional services are already obliged to do. Similar discussions are happening in other Member States as well.
The UK was also obliged to implement the EECC as part of the Brexit Transitional Period, and has done so via its own legislation. Communications providers will also need to comply with Ofcom’s General Conditions, which set out the requirements with which communications providers must comply in more detail. Ofcom issued a Statement on the implementation of the EECC in December 2020.
Given that important aspects of electronic communications law are not harmonized by the EECC—and also that the EECC does not provide for any country-of-origin or passporting privilege—providers should closely follow the national implementation efforts, particular if they offer their services cross‑border across the EU.
The UK government has now published a full response to its consultation on the Online Harms White Paper. The response confirms plans to introduce a new regulatory framework to tackle harmful content online through the upcoming Online Safety Bill (OSB). The OSB will make in-scope companies (as summarized below) take responsibility for the safety of their users by introducing a statutory duty of care. This duty will apply to any in-scope companies which provide services to UK users, regardless of where they are based in the world. Examples of in-scope services include: social media platforms, consumer cloud storage sites, search engines, video sharing platforms, online instant messaging services, video games allowing interaction with other players, and online marketplaces. The framework is expected to come into effect in the second half of 2021.
The UK government is keen to ensure that policy initiatives in this sector are coordinated with similar legislation in the US and the EU, including the European Commission’s proposed Digital Services Act (see above) and the Audiovisual Media Services Directive, which created a new regulatory regime for video-sharing platform services.
In a similar vein, Germany has adopted a new version of its youth protection law, requiring all host providers to adopt a suite of measures aimed at protecting minors from harmful third-party content that may be distributed via their services. These requirements will enter into force during Q2, 2021 and they will apply to foreign providers as well. In parallel, Germany is still reviewing its more general hate speech legislation that already regulates how providers of online services have to handle certain illegal third-party content in cases of user complaints.
Companies whose services are caught by these regimes should take preparatory steps to ensure compliance, including preparing or revising existing policies, procedures and systems governing the use of their platforms.
In the UK, the Law Commission will produce recommendations on the reform of criminal offences relating to harmful online communications in early 2021. The UK government will use these recommendations and incorporate them into the OSB to the extent required, with the expectation that the OSB itself will be ready at some stage in 2021. In Germany, and as noted above, the requirements of the youth protection law kick in Q2, 2021, and we await the results of any reviews on hate speech legislation.
In March 2021, the European Court of Justice (ECJ) added another piece to its complex case law on the “right of communication to the public” (VG Bild-Kunst v Stiftung Preußischer Kulturbesitz). It is established ECJ case law that the use of hyperlinks and deep links to copyright-protected content (including framing) does not constitute a so-called “copyright relevant action,” i.e., does not qualify as a “communication to public.” However, this is only true provided that the content, in line with the right holder’s consent, is freely accessible online. Against this background, the ECJ has now specified that this condition is not met where framing measures circumvent effective technological protection measures adopted or imposed by the right holder. In such a case, framing qualifies as a communication to the public that requires prior authorization (and which would create liability where the authorization of the original rights-holder is not obtained).
A decision as to whether certain online service providers perform copyright relevant acts of communication to the public, is currently pending before the ECJ. Nevertheless, the European legislator has already abandoned the safe harbor for such host providers. Article 17 of the 2019 Copyright Directive subjects “online content sharing service providers” to direct liability for user-uploaded, copyright-infringing content, requiring platforms to seek licenses and to remove and block automatically filtered content. Harmonization is not to be expected from Member States! Some national implementation drafts, such as Germany’s, include special safeguards for user rights (e.g., permitting minor uses) or hinge content blocking on a justified request by the right holder (Finnish government’s draft). Others stick closely to the Directive’s wording or, in the case of France, emphasize a very strict upload-filter approach.
To align business operations and rights management with applicable law, national implementations (to be finalized by June 7) should be monitored closely by providers and right holders alike.
MoFo is hosting a webinar on 11 May 2021 that will address recent developments in the implementation of the EU’s 2019 Digital Single Market Copyright Directive, especially as they relate to online services storing and displaying user-provided content and to the protection of press publications against third party uses. The emerging law in Europe will be compared and contrasted with corresponding U.S. law principles. Register for the webinar.
We are grateful to the following members of MoFo’s European Digital Regulatory Compliance team for their contributions: Rayhaan Vankalwala, Patricia Ernst, Jens Hackl, Jacqueline Feigl, Mercedes Samavi, and trainee solicitors Georgia-Louise Kinsella and Georgia Wright.