European Digital Compliance: Key Digital Regulation & Compliance Developments
European Digital Compliance: Key Digital Regulation & Compliance Developments
To help organisations stay on top of the main developments in European digital compliance, Morrison Foerster’s European Digital Regulatory Compliance team reports on some of the main topical digital regulatory and compliance developments that have taken place in the third quarter of 2022.
In this issue, we report on the start of the EU legislative journey for new proposals on media freedom and cyber-resilience, and the end of the process for others – in particular, the adoption of the EU’s flagship digital regulations, the Digital Markets Act and Digital Services Act. In the UK, the content moderation rules of the proposed Online Safety Bill are in limbo. And we report on the proposed regulatory framework in Europe for two emerging technologies – artificial intelligence and non-fungible tokens.
On 4 October 2022, the European Council gave its final approval to the Regulation on a Single Market for Digital Services, more commonly referred to as the “Digital Services Act” or DSA. This marks the final step for the DSA to come into force because the European Parliament already adopted the new legislation in July 2022.
The DSA’s main goal is to create a safe, predictable, and trustworthy online environment, and to protect the digital space against the spread of illegal content, online disinformation, and other societal risks. When adopting the DSA in July 2022, the EU Parliament praised it as “strong, ambitious regulation of online platforms” enabling “the protection of users’ rights online”. And, indeed, the DSA is more than just a refresh of the EU’s existing regulatory framework for the online sector. It introduces a comprehensive regime of content moderation rules for a wide range of online services across the EU.
See our DSA client alert for more details about the DSA.
The DSA will apply from 1 January 2024, meaning that in-scope companies now have little more than one year to bring their services and internal processes into compliance.
The DSA will apply to four categories of services that each build on one another. “Intermediary services” are services that transport, cache, or store third-party information. “Hosting services” are intermediary services that store third-party information. “Online platforms” are hosting services that also disseminate the stored third-party information. And “very large online platforms” are online platforms with more than 45 million registered users in the EU that have been given the designation by the European Commission.
The DSA provides specific content moderation, advertising, interface design, and reporting obligations for all four categories of services, and the intensity of regulation increases from intermediary services to very large online platforms.
In addition to companies preparing for when the DSA enters into force, the European Commission also has homework to do. Under the DSA, it is required to adopt implementing legislation to set out further details of how in-scope services will have to comply with the DSA’s rules.
Similarly, Member States will have to find a way to deal with national content moderation rules that will be in conflict with the DSA once it enters into force. For example, the German government has already announced that it will eliminate the German Network Enforcement Act that currently regulates content moderation regarding criminal content for social networks and video-sharing platforms.
In July 2022, the EU Council adopted the Digital Markets Act (DMA), the EU flagship law to regulate the competitive behavior of Big Tech. See our DMA client alert for more details.
Once the DMA is in force and fully applicable, the European Commission can designate companies as “gatekeepers” if they: (1) have a significant impact on the EU internal market; (2) offer certain “core platform services” as an important gateway for business users to reach end users; and (3) enjoy an entrenched and durable economic position.
After designation, a gatekeeper company is subject to 13 positive obligations and nine prohibitions, which mainly deal with issues such as data access and data use, self-preferencing, and bundling and interoperability.
The DMA is likely to enter into force in early November 2022 and apply six months thereafter. Some likely gatekeepers have already changed their behavior to comply with the DMA, such as opening their ecosystems for third-party payment systems. Others are likely working on their DMA implementation plans as well, considering the significant changes that will be required to comply with some of the substantive obligations.
Any large company providing digital or technology services that are included in the “core platform services” definition ought to understand if and how it will be affected.
The Commission announced a DMA “implementing regulation” for early 2023 to provide more guidance on how to comply with the DMA’s substantive obligations rules. This marks a shift in focus towards the enforcement of the DMA. While the Commission is looking to recruit 100 new staff and a Chief Technology Officer to enforce the DMA and the Digital Services Act, the Dutch Competition Authority recently suggested deploying national enforcers to the Commission.
But the key to effective enforcement will be cooperation with gatekeepers and other stakeholders. To this end, the EU opened its Silicon Valley Embassy to better coordinate with Big Tech and has suggested hosting workshops with consumers to discuss potential remedies for breaches of DMA requirements.
In September 2022, the European Commission published its new proposed European Media Freedom Act (EMFA). The EMFA aims to harmonise and enhance EU rules on media pluralism, increase cross-border cooperation between media regulators, and address public and private interference with media outlets.
If and when eventually adopted by the EU, the Commission’s proposed legislation will affect five different categories of addressees:
1. Providers of media services will enjoy additional protections against state interference and against unfair allocation of state advertising. This includes audiovisual and audio-only linear and on-demand offerings as well as press publications. However, media services with news and current affairs content will become subject to new obligations aimed at ensuring editorial independence of the relevant staff.
2. Manufacturers of devices and developers of user interfaces for audiovisual media services will have to implement functionalities so that users can change the default settings controlling or managing access to and use of such services.
3. Providers of very large online platforms as defined under the Digital Services Act (see above) will have to implement functionalities allowing users to self-declare that they are a media service under the EMFA. The very large online platform will then be subject to specific content moderation rules regarding content provided by declared media services.
4. Providers of audience measurement systems will be subject to general non-discrimination and transparency obligations, and they may have to disclose their methodologies upon request.
5. Providers of video-sharing platforms will not become subject to new substantive rules, but the EMFA facilitates cross-border enforcement of existing regulations for relevant services.
The Commission’s proposal will now be debated by the European Parliament and EU Council, followed by trilogue negotiations. On that basis, it is unlikely that the EMFA will be adopted before late 2023 and after adoption, it will most likely take effect before mid-2024 based on the currently proposed transitional period of six months.
In September 2022, the EU Commission published a draft of its proposed “Cyber Resilience Act” or, as it’s now officially titled, the “Regulation on horizontal cybersecurity requirements for products with digital elements” (the “Regulation”).
The new Regulation will create detailed cybersecurity requirements for manufacturers, importers, and distributors of all “products with digital elements” (including hardware and software products), with extensive cross-references to the AI Act, the Machinery Regulation, and the NIS2 Directive. The most relevant obligations include the following:
Depending on the specific violation, non-compliance with the proposed obligations may be subject to fines of up to €15 million or up to 2.5% of the offender’s total worldwide annual turnover, whichever is higher.
The Commission’s proposal will now be discussed in the EU Council and European Parliament. It’s currently not expected to be finalised before late 2023.
We previously reported that the EU Commission had been pushing forward laws that would have brought various crypto-assets (including potentially NFTs) within the scope of EU regulation.
The Commission proposed the Regulation on Markets in Crypto-Assets (MiCA) because it believes that unregulated crypto-assets too often expose consumers and investors to significant risks. The purpose of MiCA is to create a harmonised European regulatory framework for the EU crypto ecosystem and to prevent fragmentation of the EU single market due to different national level regulation of crypto-assets (which has been emerging in some EU Member States, including Germany).
At the core of MiCA is a licence requirement for certain crypto-activities. MiCA will govern the issue of crypto-securities, the supervision of issuers, trading platforms, and certain service providers (e.g., wallet services) as well as implement measures to prevent market abuse.
But one central issue has been the scope of crypto-assets to which MiCA will apply and whether NFTs will be covered.
In July 2022, the likely final text of MiCA was agreed upon; and NFTs are excluded from the MiCA scope except if they fall under existing crypto-asset categories. But within 18 months of MiCA going live, the Commission will be tasked with preparing a comprehensive assessment and, if necessary, creating a separate regulatory regime for NFTs and addressing the emerging risks of such new market.
MiCA defines “crypto-assets” as a digital representation of a value or right that can be transmitted and stored electronically using distributed ledger or similar technology. Acceptance of crypto-assets as a means of exchange or payment, and use for investment purposes are irrelevant so far. Similarly, it is irrelevant whether the value is issued or guaranteed by a central bank or public body and has the legal status of currency or money. Furthermore, MiCA differentiates between certain crypto-assets and, in addition to crypto-assets in the narrower sense, creates three sub-categories, namely value-referenced tokens, e-money tokens, and utility tokens.
NFTs (i.e., digital assets representing individual tangible or intangible assets, such as a work of art, music, and videos, or ownership rights therein) are not subject of the current draft of MiCA except if they fall under existing crypto-asset categories. The legal and regulatory environment of NFTs is currently evolving and the extent to which NFTs may or should create ownership rights in certain assets is still under review. For example, the sole attribution of a unique identifier to a crypto-asset shall not be sufficient to classify it as unique or non-fungible. For a crypto-asset to be considered unique and non-fungible, the assets or rights represented must also be unique and non-fungible. While NFTs are currently not subject to specific regulation on an EU-level, they must comply with some existing legislation, including the Anti-Money Laundering Directive and investment regulations to the extent they are used to market and sell asset investments.
Finally, the current draft of MiCA has not explicitly addressed the question of how to reduce the high carbon footprint of mining cryptocurrencies. The draft does not include a ban on proof-of-work crypto mining, which would have effectively banned cryptocurrencies like Bitcoin that rely on a proof-of-work consensus mechanism.
The Commission will, within two years, report on the environmental impact of crypto-assets and consider the introduction of mandatory minimum sustainability standards for consensus mechanisms, including the proof of work. The EU Sustainable Finance Taxonomy will likely be changed in due course (by January 2025) to include crypto mining in the economic activities that contribute substantially to climate change mitigation.
The UK’s ever-changing Online Safety Bill (OSB) is due to undergo even more reform. As covered in our previous articles (see European Digital Compliance: Key Digital Regulation & Compliance Developments and UK Publishes Draft Online Safety and Content Bill), the OSB was conceived as a mechanism for the UK government to regulate providers of user-to-user services and search services, with the aim of protecting children and adults from both illegal and so-called “legal but harmful” content online by imposing safeguarding and content duties on services companies and online platforms.
The OSB ran into choppy waters over summer 2022 due to the political changes in the UK, with the administration of outgoing prime minister, Boris Johnson, eventually being replaced in early September 2022. One of the first orders of business for Michelle Donelan (the UK’s new Secretary of State for Digital, Culture, Media, and Sport) was to confirm that the OSB will see a number of changes ‒ particularly in relation to legal but harmful content for adult users. The charges are said to better align with the UK government position on free speech.
Social media and large digital companies appear to be facing an easier task under the UK’s Online Safety Bill after the new minister in charge of the law confirmed on 6 October 2022 that the most controversial part of the law will be edited to protect free speech.
The OSB’s proposed provisions on legal but harmful content would have required large digital providers, such as major social media companies, to undertake risk assessments about content that’s legal but may be harmful to adults (e.g., in relation to anorexia and extreme weight loss), explain in their terms of service how they will handle that content, and apply those terms consistently.
The key potential changes to the OSB currently include the following:
1. Legal but Harmful Content: The definition of this content is set to be amended to protect adult users’ right to freedom of expression regarding their online activity; although it’s not yet known exactly how the OSB will be amended in this respect.
2. Illegal Content: The UK government has clarified companies’ illegal content duties by: (a) setting a ‘safety-by-design’ approach for companies to adopt when managing the risk of illegal content appearing on their platforms and (b) describing companies’ considerations for deciding whether content is illegal and should be removed.
3. Extended Ofcom Powers: The powers of the UK Office of Communications (“Ofcom”) may be extended to include the ability to order companies to: (a) make proportionate changes so they can use accredited technology to identify and remove certain illegal content (a departure from the previous position, which could have allowed companies to avoid using such technology because it doesn’t work with the functionality of their platform); (b) use their best endeavours to develop or source technology to identify and remove certain illegal content; and (c) use tools to prevent individuals from encountering certain content.
Ofcom’s proposed timeline (which anticipates the OSB receiving Royal Assent in early 2023) seems optimistic given the sheer number of expected amendments (not to mention, the delays invariably caused by the UK government’s recent Cabinet reshuffle). The OSB is currently due to undergo a third reading in the House of Commons before proceeding to the House of Lords. While the OSB’s overall legislative timeline is somewhat up in the air, Michelle Donelan seems just as keen as her predecessor to keep the OSB moving forward, having stated to BBC Radio 4: “We’ve got to get it back to the [House of Commons] and get it into law”.
Key sections of the UK’s Telecommunications (Security) Act 2021 (TSA) came into force on 1 October 2022 – also bringing into effect the Electronic Communications (Security Measures) Regulations 2022 (“EC Regulations”) on the same date. Together with the accompanying Code of Practice (the “Code”), this TSA framework regulates UK telecommunications networks and service providers.
The TSA is intended to strengthen the security of public telecommunications networks (including 5G and full fibre networks) and improve resilience against hostile cyber activity. It provides the UK government with the power to make regulations that impose specific security obligations on the providers of public telecommunications networks and services. The TSA aims to strengthen security by:
This follows a recent UK public consultation on the EC Regulations and the Code, which led to enhanced protection of data storage and monitoring/analytical tools, and further obligations on providers to identify, understand, and report on security risks internally.
Telecommunications networks and service providers must familiarise themselves with their new duties (under the TSA) and implement the minimum required security standards in accordance with the EC Regulations. The implementation deadlines vary depending on the tier of provider and the specific technical measures. For Tier 1 providers (i.e., those with £1 billion or more in turnover during the relevant year), some measures must be implemented by 31 March 2024 and key deadlines for all providers fall on 31 March in 2025, 2027, and 2028.
Non-compliance with these obligations could lead to Ofcom fines of up to 10% of the company’s annual turnover or, where a persistent violation occurs, £100,000 per day.
In September 2022, the UK’s communications regulator (Ofcom) published a policy paper setting out a new work programme. A key focus of Ofcom’s work will be aimed at understanding competition and consumer issues in broadcasting and communications markets in the light of digital disruption, which has fundamentally changed the ways in which consumers communicate and consume media. Ofcom recognizes that its traditional focus on physical infrastructure and forms of communication - which are increasingly being replaced by digital platforms - needs to adapt in order to reflect current market reality.
Ofcom plans to undertake a programme of work to examine how digital technologies are working for consumers, and their impact on investment and innovation in the communications sectors. The programme will be undertaken in collaboration with other regulators, including the UK’s Competition and Markets Authority (CMA).
The programme will be pursued over the course of the next year and consists of four targeted workstreams:
1. A market study into cloud services in the UK. Ofcom will use its powers under the Enterprise Act 2002 to conduct a market study into cloud services and the position that the largest Tier 1 cloud providers have in the market. Ofcom will consider whether any identified competition issues have an adverse impact on consumer outcomes generally, including in Ofcom’s core markets of telecoms and broadcasting. Ofcom has extensive powers to request information and evidence from market participants in order to conduct its study and will likely follow the CMA’s approach of using this tool to conduct a general assessment of the state of the market, and any issues that may merit further action, including enforcement or a more detailed market investigation.
2. Net neutrality review. Ofcom will continue to examine the relationship between internet service and online content providers to determine its future approach to net neutrality. Ofcom will shortly consult on its proposed approach to this issue.
3. Competition in digital content gateways. Ofcom will examine the extent to which digital content gateways (e.g., connected televisions, smart speakers and digital assistants) are becoming essential routes to market such that they are distorting competition, in particular by impacting the range, pricing or quality of content available to consumers.
4. Online personal communications services. Ofcom will consider how developments in platform communication services (e.g., Whatsapp and Zoom) are impacting traditional calling and messaging markets It will also look at how competition and innovation in these markets may evolve over the coming years, potential concerns arising from network effects and limited interoperability, and how these issues could impact on its wider duties such as securing access to essential services and end to end connectivity.
The market study into UK cloud services will be Ofcom’s key focus. The study formally kicked off on 6 October 2022, when Ofcom published its market study notice and “Call for Inputs”. These documents set out in detail the proposed scope of the study and seek input from stakeholders on a number of questions related to this. Interested third parties are invited to comment on the scope by 3 November 2022.
The Court of Justice of the European Union (CJEU) ruled in September 2022 that the German rules requiring telecommunications providers to retain certain user data regarding any telecom connection for law enforcement purposes is not compatible with EU law. This is second time that German data retention obligations have been struck down in court. Their first incarnation was declared unconstitutional by the German Federal Constitutional Court in 2010, and the present rules were enacted in 2015 in reaction to that ruling.
The CJEU argued that the German regulations in the Telecommunications Act (TKG) which require telecommunications providers to store traffic and location data for ten and four weeks, respectively, allow very precise conclusions about an end user’s private life. For example, this data will contain information on habits in daily life, whereabouts, activities carried out, and social relationships, essentially allowing holders of this data to create a detailed profile of these individuals. The CJEU found that the infringement of privacy rights associated with the retention of this data outweighs the public interest in its storage – even more considering that this data would have to be stored for the connections of every telecom user in Germany, and regardless of any suspicion of a specific offence.
Following the CJEU decision, it’s likely that the referring German courts will follow suit and find that telecommunications providers in Germany do not have to comply with German data retention rules.
But in practice, this decision will cause very few changes. Enforcement of these rules has already been suspended since 2017, following preliminary summary findings by German courts that they likely violate EU laws.
However, in its ruling, the CJEU also provided guidance to Member States for a compliant data retention regime:
It remains to be seen whether Germany will follow these suggestions when it tries (again) to fix its data retention rules.
The agenda classifies goals into three broad categories:
1. Connected and digital sovereign society
2. Innovative economy, work environment, science, and research
3. A learning, digital state.
The goals within each category cover infrastructure (e.g., 50% of households should be connected via fibre to the home), administration (e.g., 80% of public health insurance patients should use digital health records), and the institutional and legal framework (e.g., establishment of a “data institute” to administer a data trustee and licensing model). Some goals measure Germany’s progress vis-à-vis other EU countries, such as becoming a leading AI location in Europe. Other goals require EU-wide development and connect with the EU “Digital Decade”, such as creating an open ecosystem through the EU cloud project Gaia-X.
The German digital strategy focuses on three specific projects with “leveraging effect”, as these are especially important to facilitate other goals.
1. Creating modern, efficient, and sustainable gigabit networks and ensuring that access to data and data tools is available.
2. Establishing international norms and standards as a basis for interoperable services, such as autonomous driving.
3. Requiring secure user-friendly digital identities and modern registries for digital public administration.
The German government will present specific legislative suggestions regarding these projects in the mid-term.
While every German government ministry contributed to the digital strategy (and features at least one “beacon project”), the Federal Ministry for Digital and Transport coordinates and monitors the agenda’s progress.
It has been a busy summer for followers of the various European regulatory proposals to introduce a new regulatory framework for the use of artificial intelligence in Europe. The EU is trying to resolve internal differences in its proposed approach to regulation; while the proposals published by the UK overtly take a more light-touch, pro-innovation approach.
In April 2021, the EU Commission published a proposal for an EU Artificial Intelligence Act in the form of an AI Regulation which would be immediately enforceable throughout the EU. The proposal sparked a lively discussion among EU Member States and political parties in the EU Parliament, generating several thousand amendment proposals. On the basis of a joint draft report issued in April 2022 by various parliamentary committees examining the proposals, the EU Parliament is currently attempting to work out a text compromise. And the current Czech presidency of the EU Council has made new proposals to try to broker a compromise.
In addition to the AI Regulation, in September 2022, the EU Commission published its draft of a proposed AI Liability Directive which, in contrast to the AI Regulation, will have to be transposed into national law by the Member States within two years. The proposed AI Liability Directive is intended to establish the legal grounds for persons seeking non-contractual civil law compensation for damage caused by AI systems.
Meanwhile, outside the EU, the UK government has published an AI Regulation Policy Paper and AI Action Plan confirming that it intends to diverge from the EU’s regulatory regime. And, in June 2022, the UK made proposals on one key aspect of AI – the treatment of intellectual property rights. In both cases, the UK appears to be taking an approach that favours innovation over regulation.