Germany’s Draft Act Against Digital Violence: A Court-Centric Civil Enforcement Regime for Online Services
Germany is advancing a new legislative framework to strengthen civil and criminal protection against digital violence, responding to a sustained increase in online abuse and perceived enforcement deficits. These challenges are particularly acute where anonymous users impede the identification of perpetrators and where emerging technologies such as deepfakes challenge existing legal frameworks. On April 17, 2026, the German Federal Ministry of Justice published a draft of its new Digital Violence Act (Gesetz gegen digitale Gewalt).
While the draft introduces certain new criminal offenses, its core innovation lies elsewhere: it fundamentally reconfigures how rights are enforced against digital violence, placing courts, rather than platforms, at the center of enforcement through a dedicated civil procedure framework. For online platforms, hosting providers, and internet access providers, this translates into new procedural obligations, data handling requirements, and structured processes for judicial interaction and compliance.
Five Key Takeaways
1. Scope of Application
A central feature of the initiative is its relatively broad and technology-neutral scope. The legislative framework applies to online platforms and hosting services as defined under the Digital Services Act (DSA), as well as to internet access providers. It is therefore not confined to social media but extends to any service that can facilitate user-to-user harm within the categories of services defined in the DSA, acknowledging that digital violence can occur across a variety of technical environments. Notably, the draft introduces greater granularity within service categories compared to the DSA framework by distinguishing between “cloud hosting” and “web hosting” services within the broader DSA hosting category, and by providing a specific definition of “social media.” The latter is defined as an online platform within the meaning of the DSA with a primary purpose of enabling users to communicate and interact with one another, including through the sharing and dissemination of content.
For in-scope services, an important clarification lies in how the draft frames “digital violence.” Rather than introducing an autonomous legal category, the draft links its application to an exhaustive catalog of underlying criminal offenses, primarily involving infringements of personal rights committed online (e.g., defamation, hate speech, stalking, or image-based abuse).
2. New Procedural Obligations for Hosting Services, Online Platforms, and Internet Access Providers
Unlike earlier regulatory initiatives, the draft does not seek to expand substantive content moderation obligations. For providers, the main change is procedural, shifting from mere notice-based moderation to a court-driven enforcement model centered on judicial disclosure mechanisms. Key obligations include:
- Data Disclosure: Providers may be required, subject to prior judicial authorization, to disclose user data (including identity and IP-related information). Where the court is unable to directly identify or contact the affected user, providers may be required to inform the user of the proceedings and enable their participation, including the opportunity to submit statements anonymously or under a pseudonym and via the provider acting as intermediary. This mechanism is intended to address one of the most significant enforcement gaps in current practice: the difficulty of linking online conduct to identifiable individuals. At the same time, providers are effectively embedded into the procedure, assuming a quasi-intermediary role that requires internal workflows and standard operating procedures for user notification and the handling of user submissions.
- Evidence Preservation: Upon initiation of legal proceedings and court orders, providers must secure and retain relevant user data and content for the duration of the proceedings and ensure its subsequent transfer (if ordered) and irreversible deletion following a final decision.
- Temporary Account Suspensions: Courts may order the temporary suspension of user accounts in social networks in cases of severe personal rights violations where necessary to prevent recurrence. These measures are expressly framed as the last resort and are subject to strict proportionality and necessity requirements as well as procedural safeguards.
- Domestic Service Agent Requirement: Providers of social networks without an EU establishment must appoint a domestic service agent for legal proceedings. For EU-based providers, this requirement applies only upon specific court orders.
3. Expansion of Criminal Offenses
In addition to the procedural mechanisms aimed at strengthening civil enforcement, the draft Digital Violence Act introduces targeted amendments to the German Criminal Code (Strafgesetzbuch, StGB). While not directly linked to the EU content moderation regime, these changes may expand the range of conduct that qualifies as illegal content under national law (and therefore under the DSA where a German nexus exists).
The draft introduces the following three new offenses:
- First, the violation of intimate privacy through image recordings (Sec. 184k StGB) is broadened to capture a wide range of image-based abuse, including sexualized deepfakes that create the appearance of real persons in sexual contexts, even where such content is labeled as AI-generated;
- Second, the violation of personal rights through deceptive content (Sec. 201b StGB) specifically targets manipulated content capable of causing significant reputational harm; and
- Third, a new offense addressing unauthorized surveillance using information or communication technology (Sec. 202e StGB), covering forms of digital tracking and monitoring that interfere with personal autonomy.
4. Interaction with the Country-of-Origin (CoO) Principle and DSA
The draft explicitly addresses its interaction with EU law, in particular the country-of-origin (CoO) principle under the E-Commerce Directive and the DSA. As a general rule, this principle limits the ability of Member States to impose additional obligations on providers established in other EU jurisdictions.
The German legislator seeks to navigate these constraints by framing the core obligations under the draft, such as data disclosure and account suspension, as individual judicial orders rather than general regulatory requirements. Based on Recital 38 and Article 6(4) DSA, such case-specific measures fall outside the CoO restriction and remain permissible. For providers operating across the EU, this distinction is critical. Even where a provider is established in another Member State, German courts may still issue binding orders in individual cases requiring concrete action.
5. Compliance Considerations
While the German government assumes that the additional compliance burden may remain limited in theory, given that the Digital Violence Act is framed as a case-by-case enforcement tool, platforms will nonetheless need to establish and maintain robust and scalable internal workflows. This includes processes to identify and retrieve relevant user data, implement data preservation orders, and execute account-related measures in line with court decisions, as well as informing users of the proceedings and enabling their participation, including the opportunity to submit statements anonymously or under a pseudonym and mechanisms to relay user submissions to courts. Moreover, the actual practical impacts could vary significantly depending on the uptake of court procedures.
In addition, the introduction of new criminal offenses and the resulting expansion of the scope of potentially illegal content, particularly with regard to manipulated or synthetic media, will require corresponding updates to risk assessments and internal compliance frameworks. Platforms should also anticipate increased interaction with national courts and ensure that they are able to respond to judicial requests efficiently, regardless of their place of establishment.
What’s next?
The draft is currently subject to stakeholder consultation, with submissions invited until May 22, 2026. Following this phase, the proposal will be discussed within the German federal government before entering the legislative process in the German parliament and Federal Council. Entry into force is expected later in 2026.
We are grateful to our research assistant, Felicitas Lampe, for her contributions to this client alert.
Christoph NüßingPartner
Nina GrawAssociate