1 Introduction

Minors can be exposed to violence-glorifying, sexualized and racist content on video-sharing platforms. They can also be influenced by untrue and polarizing information and experience hate speech. Furthermore, this may lead to the infringement of personality rights guaranteed by the German Constitution.Footnote 1 Such infringement occurring on the internet rather than by ‘analogue’ means are more difficult to prosecute due to the principle of anonymity still in place regarding the internet—see § 13 (6) of the German Telemedia Act (Telemediengesetz; TMG)Footnote 2—making it less likely that perpetrators will be held accountable for their actions. The German Federal Ministry of Justice currently has no plans to implement ‘real name’ requirements,Footnote 3 and the information-providing obligations of the platforms and portals on which such infringing material is distributed are extremely limited. While information can be requested about stored data on users who have committed certain criminal offenses and/or personal sphere rights infringements pursuant to § 14 (3) TMG,Footnote 4 this right of information is useless if the platform does not have the infringer’s name but only their IP address. The victim in a given case must involve the public prosecutor’s office in the hope that they will be able to determine the identity of the perpetrator(s), but in many instances, such cases are dropped. It is abundantly clear that there is an urgent need for more effective protection of minors on the internet, in view of, for example, increasing press coverage of teen suicides prompted by an infringement of personal sphere rights on the internet.Footnote 5

The Audiovisual Media Services Directive (AVMSD) of November 14, 2018 ((EU) 2018/1808)Footnote 6 is aimed at improving the protection of minors on video sharing platforms, partly in recognition of a changed risk situation for minors on the internet. Minors are increasingly exposed to harmful content and must be especially protected from incitement to hatred, violence and terrorism, in particular through misinformation, in their development phase. The principal provisions relevant to the protection of minors are set forth in the Interstate Treaty on the Protection of Minors in the Media (Jugendmedienschutz-Staatsvertrag; JMStV),Footnote 7 the Protection of Minors Act (JuSchG),Footnote 8 and the Network Enforcement Act (NetzDG). The chief point is for the platforms, as content intermediaries, to be held more responsible than they traditionally have been. A similar trend is observable in other areas of law, including copyright law, for which platforms will in the future bear liability as perpetrators of infringement rather than as mere contributors.Footnote 9 Irrespective thereof, general monitoring obligations are prohibited pursuant to Article 14 of the E-Commerce Directive (transposed in § 10 of the TMG), and this prohibition is to remain in place after the drafting of the Digital Services Act. This paper examines what measures are imposed on platforms under the JMStV, JuSchG, and NetzDG, as well as on how these may be structured going forward. In particular, the consequences for the protection of minors will be addressed.

2 Structure of this Paper

The paper begins with a detailed look at the changes brought about by a revision of the AVMS Directive (Sect. 3) before reviewing the corresponding measures used to implement these in national law (Sect. 4). These measures include amendments to the Interstate Treaty on the Protection of Minors in the Media (Sect. 4.1), the Protection of Minors Act (Sect. 4.2), and the Network Enforcement Act (Sect. 4.3). A summary of conclusions is then provided in the subsequent and final Sect.  5.

3 Amendments to the Audiovisual Media Services Directive

Video sharing platforms are subject to the regulatory regime established in the amended Audiovisual Media Services Directive (AVMSD; (EU) 2018/1808).Footnote 10 Requirements under the AVMSD and their implementation are discussed below.

3.1 Expanded Scope of Application

Amendments to the AVMSD have always involved expansions of scope. As technologies increasingly converged, the original scope covering classic television—thus the name “Television Directive”—was expanded via amendment to include non-linear services.Footnote 11 In the amended AVMSD, the scope of application, which is largely prescribed by the definitions of terms per Art. 1, has again been expanded.Footnote 12 An audiovisual media service within the meaning of Art. 1 a) i) AVMSD is now defined as a service whose main purpose or separable element is to provide content via electronic communication networks to the general public for informational, entertainment, or educational purposes under the editorial responsibility of a media service provider. The new part is that the AVMSD now applies to any provider of understandable, discrete video content that lacks direct reference to other content.Footnote 13 The scope of application has additionally been expanded to include video sharing platforms. The primary purpose or essential function of video sharing platforms is to make programming or user-generated video content electronically available to the general public, with the video sharing platform provider bearing no editorial responsibility. The operator of a video sharing platform solely determines the organization of the platform, not what content is available on it, see Art. 1 aa) AVMSD. Thus, in addition to major providers like YouTube, AVMSD will now also apply to audiovisual content distributed by users on social media platforms like Facebook, or in separate sections of newspaper websites.Footnote 14 The expansion of AVMSD to include video sharing platforms has most recently led to the addition of Chapter IXa with Art. 28a and 28b AVMSD. The geographic scope of application is extended under Art. 28a (1) and (2) AVMSD to include providers which have, or effectively have, a branch operation within the territory of an EU Member State, or when that company has a parent company, subsidiary, or corporate affiliate domiciled in a Member State.Footnote 15 Preventive protection measures are required under Art. 28b AVMSD, such as certain requirements for the protection of minors applicable to video sharing platform operators.Footnote 16

3.2 Amendments to Media Laws for the Protection of Minors

The former Art. 12Footnote 17 and Art. 27 AVMSD, which provided for separate regulation of television programming and on-demand services with graduated regulation levels for media protection of minors, were eliminated in the amended AVMSD. Art. 6a (1) AVMSD now requires all providers of audiovisual media services to establish media exposure/consumption barriers to content deemed deleterious to development. The law provides that audiovisual services that could harm minors’ physical, mental, or moral development may only be provided in a manner that ensures that minors will generally not be exposed to or consume such audio and image/video content. The measures implemented to this end include broadcast scheduling, age verification procedures, and other technical measures.

Platform operators’ responsibilities were also specifically regulated to implement protections against content that poses a danger to minors or incites hatred or violence (Art. 28b (1) AVMSD), as well as to comply with advertising requirements as per Art. 28b (2) AVMSD.Footnote 18 In particular, Art. 28b (3) AVMSD provides that platforms must enable their users to designate content as unsuitable for minors as per Art. 28b (1) AVMSD. Platforms themselves must have reporting systems in place for unsuitable content in accordance with the aforementioned paragraph 1 and provide systems for parental control enabling parents to keep such content out of their children’s accounts. Additionally, the AVMSD provides for the “set-up and operation of age verification systems” in these clauses.Footnote 19

4 Measures Required Under National Law

4.1 The Interstate Treaty on the Protection of Minors in the Media

The amended provisions of the AVMSD have been implemented in the similarly amended Interstate Treaty on the Protection of Minors in the Media (Jugendmedienschutz-Staatsvertrag; JMStV) in special regulations applicable to video sharing services. A key change is the new clause § 5a (1) of the amended JMStV, which now expressly obligates video sharing services to implement adequate measures to protect children and teens from content deleterious to their development irrespective of the obligations per § 4 and § 5 JMStV.Footnote 20 While the necessity of this detailed clause has been questioned, it does make clear that platforms can be held liable for non-proprietary content.Footnote 21 It must be noted that the obligations per the new § 5a JMStV “apply irrespective of the obligations per § 4 and § 5 JMStV”. Thus, these require commentary (under points Sect. 4.1.1 and 4.1.2) before addressing the latest amendments to the JMStV (under point Sect. 4.1.3).

General categorical distinction is made in the JMStV between “harmful content for minors” and “content deleterious to development”. The former is fundamentally illegal to distribute in general under § 4 JMStV and, pursuant to § 4 (2) 2nd st. JMStV, can only be made accessible in telemedia to a closed user group with robust mandatory age verification. It is, however, fundamentally legal to distribute the latter, although providers are obliged under § 5 JMStV to prevent minors from consuming such content “under normal circumstances”.Footnote 22

4.1.1 Protection Against Endangering Content for Minors

The list of “absolutely prohibited content” per § 4 (1) 1st st. JMStV remains unchanged. Such content is generally prohibited in both broadcasting and telemedia. The list is intended to ensure the upholding of human dignity and prevent sexual abuse by banning child pornography and similar or related endangering content. In particular, the list of “absolutely prohibited content” serves to establish that making content public in violation of legal norms for the protection of society constitutes a criminal act under media-specific protection of minors laws.Footnote 23 The violations of public order per § 24 (1) no. 1 a–k represent such a violation.Footnote 24

In contrast, the distribution of “content endangering to minors” per § 4 (2) 1st st. JMStV is only illegal under certain circumstances. The distribution of such content in telemedia is permitted, as per the 2nd st., if access is restricted to adult users.Footnote 25 Content endangering to minors, such as regular pornography in particular (i.e. pornography without relevance to the sexualization of children or other crime), may therefore be distributed via telemedia in exceptional cases despite the general prohibition.Footnote 26 The provider must ensure that such content is only accessible within “closed user groups”.Footnote 27 The existence of such a user group is ensured by having a reliable age verification system in place that requires personal identification, although under the JMStV there are no officially prescribed recognition rules.Footnote 28 There is the possibility, however, of the Commission on the Protection of Minors (Kommission für Jugendmedienschutz; KJM) assessing the merits of a given age verification system with reference to the set of criteria the Commission has outlined.Footnote 29 Solution concepts based thereupon primarily utilize video calls for identification purposes or draw upon successful identity verificationFootnote 30 carried out elsewhere, such as when opening a bank or savings account.Footnote 31 In a recent development, age verification providers have even integrated autoident technology into their systems.Footnote 32 This technology enables user identification by automatically cross-referencing a photo against biometric and other data stemming from an identifying document.Footnote 33

4.1.2 Protections Against Content Deleterious to Development

Conceptual distinction must be made between content “endangering to minors” and content “deleterious to the development of minors”. Fundamentally, the distribution of content deleterious to development is legal, but pursuant to § 5 (1) 1st st. JMStV, providers of such content must ensure that children and youth of the concerned age levels will generally not be exposed to it.Footnote 34 Content deleterious to the development of minors is content that can lead to dysfunction through overstimulation or other excessive stressors; lack of socio-ethical orientation, such as by confusing fiction and reality; or impairment of the maturation of children and youth into responsible adults.Footnote 35 In implementing the mandatory controls limiting access to such content, providers must take into account what ages are concerned or would thereby be affected.Footnote 36 The provider’s obligations extend solely to ensuring that minors of the concerned age levels will “generally not be exposed” to content deleterious to their development. There is no requirement that it must be rendered completely impossible to access such content.Footnote 37 The potential accessing of such content does not have to be completely prevented, but rather only made difficult. This is implemented on television by limiting the broadcast to a certain time, e.g. late evening hours. Due to the ubiquitous access to content on video sharing platforms, finding technical solutions is increasingly challenging. Providers of such platforms can fulfill these requirements by marking their content as relevant for the protection of minors for filtering software. Such software can be installed by parents and helps to decide which contents are suitable for which age. The filtering software filters the internet and only shows suitable content.Footnote 38 Apart from that providers might specify time limits, and/or implement other technical measures.Footnote 39 Accordingly, the requirements per § 5 JMStV are less stringent than the requirement of enforcing a closed user group with age verification procedure as per § 4 (2) 2nd st. JMStV.Footnote 40

4.1.3 Expanded Media Protection for Minors

The increasing popularity of video sharing services, among minors in particular, led to the amendments to § 5a and § 5b JMStV, implementing Article 28b AVMSD,Footnote 41 which are the primary changes to the law. Art. 28b AVMSD is the central norm to implement protection for minors against harmful content or content that incites hatred or violence (see Sect. 3.2). The overall scope of the JMStV was also expanded, including particularly its geographic scope of application, which is of key importance for platform operators.

4.1.3.1 Applicability to Foreign Providers

The new § 2 (1) 2nd st. of the amended JMStV clarifies that the provisions of the JMStV likewise apply to providers based outside of GermanyFootnote 42 if they host content intended for use in Germany.Footnote 43 This was implemented to enhance regulators’ ability to enforce the law against foreign providers, among other objectives.Footnote 44 Foreign providers are thus now required to appoint a domestic authorized recipient of correspondence under § 21 (2) of the amended JMStV. However, the expansion of the scope of the JMStV to include foreign providers is subject to the limits of the ‘country of origin’ principle. This became evident through the amendment of a provision to insert an express reference to compliance with the country of origin principle pursuant to a resolution adopted by the Conference of Minister Presidents (Ministerpräsidentenkonferenz; MPK).Footnote 45 The country of origin principle proceeding from Art. 3 of the e-Commerce Directive (ECD), and implemented in Art. 3 (2) of the TMG, means that the free movement of telemedia services that are offered or provided in Germany by service providers domiciled in another state subject to the ECD may not be restricted. The effects of the exceptions per Art. 3 (4)–(6) ECD and Art. 3 (5) and (6) TMG must be specifically considered on a case-by-case basis. But in actual practice, regulatory measures are only allowed in a few individual cases within the framework of these stringent exceptions, and the JMStV is virtually inapplicable to providers domiciled in other EU countries.Footnote 46

4.1.3.2 Expanded Measures for Video Sharing Service Providers

In addition to the familiar pre-existing methods, the suitable measures for protection against content deleterious to minors’ development proposed for providers of video sharing services under § 5a (2) of the amended JMStVFootnote 47 include implementing and operating age verification procedures and systems that allow parents to control access to content. Using the term “age verification” when referring to less stringent procedures for content that is only detrimental to development can be confusing, as it is also used to refer to the systems employed to create closed user groups for content endangering to minors as per § 4 (2) 2nd st. JMStV, which are distinct in that they are complex and non-circumventable.Footnote 48 The Official Unified Declaration of German States clarifies that the term “age verification procedures” per § 5a JMStV also refers to procedures that establish age group classification as well as to those that create closed user groups.Footnote 49 Pursuant to § 5a (2) 2nd st. JMStV as amended, a user feedback/rating system must also be implemented for monitoring the effectiveness of such procedures in place with video sharing services.

4.1.3.3 Codified ‘Notice and Take Down’ Procedure

The newly inserted § 5b in the amended JMStV serves the determination of the illegality of content as per §§ 10a ff. of the TMG. Pursuant to § 10a (1) TMG, video sharing platform providers are obligated to have reporting procedures in place that enable users to electronically file complaints about illegal content being made available on their respective platform. Whether content is illegal proceeds from § 4 and § 5 JMStV, which establish that content deleterious to development of minors is only illegal if made available to the general public and if the video sharing service provider has not fulfilled the obligations per § 5 (1), (3)–(5) JMStV, see § 5b nos. 1 and 2 JMStV as amended.

These provisions codify a ‘notice and take down’ procedureFootnote 50 for video sharing service providers. Established under § 10 TMG, the fundamental rule is that “host providers”,Footnote 51 a term that includes video sharing providers like YouTube,Footnote 52 are not liable for third-party data which they save on a user’s behalf. The prerequisites apply, however, that they must either have no knowledge of the illegal act/data—and further, if damages are claimed, be unaware of any facts or circumstances which render the illegal act/content obvious—or have taken action to remove or restrict access to such content immediately after becoming aware of it. Host providers do not in any case have preventive review obligations, i.e. obligations to monitor or investigate activities, pursuant to § 7 (2) 1st st. TMG. However, under § 10a and § 10b TMG in conjunction with § 5b JMStV as amended, video sharing platform providers are now expressly obligated to set up a reporting procedure which is easily recognizable as such, easy to use, directly accessible, and continuously available, as well as to review user reports to ascertain whether content violates media laws concerning the protection of minors. The competent state media authority is responsible for monitoring compliance with these newly created regulations under § 14 (1) of the amended JMStV.

4.1.3.4 Self-Regulation Mechanisms for Social Media

Self-regulation mechanisms supplement the systems/procedures for the protection of minors required by law. Pursuant to § 7 (1) JMStV, the telemedia provider must appoint a Protection of Minors Officer if the provider’s platform is publicly accessible and contains content that is endangering to minors or deleterious to their development. Additionally, voluntary self-regulation panels exist, which are recognized by the KJM in accordance with § 16 2nd st. no. 2 and § 19 JMStV. Lastly, the terms of use of social media networks also provide for the protection of minors, such as the Facebook ban on nudity and pornography.Footnote 53

4.2 Protection of Minors Act

Under § 24a of the Protection of Minors Act (Jugendschutzgesetz; JuSchG), service providers which save or provide third-party content for users with a profit motive must take appropriate and effective structural precautionary measures, irrespective of § 10 TMG, to ensure that the protective objectives per § 10a nos. 1–3 JuSchG are upheld. The protective objectives per § 10a nos. 1–3 are as follows: (1) to afford protection from media likely deleterious to the development of minors and/or to their maturation into responsible, socially adequate adults (media deleterious to development); (2) to afford protection from media deleterious to the development of minors and/or to their maturation into responsible, socially adequate adults (media endangering to minors); (3) to protect the personal integrity of minors as media users; and (4) to provide orientation for children, youth, parents/guardians, and educators regarding media usage and literacy.

General monitoring obligations are prohibited per Art. 14 ECD and § 10 TMG, thus “structural precautionary measures” do not concern reviewing the content of media available on the platform. The draft Digital Services Act provides that such reviewing of content on a voluntary basis will not, going forward at any rate, potentially result in platforms which merely make third-party content available losing exemption from liability. In the future, such host providers will only be liable for content made available, even if content is checked in advance, if they become aware of a specific legal violation (see Art. 5 and 6 DSA as proposed).

In the argumentation behind the law, the term “structural precautionary measures” per § 24a JuSchG is outlined to mean:

“the structuring of a service/offering so as to facilitate the protection of the personal integrity of minors, their protection against exposure to content of a deleterious or endangering nature, and their ability to take steps accordingly on their own behalf.”Footnote 54

A list is provided specifying a range of measures which may be appropriate given the technical features and the terms of use of the respective service or offering, as well as the content and/or structuring thereof. However, measures cannot be imposed upon platforms that would create excessive hardship, for constitutionality reasons among others. Additionally, the ‘regulatory triangle’ concept applies to platform regulation, according to which the rights and interests of content providers, platform users (minors in this case), and platforms themselves are to be appropriately weighed.Footnote 55 It is thus proper and important that each case be considered individually, as the law prescribes. The listed measures include reporting and complaint systems, classification schemes for user-generated content, age verification procedures, information on where to get advice and assistance from and report issues to an independent non-provider entity, technical means provided to parents and guardians for controlling and monitoring content usage, and terms of use suitable for the protection of minors.

These measures are indicative of a trend toward platform regulation predominantly through design obligations, i.e. requiring the operator to structure and organize the platform in a child-friendly manner (‘child protection by design’).Footnote 56 If it is found in a given case that these requirements are not met, a dialogue with regulators first takes place aimed at improving the content offered and the platform design. Only if the issues remain unresolved will specific prevention measures be ordered. Failure to comply with such orders is punishable by a fine of up to five million euros, see § 28 (3) no. 4 and (5) 1st st. JuSchG. The regulator is the Federal Review Board for Media Harmful to Minors, which is to be reorganized as the Federal Center for Media Protection for Minors. The law applies equally to service providers not domiciled in Germany pursuant to § 24a (4) JuSchG.

4.3 Network Enforcement Act

While the Network Enforcement Act (Netzwerkdurchsetzungsgesetz; NetzDG) is not explicitly focused on the protection of minors, it is intended to afford protections to affected parties, including minors. The regulations it sets forth originally applied solely to social media networks per § 1 (1) NetzDG, but the scope of the law is being expanded to include video sharing platforms as part of implementation of the AVMSD. An outline of the fundamental provisions of the NetzDG is first provided below before discussing the regulations governing video sharing platforms in regard to their applicability requirements and legal ramifications.

4.3.1 The Regulatory Framework of the Network Enforcement Act

Social media networks per § 1 (1) NetzDG that have more than two million registered users in Germany are subject to specific reporting obligations,§ 2, and erasure obligations, § 3, under penalty of fines, § 4, pursuant to NetzDG.Footnote 57 Under § 3 NetzDG, social media network providers are required to have a process in place for filing complaints about illegal content that is “easily recognizable as such, directly accessible, and available at all times.”Footnote 58 Pursuant to § 3 (1) NetzDG, the content specified under § 1 (1) NetzDG is illegal, including content of an insulting nature per §§ 185 ff. of German Penal Code (StGB), of a threatening nature per § 241 StGB, or of a nature violating an individual’s intimate personal sphere per § 201a StGB. Whether an offense is culpably committed is legally irrelevant.Footnote 59

Social media network operators were already required to have complaint management processes in place prior to enactment of the NetzDG for breach of privacy issues by virtue of the blog entry procedure established by the German Federal Court of Justice (BGH).Footnote 60 However, not every case in which the criteria are met for the above offenses contains a legal breach of privacy, nor is every breach of privacy criminally relevant. Even for the overlap between offenses covered by the NetzDG and illegal breaches of privacy, the process now has to ensure that the social media network provider promptly registers the complaint and reviews whether the content reported in the complaint is illegal and has to be removed or access to it restricted.Footnote 61 In some cases, other deadlines apply for breach of privacy incidents. However, § 10 TMG already requires action to be taken promptly upon receiving notification as a fundamental principle, and EU Member States are not able to attach further nuance to that principle.Footnote 62 Any illegal content has to be promptly removed or access to it restricted, generally within seven days within receipt of complaint.Footnote 63 Furthermore, social media network providers must remove content which is obviously illegal or restrict access to it within 24 h of receiving complaint. Thus, for obviously illegal content, a shorter period of time is given. This is conditional upon the complaint in question stating sufficiently specific information.Footnote 64 The applicable deadline is extended accordingly if the social media network provider forwards the matter to a recognized regulated self-regulation panel per § 3 (6)–(8) NetzDGFootnote 65 to make a decision regarding illegality, accepting the panel’s decision as binding. The legal position of such panels and the requirements they are subject to remain unclear, however.Footnote 66 Enactment of the NetzDG did not result in significant changes to the complaint management procedure employed by social media network providers. Social media networks still primarily review content with reference to their community standards; the specific review required under NetzDG is implemented as a downstream step.Footnote 67 There is still no final clarity on whether community standards that differ from fundamental legal requirements are in any way valid.

Violations of the NetzDG are punishable by fine as per § 4 (1) no. 2. If the fine-imposing authority intends to base its decision on the illegality of content, a court decision on the illegality must be obtained first (see § 4 (5) NetzDG. Per § 68 (1) of the Administrative Offenses Act (Ordnungswidrigkeitengesetz; OWiG)). The competent court is the court of jurisdiction in the district where the administrative authority (the Federal Office of Justice) is based, i.e. Bonn Local Court.Footnote 68 While the local court decision is binding and cannot be challenged,Footnote 69 the Federal Office of Justice’s decision, as fine-imposing authority that draws on the local court’s decision, can be contested by filing an objection.Footnote 70 Some consider this obligation to obtain a decision from Bonn Local Court regarding the illegality of specific content before a fine can even be imposed to be alien to the legal system.Footnote 71 In any event, there would likely be agreement that if Bonn Local Court was demonstrably overloaded, involving other courts in the matter should be considered.Footnote 72

Pursuant to § 2 (1) NetzDG, a semi-annual complaint handling report must be posted on the provider’s website and in the Federal Gazette. And pursuant to § 5 NetzDG, a non-temporary contact person in Germany who is easily identifiable as such, as well as an authorized served document recipient to facilitate legal enforcement, must be named.

4.3.2 Amendment of the Network Enforcement Act

The Network Enforcement Act (Netzwerkdurchsetzungsgesetz; NetzDG) is currently being altered in two ways: First, by the April 2021Footnote 73 enactment of the Act against Right-Wing Extremism and Hate CrimesFootnote 74 as well as by an amendment to the NetzDG itself. The explanations outlined below, which particularly concern heightened obligations for platforms and their impact on platform design, proceed mostly from the latter of the aforementioned legislation.Footnote 75

In addition to heightened reporting obligations, the amendment provides for supplementation of the complaint handling procedure per § 3 NetzDG as follows:

  • an obligation to promptly notify users when a complaint is received over content stored for them

  • an obligation to retain removed content for a period of ten weeks for evidentiary purposes

  • an obligation to inform the complainant and the user concerned of the corresponding decisions made

  • a legal basis for forwarding data to a recognized regulated self-regulation organization

  • provisions for establishing regulated self-regulation

Platform design is addressed as well, with the social media network provider being obligated, for example, to implement a process for reporting to the Federal Criminal Police Office, see § 3a NetzDG.Footnote 76 The provider must additionally ensure that an easily identifiable procedure is in place for contacting the provider (see § 3b (1) 3rd st. NetzDG). A remonstrance procedure is being introduced that enables users whose content has been deleted to protest its deletion (see § 3b NetzDG). An arbitration procedure is likewise introduced under § 3c NetzDG. There are also rudimentary supplemental provisions regarding fines under § 4 NetzDG, and a supervisory authority is established under § 4a NetzDG. The responsibilities of the named domestic authorized recipient of served documents are supplemented under § 5 NetzDG, and the transition period provisions are supplemented under § 6 NetzDG. Lastly, video sharing platforms are placed within the scope of the NetzDG under § 3d–§ 3f NetzDG, implementing § 28b AVMSD. Except as otherwise provided under § 3e (2) and (3), NetzDG applies to video sharing platforms. The NetzDG does not address minors in particular. Thus, the NetzDG does not state an obligation to design specific minor-friendly procedures. The procedures must only be identifiable for the common user. Still, the amendment of the NetzDG might have positive effects that benefit minors such as the deletion of harmful content and the comprehensive and practical implementation of the procedures mentioned.

4.3.3 Provisions Regulating Video Sharing Platforms

Video sharing platforms fall fundamentally and entirely within the framework of the NetzDG by virtue of § 3e (1) NetzDG, thereby obtaining the same status as social media networks. The term “video sharing platform service” is regulated per definitions under § 3d (1) NetzDG. The conceptual content of this term and of other definitions conforms with requirements under the AVMSD.Footnote 77 The scope of application of the NetzDG is thus expanded with regard to the provisions of the AVMSD governing the removal of illegal content.Footnote 78 Certain video sharing platforms already met the definition to constitute a social media network as per § 1 (1) NetzDG, as all forms of communication are concerned thereunder.Footnote 79 However, the expansion in scope affects cases with video sharing platforms that specialize in the exclusive distribution of specific content, such as the publication of scenes from a computer game.Footnote 80 This content is of specific interest for minors and may be harmful due to potential violence or sexualization in computer games. Thus the content regulated by the NetzDG, both in general as well as after the amendment, is of special relevance to children. Here, it must be kept in mind that a social media network under the definition per § 1 (1) NetzDG is a platform designed for any content, rather than for specific content. Because of its legal orientation around the requirements per AVMSD, the NetzDG does not apply in exactly the same ways to social media networks and video sharing platforms respectively, as there are differences regarding details. Thus, the discussion below focuses first on differences in how the NetzDG applies to social media networks versus video sharing platforms before turning to the legal ramifications in regard to the protection of minors.

4.3.3.1 Applicability of the Network Enforcement Act

Limits are set to the applicability of the NetzDG to video sharing platforms under § 3e (2) and (3). These limitations may apply to smaller video sharing platforms with fewer than two million registered users in Germany (sub-Sect. 4.3.3.1.1.) and video sharing platforms that are domiciled outside Germany in another EU Member State (sub-Sect. 4.3.3.1.2.).

4.3.3.1.1 Limited Applicability to Smaller Video Sharing Platforms in Germany

Pursuant to § 1 (2) NetzDG, social media networks of a certain size, i.e. with fewer than two million registered users in Germany, are exempt from all requirements under § 2–§ 3b NetzDG. Smaller sized platforms are to be relieved of excessive burden that would be imposed by having to comply with the NetzDG.Footnote 81 Yet regarding smaller video sharing platforms, in contrast, the requirements per the AVMSD still fundamentally apply,Footnote 82 i.e. video sharing platform providers with fewer than two million registered users in Germany that are domiciled in Germany are not exempt from all obligations under the NetzDG, pursuant to § 3e (2) 3rd st.

Instead, alleged illegal content on smaller video sharing platforms is viewed in relation to distinct categories. Thus, certain content from the general list per § 1 (3) NetzDG is referenced under § 3e (2) 2nd st. NetzDG—namely user-generated videos and broadcasts that constitute criminal acts under §§ 111, 130 (1) and (2), 131, 140, 166, and 184b in conjunction with § 184d of Criminal Code (Strafgesetzbuch; StGB). These are categories of content already covered by the NetzDG, which are also regulated under the AVMSD. Where such specifically listed illegal content is concerned, requirements under the NetzDG still apply to smaller video sharing platforms as well, though to a lesser extent, being exempt from reporting obligations per § 2 NetzDG and notification obligations per § 3a NetzDG. The complaint procedure per § 3 (1) NetzDG is essentially limited to the deletion of obviously illegal content. There is neither a review deadline for ‘regular’ illegal content nor an obligation to retain removed content nor any specific requirements regarding in-house monitoring of complaint handling. It is required, however, to have a remonstrance procedure in place pursuant to § 3b NetzDG.

Regarding content not relevant to the criminal offenses specified under § 3e (2) 2nd st. NetzDG, smaller video sharing platforms are likewise exempted from all obligations per § 2–§ 3b NetzDG, in accordance with § 1 (2) NetzDG. Such content, which for small video sharing platforms thus does not trigger any obligations under the NetzDG, includes, for example, content illegal for reasons concerning the preservation of the democratic and constitutional state (§§ 86, 86a StGB), preservation of public order (§§ 126, 129–129b StGB), and protection of personal dignity (§§ 185–187 StGB).

Thus, while smaller video sharing platforms enjoy privileges compared to larger video sharing platforms with respect to liability for illegal content and related obligations, such privileging falls short of the exemption for smaller social media networks.

4.3.3.1.2 Limited Applicability to Video Sharing Platforms in Other EU Countries

Because the requirements per the AVMSD are directly referenced, the geographical scope of applicability of the NetzDG to video sharing platforms is derived differently and with greater specificity. The AVMSD applies the country of origin principle.Footnote 83 The NetzDG retains this in § 3e (3), thus there is a fundamental reliance on the degree of protection on the European level being ensured by the Member State in which the video sharing platform provider is domiciled, or is effectively domiciled pursuant to § 3d (2) and (3) NetzDG.Footnote 84 Therefore, video sharing platforms that are either actually or effectively domiciled in Germany are fully subject to the NetzDG. Video sharing platforms domiciled in another EU Member State are only obligated to comply with the NetzDG to a limited extent. Here, too, alleged illegal content must be viewed in relation to distinct categories. There is no fundamental obligation under the NetzDG regarding content that falls within the above-referenced list per § 3e (2) 2nd st. NetzDG. Such obligation can only be triggered in specific cases by a special order of the Federal Office of Justice pursuant to § 3e (3) and § 4a NetzDG. In contrast, regarding all other content per § 1 (3) NetzDG, the obligations under the NetzDG apply in full. This concerns a substantial amount of illegal content because criminal breach of preservation of dignity and sphere of privacy protections are a particularly significant focus within the framework of the NetzDG. After all, the most frequent reasons for content removal are violations of personal sphere rights or hate speech under community standards.Footnote 85 A breach of preservation of dignity and sphere of privacy protections does not only occur to adults, but is of special relevance to minors due to the dangers of mobbing and hate for their development.

This nuanced view regarding geographical applicability to video sharing platforms is distinct from the geographical scope of application of the NetzDG to social media networks. Social media networks have to fully comply with requirements under the NetzDG even if domiciled in another EU Member State.Footnote 86 This deviation from the country-of-origin principle remains a subject of criticism.Footnote 87

Accordingly, the distinction between social media networks and video sharing platforms in the NetzDG is highly important, with special relevance to smaller platforms with fewer than two million registered users in Germany and to those domiciled in other EU countries. Platforms that represent a hybrid between a video sharing platform and a social media network, or feature elements of both, will have to evaluate and independently decide whether they need to set up a complaint process per the NetzDG, and for what platform content. There is thus some concern that the differing outlined obligations for social media networks and video sharing platforms respectively may complicate efforts to establish coherent and uniform complaint procedures and mechanisms for users.Footnote 88 This differentiation can lead to two types of issues and corresponding legal uncertainty. First, platforms must assess whether specific content relates primarily to the function of a social media network or to a video sharing platform. In the case of YouTube, for example, which is domiciled in Ireland, different conclusions can be reached.Footnote 89 In a second step, platforms may have to review whether obligations under the NetzDG apply to the entire list of illegal content or only parts of it. Smaller platforms will have to confront the same considerations.Footnote 90

4.3.3.2 Legal Ramifications

Once the few complicated questions regarding applicability of the NetzDG for video sharing platforms have been resolved, the legal ramifications should be crystal clear. Under the NetzDG, video sharing platforms have the same status as social media networks except for the differences described, whereby they fall within the regulatory framework of the NetzDG (see Sect. 4.3.1.) and have new obligations under the most recent amendment (see 4.3.2.). Being only partially subject to these obligations is only possible for smaller video sharing platforms in exceptional cases (see 4.3.3.1.1.).

Video sharing platforms are further characterized by two particularities. Regarding illegal content per § 3e (2) 2nd st. NetzDG, video sharing platform providers are obligated to have platform users contractually agree not to use the service for the content in question, see § 3e (4) NetzDG and to monitor and ensure compliance with that agreement. This does not, however, represent an obligation to proactively review content, as such mandatory monitoring is prohibited under § 10 TMG.Footnote 91 Again, there is no special regulation of a contractual agreement with minors. The question whether and under what conditions minors can conclude a contract with the video sharing platform provider therefore depends on the contract law of the member state. The second particularity is that video sharing platforms are subject to regulatory arbitration pursuant to § 3f NetzDG. The regulatory arbitration panel exists for the sole purpose of settling disputes with video sharing platform providers out of court. Social media network providers do not have this option for disputes, instead having only non-regulatory arbitration options organized under private law as per § 3c NetzDG. However, regulatory mediation is only an option for disputes with video sharing platform providers if the they do not already participate in private-sector arbitration or the recognized arbitration panel is not a private-sector organization (see § 3f (1) 3rd st. NetzDG). This possibility is thus subsidiary to arbitration disputes organized under private law.

4.3.3.3 Legislative Overlap Between the Network Enforcement Act, the Interstate Treaty on the Protection of Minors in the Media, and the Protection of Minors Act

The discussed amendments mean that a ‘notice and takedown’ procedure for video sharing platforms will be provided under both the JMStV and the NetzDG. There can be overlap between JMStV and NetzDG regarding much content,Footnote 92 such as prohibited content specifically illegal under § 4 (1) JMStV, in particular. For example, the content listed under § 4 (1) 1st st. nos. 1–6 and no. 10 JMStV also falls within the scope of § 1 (3) NetzDG.Footnote 93 Content deleterious to the development of minors per § 5 JMStV could also be concerned. Minors suffer particularly from violation of personal sphere rights on the internet, as such negative experiences can be deleterious to their development.Footnote 94 Therefore, deletion obligations may exist in parallel under the JMStV, and in conjunction with the TMG and the NetzDG, respectively.

In such cases, the procedure per the NetzDG is seen as fundamentally more specific in nature, thus taking precedence. This is evident from the wording of § 10a and § 10b TMG.Footnote 95 The NetzDG similarly also has precedence in case of overlap with § 24a JuSchG, see § 24a (4) JuSchG, although further requirements under the JuSchG then apply relating to the protective objectives outlined above. Regarding possible overlap, however, especially between the JMStV and the NetzDG, there are concerns that the different administrative structures may lead to intersecting competencies and redundant structures that could undermine efforts to implement practical and effective complaint mechanisms, thereby giving rise to conflict.Footnote 96 Additionally, the Federal Office of Justice, as a regulatory authority, could become the subject of constitutional concerns in view of its rapidly growing importance with a view to the principle of separation of state and public media, which requires that media supervision must be relatively independent of government authorities.Footnote 97

5 Conclusions

Legislators have formulated significantly less stringent requirements for telemedia than for broadcasting regarding the protection of minors. In practice, there is no concept in place that is absolutely effective in protecting minors, in part because of ubiquitous access to social media content via smartphones and tablets, which renders technical solutions increasingly difficult.Footnote 98 Legislators have now explicitly addressed video sharing service providers for the first time, i.e. platforms where users post videos, specifically with respect to the protection of minors, imposing concrete obligations.

Laws enacted to implement the AVMSD had largely failed to provide either transparent, user-friendly mechanisms that enable user reporting of illegal content or age verification systems to restrict access to content deleterious to the physical, mental, or moral development of minors.Footnote 99 These deficits are now being addressed by implementing separate clauses §§ 5a ff. JMStV in conjunction with §§ 10a ff. TMG. On the one hand, it does seem suboptimal that the term “age verification” is used in § 5a, (2) no. 1 of the amended JMStV. At any rate, its usage does, however, yield a mechanism for creating a detailed catalog of appropriate measures depending on the type of objectionable content, the harm it could cause, the defining characteristics of the category of persons to be protected, and the rights and legitimate interests concerned as required under Art. 6 a (1) AVMSD.

This tendency towards greater platform regulation is reflected in the development of the JuSchG and the planned amendments to the NetzDG. The JuSchG focuses on the creation of structural precautionary measures. As before, these are not to give rise to general review/monitoring obligations for video sharing platforms, with reference to the Digital Services Act among other considerations. Instead, a set of various measures is available that may include the implementation of registration, classification and age verification systems, minor-friendly terms and conditions, and advisories on external sources and contacts for information and advice. The ‘child protection by design’ obligations to be fulfilled are determined with a view to constitutionality considerations, applying the principle of appropriateness, possibly in coordination with the regulator within the framework of a regulatory dialogue procedure.

In addition to outlining specifics and fleshing out the existing complaint procedure, such as additional notification and retention requirements, the draft amendment of the NetzDG also addresses platform design. For example, platforms must have a remonstrance procedure in place and provide uncomplicated channels for contacting them that are easily identifiable as such. The Federal Office of Justice, which in the past functioned solely as fine-imposing authority, is also to play a greater supervisory role. Video sharing platforms will likewise be compelled to meet these requirements going forward, irrespective of whether any kind of content or only content of a specific nature may be shared on the platform. Smaller video sharing platforms and platforms domiciled in EU countries besides Germany are possible exceptions, and will have to consider in detail what content of theirs falls within the scope of these obligations.