DPA Digital Digest: Germany [2023 Edition]
A close-up of Germany’s regulatory approach to data governance, content moderation, competition and more.
A close-up of Germany’s regulatory approach to data governance, content moderation, competition and more.
Germany is headed towards a EUR 200 billion digital economy, according to digital association Bitkom. The digital economy’s yearly growth rate of 3.8 percent is expected to rise to 6.3 percent. This development is aided by government investment. Germany’s Recovery and Resilience Plan allocates over half of the EUR 26.5 billion budget to digitalisation. Germany’s Digital Strategy for 2022-2025 sets 18 lighthouse projects to advance domestic digitalisation and strives for artificial intelligence, microchips and quantum computing “made in Germany”.
But what do Germany’s domestic digital policies stand for? The twelfth DPA Digital Digest provides a succinct overview of the latest policy and enforcement developments in major policy areas and Germany-specific points of emphasis.
In data governance, Germany has adopted rules for telecommunications privacy and cybersecurity, scrutinised data transfers to the United States, and conducted investigations into both novel and commonplace technologies.
In content moderation, Germany has introduced a landmark law and enforced rules for both content moderation and user speech rights.
In competition, Germany has introduced a unique regime for large digital firms and pursued a strict enforcement approach, especially regarding data combination.
Germany’s points of emphasis include artificial intelligence, minor protection and cloud computing.
Jump directly to the section that interests you most:
Discover the details of Germany’s regulatory approach on our dedicated country page.
Remain up-to-date on new and upcoming developments with our free notification service.
Written by Tommaso Giardini and Maria Buza. Edited by Johannes Fritz.
The European Union’s General Data Protection Regulation (GDPR) applies in Germany. The amended Federal Data Protection Act aligns domestic law with the GDPR and specifies obligations for private bodies.
Since December 2021, the Telecommunications-Telemedia Data Protection Act (TTDSG) imposes a duty of confidentiality on telecommunications and telemedia providers concerning communications, caller IDs and end-user directories. The TTDSG further imposes requirements for valid consent, which must be informed and unambiguously expressed, as specified in the compliance guidelines. The amended Telecommunications Act, implemented simultaneously, requires consent for the use of non-essential cookies and similar tracking technologies and expands the duty of confidentiality in communications to traffic and location data. In February 2023, the constitutional court invalidated a provision requiring traffic and location data retention for law enforcement purposes without cause.
Since May 2021, the amended IT Security Act requires providers of critical infrastructure to implement preventive cyber resilience measures. The government can prohibit the use of foreign-made critical infrastructure components that pose a risk to public security. In addition, the Federal Office for Information Security can demand organisational and technical changes to the cybersecurity systems of telecommunications and telemedia providers. Finally, the Act introduces a voluntary cybersecurity label.
Several German authorities issued opinions on the ongoing negotiations for a new EU-US Data Privacy Framework, following the invalidation of the Privacy Shield in 2020. In February 2023, the federal data protection authority echoed concerns voiced by the European Data Protection Board regarding mass data collection in the US. The regional data protection authority of Baden-Wuerttemberg questioned EU citizens’ ability to pursue legal action under the US Executive Order On Enhancing Safeguards For United States Signals Intelligence Activities and scrutinised its complaint mechanism in view of unclear standards, limited information and judicial independence. Still, in February 2023, the German competition authority ruled that a procurement award to a German subsidiary of a US parent company could not be challenged because of potential data transfers. Finally, regional authorities issued guidelines on data transfers, most recently the Bavarian State Commissioner for Data Protection, in May 2023.
Several government bodies and agencies issue secondary legislation and pursue enforcement. The federal government issues secondary legislation, e.g. on cyberattacks for critical infrastructure providers, energy providers and auditing firms, and preventive security requirements for healthcare applications. Enforcement is divided between the federal data protection authority, which covers public bodies, postal services and telecommunication providers, and the 16 regional authorities that oversee private entities. The federal and regional authorities convene as the Conference of the Data Protection Authorities (DSK) to conduct coordinated enforcement action and issue non-binding guidelines. The DSK has issued guidance on website subscription models, data collection practices in e-commerce and encryption requirements for emails, among others. Currently, the government is establishing a data institute to coordinate the availability and standardisation of data.
Enforcement action on salient issues is often coordinated. A coordinated action launched in April 2023 scrutinises OpenAI/ChatGPT's data processing. Several authorities requested information, including Rheinland-Palatinate, Baden-Württemberg, Hessen and North Rhine-Westphalia, which referenced a DSK investigation (due to the importance of the application). Previously, the DSK investigated third-country access to companies’ personal data and Microsoft 365 products, raising concerns regarding transparency and data transfers. A coordinated inquiry into EU-US data transfers raised questions regarding data transfers, hosting, webtracking and internal data sharing.
At the regional level, since 2022, several cases specified the reach of data protection on commonplace online functionalities. In Hesse, the data protection authority issued a notice on cloud-based writing support in web browsers, noting that such tools could illegally transfer personal data to the browser provider. In Bavaria, a court declared the use of Google Fonts on websites illegal due to the transmission of dynamic IP addresses without explicit consent and legitimate basis. Hamburg’s data protection authority notified Google’s cookie banners’ non-compliance because the button for acceptance was larger and required a click less than for rejection. A Munich court similarly ruled that user consent was not obtained freely because the cookie banner rendered opting out more burdensome. Finally, in April 2022, the Court of Justice of the European Union preliminarily ruled that consumer protection associations can independently initiate legal proceedings for data protection violations, based on German domestic law (without preclusion by the GDPR).
Germany’s 2018 Network Enforcement Act (NetzDG) is a landmark content moderation law. The NetzDG requires user-content platforms with over 2 million users in Germany to implement a flagging mechanism for users to report “unlawful content”. Unlawful content is not defined in the NetzDG but rather determined by the Criminal Code, including propaganda, symbols of terrorist organisations, violence, and revilement of religion, among others. Platforms must remove or block access to unlawful content within 24 hours and notify both the flagging and the posting user. In addition, platforms must appoint a local representative, store data on content removals and, if they receive over 100 complaints per year, report their moderation activities to authorities (every 6 months). The Federal Office of Justice enforces the rules through investigations and can impose fines, following court confirmation that content is unlawful. In March 2022, a Cologne court ruled that several provisions of the NetzDG are in violation of EU law. The violations concern the "country of origin principle" (E-Commerce Directive), which subjects companies only to the domestic laws of their European headquarters, and the requirement for legal independence of media supervisory authorities (Directive for Audio-Visual Media Services), which the Federal Office of Justice does not fulfil.
In February 2022, the NetzDG underwent two amendments. Platforms must since establish a procedure enabling users to contest moderation decisions – to both remove or not remove flagged content. In addition, the Law against right-wing extremism and hate crime requires platforms to report unlawful content and the corresponding IP address to authorities. In April 2023, the Federal Ministry of Justice announced a Law against digital violence, requiring platforms, including messaging and telecommunication providers, to identify culprits.
The German government’s enforcement focuses on both content moderation and user speech rights. In April 2023, the Federal Office of Justice upheld its EUR 5.1 million fine against messaging provider Telegram for violating the NetzDG by failing to implement a content flagging mechanism and appoint a local representative. The decision is under appeal. Also in April 2023, the Federal Office of Justice opened an investigation into Twitter for failing to moderate a series of similar defamatory tweets. Beyond NetzDG, the Federal Court of Justice ruled in 2022 that YouTube can be held liable for copyright violations if it is aware of repeat violations but does not take adequate preventive (rather than reactive) measures.
User speech rights contrast with moderation obligations. In July 2021, the Federal Court ruled that Facebook's Terms and Conditions on post deletion and account blocking are invalid because they did not sufficiently inform users and enable them to respond. In October 2021, a Cologne court prohibited YouTube from deleting two videos of a COVID-19 policy transparency campaign for failing to provide sufficient information on specific violations of its guidelines.
In January 2021, the tenth amendment to digitalise the Act against Restraints of Competition entered into force. The amendment introduces the concept of companies "of paramount significance” for competition across markets, subjecting them to enhanced scrutiny by the competition authority (BKA). The BKA determines companies’ significance for five years, considering factors including access to data and intermediary functions. For significant companies, the BKA can take ex-ante action, a primer in competition law, and prohibit specific types of conduct, such as combining data, creating information deficits or denying data portability. To date, the BKA has designated Google (January 2022), Meta (May 2022), Amazon (July 2022, appealed) and Apple (April 2023, appealed). Since March 2023, the BKA is determining the status of Microsoft.
Regarding merger regulation, the amendment raises notification thresholds to annual local turnover over EUR 50 million for one party and annual local turnover over EUR 17.5 million for the other (previously EUR 25 and 5 million, respectively). The BKA can request notification of mergers below the thresholds in specific economic sectors, following a sectoral inquiry.
In September 2022, the government published a draft eleventh amendment to the Act against Restraints of Competition. The draft empowers the BKA to issue orders following sectoral inquiries and coordinates enforcement cooperation with the European Commission on the Digital Markets Act. In addition, the draft proposes a presumption that companies violating competition rules attain an advantage of 1% on domestic sales, obliging companies to pay 1% of their profits on top of fines.
The BKA’s enforcement focuses on unilateral conduct, with a special focus on data combination. In July 2023, the Court of Justice of the EU is scheduled to rule on the BKA’s seminal ruling against Facebook’s data combination. In 2019, the BKA ruled that Facebook abused its dominant position by combining user data from its services (Facebook, WhatsApp and Instagram), as well as third-party websites and Facebook Analytics, without user consent. Since the ruling, considered a primer on the intersection of data protection and competition law, the BKA has initiated two similar investigations. In December 2022, the BKA issued a statement of objections in its investigation into Google’s data processing terms. The BKA stated that Google combines user data collected through its services (e.g. Google Search, Google Maps, YouTube) as well as third-party applications and webpages, without giving users a transparent choice to consent to or limit cross-service data combination. In June 2022, the BKA announced an investigation into Apple’s "App Tracking Transparency” framework for potential self-preferencing. The framework requires third-party applications to obtain user consent for cross-app data tracking, while Apple’s apps can combine data across services without consent.
The BKA has conducted several other investigations into unilateral conduct in digital markets, leading to behavioural remedies. In December 2022, the BKA closed its investigation into Google News Showcase following commitments. The BKA scrutinised Google’s plan to integrate story panels, for which Google pays licensing fees, into its general search functions due to potential self-preferencing or unreasonable conditions for publishers. Google committed to entitle publishers to ancillary copyrights and collective enforcement. In November 2022, the BKA partly closed the investigation into Meta’s virtual reality headsets after Meta stopped making the use of headsets conditional on the creation of a Facebook account. Currently, the BKA is investigating Google for restricting the combination of Google Automotive Services and Google Maps with third-party services. Amazon is also the subject of two investigations, concerning its influence on third-party sellers through price control and algorithms, as well as exclusivity agreements to sell brand name merchandise (“brandgating”).
Regarding mergers, transactions with a “community dimension” are examined by the European Commission rather than the BKA. In February 2022, the BKA approved the acquisition of Kustomer by Meta, finding that the effects of the acquisition would not warrant a prohibition under competition law, with explicit reference to the European Commission’s approval.
Germany has repeatedly stated “AI made in Germany” as a strategic objective, including in its Digital Strategy for 2022-2025. Germany’s AI strategy, updated in 2020, aims to strengthen national competitiveness in AI, with responsible AI as a trademark of "AI Made in Europe". The strategy includes the training of AI specialists and the provision of cutting-edge infrastructure.
Simultaneously, Germany is developing AI guardrails. Beyond the abovementioned investigations into the data protection aspects of generative AI, the government has recently focused on autonomous vehicles. Germany adopted a Law on Autonomous Driving, introducing requirements for the construction, condition and equipment of autonomous vehicles to be eligible for licensing, and guidance on cybersecurity in the automotive sector, requiring early-stage demonstrations of measures to address cybersecurity risks. Finally, the Works Council Modernisation Act regulates the use of AI in operational contexts, enabling employee representatives to be informed on and request expert assessments of firms' AI use.
Germany pursues online minor protection through service access restrictions and content moderation. Since May 2022, the second amendment to the Act on Youth Protection Act requires commercial online service providers with over 1 million users in Germany to ensure age-appropriate content through precautionary measures, parental controls and user reporting mechanisms. Further, the Law requires providers to implement age verification for user-generated content self-rated as appropriate for users over 18 years.
The Commission for the Protection of Minors in the Media has focused on pornographic sites, ordering several blockings for failing to implement age verification. Finally, in September 2021, the Act criminalising the distribution and possession of child sexual abuse material introduced criminal liability for the distribution of child sexual abuse material.
Cloud computing is central to Germany’s pursuit of technological sovereignty. In May 2023, the Conference of the Data Protection Authorities issued criteria for “sovereign clouds”, linking "sovereignty" to independence, self-determination, and safety as well as introducing the criteria of traceability through transparency, data controllability, openness, predictability and reliability. With the goal of cloud sovereignty, Germany collaborates with France on GAIA-X, a European data infrastructure. In addition, the government issued minimum standards to reduce risks and ensure information security when relying on foreign cloud service providers.