With the recently launched legislative process, the Federal Council wants to oblige large online platforms and search engines such as Google, Facebook, TikTok, and X to ensure greater fairness, transparency, and user rights. Users are to be better protected in the future—from reporting and complaint procedures to clear information requirements and greater data and advertising transparency. This is a historic step toward regulating digital infrastructure in Switzerland.
Why a new law is necessary
Digital communication platforms and search engines have become key pillars of public opinion formation. Their algorithmically controlled selection and placement of content has a significant impact on what users see. The Federal Council has determined that the current regulation of these key players is insufficient, particularly when it comes to issues of fairness, transparency, and law enforcement. The services concerned operate largely according to their own rules, without clear legal requirements for content moderation, advertising labeling, or the traceability of algorithmic control mechanisms.
The revised Data Protection Act (DSG), which comprehensively regulates the processing of personal data, has been in force since September 1, 2023. However, there is still no specific legal basis that addresses very large digital platforms and search engines in their function as communication infrastructures. The new law on communication platforms and search engines aims to close this gap. The goal is to strengthen the rights of users in the digital space, increase the transparency of algorithmic systems, and enable effective supervision of particularly influential providers.
Legal framework and key provisions
The law is aimed at very large communication platforms and search engines – specifically, services that are used by at least 10% of Switzerland’s permanent resident population on average each month, which currently corresponds to around 900,000 users. The aim is to specifically regulate those providers that play a central role in the digital information space due to their reach.
The central concern is the introduction of transparent procedures for reporting and reviewing suspected illegal content. Users should be able to easily report content that constitutes defamation (Art. 174 Swiss Criminal Code), insult (Art. 177 Swiss Criminal Code), or incitement to hatred (Art. 261bis Swiss Criminal Code). Platforms must not only process these reports, but also inform the persons concerned about the measures taken and justify them. An internal complaints system and participation in out-of-court dispute resolution are also planned.
In addition, the draft law contains requirements for the labeling and archiving of digital advertising. Advertising must be clearly recognizable as such, and platforms must maintain a publicly accessible advertising archive. Recommendation systems and the criteria according to which content is algorithmically displayed are also to be made more transparent. Services based abroad must appoint legal representation in Switzerland to ensure that legal requirements can be enforced.
The law is closely linked to existing data protection law. Platform operators who process personal data, for example for the personalized display of advertising or recommendations, must ensure that these processes also comply with the requirements of the DSG. This includes, in particular, transparency obligations, information rights of the persons concerned and, where applicable, regulations on data export.
Finally, the law also touches on issues of copyright: the obligation to set up a reporting procedure also covers copyright infringements, such as the unauthorized uploading of protected works. Platforms must coordinate their procedures accordingly.
Analysis and evaluation of regulatory approaches
The draft law takes a differentiated approach: only services with a particularly wide reach whose influence on public discourse can be classified as systemic are regulated. This means that no general platform regulation is being introduced, but rather targeted, risk-based regulation. This focus increases the proportionality of the intervention.
The focus is not on controlling content, but on improving procedures for content control by platforms themselves. The law obliges providers to create comprehensible rules and implement them transparently. Freedom of expression and information are thus preserved. The link to the revised Data Protection Act ensures consistent requirements, particularly with regard to personalized systems that use user data. The obligation to have legal representation in Switzerland significantly increases enforceability vis-à-vis globally active companies.
Practical implications for stakeholders
Platform operators must expect considerable organizational and technical effort. Internal processes for handling reports and complaints must be established, documented, and regularly reviewed.
Companies in the digital advertising and communications market benefit from a clearer regulatory framework, but they also face the need to adapt. Target group addressing, tracking technologies, and advertising systems must be reviewed for compatibility with transparency and data protection requirements. Users gain more control over the content they are exposed to and over the decisions made by platforms. This can strengthen trust in digital services. Finally, authorities and research institutions can access an expanded data base, which facilitates the analysis of the impact of digital platforms on social processes.
Platforms as players in journalism
In the context of platform regulation, the question of what role services such as Google, Facebook, or X play in the media ecosystem is becoming increasingly important. They no longer function merely as technical distributors, but also influence the content and visibility of journalistic contributions through algorithmic selection, reach prioritization, and monetization patterns. This shifts the balance of power in digital journalism to the detriment of traditional media players.
The issue is becoming even more pressing in connection with generative AI systems and automated content production: Who bears copyright responsibility when AI-generated content appears on platforms or journalistic content is processed without permission? And how does the law protect the integrity of journalistic work in the digital space? You can find our comments on this in the article on AI, copyright, and journalism in Switzerland.
Ancillary copyright and platform responsibility
The planned regulation of particularly large platforms also raises the question of how the relationship between platform operators and journalistic content should be structured. Platforms benefit economically from snippets, preview images, and headlines—i.e., short excerpts from editorial articles—without necessarily providing anything in return. In the EU, this issue has been addressed in the context of ancillary copyright. There are also intense discussions on this topic in Switzerland.
A comprehensive regulatory framework for platforms must therefore not only regulate issues of moderation and transparency, but also address the economic balance between platforms and media companies. This concerns both competition law and copyright-protected service elements. For more details, see our article: Ancillary copyright for media companies: Remuneration for online snippets planned in Switzerland.
Data protection issues
Even though the new Platform Act is not purely a data protection regulation, the interfaces with data protection law are central. Platforms regularly process large amounts of personal data, for example for profiling, targeted advertising, or personalizing recommendation systems. The draft law requires these services to disclose algorithmic systems and make them traceable in order to prevent discriminatory or non-transparent structures. The requirements of the revised Data Protection Act (DSG), particularly with regard to transparency, information, correction, and deletion, also apply without restriction to very large platforms. In addition, there are specific obligations when processing particularly sensitive personal data and when exporting data abroad. Platforms that use personal data for advertising or algorithmic control are obliged to design their systems and processes in such a way that they comply with the principles of lawfulness, purpose limitation, proportionality, and the duty to provide information.
Recommendations for practical action
Platform operators should develop an integrated compliance strategy at an early stage that covers all affected areas. Internal documentation, communication guidelines, and technical systems must be reviewed for compatibility with the new law and adapted if necessary. The appointment of legal representation in Switzerland should also be prepared if the company is based abroad.
Companies in the digital advertising environment would be well advised to critically examine their display mechanisms and redesign them if necessary. Legal and business advisors should inform their clients at an early stage and assist them with specific implementation steps. For users, the new law offers improved opportunities to exercise their rights, for example through reporting and complaint procedures. It is advisable to make active use of these and to observe how they work in order to be better able to address abuse and lack of transparency in the future.
Conclusion and outlook
The law on communication platforms and search engines marks an important step in the further development of digital regulation in Switzerland. It supplements the revised Data Protection Act with clear procedural and transparency requirements for particularly influential services. Users are granted binding rights – from reporting problematic content and internal complaint mechanisms to information about moderation decisions. At the same time, platforms are obliged to make their algorithmic systems transparent and to comply with legally established obligations.
Important questions remain unanswered with regard to concrete implementation: How far do the transparency obligations extend? What sanctions are provided for in the event of violations? How effective will the legal representation obligation be? The connection to existing regulations in youth and consumer protection, copyright, and media law will also need to be clarified further. In view of international developments – in particular the EU Digital Services Act – Switzerland should continuously evaluate its regulatory strategy and adapt it as necessary.
For companies, platforms, and advisory professions, now is the right time to set the course. Those who act in a timely manner can not only minimize risks, but also contribute to the establishment of a fair and transparent digital communications infrastructure.