Skip to content

The Digital Services Act (DSA) has been in force in the EU since the end of October 2022. Although it will not come into force until the beginning of 2024, there is already a great deal of excitement in connection with fake news about the war in Gaza in particular. The Federal Council has apparently also been caught up in this excitement. The Department of the Environment, Transport, Energy and Communications (DETEC) and the Federal Office of Justice (FOJ) are to prepare a consultation draft for platform regulation by the end of March 2024. So what provisions could a federal law on the regulation of communication platforms contain and what could they look like?

Predefined framework

In its press release of 5 April 2023, the Federal Council remains largely silent. However, it does make one important stipulation: where appropriate, the legislator should be guided by the rules of the EU’s Digital Services Act (DSA). This should include the following points:

  • Contact point and legal representative in Switzerland
  • Two-stage complaints procedure for deletion or blocking
  • Transparency in advertising
  • Establishment of a reporting centre for violence and hate

Based on these points, a rudimentary outline of this law on the regulation of communication platforms can be drawn. For the sake of simplicity, let’s call it the Swiss DSA.

Scope of a Swiss DSA

The law is intended to regulate communication platforms (so-called intermediaries). In particular, this should include search engines (Google), social networks (Facebook), multimedia platforms (YouTube) and micro-blogging services (X, Twitter). The obvious approach here will be to use the term “intermediary service” in the EU DSA. This includes all providers on the internet that transmit user content or provide access to it, the temporary storage of such user content for the purpose of transmitting it to other users or the permanent storage of content for a user.

However, a restriction of this very broad term under EU law can already be surmised between the lines. A Swiss DSA is only intended to regulate “large” communication platforms. What exactly is meant by this remains open. However, it can probably be concluded that micro, small and medium-sized enterprises (SMEs) will not be covered by the law. This would represent a first major difference to the EU DSA, which only provides for exemptions in certain areas and only for micro-enterprises and even sets out an increased programme of obligations for very large platforms. As a result, it can be assumed that future Swiss law will probably not provide for a staggering of obligations for small, medium-sized and large companies.

The Swiss DSA will also have to have extraterritorial effect due to the lack of domicile of large communication platforms in the territory of the Swiss Confederation. It will therefore be sufficient to carry out activities in one of the cantons for it to be applicable.

Contact point and representative

The establishment of a central contact point and a legal representative in Switzerland will be mandatory. These formulations are identical to European law. This means that the contact point will not only be the point of contact for authorities (see Art. 11 DSA), but also for users of the services (Art. 12 DSA). In addition, a legal representative must be appointed. As the EU Act also provides for a “sensible” exception to this, namely in the case of an existing establishment in a Member State, this obligation is likely to be waived if a corresponding subsidiary is already established in Germany.

Two-stage complaints procedure

The complaints procedure will presumably be modelled on the existing regulations in the EU. At the very least, this is suggested by the planned two-tier nature of the internal complaints procedure on the one hand and the conciliation body on the other. Presumably via the contact point already mentioned, users will be able to initiate an internal review by the platform itself if their profile is blocked or individual content is deleted. Only then should the user concerned be able to appeal to an independent arbitration centre.

The arbitration centre should be independent. Since, according to the press release, it is also planned that this will be jointly financed by the communication platforms, this must mean not only professional and factual independence but also independence in terms of personnel. This also appears to differ from the European situation. Within the EU, dispute resolution is to be carried out by an “out-of-court dispute resolution body” certified for a maximum of five years (Art. 21 para. 3 subpara. 1 DSA). The Federal Council’s press release, on the other hand, envisages the creation of a “Swiss arbitration body” – i.e. a single and centralised one.

Specific (moderation) duties

While it is therefore largely clear under which conditions a Swiss DSA will be applicable and which formal requirements a (large) communication platform must fulfil, the Federal Council is still largely silent on the moderation and other obligations. Nevertheless, one or two binding regulations can already be surmised between the lines.

Transparency requirement & ban on personalised advertising?

Only the obligation to label advertising as such and to provide information about the parameters used for the personalisation of displayed advertising is explicitly mentioned. This corresponds to Art. 26 and 27 DSA and should therefore include information about the beneficiary of the advertising in addition to the actual labelling. As it is likely to be difficult for the operator of a network to recognise a neutral post, the labelling obligation in the Swiss DSA – similar to the EU DSA – will include the obligation to provide a reporting function for content as commercial communication from another user of the platform.

The obligation to specify the parameters for advertising personalisation will presumably also not be understood in Switzerland as a disclosure of the algorithms, but rather as the “criteria that are most important for determining the information that is proposed to the user” (Art. 27 para. 2 lit. a DSA); this would therefore include, for example, the indication that location, gender or followed channels influence the advertising content displayed. Whether the user should also be able to influence the advertising displayed in accordance with EU law remains to be seen. However, as the processing of profile data is also relevant under data protection law and can even be prevented under the FADP, such an expansion of the programme of obligations would not be an increased encroachment on the private autonomy of the respective companies.

Erasure obligations and erasure rights?

A possible obligation to actively delete illegal content, as provided for in the DSA, remains unspoken. However, as the companies will have to introduce a reporting centre for content containing depictions of violence or “hate speech” and this will probably not only function as a collection point for potentially problematic content without any consequences, the Swiss DSA will probably also include an obligation to remove illegal or otherwise problematic content from the platform.

The Federal Council’s press release does not yet state whether a right to delete content can be derived from this. The EU DSA provides for this for violations of the general terms and conditions of a social network, for example. In view of the private autonomy of platform operators, deletion will be permissible within certain limits – at least a ban on arbitrariness.

Conclusion

The Swiss DSA will have to walk a tightrope between strengthening the positive potential of communication platforms and containing their negative side effects. OFCOM already stated this in its 2021 report on “Intermediaries and communication platforms”. We will see next year whether this will succeed.

Sources