Skip to content

The integration of AI chats, like ChatGPT, in games will open up exciting new possibilities for the medium. Conceivable are evolutions of text-based genres, more in-depth stories and quests, and a much more immersive world for the players.

However, this form of implementing AI in video games also harbours legal risks. In the following, we will primarily look at the data protection concerns. With European data protection authorities imposing fines in the millions, protecting players‘ data should be a priority from the very beginning of development („privacy by design“). The original (and now revised) ban on ChatGPT in Italy, as well as further discussions on regulating AI, show that authorities have become even more wary of AI tools due to the rapid development of the technology.

The particular data protection risks in the use of generative AI models stem from the fact that players can also disclose personal information or information about third parties when using an AI chat. The information can include, for example, real names, age, gender, but also far more sensitive information such as sexual orientation, ethnic origin, political opinions, religious or ideological convictions. This information falls under the protection of the GDPR, as it is both personal data according to Art. 4 No. 1 GDPR and can even be considered as special categories of personal data according to Art. 9 (1) GDPR. In addition, it may happen that this information is shared by minors.

The processing of this data by the game entails special risks and obligations for developers and publishers. First and foremost, personal data may not be processed without a legal basis. The legal bases in Art. 6 (1) GDPR are particularly relevant. These include the most common legal basis of consent (lit. a) and the legitimate interest of the processor (lit. f). However, since special categories of personal data may also be processed, it should be noted that this is only possible in cases covered by Art. 9 (2) GDPR. If personal data are processed by an AI provider located in a third country, further obligations arise.

The so-called data subject rights arising from Art. 15 et seq. GDPR pose a further problem. Individuals whose personal data are processed have the right to information about the purposes of processing, to correction and to deletion of the data. Since LLMs are so complex, there is the problem of the so-called „black box“. Even developers of AI are not able to fully disclose and understand how the data entered is stored and processed. Accordingly, it is not technically possible for them to delete or correct individual data.

Game manufacturers can counteract the data protection problems with targeted measures such as Data Processing Agreements (DPAs) with the AI providers, Privacy Impact Assessments (PIAs), Data Privacy Impact Assessments (DPIAs), Transfer Impact Assessments (TIAs), the use of standard clauses for processing in a third country (SCCs), as well as the implementation of technical and organisational measures (ToMs). However, complete security cannot be guaranteed at this stage of legal development.

Game manufacturers can address the data privacy issues with targeted measures such as Data Processing Agreements (DPAs) with AI providers, Privacy Impact Assessments (PIAs), Data Privacy Impact Assessments (DPIAs), Transfer Impact Assessments (TIAs), the use of standard third country processing clauses (SCCs), and the implementation of Technical and Organisational Measures (ToMs). However, complete security cannot be guaranteed at this stage of the regulatory development.

The following should be noted for the practice:

  • If an LLM-based AI chat is to be implemented in the game, the developers should ensure that the information entered by the players is encrypted in such a way that the developers cannot access it and the AI is not trained with it.
  • If the information entered is to be used for the training of the AI, the consent of the players should be designed in such a way that it includes a reference to the possibility of revoking this consent.
  • Furthermore, developers and publishers should be aware of their obligations to provide information and delete data under the GDPR.
  • Developers and publishers should undertake all necessary data proteciton measures (DPA, PIA, DPIA, TIA, SCCs, TOMs, etc.).