Skip to main content
NEWSCURIOSITYSAFETY

Reopening of ChatGPT - The conditions of the Guarantor

By April 13, 2023No Comments
Riapertura di ChatGPT - le regole del Garante per la privacy

With the ever growing concern for the privacy of personal data, the Privacy Guarantor has issued a series of rules for reopening ChatGPT in Italy, which had been banned by the authority on April 1

These rules, which have been put in place to protect our users' data, have been elaborated after a careful evaluation of the Italian privacy legislation.

Let's see what they are.

The conditions of the Guarantor for the reopening of Chatgpt

Information

The first rule imposed by the Guarantor concerns the collection of personal data of ChatGPT users. It is required that the collection of such information is transparent and that users are clearly informed about what data is being collected and how it will be used. Furthermore, users must be able to provide their explicit consent for the collection of this information.

In practice, to get the reopening ChatGPT in Italy, OpenIA will have to “prepare and make available on its site a transparent information, which explains the methods and logic underlying the processing of data necessary for the functioning of ChatGPT, as well as the rights attributed to users and non-user interested parties1

Legal basis

The second rule for the reopening ChatGPT concerns the legal basis of the processing of users' personal data for algorithm training. In this regard, the Guarantor has ordered the American company to "eliminate any reference to the execution of a contract and instead indicate, on the basis of the principle of accountability, consent or legitimate interest as a prerequisite for using such data, without prejudice to the exercise of one's powers of verification and assessment subsequent to such choice"2.

In other words, the Italian Data Protection Authority has asked OpenAI to use consent or legitimate interest as a legal basis for using user data to train algorithms instead of a contract.

Exercise of rights

The third rule provides that users have the right to access their personal data and to request their cancellation or rectification. In other words, ChatGPT users have the right to know what personal data has been collected about them and to request its deletion or correction in case of inaccuracies. The Guarantor has established that requests for access, cancellation or rectification of users' personal data must be processed within a certain period of time and with maximum transparency.

Protection of minors

The fourth rule set for the reopening ChatGPT involves age verification of minors. The Guarantor has specified that, by 30 September 2023, OpenAI will have to implement a system for requesting age during registration and an age verification system that is capable of "exclude access to users under the age of thirteen and minors for whom parental consent is lacking"3

Information campaign

The fifth and final rule imposes on OpenAI the obligation to promote, by 15 May, an information campaign on the use of users' personal data. The information campaign will have to take place on radio, television, newspapers and the web and will aim to inform people about how their personal data is used to train the ChatGPT algorithm. 

The Guarantor then reserves the right to continue with the investigation of any violations and, if it detects irregularities, to adopt further sanctions. 

OpenAI's answer

For its part, OpenAI is convinced that it respects the rules on privacy, but said it was open and willing to collaborate with the Guarantor to resolve critical issues and allow for rapid reopening ChatGPT

In a letter published on its website, the American company states that “OpenAI is committed to keeping powerful AI safe and widely beneficial. We know that our AI tools offer many benefits for today's people. Our users around the world have told us that ChatGPT helps increase their productivity, enhance their creativity and deliver tailored learning experiences. We also recognize that, like any technology, these tools carry real risks, so we work to ensure that security is built into our system at all levels"4.