Italy’s data protection agency has fined OpenAI €15 million (approximately $15.58 million) following an investigation into the processing of personal data of ChatGPT users. According to Reuters, the fine was imposed after it was found that OpenAI lacked a legal basis for collecting user data to train the application. The agency also noted that OpenAI failed to provide transparency and relevant information to users regarding their data.
The investigation, which began in 2023, further revealed that OpenAI’s age verification system was inadequate, potentially exposing children under 13 to inappropriate content.
OpenAI described the fine as “exaggerated” and stated it would appeal the decision. The company argued, “The amount of this fine is nearly 20 times our revenue in Italy. It is hindering Italy’s efforts to develop AI.”
Italy’s regulatory authority, Garante, has instructed OpenAI to run a public awareness campaign within six months, focusing on how ChatGPT works and how it collects user data.
It is worth noting that under the European Union’s General Data Protection Regulation (GDPR), any organization that violates legal rules can face fines of up to €20 million or 4% of global turnover, whichever is higher.
Total views: 2510