The Italian Data Protection Authority has fined OpenAI €15 million for using personal data to train its artificial intelligence chatbot, ChatGPT, without a legal basis. The authority also found that OpenAI did not have an adequate age verification system to prevent users under 13 from being exposed to inappropriate content. OpenAI has been asked to launch a six-month campaign to raise awareness on data collection practices.
OpenAI has criticized the decision as disproportionate and plans to appeal the fine. The company claims that the fine is nearly 20 times its revenue in Italy for the year. Despite this, OpenAI remains committed to working with privacy authorities to ensure its AI systems respect privacy rights.
Regulators in the US and Europe have been closely monitoring companies like OpenAI that are at the forefront of the AI industry. The European Union’s AI Act, a comprehensive rulebook for artificial intelligence, aims to protect against risks posed by AI systems. OpenAI’s case serves as a reminder of the importance of complying with data protection regulations and ensuring transparency in the use of personal data in AI development.
Source
Photo credit www.euronews.com