Italy Fines OpenAI over ChatGPT Privacy Rules Breach

The Italian watchdog also ordered OpenAI to launch a six-month campaign on Italian media to raise public awareness about how ChatGPT works - Reuters
The Italian watchdog also ordered OpenAI to launch a six-month campaign on Italian media to raise public awareness about how ChatGPT works - Reuters
TT

Italy Fines OpenAI over ChatGPT Privacy Rules Breach

The Italian watchdog also ordered OpenAI to launch a six-month campaign on Italian media to raise public awareness about how ChatGPT works - Reuters
The Italian watchdog also ordered OpenAI to launch a six-month campaign on Italian media to raise public awareness about how ChatGPT works - Reuters

Italy's data protection agency said on Friday it fined ChatGPT maker OpenAI 15 million euros ($15.58 million) after closing an investigation into use of personal data by the generative artificial intelligence application.

The fine comes after the authority found OpenAI processed users' personal data to "train ChatGPT without having an adequate legal basis and violated the principle of transparency and the related information obligations towards users".

OpenAI said the decision was "disproportionate" and that the company will file an appeal against it.

The investigation, which started in 2023, also concluded that the US-based company did not have an adequate age verification system in place to prevent children under the age of 13 from being exposed to inappropriate AI-generated content, the authority said, Reuters reported.

The Italian watchdog also ordered OpenAI to launch a six-month campaign on Italian media to raise public awareness about how ChatGPT works, particularly as regards to data collection of users and non-users to train algorithms.

Italy's authority, known as Garante, is one of the European Union's most proactive regulators in assessing AI platform compliance with the bloc's data privacy regime.

Last year it briefly banned the use of ChatGPT in Italy over alleged breaches of EU privacy rules.

The service was reactivated after Microsoft-backed OpenAI addressed issues concerning, among other things, the right of users to refuse consent for the use of personal data to train the algorithms.

"They've since recognised our industry-leading approach to protecting privacy in AI, yet this fine is nearly twenty times the revenue we made in Italy during the relevant period," OpenAI said, adding the Garante's approach "undermines Italy's AI ambitions".

The regulator said the size of its 15-million-euro fine was calculated taking into account OpenAI's "cooperative stance", suggesting the fine could have been even bigger.

Under the EU's General Data Protection Regulation (GDPR) introduced in 2018, any company found to have broken rules faces fines of up to 20 million euros or 4% of its global turnover.



Facebook-Parent Meta Settles with Australia’s Privacy Watchdog over Cambridge Analytica Lawsuit

The logo of Meta Platforms' business group is seen in Brussels, Belgium December 6, 2022. (Reuters)
The logo of Meta Platforms' business group is seen in Brussels, Belgium December 6, 2022. (Reuters)
TT

Facebook-Parent Meta Settles with Australia’s Privacy Watchdog over Cambridge Analytica Lawsuit

The logo of Meta Platforms' business group is seen in Brussels, Belgium December 6, 2022. (Reuters)
The logo of Meta Platforms' business group is seen in Brussels, Belgium December 6, 2022. (Reuters)

Meta Platforms has agreed to a A$50 million settlement ($31.85 million), Australia's privacy watchdog said on Tuesday, closing long-drawn, expensive legal proceedings for the Facebook parent over the Cambridge Analytica scandal.

The Office of the Australian Information Commissioner had alleged that personal information of some users was being disclosed to Facebook's personality quiz app, This is Your Digital Life, as part of the broader scandal.

The breaches were first reported by the Guardian in early 2018, and Facebook received fines from regulators in the United States and the UK in 2019.

Australia's privacy regulator has been caught up in the legal battle with Meta since 2020. The personal data of 311,127 Australian Facebook users was "exposed to the risk of being disclosed" to consulting firm Cambridge Analytica and used for profiling purposes, according to the 2020 statement.

It convinced the high court in March 2023 to not hear an appeal, which is considered to be a win that allowed the watchdog to continue its prosecution.

In June 2023, the country's federal court ordered Meta and the privacy commissioner to enter mediation.

"Today's settlement represents the largest ever payment dedicated to addressing concerns about the privacy of individuals in Australia," the Australian Information Commissioner Elizabeth Tydd said.

Cambridge Analytica, a British consulting firm, was known to have kept personal data of millions of Facebook users without their permission, before using the data predominantly for political advertising, including assisting Donald Trump and the Brexit campaign in the UK.

A Meta spokesperson told Reuters that the company had settled the lawsuit in Australia on a no admission basis, closing a chapter on allegations regarding past practices of the firm.