close
close

Meta must stop training its AI on Brazilian personal data

Brazil’s data protection authority (ANPD) has banned Meta from training its AI models on Brazilian personal data, citing “risks of serious harm and inconvenience to users.” The move follows an update to Meta’s privacy policy in May, in which the social media giant granted itself permission to use public data from Brazil’s Facebook, Messenger and Instagram — including posts, images and captions — for AI training.

The move follows a report released by Human Rights Watch last month, which found that LAION-5B — one of the largest datasets of image captions used to train AI models — contains personal and identifiable photos of Brazilian children, putting them at risk of deepfakes and other forms of exploitation.

As reported by The Associated PressThe ANPD told the country’s official gazette that the policy posed “an imminent risk of serious and irreparable or difficult-to-repair damage to the fundamental rights” of Brazilian users. The region is one of Meta’s largest markets, with 102 million Brazilian user accounts on Facebook alone, according to the ANPD. The notice issued by the agency on Tuesday gives Meta five business days to comply with the order, or face a daily fine of 50,000 reais (about $8,808).

Meta said in a statement to the AP According to Meta, the policy update “is in line with Brazilian privacy laws and regulations” and the decision is “a step backwards for innovation, competition in AI development and a further delay in making the benefits of AI available to Brazilians.” While Meta says users can choose not to have their data used to train AI, the ANPD says there are “excessive and unjustified obstacles” that make this decision difficult.

Meta has received similar backlash from EU regulators, leading the company to suspend plans to train its AI models on European Facebook and Instagram posts. However, Meta’s updated data collection policies are already in effect in the United States, where there are no comparable privacy protections for users.