Meta Faces Scrutiny Over AI Training on Private User Photos as Opt-In Feature Rolls Out
Meta Faces Scrutiny Over AI Training on Private User Photos as Opt-In Feature Rolls Out

Meta is sparking fresh privacy concerns as it begins testing a new opt-in feature on Facebook that prompts users to allow “cloud processing” of their private, unpublished camera roll photos. This move, which aims to generate AI-powered suggestions like collages and recaps, raises questions about Meta’s long-term intentions for using personal data to train its artificial intelligence models.
While Meta insists the current test does not use these private photos for AI model training, its updated AI usage terms, in effect since June 2024, offer little clarity on whether such data could be utilized in the future. The company has already admitted to scraping public Facebook and Instagram content since 2007 for AI training, and critics worry this new feature marks a significant expansion into previously private user data.
Users who opt into cloud processing grant Meta broad rights to analyze their media, including facial features, dates, and objects within photos. Although Meta states it retrieves only 30 days of camera roll data at a time for suggestions, it appears some thematic suggestions may use older media, hinting at longer data retention. Users can disable the feature in settings, which will prompt the removal of their unpublished photos from Meta’s cloud after 30 days.
The initiative is drawing comparisons to Google Photos, which also offers AI enhancements, but Meta’s lack of explicit assurances against using private data for generative AI training sets it apart and fuels ongoing debates about data privacy in the age of advanced AI.
Disclaimer: This content is aggregated from public sources online. Please verify information independently. If you believe your rights have been infringed, contact us for removal.