
For years, Meta has trained its AI on the vast collection of public images uploaded to Facebook and Instagram. Now, the company appears to be eyeing a much more private set of data: users’ unpublished photos. Although Meta claims it’s not currently using these private images for training, it refuses to rule out the possibility in the future. When asked about rights to users’ camera roll content, Meta offered no clear answer.
Recently, Facebook users attempting to share a Story have encountered new pop-ups requesting access to “cloud processing.” This feature allows Facebook to regularly scan a user’s camera roll, uploading media to Meta’s cloud. The goal, according to Meta, is to generate ideas like collages, themed recaps, or AI restyling. Accepting this feature means agreeing to Meta’s AI terms, which permit the analysis of facial features, dates, and the presence of people or objects in these private images. It also grants Meta the right to retain and use this information.
TechCrunch first reported on this development, highlighting the subtle shift in how Facebook interacts with unpublished content. While the feature is framed as helpful and entirely optional, the data access implications run deeper.
Mixed Messages from Meta on AI Usage
Meta insists the current test doesn’t use users’ unpublished photos to train AI. “This test doesn’t use people’s photos to improve or train our AI models,” said Meta public affairs manager Ryan Daniels. Similarly, communications manager Maria Cubeta described the feature as “very early,” saying it aims only to simplify content sharing. She emphasized that any suggestions remain visible only to the user unless they choose to post them, and that the feature can be turned off at any time.
Despite these assurances, there’s still ambiguity. Meta’s AI usage terms, effective since June 23, 2024, don’t specify whether unpublished photos accessed through “cloud processing” are protected from being used as training data. Google Photos, in contrast, clearly states it doesn’t use personal data for training generative AI models. Meta has not made such a commitment.
Further muddying the waters, although Daniels and Cubeta claim Meta only accesses the last 30 days of a user’s unpublished content, some features draw on older media. The company notes that suggestions based on themes like pets or weddings may include images older than 30 days.
Users Can Opt Out — But Is It Enough?
Users can disable the “cloud processing” feature in their settings, which also triggers the removal of their unpublished photos from Meta’s cloud after 30 days. However, that safeguard may not be enough to ease growing concerns.
This test marks a shift in how Meta approaches private content. Rather than waiting for users to share photos voluntarily, the company now seeks permission to access those photos before a conscious decision is made. That change has raised red flags for privacy advocates and everyday users alike.
Adding to the unease, some Facebook users have reported receiving AI-generated image suggestions without knowing they had opted in. One Reddit user claimed Facebook had stylized her wedding photos in the style of Studio Ghibli without her consent. Such examples highlight how quietly features like this can be introduced and how easily they may go unnoticed.