Mosseri: Instagram will make creators prove they're real

Mosseri: Authenticity vs AI on Instagram
Prove You're Real
  • Key takeaway 1: Adam Mosseri acknowledges AI-generated "synthetic everything" is filling Instagram feeds and detection is becoming harder.
  • Key takeaway 2: Mosseri suggests it may be more practical to fingerprint and signpost real media rather than chase fakes.
  • Key takeaway 3: Provenance tools — C2PA/CAI metadata and Adobe Content Credentials — are potential solutions if platforms read and trust them.
  • Key takeaway 4: Creators should adopt proven content credentials, keep originals, and be transparent about AI use to retain trust.

Why Mosseri says AI 'slop' has won

Adam Mosseri, head of Instagram, wrote on Threads that "Everything that made creators matter—the ability to be real, to connect, to have a voice that couldn’t be faked—is now suddenly accessible to anyone with the right tools." He warned feeds are "starting to fill up with synthetic everything."

Mosseri admits Meta's tools like the 'AI info' tag have struggled: many AI images go undetected while legitimate photos with minor edits were sometimes flagged. He argues detection will grow more difficult as generative models improve.

Fingerprinting real media as a practical approach

Rather than continually chasing better detectors, Mosseri floated a different idea: cryptographically sign images at capture to create a chain of custody. "All the major platforms will do good work identifying AI content, but they will get worse at it over time… it will be more practical to fingerprint real media than fake media," he wrote.

This shifts the burden toward proving authenticity — effectively asking creators and device makers to certify original capture and edits.

Existing provenance standards

Standards and tools already exist to support this vision. Camera manufacturers and software vendors are rolling out tamper-evident metadata through initiatives such as the Content Authenticity Initiative (CAI) and the Coalition for Content Provenance and Authenticity (C2PA).

Adobe’s Content Credentials can embed provenance data inside files; the stumbling block is whether platforms like Instagram will read, honor, and surface that metadata reliably.

What this means for creators

If platforms move to signpost "real" media, creators will need to adopt provenance workflows. Practical steps include using software that writes verified Content Credentials, preserving original RAW files with metadata, and declaring AI involvement when it’s used.

Artists worried about platform-level bias or false flags may also diversify distribution — maintaining direct channels (personal websites, newsletters) and exploring anti-AI-focused communities.

Final take

Mosseri’s admission is both a recognition of reality and a call to action. AI will keep improving; detection alone won't scale forever. Provenance — cryptographic signing, standardized metadata, and industry cooperation (Meta, camera makers, Adobe) — appears to be the most realistic path for preserving trust between creators and their audiences.

For creators, the immediate move is clear: adopt provenance tools, keep originals, and be transparent about AI use to protect credibility on platforms that may soon ask you to "prove you're real."

Read more