Mosseri: Fingerprint Real Media, Not Fake
- Key Takeaways:
- Adam Mosseri says AI content is becoming ubiquitous and may soon outnumber real photography on Instagram.
- He argues platforms should focus on fingerprinting authentic media (cryptographic signing at capture) rather than chasing increasingly convincing fakes.
- Mosseri warns watermarking and detection will weaken as generative AI improves, and urges camera makers to verify authenticity at capture.
- Creators should consider showing 'raw' or unpolished work to signal authenticity in a feed crowded with synthetic content.
Why Mosseri says fingerprinting real media is more practical
Instagram head Adam Mosseri wrote that social platforms will face rising pressure to identify AI-generated content. He predicts detection tools will degrade as generative models improve.
His solution: instead of endlessly chasing fakes, he suggests it could be more practical to "fingerprint real media" — for example, by cryptographically signing images at the moment of capture to create a verifiable chain of custody.
Cryptographic signing at capture
Under Mosseri's proposal, camera and phone manufacturers would embed an authenticity signature into photos and videos when they are taken.
That signature would act like a tamper-evident seal, allowing platforms and viewers to confirm whether a file originated from a verified device and hasn't been altered since capture.
Why current detection approaches falter
Techniques such as watermarks and retroactive AI-detection are already showing limits. Watermarks can be removed, and detection models struggle as generative AI grows more realistic.
Meta has acknowledged it cannot reliably detect all AI-generated or manipulated content, making Mosseri's argument for a capture-time approach more urgent.
Implications for camera makers and phone companies
If adopted at scale, cryptographic signing would require coordination across hardware makers, operating systems and platforms like Instagram (Meta).
That raises technical and privacy questions: who manages keys, how to preserve users' privacy, and how to handle legacy devices without signing capability.
What this means for creators
Mosseri argues the era of the polished, square photo is waning. He recommends creators use more candid, unflattering or raw imagery as signals of authenticity in feeds increasingly filled with AI-generated visuals.
Creators concerned about reach and discoverability may see this shift as both a threat and an opportunity: authenticity could become a new currency, but it also depends on industry-wide adoption of verification tools.
Next steps and open questions
Mosseri offered few implementation details, so adoption depends on camera makers, standard-setting bodies and platform cooperation. Key challenges include scalability, privacy safeguards, and incentives for hardware partners.
As AI-generated content becomes commonplace, Mosseri's pragmatic pivot — prioritizing verification at capture over perfecting after-the-fact detection — reframes where responsibility for authenticity might sit: with devices, not just platforms.