AI Image Obsession Triggered Manic Episode, Psychosis

AI Image Addiction Triggers Psychosis
AI Image Harm

• Key Takeaways:

  • A UX lead at a generative AI image startup says repeated image-generation sessions triggered a manic bipolar episode and psychosis.
  • She spent up to nine hours a day prompting early 2023-era image models, then became obsessed with AI depictions of herself.
  • The experience illustrates a potential digital-addiction pathway where algorithmic imagery reshapes body perception and mental health.
  • After clinical intervention she left the startup and now works at PsyMed Ventures, focusing on mental and brain health investing.

What happened

Caitlin Ner, formerly head of user experience at an AI image generator startup, published an essay in Newsweek describing how daily work with image-generation systems escalated into a mental-health crisis. She says she spent as much as nine hours a day crafting prompts for early 2023-era models, initially thrilled by the results.

From “magic” to mania

Ner wrote that the early outputs felt like "magic," but the thrill shifted into obsession. "Within a few months, that magic turned manic," she said, as repeated exposure began to alter how she perceived her own body.

How AI images rewired perception

The systems’ outputs evolved from mangled anatomy to idealized, stylized figures. Seeing those images repeatedly, Ner says, "rewired my sense of normal." When she looked at her reflection she perceived flaws she felt compelled to correct.

Clinical escalation and psychosis

Ner describes chasing AI-generated versions of herself—fashion-model images the company encouraged for product development—and losing sleep to create more. Each new image produced a brief dopamine spike, which she calls "addictive."

Triggering a manic bipolar episode

Despite prior successful management of bipolar disorder, Ner’s immersion in image generation precipitated a manic episode. That escalation culminated in psychosis: after viewing an AI image of herself on a flying horse, she briefly believed she could fly and experienced commanding voices encouraging dangerous behavior.

Intervention and recovery

Friends, family and a clinician helped Ner recognize the link between her work and the spiral. She left the startup and moved into a director role at PsyMed Ventures, a VC focused on mental and brain health, where she still uses AI tools but with more guarded boundaries.

Why it matters

Ner’s account highlights how immersive, personalized AI imagery can affect self-image and mental stability, especially for users with existing vulnerabilities. Mental-health clinicians and researchers are increasingly cautious about algorithm-driven content that repeatedly rewards appearance-focused behavior.

Broader implications

As generative models like those popularized in 2023 become more capable, companies and clinicians will need clearer safety practices and usage guidelines. Ner’s story is a cautionary case for designers, employers and users about unregulated, high-volume use of generative AI.

Read more