It wasn’t a court filing that started the Firefly frenzy. There was a pop-up at the start.When Creative Cloud users opened Photoshop or Illustrator in early June 2024—a moment that typically passes like lint on a dark sweater—they encountered updated terms that they had to agree to in order to continue working. People didn’t read the line about Adobe having access to user content for “content review” using both automated and manual methods like lawyers do. They interpret it as independent contractors would, such as when a client adds a new provision to a contract at 11:58 p.m. on a Friday. The internet panicked, screenshotted, and started a bonfire—as it always does when it detects a rights grab.
In an attempt to quickly quell the controversy, Adobe posted clarifications and then a more thorough update a few days later, stating that Firefly’s models are trained on public domain and licensed content, such as Adobe Stock, and that it does not train generative AI on customer content. The company’s reasoning was simple: without certain permissions to process data, cloud features, moderation, and collaboration cannot be used. That might be accurate. Observing the response, however, gave the impression that the true injury was relational rather than technical. People don’t just randomly remove Adobe apps. Many creatives pay for the software in the same manner that they do for their rent. In a subscription relationship, when trust falters, it does not do so in a courteous manner.
| Item | Details |
|---|---|
| Company | Adobe Inc. (Adobe) |
| Flagship suite | Creative Cloud (subscription-first rollout began in 2013) (Wikipedia) |
| Generative AI brand | Adobe Firefly (Adobe Newsroom) |
| Public positioning | Trained on licensed content (e.g., Adobe Stock) + public domain; pitched as safer for commercial use (Adobe Help Center) |
| Flashpoint #1 | June 2024 Terms of Use backlash over language about accessing user content; Adobe issued clarifications/updates (Adobe Blog) |
| Flashpoint #2 | Reporting that some AI-generated images (including created with rival tools) entered training data via Adobe Stock submissions (Yahoo Finance) |
| Legal pressure | Proposed class action alleges Adobe used pirated books to train “SlimLM” small language models (separate from Firefly images) (Reuters) |
| CEO | Shantanu Narayen (Adobe) |
Firefly, a “commercially safe” substitute for systems that are accused of learning by sifting through the open internet, was meant to be the calming counterprogramming to the larger generative-AI chaos. The promise succeeded because it aligned with the prevailing sentiment in agencies and studios: people desired speed without the risk of legal action or the awkward sensation of stealing style from unapproved third parties. Adobe has reiterated that both public domain content and content it owns, such as Adobe Stock, are used to train its models.
Then grime was added to the clean narrative. Because contributors were permitted to submit “generative AI” images to Adobe Stock, some of which were created using rival tools, and those Stock assets became part of the training pool, a 2024 report explained how AI-generated images entered Firefly’s training set. According to Adobe, AI-generated content only makes up a small percentage of the dataset. This is where creators start using words like “laundering,” not because anyone proved a conspiracy, but because the pathway feels like a loophole: if a rival model was trained on scraped images, and its outputs enter your “licensed” marketplace, doesn’t that import the very controversy you were selling yourself as immune to?
Companies accused of providing training on copyrighted work without authorization are the clear villains of the lawsuits that loom over this entire industry. Important aspects of the artists’ claims against Stability AI and Midjourney have been permitted to proceed by a federal judge, bringing the battle closer to discovery, where sensitive information frequently comes to light. With Disney and Universal suing Midjourney over copyrighted characters, Hollywood has also begun to swing. This escalation feels like a warning shot directed at any image model with growth aspirations.
The stance taken by Adobe is different; it is more corporate, defensive, and “we planned for this.” Some enterprise clients have even been offered indemnity for Firefly outputs; this detail suggests that the company is attempting to market risk management as a feature. It’s still unclear, though, if indemnity soothes people or merely serves as a reminder that the legal risk is significant enough to factor into a contract.
Even though it has nothing to do with photos, Adobe is currently facing its own AI-training lawsuit issue. Author Elizabeth Lyon filed a proposed class action lawsuit in December 2025, alleging Adobe trained its SlimLM small language models using pirated books. In the original reporting, Adobe made no comments. For a working photographer or illustrator, the details might seem irrelevant. Consent, compensation, data provenance, and the unsettling feeling that creative work has been reduced to raw material are all themes.
The Firefly freak-out is particularly flammable due to its close proximity to Creative Cloud’s power. This is a business that already controls the workflow, not an AI startup vying for attention. The blue “Ps” icon on the dock, the muscle memory of keyboard shortcuts, and the resigned monthly payment that keeps a career’s worth of files editable are all things you can observe in a university lab or co-working space. When Adobe’s terminology becomes ambiguous, it feels more like a shift in the foundation of an entire profession than a change in policy.
Additionally, Adobe has been expanding Firefly’s universe in a way that subtly shifts the debate. According to Reuters, Adobe plans to incorporate third-party models from Google and OpenAI into Firefly in 2025, enabling users to access a variety of generators via Adobe’s credit system. That is convenient—possibly unavoidable. Additionally, it adds complexity to the narrative of “trust us, we’re the safe option.” Safety is turned into a menu item. When an output causes a dispute, users begin to wonder which model did what, which terms apply, and who is accountable. Ambiguity tends to increase when ecosystems do, even with labels and controls.
Therefore, the future of Creative Cloud might depend more on Adobe’s ability to make provenance seem uninteresting than on Firefly’s “goodness.” Particularly for Stock contributors whose work is located close to the center of this machine, creators prefer readable terms, clear boundaries, and predictable incentives over speeches. The default layer of contemporary design work may still be Firefly, which hums in the background while people actually think. It is feasible. However, it’s difficult to overlook a new reflex emerging over the past two years: creators no longer presume that the platform is on their side. They believe that receipts are necessary.

