Random Image Display on Page Reload

Adobe Says It Won’t Train AI Using Artists’ Work. Creatives Aren’t Convinced

If you buy something using links in our stories, we may earn a commission. Learn more.

Jun 19, 2024 1:59 PM

Adobe Says It Won’t Train AI Using Artists’ Work. Creatives Aren’t Convinced

After a user backlash, Adobe has been forced to clarify how it will use creators’ work. But can it be trusted?

An image of an entrance to an Adobe office in San Francisco CA.

Photograph: Justin Sullivan/Getty Images

When users first found out about Adobe’s new terms of service (which were quietly updated in February), there was an uproar. Adobe told users it could access their content “through both automated and manual methods” and use “techniques such as machine learning in order to improve [Adobe’s] Services and Software.” Many understood the update as the company forcing users to grant unlimited access to their work, for purposes of training Adobe’s generative AI, known as Firefly.

Late on Tuesday, Adobe issued a clarification: In an updated version of its terms of service agreement, it pledged not to train AI on its users' content stored locally or in the cloud and gave users the option to opt out of content analytics.

Caught in the crossfire of intellectual property lawsuits, the ambiguous language used to previously update the terms shed light on a climate of acute skepticism among artists, many of whom overrely on Adobe for their work. “They already broke our trust,” says Jon Lam, a senior storyboard artist at Riot Games, referring to how award-winning artist Brian Kesinger discovered generated images in the style of his art being sold under his name on Adobe's stock image site, without his consent. Earlier this month, the estate of late photographer Ansel Adams publicly scolded Adobe for allegedly selling generative AI imitations of his work.

Scott Belsky, Adobe’s chief strategy officer, had tried to assuage concerns when artists started protesting, clarifying that machine learning refers to the company’s non-generative AI tools—Photoshop’s “Content Aware Fill” tool, which allows users to seamlessly remove objects in an image, is one of the many tools done through machine learning. But while Adobe insists that the updated terms do not give the company content ownership and that it will never use user content to train Firefly, the misunderstanding triggered a bigger discussion about the company’s market monopoly and how a change like this could threaten the livelihoods of artists at any point. Lam is among the artists who still believe that, despite Adobe’s clarification, the company will use work created on its platform to train Firefly without the creators’ consent.

The nervousness over nonconsensual use and monetization of copyrighted work by generative AI models is not new. Early last year, artist Karla Ortiz was able to prompt images of her work using her name on various generative AI models, an offense that gave rise to a class action lawsuit against Midjourney, DeviantArt, and Stability AI. Ortiz was not alone—Polish fantasy artist Greg Rutkowski found that his name was one of the most commonly used prompts in Stable Diffusion when the tool first launched in 2022.

As the owner of Photoshop and creator of PDFs, Adobe has reigned as the industry standard for over 30 years, powering the majority of the creative class. An attempt to acquire product design company Figma was blocked and abandoned in 2023 for antitrust concerns attesting to its size.

Adobe specifies that Firefly is “ethically trained” on Adobe Stock, but Eric Urquhart, longtime stock image contributor, insists that “there was nothing ethical about how Adobe trained the AI for Firefly,” pointing out that Adobe does not own the rights to any images from individual contributors. Urquhart originally put his images up on Fotolia, a stock image site, where he agreed to licensing terms that did not specify any uses for generative AI. Fotolia was then acquired by Adobe in 2015, which rolled out silent terms-of-service updates that later allowed the company to train Firefly using Urquhart’s photos without his explicit consent: “The language in the current change of TOS, it’s very similar to what I saw in the Adobe Stock TOS.”

Since the introduction of Firefly, some artists have made the difficult (and arduous) decision to cancel their Adobe membership by pivoting to tools like Affinity and Clip Studio. Others feel forcefully bound to the software. “Professionally, I can’t quit Adobe,” says Urquhart.

Adobe has acknowledged its responsibility to the creative community in the past. In September 2023, the company announced the Federal Anti-Impersonation Right (FAIR) act, a legislative initiative that aims to protect artists from misappropriations of their work. The proposal only addresses intentional impersonations used for commercial purposes, raising questions around efficacy (the act would not protect works ‘accidentally generated’ in the style of an artist) and privacy (proving intent would require storing and monitoring user prompts.)

Outside of Adobe, organizations are finding new ways to help authenticate works and prevent intellectual property theft. A team of researchers at the University of Chicago developed Nightshade, a tool that “poisons” training data and damages iterations of image-generating AI models, and Glaze, a tool that helps artists “mask” their signature styles from AI companies. In terms of regulation, the Concept Art Association—an organization Lam is also a part of—advocates for artists rights with crowd-funded lobbying efforts.

Tiffany Ng is a culture and technology freelance writer based in Hong Kong. Her work has appeared in publications including Vox, MIT Tech Review, Vice, Insider and Vogue.
Read More

The Lords of Silicon Valley Are Thrilled to Present a ‘Handheld Iron Dome’

ZeroMark wants to build a system that will let soldiers easily shoot a drone out of the sky with the weapons they’re already carrying—and venture capital firm a16z is betting the startup can pull it off.
Matthew Gault

How Game Theory Can Make AI More Reliable

Researchers are drawing on ideas from game theory to improve large language models and make them more correct, efficient, and consistent.
Steve Nadis

Why the EU’s Vice President Isn’t Worried About Moon-Landing Conspiracies on YouTube

During a tour of Silicon Valley, EU vice president Věra Jourová said she expects tech giants to prioritize stamping out content that could distort democracy.
Paresh Dave

OpenAI Employees Warn of a Culture of Risk and Retaliation

An open letter signed by former and current employees at OpenAI and other AI giants calls for whistleblower protections as the artificial intelligence rapidly evolves.
Will Knight

Activists Disrupt Amazon Conference Over $1.2 Billion Contract With Israel

Members of the activist group No Tech for Apartheid interrupted a senior Amazon executive's speech at a conference in Washington, DC, on Wednesday over the company's Project Nimbus cloud contract.
Caroline Haskins

AI Is Coming for Big Tech Jobs—but Not in the Way You Think

Companies aren’t replacing workers with AI yet. But they are sacrificing thousands of jobs in the race to further innovation in the technology.
Amanda Hoover

Airbnb’s Olympics Push Could Help It Win Over Paris

Paris officials have placed tough new restrictions on Airbnb rentals in recent years. The company is using the Olympics to try and win over locals and broaden its footprint in the iconic city.
Amanda Hoover

The $50 Billion Musk Referendum

Tesla shareholders will decide whether to back Elon Musk’s leadership—and unprecedented 11-figure bonus—in a pivotal moment for the carmaker.
Morgan Meaker

*****
Credit belongs to : www.wired.com

Check Also

Astronauts to stay on ISS for weeks longer amid probe into Boeing Starliner’s thruster issue

Boeing's Starliner spacecraft won't return its two astronauts from the International Space Station until after …