How tech companies are approaching generative AI copyright
As Valve rejects games made with AI from its Steam platform over copyright concerns, other tech companies are using it as a selling point.
Adobe and Shutterstock are both touting their "ethical" approach to generative AI / Pete Linforth
Generative AI is already being used at scale in asset creation. Companies such as Adobe have made the ability to outpaint, create new assets and entirely redesign visual creative at the click of a button a core part of its appeal. Shutterstock too has been extremely quick to tout the flexibility and speed with which its users can use AI for creative marketing.
Both Adobe and Shutterstock, however, are in the rarified position of owning vast arrays of images and videos on which their AI models have been trained. Because they can be absolutely sure where the assets used come from, they can offer indemnification against any copyright lawsuits brought against their users.
It is an extremely smart marketing technique for the companies, as they are both highlighting a fundamental issue with AI and highlighting how, by their nature, that issue will never arise for its users.
Gregor Pryor is Reed Smith’s EME managing partner. He explains: “The law relating to AI... is in its nascent stage, albeit evolving rapidly in response to mounting global concerns. AI is pushing existing legal concepts to their limits, inventing new ones and generally questioning the relationship between our legal systems and machines in an unprecedented manner.“
The potential danger for companies that publish content created through generative AI was brought home this month when American video game publisher Valve announced it would not be hosting games that use AI-generated assets on its Steam platform. Steam, which is the largest digital games store by user numbers, is seen as a bellwether for its competitors.
It clarified to Eurogamer that its decision to delist a game created by a solo developer due to its use of AI-generated assets was not simply its opinion on the tech, but a reflection of how it interprets the current copyright laws.
In addition, because it would potentially be on the hook if it were to sell games that breached those copyright rules, it said it is incumbent upon the developers to ensure they own the rights to the assets included in their game. “It is the developer’s responsibility to make sure they have the appropriate rights to ship their game.”
For both Adobe and Shutterstock, this is the moment in the widespread adoption of generative AI to plant their flags and prove they are among the technical and ethical leaders in the space. At the announcement of the integration of its Firefly AI tool across its product suite, the CEO of Adobe Systems Shantanu Narayen explained: “We believe that few companies can create the foundational models Adobe can with Firefly. This enables us to build these high-quality foundational models that are safe for commercial use. Our AI approach is built with transparency at the center… all content in Firefly will automatically be tagged with content credentials.”
That provenance is going to become as big a selling point of generative AI tools as their capabilities. There are already ongoing battles between publishers and platforms as to what form recompense and payment for the use of their content in training AIs will take, while many of the changes individuals such as Elon Musk have made to their platforms are nominally in service of taking ownership of that data.
A precedent has been partially set in Japan, which noted that all publicly available data is fair game for training AI but that users must still own the copyright to anything created using that data.
Kristen Sanger, vice-president of content at rapid video creation platform Storyblocks, argues: “The potential of properly harnessing generative AI models to ignite an explosion of digital creativity is exciting, and while I agree that copyright laws will need to be adjusted to this new reality, that alone is not enough. Explicit consent matters. It is the only way to make generative content commercially viable, especially in the wake of the supreme court Warhol v Goldsmith decision. Scraping the internet to train models will not serve anyone well”.
So just as Shutterstock and Adobe can tout the watertight indemnification of their tools (notably both are simply extensions of existing indemnification rules), they can also claim to be doing generative AI in an ethical way. For brands that are seeking to get in on the latest tech trend in marketing, that all adds up to an incredibly attractive pitch.