
Balenciaga pope. Fake Pentagon Explosions. It is becoming increasingly difficult to distinguish AI-generated images from the real ones, sometimes with disastrous consequences.
A solution remains elusive. But Microsoft is making an effort with new media provenance features debuting at its annual Build conference.
Launch for Bing Image Creator and Designer, Microsoft’s Canva-like web app that can generate designs for presentations, posters and more to share on social media and other channels, the new media provenance capabilities will allow consumers to verify whether an image or video has been generated by AI, says Microsoft. Using cryptographic methods, the capabilities, rolling out in the coming months, will mark and sign AI-generated content with metadata about the origin of the image or video.
It’s not as simple as a visible watermark. To read the signature, sites must use the Coalition for Content Provenance and Authenticity (C2PA) Interoperable Specification, a specification created with input from Adobe, Arm, Intel, Microsoft, and the Truepic visual media platform. Only then can the site warn consumers when content has been generated by AI, modified or created by Designer or Image Creator.
So the question is, will Microsoft’s efforts make much of a difference if so many image generation tools haven’t embraced similar media provenance standards? C2PA do have the backing of Adobe, which recently launched its own suite of generative AI tools, including an integration with Google’s Bard chatbot. But one of the more prominent players in the generative AI space, Stability AI, only very recently showed a willingness to embrace a specification like the type Microsoft is proposing.
Standards aside, Microsoft’s move to introduce a media provenance tracking mechanism is in line with broader industry trends as generative AI becomes popular. In May, Google said it would use embedded metadata to signal visual media created by generative AI models. Separately, Shutterstock and generative AI startup Midjourney have adopted guidelines to include a marker that content was created by a generative AI tool.