Generative Data Intelligence

Microsoft and Adobe push new symbol to label AI images

Date:

Microsoft, Adobe, and other big names this week pledged to add metadata to their AI-generated images so that future compatible apps will flag them up as machine-made using a special symbol.

You may have seen some reports about this perhaps described as some kind of AI watermark. We took a closer look.

The symbol – described as an “icon of transparency” – depicts the lower case letters “cr” inside a speech-mark like bubble.

It was created by the Coalition for Content Provenance and Authenticity (C2PA), a group of organizations across industries including tech and journalism. The C2PA has been around for about a couple of years; is being driven by Adobe, Arm, Intel, Microsoft, and Truepic; and specifies in detail how metadata in an image can securely certify, digitally, the source and edit history of that image. There are alternative approaches as well as traditional image metadata; C2PA provides an approach pushed by the above big names.

In fact, you can use C2PA’s Content Credentials metadata for any picture – it doesn’t have to be AI generated. The examples given on the Content Credentials’ website were made using Adobe’s Photoshop and Firefly AI tools, and can be identified as such through their metadata. What Microsoft, Adobe and others have promised to do is ensure their AI generators will at some point in the future include this cryptographically signed metadata in their machine-crafted pictures. The goal being to provide a way for people to see if a picture was model or human-made and how.

For instance, Microsoft said artwork produced by its text-to-picture Bing Image Creator and Bing AI chatbot will feature that metadata at some point.

Now here’s the tricky part, assuming the specification is secure and robust. It’s one thing to store that metadata in a picture. How does the user find out, without digging into the file contents?

Well, you will need a compatible application, one that understands the Content Credentials metadata. If an app recognizes that data in a file, it should superimpose the “cr” symbol over the image in a top corner. When you click on that symbol, a widget should appear describing the source of the pic and other details from the Content Credentials metadata – for example, if it was made via Bing or Photoshop.

That’s how people can easily inspect the origin of the snap. But of course if the file is opened in an application that doesn’t support Content Credentials, no symbol is shown: the app won’t understand the data and won’t show a symbol.

You can see what we mean from the aforementioned Content Credentials website. Open it up and scroll down to the AI-made butterfly example. The webpage mocks up what that image should look like in an application that can parse the picture’s metadata: the symbol is displayed in the top corner and when you click or tap it, a label appears saying it’s AI generated. The picture looks like this on the page, without the label open:

cr_logo_butterfly

If you right-click over it, on the Content Credentials website, and save that butterfly image to disk, and then open it in something like Chrome today, it’ll look like this:

cr_logo_butterfly_gone

Just a plain image, no symbol. Chrome doesn’t know about the Content Credentials metadata that’s present in the AI-generated snap, and doesn’t have the icon to overlay anyway. If you go to the Content Credentials verification page, and drop in the downloaded butterfly image, it’ll tell you it was made by Adobe’s Firefly 1.0 AI suite.

This technology – and the specs for it are detailed and impressive – relies on applications understanding and supporting the metadata, or no symbol can or will be shown. Then there’s the fact that someone could strip out the metadata, or export the file to another format without the metadata, or screenshot it from an application that doesn’t overlay the symbol, and then distribute that metadata-less image. If that stripped picture is later opened in an app that does understand Content Credentials, no symbol will be shown.

That means not only do you need apps, social networks, generators, and so on, that can add and read Content Credentials data, you also need people to be aware of the icon so that they can look out for it. If they receive or see a pic and it doesn’t have a Content Credentials symbol on it, they may be inclined to distrust it. That’s going to take a huge amount of brand awareness to work.

That all said, Adobe told us it has a Content Credentials cloud, and that seems to work like this: you upload your image files’ metadata to Adobe’s cloud; if one of your files is later shared by someone without its identifying metadata, whatever they are using to distribute the snap could run the image by Adobe’s cloud and recover the metadata if there is a visual match.

Think: something like Google reverse image search, and it returns the original metadata. That way if someone tries to post your pic to the web or via an app, and everything clicks into place, the origin of the picture should be clear, even if the metadata was lost prior to posting – it would be recovered from the Content Credentials cloud.

“Once digital content is signed with Content Credentials (in a platform that has leverages the C2PA open-standards and Content Credentials), tamper-evident metadata is attached, so it travels with the content wherever it goes,” a spokesperson for Adobe told The Register.

“Even if that information has been maliciously or accidentally stripped off at any point in the content’s lifecycle, it can be recovered via Content Credentials Cloud.”

At least Adobe has thought of that. So now we need people to be aware of the symbol; for their applications to support the symbol; for artists and generators to provide the metadata to Adobe’s cloud; and for online publishers, social networks, and other content hosts to plug into Adobe’s cloud and recover any missing Content Credentials metadata so that metadata-less images can be identified. Phew!

We also suppose that you could use a compatible metadata viewer, or the above Content Credentials verify tool, to check an image by yourself if you don’t have an image app yet to hand that can handle it.

“All AI-generated images in Bing Image Creator are identified using Content Credentials,” a Microsoft spokesperson told The Register. “If Content Credentials appears next to an image, users can quickly confirm the time and date of the original image through an invisible digital watermark.

“This process happens automatically during the creation process and is based on C2PA technical requirements. The icon serves as a simple way for consumers to recognize the content. For Bing Image Creator, we believe the icon will be in place before the end of the year and appear next to the words ‘Content Credentials’.”

Other organizations, such as Publicis Groupe, a French PR and advertising company, is expected to preview how it will use Content Credentials metadata with Adobe in the future. Meanwhile, camera makers Leica Camera and Nikon will demonstrate how the spec will be used by future equipment to bake the origin of pics into photographers’ snaps.

Good intentions

The coalition believes all this can help tackle AI-generated deepfakes that spread misinformation to trick netizens. That could be achieved if, in our view, the “cr” icon becomes as ubiquitous as the copyright symbol.

The metadata could at least make it easier for brands and businesses to be transparent about their use of synthetic images in advertising and marketing campaigns, the org said, if said businesses are willing to disclose that. Andy Parsons, senior director of the Content Authenticity Initiative (CAI) at Adobe, said the accompanying info widget acts like a “digital nutrition label” for users.

“Throughout history, visual iconography has acted as a powerful signifier for communication and culture,” he said.

“We are incredibly excited about the potential of the official Content Credentials ‘icon of transparency’ to become a universal standard and expectation across culture online – helping make trust a fundamental principle in this new digital world. We look forward to continuing to incorporate Content Credentials and the new icon across our Adobe products and solutions.”

Although it’s arguably a step in the right direction, the icon is a long way from being a catch-all solution to authenticating content on the internet and telling AI fakes from the real thing. Adoption so far is merely promised and limited to support from a few albeit very big names. And as we said, it requires many more platforms – including generative AI developers, social media, app makers, and publishers – to support it in order for it to be effective. ®

spot_img

Latest Intelligence

spot_img

Chat with us

Hi there! How can I help you?