Protecting Artwork from AI Harvesting

“Pokemon in the style of Pieter Bruegel the Elder’s The Hunters in the Snow.” This was one of the most memorable AI images that one of my friends sent me. A few months earlier, my friend had discovered DALL-E, an artificial intelligence tool that creates images from text prompts. Since then, we had been trading the results of the most creative prompts we could think of. “Art Nouveau” and “in the style of Francis Bacon” produced consistently intriguing results. 

What these prompts have in common is that they were [mostly] not drawing from the work of living contemporary artists. Many visual artists working today are concerned about the effects that AI-generated art could have on their work and ability to make a living, especially when a fundamental reason that these tools even function in the first place is that they draw from the countless visual images available online. Artists who have put their work out on the internet have unwittingly had their work used without permission or compensation for this purpose. 

Currently there aren’t a lot of tools – legal or technological – to help artists who want to put their work online but protect it from being used to train AI. An academic research group at the University of Chicago has released a free tool to help artists with this. The tool, called Glaze, is available for Windows and Mac iOS 13 or higher. It adds to images what the research team calls a “cloaking layer.” This is, essentially, a number of changes to the image that are minimally detectable to the human eye, but make it very difficult for AI models to discern what visual qualities comprise the style of that image. Glaze allows you to use a “target style” that is well-known, such as Van Gogh, that makes AI tools think that the visual characteristics of your work are in line with those of Van Gogh. 

Glaze was started with the intention of it being used for paintings and drawings. This is because this cloaking layer technology works best with artwork that is complex in terms of brushwork, color, and texture. Artwork characterized by flat color, simple shapes, and straight lines of consistent width can’t be cloaked as well. This is because there are fewer visual elements for the tool to work with and disguise to AI without significantly altering how the image looks to human eyes. However, an updated version of the tool was released in June 2023 “to work better on styles of art with flat-colors or gradients (e.g. anime or comic art).” There are now also examples of it being used on hi-res photography

For an example of what these changes can look like, check out this example from the team’s research paper, Glaze: Protecting Artists from Style Mimicry by Text-to-Image Models (loads as a PDF). 

Sources

“Glaze: Protecting Artists from Generative AI.” Gaze. Accessed July 10, 2023. https://glaze.cs.uchicago.edu/ 

Hill, Kashmir. “This Tool Could Protect Artists from A.I.-Generated Art That Steals Their Style.” New York Times, February 13, 2023, https://www.nytimes.com/2023/02/13/technology/ai-art-generator-lensa-stable-diffusion.html

Lomas, Natasha. “Glaze protects art from prying AIs: Generative art’s style mimicry, interrupted.” TechCrunch. Last modified March 17, 2023. https://techcrunch.com/2023/03/17/glaze-generative-ai-art-style-mimicry-protection/ 

Shan, Shawn, Cryan, Kenna, Wenger, Emily, Zheng, Haitao, Hanocka, Rana, and Ben Y. Zhao. “Glaze: Protecting Artists from Style Mimicry by Text-to-Image Models.” http://people.cs.uchicago.edu/~ravenben/publications/pdf/glaze-usenix23.pdf