Glaze: Protecting Artists from Style Mimicry

submited by
Style Pass
2023-03-19 02:30:05

We are an academic research group of PhD students and CS professors interested in protecting Internet users from invasive uses of machine learning.

We are not motivated by profit or any political agenda, and our only goal is to explore how ethical security techniques can be utilized to develop practical solutions and (hopefully) help real users.

Downloads March 18: Glaze Beta2 is now available for download here. Glaze generates a cloaked version for each image you want to protect. During this process, none of your artwork will ever leave your own computer. Then, instead of posting the original artwork online, you could post the cloaked artwork to protect your style from AI art generators. We will not commercialize our protection tool in any way. It is available to use for free upon release. It is solely for research purposes, with the goal of protecting artists. If you are interested to hear about news updates on the application release, please join this Glaze-announce mailing list.

Suppose we want to protect artist Karla Ortiz's artwork in her online portfolio from being taken by AI companies and used to train models that can imitate Karla's style. Our tool adds very small changes to Karla's original artwork before it is posted online. These changes are barely visible to the human eye, meaning that the artwork still appears nearly identical to the original, while still preventing AI models from copying Karla's style. We refer to these added changes as a "style cloak" and changed artwork as "cloaked artwork."

Leave a Comment