Ownership
To protect their work from being used in Generative AI training, respondents have adopted various methods, including contractual restrictions, where they explicitly state in agreements that their work cannot be used for AI purposes. Some have also made public declarations refusing permission or opted out of platforms like Meta that use content for AI training. Technical solutions are becoming more common, with tools like Nightshade, which "poisons" AI training data by distorting images, and Glaze, which alters artworks to prevent AI from mimicking artistic styles. Others have moved their work to AI-resistant platforms like Cara, which automatically blocks AI scrapers. Despite these efforts, some respondents remain skeptical about their effectiveness in preventing AI misuse.
