Nightshade Alters Reality for AI Models

Scientists from the University of Chicago have introduced a new tool called NightShade 1.0, aimed at combating the illegal use of data in training AI models. NightShade works as a “data poisoning” tool that complements the existing protective instrument glaze.

NightShade changes images in a way that renders them unsuitable for unauthorized use in model training. The tool aims to ensure that developers respect content creators’ rights to use their work.

According to the developers, NightShade minimizes visible changes in the original image, while significantly altering it for machine vision. For example, to the human eye, the manipulated image may appear as a shaded cow on a green field with little difference, while an AI model might perceive it as a large leather bag lying in the grass.

Described in a research paper from October 2023, NightShade is a targeted data poisoning attack on model training. This technique involves poisoning the images by selecting a description that does not accurately represent the content, thereby blurring the boundaries of the concept when used for training.

Using “poisoned” images in model training can lead to unpredictable outcomes. For instance, if requested to generate an image of a cat, the model might instead produce an image of a dog or a fish. Such unpredictable results make the models significantly less useful, encouraging developers to rely solely on freely distributed data.

The authors of the study highlight NightShade’s potential as a powerful tool for protecting intellectual property from illegal use. They also acknowledge that the images processed using NightShade may slightly deviate from the original, particularly in the case of artwork with flat colors and smooth backgrounds.

In addition to NightShade, the team recommends using version 1.1.1 of glaze. The free vegetation glaze, available for download, alters images to prevent the replication of an artist’s visual style by models. This is especially important as replicating an artist’s style can lead to a loss of income and dilution of the artist’s brand and reputation.

The developers compare the replication of an artist’s style to mimicry and argue that it can discourage aspiring artists from creating new works. Currently, NightShade and glaze need to be downloaded and installed separately, but a combined version of both tools is currently under development.

/Reports, release notes, official announcements.