• AutoTL;DR@lemmings.worldB
    link
    fedilink
    English
    arrow-up
    1
    ·
    1 year ago

    This is the best summary I could come up with:


    A new tool called Nightshade allows users to attach it to their creative work, and it will corrupt — or poison —  training data using that art.

    Eventually, it can ruin future models of AI art platforms like DALL-E, Stable Diffusion, and Midjourney, removing its ability to create images.

    The MIT Technology Review reported that Ben Zhao, a professor at the University of Chicago and one of the creators of Nightshade, hopes to tilt the balance away from AI companies that have taken copyrighted data to train their models.

    The Nightshade research paper said training data used in text-to-image AI models are vulnerable to the type of attack the tool unleashes.

    Artists vengeful enough to go this route can upload their work into Glaze, a tool made by Nightshade’s creators that masks their art style, for example, making a normally realistic drawing into something cubist, perhaps.

    Nightshade will be integrated into Glaze, letting users choose if they want to use the poison pill or be satisfied that the model can’t mimic their art style.


    The original article contains 476 words, the summary contains 174 words. Saved 63%. I’m a bot and I’m open source!