Salamendacious@lemmy.world to News@lemmy.world · 8 个月前Meet Nightshade, the new tool allowing artists to ‘poison’ AI models with corrupted training dataventurebeat.comexternal-linkmessage-square119fedilinkarrow-up1557arrow-down138 cross-posted to: technology@hexbear.nethackernews@derp.footechnews@radiation.party
arrow-up1519arrow-down1external-linkMeet Nightshade, the new tool allowing artists to ‘poison’ AI models with corrupted training dataventurebeat.comSalamendacious@lemmy.world to News@lemmy.world · 8 个月前message-square119fedilink cross-posted to: technology@hexbear.nethackernews@derp.footechnews@radiation.party
minus-squarealiteral@lemmy.worldlinkfedilinkarrow-up3·8 个月前I understand where you are coming, but most AI models are trained without the consent of those who’s work is being used. Same with Github Copilot, it’s training violated the licensing terms of various software licenses.
minus-squareSCB@lemmy.worldlinkfedilinkarrow-up1arrow-down3·8 个月前Then the response to that is laws not vigilantism
minus-squarealiteral@lemmy.worldlinkfedilinkarrow-up1·8 个月前I agree, but those laws need to be enforced and there is no one doing it.
I understand where you are coming, but most AI models are trained without the consent of those who’s work is being used. Same with Github Copilot, it’s training violated the licensing terms of various software licenses.
Then the response to that is laws not vigilantism
I agree, but those laws need to be enforced and there is no one doing it.