Speaking as a creative who also has gotten paid for creative work, I’m a bit flustered at how brazenly people just wax poetic about the need for copyright law, especially when the creator or artist them selves are never really considered in the first place.

It’s not like yee olde piracy, which can even be ethical (like videogames being unpublished and almost erased from history), but a new form whereby small companies get to join large publishers in screwing over the standalone creator - except this time it isn’t by way of predatory contracts, but by sidestepping the creator and farming data from the creator to recreate the same style and form, which could’ve taken years - even decades to develop.

There’s also this idea that “all work is derivative anyways, nothing is original”, but that sidesteps the points of having worked to form a style over nigh decades and making a living off it when someone can just come along and undo all that with a press of a button.

If you’re libertarian and anarchist, be honest about that. Seems like there are a ton of tech bros who are libertarian and subversive about it to feel smort (the GPL is important btw). But at the end of the day the hidden agenda is clear: someone wants to benifit from somebody else’s work without paying them and find the mental and emotional justification to do so. This is bad, because they then justify taking food out of somebody’s mouth, which is par for the course in the current economic system.

It’s just more proof in the pudding that the capitalist system doesn’t work and will always screw the labourer in some way. It’s quite possible that only the most famous of artists will be making money directly off their work in the future, similarly to musicians.

As an aside, Jay-Z and Taylor Swift complaining about not getting enough money from Spotify is tone-deaf, because they know they get the bulk of that money anyways, even the money of some account that only plays the same small bands all the time, because of the payout model of Spotify. So the big ones will always, always be more “legitimate” than small artists and in that case they’ve probably already paid writers and such, but maybe not… looking at you, Jay-Z.

If the copyright cases get overwritten by the letigous lot known as corporate lawyers and they manage to finger holes into legislation that benifits both IP farmers and corporate interests, by way of models that train AI to be “far enough” away from the source material, we might see a lot of people loose their livelihoods.

Make it make sense, Beehaw =(

  • DavidGarcia
    link
    fedilink
    English
    arrow-up
    14
    ·
    9 months ago

    What do you think should be the alternative then?

    The way I see it, you could 1) not have any models at all, which I think is shortsighted 2) hand over exclusive control over these models to big tech companies that have the money to pay these artists 3) make creative commons models that will probably never be able to compete with the big tech models. 4) Perhaps ban anything except creative commons models for personal use?

    I’d much rather AI models were freely available to everyone equally. Best compromise I could see is developing some legally binding metric that determines wether the output you want to use commercially is similar enough to some artist, so you have to reimburse them.

    • taanegl@beehaw.orgOP
      link
      fedilink
      English
      arrow-up
      10
      ·
      edit-2
      9 months ago

      Can’t put the genie back in the bottle, I guess =\ seems the only real protected forms is modern art, because nobody understands that anyways ^^;

      I’m thinking the problem of AI has to be solved by AI, that those decades need to be replaced with AI training - like you said, having it generally available.

      But that too leaves an outlier, people who don’t want to work with AI. Their only option is to never digitally publish and make all their work bounce light so that cameras can’t capture it. It’d be physical DRM in a sense.

      I don’t really want to work with AI, because it takes away the process I love, but in the end we’re sort of forced to do so =\ It’s like the Industria and digital revolution all over again. Some people (like me) will be dragged kicking and screaming into the future.

      • DavidGarcia
        link
        fedilink
        English
        arrow-up
        4
        ·
        9 months ago

        I think there will always be a market for real physical artists. Yeah you can boxed wine, but people pay to get the real artisinal stuff. Pretty sure real art will become a similarly highly sought after luxury product. If you really like the process and keep at it, you probably won’t have that much competition, because there will be less and less people with that skillset. There’s mass manufactured Ikea furniture, but people still buy handmade tables for ridiculous prices.

        And who knows, maybe AI will grow on you too.

        Or you’ll be highly sought after once we finally inevitably ban AI lol.

        So the future isn’t all doom and gloom, if you ask me.

    • taanegl@beehaw.orgOP
      link
      fedilink
      English
      arrow-up
      5
      ·
      9 months ago

      But yeah, I’m no lawyer. I have no idea how to legally solve this problem, but I suspect that eventually no law can solve the problem, once the works become so good at distinguishing itself, dropping the price so low that doing the real work manually becomes a non starter, or a hobby more or less.

      Humans were meant to work with their hands and minds :( now it’s all keyboards and screens. I tried to get away from that, but they all just keep pulling me back in!

      We’re being enslaved by our computers :(

      • DavidGarcia
        link
        fedilink
        English
        arrow-up
        3
        ·
        9 months ago

        The way I see it, art will just take on a completely different scale. With your average independent artist making their own LOTR trilogy or their own Cyberpunk 2077 or generally just VR world building entire parallel universes.

        I too hate the corpo version of the metaverse, but I think the idea in general is a sound one, if you can craft it analogously to the fediverse. Powered by FOSS software, built by real passionate people for other regular people.

        I’ve always wanted to get into art, but the scales I would like to achieve are completely unrealistic at the moment, except for like a handful of people that made it to be a creative director on the biggest projects. There’s maybe 100 people in the world that get to do that. But AI could enable anyone to work on those scales.

        Imagine a world where literally anyone can meticulously craft their own virtual worlds and you can literally visit them, akin to the Elder Scrolls universe but real planet sized.

        Imagine actually being able to see your characters come to life and meet them. Control everything from the way the buildings look or what the food is like. That is why I’m excited for AI and why I think we shouldn’t just ban it. I 100% get why artists are concerned, but then again imagine your favorite artist could build a world like that. How insanely cool would that be. You can’t do that without AI

        In my opinion AI is just a very efficient brush. Yeah you can lazily pass of a AI generated art as your own, or you can meticulously craft art with tools like InvokeAI. I think what counts in the end is if the end product has a high quality and has originality. Just because the technology is widely being abused, doesn’t make it inherently bad. Beethoven isn’t bad just because there are a million people out there trying to scam you into buying their shitty low effort mostly stolen mixtape.

        I think AI will hugely empower independent artists to produce more and at a higher quality, more closely fitting their vision.

        But even so I can also see a future where AI is devastating to humanity. The saving grace is that the chips needed to run these models are only created in a handful of companies, that could easily be regulated or destroyed. I could envision something similar to how machine guns (fully automatic guns) are regulated, but with AI. Every AI model has to be registered and hardcoded on a chip and there is only a very limited numbe of them. Only licensed individuals can use them and if you aren’t licensed law enforcement will fuck you up. This system works extremely well in the US. Or you just ban AI overall, which also seems like a realistic future.

        When you get into the physics of it, AI has the potential to be up to 3 million times smarter than us. E.g. thinking 3 million times faster. So there is a real case that we can never compete and we HAVE TO outlaw it if we want to survive.

        But then again maybe AI enables fully automated luxury space communism. Who knows.

        So I wouldn’t despair about it, I think there are just as many likely positive scenarios as there are bad ones.

        I would much rather we foster a culture that supports independent artist for their work voluntarily. I think we are already going in the right direction with patreon, buymeacoffe/teespring etc making it infinitely easier for independent creators to make money. We should be working to make that even easier. E.g. when sharing a picture to a platform like Lemmy, it could automatically find the author, like to all their socials and integrate a button to donate to them right in the interface. Increasing P2P support is more my vision of the future for independent artists.

    • frog 🐸@beehaw.org
      link
      fedilink
      English
      arrow-up
      2
      ·
      9 months ago

      Destroy all existing AI datasets, as they’re irreparably tainted. Require all AIs, regardless of whether they’re owned by a company or are open source, to build new datasets exclusively from work that is in the public domain or for which the copyright owner has been consulted and compensated. If the megacorporations want to keep the models they already have, they must compensate the creator of every single piece in the training data at market rates - if they can’t afford to do it, then they either go bankrupt or destroy the tainted dataset. If anyone, company or individual, is caught training an AI with content for which they don’t have a valid licence, issue fines starting with 10% of global revenue, to be distributed to the people whose copyright they violated. Higher fines for repeat offenders.

      • Schmeckinger@feddit.de
        link
        fedilink
        English
        arrow-up
        3
        ·
        9 months ago

        Wouldn’t that make large corporations who own more copyright much more powerful and the small guys less powerful.

        • frog 🐸@beehaw.org
          link
          fedilink
          English
          arrow-up
          1
          ·
          9 months ago

          Yes, but the solution isn’t to allow everyone to rip off artists. Because that results in the small guy creators being even less powerful - instead of only having to be cautious in their dealings with large corporations, they now have to contend with every single person on the planet using their stuff without consent or compensation.

          Even the large corporations that own a lot of content do not own enough to make a viable AI. These things take billions of images in the dataset to result in a model that’s halfway usable. No company owns that many, and they’d bankrupt themselves trying to buy that many. That’s why forcing them to pay is actually a viable solution. No existing company has copyright over billions of images.

          Oh, and obviously the legislation would have to be written to explicitly not give the likes of Google the ability to claim that by using their services, you consent to them harvesting your content to train an AI. “Can’t pay, can’t use” would have to apply to all content, globally, in a way that can’t be signed away through a sneaky ToS.

      • DavidGarcia
        link
        fedilink
        English
        arrow-up
        1
        ·
        9 months ago

        That sounds feasable.

        To be more specific I would require that models have to be copyleft and probably GNU GPLv3 so that big tech companies don’t get a monopoly on good models.

        Basically you can do what you want except change the license.

        • frog 🐸@beehaw.org
          link
          fedilink
          English
          arrow-up
          2
          ·
          9 months ago

          That sounds reasonable. It also makes room for artists who feel so inclined to offer their works into a training dataset, which is fine when it’s something they’ve specifically opted into.