• donuts@kbin.social
    link
    fedilink
    arrow-up
    5
    arrow-down
    5
    ·
    1 year ago

    They’re gonna be in even bigger trouble when it’s determined that AI training, especially for content generation, is not fair use and they have to pay each and every person whose data they’ve used.

    • Sir_Kevin@discuss.online
      link
      fedilink
      arrow-up
      3
      ·
      1 year ago

      A) Most companies jumping on the AI bandwagon are training their own models.

      B) The music industry has been legally using samples to create new songs since the 90’s.

      AI is here to stay.

      • donuts@kbin.social
        link
        fedilink
        arrow-up
        3
        arrow-down
        1
        ·
        edit-2
        1 year ago

        A) Not true. Many have been training models using various online data that that doesn’t belong to them, has not been licensed, and has been used without informed consent of the rights holders.

        B) Terrible comparison. Music sampling is a grey area that is much more complex and dubious than you’re suggesting. There are instances in which sampling has been considered fair use, but outside of that there are strict laws around sampling. Finally, human music creation and sampling have very little in common with generative AI.

        AI is here to stay. But the free ride of scraping every piece of information in human history without even a basic regard towards intellectual property or personality rights is unsustainable, unethical, and nowhere near the threshold for what can be considered fair use.

        Once people start needing to own or license their training data sets the technology will be just fine, but costs will rise dramatically and the VC investment bubble is going to pop bigtime.

      • fidodo@lemm.ee
        link
        fedilink
        English
        arrow-up
        1
        ·
        1 year ago

        What’s legal changes. There will absolutely be new ai focused laws enacted just like there were internet focused laws once the Internet became very impactful. We simply have no idea how this will play out. Whatever new laws are passed will definitely not kill ai though since it’s a big business and us law makes will want ai companies to thrive so those services can be exported. People acting like ai will die for legal reasons are completely off base.

    • Jaded@lemmy.dbzer0.com
      link
      fedilink
      English
      arrow-up
      1
      ·
      1 year ago

      Ignoring the fact that training an AI is insanely transformative and definitely fair use, people would not get any kind of pay. The data is owned by websites and corporations.

      If AI training was to be highly restricted, Microsoft and google would just pay each other for the data and pay the few websites they don’t own (stack, GitHub, Reddit, Shutterstock, etc), a bit of money would go to publishing houses and record companies, not enough for the actual artist to get anything over a few dollars.

      And they would happily do it, since they would be the only players in the game and could easily overcharge for a product that is eventually going to replace 30% of our workforce.

      Your emotional short sighted response kills all open source and literally gives our economy to Google and Microsoft. They become the sole owners of AI tech. Don’t be stupid, please. They want you to be mad, it literally only helps them.

    • fidodo@lemm.ee
      link
      fedilink
      English
      arrow-up
      1
      ·
      1 year ago

      And payment sharing will most likely be a percentage of revenue and right now their biggest hurdle is just scaling, and it’s incredibly rare that a startup with huge demand completely fails because of scaling challenges. Once they scale their profit margin will be huge, they’d be able to do payouts and still profit. But don’t get excited about payouts, it’ll probably amount to pennies like it does on Spotify.

    • Meowoem@sh.itjust.works
      link
      fedilink
      English
      arrow-up
      1
      arrow-down
      2
      ·
      1 year ago

      AI is too useful and too powerful that none of the major players in world politics are going to put serious restrictions on it, do you really think they’re going to risk Chinese and Russian ai giving them the economic and scientific edge?

      Yes selfish people want to stop progress which could help everyone in the world hey access to education, medical care, legal advice, social care, etc because they think they’re owed twenty cents for the text they wrote but thankfully society isn’t going to take them seriously, there are money grubbers and antisocial people everywhere who are looking for any chance to ruin things that could help others and we ignore those people.