Because you would have to prove that the AI only learned from your work and it’s my understanding that there is no way to track what is used as learning material or even have an AI unlearn something.
The people that is stealing art designed their algorithm to not contain proof that they stole art. If they are legally required to prove what training data they used in order to get a copyright then they will design the AI around that. That would immediately disqualify most of the current AIs because they have all been fed stolen art but I am sure they have the tech and capital to start over. And you know, Fuck em.
What if I used an open source algo with my own photographs as a dataset 🤔
Then absolutely go ahead. That isn’t what the guy in the post did tough.
I don’t see why you wouldn’t be able to keep copyright then. Everything involved would have been owned by you.
That is a big difference to how other generative models work though, which do use other people’s work.
Because you would have to prove that the AI only learned from your work and it’s my understanding that there is no way to track what is used as learning material or even have an AI unlearn something.
The people that is stealing art designed their algorithm to not contain proof that they stole art. If they are legally required to prove what training data they used in order to get a copyright then they will design the AI around that. That would immediately disqualify most of the current AIs because they have all been fed stolen art but I am sure they have the tech and capital to start over. And you know, Fuck em.