Japan's laws allow the use of copyrighted content to train AI models. Even though artists are worried for their future, some of the country's cultural and social aspects might make it easier to accept AI.
The whole “they need to get permission!” thing makes no sense. When I watch an MKBHD video I don’t need to get his permission to learn from him. I don’t need his permission to learn from his style. I don’t need permission from an artist to learn from their art style, their brush stroke technique, their colour science. I just look at it, watch it, read about it, and I learn. I can then use what I learned to make new stuff and there is nothing that they can, or should be able, to do about it.
The same applies for AI. AI isn’t recreating the material that it is trained from - it’s “learning” from it. It doesn’t take the Mona Lisa as training material and then output the Mona Lisa.
It does recreate the training material. There’s literally loads of examples of it spitting out a degraded copy of an original piece of art with specific enough terms.
But You don’t know what it was trained on, so you can’t say that with any certainty. If it was, why would it make a degraded copy? If it was trained on the real thing, shouldn’t it replicate the real thing perfectly?
The whole “they need to get permission!” thing makes no sense. When I watch an MKBHD video I don’t need to get his permission to learn from him. I don’t need his permission to learn from his style. I don’t need permission from an artist to learn from their art style, their brush stroke technique, their colour science. I just look at it, watch it, read about it, and I learn. I can then use what I learned to make new stuff and there is nothing that they can, or should be able, to do about it.
The same applies for AI. AI isn’t recreating the material that it is trained from - it’s “learning” from it. It doesn’t take the Mona Lisa as training material and then output the Mona Lisa.
It does recreate the training material. There’s literally loads of examples of it spitting out a degraded copy of an original piece of art with specific enough terms.
But that is already the fault of the person who gave those instructions to the AI.
But You don’t know what it was trained on, so you can’t say that with any certainty. If it was, why would it make a degraded copy? If it was trained on the real thing, shouldn’t it replicate the real thing perfectly?
Spot on.