• 1 Post
  • 33 Comments
Joined 1 year ago
cake
Cake day: June 11th, 2023

help-circle

  • Not really, it’s just that the sheer quantity of hours has been find to be less important than the original study presented. Essentially, with good aptitude and quality practice, you don’t actually need 10,000 hours to reach the top percentile.

    The author of this article seems to have taken this in some weird directions. They have had personal experiences of being pressured to practice long hours at something they struggled in. They find relief in the new study, which they allegedly believe validates the idea that it was a hopeless endeavor. I’d argue that the fault didn’t lie with the 10,000 hour number, but rather with thier family who pushed the author too hard to succeed in a sport they probably weren’t improving at, Rather than reevaluating motivating factors or approach.

    Of course 10,000 hours is arbitrary. I’m just saying, the study doesn’t assert that inherit talent even exist, let alone is the primary factor. It only contradicts the number of hours.




  • Imo, the true fallacy of using AI for journalism or general text, lies not so much in generative AI’s fundamental unreliability, but rather it’s existence as an affordable service.

    Why would I want to parse through AI generated text on times.com, when for free, I could speak to some of the most advanced AI on bing.com or openai’s chat GPT or Google bard or a meta product. These, after all, are the back ends that most journalistic or general written content websites are using to generate text.

    To be clear, I ask why not cut out the middleman if they’re just serving me AI content.

    I use AI products frequently, and I think they have quite a bit of value. However, when I want new accurate information on current developments, or really anything more reliable or deeper than a Wikipedia article, I turn exclusively to human sources.

    The only justification a service has for serving me generated AI text, is perhaps the promise that they have a custom trained model with highly specific training data. I can imagine, for example, weather.com developing highly specific specialized AI models which tie into an in-house llm and provide me with up-to-date and accurate weather information. The question I would have in that case would be why am I reading an article rather than just being given access to the llm for a nominal fee? At some point, they are not no longer a regular website, they are a vendor for a in-house AI.









  • I’m not anti ai, I use it generative ai all of the time, and I actually come from a family of professional artists myself ( though I am not ). I agree that its a tool which is useful; however, I disagree that it is not destructive or harmful to artist simply because it is most effective in thier hands.

    1. it concentrates the power of creativity into firms which can afford to produce and distribute ai tools. While ai models are getting smaller, there are frequently licensing issues involved (not copywrite, but simply utilizing the tools for profit) in these small models. We have no defined roadmap for the Democratization of these tools, and most signs point towards large compute requirements.

    2. it enables artist to effectively steal the intellectual labor of other artist. Just because you create cool art with it doesn’t mean it’s right for you to scrape a book or portfolio to train your ai. This is purely for practical reasons. Artists today work thier ass of to make the very product ai stands to consolidate and distribute for prennies to the dollar.

    you fail to recognize that possibility that I support ai but oppose its content being copywritable purely because firms would immediately utilize this to evade licensing work. Why pay top dollar for a career concept artist’s vision when you can pay a starting liberal arts grad pennies to use Adobe suit to generate images trained in said concept artists?

    Yes, that liberal arts grad deserves to get paid, but they also deserve any potential whatsoever of career advancement.

    Now imagine instead if new laws required that generative ai license thier inputs in order to sell for profit? Sure, small generative ai would still scrape the Internet to produce art, but it would create a whole new avenue for artist to create and license art. Advanced generative ai may need smaller datasets, and small teams of artist may be able to utilize and license boutique models.


  • I disagree with this reductionist argument. The article essentially states that because ai generation is the “exploration of latent space,” and photography is also fundamentally the “exploration of latent space,” that they are equivalent.

    It disregards the intention of copywriting. The point isn’t to protect the sanctity or spiritual core of art. The purpose is to protect the financial viability of art as a career. It is an acknowledgment that capitalism, if unregulated, would destroy art and make it impossible to pursue.

    Ai stands to replace artist in a way which digital and photography never really did. Its not a medium, it is inference. As such, if copywrite was ever good to begin with, it should oppose ai until compromises are made.