• baltakatei@sopuli.xyz
    link
    fedilink
    English
    arrow-up
    9
    arrow-down
    7
    ·
    1 year ago

    Unless anti-trust law changes, Google will just buy ChatGPT and Stability to reduce competition and form a new monopoly.

      • kakes@sh.itjust.works
        link
        fedilink
        English
        arrow-up
        26
        ·
        1 year ago

        Well that’s just straight up not true.

        OpenAI owns ChatGPT. Microsoft is a partner, but not an owner.

        • huginn@feddit.it
          link
          fedilink
          English
          arrow-up
          8
          arrow-down
          1
          ·
          1 year ago

          49% ownership means they dictate what Open AI does. Don’t kid yourself.

          • kakes@sh.itjust.works
            link
            fedilink
            English
            arrow-up
            7
            arrow-down
            1
            ·
            1 year ago

            Sure, but to say Microsoft owns OpenAI is still disingenuous without that disclaimer.

            • huginn@feddit.it
              link
              fedilink
              English
              arrow-up
              1
              ·
              1 year ago

              It’s a distinction only legally.

              At 49% ownership and being 100x the value of Open AI that is effectively the same as full control. Open AI cannot blink without Microsoft getting right of first refusal.

            • huginn@feddit.it
              link
              fedilink
              English
              arrow-up
              4
              ·
              1 year ago

              He’s being serious: that’s exactly what Bing with Chat GPT replies.

              Which further illustrates why LLMs are incredibly niche tools of limited utility… As someone who uses them in their job every day.

              • YouMayBeOntoSomethin@lemmynsfw.com
                link
                fedilink
                English
                arrow-up
                4
                ·
                1 year ago

                I mentioned to someone that I ask ChatGPT things all the time and they were like, “Don’t you know it doesn’t actually know facts? It just spews bullshit that sounds plausible.”

                The joyous thing for me is that’s why I’m using it: To generate plausible sounding nonsense for dungeons and dragons. That, to me, has been one of the biggest use cases for me. Name generation is fantastic through it. “List 10 suggestions for epic sounding names for a tavern built into a cliffside in a deep elven rain forest” and then work shopping it from there.

                As a programmer, I also make pretty consistent use of GitHub Copilot… Because half of programming is boiler plate that LLMs are really good at generating. Super useful for explaining what kind of statically defined array I want without having to type out the whole thing myself. Or, and I think this is my favorite use, any time I need to translate from one data format to another, just describing my input and my desired output gets me a great starting point that I can refine.

                But asking them for facts? Nah lol

                • huginn@feddit.it
                  link
                  fedilink
                  English
                  arrow-up
                  2
                  ·
                  1 year ago

                  I’m also a programmer. I’ve found it’s pretty useless except for code that is very repetitive (test cases) or for documentation… But been there it’s a coin flip as to if I’ll have to go in and correct it.

                  And there’s no indication that it’ll ever be better than that tbh. No matter what articles on MSN say.

                • kakes@sh.itjust.works
                  link
                  fedilink
                  English
                  arrow-up
                  1
                  ·
                  1 year ago

                  Agreed 100% on all points. It’s an incredible tool, but just not for factual information.