BDSM, LGBTQ+, and sugar dating apps have been found exposing users’ private images, with some of them even leaking photos shared in private messages.

  • Balder@lemmy.world
    link
    fedilink
    English
    arrow-up
    119
    arrow-down
    1
    ·
    2 days ago

    Brace yourselves, because this is only going to get worse with the current “vibe coding” trend.

      • Vendetta9076@sh.itjust.works
        link
        fedilink
        English
        arrow-up
        84
        ·
        2 days ago

        Vibe coding is the current trend of having an LLM build your codebase for you then shipping it without attempting to understand the codebase.

        Most developers are using LLMS to some extent to speed up their coding, as cursor and Claude are really good at removing toil. But vibe coders have the LLM build the entire thing and don’t even know how it works.

        • ElectroVagrant@lemmy.world
          link
          fedilink
          English
          arrow-up
          44
          ·
          2 days ago

          In other words, vibe coders are today’s technologically accelerated script kiddie.

          That’s arguably worse as the produced scripts may largely work and come with even less demand for understanding than a script kid’s cobbling together of code may have demanded.

          • TeddE@lemmy.world
            link
            fedilink
            English
            arrow-up
            21
            arrow-down
            3
            ·
            2 days ago

            Large language models (LLM) are the product of neural networks, a relatively recent innovation in the field of computer intelligence.

            Since these systems are surprisingly adept at producing natural sounding language, and is good at create answers that sound correct (and sometimes actually happen to be) marketers have seized on this as an innovation, called it AI (a term with a complicated history), and have started slapping it onto every product.

          • qaz@lemmy.world
            link
            fedilink
            English
            arrow-up
            4
            ·
            edit-2
            2 days ago

            A machine learning model that can generate text.

            It works by converting pieces of text to “tokens” which are mapped to numbers in a way that reflects their association with other pieces of text. The model is fed input tokens and predicts tokens based on that, which are then converted to text.

          • qyron@sopuli.xyz
            link
            fedilink
            English
            arrow-up
            4
            arrow-down
            1
            ·
            edit-2
            2 days ago

            Large Language Model

            To the extent of my understanding, it is a form of slightly more sophisticated bot, as in an automated response algorithm, that is developed over a set of data, in order to have it “understand” the mechanics that make such set cohesive to us humans.

            With such background, it is supposed to produce new similar outputs if given new raw data sets to run through the mechanics it acquired during development.

          • spooky2092@lemmy.blahaj.zone
            link
            fedilink
            English
            arrow-up
            8
            ·
            2 days ago

            Boring/repetitive work. For example, I regularly use an AI coding assistant to block our basic loop templates with variables filled in, or have it quickly finish the multiple case statements or assigning values to an object with a bunch of properties.

            In little things like that, it’s great. But once you get past a medium sized function, it goes off the rails. I’ve had it make up parameters in stock library functions based on what I asked it for.

      • Little8Lost@lemmy.world
        link
        fedilink
        English
        arrow-up
        7
        ·
        2 days ago

        Its going to be 1GB node_modules handled by garbage ai code
        ai is only good at doing smaller scripts but loosing connections and understandment in larger codebases, combined with people who cant program well (i mean not only coding but debugging… as well) also called vibe programmers its going to be a mess

        if a product claims it has vibecoding: find an alternative!

        • msage@programming.dev
          link
          fedilink
          English
          arrow-up
          9
          ·
          2 days ago

          I’m losing my will to live lately at an alarming rate.

          I used to love IT, way back at the start of 00s.

          Soon after the 10s started, I noticed bullshit trends replacing one another… like crypto or clouds or SaaS… but now with the AI I just feel alienated. Like we’re just all going to hell, and I hate the first row seating.

          • Balder@lemmy.world
            link
            fedilink
            English
            arrow-up
            4
            ·
            2 days ago

            At this point, I think it’s required to have a sort of alternate identity online and keeping anything private, photos of yourself and other information just offline. Except for government stuff, which requires your real identity.

            • msage@programming.dev
              link
              fedilink
              English
              arrow-up
              2
              ·
              1 day ago

              I mean yeah, I selfhost everything, but I hate that i have to learn and support the most useless shit ever just to earn a living.

              It used to be fun being a dev, now I’m just repeating the same warning phrases about technologies.

  • MissGutsy@lemmy.blahaj.zone
    link
    fedilink
    English
    arrow-up
    169
    arrow-down
    1
    ·
    edit-2
    3 days ago

    Cybernews researchers have found that BDSM People, CHICA, TRANSLOVE, PINK, and BRISH apps had publicly accessible secrets published together with the apps’ code.

    All of the affected apps are developed by M.A.D Mobile Apps Developers Limited. Their identical architecture explains why the same type of sensitive data was exposed.

    What secrets were leaked?

    • API Key
    • Client ID
    • Google App ID
    • Project ID
    • Reversed Client ID
    • Storage Bucket
    • GAD Application Identifier
    • Database URL

    […] threat actors can easily abuse them to gain access to systems. In this case, the most dangerous of leaked secrets granted access to user photos located in Google Cloud Storage buckets, which had no passwords set up.

    In total, nearly 1.5 million user-uploaded images, including profile photos, public posts, profile verification images, photos removed for rule violations, and private photos sent through direct messages, were left publicly accessible to anyone.

    So the devs were inexperienced in secure architectures and put a bunch of stuff on the client which should probably have been on the server side. This leaves anyone open to just use their API to access every picture they have on their servers. They then made multiple dating apps with this faulty infrastructure by copy-pasting it everywhere.

    I hope they are registered in a country with strong data privacy laws, so they have to feel the consequences of their mismanagement

      • sugar_in_your_tea@sh.itjust.works
        link
        fedilink
        English
        arrow-up
        11
        arrow-down
        1
        ·
        2 days ago

        No, it’s lack of experience. When I was a junior dev, I had a hard enough time understanding how things worked, much less understanding how they could be compromised by an attacker.

        Junior devs need senior devs to learn that kind of stuff.

        • PumaStoleMyBluff@lemmy.world
          link
          fedilink
          English
          arrow-up
          2
          ·
          1 day ago

          It does help if services that generate or store secrets and keys display a large warning that they should be kept secret, every time they’re viewed, no matter the experience level of the viewer. But yeah understanding why and how isn’t something that should be assumed for new devs.

    • taiyang@lemmy.world
      link
      fedilink
      English
      arrow-up
      20
      ·
      2 days ago

      I’ve met the type who run businesses like that, and they likely do deserve punishment for it. My own experience involved someone running gray legality betting apps, and the owner was a cheapskate who got unpaid interns and filipino outsourced work to build their app. Guy didn’t even pay 'em sometimes.

      Granted, you could also hire inexperienced people if you’re a good person with no financial investor, but that I’ve mostly seen with education apps and other low profit endeavors. Sex stuff definitely is someone trying to score cash.

    • Flax@feddit.uk
      link
      fedilink
      English
      arrow-up
      29
      arrow-down
      4
      ·
      3 days ago

      Do you reckon this app could have been vibecoded/a product of AI? Or massive use of AI in development? I’d know not to do this as a teenager when I was beginning to tinker with making apps, nevermind an actual business.

      • taladar@sh.itjust.works
        link
        fedilink
        English
        arrow-up
        69
        ·
        2 days ago

        I know for a fact that a lot of applications made these mistakes before AI was around so while AI is a possibility it is absolutely not necessary.

        • yoshman@lemmy.world
          link
          fedilink
          English
          arrow-up
          40
          ·
          2 days ago

          I had a test engineer demand an admin password be admin/admin in production. I said absolutely not and had one of my team members change it to a 64-character password generated in a password manager. Dumbass immediately logs in and changes it to admin again. We found out when part of the pipeline broke.

          So, we generated another new one, and he immediately changed it back to admin again. We were waiting for it the second time and immediately called him out on the next stand-up. He said he needs it to be admin so he doesn’t have to change his scripts. picard_facepalm.jpg

          • Serinus@lemmy.world
            link
            fedilink
            English
            arrow-up
            22
            ·
            2 days ago

            How is he not fired? Incompetence and ignorance is one thing, but when you combine it with effectively insubordination… well, you better be right. And he is not.

            • Pika@sh.itjust.works
              link
              fedilink
              English
              arrow-up
              8
              ·
              edit-2
              2 days ago

              Firmly agree, I don’t believe he should have had access to change these password in the first place unless I’m misunderstanding their definition of test engineer, but if OP had the authority and permission to change the password in the first place, and that person deliberately changed it back to the insecure route again, management would be involved and there would some sort of reprimandment because that’s past ignorance, that’s negligence

              • yoshman@lemmy.world
                link
                fedilink
                English
                arrow-up
                3
                ·
                2 days ago

                It was an admin account to do regression testing for the admin interface and functions before prod releases.

                I had my guys enable/disable the account during the testing pipeline so people can’t login anymore.

                • sugar_in_your_tea@sh.itjust.works
                  link
                  fedilink
                  English
                  arrow-up
                  1
                  ·
                  2 days ago

                  Why would you have regression on prod? Or why would you care what the password is on staging environments?

                  We have our lower environments (where all testing happens) on a VPN completely separated from prod, and testing engineers only ever touch those lower environments. Our Ops team manages all admin prod accounts, and those are completely separate from lower environment accounts.

                  So I guess I’m struggling to understand the issue here. Surely you could keep a crappy password for pre-prod testing? We even create a copy of prod as needed and change the admin accounts if there’s something prod-specific.

            • yoshman@lemmy.world
              link
              fedilink
              English
              arrow-up
              5
              ·
              2 days ago

              He was a subcontractor, so technically, he’s not our employee.

              I bubbled it up the chain on our side, and it hasn’t happened since.

          • Pika@sh.itjust.works
            link
            fedilink
            English
            arrow-up
            6
            ·
            2 days ago

            my main question in this is, why does a test engineer have the credentials to change an admin password in production. Like I get that he needs to test things but I doubt he needs access to changing profile/account settings

            • yoshman@lemmy.world
              link
              fedilink
              English
              arrow-up
              2
              ·
              2 days ago

              He had to do admin functionality regression tests before prod releases to make sure nothing broke.

              The system uses SSO for logins for everything else.

              He is a subcontractor who was using scripts for all his projects. I told him he really needs to use env vars for creds.

    • azalty@jlai.lu
      link
      fedilink
      English
      arrow-up
      6
      ·
      2 days ago

      The illusion of choice

      A lot of “normal” dating apps are also owned by the same companies

    • Rexios@lemm.ee
      link
      fedilink
      English
      arrow-up
      1
      ·
      2 days ago

      Every single one of those “secrets” is publicly available information for every single Firebase project. The real issue is the developers didn’t have proper access control checks.

  • CheeseToastie@lazysoci.al
    link
    fedilink
    English
    arrow-up
    70
    arrow-down
    2
    ·
    2 days ago

    This is devastating. The LGBT community are often hiding their true selves because of family, colleagues, culture etc. People will be destroyed.

  • PumaStoleMyBluff@lemmy.world
    link
    fedilink
    English
    arrow-up
    13
    ·
    2 days ago

    Anyone who uses Grindr, please be aware that any photos you send are cached and stored unencrypted in plain old folders on the receiver’s phone, regardless of whether they were expiring or in an album that you later revoked. It’s nearly trivial to grab any photo someone sends you, with no watermark or screenshot notification.

  • azalty@jlai.lu
    link
    fedilink
    English
    arrow-up
    17
    ·
    edit-2
    2 days ago

    Use Signal or SimpleX for more private stuff like this 👀

  • thatradomguy@lemmy.world
    link
    fedilink
    English
    arrow-up
    3
    arrow-down
    7
    ·
    2 days ago

    Just don’t send nudes… why do people think other people won’t figure out how to screenshot or just keep photos forever? Even if you trust the person, the person could get hacked… the pwned guy got pwned for Jehova’s sake. Just stop sending that shit.