Tom Hanks has warned fans that an ad for a dental plan that appears to use his image is in fact fake and was created using artificial intelligence.

In a message posted to his 9.5 million Instagram followers, the actor said his image was used without his permission. “BEWARE!! There’s a video out there promoting some dental plan with an AI version of me. I have nothing to do with it,” Hanks wrote over a screenshot of a computer-generated image of himself from the clip.

The Oscar winner has expressed concerns in the past about the use of AI in film and TV, although he has not shied away from approving digitally altered versions of himself in film.

  • MTLion3@lemm.ee
    link
    fedilink
    English
    arrow-up
    79
    ·
    10 months ago

    Aaaand it’s happening just as we all predicted. Stealing likeness in a whole new way

    • MeccAnon@kbin.social
      link
      fedilink
      arrow-up
      24
      ·
      edit-2
      10 months ago

      Right? I remember watching some time ago a AI-generated video of an actress - I think it was Kirsten Stewart - doing a monologue. It was eerily undistinguishable from reality. This is happening, and actors have all the rights to be upset by it until proper compensation rules are in place.

      • FigMcLargeHuge@sh.itjust.works
        link
        fedilink
        English
        arrow-up
        23
        ·
        10 months ago

        I think this goes deeper than just actors compensation. This will take things to a new level when this hits courtrooms. Imagine sitting there watching a video of you doing something you never actually did entered into evidence.

      • BolexForSoup@kbin.social
        link
        fedilink
        arrow-up
        17
        arrow-down
        1
        ·
        10 months ago

        Even as a professional editor for over a decade who is actively looking for them, it is becoming increasingly difficult to tell what is AI generated and what is real. I’m right most of the time, but most of the time is only like 75% of the time. And again, this is when I am actively looking for them. And the tech is only getting better.

      • BraveSirZaphod@kbin.social
        link
        fedilink
        arrow-up
        5
        arrow-down
        2
        ·
        10 months ago

        Society is going to have to adjust to actually demand some proof of authenticity when it comes to content like this.

        The good news is that techniques like public-private key cryptography do actually provide a way to do this, so at least on the technical side, this is a solvable problem. The harder part is getting people to question content that they want to be true, like political propaganda that affirms their own beliefs and biases.

        Just imagine the mess we’ll be in when you can just generate an unlimited amount of videos of some disliked minority committing fake crimes and send them directly to people that you know will be receptive to radicalization, since you’ve already identified them through data brokers and targeted advertising.

        Maybe this is just me getting older - hell, it probably is - but I’m getting more and more detached from tech in general and trying to find more meaning and enjoyment in real-life interaction, community, friendships, and connection, as well as more physical hobbies. I’m not convinced that humans are really equipped to mentally handle the world we’re creating, and I’m finding myself not wanting much to do with it.

        • MTLion3@lemm.ee
          link
          fedilink
          English
          arrow-up
          1
          ·
          9 months ago

          Finding meaning away from tech is always a good idea. Helps us from getting completely sucked into the vortex of Silicon Valley.

  • magnetosphere@kbin.social
    link
    fedilink
    arrow-up
    57
    arrow-down
    2
    ·
    edit-2
    10 months ago

    …although he has not shied away from approving digitally altered versions of himself in film.

    Besides being irrelevant, does this seem a little bit judgmental to anyone else?

  • flossdaily@lemmy.world
    link
    fedilink
    arrow-up
    51
    ·
    10 months ago

    Between voice cloning, gpt-4, and social media, the technology exists TODAY for scammers to call you at 4am with the voice and intimate knowledge of a loved-one, and tell you that they need you to send them money for an emergency.

    You thought old people were easy to scam before? We’re about to enter a golden age of manipulation.

    • helio@sopuli.xyz
      link
      fedilink
      arrow-up
      12
      ·
      10 months ago

      My grandmother is a Greek immigrant and doesn’t speak English very well. Back in 2017 someone called her and told her myself and my mother were dead and she spent like 3 days freaking out and crying. I can only imagine what would happen if someone were to do that today while emulating a voice she knew…

    • drdalek@infosec.pub
      link
      fedilink
      arrow-up
      2
      ·
      10 months ago

      I read something recently that said a study was performed and Teens are most susceptible to getting scammed online.

      • Laticauda@lemmy.ca
        link
        fedilink
        arrow-up
        1
        ·
        10 months ago

        I’m guessing that’s because of sheer numbers, there are going to be more teens online than any other demographic.

    • LedgeDrop@lemm.ee
      link
      fedilink
      arrow-up
      2
      ·
      10 months ago

      … Or you just have a word/phrase known between yourself and the other person.

    • camr_on@lemmy.world
      link
      fedilink
      arrow-up
      1
      ·
      10 months ago

      Didn’t this happen recently? I thought I saw an article about a guy in India who got an ai-faked video call from a friend who needed a quick money transfer to get out of a tough spot. He said it was sketchy but believable. It’s already here

    • bobman@unilem.org
      link
      fedilink
      arrow-up
      1
      arrow-down
      1
      ·
      10 months ago

      And the answer is to get off the internet and start trusting those close to you more than digital strangers.

    • DrM@feddit.de
      link
      fedilink
      arrow-up
      2
      arrow-down
      7
      ·
      10 months ago

      Yes, the technology exists. No, it’s not a threat for your grandma. Scammers would first need need to know which phone number is your grandmas, them they need to find out the relatives of your grandma, obtain enough sample data from your voice and train an AI model for at least a few hours to imitate your voice. That’s not a realistic scenario to do for a slim chance of getting a few thousand bucks. This kind of social engineering attack is only viable for very rich persons and businesses.

      • flossdaily@lemmy.world
        link
        fedilink
        arrow-up
        12
        ·
        10 months ago

        I’m sorry, but you’re assessment of how difficult that would be is WAAAAAY off.

        Scammers are already doing stuff like this en masse with highly customized email scams.

        The way this scam would work is to start with YouTubers, where grabbing the voice data is easy. Then you find their Facebook profile… Very easy, since people use the same usernames, or they go out of their way to link their profiles.

        It’s a pretty easy step to make friend requests with those people. And then a very easy leap to find their relatives real names and towns through their Facebook connections.

        Now you take their connections and towns and do reverse phone number lookups.

        ALL of this can be automated. Every step.

        The voice cloning and gpt-powered phone calls can be automated now, too.

        The only reason this isn’t happening at scale is that scammers haven’t had enough time to adapt yet.

        • guacupado@lemmy.world
          link
          fedilink
          arrow-up
          1
          arrow-down
          2
          ·
          10 months ago

          It’s weird you talk about how easy it is but your only example is with very public people where all you need is a Google search to get their info.

          • deeroh@lemmy.sdf.org
            link
            fedilink
            arrow-up
            4
            ·
            10 months ago

            I’d guess that most people with public social media accounts would be susceptible to something like this. As long as there are videos available with the person speaking, which are plentiful by way of instagram reels / tiktoks, the rest of what the commenter described above sounds totally feasible.

          • flossdaily@lemmy.world
            link
            fedilink
            arrow-up
            1
            ·
            9 months ago

            You don’t need another example if you understand how many people are covered in that category.

            Do you have any idea how many people have at least 5 minutes of audio on YouTube? (That’s all you need for voice cloning), tens of millions? Hundreds of millions?.. And how many of them have a Facebook, insta, tic Tok, or Twitter account? Virtually all of them.

            If you wrote a script to do what I outlined, it would run FOREVER, because he users would be signing up and making videos faster than this script could ever hope to keep up.

      • Rai@lemmy.dbzer0.com
        link
        fedilink
        arrow-up
        0
        ·
        10 months ago

        People are reaaally downvoting you, but how would someone call my G-Ma and imitate my voice using AI?! My voice isn’t on the internet. That’s an insane thing to fear for any regular person.

  • float@feddit.de
    link
    fedilink
    arrow-up
    10
    ·
    10 months ago

    The good thing about this is that people maybe start questioning if a product for some reason gets “better” because their favorite actor says so - because he got money for doing so.

    Imho, he doesn’t need to warn his fans, they are not affected by this at all. Maybe the toothpaste is even a bit cheaper compared to the one that actually paid a (probably very pricey) Hollywood star for their ad. He’s the victim, not his fans.

  • AllonzeeLV@lemmy.world
    link
    fedilink
    arrow-up
    12
    arrow-down
    2
    ·
    edit-2
    10 months ago

    To be fair, if Skynet hunts humanity down, the terminators sounding like America’s Sweetheart Tom Hanks does come as some consolation.

    “Reach for the sky!”

    Anything for you, TH!

  • treefrog@lemm.ee
    link
    fedilink
    arrow-up
    6
    arrow-down
    1
    ·
    10 months ago

    Is using an actor’s likeness without their permission copyright infringement?

    • anewbeginning@lemmy.world
      link
      fedilink
      English
      arrow-up
      8
      ·
      10 months ago

      In Portugal the law forbids taking pictures of a person without consent in public places if the person is the main subject(it’s okay if you’re filming something else and the person happens to pass). I used to think it was too limiting a law, but now I think everyone will need this sort of legal protection.

    • Gradenko@lemm.ee
      link
      fedilink
      arrow-up
      2
      ·
      10 months ago

      There are rules in Hollywood that they can’t use an actors likeness without permission, but obviously the people who made this ad don’t care about that. I think this is an area the law doesn’t cover yet, although it should.

      Although using fake Shemps was somewhat common throughout the 20th century, Screen Actors Guild contracts ban reproducing an actor’s likeness unless the original actor gave permission to do so, largely because of a lawsuit filed by Crispin Glover — following his replacement by Jeffrey Weissman in Back to the Future Part II — that determined that the method violates the original actor’s personality rights. The method continues to be used in cases, such as Shemp’s, where the original actor is deceased and permission from the deceased actor’s estate is granted.

      https://en.wikipedia.org/wiki/Fake_Shemp

    • BraveSirZaphod@kbin.social
      link
      fedilink
      arrow-up
      2
      ·
      10 months ago

      Copyright is very strictly for creative works, which your likeness is not.

      You might be able to stretch trademark law into applying here, but my understanding is that, at least at the federal level, there’s not really much of a legal framework for dealing with this sort of stuff yet. Hopefully we’ll get something soon.

      • FatCrab@lemmy.one
        link
        fedilink
        arrow-up
        1
        ·
        10 months ago

        Likeness rights are state based and accordingly vary state to state. As usual with such things, you can just assume CA and NY is the “prevailing” law on it.

  • Cyborganism@lemmy.ca
    link
    fedilink
    arrow-up
    5
    arrow-down
    1
    ·
    10 months ago

    Tom Hanks and some woman going

    DENTAL PLAN!

    Lisa needs braces!

    DENTAL PLAN!

    Lisa needs braces!

    DENTAL PLAN!

    Lisa needs braces!

  • Ddhuud@lemmy.world
    link
    fedilink
    arrow-up
    3
    arrow-down
    1
    ·
    10 months ago

    It was a calculated risk, they just suck at math.

    It’s definitely gonna cost them a number greater than 1 multiplied by what they would have to spend to actually contact Took Hanks.

    • Ignisnex@lemmy.world
      link
      fedilink
      English
      arrow-up
      2
      ·
      10 months ago

      I heard that while reading it. Like when someone says “Good news [everyone]!” and it sounds like the professor Farnsworth. God I haven’t thought about that episode in ages.