Fake videos of real people — and how to spot them | Supasorn Suwajanakorn

Fake videos of real people — and how to spot them | Supasorn Suwajanakorn

Look at these images. Now, tell me which Obama here is real. (Video) Barack Obama: To help families
refinance their homes, to invest in things
like high-tech manufacturing, clean energy and the infrastructure
that creates good new jobs. Supasorn Suwajanakorn: Anyone? The answer is none of them. (Laughter) None of these is actually real. So let me tell you how we got here. My inspiration for this work was a project meant to preserve our last
chance for learning about the Holocaust from the survivors. It’s called New Dimensions in Testimony, and it allows you to have
interactive conversations with a hologram
of a real Holocaust survivor. (Video) Man: How did you
survive the Holocaust? (Video) Hologram: How did I survive? I survived, I believe, because providence watched over me. SS: Turns out these answers
were prerecorded in a studio. Yet the effect is astounding. You feel so connected to his story
and to him as a person. I think there’s something special
about human interaction that makes it much more profound and personal than what books or lectures
or movies could ever teach us. So I saw this and began to wonder, can we create a model
like this for anyone? A model that looks, talks
and acts just like them? So I set out to see if this could be done and eventually came up with a new solution that can build a model of a person
using nothing but these: existing photos and videos of a person. If you can leverage
this kind of passive information, just photos and video that are out there, that’s the key to scaling to anyone. By the way, here’s Richard Feynman, who in addition to being
a Nobel Prize winner in physics was also known as a legendary teacher. Wouldn’t it be great
if we could bring him back to give his lectures
and inspire millions of kids, perhaps not just in English
but in any language? Or if you could ask our grandparents
for advice and hear those comforting words even if they’re no longer with us? Or maybe using this tool,
book authors, alive or not, could read aloud all of their books
for anyone interested. The creative possibilities
here are endless, and to me, that’s very exciting. And here’s how it’s working so far. First, we introduce a new technique that can reconstruct a high-detailed
3D face model from any image without ever 3D-scanning the person. And here’s the same output model
from different views. This also works on videos, by running the same algorithm
on each video frame and generating a moving 3D model. And here’s the same
output model from different angles. It turns out this problem
is very challenging, but the key trick
is that we are going to analyze a large photo collection
of the person beforehand. For George W. Bush,
we can just search on Google, and from that, we are able
to build an average model, an iterative, refined model
to recover the expression in fine details,
like creases and wrinkles. What’s fascinating about this is that the photo collection
can come from your typical photos. It doesn’t really matter
what expression you’re making or where you took those photos. What matters is
that there are a lot of them. And we are still missing color here, so next, we develop
a new blending technique that improves upon
a single averaging method and produces sharp
facial textures and colors. And this can be done for any expression. Now we have a control
of a model of a person, and the way it’s controlled now
is by a sequence of static photos. Notice how the wrinkles come and go,
depending on the expression. We can also use a video
to drive the model. (Video) Daniel Craig: Right, but somehow, we’ve managed to attract
some more amazing people. SS: And here’s another fun demo. So what you see here
are controllable models of people I built
from their internet photos. Now, if you transfer
the motion from the input video, we can actually drive the entire party. George W. Bush:
It’s a difficult bill to pass, because there’s a lot of moving parts, and the legislative processes can be ugly. (Applause) SS: So coming back a little bit, our ultimate goal, rather,
is to capture their mannerisms or the unique way each
of these people talks and smiles. So to do that, can we
actually teach the computer to imitate the way someone talks by only showing it
video footage of the person? And what I did exactly was,
I let a computer watch 14 hours of pure Barack Obama
giving addresses. And here’s what we can produce
given only his audio. (Video) BO: The results are clear. America’s businesses have created
14.5 million new jobs over 75 straight months. SS: So what’s being synthesized here
is only the mouth region, and here’s how we do it. Our pipeline uses a neural network to convert and input audio
into these mouth points. (Video) BO: We get it through our job
or through Medicare or Medicaid. SS: Then we synthesize the texture,
enhance details and teeth, and blend it into the head
and background from a source video. (Video) BO: Women can get free checkups, and you can’t get charged more
just for being a woman. Young people can stay
on a parent’s plan until they turn 26. SS: I think these results
seem very realistic and intriguing, but at the same time
frightening, even to me. Our goal was to build an accurate model
of a person, not to misrepresent them. But one thing that concerns me
is its potential for misuse. People have been thinking
about this problem for a long time, since the days when Photoshop
first hit the market. As a researcher, I’m also working
on countermeasure technology, and I’m part of an ongoing
effort at AI Foundation, which uses a combination
of machine learning and human moderators to detect fake images and videos, fighting against my own work. And one of the tools we plan to release
is called Reality Defender, which is a web-browser plug-in
that can flag potentially fake content automatically, right in the browser. (Applause) Despite all this, though, fake videos could do a lot of damage, even before anyone has a chance to verify, so it’s very important
that we make everyone aware of what’s currently possible so we can have the right assumption
and be critical about what we see. There’s still a long way to go before
we can fully model individual people and before we can ensure
the safety of this technology. But I’m excited and hopeful, because if we use it right and carefully, this tool can allow any individual’s
positive impact on the world to be massively scaled and really help shape our future
the way we want it to be. Thank you. (Applause)


  1. EitherLaw should be made against this or people should start not believing such kind of videos. No reaction to such videos

  2. How scary would it be if people were sending death threats like this. Splice victim with murder footage. Or faking deaths, or faking foreign affairs. GG

  3. The real criminals are sending their messages so we do not believe the real news and their acts. Acts against humanity. Enjoy the show. Pain is coming.

  4. His video is with Obama almost to let us know, "Yes I can take a political leader and make him say what i want."

  5. Hijacking 1of your photo fr Internet with Ai graphic simulation software will do, No need hire Lousy actor, Actress plus victim on Fake major mainstream Govt Headline News Like CNN > see people do not exist < https://youtu.be/SFWcS1aNXZg

  6. Can anyone help me? I was in a video that was altered I believe," that time of events were switched" will anyone, who have the expertise, Take a look at my Video?, Thank you all who read this and God Bless. And for anyone who feel in their Heart to assist it will be highly Appreciated

  7. Yes there are fake videos out there but also there a real videos of political figure and elites that are being claimed as fake when in reality they are real.

  8. So over a half million viewers, I don't get a reply? Really I wanted to see how well this so called technology is? Melvin

  9. Last thing make a neurone network independent AI which will read news like Eliza Cassan (from Deus Ex) – saves money on reporters 😀

  10. this technology will also help preserve the testimonies of Palestinian survivors for the future, the technology it's amazing!

  11. all of his reasoning seems so naive… it's like watching a beginning of those movies where the tech destroys humanity at the end.

  12. I can't find any useful application for humanity with this technology, but I can find many potential threats with it. The future will be pretty fucked up.

  13. I'm tired of so much emphasis on Holocaust survivors, when 2 million Americans are being tortured right now every year in US police state prison camps and nobody gives a damn. I know this, because I was one. I filed grievances on torture and appealed them all the way to the top and they were all ignored. I won my appeal and walked out the front door as one of the few to escape. Maybe a little more focus on people currently being tortured by statism, rather than people that were tortured by statism. While they are still alive, and can be freed.

    It's just not the United States, either… many countries have vast underground prison systems and gulags and prison camps hidden out of view.

  14. You guys are so transparent you got the whole Wrecking Crew up there Hillary and Obama all the regular criminals you're trying to put it in the public consciousness that when some damning videos come out of these criminals doing some shady s*** they can all claim it's a deepfake come on so transparent people aren't as stupid as you think they are but you might be a lot more stupid than we think you are

  15. Why would you do this kind of evil.work? It is only going to create problems. Great. Do you see money? Greed? It wouldn't be great.

  16. Notice that several videos have come out recently about "Deep Fake"
    videos and also Obama and Hillary are out warning people about them.
    Hmmm … why? Why now? What's coming out? What will make famous/powerful
    people cry "FAKE VIDEO!" ?

  17. Notice that several videos have come out recently about "Deep Fake"
    videos and also Obama and Hillary are out warning people about them.
    Hmmm … why? Why now? What's coming out? What will make famous/powerful
    people cry "FAKE VIDEO!" ?

  18. Just think of all the photos you have uploaded to google or iCloud? He said the quality or angle of the pic doesn't matter as long as there are alot of them.

  19. why not invest the money and time on clean and free energy for everyone instead of on stuff that can 90% lead to a 3rd world war?

  20. Is great but what's the direction of all of this wtf this clearly going to a shady wrong direction (the only positive side of this is a fun meme a fun joke on social maybe,but all the rest is clearly to other ends)so in a way to risky for no actual benefit(but anyways will be built and used so holy shezzus

  21. So we have been lied to all of our lives but when certain subjects are questioned you are labeled as a logical thinking person. Nope you are called a terrorist or member of a hate group. Seems to me violence is the way we are being kept away from the old lies that may be easy to prove.

  22. the title is 50% fake. I did not learn how to spot fake videos. He only said "oh yeah, we work on a browser plug in to spot fakes"

  23. Another stupid scientist that will have his invention used by governments for evil-doing and then will be shocked.

  24. Look at all these technophobes in the comments. Technology isn't good or bad, it's what people do with it. Think of all the great movies we're going to get in the future from this.

  25. Again, with the Feynman Great Teacher crap. He said himself, that his work had been "a failure". He couldn't teach. Only lecture. He was a great professor, not a teacher.

  26. Do any of you guys really feel more "connected" with the recording
    when you're talking to it like an idiot instead of just pushing play?

    I don't just worry about how many innocents could go to jail,
    but also how many guilty psychos could be set free once we lose trust in recorded evidence.

    Also, a bunch of people will be getting fired because their tech savvy subordinates wanna be promoted, I imagine.

  27. How to spot them? Look at the lips, they were never perfectly in sync with the sound. I guess it's because some sounds, like in this case "f" (in infrastructure)..and probably longer words in general(?) are hard to fake. Well, at the moment at least.

  28. 4:10 I Do See Controllable puppets. 🙈🙉🙊🐒 bread crumbs for the narrative but also an easy list of the bad people. #HiddenInPlainSight #wwg1wga #Qanon #IMightBeASecretGenius #Q+

  29. The human being has never created or manufactured anything that other humans have not subverted and manipulated for evil. This guy is a perfect character for a real "Terminator" scene, where someone goes back in time and…….! What a pathetic bunch of excuses he came up with to justify this abomination.

  30. I think the range of people chosen to represent the technology in this video was rather telling. I didn't see one conservative voice in that group of actors or politicians. I guess we are just going to have to trust that the people developing the technology and its "safe-guards" have no political motivations of their own.

  31. a brief synopsis…

    "we are screwed".

    morons with too much education and absolutely no common-sense convincing themselves that ideas like this serve a purpose.

    digs a very deep hole. very deep

  32. "How did you survive the Nagasaki Nuclear bomb ?".
    "Oh I realised what it was as it fell out of the sky and quickly wrapped myself in Baking foil".

  33. this apps is the biggest and stupid apps ever created because its will be using by bad moral people specially politician for black campaigne

  34. Fear your government when it's used against the people and to manipulate the citizens. Start questioning everything you see and hear before you believe.

  35. there are many methods to spot deepfake videos.
    you need to train a large number of data of fake and real videos
    then test it

    I don't think any method has a 100% accuracy
    the larger the data collected the better

    basicly it's a competition
    and deepfakes is still ahead

  36. Just because you have a love one face imprinted in a moving image does not mean it’s speaking original thoughts or wisdom of the person represent! This is very dangerous socially! Reusing images in such a fashion should be globally illegal. Potential of wars, death, political injustice is enormous!

  37. We used this video in our article A Simple Explanation of How DeepFake Videos Are Made https://www.urtech.ca/2019/12/solved-a-simple-explanation-of-how-deepfake-videos-are-made/ Thanks!

  38. Nothing on TV is real. Turn your tv off, delete the programming and wake your mind up. The trick to seeing the fake is the mouth. The way it moves is unnatural if you look closely.

Leave a Reply

Your email address will not be published. Required fields are marked *