Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

Maybe I'm bad at spotting these things but it looks and sounds super convincing to me. I really dread this technology weaponized in politics.


Why would it have more impact than photoshop? We just don't trust an image unless we are confident of the source. The same will apply to video.


The amount of work needed to fake a photo vs the uncanny valley of animated human faces.


I think once deepfaking reaches a sufficient level of believability, people will stop believing video evidence.


A bunch of people will believe it for some amount of time, and given the very low cost of producing it, it’ll be enough. If there is one thing we learned from the recent increase in mainstream extremism fueled by social networks is that fake rumors don’t have to be especially reasonable to be effective; they just need to be generated and broadcasted at a sufficient rate (selection will do the rest of the work). One interesting question is: is one protected by freedom of speech when generating fake videos clearly intended to harm another person’s image?


I dunno, I don’t think people need to be tricked into thinking certain things. It’s usually what they want. I don’t think promoting “the facts” or worrying about fake videos is going to change anything directionally.


I think that will be a problem of its own. e.g. In Atlanta police recently shot and killed a protester and other protesters are saying it was murder while the police are saying the protester shot first.

Right now, I hope the truth will be revealed by police releasing body camera footage. In the near term future that footage will satisfy nobody and we'll all be left wondering what really did happen.


Exactly, it's equally important to be able to trust that things are right when they are right, as it is to not trust things when they are wrong.

If it's too easy and accepted to wave off real images and videos as fake, that's also problematic.


There are videos of Nessie, Bigfoot and Aliens for decades. Are we still trusting video?!


Yeah but if you look carefully, those videos don't look like the real Nessie, Bigfoot or Aliens whereas Deep Tom Cruise is much more convincing.


Can you link those videos that are close up, high resolution and slowly move the camera around them?


probably only in some corner cases. There's always context and connection to real world events that cameras capture, so a fake bank robbery or a guy in two places or what have you doesn't make any sense. In a way that's already how we deal with potentially staged or doctored video evidence. You always need to figure out if a piece of evidence corresponds to the rest of your evidence.

To pick the example from the other user, if video footage of a shooting matches neither ballistics, eye witness accounts or other evidence on site it'd be very easy to spot a fake, without even technically analzying the video itself.


It might be easier to invalidate celebrity/political deepfakes as there would be no independent corroborating witnesses, but we’d expect there to be some.

But deepfakes used by governments, police or citizens to frame innocent civilians would be really scary. There wouldn’t be any witnesses, but we wouldn’t necessarily expect there to be any either.


My dystopian techno-political prediction is that the 2024 presidential election in the US will have major impact from deepfakes.

With current tech, deepfakes can be really good. The linked one is pretty much there, it's super convincing. When I know it's fake, something seems a bit off about how "Cruise's" head is in relation to his body when he walks through the doorframe but that's it. If I wasn't actively looking for anything fake, I would not have registered that either.

Whoever runs for POTUS in '24, it's certain that both candidates will be well-known people with lots of video material for training a model. It's also certain that many groups and individuals will be strongly vested in the outcome of the election, and some of them will have the resources to produce deepfakes at least as convincing as "Tom Cruise and Paris Hilton" here - this is no longer something that requires hardware worth millions to run for two months straight.

There will be fake videos of the candidates doing/saying highly scandalous stuff. Going on a racist rant, expressing corrupt intent, promising illegal things, etc. And those videos, if somewhat intelligently produced, will have a big impact once they air, no matter if they can be definitively proven to be fake later.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: