Board Logo
« Identifying Fake Images/Videos »

Welcome Guest. Please Login or Register.
Nov 17th, 2017, 10:52pm



« Previous Topic | Next Topic »
Pages: 1  Notify Send Topic Print
 thread  Author  Topic: Identifying Fake Images/Videos  (Read 129 times)
Swamprat
Gold Member
ImageImageImageImageImage


member is offline

Avatar




PM

Gender: Male
Posts: 4230
xx Identifying Fake Images/Videos
« Thread started on: Oct 9th, 2017, 2:36pm »

Our job is getting tougher.....



The scientist who spots fake images/videos

Hany Farid discusses how to detect image manipulations ó and the increasing sophistication of forgers.

Elizabeth Gibney
06 October 2017

Hany Farid, a computer scientist at Dartmouth College in Hanover, New Hampshire, specialises in detecting manipulated images and videos. Farid, who provides his services to clients as varied as universities, media organizations, and law courts, says that image manipulation is becoming both more frequent and more sophisticated. He spoke to Nature about the arms race to stay ahead of the forgers.

Where do you start when trying to spot a fake image?
One simple but powerful technique is reverse image search. You give the image to a site such as Google Image Search or TinEye, and they show you all other instances of it. A project at Columbia University, in New York City, is taking this to the next level, and starting to find parts of images that have been repurposed from other images: http://www.ee.columbia.edu/ln/dvmm/memex/index.html#About

ďIíve seen the technology get good enough that Iím now very concernedĒ
Generally, we think about which patterns, geometries, colours or structures are going to be disrupted when someone manipulates a photo. For example, when people add an object into a scene, we know that where they put the shadow is usually wrong. A viral video called Golden Eagle Snatches Kid from 2012 is one of my favourite examples. It took us only 15 minutes of analysis to show shadow inconsistencies: the eagle and baby were computer-generated: https://www.youtube.com/watch?v=CE0Q904gtMI

What about if fake images make only slight tweaks?
There are a number of analyses we can do. In a colour picture, every pixel needs three values ó corresponding to the amounts of red, green and blue at that point. But in most cameras, every pixel records just one colour, and the camera fills in the gaps by taking the average values of the pixels around it. This means that, for any given colour in an image, each missing pixel has a particular correlation with its neighbours, which will be destroyed if we add or airbrush something, and we can detect that.

Another technique is JPEG compression. Almost every image is stored in a JPEG file, which throws away some information to save on storage. There is a huge amount of variation in how each camera does that. If a JPEG is unpacked ó opened in Photoshop ó and then put back together, it is always repackaged slightly differently, and we can detect that. I wish you could just upload any image and we could tell you if itís real or not, but itís still a very difficult process and requires expertise to understand different components.

Who uses your digital forensic services?
I do analysis for organisations such as the Associated Press, Reuters, and The New York Times. There are only a handful of academics worldwide who are specialists in this, so it doesnít scale ó and that means you can only do the analysis of really high-stakes images. But there are efforts under way to scale this up. Last year, the US Defense Advanced Research Projects Agency (DARPA) got into this game with a large project of which Iím part. Over the next five years theyíre trying to create a system that will allow you to analyse hundreds of thousands of images a day. Itís a very ambitious programme: https://www.darpa.mil/program/media-forensics

I also do a lot of work in the courts. For example, here in the United States, child pornography is illegal, but computer-generated child pornography counts as 'protected speech' under the First Amendment. If someoneís arrested they might say that the offending image isnít real, and I might have to prove that it is. I also get lots of e-mails from people about photo hoaxes ó almost daily.

Do you apply your techniques to scientific papers?
I have worked on many cases of scientific misconduct, hired by universities conducting internal investigations. When I visited the US Office of Research Integrity recently, they asked me ďhow do we get our hands on automated tools?Ē The reality is weíre still not there. But creating something that uses some of the tools, such as clone detection, which looks to see whether parts of an image have been copied and pasted from elsewhere, would be possible as a semi-automated process looking at dozens, not millions, of images a day. Itís something my colleagues and I are thinking about, and itís a small but not insignificant part of the DARPA programme.

How about fake videos?
Researchers are now able to splice together footage to create videos of famous people seeming to say things they never said ó for instance, this video of President Obama: http://futureoffakenews.com/ And they can create fake images or short videos using machine learning techniques: in particular, generative adversarial networks (GANs), which learn to generate fake content: https://www.nature.com/news/astronomers-explore-uses-for-ai-generated-images-1.21398 These pit a network that generates fake content against a Ďclassiferí network that attempts to discriminate between real and fake content, so that the faking network rapidly improves.

Iíve seen the technology get good enough that Iím now very concerned. In 5 or 10 years, this is going to get really good. At some point we will reach a stage where we can generate realistic video, with audio, of a world leader, and thatís going to be very disconcerting. I would say that the field of digital forensics is now behind in video.

How can you detect fake video?
JPEG compression has an analogous construct in video, which is a bit harder to detect because video uses a more sophisticated version. Another approach is to use machine learning for detection. But weíre taking an approach similar to what we do with images ó which is based on the observation that computer-generated content lacks the imperfections that are present in a recorded video. Itís created in almost too perfect a world. So one of the things we look at is, are we not seeing the statistical and geometric patterns weíd expect to see in the physical world?

Another technique is based on some beautiful work by William Freeman and colleagues at the Massachusetts Institute of Technology in Cambridge, who showed how if you magnify really small changes in a video of a person, you can see subtle changes in the colours in their face that correspond to their pulse rate. We showed that you can use this to distinguish real people from computer-generated people: http://people.csail.mit.edu/mrub/vidmag/#people

Couldnít machine learning algorithms learn to include these features?
Perhaps in principle. But in practice, these algorithms have limited time and training data, and there is little control over which features a neural network will pick up on to discriminate between real and fake videos. A GAN is only trying to fool the classifier itís trained on. Thatís no guarantee that it will learn all aspects of what makes an image or video real or fake, or that it will fool another classifier.

My adversary will have to implement all the forensic techniques that I use, so that the neural network can learn to circumvent these analyses: for example, by adding a pulse in. In that way, Iíve made their job a little harder.

Itís an arms race. As we are developing faster, folks are creating more sophisticated technology to augment audio, images and video. The way this is going to end is that you take the ability to create a perfect fake out of the hands of the amateur. You make it harder, so it takes more time and skill, and thereís a greater risk of getting caught.

https://www.nature.com/news/the-scientist-who-spots-fake-videos-1.22784?WT.ec_id=NEWSDAILY-20171009

User IP Logged

"Let's see what's over there."
travex
Full Member
ImageImageImage


member is offline

Avatar




PM

Gender: Male
Posts: 133
xx Re: Identifying Fake Images/Videos
« Reply #1 on: Oct 13th, 2017, 10:32am »

on Oct 9th, 2017, 2:36pm, Swamprat wrote:
Our job is getting tougher.....



The scientist who spots fake images/videos

Hany Farid discusses how to detect image manipulations ó and the increasing sophistication of forgers.

ďIíve seen the technology get good enough that Iím now very concernedĒ
Generally, we think about which patterns, geometries, colours or structures are going to be disrupted when someone manipulates a photo. For example, when people add an object into a scene, we know that where they put the shadow is usually wrong. A viral video called Golden Eagle Snatches Kid from 2012 is one of my favourite examples. It took us only 15 minutes of analysis to show shadow inconsistencies: the eagle and baby were computer-generated: https://www.youtube.com/watch?v=CE0Q904gtMI



Excellent... Take that footage some fifty years back in time and it would be considered hard evidence by many - evidence of large birds being really capable of snatching little kids, as rumoured.

Why not to take the video for what it shows?

Well, there are circumstances that call for further scrutiny.
http://www.slate.com/blogs/browbeat/2012/12/19/do_eagles_really_snatch_babies_like_in_the_youtube_video_not_really_but.html

The weavers of alien abduction stories are as much as careless as those who attempt a hoax without getting acquinted with other aspects of the scenario known mostly to experts.

However, some visual evidence presented is almost fool-proof. Now analyze this:
https://i.pinimg.com/236x/44/f9/f8/44f9f8aecd8a43e328fffeb48f5f07d6--cow-art-ipad-case.jpg
User IP Logged

sunny123
New Member
Image


member is offline

Avatar




PM


Posts: 1
xx Re: Identifying Fake Images/Videos
« Reply #2 on: Oct 16th, 2017, 06:56am »

Nice Post G
User IP Logged

Pages: 1  Notify Send Topic Print
« Previous Topic | Next Topic »

Donate $6.99 for 50,000 Ad-Free Pageviews!

| |

This forum powered for FREE by Conforums ©
Sign up for your own Free Message Board today!
Terms of Service | Privacy Policy | Conforums Support | Parental Controls