Access and Feeds

Deep Fakes: Detection Likely to be an Elusive Goal

By Dick Weisinger

Technology is making it increasingly easy to spoof the world around us. Fake videos of well-known people like Tom Cruise, Vladimir Putin and Nancy Pelosi circulate widely. In some cases it’s clear the video is fake and the intent is to entertain, but it’s becoming increasingly difficult to distinguish what is true and what is fake. This makes it possible to easily create misinformation that can damage and deceive.

Alexa Koenig, director and founder at the Human Rights Center, said that “deep fake videos are what’s often referred to as synthetic video. They’re events that are generated by algorithms, so essentially by two computers that are challenging each other to see how well they can trick human beings into thinking that what they are creating is a depiction of reality.”

The Recorded Future reports that “within the next few years, both criminal and nation-state threat actors involved in disinformation and influence operations will likely gravitate towards deep fakes, as online media consumption shifts more into ‘seeing is believing’ and the bet that a proportion of the online community will continue to be susceptible to false or misleading information.”

Brian Foster, cybersecurity expert, said that “overall, the more we can automate and use intelligence to accomplish verification processes, the better. This approach relies less on humans, who, let’s face it, make lots of mistakes, and more on innovative best practices and tools that can be implemented far faster and more successfully than any static corporate policy.”

Nina Schick, author of “Deep Fakes and the Coming Infocalypse”, said that “synthetic media is expected to become ubiquitous in about three to five years, so we need to develop detection tools going forward. However, as detection capabilities get better, so too will the generation capability.”

Digg This
Reddit This
Stumble Now!
Buzz This
Vote on DZone
Share on Facebook
Bookmark this on Delicious
Kick It on DotNetKicks.com
Shout it
Share on LinkedIn
Bookmark this on Technorati
Post on Twitter
Google Buzz (aka. Google Reader)

Leave a Reply

Your email address will not be published. Required fields are marked *

*