Events

Fake News: You Haven’t Seen Anything Yet

On February 14, 2018, a young man walked into Marjory Stoneman Douglas High School in Florida and opened fire on students and faculty with a semi-automatic rifle. By the time he finished, 17 students were dead.

The 2018 Valentine’s Day massacre galvanized surviving students at the high school into political action in favor of gun control. One named Emma Gonzalez wrote an article for Teen Vogue magazine which included a video. In the video, Gonzalez tore up a bullseye target.

An image of Gonzales tearing up the bullseye quickly went viral on the internet. But almost immediately, another version of the image began to circulate: Gonzales tearing up a copy of the US Constitution. The image with the constitution was fake, but those circulating it claimed it was genuine. You can see the original and the modified image on Twitter.

Two months later, a video began circulating online of Donald Trump speaking to citizens of Belgium. “As you know, I had the balls to withdraw from the Paris climate agreement,” he said, “and so should you.” But the video was a fake produced by a left-wing Belgian political party.

Welcome to the latest iteration of the “fake news” phenomenon. And it’s about to get a lot worse. Free online tools like Google’s TensorFlow allow anyone to use a form of artificial intelligence called machine learning to create what have been nicknamed “deep fakes.”

Imagine a video being released the day before the 2020 election where Trump acknowledges that he is a Russian spy or the Democratic challenger confesses to misappropriating millions of dollars in government funds. As deep fakes become more widespread, the phenomenon could change the political process profoundly.

The implications of deep fakes aren’t limited to the political process. Indeed, the most widespread use of the technology has been in pornography, often with the faces of female celebrities superimposed on a porn star’s body to depict fictitious sex acts.

Inevitably, as deep fakes proliferate, public skepticism of photographic and video evidence will grow. This could help individuals committing crimes to avoid being held accountable. For instance, someone photographed committing a crime could claim that the images were manufactured by the real criminal or even by a vindictive prosecutor.

One of the most worrisome possibilities is the use of deep fakes to incite violence. Just imagine the chaos this technology could cause in the Middle East if ISIS were to post a deep-faked video of an American soldier appearing to murder an unarmed Muslim in Iraq.  

Many of today’s deep fakes are unconvincing because of jerky movements or garbled audio. But as the technology becomes more sophisticated, they will become almost impossible to distinguish from the real thing. For instance, a refinement of machine learning algorithms designed to mimic facial expressions provides some of the most realistic-looking fakes ever created. Computer scientist Christian Theobalt produced a video called Deep Video Portraits that explains how it’s done.

There are no easy solutions to deal with the inevitable proliferation of deep fakes and the technology that allows them to be created. Hany Farid, a pioneer in the use of forensic tools to identify child pornography, has warned: "We're decades away from having forensic technology that … [could] conclusively tell a real from a fake.”

If technology can’t detect deep fakes, how can we best deal with them? Banning the machine learning tools used to create deep fakes isn’t the answer. Such an effort would not only lead to a black market for the technology, it would also run afoul of the First Amendment.

Nor is it likely that civil or criminal sanctions against those circulating deep fakes would be effective. Anyone with a good virtual private network (VPN) can disguise their location and identity. So even finding the person or entity that distributed deep fake content is likely to be difficult or impossible.

Ultimately, the deep fake phenomenon may return us to an era where we can trust only what we personally witness. This idea led a pair of researchers publishing a forthcoming law journal article to suggest that a technology straight out of the science fiction series Black Mirror might become popular.

An episode from this series entitled The Entire History of You depicts a future society where most people have a device implanted in them that records everything they see and hear, 24 hours a day. The device is called a “grain,” an immutable life log. Recordings from the grain can be played back so that others can witness it.

An immutable life log could be useful to a politician or anyone else fearful of being victimized by a deep fake. But it has hugely negative implications for our privacy. The authors of the law journal article, Robert Chesney and Danielle Keats Citron, suggest that the proliferation of immutable life logs could lead to “the outright functional collapse of privacy via social consent.”

If you’d prefer not to buy an immutable life log once the technology becomes available, the one technique you can use to protect yourself from blackmail (or worse) via deep fakes is to limit the number of photographs and (especially) audio and video clips you post to social media. That will deter all but the most sophisticated parties from creating a deep fake with you as the star.

On another note, many clients first get to know us by accessing some of our well-researched courses and reports on important topics that affect you.

Like How to Go Offshore in 2024, for example. It tells the story of John and Kathy, a couple we helped from the heartland of America. You’ll learn how we helped them go offshore and protect their nestegg from ambulance chasers, government fiat and the decline of the US Dollar… and access a whole new world of opportunities not available in the US. Simply click the button below to register for this free program.

About The Author

Free Consultation

Since 1984, we’ve helped 15,000+ customers and clients build their wealth protection plan.

Book in a free no-obligation  consultation and learn how we can help you too.

Get our latest strategies delivered straight to your inbox for free.

Get Our Best Plan B Strategies Right to Your Inbox.

The Nestmann Group does not sell, rent or otherwise share your private details with third parties. Learn more about our privacy policy here.

The Basics of Offshore Freedom

Read these if you’re mostly or very new to the idea of going offshore

What it Really Takes to Get a Second Passport

A second passport is about freedom. But how do you get one? Which one is best? And is it right for you? This article will answer those questions and more…

How to Go Offshore
in 2024

[CASE STUDY] How we helped two close-to-retirement clients protect their nest egg.

Nestmann’s Notes

Our weekly free letter that shows you how to take back control.