He Didn’t Really Say That: The Implications of AI ‘Fake News’
When Jordan Peele teamed up with Buzzfeed to produce an AI video of Barack Obama saying some very unlikely things, many people called it “the future of fake news.” All it took was Adobe After Effects and the AI face-swapping tool FakeApp to make it happen, and while there are a few tells that it’s not the real deal – like jerky, repetitive hand movements – it’s pretty believable. The tech actually began on Reddit as a tool for making fake celebrity porn, which seems violating enough in its own right, and it raises a lot of questions about how we can verify whether something actually happened. There’s also a new Adobe program described as “Photoshop for audio” that lets you edit dialogue, and a service that requires just a few minutes of audio to let you fake someone else’s voice. If this technology has already gotten this far, where will it be five years from now?
Where Will DNA Databases Take Us?
Catching an infamous rapist and serial killer using DNA seems like a good thing, right? After all, the East Bay Rapist – better known as the Golden State Killer, a name given to him by the late sleuth and crime writer Michelle McNamara – terrorized California during the 1970s and 1980s, destroying countless lives with the extended impact of his crimes. Police were finally able to identify Joseph James DeAngelo, 72, after feeding crime scene DNA into online genealogy websites and looking for matches in family trees. He’s since been charged with 12 murders, and it’s believed he committed at least 50 rapes and multiple home invasions. But the case sets a disturbing precedent for official use of our most personal and private data.
Millions of people have willingly forked over saliva samples to a variety of private companies in order to get (often inaccurate) information about the origins of their ancestors, or find biological family they’ve never met. The companies, including Ancestry.com and 23andMe, have generally promised they won’t allow authorities to search their DNA databases. But the site used to catch DeAngelo, GEDmatch, allows public searches of its user-submitted data. The problem comes in when a close relative decides to upload their own DNA, infringing upon your privacy in the process. Plus, given the fact that law enforcement sometimes goes around the law (or wades into legal gray areas) to access other forms of private data, like what’s stored in your iPhone, it wouldn’t be too shocking if they did so with DNA, too. Considering how recent battles like Net Neutrality have turned out, we can’t rule out the possibility of the government simply deciding DNA match companies must allow authorities to search their databases.
Using Brain-Scanning Helmets to Monitor Employees
Oh hi! Put on this hat, worker. It will read your brainwaves so we, your benevolent employers, can tell when you’re feeling stressed and decide when you need a break. It’s for your own good, we promise. It will help us redesign our workflows and make you happier and more comfortable. We definitely won’t be collecting that data, storing it and using it for nefarious purposes. We would never!
In many places around the world, this is already real, but China has gone the furthest with implementing it on a broad scale. A company called Hangzhou Zhongheng Electric is one of the biggest users of uniform caps that monitor brainwaves for exactly the reasons stated above, analyzing what you’re feeling at any given time. AI algorithms detect when the employees feel depressed, anxious or enraged. The tech is concealed within regular safety helmets or hats, lightweight and wireless, so employees can almost forget they’re wearing them.
While workers initially reacted with fear and suspicion, they got used to it pretty quickly. It’s been in place since 2014, and has supposedly boosted company profits by $315 million since then. So far, the data gleaned by the tech is a little hazy. It hasn’t quite reached the level of mind reading or anything approaching it. But give it another few years.