Advertisement

Deepfakes: Can You Spot a Phony Video? | Above the Noise

Deepfakes: Can You Spot a Phony Video? | Above the Noise Check out Monstrum! A new show from PBS Digital Studios:

Co-produced with Data & Society Research Institute. @datasociety
Special thanks to Data & Society Researcher Britt Paris

Myles breaks down why deepfakes can cause so much damage, and talks to Jabril, the host of YouTube channel Jabrils, about how to spot them.

TEACHERS: Learn more about this topic and how you might teach it with your students via one of our free summer PD courses:

ABOVE THE NOISE is a show that cuts through the hype and investigates the research behind controversial and trending topics in the news. Hosted by Myles Bess.

*NEW VIDEOS EVERY OTHER WEDNESDAY*

SUBSCRIBE by clicking the RED BUTTON above.
Follow us on Instagram @kqedabovethenoise

Today’s internet meme culture thrives on Photoshopping images for comic effect. But there is a newer form of image manipulation using AI technology to create hyper realistic alterations to video. These so-called ”deepfakes” can be very difficult to detect or debunk.

**What are deepfakes?
Deepfakes are videos that have been manipulated using AI technology. This tech can scale, rotate or splice videos and images together to alter videos so that they can be very different from the original and tough to spot as fakes.

**How do deepfakes spread?
As AI becomes cheaper and more accessible to more people, deepfakes are more difficult to detect. Social media platforms like Youtube, Facebook and Twitter help to spread them, partially because of the ease and speed of sharing.

**How do deepfakes contribute to misinformation online?
According to the Pew Research Center, 85% of teens use Youtube and two-thirds of all Americans get at least some of their news from social media platforms. Right now, these companies are doing little to stop the spread of deepfakes on their platforms. High profile figures like politicians and celebrities who have a ton of photos and videos on the internet are the most common targets for deepfakes, due to the huge volume of data available to manipulate. While they are often used for parody or satire, deepfakes can be used for more sinister purposes. Imagine what could happen if somebody created a fake video of a leader inciting violence or falsely accusing another politician. Experts say deepfakes could become a serious threat to our security and democracy in the near future.

** If I’m not famous, what kind of threat do deep fakes pose to me??
Women are the most common victims of deepfakes. Usually, someone will splice the face of a woman they know into a pornographic or sexually provocative video to humiliate them. While good technical skills are required to create really convincing fakes, there are plenty of how-to’s and other resources out there to get people started.

**How can we stop the spread of deepfakes?
People and organizations are developing detection algorithms to spot deepfakes, but these methods usually lag behind the technology used to create them. Often, you can spot a deepfake by looking closely at eye movements and facial expressions. If a video looks a little suspicious, try to find another version of it online using Google image search. Also, the Youtube Data Viewer shows you when a video was uploaded and provides thumbnails for reverse image searching. But the best way to stop the spread of deepfakes is to slow down your impulse to share, and demand that social media companies do more to combat it.

SOURCES:

“This PSA About Fake News From Barack Obama Is Not What It Appears”

“Fake-porn videos are being weaponized to harass and humiliate women”


“Does This Photograph Show President Bush Reading a Book Upside-Down?”


Youtube DataViewer

“Social Media Use in 2018”

“What to Watch for in the Coming Wave of “Deep Fake” Videos”


FOR EDUCATORS
KQED Learn
KQED Teach
KQED Education



About KQED
KQED, an NPR and PBS affiliate in San Francisco, CA, serves Northern California and beyond with a public-supported alternative to commercial TV, Radio, and web media. Funding for Above the Noise is provided in part by the Corporation for Public Broadcasting, Silver Giving Foundation, Stuart Foundation, and William and Flora Hewlett Foundation.

#deepfakes #misinformation

Above the Noise,KQED,PBS,PBS Digital Studios,Myles Bess,science series,media literacy,critical thinking,current events,pro con debate,deepfakes,detecting deepfake videos,misinformation,digital literacy,data & society,Jabrils,Jabril Ashe,computer science,image manipulation,misinformation on the internet,misinformation on youtube,social media,fake news,propaganda,

Post a Comment

0 Comments