Article content
High-profile incidents of malicious use of artificial intelligence (AI) on social media — such as a recent “deepfake” showing singer Taylor Swift endorsing Donald Trump for president — are often exposed quickly, but in other cases the damage grows until the fake photo is exposed.
Deepfakes are photographs that have been doctored, often using artificial intelligence, to create a different image. Often, the doctored images take the form of nude photos of victims.
Advertisement 2
Article content
In April, The New York Times documented what it called an “epidemic of deepfake nudes in schools,” detailing how students in middle and high schools fabricated explicit images of female classmates using AI.
In some cases, boys were reported to have shared the the fake images in school lunchrooms, on school buses and through Snapchat and Instagram.
Deepfakes can result in the outright sabotage of reputations. After the doctored images spread quickly through social media, they can remain discoverable online forever – by friends, relatives and employers.
Apps and websites that allow people with little technical know-how to manipulate real photographs of people to create identical nude images are abundant online. One goes by the brazen name of Nudify, which even offers a free trial.
“It’s completely changed the social media landscape,” says Joanna Conrad, executive director of Essex County Youth Diversion and an instructor on the Canadian justice system at St. Clair College.
“You do not need to be tech savvy whatsoever to do it,” said Conrad, though she said there are guidebooks online for those who are inclined to embrace the world of deepfakes.
Article content
Advertisement 3
Article content
Children are using photos taken from online platforms, such as Facebook or Instagram, manipulating them into realistic-looking nudes using AI tools, and spreading them in their digital groups, she said.
“Sometimes it’s a prank, because everything is funny and nothing has repercussions. So they think it’s a joke sometimes and they’ll download (a picture) of a peer and then they nudify them and now they shame them, embarrass them.”
More ominous possibilities become evident with the devastating use of online sextortion, often by grooming young people to get them to send nudes. That time-consuming process may be no longer necessary — abusers can just use photos from the targeted individual’s social media platforms, manipulate them, and start the extortion process, Conrad said.
It’s then left to the victim to prove it isn’t a real photo, “but it’s almost impossible to discern which is actual imagery and which is not,” she said.
“They can do this with videos, they can do this with pictures. More than ever before it goes back to that digital footprint. The more pictures you have available of yourself online the more at risk you become because they have more stuff they can use.”
Advertisement 4
Article content
According to the federal Justice Department, Canada’s Online Harms Act — passed earlier this year but is yet to be put into effect — includes provisions for platforms to “make inaccessible to all persons in Canada content that sexually victimizes a child or revictimizes a survivor, and intimate content communicated without consent, including deepfake images.”
Still, enforcing it requires reporting and investigation. All the while, the offending material can spread among social media networks and cellphones.
Legal opinions vary about whether deepfake nudes are illegal under current law, but at least one case has been prosecuted in Quebec and the defendant jailed for creating deepfake videos of children.
Recommended from Editorial
Cybertip.ca, a website run by the charity Canadian Centre for Child Protection, says deepfake images are sometimes used to take things even further, tricking young people into performing explicit acts online, the organization says.
Advertisement 5
Article content
In the past year alone, Cybertip.ca has processed close to 4,000 sexually explicit deepfake images and videos of young people.
“Victimization can have a serious impact on youths’ mental health and well-being,” Cybertip says. “They may suffer ongoing anxiety and depression and become preoccupied with searching for their images. There may also be future or ongoing impacts if the material continues to be shared long after it has been created.”
Anyone who has been victimized through explicit images online can also seek help at needhelpnow.ca.
Digital Danger: In an investigative series, the Windsor Star looks at social media and young people — how they are being affected and how dangerous the content can be. Reporter Brian MacLeod’s six-part series concludes with suggestions for parents to become a key part in making social media safer for their children.
Article content