The dark side of artificial intelligence. Teenage blackmail and deepfake
The rapid development of artificial intelligence has negative consequences as well. The dark web is full of criminals who teach step by step how to use deepfake images and the use them to blackmail minors.
The rapid development of artificial intelligence and programmes capable of generating realistic images has led to an explosion of child pornography and blackmail attempts by criminals determined to abuse minors.
Related
- Korean women take to the streets to protest against deepfake exploitation
- Epidemic of addiction? Young doctor sounds the alarm
- 'The perfect body' according to artificial intelligence. Unreal or not?
- Young people in Romania are strongly affected by fake news on the Internet and social networks. Only 15% verify the information
- France wants to make it illegal for parents to post pictures of their children on social networks. What is the reason?
Yaron Litwin, the CMO and Digital Safety Expert for Canopy, told Fox News Digital that pedophiles are leveraging the evolving tools in a variety of ways, often with the intent to produce and distribute images of child sexual exploitation across the internet.
The dark side of AI
One technique used by sex offenders involves editing a real photograph of a fully clothed teenager and transforming it into a nude image. Yaron Litwin described one specific example.
A 15-year-old boy was interested in fitness and joined a group with other healthy lifestyle and gym enthusiasts. One day he shared a photo of himself bragging about his chest after a workout. The photo was saved by users of the group and soon edited as if the boy was naked. The edited photo was then used to blackmail the teenager.
In 2022 major social networks reported as much as a 9 per cent increase in suspected child sexual abuse material (CSAM) on their platforms. 85 per cent of these reports came from the Meta's digital platforms, such as Facebook, Instagram and WhatsApp.
Litwin said that the process of editing existing images with artificial intelligence has become extremely easy and quick, often leading to horrific experiences for families. Therefore, it's simply easy to create completely doctored images of child sexual abuse that are not based on authentic images.
According to a recent analysis, AI-generated images of children involved in sexual acts have the potential to disrupt the central tracking system that blocks CSAM on the network.
As it stands, the system is only designed to detect known images of abuse, not generated ones. This new variable could prompt law enforcement to spend more time determining whether an image is real or generated by artificial intelligence.
Source: Fox News