#MyImpact
Alarming CBOS report on e-cigarette use among teenagers revealed
The dark side of artificial intelligence. Teenage blackmail and deepfake

The dark side of artificial intelligence. Teenage blackmail and deepfake

Image source: Ā© canva
Natalia Witulska,
29.06.2023 14:00

The rapid development of artificial intelligence has negative consequences as well. The dark web is full of criminals who teach step by step how to use deepfake images and the use them to blackmail minors.

The rapid development of artificial intelligence and programmes capable of generating realistic images has led to an explosion of child pornography and blackmail attempts by criminals determined to abuse minors.

Yaron Litwin, the CMO and Digital Safety Expert for Canopy, told Fox News Digital that pedophiles are leveraging the evolving tools in a variety of ways, often with the intent to produce and distribute images of child sexual exploitation across the internet.

The dark side of AI

One technique used by sex offenders involves editing a real photograph of a fully clothed teenager and transforming it into a nude image. Yaron Litwin described one specific example.

A 15-year-old boy was interested in fitness and joined a group with other healthy lifestyle and gym enthusiasts. One day he shared a photo of himself bragging about his chest after a workout. The photo was saved by users of the group and soon edited as if the boy was naked. The edited photo was then used to blackmail the teenager.

In 2022 major social networks reported as much as a 9 per cent increase in suspected child sexual abuse material (CSAM) on their platforms. 85 per cent of these reports came from the Meta's digital platforms, such as Facebook, Instagram and WhatsApp.

Another negative consequence of the rapid development of AI
Another negative consequence of the rapid development of AI (canva)

Litwin said that the process of editing existing images with artificial intelligence has become extremely easy and quick, often leading to horrific experiences for families. Therefore, it's simply easy to create completely doctored images of child sexual abuse that are not based on authentic images.

According to a recent analysis, AI-generated images of children involved in sexual acts have the potential to disrupt the central tracking system that blocks CSAM on the network.

As it stands, the system is only designed to detect known images of abuse, not generated ones. This new variable could prompt law enforcement to spend more time determining whether an image is real or generated by artificial intelligence.

Source: Fox News

Let us know what do you think
  • emoji heart - number of votes: 0
  • emoji fire - number of votes: 0
  • emoji smile - number of votes: 0
  • emoji sad - number of votes: 1
  • emoji anger - number of votes: 2
  • emoji poop - number of votes: 0
Luce, Vaticanā€™s cartoon mascot for Jubilee 2025, sparks controversy