New Telegram Bot Violates Female Privacy Making Deep Fake Nudes From Regular Photos

in #technologylast month

A deep fake bot found on the Telegram messenger app has victimized seemingly hundreds of thousands of women by making normal photos into deep fake nudes.

By Aaron Kesel

More than 100,000 of these non-consensual sexual images have been posted publicly online, but the bot has produced hundreds of thousands more that haven't been traced, CNET reported.

The victims are mostly private individuals, women whose photos were taken off social media or pulled from a personal stash of pics, according to a research report about the bot Tuesday, which traced more than 100,000 publicly posted images of victims of this bot. Some victims had originally been photographed in bathing suits or underwear. Some were wearing simple T-shirts and shorts.

"The innovation here is not necessarily the AI in any form," said Giorgio Patrini, CEO of deepfake-research company Sensity and coauthor of the report. "It's just the fact that it can reach a lot of people, and very easily."

This bot isn't just victimizing women it has also created numerous deep fake nude images of children. All of this combined with recent Telegram ransomware spam bots has put Telegram into hot water, requiring the company to act and disable all bots on its network while it closes its API.

Telegram has been criticized over the years for hosting terrorist propaganda, facilitating piracy and copyright infringement, as well as hosting varieties of pornography. But the service has also taken actions to remove abuse on its platform, such as kicking off groups of violent extremists like neo-Nazis and ISIS.

The new bot is a fork of a previous software known as Deepnude, both used artificial intelligence to automatically generate non-consensual sexual images of women and children in photos. However, the new bot is simpler and easier to use than the original desktop app. The Telegram bot allows easier access to some dangerous tools, enabling any stranger to snap a photo and create a deep fake nude.

Sensity found 104,852 images of women who were victimized by the bot and then shared publicly, as of the end of July. While each image may not be of a unique individual, Patrini said instances of the same woman being victimized, or the same photo being manipulated repeatedly, were rare.

The bot is built for "freemium," providing free users with a basic level of functionality while advanced features are available for those who pay for the app in cryptocurrency. Abusers can use the bot free by sending photos to it one at a time, seemingly up to five a day. But "paid premium" features include sending multiple pics, skipping the line of free users and removing watermarks from the pornographic images. The bot also offers a referral program in its "coins," also payable in rubles.

Activist Post has previously covered deep fakes, including faking audio, and video. We noted the dangers of deep fake technology, and how artificial intelligence will be needed to determine what's fake and what's real to distinguish reality. The Pentagon also wants to police deep fake technology on social media, as well as fake news more focused on politics. Using artificial intelligence for abuse such as faking nude images of women and children will only push the government more and more towards focusing on how it can counter deep fake technology.

By [@An0nkn0wledge](https://hive.blog/@an0nkn0wledge)

Aaron Kesel writes for Activist Post.

Image: Unsplash/Alexander Krivitskiy

Subscribe to Activist Post for truth, peace, and freedom news. Send resources to the front lines of peace and freedom HERE! Follow us on SoMee, HIVE, Parler, Flote, Minds, and Twitter.

Provide, Protect and Profit from what’s coming! Get a free issue of Counter Markets today.