Site icon Tech Newsday

Federal Bureau of Investigation (FBI) warns of scammers using AI for sextortion

The FBI has warned about scammers using AI to create explicit deepfake images and videos known as “sextortion.”

Scammers manipulate innocent photos and videos from public social media accounts using advanced editing software. The victims, including minors and non-consenting adults, have reported having their images transformed into explicit content. These materials are shared on social media and porn websites to harass victims and carry out sextortion schemes.

In response, the FBI is warning the general public about the dangers of uploading personal images and videos online. While various forms of media may look benign when posted or shared, they may present unscrupulous persons with a vast library of information to use for illegal purposes.

Although the FBI has not published the actual number of complaints received, the notice was issued in response to a significant surge in sextortion schemes targeting juveniles. These scams often include internet predators creating phony identities, frequently appearing as attractive girls, in order to trick naive adolescent guys into sending graphic photographs. Following that, the fraudsters threaten to make the incriminating content public unless a ransom is paid.

Victims have reported scammers utilizing manufactured photographs or videos made from their social media accounts, online posts, or video conversations, according to the FBI. In certain situations, offenders employ deepfakes to coerce victims into sharing authentic sexually explicit photographs or videos.

The sources for this piece include an article in PCMag.

Exit mobile version