The Bank of Russia has warned about the increasing number of cases of deepfake videos used by fraudsters. Attackers send videos generated by neural networks depicting a person to their loved ones and ask them to transfer money.

Scammers started using deepfakes more frequently: Central Bank
  1. News

Author:

Subscribe to RB.RU on Telegram

As the Central Bank points out, short videos inform about a problem that urgently requires money to be solved. This could be an accident, an illness or a dismissal.

Deepfake clones also “impersonate” employers or famous personalities in the potential victim’s field. Attackers obtain materials to imitate voices and images by hacking social media accounts.

How to protect yourself

As a protective measure, the regulator suggests that the person who allegedly found themselves in a difficult situation should be called back.

You can also study the resulting video from the point of view of the monotony of the narration, the naturalness of the facial expressions and the facial expressions.

In the spring of 2024, the State Duma began to think about introducing sanctions for the illegal use of votes, images and biometric data of Russians. Prior to this, the Ministry of Digital Development was tasked with developing a platform for identifying deepfakes.

Author:

Natalia Gormaleva

Source: RB

Previous articleDell shares up 25% in just one weekComputersAugust 18, 2024 01:30
Next articlePlayers “revolt” against Activision after a mod for Call of DutyGames was banned from being released August 18, 2024, 02:15
I am a professional journalist and content creator with extensive experience writing for news websites. I currently work as an author at Gadget Onus, where I specialize in covering hot news topics. My written pieces have been published on some of the biggest media outlets around the world, including The Guardian and BBC News.

LEAVE A REPLY

Please enter your comment!
Please enter your name here