The agency announced the approval of rules allowing direct federal litigation against fraudsters who falsify applications from government agencies and officials acting on behalf of financial and business companies. The FTC will also go after code developers for such artificial intelligence models.

However, in the first draft, these rules do not address the issue of the use of AI deepfakes that impersonate private individuals. And herein lies a serious security issue. That’s why the trade agency has issued an additional notice asking the public to weigh in on whether “private” deepfakes will fall under the new rules.

“Scammers are using AI tools to impersonate people with uncanny accuracy and on a much larger scale. “With voice cloning and other AI scams on the rise, it is more important than ever to protect Americans from impersonation scams,” said FTC Chair Lina Khan.

Source: Ferra

Previous articleTesla released an update to protect batteries in very cold weatherAuto03:57 | February 18, 2024
Next articleTwitter is called “the hotbed of Chinese propaganda” Applications04:30 | February 18, 2024
I am a professional journalist and content creator with extensive experience writing for news websites. I currently work as an author at Gadget Onus, where I specialize in covering hot news topics. My written pieces have been published on some of the biggest media outlets around the world, including The Guardian and BBC News.

LEAVE A REPLY

Please enter your comment!
Please enter your name here