NEWS


Deepfake technology is being used in scams: What you need to know

Deepfake technology is increasingly used in scams, with AI voice-cloning apps mimicking voices of loved ones to deceive victims into sending money. Despite the sophistication of these deepfakes, there are no federal laws preventing unauthorized voice cloning. Consumer Reports highlights the lack of consent safeguards in popular voice cloning apps and advises measures such as using two-factor authentication, being cautious of suspicious communications, and applying common sense to protect against these scams.