loader
AI Impersonation

Introduction

CEDAR RAPIDS, Iowa (KCRG) – An expert warns that AI technology is being used to impersonate individuals for scamming purposes.

Understanding Deepfakes

Scammers are utilizing AI to create deepfakes, which are manipulated videos that superimpose one person’s face or voice onto another. According to DeepMedia, a deepfake detection service, over 500,000 deepfake videos were shared on social media in 2023.

The Growing Threat

Dan Tuuri, a cybersecurity expert in eastern Iowa, emphasizes that as this technology becomes more accessible, the prevalence of these videos is increasing, making them harder to detect.

“We’re seeing an evolution in AI technology where individuals can completely mimic another’s voice,” Tuuri stated. “It typically requires a sample of just 30 to 90 seconds to replicate that persona convincingly.”

Impact on Society

According to CPJ Open Fox, a firm that provides data and technology to law enforcement, deepfake videos are projected to double every six months.

Tuuri highlights that this technology can be exploited to scam individuals or extract personal information.

“The financial incentive is significant,” he noted. “An attacker can persuade someone to take actions that lead to fund transfers or the sharing of sensitive corporate information.”

Protecting Yourself

He advises individuals to take precautionary measures to avoid being deceived, even by family members. For instance, establishing a verification method can be helpful.

“If I receive something suspicious from my wife, who is gardening this summer, I might ask, ‘What did you pick out of the garden last night?’ That’s our cue,” Tuuri explained.

Conclusion

Deepfakes can also create misleading videos of celebrities or politicians, which could pose challenges during this year’s contentious presidential election. Tuuri encourages everyone to verify the information encountered on social media by consulting reliable news sources and checking with a politician’s primary media outlet.

Copyright 2024 KCRG. All rights reserved.