AI Voice Cloning is on the Rise
Artificial Intelligence (AI) is a tool that can be used for beneficial purposes but it is also used to harm by scammers. Approximately one in ten adults have been targeted with a robocall claiming a family emergency and that a loved one needs money. These types of calls are set up through voice cloning and are also known as AI scam calls. All a scammer needs to do is find an audio clip of someone’s voice online and then upload it into a program that will replicate the voice.
The Federal Trade Commission (FTC) recommends the following course of action if you get a concerning call from a loved one in trouble.
– Call the person who supposedly called you back at their regular phone number and verify the story.
– If you can’t reach the person, try to get in touch with them through family members or mutual friends.
– If the caller asks for money through channels that are hard to trace, such as cryptocurrency, gift cards and wiring, recognize that is a sign of a scam and end the call immediately.
– If you spot a scam, report it HERE.
We suggest that you consider setting up a safe word with your loved ones that can be used in the event that a real emergency occurs. Guard Well Identity Theft Solutions exists to provide you, your family, and your employees from the damages of identity theft. If you have any questions or concerns, please contact our Member Services team immediately. We are always available for you 24/7/365 at 888.966.4827 (GUARD).
Image courtesy credit: Israel Palacio via Unsplash.com