(WHTM) — Artificial intelligence is making a lot of news, and for good reason — it can do amazing things. But it can also trick your eyes and ears.

Here is what to listen for as concerns are growing every day about AI.

Members of Congress just got to hear how good it is at replicating voices, which opens a whole bunch of doors for scammers.

United States Senator Richard Blumenthal opened an AI hearing with a deep fake of his own voice.

Get the latest Pennsylvania politics and election news with abc27 newsletters!

“You might have thought the voice was mine and the words were from me. But in fact that voice was not mine,” said Blumenthal.

But how concerned should you be with all this?

If you have older parents who tend to answer their phones, a security consultant showed us last week how easily they could be convinced that it is you calling!

“With as little as three seconds of audio you can clone someone’s voice,” said Dave Hatter of Intrust IT Security, who recorded John Matarese reading a line of text.

Then he typed a common phrase you’d hear from a scammer and had Matarese’s voice say it!

“If I got a voicemail from you, John, I would assume it was you talking to me,” Hatter said.

“I answered and it was one of the kids crying,” said Natalie Bruser who explained how a scammer’s call sounded like her daughter in trouble two years ago.

Get severe weather alerts with newsletters and push alerts from the abc27 Weather Team!

“I said, ‘Nicole, please calm down I need to hear your voice,'” she added.

If that scammer had access to AI voice cloning he could have grabbed her daughter’s voice from social media and make the scam even more convincing.

That’s why so many people are so concerned.

Warn your older relatives to be very suspicious of unexpected calls, even if they seem to come from a loved one, so you don’t waste your money.