Fake voices 'help cyber-crooks steal cash'

A security firm says deepfaked audio is being used to steal millions of pounds.
Symantec said it had seen three cases of seemingly deepfaked audio of different chief executives used to trick senior financial controllers into transferring cash.
Deepfakes use artificial intelligence to create convincing fake footage.
The AI system could be trained using the "huge amount" of audio the average chief executive would have innocently made available, Symantec said.
Corporate videos, earning calls, media appearances as well as conference keynotes and presentations would all be useful for fakers looking to build a model of someone's voice, chief technology officer Dr Hugh Thompson said.
"The model can probably be almost perfect," he said.

And they had used background noise to cleverly mask the least convincing syllables and words.
"Really," said Dr Thompson, "who would not fall for something like that":[]}