Identity, Phishing

Attempted voice phishing against LastPass thwarted


LastPass has disclosed having been unsuccessfully targeted by a voice phishing attack that involved deepfake audio spoofing CEO Karim Toubba, according to BleepingComputer.

Several calls and text messages, as well as a voicemail with the deepfake audio potentially generated using publicly available audio from Toubba, have been sent by attackers to a LastPass employee via WhatsApp, which is not being used by the password manager firm as a business communication channel, said LastPass intelligence analyst Mike Kosak.

"…[O]ur employee rightly ignored the messages and reported the incident to our internal security team so that we could take steps to both mitigate the threat and raise awareness of the tactic both internally and externally," Kosak added.

Such a disclosure comes days after healthcare and public health organizations across the U.S. were warned by the Department of Health and Human Services regarding an ongoing social engineering attack involving AI voice cloning against their IT help desks that sought to obtain multi-factor authentication codes to infiltrate their networks.

Get daily email updates

SC Media's daily must-read of the most current and pressing daily news

By clicking the Subscribe button below, you agree to SC Media Terms and Conditions and Privacy Policy.