Amazon Alexa can be hijacked via commands from own speaker

This article has been indexed from

The Register – Security

This isn’t the artificial intelligence we were promised

Without a critical update, Amazon Alexa devices could wake themselves up and start executing audio commands issued by a remote attacker, according to infosec researchers at Royal Holloway, University of London.…

Read the original article: