- Jul 20, 2016
- 6,572
- 20,209
And it is not good news.
So, it is looking like all the stuff in the sci-fi movies is coming to pass. X-files had a sort of funny, sort of scary episode about all these things running amuck. One projected use is that your little friend will hear you listening to music and proceed to order and download more tracks for you. I don't want to be the old lady naysayer, but until we are done with nefarious types, caution might be prudent.
https://www.zerohedge.com/news/2018-05-14/ultrasonic-attacks-can-trigger-alexa-siri-hidden-commands-raise-serious-security
Over the last two years, academic researchers have identified various methods that they can transmit hidden commands that are undetectable by the human ear to Apple’s Siri, Amazon’s Alexa, and Google’s Assistant.
According to a new report from The New York Times, scientific researchers have been able “to secretly activate the artificial intelligence systems on smartphones and smart speakers, making them dial phone numbers or open websites.” This could, perhaps, allow cybercriminals to unlock smart-home doors, control a Tesla car via the App, access users’ online bank accounts, load malicious browser-based cryptocurrency mining websites, and or access all sort of personal information.
So, it is looking like all the stuff in the sci-fi movies is coming to pass. X-files had a sort of funny, sort of scary episode about all these things running amuck. One projected use is that your little friend will hear you listening to music and proceed to order and download more tracks for you. I don't want to be the old lady naysayer, but until we are done with nefarious types, caution might be prudent.
Recently, the ultrasonic attack technology showed up in the hands of the Chinese. Researchers at Princeton University and China’s Zhejiang University conducted several experiments showing that inaudible commands can, in fact, trigger voice-recognition systems in an iPhone.
https://www.zerohedge.com/news/2018-05-14/ultrasonic-attacks-can-trigger-alexa-siri-hidden-commands-raise-serious-security