Adding Features/Functionality Always Decreases Security, Never Underestimate the Power of the Dark Side: Siri and Alexa et al Exploits
When new functionality is added, it not only adds its own features, but intercouples with other areas, leading to bugs and more security risks. A very good example of this is described in Researchers Demonstrate Subliminal Smart Device Commands That Have Potential for Malicious Attacks.
Researchers in the United States and China have been performing tests in an effort to demonstrate that "hidden" commands, or those undetectable to human ears, can reach AI assistants like Siri and force them to perform actions their owners never intended. The research was highlighted in a piece today by The New York Times, suggesting that these subliminal commands can dial phone numbers, open websites, and more potentially malicious actions if placed in the wrong hands.
What if 10,000 people in Time’s Square carrying smart phones all dial 911 at the same time, via an exploit from some loudspeaker system. Chaos could result in that and any number of imaginable scenarios. Sure would be a good way to make money driving phone calls to numbers that charge for the call.
Privacy is closely related to security; it demands strong security. It seems to me that a core premise of privacy is not breaking security, which makes Tim Cook’s comments on privacy nice for tea and crumpets and MSNBC interviews, but off target: there is no privacy if Apple breaks your security nor does giving in to government pressure inspire confidence in his or Apple’s comittment and integrity to principle; to use an analogy, you either believe stealing is wrong each and every time, or you don’t. It’s that simple, though in this muddled day and age, rationalizations abound for attacking principled stances in all sorts of vicious ways.
I turn Apple’s Siri OFF, but Apple disrespects my choice (in my reality, off is OFF, I want the cord unplugged so to speak). Yet Siri kicks in even when turned off. It did so just yesterday while in my pocket for no apparent reason, and I had not touched the phone for 5 hours. Ditto for being out fishing high in the mountains, and having music start to play at some random time.
A strong distrust of technology is your best first line of defense—disable every feature you don’t absolutely need, that is, if it is even possible to do so (not with Siri, not completely).
The foregoing should raise alarm bells given the article above, and with smart phones starting to tie into locks, alarms, bank accounts, cryptocurrency, etc. Or... what if the subliminal command is set up to send tell Siri (or Alexa et all) to send email to a known terrorist or child porn site, say, to black mail someone? Seems like something useful for political operatives prior to an election to smear an opponent.