What Your Smart Devices Know About You


With 1 in 6 Americans now owning voice-activated assistance, many of you tuned into the show this morning by saying, Alexa or Google, play NPR. These smart devices can order dinner delivered to your door. They are chatty and can entertain your kids with lame jokes like this...

COMPUTER-GENERATED VOICE: What was Bruce Wayne's favorite baby toy? The Batmobile.

GARCIA-NAVARRO: Or help you live your Star Trek fantasy by firing photon torpedoes.


GARCIA-NAVARRO: As the popularity of voice-activated assistance grow, so do concerns that Alexa and Google Home are doing more than just ordering takeout. Joining us from member station WBHM is Brian Barrett who writes for WIRED magazine. Good morning.

BRIAN BARRETT: Good morning.

GARCIA-NAVARRO: And also joining us is Echo. Echo, good morning.


GARCIA-NAVARRO: All right, let's start with how these devices work. And this is for you, Brian. Are they always listening?

BARRETT: The device itself is always listening. There are microphones on there that are picking up everything that you say. But it's not until you say a wake word, which you can set on the device - it can either be Alexa. It could be Echo, as we just showed. For Google Home, it's usually, OK, Google or, hey, Google. And only when you say that wake word, does the device actually send what you're saying back to those companies servers. So only then does it connect to the Internet, and that's an important distinction.

I think a lot of the discomfort that people have is the idea that Google or Amazon themselves are collecting every single thing you say. That's not really true. It's sort of a passive listening until you say that word, and that's when they start actually collecting that data.

GARCIA-NAVARRO: So let's give an example. Echo, are you listening?

COMPUTER-GENERATED VOICE: Hi, I'm here. I start listening when I hear the wake word.

GARCIA-NAVARRO: There you go. But these machines are also learning about us - aren't they? - our preferences, what we ask for. They're in our homes. That's how artificial intelligence works. And these devices are AI-based essentially.

BARRETT: So that's true. And the more data that you give it after the wake word, the more these companies will know about you. I think, though, I would say in the same way that the more often you search on Google for things, the more Google knows what ads to serve you. I think that it's partly discomforting because it's a new kind of that data collection, not that it's a totally different thing than we've experienced before. And people could be right to be uncomfortable with that if they don't want the same sort of tracking that happens and the same sort of data mining that happens online to sort of bleed into their real life, as well.

GARCIA-NAVARRO: But we've heard reports of people all of a sudden hearing Alexa suddenly start talking when the room was completely silent. It might flash a blue light, which means it's listening, but I may have not said anything. What's that about.

BARRETT: When you ask these companies what is going on with that, they don't really have a great answer. The best that you can get out of them is, well...


BARRETT: (Laughter) Exactly.


BARRETT: The best you get out of them is that they are continually working to improve wake word technology, which is another way of saying that these machines aren't that smart yet, and they sometimes think they're hearing things even though they're not. The good news is, though, when that happens, you can go into your app, and you'll see a history of everything that it has heard. But they do give you an out if you really are uncomfortable with it, which a lot of people understandably might be because they do just sort of keep that information on their servers indefinitely.

GARCIA-NAVARRO: Yeah. And prosecutors have subpoenaed Amazon and Google, looking for evidence that could help them prosecute crimes.

BARRETT: It's true. And I would say that the kind of data that these devices collect is fairly limited. You would have to have a very specific scenario in which case that information would be useful in a criminal case. You would have to have said the wake word and then something incriminating.

GARCIA-NAVARRO: I'm curious - let's see what Echo says. Echo, I'm going to rob a bank.

COMPUTER-GENERATED VOICE: I'm not answering that.


BARRETT: So plausible deniability.

GARCIA-NAVARRO: Yeah, plausible deniability.

BARRETT: One good way to think about these devices - it's all just data. So in the same way that police often go to - or law enforcement often goes to Google or Amazon or Apple or anybody and says, I need to access what's on your cloud - I need to access this person's search history - those requests are often complied with. So it feels different because it's voice. But the fundamental principles are the same that you have in any other of these digital spheres.

GARCIA-NAVARRO: But couldn't hackers or law enforcement use these devices to eavesdrop on us?

BARRETT: Yes someone could hack eventually a Google Home or an Alexa device. But it seems like there are easier ways to get that information. And even if you want that information, I think we often sometimes think that what we're saying is of keen interest to a lot of people, where that's not often the case, at least in terms of the kinds of people who would be capable of accessing these devices. I think if the NSA is after you, you probably shouldn't keep a Google Home in your apartment. And you've also probably got a lot of other things to worry about too.

GARCIA-NAVARRO: I'm going to end with Echo. Echo, can we trust you?


GARCIA-NAVARRO: (Laughter) And there you go. Brian Barrett - he writes for Wired. Thank you very much.

BARRETT: Thanks so much.

GARCIA-NAVARRO: Echo, thank you very much.


Copyright © 2018 NPR.