Would you let a stranger eavesdrop in your home and keep the recordings? For most people, the answer is, "Are you crazy?"
Yet that's essentially what Amazon has been doing to millions of us with its assistant Alexa in microphone-equipped Echo speakers. And it's hardly alone: Bugging our homes is Silicon Valley's next frontier.
Many smart-speaker owners don't realize it, but Amazon keeps a copy of everything Alexa records after it hears its name. Apple's Siri, and until recently Google's Assistant, by default also keep recordings to help train their artificial intelligences.
So come with me on an unwelcome walk down memory lane. I listened to four years of my Alexa archive and found thousands of fragments of my life: spaghetti-timer requests, joking houseguests and random snippets of "Downton Abbey." There were even sensitive conversations that somehow triggered Alexa's "wake word" to start recording, including my family discussing medication and a friend conducting a business deal.
You can listen to your own Alexa archive here. Let me know what you unearth.
For as much as we fret about snooping apps on our computers and phones, our homes are where the rubber really hits the road for privacy. It's easy to rationalise away concerns by thinking a single smart speaker or appliance couldn't know enough to matter. But across the increasingly connected home, there's a brazen data grab going on, and there are few regulations, watchdogs or common-sense practices to keep it in check.
Let's not repeat the mistakes of Facebook in our smart homes. Any personal data that's collected can and will be used against us. An obvious place to begin: Alexa, stop recording us.
A sensitive word
"Eavesdropping" is a sensitive word for Amazon, which has battled lots of consumer confusion about when, how and even who is listening to us when we use an Alexa device. But much of this problem is of its own making.
Alexa keeps a record of what it hears every time an Echo speaker activates. It's supposed to only record with a "wake word" - "Alexa!" - but anyone with one of these devices knows they go rogue. I counted dozens of times when mine recorded without a legitimate prompt. (Amazon says it has improved the accuracy of "Alexa" as a wake word by 50 percent over the past year.)
What can you do to stop Alexa from recording? Amazon's answer is straight out of the Facebook playbook: "Customers have control," it says - but the product's design clearly isn't meeting our needs. You can manually delete past recordings if you know exactly where to look and remember to keep going back. You cannot actually stop Amazon from making these recordings, aside from muting the Echo's microphone (defeating its main purpose) or unplugging the darn thing.
Amazon founder and chief executive Jeff Bezos owns The Washington Post, but I review all tech with the same critical eye.
Amazon says it keeps our recordings to improve products, not to sell them. (That's also a Facebook line.) But anytime personal data sticks around, it's at risk. Remember the family that had Alexa accidentally send a recording of a conversation to a random contact? We've also seen judges issue warrants for Alexa recordings.
Alexa's voice archive made headlines most recently when Bloomberg discovered Amazon employees listen to recordings to train its artificial intelligence. Amazon acknowledged some of those employees also have access to location information for the devices that made the recordings.
Saving our voices is not just an Amazon phenomenon. Apple, which is much more privacy-minded in other aspects of the smart home, also keeps copies of conversations with Siri. Apple says voice data is assigned a "random identifier and is not linked to individuals" - but exactly how anonymous can a recording of your voice be? I don't understand why Apple doesn't give us the ability to say not to store our recordings.
The unexpected leader on this issue is Google. It also used to record all conversations with its Assistant, but last year quietly changed its defaults to not record what it hears after the prompt "Hey, Google." But if you're among the people who previously set up Assistant, you probably need to readjust your settings (check here) to "pause" recordings.
I'm not the only one who thinks saving recordings is too close to bugging. Last week, the California State Assembly's privacy committee advanced an Anti-Eavesdropping Act that would require makers of smart speakers to get consent from customers before storing recordings. The Illinois Senate recently passed a bill on the same issue. Neither are much of a stretch: Requiring permission to record someone in private is enshrined in many state laws.
"They are giving us false choices. We can have these devices and enjoy their functionality and how they enhance our lives without compromising our privacy," Assemblyman Jordan Cunningham, R, the bill's sponsor, told me. "Welcome to the age of surveillance capitalism."
Personal data isn't that personal
Inspired by what I found in my Alexa voice archive, I wondered: What other activities in my smart home are tech companies recording?
I found enough personal data to make even the East German secret police blush.
When I'm up for a midnight snack, Google knows. My Nest thermostat, made by Google, reports back to its servers' data in 15-minute increments about not only the climate in my house, but also whether there's anyone moving around (as determined by a presence sensor used to trigger the heat). You can delete your account, but otherwise Nest saves it indefinitely.
Then there are lights, which can reveal what time you go to bed and do almost anything else. My Philips Hue-connected lights track every time they're switched on and off - data the company keeps forever if you connect to its cloud service (which is required to operate them with Alexa or Assistant).
Every kind of appliance now is becoming a data-collection device. My Chamberlain MyQ garage opener lets the company keep - again, indefinitely - a record of every time my door opens or closes. My Sonos speakers, by default, track what albums, playlists or stations I've listened to, and when I press play, pause, skip or pump up the volume. At least they only hold on to my sonic history for six months.
And now the craziest part: After quizzing these companies about data practices, I learned most are sharing what's happening in my home with Amazon, too. Our data is the price of entry for devices that want to integrate with Alexa. Amazon's not only eavesdropping - it's tracking everything happening in your home.
Amazon acknowledges it collects data about third-party devices even when you don't use Alexa to operate them. It says Alexa needs to know the "state" of your devices "to enable a great smart home experience." But keeping a record of this data is more useful to them than to us. (A feature called "hunches" lets you know when a connected device isn't in its usual state, such as a door that's not locked at bedtime, but I've never found it helpful.) You can tell Amazon to delete everything it has learned about your home, but you can't look at it or stop Amazon from continuing to collect it.
Google Assistant also collects data about the state of connected devices. But the company says it doesn't store the history of these devices, even though there doesn't seem to be much stopping it.
Apple does the most admirable job operating home devices by collecting as little data as possible. Its HomeKit software doesn't report to Apple any info about what's going on in your smart home. Instead, compatible devices talk directly, via encryption, with your iPhone, where the data stays.
Free for all
Why do tech companies want to hold on to information from our homes? Sometimes they do it just because there's little stopping them - and they hope it might be useful in the future.
Ask the companies why, and the answer usually involves AI.
"Any data that is saved is used to improve Siri," Apple said.
"Alexa is always getting smarter, which is only possible by training her with voice recordings to better understand requests, provide more accurate responses, and personalize the customer experience," Beatrice Geoffrin, director of Alexa privacy, said in a statement. The recordings also help Alexa learn different accents and understand queries about recurring events such as the Olympics, she said.
Noah Goodman, a computer science and psychology professor at Stanford University, told me it's true that AI needs data to get smarter.
"Technically, it is not unreasonable what they are saying," he said. Today's natural language-processing systems need to rerun their algorithms over old data to learn. Without the easy access to data, their progress might slow - unless the computer scientists make their systems more efficient.
But then he takes his scientist hat off. "As a human, I agree with you. I don't have one of these speakers in my house," Goodman said.
We want to benefit from AI that can set a timer or save energy when we don't need the lights on. But that doesn't mean we're also opening our homes to tech companies as a lucrative source of data to train their algorithms, mine our lives and maybe lose in the next big breach. This data should belong to us.
What we lack is a way to understand the transformation that data and AI are bringing to our homes.
Think of "Downton Abbey": In those days, rich families could have human helpers who were using their intelligence to observe and learn their habits, and make their lives easier. Breakfast was always served exactly at the specified time. But the residents knew to be careful about what they let the staff see and hear.
Fast-forward to today. We haven't come to terms that we're filling our homes with even nosier digital helpers. Said Goodman: "We don't think of Alexa or the Nest quite that way, but we should."