I’ll be honest with you. I have Alexa devices dotted around every room in my house, a Google Home setup in the kitchen, and an iPhone in my pocket. That’s before we even get into the AI apps I use daily for work. For years I’ve been vaguely aware that all of this stuff is “collecting data” in some way, but I’ve never properly sat down and worked out what that actually means. What are these devices hearing? What are they sending where? And more to the point, should I be worried about my kids’ conversations floating around in some server farm in Virginia?
So I did the homework. I read through the privacy policies, dug into the research, and pulled together everything that actually matters in plain English. No scaremongering, no tinfoil hats. Just a straight answer to the question: what are these things really doing with our data?
The short answer is: more than most people realise, but also less sinister than the worst headlines suggest. The long answer is below.
How Your Smart Assistant Actually Works (And What It’s Listening For)
Let’s start with the mechanics, because there’s a lot of confusion here.
Your Alexa, Google Home, or Siri device is not, according to the companies, actively recording every conversation in your home. What it’s doing is passively scanning audio for its wake word. “Alexa,” “Hey Google,” “Hey Siri.” It’s a bit like your name being called across a busy pub. You’re not listening to every conversation, but the moment you hear your name, your ears prick up.
The problem is that software is not perfect. These devices can mistake a similar-sounding word or phrase for a wake word, and when that happens, they start recording. You might say something close enough to “Alexa” in normal conversation, and suddenly you’ve just accidentally triggered a recording you didn’t intend to make. That’s not a conspiracy. That’s just imperfect voice recognition technology trying its best.
Once the wake word is detected, the audio is typically sent to the cloud, where the real processing happens. The command is analysed, a response is generated, and depending on which platform you’re using, various amounts of that data are retained. This is where the platforms differ quite significantly.
What Alexa, Google, and Siri Are Actually Collecting
This is the part that surprised me the most when I actually read through the research properly.
A Surfshark Research Centre study analysed 290 apps connected to over 400 smart home devices and looked at how many data points each platform collects out of a possible 32. Amazon’s Alexa collected 28 out of 32. Twenty-eight. That includes precise location, contact details, health-related data, photos, videos, and audio recordings, all linked to individual user profiles. Google Home collected 22 out of 32, which is still a substantial amount. Beyond voice recordings, Amazon Echo devices also continuously transmit device telemetry, including network information, usage patterns, feature usage frequency, and error logs. Crucially, users have no control over the granularity of what gets sent versus what stays local.
And here’s the development that caught my attention. In March 2025, Amazon quietly removed the “Do Not Send Voice Recordings” option from select Echo devices, including the Echo Dot 4th Gen, Echo Show 10, and Echo Show 15. This feature had previously allowed audio commands to be processed entirely on the device, meaning your voice never left your home. Amazon removed it to power Alexa+, its generative AI upgrade, stating that the new AI features rely on cloud processing power. Affected devices were automatically switched to a setting that deletes recordings, which is something, but local processing is now gone entirely.
Apple’s Siri presents a noticeably different approach. Apple states it has never used Siri data to build marketing profiles, never made it available for advertising, and never sold it to anyone. Siri requests are not associated with your Apple Account. Instead, a random identifier is used during processing, which Apple describes as unique among digital assistants. By default, Apple does not retain audio recordings of Siri interactions. For more complex queries that require cloud processing, Apple uses something called Private Cloud Compute, where your data is used only to fulfil the request and is not stored or made accessible to Apple.
That said, a word of caution. Apple settled a class action lawsuit for $95 million related to allegations that Siri had secretly recorded users. No admission of wrongdoing was made, and it was a US case, but it does add a degree of nuance to Apple’s otherwise polished privacy narrative.
The Real-World Risks for Families
So what does all of this actually mean in practice? Let me break it down without overdramatising it.
The first risk is accidental recording. It’s more common than most people think. If you’ve ever had your Echo light up in the middle of a conversation, you’ve seen it happen. Most of these clips are mundane and meaningless. But the idea that fragments of family conversations could be retained in the cloud is, at minimum, worth being aware of.
The second risk is data linkage. It’s not just that Alexa records your voice. It’s that she also knows your shopping habits, your routines, which smart devices you control, what time you leave the house, and which third-party apps you’ve connected. Individually, none of those things are particularly sensitive. Put them together and they form a fairly detailed picture of your life.
The third risk is the wider ecosystem. Every time you grant a third-party Alexa Skill, Google Action, or iOS app access to your assistant, you’re potentially extending the data pipeline to another company with its own privacy policy. Many people never check those.
For families with kids, the concern is amplified. Survey data suggests that around 60% of smart assistant users are concerned about their voice recordings being listened to by someone else. And yet a majority use one anyway, and over half have never read the privacy policy or terms and conditions. I include myself in at least two of those categories, if I’m being honest.
Practical Steps to Actually Protect Your Family
The good news is there are sensible, straightforward things you can do. None of them require you to throw your Echo in the bin.
First, review and delete your voice history regularly. On Alexa, go into the Alexa app, select More, then Settings, then Alexa Privacy, and you can review and delete recordings by date or by category. You can also set automatic deletion. Google has a similar option in the Google Home app under your profile icon, then Assistant Settings, then Your Data in the Assistant. While you’re there, uncheck “Include audio recordings” to stop Google storing your voice clips.
Second, mute your devices when you don’t need them. All of the major smart speakers have a physical mute button. It feels almost too simple, but a physically muted microphone cannot record anything. I use this more than I used to, particularly in the evenings when we’re sitting around the dinner table or having conversations I’d rather keep in the room.
Third, audit your connected apps and Skills. Go through whatever you’ve connected to Alexa or Google and remove anything you’re not actively using. Every disconnected third party is one fewer company with access to your data.
Fourth, check your router’s privacy settings and keep your mesh network firmware updated. If you’re running a mesh system like the Deco range, make sure you know what telemetry it’s sending back and keep it current.
Finally, if privacy is a primary concern, lean into Siri where possible. It’s not perfect, as the lawsuit settlement shows, but the architecture is genuinely designed with more on-device processing in mind, particularly on newer Apple hardware.
Platform Privacy Comparison
| Platform | Data Points Collected (out of 32) | On-Device Processing | Audio Retained by Default | Linked to User Profile |
|---|---|---|---|---|
| Amazon Alexa | 28 | No (removed March 2025) | Yes (deletable) | Yes |
| Google Home | 22 | Partial | Yes (can opt out) | Yes |
| Apple Siri | Not ranked in study | Yes (where possible) | No | No (random ID used) |
Hype Cycle Check
LIKELY TO LAST: Privacy regulation will only tighten. The UK’s data protection framework, combined with growing consumer awareness, means platforms will be forced to offer clearer controls over time. The direction of travel is towards more transparency, even if progress is slow.
WATCH CLOSELY: Apple’s Private Cloud Compute model. If it genuinely delivers on its privacy promises at scale as Apple Intelligence expands, it could set a meaningful benchmark that pressures other platforms to follow. The proof will be in independent audits, not press releases.
VAPOURWARE RISK: The idea that “privacy mode” settings on any of these platforms give you complete protection. Opting out of audio retention is a positive step, but device telemetry, usage patterns, and third-party integrations are largely outside user control. Complete privacy on a connected device is not currently a realistic option.
What This Means for CES 2027
The trajectory is clear. Every major AI assistant is moving deeper into your home, your car, and your wearables. By CES 2027, we should expect the privacy conversation to be front and centre, not as a fringe concern, but as a selling point. Apple has already started marketing on-device AI as a feature. Expect Amazon and Google to follow with their own versions of privacy-forward processing, partly because of regulatory pressure and partly because consumer trust is genuinely at risk. The interesting question for CES 2027 is whether any platform will offer a credible, independently verified privacy guarantee. That would be genuinely new.
Recommended on Amazon
These are affiliate links — if you buy through them, Tech Dads Life earns a small commission at no extra cost to you.
What to Watch
Amazon’s Alexa+ rollout. Now that local voice processing has been removed, Alexa+ is entirely cloud-dependent. Watch how Amazon handles consent and transparency as this rolls out more widely in the UK.
UK data protection enforcement. The ICO has been sharpening its focus on smart device data practices. Any enforcement action against a major platform in the UK would be significant.
Apple Intelligence expansion. As Apple Intelligence features reach more devices and more countries, the real-world performance of Private Cloud Compute will come under greater scrutiny.
Third-party Skill and Action auditing. Regulators in the EU and UK are beginning to look more closely at the data practices of the third-party apps that plug into major smart assistants. This could result in significant changes to how Skills and Actions are permitted to operate.
If this kind of practical, no-nonsense tech guidance is useful to you, come and join the Tech Dads Life newsletter. It lands in your inbox regularly and it’s free. Sign up at techdadslife.beehiiv.com and I’ll make sure you stay ahead of this stuff without having to spend your evenings reading privacy policies.

