Companies DO analyze what you say to smart speakers, but only after you have said “ok google, siri, alexa, etc.” (or if they mistake something like “ok to go” as “ok google”). I am not aware of a single reputable source claiming smart speakers are always listening.
The reality is that analyzing a constant stream of audio is way less efficient and accurate than simply profiling users based on information such as internet usage, purchase history, political leanings, etc. If you’re interested in online privacy device fingerprinting is a fascinating topic to start understanding how companies can determine exactly who you are based solely on information about your device. Then they use web tracking to determine what your interests are, who you associate with, how you spend your time, what your beliefs are, how you can be influenced, etc.
Your smart speaker isn’t constantly listening because it doesn’t need to. There are far easier ways to build a more accurate profile on you.
A recent study found these devices incorrectly activate like 80 times per day on average
It’s literally impossible for them to not be “analyzing” all the sounds they (perhaps briefly) record.
[Sound] --> [Record] --> [Analyze for keyword] --> [Perform keyword action] OR [Delete recording]
Literally all sounds, literally all the time. And we just trust that they delete them and don’t send them “anonymized” to be used for training the audio recognition algorithms or LLMs.
The way that “Hey Alexa” or “Hey Google” works is by, like you said, constantly analysing the sounds they said. However, this is only analyzed locally for the specific phrase, and is stored in a circular buffer of a few seconds so it can keep your whole request in memory. If the phrase is not detected, the buffer is constantly overwritten, and nothing is sent to the server. If the phrase is detected, then the whole request is sent to the server where more advanced voice recognition can be done.
You can very easily monitor the traffic from your smart speaker to see if this is true. So far I’ve seen no evidence that this is no longer the common practice, though I’ll admit to not reading the article, so maybe this has changed recently.
It’s been published by multiple sources at this point that this happens because of detected proximity. Basically, they know who you hang out with based on where your phones are, and they know the entire search history of everyone you interact with. Based on this, they can build models to detect how likely you are to be interested in something your friend has looked at before.
Imagine if there was some technology where you could search for things and it would show you information about it
Following an investigation by Bloomberg, the company admitted that it had been employing third-party contractors to transcribe the audio messages that users exchanged on its Messenger app.
So not your IRL conversations.
There is no indication that Facebook has used the information it collected to sell ads.
So not for ads.
It says the opposite of the things you claimed.