Apple Faces $95 Million Settlement Over Siri Privacy Violations
In a significant legal development, Apple Inc. has agreed to a substantial settlement amounting to $95 million. This class-action lawsuit stems from allegations that private conversations with the digital assistant Siri were inadvertently recorded and subsequently listened to by third-party contractors. This case has drawn considerable attention due to the serious implications for user privacy and data security, highlighting the critical need for technology companies to safeguard sensitive information.
If the proposed settlement receives approval from U.S. District Judge Jeffrey White, as filed on Tuesday in Oakland, California federal court, affected users will be eligible to receive up to $20 for each Apple device equipped with Siri, which includes popular devices like the iPhone and Apple Watch. This compensation is a response to the concerns raised by users regarding the privacy of their interactions with Siri.
New evidence claims Google, Microsoft, Meta, and Amazon could be listening to you on your devices
Details of the Lawsuit: Unintentional Activation of Siri and Privacy Breaches
The crux of the lawsuit revolves around user complaints regarding the unintentional activation of Siri. A whistleblower’s report in 2019, published by The Guardian, revealed that Apple contractors were listening to voice recordings during quality control checks. The recordings included deeply personal content, such as “confidential medical information, drug deals, and recordings of couples having sex,” raising serious ethical questions about user privacy. Although Siri is designed to activate only upon hearing the trigger phrase “hey Siri,” there have been multiple reports of it being activated by unintended sounds, including zippers and casual conversations.
Mashable Light Speed
Implications of Siri’s Recording Practices on User Trust
Numerous Apple users reported that their private conversations were recorded and utilized by third-party advertisers. Many experienced targeted advertisements for products mentioned during private discussions, including medical treatments discussed with their healthcare providers. In light of these concerns, Apple issued a formal apology, pledging to cease the practice of saving voice recordings. This incident has sparked broader discussions about user trust and the ethical responsibilities of tech companies in managing personal data.
‘LLM Siri’ aims to rival ChatGPT — but don’t expect it until iOS 19
How to Claim Your Share of the Settlement: A Step-by-Step Guide
The timeline for the lawsuit covers the period from September 17, 2014, to December 31, 2024. To participate in the settlement, Apple users must submit a claim for up to five Apple devices equipped with Siri, which can include the iPhone, iPad, Apple Watch, MacBook, iMac, HomePod, iPod touch, or Apple TV. Additionally, claimants must affirm under oath that they unintentionally activated Siri during a conversation they intended to keep confidential. This requirement is a crucial part of the settlement proposal, aimed at ensuring legitimate claims are processed.
Wider Privacy Concerns: The Voice Assistant Landscape
Apple is not alone in facing scrutiny over privacy violations related to voice assistants. Google is currently embroiled in a similar class-action lawsuit concerning the unintentional activation of Google Assistant without its designated wake words. This growing trend underscores the urgent need for companies to prioritize user privacy and transparency in their technologies. As voice assistants become increasingly integrated into daily life, users must remain vigilant about how their data is being handled, pushing for greater accountability from tech giants.
var facebookPixelLoaded = false;
window.addEventListener(‘load’, function(){
document.addEventListener(‘scroll’, facebookPixelScript);
document.addEventListener(‘mousemove’, facebookPixelScript);
})
function facebookPixelScript() {
if (!facebookPixelLoaded) {
facebookPixelLoaded = true;
document.removeEventListener(‘scroll’, facebookPixelScript);
document.removeEventListener(‘mousemove’, facebookPixelScript);
!function(f,b,e,v,n,t,s){if(f.fbq)return;n=f.fbq=function(){n.callMethod?
n.callMethod.apply(n,arguments):n.queue.push(arguments)};if(!f._fbq)f._fbq=n;
n.push=n;n.loaded=!0;n.version=’2.0′;n.queue=[];t=b.createElement(e);t.async=!0;
t.src=v;s=b.getElementsByTagName(e)[0];s.parentNode.insertBefore(t,s)}(window,
document,’script’,’//connect.facebook.net/en_US/fbevents.js’);
fbq(‘init’, ‘1453039084979896’);
fbq(‘track’, “PageView”);
}
}










