Inside the world of extreme-privacy consultants, who, for the right fee, will make you and your personal information very hard to find.
...interest from this group has risen since the killing of UnitedHealthcare CEO Brian Thompson last year.
Privacy has become a privilege of the rich.
Abuse allows Meta and Yandex to attach persistent identifiers to detailed browsing histories.
https://news.ycombinator.com/item?id=44169115
Follow up: Meta pauses mobile port tracking tech on Android after researchers cry foul:
https://www.theregister.com/2025/06/03/meta_pauses_android_tracking_tech/
Startups are deploying employee tracking tools in low-regulation markets with help from Silicon Valley venture capital.
Technologies that promise to track, manage, and supervise workers, increasingly using artificial intelligence, are getting entrenched in the developing world, according to a new report by Coworker.org, a labor rights nonprofit based in New York.
Audits of more than 150 startups and regional companies based in Kenya, Nigeria, Colombia, Brazil, Mexico, and India showed workplace surveillance is expanding in scale and sophistication, the researchers said.
FTC’s “entire” monopoly case rests on decade-old emails, Meta argued.
Zuckerberg suggested that Facebook could buy Instagram to "neutralize a potential competitor"
Boarding passes and check-in could be scrapped in air travel shake-up | Air transport | The Guardian
Facial recognition and a ‘journey pass’ stored on passengers’ phones are part of UN-backed plans to digitise air transport
Privacy nightmare, and what about people who don't have smartphones? And facial recognition seemingly with little to no consideration for security and privacy.
...computer vision papers often refer to human beings as “objects,” a convention that both obfuscates how common surveillance of humans is in the field, and objectifies humans by definition.
“The studies presented in this paper ultimately reveal that the field of computer vision is not merely a neutral pursuit of knowledge; it is a foundational layer for a paradigm of surveillance...”
Exclusive: Algorithms allegedly being used to study data of thousands of people, in project critics say is ‘chilling and dystopian’
This gambit is called "predatory inclusion." Think of Spike Lee shilling cryptocurrency scams as a way to "build Black wealth" or Mary Kay promising to "empower women" by embroiling them in a bank-account-draining, multi-level marketing cult. Having your personal, intimate secrets sold, leaked, published or otherwise exploited is worse for your mental health than not getting therapy in the first place, in the same way that having your money stolen by a Bitcoin grifter or Mary Kay is worse than not being able to access investment opportunities in the first place.
But it's not just people struggling with their mental health who shouldn't be sharing sensitive data with chatbots – it's everyone. All those business applications that AI companies are pushing, the kind where you entrust an AI with your firm's most commercially sensitive data? Are you crazy? These companies will not only leak that data, they'll sell it to your competition. Hell, Microsoft already does this with Office365 analytics:
https://pluralistic.net/2021/02/24/gwb-rumsfeld-monsters/#bosswareThese companies lie all the time about everything, but the thing they lie most about is how they handle sensitive data. It's wild that anyone has to be reminded of this. Letting AI companies handle your sensitive data is like turning arsonists loose in your library with a can of gasoline, a book of matches, and a pinky-promise that this time, they won't set anything on fire.