Boarding passes and check-in could be scrapped in air travel shake-up | Air transport | The Guardian
Facial recognition and a ‘journey pass’ stored on passengers’ phones are part of UN-backed plans to digitise air transport
Privacy nightmare, and what about people who don't have smartphones? And facial recognition seemingly with little to no consideration for security and privacy.
By Emily tl;dr: Every time you describe your work that involves statistical modeling as "AI" you are lending your power and credibility to Musk and DOGE.
Calling it "AI" is fast becoming yet another kind of anticipatory obedience.
If what you are doing is sensible and grounded science, there is undoubtedly a more precise way to describe it that bolsters rather than undermines the interest and value of your research. Statistical modeling of protein folding, weather patterns, hearing aid settings, etc really have nothing in common with the large language models that are the primary focus of "AI".
Transcript of: https://www.youtube.com/watch?v=eK0md9tQ1KY
It’s a way to make certain kinds of automation sound sophisticated, powerful, or magical and as such it’s a way to dodge accountability by making the machines sound like autonomous thinking entities rather than tools that are created and used by people and companies.
With 7 key questions to ask of an automation technology.
Colossal Biosciences claims it created [extinct] dire wolves, but scientists outside the company are skeptical.
IMO, whether it's asking if the cloned animals are dire wolves or not; the VC funding; or the impressive tech, they risk distracting us from the underlying problems in biodiversity conservation.
See also:
https://assemblag.es/@theluddite/114302525871514316
Rest of World’s global tracker found that AI was used more for memes and campaign content than mass deception in the 2024 elections...
...Global elections saw artificial intelligence used for playful memes and serious misinformation, revealing a complex landscape where tech’s impact is nuanced, not catastrophic.
The focus on AI's impact on elections is distracting us from some deeper and longer-lasting threats to democracy.
This gambit is called "predatory inclusion." Think of Spike Lee shilling cryptocurrency scams as a way to "build Black wealth" or Mary Kay promising to "empower women" by embroiling them in a bank-account-draining, multi-level marketing cult. Having your personal, intimate secrets sold, leaked, published or otherwise exploited is worse for your mental health than not getting therapy in the first place, in the same way that having your money stolen by a Bitcoin grifter or Mary Kay is worse than not being able to access investment opportunities in the first place.
But it's not just people struggling with their mental health who shouldn't be sharing sensitive data with chatbots – it's everyone. All those business applications that AI companies are pushing, the kind where you entrust an AI with your firm's most commercially sensitive data? Are you crazy? These companies will not only leak that data, they'll sell it to your competition. Hell, Microsoft already does this with Office365 analytics:
https://pluralistic.net/2021/02/24/gwb-rumsfeld-monsters/#bosswareThese companies lie all the time about everything, but the thing they lie most about is how they handle sensitive data. It's wild that anyone has to be reminded of this. Letting AI companies handle your sensitive data is like turning arsonists loose in your library with a can of gasoline, a book of matches, and a pinky-promise that this time, they won't set anything on fire.
Nature article about Replika AI companion:
https://www.nature.com/articles/s44184-023-00047-6