This gambit is called "predatory inclusion." Think of Spike Lee shilling cryptocurrency scams as a way to "build Black wealth" or Mary Kay promising to "empower women" by embroiling them in a bank-account-draining, multi-level marketing cult. Having your personal, intimate secrets sold, leaked, published or otherwise exploited is worse for your mental health than not getting therapy in the first place, in the same way that having your money stolen by a Bitcoin grifter or Mary Kay is worse than not being able to access investment opportunities in the first place.
But it's not just people struggling with their mental health who shouldn't be sharing sensitive data with chatbots – it's everyone. All those business applications that AI companies are pushing, the kind where you entrust an AI with your firm's most commercially sensitive data? Are you crazy? These companies will not only leak that data, they'll sell it to your competition. Hell, Microsoft already does this with Office365 analytics:
https://pluralistic.net/2021/02/24/gwb-rumsfeld-monsters/#bosswareThese companies lie all the time about everything, but the thing they lie most about is how they handle sensitive data. It's wild that anyone has to be reminded of this. Letting AI companies handle your sensitive data is like turning arsonists loose in your library with a can of gasoline, a book of matches, and a pinky-promise that this time, they won't set anything on fire.
34 shaares