608
Woman divorces husband after ChatGPT reads his coffee grounds and predicts affair
(www.techspot.com)
We're not The Onion! Not affiliated with them in any way! Not operated by them in any way! All the news here is real!
Posts must be:
Please also avoid duplicates.
Comments and post content must abide by the server rules for Lemmy.world and generally abstain from trollish, bigoted, or otherwise disruptive behavior that makes this community less fun for everyone.
And that’s basically it!
This is about as psychotic as governments using AI to predict crimes.
Not necessarily. I think that predicting crime using public information can be beneficial. It just shouldn't be invasive or biased, and shouldn't be used in court to justify a warrant or arrest.
I agree with you, but:
All those things that will soon happen whether we like it or not.
Let's not kid ourselves. Publicly available information is invasive and a violation of privacy.
We have corporations who have effectively set up mass surveillance networks and they call it "adtech".
There is an entire economy surrounding "publicly available information". There are corporations that act as as data brokers and people search websites that compile way too much sensitive information about private individuals. Newspapers systematically report on events that aren't really of public interest concerning private individuals; e.g. arrest records and these articles hang around forever even if the arrest doesn't result in a conviction or the crime is expunged.
If this was employed by the government or law enforcement, it would absolutely include data that extends far beyond the reaches of publicly available information — and it's worth pointing out that the US has a mass surveillance network in the form of the NSA/PRISM.
There is zero way you could convince me that AI, prone to hallucination, would be well served to predict crime or criminals. Even if it didn't hallucinate, it still wouldn't be possible to predict crime - only potentially anticipate a crime. We aren't 2D characters following a script — anything can happen.
Law enforcement is already very unhinged. Let's not cheerlead the addition of any tools that aid in psychosis to their arsenal.
You are aware that "it shouldn't be invasive or biased" automatically removes every AI, right?
I'm speaking about the concept not the currently available AI.