Talking with an AI model is like talking with that one friend, that is always high that thinks they know everything. But they have a wide enough interest set that they can actually piece together an idea, most of the time wrong, about any subject.
People Twitter
People tweeting stuff. We allow tweets from anyone.
RULES:
- Mark NSFW content.
- No doxxing people.
- Must be a pic of the tweet or similar. No direct links to the tweet.
- No bullying or international politcs
- Be excellent to each other.
- Provide an archived link to the tweet (or similar) being shown if it's a major figure or a politician.
I am sorry to say I can frequently be this friend...
Isn't this called "the Joe Rogan experience"?
One thing I have found it to be useful for is changing the tone if what I write.
I tend to write very clinicaly because my job involves a lot of that style of writing. I have started asked chat gpt to rephrase what i write in a softer tone.
Not for everything, but for example when Im texting my girlfriend who is feeling insecure. It has helped me a lot! I always read thrugh it to make sure it did not change any of the meaning or add anything, but so far it has been pretty good at changing the tone.
Also use it to rephrase emails at work to make it sound more professional.
I use chatgpt as a suggestion. Like an aid to whatever it is that I’m doing. It either helps me or it doesn’t, but I always have my critical thinking hat on.
If the standard is replicating human level intelligence and behavior, making up shit just to get you to go away about 40% of the time kind of checks out. In fact, I bet it hallucinates less and is wrong less often than most people you work with
My kid sometimes makes up shit and completely presents it as facts. It made me realize how many made up facts I learned from other kids.
And it just keeps improving over time. People shit all over ai to make themselves feel better because scary shit is happening.
I did a google search to find out how much i pay for water, the water department where I live bills by the MCF (1,000 cubic feet). The AI Overview told me an MCF was one million cubic feet. It's a unit of measurement. It's not subjective, not an opinion and AI still got it wrong.
Shouldn't it be kcf? Or tcf if you're desperate to avoid standard prefixes?
Everywhere else in the world a big M means million.
Yeah, shouldn't that be Kcf, Kilo cubic foot?
Kilo is a small k as there wasn't a person named that.
I think in this case it's Roman numeral M
The only thing that would make more sense would be if the bill was in cuneiform.
Americans really using ANYTHING but metric, huh?
Yeah, that's an odd one. My city does water by the gallon, which is much more reasonable.
First off, the beauty of these two posts being beside each other is palpable.
Second, as you can see on the picture, it's more like 60%
No it's not. If you actually read the study, it's about AI search engines correctly finding and citing the source of a given quote, not general correctness, and not just the plain model
Read the study? Why would i do that when there's an infographic right there?
(thank you for the clarification, i actually appreciate it)
I love that this mirrors the experience of experts on social media like reddit, which was used for training chatgpt...
Also common in news. There’s an old saying along the lines of “everyone trusts the news until they talk about your job.” Basically, the news is focused on getting info out quickly. Every station is rushing to be the first to break a story. So the people writing the teleprompter usually only have a few minutes (at best) to research anything before it goes live in front of the anchor. This means that you’re only ever going to get the most surface level info, even when the talking heads claim to be doing deep dives on a topic. It also means they’re going to be misleading or blatantly wrong a lot of the time, because they’re basically just parroting the top google result regardless of accuracy.
One of my academic areas of expertise way back in the day (late '80s and early '90s) were the so-called "Mitochondrial Eve" and "Out of Africa" hypotheses. The absolute mangling of this shit by journalists even at the time was migraine-inducing and it's gotten much worse in the decades since then. It hasn't helped that subsequent generations of scholars have mangled the whole deal even worse. The only advice I can offer people is that if the article (scholastic or popular) contains the word "Neanderthal" anywhere, just toss it.
I'm curious. Are you saying neanderthal didn't exist, or was just homo sapiens? Or did you mean in the context of mitochondrial Eve?
Sientists confirm it: we are living in a simulation!
Are you saying neanderthal didn’t exist, or was just homo sapiens? Or did you mean in the context of mitochondrial Eve?
All of these things, actually. The measured, physiological differences between "homo sapiens" and "neanderthal" (the air quotes here meaning "so-called") fossils are much smaller than the differences found among contemporary humans, so the premise that "neanderthals" represent(ed) a separate species - in the sense of a reproductively isolated gene pool since gone extinct - is unsupported by fossil evidence. Of course nobody actually makes that claim anymore, since it's now commonly reported that contemporary humans possess x% of neanderthal DNA (and thus cannot be said to be "extinct"). Of course nobody originally (when Mitochondrial Eve was first mooted) made any claims whatsoever about neanderthals: the term "neanderthal" was imported into the debate over the age and location of the last common mtDNA ancestor years later, after it was noticed that the age estimates of neanderthal remains happened to roughly match the age estimates of the genetic last common ancestor. And this was also after the term "neanderthal" had previously gone into the same general category in Anthropology as "Piltdown Man".
Most ironically, articles on the subject today now claim a correspondence between the fossil and genetic evidence, despite the fact that the very first articles (out of Allan Wilson's lab and published in Nature and Science in the mid-1980s) drew their entire impact and notoriety from the fact that the genetic evidence (which supposedly gave 100,000 years ago and then 200,000 years ago as the age of the last common ancestor) completely contradicted the fossil evidence (which shows upright bipedal hominids spreading out of Africa more than a million and half years ago). To me, the weirdest thing is that academic articles on the subject now almost never cite these two seminal articles at all, and most authors seem genuinely unaware of them.
it's much older than reddit https://en.wikipedia.org/wiki/Gell-Mann_amnesia_effect
i was going to post this, too.
The Gell-Mann amnesia effect is a cognitive bias describing the tendency of individuals to critically assess media reports in a domain they are knowledgeable about, yet continue to trust reporting in other areas despite recognizing similar potential inaccuracies.
If you want an AI to be an expert, you should only feed it data from experts. But these are trained on so much more. So much garbage.
Most of my searches have to do with video games, and I have yet to see any of those AI generated answers be accurate. But I mean, when the source of the AI's info is coming from a Fandom wiki, it was already wading in shit before it ever generated a response.
I’ve tried it a few times with Dwarf Fortress, and it was always horribly wrong hallucinated instructions on how to do something.
40% seems low
I just use it to write emails, so I declare the facts to the LLM and tell it to write an email based on that and the context of the email. Works pretty well but doesn't really sound like something I wrote, it adds too much emotion.
This is what LLMs should be used for. People treat them like search engines and encyclopedias, which they definitely aren't
That sounds like more work than just writing the email to me
Yeah, that has been my experience so far. LLMs take as much or more work vs the way I normally do things.
I've been using o3-mini mostly for ffmpeg
command lines. And a bit of sed
. And it hasn't been terrible, it's a good way to learn stuff I can't decipher from the man pages. Not sure what else it's good for tbh, but at least I can test and understand what it's doing before running the code.