this post was submitted on 16 Oct 2024
647 points (97.1% liked)

Science Memes

10671 readers
3200 users here now

Welcome to c/science_memes @ Mander.xyz!

A place for majestic STEMLORD peacocking, as well as memes about the realities of working in a lab.



Rules

  1. Don't throw mud. Behave like an intellectual and remember the human.
  2. Keep it rooted (on topic).
  3. No spam.
  4. Infographics welcome, get schooled.


Research Committee

Other Mander Communities

Science and Research

Biology and Life Sciences

Physical Sciences

Humanities and Social Sciences

Practical and Applied Sciences

Memes

Miscellaneous

founded 2 years ago
MODERATORS
 
top 50 comments
sorted by: hot top controversial new old
[–] selokichtli@lemmy.ml 24 points 13 hours ago* (last edited 12 hours ago)

Today I learned about AI agents in the news and I just can think: Jesus. The example shown was of an AI agent using voice synthesis to bargain against a human agent about the fee for a night in some random hotel. In the news, the commenter talked about how the people could use this agents to get rid of annoying, reiterative, unwanted phone calls. Then I remembered about that night my in-laws were tricked to give their car away to robbers because they ~~thought~~ were told my sister in law was kidnapped, all through a phone call.

Yeah, AI agents will free us all from invasive megacorporations. /s

[–] BleatingZombie@lemmy.world 14 points 14 hours ago (4 children)

Why isn't anyone saying that AI and machine learning are (currently) the same thing? There's no such thing as "Artificial Intelligence" (yet)

[–] KingRandomGuy@lemmy.world 4 points 6 hours ago

I work in an ML-adjacent field (CV) and I thought I'd add that AI and ML aren't quite the same thing. You can have non-learning based methods that fall under the field of AI - for instance, tree search methods can be pretty effective algorithms to define an agent for relatively simple games like checkers, and they don't require any learning whatsoever.

Normally, we say Deep Learning (the subfield of ML that relates to deep neural networks, including LLMs) is a subset of Machine Learning, which in turn is a subset of AI.

Like others have mentioned, AI is just a poorly defined term unfortunately, largely because intelligence isn't a well defined term either. In my undergrad we defined an AI system as a programmed system that has the capacity to do tasks that are considered to require intelligence. Obviously, this definition gets flaky since not everyone agrees on what tasks would be considered to require intelligence. This also has the problem where when the field solves a problem, people (including those in the field) tend to think "well, if we could solve it, surely it couldn't have really required intelligence" and then move the goal posts. We've seen that already with games like Chess and Go, as well as CV tasks like image recognition and object detection at super-human accuracy.

[–] finitebanjo@lemmy.world 8 points 14 hours ago

Its more like intelligience is very poorly defined so a less controversial statement is thar General Artificial Intelligience doesn't exist.

Also Generative AI such as LLMs are very very far from it, and machine learning in general haven't yielded much result in the persuit of sophonce and sapience.

Although they technically can pass a turing test as long as the turing test has a very short time limit.

[–] nialv7@lemmy.world 4 points 14 hours ago* (last edited 14 hours ago) (1 children)

that heavily depends on how you define "intelligence". if you insist on "think, reason and behave like a human", then no, we don't have "Artificial Intelligence" yet (although there are plenty of people that would argue that we do). on the other hand if you consider the ability to play chess or go intelligence, the answer is different.

[–] minyakcurry@monyet.cc 2 points 13 hours ago (1 children)

Honestly I would consider BFS/DFS artificial intelligence (and I think most introductory AI courses agree). But yea it's a definition game and I don't think most people qualify intelligence as purely human-centric. Simple tasks like pattern recognition already count as a facet of intelligence.

[–] Adalast@lemmy.world 2 points 12 hours ago (1 children)

I forget the exact quote or who said it, but the gist is that a species cannot be considered sapient (intelligent) on an interplanetary/interstellar stage until they have discovered Calculus. I prefer to use that as my bar for the sapience of those around me as well.

[–] NikkiDimes@lemmy.world 1 points 6 hours ago

Weeeell, sheeiiiiit

[–] TriflingToad@lemmy.world 2 points 13 hours ago

It very much depends on what you consider AI, or even what you consider intelligence. I personally consider LLMs AI because it's artificial.

[–] IsoSpandy@lemm.ee 46 points 1 day ago (50 children)

I don't get the ai hate sentiment. In fact I want ai to be so good that it steals all our jobs. Every single "worker" on the planet. The only job I don't think they can steal is that of middle management because I don't think we have digitized data on how to suck your own dick. After everybody is jobless, then we would be free. We won't need the rich. They can be made into a fine broth.

Sarcasm aside, I really believe we should automate all menial jobs, crunch more data and make this world a better place, not steal creative content made by humans and make second rate copies.

[–] AbsoluteChicagoDog@lemm.ee 26 points 18 hours ago

I don't know if you've been paying attention to everything that's happened since the industrial revolution but that's not how it's going to work

[–] NicolaHaskell@lemmy.world 2 points 12 hours ago

Sure, Eli Whitney.

How about the machines automate the complicated jobs to make as many menial jobs for me as possible? Computers these days are all lazy. They could optimize scheduling so the neighbors and I all get time together and time apart for a hundred hours of kicking dirt down at the office each year, instead they hang around doing vapes and abstract paintings of hands.

[–] frezik@midwest.social 45 points 23 hours ago (1 children)

The problem with AI isn't the tech itself. It's what capitalism is doing with it. Alongside what you say, using AI to achieve fully automated luxury gay space communism would be wonderful.

[–] uis@lemm.ee 17 points 22 hours ago (1 children)

Maybe problem is, you know, capitalism?

[–] SmilingSolaris@lemmy.world 9 points 19 hours ago
[–] BallsandBayonets@lemmings.world 5 points 18 hours ago (1 children)

I would love AI. Still waiting for it. Probably 50 years away (if human society lasts that long).

What I hate is the term being yet another scientific term to get stolen and watered down by brainless capitalists so they can scam money out of other brainless capitalists.

[–] Randomguy@lemm.ee 1 points 17 hours ago (1 children)

What I hate is the term being yet another scientific term ~~to get stolen and watered down~~ created by ~~brainless capitalists~~ researchers and scientists so they can ~~scam money out of~~ describe ideas to other ~~brainless capitalists~~ researchers and scientists.

The term AI as we use it today has been in use in the field of computer science for more than 50 years

The term that you describe as AI is what researchers in the field have called AGI for more than a decade.

The only place where AI is used to mean a artificial intelligence on the same level of humans is in fucking science fiction.

Is it hard to comprehend that when people say AI on the topic of something made by computer scientists they refer to the thing computer scientists call AI?

Do you go on gaming conversations and say: "Um... Akshually... it's not AI... it's just a behaviour heuristics 🤓"

[–] DragonTypeWyvern@midwest.social 2 points 15 hours ago (1 children)

"Brainless capitalists" weren't invented in 2019.

[–] Randomguy@lemm.ee 1 points 14 hours ago (1 children)

Oh yes, Alan Turing, such a famous capitalist.

[–] DragonTypeWyvern@midwest.social 1 points 13 hours ago* (last edited 13 hours ago)

I suppose you did say more than fifty years, which technically includes someone that died in 1954, but he also defined it in a way that even current models don't meet, so here we are, back at brainless capitalists.

[–] redwattlebird@lemmings.world 1 points 13 hours ago

For me, it's because AI is referring to a LLM, which is not AI. Also, these LLMs use a crap load of energy to do things that we can currently do ourselves for much less energy.

But actual AI? Yes, please!

[–] iAvicenna@lemmy.world 4 points 21 hours ago* (last edited 21 hours ago)

they will automate all menial jobs, fire %90 of the workers and ask remaining %10 to oversee the AI automated tasks while also doing all other tasks which can not be automated. all so that shareholders can add some more billions on top of their existing stack of billions.

[–] Sas@beehaw.org 20 points 1 day ago (5 children)

The problem is that it will be the rich that are the owners of the AI that stole your job so suddenly we peasants are no longer needed. We won't be free, we will be broth.

load more comments (5 replies)
[–] nightwatch_admin@feddit.nl 10 points 1 day ago

Had me in the first half, not gonna lie

load more comments (42 replies)
[–] IndustryStandard@lemmy.world 3 points 21 hours ago
load more comments
view more: next ›