this post was submitted on 13 Sep 2023
52 points (78.3% liked)

Technology

34513 readers
468 users here now

This is the official technology community of Lemmy.ml for all news related to creation and use of technology, and to facilitate civil, meaningful discussion around it.


Ask in DM before posting product reviews or ads. All such posts otherwise are subject to removal.


Rules:

1: All Lemmy rules apply

2: Do not post low effort posts

3: NEVER post naziped*gore stuff

4: Always post article URLs or their archived version URLs as sources, NOT screenshots. Help the blind users.

5: personal rants of Big Tech CEOs like Elon Musk are unwelcome (does not include posts about their companies affecting wide range of people)

6: no advertisement posts unless verified as legitimate and non-exploitative/non-consumerist

7: crypto related posts, unless essential, are disallowed

founded 5 years ago
MODERATORS
 

Some argue that bots should be entitled to ingest any content they see, because people can.

you are viewing a single comment's thread
view the rest of the comments
[–] amju_wolf@pawb.social 4 points 1 year ago

I read an article about a subject. I will forget some of it. I will misunderstand some of it. I will not understand some of it. (These two are different because in misunderstanding I think I understand but I am wrong. In simply not understanding the information I can not make heads or tails of that portion)

Just because you're worse at comprehension or have worse memory doesn't make you any more real. And AIs also "forget" things, they also get stuff imperfectly, because they don't store any actual "full length texts" or anything. It's just separete words (more or less) and the likelyhood of what should come next.

Another, I as a natural intelligence know what I can quote, and what I should not due to copyrights, social mores, and law. AI regurgitates everything that might match regardless of source.

Except you don't not perfectly. You can be absolutely sure that you often say something someone else has said or written, which means they technically have a copyright to it... But noone cares for the most part.

And it goes the other way too - you can quote something imperfectly.

Both actually can/do happen already with AIs, though it would be great if we could train them with proper attribution - at least for the clear cut cases.

The third issue: The AI does not understand even with copious training data. It does not know that dogs bark, it does not have a concept of a dog.

A sufficiently advanced artificial intelligence would be indistinguishible from natural intelligence. What sets them apart then?

You can look at animals, too. They also have intelligence, and yet there are many concepts that are incomprehensible to them.


The thing is though, how can you actually tell that you don't work the exact same way? Sure the AI is more primitive, has less inputs - text only, no other outside stimuli - but the basis isn't all that different.