Traister101

joined 1 year ago
[–] Traister101 14 points 9 months ago (1 children)

Well communist views != tankie for one thing

[–] Traister101 2 points 9 months ago

You should refine your thoughts more instead of dumping a stream of consciousness on people.

Essentially what this stream of consciousness boils down to is "Wouldn't it be neat if AI generated all the content in the game you are playing on the fly?" Would it be neat? I guess so but I find that incredibly unappealing very similar to how AI art, stories and now video is unappealing. There's no creativity involved. There's no meaning to any of it. Sentient AI could probably have creativity but what people like you who get overly excited about this stuff don't seem to understand is how fundamentally limited our AI actually is currently. LLMs are basically one of the most advanced AI things rn and yet all it does is predict text. It has no knowledge, no capacity for learning. It's very advanced auto correct.

We've seen this kind of hype with Crypto with NFTs and with Metaverse bullshit. You should take a step back and understand what we currently have and how incredibly far away what has you excited actually is.

[–] Traister101 4 points 9 months ago

It's called "prompting" and anybody doing it deserves mockery

[–] Traister101 7 points 9 months ago

Still waiting on the programmer part. In a nutshell AI being say 90% perfect means you have 90% working code IE 10% broken code. Images and video (but not sound) is way easier cause human eyes kinda just suck. Couple of the videos they've released pass even at a pretty long glance. You only notice funny businesses once you look closer.

[–] Traister101 4 points 9 months ago

Ah yep that's what it was

[–] Traister101 24 points 9 months ago (4 children)

Yeah that was a whole thing like a couple years back. The sad reality iirc is that the litter was for something to do with school shootings, perhaps to help clean up the blood...

[–] Traister101 10 points 9 months ago

AI generated slop. Wouldn't really call her a loli. Doesn't look like a child

[–] Traister101 8 points 9 months ago* (last edited 9 months ago) (3 children)

Yeah so firstly Matsuri isn't actually a loli (though yes I'll admit that's intended to be loli art). But I also happen to know of that artist as they got me as well. For example NSFW Tasumaki from One Punch Man or NSFW Yor from Spy Family but then this same person will draw Takagi as a middle schooler taking a dick.

Voosh clearly doesn't think very hard when downloading shit off Twitter cause the AI horse is really obvious. Pretty easy to believe that second Matsuri picture (the loli one) wasn't caught cause it's not obvious enough. Dude likes cock he's not paying much attention to anything else.

Edit: Clarification

[–] Traister101 25 points 9 months ago (7 children)

Vaush did not in fact have any loli porn. One (some?) of the art he downloaded happened to be drawn by somebody who for some reason chose not to draw lolicon that day and Vaush innocently thought it looked hot. As somebody who spends way too much time on pixiv it's happened to me enough I've made a habit of thoroughly checking an artists profile cause some really freaky people love drawing normal stuff every once in awhile. It's quite off-putting.

What we should really be talking about is the obvious AI art he had downloaded like a absolute troglodyte (and the horse stuff has been known for years but it's still funny)

[–] Traister101 1 points 9 months ago

Rebasing shines for local commits not remote commits. IE rebase your commits onto the remote or amend the previous commit (yes that's actually a rebase)

[–] Traister101 3 points 9 months ago (1 children)

LLMs don't "learn" they literally don't have the capacity to "learn". We train them on an insane amount of text and then the LLMs job is to produce output that looks like that text. That's why when you attempt to correct it nothing happens. It can't learn, it doesn't have the capacity to.

Humans aren't "word guessing machines". Humans produce language with intent and meaning. This is why you and I can communicate. We use language to represent things. When I say "Tree" you know what that is because it's the word we use to describe an object we all know about. LLMs don't know what a tree is. They can use "tree" in a sentence correctly but they don't know what it means. They can even translate it to another language but they still don't know what "tree" means. What they know is generating text that looks like what they were trained on.

Here's a well made video by Kyle Hill that will teach you lot better than I could

[–] Traister101 5 points 9 months ago

Yep. Been having trouble with mine recently, it's managed to learn my typos and it's getting quite frustrating

view more: ‹ prev next ›