this post was submitted on 16 Sep 2024
372 points (98.2% liked)

Technology

34996 readers
274 users here now

This is the official technology community of Lemmy.ml for all news related to creation and use of technology, and to facilitate civil, meaningful discussion around it.


Ask in DM before posting product reviews or ads. All such posts otherwise are subject to removal.


Rules:

1: All Lemmy rules apply

2: Do not post low effort posts

3: NEVER post naziped*gore stuff

4: Always post article URLs or their archived version URLs as sources, NOT screenshots. Help the blind users.

5: personal rants of Big Tech CEOs like Elon Musk are unwelcome (does not include posts about their companies affecting wide range of people)

6: no advertisement posts unless verified as legitimate and non-exploitative/non-consumerist

7: crypto related posts, unless essential, are disallowed

founded 5 years ago
MODERATORS
you are viewing a single comment's thread
view the rest of the comments
[–] HubertManne@moist.catsweat.com 3 points 2 months ago (2 children)

I really don't get how its different than a search engine. Granted its surprising how often I have to give up in disgust and just go back to normal search but pretty often they can find more relevant stuff faster

[–] cypherpunks@lemmy.ml 21 points 2 months ago (1 children)

I really don’t get how its different than a search engine

Neither did this guy.

The difference is that LLM output is (in the formal sense) bullshit.

[–] HubertManne@moist.catsweat.com 1 points 2 months ago (1 children)

so is search. I mean I would not click the first link from a search and then copy and paste code from the site into my project no questions asked. similarly you can look over what the ai comes up with and see if it makes sense. same you would do with some dudes blog. you can also check the references it gives or ask it to expand on some part. hey what does the function X do. I really don't see it as being worse than search.

[–] moriquende@lemmy.world 9 points 2 months ago* (last edited 2 months ago) (1 children)

not that you should be copy pasting any significanct amount of code, but at least when you do you're required to understand it enough to fit it into your program. LLMs just straight up camouflage the shit code by putting something that already fits and has no squiggly red lines beneath. Many people probably don't bother reading it at that point.

[–] HubertManne@moist.catsweat.com 0 points 2 months ago (1 children)

yeah I mean by that standard anything a person like that uses is going to be an issue. They can be useful but im worried about the power they use although I wonder how much power that is realtive to be searching different blogs for 10 or 20 minutes.

[–] Facebones@reddthat.com 3 points 2 months ago (1 children)

For a point of comparison, a ChatGPT request uses 2.9 watt-hours (and rising) to a google searches 0.3 (which per your example would only be run once assuming you're checking different blogs from the same list of results.)

https://timesofindia.indiatimes.com/technology/tech-news/chatgpt-google-search-need-power-to-run-heres-how-much-water-and-electricity-are-used-to-answer-questions/articleshow/111382705.cms

[–] HubertManne@moist.catsweat.com 1 points 2 months ago

generally I end up checking some results and often changing the search with new keywords but all the same I generally am doing follow up questions similarly. Im betting to any energy the ai uses to check web destinations is likely not included which would be the same as I going to a destination. maybe less if its more of a crawl or api. Any way you slice it its going to be more I think.