this post was submitted on 15 Oct 2023
61 points (93.0% liked)

Asklemmy

43890 readers
950 users here now

A loosely moderated place to ask open-ended questions

Search asklemmy ๐Ÿ”

If your post meets the following criteria, it's welcome here!

  1. Open-ended question
  2. Not offensive: at this point, we do not have the bandwidth to moderate overtly political discussions. Assume best intent and be excellent to each other.
  3. Not regarding using or support for Lemmy: context, see the list of support communities and tools for finding communities below
  4. Not ad nauseam inducing: please make sure it is a question that would be new to most members
  5. An actual topic of discussion

Looking for support?

Looking for a community?

~Icon~ ~by~ ~@Double_A@discuss.tchncs.de~

founded 5 years ago
MODERATORS
 

Here's some context for the question. When image generating AIs became available, I tried them out and found that the results were often quite uncanny or even straight up horrible. I ended up seeing my fair share of twisted fingers, scary faces and mutated abominations of all kinds.

Some of those pictures made me think that since the AI really loves to create horror movie material, why not take advantage of this property. I started asking it to make all sorts of nightmare monsters that could have escaped from movies such as The Thing. Oh boy, did it work! I think I've found the ideal way to use an image generating AI. Obviously, it can do other stuff too, but with this particular category, the results are perfect nearly every time. Making other types of images usually requires some creative promptcrafting, editing, time and effort. When you ask for a "mutated abomination from Hell", it's pretty much guaranteed to work perfectly every time.

What about LLMs though? Have you noticed that LLMs like chatGPT tend to gravitate towards a specific style or genre? Is it longwinded business books with loads of unnecessary repetition or is it pointless self help books that struggle to squeeze even a single good idea in a hundred pages? Is it something even worse? What would be the ideal use for LLMs? What's the sort of thing where LLMs perform exceptionally well?

you are viewing a single comment's thread
view the rest of the comments
[โ€“] dewritoninja@pawb.social 13 points 1 year ago (3 children)

With the proper documentation llms are great at helping with code. Take phind which uses GPT-3.5 but with sources. Its great for small code snippets and pulls it's answers for documentation and stackoverflow

[โ€“] ByGourou@sh.itjust.works 5 points 1 year ago

I've had free access to github copilot since beta and it's great, especially when working with unknown libraries or languages. I don't have to pull out documention and I can go on with the logic. Of course it often hallucinate, the code it spits out need to be checked, but still, it saves a lot of time.

[โ€“] colonial@lemmy.world 3 points 1 year ago (1 children)

... Eh, no. I've seen GPT generate some incredibly unsound C despite being given half a page of text on the problem.

[โ€“] dewritoninja@pawb.social 6 points 1 year ago

C is already incredibly unsound /hj

[โ€“] Hamartiogonic@sopuli.xyz 1 points 1 year ago* (last edited 1 year ago)

I've had some good experiences with asking Bing to write a few lines of VBA or R. Normally, I'll just ask it solve a specific problem, but then I'll modify the code to suit my specific needs.