this post was submitted on 15 Oct 2023
61 points (93.0% liked)

Asklemmy

43890 readers
1212 users here now

A loosely moderated place to ask open-ended questions

Search asklemmy 🔍

If your post meets the following criteria, it's welcome here!

  1. Open-ended question
  2. Not offensive: at this point, we do not have the bandwidth to moderate overtly political discussions. Assume best intent and be excellent to each other.
  3. Not regarding using or support for Lemmy: context, see the list of support communities and tools for finding communities below
  4. Not ad nauseam inducing: please make sure it is a question that would be new to most members
  5. An actual topic of discussion

Looking for support?

Looking for a community?

~Icon~ ~by~ ~@Double_A@discuss.tchncs.de~

founded 5 years ago
MODERATORS
 

Here's some context for the question. When image generating AIs became available, I tried them out and found that the results were often quite uncanny or even straight up horrible. I ended up seeing my fair share of twisted fingers, scary faces and mutated abominations of all kinds.

Some of those pictures made me think that since the AI really loves to create horror movie material, why not take advantage of this property. I started asking it to make all sorts of nightmare monsters that could have escaped from movies such as The Thing. Oh boy, did it work! I think I've found the ideal way to use an image generating AI. Obviously, it can do other stuff too, but with this particular category, the results are perfect nearly every time. Making other types of images usually requires some creative promptcrafting, editing, time and effort. When you ask for a "mutated abomination from Hell", it's pretty much guaranteed to work perfectly every time.

What about LLMs though? Have you noticed that LLMs like chatGPT tend to gravitate towards a specific style or genre? Is it longwinded business books with loads of unnecessary repetition or is it pointless self help books that struggle to squeeze even a single good idea in a hundred pages? Is it something even worse? What would be the ideal use for LLMs? What's the sort of thing where LLMs perform exceptionally well?

you are viewing a single comment's thread
view the rest of the comments
[–] Toribor@corndog.social 3 points 1 year ago (1 children)

I used it to learn Ansible and Terraform. It probably does 80% of the work with me occasionally having to point out that it made up a module, is using a deprecated format or something like that. Still a huge time saver though. In ten seconds it can output something at least as good as what I'd produce with 15 minutes of reading documentation and searching for examples.

[–] Hamartiogonic@sopuli.xyz 2 points 1 year ago

That is a valid use for an LLM, especially in easy cases. With more complex cases, I usually end up getting completely incorrect tech advice, but eventually I’ve always managed to make things work. It may require a few messages back and forth, but eventually I’ve managed to narrow it down enough that I can ask the right question and I finally get the right answer.