this post was submitted on 08 Jul 2023
352 points (100.0% liked)

Technology

59466 readers
3247 users here now

This is a most excellent place for technology news and articles.


Our Rules


  1. Follow the lemmy.world rules.
  2. Only tech related content.
  3. Be excellent to each another!
  4. Mod approved content bots can post up to 10 articles per day.
  5. Threads asking for personal tech support may be deleted.
  6. Politics threads may be removed.
  7. No memes allowed as posts, OK to post as comments.
  8. Only approved bots from the list below, to ask if your bot can be added please contact us.
  9. Check for duplicates before posting, duplicates may be removed

Approved Bots


founded 1 year ago
MODERATORS
 

Slow June, people voting with their feet amid this AI craze, or something else?

you are viewing a single comment's thread
view the rest of the comments
[–] Tiffany@lemm.ee 16 points 1 year ago (3 children)

Sure it can. Finish generating it server-side, then send it as one big chunk to the user.

To be honest though, ChatGPT is pretty fast at generating text these days compared to how it was at the beginning so it doesn't bother me as much.

[–] Mereo@lemmy.ca 9 points 1 year ago

GPT-4 isn't fast yet so if it will frustrate people if they do that.

[–] maiskanzler@feddit.de 3 points 1 year ago (1 children)

What still bothers me, is that it doesn't do smooth scrolling while generating. It's tons of tiny jumps and hiccups which make it very hard to read. I tend to scroll up a little as soon as it has generated a few lines, then read at my own pace. Annoying default behaviour though.

[–] Tiffany@lemm.ee 3 points 1 year ago

Yeah, that's pretty much what I do if it's going to be a long block of text. If not, I usually just wait.

Having it just say "Generating Text..." then give a percentage, then just show the entire thing would be preferable to me. I'd like the option even if it wasn't default.

[–] ayyndrew@lemmy.world 1 points 1 year ago

That's what Bard does