488
submitted 2 months ago by mozz@mbin.grits.dev to c/technology@beehaw.org

Credit to @bontchev

you are viewing a single comment's thread
view the rest of the comments
[-] JackGreenEarth@lemm.ee 7 points 2 months ago

Yes, but what LLM has a large enough context length for a whole book?

[-] ninjan@lemmy.mildgrim.com 8 points 2 months ago

Gemini Ultra will, in developer mode, have 1 million token context length so that would fit a medium book at least. No word on what it will support in production mode though.

[-] JackGreenEarth@lemm.ee 3 points 2 months ago

Cool! Any other, even FOSS models with a longer (than 4096, or 8192) context length?

this post was submitted on 15 Apr 2024
488 points (100.0% liked)

Technology

37208 readers
125 users here now

Rumors, happenings, and innovations in the technology sphere. If it's technological news or discussion of technology, it probably belongs here.

Subcommunities on Beehaw:


This community's icon was made by Aaron Schneider, under the CC-BY-NC-SA 4.0 license.

founded 2 years ago
MODERATORS