this post was submitted on 10 May 2024
30 points (96.9% liked)

LocalLLaMA

2249 readers
1 users here now

Community to discuss about LLaMA, the large language model created by Meta AI.

This is intended to be a replacement for r/LocalLLaMA on Reddit.

founded 1 year ago
MODERATORS
you are viewing a single comment's thread
view the rest of the comments
[–] istanbullu@lemmy.ml 1 points 5 months ago (1 children)

is llamafile better than ollama/llama.cpp ?

[–] xcjs@programming.dev 2 points 4 months ago (1 children)

It's just a different use case to create a single-file large language model engine that automatically chooses the "best" parameters to run under. It uses llama.cpp under the hood.

The intent is to make it as easy as double clicking a binary to get up and running.

[–] xcjs@programming.dev 1 points 4 months ago* (last edited 4 months ago)

I just wanted to update this to mention that there are a lot of custom low level performance improvements for CPU based inferencing in Llamafile: https://justine.lol/matmul/