this post was submitted on 26 Oct 2023
100 points (93.9% liked)

Technology

72799 readers
2954 users here now

This is a most excellent place for technology news and articles.


Our Rules


  1. Follow the lemmy.world rules.
  2. Only tech related news or articles.
  3. Be excellent to each other!
  4. Mod approved content bots can post up to 10 articles per day.
  5. Threads asking for personal tech support may be deleted.
  6. Politics threads may be removed.
  7. No memes allowed as posts, OK to post as comments.
  8. Only approved bots from the list below, this includes using AI responses and summaries. To ask if your bot can be added please contact a mod.
  9. Check for duplicates before posting, duplicates may be removed
  10. Accounts 7 days and younger will have their posts automatically removed.

Approved Bots


founded 2 years ago
MODERATORS
top 15 comments
sorted by: hot top controversial new old
[–] Knusper@feddit.de 37 points 2 years ago (3 children)
[–] cyd@lemmy.world 50 points 2 years ago (3 children)

That's not such a big deal. Their objective is to get people hooked on the system. After that, they'll jack up the price. Microsoft can easily afford to lose money for several years in pursuit of that target.

(One way this plan could fall through is if LLM tech progresses to the extent that free and open source copilots, run locally, can give result that are just as good.)

[–] Pechente@feddit.de 22 points 2 years ago

One way this plan could fall through is if LLM tech progresses to the extent that free and open source copilots, run locally, can give result that are just as good.

MS might be in trouble then.

Performance is not great but apparently it’s not optimized at all as of right now.

[–] theterrasque@infosec.pub 11 points 2 years ago* (last edited 2 years ago)

There are already very impressive local models for coding. Some have come out favourably to copilot in tests iirc

Edit: https://evalplus.github.io/leaderboard.html

[–] HidingCat@kbin.social 6 points 2 years ago (2 children)

Not familiar with the tech, but wouldn't server-side LLMs still have an advantage regardless because of the greater power available on tap? Anything that improves local LLM will also benefit server-side LLMs, wouldn't it?

[–] bamboo@lemm.ee 10 points 2 years ago

Possibly, but given the choice between paying $20/m for a marginally better version of something that’s free and probably built in to your editor at that point, most people would probably take the free thing. At that point paid llms will need to find new niches beyond simply existing.

[–] my_hat_stinks@programming.dev 10 points 2 years ago

Not necessarily, as it gets faster the latency between your local and remote machines becomes a bigger fraction of the time taken to process anything. If your local machine processes in 50ms and the remote machine in 5s, a latency of just 45ms would make your machine faster.

Running locally also cuts out a lot of potential security issues inherent to sending data over a network, and not sending your data to a third party is a bonus too.

[–] worldsayshi@lemmy.world 12 points 2 years ago* (last edited 2 years ago) (2 children)

That seems so weird when you think about the pricing for openai API. It feels at least an order of magnitude cheaper than using chatgpt plus subscription, which in turn is $20/month. If Copilot is losing money, openai must be burning money by truckloads.

[–] txmyx@feddit.de 8 points 2 years ago* (last edited 2 years ago) (1 children)

Openai is losing money. I'm too lazy to find the article, but it is mindblowing how much they're losing

Edit: nvm, I found it https://medium.com/illumination/openai-lost-540m-in-2022-needs-100b-to-developing-artificial-generative-intelligence-20126721cd13

[–] Cloudkid@lemmus.org 1 points 2 years ago* (last edited 2 years ago) (1 children)

major percentage of these losses can be attributed to the outrageous expenses of training language models.

Those cost seems to be developmental and not operational, after they reach AGI I assume it will go down

[–] YIj54yALOJxEsY20eU@lemm.ee 2 points 2 years ago

After they reach AGI they will own the entire world

[–] axo@feddit.de 3 points 2 years ago

It sure is, what would make you think otherwise? It has enough VC money to burn

[–] pineapple_pizza@lemmy.dexlit.xyz 7 points 2 years ago (1 children)

We're in the sliceline era of generative ai, enjoy it before prices get hiked

[–] nicetriangle@kbin.social 6 points 2 years ago

Yep everyone's trying to capture market share and stamp out any competitors with shorter funding runways until they achieve some amount of monopolization over the customer base. Then comes the price hikes and other anti consumer bullshit.

[–] Phanatik@kbin.social 7 points 2 years ago

Yeah, I'm sure Microsoft is happy with the theft of copyrighted works and people's personal information.