Lemmy Today

1,470 readers
86 users here now

Welcome to lemmy.today!

About us

🤗 Thanks for joining our little instance here, located in Oregon. The idea is to have a fast, stable instance and allow users to subscribe to whatever content they want from here.

😎 We dont block any other instances. We will keep it that way unless it becomes a moderation problem.

🤠 We will be around for a very long time, so you dont have to worry about us shutting down the instance anytime soon. We like performance and stability in our servers, and will upgrade the instance when its needed.

🥹 Make sure to join a lot of remote communities to get a good feed going. How to do that is explained here.

Lemmy mobile apps

You should start using one of these ASAP since the web browser user interface is quite ugly, even with themes.

Optional Lemmy web browser user interfaces

Rules

Contact the admin

founded 1 year ago
ADMINS
1
 
 

[ comments | sourced from HackerNews ]

2
 
 

Archive link

Silicon Valley has bet big on generative AI but it’s not totally clear whether that bet will pay off. A new report from the Wall Street Journal claims that, despite the endless hype around large language models and the automated platforms they power, tech companies are struggling to turn a profit when it comes to AI.

Microsoft, which has bet big on the generative AI boom with billions invested in its partner OpenAI, has been losing money on one of its major AI platforms. Github Copilot, which launched in 2021, was designed to automate some parts of a coder’s workflow and, while immensely popular with its user base, has been a huge “money loser,” the Journal reports. The problem is that users pay $10 a month subscription fee for Copilot but, according to a source interviewed by the Journal, Microsoft lost an average of $20 per user during the first few months of this year. Some users cost the company an average loss of over $80 per month, the source told the paper.

OpenAI’s ChatGPT, for instance, has seen an ever declining user base while its operating costs remain incredibly high. A report from the Washington Post in June claimed that chatbots like ChatGPT lose money pretty much every time a customer uses them.

AI platforms are notoriously expensive to operate. Platforms like ChatGPT and DALL-E burn through an enormous amount of computing power and companies are struggling to figure out how to reduce that footprint. At the same time, the infrastructure to run AI systems—like powerful, high-priced AI computer chips—can be quite expensive. The cloud capacity necessary to train algorithms and run AI systems, meanwhile, is also expanding at a frightening rate. All of this energy consumption also means that AI is about as environmentally unfriendly as you can get.

3
view more: next ›