this post was submitted on 22 Jan 2025
91 points (98.9% liked)

technology

23443 readers
441 users here now

On the road to fully automated luxury gay space communism.

Spreading Linux propaganda since 2020

Rules:

founded 4 years ago
MODERATORS
 

linky (known as griftgate lmao)

text of tweetThe Stargate Project is a new company which intends to invest $500 billion over the next four years building new AI infrastructure for OpenAI in the United States. We will begin deploying $100 billion immediately. This infrastructure will secure American leadership in AI, create hundreds of thousands of American jobs, and generate massive economic benefit for the entire world. This project will not only support the re-industrialization of the United States but also provide a strategic capability to protect the national security of America and its allies.

The initial equity funders in Stargate are SoftBank, OpenAI, Oracle, and MGX. SoftBank and OpenAI are the lead partners for Stargate, with SoftBank having financial responsibility and OpenAI having operational responsibility. Masayoshi Son will be the chairman.

Arm, Microsoft, NVIDIA, Oracle, and OpenAI are the key initial technology partners. The buildout is currently underway, starting in Texas, and we are evaluating potential sites across the country for more campuses as we finalize definitive agreements.

As part of Stargate, Oracle, NVIDIA, and OpenAI will closely collaborate to build and operate this computing system. This builds on a deep collaboration between OpenAI and NVIDIA going back to 2016 and a newer partnership between OpenAI and Oracle.

This also builds on the existing OpenAI partnership with Microsoft. OpenAI will continue to increase its consumption of Azure as OpenAI continues its work with Microsoft with this additional compute to train leading models and deliver great products and services.

All of us look forward to continuing to build and develop AI—and in particular AGI—for the benefit of all of humanity. We believe that this new step is critical on the path, and will enable creative people to figure out how to use AI to elevate humanity.

At current prices of money, this project has to return 20 billion in profit, and likely operate on 50 billion revenue at least to cover the energy costs

you are viewing a single comment's thread
view the rest of the comments
[–] AtmosphericRiversCuomo@hexbear.net 3 points 21 hours ago (1 children)

That's mind boggling output for an 8b model compared to last year. It's funny, you can def see how that first answer is a distillation of r1's answer here:

spoilerOkay, so the user is asking how dumb it is to buy $500 billion in Nvidia stock. Let me break this down. First, $500 billion is an astronomical amount of money. Nvidia's current market cap is around $2.2 trillion as of early 2023. If someone tried to buy $500 billion worth of their stock, that's like almost a quarter of the entire company. But wait, market cap is the total value of all outstanding shares. So buying $500 billion would require that many shares being available, which they might not be.

Liquidity is a big factor here. Even if the shares are available, purchasing such a huge volume would drive the price up rapidly because of the increased demand. This is known as slippage. The more you buy, the higher the price goes, so the actual cost could be way more than $500 billion. Plus, once the market notices such a massive buy, other investors might react, maybe causing a short-term spike but also potential volatility afterward.

Then there's diversification. Putting all that money into a single stock is super risky. Even if Nvidia is a strong company, any company-specific issue—like a product failure, regulatory problem, or market shift—could tank the stock. Diversification helps mitigate that risk, but this move is the opposite of diversified.

[–] plinky@hexbear.net 2 points 21 hours ago (1 children)

That's at least aware of concept of free float and slippage shrug-outta-hecks (i don't want to poo-poo small model over something which might have been pruned, but outside of fancy thonk style posting, it's not that different from other small ones)

The paper details more about how they got these things to train each other, so in theory you'd train these tiny 1.5b or small 8b models to specific tasks.