Futures
Access hundreds of perpetual contracts
TradFi
Gold
One platform for global traditional assets
Options
Hot
Trade European-style vanilla options
Unified Account
Maximize your capital efficiency
Demo Trading
Introduction to Futures Trading
Learn the basics of futures trading
Futures Events
Join events to earn rewards
Demo Trading
Use virtual funds to practice risk-free trading
Launch
CandyDrop
Collect candies to earn airdrops
Launchpool
Quick staking, earn potential new tokens
HODLer Airdrop
Hold GT and get massive airdrops for free
Launchpad
Be early to the next big token project
Alpha Points
Trade on-chain assets and earn airdrops
Futures Points
Earn futures points and claim airdrop rewards
Elon Musk says Colossus2 is training 7 models simultaneously, with a maximum of 100 trillion parameters.
According to 1M AI News monitoring, SpaceX and xAI founder Elon Musk posted on X, saying that SpaceXAI supercomputing cluster Colossus 2 is currently training seven models at the same time:
He added the note, “Still some catching up to do” (Some catching up to do). Previously, multiple media outlets reported that xAI’s next-generation flagship model Grok 5 has a parameter scale of about 6 trillion, which matches the 6T in the list. The 10-trillion-parameter model has not been reported publicly before.
In the post, Musk refers to the combined entity of SpaceX and xAI as “SpaceXAI.” The two companies completed the merger in February this year, and the combined valuation is about $1.25 trillion. Colossus 2 went live on January 17. It is the world’s first gigawatt-class AI training cluster, and it is planned to be upgraded to 1.5GW this month.