Forums
Talk about anything you want!
Login to get your referral link.
Stanford Lab, focused on AI, will use the decentralized cloud computing platform Theta Edgecloud for its LLM work.
The decentralized cloud could be the solution of the vast computer needs of the AI. On April 17, Theta Labs revealed that the AI of the University of Stanford would use Theta (TheTA) Edgecloud in her work on large -language models. The laboratory, led by assistant professor Ellen Vinerk, will use the platform for discreet optimization and algorithmic reasoning of LLM.
Stanford joins a growing list of university institutions using the decentralized research platform. According to Theta Labs, other Edgecloud adopters include the National University of Seoul, the University of Korea, the University of Oregon, Michigan State University, etc.
Large technological companies quickly expanded their investment in infrastructure IT, in particular one intended to supply AI. In 2024, Microsoft invested $ 3.3 billion in a data center in Wisconsin, with the support of Joe Biden administration.
At the same time, Amazon said it was planning to spend $ 11 billion in data centers in Indiana. Google, on the other hand, is global, investing $ 1.1 billion in its data center in Finland and building another in Malaysia for $ 2 billion.
However, the Big Tech model is not the only one to compete for the workloads of the AI. Unlike most traditional LLM services, Theta Edgecloud works as a decentralized cloud computing platform. Its infrastructure is distributed geographically, which means that it is not based on massive centralized data centers to provide calculation power.
Instead, the platform uses Blockchain technology to reward GPU small suppliers according to the income they generate with users. This allows TheTA to operate with lower capital expenses and the scale faster. In turn, it offers a more affordable infrastructure for users.
The TheTA network is a blockchain protocol originally designed for decentralized video streaming. However, the network has since extended to provide a decentralized infrastructure for cloud computing, with a particular emphasis on AI applications.
1
Voice
0
Replies