Grok will be developed with a supercomputer
Musk announced earlier this year that 20,000 people will be required to train Grok 2. Nvidia H100 GPUHe said that it needed , and that future Grok models would require more than 100,000 H100 GPUs. Named after Tesla giga factories and “giga factory of computersBy creating this supercomputer called ” xAIcan develop more advanced language models.
Since OpenAI’s ChatGPT dominates the AI world, you may not have tried or even heard of Grok. Grok AI can be used on X.com, but you have to pay a certain monthly subscription fee. One of Grok’s biggest advantages over other models is that it has direct access to all posts on X, which Musk has consistently said replaces news feeds.
This news our mobile application Download using
You can read it whenever you want (even offline):
Source link: https://www.donanimhaber.com/elon-musk-in-super-bilgisayar-projesi-ortaya-cikti–177710