[ad_1]

What you need to know

  • ChatGPT will require as many as 30,000 NVIDIA GPUs to operate, according to a report by research firm TrendForce.
  • Those calculations are based on the processing power of NVIDIA’s A100, which costs between $10,000 and $15,000.
  • It’s unclear if gamers will be affected by so many GPUs being dedicated to running ChatGPT.

ChatGPT stormed onto the tech scene near the end of 2022. By January of this year, the tool reportedly had 100 million monthly users. It would only get more users in the following month, as Microsoft announced a new version of Bing powered by ChatGPT. With the technology receiving so much interest and real-world use, it’s not surprising that companies stand to profit from ChatGPT’s popularity.

According to research firm TrendForce, NVIDIA could make as much as $300 million in revenue thanks to ChatGPT (via Tom’s Hardware). That figure is based on TrendForce’s analysis that ChatGPT needs 30,000 NVIDIA A100 GPUs to operate. The exact figure that NVIDIA stands to make will depend on how many graphics cards OpenAI needs and if NVIDIA gives the AI company a discount for having such a large order.

If NVIDIA prioritizes placing components within graphics card meant to run ChatGPT, it could affect the availability of other GPUs.



[ad_2]

Source link

Leave a Reply

Your email address will not be published. Required fields are marked *