ChatGPT has been a buzzing topic on the internet since its release. The platform has attracted millions of users in a few months and has brought business to several tech companies, including Microsoft.
In a recent report, Microsoft revealed that their company built AI infrastructure on thousands of Nvidia Graphics Processing Units (GPUs) for Open AI. Open AI contacted Microsoft around five years ago to help them build the AI infrastructure on Nvidia GPUs. These GPUs are then used in Open AI’s products, including ChatGPT.
Let’s see how ChatGPT uses Nvidia.
Does OPen AI ChatGPT use Nvidia?
Open AI, the parent company of ChatGPT, uses Nvidia GPUs in ChatGPT. The company reached out to Microsoft seeking assistance to build infrastructure for their new AI-based platform, ChatGPT.
Microsoft claims to link thousands of Nvidia GPUs on its Azure cloud computing system to build the ultimate supercomputer for Open AI.
Microsoft has also invested billions of dollars in Open AI in supporting the company and its inventions. A significant part of Microsoft’s investment goes into training Nvidia GPUs used in ChatGPT.
What hardware is ChatGPT running on?
ChatGPT was built in collaboration with Microsoft. The model is trained using Azure supercomputing infrastructure with Nvidia GPUs. ChatGPT’s hardware comprises over 285,000 CPU cores, 10,000 GPUs, and network connectivity of 400 GBs per second per GPU server.
How much GPU does chat GPT cost?
Calculating the total GPU cost for ChatGPT is challenging. Several factors need to be taken into consideration. However, according to several reports, Azure cloud hosts ChatGPT, Microsoft charges $3 per hour for one A100 GPU, and each word costs $0.0003.
One ChatGPT uses Eight GPUs and generates an average of 30 words. So, the total daily cost of ChatGPT is assumed to be $100,000 or $3M per month.
How much power does ChatGPT use?
Open AI’s ChatGPT uses GPT-3 for training. So, the power consumed by ChatGPT and GPT-3 will be similar. Additionally, ChatGPT is trained with reinforcement learning. Hence, the final calculation of power consumed by ChatGPT should be a combination of these factors.
According to a report, GPT-3 used 1,287 MWh of electricity emitting 552 tons of CO2e. Again, the carbon footprint of ChatGPT is 365 * 23.0 kg which is 8.4 tons. Overall, the total carbon footprint released by ChatGPT is 23.04 kgCO2e.
Nvidia CEO Jensen Huang’s big bet on A. I as his core technology power ChatGPT
Nvidia’s CEO Jensen Huang recently said in an interview that the company realized about a decade ago that artificial intelligence would revolutionize the software industry.
The company then changed everything and added artificial intelligence to its products. The company then started making AI chips, of which A 100 is the most sold.
Huang added that they believed something new would happen someday, and the invention of AI chips is a hint of their good fortune.
ChatGPT popularity giving Nvidia stock an unexpected boost
The news regarding ChatGPT using Nvidia GPUs is official. Open AI has confirmed that its products are trained using Nvidia GPUs. Further, Microsoft has also revealed it has built AI infrastructure on Nvidia GPUs for Open AI.
Hence, if Nvidia’s stocks experience a hike after ChatGPT’s release, it shouldn’t surprise anyone. Open AI is also working on new projects and has hinted that it will require more GPUs in the future.
Eventually, Nvidia’s stocks will boost, and its values will increase. To conclude, Nvidia’s partnership with Open AI will be highly profitable for both companies.