A chief analyst at a semiconductor research firm says that ChatGPT could cost OpenAI over USD 700,000 in operational expense a day.
SemiAnalysis’s Dylan Patel told The Information that ChatGPT is that expensive due to massive amounts of infrastructure it requires to calculate responses to hundreds of millions of prompts.
The analyst said that most of the cost are allocated to servers ChatGPT requires to achieve the necessary computing power amidst the growing demand in AI.
The cost could even be higher with GPT-4 model since Patel’s estimate was based on GPT-3.
In line with this, Microsoft is developing a chip called Athena to decrease the operational costs of running generative AI models.
The project started in 2019 after Microsoft’s one-billion-dollar deal with OpenAI, which required OpenAI to exclusively run its AI models on Microsoft’s Azure cloud servers.
Microsoft was trailing behind Google and Amazon in building its own in-house chips and wanted cheaper alternatives to Nvidia’s graphics processing units which their models use.
The said AI chip could be released for internal use by Microsoft and OpenAI this year.