ChatGPT is free and paid for by Microsoft for the average user who uses the bot to perfect their Tinder profile or create documents. And big technology expenses are high. daily, Running OpenAI’s productive AI could cost up to $700,000. So big tech is trying to lower value.

The main reason for the high cost of running AI is servers. Tech infrastructure is expensive, as the chatbot requires a great deal of computing power to respond to users.

The value was estimated by SemiAnalysis principal analyst Dylan Patel in an interview with The Information website. After the language model migrated to the newer and more powerful GPT-4, Patel told Business Insider the expense could be even higher.

AIs like this are quite expensive to train and yet the ChatGPT operation manages to be bigger. That’s why Microsoft is working on its own chip called Athena. The development of the project took four years, starting after the billionaire deal with OpenAI, which required working on Azure servers.

With its own chips, big tech is now finding a cheaper alternative as it uses Nvidia GPUs. Competition continues with Google and Amazon, which also develop their own components.

More than 300 company employees work in the sector. Microsoft devices expected to be released for internal use as early as 2024in both the company’s and OpenAI’s products, including the very expensive chatbot.

Source: Tec Mundo

Previous articleA look at the RTX 4070 Ti from Nvidia
Next articleMobile number portability will require SMS confirmation earlier this month
I am a passionate and hardworking journalist with an eye for detail. I specialize in the field of news reporting, and have been writing for Gadget Onus, a renowned online news site, since 2019. As the author of their Hot News section, I’m proud to be at the forefront of today’s headlines and current affairs.

LEAVE A REPLY

Please enter your comment!
Please enter your name here