The cost of using ChatGPT is drastically reduced: by up to 90%.

Artificial intelligence is undoubtedly the hottest topic at the moment, and as the price of processing power for large language models (LLMs) falls, the technology will move more quickly into the mainstream.

the cost of using chatgpt is drastically reduced by up to 90

This cost difference has narrowed to around 5% due to fierce competition between big language modelling companies OpenAI, Anthropic and Cohere.

OpenAI has directly slashed the price of using this technology by 90% in order to be more competitive, which could be devastating for competitors.

In addition, the significant price reduction of OpenAI's ChatGPT will allow more users to participate and hardware manufacturers to design more powerful chips, such as the NVIDIA H100 GPU.

However, Ori Goshen, co-CEO of AI21, said that most companies will not use ChatGPT's generic models, but will need models trained for industries such as finance or healthcare, or models trained based on a company's own data.

ChatGPT's price reduction is just the beginning of a technology that will enter mass commercialisation at a rapid pace and demonstrates OpenAI's determination to capture a market that few companies will be able to compete with in the future.

Author: King
Copyright: PCSofter.COM

<< Prev
Next >>