Google chatbot in trouble: costs 10 times more to run than normal search
Google recently launched Bard, an intelligent chatbot, in preparation for its deadly rival ChatGPT, but Bard has not performed well and has also saddled the company with high operating costs.
In a recent interview, John Hennessy, chairman of Google's parent company Alphabet, said that large language models of artificial intelligence conversations can cost more than 10 times as much as traditional search engines, and he believes Google needs to reduce the cost of such runs.
According to Morgan Stanley, Google performed 3.3 trillion searches last year at an average cost of 0.2 cents per search, and if the ChatGPT-like AI can handle half of the queries it receives with 50-word responses, Google's costs could increase by $6 billion by 2024.
The main reason artificial intelligence is so expensive is because it requires more computing power. Analysts say that such artificial intelligence relies on billions of dollars worth of chips, and spreading that cost over several years of use increases the cost of a single use. In addition, power consumption will also lead to higher costs, but also for the company to put pressure on carbon emission targets.
Some experts believe that an effective way to reduce costs is to use smaller AI models and apply them to simpler tasks.
Another source revealed that OpenAI's computer scientists have found ways to optimize the cost of reasoning through code and use it to improve the efficiency of the chip's operation.
In addition, OpenAI announced the launch of a paid version of ChatGPT Plus to generate revenue, priced at $20 per month, with paid features including no queuing during peak hours, fast response, and priority access to new features.
Total 0 comment