-
The internet contributes 1.6 billion tons of greenhouse gas emissions annually.
-
Now Google and Microsoft want to add AI to their search engines
-
This would add to global carbon emissions, experts told Wired.
Ever since the partnership between Microsoft and OpenAI paved the way for a future where AI powers search engine results, there has been an all-out race between Bing and Google to implement the technology.
Google and Microsoft are trying to improve search engine results with large language models that they say will distill “complex” information while responding to queries in a humane way. Microsoft will implement ChatGPT in its existing Bing search engine, while Google announced the launch of an “experimental conversational AI service” called Bard.
However, behind the scenes there is already a laborious operation of maintaining computer systems to help with search engines that could get more resources.
The power used to train a single artificial intelligence can emit hundreds of thousands of pounds of carbon emissions, while internet use accounts for nearly 4% of global greenhouse gas emissions.
The computing power required to merge AI with the load of search engine queries could increase the amount of computing power required from companies like Google and Microsoft by up to 5 times, experts told Wired. And with the proliferation of computers, greenhouse gas emissions will rise.
“It requires processing power as well as storage and efficient searching,” Alan Woodward, a professor of cybersecurity at the University of Surrey, told Wired. “Anytime we see a step change in online processing, we see a significant increase in the power and cooling resources required by large processing centers. I think this could be one such step.”
The new search engines will also require more data centers to store data. Martin Bouchard, founder of data center company QScale, told Wired that AI would result in “at least four or five times more computing power per query.”
In a statement to Insider, Jane Park, a Google spokesperson, said the company would initially launch a “lighter” version of Bard that would require less computing power.
“We also published research describing the energy costs of state-of-the-art language models, including an earlier and larger version of LaMDA,” Park said in a statement. “Our findings show that combining efficient models, processors and data centers with clean energy sources can reduce the carbon footprint of an ML system by as much as 1000x.”
Environmental concerns aren’t the only criticisms ChatGPT and Google’s AI have received in the past. Google’s rollout for Bard drew criticism from employees who say the product is “rushed and failed,” according to a report from CNBC.
Insider senior tech correspondent Adam Rogers wrote about how AI-produced search engine responses can yield answers with misinformation or flawed logic that are harder for searchers to detect.
“They will omit the sources they draw from and the biases built into their databases behind the trappings of acceptable, almost-but-not-quite-human-sounding prose,” Rogers wrote. “No matter how wrong they are, they will do it sound right.”
Open AI and Microsoft did not immediately respond to Insider’s request for comment.
Read the original article on Business Insider