Whenever you ask silly questions to ChatGPT, you are unknowingly contributing to the destruction of the environment. Indeed, each interaction with the OpenAI chatbot is equivalent to half a liter of wasted water. Imagine the impact of millions of users, billions of conversations.
ChatGPT is becoming more and more used on a daily basis, even to answer trivial questions. However, it is crucial to be aware of the repercussions of our actions. We must not forget the resources required for the operation of this revolutionary chatbot. Indeed, language models require energy-intensive computing infrastructures to perform the calculations needed to generate answers.
ChatGPT reveals shocking impact of silly questions on global drinking water
According to a text published in Forbes, a series of discussions between ChatGPT and a user, comprising between 20 and 50 exchanges of questions and answers, results in an energy consumption similar to that necessary to produce a 500 ml bottle of water. Even though this amount may seem insignificant… it takes on a whole new dimension when you take into account that this bot has more than 100 million active users, each participating in multiple dialogues.
In order to fully understand the enormity of this situation, the article highlighted that Google's data retention facilities in the United States are estimated to consume 12.7 billion liters of fresh water exclusively for the process of cooling, approximately 90% of which is drawn from drinking water resources.
Indeed, your eyes do not deceive you : clean and consumable water. This reality takes place within a nation where some agglomerations suffer periodically from the absence of healthy water, within a society where “771 million people do not have a source of safe water” and where “women and young girls spend almost 200 million hours every day carrying water”.
How does using ChatGPT for silly questions consume energy ?
According to a report, OpenAI uses huge data centers around the world to train AI algorithms, including the latest GPT-4 model. These data centers require a considerable amount of electricity to power the servers used in the learning process.
At the moment, OpenAI has not yet revealed precise figures on the specific power consumption of ChatGPT. However, estimates have been made based on approximate traffic data and models.
According to Ecoist Club president and founder Daria Marchenko, ChatGPT's power consumption could be around 0.6 watt-hours per query. This equates to approximately 7 to 15 tonnes of carbon dioxide (CO2) per day. In addition, according to analysts from SemiAnalysis, the infrastructure used by ChatGPT could cost OpenAI approximately $700,000 per day.
When does it make sense to use chatbots ?
When there is a high volume of requests or when queries are repetitive in nature, chatbots can provide an instant response, reducing user wait time. Plus, they are available 24/7, providing uninterrupted support.
Chatbots are particularly good at providing basic information, making reservations, answering common questions, or helping with minor technical issues. They can also help triage and direct requests to the appropriate department, improving overall customer support efficiency.
However, in complex situations requiring in-depth understanding, the intervention of a human agent is preferable. Chatbots are complementary tools, but cannot completely replace the human expertise and empathy needed in some cases.

Comments
Post a Comment