top of page
Writer's pictureSoham Joshi

GPT and Carbon footprint



{image source: The register}

Generative Pre-Trained Transformer, is a high performing machine learning model that gives elaborate and much thoughtful response to any human query. An extended version of chatbot became exceedingly popular because of its longer responses which are much like comprehended version of consolidated responses from a google query.


The machine learning model takes enormous compute power and exorbitant storage. The image processing itself takes huge toll on processing and eventually electricity used by processor and also electricity used by the cooling system of the datacenters. The regular wear and tear of the hardware and replacement produces huge e-waste. When there is no scarcity of human labor, is this technology is serving or dis-serving larger populations.


For Chat GPT training alone, , some digital scientists have calculated that this “training” required to operate Chat GPT, actually generates CO2 at the same levels as a car traveling 700,000 km, which is about twice the distance between Earth and the Moon { source: The Register, 2020}


With climate change, more and more marginalized population becoming vulnerable to its impact. The carbon emissions by prevalent use of ML can further deteriorate this. So, what these technologies are helping us with ? who do they serve and who they dis-serve ? how can the same technology help those at disadvantage.


Writing compact models, implementing smarter data processing capabilities, making use of high-speed convergent neural network such as spiking neural networks could be the way and using renewable energy.


But how do we fix the water footprint ?

as such a high performing technology requires high quality semiconductor chips to build the fast processing computers.

Comments


bottom of page