The Achilles’ Heel Of AI Might Be Its Big Carbon Footprint

Context: Current AI models leave a significant carbon footprint in their development and operation. A probable solution is to go back to one of the most profound creations of nature, the human brain.

 Environmental cost of AI

  • While the average American generates 36,156 pounds of carbon dioxide emissions in a year, training a single deep-learning model can generate up to 626,155 pounds of emissions.
  • This is roughly equal to the carbon footprint of 125 round- trip flights between New York and Beijing.
  • Inherent contradiction: While, on the one hand, AI helps reduce the effects of the climate crisis by way of smart grid designs, it is nevertheless a significant carbon emitter.

Reasons for the high environmental costs of AI:

  • Booming computation demand for AI training models: Doubling every 3.4 months since 2012, a wild deviation from 24 months of Moore’s Law.
    • Neural networks carry out a lengthy set of mathematical operations for each piece of training data. Larger datasets, therefore, translate to soaring computing and energy requirements.
  • Experimentation and tuning costs: Machine learning today remains largely an exercise in trial and error. Deploying AI models in real-world settings –(a process known as Inference), consumes more energy.
    • It is estimated that 80-90% of the cost of a neural network is on Inference rather than training.

Way forward: Developing “Green AI” -, or AI that yields novel results without increasing computational cost (and ideally reducing it)

  • Employing Neuromorphic Computing: that understands the actual processes of our brain and uses this knowledge to make computers ‘think’, and process inputs more like human minds do.
    • For E.g. while the brain uses just 20 watts to execute multi-pronged activities, a supercomputer consumes more than 5 megawatts.
  • Researching more on Computational neuroscience: a field of study in which mathematical tools and theories used to investigate brain function at an individual neuron level.
    • The mathematical language of Computational Neuroscience makes it very palatable for AI practitioners.

Conclusion: To build AI systems that leave a far smaller carbon footprint, one must go back to one of the most profound creations of nature—the human brain.