Part 3/9:
The underlying principle of energy consumption in computing can be traced back to the work of physicist Rolf Landauer, who in 1961 posited that every irreversible operation in computing—notably the manipulation or destruction of information—increases entropy, which is subsequently released as heat. This may sound insignificant, considering that the energy used to flip a bit of data is an infinitesimally small 2.9 x 10^-21 joules. However, the sheer volume of bits processed by modern computers—often in the billions or trillions per second—means that the cumulative energy expenses are monumental.