Earlier this decade there was a lot of noise about how Bitcoin (crypto) mining was a huge draw on energy. There were sites dedicated to the amount of electricity consumed by Bitcoin miners as compared to entire countries.
This caused some to feel that mining should be banned. Some nations (or municipalities) did take that step. However, it is a nothing-burger compared to what we are seeing today.
In fact, this is minimal compared to what the draw will be in the future. AI is a race, but not only for the features and services. There is also a mad dash to secure enough electricity to power the data centers.
All of Big Tech is heavily invested in this. There are data centers popping up all over the world. As stated in other articles, the amount of inference compute required will be enormous. Jensen Huang, CEO of Nvidia, says it will do a one billion X over the next decade.
Areas such as Texas are already seeing a deficit of power. This will have to be addressed.
The Huge Demand On Energy
Texas is the second most populous state in the US. It is also one of the largest energy producers. Long known for its oil (and then gas), it is now one of the leading renewable energy generators.
Cheap electricity costs were a drive of crypto mining in the state. Over the last couple years, many of those firms switched to AI. It is altering the energy demand in the state.
Texas is rapidly emerging as an epicenter of artificial intelligence-driven energy demand, with an unprecedented surge in large-load power requests, a wave now dominated by AI data centers rather than Bitcoin miners.
Of course, there are ramification of this.
ERCOT, the Electric Reliability Council of Texas, which operates the state’s independent power grid and oversees reliable electric service for about 90% of Texans, reported that its large-load interconnection queue has ballooned to 226 gigawatts of new requests, roughly 73% tied to AI facilities.
It is a situation that is not going away. We have the likes of Google, Microsoft, OpenAI, and Amazon investing billions in facilities, pushing the load far ahead of supply. The ramp is nowhere near the level to maintain pace.
One challenge is the renewable energy supply is hindered since it does not provide 24 hours energy production. Data centers are dependent upon consistent power. One solution is batteries but the US is lagging in that area also.
Global Problem
Many view AI as an arms race. This means, by definition, it is an energy race.
China and the US are the leading players in both. When it comes to energy, the US is lagging its counterpart in the east. China spent the last couple decades adding massive power. Not only did it get involved in renewables but it also built out nuclear, added coal plants when the world shut them down, and is heavily involved in gas.
This means that China has the ability to keep building data centers, at least from an energy standpoint. The bottleneck here are chips.
Elon Musk is addressing this problem by starting to talk about space. His long term vision is to put data centers in space, powered by the 24 hour exposure to the sun. Many figure the ongoing costs to be 1/10th what it is here on Earth.
To achieve this end, a lot is required. Better rockets are still necessary. Solar panels are designed for the relatively stable environment on Earth, not the extremes of space.
Unfortunately, Musk is probably correct in his assessment. The only way to solve this problem is to go off-planet. We need to pull in resources not presently accessed.
He even goes as far to propose data centers on the Moon and Mars.
Every country is facing the same dilemma. Some hold hope that nuclear fusion is the answer. In theory, this is valid. However, for the last half century, the theory and real world progress have not matched. It is still unknown if this can be solved in an economical manner.
Whatever the solution, something is required. There is no way that we see a slowing of AI services. It is still early and the demand is high. It is only going to increase over time as more people are accessing a greater number of applications with AI components built in.
This means inference compute which is powered by electricity.
And we haven't even started to talk about the power needs of robots.
Posted Using INLEO
This post received a 4% upvote from Hive Booster, courtesy of @dankorox. Visit https://hivebooster.xyz to receive a free upvote for your post!