When we ask most people where their electricity comes from, the typical reply will be, “from the wall” or “from the outlet.”

Occasionally we might hear, “from the power company.”

And oddly, it’s very rare to hear the correct answer, “from coal” or “from natural gas,” which are the top two sources of global electricity – making up almost 60% of global electricity production.

And coal still has the lead by a large margin coming in at around 36% of global electricity production. 

These are the sources of electricity that fuel our electric vehicles, homes, stoves, factories, data centers, and smartphones.

They’re also the sources of electricity powering the advanced computing systems behind artificial intelligence (AI).

Speeding Up, Not Slowing Down

The topic of energy consumption from artificial intelligence has recently become of interest.

And ironically, there are no serious efforts being made to curtail electricity production for AI. 

Nor should there be. I’ll explain why in a bit.

Recent forecasts for AI-related electricity consumption are showing about a doubling of electricity demand for data centers between 2023-2026, rising from about 500 terawatt hours (TWh) to more than 1,000 terawatt hours by 2026. 

That’s just a three-year period, with the majority of demand being driven by the latest developments in AI. Pretty remarkable.

Source: International Energy Agency

In 2022, global data center electricity demand was responsible for 1.4-1.7% of global electricity production. But large data center demand has been increasing between 20-40% annually.

And in countries that have large footprints of data center operations, the numbers are even more astounding. Data centers in Ireland made up 18% of the country’s electricity demand in 2022.

This trend isn’t slowing down. In fact, it’s speeding up.

To provide a simple example of what’s driving this trend, and why it’s so important, let’s look at the power and costs required to train each successive generation of AI.

Source: Leopold Aschenbrenner

The above table might come as a shock.

To train OpenAI’s GPT-4, the technology behind ChatGPT, it cost several hundred million dollars and about 10 megawatts (MW) of power – the equivalent of enough electricity to power 10,000 homes.

It’s no secret that OpenAI is already training “GPT-5” (or whatever it will be called) and that the release of this new large language model (LLM) will happen in the coming months. 

The cost of training will be measured in billions of dollars. And the electricity required will be around 100 megawatts, enough to power about 100,000 homes.

And the ever-increasing power requirements – and the money behind it – won’t slow down.

The Money Driving the Acceleration

“OOMs” stands for orders of magnitude. Each successive release is about a 10X increase in terms of cost and electricity.

“H100s” are NVIDIA’s H100 GPUs or an equivalent product, which are also increasing by an order of magnitude with each successive large AI model.

By 2026, a gigawatt of electricity, roughly the entire output of a large nuclear reactor, will be required to reach artificial general intelligence (AGI).

And by 2030, $1 trillion and more than 20% of current U.S. electricity production will be needed to develop an artificial superintelligence (ASI).

The numbers are hard to imagine, but they are a clear indication of what will be needed in terms of capital and electricity to achieve what will quickly become the largest productivity innovation in human history.

Don’t believe these numbers? How about this…

Microsoft and OpenAI have been planning a $100 billion data center facility, made up of millions of high-performance semiconductors planned to be operational by 2028. The project’s name is Stargate, which is probably a good name as we’re going to feel like we stepped through a portal in time.

Computational power like that will get us beyond artificial general intelligence and a step closer to ASI.

For anyone who doesn’t think that’s within the realm of possibility, consider this. Microsoft (MSFT) currently has $80 billion in cash. It will generate $70 billion in free cash flow this fiscal year (ending June 30), and another $79 billion next fiscal year (ending June 30, 2025).

Microsoft wouldn’t even need to use debt if it didn’t want to.

But it’s not just Microsoft.

Alphabet (GOOGL) is sitting on $108 billion in cash and will generate $81 billion in free cash flow this year, and $92 billion next. Meta (META) has $58 billion in cash and $44 billion in free cash flow this year, growing to $50 billion next.

The money is there, just within these companies, to all have a run at being the first to AGI. And there is at least another $250 billion in private capital willing to be invested in these projects.

Why the rush?

It’s pretty simple. There’s a multitrillion-dollar opportunity with AGI.

But even if we take the money out of the equation, there is an incredible implication of AGI. 

As I wrote in Tuesday’s Bleeding Edge – The Power of a Human Brain, there won’t just be one AGI that the world will depend upon. There will be millions of these “agents.”

These agents will be intelligent enough to perform autonomous research on just about any topic imaginable. And “they” will be able to work around the clock.

With this technology, research on battery technology, advanced materials, nuclear fusion, small modular reactors, genetics, drug development, robotics, aging, and so many other disciplines will be transformed almost overnight.

Efforts to curtail electricity production or hold back the development of AGI will only result in delaying these breakthroughs in clean energy production that we’re so close to achieving with small modular reactors and nuclear fusion.

This isn’t a time to decelerate. Quite the opposite.

It’s time to accelerate. Time to push through the envelope, like a jet going supersonic. It’s time to step through the portal.

Because on the other side of the shock wave is a world of “intelligence” and clean energy abundance.

That’s a world I want to see.