Why you should make room for data centers in your portfolio

Why you should make room for data centers in your portfolio

Hello, Reader.

All eyes are on the AI ​​arena, watching the tech titans battle it out to see which AI system will take the crown.

Nvidia Corp.NVDA) stands out as a fan favorite. The giant remains undefeated as an AI processing power.

But look beyond the bright lights and roars of the AI ​​battlefield, and you’ll spot another exciting campaign raging in the shadows…

The surprisingly ruthless data center war.

While AI soaks up all the glory with its flashy technological prowess, the unglamorous realm of data centers fades behind the scenes.

Early cloud computing companies like Rackspace Technology Inc. (RXT) impressed with the ability to turn a small Dell laptop into a high performance machine. As that initial magic has faded, newer and bigger cloud providers have taken over the industry.

So today Smart moneywe’ll shine a light on the data center wars, today’s cloud computing giants, and what it all means as an investment opportunity.

Let’s dive in…

Business IT on the move

The first shift toward cloud computing occurred in the mid-2010s, when enterprise customers began migrating to the cloud. These large companies and organizations needed much more available computing power than the ordinary user, and only large providers like Inc. (AMZN) could guarantee enough space.

Industry always was not particularly profitable. Amazon itself has seen its returns on capital invested (KING) has fallen from the low-single-digit 20% range this decade as cloud services have crowded out its lucrative e-commerce business.

Servers are expensive and need to be replaced every two years, so cloud computing companies are more like airlines and automakers: capital-intensive businesses with little chance of long-term profitability.

But in 2022, all that has changed. And cloud computing companies can thank artificial intelligence for that.

AI has the strange property of working well in parallel. Current artificial intelligence models rely on huge multidimensional matrices (also called tensors), which can be solved simultaneously by multiple processors.

Indeed, the “neuron weights” of current neural networks are more easily organized by matrices, and each entry in a response matrix can be calculated independently of all the others. This is why graphics processing units (GPUs) are better at AI-driven calculations than central processing units (CPUs), and why companies like Nvidia have done incredibly well. When it comes to linear algebra, multiple slower processors perform better than a single ultra-fast chip.

Tech giants quickly realized that this concept was also valid for all data centers. Amazon’s $10 billion “Project Atlas” data center campus being developed just north of Jackson, Mississippi, will span 1,700 acres when completed. The first building alone will cover 3 million square feet, or about 55 football fields. Alphabet Inc. (GOOG)also, is converting many of its existing data centers into similar giants.

These new data centers, called “hyperscales” or “hyperscalers”, now offer Google, Amazon and Microsoft Corp.MSFT) their moment.

These specialized server systems require huge upfront investments, and the three-way oligopoly is turning a once barely profitable business into a lucrative operation. Analysts now expect the three companies to generate nearly $200 billion in profits next year, largely from this lucrative new business. Amazon itself is expected to see its return on investment rise to 26% this year.

What’s next for hyperscalers – and AI?

These three companies should do well in the medium term. Not only are these tech giants snapping up prime real estate, infrastructure, and chips ahead of the competition, but they are also investing billions in creating proprietary, cutting-edge AI models. It is important. Once a customer invests in Google’s VertexAI system, for example, it becomes much more difficult to move to a competing platform, because code must be rewritten and processes redone.

However, long-term results are not yet known. A few weeks ago, The Economist rightly noted that today’s hyperscaler companies look a lot like other historical examples of irrational exuberance…

At first, railway tracks were laid out for locomotives which were soon replaced by more powerful locomotives. As rolling stock became heavier, the lines had to be replaced with sturdier materials. During the 1990s, telecommunications companies increased their capital expenditures three and a half times and laid 600 (million) km of cable… Tech giants’ assumptions about people’s willingness to pay for chatbots and the like Nifty “gen-ai” tools could be wrong. equally out of place.

This is true in a sense. Amazon depreciates its servers on a five-year basis, while Google does so over six years. Every billion dollars spent today on new equipment will cost another billion to replace by the end of the decade…and so on in perpetuity.

There is also the uncertainty of innovation. Today’s tensor-based AI models perform well on hyperscale data centers, which is easy to take for granted. The AI ​​of tomorrow could well work on quantum or even biological computers. There is no “law” that says ultra-large 1,700-acre data centers will have a place in the future.

Fortunately, we’re not there yet. Amazon and Alphabet are expected to report record profits this year, and rising analyst estimates are a strong sign that things are still going well. Both companies remain strong data center players.

As we see this explosion of artificial intelligence on all fronts… hundreds of millions of people could find themselves on the wrong side of a great flood of destruction unleashed by AI.

This is why I am issuing a Red AI Code.

I created a short presentation to help you prepare for what’s to come. Click here for more details.


Eric Frire