close
close

How Intel lagged behind in the AI ​​chip boom

How Intel lagged behind in the AI ​​chip boom

In 2005, there was no inkling of the artificial intelligence explosion that would occur years later. But executives at Intel, whose chips served as the electronic brains in most computers, were faced with a decision that might have changed the way that transformative technology developed.

Mr. Paul Otellini, then CEO of Intel, presented the board with a surprising idea: buy Nvidia, a Silicon Valley startup known for chips used for computer graphics. The price tag: a whopping US$20 billion ($26.4 billion).

Some Intel executives believed that the underlying design of graphics chips could ultimately create significant new jobs in data centers, an approach that would eventually come to dominate AI systems.

But the board resisted, according to two people familiar with the boardroom discussion who spoke only on condition of anonymity because the meeting was confidential. Intel had a poor track record of absorbing companies. And the deal would have been Intel’s most expensive acquisition by far.

Faced with skepticism from the board, Mr. Otellini, who died in 2017, withdrew and his proposal did not go any further. Afterwards, one participant in the meeting said it was “a fateful moment.”

Today, Nvidia is the unrivaled AI chip king and one of the most valuable companies in the world, while Intel, once the semiconductor superpower, is faltering and unsupported by the AI ​​gold rush. Nvidia’s stock market value, for years a fraction of Intel’s, now exceeds $3 trillion, roughly 30 times that of the struggling Silicon Valley icon, which has fallen below $100 billion.

As the company’s valuation has fallen, some major technology companies and investment bankers have pondered what was once unthinkable: that Intel could be a potential takeover target.

Such scenarios increase the pressure facing Mr. Patrick Gelsinger, who was appointed CEO of Intel in 2021. He has focused on restoring the company’s once-in-a-generation lead in chip manufacturing technology, but loyal business watchers say Intel is in dire need of hot products – such as AI chips – to support sales that will increase by more than 2021 to 2023. 30 percent has fallen.

“Pat Gelsinger is very focused on the production side,” says Professor Robert Burgelman of the Stanford Graduate School of Business. “But they missed AI, and it has now caught up.”

The story of how Intel, which recently cut 15,000 jobs, fell behind in AI is representative of the broader challenges the company now faces. Opportunities were missed, idiosyncratic decisions were made and their execution was poor, according to interviews with more than twenty former Intel executives, board directors and industry analysts.

The trail of missteps was a byproduct of a corporate culture built on decades of success and high profits, dating back to the 1980s, when Intel chips and Microsoft software became the twin engines of the booming PC industry.

That culture was heavy and focused on the franchise in personal computers and later in data centers. Intel executives half-jokingly described the company as “the world’s largest single-celled organism,” an isolated, self-contained world.

It was a business ethos that worked against the company when Intel repeatedly tried and failed to become a leader in chips for AI. Projects were initiated, continued for years, and then abruptly halted because Intel leadership lost patience or the technology fell short. Investments in newer chip designs have consistently taken a back seat to protecting and expanding the company’s money-making mainstay: generations of chips based on Intel’s PC-era blueprint, the so-called x86 architecture.

“That technology was Intel’s crown jewel – proprietary and very profitable – and they would do anything in their power to keep that going,” said Professor James D. Plummer, a professor of electrical engineering at Stanford University and a former director of Intel.

Intel leaders sometimes acknowledged the problem. Mr. Craig Barrett, former CEO of Intel, once compared the x86 chip business to a creosote bush – a plant that poisons competing plants around it. Yet profits remained high for so long that Intel didn’t really change course.

When Intel was considering a bid for Nvidia, the smaller company was widely seen as a niche player. The specialized chips were mainly used in machines for computer gamers, but Nvidia had begun adapting its chips for other types of computing, such as oil and gas exploration.

While Intel’s microprocessor chips excelled at quickly performing calculations one at a time, Nvidia’s chips delivered superior graphics performance by splitting tasks and spreading them across hundreds or thousands of processors running in parallel – an approach that would pay off years later in A.I.

After the Nvidia idea was rejected, Intel, with the support of the board, focused on an internal project, codenamed Larrabee, to get ahead of the competition in graphics. The project was led by Mr. Gelsinger, who joined Intel in 1979 and steadily rose to become a senior executive.