From the horse’s mouth

From the horse’s mouth

Recently, the BBC interviewed Sundar Pichai, the CEO of Alphabet, Google’s parent. With all the claims the artificial intelligence (AI) boom is a bubble – something I have noted cannot be known until after the event – I wonder whether the BBC was motivated to go to the source. 

Describing the surge in AI funding as a remarkable period, yet one marked by certain unrealistic expectations, Pichai warned viewers of BBC News that no business would escape unscathed if the AI boom collapses.

While the BBC interview covered topics ranging from power demands and delays in environmental goals to investments in the UK, the reliability of AI systems, and how the AI shift might reshape employment – it was Pichai’s answer to the question of whether Google could shield itself from the fallout of a collapse that attracted some attention. 

Despite the recent investment in his company by Warren Buffett’s Berkshire Hathaway, Pichai suggested the company is equipped to handle a bubble bursting but noted, “I think no company is going to be immune, including us.”

Alphabet’s stock has doubled since April, hitting a US$3.6 trillion market capitalisation, partly on the back of its creation of custom AI processors that rival Nvidia’s.

Escalating market valuations and OpenAI’s status as a private company are potentially masking the industry’s losses, prompting some investors to publicly voice concerns about OpenAI’s US$1.4 trillion in spending commitments despite earnings that are a tiny fraction (1/1000th) of the anticipated spend.

I have several concerns. 

One is that OpenAI’s forecast losses of US$9 billion this year rising to US$74 billion in 2028/29 are effectively making all the other listed players look unduly cheap on a price-to-earnings (P/E) basis.

Another concern is that throughout history, we have seen the same cycle repeated when new General Purpose Technology (GPT) emerged. I am not talking only about the Dotcom bubble of 1999/2000. Going back to the shipping, the automobile, electricity, commercial flight, the telephone, television, the computer and then the internet, and more recently, big data, machine learning, the internet of things (IoT), and now, artificial intelligence (AI), what we see is a pattern of behaviour that inevitably leads to the same outcome. 

After the technology emerges, excitement about it builds. The hype attracts investors who compete to be involved, reducing the cost of capital for participants. Armed with cheap capital (initially equity capital), participants use it to build out and scale the technology in a fast-paced land grab. If the hype lasts long enough, equity raisings eventually run out, and debt is accumulated to further fund the build-out. Cheap funding and the consequent land grab create an oversupply. At the same time, the investment theme, long considered ‘structural’ in its impact on the course of human history, inevitably bumps up against customer demand that is not structural but ‘cyclical’. Oversupply meets a cyclical consumer. Then prices collapse, and hoped-for returns vaporise.

Of course, even though the bubble bursts, the technology will indeed change the course of human history. In order to do that, however, it needs to be widely adopted by billions of people. Thankfully, the period of ‘creative destruction’ delivers cheap acquisition prices for distressed asset buyers who pick up the pieces and can now deliver the technology to the masses affordably.

My next concern is there isn’t enough money in customers’ pockets today to pay for the AI tools sufficiently to justify the spend on the build-out. Some calculations suggest the world needs to spend 6-8x its current software spend to deliver a sufficient return on capital expenditure for AI participants. Others have suggested every one of the world’s 1.6 billion iPhone users would need to spend an additional US$35 per month on AI software to produce a return of 10 per cent – which, if true, is not a particularly exciting return.

Finally, I have concerns that the artificial intelligence (AI) dream can easily be ‘interrupted’ by many factors, including the inaccuracy of AI tools (Don’t blindly trust what AI tells you, says Google’s Pichai), the slowing marginal evolutionary improvements to existing AI models, likely delays to projects, and the costs and shortages of energy and water (Pichai pointed out the “immense” power requirements of AI, which accounted for 1.5 per cent of global electricity use last year, according to the International Energy Agency).

In comments echoing those of U.S. Federal Reserve chairman Alan Greenspan in 1996, who warned of “irrational exuberance” in the market a year or so ahead of the dotcom crash, Mr Pichai said the industry can “overshoot” in investment cycles like this.

While some compare this boom to the internet bubble of 1999/2000, the argument that many of the AI players are established dominant monopolies rather than profitless startups is fair. Of course, because today’s companies aren’t profitless, the valuations in aggregate aren’t seen as ludicrous. But the aggregate price-to-earnigs (P/E) of today’s largest ten companies is higher than the same figure in early 2000, and OpenAI’s privatisation of losses could be masking much worse economics for the others, so who knows.

The historical patterns of invention, hype, low-cost capital, overcapacity, creative destruction, and distressed buying give Pichai’s warning some credence – sectors can overextend during these investment surges even though the technology is ultimately transformative.

“We can look back at the internet right now. There was clearly a lot of excess investment, but none of us would question whether the internet was profound,” he said.

Both can be true, and history is replete with examples that were. The technology does change the world, but investors who rushed in too early were often wiped out.

Pichai notes, “I expect AI to be the same. So, I think it’s both rational and there are elements of irrationality through a moment like this.”

Pichai’s views come after another warning from JP Morgan’s chief Jamie Dimon, who recently told the BBC that while AI funding could yield returns, a portion of the capital might “probably be lost”.

As always, time will tell.

INVEST WITH MONTGOMERY

Roger Montgomery is the Founder and Chairman of Montgomery Investment Management. Roger has over three decades of experience in funds management and related activities, including equities analysis, equity and derivatives strategy, trading and stockbroking. Prior to establishing Montgomery, Roger held positions at Ord Minnett Jardine Fleming, BT (Australia) Limited and Merrill Lynch.

He is also author of best-selling investment guide-book for the stock market, Value.able – how to value the best stocks and buy them for less than they are worth.

Roger appears regularly on television and radio, and in the press, including ABC radio and TV, The Australian and Ausbiz. View upcoming media appearances. 

This post was contributed by a representative of Montgomery Investment Management Pty Limited (AFSL No. 354564). The principal purpose of this post is to provide factual information and not provide financial product advice. Additionally, the information provided is not intended to provide any recommendation or opinion about any financial product. Any commentary and statements of opinion however may contain general advice only that is prepared without taking into account your personal objectives, financial circumstances or needs. Because of this, before acting on any of the information provided, you should always consider its appropriateness in light of your personal objectives, financial circumstances and needs and should consider seeking independent advice from a financial advisor if necessary before making any decisions. This post specifically excludes personal advice.

Why every investor should read Roger’s book VALUE.ABLE

NOW FOR JUST $49.95

find out more

SUBSCRIBERS RECEIVE 20% OFF WHEN THEY SIGN UP


Leave a reply

<a href="" title=""> <abbr title=""> <acronym title=""> <b> <blockquote cite=""> <cite> <code> <del datetime=""> <em> <i> <q cite=""> <s> <strike> <strong> 

required