Big Tech is racing to claim its share of the generative AI market

How much of the booming generative AI market will disappear into the maws of the biggest tech companies? And what will be left over for the many other companies that are hoping to cash in on the tech world’s latest craze?

It may be less than five months since the launch of ChatGPT, but those questions already loom large as the biggest tech companies race to stake out large parts of the territory for themselves.

Amazon has been the latest to set out its generative AI stall, through its Amazon Web Services cloud computing arm. Along with its own large AI models, named Titan, AWS last week said it would offer access to several others on its platform, including the large language model from AI start-up Anthropic and the open-source Stable Diffusion image-generating system.

Hosting and delivering independent AI services such as these is all part of an attempt by AWS to put its cloud at the centre of the new generative AI market. AWS also supplies all the tools developers need to build, train and deploy their own generative AI models, and for good measure designs its own specialised chips for both training and running large machine-learning systems.

It is not alone. This month, Google boasted that supercomputers built with the latest generation of its own chips, called TPUs, have achieved breakthrough levels of performance in training large AI models. Microsoft has also joined the stampede among the biggest tech companies to develop its own specialised chips for AI, according to one senior figure (its plans were first reported by The Information.)

Moves like these show just how far the big tech companies are going in their attempts to control all parts of AI’s new computing “stack” — that is, the layers of technology that are required to support and run demanding new computing workloads and turn them into useful services for customers.

At the bottom of this stack are chips designed to process the vast amounts of data needed to train large AI models. Other layers include the algorithms and other software required to train and deploy the systems; the large-scale language and vision models themselves, known as “foundation models” because they act as a base level of intelligence; and finally, the many applications and services that run on top of these models to shape the technology for specific markets and uses.

Amazon, Microsoft and Google are already staking their claim to most of the lower levels of this hierarchy of technology, making it hard for others to break into a market where operating at huge scale with the lowest unit costs will be essential.

Even Elon Musk, who claims his nascent AI company will be a “third force” in AI against Google and the Microsoft/OpenAI partnership, faces a steep climb. Tesla, his electric car company, has already built an AI computer to handle vision recognition. This week, the irrepressible Musk claimed selling this technology to others could one day be worth “hundreds of billions”. But catching up with the tech giants that have already spent years fine-tuning their technology for the world’s largest language and image models will not be easy.

The question now is how much further up the “stack” the cloud companies try to move, in the process claiming more of the value from the new technology for themselves.

For those who don’t already have it, control of their own large AI models (or, in Microsoft’s case, a close alliance with OpenAI) seems a likely goal. Foundation models cost a huge amount to develop and can be put to work on a wide range of applications, making them a natural first step for any big tech company with AI ambitions.

The centrality of these large models to their wider strategic goals means the companies are not likely to view them as profit centres in their own right. That is certainly how Emad Mostaque, head of Stability AI, the company behind Stable Diffusion, sees it. He warns of a “race to the bottom” in pricing as the big tech companies battle to establish their main AI systems, leaving little room for anyone else.

Mostaque is instead counting on two things. One is that Amazon will always be happy to make money hosting rival AI models in its cloud and not try to supplant them with its own. The second is that there will still be room for differentiation between AI models, and that not all customers will want to rely on giant, opaque systems run by a handful of dominant tech companies. If he is wrong, generative AI’s early, competitive phase could prove very shortlived.

richard.waters@ft.com

Read the full article Here

Leave a Reply

Your email address will not be published. Required fields are marked *

DON’T MISS OUT!
Subscribe To Newsletter
Be the first to get latest updates and exclusive content straight to your email inbox.
Stay Updated
Give it a try, you can unsubscribe anytime.
close-link