Money for UK supercomputer plan ‘needs to be in billions’, says Javid

Britain’s £900mn plan to build a supercomputer to boost research in artificial intelligence is far too modest and will leave the country lagging rivals, former chancellor Sajid Javid warned on Tuesday.

Javid told the London Tech Week conference that “billions” should be invested in the project to provide 10 times the capacity of the “exascale” machine promised by Jeremy Hunt, current chancellor, in his March Budget.

Hunt announced £900mn of funding to implement the recommendations of an independent “Future of Compute” review for the supercomputer, saying that “AI needs computer horsepower”.

But Javid claimed the government’s plans would by 2026 only produce a facility with the same power that OpenAI’s ChatGPT had in 2022, leaving Britain trailing rivals such as the US.

“It’s nowhere near where we need to be,” he said. “The investment cannot be in the hundreds of millions; it’s going to have to be in the billions. I don’t think we’ve got a choice.”

Javid, a former adviser to the US artificial intelligence company C3 AI, said speed was of the essence and that the project needed to be brought on stream much more quickly.

He added that while Rishi Sunak understood the issues — the prime minister is convening the first global AI summit in Britain in the autumn — the Whitehall system was “woefully unprepared” for the AI revolution.

His comments echoed a warning by Sir Tony Blair, former Labour prime minister, and ex-Tory leader Lord William Hague, who this week wrote in a report that if the UK “does not up its game quickly, there is a risk of never catching up”.

Exascale computers require highly specialised equipment, including tens of thousands of processors and intensive cooling systems. They will be capable of radically reducing the time it takes to process vast bodies of information or tackling more complex kinds of data, promising scientific breakthroughs in spheres such as biology, climate change and security. 

The sums promised by the UK government are comparable to supercomputer projects in the US and EU but pale in comparison to the multibillion-dollar resources available to Silicon Valley’s richest companies. 

OpenAI raised US$1bn from Microsoft in 2019 to build the “large language model” (LLM) technology that underpins ChatGPT, its breakthrough chatbot that launched to great acclaim in November last year.

LLMs require huge computing resources to train or prepare the massive amounts of data that power chatbots and other “generative AI” systems capable of creating humanlike text and images.

Earlier this year, Microsoft invested another several billion dollars in OpenAI, which in March released an upgraded version of its AI model, GPT-4, leading to another step change in the quality of ChatGPT’s responses.

In 2018, the US earmarked up to $1.8bn for two exascale supercomputers. Last month, the Frontier supercomputer at the US Department of Energy’s Oak Ridge National Laboratory in Tennessee bore the first fruits of that investment, becoming the first to break the exascale barrier.

In doing so, it recorded performance levels of 1.1 exaflops, or more than 1 quintillion calculations per second. 

The European Commission announced a year ago that the Jülich Supercomputing Centre in Germany would host Jupiter, Europe’s first exascale computer, which is scheduled to launch later this year.

The commission allocated a maximum total budget of €500mn for Jupiter, jointly funded by Brussels and the German government.

The Department of Science, Innovation and Technology was contacted for comment.

Read the full article Here

Leave a Reply

Your email address will not be published. Required fields are marked *

DON’T MISS OUT!
Subscribe To Newsletter
Be the first to get latest updates and exclusive content straight to your email inbox.
Stay Updated
Give it a try, you can unsubscribe anytime.
close-link