Mark Parsons sounds remarkably alert for a man who is just back from an exhausting tour of some of the world’s biggest supercomputing functions and facilities.

In the previous fortnight he had been in the US, at Supercomputing 2018 – the industry’s mega-conference, attracting some 14,000 delegates – and in Japan, as a guest at the Riken Advanced Institute for Computational Science campus in Kobe.

“It was certainly a big trip, but incredibly interesting,” says Parsons, Executive Director of the UK’s own National Supercomputer, ‘ARCHER’, which sits in a rather dour industrial facility on the outskirts of Edinburgh.

 I ask Parsons what the ‘big reveal’ was at this year’s event in Dallas, Texas, and he references the relatively new US supercomputer, appropriately named ‘Summit’; at 200 petaflops, it is a “massive” system, he says, capable of performing an unfathomable 200 million billion calculations per second.

Professor Parsons runs the Edinburgh Parallel Computing Centre (EPCC), which has oversight of ARCHER; at 2.5 petaflops the system is now five years old and has unfortunately fallen way down the global supercomputing charts. It is a fact that depresses Parsons and why he freely admits he doesn’t check since he last discovered ARCHER languishing at a lowly 131st.

“It’s never been this low before, in terms of the big national system,” he says. “In the major European economies we are significantly lagging behind at the moment. The government, because of austerity, has simply not been investing.”

Fortunately for Parsons and his 110-strong staff, a fresh round of procurement for ARCHER II is underway, thanks to a £115m programme of investment, which looks set to propel the EPCC back into the global top 20; the cash injection will come from the UK and Scottish Government-funded City Region Deal for Edinburgh and South East Scotland, of which just shy of £50m will go towards upgrading ARCHER II to 30 petaflops in computing power.

Arguably there has never been a better time to secure that investment. Forbes reported this year that the world is currently producing 2.5 quintillion bytes of data every day, and the US and China are driving a supercomputing ‘arms race’ to try and make sense of it all. Both countries are expected to reach the 1,000-pet-aflop ‘exascale’ within the next couple of years – in what will be considered a ‘new age’ of computing performance (the exascale is a billion-billion calculations per second, for those keeping count).

For Parson’s money, the US has the upper hand with its Oak Ridge, Tennessee, facility; although the Chinese surprised the world by developing the first 100 petaflop system, he says it has proved “very difficult” to programme. “The Americans are certainly better at producing use-able systems,” he explains. “But the Chinese have been extremely impressive in terms of creating this supercomputing industry.”

What exactly is the prize on offer, though, is perhaps the most inviting question. In a fast-approaching age of automation, global mega-corporations are precipitously close to being able to use data in such a way that machines will soon be able to perform far above and beyond the skill level of humans; in the US, there are already algorithms that have outperformed cancer doctors in being able to detect tumours and Google can now predict what you are going to write in search queries with terrifying accuracy (even if it struggles with gender pronouns).

Health, insurance, law, professional services generally, not to mention precision manufacturing and genomics, are all human fields that stand to be aided (replaced, even) by the rapidity of technological development, the likes of which exascale computing will only serve to enhance. Demand for the exascale is also coming from industry partners who use ARCHER, with RollsRoyce and the Met Office among existing users who would benefit from even faster processing time, says Parsons.

Rolls Royce is moving more of its extremely expensive engine certification tests to the ‘virtual’ space; instead of testing physical engines to the point of destruction (blades that shear off in the process have to be contained within engine casings in order for an engine to be ‘certified’), the aerospace company will instead rely on supercomputer-generated modelling techniques to replace disaster testing methods. Steel pouring in foundries can be made more efficient by supercomputer simulations, and sails on a ship can be improved by modelling wind flow.

When people talk about moving data into the cloud, Parsons could quite legitimately raise the prospect of moving the ‘cloud into the data’; such is the advance in weather forecasting in the next few years that his super-computer is expected to be able to model the behaviour of individual clouds.

Government data, too, is likely to be pushed more and more into the supercomputer’s telescopic gaze. Much of the City Region Deal’s focus is around the creation of a ‘World Class Data Infrastructure’ (WCDI) for the region, and Scotland more widely. In fact, Parsons stresses computing power alone is not the only story of the government-backed funding programme; the ability to store and move around large data sets, for their use by the likes of the five data institutes that will benefit from the Deal; the Edinburgh Futures Institute, the new Bayes Centre, the Usher Institute,  Ro-botarium and Easter Bush campus – are all University of Edinburgh affiliates which will be supported by the wraparound WCDI facility.

Data Driven Innovation (DDI) actually makes up £600m of the total spend of the £1.3bn programme, and is therefore a huge investment by government in the transformative power of data to stimulate the creation of new knowledge and future industries. Parsons has a role to play in education, too. The intention is that his facility will be accessible by “every child that goes to state school” and it will play its part in the training of 100,000 data scientists over the next 10 years.

“We’re going to go out and work with hundreds, if not thousands, of companies to help adopt data driven innovation in their business,” he says. “I think some of those projects will use my computers and some  won’t but the important thing is everything will be driven by the data that we hold; I think the biggest challenge we have in the world right now is dealing with data deluge.

“One of the things that ARCHER has struggled with is moving large amounts of data around. The next ARCHER, and all the modern supercomputers, will be much better at moving data around. It has to become better at that. All of my problems as a supercomputing centre director relates to the data that we store, not the computer. The computer just works nowadays; it’s always data that causes the headaches.”

And asked whether the DDI revolution is going to phase out all our jobs, Parsons refers back to the prescience of economists like JK Galbraith. “As the world became richer, things would become automated and we would work less [he predicted],” he says. “This is the challenge to policy-makers, which is if we get into the position with AI and robotics that we need to work less because we’ve got machines doing more for us then we need to ensure that we have a fair society. I think it’s a very interesting discussion that society needs to have with itself.”

Prof. Mark Parsons will speak at Digital Cities & Regions, a FutureScot Policy and Technology Conference, at Edinburgh City Chambers on Thursday, 7 March, 2019 – visit www.futurescotevents.com