The Met Office is the UK’s national weather service and
underpins all our prediction capabilities, from short and long-term
weather forecasts to climate change. At the heart of the Met Office’s
weather forecasting is a computational model, which takes the current
and past state of the weather and simulates how it will evolve over the
next few hours, days and months.
As you’ve already guessed, this modelling program isn’t as
simple as an app on your phone; in fact, it requires a supercomputer –
but even supercomputers are constantly evolving and improving, at a
similar speed to our own personal devices. While the number of
transistors being packed onto a chip is increasing, however, energy
restrictions mean that increases to processor clock speed have ground to
“Instead, manufacturers are using the additional
transistors to produce chips containing more than one processor core,”
says Hartree Centre computational scientist Rupert Ford. “It is now down
to the software to make use of these additional cores to achieve higher
Supercomputers which might once have had just a handful of processor cores now routinely contain hundreds of thousands of them.
“Harnessing a vast number of cores to do something as
complex as a weather forecast is no mean feat,” says Rupert. “An analogy
that is often used is ploughing a field: it is much easier to get two
large horses to pull a plough than it would be to use 1,000 chickens!
When you introduce different types of computer processor,
it may be even more complex – say, 500 chickens and 5,000 mice, which
must be programmed in different ways. Given that the large, complex
models used by organisations like the Met Office typically take 10-15
years to develop and have an operational life of approx. 30 years, this
presents a big challenge - how to write a program for a computer that
does not yet exist?”
This is the basis of the GungHo project, a collaboration
between the Met Office, Natural Environment Research Council (NERC) and
the Hartree Centre. We are working to develop the next generation
weather and climate computer model, due to become operational in the
mid-2020s. The name of the code generation tool is PSyclone, and it is
hoped that it will be extended for use in other domains in future.
The construction of such a model requires expertise in both
meteorology and computational science - which presents another
challenge, as few scientists are experts in both. A key design principle
adopted within the GungHo project is therefore the 'separation of
concerns' where elements of the software dealing with the natural
science are kept apart from those dealing with the computational science
(i.e. what type of supercomputer the model is running on).
“One of the most novel aspects of the project is that those
parts of the model that adapt performance to suit the supercomputer it
is running on, are automatically generated,” says Andrew Porter, another
of the scientists working on the project.
“This removes the need for programmers to hand-code
software for a specific supercomputer, greatly improving productivity.
Optimising the model for a different type of supercomputer is a matter
of applying different transformations using the PSyclone tool rather
than rewriting the code, simplifying what can be a costly and