Doing Science in the Digital Age: If everyone is parallel processing, what’s the problem?
The phone I have in my pocket is more powerful than the first supercomputer I used, and my phone is 4 years old! As we head towards exascale and beyond, what is the future of parallel computing and, more importantly, what challenges to its use still remain?
When we think of massively parallel computers, we think of modelling and simulation in the physical sciences. But the same techniques can be applied to other disciplines, given the right tools and skills. So why isn’t parallel programming ubiquitous in research? Do we need to change our definition of what using a high performance computing means?
In this talk I will argue that parallel computing is not just about bigger and faster machines, but supporting more people to get the best performance from them. I will discuss work the Software Sustainability Institute has been doing to understand how researchers use computing resources, the role that software plays in modern research, and why Research Software Engineers are an important part of what comes next. I’ll also cover how work funded by the UK’s ExCALIBUR programme is looking to provide people with the skills and knowledge to exploit exascale as we prepare to meet the challenges that the next decade of research will bring.
Funding
The UK Software Sustainability Institute: Phase 3
Engineering and Physical Sciences Research Council
Find out more...Understanding and Nurturing an Integrated Vision for Education in RSE and HPC (UNIVERSE-HPC)
UK Research and Innovation
Find out more...