25th February 2013 Washington DC, USA
The Biggest Computer I’ve Ever Seen, and Why Sometimes 0 and 1 Just Ain’t Enough
When you work in international scientific collaboration, you sometimes take inspiration from bizarre sources. Like this 2008 Gonzales song “Working Together”:
Last week, I had a UK high-performance computing group with me in DC, Pittsburgh and Chicago, and couldn’t get the song out of my head. The US Science and Innovation Network runs a lot of workshops and sponsors many visits for and by the research community to facilitate collaboration. Much of this happens via our consulates. In DC, we do a lot more policy work and high-level visits. So last week was a great experience for me to get outside the Beltway and see some real live collaborations. The members of the group were here not only looking for their own academic collaborations, but also on behalf of the whole UK academic community – looking for new or missed opportunities where the UK could access US computing facilities and looking at innovative partnership models.
Personally, the singular image that will stick with me from the trip is walking into the two-story, department-store-sized building that houses Blue Waters. Even if you know nothing about high-performance computing, this machine is impressive. It’s powered by two substations. Normally, that would power a large subdivision or small town. But here, it was keeping the lights on for a very large series of computers. We toured the building with Bill Kramer, Project Manager for Blue Waters, went under the elevated floor (six feet up to house the electrical and cooling equipment needed to keep the systems powered and keep them from overheating), went on the roof to see the cooling system, and generally just walked around constantly saying insightful, science-y things like “Whoa” and “This is unbelievable.”
So why were we on this journey across America, and why is HPC necessary? Well, the US has an abundance of HPC facilities. Many of them are used only a fraction of the time and require only a written request for access – then academics can use them to conduct their research. Most scientific disciplines have come to the point where the amount of data they generate requires HPC – space and planetary science in its searches through our own and other solar systems; bioninformatics for genome sequencing and other applications; and many more (Ralph Roskies and Michael Levine from the Pittsburgh Supercomputing Center, which we visited, put if far more eloquently in this 2012 op-ed. The UK is one of the scientific powerhouses of the world, but does not have anywhere near the number of facilities that the US does. So the scientific community needs additional resources, and the US has those resources. From a broader perspective, UK-US research collaboration strengthens science: our network speaks frequently about things like the number of Nobel laureates who hail from the two countries have and the joint research the UK-US relationship produces.
I’ll close with a shameless plug. Our monthly newsletter, available via Storify, contains additional details on the collaborative activities of our network – check it out! With any luck, we’ll have some HPC updates soon!
The difficulties surrounding data storage and movement will still plague you, perhaps even more so when moving overseas. What about the requirements from UK research councils to keep all research data for 15 years _after it was last accessed_? This was one of several steps that were taken to avoid another climategate.
At the moment, very few a academics worry about doing this themselves, and it is left up to the university IT departments, who will have no access or say in data that is worked on across the pond. They may not even know about it.