Daniel Porter
Jun 23, 2012
Featured

Patexia Weekly 06/15-06/22: Scrutinizing supercomputer

The BlueGene/L supercomputer, precursor to the current record holder, at the Lawerence Livermore National Laboratory in Livermore, California, October 27, 2005.Supercomputers are an interesting window through which to view our society’s intellectual status. Humans tend to differentiate themselves from other known living organisms through our mastery of technologies, and few technologies are a more impressive example of our technological prowess. They serve as more than a technological benchmark, though, as supercomputers are also one our greatest tools -- particularly for our uniquely human quest for knowledge. When a problem is so complex that we don’t even really know how to approach it, we turn to supercomputers. When their answers are not satisfying, we make better, faster supercomputers.

Every year, the Top500 list highlights which supercomputers are the best -- able to complete the largest number of operations per second -- and this year’s honor goes to the IBM Sequoia at the Lawrence Livermore lab. The numbers themselves are news (Sequoia clocks in at 16 Pflops -- securely within three or four orders of magnitude of the estimated computing capacity of the human brain), but most coverage built stories around where the computers are, and who made them.

Interestingly, only brief mention was given to the most important aspect of this story, and the first question that came to my mind upon seeing this year’s list: what complex problems are we using these tools to solve, anyway?

I had mixed emotions when the answers turned out to be “nuclear weapons simulations,” a potentially disturbing fact that got lost in all the “America is the best again” rhetoric of the past week. This is far better than regularly testing nuclear devices, don’t get me wrong. But as a physicist, I understand that simulation is inextricably linked to application, and it’s the application that worries me. A nuclear arms race fought silently with computers is still a nuclear arms race, and implies a certain conclusion. Contributor G. Taylor explores the intricacies of the politics surrounding supercomputers this week, covering the topic thoroughly enough that I need not say more.Researchers learn how to wire electric circuits more like our brains.

Thankfully, looking at the collective human intellectual journey through the lens of how we allocate supercomputing power, you will see more than just a singular focus on destructive nuclear technologies. We are curious about many other important problems also. We strive to understand problems like our collective survival, like climate change and -- most interestingly, in my opinion -- how our brains work. This week we learned that we know how to create circuits that are wired more like our brains, but also that our brains function not so differently from the vast inorganic information networks we’ve set up around the globe. This, in my opinion, is a much more worthy way to spend the fastest and most efficient of our processor time.

For millennia, we humans have used our our brains to drive technological advancement, but an embarrassingly large amount of this development was motivated by violence and war. I would be hard pressed to say thatIt's all about connections: consciousness remains a mystery, not because our brains are so fast, but because of the unique and complex way our brains are interconnected. we would be better off without these technologies, and therefore the wars that motivated them. Still, I like to think that we need no longer rely on war to motivate technological development -- we already know lots of great ways to kill each other. Instead, I think it’s high time we start focusing on more interesting problems. Before the end of the decade, we may construct a computer that is able to complete the same number of computations per second as our brains. For better or for worse, are we going to be ready?