What is common to the forecasting of weather, the making of a Pixar movie, protein structure prediction and the design of mobile phone networks? Each one of them requires a massive amount of computational resources.
One of the things that fascinates me about this is how much longer it takes to simulate a physical event compared to how long the event actually takes. For instance, when your hair blows in the wind, it is an event lasting a few seconds. But when Pixar wants to simulate that for Elasti-Girl, it requires a complex model of the physics involved and a computation that probably lasts several hours to make it look realistic. They need a model of hair, a model of wind and a model of the dynamics involved in the interaction of the two. Here is a nice slidedeck about some of the physics involved.
Similarly, you may make a phone call on your mobile phone in a matter of seconds, but consider what the physical manipulations involved are. At the topmost level there is a sequence of events such as the following — the “network”, that allmighty entity which already knows of your existence (the “bars” on your phone are the signal strength), suddenly responds to your need to talk to someone, then finds out approximately where the other person is, and sends the message out, then the other person (hopefully) responds, a connection is established and lasts for as long as you desire (not if it’s AT&T, but still). And this happens millions of times every hour. At the other extreme, you can think of the most detailed level and study the currents and voltages, and transistors and gates, and electrons and photons, that are sitting inside your phone, inside various pieces of equipment that make up the magical “network” and inside the other person’s phone. And there are many, many intermediate levels that can be studied too. Can’t you just see all the electrons and photons just scurrying around at the press of one button from your all-powerful finger?
But my point is not that all the above in fact occurs when you make a call. It does, and in a matter of seconds. My point is that in order to design and build the system you have to model all this physical reality and simulate it, and this is a vast computational exercise requiring many orders of magnitude more time than the actual call, and this is true even if you simulate only a very limited part of this reality. For instance, you can simulate only one call, and not a million, and you can simulate the chip completely separately, wherein you are definitely not simulating down at the level of electrons but are only abstracting out functional and logical blocks in the chip. But it still takes several hours, even days, to fully simulate even that highly restricted slice of reality, and the reality of the call lasts only a few seconds.
And this general principle is true for studying protein folding, building particle accelerators, performing astronomy calculations, sending up satellites and several other topics besides. What you are doing is mapping some physical reality — whether it be hair blowing in the wind or the changing of the weather — into some approximate model, and then letting the model run on a computer, or, more likely, several hundreds of computers. In turn, what this means is that you are forcing the electrons of all these machines to create some approximation of the real physical electrons (and photons and atoms and molecules and forces and all the other stuff that constitutes “reality”), in order to study (and design (and build)) the system or process you are interested in. The inefficiency of this simulation process is what takes my breath away — something that “just happens” in the physical world can take a lot of computation to simulate.
Recently, for the first time, I actually got to see a massive high performance computing centre, with about 10,000 cores. It was basically a roomful of racks, stacked with machines, noisy with all the cooling that was required. If you put your hand in the aisles between racks, you immediately appreciated the energy that was required to run the machines, and the heat that was generated in the process and which needed to be dissipated. One hilarious and/or ironic fact is that the computations required to study global warming require so much energy that one might actually need a small dedicated power plant right next to the computing centre. I wonder if the effect of the power plant on global warming then gets accounted for in the computation. (Yeah, and then the starving serpent eats its tail.)
When I was first introduced to computers, we learnt about Charles Babbage and his difference engine and analytical engine. We learnt about the ENIAC, which basically took up a room. We’ve come a long way since then, but in some sense, the need and desire for computation has kept pace with technological developments and so we have remained at a point where lengthy computations on roomfuls of computational units are still the norm. That’s humans for you.
 When I google “simulating hair” it gives about 1 million results and asks me if I mean “stimulating hair.” The number of results for the latter query are 7.8 million. (Both queries are without quotes.) More people are interested in stimulating hair growth than in simulating hair dynamics. No surprises, but oh you poor humans!