Standing Wave


Previous: Chapter 1: Models of circulation


Chapter 2: Détente

In which we learn about a cold war era scientific collaboration, and an odd result

The CIA FOIA Electronic Reading Room is a veritable treasure trove of documents. It’s amazing to see what those spooks had been squirrelling. The communication between the University of Hawaii/NOAA Joint Climate Research Effort and Computer Center of the Siberian Department of the Academy of Sciences of the USSR comprised only about a dozen documents. There was a telegram from Henry Kissinger himself, showing that this programme had been sanctioned at the highest levels. The introductory letters from Miller and his counterpart Alekseev were uninteresting, but the exchange between Lightman and Chinchuluun was full of fascinating technical details. It was clear that Lt. Dr. Lightman had done his homework: he must have read all the relevant papers by the Soviet scientists involved. As these were published only in Russian, the project must have gotten support from translators. More evidence that Kissinger considered it important.

Ahead of the visit, Lightman and Chinchuluun had quickly identified a potentially fruitful area for collaboration: the integration of a NOAA convection model for simulating cloud feedbacks into the Computer Center’s global circulation model. There was some discussion of initial test runs, but the most enlightening (and entertaining) document was a report by Lightman on his two-month visit. He complained that the work advanced slowly because of the system’s technical disadvantages and the total lack of punched card debugging tools. (“To verify the card punched, one had to punch a second card as a check. If the two cards matched, one assumed they were correct; if not, one assumed one card was mispunched.”)

But he also praised the staff for being extremely helpful: “A. I. Chinchuluun and V. K. Gusiakov assisted me in learning the use of the BESM-6 and in utilizing the other facilities. A. I. Chinchuluun was instrumental in helping me overcome the program changes. Without his dedication, I would definitely not have been able to make the models work.”

It was clear that in that short time, Lightman, Chinchuluun and Gusiakov had became close friends: “The people I worked with were very hospitable. I was invited into the homes of A. I. Chinchuluun and V. K. Gusiakov. We went on some excursions together, notably several days of camping and hillwalking near Luzhba in the mountains of the Abakan Range, and I enjoyed their friendship immensely. They took me to film festivals, art shows, ballets, and several banquets. A Russian picnic on an island in the Ob Sea was an unforgettable experience. A. I. Chinchuluun and V. K. Gusiakov helped me with problems of living in Novosibirsk. Their patience and efforts made my stay so much more enjoyable.”

Somehow the name of Gusiakov was familiar to me. Viacheslav K. Gusiakov, know as “Slava” to his friends, was an eminent expert on tsunami modelling and in 2012 he had been the co-author of a damning report by the Bulletin of the Atomic Scientists, “Fukushima: The myth of safety, the reality of geoscience”. I remembered it well as I had been planning a visit to a colleague at Sendai University in 2011 when the disaster had struck, and I had therefore taken a more than casual interest in it.

In 1975, Gusiakov must have been in his late twenties, fresh from his PhD defence. His assigned role in the project was that of the Center’s official host — the report even mentions that he created a short film about Lightman’s visit — but clearly he had been a lot more hands-on than that.

Apparently, Lightman and Chinchuluun had managed the integration of the models in the FORTRAN code, but had not managed to run the simulations by the end of the visit. Follow-on letters from Chinchuluun in preparation for his return visit to Hawaii showed eventual progress, and the final letter even had some very preliminary results.

Looking at those results and the assumptions on the models, I felt there was something very odd: with the resolution given in the final letter by Chinchuluun, and the approach described in the famous 1980 article by Marchuk and Dymnikov, “A mathematical model of the general circulation of the atmosphere and ocean”, it would seem that on the BESM-6, the simulation should have taken impractically long, much longer than the two months between the visits. Both scientists appeared to be meticulous and had been embedded in professional teams, so I had to assume the results were correct. There had to be some deep magic going on here. Either the model, and in particular the convection kernel, was different from the published approach, or they had used some totally radical compiler optimisations. Both were unlikely: a novel model with that performance should surely have been published; and the Soviet BESM-6 compiler was an instruction-by-instruction translation of the binary of the compiler for the CDC 1604 computer, highly sub-optimal. But the facts were there: by my estimate, the model allowed an eight times higher resolution in every dimension, and even then was twice as fast as it should have been. So it was a thousand times faster than could be explained by the existing assumptions.

As I had seen incredible speed-ups before, in my own work and that of others, I didn’t get too excited: such results were usually not real, but the result of unoptimised baselines or unjustified assumptions, or just plain wrong. So I carefully double-checked everything I knew. I studied the BESM-6 architecture and instruction set and looked for ways a programmer might have handwritten the assembly code to improve the performance that much. There was definitely scope, easily for ten times speed-up, but nowhere near a thousand. It had to be the model itself, some radically novel algorithm. But if so, how come it had never been published, not even in the internal reports of the Computer Center (which of course the CIA had managed to acquire)?

@lores:
@kagetsuko There’s something strange about those results in Chinchuluun’s last letter. It’s much too fast compared to the state of the art at the time. I am assuming they were using a model similar to what Marchuk et al. describe in their paper (link). Do you know of any algorithms or schemes that would make a dramatic difference and that were known at the time?

@kagetsuko:
@lores No, I had assumed they used an early version of the published model by Marchuk et al. I’m not familiar at all with the BESM-6 computer, all I know is that this is the machine they used for the simulations and that they could program it in FORTRAN. So you say their results were too fast? What do you mean?

@lores:
@kagetsuko If you look at the memory requirements for the model, and compare against how fast the BESM-6 could access that memory and compute on it, there is a huge performance gap. Even if they had hand-optimised the machine code, it is still more than a hundred times faster than what could be expected based on those published models. The only plausible explanation is some radically new scheme or algorithm in the cloud simulation part of the model. So if you can find that code, there could be something genuinely novel in it.

@kagetsuko:
@lores That’s really exciting! Let’s hope we find it. お互いに頑張りましょう!

kagetsuko and I occasionally had exchanges in a mix of English and Japanese. otagaini gambarimashō is one of those quintessentially Japanese phrases laden with cultural subtext, but here it simply means let’s work hard on this together, and persevere until the job is done. Japanese is pithy.


Next: Chapter 3: The actors, Part I


Written by

 

Updated