Nuclear winter – coding for fun?

If you peruse through old copies of BYTE Magazine (archive.org), sometimes you can find an oddity. I was looking through some today, and in the December 1986 issue (p.143) there is a very interesting article – “Local Effects of Nuclear Weapons“, by John R. Fanchi. The article talked about how to calculate the effects of a nuclear blast, and provide some programs written in BASIC. Weird you say? Well consider the context, the Cold War was still going on, and people were starting to have home computers, programming in BASIC on them, so it’s not that weird really. Nobody thought that the USSR would basically collapse in five years time, and the early 1980s were full of apocalyptic movies: War Games (1983), The Day After (1983). But it was also the decade that heralded the birth of video games, punk fashion, and dance music, so go figure.

The article provides two programs, one to calculate detonation effects, such as thermal flux, overpressure, EMP range, and radiation dosage, and another to calculate nuclear fallout. These articles were always well written, and researched and included the relevant math. It’s interesting because we don’t see these sort of coding articles anymore.

Anyway, for those interested, I have translated the code to Fortran. many of these programs were fairly basic: lots of input variables, a bunch of calculations, and a loop or two and decisions.

! Falout - A nuclear fallout calc program
! Written by: John R. Fanchi, July 1985 in BASIC
program falout
    real :: y, d
    real :: vpar, dpar, vtrans, dtrans, rems
    real :: alpha, beta, gamma, tfac, facmax
    real :: tym, facper, facpar, disper, dispar
    real :: factor, coef, effrem, norm, tymhrs
    integer :: i

    write(*,*) 'Falout - Estimating the Distribution of Nuclear Fallout'
    write(*,*) 'Estimating radiation dosage'
    write(*,*) 'Enter the yield of the nuclear blast in megatons: '
    read(*,*) y
    write(*,*) 'Enter your distance from the blast in miles: '
    read(*,*) d

    rems = 250*(1000*y)/(16*3.1416*d*d)
    write(*,*) 'Estimated radiation dosage in rems: ', rems
    write(*,*) 'Estimating fallout distribution'
    write(*,*) 'The distribution of fallout is treated as a'
    write(*,*) 'random walk process similar to Brownian motion.'
    write(*,*) 'Enter the line-of-sight wind speed (mph): '
    read(*,*) vpar
    vpar = vapr * 24.0
    write(*,*) 'Enter the line-of-sight dispersion (sq miles/day): '
    read(*,*) dpar
    write(*,*) 'Enter the tranverse wind speed (mph): '
    read(*,*) vtrans
    vtrans = vtrans * 24.0
    write(*,*) 'Enter the transverse dispersion (sq miles/day): '
    read(*,*) dtrans
    write(*,*) 'Radiation dosage at your location as a function of time'
    write(*,*) '   Days    Rems    Norm    Hrs      '
    alpha = d*d/(4*dpar)
    beta = -2.0*d*vpar/(4.0*dpar)
    gamma = vpar*vpar/(4.0*dpar)+vtrans*vtrans/(4.0*dtrans)
    tfac = sqrt(1.0+4.0*alpha*gamma)
    tymmax = (1.0+tfac)/(2.0*gamma)
    facmax = ((d-vpartymmax)**2.0)/(4.0*dpar*tymmax)
    facmax = facmax+vtrans*vtrans*tymmax/(4.0*dtrans)
    do i = 1,20
       tym = tymmax*(1.0-0.02*(10.0-i))
       facpar = 0.0
       facper = 0.0
       dispar = d-vpar*tym
       disper = vtrans*tym
       facpar = dispar * dispar/(4.0*dpar*tym)
       if (dtrans /= 0.0) then
          facper = disper*disper/(4.0*dtrans*tym)
       end if
       factor = facpar+facper
       coef = tymmax/tym
       effrem = rems*coef*exp(facmax-factor)
       norm = effrem/rems
       tymhrs = tym*24.0
       write(*,10) tym, effrem, norm, tymhrs
    end do
    10 format(f8.2,f8.2,f8.2,f8.2)
 end program

Image processing was once simple

The concept of image processing was once quite simple. Images taken in the 1960s were often photographs taken by spacecraft, or remote sensing/aerial photographs, that contained artifacts from the acquisition process. These artifacts needed to be removed to create a more aesthetically pleasing image for interpretation, devoid of things that could distract the viewer. An example is shown below of an image of the moons surface taken by the Lunar Orbiter 1. The original contains a series of (periodic) horizontal line artifacts, which can be magically removed using a Fast Fourier transform, and a little effort.

Image of Taruntius Crater on the Moon, taken by Lunar Orbiter 1, (frame M-31), August 20, 1966. The original image (left), versus the de-striped images (right).

But periodic noise and artifacts are often easy to remove. Let’s not forget that many of the early image processing techniques evolved due to limitations in optics. Images needed to be processed to improve enhancement, or reduce noise or improve acuity. These early algorithms are often the same ones we use now, things like unsharp masking to improve acuity. Why have we not moved on to move modern algorithms? The reason is simple – newer algorithms often don’t work any better. They are often more complex, making use of some fancy new method of manipulating data, but they rarely produce results which one could call awe-inspiring. There are hundreds of new image processing and analysis algorithms created each year, but many are just incremental changes of some existing algorithm, and are often poorly tested.

Part of the allure of historical algorithms is their simplicity. They are easy to program, and often quite efficient to run. It is hard for newer algorithms to justify their existence because they cannot sufficiently differentiate themselves.

The advent of flexible learning

When was the last time there was a sea change in post secondary education? A long time ago I imagine. Most institutions have done things in the same was for decades. Lectures, labs, seminars. Nothing truly pivotable about how education is metered out. Until the pandemic that is. All of a sudden the method of delivery has had pivot to either a hybrid model, or completely remotely, with asynchronous or synchronous (live) lectures. People who had likely never updated their material had to find a new way to deliver it – not a monumental change, but something at least.

Funerary relief found in Neumagen near Trier, a teacher with three discipuli, around 180-185 AD, Rheinisches Landesmuseum Trier, Germany Photo: Wikipedia / Carole Raddato

Online, distance education was suppose to be a big thing. Much like e-books, it hasn’t really worked out that way. Remember MOOC’s? They just never took off. Students want in-person classes, which is odd, both from the perspective of being a generation which spends a lot of time online, and from the sheer fact that a good proportion of students don’t turn up to classes anymore. Some are busy, some aren’t bothered, some take classes where the instructor is awful (and I get that, we all had mediocre instructors in one class or another).

The next step in higher education of course is the notion of flexible learning, or possibly hybrid learning, where we move away from the notion of in-person classes to the idea of a combination of in-person interaction by way of seminars, Q&A, or discussion groups, and online material (in an asynchronous manner). Is there any need for old-fashioned lectures anymore? Done in this manner, courses could be taken by students anywhere, anytime. But such a change requires a shift in institutional ideology, something I think will be challenging. Flexibility in learning also means making courses far more experiential. For example how does one rethink first year chemistry labs done at home? Easy, reconfigure the experiments to make use of chemicals found in the home. This may be quite beneficial from a learning perspective, because it actually relates to everyday use. Sure, students don’t get the chance to do truly stupid things (like putting sodium in water, but that was first year chemistry 30 years ago, maybe things are more constrained now), but that’s okay.

If chemistry can do it, so can everyone else. Sure, some people learn better face-to-face, but the reality is, not everything throughout life will be learnt this way… in fact after university, most learning will be experiential. One could learn more about ecology by going out in the field. Imagine the discussions that could be had by people exploring the impact of climate change on their local communities – from their local communities. Imagine a class where people from all around the world are interacting? Look I’m not advocating for a tidal change to online learning, but there has to be a happy medium where we are actually moving forward from a pedagogical viewpoint, rather than the status quo.

Things have to change, the world has to change.

Programming languages are like ancient script

“Some things in life are too complicated to explain in any language.”

Haruki Murakami

In human terms, languages are used to communicate, either through spoken or written words. They are the tools which help meld society together. Programming languages are also used to communicate between a human and a machine. In many respects programming languages are more like ancient script than modern text. Modern writing is very much free-flow. The number of words, and how they are cobbled together is almost limitless. Ancient written languages, like Sumerian, were written using cuneiform symbols, and as such were limited in context, i.e. what was written down was not an exact depiction of what was actually said.

Like modern programming languages, ancient script also evolved to be more complex entities over time. Consider the pictorial representations for the Sumerian word for head (sag). Early depictions were easy to understand. As time progressed the cuneiform progressively got more complex (and harder to interpret).

Sumerian “words” for head (sag) from pictograms to cuneiform

Programming languages are no different. Languages once started out quite simple, but over time have become more complex as they have more and more features added to them. Ancient languages have another thing in common with programming languages – explaining things. Take for example the cuneiform for bread (ninda) shown below, circa 1000BC.

Ninda, 1000BC

Reading the symbol for bread may tell nothing about the type of bread – the cultures of that period in Mesopotamia made hundreds of different types of bread, from many different grains. So using the cuneiform it may have been hard to actually describe something like a recipe in any great detail. Programming languages are also constrained by their “alphabet”, that series of structures used to form a program. It may therefore be very difficult to explain certain things, like AI in a programming language, in part because they may be too complicated to explain.