Psst… got an a.out problem?

My system is full of a.out files. Leftovers from compiling my own C and Fortran programs, or evaluating others. This is a case where physically going through each folder and deleting them, or doing the same via command-line makes me feel nauseous. As I know they are restricted to certain parts of the disk, how do I go about deleting them all? One way would be to create a shell script. The easiest way is to use find.

First check the files:

find . -name "a.out" -type f

Then the files can be deleted in the following manner:

find . -name "a.out" -type f -delete

This will recursively delete all a.out files from the current directory down.


Debunking TV technology

There are lots of crime shows on television, and some of them, CSI, Crossing Lines, Law and Order, and Castle use algorithms to find clues in images, usually those acquired from CCTV cameras. According to a survey by Leger* for Axis Communications, 68% of Canadians said they watch these crime dramas. 71% of Canadians think recorded surveillance footage can be enhanced in a lab using software. However here’s the debunk, the majority of  surveillance cameras sold worldwide today remain analog, which is why security video often shown on the evening news is grainy and of poor quality, making identification difficult.

TV has a magical way of taking a poor quality image and improving it in many ways. Is the image too small? Then it might be possible to “zoom-in” or enlarge it. Image too blurry? It might be possible to improve the acuity by sharpening it. There’s something happening out of view of the camera, but there is person in the camera looking at the scene – we could probably extract an image from a reflection in the cornea? Are any of these at all possible?

The short answer is no. But let us delve into why.

The basic pretext is that these images have too little or missing data, and it is possible to use “intelligent” algorithms will fill in the missing pieces. Problem is that you can’t recreate what wasn’t there in the first place. Here’s a simple example. When something in an image is too small, the natural tendency is to zoom in. When they do this on television they are effectively “enlarging” that information. Consider enlarging a 2×2 image into a 4×4 image:


But the reality is that there are 12 pixels in the new image with no information – the algorithms have to somehow predict what values these “empty” spaces have. Simple algorithms do this by “averaging” the pixels around it. Below are examples of two simple algorithms applied to the task of enlarging a 4×4 image into a 200×200 image. A bilinear algorithm uses four surrounding pixels, the bicubic, 16 to calculate their averages. The result is less than optimal. It is enlarged, and offers a  kind-of-approximation, but there is no way to make it crisper – or indeed to add information that is not there.



Programming is… not magic

Programming is a process by which an idea, expressed in the form of an algorithm, is translated from paper to a language which can be interpreted by a machine. The language used is somewhat immaterial – at the end the “code” is transmogrified into machine language. The machine doesn’t give a hoot about the language – it just follows whatever instructions are given – good or bad. But programming is more than just writing code – programming is a process that involves mental engagement. Programs are only as good as the people who program them. And so there is no “magic” to programming. Things don’t “just happen”. If you don’t understand how something works in the program you are writing, then you won’t end up with a robust program.



How jovial is Jovial?

JOVIAL [1]  is a high-order computer programming language similar to ALGOL, but specialized for the development of embedded systems. JOVIAL is an acronym for “Jules Own Version of the International Algorithmic Language”. It was developed to compose software for the electronics of military aircraft by Jules Schwartz in 1959. But is JOVIAL really cheerful and friendly?

Jovial is based on Algol-58, but includes features such as assembly level inserts, records and array of records, and is used extensively in USAF applications. It is reasonably well structured and has rudimentary typing, is readable and simple, and produces fast code. Jovial was used extensively in the 1970s and 1980s to develop software for a broad range of military and aerospace systems. Among these are the B-52, B-1, and B-2 bombers, C-130, C-141, and C-17 transport aircraft, F-15, F-16, F-18, and F-117 fighter aircraft, E-3 Sentry AWACS aircraft, Navy Aegis cruisers, Army Multiple Launch Rocket System (MLRS), and the Army UH-60 Black Hawk helicopters.

For many years it ran the air traffic control systems of many countries, but these have been slowly replaced with more agile systems. That is, except in the United Kingdom. In December 2014, a disruption to the air traffic control infrastructure in the vicinity of London and southeast England was caused by problems with the National Airspace system, which is written in, you guessed it – Jovial. They are using software reminiscent of the 1960s.

Here’s an example of a Jovial function to calculate a factorial:

    TEMP = 1;
    FOR I:2 BY 1 WHILE I<=ARG;
        TEMP = TEMP*I;

Here the function has one parameter, ARG of type U (unsigned), and returns a type U. The return value is set when TEMP is assigned to FACTORIAL. (Note that earlier definitions of the language were messier).

NB: The language was originally suppose to be called OVIAL – Our Version of the International Algebraic Language. However it was the late 1950s, and OVIAL suggested it was related to the birth process, which wasn’t deemed that acceptable!

[1] Cheatham, T.E., “A brief description of JOVIAL”, Abstract SIGPLAN Notices, Vol.14(4) (1979).

Unmastered complexity

… most of our systems are much more complicated than can be considered healthy, and are too messy and chaotic to be used in comfort and confidence. The average customer of the computing industry has been served so poorly that he expects his system to crash all the time, and we witness a massive world-wide distribution of bug-ridden software for which we should be deeply ashamed.”

Edsger W. Dijkstra (November, 2000)

Languages for image processing

Over the years I have written image processing (and computer vision) algorithms in a number of languages – from C, to Matlab, and now Python. C is obviously the fastest of the pack, but in the early years before the advent of OpenCV in 2000 suffered from a lack of consistent and robust libraries. Sure, you could write them yourself, however when you’re dealing with trying to solve a problem, the last thing you really want to do is write standard libraries. C also suffers from a syntax which is not that amenable to complete array operations, unlike languages such as Python, and even Fortran. For example, there is no simple way of modifying the saturation component of the HSI colour space (Hue-Saturation-Intensity). In Python the code to modify saturation by taking a power of 0.65 would look like this (where imgS is the saturation component):

imgS = imgS**0.65

In C the same code would look like this:

for (i=0; i<nRows; i=i+1)
    for (j=0; j<nColumns; j=j+1)
        imgS[i][j] = pow(imgS[i][j],0.65);

Not grievously more code, but in a large, complex algorithm, this can become quite tedious. Python also provides a large number of built-in functions to deal with arrays, especially with the use of Numpy. OpenCV solves some of these issues, and provides data structures and I/O to deal with images. Standard C makes a mess of images, because large ones are required to be stored using dynamic arrays, or more likely dynamic structures, as size information can then be coupled with the array itself. But whilst Python provides easy-to-write code, it suffers from a lack of efficiency (I’ll talk about that some more in a future post). Sure, Python can be efficient, if you are good at vectorization (but it can be a steep learning curve). C trumps most languages with the ability to efficiently process things. There is a solution of course. Use Python for ease of use, and couple it with C functions for the grunt work.

I also used Matlab for many years, and IDL before that. When you didn’t have time to re-invent the wheel, before OpenCV, these options were great. But Matlab is expensive, and it doesn’t make stand-alone programs easy to develop. I have tinkered a bit with Julia too, but it may be too early for serious image processing. In the end, a language that makes manipulation of arrays easy is ideal – from processing without the need for a nested loop, to extracting a sub-image with a single line of code.

Why technology isn’t always better

We live in a world where technology is ubiquitous. It’s hard to escape from it. E.F. Schumacher, British economist, and author of the 1973 book “Small is Beautiful“, stated that “The system of nature, of which man is a part, tends to be self-balancing, self-adjusting, self-cleansing. Not so with technology.” For a while, technology only ensconced itself in our lives through mobile devices, but ever more so, it has started its assimilation of our homes. A good example is lighting. In its simplest form, if you turn a switch on, a light goes on. Turn it off, the light goes off. Then we added the dimmer, which allows the brightness of the light to be modified (I’m convinced that very few people actually use dimmers, unless the bulb is über bright). Along the way we transitioned to more energy efficient light bulbs – from incandescent, to CFL, then LED. I ♥ LED. But now we also have lights that can be controlled by an app. Have we gone too far?

For a while I thought that some of this technology would be useful, but these newfangled wireless devices often suffer from having too many [quirky] features, and not enough usability. If I need to read instructions on how to operate a light switch it is too difficult. There is also the problem of privacy. If devices are wirelessly attached to your network, and perform “updates” automatically (see my post on the Nest Protect), then one has to wonder what else they could possibly be doing. Could an intelligent washing machine be transmitting information on my washing habits back to the manufacturer? Sure this data might be useful in creating the next generation of washing machines, but wouldn’t it be an invasion of my privacy? Look, washing machines wash clothes, and the mere fact that most people only use 2-3 cycles means that they don’t have to be smart, or feature-ridden. The same with fridges. Adding chips and code to a device that does something simple just complicates things and means that if something does go wrong it might be harder (and more costly) to fix.

I thought about a smart thermostat for a long while, but I just haven’t been able to break down and buy one. I know one thing – no Nest. Too many issues with software, (remember the hand-wavy thing – it still doesn’t work). There seems to be a lack of confidence in their software – check out this article on I still have one Nest Protect (the hard wired one) – which *seems* to work – if nothing else it makes a great motion activated night-light. Yeah, sure I like that the Nest thermostat looks cool. But in a small home an intelligent thermostat might not make much sense. There are probably better ways to be energy efficient too. Like getting a furnace that is the right size – many home furnaces are oversized. I thought about the Honeywell Lyric as well – it doesn’t use a learning algorithm, but rather geo-fencing (i.e. proximity of cell phones). However it also doesn’t have a web interface, doesn’t allow changes to be made remotely, and doesn’t allow a program to be scheduled manually. Nuts.

What does this mean? The evolution of over-technological devices for the home. Designed by companies who will try and convince you that you need them. That it’s better to control your whole home from a smart mobile device. That it all needs to be done remotely.

Remember that technology (just like magic) comes at a price.