Are programming languages really languages?

The craft of creating software involves programming languages, the means by which humans communicate with machines, telling the machine what to do. But can programming languages really be considered in the same vein as natural languages? Both have two items in common: semantics, and syntax. Semantics convey the intention of a language, while following a set of rules – the syntax. But that’s really where the similarities end.

Human languages help tell a story. The story can be a piece of nonfiction, a folktale, or some form of tragedy. Programming languages do not convey a story, but rather a set of instructions, akin more to a recipe. Imagine if “The Three Little Pigs” were told from an algorithmic viewpoint:

  1. The three little pigs leave home.
  2. The first pig builds a house of straw.
  3. The wolf blows down house of straw, and eats the pig.
  4. The second pig builds a house of sticks.
  5. The wolf blows down house of sticks, and eats the pig.
  6. The third pig builds a house of bricks.
  7. The wolf doesn’t blow down the house, and comes down the chimney instead.
  8. The third pig puts a pot of boiling water at the bottom of the chimney.
  9. The wolf falls in the pot. The pig cooks and eats the wolf.

Hardly inspiring right? (And maybe a tad harsh, but in the original the first two pigs were eaten by the wolf). Programming languages are not prose.

Programming languages are artificial entities, designed from the ground up to have exact definitions, and rules, which do not change depending on context. A line of code in a programming language will have either one or no meaning (the latter implies it is incorrect), for that is the context of machine logic. There are no synonyms, or analogies, nothing that alludes to historical, or cultural significance. Natural languages also evolve over time, something programming languages can also do, albeit in a more restrictive sense.  Take English for example – it evolved from Old English to Middle English to Early Modern English to what we know today. If you read a piece of Old English (Anglo-Saxon), you would not understand it at all, it is closer to Latin than the English we know today. The phrase “Where are you from?” in Old English becomes “Hwanan cymst þū?”. Programming languages never change in such a drastic way. In a programming language, there is no room for improvisation, whereas human languages are more fluid, and imperfect.

Human languages involve more than just spoken or written words. Spoken languages involve characteristics such as body language, intonation, and volume. Even written languages have an aesthetic, through script or font, that programming languages don’t (okay but Courier is kind-of boring from the perspective of font – you see Courier you think code). Programming languages have none of this, but then again, they were never designed to – they were designed to communicate with devices whose emotional state is limited to 1’s and 0’s. In human languages, word meanings also change over time. The word “awful” once meant “worthy of awe”, a “clue” use to be a ball of yarn. The syntax of programming languages does not change in a similar fashion, i.e. a while does not turn into a conditional statement.

Herein lies the core difference. Programming languages were designed to convey logic, to translate algorithms from theory into reality. Human languages convey both logical, and emotional information, and are organic. Programming languages are therefore much more like Lego, building blocks of a sort. In this respect programming languages are much simpler than their human brethren.


Is HTML a *real* programming language?

I recently taught myself HTML, for a web design and development course I am building. I always said I would avoid HTML like the plague, I mean it’s not real programming is it? From the perspective of algorithmic programming languages, HTML is not one, for the simple reason that you cannot really implement algorithms using it. The “M” in HTML stands for “markup”, and that’s exactly what it is used for, adding context and structure to content.

Some will disagree and that’s okay, but it was never designed to be a traditional programming language – if it were there would be little need for Javascript in webpages. Some will argue that HTML5 + CSS3 is Turing Complete… and that may be true, but does that really make it a programming language? They go on to suggest this is because you can write a Rule 110 automaton. The sample code provided is 12,200-odd lines long (you can find the code on github). The thing is, I could call a matchstick a building material and go on to build a house with it – but it wouldn’t necessarily make a lot of sense.

HTML is the modern day equivalent to the typesetter in the printing industry. HTML doesn’t contain any programming logic, i.e. there are no control structures. It can’t perform math. Variables can’t be declared and there are no functions. HTML can’t take input and produce output… so it’s not really a programming language. Some would argue that the existence of tags like <ol> </ol>, used to create an ordered list is a form of logic, and technically that is true. But then word processors prior to WYSIWYG also used the same nomenclature. So by this logic, word processors are also programming languages.

Somewhere you have to draw the line.

Teaching – We’re not in Kansas anymore

I have been teaching in the postsecondary realm for nearly 20 years now. It use to be different.  Students seemed more engaged (less distracted), and classes were more reasonably sized. It’s not like that anymore.

Last winter I taught a seminar class on the history of food, which was awesome. The difference, it was a first-year seminar with 18 students. Most classes just aren’t that small anymore. In larger classes it just feels like going through a process. 100, 200, 400? is it even meaningful teaching that many people?  Even 50 can be a struggle. Now imagine it from the student’s point-of-view – wouldn’t you feel lost in a class of 200? Likely. But the “efficiencies” of the modern university require it. Unless you are in a field with lower enrolments, or a “boutique” program, you won’t see those small classes… sometimes not even in upper year classes.

Teaching should be a very experiential and engaging endeavour, but the reality is that methods of teaching are predominantly the same as they were in the 1980s (or 70s, 60s?). Lecturing is overall quite boring. There was a time when it mattered more, when knowledge was a harder entity to obtain. Now knowledge is ubiquitous. With resources on the net, and the likes of YouTube, you can teach yourself almost anything. Although having said that, I taught myself to code in Pascal in first year because I couldn’t understand anything my instructor was waffling on about. Which is the other problem – instructors who can’t teach. They exist, people more interested in research and writing hard-to-read articles. They could be brilliant people (Sheldon-esque, need I say more?), but well, they don’t understand students. And this has a large role to play in the “feel” of the class environment. I would rather be one student in 200 listening to an engaging speaker, rather than 1 in 30 listening to a boring, pedantic one (you just need to go to a STEM conference to hear those people).

Don’t all students deserve a better learning environment? (And I’m not talking learning outcomes). Learning is also a two-way street, and what I mean by that is that as the teacher, you too are a learner. Don’t presume that young people don’t have any knowledge to give.  I learned a lot of things from my food class, especially the benefits of  integrating more plant based foods into my diet. In a larger class it is harder to engage properly with students, you will never have the same sort of experiences, and that’s the problem. Sure, there are numerous “techniques” to engage large classes, and maybe some of them worked, once. But with social media it is quick for people to become disengaged.

How can we fix higher education?

  • Transition towards smaller classes (20-30 students). This would be costly, but can be offset by creating more experiential courses, and requiring students to take fewer courses for their degrees. Maybe shorten degrees to 3 years (which is the norm in places outside of North America).
  • Create meaningful learning experiences. This would require a move-away from the traditional lecture-based way of teaching, but with smaller classes this will be less of an issue. Incorporate experiential learning. Include more learning experiences *outside* the cloistered university environment. Note that some programs in some universities already do that.
  • Hire teachers. Hire researchers to research, hire teachers to teach. They are people  that can do both, but having dedicated teachers goes a long way to making the system of education better.

A great example of this is the “Outdoor Adventure Leadership” program at Laurentian University. Small class sizes (<30), experiential learning, real-life experiences. Students in this program assuredly learn way more than just sitting in a classroom. This could be done in traditional STEM courses as well, if someone had the willpower to change the norm.

But maybe most people prefer to continue living in the Land of OZ.