The most evil programming language

After a talk on legacy software recently, a high-school student asked me what I thought the most evil programming language is. I have programmed in many languages over the years – is there one that stands out as being evil? or at least slightly wicked? Some people would cite Cobol as being evil, but that is only because it is so different from the norm experienced today. I would actually choose C. C almost has a dr Jekyll and mr Hyde nuance about it. It excels at system programming, and is fast. But it will turn on you in a heart-beat and bleed memory. C will do just about anything you want it to, however sometimes it lacks elegance. That, and it’s a hard language for novices to learn to program in.

Here are Niklaus Wirth’s views on C¹:

“The widespread use of C effectively, if unintentionally, sabotaged the programming community’s attempt to raise the level of software engineering. This was true because C offers abstractions which it does not in fact support: arrays that remain without index checking, data types without a consistency check, pointers that are merely addresses where addition and subtraction are applicable. One might have classified C as being somewhere on a scale between misleading and (possibly) dangerous.”

“The trouble was that C’s rules could easily be broken, exactly what many programmers valued. C made it possible for programmers to manage access to all of a computer’s idiosyncrasies, even to those items that a high-level language would properly hide. C provided freedom, whereas high-level languages were considered straitjackets, enforcing unwanted discipline.”

It is true, C lets you do things that other programming languages never would, and therefore may contribute to a false sense of security. Create a simple array of characters in C:

char word[20];

Yet use scanf to read in the string, and you risk being able to store more than just 19 characters in word. See, from a pedagogical perspective C is already confusing. In other languages specifying 20 characters means 20 characters, and whatever terminates the “string” is done so transparently. Not so in C – here it is 19 characters + 1 invisible end-of-string character. But with scanf, I could read in 30 characters, and likely “store” them quite happily. That shouldn’t be allowed to happen.

¹Wirth, N., “A brief history of software engineering”, IEEE Annals of the History of Computing, pp.32-39 (2008)

 

 

Advertisements

Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out / Change )

Twitter picture

You are commenting using your Twitter account. Log Out / Change )

Facebook photo

You are commenting using your Facebook account. Log Out / Change )

Google+ photo

You are commenting using your Google+ account. Log Out / Change )

Connecting to %s