Sally sells C shells, but should she have?

C is an inherently complex language, not from the perspective of the language, because it is a small language in comparison to the likes of Ada or C++. However there is an inherent complexity to learning the language because of its assembly-like heritage. The white-paper linked below overviews some of the more challenging aspects of C from the perspective of the novice programmer, and offers alternatives from other languages, where the usability may be improved.

Syntactic sugar and obfuscated code: Why learning C is challenging for novice programmers

Learning recursion through visual patterns (ii)

Visual algorithms are a good way of learning recursion. As we have seen with recursive squares, simple is often better. What about circles? We could create a program in Processing to embed a circle within a circle, each time reducing the size of the circle. Here is the setup:

float dx, dy;
void setup() {
   size(400, 400);
   noLoop();
}

void draw() {
   dx = width/2,0;
   dy = height/2.0;
   background(255);
   stroke(1);
   recCircle(200);
}

The call to the recursive function recCircle(200) uses a circle diameter of 200 pixels . A circle is then drawn at position (dx,dy) with a diameter of diam. At each recursive call, the radius is reduced to 75% its previous size.  The recursion stops when diam becomes less than 5. Here is the recursive function:

void recCircle(float diam) {
  if (diam > 5) {
    circle(dx,dy,diam);
    recCircle(diam*0.75);
  }
}

 

Here is the output:

The dangers of technology in the home

It is very tempting to fill our homes with a plethora of technology, “smart homes” I think they call it. It is suppose to make our lives easier, but in reality that is probably not the case. The bigger issues are however throw-away technology, and data privacy. The first issue is a pertinent one, because we create far too many technology products (which now includes appliances) that have a limited lifespan. When its battery wanes, or a new product version appears, we ditch the old one. If we’re lucky it ends up being recycled – if not it ends up in a landfill, plastic components and all. No electronic technology lasts forever. Sometimes it’s the software that just has reached the end of its lifespan, and the company doesn’t want to maintain it anymore. Technology companies need to reign in their throw-away ideologies which put profits before the environment (and let’s face it even appliance companies are technology companies).

Of course having said all that, sometimes the technology itself is substandard. I’ve had a Nest Protect for a number of years. The first one wasn’t great, but just stopped working properly one day, basically shutting itself down. Google sent me a replacement, but honestly it hasn’t been any better. It turns blue, and beeps when there is nothing happening anywhere. I’m not convinced it’s really functioning, and likely I will replace it with a dumber piece of technology. Even LED lightbulbs have a bunch of technology in them, and frankly that’s why their lifespans have been curtailed in recent years. The “Internet of Things” technologies are also vulnerable to a bunch of different issues. Would I trust a smart lock? No. Good physical keys are much better. Part of the reason is pertinent to colder regions – cold saps battery life, and smart locks run on batteries. Smart locks that have internet connections are more susceptible to attack, than say a keyed lock with a good keying system like Medeco. Check out this article.

TV’s can be smart, but most other appliances don’t need to be. The more technology you add to a device, the greater chance of errors, because no technology is perfect. Refrigerator problems use to be easy to diagnosis, not so now. The more technology we add to home, the harder it becomes to track what is functioning properly. Light bulbs provide light, they don’t need to be smart. Actually maybe people should stop offloading things to machines and get a little smarter in how they run their homes? One of the biggest issues is trust. Could a smart door be hacked? I mean car systems have been hijacked and controlled remotely in the past. What about malware attacks? Does the software in these devices need to be regularly updated, and if so is it done remotely? Can you trust that? Is your privacy being compromised by some home device that you talk to? Speakers that listen to you, a camera that can watch you, and technologies that control lights, locks, climate control.

The question is, how much control do you want to give away?

A simple Julia example – The Yorkshire Estates

To illustrate how easy it is to make programming somewhat fun, consider the  the following puzzle, “The Yorkshire Estates“, described by Henry Ernest Dudeney in his book on puzzles “The Canterbury Puzzles”.

I was on a visit to one of the large towns of Yorkshire. While walking to the railway station on the day of my departure a man thrust a hand-bill upon me, and I took this into the railway carriage and read it at my leisure. It informed me that three Yorkshire neighbouring estates were to be offered for sale. Each estate was square in shape, and they joined one another at their corners, just as shown in the diagram. Estate A contains exactly 370 acres, B contains 116 acres, and C 74 acres.

Now, the little triangular bit of land enclosed by the three square estates was not offered for sale, and, for no reason in particular, I became curious as to the area of that piece. How many acres did it contain?

Yorkshire Estates puzzle diagram

The task is to derive a way of calculating the area of the triangular piece of land. I won’t go through the whole description of the problem, and how to create the algorithm, because you can find that here. Basically you can use the fact that each estate is square in shape, along with its area to determine the lengths of the three sides of the triangle. The area of the triangle can then be calculated using Heron’s formula. Let’s just look at the Julia program.

println("What is the area of each estate?")
println("Estate 1: ")
area1 = parse(Float64, chomp(readline()))
println("Estate 2: ")
area2 = parse(Float64, chomp(readline()))
println("Estate 3: ")
area3 = parse(Float64, chomp(readline()))
side1 = sqrt(area1)
side2 = sqrt(area2)
side3 = sqrt(area3)
semiP = (side1 + side2 + side3)/2.0
areaTri = sqrt(semiP * (semiP-side1) * (semiP-side2) * (semiP-side3))
println("The area of the triangle of land is ", areaTri, " acres.")

This teaches the basics of problem solving, keyboard input, output, variables, and calculations. Nothing more is needed for an introductory problem.

 

You could think like a computer scientist, but do you want to?

There are many books on programming in the bibliosphere. Some are good, some mediocre, and some are very technical, targeted towards a particular audience. Think Julia (How to Think Like a Computer Scientist)  is a book relegated to the latter category. It’s title alludes to the fact that you could think like an computer scientist, but the question is do you want to? One of the reasons that people shy away from learning to program is the “tech” aspect of it. By proclaiming your book as CS centric you will automatically reduce your audience.

The book contains a good amount of information in its 276 pages – at least it is not a 1000 page behemoth. The authors do however slip into the same issue many technical writers do – a lack of thought for the abilities of their audience, assuming their audience are non programmers (which they seem to do, saying that  “No formal prior knowledge is required.”) . People from varied walks of life likely would like to learn how to program, and Julia is indeed a good language in which to learn. The problems include (i) the number of chapters dealing with higher-end language concepts and (ii) the  lack of cohesive and relatable examples. It’s not alone, there are few good programming books out there. Part of the problem is the complexity of programming languages. There are no books that just teach the basics of programming using a series of languages as examples. They use to exist in the 1970s and 80s. Now programming books all seem to be the size  of War in Peace, or are too esoteric to read.

I wanted to like this book. If it’s audience was anyone but novice programmers I would not have an issue. But the way the book reads, just doesn’t present an easy read for the novice, non-CS individual wanting to learn programming. First, maybe start with the concept of data – something that underpins all computing. Jensen and Wirth did it in their small book: Pascal: User Manual and Report. It doesn’t take a lot, and Julia is *all* about data. After data, and variables, I think books should cover control structures, because you can build programs without functions, but this book chooses the function route first. It’s not terrible, as long as only assignment statements are used to illustrate what’s inside a function (these functions don’t return values, that comes in a later chapter). The problem arises when one hits the “stack diagrams”, or “void functions” sections. Really? This is exactly why this is a book for computing-centric people, and not for novices. People who write programs to solve programs don’t want to know what’s under the hood.

Describe a programming concept, illustrate the concept using data people understand. Leave out things that aren’t relevant. Recursion is a good example. Many CS students don’t understand it, so don’t beguile the novice programmer with it. It will seem like Harry Potter magic. The fact is books on programming could be simpler than they are. Take the simple concept of making decisions. Strangely enough in this book recursion is covered in a chapter entitled “Conditionals and Recursion” – why I do not know – would it not be better associated with functions? (or not included at all). The same chapter covers “Keyboard Input”. Choose one topic, stick to it.  In Ch.7 we finally hit iteration – arguably more important than recursion. Start the chapter off with variable reassignment, and end with algorithms, a topic that would have been better dealt with at the start of the book, and not in a chapter on iteration. From there on the book deals with higher level topics for the remaining 176 pages – strings, arrays, dictionaries, tuples, files etc.

Summing up, I think this book is perfect for someone who knows the basics of programming, but not for someone who has never programmed before. Julia is about data, and as such the examples used to illustrate concepts should reflect this. The examples used throughout the book are typically CS centric as well –  Factorial, Fibonacci and Ackermann to illustrate recursion (which pops up as a topic twice) – not exactly practical examples of how recursion is useful for solving problems. There are case studies, for example one which looks at word puzzles, but there seem to be more exercises than any sort of methodology for solving a particular word puzzle. Maybe look at some financial or environmental data, or even image data, which isn’t that complex to do with the tools available.

Recursive patterns – the Koch snowflake

The Koch curve can be extended by placing three copies of the Koch curve outward around the three sides of an equilateral triangle to form a simple closed curve that forms the boundary of a Koch snowflake.

Once we have the code for the recursive Koch curve, implementing the snowflake is trivial. It basically involves generating a Koch curve, rotating 120°, generating another Koch curve, rotating another 120°, and then generating the final Koch curve.

Here is the processing code, which can be used with the quick recursive functions described previously.

void kochSnowflake(float lngth, int level) {
  kochCurve(lngth,level); 
  rotation(120);
  kochCurve(lngth,level);
  rotation(120);
  kochCurve(lngth,level); 
}

Here is the output for lngth=200, and a level=4.

Koch snowflake

A 4-degree Koch snowflake

 

Is a CS education meaningful?

I sat in a cafe yesterday listening to two people, 20-somethings, talking about what I imagine was some software project at some company. It was hard not to hear what they were talking about. But what I thought of was that I have no clue about the modern mechanisms for building software in a real company. In fact I doubt there are many CS instructors who do. It’s no different to other fields in academia. The difference I think is that academics in history are actually experts in their fields. I don’t know if I believe the same is true of technological fields. Why? Because academics don’t write large pieces of software. We write programming libraries, or tiny systems, but nothing groundbreaking. Students write bigger pieces of software than we do. Students often have more real-world experience than most instructors – they go off to coop placements and learn real things.

I once had a student who took a 4th year project course with me to develop an app… I really just supervised because I have no real knowledge of how to build apps. At the end he went off to the Apple Developers conference in the summer, and when he came back he said he learned more in one week there than he did in four years of school. I had to agree. I don’t think I necessarily learned much as an undergraduate. It was a means to an end. Sure, I learned to program, but I taught myself that. Like anything in life, you just need the will to do something in order to succeed.

I can teach programming, and I can teach the basics of program re-engineering (one of the few things most companies don’t delve into in any great depth – Cobol anyone?). I can teach the basic algorithms underpinning image processing, or how to look at photography from an aesthetic viewpoint. But I can’t teach you how to design and develop large pieces of software in a realistic manner. I don’t know how to build an app, and I can’t program in Swift. Those are skills most people  will likely learn on the job – that’s just the way it is, and maybe the way it should be. But then again maybe CS education doesn’t belong in a degree. Maybe it belongs in an apprenticeship program supplemented by some pedagogical underpinnings (not the other way around).