The problem with (CS) grad school

There was a time when being in graduate school had some sort of meaning. There often weren’t that many students, and those that were there, wanted to be there to achieve something, and broaden their knowledge. When I was doing my PhD at RMIT University in the late 1990s, this was a case, a small cohort of like-minded individuals from many places, all working in a small office. It was fun, but also everyone was dedicated to what they were doing. There was also a masters (by coursework) program in our dept (Computer Systems Engineering) that was course based, and full of overseas students. It would be a harbinger of what the Australian education system would become – a broken business model reliant on overseas students to make it function. It is a model that does not work, and one that is slowly being adopted by some universities in Canada as well. Sometimes not even educational institutions are good at learning lessons others have endured.

The problem is that now some CS graduate programs seem to have a reasonable number of less-than-stellar students, some of whom are not really there for educational purposes, but rather an easy way to immigrate. I noticed this a few years ago when I taught a grad course on programming languages. Few of the students taking the course seemed to have much of an understanding about programming languages, or indeed an interest in learning. To me, any degree in CS requires you to be proficient in programming, and I’m not talking about HTML. Every week I get bombarded with emails from students who want to work in my research group, or are looking for a supervisor. Most don’t even take the time to properly read anything, just send emails saying how good their grades are, and how interested they are in doing research in topic “Z”. 99% of these emails I delete immediately, partially because I’m not interested in taking on students anymore. Many have never even programmed in C, or are interested in some “flavour-of-the-month” topic or AI, which I have no time for.

But even if I were looking for grad students, it’s unlikely many would make the cut. Grad students require far too much advisor-based funding these days, and I think this funding, a good amount of which comes from the Canadian Government should be focused on students that show real talent. I also would want a student who is a *really good* programmer. Like exceptional – and able to program in numerous appropriate languages, i.e. nothing with the word M****soft in it. We’re talking C, and Python, and the ability to adapt to new and old languages (e.g. Fortran). They would also have to be a good writer. No one has the time to teach someone to write. If a student doesn’t have much experience writing, a graduate degree will be a struggle. In addition, I would want students who don’t need to be spoon fed, i.e. can actually think for themselves.

The problem with academia in CS is that many of the good undergrads do not want to go to grad school, because they get jobs, and frankly grad school does nothing to better that (I have covered this before, there is just very little difference in $ earned between undergrad and grad degrees in CS). Sure, a graduate degree is not suppose to be about the money, but instead about the specialized knowledge… that being said two more years in academia for an MSc. versus a real job? What can you really learn about a topic that you can’t learn on-the-job? So due to the lack of domestic students, grad programs fill up with overseas students. Relying on overseas students to fill programs can be a bad idea, especially so when faculty are required to basically fund them. Now funding comes in many forms, one of which is Teaching Assistantships – the problem being that if students cannot really program or communicate properly, it reduces the quality of education our undergraduate students receive. Then there are the ethics involved in having international students pay exorbitant fees to study.

All-in-all, I see Canadian universities falling down the same rabbit hole Australian ones have – a reliance on overseas students to fill shortfalls. Look I am not against overseas grad students, I too was an international grad student once, but it is better to have a system which is much more balanced.

A trace of a recursive function

Let’s consider a recursive function that calculates the square of a number, i.e. x2. Here is the Ada function that performs the task:

function sqr(x : in integer) return integer is
   c, s : integer;
begin
   if x = 0 then
      return 0;
   else
      c := x - 1;
      s := sqr(c);
      return s + 2*x-1;
   end if;
end sqr;

This works on the principle of x2 being calculated using the formula x2 = (x-1)2 + (2x-1). Based on this:

12 = 1
22 = 1 + 4
32 = 1 + 4 + 9
...
x2 = 1 + 4 + 9 + ... + (2x-1)

Here is what the function looks like when visualized as a flow-chart:

What happens if we visualize x = 4?

AI will likely replace mediocre programmers

There has been a lot of news about the likes of AI like ChatGPT eventually taking on the role that some programmers perform, and I have to say this seems very likely. Now I’m not for most facets of AI, because I think it is an intrusive technology (but of course it’s not really artificial intelligence in the manner of the term, more like “intelligence of a sort”). However used properly it likely has some benefits, especially in fields like programming. Now I’m not talking high level intelligent programming, I’m talking about more generic style, everyday programming. The type that often gets done by people who honestly either don’t have much real-world experience, or just aren’t that good programmers. I’m of course talking about the mediocre programmers.

Mediocre programmers should start to get a little worried, because there is every chance that over the next few years, AI will start to replace human programmers doing these tasks. And I’m talking about programming, not software development. It remains to be seen if large-scale software can be developed by AI in the near future (but then again given the bungle of some large-scale software by so-called software professionals maybe AI would be a better option?). People with good skill-sets and abilities need not worry of course, but people who lack the proper programming skills will have issues, and unfortunately universities tend to graduate quite a few of these type of people these days.

How do you prevent yourself from being a victim of AI? Well there are limits to the intelligence of AI, after all it is still a piece of software designed by humans, and living off data. AI cannot really think for itself in the same way as humans can (not yet anyway)… in the same way as digital cameras do not see the world in the same way as humans. Human intelligence is unique in many ways – we use our computing ability, memory, emotions, and abilities to think, often in artistic ways, whereas AI-based software relies on data and specific instructions. Humans also have strategy, creativity, empathy-based social skills, and dexterity. Machines don’t really have the ability to understand beyond the knowledge they have. Yes there are constraints to human cognition, but there are also constraints to AI.

Basically, if you want to secure your future in computer science, don’t be a mediocre programmer. Be someone who has ideas, and is a problem solver, and can think outside the box.

A recursion example : step-by-step

Here is a simple recursive subroutine written in Fortran which accepts an integer, and converts it to binary form.

recursive subroutine conv2binary(n)
   integer, intent(in) :: n

   if (n == 0) then
      write(*,*) "binary = "
   else
      call conv2binary(n/2)
      write(*,'(i1)',advance="no") mod(n,2)
   end if
end subroutine conv2binary

Here is a breakdown of what happens when conv2binary(5) is called.

▶︎ The initial value of n is 5, so the else portion of the code is executed, and conv2binary() is called with n/2 = 5/2 = 2 as the parameter.

▶︎▶︎ As the new n for the recursive call is 2, the else is executed again, and conv2binary() is called again but now with the parameter n/2 = 2/2 = 1.

▶︎▶︎▶︎ As the new n for the recursive call is 1, the else is executed again, and conv2binary() is called again but now with the parameter n/2 = 1/2 = 0.

▶︎▶︎▶︎▶︎ As n is now 0, the if portion of the code is finally executed, and the write statement prints “binary = “.

▶︎▶︎▶︎ Control now passes back to the previous level of recursion where n=1, and the write statement prints n mod 2 = 1 mod 2 = 1.

▶︎▶︎ Control then passes back to the previous level of recursion where n=2, and the write statement prints n mod 2 = 2 mod 2 = 0.

▶︎ Finally control then passes back to the original level of recursion where n=5, and the write statement prints n mod 2 = 5 mod 2 = 1.

In summary, the subroutine prints: 101, which is 5 in binary. Here is a visual depiction of the process:

Here is the rest of the program:

program dec2bin

   integer :: num
   write(*,*) "Integer? "
   read(*,*) num

   call conv2binary(num)
   write(*,*)

contains

! subroutine conv2binary() goes here

end program dec2bin

Recursion – implementing sum

There are lots of easy to understand recursive algorithms. One of the easiest is is a function sum(x) which sums up all the integers from 1 to x. Here is the function implemented in Ada.

function sum(n: integer) return integer is
begin
   if n = 1 then 
      return 1;
   else 
      return (n + sum(n-1));
   end if;
end sum;

Here the function sum(x) recurses until the value of x becomes 0. At this point the recursion is essentially done, and the calls to sum() backtrack. This is shown in the summary below for sum(5).

sum(5) = (5 + sum(4))                          recursive call sum(4)
       = (5 + (4 + sum(3)))                    recursive call sum(3)
       = (5 + (4 + (3 + sum(2))))              recursive call sum(2)
       = (5 + (4 + (3 + (2 + sum(1)))))        recursive call sum(1), return 1
       = (5 + (4 + (3 + (2 + 1))))             return (2 + 1)
       = (5 + (4 + (3 + 3)))                   return (3 + 3)
       = (5 + (4 + 6))                         return (4 + 6)
       = (5 + 10)                              return (5 + 10)
       = 15

There are four recursive calls to sum() in addition to the original call, which is not considered recursive, because the function may actually terminate, e.g. if sum(1) is invoked. So if the function were to call sum(10000), there would be 9,999 recursive calls. The problem with recursion of course is that many of these simplistic algorithms are just as easy to implement as iterative algorithms.

Here is the same algorithm represented iteratively:

function sumi(n: integer) return integer is
   s : integer;
begin
   s := 0;
   for i in 1..n loop
      s := s + i;
   end loop;
   return s;
end sumi;

What is recursion?

Recursion is a method of problem solving which is associated with the notion of divide-and-conquer. Using recursion, a problem can be defined in terms of smaller instances of itself.

In general, problems that naturally governed by a recursive solution possess two characteristics:

  1. The solution of an instance of the problem may be expressed in terms of one or more smaller instances of the problem.
  2. The smallest instance of the problem may be solved directly.

To illustrate these characteristics, let’s consider the problem of finding the sum, s ,of a series of integers, a which has values from 1 to some integer value n, i.e. 1,..,n. In mathematics this might be represented as:

s = 1 + 2 + 3 + ... + (n-1) + n

The problem of summing the integers from 1,..,n can now be divided into smaller instances of the problem. If we transform the sum into a function, called sum(), then sum(n) would represent the sum of all integers from 1 to n. Then the sequence of calculations can be surmised as:

sum(n) = n + sum(n-1)
sum(n-1) = n-1 + sum(n-2)
...
sum(3) = 3 + sum(2)
sum(2) = 2 + sum(1)
sum(1) = 1

So for n=5, the sequence looks like this:

sum(5) = 5 + sum(4)
sum(4) = 4 + sum(3)
sum(3) = 3 + sum(2)
sum(2) = 2 + sum(1)
sum(1) = 1

The first line of the sequence sees sum(5) being decomposed into the summation of 5 and sum(4), the latter representing a smaller instance of the problem. Lines 2-4 also represent sub-problems. Line 5 is the smallest instance of the problem, which can be solved directly, i.e. the sum of 1 is 1. So sum(5) will end up equalling 5 + 4 + 3 + 2 + 1 = 15. Below is the problem represented in terms of recursive instructions:

1. sum(1) = 1
2. sum(n) = n + sum(n-1)

In programming recursion is represented as procedures or functions calling themselves.

The reluctant academic

I became an academic more out of chance than anything else. After my masters I never really considered doing another degree. Yet, life intervened and in my mid-20s I was diagnosed with Crohn’s. I had given up work, but after 6 months of doing nothing, I was reading a newspaper, when I saw an ad for a PhD. scholarship, $15K a year at RMIT University in Melbourne. I wasn’t doing anything else, so I figured why not have someone pay me to do a PhD while my body muddles along. I had never lived in Melbourne before, but that didn’t bother me, I had moved to more than one city never having been there. After my MSc at the University of Manitoba, I knew enough to know I didn’t want to do a North American PhD – too much unnecessary nonsense, and it took far too long. In Australia a PhD is three years, pure research, and tuition for at least three years was free. So I figured why not. A short while after starting I ended up spending six weeks in hospital, and having major abdominal surgery… which put my health back on track for a long while. But long story short, a little over three years, and I handed in my PhD thesis, and waited until it was assessed. So I had a PhD… now what to do. A world trip, and life intervening lead me to an academic career.

But I have always been a reluctant academic. Perhaps it is because my view on academia stems more from what it once was, the nostalgia of the 1950s and 60s, perhaps when teaching meant more than it does now. When scholarly work was something one was engaged in, so as to become more versed in whatever area one studied. Alas this was not to be. Had I missed the boat by three decades? Why were people so focused on publishing things, and doing the dance of the great grant-cash-flow. Why did so few people seem to care about pedagogy? Why are so many people so close-minded to new ideas? Then one day it dawned on me – I was the outlier. I didn’t fit the mould, but nor did I want to. Universities are meant to be this utopia of free-thought, but I have learned over the years that the opposite was more the truth. Some people are so absorbed in their own self-interests that they aren’t interested in other’s ideas. Others mostly believe that their research is the be-all-and-end-all of everything, but I think this is mostly a delusion of some sort. Nobody matters that much. So you’re a Dr? Who cares, but people love their titles. Full-professor, yeah great. Nobody really cares *that* much. My degree is stuff in a cupboard somewhere, I honestly don’t care that much.

Maybe it’s one of the reasons I blog. I don’t really write papers anymore. I enjoy writing books, but only interesting ones. Academic papers are too laborious to write, and are too boring to read, which is the way with STEM work. Often blog writing isn’t considered “rigorous academic scholar”, and take away time that could be spent writing journal articles. But writing blogs is more interesting, and from the perspective of intellectual wellness is far better for you than writing papers few people will ever read.

A historical background to the word “recursion”

recursion (ri-kur’-zhan) noun. See recursion.

The English word recursion likely originated from the Latin recursus, to ‘run back’. Although the word was first associated with mathematics in the context of recursion relations, in fact it can be found being used at least as far back as 1819, in the “A Dictionary of the English Language”, by Samuel Johnson (1819), recursion is simply defined as “return”, again with the example given as a pendulum.

In “Dictionary of the English and German Languages”, written by J.H. Hilpert from 1831, an example for recursion is given in the form “the recursions of a pendulum”.

Strangely, by the early 1900s, the word had disappeared from English dictionaries, such as the Concise Oxford Dictionary (1911). An investigation of English dictionaries throughout the 20th century shows that it doesn’t make a reappearance until the 1970s. Here is the entry from The Concise Oxford Dictionary of Current English (1976). Nothing really to do with computing.

It was not until the 1990’s when the word took on a more mathematical/computer science based meaning (The Concise Oxford Dictionary of Current English, 1990).

And later in the decade was expanded to include the word recursive. (The Concise Oxford Dictionary, 1999).

Technology has just become a big blah

There isn’t really a lot to be excited about technology anymore. Not like it was in the 1980’s… but then again people were actually building their own machines, or upgrading them. I remember the amazing amount of 3rd party products available to hot-rod Mac’s. I recently bought a new iPhone 14, just because my iPhone 8 had come to the end of its useful life. Is the iPhone 14 good? Sure, I mean the battery seems a little better, the interaction space is nice, and the camera’s are much improved, if not from a megapixel perspective, from a functionality perspective.

But… and you knew there would be a but, it’s not an *AMAZING* new experience. It’s just a new experience. I mean apart form the hardware upgrades, most nearly everything else is software… and it’s not really perfect. In fact it could use a lot of work – if you’re not convinced, check out Luke Miani’s video “It’s not just true… iOS is getting WORSE.” It’s likely the whole OS needs an update – maybe better things for iOS 17?

Technology for the iPhone has certainly in the 16 years in between the iPhone 1 and 14, but has the experience improved 14-fold?

Things are no different with any other technology. I have a lot of Apple technology in my house, but nothing that automates anything (I find that stuff invasive, and frankly I can turn the lights on without assistance). I had two Nest Protect’s smoke alarms a few years back, but they weren’t really that smart, and really buggy – not worth the price. Likely a bunch have AI in them now, which I have no interest in… much of this stuff is just future junk (I mean our houses are full of the stuff). Digital cameras? There is a bit of new stuff, but it has been more to do with advances in optical science than advances in digital technology. Sure, sensors get more photosites, but more megapixels does not necessarily mean better pictures.

Why has technology become such a blah? Because there are very few new ideas in technology. Sure you can add AI to just about anything, but does it make it better, or take something away? AI can make cameras in mobile devices produce nicer images, but does that not diminish the abilities of the human somewhat? So what is the future? Perhaps nothing because we have come up against the limits of most of these devices. Mobile devices can’t get much better, largely because they can’t get much bigger, and therefore only have so much room inside them to add new technology. Technology could be shrunk, but there are limits there as well. Anyone for battery life? In reality the true technological advances will be in realms beyond the consumer.