Computer science and the age of the app

The age of the desktop is dead – long live the mobile device!

But where does this leave computer science? Are we still just giving students a “foundation” in programming, software and hardware with a sprinkling of specialist courses? Fundamentally our curriculums haven’t changed much in 30 years – sure we have mixed up the languages on occasion, but we haven’t made radical changes. Industry has, in many ways, evolved – but there are still vestiges of the early days of computing.  Indeed the greatest amount of code written every year is still Cobol, and few institutions teach it anymore. But few industries rely solely on one programming languages anymore, and even financial institutions which rely heavily on Cobol layer it with Java, and Ruby, so the real guts of the system are hidden from view. Companies like Google have increasingly relied on languages such as Python – and software has moved to the mobile device.

Traditional ways of designing and coding software are dead. The evolution of the ubiquitous app has guaranteed that – it has literally become an organic entity. It is created, released, and then goes through various cycles of “healing” whereby crowdsource-testing finds bugs which are fixed, or advancement where new capabilities are added. A new version of the operating system, or device requires the app to become adaptive. Failure to advance or adapt results in its expiry. It really is survival of the fittest.

We teach software design – but do we really know what goes on in a modern startup? What we need to do is spend time shadowing software groups in small companies, finding out what makes them tick – and what makes them successful. What software design practices have they developed that we could teach? We also need to provide facilities for training instructors on new techniques such as mobile device programming. Last but not least we need to move away from the traditional model of teaching computer science. What is needed is a model whereby students are formed into “start-ups” after the first year, and spend their remaining university time working in groups designing real products – say one piece of software per year. No formal classes per say, yet all the relevant concepts are taught by way of workshops, including relevant workshops on mobile usability, agile development, and iOS/Android programming, and the like. Assessment would be gauged by how successful their product is. This would ultimately create a cohort of industry-senstive individuals ready to tackle the future. Students would maybe start creating command-line software and progress to creating viable mobile apps.

This IS the future. Whether or not we embrace it and push the envelop is up to us. We too need to evolve.

The history of computing

It saddens me that so few programming students know much about the history of computing. Dijkstra, Wirth, Hoare anyone? There is a lack of knowledge of the computer scientists that built the field, and a lack of understanding of how computer science developed. This is partly our fault for not advocating more historical adeptness. It’s not good enough to say “but we’ve moved on from there”. Clearly in many cases we haven’t. Besides which there are many good discussions to be had by reviewing history. Here are some topics:

  1. Does the use of Cobol really cripple the mind? – Why has Cobol survived since the dawn of computing?
  2. How relevant is Fortran?
  3. What has been the impact of object-oriented design since the early 1990’s?
  4. Java, C, C++ – Is there one language to rule them all?
  5. Who is the greatest computer scientist?
  6. How evil is GOTO?
  7. If GOTO had not been supplanted by structured programming, where would programming be today?
  8. What has AI achieved in the past 30 years?
  9. What were the most revolutionary ideas of the 1970s? 1980s? etc
  10. Has our way of programming really evolved?
  11. When did people begin to start hacking?

One could also look at the evolution of programming languages, the birth of the electronic computer and its role in WW2, a history of algorithms (historical and those considered the most influential), a history of the operating system or usability. In reality there are a myriad of interesting topics to explore, providing a better understanding of why computer science is what it is. Yet few institutions offer a “History of Computer Science” course – maybe because it takes away a valuable slot from a “hot” topic. But much can be learned from delving into the past.

It is practically impossible to teach good programming to students that have had a prior exposure to BASIC as potential programmers they are mentally mutilated beyond hope of regeneration.
Edsger W. Dijkstra