Experiential learning and failure

Experiential learning involves learning by doing. You learn to cook by actually cooking, not by studying recipes. You learn to build, by building. It is these fundamental skills that help you gain an understanding of problem solving, and failure. That’s right, failure – because not everything will work the first time. The very nature of experiential is that it is hands-on, and probably somewhat experimental. A recipe might seem easy in a cookbook, but may not turn out exactly when you make it, and that’s okay. You can’t expect to be an instant expert. Some of the notion of instant gratification of course comes from building toys such as LEGO. Now LEGO is a great platform for creativity, but not all LEGO. LEGO kits such as the Star Wars vehicles look cool, but there is nothing creative or experiential about them. Open the box, follow the instructions, build the vehicle. There is no real chance to fail (unless there is a piece missing, which *rarely* happens).

But just as much as much as there is no chance of failure, there is no real chance of excitement or innovation. Sure you could build something else with the pieces, but how many kids do?  Modern Meccano is no different, no element of failure. Modern toys aren’t meant to fail. The harsh reality is that failure allows you to get some perspective, and learn from the failure. What we end up with is kids that grow up and have no perspective of what failure is. 

But failure is okay, here’s some reasons why:

  1. Failure is inevitable – Everyone fails at something, sometime in their lives. Ever baked a cake that was a failure? Everyone fails at something, sometime in their lives.
  2. You learn more from failure than success – Perfection does not exist, everything can benefit from improvements, ever minor ones.
  3. Failure emboldens risk taking – If you aren’t afraid of failure you might take more risks – you can’t play it safe forever. Failure epitomizes humanities tireless struggle.
  4. Failure provides new paths – Sometimes things don’t work out, and that allows some reflection, and maybe the realization that alternate paths exist. Failure makes you dig deeper to find new meanings.

However failure in modern society is often deemed a deficiency. It is time to take a different attitude towards failure. Programmers fail all the time – programs fail to compile, and algorithms fail to work. And you know what, it’s ok.

 

 

The problem with programming is a lack of problem solving skills

Everyone wants kids to learn to program, but has anyone actually asked the kids? There was a vast body of research performed in the 1970s and 80s into better ways of teaching kids to program. Languages such as LOGO were developed to make it easier, and more fun (because you could program visual things). Now we have tech companies pouring $millions into teaching kids to code. I get it, in this day and age being able to program should be up there with reading and writing, right? But here’s the kicker, the craft of programming likely doesn’t matter much. Yes, coding can be challenging, partially because come of the chosen languages can be unintuitive, and complex to learn (C anyone?). Leaving the usability and cognitive aspects of programming languages aside, the real question is *why* do we want kids to learn to code?

To write programs? What sort of programs? There seems to be a mobile app or web app to do just about everything. Many programs that people need have already been written. If I need to filter a photograph, there are a multitude of ways of doing it. I don’t need to code my own. Programming is just a tool, so if we want to teach people about the concepts of coding, that’s fine, but there has to be a purpose. The purpose if of course to use programming to implement an algorithm, which is the solution to some problem being solved. Without having an algorithm to implement,  there’s no point in coding. It’s like owning an axe in Iceland – nice idea, but there aren’t that many trees, so what are you going to do with it? What we would be better off is to teach kids about how to solve problems.

That’s right, solve problems, in an experiential way. Schools spend too much time not doing anything nearly experiential at all. STEM subjects like chemistry and physics are fairly cookie-cutter… not much innovation there (don’t get me wrong, there are innovative teachers, but they are often ham-strung by a system which is designed by clueless people – the same ones who want to get rid of shop classes because someone might “get hurt”). We need more courses that make kids think – at the elementary, secondary and post-secondary level. They do exist, usually in the guise of design, or architectural courses. But STEM subjects usually offer few innovative classes, especially experiential ones, i.e. ones *not* involving lectures, but real life problem solving. Programming doesn’t need more theoretical problem solving. Problem solving could be as simple as building something from LEGO, or designing an algorithm to use a drone’s photographic abilities in a particular way. Ultimately the best types of problem solving involve building something, because then there is an actual association between thinking and a tangible solution.

It’s like the sort of thing kids do, while playing, one could almost say its experiential problem solving through play. Remember being a kid, and building forts out of pieces of wood, and stuff? Not all kids have those experiences these days. Or LEGO sets that were just a big box of LEGO pieces and you built whatever you wanted? Or Meccano? Most kids took some sort of shop class, so you learned skills, and learned to problem solve. What is the strongest way of building something with wood? Even cooking let’s you problem solve – if we want to make a cake recipe gluten free, what do we substitute for wheat flour so that the cake turns out tasting the same? Innovation comes from experiential learning, and frankly we just provide far too few opportunities for kids these days.

Something needs to change.

 

 

 

AI – the great abyss

AI has become somewhat of a pet peeve. Every day you see someone on TV from some company talking about AI, but there doesn’t seem to be any real idea what AI is. Is it artificial intelligence? Hardly, because evolution took millions of years  to get human intelligence to where it is today. Has any other life on the planet become self-aware? (If there is, they are keeping it to themselves) How many other planets in the universe hold intelligent life (we don’t know, but there might be some which don’t consider us very intelligent). So creating intelligence isn’t easy. Does artificial intelligence mean “human-like” intelligence? If it does, it could be hard to replicate, if it doesn’t then what intelligence does it mimic?

We assume we as humans have the only form of intelligence, however It could be that dolphins are just as intelligent as us, but didn’t evolve in such a way that  they could control the world around them. Ants and bees have intelligence as well, maybe more akin to social-intelligence, but intelligence nonetheless. Of course, we don’t really know that much about the way the brain works either. We like to *think* we do, but it’s made predominantly of water and fat, so understanding how it works is a challenge..

Siri is not AI, neither is Alexa. Neither is that “smart” thermostat, or dishwasher. They run on algorithms which may have some form of “self-learning” algorithms that analyze vast repositories of data in a way that humans can’t. An autonomous “weed puller” on a farm is by no means intelligent. It works by taking a photo of a plant and determining if it is a weed – by comparing it to possibly tens of thousands of photographs of weeds. If it matches it is pulled. But this is hardly intelligence, it is an algorithm – the weed pulling system is following a set of defined rules. The same with autonomous vehicles – using data from a number of sensors, the car drives itself based on the knowledge it has, based on its algorithm. But throw in a random variable, like intense rain, or snow, and the algorithm likely won’t work as well. The car doesn’t think for itself. It can’t. It all relies on data, lots of it. So AI would be better termed DI, or data-based intelligence, or even better would be data-based decision making, because that’s exactly what it is.

Yes, computers do things faster than humans, there is no doubt about that. But a smart thermostat isn’t intelligent, it just contains algorithms that look for patterns in human behaviour – if no one walks past a smart thermostat in 2-3 hours, one could assume that the house is empty? All these machine do is modify human behaviour to suit machines. Do we need intelligent fridges? Ones where we can view the contents of a fridge from an internal camera, shown on an external screen? Do we need robotic clothes drawers? We need them as much as the moving sidewalks and atomic trains of the 1950s i.e. not. In fact people need to start using their own intelligence, and stop relying on devices to run their lives.

 

How the brain rewires itself…

The human brain is an interesting thing. It’s made of a lot of water, and some fat. How we are able to think with it is truly remarkable, and possibly we may never really understand how it works (and maybe we don’t need to).

Years ago I had a transient ischemic attack (TIA), or mini-stroke. While there were no physical changes to my brain, the years that followed showed some changes in the way I think. I use to have a very active mind, even when walking outside my mind would be churning over… there was rarely a time when it wouldn’t be doing something. Afterwards though, there was an inherent stillness in my mind, which I had never encountered before. Now I can walk outside, and my mind is not actively thinking about anything. I also find it easier to be creative, and seem to have a greater flexibility of thought. I don’t know how this re-wiring of my brain occurred, but I am almost grateful it did. I have come to the realization that I really enjoy meaningful “writing”, and hence the blogging. I enjoy writing the manuscripts I do because they are interesting to me, and I try and make them very readable (although these days it can often take a year to finish writing anything). Many computer scientists use to write in this manner (think Niklaus Wirth, or Dijkstra), but few do these days – too much concern with having all papers conform to some regimented standard of scientific writing, with little or no concern for the reader.

The other thing I have come to realize is that I’m very much a visual learner, and a right-brain thinker. Now more than ever I prefer to build things, and be creative. That’s probably why I am somewhat drawn towards photography, and image processing. Computer science isn’t exactly a creative field. Sometimes what we do with algorithms is creative, e.g. the software that runs digital cameras, or even photo drones, but there is nothing creative about the code that runs a mobile device, or a microwave. Algorithms don’t flow in the same organic manner as the lines on a carved wooden spoon. They is some limited beauty in algorithms, but at the end of the day, the programs they evolve into are just inanimate things. Working with your hands requires design, innovation, and improvisation. Thinking work alone rarely does, and unlike building a piece of furniture, pure thinking, without building of some sort, produces very little in the way of tangible outcomes.

 

Deciding on post-highschool studies

When I went to university I did a science degree, for some reason. I can’t remember why, probably the same reason my father wanted me to do the advanced math course in grade 11 ( which I failed miserably at). To be honest I was never really that interested in math, or computers in high school. Nor science – biology was ok, but chemistry was boring, and I loathed it (I even loathed it in the two years I did 1st year chemistry at university). There was a computer class at high school, but as I remember it was programming in BASIC on a TRS-80. Not exactly awe-inspiring. I would have preferred doing something other than science, but my upper year high school classes were geared towards science.

I should have made my own decision, and chosen to do an arts degree, likely in history. I did not enjoy many of the science courses I took at university. Computer science was okay, but then again it was the only thing that actually afforded some thinking. Many science classes were repetition. People who were good at rote learning were good at things like biology and genetics. I wasn’t. We couldn’t take arts classes, so I settled on some from geography, which were fun. I ended up with a major in CS, and a minor in stats – but ask me about stats today, and I could tell you very little. I don’t necessity regret going to university, but I realize now that there could have been more productive things to do with my life. I just never really found university stimulating. That I spent a 5th year doing an honours degree, and two grad degrees still astounds me. Look I am where I am today because of these early decisions… but eventually it will be time to move onto new challenges and do new things.

What I’m trying to say in a very long-winded way is that when you are considering what you want to do post-high school, make the decision on what *you* want. Not what others want. Don’t get sucked into the fancy brochures from institutions of higher learning. Think about what will be relevant in 5 or 10 years. Think of what you will enjoy doing. Will the job you do allow you to travel, or move to different places? Will it allow you to grow, and more importantly will you be interested in doing it for many years. If you want to learn a trade, then learn a trade. If you want to do a degree in history, then do a degree in history. Do whatever speaks to you. And be aware that you may not be able to find your true calling at age 17. Some of us are still looking. If you are hesitant, take a gap year, and volunteer somewhere, get a job, travel… open yourself to a world of possibilities.

Computing – It’s life Jim, but not as we know it.

When we look at the field of computing, we have to ask ourselves whether or not it has improved our lives. There are many aspects of computing that have markedly improved our lives. Some of these have to do with the speed and precision that things can be done. Where would we be without the ability to access funds anywhere in the world, to easily communication with many people, plan vacations, shop online, or obtain information.

The problem lies in the fact that things have probably moved far quicker than we anticipated, and now we are stuck “fixing” problems we likely didn’t need to create, and moving forward with things we likely don’t need.  By “fixing”, I largely mean the security of our information, and uber complex systems. We wouldn’t need security if we had likely taken a bit more time to make our systems more robust to begin with. And the problem is exacerbated by the fact that we keep adding things that make it even worse. Take the “Internet of Things”, the CS communities quest to make life easier for everyone by automating everything in our homes. I’ve talked about this before, but peoples ineptitude towards technology (crappy passwords, no passwords, thinking that the tech will do everything itself) is starting to show. How much technology do I want in my home? As little as possible. Some of it is certainly helpful, but the rest just becomes survives long enough to become electronic waste.

We don’t need wireless fridges, or fridges with cameras in them – what so we can watch our leftovers age in realtime? They are even selling a “FridgeCam” in the UK, so you too can upgrade your fridge and have a photo of your fridge’s contents sent to your smartphone every time the door is opened and closed. Their rational is partly to reduce food wastage via the camera’s “innovative food tracking capabilities”. Yikes. Imagine a world where we can’t even keep track of what is in our fridge. What’s next? OvenCam? BBQCam? DrainCam? We are almost creating solutions to things we don’t need solutions for. It’s not like our lives aren’t already easy, because they are much easier than they have ever been in the past. In fact making things a bit harder might be good for humans. And don’t think AI will solve all our woes… if anything it will make things much, much worse.

A port, a port, my kingdom for a port!

I do love Apple products… at least most of them.  I have been waiting for the laptop updates for a while, mainly because I would like to get a MacBook Air to replace my MacBook Pro – and I don’t want to compromise on the resolution of the screen. So the new ones came out this week, and they look great except for one thing – the ports. The one thing I liked about the old model was the availability of ports. Now, like all the other laptops Apple makes, there is only the ubiquitous USB-C. Nice. The problems with this are numerous, but it mostly has to do with usability.

The first problem is the loss of the MagSafe charging port. This was an excellent piece of innovation because if you accidentally tripped on the cord there wouldn’t be any issues, it just disengaged from the laptop. Replacing this with a “one-port-does-it-all” seems like a good idea, until you trip on the charging cord the first time and it rips out of the socket. The other big problem is that I need to plug things into my laptop. I need a HDMI port, a USB-A port, and a port for my SD cards. Now I know you can buy adapters from Apple, but let’s face it, that’s just a pain. It’s extra gadgets I have to carry around with me, and hope I don’t misplace. You can’t misplace built-in ports. I wouldn’t even mind if they created some multi-port doc or something… well they do but here’s the kicker – their reviews on the Apple store bite.

  • The USB-C to USB adapter gets an average rating of 2.5/5. 60% of the reviewers gave it a rating of 1./5.
  • The multi-port adapter with USB, HDMI, and another USB-C port is even worse with a rating of 2/5, with 67% giving it a rating of 1/5. Somebody called it an “expensive piece of junk”.

Some of the reviews are naturally giving a low rating because of the inconvenience factor – “as if purchasing the latest Macbook wasn’t expensive enough, you now have no choice but to buy this as an added extra”, but many air to the side of usability with comments like “…has sacrificed practicality for superficial appearance”.  Do I want to buy something with such a low rating? No. But you are almost forced to buy it , because, well otherwise what are you going to do?

So if I bought all three adapters I need, it would cost me $49 (SD card), and $85 for the multi-port adapter, because it seems there is no separate HDMI adapter. I could buy a Moshi display adapter, but it too gets crappy reviews. What is one to do?  Thank heavens I don’t need to hardwire my network connection. The irony is I have a perfectly good SD-card with a Lightning connector, but guess what? No Lighting to USB-C adapters exist, and here I was thinking that Apple cared about the planet. Guess not. I may just have to buy a third party HDMI adapter, or maybe a hub adapter from Amazon – could they be any worse?

Now I imagine they got rid of the ports to make life easier, reduce the complexity of the build, or possibly make more room for battery or something, but the net result is a loss of usability. Look, I like Apple, and chances are I will go out and buy a new MacBook Air, because the thought of going back to the alternative is *not* an option. I just frankly wish that Apple would make decisions that were better for its customers. For a company with supposedly 123,000 employees, we aren;t exactly seeing much in the way of innovation.