One of the problems with computer science education is that it is stuck in the dark ages. Don’t get me wrong, there are many good initiatives to modernize the ways curricula are taught. But one has to ask the question – in a field that moves so quickly is teaching classes even relevant anymore? Look I get the importance of learning the theoretical underpinning of a subject, but that makes it more of a science, and computer science, even though it has the word science in its title is more of a craft than a science.
So maybe it should be taught a different way. Think apprenticeship – and I know it’s a radical idea, but if universities *really* want to be innovative in how they teach things they have to let go of what they know, and move into the light.
The model I would propose is similar in concept to the Forest Schools, found in the kindergarden and early elementary setting. Here children spend anywhere from a half day to a full day outdoors in local woodlands and green spaces, in various urban and near-urban parks, natural spaces adjacent to or on school grounds, or natural playgrounds and outdoor classrooms. The basis of these programs is emergent, experiential, inquiry-based, play-based learning. Why shouldn’t higher education be treated in a similar manner? Okay, I’m not suggesting playing in forests. But hey why not – if it could be incorporated into some practical project.
And therein lies the whole concept. Turning the current system away from lecture-based learning, towards informative, experiential based learning. How can this be accomplished? First of all, ditch the current concept of classes. You learning programming by doing, not by sitting and learning about syntax. It does seem radical, but innovative instruction is about being radical.
Next base the learning experience on the idea of spending the period from September to April, working in small groups, creating solutions to real problems. Intersperse this with 3-5 day coding bootcamps, to help solidify certain ideas, or introduce new concepts. Maybe have people from industry give 1-2 day workshops. The projects would hopefully mimic what real world projects go though to produce an outcome, and the outcomes would have to be real. If students had to learn certain things – they would just have to do so. If we take the forest analogy, one project could be to use inexpensive drones to measure forest canopies. That might stand out as a good upper year project. So what about first year. Maybe first year could be about developing something small? Maybe an app? Students would likely gain incredible skills in designing, implementing and marketing an app.
What are the benefits of this approach? Firstly, we could likely cut back a degree to three years. Thats right, 3 years. Why do we need 4-5 years for a degree. If students wanted to do coop, they wouldn’t be restricted by “classes they have to take to fulfill the degree requirements”, so could so it whenever they like. There likely would be issues with ancillary courses such as math, or science, but maybe they would need to change their perspective too?
Okay so some will argue that “practice” is what coop is for, and I get that. But the reality is that *every* student should have the same experience. Worst case scenario, opt for a hybrid model where the first two years of a degree are traditional, and the last two-years follow the apprenticeship model (or 1.5/1.5). Will this ever happen? No, unlikely. Few institutions have the gutsiness required to take this big jump into the unknown. But you know what? It may just actually work.