Plateauing technology and boring computing

This year marks 37 years since I took my first computer science class. It was in Pascal, and I really had no clue what I was doing. But computing was interesting in the 1980s. Home computers were becoming a thing, Mac’s were making inroads everywhere, and you could play-around with PCs, even if they likely weren’t as interesting as Mac’s. Everything had its pros and cons. Unix provided an excellent platform to learn programming, much better than Mac’s which were awful to code, and PC’s which just really weren’t that interesting language-wise (unless you liked BASIC). But there were lots of things to learn, and the focus wasn’t always on “the language”. On Unix, you could even teach yourself shell programming, and do some amazing things with the system. You could even bump the system load to the point where professors playing Larn got thrown out of the game (I mean they should have been teaching in class so it was their own fault). To put this into context, the CS dept ran a Pyramid 90x supermini computer which had a 32-bit RISC microprocessor and ran at 8 Mhz. It probably had somewhere around 4-16MB of memory, and sold for around $150K.

But recently I think things have stagnated. Like I have mentioned previously, languages aren’t exactly exciting these days, primarily because they just haven’t evolved in any perceivable way for the past 25 years. Most technology has plateaued. Sure things get a slight bit faster, a bit more memory, and maybe a better camera, but most people couldn’t tell the difference. Machines seem to function as they did 20 years ago. Sure there is AI, but its no panacea, i.e. it’s not a universal problem solver, and it is a completely different way of doing things. Most programming and technology evolution is still done by humans. Since 2000 almost everything that has been “invented” is on the whole cleverly repackaged and re-engineered prior inventions. Nothing new here… keep moving.

I mean it’s not surprising, there just isn’t the same level of CS creativity in universities as there was in the 1960s-1980s when it comes to language design and computing in general. Those evolutionary decades in computer science were endless waves of new technology and ideas. Where are the new data structures? Where are the new sorting techniques? Right, there aren’t any, or perhaps there are just new derivatives of old ones. And are faster algorithms any better? Maybe for uber uber large datasets, but not for the average person. Few do these comparisons anymore, because although academia spurred the growth of computer science, it has done little in decades. Ideas and advancements now come mostly from corporate labs. Nobody funds long-term research into potential ideas in computer science (anymore). For example, why hasn’t there ever been a long-term study into the usability of programming languages? Because no one would fund it.

In 2007, the iPhone was a good invention – we haven’t seen anything comparable since. Apple Vision Pro? Yeah likely some people (those into spatial computing) might love them, but they likely will never be ubiquitous tech – not until they evolve past what they are now. Still all tech has to start somewhere. But it’s not exactly going to change lives. The thing is that it’s hard to come up with good tech ideas these days that are pivoting, innovative, game changing. I think this is possibly because we lack the same innovative thinkers of previous generations. They were often multidisciplinary individuals who could think outside the box. They produced ideas from nothing. Why? It’s possibly because they had creative childhoods, building things out of Meccaco, or playing with home chemistry sets, or even just building forts in the woods.

Of course, maybe we have invented all the technology we’ll ever need? Computing is not the sticking point of future technology to venture into space, nor to make medical enhancements or filter CO2 from the air. Hint, it’s likely material science, and sustainable energy. Some computing doesn’t need to go any further, e.g. word processors, or even operating systems for that matter. It’s also possible we are in some sort of lag. The same technological block that abandoned the moon 50 years ago, or any potential for evolving the technology of the Space Shuttle. What computing needs is new ideas and innovation, not the same old things rehashed 100 times.

One thought on “Plateauing technology and boring computing

Leave a comment

This site uses Akismet to reduce spam. Learn how your comment data is processed.