Machines don’t think, they process

When I hear the term “AI”, or “AI assisted” software in a post somewhere it makes me cringe. Recently it’s been photography related posts related to recovering severely underexposed RAW images, or AI-powered upscaling where you take a 12MP photo and upscale it to something larger. But there isn’t much AI involved here, it’s just an algorithm, that may use lots of data to “learn” from to do whatever it is that it is suppose to do. It’s similar to those algorithms that cut out pieces of a photograph that you don’t want, e.g. telephone poles, people, and seamlessly replace the data so it looks like the obscuring item never was there. Making an image larger is of course challenging because you have to create data. Now there is a lot of data in a 12MP image, so an “intelligent” algorithm might be able to effectively upscale it. The same could not be said for upscaling a 640×480 to 12MP… no amount of intelligence it going to make that happen. Like I have said before the pretext for much of the “AI” in the world is data – we have the ability to store vast repositories of data, and that data can be used to create algorithms that learn, in a fashion. But it is not organic learning in any sense of the word.

I loath statements like “AI will make the photos we take better”. What, so I can go on vacation to Norway and take a photo of a fjord on an overcast day – then the camera-based AI will pump out a beautiful sun-lit photograph? Why? Look the world is already full of doctored photographs, we don’t need any more. Photographs are what they are, and short or some basic improvements shouldn’t be played around with. Doing so destroyed their intrinsic value. Basic algorithms like image segmentation still can’t properly identify objects in the simplest of scenes… AI makes this somewhat better, but it still relies on lots of learning from existing data. Facial recognition, just to be clear, it *not* AI – it is an algorithm which finds faces, and has been around for years. AI, it seems, is the latest shiny object people are chasing because it will “fundamentally change our lives”. But consider this, maybe our lives don’t need fundamental change (except for maybe caring more about the planet, and less about the next piece of transient technology).

Look I think from the standpoint of photography, people should take real photographs, and spend less time worrying about post-processign them. AI is no substitute for experience, and good quality lenses. From other perspectives, algorithms are good at pattern matching, but that’s not really AI. We can’t even get translation to work effectively. There was a whole article earlier this year in The Atlantic about The Shallowness of Google Translate. There are times when translate just doesn’t work. Give it a fancy font, and it can’t even detect the characters (I have another post on that). It is good for simple sentences, but just can’t handle the hard stuff. Machines are devoid of many things, emotion being on the top of the list. But they are also devoid of the sense of thought. Machines don’t think, they process data. They don’t invent, they don’t think outside the box, they don’t paint. Humans do those things, so why try and create something that humans already do?

There is more to life than technology, and algorithms.

 

Too much schooling ? (or maybe just a different type)

It’s amazing that we as humans we spend so much time in education. I honestly believe that spending as much time as we doing in institutions of learning somehow may stifle human creativity. It wasn’t always this way of course, people use to learn their trades more through apprenticeships. Consider the great British “engineer” Isambard Kingdom Brunel (1806-1959). He was responsible for many of the greatest engineering achievements of the 19th century, such as the Rotherhithe Tunnel. When he was 14, Brunel attended the College of Caen in Normandy, and later that same year (1820), the Lycée Henri-Quatre in Pairs, famous for its mathematics. But Brunel never trained as an engineer because there was no formal education in that field. He instead apprenticed as a clockmaker to Louis Breguet, who made chronometers, watches and other scientific instruments. Returning to England in 1822, The rest of his abilities he learned on the job. It was likely Brunel had a vast ability to think outside the box.

This brings us to the crux of the matter. We spend so much time worrying about education, that we have lost sight of what drove many inventions in the past – innovation, building and experimentation. Not solely education. James Watt, who developed the Watt steam engine in 1776, learned to be an instrument maker (for a year), then set up his own business. Now he is credited as being in the field of mechanical engineering, but honestly he was a inventor. There are many contemporary individuals who built things that aren’t mechanical engineers. My grandfather was a painter, back in the days when they mixed paint themselves. He also renovated houses, and even lifted them to add a second storey beneath. He wasn’t a structural engineer, but he was able to solve problems. Education, while being important, has to be tempered with real-world experiences.

Today, engineers and computer scientists don’t really build things in school. Years ago they had some concept of building things, either from some sort of shop class (wood, metal, or technical drawing), a hobby which involved building things, or a childhood spent exploring with Lego and/or Mecano. People think a mechanical engineering degree will teach you all you need to know to build anything – but it likely won’t. Others believe that a computer science degree will make you a software guru – it won’t. Few people design and implement large scale software projects in university (coop students do work on projects, and often learn a substantial amount). Both computer scientists and engineers end up with strong analytical skills and the ability to focus well on closed-ended problems, but few skills to adequately think outside the box. A common mantra in many “making” fields is that in school you learn about 10% of what you need to know – real world work will teach you the rest. This would be lessened if we reduced the amount of archaic traditional teaching we do, and concentrated more on an experiential model. Why do we need 4-5 years of classes to get a degree? Would one year in class be enough, followed by two years of real experience interspersed with seminars on relevant topics?

Food for thought.