I see a lot of people on sites like Quora asking about becoming a programmer. The reality of course is that programming is not easy, nor is it for everyone. Here are some common myths dispelled.
Myth 1: “I need to be super good at math.”
No, not really. Being great at calculus might be helpful in certain applications where you need to solve equations to derive an algorithm and implement a solution, but most universities put to much emphasis on esoteric math skills. Mathematical knowledge is good in areas line image processing, but too many times too much emphasis is placed on it at the expense of really important things like problem solving.
Myth 2: “I need a degree in computer science.”
Sure it helps, in the right context. Some of the most interesting people in computing never went to university, yet they achieved incredible things. If you have two people: one went to university and got an A average, the other self-taught themselves programming and created an incredibly successful app which sold millions of copies. Who would I hire? The latter. Why? Because they have already proven themselves without the need for some academic hubris. That and they have a portfolio of experience, and are self-motivated. There are many stories like this. Conversely there are people who barely pass their courses and still get a degree. So I want someone with a 55% average programming software for a nuclear reactor? Hardly. Oh, and remember, most of the people teaching computer science in institutions of higher learning don’t actually design software for a living.
Myth 3: “I need to be super brilliant.”
Define brilliance? The ability to get straight A’s in university? Hardly. You need to be a hard worker, and more importantly than be brilliant, you need to have a good sense of exploration, and willing to think outside the box. Clever algorithms come into existence from people who have the ability to think beyond current knowledge, into the great beyond. Brilliance comes in many forms, not just academic grades.
Myth 4: “I need to learn the best language.”
Define best? There is no best language, despite what anyone says. Every language has some inherent benefit or weakness and is geared towards slightly different things. In reality to become a good programmer you will need to learn about many different languages, and how they interact. Never have the attitude that “C is best”, or “I only code in Java”. Boring… everyone learns these languages. If you want to stand out learn the languages that others don’t, like Fortran and Ada.
Myth 5: “I’m done learning.”
Many people seem to believe that once they have a degree they are done learning. Wrong. Computing, like many disciplines continually evolves. You will need to learn new things all the time, and in fact maybe unlearn some of what you learned in university. University often doesn’t relate completely to the real world. Case in point, many years ago academia discarded teaching languages like Cobol because they thought it wasn’t relevant… news flash… it’s as relevant today as it was in 1970.
Myth 6: “Once I have mastered the syntax of a language, I can do anything.”
Mastering syntax is one thing, being able to actually implement an algorithm is another all together. There are often many ways to implement an algorithm, some might be more efficient than others. You have to have an innate understanding of how a language can be used to implement an algorithm. In some cases the language may not even be the best to implement the algorithm. For example you can master Java syntax, but it would not be the best language to implement a real-time control system for an autonomous train.
Myth 7: “I’m good at gaming, so I’ll be a great coder.”
Likely not. Gaming and actually designing and implementing software are world apart. If you don’t have any interests outside of the computer I would imagine you aren’t really able to think outside the box… and I don’t want to hear any malarkey about have great hand-eye coordination, and multitasking skills… it’s hyped up baloney.
Myth 8: “I can master language X in a few weeks.”
🤣 Nope. Nada. Not likely. You may get a hang of the syntax, but master? That’s like saying you could become a Jedi in a few weeks.
Myth 9: “I learned HTML, and it was easy.”
Myth 10: “I’m a woman, programming isn’t for me.”
Why not? just because there are so many guys in computing? Ignore that, follow yours interests. Women were as much on the forefront of computing in its formative years as men (it’s just often conveniently forgotten). Actually some of the best programmers in my classes are women.
Myth 11: “Programmers sit in front of a machine all day.”
Programming isn’t all about machines, and it isn’t all about coding. It is just as much about coming up with designs, and new algorithms, as implementing them. Besides, these days you can work from just about anywhere. Some people find inspiration sitting in a cabin in Iceland, or on a beach. It’s what you make of it.
Myth 12: “The more tools I use the better programmer I am.”
No. Tools are fine, but sometimes the more tools you know, the less you understand about what is happening. A good example is programmers who eschew learning low-level stuff like the command line, instead opting only for interactive development environments. They don’t understand how things work at the lowest level, and so have less of an understanding of what is going on overall.
Myth 13: “I’m a cool programmer because I code everything on the fly.”
No, you’re not. All it proves is that likely you never followed instructions. You probably indent with two spaces or worse use tabs. Coding on the fly is okay for trying out small things, experimenting and the like, but it’s not good for large scale projects because it’s easy to miss things. People usually code-on-the-fly because they think they are cool. Big mistake. I see it when people try to translate code, and then wonder why they get in a mess… it’s usually because they have no clue what they are doing.