Sign up to our newsletter and get news, opinion and industry insights direct to your inbox

Should I learn to code? Not necessarily, but thinking like a programmer can help you raise your game

Opinion

Should I learn to code? Not necessarily, but thinking like a programmer can help you raise your game

Are you a professional coder? If you heeded the call of President Obama in 2013 for as many people as possible to learn coding and hacking, you might be.

The president said, “Learning these skills isn’t just important for your future, it’s important for our country’s future,” and he wasn’t wrong. But I took a different perspective on his message. What if instead of literally learning how to code, you learnt how to think like a programmer — would it make you better at your job? I think so, and here’s why.

The swing in attitude towards computer programmers was at the same time peculiar and refreshing. People who had the typical programmer pinned as a basement dwelling, pizza-eating, live-at-home-in-your-forties, neckbeard, CRT monitor wielding sociopath decided that programming was, well, actually sort of cool. And that actually, they were going to do a bit of basement dwelling themselves.

Don’t get me wrong - I thought, and still think it was great. I think that the massive rise in popularity of services like CodeAcademy and Treehouse are testament to how popular the idea has actually become, too. I actually tried them both, and they’re great resources - slick platforms that are welcoming to people who are ready to start building a technical foundation.

But I also think that there are some ways that the whole mania could have been better articulated. Primarily, I think that there shouldn’t have been such an emphasis on learning exactly how to code - but more towards the way you need to think in order to code well. In a nutshell, learning how to think like a programmer. The reason that I say this is that for the many that may go on to become programmers, there are many more who don’t. For those, rather than writing programming off as a missed opportunity - as something they ‘can’t do’ - it would be great to see people taking good practices and applying them in their own professional spaces. There really are more parallels than you might think, and a few of them are below.

Problem definition: What problem am I trying to solve?

The first thing that programmers need to do before they do anything, is to define the problem that they need to solve. Specifically defining the problem set is key to making sure that the code is elegant, scalable and does what it’s supposed to do. On commercial projects, it also stops ‘scope creep’ and ‘bloat’. Understanding the problem is key to the way that you decide to solve it. And sometimes, you don’t actually need to write any code at all.

Problem definition is a very useful skill both in and out of programming. Defining where a problem begins and ends (and its subsequent caveats) can be applied to basically every single industry. Defining your problems before tackling them improves the quality of your decision-making enormously.

Critical and creative thinking: Is there a better way to do this?

There is more than one way to solve most programming tasks - many better than others, and sometimes one better than all. In some instances, how you solve the problem is just as important as solving the problem itself. Working within limited bounds can often mean that your first few ideas just don’t work - so thinking creatively is often part of getting the job done.

Programming is one of the rare pursuits that is usually equal parts creativity and logic, and being able to think outside of any problem itself is a practical, transferable skill. Taking a step back, or a 15 minute break and the re-evaluating can sometimes point out really obvious ways to improve your work - whether you’re a coder or not.

Make no assumptions: What is definitely happening here?

Before you’ve actually told them what to do, computers are dumb. Really, really dumb. They’ll try to do exactly what you tell them do - regardless of whether or not they can do it - and they don’t really handle failure very well. This is because they don’t know anything about what you’re trying to do until you tell them explicitly. Computers become useful when they receive some instructions, and ‘smart’ when they receive unambiguous instruction from smart humans. In fact, I’d argue that that’s the best and worst part of working in technology.

This level of instruction is rare in ‘real life’ and can be the undoing of new programmers. They don’t realise that they often have to catch errors before they happen, and prepare defensively for failure events. Assuming, for example, that a variable is an integer when it’s a string, or that it even exists at all if it doesn’t, is usually going to result in some sort of issue. (Here, I’ve assumed that you know what a variable, an integer and a string actually are - but if you don’t, there are many great resources available for learning).

Learning that assumptions are bad is great. You really can apply it anywhere. Everybody is guilty of making mistakes based on incorrect assumption. The less that we do it, the better for all of us.

Attention to detail: Will it compile?

Deciding to become a professional programmer is one of the easiest ways to learn that a misplaced comma is quite capable of destroying a human being’s happiness for an entire day. Seriously though - attention to detail is critical when writing code. Computers understand a pre-defined list of words applied within a rigorous syntactical structure ruthlessly.

Attention to detail is an obvious prerequisite for doing better at life. Even basic things - double, triple and quadruple checking that email before you send it out to 500 people, making sure you’ve spelt the CEO’s name right, making sure that you know where your meeting room is before the day of the meeting - it’s important. Make things easier for yourself by learning that the minutia counts. (Without being pedantic. Detail without function or reason is very annoying.)

Constructive teamwork: How can I drive respect, communication and education?

It’s likely that as a programmer, you’ll work on projects with a team. Thousands and thousands of lines have been written on how to structure software engineering teams and I won’t broach the topic here - but teamwork and communication are vital in technical disciplines.

Programmers come in various calibres. Working in a team of mixed abilities and strengths is a good way to learn about diplomacy, workload distribution and communication and sometimes just straight-up being friendly. ‘This code sucks, I need to rewrite it’ is far less productive than ‘I’ve seen a few ways that we could tidy this up - lets take 20 minutes to look at it.’. A team that can communicate and learn from each other (and where there is no such thing as a ‘stupid’ question) is far happier and more productive than a team of demotivated, bumbling juniors led by someone who thinks that being a better programmer is the same thing as being a better human.

This obviously translates exactly to working well within a team in any industry. Putting productivity, respect and education towards the top of your priorities goes a long way to improving your output and your environment, and if you’re a business, making your operation more profitable.

Round up

A lot of this probably sounds obvious, and it is - but I think these are a few simple ways to demonstrate that there is more to writing code than just knowing every single Python library that’s ever existed. Starting to think like a programmer is a great step towards becoming more logical, efficient and analytical regardless of the industry that you work in - and it’ll improve the quality of your decision making in the long run.