Question

I often hear that a real programmer can easily learn any language within a week. Languages are just tools for getting things done, I'm told. Programming is the ultimate skill that must be learned and mastered.

How can I make sure that I'm actually learning how to program rather than simply learning the details of a language? And how can I develop programming skills that can be applied towards all languages instead of just one?

Was it helpful?

Solution

Don't worry about meeting some ridiculous concept of "skill" so commonly heard in such statements like:

  • All programming languages are basically the same.
  • Once you pick up one language well you can pick up any other language quickly and easily.
  • Languages are just tools, there's some overarching brain-magic that actually makes the software.

These statements are all based on a flawed premise and betray a lack of experience across a broader spectrum of programming languages. They are very common statements and strongly believed by a great swath of programmers, I won't dispute that, but I will dispute their accuracy.

This is proved simply: Spend one week (or really any amount of time greater than a couple days) trying to learn the fundamentals of Haskell, Prolog, or Agda. You will soon after start hearing the old Sesame Street song play in your head "One of these things is not like the others...".

As it turns out, there is a whole swath of programming languages, techniques, and approaches which are so foreign from what 95% of us do or have ever done. Many are completely unaware that any of these other concepts even exist, which is fine and these concepts aren't necessary to be an employed and even effective programmer.

But the fact remains: These techniques and approaches do exist, they are good for many different things and can be very useful, but they are not just like what you're used to and people cannot simply pick them up with an afternoon of fiddling.

Furthermore, I would say the majority of cases where people claim they have or can learn such complex things as programming languages so quickly as a week, they are suffering from a bit of Dunning Kruger Effect, Wikipedia (emphasis mine):

The Dunning–Kruger effect is a cognitive bias in which unskilled individuals suffer from illusory superiority, mistakenly rating their ability much higher than average. This bias is attributed to a metacognitive inability of the unskilled to recognize their mistakes.

I would refer people to this more experienced perview on the concept of learning to program by Peter Norvig: Learn to program in ten years.

Researchers (Bloom (1985), Bryan & Harter (1899), Hayes (1989), Simmon & Chase (1973)) have shown it takes about ten years to develop expertise in any of a wide variety of areas, including chess playing, music composition, telegraph operation, painting, piano playing, swimming, tennis, and research in neuropsychology and topology. The key is deliberative practice: not just doing it again and again, but challenging yourself with a task that is just beyond your current ability, trying it, analyzing your performance while and after doing it, and correcting any mistakes. Then repeat. And repeat again.


Surely, there is a set of overarching principles that will make all languages easy to learn!

Perhaps, but I would argue this set of principles is so large that there will almost always be languages outside of your one-week reach. As you add new concepts to the list you're familiar and comfortable with, this list of languages outside your immediate reach may shrink, but I have a hard time believing it will ever go away. The list of conceptual computing approaches to things is so broad it's baffling, from concatenative languages to vector based languages to languages specializing in AI or metaprogramming (or languages which exist entirely to support regular expressions).

After ten years you will be able to generally program. This means you can write somewhat decent code in some language or style of languages. So after 10 years you are ready to start tackling these countless broad cross-cutting concepts for the rest of your life, and short of being Edsger W. Dijkstra, Donald Knuth or John D. Carmack, you're not going to get to all of them.

OTHER TIPS

... how can I develop programming skills that can be applied towards all languages instead of just one?

The key to this question is to transcend the language and think in not the language you are coding in.

WAT?

Experienced polyglot programmers think in the abstract syntax tree (AST) of their own mental model of the language. One doesn't think "I need a for loop here", but rather "I need to loop over something" and translates to that to the appropriate for, or while, or iterator or recursion for that language.

This is similar to what one sees in learning a spoken language. People who speak many languages fluently think the meaning, and it comes out in a given language.

One can see some clue of this AST in the pair of eyetracking videos Code Comprehension with Eye Tracking and Eye-Tracking Code Experiment (Novice) where the movements of the eye of a begineer and experienced programmer are watched. One can see the experienced programer 'compile' the code into their mental model and 'run' it in their head, while the beginner has to iterate over the code keyword by keyword.

Thus, the key to the question of developing programming skills to apply to all languages is to learn multiple languages so that one can distance themselves from having the mental model of one language and develop the ability to generate the AST for a problem on their own in a head language that is then translated to a given language.

Once one has this ability to use the AST in the head, learning another language within a similar school of thought (going to Befunge is a bit of a jump from Java, but not as much from Forth) becomes much easier - it's 'just' translating the AST to a new language which is much easier the 3rd, 4th and 5th (etc...) time it's done.


There is a classic article, Real Programmers Don't Use Pascal. Part of this reads:

... the determined Real Programmer can write Fortran programs in any language

There are also bits for which you can't just use the mental AST - you need to think in the language too. This takes a bit of time to accomplish (I'm still accused of writing Perl code in Python and my first Lisp code was reviewed saying "This is a very good C program.").

To this, I must point out an article published by the ACM, How Not to Write Fortran in Any Language. The third paragraph of the article (that isn't leading quotes) directly addresses the question at hand:

There are characteristics of good coding that transcend all general-purpose programming languages. You can implement good design and transparent style in almost any code, if you apply yourself to it. Just because a programming language allows you to write bad code doesn’t mean that you have to do it. And a programming language that has been engineered to promote good style and design can still be used to write terrible code if the coder is sufficiently creative. You can drown in a bathtub with an inch of water in it, and you can easily write a completely unreadable and unmaintainable program in a language with no gotos or line numbers, with exception handling and generic types and garbage collection. Whether you're writing Fortran or Java, C++ or Smalltalk, you can (and should) choose to write good code instead of bad code.

It isn't just enough to have the AST - it's necessary to have the AST that one can translate into other languages. Having a Fortran AST in your head and writing Fortran code in Java isn't a good thing. One must also be familiar enough with the language and its idioms to be able to think in the language (despite what I said at the very top).

I've seen Java code written by someone who hadn't stopped writing C code. There was one object with a main method. In this object were a bunch of static methods called by main, and private inner classes that had public fields (and thus looked a lot like struts). It was C code written in Java. All that was done was translating the syntax of one language to another.

To get past this point, one needs to continue to write code in multiple languages, not think in those languages when designing the code, but think in them when translating the design into the code to work with the language idioms correctly.

The only way to get there - being able to develop programming skills that can be applied to all languages - is to continue to learn languages and keep that mental programing language flexible rather than linked to one language.

(My apologies to ChaosPandion for borrowing heavily from the idea he presented.)

Pick a language, and start coding. Python is a good choice for a beginner, and there are tutorials available online, so that you can learn how to do it properly.

Everything follows from that. Your interests will lead you to frameworks and design concepts that will add sophistication to your programs. You will discover that there are online courses you can take that will ground you in the fundamentals and the theory, and that there are different programming paradigms you can explore, and so on.

And yes, you will discover languages like Haskell that will teach you something new, once you have a firm grounding in the fundamentals.

Some programmers probably think all languages are the same because they haven't been exposed to any that make them think differently. All of the most commonly used languages are derived from Algol (they are essentially procedural languages), and of those, most are curly-brace languages similar to C. All of them do essentially the same things, albeit some with more sophistication than others.

Programming is about problems solving in such a way that the solution can be expressed in such a restricted grammar that it can be implemented with a programming language. The art of programming is therefore the art of solving problems.

Certain languages invite other programming paradigms such as object orientation, event driven, multi-threaded and MVC framework based. These are all just models and patterns and have nothing really to do with implementation.

If you can sit and solve a problem on paper in such a way that it could be easily translated into code and is associated with an appropriate model for your platform, then you are a programmer. If all you can do it take those solutions and implement them in our chosen language, then that's another matter.

I have been programming for 30 years (OMFG!) and still use php.net to look up commands in PHP because it's not my first language.

I would say that expertise in languages is inversely proportional to how often you look at the manual or stackoverflow. Expertise in programming is how readily you solve problems in a way which is compatible with computer programming languages.

In related news, I learnt Ruby last week. Though I'm no "expert", I can solve you a problem which I could write in Perl, say, and then spend an age translating it to Ruby whilst I learn it some more.

I think, as with anything, practice makes perfect. Just don't pigeonhole yourself into always doing the same thing or always using the same language and keep continuing to learn things on every project.

I think you can easily draw a parallel to something like learning to play a guitar. Any good musician can learn to play a new song in a very short period of time, because they already know all the chords and all the theory behind why the chords are played the way they are. How do they get that good? They just have played so many songs that all the patterns have just blended together, while at the same time supplemented their knowledge with actual documented theory that those patterns subscribe too.

So maybe you can play a few songs very well, but you can't deviate or pick up new songs quickly. This is probably the equivalent of a .NET programmer that continues to make the same CRUD application over and over, at some point try something new, add in some web service calls or an advanced UI, or writing it in a whole new language. When you hit a snag look into why things happen the way they do, ask questions on Stack Exchange, etc. Eventually, you will see all the patterns that continually come up and know some of the underlying theory and learning a new language won't seem nearly as daunting.

I'm not going to adress how long it takes to learn a language or what it means to learn a language, instead I'm going to address your actual problem: how to determine if you have learned to program or have learned a programming language.

You've learned to program if you have learned to break a problem down into discrete processes and then use those processes to solve your problem. You've learned a progamming language if you've learned the syntax of a language and know how to adjust how a process works, when implemented in that language.

This is not to say you should program in Fortan when using Lisp or add up the values of a column in a table in a db using a cursor. Just that the language is an implementation detail. One that can change what processes are needed, but not the need for identifing and creating processes -- in the end there is a real world implementation, with input/output and desired results.

My strategy has always been to focus on pure skills rather than specific skills.

Instead of learning Python (or any language) 's special syntax for whatever it is you want to do, spend your brain cycles solving abstract problems, like how to best solve every problem in that category.

That way, you will know what to do no matter the language, and will mostly possess timeless skills that can be used for programming in any language.

Specifically avoid tools that are full of gotchas, like MySQL, or opinionated languages, like Java, as whatever you learn by using these tools will have a big proportion of tool-specific knowledge which is bound to become useless pretty fast.

Contrary to what has been said in many answers, do NOT listen to other programmers,You are a noob and there is no way you can tell the fake from the real deal, so you're better off taking everything with a spoon of salt.

You want to be questioning all the time and accepting only when the solution is fast, elegant and reliable.

There's the theoretical approach. Learning about how computers actually work under the cover. How the basic processor instructions are stringed together to make the more complex operations and structures that we take for granted in high-level programming land.

Then there's the more practical programming approach. The main sticking point that plague people usually labeled as "not good programmers" is that they only really know one language. And even if they know others, they program in them in the same way they do with their native language. That's a cycle one must break if they really want to learn how to program. The default answer to that is to learn at least one language from each programming paradigm. So learn an OOP language, a functional language, a scripting language ... etc. And by learning I don't mean learning the syntax. You learn a language by actually using it to create something.

Personally, when I want to learn a new language I use Project Euler puzzlers. I go to a puzzle that I have already solved in an OOP language (as an example) and try to solve it using a functional one while trying to follow the best practices of the new language. When you solve the same problem using two fundamentally different approaches you not only see what the real differences are, but they also show you where the common areas are. These common areas that are shared by all languages is the real programming, the differences are just different ways to achieve it.

Well, most of the things I wanted to say has already been said. What I would like to add is a very simple analogy.

If programming languages are considered mere tools, even then there is absolutely no logic in being good at one makes being good at the other a cakewalk.

Just consider a bunch of reputed master swordsmen, suddenly put down their swords and went off to battle with spears after 7 days training. What would happen? They would be massacred.

Languages are often not difficult to learn but it takes patience and exercise to be good at it. Additionally, there is no right way to learn programming.

Learning a programming is like playing an RPG game. Sometimes you use swords, sometimes spears, sometimes a shield. Each enemy you kill, you get experience points. Once you have enough experience points, you level up. Now mastering a sword will not make you excellent with bows and arrows. But a portion of the experience you gained previously will increase your stamina and speed.

Here's a couple of things you might want to do when learning a language.

  • Read about the language. if it sound interesting try out the hello world app(s) by yourself.
  • Read some tutorials, tricks, blogs.
  • Make simple apps in it just for fun.
  • Test different features.
  • If you really like it, buy some books and/or video tutorials.
  • Search for good libraries.
  • Search for answers, ask only if you can't find the answers.
  • Help others asking for answers (where better than here?)
  • Make something useful. Making a calculator app may be a good exercise but if you make a TO-DO list app and actually you use on your PC/Phone, the feeling is 100 times satisfactory.

Experience new languages, explore new libraries, learn new tricks on your free time. Before you know it you'll surprise yourself with your own skill.

In my case, I learn how to actually program through the following:

  1. Learn from the masters. Listen to programming podcasts, read professional blogs in your programming topic of choice, read/watch wonderful tutorials done by gurus that are scattered all over the web and lastly, reading epic books like The Pragmatic Programmer. This book has a lot of programming gems that have been accumulated throughout the career of the authors. One sure fire way to learn how to actually code is to know how other successful programmers do it.
  2. Experience by doing. Reading about it and knowing is one thing, actually putting it into practice and getting it to work is another. There is no better teacher than experience, so put your coding cap on and get started.
  3. Ask someone who knows. Just like you're doing now, don't be afraid to ask about best practices or better ways to do things from seniors in your team, or if you're unfortunate enough to not have access to the said seniors or mentors or gurus, then there's still the rest of stackexchange and the internet to ask.

Also, as your commenters have mentioned, don't forget to master your tools as well. Learning all the best practices and greatest theories are all for naught or will be poorly implemented if you don't know enough about your tool, in this case, a programming language.

I think, if you can think analytically, you have a good start.

Learn any language you want and work yourself trough a series of examples e.g. as presented in nearly ever book that teaches programming.

Next try to solve your own problems. Try to find different solutions and compare them. Speed and memory-usage are commonly used factors that matter. Discuss your solutions with other programmers.

Read code of other programmers and try to understand why they solved the problem this way.

You should also read some books about algorithms to get an overview over standard approaches. New problems are often modifications of old problems.

A lot of practice and working with code also in teams will help you to increase your skills step by step.

I hope my opinion answers you question at least partial.

Licensed under: CC-BY-SA with attribution
scroll top