Experience counts in coding

20170812 Coding

Life is change. I’m not sure if that’s a quote, or something I just made up. However, over the past few decades, the pace of technological change has seemingly accelerated. Does that mean that if you are a coder, for example, your experience can quickly become outdated or does experience count? Just hold that thought, whilst I digress to what will appear to be a totally unrelated subject (but isn’t).


I recently went to a literary festival hosted at the British Museum. I admit this is hardly everyone’s cup of tea, listening to authors discussing books, but somehow it’s the type of thing I seem to enjoy. It proved to be a great opportunity to stock on even more books, I’ll still struggle to get through. In between the various talks, I had the chance to visit the rest of the British Library. One of the star attractions of the British Library is the Treasures room. It holds no gold or silver, diamonds and pearls. Instead, it has original manuscripts from creative geniuses, such as Mozart and Da Vinci. It’s strange but seeing words written by their hands, somehow makes it that bit more inspiration. They might have creative geniuses, but their handwriting is just as human (and messy) as the rest of us. In among the various manuscripts, which I saw, is a letter from Ada Lovelace to Charles Babbage from 1843. It proposes a calculation which ‘may be worked out by the engine without having been worked out by human head and hands first’. This is actually the first time that the general principal of a computer program had been set out in writing. It’s over 150 years since then and we are still coding!


With all the relentless change in technology, it can be tempting to think that the way people code changes very quickly. However, despite the dizzying pace in technological changes, in practice, the programming languages which folks use changes relatively slowly over time. Let’s take the most popular computing languages (my source here is IEEE Spectrum’s data) which are Python, C, Java and C++. Java is the “youngster” in the group, and was invented in 1994, over 20 years ago. It takes time for a language to be used, because it takes time for people to learn them! This also explains the stickiness of languages. It also takes time for a language to mature to attract users. The key point is though once you’ve understand the general thought processes behind coding, it becomes easier to learn newer languages. Yes, language specific features will need to be learnt. You’ll also have to learn about new APIs and libraries, even if you stick to the same language. Languages themselves also change over time, adopting new features.


However, the general experience you have in coding is still very relevant and very transferable. When it comes to designing software, knowing which pitfalls to avoid, is something which comes with experience, as is listening to your client. It is no use coding up software, which doesn’t fit your client’s requirements! Learning from mistakes is also key to writing good code. Understanding where to optimise code and where to just let it go, is another skill that comes with time. The same can be said with making code more maintainable. Combining experience with a willingness to learn about new techniques is a super combination for coding!