Experience isn’t always helpful, and it often doesn’t mean what we think it means. The correlation between years of experience and the ability to meaningfully contribute to a team has yet to be proven, so we shouldn’t rely on that as a key indicator of potential. However, that’s the best we have right now when we’re trying to judge someone’s potential ability to contribute, and this is leading developers to chase emerging technologies in an effort to pad their resumes so they don’t miss out on future job opportunities.
What is “experience”?
I first started programming when I was 10, and now I’m 32, so I must have 22 years of programming experience, right? While I was studing opera in college and grad school I was still building and configuring my own computers. I wasn’t running Linux, but I knew my way around a PowerShell, and whenever anyone had computer problems I was the go-to guy. And for the 5 years where I was a publicist, I was still knee deep in HTML and jQuery thanks to the beginning of the world of digital marketing.
Now we all know that I can’t reasonably put 22 years of experience on my resume, since many of those “years of experience” aren’t at all applicable to what I would be doing today. A good chunk of the first couple of years where I was “programming” I was just writing programs on my TI-83 in BASIC so I could cheat on math and science tests. But whenever anyone asks me how many years of programming experience I have, I honestly have no idea what to tell them. My confustion always starts with this question: what exactly is a “year of experience”? Does it mean a year of working 40 hours/week at a full-time job? If that’s the case, then would working 80 hours/week mean you get 2 years of experience out of your 365 days of work?
This is a big reason why I don’t like or care about “years of experience” as a proxy for what sort of “level” a developer is at, or what kind of work they’re capable of. I’ve worked with developers who spent 10 years of their life working full time at big companies in San Francisco but couldn’t solve even basic problems without guidance from others, and I’ve worked with developers less than six months out of a bootcamp that could solve very complicated problems without any guidance from a more “senior” developer. I know these are just anecdotes and don’t have any value, but there are other reasons why I’m skeptical of experience - sometimes it can be deadly.
The July Effect
People used to say that you shouldn’t get a heart attack in July because all the experienced cardiac specialists are out at conferences or on vacation, and that would mean that you wouldn’t have as good a chance of surviving. The name people gave to this theory was the July Effect, and it turns out there is indeed a July Effect, but not the one that we thought. In fact, you have a significantly increased chance of surviving a heart attack when your doctor is less experienced!
Software written by experienced developers won’t kill anybody, but there is a parallel here I’m driving at. Our understanding of technology best practices evolves over time, and that requires constant learning from developers. Well, learning is one thing, but changing our instincts and feelings towards how software should be written is another. These instincts often stick with us as we go through our careers, and this is why it’s not uncommon to find folks with 10 years of Ruby experience that still write abstract classes to try and enforce an interface on child classes - it’s frequently a vestage of their history with Java. It’s not uncommon to see similar vestigial code in Elixir as well, as Ruby developers move into the language and bring with them OO patterns that aren’t really a great fit for that language.
In programming we generally revere experience, but we need to be careful to only do so when it’s been proven to be a marker of someone who contributes high-quality code. Funny enough, in the medical example I gave above, there actually was one place where experience correlated highly with positive outcomes, and that was in surgery. Until we know which side of the spectrum we fall on as software developers, we should be much more careful about how we prejudge those with experience to be inherintly competent.
Chasing the dragon
That’s a real ad up there that I pulled from Upwork today. It’s asking for 3-5 years of experience in a library that, at the time of posting, had only been publicly available for about 3 years. Let’s keep that in mind as we move forward in this post.
So, since we know that folks revere experience, that’s what we’re all chasing. And because jobs in tech are frequently rather short lived (often less than 4 years), developers are smart to keep their skills sharp and to keep up with the changing technological landscape. But if you want to get a “good” job, since we’re competing with other developers, we need to have more years of experience than the next person so we can appear to be a better candidate. This, I think, is why chasing emerging technologies is so common in our world. If you don’t get on the train at the beginning then all the other folks who did will get all the jobs, and you’ll be left out in the cold.
Well, I wish I had some answers, but I don’t. Our whole industry knows hiring is messy, difficult, and broken, but I’m just don’t have solutions - and trust me, I’ve thought long and hard about them! I think we could nip two problems in the bud if we had some other marker for a person’s potential to contribute positively to a team other than experience, but I just don’t know what that would be. But it’s worth continued thought since it’s such an important issue, and that’s why I write about it, and if you have any other ideas, I’d love to hear them on twitter!