Are Certifications Worth it?
Are they valuable to developers? or should we collectively decide to ignore them altogether?
The relationship between formal education and Software Development is radically different from other industries. Unlike Doctors, Lawyers, technicians, etc there is a general disdain of not only formal education but also credential-based proof of competence.
This can be most clearly seen in the role that professional certifications play in Software Development.
If you spend any time online or with other developers you'll quickly see that although traditional format education like university still has weight, other forms of credential-based displays of competence are overwhelmingly looked at with skepticism.
Here I'll be discussing whether the reputation of certifications is earned. And what merit, if any, they have.
I will preface this article with a piece of advice that I feel obliged to share since I know most people won't finish reading.
If you do like taking certifications. Whether for fun, continual learning, or as a nice addon to your resume. Do not include them if you're applying to startups or mention it in startup communities.
I do not mean any offense to the startup world, but they do generally have a bias against traditional displays of credentials (unless you're mentioning that you went to Stanford). *
While I overall agree with the sentiment, it can get overzealous. So play your cards carefully.
Any evidence that you care about things like degrees, certifications, etc., will count negatively against you.
- Out of the two dozen or so forums posts that I collected while researching for this, the ones that were startup-centered (like HackerNews, certain subreddits, etc) were the most likely to say that having/taking certifications counted negatively against you. While this isn't enough to formally prove there is any bias, I saw enough of these kinds of attitudes to mention it.
Controversies Surrounding Certifications
The issues most people have with certifications can be divided into two different categories. One relates to the fundamental issue surrounding any proof of competence across industries.
The second is an issue specific to software development certifications' inability to actually correlate to someone else's skill set.
Many of the arguments people make certifications, even some that people argue are unique to them, are fundamental problems of the exam-driven education system.
The problem with the exam-driven approach is that it optimizes what is measurable, not necessarily the factors that have the most effect on competence.
I cannot find the original publication that showed there was no correlation between technical competence and all measures used in the industry and academia. If you happen to have it on hand or know its name, please comment it below.
The main issues with the exam-driven learning commonly cited are:
- Any signal that certification could provide that a developer knows the material is completely destroyed by how easy it is to simply memorize every answer to the certification exam.
- Instead of being used as educational material, the certifications are used as a piece of paper that allows non-technical employees at companies to baselessly filter our applicants because they "don't fit the picture" of whom they think is their ideal candidate.
- The people creating these exams have no rigorous industry experience. Their only actual technical experience is creating these certifications. Even in organizations with talented people, they are usually too busy to spend time creating certifications.
- Since they are a profit-centered venture, they appeal to the lowest common denominator. But due to the aforementioned ease of "hacking" of the exam, even the people its meant to serve to end up learning nothing.
If you spend any time online, you will quickly come across people baffled at the value Hiring Managers and non-developers place in certifications.
The certifications, according to them, are such a terrible measurement of a developer's skillset that you may as well use a random IQ test from the internet in order to measure their competence.
The main issues with certifications commonly cited are:
- Technology changes too quickly for any serious certification, and the developers who take them, to keep up
- Most companies that make these certificates have no incentive to make their certifications rigorous enough to prove the skills of those who take it. They lack both the depth and breadth to be of any value.
- Any time spent studying for or taking certifications would be vastly better spent working on side projects or contributing to an Open Source project.
It really can't be debated that, at least in the present, certifications are the least popular way for software developers to learn and/or prove their talent.
Although there is extremely limited data gathered on it, it's clear from the 2018 StackOverflow Developer Survey and the 2020 FreeCodeCamp survey that approximately 2% to 14% of developers have completed, or plan to complete professional certifications.
Even if you made the argument that these survey's questions around certifications were not properly designed. A properly-designed survey is highly unlikely to give a more favorable share due to both the other data published in the surveys and its online unpopularity.
It is interesting to note the lack of surveys focusing on the popularity and value perceived of certifications in the industry. Considering the myriad of surveys by a multitude of companies, it's notable that this has been mostly ignored.
The obvious question that arises from all the arguments we've discussed and the unpopularity of them is "How on earth do these certifications still exist?"
A product whose target audience is at most ambivalent, but commonly dislikes, the product would very quickly die out. So why are professional software certifications still alive and well?
That's because the target audience isn't software developers.
An interesting thread that you'll find among most of the common arguments against the certifications outlined above is that they assume a third-party.
From the arguments about how they don't demonstrate the actual skill to the points, to how easy they are to memorize, to being described as a waste of time (when discussing instead of doing open source, they mention it as a different proof of skill).
All of these create a third party that:
- Is not knowledgeable enough to separate fakers and talented people
- Will be a judge of whether you are talented or not
The usual culprit mentioned is HR, but this problem ranges to all areas of business and beyond. The problem with tying certifications to allowing non-talented people to pass through is that the issue isn't with the certifications.
The issue is that any random factor could convince a Non-Expert that you know what you're doing.
This is a problem that's been discussed in books like Thinking, Fast and Slow, and a myriad of psychological publications.
If you wanted to hack the interview process, taking certifications would be just as effective as simply having the same hobby as the person you're interviewing.
Condemning certifications because they don't display talent or skill would also require you to condemn every other way we determine a developer's talent and the whole recruiting process
- GitHub activity can be faked
- Projects can be overblown
- Interviews can be hacked via basic soft skills
- Technical interviews require you to simply memorize LeetCode styled questions
- Old school nepotism
While paid take-home projects are most likely the best way to determine a candidate's skill. They still won't prevent either nepotism or a candidate sweet-talking their way to success.
If we want to make sure that certifications reflect the talent, and minimize the number of candidates without any actual skill getting through. We'll need to reform the foundations of our recruiting and talent management process.
Non-Expert As Yourself
The non-expert problem can also be an issue even if there isn't a third party judging your abilities. Many software developers early in their career use certifications as a clutch, a way to prove to themselves that they're good developers.
When you're early in your career you simply don't know what you don't know. Hence you end up feeling a general sense of incompetence and confusion that makes it extremely discouraging as you see your peers or colleagues breeze through issues and display skills you didn't know exist.
What to do instead
While I can empathize with the desire to have a tangible object that you can point to when you feel like you don't belong.
The certifications will never fix the deeper problem, which is that you are always going to be learning and improving. There will never be a point where you'll feel like you "made it". Its an emotional problem not a technical one.
The only way to get past it is to keep developing and learning, and then one day you'll look back and see everything that you missed and could not understand earlier in your career.
Don't use certifications as a way to try to prove to yourself that you're a good developer. It won't work and it's wasted time that could be better spent elsewhere.
Not Every Certification Is Created Equal
Like in much of online and public discourse, detractors of certifications will point to the absolute worst instances of their opposition as evidence that everything is that bad.
Although this straw man argument is compelling, it creates a false equivalence between certifications like "c++ language mastery" and an AWS Cloud Architect certification.
Regardless of your opinion about the AWS certifications, you can't honestly claim that those two are equivalent.
There certainly are a myriad of certifications that only serve as a cash machine for the companies selling them, but the most prominent ones are a far cry from that.
The difficulty is in being able to differentiate between good and bad certifications. Of which certain guidelines allow us to create different clusters of certifications.
- Cloud Platform Certifications
- Language & Framework Certifications
- High-level tool certification (being certified in Chrome Devtools, etc)
- Low-level tool certification (Kubernetes, databases, etc)
From this list we can tell that two axes define a certification:
- Level of System Dependency
- Frequency of Breaking Change in Systems
It's vital to deeply understand the strong dependencies your system(s) relies on
There is a difference in the level of dependency a system has with its database compared to a framework.
You may be heavily tied to your language framework but moving terabytes of data from one database system to another completely different one is several magnitudes harder. This argument follows similarly for whichever cloud platform your systems require the most heavily on.
That's why the most popular certifications developers usually take, if any, are for these kinds of systems. The mission-critical components that are incredibly complicated and require a broad range of knowledge to fully understand.
Correlation between change and the value of a certification
One of the reasons that there hasn't yet been a standard in software development for a skill like there are in other professions is due to the frequency of change.
Whilst medicine, law, and many other professions do evolve and change. None of them come even close to changing quite as fast as the software industry.
A Software Developer who has learned nothing new in 5 years will find themselves in a relic in the industry. One who hasn't learned anything in 10 years will be unable to meaningfully contribute to the modern problems being faced.
However, this problem isn't evenly distributed within technology itself. Database systems do not change nearly as quickly as front-end frameworks for instance. The latter of which faces radically changes every two years or so.
This poses a challenge to those hoping to both create or complete certifications. For the same reason that there aren't many helpful books about modern web development, there are no good certifications.
By the time anyone has come even close to finishing the certification, the entire landscape has completely shifted. And even if a certification is published while the technology is relevant, it will last a very short time.
Restarting the cycle all over again and requiring any developer hoping to maintain a valid certification to constantly study for, and retake, a certification that will inevitably become ever more out of date.
How to pick which kind to focus on
From the aforementioned detail, it's clear that you get the most value by focusing on certifications for key tools that change infrequently, and for large-scale platforms/tools that your systems rely upon.
So completely avoid trying to certify yourself in languages, frameworks, specific tools for which there are a myriad of alternatives
For narrow topics & high-level tools, an online course will serve you better than certifications.
For everything else, a certification will provide a better learning path.
"Okay, so there are some good ones but its still a waste of time"
I can understand where you are coming from. And you are right, certificates are a complete waste of time if you're trying to prove to others just how smart you are.
But that's not how you should use them.
Certifications are invaluable as a focusing device for your learning.
They define clear pillars of technologies important to your career from which you can plan out projects, research, and a whole learning path to mastery.
It's better to think of signing up for a certification exam like signing up to run a 10K race 3 months from now.
If you're trying to get in better shape, signing up for a race won't replace actual workouts. But by creating a clear path ahead to move towards, there's no confusion about what exactly you have to do.
Certifications Provide A Clear Learning Path
The argument I'm making isn't that we should all give up workshops, talks, side projects, and all of the other ways that we learn.
This isn't a question of either/or but yes/and. The other methods of learning we've discussed are singular and focused (as they should be).
By choosing the right certifications, you get a clear and measurable goal from which you can guide your learning. Using it as a basis to find good online courses, conferences to attend, side projects to complete, or whichever specific learning method works best for you.
So yes do your side projects, and workshops, and everything else. But similar to focusing your exercise routine by joining competitions. Certifications provide you a clear vision for your road to mastery.
While not useful for determining the talent and competence of a developer. Certifications can be valuable to the developer taking it, helping them have a clear roadmap for their learning.
In the end, they are not a replacement for experience, but those hoping to continue to improve at their craft may find it another helpful addition to their learning toolbelt.
My ultimate piece of actionable advice would be to use certifications as a focusing point for your learning. This is specifically valuable for key tools and platforms that you either regularly use at work or you want to use more.
A few certifications that I would recommend are:
- Cloud Platforms - AWS, GCP, Azure, etc
- DevOps tools like Kubernetes
- Databases - Cassandra, DynamoDB, MongoDB, Postgres , etc
Don't just do the certifications and stop, get involved in the community, create side projects with the technology. Try to take on more tasks related to it at work.
Certifications can be a really useful tool for individuals to improve their learning process if used appropriately.