In context: Today's dynamic technology landscape is forcing employers in the IT field to rethink their stance regarding college degrees. In many cases, hiring managers are realizing that degree requirements exclude large pools of highly qualified candidates, hindering their ability to fill critical vacancies.

According to CIO magazine, today's corporate technology leaders and hiring managers across multiple industries are beginning to understand that competency in specific IT-based skills can be more important, valuable, and relevant than the more generalized knowledge provided by an undergraduate degree. The shift in thought in no way downplays the importance of a candidate's ability to communicate effectively or employ critical thinking to solve work-based challenges. Instead, it indicates that employees can develop these skills through specific work experience and targeted, specialized training.

A 2022 report from The Burning Glass Institute notes that employers across multiple sectors are resetting expectations regarding degree requirements. Between 2017 and 2019, 46 percent of mid-level and 31 percent of senior-level technical occupations experienced "material degree resets." These resets involve employers removing educational requirements from their overall position qualifications and placing more emphasis on technical skills while highlighting soft talents such as writing, communication, and attention to detail within the description itself.

While many technology leaders see the benefit of eliminating formal education requirements for practical technical skills, some still feel a degree offers a measure of value that isn't available from pure experience. Veritas Technologies Chief Information Officer Jane Zhu maintains that degrees provide candidates with intangible benefits such as social awareness, problem-solving skills, collaborative mindsets, and personal accountability that experience alone cannot provide.

As an IT contractor managing technical teams and projects for more than 20 years, I've experienced the need to shift away from legacy college degree requirements and instead focus on relevant certifications, skills, and experience. It is especially true when working with cutting-edge, niche, or specialized technologies with a comparatively small practicing community.

For example, when the need for an experienced .NET developer arises, a steady flow of highly qualified local candidates is just about guaranteed. In those cases, and if all other skills were equal, a college degree and the skills it provided could be the differentiator that elevated a specific candidate's overall value.

Conversely, when we need to hire resources for very specialized technical positions, it becomes a nightmare due to candidate education requirements. Rather than receiving dozens of local candidates in a matter of days, as with the .NET example, we find ourselves struggling to find even five qualified candidates across the entire country. Dropping the education requirement would have expanded that pool of candidates exponentially.

Unfortunately, I do not have the authority to remove those requirements. The result is a much longer, more costly sourcing and hiring process with the added risk of the position going vacant should the current resource retire, take another job, or change their technical focus.

The issue certainly has room for debate, but as a working IT management professional, I absolutely lean toward skills and experience playing a far more important role than a decades-old degree in a subject that may not even be related to an organization's needs.