We often hear employers and business leaders lament the unfortunate gap between what students learn in college and what they are actually expected to know in order to be job-ready. This is particularly alarming in light of the large — and still growing — number of people graduating from university: above 40% of 25 to 34-year-olds in OECD countries, and nearly 50% of 25 to 34-year-olds in America.
Although there is a clear premium on education — recent reports from ###em/em### suggest that the ROI of a college degree has never been higher for young people — the value added from a college degree decreases as the number of graduates increases. This is why a college degree will boost earnings by over 20% in sub-saharan Africa (where degrees are relatively rare), but only 9% in Scandinavia (where 40% of adults have degrees). At the same time, as university qualifications become more commonplace, recruiters and employers will increasingly demand them, regardless of whether they are actually required for a specific job. So, while tertiary degrees may still lead to higher-paying jobs, the same employers handing out these jobs are hurting themselves — and young people — by limiting their candidate pool to college graduates. In an age of ubiquitous disruption and unpredictable job evolution, it is hard to argue that the knowledge acquisition historically associated with a university degree is still relevant.
There are several data-driven arguments that question the actual, rather than the perceived, value of a college degree. First, meta-analytic reviews have long-established that the correlation between education level and job performance is weak. In fact, the research shows that intelligence scores are a much better indicator of job potential. If we were to pick between a candidate with a college degree and a candidate with a higher intelligence score, we could expect the latter to outperform the former in most jobs, particularly when those jobs require constant thinking and learning. Academic grades are indicative of how much a candidate has studied, but their performance on an intelligence test reflects their actual ability to learn, reason, and think logically.
College degrees are also confounded with social class and play a part in reducing social mobility and augmenting inequality. Many universities do select students on meritocratic grounds, but even merit-based selection is conflated with variables that decrease the diversity of admitted applicants. In many societies, there is a strong degree of assortative mating based on income and class. In the U.S., affluent people are more likely to marry other affluent people, and families with more money can afford to pay for schools, tutors, extracurriculars, and other privileges that increase their child’s likelihood of accessing an elite college education. This, in turn, affects the entire trajectory of that child’s future, including their future career prospects — providing a clear advantage to some and a clear disadvantage to others.
When employers attach value to university qualifications, it’s often because they see them as a reliable indicator of a candidate’s intellectual competence. If that is their focus, why not just use psychological assessments instead, which are much more predictive of future job performance, and less confounded with socioeconomic status and demographic variables?
Having said that, universities could substantially increase the value of the college degree if they spent more time teaching their students critical soft skills. Recruiters and employers are unlikely to be impressed by candidates unless they can demonstrate a certain degree of people-skills. This is perhaps one of the biggest differences between what universities and employers look for in applicants. While employers want candidates with higher levels of EQ, resilience, empathy, and integrity, those are rarely attributes that universities nurture or select for in admissions. As the impact of AI and disruptive technology grows, candidates who can perform tasks that machines cannot are becoming more valuable — and that underscores the growing importance of soft skills, which are hard for machines to emulate.
In a recent ManpowerGroup survey of 2,000 employers, over 50% of organizations listed problem-solving, collaboration, customer service, and communication as the most valued skills. Likewise, a recent report by Josh Bersin noted that employers today are as likely to select candidates for their adaptability, culture fit, and growth potential as for in-demand technical skills (e.g. python, analytics, cloud computing). Additionally, employers like Google, Amazon, and Microsoft, have highlighted the importance of learnability — being curious and having a hungry mind — as a key indicator of career potential. This is likely a result of the growing focus on employee training — one report shows U.S. companies spent over $90 billion on it in 2017. Hiring people with curiosity is likely to maximize the ROI of these programs.
There is also a huge opportunity for colleges to restore their relevance by helping to fill the learning gap many managers face when they are promoted into a leadership role. Today, people often take on leadership positions without much formal management training. Often, the strongest individual contributors are promoted into management, even though they haven’t developed the skills needed to lead a team. But if more schools invested in teaching those skills, organizations would have a larger amount of candidates with leadership potential.
In short, we believe that market demands clearly call for a paradigm change. More and more students are spending more and more money on higher education, and their main goal is largely pragmatic: to boost their employability and be a valuable contributor to the economy. Even if the value attached to a university degree is beneficial to those who obtain it, companies can help change the narrative by putting less weight on “higher education” as a measure of intellectual competence and job potential, and instead, approach hiring with more open-mindedness.