Universities are wrestling with a contradiction. How can there be both a dire lack of IT skills and poor employment prospects for computer science graduates? Does the blame rest with teaching, a lack of industry training or something else entirely?

Image: Martin Neeves Photography / The University of Warwick

When businesses claim workers don't have the digital skills they need, but a disproportionate number of computer science graduates can't find jobs, something is amiss.

This seemingly contradictory situation has been the status quo in the UK for many years now, and is a conundrum that university chiefs are trying to unpick.

The vast majority of UK business leaders and IT execs, 78 percent, told PricewaterhouseCoopers' recent Global Digital IQ Survey that a shortage of digital skills was holding their firm back.

But that claim sits uneasily next to the relatively high proportion of computer science graduates struggling to find work, with 11.7 percent unemployed six months after leaving university. Compared to graduates in related disciplines -- in science, technology, engineering and mathematics -- their employment prospects are particularly poor.

SEE: Download: The truth about MooCs and bootcamps--Their biggest benefit isn't creating more coders

At the Microsoft Transform event in London yesterday, Matthew Gould, director general for digital and media at the Department for Culture, Media and Sport, described the gap as "a total mystery".

Carsten Maple, chairman of the council of professors and heads of computing, said he and his university colleagues were wrestling with where the problem lay, whether it is with academia or with industry.

"What we have to understand is, are we producing the wrong kind of graduates? Is the perception of industry of what a computer graduates is like, is that correct?" he said.

And if universities aren't producing graduates with the tech skills that business want, how can that shortfall be satisfactorily resolved, particularly when each firm has specific requirements, he asked.

"When we talk about industry needs for computer science graduates, the needs of IBM will be very different to the needs of the small companies," he said.

SEE: Is tech turning contract work into the future of employment?

Should universities even be expected to produce graduates fully-formed and ready for business, or is it the responsibility of firms to train people on the job, he asked.

"Should industry expect someone to come straight from a computer science degree and do advanced data analytics on their first day? Should they [industry] have their own training programmes?"

Despite businesses having complained about skills shortages for many years, UK firms have proved increasingly unwilling to bridge this gap by funding IT apprenticeships, with the number of people securing placements dropping by one third in 2014, despite a massive rise in the numbers applying.

The government is now pushing firms to accept more apprentices by levying a tax on the largest firms to fund three million new placements by 2020.

There is also a danger that were universities to focus too heavily on equipping students with the skills in demand at a specific moment, graduates would be ill-equipped to grasp new technological skills as business demands evolve, said Maple.

"We have to decide, do we teach those fundamental skills or do we teach the very latest skills that you will be using in five years?

"That's a real challenge for us. Because technology evolves so fast, what we don't want to be doing is training.

"Universities are educational institutions. What we have to do is to teach students to be able to think in different environments. Some of that will come from fundamental principles, such as logic, the process of doing things.

"If we can teach people to debug 500 lines of Haskell, they can do anything in any language."

While universities need to be aware of what new technologies are emerging, he said, they also "have to stay within our key principles, because otherwise, while graduates may be ready to work on day one, in two years' time as technology moves on, they won't have the key skills and competencies to adapt to the new environments".

There are those who claim the entire notion of an IT skills shortage, which is also a common complaint in the US, is a myth, propagated by technology firms in order to more cheaply employ foreign workers in these roles.

Ron Hira, of the Economic Policy Unit, cites figures showing that, between the financial years 2010 and 2012, the IT services firm Cognizant hired 18,000 foreign workers on H-1B visas and very few domestic employees in the US, a pattern he says is common across large IT services firms. He also found statistics that demonstrated, on average, workers employed on H-1B visas by IT offshoring firms were paid tens of thousands of dollars less than their US counterparts.

In the UK, Sir Nigel Shadbolt this year published a review into how to improve the employability of computer science graduates.

Maple said the group implementing the review's findings are examining relevant data to identify the issues at the root of the contradiction between claims of a skills shortage and graduate unemployment.

There is mounting pressure to resolve the issue, as demand for digital skills is forecast to continue to grow, with the UK Careers and Employability Service predicting a need for 518,000 additional workers for the three highest skilled occupational groups "in the digital arena" by 2022. That's three times the number of computer science graduates produced by the UK over the past 10 years.

Executive Briefing Newsletter Discover the secrets to IT leadership success with these tips on project management, budgets, and dealing with day-to-day challenges. Delivered Tuesdays and Thursdays Sign up today

Also see