He says threats can be prevented through research and education

As well as asteroids, threats to humanity include AI and climate change

Engineers are working to mitigate threat, but progress has been slow

No one knows when next one could be. It 'could be tomorrow,' says Cox

Humanity, says Brian Cox, could be wiped out by asteroids – and we’re not taking the threat seriously

Brian Cox isn’t a man to mince his words.

He famously described the belief that the world is 6,000-years-old as ‘b*****ks’ and those who think the planet would end because of the Mayan calendar as ‘morons.’

But, lately, something far more serious has been weighing on the mind of the Oldham physicist.

Humanity, he says, is at risk of being wiped out by asteroids – and we’re not taking the threat seriously.

‘There is an asteroid with our name on it and it will hit us,’ Professor Cox told MailOnline. In fact, the Earth had a ‘near-miss’ only a few months ago.

‘We didn’t see it,’ says the 46-year-old. ‘We saw it on the way out, but if it had just been a bit further over it would have probably wiped us out. These things happen.’

The bus-sized asteroid, named 2014 EC, came within 38,300 (61,637km) miles of Earth in March - around a sixth of the distance between the moon and our planet.

And it wasn’t the only one threatening Earth. Nasa is currently tracking 1,400 'potentially hazardous asteroids' and predicting their future approaches and impact probabilities.

The threat is so serious that former astronaut Ed Lu has described it as ‘cosmic roulette’ and said that only ‘blind luck’ has so far saved humanity from a serious impact.

The threat is so serious that former astronaut Ed Lu has described it as ‘cosmic roulette’ and said that only ‘blind luck’ has so far saved humanity from a serious impact

No one really knows when a serious impact could happen.

‘It could be tomorrow,’ Professor Cox tells MailOnline. ‘The thing that bothers me about that is we do know how to do something about it.’

For instance, earlier this year US researchers revealed an audacious plan to blow up asteroid approaching Earth with nuclear weapons.

The Iowa team outlined their vision at a Nasa conference, and say they would need just a weeks’ notice to launch if the system were developed.

But ideas such as this aren’t progressing fast enough, says Cox. ‘I think its human stupidity we need to worry about.

No one really knows when a serious impact could happen. ‘It could be tomorrow,’ Professor Cox tells MailOnline. ‘The thing that bothers me about that is we do know how to do something about it’

Dramatic proof that any of these can strike Earth came on 15 February last year, when an unknown object exploded high above Chelyabinsk, Russia, with 20–30 times the energy of the Hiroshima atomic

‘It’s the way we behave, but also just the way we don’t accumulate knowledge at the rate that we could.’

‘We just don’t. I mean, you can see it by the figures. We spend virtually as much on it as everyone else, which is sod all.’

Last year, global spending on government space programs dropped for the first time in almost two decades.

The UK now spends nearly £300 million ($500 million) on civil space programmes, while the US, invests $38.7 billion (£23.3 billion) on its space budget.

Professor Cox says this is not nearly enough. Highlighting something known as the Drake equation, he said the time is ticking for humanity.

The equation was thought up by Dr Frank Drake to bound the terms involved in estimating the number of technological civilisations that may exist in our galaxy.

The aftermath of Chelyabinsk. Although Nasa is monitoring the skies, no one really knows when a serious impact could happen. ‘It could be tomorrow,’ Professor Cox tells MailOnline. ‘The thing that bothers me about that is we do know how to do something about that’

In it, he claimed that the time it would take for a technological civilisation to self-destruct would be around 200 years.

‘That might not be a bad estimate, actually,’ says Cox. ‘If you think about it, we almost did it with the Cuban missile crisis.

‘We may almost be doing it again. It’s not clear, the way we’re dealing with the environment.

‘But we’ve been close to wiping ourselves out, only a few hundred years after we became a technological civilisation.’

And it’s not just asteroids we should worry about, climate change and artificial intelligence are also key concerns.

The only way to deal with it is to spend more money on research and education, to improve our, he says.

Professor Cox is currently working to inspire the next generation of scientists with his involvement in science summer school in St Paul’s Way Trust School (SPWTS) in Poplar, Tower Hamlets.

He says our best hope of mitigating threats such as these is to actively pursue more knowledge, bridge the skills gap in science and engineering and increase our spending on research.

It’s not just asteroids we should worry about, climate change and artificial intelligence are also at the top of the astrophysicist's list. 'Conscious things are notoriously difficult to deal with,' said Cox. 'Look at humans’

‘I mean, you are talking about the far future. And things you say about the far future often sound fantastical. But [the threat of artificial intelligence] is a legitimate question.

‘If you think the brain is basically a computing device, but a very complicated one, then presumably it operates according to the known laws of physics and therefore you should be able to simulate them.

‘What do you do? It’s an ethical dilemma. I don’t know. Conscious things are notoriously difficult to deal with – look at humans.’

His views echo that of Tesla-founder, Elon Musk, who earlier this year said that the threat of artificial intelligence is more severe than nuclear weapons.

‘Ultimately I would say, it’s about not being so myopic. We know quite a lot about nature,’ says Cox.

‘Let’s say we’re the only civilisation in the Milky Way, which is possible.

‘If you were to sit there from that perspective, to say, how would we rearrange our affairs to protect ourselves because we are the only intelligent civilisation in the galaxy, it wouldn’t be like this…right?’