New analysis of 1.6 billion weather records concludes the world IS warming (but still can't say what's causing it)

Analysis of all available historical data from weather stations and ships

Mathematically 'weighted' so researchers could include all data, even awkward results

World warmed by 0.911 degrees centigrade in last 50 years

Amalgamates results from 10 data archives



Research team includes Nobel-winning physicist

No conclusion on what has caused warming

The Earth is definitely warming up, a large-scale independent study has concluded.

The research by the Berkeley Earth Surface Temperature group - which was largely composed of researchers new to climate science, including Dr Saul Perlmutter, who recently won the Nobel prize for physics for his work on dark energy - analysed all available temperature records.

Despite it concluding that the temperature of our planet is rising, many remain sceptical of the theory that global warming is a man-made phenomenon.

They say that the latest findings are simply evidence that the earth is going through cycles during which the average temperature fluctuates.



The group's conclusions were clear - Earth's temperature IS rising. Their result also tallied with previous climate change studies

But Berkeley group's analysis of 1.6billion temperature reports from weather stations and ships confirmed that over the past 50 years, the world's land surface warmed by 0.911 of a degree centigrade. This tallies with previous estimates by Nasa and the National Oceanic and Atmospheric Administration.

The group's conclusion is not about the causes of climate change - whether due to man-made emissions or natural cycles. It's purely a statistical analysis - but a more far-reaching one than previous studies. The group's four papers are still under review by fellow scientists. Publishing them at this early stage is a highly unorthodox step.

The study head, physicist Richard Muller, said there were 'legitimate issues' with previous studies, but said, 'My hope is that this will win over those people who are properly sceptical.'



The researchers added 32,000 new weather stations to the existing climate data, including many more readings from Africa, Asia, South American and Antarctica

The group's approach is highly rigorous - and their algorithms are designed to deal with the inconsistent climate data provided by some weather stations, and to mathematically 'weight' data according to its accuracy. All available information has been used in the group's analysis.



Previous studies have been criticised for 'pruning' data by hand.



The case for climate change has also been damaged by revelations such as 2009's 'Climategate' when emails between University of East Anglia climate researchers were exposed by a hacker, and included references to using figures selectively to 'hide' a 'decline'.

The group added 32,000 new weather stations to existing data, including many more readings from Asia, Antarctica, Africa and South America. Eighty per cent of earth's surface is within 200km of at least one of the stations used in the study.



The worldwide approach ensures data isn't tainted by effects such as 'urban heat islands' - the warmer local climate in cities.



Richard Muller, the physicist who heads the Berkeley team. Muller has described Al Gore as an 'exaggerator' and started the project after he doubted the scientific basis of previous studies Nobel-prize winning physicist Saul Perlmutter was one of the scientists behind the recent analysis. He does not have a background in climate science. He won the Nobel this year for his work on 'dark energy'











CO2 emissions by country: The Berkeley study does not offer any conclusions as to what might cause global warming - but it confirms that some form of warming effect is taking place

The research amalgamates results from 10 previous data archives, six monthly and four daily.



This all-encompassing approach should, its authors say reduce 'potential bias' and 'statistical uncertainty'.

The group aims, 'To merge existing surface station temperature data sets , and to review existing temperature processing algorithms for averaging, and error analysis.'



The information from weather stations is often vulnerable to local conditions such as warmth from nearby buildings, or even to changes in the time of day when readings were taken.

Local weather stations never 'aimed' to be part of a broader climate record, so many varied their methods for taking temperatures, creating 'blips' which previous studies simply cut out

The Berkeley group's approach allows the scientists to deal with such data without omitting readings that don't 'fit' with the overall trend.



It also uses a spatial technique that estimates the readings between weather stations.

The Berkeley paper merged results from ten existing data archives - four daily and six monthly - to create the most all-encompassing data archive yet

Their research uses all available climate data, and is 'open' so that other researchers can access the data they used, and the tools they used to make their conclusion.

'The Berkeley Earth mathematical framework allows one to include short and discontinuous temperature records, so that nearly all temperature data can be used,' says the group.

'The framework contains a weighting process that assesses the quality and consistency of a network of temperature stations as an integral part of the averaging process. This permits data with varying levels of quality to be used without compromising accuracy.'



Climate activists have criticised the study for being funded by a group that also funds Koch Industries, described by Greenpeace as being central to 'climate denial'.



Other scientists have been critical of the decision to publish before the peer-reviewing process is finished - but researchers from other climate studies have welcomed the Berkeley group's conclusion.









