Original Full Sized Image.

I’m adopting this “tag line” about tomatoes due to the simple fact that my tomato garden is a more accurate reporter of the temperature than is GIStemp. Normal tomatoes will not set fruit below 50F at night. Cold varieties, like Siberia, can set fruit down to 40F (and some claim lower to 35F). My tomato plants reliably report the temperature. GIStemp, not so much… GIStemp was reporting a 115 year record heat in California (because NCDC in GHCN moved all the thermometers to Southern California near the beach, except one at the S.F. Airport…) while my tomatoes were accurately reporting “too cold to set fruit”…

(NCDC is the National Climatic Data Center. GHCN is the Global Historical Climate Network – the thermometer data. GIStemp is the NASA product that turns the GHCN history into that global ‘anomaly map’ and claims to show where it’s hot and how hot we are.)

GIStemp, A “Start Here” page

UPDATE 6: As of November, 2009:

I’ve added a GIStemp high level overview for regular folks (i.e. you don’t need to be a computer geek or weather guy to ‘get it’):

https://chiefio.wordpress.com/2009/11/09/gistemp-a-human-view/

This is a nice high level view, but has links down into all the detail and tech talk that support it, if desired. Most folks ought to read it first.

NOAA / NCDC have Fudged and Corrupted the Input Data Series

A sidebar on data corruption from thermometer deletions:

The GHCN input data to GIStemp “has issues” (they -NOAA/NCDC- deleted 90% or so of the thermometers between about 1990 and 2009…) with those deletions focused on cold places. This is the second set of reports most folks ought to read. We explore that here:

https://chiefio.wordpress.com/2009/11/03/ghcn-the-global-analysis/

And find what is most likey the key coordinating factor behind the ‘agreement’ between HadCRUT (UEA / CRU i.e. the “Climategate” folks), NCDC (the GHCN adjusted series and GHCN data fabricators), and GIStemp. They all use GHCN and the GHCN set has been “buggered” with the deletion of cold thermometers from the recent data (but they are left in the baseline periods. Even the Japanese temperature series depends on GHCN). This, IMHO, is the biggest problem and is the most important ‘issue’ in the apparent fraud of AGW. When you are ‘cooking the books’, literally, I have trouble finding any more polite word than fraud…

Some Prior Updates

UPDATE 5: As of September, 2009:

I ran into this excellent page:

http://www.appinsys.com/GlobalWarming/GW_Part2_GlobalTempMeasure.htm

A very “information dense” but “must read” page.

UPDATE4: As of Augst 16, 2009:

I’ve put up a simple intro to the “issues” with the AGW thesis:

https://chiefio.wordpress.com/2009/07/30/agw-basics-of-whats-wrong/

A bit dated (it needs to be updated with Climategate, the UEA subornation of the peer review process, and the NOAA / NCDC GHCN data set molestation, for example) but still a good starter list.

Not exactly GIStemp, but in “characterizing the data” for the GHCN input, I’ve found that all the “warming signal” is carried in the winter months. The summer months do not warm. That can not be caused by CO2. How can you have a ‘runaway greenhouse effect’ with a ‘tipping point at a higher temperature’ when it reacts LESS with higher temperatures with an apparent dampening to a very nice maximum?

Further, the “warming signal” arrives coincident with the arrival of large numbers of new thermometers. When you look at the longest lived cohort, those over about 100 years lifetime, there is no warming signal present in the data to speak of. When you look at the much shorter lived cohorts, you find a very strong warming signal, especially in the winter months. On further inspection of the data it looks like a lot of thermometers “arrived” at places with low latitudes AND at airports (newly built as the “jet age” arrived).

https://chiefio.wordpress.com/2009/08/05/agw-is-a-thermometer-count-artifact/

https://chiefio.wordpress.com/2009/08/10/well-theres-your-global-warming-problem/

https://chiefio.wordpress.com/2009/08/13/gistemp-quartiles-of-age-bolus-of-heat/

Those above three links are still a nice read, but they were early in my discovery process. The end result was that “GHCN Global Analysis”. Still, they are worth reading both for what they say and as an interesting insight into how a discovery process proceeds from a broad ‘something is not quite right here’ discovery and down into the exact focused detail (who, what, how, when – specifically).

My “working thesis” at this point is that GIStemp is a “filter” that tries to remove the impact of this bolus of thermometers arriving in a spike of time and space (using zones, boxes, grids, et. al.) , but is just not adequate to the task. GIStemp is just not a perfect filter. Looking at the impact of the temperature steps (up to zones), they act as a mild amplifier, so some of the “filter” effect in the later steps will only be removing what was added in the early steps.

https://chiefio.wordpress.com/2009/08/12/gistemp-step1-data-change-profile/

But only a little impact comes from STEP0 (not surprising, since it is adding in the Antarctic data and some fiddley bits

https://chiefio.wordpress.com/2009/08/12/gistemp-step0-data-changes-profile/

Though the way they do the F to C conversion is sloppy, not very efficient, and has sensitivity to exactly which compiler you use:

https://chiefio.wordpress.com/2009/07/30/gistemp-f-to-c-convert-issues/

UPDATE3: As of July 27, 2009:

[ NOTE: The “UPDATE” here is a bit dated at this point. I’ve now worked out in some detail the thermometer deletions that were only hinted at in this stage (and are now nicely documented in the reports above). It has all been duplicated by independent parties, and I’ve caught up on sleep. I’m mostly leaving this part up as a reminder of what it was like back then. A bit of nostalgia, of a sort. Further down a “Geek Corner” continues with computer code and how to download and install GIStemp.]

I’ve identified the “issue” with STEP4_5 codes that will not run without errors. They were produced on a ‘Bigendian” box like a Sun, and I’m on a “Littleendian” box like a x86 PC. The g95 docs say they support the “convert={swap|bigendian|littleendian} ” flag to the file “open” statement. I ought to be able to add that directive the “open” file foo statments and be done. But the compiler barfs on it. I’m left to suspect that the g95 support for convert=swap is not actually working in the release I have running.

So the bottom line is that the code I have probably works, but it can’t read a couple of the downloaded files (like SBBX.HadR2 ) due to the bytes being in the wrong order for this hardware. A relatively obscure problem usually avoided by not using byte order sensitive data structures for data interchange…

OK, so I’ll ether port the code to a Macintosh (who have bigendian processors), dig my ancient Sparc 2 out of the garage, or make a purpose built byte swapping file conversion utility. (Since I downloaded the latest production release of the g95 compiler, it’s not likely to be the “fix” to go fishing for a version with a working convert=swap flag; but I do need to verify this assumption…). I’m on stable.91 and it look like 92 has the working convert flag.

I’ve also built a couple of tools that do some crude data analysis on the temperature data as it moves from step to step. The most interesting bit so far is that the “warming” all happens in the winter.

Clever stuff, this CO2. It can cause selective warming in winter, with nearly no warming in summer…

I’m going to paste in here a comment I made over on Watts Up With That in a thread about GIStemp STEP1 about the first rough characterization of what GIStemp does to the temperature data:

Well, at long last I have a contribution based on the work porting GIStemp. I can now run it up to the “add sea surface anomaly maps” stage, and this means I can inspect the intermediate data for interesting trends. (The STEP4_5 part will take a bit longer. I’ve figured out that SBBX.HadR2 is in “bigendian” format and PCs are littleendian, so I have a data conversion to work out…).

Ok, what have I found in steps 0, 1, 2, …? Plenty. First off, though, I needed a “benchmark” to measure against. I decided to just use the canonical GHCN data set. This is what all the other bits get glued onto, so I wondered, what happens, step by step, as bits get blended into the sausage? I also wondered about the odd “seasonal” anomaly design, and wanted a simple year by year measure. [and also month by month -ems]

So my benchmark is just the GHCN monthly averages, summed for each month of the year, cross footed to an annual “Global Average Temperature”, and then a final GAT for ALL TIME is calculated by averaging those yearly GATs.

Now, there are a couple of caveats, not the least of which is that this is Beta code. I’ve cobbled together these tools on 5 hours sleep a night for the last few days (It’s called a “coding frenzy” in the biz… programmers know what I’m talking about… you don’t dare stop till it’s done…) So I’ve done nearly NO Quality Control and have not had a Code Review yet (though I’ve lined up a friend with 30+ years of high end work, currently doing robotics, to review my stuff. He started tonight.) I’m fairly certain that some of these numbers will change a bit as I find little edge cases where some record was left out of the addition…

Second is that I don’t try to answer the question “Is this change to the data valid?” I’m just asking “What is the degree of change?” These may be valid changes.

And third, I have not fully vetted the input data sets. Some of them came with the source code, some from the GIS web site, etc. There is a small possibility that I might not have the newest or best input data. I think this is valid data, but final results may be a smidgeon different if a newer data set shows up.

Ok enough tush cover: What did I find already?!

First up, the “GLOBAL” temperature shows a pronounced seasonal trend. This is a record from after STEP1, just before the zonalizing:

GAT in year : 1971 3.60 6.20 8.20 12.90 16.50 19.30 20.90 20.70 17.90 13.90 9.50 5.60 14.10

The first number is the year, then 12 monthly averages, then the final number is the global average. The fact that the 100ths place is always is a 0 is a direct result of their using C in tenths at this stage. It is “False Precision” in my print format.

It seems a bit “odd” to me that the “Globe” would be 17C colder in January than it is in July. Does it not have hemispheres that balance each other out? In fairness, the sea temps are added in in STEP4_5 and the SH is mostly sea. But it’s pretty clear that the “Global” record is not very global at the half way point in GIStemp.

Next is from GHCH, to GHCN with added (Antarctic, Hohenp…., etc.) and with the pre 1880’s tossed out and the first round of the Reference Station Method. The third record is as the data leaves STEP1 with it’s magic sauce. These are the total of all years in the data set. (The individual year trends are still being analyzed – i.e. I need to get some sleep ;-)

2.6 3.9 7.3 11.8 15.8 18.9 20.7 20.3 17.4 13.1 7.9 3.9 11.97

2.6 3.8 7.3 11.7 15.6 18.7 20.5 20.0 17.2 13.0 7.9 3.9 11.85

3.2 4.5 7.9 12.1 15.9 19.0 20.9 20.5 17.7 13.5 8.5 4.5 12.35

It is pretty clear from inspection of these three that the temperature is raised by GIStemp. It’s also pretty clear that STEP0 does not do much of it (in fact, some data points go down – Adding the Antarctic can do that!). The “cooking” only really starts with STEP1.

The big surprise for me was not the 0.38 C rise in the Total GAT (far right) but the way that winters get warmed up! July and August hardly change (0.2 and 0.3 respectively) yet January has a full 0.6 C rise as do November, December, Febrary, and March.

So GIStemp thinks it’s getting warmer, but only in the winter! I can live with that! At this point I think it’s mostly in the data, but further dredging around is needed to confirm that. The code as written seems to have a small bias spread over all months, at least as I read it, so I’m at a loss for the asymmetry of winter. Perhaps it’s buried in the Python of Step1 that I’m still learning to read…

Finally, a brief word on trends over the years. The GIStemp numbers are, er, odd. I have to do more work on them, but there are some trends that I just do not find credible. For example, the 1776 record (that is very representative of that block of time) in GHCN is:

GAT/year: 1776 -1.40 2.30 4.20 7.20 12.10 18.20 19.70 19.30 15.60 9.50 3.00 -0.40 9.89

The 2008 record is:

GAT/year: 2008 8.30 8.30 11.10 14.60 17.60 19.90 20.90 20.90 18.80 15.50 11.00 8.80 15.90

Notice that last, whole year global number? We’re already 6 C warmer!

Now look at the post step1 record for 1881 compared to 1971:

GAT in year : 1881 3.50 4.10 6.40 10.90 15.30 18.20 20.20 19.80 17.20 11.80 6.40 3.40 11.43

GAT in year : 1971 3.60 6.20 8.20 12.90 16.50 19.30 20.90 20.70 17.90 13.90 9.50 5.60 14.10

According to this, we’ve warmed up 4.5C since 1881 and the 1971 record above was a full 2.7C warmer than 1881. But I thought we were freezing in 1971 and a new ice age was forecast?!

Now take a look at January. No change from 1881 to 1971 (well, 0.1c) but February was up 2.1C, March 1.8C, December 2.2C. And the delta to 2008 is a wopping 4.8C in January and 5.4C in December, but July is almost identical. By definition, picking one year to compare to another is a bit of a cherry pick, even though these were modestly randomly picked. (There are “better” and “worse”: 1894 was MINUS 2.4c in January). But even with that, the “globe” seems to have gotten much much warmer during the Northern Hemisphere winters. Yet not in summer.

Somehow I suspect were seeing a mix of: Exit from LIA in the record that is mostly focused on N. America and Europe; any AGW being substantially in winter in the N.H. and not really doing much for summer heat (if anything), and potentially some kind of bias in the code or temperature recording system that has been warming winter thermometers (heated buildings nearby, car exhausts, huge UHI from massive winter fuel today vs a few wood fires 100+ years ago). (update: see the cold winter location thermometer deletions under the “GHCN global analysis” link at the top.)

I’ve seen nothing in the AGW thesis that would explain these patterns in the data. Certainly not any “runaway greenhouse” effect. The summers are just fine…

So I’m going to dredge through the buckets of “stuff” my new toy is spitting out, and spend a while thinking about what would be a good article to make from this… and do a bit of a code review to make sure I’ve got it right. In the mean time, enjoy your balmy winters ;-)

(And if Anthony would like a copy of the ported GIStemp to play with, well, “Will Program for Beer!” ;-)

E.M.Smith (03:59:22) :

Hmmm…. A bit further pondering….

Does anyone have a graph of S.H. thermometer growth over time? It would be a bit of a “hoot” if the “Global Warming” all came down to more thermometers being put in The Empire in Africa, Australia, et. al. then to Soviet Union dropping SIberia out in large part…

Could GW all just be where in the world is Carmen Sandiego’s Thermometer?

8-)

END QUOTED POSTINGS.

(I’ve put a update or two in the quote, so it’s not exactly a quote anymore, but 99+% the same)

Geek Corner starts here. Technical stuff and programmer topics

So at this point I have a working GIStemp port, a clear idea how to get it to “play well with littleendian hardware” and some early clues as to where the warming is coming from (and it isn’t CO2, given the summers are not getting hotter…)

I’ll probably take a break from GIStemp for the next week (or at least ramp down the effort a little, I could use some more sleep 8-) and get back to making some weekly stock market postings. Time to make some more money (my kids need tuition this month ;-)

UPDATE2: As of July 22, 2009:

I’ve gotten all the code up to STEP4_5 to run to completion. The input data files for STEP4 do not have a location called out in what passes for documentation, so I’m on a bit of a treasure hunt to get them. I’ve found SBBX.HadR2, but not the monthly update files yet. So I still can’t run that step to see what happens.

A note on hardware:

I decided to haul an old box out of the garage for this project. I didn’t care how long it took to run. Mostly I just wanted a dedicated LINUX box that I could “blow away” at any time if I needed a different release, port, or whatever. (I have several LINUX releases on my bookshelf along with a couple of BSD copies, Wasabi, and more…). My goal was mostly just to get through the compile step. Given the small size of the code, I knew that would be “doable” on any size box. I can now also say that the code runs just fine on my box as well. What box?

It started life as a “white box” 486 machine some decades ago. About a decade (plus?) ago it got a motherboard upgrade. As of now, it runs an AMD 400 Mhz processor. The memory on that board is a mix. It was from a transition period when both SDRAM and old SIMM memory sticks were supported on one board. Right now I have some of each: 64 MB of 100 Mhz SDRAM and 48 MB of slow SIMMS. The Operating System is RedHat 7.2 Linux (though most any Linux or Unix ought to be fine.) This was just what was on it when pulled from the garage… The disk is a 10 GB IDE disk, of which GIStemp sucks up about 1 GB.

Most of that is for data. GIStemp makes a copy or two of the data at each step, with minor modifications, and leaves some of them laying about. The code itself is only about 7000 lines after allowing for duplicate copies and “hand tools” that are not part of the normal flow of execution.

A fast disk is much more important than a fast processor. Much of the time the machine is I/O limited. Especially in the Python steps. The only time I saw a significant “hardware limitation” was during the make of the “gcc” C compiler chain that I had to do to get the 4.x release level of libraries needed for the g95 FORTRAN compiler port. ( If I were running a newer Linux, they would already be there… I just figured a gcc build was less work than a new Linux install.) Watching “top” and it’s report of swap file usage showed that 256 MB of memory would be the biggest performance improver. The actual execution of GIStemp runs just fine on the box “as is” taking a few minutes per step, mostly shoveling the 100 MB or so of data into Yet Another Copy…

So if you decide to run this puppy, any old PC will do. A new one with fast SATA disk drives and at least 256 MB of memory matters more than the CPU speed.

I’ll be putting together a posting on what was done for the port, so anyone who would like to know exactly what was done to make it go can either wait a bit, or ask questions below. Also, if anyone want’s a copy of the code, I’m happy to make it available. It will get a bit “cleaner” over the next week or two, but if you want it Right Now, just tell me where to leave a “tarball”… ;-)

UPDATE: As of July 21, 2009 I have managed to get GIStemp to compile on a Linux box! I’ve added “Makefiles” for all the steps and moved much of the code into a src/bin structure (it had been compiling code in-line in scripts and then running it, then deleting the executables).

I’ve also started to map the “data flow” as it moves from file to file to file to file… A silly way to do a “data structure”, rather reminiscent of how it was done in the 1960’s with tape drives holding batch data. At any rate, the first section is mapped (with more to follow) here:

The Data Go Round

The bottom line is that the “data structure” is still primitive, but the program and data file location and overall operational flow is now a much cleaner structure with source code (things like sorts.f ) in separate directories from where it writes it’s scratch files / work files; and with a clean way to “make” the executables (things like sorts.exe) one time and not have to do it again every time you run the system. Also, where there was the potential for ambiguity between a script name and an executable from a FORTRAN program, I’ve changed is so that the script ends in .sh as a qualifier. For example, zonav.f is the FORTRAN source that produces zonav.exe which has a “wrapper script” that runs it named zonav.sh. The *.f files live in a directory named “src” while the *.exe files live in a directory named “bin” and the *.sh files come in two flavors. The “driver scripts” for a single “STEP” that are left in the STEPx directory, and shared scripts that I’ve put in the “src” directory.

You can now look at the code and have a decent clue what’s going on! For now, each FORTRAN and python program is basically exactly as written by GISS. I’ll be making some minor changes as I get steps working and test them (things like having the “work” files created in the work_files directory, rather than ‘wherever’ and then moved to the work_files directory when done. I will NOT be making any changes to speak of to the logic or processing of the code. Just cleaning up where it puts things, not changing what it does to them.

Once it’s running, I intend to follow a USHCN site as it’s data flows through the code and see what happens to it, Step by Step…

This all sounds more complicated than it is. For one thing, moving to a source directory means that the duplicate copies of programs shared by STEP3, STEP4 and STEP5 now only have a single copy to deal with. It also means that all the “compile foo.f, run foo.exe, delete foo.exe” text that was scattered around all the scripts is replaced with a couple of “Makefile”s. One for the initial step of STEP0 and one for STEP2, STEP3, STEP4_5 combined. (Sometime after I’ve gotten STEP0 tested, I’ll combine the two Makefiles…). The STEP1 python step has a complicated make system built into it already. I may do something with it, but it seems to work OK for now.

The only real “bugs” I’ve run into so far are that to get things to compile I had to remove a “feature” they were using in about 1/2 of them. They would declare an array and assign data values to it in one step. That is a non-standard extension on some FORTRAN compilers, but not g95. So I had to replace those lines with ones a bit more like:

INTEGER FOO(2)

DATA FOO /1, 2/

(Where they had: INTEGER FOO(2)/1,2/ )

There were also a couple of programs that complained about data type mismatches in passed parameters. That can be a “bug” or it can be a “neat trick” depending on what the programmer intended (though it is always poor style…) As I run those programs, I’ll evaluate if those were bugs or features ;-) IIRC, there were three of them. Two were involving print steps, so I’m not too worried. One was a REAL passed to an INTEGER data type in a subroutine. That’s more worrisome and could be a significant bug. We’ll see.

Oh, and in most cases the scripts had the ksh directive at the top removed (most folks don’t have ksh). A couple of script don’t run on bash or sh, so I’m going to be fixing them as I do my “debug and test”.

So, at any rate, it’s time to “pop a cork”! Since I now have a significantly cleaned up AND COMPILED version of GIStemp ready to run and test!

https://chiefio.wordpress.com/2009/07/29/gistemp-a-cleaner-approach/

As parts are shown to work correctly, I’ll just use them as is. If a part is broken (due to the compiler differences, or ksh) I’ll re-write it as needed, perhaps even into C. Just about everything on the planet has a C compiler on it… (It was modestly annoying to get and build all the things needed to put a FORTRAN 90 or newer compiler up on Linux…)

At any rate, if you are thinking of trying to make GIStemp go, holler at me first. I’m happy to give folks what I’ve got to help them along. When I have something that works more or less end to end and can be shown to be essentially what GISS intended, I’ll publish it somewhere (need to find a place to put a tarball…). Until then, it will have to be “upon request”.

So now you know where I’ve been the last couple of weeks ;-)

END UPDATE.

Every journey begins with but a single step. This page is it for GIStemp deconstruction.

At present this is a work in progress. As I get through a chunk, I will update this persistent top level page with the appropriate links. It is likely to take a few months.

GIStemp has 6 formal steps (named 0 to 5) each run by a top level script. In the sections on each STEP, I go through that top level script in an overview page. that overview page will have links to the source code and commentary for each program inside that step. As these overviews get done, and as the code pages fill out, there will be links added to fill in this structure.

Right now the biggest hole is STEP2, but I hope to have something there soon. STEP1 source code is up, but analysis is lacking for now.

At present, I’m nearly done with STEP0, and I’ve started on STEP1 (which sounds much more impressive when you realize that step 4 and step 5 are in one directory, STEP4_5, and they (it?) are really just a couple of small new programs and some runs of the same code as STEP3, so there’s roughly STEP2 as a “big chunk” to go. With Minus One, Zero, and One done that makes it about half way already…

STEP3 and STEP4_5 have had the scripts documented and the source code is up, but I have not yet done the FORTRAN deconstruction.

So please expect that this shell will fill in over time. There is only one of me doing it part time for free, so the speed will not be what you might like.

You are welcome to help make it go faster by contributing time, coding skill, commentary, or even just beer. After all, my present motto is “Will Program for Beer!” (Hey, everybody needs a motivator…good management just chooses what they know will work; and since I’m management as well as grunt on this project…)

General Overview Steps

So how big is this puppy and where can I download a copy? What is the general impression of it?

For a bit more detail on the source download and a peek at the “README” file that comes with it, gistemp.txt, we have a starter peek.

And if you would like, you can look in a bit more depth at the ghcn data formats.

General Issues

A list of things I’ve found that make me wonder.

First up, the issue of GIStemp cutting off data at 1880 and using an odd “baseline” period of 1950 to 1980. Is this a Cherry Pick? Were these dates picked deliberately to make warming trends look bigger than they really were (or to fabricate the trend entirely?)

Then there is the issue of false precision. How can you calculate 1/10 th of a degree C from data in whole degrees F? Mr. McGuire would NOT approve!

(Please note: To all the folks who wish to run off to the central limit theorem and how an average of a large number of values can have a greater precision: All such discussion belongs under the “Mr. McGuire” thread. It is there already. Further, GIStemp does not do that. NCDC in GHCN averages exactly 2 things: daily MIN and MAX. Then it takes those (at most 31) for the month and averages them together. The average of up to 31 daily averages of the daily MIN and MAX is not the monthly mean (though everyone treats it as such). The “problem” is in assigning 1/100 F precision to that as the monthly mean when it: A. Isn’t the monthly mean and we have no error estimates. and B. Is calculated in 2 steps with no more than 31 values at any one point. No law of large numbers need apply here. In GIStemp STEP0 this is converted to C one value at a time, so again, no ‘law of large numbers’, just a “monthly mean temperature” measured in 1/10 C that isn’t an accurate monthly mean. This is carried through all the other homogenizing, splicing, UHI adjusting etc. steps. The only time a large number theorem approach might apply is the final pre-anomaly step, but even there it looks like each box is calculated, then the global mean is calculated from them. So please, leave the central limit theorem stuff out of GIStemp… look at how the data are actually processed, not at theoreticals.)

And then there is the issue of trying to dig a global trend out of a data series that only goes back a few years for most of the planet. Maybe you can get a 50 year trend for America, but not for most of the world. We just don’t have the data for the time period needed

The Code – Step by Step, Inch by Inch, Slowly He Turns…

Step Minus One

OK, you need to get the data and there is a pre-process to sort the Antarctic data if you get a fresh copy.

Step Zero

So how about those input files scattered about?

Once you have all the data and files, STEP0 processing does what again?

Step One

What does STEP1 look like in an overview sense?

This step uses Python, but the Python programs call a library of C functions. These are in two pieces. Monthlydata and Stationstrings.

Step Two

What does STEP2 look like in an overview sense?

This step has several subscripts and FORTRAN programs

Step Three

What does STEP3 look like in an overview of mostly just source code listings right now.

Step Four_Five

Some folks have asserted that since GIStemp uses an anomaly map for sea surface temperatures to adjust its internal anomaly map already computed, this means that GIStemp “uses satellite data”. After chasing this down for a while, I came to the conclusion that this stretches the truth quite a bit. Yes, it uses a partially satellite derived SST anomaly map as input to STEP4, but that isn’t quite the same as using direct satellite data.

What does STEP4 look like in an overview of mostly just source code listings right now.

What does STEP5 look like in an overview of mostly just source code listings right now.

Related Websites

To Be Done: Add entries.

Subscribe to feed