So, an admission. It’s been a bit since my last update. Turns out that I don't work on this project full-time :)



Anyone that works on hobby projects knows that time is usually the constraining factor when it comes to getting stuff done. Work, fixing things, cleaning the shop, and so on generally gets in the way of meaningful progress on SUPAR AWESOME PROJECTS. That's a big part of why I gave myself a decade to get this project done. Oh, I didn't mention that? Yeah, this is a decade-long project. I think I'm on track.

Believe it or not though, I have made some great progress since the previous update. Among other things, I have the full Simplified Perturbations Models running on the INMOS transputer hardware. Last we checked in, I had some stuff running including decoding the TLEs (Two Line Elements) but I had yet to show off actual orbital predictions. But after some epic hacking sessions, that stuff is for real working. I'm going to write those accomplishments up but for now what I want to talk about is this map stuff.

Two posts ago, I detailed some plans to get a satellite map running on Apple II hardware. The first part of this was succeeding in the display of DHGR (Double Hires GRaphics) bitmaps. As in BMP, the stuff that you can output directly from a modern version of Adobe Photoshop here in Twenty Eighteen. That would provide the background map atop which I could display the satellite position. This was harder than I thought it'd be.



The first barrier to getting DHGR running on the Apple II was the lack of awesome documentation. I mean, it was out there but for a guy like me used to including some standard libraries and pushing stuff to a display it wasn't exactly intuitive. I will right now say that it took me a LOT of revisions to get it right. I looked at a lot of on-screen garbage trying to puzzle out what was going on.



So close.

So, if you'll excuse a little diversion I want to talk about Apple II graphics.

First, the fact that we can display a 560x192 monochrome bitmap on the Apple II is kind of amazing.

Woz, that is to say, Steve Wozniak, was a brilliant hardware designer. He was working at Hewlett Packard when he started coming up with the schematics for what would become the Apple I, and he was all about efficiency in design. Woz was absolutely great at making the very most of the fewest ICs possible. It's fair to say that it would be hard to strip much out of the Apple II and get meaningful NTSC video out of a design. The entire video system is based around the NTSC (and later, PAL) waveforms. Woz basically looked at the required video signal and said, "How can I design a system with minimal cost and display meaningful graphics from a video buffer". To be sure, his business partner Steve Jobs was probably totally on-board with this cost-optimization in the pursuit of producing an affordable computer with maximum sustainability for Apple.

That said, the side effect of reducing hardware complexity was to increase the software complexity. There was no sprite hardware, and the video modes played epic tricks with the timing of the video signal to produce colors. As a result of this spartan hardware design, the higher resolution video modes became fairly complex when it came to memory maps. In modern frame buffers, there is a nice linear correspondence between the byte stored in memory and its position on the screen. On the Apple II, one finds themself hopping all over the memory map to load an image that will be properly displayed on the screen. Line 2 starts at a position farther than it should from Line 1, as does each subsequent line, except...