Thatcher's rants and musings 2009

rants index: current | 2016 archive | 2015 archive | 2014 archive | 2013 archive | 2011 archive | 2009 archive | 2008 archive | 2007 archive | 2006 archive | 2005 archive | 2004 archive | 2003 archive | 2002 archive



# permalink

26 Aug 2009

Hot-rodding the Voloci

I did a fun basement project this summer, taking my Voloci electric motorbike and upping the voltage to 48V, using a new and great LiFePO4 e-bike battery.

Read all about it.

# permalink

22 June 2009

Misc

Amazing game: http://windosill.com

I'm down in the basement listening to Aerosmith's "Get Your Wings". I've been listening to MP3s through headphones for so long, I've forgotten how awesome classic rock sounds coming off vinyl into a real amplifier through real speakers. This particular album plus the speakers are probably the key ingredients. I'll find out a little later when I blast some MP3s. # permalink

14 June 2009

The A.G's and Bob Edwards in Flint MI

Shameless nostalgia but I find it pretty funny. Back in 1992 we pulled into Flint MI and encountered one very memorable Bob Edwards. By some stroke of fate the moment was preserved, and now I share it with you:

# permalink

20 April 2009

Movie Awards

I haven't done movie awards since early 2007 so this list covers movies I saw in 2007 and 2008. I only give awards for movies I saw in a theater.

Best Picture -- The Wackness

Second Best Picture -- Superbad

Third Best Picture -- In Bruges

Judd Apatow Honorable Mentions -- Knocked Up, Role Models, Pineapple Express, Forgetting Sarah Marshall

Best Big Budget Hollywood Superhero Movie -- Iron Man

Best Big Budget Hollywood Sci Fi Movie -- I Am Legend

Best Heist Movie -- The Bank Job

Best French Thriller -- Tell No One

Best Hollywood Movie With An African Setting -- Blood Diamond

Best Jerry Bruckheimer Production -- Deja Vu

Oscar Winner That For Once Is Watchable -- Slumdog Millionaire

Worst Waste Of Talent -- The Ex

Cheesiest Movie -- Vicky Christina Barcelona

Most Tedious Art Flick -- Rachel Getting Married

I think I saw all the worthy Hollywood product in 2007 and 2008. A bold claim, I know. Tell me what I missed and prove me wrong.

# permalink

8 April 2009

Google Earth Tours

This is cool. We just released an updated Google Earth Browser Plugin that supports KML tours. Below is a re-enactment of the Flight 1549 crash, with the cockpit and tower audio synchronized with the flight path of the plane. The tour was put together by Jeral Poskey (blog post). Unlike a video, you can move the viewpoint around while the tour is playing.

(Note: if you experience choppiness in the audio, pause the tour for a minute or two to let the audio load in, and then restart the tour.)

Other good tours include Ira Flatow doing Introduction to Mars and Bill Nye's Mars Exploration.

# permalink

23 March 2009

Keeping Score

I don't think this is common knowledge, probably because nobody outside engineering cares, but engineers are a competitive bunch. We tend to measure our skills and accomplishments against other engineers, and rank each other, implicitly and sometimes explicitly.

(Aside: when I say "engineer", think "programmer". I always used to think it was a little wrong to refer to programmers as engineers, but my current employer has infected my brain with the term and now I hardly notice.)

Anyway, although we like to rank ourselves against each other, I think we often get the criteria wrong. There is definitely a sense of someone's ability or capacity that comes from being exposed to their work, that would be hard to quantify but is very real, and that most programmers would roughly agree on. However, while ability and capacity are interesting, I don't think they are necessarily good measures of utility. I've seen some really stupid counter-productive stuff done by very brilliant people. Likewise, some of the most effective things are done by programmers of seemingly modest skills.

Is there some objective metric by which we can rank the effectiveness of a programmer? It's a complicated multidimensional problem -- there are so many competing ways to evaluate a program. There's time to market, market penetration, revenue, elegance, performance, user-friendliness, etc -- it doesn't seem like there could be a good metric that fairly takes it all into account.

But, I have such a metric. It's very simple:

Your worth as a programmer is proportional to the total number of processor cycles consumed by your code.

It probably sounds crazy to you, but I think it's great. My argument is that if a program is useful, people will choose to run it. The more useful it is, more and more people will run it, and they'll run it more often. People "pay" to run a program by using up their finite CPU cycles running it -- it's a competitive situation, where self-interested humans make the choices.

Objection But that rewards slow bloatware! Response No it doesn't. All other things being equal, slow code is displaced by fast code, if performance is at all important. Look at the efforts that go into finding and optimizing the parts of a program that consume the most cycles. If the cycle-hogs are useless overhead, they get eliminated altogether. If the cycle-hogs have a core of irreducible useful work, the bloat may be removed, but the core still gets executed, presumably a lot. If performance isn't important, then slowness is not the right metric anyway, and the cycles go to the program that solves the most pressing real-world problem earliest or quickest or cheapest or most conveniently or whatever.

It may be a little unfair to compare programs across genres, like, why should TurboTax score lower than Windows Solitaire, just because tax season comes but once a year? On the other hand, what better way to balance the relative value of bean-counting against time-wasting, than by comparing cycles executed?

It's also cruel to major efforts that resulted in low usage (a scenario close to my heart). To that I say -- utility doesn't care how hard you work, it cares how useful your work is. If people don't use it, then it's not useful. You wasted your time. Sorry.

Anyway, you can work out the other implications for yourself. Put your objections in the comments and I'll see if I can rebut them.

# permalink

17 March 2009

Old TectrixVR Press

Happy St. Patrick's Day. In honor of the holiday, I scanned some more junk from the basement, this time some press articles about Tectrix from 1994.

# permalink

1 March 2009

HTML 5 Awesomeness

Dean McNamee recently clued me into the potential of the <canvas> tag for doing 3D graphics in pure HTML + Javascript, without any plugins. Chrome's V8 Javascript engine really helps unlock it, because you can afford much more Javascript math. I just whipped up an example of perspective-correct texture mapping. This is just the tip of the iceberg!

# permalink

31 January 2009

Two Bits

I just finished reading Two Bits: The Cultural Significance of Free Software by Christopher Kelty (see http://twobits.net). It's an academic anthropology book about (what I would call) open source software development. For a long time, on the basis of peripheral contact with my circle of online buddies, Julie (a professional Historian/Anthropologist) has said that somebody needs to do an anthropological study of the world of computer geeks, and what do you know, here's one. Julie bought it for work, read some, then passed it off to me for my professional geek opinion.

It was an interesting read. I'm not an anthropologist and some of the verbage went right over my head. But I did learn some things from it:

The concept of a "recursive public". Kelty apparently coined this phrase, and makes a big deal of it, though I'm still not 100% sure I know exactly what he means by it. I think he means that the actual infrastructure of the world of free software is self-created. I.e. participants communicate using email and source-control systems and web servers/browser and so on, which themselves are artifacts of open source software development (some of them, anyway). This is distinct and deeper than just social conventions/rules created and observed by a community. It's possible I'm distorting his intent though because he gets a bit theoretical when he talks about it, and applies it to things that I don't think quite fit my definition.

His observation that geeks fall on a spectrum ranging from "Polymath" through "Transhuman". Polymaths, according to Kelty, have wide technical and non-technical interests ("avowed dilettantism") and tend to have a wholistic, humanist outlook, and see technology as an opportunity to influence the non-technical sphere. Transhumanism "focuses on the power of technology to transcend the limitations of the human body as currently evolved." Transhumans tend to believe in the "singularity", "the point at which the speed of technical progress is faster than human control over the course". I don't really buy that this is the principal axis of geek differentiation (and also, Transhumans give me the creeps), but it's an interesting characterization, that pins down and labels some concepts I was vaguely aware of.

Good accounts of the evolution of Unix, including some stories about Ken Thompson and the Lions Book and stuff like that, but also up through the whole mess in the 80's and early 90's of the competing corporate consortiums that attempted to unify Unix, which eventually got overwhelmed by Linux. He also explains the GNU prehistory, including various versions of Emacs and how that led to Stallman flipping his wig and starting the GNU project. I had a vague idea of some of these stories, but Kelty gives a very clear account with many details that were new to me.

Some discussion about whether open source communities have "norms", in the anthropological sense. I don't really know what this means, but Kelty makes it sound important and interesting. There's some discussion about how the Scientific Method exists more in theory than in practice, in contrast to the methods of open source, which exist very concretely in practice and perhaps less in theory. Anyway, I can't really do justice to what he says but it might be significant.

A funny anecdote about Eric Raymond, who famously enlisted some concepts from Anthropology in his Cathedral Vs The Bazaar essay. I always suspected Raymond was mostly full of crap, and Kelty, a genuine professor of Anthropology, comes through, as he contemplates their incipient meeting: "Visions of a mortal confrontation between two anthropologists-manque filled my head. I imagined explaining point by point why his references to self-organization and evolutionary psychology were misguided, and how the long tradition of economic anthropology contradicted basically everything he had to say about gift-exchange." It gets better; the whole anecdote is worth reading. # permalink

27 January 2009

More Sinkhole albums free online

I finally got around to compiling the content and getting permission, and have posted Sinkhole's other two albums under the Creative Commons Attribution 3.0 license.

Space Freak is our second full-length, and Retrospectacles is our fourth, a combination EP/Greatest Hits record. These are the two that were released on Doctor Strange Records, and personally, I think they're our best two albums, although other members of Sinkhole might disagree.

Turn it up man!

# permalink

26 January 2009

More Google Fanboyism

We had a plumbing scare in Oak Bluffs due to an empty oil tank and cold weather. Our plumbing was fortunately saved in the nick of time by hero Mark R.

I had been thinking for a while about setting up some kind of remote temperature surveillance to provide peace of mind during the cold months, and this incident finally caused me to do something about it. I bought one of these guys , along with a temperature sensor. I figured I could find a way to have it email my web server, or I could poll it via HTTP or something like that.

The device came, and I plugged it in etc. There's a router-like web interface that has graphs and such, which is fine, except I wasn't too eager to open a hole in my firewall and expose this thing's admin interface to the Internets. As I feared, the SMTP options weren't quite compatible with my constraints; the SensorProbe2 can do authenticated SMTP, but not the POP-before-SMTP scheme that my web host uses, and not the elaborate stuff that the gmail relay would need.

I may live to regret this, but I ended up using SNMP to poll the temperature value. I have a cron job set up on my web host that polls the value, keeps a temperature history, and writes some html and xml to show the history & status. I'm using the Google Chart API to draw a graph, like so:

The live html chart is here and the xml version is here. What good is the xml? I have it on the top of my iGoogle page, so above the NYC and Oak Bluffs weather, I can see the actual temperature in our house. So can you, for that matter.

Currently, the thermometer is actually showing the temperature near my window in NY, but someday soon it will be in MA.

# permalink

9 January 2009

Google FriendConnect

I've wanted this for like a million years: a web service that lets you embed comments in an arbitrary web page, and handles all the data storage and other messy stuff like sign-ins and whatnot.

Guess what? Google has come through with Google FriendConnect.

To try it out, I did some furious hacking on Textweb. It didn't take very long; kudos to the FriendConnect team for making it almost completely painless (so far). Anyway, you can log in using a Google or Yahoo account (plus some others), and post comments.

# permalink

3 January 2009

Short String Optimization

Happy New Year!

Won Chun and I came up with a clever way to squeeze an extra byte of storage into the C++ "short string optimization".

Quick review: a vanilla C++ string is a class that contains a string length, a pointer to a memory buffer, and a buffer capacity. The buffer is the sequence of bytes that make up the string, and is compatible with C strings, meaning the last byte is 0. Normally, the buffer is allocated on the heap, and is resized on demand when the string is changed, so that the buffer capacity is always large enough to hold the string. On a 32-bit machine, the C++ class instance itself (not including the allocated buffer) normally is at least 12 bytes: 4 bytes each for length, buffer* and capacity.

The Short String Optimization adds a small byte buffer directly in the string class. If the string length is short enough, the bytes are stored directly in the string instance itself instead of allocated on the heap. There are different ways of doing this, but one of the better ways is to overlay the local buffer on top of the buffer* and capacity fields that are only needed by the heap buffer. E.g.:

class string { ... private: union { struct { unsigned char local_length; char local_buffer[15]; } local_data_; struct { unsigned char flag_value; // set to 0xFF when using heap_data_ char* buffer; size_t length; size_t capacity; } heap_data_; }; };

That gives sizeof(string)==16, the local buffer is 15 bytes, and the max local string length is 14 (i.e. not counting the terminating 0 char).

(In practice, STLport and some other STL libs are less size-conscious than that, and don't reuse fields to that extent.)

Anyway, we think strings should be as efficient as possible. So we improve this slightly, expanding the local buffer to use the full 16 bytes. We do it like so:

class string { ... private: union { struct { unsigned char local_buffer[15]; unsigned char fifteen_minus_length; } local_data_; struct { char* buffer; size_t length; size_t capacity; int dummy; } heap_data_; }; };

Our clever twist is to store (15 - length) when using local data, and put that byte at the end of the local struct. So, when length == 15, (15 - length) == 0, and that means the fifteen_minus_length byte serves double duty as both the length field and the terminating 0 byte! All 15 local_buffer[] bytes are used for string data! When not using local_data, we fill fifteen_minus_length with a flag value like 0xFF, and the heap_data fields are active.

This trick is implemented and tested in tu_string here and it seems to work fine. Also it's parameterized so that you can safely try different sizes for the local buffer, and it should compile and work correctly on 64-bit machines, etc.

Hot!

rants index: current | 2016 archive | 2015 archive | 2014 archive | 2013 archive | 2011 archive | 2009 archive | 2008 archive | 2007 archive | 2006 archive | 2005 archive | 2004 archive | 2003 archive | 2002 archive

