Over the years, I have used pretty much every kind of computer available from the Atari to the Amiga to the Mac, PC, and even Linux boxes. Lately, I've been fascinated with the differences, especially the difference between the Mac and a Windows-based PC, since they each have their adherents. It's only recently that I finally isolated a factor that I feel is often overlooked, but very important: precision.

Actually, I think it is better described as preciseness.

In most cases, it has to do with the sluggish nature of some machines and the resolution of the screen. But it's more than resolution. This reminds me of the various machines that used to have what I call chubby arrows.

The first machine that I recall having this ran the Digital Research GEM operating system, the OS that was systematically destroyed by Microsoft in the early days of the GUI. Looking back on it, I never liked the feel of the OS because it had a big, chubby pointer that never gave the user the impression of preciseness. It was annoying.

This was also apparent with both the Atari and the Amiga desktops. All of these systems failed, I think, because of this subtle flaw, not marketing issues. I know this is a radical concept, but the big, fat arrow killed a lot of systems, because it conveyed an impression of impreciseness. Thus, it generated a lack of confidence in the machine. It was subtle, but enough to create doubt.

Precision is always at the forefront of Apple's Mac computers. And it's taken to a much deeper level than the PC. The file system is snappier, for example, and it has been snappier since the first iMac. When you click on a folder, it snaps open quickly. This gives the user confidence in the machine. Responsiveness is equally as important as precision.

Years ago, I used to moan and groan about how some programming languages resulted in a vague sluggish quality in the programs they created. I almost made a career out of complaining about a useful little language called FORTH. For some years, it was popularized as an easy way to develop programs. To me, the programs always gave off a sluggish vibe that I detested. I didn't realize it at the time, but it was part of this precision issue.

Twitter is often annoying when it suddenly begins to buffer the text. You type and type and look up to see the entry is 20 to 30 letters behind what you are typing and characters begin to appear one by one. This induces panic. I never want to see this sort of sluggish imprecision, ever.

There was a lot of this on PCs with the early versions of Windows, and it still exists. I can assure you that the little flashlight that appears all too often on Windows XP is not a confidence builder. You know what I mean: When you open a folder, then click on a sub-folder and suddenly a stupid flashlight appears to indicate that it's looking for something. What's it looking for? Is the memory clogged up? Can the folder not be opened? Is it lost on the disk somehow? What's the problem?

The problem is imprecision.

In this battle, the Mac beats the PC, and everything else for that matter.