In this post: Craig tries to convince you to treat asymptotic complexity as a relation, instead of like sets.

Awkward-O

Big-O notation is one of my pet peeves. It’s a perfect example of complicating something simple.

Even ignoring the fact that Big-O notation is hard to learn, because instead of looking like other notation it uses totally arbitrary symbols, it is clearly at odds with how people think about complexities. Have you ever seen someone write , as if a function could be equal to a set of functions, instead of ? How about instead of ? I think the reason people write these (literally false) statements is because they’re thinking in terms of comparisons, whereas Big-O notation is formulated in terms of sets.

When a person writes , they almost certainly didn’t intend to say “The set of functions asymptotically upper bounded by is the same as the set of functions asymptotically upper bounded by .”. That’s trivially false. They meant linear costs are asymptotically upper bounded by quadratic costs, but ended up misusing/abusing Big-O notation instead.

(I suppose I can’t actually speak for “people”, but personally I do think about asymptotic complexity as a relation.)

Given that we’re thinking of asymptotic complexity in terms of comparisons, why not use notation based on that?

Asymptotic Comparison Notation

So we want to represent asymptotic complexity as a relation. How does that work, exactly?

Well, we’ll use the same symbols we already use for relations and comparisons ( , , , , and ) but slightly modify them to indicate asymptotic-ness. My personal (and arbitrary) preference is to indicate the asymptotic-ness by circling the operator. I represent “asymptotically less than” as , “asymptotically equal to” as , and so forth.

Here’s a table that shows each operator beside its English description, its analogue in Big-O notation, and its definition in terms of a proportional limit:

Do you see how has a definition that is the inverse of the definition of its inverse, ? How ‘s definition is equivalent to satisfying either ‘s definition or ‘s definition (because it’s greater than OR equal)? How there’s no confusion about what’s lower bounding and what’s upper bounding due to the re-use of existing symbols?

Doesn’t that just… make sense?

Examples

Here’s a few examples of using the asymptotic comparison operators as well as their limit definitions:

Buying a computer with a faster processor doesn’t affect the complexity of an algorithm: , where , because .

Linear algorithms are asymptotically less costly than quadratic algorithms: because .

Running two algorithms, one after the other, is at least as asymptotically expensive as one of them individually: , where and are positive, because .

Logarithmic algorithms are asymptotically less costly than polynomial algorithms (using L’Hôpital’s rule): , for any , because

Chaining:

Summary

Comparison operators are a natural fit when thinking about asymptotic complexity. Personally, I use the existing comparison operators and circle them to indicate the comparison is asymptotic.

—

—

—



Twisted Oak Studios offers consulting and development on high-tech interactive projects. Check out our portfolio, or Give us a shout if you have anything you think some really rad engineers should help you with.



Archive