If you put aside the trolling and flaming, (and accept that someone can hold right and wrong views on different subjects) then ESR’s controversial blog post on Rust and the follow up make a lot of sense.

The distilled point is this: a comprehensive standard library is more practical than a proliferation of community packages.

This is because of the discovery cost of selecting an appropriate package. It’s not specific to Rust, but any language where the standard library is small.

To illustrate the discovery cost, I’m going to run through the package selection process in Rust/Node/Clojure/small stdlib language de jour:

Prerequisite for beginners: find out what the package manager is and how to use it properly.

Search for keywords. Let’s say I’m searching for an “antigravity” package. (XKCD fans will know where I’m going with this.)

Whittle down the results to a shortlist using metrics like popularity. Imagine we end up with three: “antigravity-is”, “floaty” and “upfall”.

Evaluate the leading candidates. Is the license compatible? Do they support your target platform/runtime? Are they actively maintained? Is it stable? Do they have the features you need? What’s the documentation like? Are there any security concerns? What about their dependencies? In our example perhaps antigravity-js is GPLv3 and we intend to distribute proprietary blobs, so that’s a no. Floaty hasn’t seen any Git commits for 6 months, so that’s questionable. So we’ll take upfall, even though you actually liked its API design least. A typical engineering compromise.

Install your best pick. Save to package.json or whatever. Make a call on whether you need to shrinkwrap it or even preserve a local copy.

Start coding. Tomorrow. Because you probably ran out of time doing due diligence today.

Phew. Being a responsible engineer takes effort. Let’s try it in Python.

Prerequisite for beginners: learn how to search the docs.

import antigravity

Lift off! 🚀

No shortlisting and evaluating and all that malarkey. Inclusion in the standard library guarantees most points. Sure the API may be stodgy and occasionally there are three roughly equivalent modules to choose from (.NET and Python both have a plethora of XML options) but it rarely goes further than that. There is so much less friction.

The point isn’t that the standard library module is always the absolute best option available. But it is highly likely to be a good enough option, which is a fair trade for the cost of assessing and choosing other libraries.

This doesn’t preclude a large and healthy package community. There are plenty of packages that improve on stdlib modules in some way, and some evolve into de-facto standards – like requests in Python and nokogiri in Ruby. That’s great! But it would be even better if/when they are incorporated into the stdlib. With an external package there is always friction.

[As an aside: I think this is distinct from the frameworks vs small composable libraries debate. It’s not about the size of the library, it’s about the time and effort of including it.]

I don’t think it’s fair to single Rust out on this though. They’ve only just stabilised the language itself, how could they possibly provide a large stdlib without Google-like levels of engineering resource to throw at it? And they are doing the right thing by “incubating” crates with a view to later including them. In the meantime the crates system seems well thought out and doesn’t add any further unnecessary friction.

So the future looks good for Rust! But for now, having a large standard library is an advantage other languages have. Seems a fair criticism to me.