In case you are wondering why I’ve written the following tweet:

let me ensure you I’ve fixed already all my issues, but I’m still super upset for how lightly TC39 changed a 2+ years shipped standard behavior without a single warning to let developers migrate.

This is the long-story-short of how a small change can become a nightmare.

What happened

We all know strings are immutable primitives in JavaScript, and once we use a string as object key, that won’t ever be collected, or removed, unless we explicitly drop it. Developers usually never cared much about cleaning up strings in general, and to be honest, it’s not so common that you have many strings on your code or that whatever string you have should be garbage collected.

A similar concept was standardized in ECMAScript 2015 for template literal objects:

Each TemplateLiteral in the program code of a realm is associated with a unique template object that is used in the evaluation of tagged Templates (12.2.9.5). The template objects are frozen and the same template object is used each time a specific tagged Template is evaluated.

That means that if two template literals contains the same static parts around their interpolations, these are considered identical same way two strings defined in two different places with the same content are identical.

function tag(template, ...values) {

// by ES2015 specifications

// the template is a frozen Array

// unique per each unique static content

// represented by any template literal

return template;

} // same way 'a' === 'a'

tag`a` === tag`a`; // true

tag`1${2}3` === tag`1${5}3`; // also true

Above little snippets produce true in Chrome < 66, Safari, Edge, and NodeJS 6 to 9, but it will result into false only in NodeJS 10 and Chrome 66+, but it might return false also in next version of Safari.

On top of that, Babel < 7 will return true but Babel 7+ will return false

… because screw you features detection …

Firefox < 55 as well as (older?) TypeScript transpiled code will always return false because they badly implemented the unique nature of each template object.

If you were trying to feature detect that behavior you are now falling back on latest Node and Chrome too.

The new feature detection should now be indeed more like the following one:

function tag(template, ...values) { return template; }

function same() { return tag`a`; }

same() === same();

Not a big deal, huh? Well … the fact I could not trust anymore that two identical template literals in the same scope would point at the same object made every test of my libraries fail.

Differently from many projects out there, my libraries are 100% code covered and that means I have a lot of tests … like, a lot!

If you polyfill the WeakMap your code will break

As it’s been since 2015, it made literally no sense to store template objects as WeakMap keys, for the simple reason these would’ve never be collected, but as unique objects, the memory footprint wasn’t too huge anyway.

But now things changed: since template objects can be different, even if identical in nature, if you were using Map to store them as key, your memory might blow up infinite amount of times without ever freeing … “how lovely”

So now you are forced to use WeakMap if you want to relate a template object to anything but there is a little gotcha here: template objects are frozen, and if you polyfill WeakMap attaching a unique ID to the key object, it will throw once you pass a template literal transpiled by Babel, because Babel also freezes the object as per specs.

// a polyfill that easily breaks

const WeakMap = window.WeakMap || function WeakMap() {

const key = '_' + (Math.random() * Date.now());

return {

get(obj) { return obj[key]; },

set(obj, value) {

// this breaks with frozen objects

// there is no way now to polyfill

Object.defineProperty(obj, key, {

configurable: true,

value

});

}

};

};

One does not simply change a 3 yo standard behavior

I have always promoted the usage of standards because I’ve always thought these were a safe bet with a lot of regards to backward compatibility.

A change like this is the equivalent of changing the output of the command ip addr or ifconfig in a console, something that will break everything because many use regular expressions there trusting it’ll always produce that kind of outpt. On the browser side we also have the most convoluted ever features detection techniques that always assume specific behaviors to be consistently truthy or falsy, but I’m now scared to assume anything because shipped standards might suddenly change without any warning.

Honestly thought, what were they thinking? I am one of those few developers that read specs in all their details and triple check behaviors across every browser and JS engine I can, front end or back end, and specs now backfire?

If I create code that is supposed to be super robust thanks to its full usage of specifications, behind well defined features detection, I don’t want to fear these kind of changes in the future.

A unique template anyway …

For those that wouldn’t care about few extra objects stored in memory, the solution to have in every single platform always the same object per template static parts is the following one:

const T = Object.create(null);

const TL = t => {

const k = '^' + t.join('^');

return T[k] || (T[k] = t);

};

It’s super simple, and fast enough, but accordingly with my tests twice as slower than a WeakMap . But remember, WeakMap with these template objects is not possible, if you are polyfilling, so welcome to the change to specs after 3 years hell I’ve dealt with for the last 2 days 🎉

My libraries support down to IE9 or even IE8, as well as NodeJS LTS to latest, and having no control over the way 3rd parts bundle my code or transpile it, I cannot afford these breaking changes.

So please TC39 don’t ever do anything like this ever again, thank you!