Type systems rock. Even the most primitive type systems like in C rock. They mostly just allow us to automatically allocate enough memory to store things, but that lets us treat the giant expanse of bytes attached to our machines as if they were a mosaic of primitives, structs, unions and enums. More advanced type systems may allow us to avoid use-after-frees or out of bounds access or enable polymorphism by magically inserting virtual dispatch into code. Better yet, some type systems can allow us to know whether code produces side-effects or can fail or may not return anything at all. All of these features are enabled by type systems, but type systems still suck.

One of the first times I realized that type systems suck was actually in school. In C# and Java, arrays are typed when they are created. This means that you can have an Object[] or String[] or Whatever[] and store the appropriate type in it. While this is cool, it's also lame because what if I have a method that takes an Object[] and I have a String[] ? I know that my String s are also Object s, so it seems a waste to copy those String s into a new Object[] , so both languages let you treat your String[] like it's an Object[] . But what happens is now that it looks like an Object[] , it's tempting to store objects in there! So what happens if I go to store a Whatever in that Object[] ? A runtime error, of course. We can't allow a Whatever into that Object[] (because it's actually a String[] masquerading as an Object[] ). Since Java has some crazy support for generic type variance, it's actually possible to make your method take a <T extends Object> type parameter and then a T[] et viola you've got your self an array you can use items from as Object, but that you can't store an object in unless you know it's a T . But while that's real cute, you will always be able to write code in the prior way because of backwards compatibility and I wouldn't want it any other way. But that's where type systems go wrong. So often types are so tightly linked to the actual implementation of the language that there's almost no way to evolve without breaking something.

Now, on the other hand there's TypeScript. It's a thing of beauty and it sucks. When I'm using TypeScript I feel like a JavaScript ninja! No more searching "mdn setTimeout" to figure out whether the function or the timeout goes first, it's right there in my IDE and it will complain if I get it wrong! But you know what even more insidious problem it does not protect me from? Bound functions, of course. Now if you're not privy to the way that bound functions work in JavaScript, consider yourself blessed because it is the most messed up and amazing thing that's ever been done. In JavaScript a "method" is just a function assigned to a field of an object as if it were a number or string or whatever. Like C# or Java however, methods can use a magic variable called "this" to refer to the object they're called on. Most fantastically of all, the "this" is set based on the syntax of the call site. If you try to use it as a lambda (like you might in Java, C#, or Python), when it's called by the method you're passing it to, it will not have any idea what it used to be assigned to. Unfortunately TypeScript will not complain if you take a method that uses the this keyword and try to use it as a lambda with a similar signature. Even though TypeScript's type system isn't even tied to it's code generation it still fails to check for some of the most wat-inducing behaviors of the underlying platform.

So, how is anybody supposed to create a type system that's not a giant tangled mess? Well, let me tell you. I love messiness in programming. I have a deep appreciation for a language like JavaScript where the way == works requires multiple examples and tables and twitter posts about how stupid it is. But what I think we need to either really wallow in the ugliness of types or make our compilers emit the right stuff. Either give me a type for methods that specifies what kind of this it requires and whether it writes to it or tack on some runtime code that will bind a method when it's being used as a lambda. I think the absolute worst thing we can do is make type systems just good enough to make us forget that there are real and bizarre things that happen at the corners of "defined" behavior. The major problem with my approach is that if you decide you're going to try to bridge the gap between the language people want to write in and the platform they're running on you're going to get lampooned for the performance degradation. Either you'll bloat the code with polyfills and stubs or you'll suffer a 10% penalty at runtime every time somebody does something a certain way you can't figure out how to optimize. The problem with expanding your types to match the real world is that all kinds of bad things happen and the halting problem states somebody will always be able to come up with a program that can't be checked.

I hope that my tone has come off as a little tongue-in-cheek, I certainly don't hate type systems, but I also do think that there's still room for improvement in our tooling. I think the one thing (if anything, and I'm not claiming that kind of knowledge) is that it's so much more appealing to have a "clean" or "pure" type system and so people prefer to throw everything out and start again rather than deal with the actual things we have going on. Some things feel like the old joke: a man tells his doctor "Doc, it hurts when I do this," to which the doctor replies "Then don't do that." I'm optimistic that in the future we can have types that catch our common problems and hopefully even explain them in a way that helps us to understand and move past them. There are only so many hours in a day and there are smarter people than me working on these problems already, but it still fascinates and humbles me to think about all the paths yet taken.