If anyone outside Apple saw Swift coming, they certainly weren't making any public predictions. In the middle of a keynote filled with the sorts of announcements you'd expect (even if the details were a surprise), Apple this week announced that it has created a modern replacement for the Objective-C, a programming language the company has used since shortly after Steve Jobs founded NeXT.

Swift wasn't a "sometime before the year's out"-style announcement, either. The same day, a 550-page language guide appeared in the iBooks store. Developers were also given access to Xcode 6 betas, which allow application development using the new language. Whatever changes were needed to get the entire Cocoa toolkit to play nice with Swift are apparently already done.

While we haven't yet produced any Swift code, we have read the entire language guide and looked at the code samples Apple provided. What follows is our first take on the language itself, along with some ideas about what Apple hopes to accomplish.

Why were we using Objective-C?

When NeXT began, object-oriented programming hadn't been widely adopted, and few languages available even implemented it. At the time, then, Objective-C probably seemed like a good choice, one that could incorporate legacy C code and programming habits while adding a layer of object orientation on top.

But as it turned out, NeXT was the only major organization to adopt the language. This had some positive aspects, as the company was able to build its entire development environment around the strengths of Objective-C. In turn, anyone who bought in to developing in the language ended up using NeXT's approach. For instance, many "language features" of Objective-C aren't actually language features at all; they are implemented by NeXT's base class, NSObject. And some of the design patterns in Cocoa, like the existence of delegates, require the language introspection features of Objective-C, which were used to safely determine if an object will respond to a specific message.

The downside of narrow Objective-C adoption was that it forced the language into a niche. When Apple inherited Objective-C, it immediately set about giving developers an alternative in the form of the Carbon libraries, since these enabled a more traditional approach to Mac development.

Things changed with the runaway popularity of the iPhone SDK, which only allowed development in Objective-C. Suddenly, a lot of developers used Objective-C, and many of them already had extensive experience in other programming languages. This was great for Apple, but it caused a bit of strain. Not every developer was entirely happy with Objective-C as a language, and Apple then compounded this problem by announcing that the future of Mac development was Cocoa, the Objective-C frameworks.

What's wrong with Objective-C?

Objective-C has served Apple incredibly well. By controlling the runtime and writing its own compiler, the company has been able to stave off some of the language limitations it inherited from NeXT and add new features, like properties, a garbage collector, and the garbage collector's replacement, Automatic Reference Counting.

But some things really couldn't be changed. Because it was basically C with a few extensions, Objective-C was limited to using C's method of keeping track of complex objects: pointers, which are essentially the memory address occupied by the first byte of an object. Everything, from an instance of NSString to the most complex table view, was passed around and messaged using its pointer.

For the most part, this didn't pose problems. It was generally possible to write complex applications without ever being reminded that everything you were doing involved pointers. But it was also possible to screw up and try to access the wrong address in memory, causing a program to crash or opening a security hole. The same holds true for a variety of other features of C; developers either had to do careful bounds and length checking or their code could wander off into random places in memory.

Beyond such pedestrian problems, Objective-C simply began showing its age. Over time, other languages adopted some great features that were difficult to graft back onto a language like C. One example is what's termed a "generic." In C, if you want to do the same math with integers and floating point values, you have to write a separate function for each—and other functions for unsigned long integers, double-precision floating points, etc. With generics, you can write a single function that handles everything the compiler recognizes as a number.

Apple clearly could add some significant features to the Objective-C syntax—closures are one example—but it's not clear that it could have added everything it wanted. And the very nature of C meant that the language would always be inherently unsafe, with stability and security open to compromise by a single sloppy coder. Something had to change.

But why not take the easy route and adopt another existing language? Because of the close relationship between Objective-C and the Cocoa frameworks, Objective-C enabled the sorts of design patterns that made the frameworks effective. Most of the existing, mainstream alternatives didn't provide such a neat fit for the existing Cocoa frameworks. Hence, Swift.