The Final Effect

SKAction

Attempt 1 - Initial approach

SKEmitterNode

SKLabelNode

particleBirthRate

Attempt 2 - Minor Improvements

SKAction

Attempt 3 - Different approach

SKLabelNode

Draw the text into a bitmap (or use a premade texture asset, in my case I just drew the string into an NSBitmapImageRep ) Define the resolution, or 'node-size to pixel-size' value Read the pixel color values of the bitmap If the pixel is non-white, create a SpriteKit node at a position based on the X,Y coordinates Apply the SKAction to each particle node to create the desired effect

SKShapeNode vs. SKSpriteNode

SKShapeNode

SKSpriteNode

-spriteNodeWithColor:

Reading in the bitmap

NSBitmapImageRep

-colorForPixel:

NSBitmapImageRep

NSImage

NSCGImageSnapshotRep

-colorForPixel:

Other Effects

Performance

Video Demo

Recently I wanted to create an animated title sequence for Dust using SpriteKit in which the text would disintegrate, looking as though each letter was breaking apart in the wind. I tried several approaches and went through multiple iterations before finally settling (for now at least) on the effect shown below. I also discovered a few interesting things about SpriteKit's performance capabilities in the process. In this post I'll describe how the effect was implemented and some of the earlier approaches which didn't pan out.This effect iswith SpriteKit using about 20,000 individual nodes comprising the particles of the text. Each node has its ownanimation assigned to it. Because each individual animation is randomized the effect is unique each time and can be controlled at an extremely granular level. SpriteKit is able to handle all of this out of the box, with no optimization, at, which is pretty darn impressive.Initially, I started with the easiest approach I could think of, which was to leverageand attempt to 'fake' the disintegration. I kept things simple and put a single-letteron the screen. I then created an emitter in's editor (using the Spark template as a starting point) to provide the particle effect itself.I combined the two by running a set of simple SKActions which would present the text node and the disabled emitter (having previously set itsto 0), and then run an action which simultaneously faded the text while briefly turning on the emitter. This achieved a fairly simplistic result which, although not completely terrible, was not convincing.Modifying thesequences by slightly scaling and adjusting the text as part of the animation and moving the emitter helped a bit. I then added code which could apply this overall sequence of actions for each letter of an arbitrary string. Adding a little randomization of the timing for each letter helped things visually. This was the result:Again, not terrible, but it leaves a lot to be desired.I knew that what I really wanted to do was legitimately break apart the letters into their actual particles, which could be individually controlled. This was going to require a different approach, however. I didn't want to deal with the font glyphs directly for this early proof-of-concept, and's API is somewhat limited, so that was off the table as well. It didn't really matter how the text was represented, however, I just needed some way to define the individual parts of the glyphs as a 2D grid of nodes - which is exactly what a bitmap is. So the next approach leveraged a texture image representing the string. It's simple, but a lot more fun:Essentially it's simply recreating the bitmap within the scene using individual SKNodes.Initially I leveraged, but I found that no matter how I adjusted those sprites the animation performance was abysmal. I then tried creating tinys of a flat color using, which provided significantly better performance out of the box. (As an aside, it might be worth mentioning that SKShapeNode has a ton of issues, even as of 10.12. That class deserves a blog post all on its own.)Again I went with the lowest-cost solution for this proof-of-concept, which wound up being trivial thanks to AppKit. I simply useand iterate over the pixels, obtaining the color value via. This is definitely not the most robust or performant approach, but for the purposes of this test it was sufficient.It was helpful in this case to usedirectly, rather than attempting to work at a level higher on the more abstractedclass. Drawing into an NSImage will allow the system to automatically create and manage the underlying representations, but they may not always be what you expect. For example in some cases you'll find that the NSImage is backed by an, which is a private NSImageRep subclass which doesn't offer the convenience ofGoing this route of managing the individual particles based on an input texture provides a lot of power and flexibility in the appearance of the animation. Because you can easily calculate the position and color of each particle node, you can also do other interesting effects such as assembling the final composite from a randomly-scattered set of pixels. (For an example of this, check out the video below.)Even without any optimization, the animation performs surprisingly well. SpriteKit handles 24,000 nodes animating individually without much of a hiccup. This could undoubtedly be improved by using a lower resolution (larger/fewer nodes), or grouping sets of nodes together for fewer animations, etc. It would also be interesting to experiment with different rendering settings and properties for the scene and the SKSpriteNodes themselves. But for this simple test I was surprised at the rendering speed. Chances are I will be using this effect in some form for the opening titles of Dust.