In this Apple Pencil tutorial, you’ll learn about force, touch coalescing, altitude, and azimuth, to add realistic lines and shading to a drawing app.

Note: Updated for Xcode 7.3, iOS 9.3, and Swift 2.2 on 04-01-2016

I know that many of you have got yourself a gorgeous new iPad Pro and snagged a Pencil to go along with it.

If you’re anything like me, once you’ve experienced the awesomeness of drawing with Pencil, you’ll want to include support for it in all of your apps.

I’ve been waiting for something like this device since I purchased the original iPad. As you’ll see from my scribbles, I’m no Rembrandt, but I’ve found Pencil is also great for taking notes. I can only imagine what kinds of amazing works of art people will create now that there is Apple Pencil.

In this Apple Pencil tutorial, you’re going to learn exactly what it takes to support Pencil. Here are the key things you’ll learn:

How to work with force

How to improve accuracy

How to implement shading behavior

How to add an eraser

How to improve the experience by working with predictive and actual drawings

By the end of this tutorial, you’ll be ready to integrate Apple Pencil support into your apps!

Prerequisites

To follow along with this tutorial, you’ll need:

An iPad Pro and Apple Pencil. You cannot test Pencil on the simulator. Also, Pencil doesn’t work with older iPads, just the iPad Pro. Sounds like you’ve got your excuse for an upgrade!

At least Xcode 7.1 with at least iOS 9.1.

A basic familiarity with Core Graphics. You’ll need to know what contexts are, how to create them and how to draw strokes. Have a look at the first part of our Core Graphics tutorial — that will be plenty to get you up to speed, and the app will remind you to drink water. :]

Getting Started

Throughout this Apple Pencil tutorial, you’ll build an app called Scribble. It’s a simple app that lets you draw with responsive UI, e.g. pressure sensitivity and shading.

Download and explore Scribble. Try it out on your iPad Pro, with both Pencil and your finger, making sure to rest your hand on the screen as you draw.

You’ll see that unlike previous iPads, palm rejection is automatic, although you’ll have to be careful to rest a large area of your hand because smaller areas are read as touches.



Shake the iPad to clear the screen — just like an Etch-A-Sketch!

Under the hood, Scribble is a basic app that consists of a canvas view that captures the touch from finger or Pencil. It also continuously updates the display to reflect your touch.

Take a look at the code in CanvasView.swift.

The most important code can be found in touchesMoved(_:withEvent:) , which is triggered when the user interacts with the canvas view. This method creates a Core Graphics context and draws the image displayed by the canvas view into that context.

touchesMoved(_:withEvent:) then calls drawStroke(_:touch:) to draw a line into the graphics context between the previous and current touch.

touchesMoved(_:withEvent:) replaces the image displayed by the canvas view with the updated one from the graphics context.

See? It’s pretty simple. :]

Your First Drawing with Pencil

Drawing with your finger has never been elegant, even in a digital environment. Pencil makes it much more like drawing the old analog way, with the basic UI of a pencil and paper.

You’re now ready to use the first feature of the Pencil — force. When you press harder against the screen the resulting stroke is wider. This feature doesn’t work with your finger, although there is a little cheat that you’ll learn about later.

Force amount is recorded in touch.force . A force of 1.0 is the force of an average touch so you need to multiply this force by something to produce the right stroke width. More on that in a moment…

Open CanvasView.swift, and at the top of the class add the following constant:

private let forceSensitivity: CGFloat = 4.0

You can tweak this forceSensitivity constant to make your stroke width more or less sensitive to pressure.

Find lineWidthForDrawing(_:touch:) . This method calculates the line width.

Just before the return statement, add the following:

if touch.force > 0 { lineWidth = touch.force * forceSensitivity }

Here you calculate the line width by multiplying the force of the touch with the forceSensitivity multiplier, but remember this only applies to Pencil and not to a finger. If you’re using your finger, touch.force will be 0 , so you can’t change the stroke width.

Build and run. Drawn some lines with Pencil and notice how the stroke varies depending on how hard you press down on the screen:

Smoother Drawing

You’ll notice that when you draw, the lines have sharp points rather than a natural curve. Before Pencil, you had to do complex things like convert your strokes to spline curves to make drawings look decent, but Pencil makes this sort of workaround largely unnecessary.

Apple tells us that the iPad Pro scans for a touch at 120 times per second, but when the Pencil is near the screen the scan rate doubles to 240 times per second.

The iPad Pro’s refresh rate is 60 Hz, or 60 times per second. This means that with a scan of 120 Hz it could theoretically recognize two touches but only display just one. In addition, if there’s a lot of processing behind the scenes, the touch event could be missed altogether on certain frames because the main thread is occupied and unable to process it.

Try drawing a circle quickly. It should be round, but the result is more akin to the jagged points of a polygon:

Apple came up with the concept of coalesced touches to deal with this problem. Essentially, they capture the touches that would have been lost in a new UIEvent array, which you can access via coalescedTouchesForTouch(_:) .

Find touchesMoved(_:withEvent:) in CanvasView.swift and replace:

drawStroke(context, touch: touch)

With the following:

// 1 var touches = [UITouch]() // 2 if let coalescedTouches = event?.coalescedTouchesForTouch(touch) { touches = coalescedTouches } else { touches.append(touch) } // 3 print(touches.count) // 4 for touch in touches { drawStroke(context, touch: touch) }

Let’s go through this section by section.

First, you set up a new array to hold all the touches you’ll have to process. Check for coalesced touches, and if they are there, you save them all to the new array. If there aren’t any, you just add the one touch to the existing array. Add a log statement to see how many touches you’re processing. Finally, instead of calling drawStroke(_:touch:) just once, you call it for every touch saved in the new array.

Build and run. Draw some fancy curlicues with your Pencil and revel in buttery smoothness and stroke width control:

Turn your attention to the debug console. You’ll notice that when you’re drawing with Pencil rather than your finger, you receive many more touches.

You’ll also notice that even with coalesced touches, circles drawn with Pencil are much rounder simply because the iPad Pro scans for touches twice as often when it senses Pencil.

Tilting the Pencil

Now you have lovely fluent drawing in your app. However, if you’ve read or watched any reviews of the Apple Pencil, you’ll remember there was talk of its pencil-like shading abilities. All the users need do is tilt it, but little do they realize that shading doesn’t happen automatically — it’s all down to us clever app developers to write the code that makes it work as expected. :]

Altitude, Azimuth and Unit Vectors

In this section, I’ll describe how you measure the tilt. you’ll get to add support for simple shading in the next section.

When you’re working with Pencil, you can rotate it in three dimensions. Up and down direction is called altitude, while side-to-side is called azimuth:

The altitudeAngle property on UITouch is new to iOS 9.1, and is there just for the Apple Pencil. It’s an angle measured in radians. When Pencil lies flat on the iPad’s surface, the altitude is 0. When it stands straight up with the point on the screen, the altitude is π/2 . Remember that there are 2π radians in a 360 degrees circle, so π/2 is equivalent to 90 degrees.

There are two new methods on UITouch to get azimuth: azimuthAngleInView(_:) and azimuthUnitVectorInView(_:) . The least expensive is azimuthUnitVectorInView(_:) , but both are useful. The best one for your situation depends on what you need to calculate.

You’ll explore how the azimuth’s unit vector works. For reference, a unit vector has a length of 1 and points from the coordinate (0,0) towards a direction:

To see for yourself, add the following at the top of touchesMoved(_:withEvent:) , just after the guard statement:

print(touch.azimuthUnitVectorInView(self))

Build and run. With the iPad in landscape orientation — Scribble is landscape only to keep this tutorial focused on Pencil — hold your pen so that the point is touching on the left side of the screen, and the end is leaning right.

You won’t be able to get these values in the debug console with satisfactory precision, but the vector is approximately 1 unit in the x direction and 0 units in the y direction — in other words (1, 0) .

Rotate Pencil 90 degrees counter-clockwise so the tip is pointing towards the bottom of the iPad. That direction is approximately (0, -1) .

Note that x direction uses cosine and the y direction uses sine. For example, if you hold your pen as in the picture above — about 45 degrees counter-clockwise from your original horizontal direction — the unit vector is (cos(45), sin(-45)) or (0.7071, -0.7071) .

Note: If you don’t know a lot about vectors, it’s a useful bit of knowledge to pursue. Here’s a two-part tutorial on Trigonometry for Games using Sprite Kit that will help you wrap your head around vectors.

Remove that last print statement when you understand how changing the direction of Pencil gives you the vector that indicates where it’s pointing.

Draw With Shading

Now that you know how to measure tilting, you’re ready to add simple shading to Scribble.

When Pencil is at a natural drawing angle, you draw a line by using force to determine the thickness, but when the user tilts it on its side, you use force to measure the shading’s opacity.

You’ll also calculate the thickness of the line based upon the direction of the stroke and the direction in which you’re holding the Pencil.

If you’re not quite following me here, just go find a pencil and paper to try shading by turning the pencil on its side so that the lead has maximum contact with the paper. When you shade in the same direction as the pencil is leaning, the shading is thin. But when you shade at a 90 degree angle to the pencil, the shading is at its thickest:

Working With Texture

The first order of business is to change the texture of the line so that it looks more like shading with a real pencil. The starter app includes an image in the Asset Catalog called PencilTexture to use for this.

Add this property to the top of CanvasView:

private var pencilTexture = UIColor(patternImage: UIImage(named: "PencilTexture")!)

This will allow you to use pencilTexture as a color to draw with, instead of the default red color you’ve used up until now.

Find the following line in drawStroke(_:touch:) :

drawColor.setStroke()

And change it to:

pencilTexture.setStroke()

Build and run. Hey presto! Your lines now look much more like a pencil’s lines:

Note: In this tutorial, you’re using a texture in a rather naive way. Brush engines in full-featured art apps are far more complex, but this approach is enough to get you started.

To check that Pencil is tilted far enough to initiate shading, add this constant to the top of CanvasView:

private let tiltThreshold = π/6 // 30º

If you find that this value doesn’t work for you because you hold it differently, you can change its value to suit.

Note: To type π hold down Option + P at the same time. π is a convenience constant defined at the top of CanvasView.swift as CGFloat(M_PI) . When programming graphics, it’s important to start thinking in radians rather than converting to degrees and back again. Take a look at this image from Wikipedia to see the correlation between radians and degrees.

Next, find the following line in drawStroke(_:touch:) :

let lineWidth = lineWidthForDrawing(context, touch: touch)

And change it to:

var lineWidth:CGFloat if touch.altitudeAngle < tiltThreshold { lineWidth = lineWidthForShading(context, touch: touch) } else { lineWidth = lineWidthForDrawing(context, touch: touch) }

Here you're adding a check to see if your Pencil is tilted more than π/6 or 30 degrees. If yes, then you call the shading method rather than the drawing method.

Now, add this method to the bottom of CanvasView :

private func lineWidthForShading(context: CGContext?, touch: UITouch) -> CGFloat { // 1 let previousLocation = touch.previousLocationInView(self) let location = touch.locationInView(self) // 2 - vector1 is the pencil direction let vector1 = touch.azimuthUnitVectorInView(self) // 3 - vector2 is the stroke direction let vector2 = CGPoint(x: location.x - previousLocation.x, y: location.y - previousLocation.y) // 4 - Angle difference between the two vectors var angle = abs(atan2(vector2.y, vector2.x) - atan2(vector1.dy, vector1.dx)) // 5 if angle > π { angle = 2 * π - angle } if angle > π / 2 { angle = π - angle } // 6 let minAngle: CGFloat = 0 let maxAngle = π / 2 let normalizedAngle = (angle - minAngle) / (maxAngle - minAngle) // 7 let maxLineWidth: CGFloat = 60 var lineWidth = maxLineWidth * normalizedAngle return lineWidth }

There's some complex math in there, so here's a play-by-play:

Store the previous touch point and the current touch point. Store the azimuth vector of the Pencil. Store the direction vector of the stroke that you're drawing. Calculate the angle difference between stroke line and the Pencil direction. Reduce the angle so it's 0 to 90 degrees. If the angle is 90 degrees, then the stroke will be the widest. Remember that all calculations are done in radians, and π/2 is 90 degrees. Normalize this angle between 0 and 1 , where 1 is 90 degrees. Multiply the maximum line width of 60 by the normalized angle to get the correct shading width.

Note: Whenever you're working with Pencil, the following formulae come in handy: Angle of a vector: angle = atan2(opposite, adjacent)

Normalize: normal = (value - minValue) / (maxValue - minValue)

Build and run. Hold Pencil at about the angle indicated in the picture, as is you're going to shade. Without changing the angle, do a little shading.

Notice how as the stroke direction changes it becomes wider and narrower. It's a bit blobby here with this naive approach, but you can definitely see the potential.

Using Azimuth to Adjust Width

One more thing to do: When you draw at 90 degrees with a real pencil, the line gets narrower as you change the pencil's tilt angle. But, if you try that with your Apple Pencil, the line width stays the same.

In addition to the azimuth angle, you also need to take into Pencil's altitude into account when calculating the width of the line.

Add this constant to the top of the CanvasView class, just below the others:

private let minLineWidth: CGFloat = 5

This will be the narrowest that a shading line can be -- you can change it to suit your own personal shading tastes. :]

At the bottom of lineWidthForShading(_:touch:) , just before the return statement, add the following:

// 1 let minAltitudeAngle: CGFloat = 0.25 let maxAltitudeAngle = tiltThreshold // 2 let altitudeAngle = touch.altitudeAngle < minAltitudeAngle ? minAltitudeAngle : touch.altitudeAngle // 3 let normalizedAltitude = 1 - ((altitudeAngle - minAltitudeAngle) / (maxAltitudeAngle - minAltitudeAngle)) // 4 lineWidth = lineWidth * normalizedAltitude + minLineWidth

Note: Make sure you add this code to lineWidthForShading(_:touch:) , and not lineWidthForDrawing(_:touch:) by mistake.

There's a lot to digest here, so let's take this bit by bit.

Theoretically, the minimum altitude of Pencil is 0 degrees, meaning it's lying flat on the iPad and the tip isn't touching the screen, hence, altitude can't be recorded. The actual minimum altitude is somewhere around 0.2 , but I've made the minimum to be 0.25 . If the altitude is less than the minimum, you use the minimum instead. Just like you did earlier, you normalize this altitude value to be between 0 and 1 . Finally, you multiply the line width you calculated with the azimuth by this normalized value, and add that to the minimum line width.

Build and run. As you shade, change the Pencil's altitude and see how the strokes get wider and narrower. Increasing the Pencil's altitude gradually should let you segue smoothly into the drawing line:

Playing with Opacity

The last task in this section is to make the shading look a bit more realistic by turning down the texture's opacity, which you'll calculate with force.

Just before the return statement in lineWidthForShading(_:touch:) , add the following:

let minForce: CGFloat = 0.0 let maxForce: CGFloat = 5 let normalizedAlpha = (touch.force - minForce) / (maxForce - minForce) CGContextSetAlpha(context, normalizedAlpha)

After working through the previous blocks of code, this one should be self-explanatory. You're simply taking the force and normalizing it to a value between 0 and 1 , and then setting the alpha used by the drawing context to that value.

Build and run. Try shading with varying pressure:

Finger vs. Pencil

If you're anything like me, you've probably made a few sketching errors here and there and wish you could erase those errant lines.

In this section, you're going to look at how you can distinguish between using the Apple Pencil and your finger. More specifically, you'll configure the app so that your finger can play the role of a faithful eraser.

It turns out that checking whether a finger or the Apple Pencil is being used is pretty easy -- you just use the type property on UITouch .

At the top of CanvasView , add a property for the eraser color. You're going to paint in the background color of the canvas view, and it will give the illusion of acting as an eraser. Clever, eh? :]

private var eraserColor: UIColor { return backgroundColor ?? UIColor.whiteColor() }

Here you set eraserColor to the view's background color, unless it's nil , in which case you just set it to white.

Next, find the following code in drawStroke(_:touch:) :

if touch.altitudeAngle < tiltThreshold { lineWidth = lineWidthForShading(context, touch: touch) } else { lineWidth = lineWidthForDrawing(context, touch: touch) } pencilTexture.setStroke()

And replace it with the following:

if touch.type == .Stylus { if touch.altitudeAngle < tiltThreshold { lineWidth = lineWidthForShading(context, touch: touch) } else { lineWidth = lineWidthForDrawing(context, touch: touch) } pencilTexture.setStroke() } else { lineWidth = 20 eraserColor.setStroke() }

Here you've added a check to see whether it's Pencil or a finger, and if it's the latter you change the line width and use the eraser color for drawing.

Build and run. Now you can clean up any untidy edges or erase everything with your finger!

Faking Force For a Finger

Just as an aside, did you know that since iOS 8 you've been able to fake force with your finger? There's a property declared on UITouch called majorRadius , which, as its name implies, holds the size of the touch.

Find this line that you just added in the previous code block:

lineWidth = 20

And replace it with this one:

lineWidth = touch.majorRadius / 2

Build and run. Shade a dark area, and then erase with both the tip of your finger and the flat of your finger to see the varying thicknesses:

Finger painting feels really clumsy and the drawings are painful after you've played around with the elegant Apple Pencil. :].

Reducing Latency

You might think that your Pencil zooms over the surface of the iPad with the drawn line following closer than ever. Not so much -- it's an illusion because there is latency between the touch and the time the line renders. Apple has a trick up its sleeve to deal with it: Touch Prediction.

Incredible as it may seem, all-seeing Apple knows where your Pencil, or finger, is about to draw. Those predictions are saved into an array on UIEvent so that you can draw that predicted touch ahead of time. How cool is that!? :]

Before you can begin working with predicted touches, there's one small technical obstacle to overcome. At the moment, you're drawing strokes in the graphics context, which are then displayed immediately in the canvas view.

You'll need to draw the predicted touches onto the canvas but discard them when the actual touches catch up with the predicted ones.

For example, when you draw an S-shape it predicts the curves, but when you change direction, those predictions will be wrong and need to be discarded. This picture illustrates the problem. The "S" is drawn in red and the predicted touches show in blue.

Here's what your code will need to do to avoid this problem:

You'll create a new UIImage property named drawingImage to capture the true -- not predicted -- touches from the graphics context. On each touch move event, you'll draw drawingImage into the graphics context. The real touches will be drawn into the graphics context, and you'll save it to the new drawingImage instead of using the image property on the canvas view. The predicted touches will be drawn into the graphics context. The graphics context, complete with predicted touches, will be pushed into canvasView.image , which is what the user will see.

In this way, no predicted touches will draw into drawingImage and each time a touch move event occurs, the predictions will be deleted.

Housekeeping: Deleting Drawing Predictions

There's a little housekeeping in order to ensure those predicted touches are properly disposed of at the end of the stroke or when the user cancels the drawing.

Add a new UIImage property at the top of the CanvasView to hold the proper drawn image -- the one without predictions:

private var drawingImage: UIImage?

Next, find the following statement in touchesMoved(_:withEvent:) :

image?.drawInRect(bounds)

And replace it with the following:

drawingImage?.drawInRect(bounds)

Here you're drawing drawingImage into the graphics context, rather than the image being displayed at that time by the canvas view. This will overwrite any predicted touches drawn by the previous move event.

Now, at the bottom of touchesMoved(_:withEvent:) , but just above these lines:

image = UIGraphicsGetImageFromCurrentImageContext() UIGraphicsEndImageContext()

Add this new code:

// 1 drawingImage = UIGraphicsGetImageFromCurrentImageContext() // 2 if let predictedTouches = event?.predictedTouchesForTouch(touch) { for touch in predictedTouches { drawStroke(context, touch: touch) } }

Here's what's happening in there:

You save the graphics context with the new stroke that's been drawn but don't include the predicted strokes. Like you did with the coalesced touches, you get the array of predicted touches and draw the strokes for each predicted touch.

Now add these two methods:

override func touchesEnded(touches: Set<UITouch>, withEvent event: UIEvent?) { image = drawingImage } override func touchesCancelled(touches: Set<UITouch>?, withEvent event: UIEvent?) { image = drawingImage }

These are called at the end of a stroke. By replacing the image with drawingImage when a touch ends or is cancelled, you're discarding all the predicted touches that were drawn onto the canvas.

One last thing: You'll need to clear both the canvas and the true drawing when you shake to clear.

In CanvasView.swift, in clearCanvas(animated:) , locate this code inside the animation closure:

self.image = nil

Add this statement straight after that line:

self.drawingImage = nil

Now a little bit further in that same method, locate:

image = nil

and add this code after it:

drawingImage = nil

Here you clear both images of any drawing that you've done.

Build and run. Draw some squiggles and curves. You may notice that you're drawing all the touches that Apple predicted you might make, and consequently perceived latency is reduced. You may need to watch closely because it's very subtle. :]

Note: When you run the second code sample at the end of this tutorial, you'll be able to visualize what predicted touches actually do. You'll see an option to replace the texture with a blue color just for the predicted touches.

Apple's algorithm for predicted touches is astonishingly good. It's the little subtleties like this one that make it a pleasure to develop for Apple platforms.

Where To Go From Here?

Congratulations! You have now completed a simple drawing app where you can scribble and enjoy getting artsy with your Pencil. :] You can download the finished project to see the final result.

You did some pretty awesome things and learned about the following:

Smoothing lines and shapes so they look natural

Working with altitude and azimuth

Implementing drawing and shading

Adding and working with texture

Adding an eraser

Working with predictive data and what to do when it's not used

I'm also providing a second project that has buttons to turn coalesced and predicted touches on or off, so that you can visualize their effects.

Apple's WWDC video on touches has a great section on how coalesced and predicted touches work with the 60 Hz frame rate. Watch it to see how latency has improved from 4 frames in iOS 8 to 1.5 frames in iOS 9.1. Pretty spectacular!

FlexMonkey (aka Simon Gladman) has done some really creative things with the Pencil that go well beyond just drawing with it. Take a look at his blog, especially the Pencil Synthesizer and FurrySketch.

I hope you enjoyed this Apple Pencil tutorial - I'd love to see as many apps as possible integrating this very cool device. If you have any questions or comments please join the forum discussion below!