Step 3: Set Up Camera and Display Live Video

Next, we’re going to add the camera to our app and display live video using our new VideoPreviewView .

First things first, we need to add camera permissions to the Info.plist file. As a key, add “Privacy — Camera Usage Description” with a description:

To display frames from the camera, we need to configure AVCaptureSession and AVCaptureVideoDataOutput objects:

First, we have to set the pixelBuffer format to kCVPixelFormatType_32BGRA for the segmentation model.

To make sure we have everything hooked up correctly, let’s display the raw video coming from the camera. For reasons that will become apparent in the next step, we’re going to display the camera output via a UIImageView .

When we run the app, we should see normal video displayed. Here we make sure to update the previewView asynchronously so we don’t block the UI Thread:

Run this and you should see live video in your app!

Step 4: Segment your Images

Once you have the video displaying your preview view, it’s time to run the image segmentation model.

First, update your Podfile to include the People Image Segmentation Model.

pod ‘Fritz/VisionSegmentationModel/People’

There are actually three different kinds of models: Living Room, Outdoor, and People. Learn more about these models here

Initialize the image segmentation model as a variable in the view controller and run the model in the captureOutput function:

The model’s output is a FritzVisionSegmentationResult object. The easiest way to work with the result is to call its toImageMask function. You can customize the appearance of the mask by setting the class you’d like to construct the mask from, the threshold above which the pixel will be completely revealed and the minThreshold below which the pixel will be obfuscated.

The output is a UIImage that can overlay the input image to the model. The color of each pixel represents the class the model predicts. For our people model, black pixels represent people.

The reason we’re using the UIImageView is that this view type lets you pass in another UIImageView as a mask. Pixels where the value is greater than zero in the mask image will let the background shine through.

When you build and run this example, you should only see people—nothing else!

Step 5: Add a Blur

The last step will be blurring the background. To do this, we need to add _another_ UIImageView to display the background video. Before adding the blur, let’s add the background video back in.

Now, adding the blur is as simple as adding in a UIVisualEffectView

A screenshot from our app with the blur effect

However, this blur view is a bit strong. The effect will work better with a configurable blur radius. By defining our own CustomBlurView , we can tailor the effect to our liking:

Conclusion — How will you use image segmentation?

We’ve built our first app using image segmentation and hopefully have a feel for the power of image segmentation.

There are many next steps we could take. Create an app that puts custom backgrounds behind people, or one that replaces the sky with cheese— there are so many options available. Have an idea? Let us know in the comments!

Discuss this post on Hacker News