iOS 10.0

This article summarizes the key developer-related features introduced in iOS 10, which runs on currently shipping iOS devices. The article also lists the documents that describe new features in more detail.

For late-breaking news and information about known issues, see Release Notes at https://developer.apple.com/ios/download/. For the complete list of new APIs added in iOS 10, see iOS 10.0 API Diffs. For more information on new devices, see iOS Device Compatibility Reference.

To learn about what’s new in Swift, see Swift Language and The Swift Programming Language (Swift 3) .

Providing Haptic Feedback On iPhone 7 and iPhone 7 Plus, haptics provide additional ways to physically engage users with tactile feedback that gets attention and reinforces actions. Some system-provided interface elements, such as pickers, switches, and sliders, automatically provide haptic feedback as users interact with them. To give you the ability to generate haptics in an app that targets iOS 10, UIKit introduces the new UIFeedbackGenerator class and three concrete subclasses, each of which enables haptics that are appropriate for a specific scenario, as shown in Table 1. Table 1 Concrete feedback generator classes and example usages Class name Example usage UIImpactFeedbackGenerator Provides a physical experience that complements the visual feedback for an action or task. For example, the user might feel a thud when a view slides into place or two objects collide. UINotificationFeedbackGenerator Indicates that a task or action, such as depositing a check or unlocking a vehicle, has completed, failed, or produced a warning of some kind. UISelectionFeedbackGenerator Indicates that the selection is actively changing. For example, the user feels light taps while scrolling a picker wheel. Using one of the concrete subclasses, you ask the system to generate haptics for a specific scenario and iOS manages the strength and behavior of the feedback based on the scenario you choose. In addition, you can call the prepare method of UIFeedbackGenerator to inform the system that haptic feedback is about to be required and to minimize latency. To learn how to use haptics to provide the best user experience in your app, see “Haptic Feedback” in iOS Human Interface Guidelines .

SiriKit Apps that provide services in specific domains can use SiriKit to make those services available from Siri on iOS. Making your services available requires creating one or more app extensions using the Intents and Intents UI frameworks. SiriKit supports services in the following domains: Audio or video calling

Messaging

Sending or receiving payments

Searching photos

Booking a ride

Managing workouts

Adjusting settings in a CarPlay-enabled vehicle (automotive vendors only)

Making restaurant reservations (requires additional support from Apple) When the user makes a request involving your service, SiriKit sends your extension an intent object, which describes the user’s request and provides any data related to that request. You use the intent object to provide an appropriate response object, which includes details of how you can handle the user’s request. Siri typically handles all user interactions, but you can use an extension to provide custom UI that incorporates branding or additional information from your app. SiriKit also provides a mechanism you can use to tell the system about the interactions and activities that occur within your app. SiriKit defines an interaction object, which combines an intent with information about the intent-handling process, including details such as the start time and duration of a specific occurrence of the process. If your app is registered as capable of handling an activity that has the same name as an intent, the system can launch your app with an interaction object containing that intent even if you don’t provide an Intents app extension. Ride booking is supported by both Maps and Siri, and users can also make restaurants reservations with Maps. Your Intents extension handles interactions that originate from the Maps app in the same way that it handles requests coming from Siri. If you customize the user interface, your Intents UI extension can also configure itself differently, depending on whether the request came from Siri or Maps. To learn how to support SiriKit and give users new ways to access your services, read SiriKit Programming Guide . When you’re ready to implement the app extensions that handle various intents, see Intents Framework Reference and Intents UI Framework Reference .

Proactive Suggestions iOS 10 introduces new ways to increase engagement with your app by helping the system suggest your app to users at appropriate times. If you adopted app search in your iOS 9 app, you gave users access to activities and content deep within your app through Spotlight and Safari search results, Handoff, and Siri suggestions. In iOS 10 and later, you can provide information about what users do in your app, which helps the system promote your app in additional places, such as the keyboard with QuickType suggestions, Maps and CarPlay, the app switcher, Siri interactions, and (for media playing apps) the lock screen. These opportunities for enhanced integration with the system are supported by a collection of technologies, such as NSUserActivity , web markup defined by Schema.org, and APIs defined in the Core Spotlight, MapKit, UIKit, and Media Player frameworks. In iOS 10, the NSUserActivity object includes the mapItem property, which lets you provide location information that can be used in other contexts. For example, if your app displays hotel reviews, you can use the mapItem property to hold the location of the hotel the user is viewing so that when the user switches to a travel planning app, that hotel’s location is automatically available. And if you support app search, you can use the new text-based address component properties in CSSearchableItemAttributeSet , such as thoroughfare and postalCode , to fully specify locations to which the user may want to go. Note that when you use the mapItem property, the system automatically populates the contentAttributeSet property, too. To share a location with the system, be sure to specify latitude and longitude values, in addition to values for the address component properties in CSSearchableItemAttributeSet . It’s also recommended that you supply a value for the namedLocation property, so that users can view the name of the location, and the phoneNumbers property, so that users can use Siri to initiate a call to the location. In iOS 9, adding markup to the structured data on your website enriched the content that users see in Spotlight and Safari search results. In iOS 10, you can use location-related vocabulary defined at Schema.org, such as PostalAddress, to further enhance the user’s experience. For example, if users view a location described on your website, the system can suggest the same location when users switch to Maps. Note that Safari supports both JSON-LD and Microdata encodings of Schema.org vocabularies. UIKit introduces the textContentType property in the UITextInputTraits protocol so that you can specify the semantic meaning of the content you expect users to enter in a text area. When you provide this information, the system can in some cases automatically select an appropriate keyboard and improve keyboard corrections and proactive integration with information supplied from other apps and websites. For example, if you use UITextContentTypeFullStreetAddress to tell the system that you expect users to enter a complete address in a text field, the system can suggest the address of a location the user was recently viewing. If your app plays media and you use the MPPlayableContentManager APIs, iOS 10 helps you let users view album art and play media through your app on the lock screen. If your ride-sharing app uses the MKDirectionsRequest API, iOS 10 can display it in the app switcher when the user is likely to want a ride. To register as a ride-share provider, specify the MKDirectionsModeRideShare value for the MKDirectionsApplicationSupportedModes key in your Info.plist file. If your app supports only ride sharing, the system suggests your app with text that begins “Get a ride to...”; if your app supports both ride sharing and another routing type (such as Automobile or Bike), the system uses the text “Get directions to...”. Note that the MKMapItem object you receive may not include latitude and longitude information and would require geocoding.

Integrating with the Messages App In iOS 10, you can create app extensions that interact with the Messages app and let users send text, stickers, media files, and interactive messages, including interactive messages that update as each recipient responds to the message. You can also make your publicly accessible images available to the #images app in Messages. You can create two types of app extensions: A Sticker pack provides a set of stickers that users can add to their Messages content.

An iMessage app lets you present a custom user interface within the Messages app, create a sticker browser, include text, stickers, and media files within a conversation, and create, send, and update interactive messages. An iMessage app can also help users search images that you host on your app’s related website while they’re in the Messages app. You can create a Sticker pack without writing any code: Simply drag images into the Sticker Pack folder inside the Stickers asset catalog in Xcode. To develop an iMessage app, you use the APIs in the Messages framework ( Messages.framework ). To learn about the Messages framework, see Messages Framework Reference. For general information about creating app extensions, see App Extension Programming Guide. The #images app in Messages shows people popular images from public websites. Your publicly accessible images can be included in #images search results after Apple's web crawler, known as Applebot, has scanned your website. To make your public images available in #images, follow these steps: Implement an iMessage app.

Add the com.apple.developer.associated-domains key to your app’s entitlements. Include a list of the web domains that host the images you want to make searchable. For each domain, specify the spotlight-image-search service in an entry such as spotlight-image-search:yourdomain.com .

Add an apple-app-site-association file to your website. Add a dictionary for the spotlight-image-search service and include your app ID, which is the team ID or app ID prefix, followed by the bundle ID. You can specify up to 500 paths and patterns that should be included for indexing for #images (for some examples of website paths, see the universal links examples in Creating and Uploading the Association File).

Allow crawling by Applebot (to learn more, see About Applebot).

User Notifications iOS 10 introduces the User Notifications framework ( UserNotifications.framework ), which supports the delivery and handling of local and remote notifications. You use the classes of this framework to schedule the delivery of local notifications based on specific conditions, such as time or location. Apps and app extensions can use this framework to receive and potentially modify local and remote notifications when they are delivered to the user’s device. Also introduced in iOS 10, the User Notifications UI framework ( UserNotificationsUI.framework ) lets you customize the appearance of local and remote notifications when they appear on the user’s device. You use this framework to define an app extension that receives the notification data and provides the corresponding visual representation. Your extension can also respond to custom actions associated with those notifications.

Speech Recognition iOS 10 introduces a new API that supports continuous speech recognition and helps you build apps that can recognize speech and transcribe it into text. Using the APIs in the Speech framework ( Speech.framework ), you can perform speech transcription of both real-time and recorded audio. For example, you can get a speech recognizer and start simple speech recognition using code like this: let recognizer = SFSpeechRecognizer() let request = SFSpeechURLRecognitionRequest(url: audioFileURL) recognizer?.recognitionTask(with: request, resultHandler: { (result, error) in print (result?.bestTranscription.formattedString) }) As with accessing other types of protected data, such as Calendar and Photos data, performing speech recognition requires the user’s permission (for more information about accessing protected data classes, see Security and Privacy Enhancements). In the case of speech recognition, permission is required because data is transmitted and temporarily stored on Apple’s servers to increase the accuracy of recognition. To request the user’s permission, you must add the NSSpeechRecognitionUsageDescription key to your app’s Info.plist file and provide content that describes your app’s usage. When you adopt speech recognition in your app, be sure to indicate to users when their speech is being recognized so that they can avoid making sensitive utterances at that time.

Wide Color Most graphics frameworks throughout the system, including Core Graphics, Core Image, Metal, and AVFoundation, have substantially improved support for extended-range pixel formats and wide-gamut color spaces. By extending this behavior throughout the entire graphics stack, it is easier than ever to support devices with a wide color display. In addition, UIKit standardizes on working in a new extended sRGB color space, making it easy to mix sRGB colors with colors in other, wider color gamuts without a significant performance penalty. Here are some best practices to adopt as you start working with Wide Color. In iOS 10, the UIColor class uses the extended sRGB color space and its initializers no longer clamp raw component values to between 0.0 and 1.0 . If your app relies on UIKit to clamp component values (whether you’re creating a color or asking a color for its component values), you need to change your app’s behavior when you link against iOS 10.

When performing custom drawing in a UIView on an iPad Pro (9.7 inch), the underlying drawing environment is configured with an extended sRGB color space.

If your app renders custom image objects, use the new UIGraphicsImageRenderer class to control whether the destination bitmap is created using an extended-range or standard-range format.

If you are performing your own image processing on wide-gamut devices using a lower level API, such as Core Graphics or Metal, you should use an extended range color space and a pixel format that supports 16-bit floating-point component values. When clamping of color values is necessary, you should do so explicitly.

Core Graphics, Core Image, and Metal Performance Shaders provide new options for easily converting colors and images between color spaces.

Adapting to the True Tone Display The True Tone display uses ambient light sensors to automatically adjust the color and intensity of the display to match the lighting conditions of the current environment. To ensure that your app works well with the standard color shift provided by True Tone, add the new UIWhitePointAdaptivityStyle key to your Info.plist file to describe your app’s primary visual content. For example: If your app is a photo editing app, color fidelity is more important than automatic adjustment to the environmental white point. In this case, you can use the UIWhitePointAdaptivityStylePhoto style to reduce the strength of True Tone shift applied by the system.

If your app is a reading app, conformance with the environmental white point is helpful to users. In this case, you can use the UIWhitePointAdaptivityStyleReading style to increase the strength of True Tone shift applied by the system.

App Search Enhancements iOS 10 and the Core Spotlight framework introduce several enhancements to app search: In-app searching

Search continuation

Crowdsourcing deep link popularity with differential privacy

Visualization of validation results The new CSSearchQuery class supports in-app searches of content that you index using existing Core Spotlight APIs. Using this API can eliminate the need to maintain your own separate search index and lets you take advantage of Spotlight’s powerful search technology and matching rules to allow users to search for content without leaving your app, just as they do within Mail, Messages, and Notes. In iOS 9, using search APIs (such as Core Spotlight, NSUserActivity , and web markup) to index content within your app let users search for that content using the Spotlight and Safari search interfaces. In iOS 10, you can use new Core Spotlight symbols to let users continue a search they began in Spotlight when they open your app. To enable this feature, add the CoreSpotlightContinuation key to your Info.plist file, give it the value YES , and update your code to handle an activity continuation of type CSQueryContinuationActionType . The user info dictionary in the NSUserActivity object that you receive in your application:continueUserActivity:restorationHandler: method includes the CSSearchQueryString key, whose value is a string that represents the user’s query. iOS 10 introduces a differentially private way to help improve the ranking of your app’s content in search results. iOS submits a subset of differentially private hashes to Apple servers as users use your app and as NSUserActivity objects that include a deep link URL and have their eligibleForPublicIndexing property set to YES are submitted to iOS. The differential privacy of the hashes allows Apple to count the frequency with which popular deep links are visited without ever associating a user with a link. When you test your website markup and deep links using the App Search API Validation tool, it now displays a visual representation of your results, including supported markup, such as that defined at Schema.org. The validation tool can help you see information that the Applebot web crawler has indexed, such as the title, description, URL, and other supported elements. You can access the validation tool here: https://search.developer.apple.com/appsearch-validation-tool. To learn more about supporting deep links and adding markup, see Mark Up Web Content. To learn how to make your website’s images searchable within the Messages app, see Integrating with the Messages App.

Widget Enhancements iOS 10 introduces a new design for the lock screen, which now displays widgets. To ensure that your widget looks good on any background, you can specify widgetPrimaryVibrancyEffect or widgetSecondaryVibrancyEffect , as appropriate (use these properties instead of the deprecated notificationCenterVibrancyEffect property). In addition, widgets now include the concept of display mode (represented by NCWidgetDisplayMode ), which lets you describe how much content is available and allows users to choose a compact or expanded view.

Apple Pay Enhancements In iOS 10, users can make easy and secure payments using Apple Pay from websites and through interaction with Siri and Maps. For developers, iOS 10 introduces new APIs you can use in code that runs in both iOS and watchOS, the ability to support dynamic payment networks, and a new sandbox testing environment. iOS 10 introduces new APIs that help you incorporate Apple Pay directly into your website. When you support Apple Pay in your website, users browsing with Safari in iOS or macOS can make payments using their cards in Apple Pay on their iPhone or Apple Watch. To learn more, see ApplePay JS Framework Reference . The PassKit framework ( PassKit.framework ) introduces APIs that let you support Apple Pay in places where UIKit is not available. Specifically, PKPaymentAuthorizationController and PKPaymentAuthorizationControllerDelegate enable features provided by PKPaymentAuthorizationViewController and its delegate, but don’t require UIKit. Although the new API is required for supporting Apple Pay in watchOS and in certain intents, it’s recommended that you adopt it in all of your code so that you can provide broad Apple Pay support with a single code base. (To learn more about intents and Siri integration, see SiriKit.) The PassKit framework also adds features that let card issuers present their cards from within their apps. Specifically, the PKPaymentButtonTypeInStore button type lets you display an Apple Pay button for a card and the presentPaymentPass: method lets you programmatically display the card (the presentPaymentPass: method is defined in PKPassLibrary ). When a new payment network becomes available, your app can automatically support the new network without requiring you to modify and recompile your app. The availableNetworks method lets you discover the networks that are available on the user's device at runtime. In addition, the supportedNetworks property is expanded, so that it can take some payment provider names as an argument. Your app then automatically supports any networks that the payment provider supports. To learn more, see https://developer.apple.com/apple-pay/. iOS 10 introduces a new testing environment that lets you provision test cards directly on the device. The test environment returns encrypted test payment data. To use this environment, follow these steps: Create a testing iCloud Account at iTunes Connect. Log into that account on your device. Set the desired region for testing. Use test cards listed at https://developer.apple.com/apple-pay/. Note: When you switch iCloud accounts, the environment switches automatically. You must still test your payments using actual cards in an production environment.

Security and Privacy Enhancements iOS 10 introduces several changes and additions that help you improve the security of your code and maintain the privacy of user data. To learn more about these items, see https://developer.apple.com/security/. The new NSAllowsArbitraryLoadsInWebContent key for your Info.plist file gives you a convenient way to allow arbitrary web page loads to work while retaining ATS protections for the rest of your app. To learn more about this key, see NSAppTransportSecurity.

The SecKey API includes improvements for asymmetric key generation. Use the SecKey API instead of the deprecated Common Data Security Architecture (CDSA) APIs.

The RC4 symmetric cipher suite is now disabled by default for all SSL/TLS connections, and SSLv3 is no longer supported in the Secure Transports API. It’s recommended that you stop using the SHA-1 and 3DES cryptographic algorithms as soon as possible.

The UIPasteboard class supports the Clipboard feature, which lets users copy and paste between devices, and includes API you can use to restrict a pasteboard to a specific device and set an expiration timestamp after which the pasteboard is cleared. Additionally, named pasteboards are no longer persistent—instead, you should use shared containers—and the “Find” pasteboard (that is, the pasteboard identified by the UIPasteboardNameFind constant) is unavailable.

You must statically declare your app’s intended use of protected data classes by including the appropriate purpose string keys in your Info.plist file. For example, you must include the NSCalendarsUsageDescription key to access the user’s Calendar data. If you don’t include the relevant purpose string keys, your app exits when it tries to access the data.

CallKit The CallKit framework ( CallKit.framework ) lets VoIP apps integrate with the iPhone UI and give users a great experience. Use this framework to let users view and answer incoming VoIP calls on the lock screen and manage contacts from VoIP calls in the Phone app’s Favorites and Recents views. CallKit also introduces app extensions that enable call blocking and caller identification. You can create an app extension that can associate a phone number with a name or tell the system when a number should be blocked.

News Publisher Enhancements News Publisher makes it easy to deliver beautifully designed news, magazine, and web content to Apple News using the Apple News Format. Anyone can sign up, from major magazines or news organizations to independent publishers and bloggers. To get started or to learn more about recent updates, visit https://newsresources.apple.com.

Video Subscriber Account iOS 10 introduces the Video Subscriber Account framework ( VideoSubscriberAccount.framework ) to help apps that support authenticated streaming or authenticated video on demand (also known as TV Everywhere) authenticate with their cable or satellite TV provider. Using the APIs in this framework can help you support a single sign-in experience in which users sign in once to unlock access in all of the streaming video apps that their subscription supports.

App Extensions iOS 10 introduces several new extension points for which you can create an app extension, such as: Call Directory

Intents

Intents UI

Messages

Notification Content

Notification Service

Sticker Pack In addition, iOS 10 includes the following enhancements for third-party keyboard app extensions: You can automatically detect the input language of a document by using the documentInputMode property of the UITextDocumentProxy class, and change your keyboard extension to align with that language (if supported). When you detect the input language in this way, you can do per-language keyboard switching such as what is built in to Messages.

The new handleInputModeListFromView:withEvent: method lets a keyboard extension display the system’s keyboard picker menu (that is, the globe key menu). A keyboard extension should position the globe key in the same location as the system globe key for each orientation. Also, if you need to provide a custom key—to enable keyboard settings, for example—you should put this key in the same location as the dictation key in the system keyboard. To learn more about creating app extensions in general, see App Extension Programming Guide.