In this Metal tutorial, you will learn how to get started with Apple’s 3D graphics API by rendering a simple triangle to the screen.

Update note: Andrew Kharchyshyn updated this tutorial for iOS 12, Xcode 10 and Swift 4.2. He also wrote the original.

In iOS 8, Apple released its own API for GPU-accelerated 3D graphics: Metal.

Metal is similar to OpenGL ES in that it’s a low-level API for interacting with 3D graphics hardware.

The difference is that Metal is not cross-platform. Instead, it’s designed to be extremely efficient with Apple hardware, offering improved speed and low overhead compared to using OpenGL ES.

In this tutorial, you’ll get hands-on experience using the Metal API to create a bare-bones app: drawing a simple triangle. In doing so, you’ll learn some of the most important classes in Metal, such as devices, command queues and more.

This tutorial is designed so that anyone can go through it, regardless of your 3D graphics background — however, things will move along fairly quickly. If you do have some prior 3D-programming or OpenGL experience, you’ll find things much easier, as many of the same concepts apply to Metal.

Note: Metal apps do not run on the iOS simulator; they require a device with an Apple A7 chip or later. To complete this tutorial, you’ll need an A7 device or newer.

Metal vs. SpriteKit, SceneKit or Unity

Before you get started, it’ll be helpful to understand how Metal compares to higher-level frameworks like SpriteKit, SceneKit or Unity.

Metal is a low-level 3D graphics API, similar to OpenGL ES, but with lower overhead meaning better performance. It’s a very thin layer above the GPU, which means that, in doing just about anything, such as rendering a sprite or a 3D model to the screen, it requires you to write all of the code to do this. The trade-off is that you have full power and control.

Conversely, higher-level game frameworks like SpriteKit, SceneKit and Unity are built on top of a lower-level 3D graphics APIs like Metal or OpenGL ES. They provide much of the boilerplate code you normally need to write in a game, such as rendering a sprite or 3D model to the screen.

If all you’re trying to do is make a game, you’ll probably use a higher-level game framework like SpriteKit, SceneKit or Unity most of the time because doing so will make your life much easier. If this sounds like you, we have tons of tutorials to help you get started with Apple Game Frameworks or Unity.

However, there are still two really good reasons to learn Metal:

Push the hardware to its limits: Since Metal is at such a low level, it allows you to really push the hardware to its limits and have full control over how your game works. It’s a great learning experience: Learning Metal teaches you a lot about 3D graphics, writing your own game engine, and how higher-level game frameworks work.

If either of these sound like good reasons to you, keep reading!

Metal vs. OpenGL ES

OpenGL ES is designed to be cross platform. That means you can write C++ OpenGL ES code, and, most of the time, with some small modifications, you can run it on other platforms, such as Android.

Apple realized that, although the cross-platform support of OpenGL ES was nice, it was missing something fundamental to how Apple designs its products: the famous Apple integration of the operating system, hardware and software as a complete package.

So Apple took a clean-room approach to see what it would look like if it were to design a graphics API specifically for Apple hardware with the goal of being extremely low overhead and performant, while supporting the latest and greatest features.

The result is Metal, which can provide up to 10✕ the number of draw calls for your app compared to OpenGL ES. This can result in some amazing effects — you may remember from the Zen Garden example in the WWDC 2014 keynote, as an example.

Time to dig right in and see some Metal code!

Getting Started

Xcode’s iOS game template comes with a Metal option, but you won’t choose that here. This is because you’re going to put together a Metal app almost from scratch, so you can understand every step of the process.

Download the files that you need for this tutorial using the Download Materials button at the top or bottom of this tutorial. Once you have the files, open HelloMetal.xcodeproj in the HelloMetal_starter folder. You’ll see an empty project with a single ViewController.

There are seven steps required to set up Metal so that you can begin rendering. You need to create a:

MTLDevice CAMetalLayer Vertex Buffer Vertex Shader Fragment Shader Render Pipeline Command Queue

Going through them one at a time.

1) Creating an MTLDevice

You’ll first need to get a reference to an MTLDevice .

Think of MTLDevice as your direct connection to the GPU. You’ll create all the other Metal objects you need (like command queues, buffers and textures) using this MTLDevice .

To do this, open ViewController.swift and add this import to the top of the file:

import Metal

This imports the Metal framework so that you can use Metal classes such as MTLDevice inside this file.

Next, add this property to the ViewController :

var device: MTLDevice!

You’re going to initialize this property in viewDidLoad() rather than in an initializer, so it has to be an optional. Since you know you’re definitely going to initialize it before you use it, you mark it as an implicitly unwrapped optional, for convenience purposes.

Finally, add viewDidLoad() and initialize the device property, like this:

override func viewDidLoad() { super.viewDidLoad() device = MTLCreateSystemDefaultDevice() }

MTLCreateSystemDefaultDevice returns a references to the default MTLDevice your code should use.

2) Creating a CAMetalLayer

In iOS, everything you see on screen is backed by a CALayer . There are subclasses of CALayers for different effects, such as gradient layers, shape layers, replicator layers and more.

If you want to draw something on the screen with Metal, you need to use a special subclass of CALayer called CAMetalLayer . You’ll add one of these to your view controller.

First, add this new property to the class:

var metalLayer: CAMetalLayer!

Note: If you get a compiler error at this point, make sure that you set the app to target your Metal-compatible iOS device. As mentioned earlier, Metal is not supported on iOS Simulator at this time.

This will store a handy reference to your new layer.

Next, add this code to the end of viewDidLoad() :

metalLayer = CAMetalLayer() // 1 metalLayer.device = device // 2 metalLayer.pixelFormat = .bgra8Unorm // 3 metalLayer.framebufferOnly = true // 4 metalLayer.frame = view.layer.frame // 5 view.layer.addSublayer(metalLayer) // 6

Going over this line by line:

Create a new CAMetalLayer . You must specify the MTLDevice the layer should use. You simply set this to the device you obtained earlier. Set the pixel format to bgra8Unorm , which is a fancy way of saying “8 bytes for Blue, Green, Red and Alpha, in that order — with normalized values between 0 and 1.” This is one of only two possible formats to use for a CAMetalLayer , so normally you’d just leave this as-is. Apple encourages you to set framebufferOnly to true for performance reasons unless you need to sample from the textures generated for this layer, or if you need to enable compute kernels on the layer drawable texture. Most of the time, you don’t need to do this. You set the frame of the layer to match the frame of the view. Finally, you add the layer as a sublayer of the view’s main layer.

3) Creating a Vertex Buffer

Everything in Metal is a triangle. In this app, you’re just going to draw one triangle, but even complex 3D shapes can be decomposed into a series of triangles.

In Metal, the default coordinate system is the normalized coordinate system, which means that by default you’re looking at a 2x2x1 cube centered at (0, 0, 0.5).

If you consider the Z=0 plane, then (-1, -1, 0) is the lower left, (0, 0, 0) is the center, and (1, 1, 0) is the upper right. In this tutorial, you want to draw a triangle with the following three points:

You’ll have to create a buffer for this. Add the following constant property to your class:

let vertexData: [Float] = [ 0.0, 1.0, 0.0, -1.0, -1.0, 0.0, 1.0, -1.0, 0.0 ]

This creates an array of floats on the CPU. You need to send this data to the GPU by moving it to something called a MTLBuffer .

Add another new property for this:

var vertexBuffer: MTLBuffer!

Then add this code to the end of viewDidLoad() :

let dataSize = vertexData.count * MemoryLayout.size(ofValue: vertexData[0]) // 1 vertexBuffer = device.makeBuffer(bytes: vertexData, length: dataSize, options: []) // 2

Taking it comment by comment:

You need to get the size of the vertex data in bytes. You do this by multiplying the size of the first element by the count of elements in the array. You call makeBuffer(bytes:length:options:) on the MTLDevice to create a new buffer on the GPU, passing in the data from the CPU. You pass an empty array for default configuration.

4) Creating a Vertex Shader

The vertices that you created in the previous section will become the input to a little program that you’ll write called a vertex shader.

A vertex shader is simply a tiny program that runs on the GPU, written in a C++-like language called the Metal Shading Language.

A vertex shader is called once per vertex, and its job is to take that vertex’s information, such as position — and possibly other information such as color or texture coordinate — and return a potentially modified position and possibly other data.

To keep things simple, your simple vertex shader will return the same position as the position passed in.

The easiest way to understand vertex shaders is to see it yourself. Go to File ▸ New ▸ File, choose iOS ▸ Source ▸ Metal File, and click Next. Enter Shaders.metal for the filename and click Create.

Note: In Metal, you can include multiple shaders in a single Metal file. You can also split your shaders across multiple Metal files if you would like, as Metal will load shaders from any Metal file included in your project.

Add the following code to the bottom of Shaders.metal:

vertex float4 basic_vertex( // 1 const device packed_float3* vertex_array [[ buffer(0) ]], // 2 unsigned int vid [[ vertex_id ]]) { // 3 return float4(vertex_array[vid], 1.0); // 4 }

Here’s what’s going on in the code above:

All vertex shaders must begin with the keyword vertex . The function must return (at least) the final position of the vertex. You do this here by indicating float4 (a vector of four floats). You then give the name of the vertex shader; you’ll look up the shader later using this name. The first parameter is a pointer to an array of packed_float3 (a packed vector of three floats) – i.e., the position of each vertex.

Use the [[ ... ]] syntax to declare attributes, which you can use to specify additional information such as resource locations, shader inputs and built-in variables. Here, you mark this parameter with [[ buffer(0) ]] to indicate that the first buffer of data that you send to your vertex shader from your Metal code will populate this parameter. The vertex shader also takes a special parameter with the vertex_id attribute, which means that the Metal will fill it in with the index of this particular vertex inside the vertex array. Here, you look up the position inside the vertex array based on the vertex id and return that. You also convert the vector to a float4 , where the final value is 1.0 — long story short, this is required for 3D math.

5) Creating a Fragment Shader

After the vertex shader completes, Metal calls another shader for each fragment (think pixel) on the screen: the fragment shader.

The fragment shader gets its input values by interpolating the output values from the vertex shader. For example, consider the fragment between the bottom two vertices of the triangle:

The input value for this fragment will be a 50/50 blend of the output value of the bottom two vertices.

The job of a fragment shader is to return the final color for each fragment. To keep things simple, you’ll make each fragment white.

Add the following code to the bottom of Shaders.metal:

fragment half4 basic_fragment() { // 1 return half4(1.0); // 2 }

Reviewing line by line:

All fragment shaders must begin with the keyword fragment . The function must return (at least) the final color of the fragment. You do so here by indicating half4 (a four-component color value RGBA). Note that half4 is more memory efficient than float4 because you’re writing to less GPU memory. Here, you return (1, 1, 1, 1) for the color, which is white.

6) Creating a Render Pipeline

Now that you’ve created a vertex and fragment shader, you need to combine them — along with some other configuration data — into a special object called the render pipeline.

One of the cool things about Metal is that the shaders are precompiled, and the render pipeline configuration is compiled after you first set it up. This makes everything extremely efficient.

First, add a new property to ViewController.swift:

var pipelineState: MTLRenderPipelineState!

This will keep track of the compiled render pipeline you’re about to create.

Next, add the following code to the end of viewDidLoad() :

// 1 let defaultLibrary = device.makeDefaultLibrary()! let fragmentProgram = defaultLibrary.makeFunction(name: "basic_fragment") let vertexProgram = defaultLibrary.makeFunction(name: "basic_vertex") // 2 let pipelineStateDescriptor = MTLRenderPipelineDescriptor() pipelineStateDescriptor.vertexFunction = vertexProgram pipelineStateDescriptor.fragmentFunction = fragmentProgram pipelineStateDescriptor.colorAttachments[0].pixelFormat = .bgra8Unorm // 3 pipelineState = try! device.makeRenderPipelineState(descriptor: pipelineStateDescriptor)

Taking it section by section:

You can access any of the precompiled shaders included in your project through the MTLLibrary object you get by calling device.makeDefaultLibrary()! . Then, you can look up each shader by name. You set up your render pipeline configuration here. It contains the shaders that you want to use, as well as the pixel format for the color attachment — i.e., the output buffer that you’re rendering to, which is the CAMetalLayer itself. Finally, you compile the pipeline configuration into a pipeline state that is efficient to use here on out.

7) Creating a Command Queue

The final one-time-setup step that you need to do is to create an MTLCommandQueue .

Think of this as an ordered list of commands that you tell the GPU to execute, one at a time.

To create a command queue, simply add a new property:

var commandQueue: MTLCommandQueue!

Then, add the following line at the end of viewDidLoad() :

commandQueue = device.makeCommandQueue()

Congrats — your one-time setup code is done!

Rendering the Triangle

Now, it’s time to move on to the code that executes each frame — to render the triangle!

This is done in five steps:

Create a Display Link Create a Render Pass Descriptor Create a Command Buffer Create a Render Command Encoder Commit your Command Buffer

Note: In theory, this app doesn’t actually need to render things once per frame, because the triangle doesn’t move after it’s drawn. However, most apps do have moving pieces, so you’ll do things this way to learn the process. This also gives a nice starting point for future tutorials.

1) Creating a Display Link

You need a way to redraw the screen every time the device screen refreshes.

CADisplayLink is a timer synchronized to the displays refresh rate. The perfect tool for the job! To use it, add a new property to the class:

var timer: CADisplayLink!

Initialize it at the end of viewDidLoad() as follows:

timer = CADisplayLink(target: self, selector: #selector(gameloop)) timer.add(to: RunLoop.main, forMode: .default)

This sets up your code to call a method named gameloop() every time the screen refreshes.

Finally, add these stub methods to the class:

func render() { // TODO } @objc func gameloop() { autoreleasepool { self.render() } }

Here, gameloop() simply calls render() each frame, which, right now, just has an empty implementation. Time to flesh this out.

2) Creating a Render Pass Descriptor

The next step is to create an MTLRenderPassDescriptor , which is an object that configures which texture is being rendered to, what the clear color is and a bit of other configuration.

Add these lines inside render() , in place of // TODO :

guard let drawable = metalLayer?.nextDrawable() else { return } let renderPassDescriptor = MTLRenderPassDescriptor() renderPassDescriptor.colorAttachments[0].texture = drawable.texture renderPassDescriptor.colorAttachments[0].loadAction = .clear renderPassDescriptor.colorAttachments[0].clearColor = MTLClearColor( red: 0.0, green: 104.0/255.0, blue: 55.0/255.0, alpha: 1.0)

First, you call nextDrawable() on the Metal layer you created earlier, which returns the texture in which you need to draw in order for something to appear on the screen.

Next, you configure the render pass descriptor to use that texture. You set the load action to Clear, which means “set the texture to the clear color before doing any drawing,” and you set the clear color to the green color used on the site.

3) Creating a Command Buffer

The next step is to create a command buffer. Think of this as the list of render commands that you wish to execute for this frame. The cool thing is that nothing actually happens until you commit the command buffer, giving you fine-grained control over when things occur.

Creating a command buffer is easy. Simply add this line to the end of render() :

let commandBuffer = commandQueue.makeCommandBuffer()!

A command buffer contains one or more render commands. You’ll create one of these next.

4) Creating a Render Command Encoder

To create a render command, you use a helper object called a render command encoder. To try this out, add these lines to the end of render() :

let renderEncoder = commandBuffer .makeRenderCommandEncoder(descriptor: renderPassDescriptor)! renderEncoder.setRenderPipelineState(pipelineState) renderEncoder.setVertexBuffer(vertexBuffer, offset: 0, index: 0) renderEncoder .drawPrimitives(type: .triangle, vertexStart: 0, vertexCount: 3, instanceCount: 1) renderEncoder.endEncoding()

Here, you create a command encoder and specify the pipeline and vertex buffer that you created earlier.

The most important part is the call to drawPrimitives(type:vertexStart:vertexCount:instanceCount:) . Here, you’re telling the GPU to draw a set of triangles, based on the vertex buffer. To keep things simple, you are only drawing one. The method arguments tell Metal that each triangle consists of three vertices, starting at index 0 inside the vertex buffer, and there is one triangle total.

When you’re done, you simply call endEncoding() .

5) Committing Your Command Buffer

The final step is to commit the command buffer. Add these lines to the end of render() :

commandBuffer.present(drawable) commandBuffer.commit()

The first line is needed to make sure that the GPU presents the new texture as soon as the drawing completes. Then you commit the transaction to send the task to the GPU.

Phew! That was a ton of code, but, at long last, you are done! Build and run the app and bask in your triangular glory:

Where to Go From Here?

The final project for this tutorial is in the tutorial materials bundle using the Download Materials button at the top or bottom of this tutorial.

You have learned a ton about the Metal API! You now have an understanding of some of the most important concepts in Metal, such as shaders, devices, command buffers, pipelines and more.

Also, be sure to check out some great resources from Apple:

You also might enjoy the Beginning Metal course on our site, where we explain these same concepts in video form, but with even more detail.

Or you can dive into books: Check out our Metal by Tutorials book.

I hope you enjoyed this tutorial, and if you have any comments or questions, please join the forum discussion below!