Wolfenstein 3D for iPhone

For once, developers were able to read the source code of an id software product just a few days after its release. I spent a week in my spare time reading the internal of the Wolfenstein 3D for iPhone engine. It is by far the cleanest and easiest id source code release to date.



This review is far from being exhaustive but it provides answers to a few of the questions I asked myself:



How was the absence of direct framebuffer access addressed ?

How is the engine (ANSI C) coupled with Objective-C ?

How was the absence of Immediate mode in openGL ES addressed ?

Is NSTimer the only way to perform animation on this #*! iphone ?

Download





Overall design

If you are not familiar with iPhone development, in a nutshell: You don't control it, it controls you ! The big picture does not look like the classic:



int main(int argc,char* argv[]) { while(gameOn) { getUserInputs(); updateTimer(); updateWorld(); renderWorld(); swapBuffer(); } }



With something like this, you would have total control of the thread/process and it's just not the way the iPhone works. Everything is based on events, you define callback methods and function pointers that will be used by the iPhone when it needs it:





The "while loop", updateWorld and renderWorld for instance are replaced by an object NSTimer that will call your rendering/update method at a FIXED rate.

that will call your rendering/update method at a FIXED rate. User inputs are handled via callbacks method, that your main class is supposed to overwrite.

You end up having the following block to setup the NSTimer in EAGLView.m :





self.animationTimer = [NSTimer scheduledTimerWithTimeInterval:0.032 target:self selector:@selector(drawView) userInfo:nil repeats:YES];



As you can see, the method called by NSTimer is named drawView , the refresh rate is set to 1/0.032 (30Hz). The method drawView will update the world, calls the C rendering method ( iPhoneFrame ) and finally swap the buffer to the screen. Most of the engine is ANSI C, Objective-C is just being used to host the windows, load textures and grab the users input. The following block shows the Objective-C method overwrite needed to grab user input.





- (void) touchesBegan:(NSSet*)touches withEvent:(UIEvent*)event { [self handleTouches:touches withEvent:event]; } - (void) touchesEnded:(NSSet*)touches withEvent:(UIEvent*)event { [self handleTouches:touches withEvent:event]; } - (void)touchesMoved:(NSSet *)touches withEvent:(UIEvent *)event { [self handleTouches:touches withEvent:event]; }

For any finger on the screen, the method handleTouches is called with data as parameters. The core of the ANSI C engine is then updated with new position and angles.

Engine





The DOS version of wolfenstein was written in Borland C/TASM with direct framebuffer access, it was a pure raycaster engine:



For every column of pixels on the screen, a ray was casted. Taking advantage of the axis aligned walls, the intersection of the ray with the first blocker was fast to determine. The distance from the POV would give the height of the pixel column to write in the framebuffer. After a little bit of fishBowl perspective correction you had the illusion of 3D. Check out F. Permadi's article, it's pure gold if you want to read more about raycasters engine.









Wolfenstein for iPhone could not go this way because the iPhone library CoreSurface/CoreSurface.h needed to access the framebuffer is restricted. You just can't use it for Appstore applications.



So, how do you render the world if you cannot write the column of pixel in the framebuffer based on their distance ?



You keep the first part of the algorithm for visibility determination:

Cast a ray for every columns of pixel (wolf3D casts 640 rays), it will build the list of visible walls.

Build a list of every sprite entity in the room, sort them by distance: Back to Front.

Set the GL_MODEL_VIEW and GL_PROJECTION according to the player POV.

Using the walls coordinate, draw a bunch of big QUADS for walls and entities.

The resulting engine is actually even more elegant than the original in my opinion. And it also provides hardware texture filtering !





void iphoneFrame() { int msec = 14; // fixed time iphoneFrameNum++; if ( consoleActive ) { iphoneSet2D(); Client_Screen_DrawConsole(); iphoneSavePrevTouches(); GLimp_EndFrame(); return; } // fill the floor and ceiling R_Draw_Fill( 0, 0, viddef.width, viddef.height >> 1, r_world->ceilingColour ); R_Draw_Fill( 0, viddef.height >> 1, viddef.width, viddef.height, r_world->floorColour ); // draw 3D world R_SetGL3D( Player.position ); R_RayCast( Player.position, r_world ); R_DrawSprites(); // draw 2D overlays iphoneSet2D(); // Draw damage or bonus blending, simulate palette switch Client_Screen_DrawConsole(); ShowTilt(); // do the swapbuffers GLimp_EndFrame(); }

Escaping infamous Immediate mode

Iphones run openGL ES 1.1. Hence you have no access to glVertex3f,glTexCoord2f and all other outdated methods. There was a bunch of calls to these method in the original Wolf3D Redux by Michael Liebscher. Instead of changing every methods calls, a really neat abstraction layer was implemented, hiding a vertexArray mechanism.



What looks like:



pfglBegin(GL_QUADS); pfglTexCoord2f( 1.0, 0.0 ); pfglVertex3f( 0.0,0.0, 0.0 ); ... pfflEnd();

Is actually doing this:



//qglBegin(GL_QUADS); curr_vertex = 0; curr_prim = prim; //pfglTexCoord2f( 1.0, 0.0 ); vab.st[ 0 ] = s; vab.st[ 1 ] = t; //pfglVertex3f( 0.0,0.0, 0.0 ); vab.xyz[ 0 ] = x; vab.xyz[ 1 ] = y; vab.xyz[ 2 ] = z; immediate[ curr_vertex ] = vab; curr_vertex++; //qflEnd(); qglDrawArrays( curr_prim, 0, curr_vertex );

Of course to make all this work, openGL is initialized as follow:

struct Vertex { float xyz[3]; float st[2]; GLubyte c[4]; }; #define MAX_VERTS 16384 typedef struct Vertex Vertex; Vertex immediate[ MAX_VERTS ]; void initGL { qglVertexPointer( 3, GL_FLOAT, sizeof( Vertex ), immediate[ 0 ].xyz ); qglTexCoordPointer( 2, GL_FLOAT, sizeof( Vertex ), immediate[ 0 ].st ); qglColorPointer( 4, GL_UNSIGNED_BYTE, sizeof( Vertex ), immediate[ 0 ].c ); qglEnableClientState( GL_VERTEX_ARRAY ); qglEnableClientState( GL_TEXTURE_COORD_ARRAY ); qglEnableClientState( GL_COLOR_ARRAY ); }

Short lived bottleneck

While digging in the source, I found out that the framebuffer was being swapped way more often than necessary. After firing an email to John Carmack, it looks like it's already fixed in v1.1!

Email from: Fabien Sanglard fabien.sanglard@fabiensanglard.net to: John Carmack johnc@idsoftware.com subject: How to make Wolfenstein's iphoneFrame 50% faster. Hello john, I've been reading the code of Wolfenstein 3D for a few days. I've noticed that the buffer is swapped at the end of every drawView calls ( pretty much standard). But it looks like the buffer is also swapped in calls to GLimp_EndFrame method. It doesn't look like the Right Thing to Do (unless I am missing something). According to my testing, changing: void GLimp_EndFrame() { [eaglview swapBuffers]; } to void GLimp_EndFrame() { //[eaglview swapBuffers]; } changed the iphoneFrame runtime from 20ms to 10ms.

Email from: John Carmack johnc@idsoftware.com to: Fabien Sanglard fabien.sanglard@fabiensanglard.net subject: Re: How to make Wolfenstein's iphoneFrame 50% faster. That was already changed in v1.1. It was interesting that the only way that was possible to go unnoticed was the fact that the iPhone uses triple buffering instead of double buffering -- with double buffering the screen would have never been updated at all.

So next release should be even faster. It's also a good news from people hoping for Doom: Wolfenstein is far from pushing the iphone to the limit. As a side note, I was really surprised to get an answer from John Carmack public email address. It's pretty amazing that some stranger can exchange with one of the gods of programming.



Compiling

If you are an Apple iPhone Developer, you can actually build the game from the source and upload it on your iPhone.

An early attempt, selecting the configuration "ReleaseEpisode1" will fail:



CodeSign error: Code Signing Identity 'iPhone Developer: John Carmack' does not match any code-signing certificate in your keychain.

No doubt, the code IS still warm. Just remove John Carmack's signing certificate, grab a new "Provisionning Certificate" from the Apple Developer Portal and build again: The game builds flawlessly and gets uploaded on the iphone device.



Objective-C/ C++ communication

The engine feature a neat example of method calls Objective-C <=> ANSI C.



Objective-C => ANSI C is quite easy: Just import the header with the function declaration and call them.

This is done in - (void)drawView , to call iphoneFrame() .



Objective-C <= ANSI C: Call an Objective-C method from a C program is a little bit more tricky:



You need the main UIView to maintain a reference to itself via a global variable:



EAGLView.m

EAGLView *eaglview; - (id)initWithCoder:(NSCoder*)coder { eaglview = self; } void GLimp_EndFrame() { [eaglview swapBuffers]; }

You then declare the function "extern 'C'" (if you are in c++) and can call it from your C/C++ code:



#ifdef __cplusplus extern "C" { #endif void GLimp_EndFrame( void ); #ifdef __cplusplus } #endif ... void foo(void) { GLimp_EndFrame(); }

Recommended reading

None: I've been playing way too much soccer recently.



@