With this code, we have the structure for a chat application. We’re defining three functions in our Channel here:

join this is called whenever a client initializes the WebSocket connection. In this chat app of ours, the client passes a "name" parameter on joining, and we can save that to the socket with assign . Once a client has joined, they can send and receive messages scoped to that topic ( "chat_room" in this instance). must return {:ok, socket} or {:error, socket, "reason"} to allow or deny pubsub access

event this is called whenever a client sends a WebSocket message to the server. In this case, this function would handle a new chat message, as the client sends a "new:message" event with a "content" parameter. The Channel can grab the user name (which was saved to the socket connection when the user joined) with get_assign , and broadcast to all connected clients the new message. must return socket

leave the opposite of join . This removes a client from the set of subscribers to the "chat_room" topic and tears down the connection. must return socket



Now that we have the basics of joining, handling/broadcasting events, and leaving, let’s get to the interesting bit.

Syncing Multiple Clients

The idea of Jambells is technically quite similar to the Chrome Racer experiment - multiple users join a “room” using a shared code, and a synchronized game kicks off across everyone’s device. In Jambells, notes descend Guitar Hero style down the screen and you shake your phone (like a handbell) when one of your notes reaches the play area.

With multiple people playing a song together, it’s vital all the devices are in sync. Client side clocks keep the game in sync after the start, so the challenge at hand was having all devices “start the game” at the same time.

The first attempt at this had the game leader send a "game:start" event to the server, which would in turn broadcast "game:started" to all the clients.

link to code

We crossed our fingers and hoped this just worked … but it didn’t. Different devices seemed to receive the "game:started" message later than others, and in a very inconsistent manner. It’s incredible how something like half a second can turn a holiday song everyone knows into some completely unrecognizable sound. Improvements were essential.

We guessed that since different phones had different levels of latency, we could just measure that latency and send the "game:started" event along with some calculated delays.

link to code

So when the leader sends the "game:start" event, the server broadcasts a "game:ping" , and all the clients respond immediately with "game:pong" . Once all the delays are calculated, "game:started" is broadcasted and different clients wait their designated delay time before starting.

We felt smart, crossed our fingers, and tested things out again. Unfortunately, this approach was just as messed up as before. With some logging and video analysis, we discovered that any client who was given a delay would be behind by the amount of time we told it to wait.

Strange, still messed up, but not all bad news. We just removed the delays and everything worked! So it’s not that some devices have different latency issues, but instead that WebSocket communication just needs to be warmed up before being consistently speedy across devices.

link to code

So this is what we landed on. We no longer clock response times to determine latency, but we instead just wait for all clients to pong. Once all the connections have been primed, we broadcast "game:started" and everyone starts together! We’ll admit it’s still not perfect, but without any crazy logic or optimized use of multiple app servers spread across the globe (how Chrome Racer tackles this problem) we’re seeing accurate synchronization within about 1/8th of a second.

Recap

This approach of warming up a WebSocket connection and then relying on it to be speedy enough seems pretty solid. It could of course be improved on - ideally you’d have your server repeatedly broadcast the state so clients can continuously hone in on synchronization, and possibly calculate the actual latency once the connection has been “warmed up” to help even more. Given the project constraints though (holiday season release pressure, small server-side team - me), we happily achieved satisfactory synchronization without too much hassle.