Tim Williams • April 18, 2015

Since the inception of the internet the predominant data transfer paradigm has been the traditional Request/Response method with the HTTP standard. With this method the client issues a request via headers and possibly a body of data sent to a server, the server parses the request and generates a response. When the response is fully downloaded by the client, the connection to the server is closed and the client parses and probably displays the response to the client.

Simplified HTTP Request

This method is very efficient for data that is relatively static, however with the rising popularity of Single Page Applications that take the form of social networks, collaborative documents and web based communication systems there is a real need for real-time communication. In my opinion Node.js is the first highly capable solution to this problem.

Node.js is described as a “non-blocking evented I/O” solution. To me that means dealing with streams of data wasn’t an afterthought in the development of the platform opposite to how the support feels in solutions like the LAMP stack.

At a high level here is why Node.js is the right choice for real-time applications:

Event Handling: Node.js services offer event based interaction in a unified interface which is very easy to understand. The internal services all implement an abstract event object allowing developers to have a predictable standard with which to hook into these events. This makes reacting to changes on observed objects much simpler, and extending the standard to emit custom events from developer defined objects. WebSocket Capability: Because Node.js was built around the principle with working with data streams it has the most solid and simple implementation of WebSockets. Socket.io is a framework built to extend upon that capability and make it easy for developers to implement the technology. Non-Blocking: Asynchronous operations are the other component which make having a simple event interface very powerful. This means that processes can happen in any order allowing for input from different sources to flow naturally. In real time applications much of that input comes from users which, unless you are playing a game of chess, don’t tend to happen synchronously!

Putting together streams and WebSockets creates a backbone for an application which is just about as real-time as the web can offer!

Simplified Overview of Stream Powered WebSocket Communication

This simplified overview demonstrates the communication paradigm available with WebSockets and the implementation of Node.js data streams. The client requests a WebSocket connection over a standard HTTP request. The server upgrades that connection and issues an “upgrade successful” response. The server and client now have an open 2 way communication channel established. The simplified server here is observing a process or perhaps a third party service which provides a data stream. Using event handling the server responds to changes in the state of the object it is observing and communicates those changes to the client. The client then interprets the communication and updates its models which use 2 way data binding to render immediate results to the user. The user can also initiate changes in the model with 2 way data binding which are hooked by the client, and using the same event based logic, communicates the model change back to the server.

Though this version is very simplified it’s easy to see where the advantages of this approach shine over using a REST API or other design pattern to emulate a real-time environment. Using REST there is no clear way for the server to send events back to the client, which means the client must inefficiently communicate to the server through frequent HTTP requests.