Creating software does not end with writing good code. It gets completed when the software is deployed and able to handle the requests properly and when we can scale without hindering the performance and the cost of running it.

You’re probably thinking about how you have cloud computing to take care of all these things. “So what is this new serverless thing, Vignes?”

Serverless computing is an architecture style in which the code is executed in a cloud platform where we don’t need to worry about the hardware and software setup, security, performance, and CPU idle time costs. It’s an advancement of cloud computing that goes beyond infrastructure that abstracts the software environment as well. It means no configuration is required to run the code.

With serverless, the following will be your working style:

Develop the code. Upload the code to the service provider. Configure the trigger (an HTTP request, in our case).

Our work is done! Now the platform provider will take care of incoming requests and scaling.

Introduction to Serverless Microservices

Serverless architecture is often coupled with a microservices style design. A microservice is a standalone part of big software that handles requests for one specific module. By creating microservices that can run in a serverless environment, it becomes easy to maintain the code and speed up deployments.

Introduction to AWS Lambda & GCF, a Comparison

A serverless feature is often called a “back-end as a service” or “function as a service.” The number of serverless computing providers is beginning to increase. However, some of the traditional big players also provide serverless options, such as Amazon Web Services’ AWS Lambda Functions and Google Cloud’s Google Cloud Functions (GCF), the latter of which, while currently in beta, is what I am using. Although they work similarly, there a few important differences between them.

AWS Lambda Google Cloud Functions Language support Node.js, Python, C#, Java Node.js Triggers DynamoDB, Kinesis, S3, SNS, API gateway (HTTP), CloudFront, + more HTTP, Cloud PubSub, Cloud storage bucket Maximum execution time 300 seconds 540 seconds

In this article, we will go through the process of implementing serverless code deployment using GCF. Google Cloud Functions is a lightweight, event-based, asynchronous compute solution that allows you to create small, single-purpose functions which respond to cloud events without the need to manage a server or a runtime environment.

GCF has three possible implementations separated based on triggers.

HTTP trigger Routes HTTP requests to the cloud functions Internal Google pub/sub trigger Routes publish and subscription requests to cloud functions Cloud storage bucket trigger Routes any changes made to the storage bucket to the cloud function

Let’s create an HTTP trigger-based setup using Google Cloud Functions

Google Cloud Functions does not require any additional special setup or installation. GCF ensures that the default node environment is set up and ready for execution. When a cloud function is created with HTTP as trigger, it provides a URL to trigger the function. Comparing with AWS Lambda, which uses an API gateway as a medium to communicate with it, Google Cloud Functions provides the URL immediately based on the projectID and region.

Creating a Serverless Node.js Application

To make our code executable in GCF, we should wrap the code inside one single function. GCF will call that particular function whenever the trigger occurs. The possible ways to do this are uploading,

Single file: Export a default function that will call other functions based on the request. Multiple files: Have an index.js file requiring all other files and exporting the default function as starting point. Multiple files: Have one main file configured in package.json using "main": "main.js" as the starting point.

Any of the above methods will work.

GCF has a particular Node runtime version supported. Make sure the code is written to support that particular version. At the time of creating this post, GCF supports Node version v6.11.1.

To create a function, There are few options to consider.

Memory This tells how much memory is needed to process the request for one run time. Defined in MB. For a small application, 128MB should be quite sufficient, but can increased up to 2GB. Timeout Timeout, as the name implies, defines the expected code execution timeout. After this, the code will be killed and stopped. Any execution after this point will stop abruptly. Max timeout is 540 seconds. Function to execute Though more than one function can be exported from the main handler file, we need to configure one function that should be triggered for processing the request. This allows the developer to have multiple entry points based on HTTP method/URL.

To upload the code, simply do a copy paste of the code to create a function portal. For more than one file, zip the contents and upload the file. Make sure, in the case of a ZIP file, there should be either an index.js file or a package.json file with the main file mentioned.

Any NPM module dependency should be mentioned in package.json . GCF attempts to install the modules mentioned in the package.json file during the first-time setup.

Lets create a simple handler to return a 200 status and some message. Create a function and add the following code to the source.

exports.httpServer = function httpServer(req, res) { console.log(req); res.status(200).send('Server is working'); }

Once the function is created, Open the URL provided to trigger the function. It should respond like the following.

Now, let’s examine the req object in the logs. To view logs, GCF provides options right from the console. Click the the vertical dots and open the logs option.

Now, let’s update the code to handle simple routes for /users .

The following code is used to handle a simple GET & POST request for the /users route:

exports.httpServer = function httpServer(req, res) { const path = req.path; switch(path) { case '/users': handleUsers(req, res); break; default: res.status(200).send('Server is working'); } }; const handleUsers = (req, res) => { if (req.method === 'GET') { res.status(200).send('Listing users...'); } else if (req.method === 'POST') { res.status(201).send('Creating User...') } else { res.status(404); } }

After updating, let’s test it in-browser now, but this time with /users at the end.

That’s cool. We created a basic HTTP server with routing.

Operations & Debugging

If code were where the story ended, you wouldn’t be researching infrastructure options like serverless Node.js applications. Here’s a brief summary of how to take care of common tasks like deployment and debugging. Things Node.js developers already do for other applications.

Deployment:

Code for functions can be deployed in four ways.

Copy pasting the code in the console

Uploading a ZIP file

Deploying from the cloud storage bucket as a ZIP file

Deploying from the cloud source repository

The most convenient option is, obviously, deploying from a source repository.

Invocation:

While creating the function, Console provides the HTTP URL to trigger the function which is in the format: https://<region>-<project-id>.cloudfunctions.net/<function-name>

AWS Lambda’s function has cold start issues that makes the function execution take additional time to start up. Once started, the following executions will respond normally. This initial additional start-up time is referred as a cold start. Though we don’t have the official documentation for GCF related to this topic, the cold start issues didn’t show up during our testing.

Debugging:

GCF integrates with the Stackdriver Logging service in Google Cloud. All console logs and errors will get logged here, and it helps debug code that is deployed already.

Testing:

The console provides options to test the function by passing a JSON as input. The function will be called with JSON as the input and the output will be displayed in the console. The request (input) and response is similar to the Express.js framework and can be unit tested during the development process itself. If you need a refresher on Node.js testing, check out A Node.js Guide to Actually Doing Integration Tests

Limitations and Next Steps

Using serverless functions has its own advantages, It also comes with limitations

Vendor lock-in: It limits the code that we write to one particular service provider. Moving the code to another provider requires rewriting the code with significant efforts toward migration. As this can be a big issue, we should be very careful when choosing a service provider.

Limitations in the number of requests and hardware resources: Providers often limit the number of parallel requests that a function will handle at a time. There are memory restrictions as well. These types of restrictions can be bumped higher by speaking to the provider, but they will still exist.

Google Cloud Functions is maturing and improving a lot. It is still being improved and updated frequently, especially in the languages it can support. If you are planning to use Google Cloud functions, keep an eye on the changelogs to avoid any breaking changes in the implementation.