Software deployment on the Internet started with servers. Then, there was virtualization. IaaS (infrastructure-as-a-service), the first step into cloud computing, made it possible to provision virtual servers by the hour. PaaS (platform-as-a-service) was the next step to increase abstraction between the physical machine and the developer’s code. Finally, the trend of building microservices instead of monoliths and integrating APIs and remotely-hosted services led to the current stage of cloud computing with serverless platforms, or FaaS (function-as-a-service), in which small units of code are deployed and the whole infrastructure is managed automagically by the hosting provider.

Building on the idea of serverless and FaaS I’d like to introduce a concept that I call Micro API. I define it as a design pattern describing a piece of software that

exposes a Web API (REST- or RPC-style) to its consumer,

is implemented in a single file with reasonable low LOC (lines of code),

relies on a standardized framework and set of dependencies

and does not require local state.

Micro APIs run in “execution engines” which provide the standardized framework and dependencies. Since the custom logic is extremely small, deployment can happen on demand, meaning that whenever a request comes in which needs to be handled by a specific Micro API, the code can be downloaded from a repository, cached within the engine and immediately executed. Execution engines are multi-tenant by design and place custom code in a sandbox so that different Micro APIs don’t affect each other.

Thanks to on-demand-deployment and multi-tenancy, a network of hosted Micro API execution engines on servers distributed all over the world can behave like a content delivery network (CDN). API requests are received and executed, after retrieving their code if it’s not yet cached, at the location closest to the consumer. With such a network of edge servers, distributing and scaling server-side logic can be done just as easily as distributing static web content! No resources must be provisioned upfront, which means the cost of hosting a Micro API endpoint is essentially zero. High scalability with thousands of requests per second can be achieved by scaling up the execution engines either horizontally or vertically. Alternatively, Micro APIs can also be used in a single-tenant environment with execution engine servers deployed on premise, within private and hybrid clouds or even in devices at the edge of the network.

Unlike some other serverless environments a Micro API may seem more limited through the obviously opinionated, due to choice of framework and dependencies, execution engine. This is intentional because it keeps the developer’s focus on the business logic and allows them to to quickly launch a huge number of independent Micro APIs without making architectural decisions and manage dependencies for each. Different execution engines can be designed for specific requirements.

Micro APIs are a perfect choice for tasks like this:

An API which is a proxy or facade for another API, bridging different authentication protocols or converting data formats (e.g. XML to JSON).

A webhook receiver that modifies or checks data before calling other APIs or webhooks.

A simple layer of data validation before data is stored via another API or in a cloud-based storage system.

A mashup combining data from multiple APIs into a single response.

A mockup with static or semi-static responses.

A routing component in a microservice architecture.

In summary you could say that Micro APIs work great as glue that is able to connect anything with anything through custom code that is easily created and deployed, making them more versatile - given basic coding skills - than visual integration and aggregation services. That is what makes me very excited about this pattern, especially when combined with an execution engine that offers great tools for providing and consuming Web APIs.

At CloudObjects we’re building such a MicroAPI execution engine based on PHP, the Silex micro-framework and leveraging PHPSandbox to provide a safe runtime environment. It’s called phpMAE, is designed around integration with CloudObjects Core and other upcoming CloudObjects products, and it’s provided as a hosted service as well as a fully open source distribution for development and hybrid deployments. Configuration and source code of Micro APIs is stored in and deployed via CloudObjects Core. Curious to learn more? Stay tuned for an introduction to phpMAE and launch of the hosted service in an upcoming blog post!