Another benefit of the hyperscale clouds is their operational expertise and their infrastructure design. They can spin up a new server instance in seconds, providing compute on demand. And with container technologies, they’re able to quickly switch in new isolated environments and scale them up as necessary.
That’s where serverless technologies come in, building on those skills and the processes to rapidly launch small, stateless processes as necessary. Like much cloud-native development, these serverless processes are event driven, responding to messages, processing their contents, and passing the results on to the next element of a distributed application. As they’re launched on demand, they can scale rapidly, and as they’re billed per second of CPU time, they’re also relatively cheap when compared to a full-time virtual infrastructure.
The evolution of Azure Functions
Azure’s serverless elements are its Functions, easy-to-construct blocks of code that can be assembled into more complex platforms. They’re designed to work in conjunction with Azure’s messaging infrastructure, from its IoT (Internet of Things) services to its scalable publish-and-subscribe Grid. There are direct bindings to key Azure services, simplifying building a Functions-based PaaS application or using a Function to control and trigger other Azure-hosted applications.
Limited language support isn’t surprising; even with its size, Microsoft doesn’t have the scale to produce Functions runtimes for every language. However, there is an option that lets you continue to use your choice of languages. For example, you can work in Rust or Go while still looking like a Functions end point to the rest of your application.
Adding additional language support with custom handlers
Under the hood, a Function has a very simple architecture. Inputs come in through a trigger, which extracts the input payload and passes it to a language-specific functions host. This runs your code, sending its output payload to a predefined target. Microsoft now gives you the ability to build custom handlers around a generic function host that calls out over an HTTP connection to an external Web server that’s running your own code. The function host makes a request of the Web server, receiving a response that it then formats and forwards to the function target.
As custom handlers run outside the context of the Function host, you’re no longer limited to supported languages and can write code in any language you want. All you need to do is provide HTTP end points with configuration files in the Function app. These define any custom handler functions, along with the URI of your custom handler. The code for your handler will run in the Function, with code to launch the Web server.
Building a custom handler
Much of your configuration will be in a host.json file. This contains details of the custom handler executable, along with any arguments that need to be passed to the runtime and details of its working directory. If your custom handler needs to bind to any Azure services you have to include an extension bundle section to your handler’s hosts.json. Alternatively, you can explicitly add specific extension packages to your project if an extension or a specific version isn’t included in the bundles. You can even use this technique to add any custom extensions you’ve written.
Each function in your custom handler needs its own functions.json file in a folder with the same name as the function. This file defines the bindings and triggers used by your Function, much like you would with any standard Function.
Your custom handler will need to be capable of parsing JSON data sent from the Function host. This comes as data and as metadata. Your code will need to process this and then deliver a response JSON object, with data, log data, and any expected return. If you’ve built code that works with any modern API-first design model, such as gRPC, then you won’t have any difficulty building a custom handler to work with Azure Functions. All you need is a language and a runtime that can receive and send HTTP events, and also code that can parse JSON.
There are some minor restrictions associated with running a custom handler, with the handler required to start within 60 seconds. That shouldn’t be a problem for most runtimes, especially when working in the same context as your function and with the custom handler executable installed in your Function alongside the function host. That’s not to say you can use a custom handler as a way of hosting Web applications outside of Azure App Service. For one thing, there’s no HTTP/2 support or access to streamed data.
A serverless-everywhere future?
By opening up Functions to any language through custom handlers, Microsoft is beginning the process of making its serverless infrastructure language-independent, allowing more developers to work with serverless applications. There’s a lot to like here, especially with how little extra work is needed over and above building a Function with one of its directly supported languages. Support is already in the Azure Functions Visual Studio Code extension, so you can start work in familiar tools as well as with familiar languages.
Things get interesting if you start thinking about these changes to Functions as a prelude to a more-portable serverless model, one that’s able to run at the edge of the network as well as in the cloud. Part of that model is already here: Azure Functions runs on Azure IoT Edge using containerized Functions that are pushed from the Azure Container Registry.
It’s easy then to start imagining custom handlers that can be run on edge-class hardware, using new technologies like Web Assembly to provide a universal runtime that requires minimal deployment. Code written in any language can be compiled as part of a build and deployed to a WASI (Web Assembly System Interface) runtime on an edge device, managed via Kubernetes, without needing a container. As the cloud becomes a fog that encompasses all our devices, a Krustlet-based portable Functions that works with any language could bring the benefits of event-driven serverless applications and cloud-native distributed systems everywhere they’re needed.