Low-Cost Rate Limiting for Azure Functions APIs with API Management’s Consumption Tier

Azure Functions can be used as a lightweight platform for building APIs. They support a number of helpful features for API developers including custom routes and a variety of output bindings that can implement complex business rules. They also have a consumption-based pricing model, which provides a low-cost, pay-per-use pricing model while you have low levels of traffic, but can scale or burst for higher levels of demand.

The Azure Functions platform also provides Azure Functions Proxies, which gives another set of features to further extend APIs built on top of Azure Functions. These features include more complex routing rules and the ability to do a small amount of request rewriting. These features have led some people to compare Azure Functions Proxies to a very lightweight API management system. However, there are a number of features of an API management platform that Azure Functions Proxies doesn’t support. One common feature of an API management layer is the ability to perform rate limiting on incoming requests.

Azure API Management is a hosted API management service that provides a large number of features. Until recently, API Management’s pricing model was often prohibitive for small APIs, since using it for production workloads required provisioning a service instance with a minimum of about a AUD$200 monthly cost. But Microsoft recently announced a new consumption tier for API Management. Based on a similar pricing model to Azure Functions, the consumption tier for API Management bills per request, which makes it a far more appealing choice for serverless APIs. APIs can now use features like rate limiting and many others without needing to invest in a large monthly expense.

In this post I’ll describe how Azure Functions and the new API Management pricing tier can be used together to build a simple serverless API with rate limiting built in, and at a very low cost per transaction.

Note:this new tier is in preview, and so isn’t yet ready for production workloads but it will hopefully be generally available and supported soon. In the meantime, it’s only available for previewing in a subset of Azure regions. For my testing I’ve been using Australia East.

Example Scenario

In this example, we’ll build a simple serverless API that would benefit from rate limiting. In our example function we simulate performing some business logic to calculate shipping rates for orders. Our hypothetical algorithm is very sophisticated, and so we may later want to monetise our API to make it available for high-volume users. In the meantime we want to allow our customers to try it out a little bit for free, but we want to put limits around their use.

There may be other situations where we need rate limiting too for example, if we have a back-end system we call into that can only cope with a certain volume of requests, or that bills us when we use it.

First, let’s write a very simple function to simulate some custom business logic.

Function Code

For simplicity I’m going to write a C# script version of an Azure Function. You could easily change this to a precompiled function, or use any of the other languages that Azure Functions supports.

Our simulated function logic is as follows:

Receive an HTTP request with a body containing some shipping details. Calculate the shipping cost. Return the shipping cost.

In our simulation we’ll just make up a random value, but of course we may have much more sophisticated logic in future. We could also call into other back-end functions or APIs too.

Here’s our function code:

If we paste this code into the Azure Functions portal, we’ll be able to try it out, and sure enough we can get a result:


Low-Cost Rate Limiting for Azure Functions APIs with API Management’s Consumpti ...
API Management Policy

Now that we’ve got our core API function working, the next step is to put an API Management gateway in front of it so we can apply our rate limiting logic. API Management works in terms of policies that are applied to incoming requests. When we work with the consumption tier of API Management we can make use of the policy engine, although there are some limitations . Even with these limitations, policies are very powerful and let us express and enforce a lot of complex rules. A full discussion of API Management’s policy system is beyond the scope of this post, but I recommend reviewing the policy documentation .

Here is a policy that we can use to perform our rate limiting:

This policy uses the caller’s IP address as the rate limit key. This means that if the same IP address makes three API calls within a 15-second period, it will get rate limited and told to try again later. Of course, we can adjust the lockout time, the number of calls allowed, and even the way that we group requests together when determining the rate limit.

Because we may have additional APIs in the future that would be subject to this rate limit, we’ll create an API Management product and apply the policy to that. This means that any APIs we add to that product will have this policy applied.

Securing the Connection

Of course, there’s not much point in putting an API Management layer in front of our function API if someone can simply go around it and call the function directly. There are a variety of ways of securing the connection between an API Management instance and a back-end Azure Functions app, including using function keys, function host keys, and Azure AD tokens. In other tiers of API Management you can also use the IP address of the API Management gateway, but in the consumption tier we don’t get any IP addresses to perform whitelisting on.

For this example we’ll use the function key for simplicity. (For a real production application I’d recommend using a different security model, though.) This means that we will effectively perform a key exchange:

Requests will arrive into the API Management service without any keys. The API Management service will perform its rate limiting logic. If this succeeds, the API Management service will call into the function and pass in the function key, which only it knows.

In this way, we’re treating the API Management service as a trusted subsystem we’re configuring it with the credentials (i.e. the function key) necessary to call the back-end API. Azure API Management provides a configuration system to load secrets like this, but for simplicity we’ll just inject the key straight into a policy. Here’s the policy we’ve used:

We’ll inject the function key into the policy at the time we deploy the policy.

As this logic is specific to our API, we’ll apply this policy to the API and not to our product.

Deploying Through an ARM Template We’ll use an

本文系统(windows)相关术语:三级网络技术 计算机三级网络技术 网络技术基础 计算机网络技术

代码区博客精选文章
分页:12
转载请注明
本文标题:Low-Cost Rate Limiting for Azure Functions APIs with API Management’s Consumpti ...
本站链接:https://www.codesec.net/view/621306.html


1.凡CodeSecTeam转载的文章,均出自其它媒体或其他官网介绍,目的在于传递更多的信息,并不代表本站赞同其观点和其真实性负责;
2.转载的文章仅代表原创作者观点,与本站无关。其原创性以及文中陈述文字和内容未经本站证实,本站对该文以及其中全部或者部分内容、文字的真实性、完整性、及时性,不作出任何保证或承若;
3.如本站转载稿涉及版权等问题,请作者及时联系本站,我们会及时处理。
登录后可拥有收藏文章、关注作者等权限...
技术大类 技术大类 | 系统(windows) | 评论(0) | 阅读(74)