11436 SSO

Tutorial: Adding Caching to your Apigee-127 API

alex
Nov 18, 2014

Apigee-127 offers a fast and easy way to develop production-grade APIs, but it takes more than just exposing a set of RESTful endpoints to create a truly robust API. Ideally, an API should also implement a number of other features, such as quota, spike arrest, and caching. In the latest post in our series about Apigee-127, we’ll discuss caching, and take a look at how to quickly and easily add response caching to an Apigee-127 API.

What is caching and why do I need It?

In short, caching in Apigee-127 saves the response to a specific API request so that it can be automatically returned. This is an invaluable way to increase the performance, efficiency, and reliability of your API, because it bypasses the need to actually execute the business logic specified in your controller for a given endpoint.

For example, imagine you have an Apigee-127 API with an "/addresses" endpoint that you use to callout to a third-party API to get the addresses of local businesses. Since we would not expect the address of a business to change frequently, we would cache the response each time an address was requested. This means our API won’t need to call out to get address data for as long as the cache is valid. This gives our API three advantages. 

First, the response time for our "/addresses" endpoint becomes very fast, since it is being served from the cache, rather than generated on the fly with each request. Second, we reduce network load on our backend by not calling out to get the address data every time the "/addresses" endpoint is hit. Third, we make more efficient use of the third-party API, which may limit or throttle usage when too many requests are made, or charge based on the number of API requests.

Cache providers

Now that we know more about caching and why it’s a valuable feature to have in an Apigee-127 API, we need a place to actually cache our API responses. Apigee-127 supports three options for storing your cache: in-memory, Apigee, and Redis. 

As with most things, which caching provider you choose will depend on the needs of your API. If you are planning to deploy your Apigee-127 API to Apigee’s cloud, choosing to use Apigee as your cache provider also allows you to optionally enable analytics to view usage and performance data that can help you better understand what routes to cache.

Adding caching to your Apigee-127 API

Adding a cache is easy, simply add the following to your Apigee-127 Swagger file under the "x-volos-resources" property:


x-volos-resources:
  cache:     
    provider: <cache provider>
      options:
        name: <cache name>
        ttl: <time-to-live> 

Then apply the cache to a route under the "x-volos-apply" property:


paths:
  /weather:
    x-swagger-router-controller: cache-controller
    x-volos-apply:
      # Applies the cache to our endpoint

      cache:
        # Gets the cache name by calling
        # cacheKey() in /helpers/volos.js
        key:
            helper: volos
            function: cacheKey

In the above example, simply substitute one of the following for <cache provider>:


- volos-cache-apigee
- volos-cache-memory
- volos-cache-redis

You will also need to name your cache, and provide a time-to-live value in milliseconds to set how long the cache should be saved.

Selective caching

To further optimize the performance of your API, Apigee-127 supports selective caching via “helper” functions. This allows you to implement fine-grain caching based on custom logic. For example, let’s look again at our address API. Almost certainly we would want to be able to query our "/addresses" endpoint for the name of a specific business. The request we would want to cache the response for would likely look something like this:


‘GET https://my.api.com/addresses?name=Walgreens'

In this case, we would want to separately cache the response for each business. In our cache, responses are saved as key-value pairs, so we need a helper function that sets a custom cache key for each request. To do this, we create a new file called "volos.js" in the "/api/helpers" directory of our project, and simply include a function that parses the "name" parameter of the request, then returns it as the cache key:


// Export the cacheKey function so it can be referenced in our Apigee-127 Swagger file

module.exports = {
  cacheKey: cacheKey
};

function cacheKey(req) {
  // Get the value of the ‘name’ query parameter from the request
  var key = req.swagger.params.name.value;

  // Return the value as the cache key
  return key;
}

Now all we have to do is tell Apigee-127 that we want to use the cacheKey helper function by specifying the following in the "cache" portion of our "x-volos-apply" block:


x-volos-apply:
  cache:
    key:
      # Name of our file in /api/helpers
      helper: volos

      # Name of our helper function
      function: cacheKey

Note that in this example we named the helper file volos.js, but you can name it anything you like.

Check out the code for this example here: https://github.com/apigee-127/a127-samples/tree/master/cache-example.  Give it a try and let us know your feedback!

As you've seen, its really easy to add caching to your API with Apigee-127.  With the in-memory and Redis providers you can develop and test your API and test the caching locally (127.0.0.1) and then update the provider to Apigee and deploy to Edge for enterprise-grade, distributed caching for production.  

Next up, we'll show you how you can access the resources devined in x-volos-resources programatically so you can have application-level caches in addition to caches defined and applied to your API endpoints. Thanks for stopping by!

Microservices Done Right

Next Steps

 
 

Resources Gallery

News