API Caching, Part I
The web has come a long way since the 90s. In the past, sites were commonly driven by a single, monolithic application that acted as the only communication medium to a centralized database. The modern approach is to break this one large application into a set of interdependent and cooperative services.
The standard method of communication between these services is the Application Programming Interface (API). APIs are often responsible for not only fetching but also mutating data; as such, they can be highly dynamic and notoriously hard to accelerate. By caching your API, however, you can reduce latency, optimize your site’s performance and scale more efficiently.
This is where Fastly comes in. Over the course of this article, we’ll dig into the nuts and bolts of using our Instant Purge™ feature for API caching.
An Example: Constructing an Article Comments API
To begin, let’s imagine an online magazine that has articles and allows users to make comments. From this basic description, it’s pretty easy to imagine a relational schema for the database of such a website:
From the figure above we see that each article can have many comments, and each comment is authored by exactly one user.
You could design a RESTful API specification and use it to manipulate and fetch comments, like so:
GET /comment
- Returns a list of all commentsGET /comment/:id
- Returns a comment with the given idPOST /comment
- Creates a new commentPUT /comment/:id
- Updates a comment with the given idDELETE /comment/:id
- Deletes a comment with the given id
The create, read, update, and delete (CRUD) methods ensure the API can perform its basic operations, but they don’t expose the relational aspect of the data. To do so, you could add a couple of relational endpoints, like so:
GET /article/:article_id/comments
- Get a list of comments for a given articleGET /user/:user_id/comments
- Get all comments for a given user
Endpoints such as these make it easier for other programmers to get the information they need to do things like rendering the HTML page for an article or displaying comments on a user’s profile page.
While there are many other possible endpoints we could construct for this API, the set of seven we defined above should suffice for our purposes. As a final note, let’s assume that the API has been programmed to use an Object Relation Model (ORM), such as ActiveRecord, when interacting with the database.
Caching the Comments API
Using Fastly to cache our comments API is pretty straightforward. To do so, we’ll follow this basic plan of attack:
Determine the API endpoints that can be cached.
Send a
PURGE
requests to Fastly when data changes.Set up a Fastly configuration, deploy, and CNAME the API’s domain.
Step 1: Determine Endpoints to be Cached
In order to cache the API, we first need to identify which URLs we want to cache. An easy way to go about this is to split the specification endpoints into two groups.
The first group, called “accessors,” are used to fetch or access the comment data. These are the endpoints we want to cache using Fastly. Referencing our specification, we see that four endpoints fit this description:
GET /comment
GET /comment/:id
GET /article/:article_id/comments
GET /user/:user_id/comments
The second group, called “mutators,” are used to change or mutate the comment data. These are endpoints considered to be “uncachable” as they will always be dynamic. We see that three of the endpoints fit the bill:
POST /comment
PUT /comment/:id
DELETE /comment/:id
Making a quick note of the HTTP methods for each of the endpoints you should begin to see a pattern. Because we chose to make a the comments API RESTful, we can easily identify which endpoints are accessors and which are mutators via simple rule:
GET endpoints can be cached, but PUT, POST, & DELETE endpoints cannot.
Once we’ve gathered this information, we’re ready to program the API to send Instant Purge™ requests.
Step 2: Send PURGE Requests
While it may be tempting to point at the PUT, POST, and DELETE endpoints as the place where data is being modified, this usually is not the whole picture. In most modern APIs, these endpoints are thought to represent an interface to the actual model code responsible for handling the database modifications.
In the example specification for the API, we made the assumption that we’d be using an ORM to perform the actual database work. Most ORMs allow programmers to set special “callbacks” on models that will fire when certain actions have been performed (e.g., before or after validation, after creating a new record).
For purging, we are interested in whether a model has saved information to the database (whether this be a new record, an update to an existing record, or the deleting of record). At this point, we’d add a callback that tells the API to send a PURGE request to Fastly for each of the cacheable endpoints that we identified above.
For an ActiveRecord comments model, it’s as simple as this:
require 'fastly'
class Comment < ActiveRecord::Base
fastly = Fastly.new(api_key: 'MY_API_KEY')
after_save do
fastly.purge "/comment"
fastly.purge "/comment/#{self.id}"
fastly.purge "/article/#{self.article_id}/comments"
fastly.purge "/user/#{self.user_id}/comments"
end
end
With our model code in place, the API is now ready to be cached. The final step is to configure a Fastly service for our API.
Step 3: Set Up Fastly Configuration
Configuring a service to handle the comments API is a snap. By default, Fastly will not cache PUT, POST, and DELETE requests (for more information about this, see our article on default caching behavior of HTTP verbs). This means that all we need in our configuration is the following:
The domain for the API
The address to a backend server that runs the API software
Seriously, that’s all it takes. Of course, you can always provide additional rules that tell Fastly how to work around some of the more peculiar aspects of your API (they all have them, right?).
Once the configuration is complete, all that remains is to activate the configuration and then CNAME the given domain to point to Fastly. At this point, the API is being cached in production and will be faster than ever!
(For more information on configuring Fastly for your sites and APIs please see our documentation.)
Just the Beginning...
API caching via Fastly is a highly effective way to accelerate the performance of your service-oriented architecture. That’s why companies such as New Relic and Wanelo are using us to speed up their API calls and keep their customers happy.
As we demonstrated in the tutorial above, the process is pretty darn simple. Once you’ve determined what needs to be cached and programmed in the API to send Instant Purge™ requests in the right places, Fastly takes care of the rest.
While Instant Purge™ caching can be applied to nearly any API, there are a handful of advanced features that can make it even easier. Stay tuned for part two of our API Caching series, where we will cover using Surrogate-Key and Cache-Control headers from the API perspective.