SOA Series Part 6: Optimizing Your Service for Client Performance

This is the sixth in a series of seven posts on service-oriented architecture derived from a workshop conducted by Cloves Carneiro and Tim Schmelmer at Abril Pro Ruby. The series is called SOA from Day One – A love story in 7 parts.

In order to maintain responsiveness of your service-based web applications, optimizing performance (as perceived by service clients) is essential in a service-oriented environment. While part 4 of this series focused on making fewer network calls from a client application to the services it depends upon, this sixth part of the “SOA from Day One” blog lays out some strategies to improve performance via measures inside the service itself.

Restricting the response size

Benchmarking our services at LivingSocial, we noticed that the lion’s share of slow APIs were growing linearly slower with the size of the (JSON) response.

Once there is enough data, the share of connection setup / teardown times, and even database queries, are dwarfed by time spent in:

  • result serialization into JSON
  • shipping large JSON payloads over the wire
  • de-serializing the JSON client-side

Here are some tips based on how we addressed these issues.

Result paging

Whenever your service exposes endpoints that return lists of objects as results (e.g., #index or #search like actions), it makes sense to provide the client with an option to request a limited number of results.

A very simple and yet effective way of providing such “poor man’s paging” is to accept limit and offset parameters for list endpoints.

To improve performance, experiment with optimal default and maximum values for such a limit parameter. Finding the ‘sweet spot’ that is acceptable for both the clients and the service depends very much on your particular data and use cases. You will find yourself iterating a couple of times on the best number for such a limit.

Content Representations

Much like there are service clients that will not need all objects in a single response of a list endpoint, not every client needs all the information the service can expose about a single entity for endpoints that return just one object. Some clients might just need two or three fields of potentially tens, or even hundreds, of existing fields.

A good way to provide flexibility with regards to the amount of information returned about a single object is to honor requests for different representations of an entity. Try to make it easy for clients to define (and iterate over) the best representation for their use case.

A good candidate for information that might be worth removing is: any information that might come from a secondary service (e.g., the tags or the city name for an inventory item). Some clients might be able to make by themselves, as secondary requests, to the authoritative service for such information (e.g., to cities-service and tags-service). Other clients might want to have this aggregated information be returned by a service, but only when requesting a single object (e.g., one inventory item), not when asking for an entire list of them.

Make the service work less

Retrieving information from the database and subsequently serializing it takes valuable time in your API’s request/response cycle. Here are some ways to avoid incurring this time.

HTTP Conditional GET

Using ETag, Last-Modified and Cache-Control response headers come as standard parts of the HTTP protocol, and they allow for a great deal of flexibility … and yet they remain unused in many API services we have seen in the wild.

Rails has great support for honoring and setting the respective HTTP request / response headers to allow clients to specify what that of the service objects they know, and the service to declare when and how this information will become stale.

While it is not easy to find Ruby HTTP client libraries that automatically honor or send these headers, browsers will certainly honor them out of the box.

Using a Reverse Proxy

Even if you clients don’t send or honor HTTP conditional GET headers, reverse proxies added between your clients and your service will. For our most-trafficked internal API services, LivingSocial relies heavily on Varnish, a reverse proxy that has excellent performance and scaling characteristics. Using Varnish, we saw some endpoints sped up by a factor of 50.

Varnish is also flexible enough to function as a ‘hold the fort’ cache: if the service fronted by varnish is down, the proxy can return the last “good” (= 2XX Status Code) response it received from the backend service.

It can be administered to cache based on a full URI, including or excluding headers. Two tips for configuring your reverse proxy:

  1. Try making all query parameters sorted, so that any reverse proxy can yield a higher cache hit rate

  2. Make all parameters (like authentication) that do not affect the JSON responses be sent in request headers, and make varnish ignore these headers for its cache key

For a more in-depth discussion of our usage of HTTP conditional GET and varnish, you can find a small presentation here. [This presentation was put together using Remark.js, and you can hit the p key to see the presenter notes with more explanations.]

Exercises “Performance tuning”

  1. Add a small and a full inventory item representation

    1. full is the currently existing representation which makes dependent calls out to cities-service and tags-service for all city and tagging information on inventory items.

    2. small represented inventory items just have the Hyperlinked URI for their city and their tags inside cities-service and tags-service

    3. make the representation be selectable via a representation query paramter, which will be honored by all endpoints that return inventory items (#show, #index, #in_city, #near_city)

  2. Add “limit and offset” based paging of results

    1. allow for paging through API-returned inventory items by letting the client request a smaller number of results (via a limit query parameter), and starting at a particular offset (via an additional offset parameter)

    2. make these paging parameters be honored by all endpoints that return lists of inventory items (#index, #in_city, #near_city)

  3. Make sure to look at the various places in the application_controller and the inventory_items_controller that implemented service-side HTTP Conditional GET logic

    1. Come up with a curl request that makes the inventory_items#show endpoint return a 304 response, based on a If-Modified-Since request header

    2. Come up a curl request that makes the inventory_items#show endpoint return a 304 response, based on a If-None-Match request header

Previous articles in this series

Comments