Proxy Caching

One of the ways Kong delivers performance is through caching. The Proxy Cache plugin accelerates performance by caching responses based on configurable response codes, content types, and request methods. When caching is enabled, upstream services are not bogged down with repetitive requests, because Kong Gateway responds on their behalf with cached results. Caching can be enabled on specific Kong Gateway objects or for all requests globally.

Cache Time To Live (TTL)

TTL governs the refresh rate of cached content, which is critical for ensuring that clients aren’t served outdated content. A TTL of 30 seconds means content older than 30 seconds is deemed expired and will be refreshed on subsequent requests. TTL configurations should be set differently based on the type of the content the upstream service is serving.

  • Static data that is rarely updated can have longer TTL

  • Dynamic data should use shorter TTL to avoid serving outdated data

Kong Gateway follows RFC-7234 section 5.2 for cached controlled operations. See the specification and the Proxy Cache plugin parameter reference for more details on TTL configurations.

Enable caching

The following tutorial walks through managing proxy caching across various aspects in Kong Gateway.

Prerequisites

This chapter is part of the Get Started with Kong series. For the best experience, it is recommended that you follow the series from the beginning.

Start with the introduction Get Kong, which includes a list of prerequisites and instructions for running a local Kong Gateway.

Step two of the guide, Services and Routes, includes instructions for installing a mock service used throughout this series.

If you haven’t completed these steps already, complete them before proceeding.

Global proxy caching

Installing the plugin globally means every proxy request to Kong Gateway will potentially be cached.

  1. Enable proxy caching

    The Proxy Cache plugin is installed by default on Kong Gateway, and can be enabled by sending a POST request to the plugins object on the Admin API:

    1. curl -i -X POST http://localhost:8001/plugins \
    2. --data "name=proxy-cache" \
    3. --data "config.request_method=GET" \
    4. --data "config.response_code=200" \
    5. --data "config.content_type=application/json; charset=utf-8" \
    6. --data "config.cache_ttl=30" \
    7. --data "config.strategy=memory"

    If configuration was successful, you will receive a 201 response code.

    This Admin API request configured a Proxy Cache plugin for all GET requests that resulted in response codes of 200 and response Content-Type headers that equal application/json; charset=utf-8. cache_ttl instructed the plugin to flush values after 30 seconds.

    The final option config.strategy=memory specifies the backing data store for cached responses. More information on strategy can be found in the parameter reference for the Proxy Cache plugin.

  2. Validate

    You can check that the Proxy Cache plugin is working by sending GET requests and examining the returned headers. In step two of this guide, services and routes, you setup a /mock route and service that can help you see proxy caching in action.

    First, make an initial request to the /mock route. The Proxy Cache plugin returns status information headers prefixed with X-Cache, so use grep to filter for that information:

    1. curl -i -s -XGET http://localhost:8000/mock/requests | grep X-Cache

    On the initial request, there should be no cached responses, and the headers will indicate this with X-Cache-Status: Miss.

    1. X-Cache-Key: c9e1d4c8e5fd8209a5969eb3b0e85bc6
    2. X-Cache-Status: Miss

    Within 30 seconds of the initial request, repeat the command to send an identical request and the headers will indicate a cache Hit:

    1. X-Cache-Key: c9e1d4c8e5fd8209a5969eb3b0e85bc6
    2. X-Cache-Status: Hit

    The X-Cache-Status headers can return the following cache results:

    StateDescription
    MissThe request could be satisfied in cache, but an entry for the resource was not found in cache, and the request was proxied upstream.
    HitThe request could be satisfied in cache, but an entry for the resource was not found in cache, and the request was proxied upstream.
    RefreshThe resource was found in cache, but could not satisfy the request, due to Cache-Control behaviors or reaching its hard-coded cache_ttl threshold.
    BypassThe request could not be satisfied from cache based on plugin configuration.

Service level proxy caching

The Proxy Cache plugin can be enabled for specific services. The request is the same as above, but the request is sent to the service URL:

  1. curl -X POST http://localhost:8001/services/example_service/plugins \
  2. --data "name=proxy-cache" \
  3. --data "config.request_method=GET" \
  4. --data "config.response_code=200" \
  5. --data "config.content_type=application/json; charset=utf-8" \
  6. --data "config.cache_ttl=30" \
  7. --data "config.strategy=memory"

Route level proxy caching

The Proxy Caching plugin can be enabled for specific routes. The request is the same as above, but the request is sent to the route URL:

  1. curl -X POST http://localhost:8001/routes/example_route/plugins \
  2. --data "name=proxy-cache" \
  3. --data "config.request_method=GET" \
  4. --data "config.response_code=200" \
  5. --data "config.content_type=application/json; charset=utf-8" \
  6. --data "config.cache_ttl=30" \
  7. --data "config.strategy=memory"

Consumer level proxy caching

In Kong Gateway, consumers are an abstraction that defines a user of a service. Consumer-level proxy caching can be used to cache responses per consumer.

  1. Create a consumer

Consumers are created using the consumer object in the Admin API.

  1. curl -X POST http://localhost:8001/consumers/ \
  2. --data username=sasha
  1. Enable caching for the consumer
  1. curl -X POST http://localhost:8001/consumers/sasha/plugins \
  2. --data "name=proxy-cache" \
  3. --data "config.request_method=GET" \
  4. --data "config.response_code=200" \
  5. --data "config.content_type=application/json; charset=utf-8" \
  6. --data "config.cache_ttl=30" \
  7. --data "config.strategy=memory"

Manage cached entities

The Proxy Cache plugin supports administrative endpoints to manage cached entities. Administrators can view and delete cached entities, or purge the entire cache by sending requests to the Admin API.

To retrieve the cached entity, submit a request to the Admin API /proxy-cache endpoint with the X-Cache-Key value of a known cached value. This request must be submitted prior to the TTL expiration, otherwise the cached entity has been purged.

For example, using the response headers above, pass the X-Cache-Key value of c9e1d4c8e5fd8209a5969eb3b0e85bc6 to the Admin API:

  1. curl -i http://localhost:8001/proxy-cache/c9e1d4c8e5fd8209a5969eb3b0e85bc6

A response with 200 OK will contain full details of the cached entity.

See the Proxy Cache plugin documentation for the full list of the Proxy Cache specific Admin API endpoints.


Previous Rate Limiting

Next Key Authentication