Caching

Caching is a great and simple technique that helps improve your app’s performance. It acts as a temporary data store providing high performance data access.

Installation

First install the required package:

  1. $ npm install --save cache-manager

In-memory cache

Nest provides a unified API for various cache storage providers. The built-in one is an in-memory data store. However, you can easily switch to a more comprehensive solution, like Redis. In order to enable caching, first import the CacheModule and call its register() method.

  1. import { CacheModule, Module } from '@nestjs/common';
  2. import { AppController } from './app.controller';
  3. @Module({
  4. imports: [CacheModule.register()],
  5. controllers: [AppController],
  6. })
  7. export class ApplicationModule {}

Then just tie the CacheInterceptor where you want to cache data.

  1. @Controller()
  2. @UseInterceptors(CacheInterceptor)
  3. export class AppController {
  4. @Get()
  5. findAll(): string[] {
  6. return [];
  7. }
  8. }

warningWarning Only GET endpoints are cached. Also, HTTP server routes that inject the native response object (@Res()) cannot use the Cache Interceptor. Seeresponse mapping for more details.

Global cache

To reduce the amount of required boilerplate, you can bind CacheInterceptor to all endpoints globally:

  1. import { CacheModule, Module, CacheInterceptor } from '@nestjs/common';
  2. import { AppController } from './app.controller';
  3. import { APP_INTERCEPTOR } from '@nestjs/core';
  4. @Module({
  5. imports: [CacheModule.register()],
  6. controllers: [AppController],
  7. providers: [
  8. {
  9. provide: APP_INTERCEPTOR,
  10. useClass: CacheInterceptor,
  11. },
  12. ],
  13. })
  14. export class ApplicationModule {}

WebSockets & Microservices

You can also apply the CacheInterceptor to WebSocket subscribers as well as Microservice’s patterns (regardless of the transport method that is being used).

  1. @@filename()
  2. @CacheKey('events')
  3. @UseInterceptors(CacheInterceptor)
  4. @SubscribeMessage('events')
  5. handleEvent(client: Client, data: string[]): Observable<string[]> {
  6. return [];
  7. }
  8. @@switch
  9. @CacheKey('events')
  10. @UseInterceptors(CacheInterceptor)
  11. @SubscribeMessage('events')
  12. handleEvent(client, data) {
  13. return [];
  14. }

info Hint The @CacheKey() decorator is imported from @nestjs/common package.

However, the additional @CacheKey() decorator is required in order to specify a key used to subsequently store and retrieve cached data. Also, please note that you shouldn’t cache everything. Actions which perform some business operations rather than simply querying the data should never be cached.

Customize caching

All cached data has its own expiration time (TTL). To customize default values, pass the options object to the register() method.

  1. CacheModule.register({
  2. ttl: 5, // seconds
  3. max: 10, // maximum number of items in cache
  4. });

Different stores

This service take advantage of cache-manager under the hood. The cache-manager package supports a wide-range of useful stores, for example, Redis store. A full list of supported stores is available here). To set up the Redis store, simple pass the package together with corresponding options to the register() method.

  1. import * as redisStore from 'cache-manager-redis-store';
  2. import { CacheModule, Module } from '@nestjs/common';
  3. import { AppController } from './app.controller';
  4. @Module({
  5. imports: [
  6. CacheModule.register({
  7. store: redisStore,
  8. host: 'localhost',
  9. port: 6379,
  10. }),
  11. ],
  12. controllers: [AppController],
  13. })
  14. export class ApplicationModule {}

Adjust tracking

By default, Nest uses the request URL (in an HTTP app) or cache key (in websockets and microservices apps, set through the @CacheKey() decorator) to associate cache records with your endpoints. Nevertheless, sometimes you might want to set up tracking based on different factors, for example, using HTTP headers (e.g. Authorization to properly identify profile endpoints).

In order to accomplish that, create a subclass of CacheInterceptor and override the trackBy() method.

  1. @Injectable()
  2. class HttpCacheInterceptor extends CacheInterceptor {
  3. trackBy(context: ExecutionContext): string | undefined {
  4. return 'key';
  5. }
  6. }

Async configuration

You may want to asynchronously pass in module options instead of passing them statically at compile time. In this case, use the registerAsync() method, which provides several ways to deal with async configuration.

One approach is to use a factory function:

  1. CacheModule.registerAsync({
  2. useFactory: () => ({
  3. ttl: 5,
  4. }),
  5. });

Our factory behaves like all other asynchronous module factories (it can be async and is able to inject dependencies through inject).

  1. CacheModule.registerAsync({
  2. imports: [ConfigModule],
  3. useFactory: async (configService: ConfigService) => ({
  4. ttl: configService.getString('CACHE_TTL'),
  5. }),
  6. inject: [ConfigService],
  7. });

Alternatively, you can use the useClass method:

  1. CacheModule.registerAsync({
  2. useClass: CacheConfigService,
  3. });

The above construction will instantiate CacheConfigService inside CacheModule and will use it to get the options object. The CacheConfigService has to implement the CacheOptionsFactory interface in order to provide the configuration options:

  1. @Injectable()
  2. class CacheConfigService implements CacheOptionsFactory {
  3. createCacheOptions(): CacheModuleOptions {
  4. return {
  5. ttl: 5,
  6. };
  7. }
  8. }

If you wish to use an existing configuration provider imported from a different module, use the useExisting syntax:

  1. CacheModule.registerAsync({
  2. imports: [ConfigModule],
  3. useExisting: ConfigService,
  4. });

This works the same as useClass with one critical difference - CacheModule will lookup imported modules to reuse any already-created ConfigService, instead of instantiating its own.