December 20, 2024 • 15 min
Implementing Redis Caching Strategy in NestJS
#NestJS
#Redis
#Tutorial
Implementing Redis Caching Strategy in NestJS
Caching is crucial for application performance. In this tutorial, I’ll show you how to implement Redis caching in NestJS with proper invalidation strategies.
##Setup Redis in NestJS
Install Dependencies
npm install @nestjs/cache-manager cache-manager
npm install cache-manager-redis-yet
npm install redis
Configure Cache Module
// app.module.ts
import { CacheModule } from '@nestjs/cache-manager';
import { redisStore } from 'cache-manager-redis-yet';
@Module({
imports: [
CacheModule.registerAsync({
isGlobal: true,
useFactory: async () => ({
store: await redisStore({
socket: {
host: process.env.REDIS_HOST || 'localhost',
port: parseInt(process.env.REDIS_PORT) || 6379,
},
password:process.env.REDIS_PASSWORD,
ttl: 60 * 1000, // default TTL: 60 seconds
}),
}),
}),
],
})
export class AppModule {}
Basic Caching Pattern
Cache-Aside Pattern
The most common caching pattern:
import { Injectable, Inject } from '@nestjs/common';
import { CACHE_MANAGER } from '@nestjs/cache-manager';
import { Cache } from 'cache-manager';
@Injectable()
export class UsersService {
constructor(
@Inject(CACHE_MANAGER) private cacheManager: Cache,
private prisma: PrismaService,
) {}
async findOne(id: number): Promise<User> {
const cacheKey = `user:${id}`;
// 1. Try to get from cache
const cached = await this.cacheManager.get<User>(cacheKey);
if (cached) {
console.log('Cache HIT');
return cached;
}
console.log('Cache MISS');
// 2. If not in cache, get from database
const user = await this.prisma.user.findUnique({
where: { id },
});
// 3. Store in cache for next time
await this.cacheManager.set(cacheKey, user, 300000); // 5 minutes
return user;
}
}
TTL (Time-To-Live) Strategies
Strategy 1: Fixed TTL
// Different TTL based on data type
const TTL = {
USER_PROFILE: 300, // 5 minutes
USER_SETTINGS: 3600, // 1 hour
STATIC_CONTENT: 86400, // 24 hours
ANALYTICS: 60, // 1 minute
};
await this.cacheManager.set('user:1', user, TTL.USER_PROFILE * 1000);
Strategy 2: Dynamic TTL
// TTL based on data characteristics
function calculateTTL(data: any): number {
if (data.isStatic) return 86400; // 24 hours
if (data.updateFrequency === 'high') return 60; // 1 minute
if (data.updateFrequency === 'medium') return 300; // 5 minutes
return 3600; // 1 hour default
}
const ttl = calculateTTL(userData);
await this.cacheManager.set(key, userData, ttl * 1000);
Cache Invalidation Patterns
Pattern 1: Time-Based Invalidation
Simplest approach - let cache expire naturally:
// Set with TTL, no manual invalidation
await this.cacheManager.set('data', value, 300000); // 5 min
Pros: Simple, no maintenance Cons: Might serve stale data
Pattern 2: Write-Through Invalidation
Update cache when data changes:
async update(id: number, data: UpdateUserDto): Promise<User> {
// 1. Update database
const updated = await this.prisma.user.update({
where: { id },
data,
});
// 2. Update cache immediately
const cacheKey = `user:${id}`;
await this.cacheManager.set(cacheKey, updated, TTL.USER_PROFILE * 1000);
return updated;
}
Pattern 3: Delete-on-Write
Remove cache entry when data changes:
async update(id: number, data: UpdateUserDto): Promise<User> {
// 1. Update database
const updated = await this.prisma.user.update({
where: { id },
data,
});
// 2. Delete from cache (will be fetched fresh next time)
await this.cacheManager.del(`user:${id}`);
return updated;
}
Pattern 4: Tag-Based Invalidation
Invalidate multiple related cache entries:
class CacheService {
private tags = new Map<string, Set<string>>();
async setWithTags(key: string, value: any, tags: string[], ttl?: number) {
// Store value
await this.cacheManager.set(key, value, ttl);
// Track tags
tags.forEach(tag => {
if (!this.tags.has(tag)) {
this.tags.set(tag, new Set());
}
this.tags.get(tag)!.add(key);
});
}
async invalidateTag(tag: string) {
const keys = this.tags.get(tag);
if (!keys) return;
// Delete all keys with this tag
await Promise.all(
Array.from(keys).map(key => this.cacheManager.del(key))
);
this.tags.delete(tag);
}
}
// Usage:
await cacheService.setWithTags(
'user:1:posts',
posts,
['user:1', 'posts'],
300000
);
// Later, invalidate all caches tagged with 'user:1'
await cacheService.invalidateTag('user:1');
Advanced Patterns
Decorator-Based Caching
Create a custom decorator:
// cache.decorator.ts
import { SetMetadata } from '@nestjs/common';
export const CACHE_KEY_METADATA = 'cache_key';
export const CACHE_TTL_METADATA = 'cache_ttl';
export const Cacheable = (keyPrefix: string, ttl = 300) => {
return (target: any, propertyKey: string, descriptor: PropertyDescriptor) => {
SetMetadata(CACHE_KEY_METADATA, keyPrefix)(target, propertyKey, descriptor);
SetMetadata(CACHE_TTL_METADATA, ttl)(target, propertyKey, descriptor);
};
};
Create an interceptor:
// cache.interceptor.ts
import { Injectable, NestInterceptor, ExecutionContext, CallHandler, Inject } from '@nestjs/common';
import { Observable, of } from 'rxjs';
import { tap } from 'rxjs/operators';
import { CACHE_MANAGER } from '@nestjs/cache-manager';
import { Cache } from 'cache-manager';
import { Reflector } from '@nestjs/core';
@Injectable()
export class CacheInterceptor implements NestInterceptor {
constructor(
@Inject(CACHE_MANAGER) private cacheManager: Cache,
private reflector: Reflector,
) {}
async intercept(context: ExecutionContext, next: CallHandler): Promise<Observable<any>> {
const keyPrefix = this.reflector.get('cache_key', context.getHandler());
const ttl = this.reflector.get('cache_ttl', context.getHandler());
if (!keyPrefix) {
return next.handle();
}
const request = context.switchToHttp().getRequest();
const cacheKey = `${keyPrefix}:${JSON.stringify(request.params)}`;
const cached = await this.cacheManager.get(cacheKey);
if (cached) {
return of(cached);
}
return next.handle().pipe(
tap(async (data) => {
await this.cacheManager.set(cacheKey, data, ttl * 1000);
}),
);
}
}
Usage:
@Injectable()
export class UsersService {
@Cacheable('user', 300)
async findOne(id: number): Promise<User> {
return this.prisma.user.findUnique({ where: { id } });
}
}
Monitoring Cache Performance
Track Cache Hit Rate
@Injectable()
export class CacheMetrics {
private hits = 0;
private misses = 0;
recordHit() {
this.hits++;
}
recordMiss() {
this.misses++;
}
getHitRate(): number {
const total = this.hits + this.misses;
return total === 0 ? 0 : (this.hits / total) * 100;
}
getStats() {
return {
hits: this.hits,
misses: this.misses,
hitRate: `${this.getHitRate().toFixed(2)}%`,
total: this.hits + this.misses,
};
}
}
Performance Comparison
| Scenario | Without Cache | With Cache | Improvement |
|---|---|---|---|
| User Profile | 45ms | 2ms | 95.6% faster |
| Product List | 120ms | 8ms | 93.3% faster |
| Analytics | 850ms | 15ms | 98.2% faster |
Best Practices
- Set appropriate TTLs - Balance freshness vs performance
- Use cache keys wisely - Include version in key for easy invalidation
- Monitor hit rates - Aim for > 80% hit rate
- Handle cache failures gracefully - App should work if Redis is down
- Don’t cache everything - Only cache frequently accessed data
- Use compression - For large cached values
- Implement circuit breakers - Prevent cache stampede
Common Pitfalls
❌ Cache Stampede
Multiple requests miss cache simultaneously:
// BAD: All requests hit database
async findPopularPosts() {
const cached = await this.cache.get('popular_posts');
if (!cached) {
// Problem: Multiple requests reach here simultaneously
const posts = await this.prisma.post.findMany();
await this.cache.set('popular_posts', posts);
return posts;
}
return cached;
}
Solution: Use locking
import { Mutex } from 'async-mutex';
const mutex = new Mutex();
async findPopularPosts() {
const cached = await this.cache.get('popular_posts');
if (!cached) {
// Only one request fetches data
return await mutex.runExclusive(async () => {
// Double-check after acquiring lock
const cached2 = await this.cache.get('popular_posts');
if (cached2) return cached2;
const posts = await this.prisma.post.findMany();
await this.cache.set('popular_posts', posts, 300000);
return posts;
});
}
return cached;
}
Conclusion
Redis caching in NestJS can dramatically improve performance:
- Reduce database load by 80-90%
- Improve response times by 90-98%
- Scale to handle more users
Implement proper invalidation strategies to ensure data freshness while maximizing cache hit rates!