Isaac.

Caching Strategies and Redis

Implement caching with Redis to improve application performance.

By EMEPublished: February 20, 2025
cachingredisperformancein-memoryoptimization

A Simple Analogy

A cache is like keeping frequently used items on your desk. Instead of walking to the filing cabinet (database) for the stapler every time, keep it on your desk. Redis is a high-speed desk for your application's most-used data.


What Is Caching?

Caching stores frequently accessed data in fast, temporary storage. Instead of fetching from the database repeatedly, return from cache, dramatically reducing latency and database load.


Why Cache?

  • Speed: Memory access is 100x faster than database
  • Scalability: Reduce database load
  • User experience: Faster response times
  • Cost: Fewer database queries = less infrastructure
  • Reliability: Partial cache hit maintains service

Caching Strategies

| Strategy | Use Case | |----------|----------| | Cache-Aside | Check cache first, load from DB if miss | | Write-Through | Update cache and DB together | | Write-Behind | Update cache immediately, DB later | | Refresh-Ahead | Proactively refresh before expiration |


Redis Basics

# Start Redis
docker run -d -p 6379:6379 redis:7

# CLI
redis-cli
> SET key "value"
> GET key
> DEL key
> EXPIRE key 60  # Expire in 60 seconds
> TTL key        # Time to live

.NET with Redis

// Install: StackExchange.Redis

var redis = ConnectionMultiplexer.Connect("localhost:6379");
var db = redis.GetDatabase();

// Set value
db.StringSet("user:123", "Alice", TimeSpan.FromMinutes(30));

// Get value
var user = db.StringGet("user:123");

// Delete
db.KeyDelete("user:123");

// Check existence
bool exists = db.KeyExists("user:123");

Practical Example

public class UserService
{
    private readonly IDatabase _cache;
    private readonly IUserRepository _repository;
    
    public async Task<User> GetUserAsync(int id)
    {
        var cacheKey = $"user:{id}";
        
        // Try cache first
        var cached = _cache.StringGet(cacheKey);
        if (cached.HasValue)
        {
            return JsonSerializer.Deserialize<User>(cached.ToString());
        }
        
        // Cache miss: fetch from database
        var user = await _repository.GetUserAsync(id);
        
        // Store in cache for 30 minutes
        _cache.StringSet(
            cacheKey, 
            JsonSerializer.Serialize(user),
            TimeSpan.FromMinutes(30)
        );
        
        return user;
    }
}

Cache Invalidation

public async Task UpdateUserAsync(User user)
{
    // Update database
    await _repository.UpdateAsync(user);
    
    // Invalidate cache
    _cache.KeyDelete($"user:{user.Id}");
    
    // Set new data
    _cache.StringSet(
        $"user:{user.Id}",
        JsonSerializer.Serialize(user),
        TimeSpan.FromMinutes(30)
    );
}

Distributed Caching

// ASP.NET Core distributed cache
builder.Services.AddStackExchangeRedisCache(options =>
{
    options.Configuration = "localhost:6379";
});

// In controller
[ApiController]
public class UsersController : ControllerBase
{
    private readonly IDistributedCache _cache;
    
    public async Task<User> GetUser(int id)
    {
        var cacheKey = $"user:{id}";
        
        var cached = await _cache.GetStringAsync(cacheKey);
        if (!string.IsNullOrEmpty(cached))
        {
            return JsonSerializer.Deserialize<User>(cached);
        }
        
        // Fetch and cache...
    }
}

Best Practices

  1. Short TTL: Avoid stale data
  2. Cache keys: Use consistent naming (user:123)
  3. Invalidate properly: Clear cache when data changes
  4. Monitor hits: Measure cache effectiveness
  5. Graceful degradation: Handle cache failures

Related Concepts to Explore

  • Cache warming and preloading
  • Cache stampede prevention
  • Distributed caching across servers
  • Memory management and eviction policies
  • Monitoring cache performance

Summary

Caching dramatically improves performance by reducing database load. Use Redis strategically with proper invalidation to balance speed and data freshness.