A backend API project focused on performance, caching strategies, and concurrency handling using Node.js, PostgreSQL, and Redis.
This project was built to study cache-aside pattern, lazy loading, cache stampede, and real-world performance benchmarking.
- Node.js + Express
- TypeScript
- PostgreSQL (data persistence)
- Redis (cache layer)
- Docker / Docker Compose
- Autocannon (load testing)
The project follows a clean, layered architecture:
Controller → Service → Repository
↓
Redis Cache
↓
External API (FakeStore)
-
Controller
- Handles HTTP requests and responses
- Input validation and status codes
-
Service
- Business logic
- Cache-aside implementation
- Cache invalidation
- Concurrency handling
-
Repository
- Direct database access (PostgreSQL)
- SQL queries
-
External API Gateway
- Fetches data from FakeStore API when cache and DB miss
- Try to fetch data from Redis
- If cache miss → query database
- If DB miss → fetch from external API
- Persist data in DB
- Store data in Redis with TTL
// simplified flow
cache → database → external API- Cache TTL: 300 seconds
Under high concurrency, multiple requests can try to populate the cache at the same time, causing:
- Duplicate database inserts
- High latency spikes
- External API overload
- Redis-based locking
- Unique constraint in database
ON CONFLICT DO NOTHINGon inserts
Tests executed using Autocannon:
npx autocannon -c 20 -d 10 http://localhost:3000/api/iten/1- Average latency: ~4–5 ms
- p99 latency: < 10 ms
- Throughput: ~4,000 req/s
- Max latency: ~30 ms
- Average latency: ~7 ms
- p99 latency: ~14 ms
- Throughput: ~2,600 req/s
➡️ Redis improves:
- ~40% lower latency
- ~50% higher throughput
- Much better stability under load
- Cold cache: possible latency spikes due to cache population and concurrency
- Hot cache: stable, low-latency responses
This behavior was validated by running sequential load tests.
docker compose up --buildAPI will be available at:
http://localhost:3000/api/iten/:id
- Practical use of Redis as a cache layer
- Cache-aside and lazy loading patterns
- Performance benchmarking and analysis
- Handling concurrency issues in distributed systems
- Clean architecture and separation of concerns