🌍 Available Languages:
This project implements a production-ready AI agent backend built with Node.js + TypeScript, exposed via a webhook and fronted by a Cloudflare Worker acting as an edge gateway.
The focus of the project is clean architecture, testability, and real-world DevOps practices, rather than framework-heavy abstractions.
Client
↓
Cloudflare Worker (Edge Gateway)
↓
Node.js Backend (Webhook)
↓
AI Agent (LLM)
- Backend: Handles business logic and AI agent execution
- Agent: Encapsulates LLM interaction and output validation
- Worker: Lightweight edge gateway (routing, validation, forwarding)
- CI: Automated tests on every Pull Request
- Node.js 20 (LTS)
- TypeScript
- Express
- Vitest (unit testing)
- Supertest (API testing)
- Cloudflare Workers (Wrangler)
- Docker (multi-stage build)
- Docker Compose (local orchestration)
- GitHub Actions (CI)
The AI agent:
- Uses a system prompt to control behavior
- Produces structured JSON output
- Validates responses before returning them
- Fails fast on invalid or empty model responses
The agent logic is framework-agnostic and fully testable.
Design decision: The OpenAI Agents SDK was intentionally not used to keep the agent portable, testable, and vendor-neutral.
- AI agent logic tested with Vitest
- OpenAI SDK is fully mocked (no external calls)
/pingendpoint tested using Supertest- Express app tested in-memory (no open ports)
- Tests run automatically on every Pull Request using GitHub Actions
- No real API keys or external services are required in CI
The backend is packaged using a multi-stage Docker build for minimal image size and fast startup.
docker build -t ai-backend .
docker run -p 3000:3000 --env-file .env ai-backendcurl http://localhost:3000/pingExpected response:
{ "message": "Pong!", server_message: "Server running smoothly" }For local development, the backend and the Cloudflare Worker can be orchestrated together using Docker Compose.
docker-compose up --build- Backend:
http://localhost:3000 - Worker (local):
http://localhost:8787
Note: In production, the Worker runs on Cloudflare Edge and is not containerized.
.
├─ src/
│ ├─ agents/ # AI agent logic
│ ├─ services/ # OpenAI client wrapper
│ ├─ prompts/ # System prompts
│ └─ server.ts # Express app
├─ worker/ # Cloudflare Worker
├─ tests/ # API & integration tests
├─ Dockerfile
├─ docker-compose.yml
├─ .dockerignore
└─ .github/workflows # CI
- Separation of concerns (agent, API, edge)
- Testability first (mocked LLM, deterministic tests)
- No vendor lock-in
- Production-ready DevOps practices
- Fail fast, validate everything
- No real API keys are used in CI
- LLM calls are mocked during tests
- The Worker is intentionally lightweight
Built as a real-world example of an AI Engineer / Backend Engineer project focused on architecture, testing, and deployment best practices.