Skip to content

A lightweight API to manage GPU-enabled Docker containers for deep learning workloads. Supports container creation, GPU allocation, image management, and real-time GPU monitoring.

License

Notifications You must be signed in to change notification settings

JuniorDurand/DeepDock

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

32 Commits
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

DeepDock: GPU-Aware Container Management API

DeepDock is a lightweight API and CLI tool designed to manage Docker containers with GPU support. It simplifies the lifecycle of deep-learning workloads by providing endpoints for:

  • Pulling and listing images
  • Creating, starting, stopping, and inspecting containers
  • Monitoring GPU utilization in real time
  • Managing per-user volumes
  • Ensuring NVIDIA GPU availability inside containers

DeepDock is ideal for development environments, research labs, and GPU-enabled servers that run multiple AI workloads in parallel.

Features

  • FastAPI-based REST API
  • GPU monitoring using NVML
  • Container lifecycle management
  • Automatic volume binding per user
  • CLI command (deepdock) to start the service
  • Compatible with Docker + NVIDIA Container Toolkit
  • Lightweight, easy to deploy, low configuration

Requirements

Before installing DeepDock, your host system must meet the following requirements:

1. Docker Engine

Install Docker using the official installer: Docker Installer

Verify installation:

docker --version

2. NVIDIA GPU + Drivers

DeepDock requires a GPU-enabled environment.

Install the NVIDIA driver for your GPU: Nvidia Drivers homepage

Check if the driver is working:

nvidia-smi

3. NVIDIA Container Toolkit

Required for GPU access inside containers.

Install it: https://docs.nvidia.com/datacenter/cloud-native/container-toolkit/latest/install-guide.html

Verify installation:

nvidia-container-cli --version

Installation

DeepDock can be installed directly from your GitHub repository using pip:

pip install git+https://github.com/JuniorDurand/DeepDock.git

Running the DeepDock Service

To start the API service:

deepdock

By default, the server starts at:

📍 http://127.0.0.1:8000

With automatic GPU monitoring and container management enabled.

To run in development mode:

uvicorn deepdock.main:app --reload

API Documentation

The API provides endpoints to manage Deep Learning containers with GPU access.
For a detailed description of all endpoints, parameters, and examples, please refer to the Wiki: DeepDock Wiki

The API provides endpoints to manage Deep Learning containers with GPU access.

  • Quick reference: The API has automatic documentation available via Swagger. Once the service is running, access it at:
    http://127.0.0.1:8000/docs

  • Detailed reference: For a detailed description of all endpoints, parameters, and examples, please refer to the Wiki:
    DeepDock Wiki

About

A lightweight API to manage GPU-enabled Docker containers for deep learning workloads. Supports container creation, GPU allocation, image management, and real-time GPU monitoring.

Topics

Resources

License

Stars

Watchers

Forks