Run Django and Celery directly on your laptop (via uv) while Docker Compose provides Postgres, Redis, and optional extras. All defaults in config/settings.py are tuned so you do not need to export environment variables for local work; the Compose stack and Python processes share the same static credentials.
- Docker Desktop (or another Docker engine) with at least 4 GB RAM.
uv≥ 0.4.- Python 3.12 (uv will manage the interpreter inside
.venv). - Node.js 22.x (ships with npm 10) for the Vite frontend.
From the repository root:
# Create the reusable virtual environment.
uv venv .venv
# Install Python dependencies in editable mode (no activation needed).
uv run pip install -e .
# Frontend deps (install once; Vite reuses node_modules).
npm ci --prefix frontend
# Start backing services (Postgres, Redis, MinIO).
docker compose -f docker-compose.dev.yaml up
# Bootstrap the database.
uv run python manage.py migrate
uv run python manage.py createsuperuserTip:
docker-compose.dev.yamlbinds services to127.0.0.1using thepostgres/postgresdefaults thatconfig/settings.pyapplies whenGOBII_RELEASE_ENV=local(the default outside containers).
-
Ensure services are running
docker compose -f docker-compose.dev.yaml up
(Optional) add
--profile containersif you prefer to run Django from within Docker. -
Start the Django ASGI server with live reload
uv run uvicorn config.asgi:application --reload --host 0.0.0.0 --port 8000
-
Start the Celery worker (new shell)
uv run celery -A config worker -l info --pool=threads --concurrency=4
macOS disables
forkby default; the threads pool restores worker startup while remaining autoreload-friendly. -
Front-end hot reload (new shell)
npm run dev --prefix frontend
-
Optional processes
- Celery beat:
uv run celery -A config beat --loglevel info --scheduler redbeat.RedBeatScheduler - Object storage (MinIO UI) is available at http://localhost:9090 (
minioadmin/minioadminby default). - Sandbox compute server starts by default with
docker compose -f docker-compose.dev.yaml up. - To make host-run Django use it, export
SANDBOX_COMPUTE_BACKEND=http SANDBOX_COMPUTE_API_URL=http://127.0.0.1:8080 SANDBOX_COMPUTE_API_TOKEN=dev-sandbox-token. - Prefer running Django/Celery inside containers?
docker compose -f docker-compose.dev.yaml --profile containers up web(add--profile workeror--profile beatas needed) will reuse the same backing services.
- Celery beat:
Stop everything when finished with Ctrl+C in the Compose terminal (or run docker compose -f docker-compose.dev.yaml down in another shell if you prefer a clean exit).
Follow the testing guidance in README or individual apps. When writing or running tests, prefer targeted modules first, then finish with the full suite:
# Example: run a focused test file
uv run python manage.py test path.to.app.tests.test_example --settings=config.test_settings
# (Use --parallel auto for larger suites when needed)
uv run python manage.py test --settings=config.test_settings --parallel auto- Database migrations fail the first time – ensure Postgres is up (
docker compose -f docker-compose.dev.yaml ps) and rerunuv run python manage.py migrate. - Celery cannot connect to Redis – verify the container is healthy and that the worker command is using the same shell with
.venvactivated (it will pick upREDIS_URL=redis://localhost:6379/0automatically). - Need a clean database – run
docker compose -f docker-compose.dev.yaml down -vto drop the local Postgres volume, then bring it back up and rerun migrations.
The platform includes an end-to-end evaluation system for verifying agent behavior, tool usage, and prompt effectiveness.
To run the standard evaluation suite against a temporary test agent:
uv run python manage.py run_evalsOptions:
--scenario <slug>: Run a specific scenario (e.g.,--scenario echo_response).--sync: Run synchronously (eager mode) for debugging without a separate Celery worker.--agent-id <uuid>: Run against an existing Persistent Agent instead of creating a temporary one.