Skip to content

TALOS Development Guide

Prerequisites

Tool Version Notes
Python >= 3.10 Required for match statements and modern type hints
Docker >= 24.0 Docker Desktop or Docker Engine
Docker Compose >= 2.20 Included with Docker Desktop; standalone install also works
Git >= 2.30
Make Any GNU Make (included on Linux/macOS; use Git Bash or WSL on Windows)
POSIX shell bash/zsh Git Bash works on Windows

Quick Start

git clone https://gitlab.com/pierros/talos.git
cd talos
cp ops/.env.example ops/.env    # Edit with your secrets (see below)
make dev                        # Builds and starts the full stack

The dashboard is available at http://localhost:8000.

Configure Environment

Edit ops/.env and set the required values:

# Database
POSTGRES_USER=talos
POSTGRES_PASSWORD=<generate a secure password>
POSTGRES_DB=talos_core
DATABASE_URL=postgresql://talos:<password>@db:5432/talos_core

# Authentication
SECRET_KEY=<generate with: python3 -c "import secrets; print(secrets.token_hex(32))">

# MQTT
MQTT_USER=talos
MQTT_PASS=<generate a secure password>
BROKER_HOST=broker

Initial Data Sync

After the stack is running, trigger a SatNOGS database sync:

curl -X POST http://localhost:8000/system/sync

Or use the dashboard UI to sync satellite data.


Running Components Individually

For faster iteration during development, run Python components outside Docker while keeping infrastructure services in containers.

Infrastructure Only (Database + Broker)

cd ops
docker compose up -d db broker

Core API

pip install -e ".[core,dev]"
DATABASE_URL=postgresql://talos:talos_password@localhost:5432/talos_core \
BROKER_HOST=localhost \
SECRET_KEY=dev_secret_key \
uvicorn core.app:app --host 0.0.0.0 --port 8000 --reload

The API starts on http://localhost:8000 with auto-reload enabled in development. Prometheus metrics are exposed at /metrics. (python core/main.py also works via the backward-compat shim.)

Director

pip install -e ".[director,dev]"
DATABASE_URL=postgresql://talos:talos_password@localhost:5432/talos_core \
BROKER_HOST=localhost \
python director/mission_director.py

Agent

The agent runs on ground station hardware (e.g., Raspberry Pi) and connects to the MQTT broker. Since v0.4.0 it uses aiomqtt (asyncio) instead of paho-mqtt:

pip install -e ".[agent]"
python agent/agent.py --id <station_id> --key <api_key> --broker <broker_host>

The station ID and API key are generated when provisioning a station through the dashboard or API. Optional TLS: set TALOS_MQTT_TLS_CA to the path of the broker's CA certificate.


Database Migrations (Alembic)

TALOS uses Alembic for database schema versioning. Migration scripts live in migrations/versions/.

Running Migrations

# Apply all pending migrations (bring database to latest schema)
alembic upgrade head

# Check current migration version
alembic current

# View migration history
alembic history

Creating a New Migration

When you modify the SQLModel models in core/database.py, generate a migration:

# Auto-generate migration from model changes
alembic revision --autogenerate -m "describe your change"

# Review the generated file in migrations/versions/
# Then apply it
alembic upgrade head

Rolling Back

# Downgrade one step
alembic downgrade -1

# Downgrade to a specific revision
alembic downgrade <revision_id>

Docker Deployments

When deploying with Docker Compose, migrations run automatically on container startup. For manual control:

docker compose exec core alembic upgrade head

Makefile Targets

Run make help for a full list. Key targets:

Target Description
make dev Start full stack with dev overrides
make up Start production stack (detached)
make down Stop all services
make test Run full test suite with coverage
make test-unit Unit tests only (shared/)
make test-smoke Route and auth smoke tests
make test-campaign Campaign lifecycle and RBAC tests
make test-all All test suites
make test-hil Hardware-in-the-loop tests (requires real hardware, TALOS_HIL=1)
make lint Run ruff linter
make format Auto-format code with ruff
make typecheck Run mypy type checker
make docs Build MkDocs documentation
make docs-serve Serve docs locally with hot-reload
make logs Tail Docker Compose logs
make clean Remove caches, build artifacts

Test Tiers

Fast Tests (Unit)

make test
# or directly:
pytest tests/ -v --cov=shared --cov=core

Runs unit tests for shared utilities, schemas, API endpoints, and business logic. Completes in seconds with no external dependencies.

Slow Tests (Benchmarks)

pytest -m slow

Includes performance benchmarks for SGP4 propagation, pass prediction, and MQTT throughput. Requires a running database.

End-to-End Tests (Full Stack)

./tests/test_integration/test_e2e_docker.sh

Spins up the complete Docker Compose stack, runs integration tests against live services (API, broker, database), and tears down. Use this to validate the full deployment pipeline before releases.

Campaign End-to-End Tests

pytest tests/test_integration/test_campaign_e2e.py -v

Covers the full campaign lifecycle: creation, station assignment, activation, and RBAC enforcement. Tests verify that norad_id is resolved from satnogs_id, that campaigns are saved even when transmitter fetching fails, and that viewers receive 403 on operator-only endpoints.

API End-to-End Tests (71 tests)

pytest tests/test_integration/test_e2e.py -v

Comprehensive coverage of the org, station, campaign, and assignment REST APIs. Includes RBAC enforcement tests ensuring role-based access control is applied correctly.

Smoke Tests (Routes and Auth)

pytest tests/test_integration/test_smoke_routes.py -v

Validates all route handlers return expected status codes and that the authentication flow (magic link email via Resend API) works end-to-end.

All Checks (CI Pipeline)

make lint && make typecheck && make test

The GitLab CI pipeline runs 18 jobs across 7 stages: lint (ruff, mypy), test (unit, smoke, campaign, physics, integration, load, hardware), security (SAST, secret detection, dependency scanning), build (Docker images), release (tagged versions), deploy (Fly.io), and pages (MkDocs).


Code Style

Linting and Formatting

TALOS uses Ruff for both linting and formatting:

ruff check .            # Lint (errors only)
ruff check . --fix      # Lint with auto-fix
ruff format .           # Format all files

Configuration is in pyproject.toml:

  • Target: Python 3.10
  • Line length: 120 characters
  • Enabled rule sets: E, F, I, N, W, UP, B, SIM

Type Checking

mypy shared/ core/

Configuration in pyproject.toml enables warn_return_any and warn_unused_configs.


How to Add a New MQTT Message Type

MQTT message types are defined in the shared/ package to ensure consistency across all components.

Step 1: Define the Payload Schema

Add a Pydantic model to shared/schemas.py:

# shared/schemas.py

class TemperatureTelemetry(BaseModel):
    """Agent -> Director: hardware temperature readings."""
    station_id: str
    cpu_temp_c: float
    ambient_temp_c: float | None = None
    ts: str = Field(default_factory=lambda: datetime.now(timezone.utc).strftime("%Y-%m-%dT%H:%M:%SZ"))

Step 2: Define the Topic

Add a method or constant to shared/topics.py:

# shared/topics.py  (inside class Topics)

@staticmethod
def station_temperature(station_id: str) -> str:
    """Agent -> Director: hardware temperature telemetry."""
    return f"talos/gs/{station_id}/telemetry/temp"

SUB_ALL_STATION_TEMP = "talos/gs/+/telemetry/temp"

Step 3: Publish from the Source Component

In the agent (or whichever component produces the data):

from shared.topics import Topics
from shared.schemas import TemperatureTelemetry

topic = Topics.station_temperature(self.station_id)
payload = TemperatureTelemetry(station_id=self.station_id, cpu_temp_c=cpu_temp)
client.publish(topic, payload.model_dump_json(), qos=0)

Step 4: Subscribe in the Consumer

In the director or core API:

from shared.topics import Topics

client.subscribe(Topics.SUB_ALL_STATION_TEMP)

Step 5: Update Documentation

Add the new topic to the MQTT Topic Hierarchy table in docs/ARCHITECTURE.md.


How to Add a New Ground Station

Via the Dashboard

  1. Log in to the TALOS dashboard at http://localhost:8000
  2. Navigate to the station management page
  3. Enter the SatNOGS Network station ID
  4. TALOS pulls station metadata (name, location, altitude) from SatNOGS
  5. A station_id and api_key are generated automatically
  6. Copy the provided agent command and run it on the station hardware

Via the API

curl -X POST http://localhost:8000/stations/create \
  -H "Content-Type: application/json" \
  -d '{"network_id": 123}'

On the Station Hardware

# On the Raspberry Pi or station host
pip install paho-mqtt
python3 agent/agent.py --id gs_athens_01_a1b2 --key sk_... --broker your-broker-host

Ensure that rotctld and/or rigctld (from the Hamlib package) are running on the station host before starting the agent.


How to Add a New Satellite Tracking Feature

  1. Research the physics: Determine what orbital mechanics or signal processing is needed.
  2. Add computation to the Director: Extend director/mission_director.py with the new calculation inside the per-assignment physics loop or as a background task. The Director iterates over active campaign assignments, so new features should be scoped per-assignment.
  3. Define MQTT messages: Follow the "Add a New MQTT Message Type" guide above to create schemas and topics for any new data. Use org-scoped topic paths (talos/{org_slug}/gs/{station_id}/...).
  4. Update the Core API: Add any necessary REST endpoints in core/main.py for configuration or data retrieval. New endpoints should be org-scoped (/api/org/{slug}/...) and respect RBAC.
  5. Update the database: If new tables or columns are needed, update the SQLModel models in core/database.py and generate an Alembic migration (see "Database Migrations" above).
  6. Update the Dashboard: Modify core/templates/dashboard.html to display the new feature. The dashboard supports multi-campaign visualization with color-coded satellite tracks.
  7. Add tests: Write unit tests in tests/ covering the new computation and message flow.
  8. Update docs: Add the feature to ARCHITECTURE.md and CHANGELOG.md.

How to Create a Campaign via API

Campaigns replace the single-mission model from v0.1. A campaign targets a specific satellite and can be assigned to one or more stations. As of v0.2.1, assignments are a simple station-to-campaign link (time windows have been removed).

Step 1: Create a Campaign

curl -X POST http://localhost:8000/api/org/{org_slug}/campaigns \
  -H "Content-Type: application/json" \
  -d '{
    "name": "ISS Tracking - Europe Pass",
    "norad_id": 25544,
    "satnogs_id": "XGHT-6662-1886-1195-1498",
    "priority": 7,
    "transmitter_uuid": "optional-transmitter-uuid"
  }'

The campaign is created in draft status.

Step 2: Assign Stations

curl -X POST http://localhost:8000/api/org/{org_slug}/campaigns/{campaign_id}/assignments \
  -H "Content-Type: application/json" \
  -d '{
    "station_id": 1
  }'

Step 3: Activate the Campaign

curl -X POST http://localhost:8000/api/org/{org_slug}/campaigns/{campaign_id}/activate

The Director picks up active assignments and begins tracking. Multiple campaigns can be active simultaneously, with different stations tracking different satellites.

Campaign Lifecycle

draft --> active --> completed
  |                    ^
  +--> cancelled       |
                       |
  (assignments move: pending --> tracking --> completed/failed)

How to Manage Organizations

Organizations group users, stations, and campaigns. Every user belongs to at least one organization.

Auto-Creation

When a new user registers and does not belong to any organization, a default organization is automatically created for them with the owner role.

Create an Organization

curl -X POST http://localhost:8000/api/org \
  -H "Content-Type: application/json" \
  -d '{
    "name": "Athens Ground Network",
    "slug": "athens-gn"
  }'

Add a Member

curl -X POST http://localhost:8000/api/org/{org_slug}/members \
  -H "Content-Type: application/json" \
  -d '{
    "email": "operator@example.com",
    "role": "operator"
  }'

Roles

Role Capabilities
owner Full access: manage members, delete org, all operator permissions
operator Manage stations, create/activate campaigns, manage assignments
viewer Read-only dashboard access

Transfer Station Ownership

Stations are owned by organizations (not individual users). To move a station between organizations, an owner can reassign it through the API:

curl -X PATCH http://localhost:8000/api/org/{org_slug}/stations/{station_id} \
  -H "Content-Type: application/json" \
  -d '{"org_id": <new_org_id>}'

Debugging Tips

MQTT Inspector

Use mosquitto_sub to watch all MQTT traffic:

# Subscribe to all TALOS topics
mosquitto_sub -h localhost -t "talos/#" -v -u talos -P <mqtt_password>

# Watch commands for a specific station
mosquitto_sub -h localhost -t "talos/gs/gs_athens_01/cmd/#" -v -u talos -P <mqtt_password>

# Watch all telemetry
mosquitto_sub -h localhost -t "talos/gs/+/telemetry/#" -v -u talos -P <mqtt_password>

Log Levels

All components use Python's logging module. Set the log level via environment variable or code:

# Increase verbosity for debugging
LOG_LEVEL=DEBUG python core/main.py

Log output is structured as timestamp - module - level - message.

Common Issues

Symptom Likely Cause Fix
Agent connects but no commands arrive Station not provisioned or wrong station_id Check station_id matches the dashboard
Director not tracking No active campaign assignments Create a campaign with station assignments and activate it
Database connection refused PostgreSQL not running or wrong URL Check docker compose ps and DATABASE_URL
MQTT authentication failed Wrong credentials Verify MQTT_USER/MQTT_PASS match broker config
Rotator not moving rotctld not running on station host Start Hamlib: rotctld -m 2 -r /dev/ttyUSB0

Project Structure

talos/
  core/                   Web API package (split from monolith in v0.4.0)
    app.py                FastAPI application factory, middleware, lifespan
    config.py             Environment-based settings (single source of truth)
    deps.py               Shared FastAPI dependencies (DB sessions, auth, RBAC)
    mqtt_client.py        Async MQTT helper for publish/subscribe
    sync.py               SatNOGS catalog sync logic
    models.py             Pydantic request/response models
    metrics.py            Prometheus metrics definitions (prometheus-client)
    database.py           SQLModel ORM models (Organization, Campaign, Assignment, Station, User)
    main.py               Backward-compat shim (re-exports from core.app)
    routes/               11 route modules:
      admin.py              Admin actions (join/delete networks, delete stations/campaigns)
      auth.py               Magic link login, session management
      campaigns.py          Campaign CRUD and lifecycle
      dashboard.py          Main dashboard page
      legacy.py             Legacy endpoint compatibility
      networks.py           SatNOGS network integration
      orgs.py               Organization management
      pages.py              Static and management pages
      public.py             Public-facing endpoints
      stations.py           Station provisioning
    static/js/            10 ES modules (extracted from inline <script> tags):
      dashboard.js, campaigns.js, admin.js, public-track.js,
      home.js, login.js, members.js, networks.js, settings.js, stations.js
    templates/            Jinja2 HTML templates
    Dockerfile            Container image definition
    requirements.txt      Python dependencies

  director/               Multi-campaign physics engine (separate process)
    mission_director.py   Per-assignment SGP4 propagation loop, pass prediction, MQTT commands

  agent/                  Ground station edge client (asyncio since v0.4.0)
    agent.py              aiomqtt async subscriber, Hamlib bridge

  shared/                 Common utilities shared across components
    __init__.py
    schemas.py            Pydantic v2 models for all MQTT payloads
    topics.py             MQTT topic string constants and wildcards
    time_utils.py         UTC timestamp helpers
    satnogs_client.py     SatNOGS API client with response caching

  migrations/             Alembic database migration scripts
    alembic.ini           Alembic configuration
    versions/             Migration version files

  ops/                    Deployment configuration
    docker-compose.yml           Full stack definition
    docker-compose.dev.yml       Development overrides
    docker-compose.monitoring.yml  Prometheus + Grafana monitoring stack
    .env.example                 Template for environment variables
    mosquitto/
      config/
        mosquitto.conf    Broker configuration (auth, ACLs, TLS)

  tests/                  Test suite (~200+ tests)
    test_shared/          Unit tests for shared utilities (19 tests)
    test_integration/     End-to-end and integration tests
      test_e2e.py           API e2e tests: org/station/campaign/assignment (71 tests)
      test_campaign_e2e.py  Campaign lifecycle and RBAC enforcement (33 tests)
      test_smoke_routes.py  Route handler and auth flow smoke tests (51 tests)
      test_director_e2e.py  Director physics and propagation tests
      test_load.py          Load and performance benchmarks
      test_agent_hardware.py Hardware-in-the-loop agent tests
      test_e2e_docker.sh    Full-stack Docker test script

  docs/                   Documentation (deployed to GitLab Pages via MkDocs)
    ARCHITECTURE.md       System architecture and data flow
    DEVELOPMENT.md        This file
    CHANGELOG.md          Release history
    RELEASING.md          Versioning and release process
    research/             v0.3 research reports (00-05)
    research/archive/     Archived v0.1-v0.2 research reports

  .gitlab/
    issue_templates/       Bug Report, Feature Request, Question, Security Vulnerability
    merge_request_templates/ Default, Hotfix, Release

  .gitlab-ci.yml          CI/CD pipeline (lint, test, security, build, deploy, pages)
  pyproject.toml          Project metadata, dependencies, tool configuration
  Makefile                Development shortcuts
  CONTRIBUTING.md         Contribution guidelines
  SECURITY.md             Security policy and vulnerability reporting