16 KiB
Seismo Fleet Manager v0.2.1
Backend API and HTMX-powered web interface for managing a mixed fleet of seismographs and field modems. Track deployments, monitor health in real time, merge roster intent with incoming telemetry, and control your fleet through a unified database and dashboard.
Features
- Web Dashboard: Modern, responsive UI with dark/light mode, live HTMX updates, and integrated fleet map
- Fleet Monitoring: Track deployed, benched, retired, and ignored units in separate buckets with unknown-emitter triage
- Roster Management: Full CRUD + CSV import/export, device-type aware metadata, and inline actions from the roster tables
- Settings & Safeguards:
/settingspage exposes roster stats, exports, replace-all imports, and danger-zone reset tools - Device & Modem Metadata: Capture calibration windows, modem pairings, phone/IP details, and addresses per unit
- Status Management: Automatically mark deployed units as OK, Pending (>12h), or Missing (>24h) based on recent telemetry
- Data Ingestion: Accept reports from emitter scripts via REST API
- Photo Management: Upload and view photos for each unit
- Interactive Maps: Leaflet-based maps showing unit locations
- SQLite Storage: Lightweight, file-based database for easy deployment
Roster Manager & Settings
Visit /settings to perform bulk roster operations with guardrails:
- CSV export/import: Download the entire roster, merge updates, or replace all units in one transaction.
- Live roster table: Fetch every unit via HTMX, edit metadata, toggle deployed/retired states, move emitters to the ignore list, or delete records in-place.
- Stats at a glance: View counts for the roster, emitters, and ignored units to confirm import/cleanup operations worked.
- Danger zone controls: Clear specific tables or wipe all fleet data when resetting a lab/demo environment.
All UI actions call GET/POST /api/settings/* endpoints so you can automate the same workflows from scripts.
Tech Stack
- FastAPI: Modern, fast web framework
- SQLAlchemy: SQL toolkit and ORM
- SQLite: Lightweight database
- HTMX: Dynamic updates without heavy JavaScript frameworks
- TailwindCSS: Utility-first CSS framework
- Leaflet: Interactive maps
- Jinja2: Server-side templating
- uvicorn: ASGI server
- Docker: Containerization for easy deployment
Quick Start with Docker Compose (Recommended)
Prerequisites
- Docker and Docker Compose installed
Running the Application
-
Start the service:
docker compose up -d -
Check logs:
docker compose logs -f -
Stop the service:
docker compose down
The application will be available at:
- Web Interface: http://localhost:8001
- API Documentation: http://localhost:8001/docs
- Health Check: http://localhost:8001/health
Data Persistence
The SQLite database and photos are stored in the ./data directory, which is mounted as a volume. Your data will persist even if you restart or rebuild the container.
Local Development (Without Docker)
Prerequisites
- Python 3.11+
- pip
Setup
-
Install dependencies:
pip install -r requirements.txt -
Run the server:
uvicorn backend.main:app --host 0.0.0.0 --port 8001 --reload
The application will be available at http://localhost:8001
Optional: Generate Sample Data
Need realistic data quickly? Run:
python create_test_db.py
cp /tmp/sfm_test.db data/seismo_fleet.db
The helper script creates a modem/seismograph mix so you can exercise the dashboard, roster tabs, and unit detail screens immediately.
Upgrading from v0.1.x
Versions ≥0.2 introduce new roster columns (device_type, calibration dates, modem metadata, addresses, etc.). Run the migration once per database file before starting the app:
python backend/migrate_add_device_types.py
The script is idempotent—if the new columns already exist it simply exits.
API Endpoints
Web Pages
- GET
/- Dashboard home page - GET
/roster- Fleet roster page - GET
/unit/{unit_id}- Unit detail page - GET
/settings- Roster manager, CSV import/export, and danger-zone utilities
Fleet Status & Monitoring
- GET
/api/status-snapshot- Complete fleet status snapshot - GET
/api/roster- List of all units with metadata - GET
/api/unit/{unit_id}- Detailed unit information - GET
/health- Health check endpoint
Roster Management
- POST
/api/roster/add- Add new unit to rostercurl -X POST http://localhost:8001/api/roster/add \ -F "id=BE1234" \ -F "device_type=seismograph" \ -F "unit_type=series3" \ -F "project_id=PROJ-001" \ -F "deployed=true" \ -F "note=Main site sensor" - GET
/api/roster/{unit_id}- Fetch a single roster entry for editing - POST
/api/roster/edit/{unit_id}- Update all metadata (device type, calibration dates, modem fields, etc.) - POST
/api/roster/set-deployed/{unit_id}- Toggle deployment status - POST
/api/roster/set-retired/{unit_id}- Toggle retired status - POST
/api/roster/set-note/{unit_id}- Update unit notes - DELETE
/api/roster/{unit_id}- Remove a roster/emitter pair entirely - POST
/api/roster/import-csv- Bulk import from CSV (merge/update mode)curl -X POST http://localhost:8001/api/roster/import-csv \ -F "file=@roster.csv" \ -F "update_existing=true" - POST
/api/roster/ignore/{unit_id}- Move an unknown emitter to the ignore list - DELETE
/api/roster/ignore/{unit_id}- Remove a unit from the ignore list - GET
/api/roster/ignored- List all ignored units with reasons
Settings & Data Management
- GET
/api/settings/export-csv- Download the entire roster as CSV - GET
/api/settings/stats- Counts for roster, emitters, and ignored tables - GET
/api/settings/roster-units- Raw roster dump for the settings data grid - POST
/api/settings/import-csv-replace- Replace the entire roster in one atomic transaction - POST
/api/settings/clear-all- Danger-zone action that wipes roster, emitters, and ignored tables - POST
/api/settings/clear-roster- Delete only roster entries - POST
/api/settings/clear-emitters- Delete auto-discovered emitters - POST
/api/settings/clear-ignored- Reset ignore list
CSV Import Format
Create a CSV file with the following columns (only unit_id is required, everything else is optional):
unit_id,unit_type,device_type,deployed,retired,note,project_id,location,address,coordinates,last_calibrated,next_calibration_due,deployed_with_modem_id,ip_address,phone_number,hardware_model
Boolean columns accept true/false, 1/0, or yes/no (case-insensitive). Date columns expect YYYY-MM-DD. Use the same schema whether you merge via /api/roster/import-csv or replace everything with /api/settings/import-csv-replace.
Example
unit_id,unit_type,device_type,deployed,retired,note,project_id,location,address,coordinates,last_calibrated,next_calibration_due,deployed_with_modem_id,ip_address,phone_number,hardware_model
BE1234,series3,seismograph,true,false,Primary sensor,PROJ-001,"Station A","123 Market St, San Francisco, CA","37.7937,-122.3965",2025-01-15,2026-01-15,MDM001,,,
MDM001,modem,modem,true,false,Field modem,PROJ-001,"Station A","123 Market St, San Francisco, CA","37.7937,-122.3965",,,,"192.0.2.10","+1-555-0100","Raven XTV"
See sample_roster.csv for a minimal working example.
Emitter Reporting
- POST
/emitters/report- Submit status report from a seismograph unit - POST
/api/series3/heartbeat- Series3 multi-unit telemetry payload - GET
/fleet/status- Retrieve status of all seismograph units (legacy)
Photo Management
- GET
/api/unit/{unit_id}/photos- List photos for a unit - GET
/api/unit/{unit_id}/photo/{filename}- Serve specific photo file
API Documentation
Once running, interactive API documentation is available at:
- Swagger UI: http://localhost:8001/docs
- ReDoc: http://localhost:8001/redoc
Testing the API
Using curl
Submit a report:
curl -X POST http://localhost:8001/emitters/report \
-H "Content-Type: application/json" \
-d '{
"unit": "SEISMO-001",
"unit_type": "series3",
"timestamp": "2025-11-20T10:30:00",
"file": "event_20251120_103000.dat",
"status": "OK"
}'
Get fleet status:
curl http://localhost:8001/api/roster
Import roster from CSV:
curl -X POST http://localhost:8001/api/roster/import-csv \
-F "file=@sample_roster.csv" \
-F "update_existing=true"
Using Python
import requests
from datetime import datetime
# Submit report
response = requests.post(
"http://localhost:8001/emitters/report",
json={
"unit": "SEISMO-001",
"unit_type": "series3",
"timestamp": datetime.utcnow().isoformat(),
"file": "event_20251120_103000.dat",
"status": "OK"
}
)
print(response.json())
# Get fleet status
response = requests.get("http://localhost:8001/api/roster")
print(response.json())
# Import CSV
with open('roster.csv', 'rb') as f:
files = {'file': f}
data = {'update_existing': 'true'}
response = requests.post(
"http://localhost:8001/api/roster/import-csv",
files=files,
data=data
)
print(response.json())
Data Model
RosterUnit Table (Fleet Roster)
Common fields
| Field | Type | Description |
|---|---|---|
| id | string | Unit identifier (primary key) |
| unit_type | string | Hardware model name (default: series3) |
| device_type | string | seismograph or modem discriminator |
| deployed | boolean | Whether the unit is in the field |
| retired | boolean | Removes the unit from deployments but preserves history |
| note | string | Notes about the unit |
| project_id | string | Associated project identifier |
| location | string | Legacy location label |
| address | string | Human-readable address |
| coordinates | string | lat,lon pair used by the map |
| last_updated | datetime | Last modification timestamp |
Seismograph-only fields
| Field | Type | Description |
|---|---|---|
| last_calibrated | date | Last calibration date |
| next_calibration_due | date | Next calibration date |
| deployed_with_modem_id | string | Which modem is paired during deployment |
Modem-only fields
| Field | Type | Description |
|---|---|---|
| ip_address | string | Assigned IP (static or DHCP) |
| phone_number | string | Cellular number for the modem |
| hardware_model | string | Modem hardware reference |
Emitter Table (Device Check-ins)
| Field | Type | Description |
|---|---|---|
| id | string | Unit identifier (primary key) |
| unit_type | string | Reported device type/model |
| last_seen | datetime | Last report timestamp |
| last_file | string | Last file processed |
| status | string | Current status: OK, Pending, Missing |
| notes | string | Optional notes (nullable) |
IgnoredUnit Table (Noise Management)
| Field | Type | Description |
|---|---|---|
| id | string | Unit identifier (primary key) |
| reason | string | Optional context for ignoring |
| ignored_at | datetime | When the ignore action occurred |
Project Structure
seismo-fleet-manager/
├── backend/
│ ├── main.py # FastAPI app entry point
│ ├── database.py # SQLAlchemy database configuration
│ ├── models.py # Database models (RosterUnit, Emitter, IgnoredUnit)
│ ├── routes.py # Legacy API endpoints
│ ├── routers/ # Modular API routers
│ │ ├── roster.py # Fleet status endpoints
│ │ ├── roster_edit.py # Roster management & CSV import
│ │ ├── units.py # Unit detail endpoints
│ │ ├── photos.py # Photo management
│ │ ├── dashboard.py # Dashboard partials
│ │ ├── dashboard_tabs.py # Dashboard tab endpoints
│ │ └── settings.py # Roster manager/data operations
│ ├── services/
│ │ └── snapshot.py # Fleet status snapshot logic
│ ├── migrate_add_device_types.py # SQLite migration for v0.2 schema
│ └── static/ # Static assets (CSS, etc.)
├── create_test_db.py # Generate a sample SQLite DB with mixed devices
├── templates/ # Jinja2 HTML templates
│ ├── base.html # Base layout with sidebar
│ ├── dashboard.html # Main dashboard
│ ├── roster.html # Fleet roster table
│ ├── unit_detail.html # Unit detail page
│ ├── settings.html # Roster manager UI
│ └── partials/ # HTMX partial templates
│ ├── roster_table.html
│ ├── retired_table.html
│ ├── ignored_table.html
│ └── unknown_emitters.html
├── data/ # SQLite database & photos (persisted)
├── requirements.txt # Python dependencies
├── Dockerfile # Docker container definition
├── docker-compose.yml # Docker Compose configuration
├── CHANGELOG.md # Version history
├── FRONTEND_README.md # Frontend documentation
└── README.md # This file
Docker Commands
Build the image:
docker compose build
Start in foreground:
docker compose up
Start in background:
docker compose up -d
View logs:
docker compose logs -f seismo-backend
Restart service:
docker compose restart
Rebuild and restart:
docker compose up -d --build
Stop and remove containers:
docker compose down
Remove containers and volumes:
docker compose down -v
Release Highlights
v0.2.1 — 2025-12-03
- Added the
/settingsroster manager with CSV export/import, live stats, and danger-zone table reset actions. - Deployed/Benched/Retired/Ignored tabs now have dedicated HTMX partials, sorting, and inline actions (edit, deploy toggle, ignore, delete).
- Unit detail pages expose device-type specific metadata (calibration windows, modem pairing, IP/phone fields) with a refreshed editing experience.
- Snapshot summary and dashboard counts now focus on deployed units and include address/coordinate data for mapping widgets.
v0.2.0 — 2025-12-03
- Introduced device-type aware roster schema (seismograph vs modem) plus migration +
create_test_db.pyhelper for new installs. - Added Ignore list model/endpoints to quarantine noisy emitters directly from the roster.
- Roster page gained Add Unit + CSV Import modals, HTMX-driven updates, and unknown emitter callouts.
- Snapshot service now returns active/benched/retired/unknown buckets containing richer metadata for the dashboard and roster tabs.
v0.1.1 — 2025-12-02
- Roster Editing API: Full CRUD operations for managing your fleet roster
- CSV Import: Bulk upload roster data from CSV files
- Enhanced Data Model: Added project_id and location fields to roster
- Bug Fixes: Improved database session management and error handling
- Dashboard Improvements: Separate views for Active, Benched, and Retired units
See CHANGELOG.md for the full release notes.
Future Enhancements
- Email/SMS alerts for missing units
- Historical data tracking and reporting
- Multi-user authentication
- PostgreSQL support for larger deployments
- Advanced filtering and search
- Export roster to various formats
- Automated backup and restore
License
MIT
Version
Current: 0.2.1 — Settings & roster manager refresh (2025-12-03)
Previous: 0.2.0 — Device-type aware roster + ignore list (2025-12-03)
0.1.1 — Roster Management & CSV Import (2025-12-02)
0.1.0 — Initial Release (2024-11-20)