Containerize backend with Docker Compose

Added Docker support for easy deployment:
- Dockerfile: Python 3.11 slim image with FastAPI app
- docker-compose.yml: Service definition with volume mounting for data persistence
- .dockerignore: Exclude unnecessary files from Docker build
- database.py: Updated to store SQLite DB in ./data directory for volume persistence
- .gitignore: Added entries for database files and data directory
- README.md: Comprehensive documentation with Docker and local setup instructions

The application can now be run with: docker compose up -d
Database persists in ./data directory mounted as a volume
This commit is contained in:
Claude
2025-11-20 18:46:46 +00:00
parent f976e4e893
commit 05c63367c8
6 changed files with 321 additions and 3 deletions

19
.dockerignore Normal file
View File

@@ -0,0 +1,19 @@
__pycache__
*.pyc
*.pyo
*.pyd
.Python
*.so
*.egg
*.egg-info
dist
build
.git
.gitignore
*.db
*.db-journal
.env
.venv
venv/
ENV/
data/

6
.gitignore vendored
View File

@@ -205,3 +205,9 @@ cython_debug/
marimo/_static/ marimo/_static/
marimo/_lsp/ marimo/_lsp/
__marimo__/ __marimo__/
# Seismo Fleet Manager
# SQLite database files
*.db
*.db-journal
data/

19
Dockerfile Normal file
View File

@@ -0,0 +1,19 @@
FROM python:3.11-slim
# Set working directory
WORKDIR /app
# Copy requirements first for better caching
COPY requirements.txt .
# Install dependencies
RUN pip install --no-cache-dir -r requirements.txt
# Copy application code
COPY . .
# Expose port
EXPOSE 8000
# Run the application
CMD ["uvicorn", "main:app", "--host", "0.0.0.0", "--port", "8000"]

251
README.md
View File

@@ -1,2 +1,249 @@
# seismo-fleet-manager # Seismo Fleet Manager - Backend v0.1
Web app and backend for tracking deployed units.
Backend API for managing seismograph fleet status. Track multiple seismographs calling in data from remote deployments, monitor their status, and manage your fleet through a unified database.
## Features
- **Fleet Monitoring**: Track all seismograph units in one place
- **Status Management**: Automatically mark units as OK, Pending (>12h), or Missing (>24h)
- **Data Ingestion**: Accept reports from emitter scripts via REST API
- **SQLite Storage**: Lightweight, file-based database for easy deployment
## Tech Stack
- **FastAPI**: Modern, fast web framework
- **SQLAlchemy**: SQL toolkit and ORM
- **SQLite**: Lightweight database
- **uvicorn**: ASGI server
- **Docker**: Containerization for easy deployment
## Quick Start with Docker Compose (Recommended)
### Prerequisites
- Docker and Docker Compose installed
### Running the Application
1. **Start the service:**
```bash
docker compose up -d
```
2. **Check logs:**
```bash
docker compose logs -f
```
3. **Stop the service:**
```bash
docker compose down
```
The API will be available at `http://localhost:8000`
### Data Persistence
The SQLite database is stored in the `./data` directory, which is mounted as a volume. Your data will persist even if you restart or rebuild the container.
## Local Development (Without Docker)
### Prerequisites
- Python 3.11+
- pip
### Setup
1. **Install dependencies:**
```bash
pip install -r requirements.txt
```
2. **Run the server:**
```bash
python main.py
```
Or with auto-reload:
```bash
uvicorn main:app --reload
```
The API will be available at `http://localhost:8000`
## API Endpoints
### Root
- **GET** `/` - Health check
### Emitter Report
- **POST** `/emitters/report`
- Submit status report from a seismograph unit
- **Request Body:**
```json
{
"unit": "SEISMO-001",
"unit_type": "series3",
"timestamp": "2025-11-20T10:30:00",
"file": "event_20251120_103000.dat",
"status": "OK"
}
```
- **Response:**
```json
{
"message": "Emitter report received",
"unit": "SEISMO-001",
"status": "OK"
}
```
### Fleet Status
- **GET** `/fleet/status`
- Retrieve status of all seismograph units
- **Response:**
```json
[
{
"id": "SEISMO-001",
"unit_type": "series3",
"last_seen": "2025-11-20T10:30:00",
"last_file": "event_20251120_103000.dat",
"status": "OK",
"notes": null
}
]
```
## API Documentation
Once running, interactive API documentation is available at:
- **Swagger UI**: http://localhost:8000/docs
- **ReDoc**: http://localhost:8000/redoc
## Testing the API
### Using curl
**Submit a report:**
```bash
curl -X POST http://localhost:8000/emitters/report \
-H "Content-Type: application/json" \
-d '{
"unit": "SEISMO-001",
"unit_type": "series3",
"timestamp": "2025-11-20T10:30:00",
"file": "event_20251120_103000.dat",
"status": "OK"
}'
```
**Get fleet status:**
```bash
curl http://localhost:8000/fleet/status
```
### Using Python
```python
import requests
from datetime import datetime
# Submit report
response = requests.post(
"http://localhost:8000/emitters/report",
json={
"unit": "SEISMO-001",
"unit_type": "series3",
"timestamp": datetime.utcnow().isoformat(),
"file": "event_20251120_103000.dat",
"status": "OK"
}
)
print(response.json())
# Get fleet status
response = requests.get("http://localhost:8000/fleet/status")
print(response.json())
```
## Data Model
### Emitters Table
| Field | Type | Description |
|-------|------|-------------|
| id | string | Unit identifier (primary key) |
| unit_type | string | Type of seismograph (e.g., "series3") |
| last_seen | datetime | Last report timestamp |
| last_file | string | Last file processed |
| status | string | Current status: OK, Pending, Missing |
| notes | string | Optional notes (nullable) |
## Project Structure
```
seismo-fleet-manager/
├── main.py # FastAPI app entry point
├── database.py # SQLAlchemy database configuration
├── models.py # Database models
├── routes.py # API endpoints
├── requirements.txt # Python dependencies
├── Dockerfile # Docker container definition
├── docker-compose.yml # Docker Compose configuration
├── .dockerignore # Docker ignore rules
└── data/ # SQLite database directory (created at runtime)
```
## Docker Commands
**Build the image:**
```bash
docker compose build
```
**Start in foreground:**
```bash
docker compose up
```
**Start in background:**
```bash
docker compose up -d
```
**View logs:**
```bash
docker compose logs -f seismo-backend
```
**Restart service:**
```bash
docker compose restart
```
**Stop and remove containers:**
```bash
docker compose down
```
**Remove containers and volumes:**
```bash
docker compose down -v
```
## Future Enhancements
- Automated status updates based on last_seen timestamps
- Web-based dashboard for fleet monitoring
- Email/SMS alerts for missing units
- Historical data tracking and reporting
- Multi-user authentication
- PostgreSQL support for larger deployments
## License
MIT
## Version
0.1.0 - Initial Release

View File

@@ -1,8 +1,12 @@
from sqlalchemy import create_engine from sqlalchemy import create_engine
from sqlalchemy.ext.declarative import declarative_base from sqlalchemy.ext.declarative import declarative_base
from sqlalchemy.orm import sessionmaker from sqlalchemy.orm import sessionmaker
import os
SQLALCHEMY_DATABASE_URL = "sqlite:///./seismo_fleet.db" # Ensure data directory exists
os.makedirs("data", exist_ok=True)
SQLALCHEMY_DATABASE_URL = "sqlite:///./data/seismo_fleet.db"
engine = create_engine( engine = create_engine(
SQLALCHEMY_DATABASE_URL, connect_args={"check_same_thread": False} SQLALCHEMY_DATABASE_URL, connect_args={"check_same_thread": False}

23
docker-compose.yml Normal file
View File

@@ -0,0 +1,23 @@
version: '3.8'
services:
seismo-backend:
build: .
container_name: seismo-fleet-manager
ports:
- "8000:8000"
volumes:
# Persist SQLite database
- ./data:/app/data
environment:
- PYTHONUNBUFFERED=1
restart: unless-stopped
healthcheck:
test: ["CMD", "curl", "-f", "http://localhost:8000/"]
interval: 30s
timeout: 10s
retries: 3
start_period: 40s
volumes:
data: