Files
terra-view/README.md
Claude 05c63367c8 Containerize backend with Docker Compose
Added Docker support for easy deployment:
- Dockerfile: Python 3.11 slim image with FastAPI app
- docker-compose.yml: Service definition with volume mounting for data persistence
- .dockerignore: Exclude unnecessary files from Docker build
- database.py: Updated to store SQLite DB in ./data directory for volume persistence
- .gitignore: Added entries for database files and data directory
- README.md: Comprehensive documentation with Docker and local setup instructions

The application can now be run with: docker compose up -d
Database persists in ./data directory mounted as a volume
2025-11-20 18:46:46 +00:00

250 lines
5.2 KiB
Markdown

# Seismo Fleet Manager - Backend v0.1
Backend API for managing seismograph fleet status. Track multiple seismographs calling in data from remote deployments, monitor their status, and manage your fleet through a unified database.
## Features
- **Fleet Monitoring**: Track all seismograph units in one place
- **Status Management**: Automatically mark units as OK, Pending (>12h), or Missing (>24h)
- **Data Ingestion**: Accept reports from emitter scripts via REST API
- **SQLite Storage**: Lightweight, file-based database for easy deployment
## Tech Stack
- **FastAPI**: Modern, fast web framework
- **SQLAlchemy**: SQL toolkit and ORM
- **SQLite**: Lightweight database
- **uvicorn**: ASGI server
- **Docker**: Containerization for easy deployment
## Quick Start with Docker Compose (Recommended)
### Prerequisites
- Docker and Docker Compose installed
### Running the Application
1. **Start the service:**
```bash
docker compose up -d
```
2. **Check logs:**
```bash
docker compose logs -f
```
3. **Stop the service:**
```bash
docker compose down
```
The API will be available at `http://localhost:8000`
### Data Persistence
The SQLite database is stored in the `./data` directory, which is mounted as a volume. Your data will persist even if you restart or rebuild the container.
## Local Development (Without Docker)
### Prerequisites
- Python 3.11+
- pip
### Setup
1. **Install dependencies:**
```bash
pip install -r requirements.txt
```
2. **Run the server:**
```bash
python main.py
```
Or with auto-reload:
```bash
uvicorn main:app --reload
```
The API will be available at `http://localhost:8000`
## API Endpoints
### Root
- **GET** `/` - Health check
### Emitter Report
- **POST** `/emitters/report`
- Submit status report from a seismograph unit
- **Request Body:**
```json
{
"unit": "SEISMO-001",
"unit_type": "series3",
"timestamp": "2025-11-20T10:30:00",
"file": "event_20251120_103000.dat",
"status": "OK"
}
```
- **Response:**
```json
{
"message": "Emitter report received",
"unit": "SEISMO-001",
"status": "OK"
}
```
### Fleet Status
- **GET** `/fleet/status`
- Retrieve status of all seismograph units
- **Response:**
```json
[
{
"id": "SEISMO-001",
"unit_type": "series3",
"last_seen": "2025-11-20T10:30:00",
"last_file": "event_20251120_103000.dat",
"status": "OK",
"notes": null
}
]
```
## API Documentation
Once running, interactive API documentation is available at:
- **Swagger UI**: http://localhost:8000/docs
- **ReDoc**: http://localhost:8000/redoc
## Testing the API
### Using curl
**Submit a report:**
```bash
curl -X POST http://localhost:8000/emitters/report \
-H "Content-Type: application/json" \
-d '{
"unit": "SEISMO-001",
"unit_type": "series3",
"timestamp": "2025-11-20T10:30:00",
"file": "event_20251120_103000.dat",
"status": "OK"
}'
```
**Get fleet status:**
```bash
curl http://localhost:8000/fleet/status
```
### Using Python
```python
import requests
from datetime import datetime
# Submit report
response = requests.post(
"http://localhost:8000/emitters/report",
json={
"unit": "SEISMO-001",
"unit_type": "series3",
"timestamp": datetime.utcnow().isoformat(),
"file": "event_20251120_103000.dat",
"status": "OK"
}
)
print(response.json())
# Get fleet status
response = requests.get("http://localhost:8000/fleet/status")
print(response.json())
```
## Data Model
### Emitters Table
| Field | Type | Description |
|-------|------|-------------|
| id | string | Unit identifier (primary key) |
| unit_type | string | Type of seismograph (e.g., "series3") |
| last_seen | datetime | Last report timestamp |
| last_file | string | Last file processed |
| status | string | Current status: OK, Pending, Missing |
| notes | string | Optional notes (nullable) |
## Project Structure
```
seismo-fleet-manager/
├── main.py # FastAPI app entry point
├── database.py # SQLAlchemy database configuration
├── models.py # Database models
├── routes.py # API endpoints
├── requirements.txt # Python dependencies
├── Dockerfile # Docker container definition
├── docker-compose.yml # Docker Compose configuration
├── .dockerignore # Docker ignore rules
└── data/ # SQLite database directory (created at runtime)
```
## Docker Commands
**Build the image:**
```bash
docker compose build
```
**Start in foreground:**
```bash
docker compose up
```
**Start in background:**
```bash
docker compose up -d
```
**View logs:**
```bash
docker compose logs -f seismo-backend
```
**Restart service:**
```bash
docker compose restart
```
**Stop and remove containers:**
```bash
docker compose down
```
**Remove containers and volumes:**
```bash
docker compose down -v
```
## Future Enhancements
- Automated status updates based on last_seen timestamps
- Web-based dashboard for fleet monitoring
- Email/SMS alerts for missing units
- Historical data tracking and reporting
- Multi-user authentication
- PostgreSQL support for larger deployments
## License
MIT
## Version
0.1.0 - Initial Release