Merge pull request #1 from serversdwn/claude/seismo-backend-server-01FsCdpT2WT4B342V3KtWx38

Build backend server for Seismo Fleet Manager v0.1
This commit is contained in:
serversdwn
2025-11-20 17:06:24 -05:00
committed by GitHub
10 changed files with 488 additions and 2 deletions

19
.dockerignore Normal file
View File

@@ -0,0 +1,19 @@
__pycache__
*.pyc
*.pyo
*.pyd
.Python
*.so
*.egg
*.egg-info
dist
build
.git
.gitignore
*.db
*.db-journal
.env
.venv
venv/
ENV/
data/

6
.gitignore vendored
View File

@@ -205,3 +205,9 @@ cython_debug/
marimo/_static/ marimo/_static/
marimo/_lsp/ marimo/_lsp/
__marimo__/ __marimo__/
# Seismo Fleet Manager
# SQLite database files
*.db
*.db-journal
data/

19
Dockerfile Normal file
View File

@@ -0,0 +1,19 @@
FROM python:3.11-slim
# Set working directory
WORKDIR /app
# Copy requirements first for better caching
COPY requirements.txt .
# Install dependencies
RUN pip install --no-cache-dir -r requirements.txt
# Copy application code
COPY . .
# Expose port
EXPOSE 8000
# Run the application
CMD ["uvicorn", "main:app", "--host", "0.0.0.0", "--port", "8000"]

251
README.md
View File

@@ -1,2 +1,249 @@
# seismo-fleet-manager # Seismo Fleet Manager - Backend v0.1
Web app and backend for tracking deployed units.
Backend API for managing seismograph fleet status. Track multiple seismographs calling in data from remote deployments, monitor their status, and manage your fleet through a unified database.
## Features
- **Fleet Monitoring**: Track all seismograph units in one place
- **Status Management**: Automatically mark units as OK, Pending (>12h), or Missing (>24h)
- **Data Ingestion**: Accept reports from emitter scripts via REST API
- **SQLite Storage**: Lightweight, file-based database for easy deployment
## Tech Stack
- **FastAPI**: Modern, fast web framework
- **SQLAlchemy**: SQL toolkit and ORM
- **SQLite**: Lightweight database
- **uvicorn**: ASGI server
- **Docker**: Containerization for easy deployment
## Quick Start with Docker Compose (Recommended)
### Prerequisites
- Docker and Docker Compose installed
### Running the Application
1. **Start the service:**
```bash
docker compose up -d
```
2. **Check logs:**
```bash
docker compose logs -f
```
3. **Stop the service:**
```bash
docker compose down
```
The API will be available at `http://localhost:8000`
### Data Persistence
The SQLite database is stored in the `./data` directory, which is mounted as a volume. Your data will persist even if you restart or rebuild the container.
## Local Development (Without Docker)
### Prerequisites
- Python 3.11+
- pip
### Setup
1. **Install dependencies:**
```bash
pip install -r requirements.txt
```
2. **Run the server:**
```bash
python main.py
```
Or with auto-reload:
```bash
uvicorn main:app --reload
```
The API will be available at `http://localhost:8000`
## API Endpoints
### Root
- **GET** `/` - Health check
### Emitter Report
- **POST** `/emitters/report`
- Submit status report from a seismograph unit
- **Request Body:**
```json
{
"unit": "SEISMO-001",
"unit_type": "series3",
"timestamp": "2025-11-20T10:30:00",
"file": "event_20251120_103000.dat",
"status": "OK"
}
```
- **Response:**
```json
{
"message": "Emitter report received",
"unit": "SEISMO-001",
"status": "OK"
}
```
### Fleet Status
- **GET** `/fleet/status`
- Retrieve status of all seismograph units
- **Response:**
```json
[
{
"id": "SEISMO-001",
"unit_type": "series3",
"last_seen": "2025-11-20T10:30:00",
"last_file": "event_20251120_103000.dat",
"status": "OK",
"notes": null
}
]
```
## API Documentation
Once running, interactive API documentation is available at:
- **Swagger UI**: http://localhost:8000/docs
- **ReDoc**: http://localhost:8000/redoc
## Testing the API
### Using curl
**Submit a report:**
```bash
curl -X POST http://localhost:8000/emitters/report \
-H "Content-Type: application/json" \
-d '{
"unit": "SEISMO-001",
"unit_type": "series3",
"timestamp": "2025-11-20T10:30:00",
"file": "event_20251120_103000.dat",
"status": "OK"
}'
```
**Get fleet status:**
```bash
curl http://localhost:8000/fleet/status
```
### Using Python
```python
import requests
from datetime import datetime
# Submit report
response = requests.post(
"http://localhost:8000/emitters/report",
json={
"unit": "SEISMO-001",
"unit_type": "series3",
"timestamp": datetime.utcnow().isoformat(),
"file": "event_20251120_103000.dat",
"status": "OK"
}
)
print(response.json())
# Get fleet status
response = requests.get("http://localhost:8000/fleet/status")
print(response.json())
```
## Data Model
### Emitters Table
| Field | Type | Description |
|-------|------|-------------|
| id | string | Unit identifier (primary key) |
| unit_type | string | Type of seismograph (e.g., "series3") |
| last_seen | datetime | Last report timestamp |
| last_file | string | Last file processed |
| status | string | Current status: OK, Pending, Missing |
| notes | string | Optional notes (nullable) |
## Project Structure
```
seismo-fleet-manager/
├── main.py # FastAPI app entry point
├── database.py # SQLAlchemy database configuration
├── models.py # Database models
├── routes.py # API endpoints
├── requirements.txt # Python dependencies
├── Dockerfile # Docker container definition
├── docker-compose.yml # Docker Compose configuration
├── .dockerignore # Docker ignore rules
└── data/ # SQLite database directory (created at runtime)
```
## Docker Commands
**Build the image:**
```bash
docker compose build
```
**Start in foreground:**
```bash
docker compose up
```
**Start in background:**
```bash
docker compose up -d
```
**View logs:**
```bash
docker compose logs -f seismo-backend
```
**Restart service:**
```bash
docker compose restart
```
**Stop and remove containers:**
```bash
docker compose down
```
**Remove containers and volumes:**
```bash
docker compose down -v
```
## Future Enhancements
- Automated status updates based on last_seen timestamps
- Web-based dashboard for fleet monitoring
- Email/SMS alerts for missing units
- Historical data tracking and reporting
- Multi-user authentication
- PostgreSQL support for larger deployments
## License
MIT
## Version
0.1.0 - Initial Release

26
database.py Normal file
View File

@@ -0,0 +1,26 @@
from sqlalchemy import create_engine
from sqlalchemy.ext.declarative import declarative_base
from sqlalchemy.orm import sessionmaker
import os
# Ensure data directory exists
os.makedirs("data", exist_ok=True)
SQLALCHEMY_DATABASE_URL = "sqlite:///./data/seismo_fleet.db"
engine = create_engine(
SQLALCHEMY_DATABASE_URL, connect_args={"check_same_thread": False}
)
SessionLocal = sessionmaker(autocommit=False, autoflush=False, bind=engine)
Base = declarative_base()
def get_db():
"""Dependency for database sessions"""
db = SessionLocal()
try:
yield db
finally:
db.close()

23
docker-compose.yml Normal file
View File

@@ -0,0 +1,23 @@
version: '3.8'
services:
seismo-backend:
build: .
container_name: seismo-fleet-manager
ports:
- "8001:8000"
volumes:
# Persist SQLite database
- ./data:/app/data
environment:
- PYTHONUNBUFFERED=1
restart: unless-stopped
healthcheck:
test: ["CMD", "curl", "-f", "http://localhost:8000/"]
interval: 30s
timeout: 10s
retries: 3
start_period: 40s
volumes:
data:

41
main.py Normal file
View File

@@ -0,0 +1,41 @@
from fastapi import FastAPI
from fastapi.middleware.cors import CORSMiddleware
from database import engine, Base
from routes import router
# Create database tables
Base.metadata.create_all(bind=engine)
# Initialize FastAPI app
app = FastAPI(
title="Seismo Fleet Manager",
description="Backend API for managing seismograph fleet status",
version="0.1.0"
)
# Configure CORS (adjust origins as needed for your deployment)
app.add_middleware(
CORSMiddleware,
allow_origins=["*"], # In production, specify exact origins
allow_credentials=True,
allow_methods=["*"],
allow_headers=["*"],
)
# Include API routes
app.include_router(router)
@app.get("/")
def root():
"""Root endpoint - health check"""
return {
"message": "Seismo Fleet Manager API v0.1",
"status": "running"
}
if __name__ == "__main__":
import uvicorn
uvicorn.run(app, host="0.0.0.0", port=8000)

18
models.py Normal file
View File

@@ -0,0 +1,18 @@
from sqlalchemy import Column, String, DateTime
from datetime import datetime
from database import Base
class Emitter(Base):
"""Emitter model representing a seismograph unit in the fleet"""
__tablename__ = "emitters"
id = Column(String, primary_key=True, index=True)
unit_type = Column(String, nullable=False)
last_seen = Column(DateTime, default=datetime.utcnow)
last_file = Column(String, nullable=False)
status = Column(String, nullable=False) # OK, Pending, Missing
notes = Column(String, nullable=True)
def __repr__(self):
return f"<Emitter(id={self.id}, type={self.unit_type}, status={self.status})>"

5
requirements.txt Normal file
View File

@@ -0,0 +1,5 @@
fastapi==0.104.1
uvicorn[standard]==0.24.0
sqlalchemy==2.0.23
pydantic==2.5.0
python-multipart==0.0.6

82
routes.py Normal file
View File

@@ -0,0 +1,82 @@
from fastapi import APIRouter, Depends, HTTPException
from sqlalchemy.orm import Session
from pydantic import BaseModel
from datetime import datetime
from typing import Optional, List
from database import get_db
from models import Emitter
router = APIRouter()
# Pydantic schemas for request/response validation
class EmitterReport(BaseModel):
unit: str
unit_type: str
timestamp: str
file: str
status: str
class EmitterResponse(BaseModel):
id: str
unit_type: str
last_seen: datetime
last_file: str
status: str
notes: Optional[str] = None
class Config:
from_attributes = True
@router.post("/emitters/report", status_code=200)
def report_emitter(report: EmitterReport, db: Session = Depends(get_db)):
"""
Endpoint for emitters to report their status.
Creates a new emitter if it doesn't exist, or updates an existing one.
"""
try:
# Parse the timestamp
timestamp = datetime.fromisoformat(report.timestamp.replace('Z', '+00:00'))
except ValueError:
raise HTTPException(status_code=400, detail="Invalid timestamp format")
# Check if emitter already exists
emitter = db.query(Emitter).filter(Emitter.id == report.unit).first()
if emitter:
# Update existing emitter
emitter.unit_type = report.unit_type
emitter.last_seen = timestamp
emitter.last_file = report.file
emitter.status = report.status
else:
# Create new emitter
emitter = Emitter(
id=report.unit,
unit_type=report.unit_type,
last_seen=timestamp,
last_file=report.file,
status=report.status
)
db.add(emitter)
db.commit()
db.refresh(emitter)
return {
"message": "Emitter report received",
"unit": emitter.id,
"status": emitter.status
}
@router.get("/fleet/status", response_model=List[EmitterResponse])
def get_fleet_status(db: Session = Depends(get_db)):
"""
Returns a list of all emitters and their current status.
"""
emitters = db.query(Emitter).all()
return emitters