Compare commits
8 Commits
1.0-experi
...
7971092509
| Author | SHA1 | Date | |
|---|---|---|---|
|
|
7971092509 | ||
|
|
d349af9444 | ||
|
|
be83cb3fe7 | ||
|
|
e9216b9abc | ||
|
|
d93785c230 | ||
|
|
98ee9d7cea | ||
|
|
04c66bdf9c | ||
|
|
8a5fadb5df |
1
.gitignore
vendored
@@ -211,3 +211,4 @@ __marimo__/
|
||||
*.db
|
||||
*.db-journal
|
||||
data/
|
||||
.aider*
|
||||
|
||||
91
CHANGELOG.md
@@ -1,99 +1,10 @@
|
||||
# Changelog
|
||||
|
||||
All notable changes to Terra-View will be documented in this file.
|
||||
All notable changes to Seismo Fleet Manager will be documented in this file.
|
||||
|
||||
The format is based on [Keep a Changelog](https://keepachangelog.com/en/1.0.0/),
|
||||
and this project adheres to [Semantic Versioning](https://semver.org/spec/v2.0.0.html).
|
||||
|
||||
## [0.5.0] - 2026-01-09
|
||||
|
||||
### Added
|
||||
- **Unified Modular Monolith Architecture**: Complete architectural refactoring to modular monolith pattern
|
||||
- **Three Feature Modules**: Seismo (seismograph fleet), SLM (sound level meters), UI (shared templates/static)
|
||||
- **Module Isolation**: Each module has its own database, models, services, and routers
|
||||
- **Shared Infrastructure**: Common utilities and API aggregation layer
|
||||
- **Multi-Container Deployment**: Three Docker containers (terra-view, sfm, slmm) built from single codebase
|
||||
- **SLMM Integration**: Sound Level Meter Manager fully integrated as `app/slm/` module
|
||||
- Migrated from separate repository to unified codebase
|
||||
- Complete NL43 device management API (`/api/nl43/*`)
|
||||
- Database models for NL43Config and NL43Status
|
||||
- NL43Client service for device communication
|
||||
- FTP, TCP, and web interface support for NL43 devices
|
||||
- **SLM Dashboard API Layer**: New dashboard endpoints bridge UI and device APIs
|
||||
- `GET /api/slm-dashboard/stats` - Aggregate statistics (total units, online/offline, measuring/idle)
|
||||
- `GET /api/slm-dashboard/units` - List all units with latest status
|
||||
- `GET /api/slm-dashboard/live-view/{unit_id}` - Real-time measurement data
|
||||
- `GET /api/slm-dashboard/config/{unit_id}` - Retrieve unit configuration
|
||||
- `POST /api/slm-dashboard/config/{unit_id}` - Update unit configuration
|
||||
- `POST /api/slm-dashboard/control/{unit_id}/{action}` - Send control commands (start, stop, pause, resume, reset, sleep, wake)
|
||||
- `GET /api/slm-dashboard/test-modem/{unit_id}` - Test device connectivity
|
||||
- **Repository Rebranding**: Renamed from `seismo-fleet-manager` to `terra-view`
|
||||
- Reflects unified platform nature (seismo + SLM + future modules)
|
||||
- Git remote updated to `terra-view.git`
|
||||
- All references updated throughout codebase
|
||||
|
||||
### Changed
|
||||
- **Project Structure**: Complete reorganization following modular monolith pattern
|
||||
- `app/seismo/` - Seismograph fleet module (formerly `backend/`)
|
||||
- `app/slm/` - Sound level meter module (integrated from SLMM)
|
||||
- `app/ui/` - Shared templates and static assets
|
||||
- `app/api/` - Cross-module API aggregation layer
|
||||
- Removed `backend/` and `templates/` directories
|
||||
- **Import Paths**: All imports updated from `backend.*` to `app.seismo.*` or `app.slm.*`
|
||||
- **Database Initialization**: Each module initializes its own database tables
|
||||
- Seismo database: `app/seismo/database.py`
|
||||
- SLM database: `app/slm/database.py`
|
||||
- **Docker Architecture**: Three-container deployment from single codebase
|
||||
- `terra-view` (port 8001): Main UI/orchestrator with all modules
|
||||
- `sfm` (port 8002): Seismograph Fleet Module API
|
||||
- `slmm` (port 8100): Sound Level Meter Manager API
|
||||
- All containers built from same unified codebase with different entry points
|
||||
|
||||
### Fixed
|
||||
- **Template Path Issues**: Fixed seismo dashboard template references
|
||||
- Updated `app/seismo/routers/dashboard.py` to use `app/ui/templates` directory
|
||||
- Resolved 404 errors for `partials/benched_table.html` and `partials/active_table.html`
|
||||
- **Module Import Errors**: Corrected SLMM module structure
|
||||
- Fixed `app/slm/main.py` to import from `app.slm.routers` instead of `app.routers`
|
||||
- Updated all SLMM internal imports to use `app.slm.*` namespace
|
||||
- **Docker Build Issues**: Resolved file permission problems
|
||||
- Fixed dashboard.py permissions for Docker COPY operations
|
||||
- Ensured all source files readable during container builds
|
||||
|
||||
### Technical Details
|
||||
- **Modular Monolith Benefits**:
|
||||
- Single repository for easier development and deployment
|
||||
- Module boundaries enforced through folder structure
|
||||
- Shared dependencies managed in single requirements.txt
|
||||
- Independent database schemas per module
|
||||
- Clean separation of concerns with explicit module APIs
|
||||
- **Migration Path**: Existing installations automatically migrate
|
||||
- Import path updates applied programmatically
|
||||
- Database schemas remain compatible
|
||||
- No data migration required
|
||||
- **Module Structure**: Each module follows consistent pattern
|
||||
- `database.py` - SQLAlchemy models and session management
|
||||
- `models.py` - Pydantic schemas and database models
|
||||
- `routers.py` - FastAPI route definitions
|
||||
- `services.py` - Business logic and external integrations
|
||||
- **Container Communication**: Containers use host networking
|
||||
- terra-view proxies to sfm and slmm containers
|
||||
- Environment variables configure API URLs
|
||||
- Health checks ensure container availability
|
||||
|
||||
### Migration Notes
|
||||
- **Breaking Changes**: Import paths changed for all modules
|
||||
- Old: `from backend.models import RosterUnit`
|
||||
- New: `from app.seismo.models import RosterUnit`
|
||||
- **Configuration Updates**: Environment variables for multi-container setup
|
||||
- `SFM_API_URL=http://localhost:8002` - SFM backend endpoint
|
||||
- `SLMM_API_URL=http://localhost:8100` - SLMM backend endpoint
|
||||
- `MODULE_MODE=sfm|slmm` - Future flag for API-only containers
|
||||
- **Repository Migration**: Update git remotes for renamed repository
|
||||
```bash
|
||||
git remote set-url origin ssh://git@10.0.0.2:2222/serversdown/terra-view.git
|
||||
```
|
||||
|
||||
## [0.4.2] - 2026-01-05
|
||||
|
||||
### Added
|
||||
|
||||
@@ -1,26 +0,0 @@
|
||||
FROM python:3.11-slim
|
||||
|
||||
# Set working directory
|
||||
WORKDIR /app
|
||||
|
||||
# Install system dependencies
|
||||
RUN apt-get update && \
|
||||
apt-get install -y --no-install-recommends iputils-ping curl && \
|
||||
rm -rf /var/lib/apt/lists/*
|
||||
|
||||
# Copy requirements first for better caching
|
||||
COPY requirements.txt .
|
||||
|
||||
# Install dependencies
|
||||
RUN pip install --no-cache-dir -r requirements.txt
|
||||
|
||||
# Copy application code
|
||||
COPY . .
|
||||
|
||||
# Expose SFM port
|
||||
EXPOSE 8002
|
||||
|
||||
# Run SFM backend (API only)
|
||||
# For now: runs same app on different port
|
||||
# Future: will run SFM-specific entry point
|
||||
CMD ["python3", "-m", "app.main"]
|
||||
@@ -1,21 +0,0 @@
|
||||
FROM python:3.11-slim
|
||||
|
||||
WORKDIR /app
|
||||
|
||||
# Install system dependencies
|
||||
RUN apt-get update && \
|
||||
apt-get install -y --no-install-recommends curl && \
|
||||
rm -rf /var/lib/apt/lists/*
|
||||
|
||||
# Copy requirements and install dependencies
|
||||
COPY requirements.txt .
|
||||
RUN pip install --no-cache-dir -r requirements.txt
|
||||
|
||||
# Copy application code
|
||||
COPY app /app/app
|
||||
|
||||
# Expose port
|
||||
EXPOSE 8100
|
||||
|
||||
# Run the SLM application
|
||||
CMD ["uvicorn", "app.main:app", "--host", "0.0.0.0", "--port", "8100"]
|
||||
@@ -1,24 +0,0 @@
|
||||
FROM python:3.11-slim
|
||||
|
||||
# Set working directory
|
||||
WORKDIR /app
|
||||
|
||||
# Install system dependencies
|
||||
RUN apt-get update && \
|
||||
apt-get install -y --no-install-recommends iputils-ping curl && \
|
||||
rm -rf /var/lib/apt/lists/*
|
||||
|
||||
# Copy requirements first for better caching
|
||||
COPY requirements.txt .
|
||||
|
||||
# Install dependencies
|
||||
RUN pip install --no-cache-dir -r requirements.txt
|
||||
|
||||
# Copy application code
|
||||
COPY . .
|
||||
|
||||
# Expose Terra-View UI port
|
||||
EXPOSE 8001
|
||||
|
||||
# Run Terra-View (UI + orchestration)
|
||||
CMD ["python3", "-m", "app.main"]
|
||||
@@ -1,141 +0,0 @@
|
||||
# Terra-View Modular Monolith - Known-Good Baseline
|
||||
|
||||
**Date:** 2026-01-09
|
||||
**Status:** ✅ IMPORT MIGRATION COMPLETE
|
||||
|
||||
## What We've Achieved
|
||||
|
||||
Successfully restructured the application into a modular monolith architecture with the new folder structure working end-to-end.
|
||||
|
||||
## New Structure
|
||||
|
||||
```
|
||||
/home/serversdown/sfm/seismo-fleet-manager/
|
||||
├── app/
|
||||
│ ├── main.py # NEW: Entry point with Terra-View branding
|
||||
│ ├── core/ # Shared infrastructure
|
||||
│ │ ├── config.py # NEW: Centralized configuration
|
||||
│ │ └── database.py # Shared DB utilities
|
||||
│ ├── ui/ # UI Layer (device-agnostic)
|
||||
│ │ ├── routes.py # NEW: HTML page routes
|
||||
│ │ ├── templates/ # All HTML templates (copied from old location)
|
||||
│ │ └── static/ # All static files (copied from old location)
|
||||
│ ├── seismo/ # Seismograph Feature Module
|
||||
│ │ ├── models.py # ✅ Updated to use app.seismo.database
|
||||
│ │ ├── database.py # NEW: Seismo-specific DB connection
|
||||
│ │ ├── routers/ # API routers (copied from backend/routers/)
|
||||
│ │ └── services/ # Business logic (copied from backend/services/)
|
||||
│ ├── slm/ # Sound Level Meter Feature Module
|
||||
│ │ ├── models.py # NEW: Placeholder for SLM models
|
||||
│ │ ├── database.py # NEW: SLM-specific DB connection
|
||||
│ │ └── routers/ # SLM routers (copied from backend/routers/)
|
||||
│ └── api/ # API Aggregation Layer (placeholder)
|
||||
│ ├── dashboard.py # NEW: Future aggregation endpoints
|
||||
│ └── roster.py # NEW: Future aggregation endpoints
|
||||
└── data/
|
||||
└── seismo_fleet.db # Still using shared DB (migration pending)
|
||||
```
|
||||
|
||||
## What's Working
|
||||
|
||||
✅ **Application starts successfully** on port 9999
|
||||
✅ **Health endpoint works**: `/health` returns Terra-View v1.0.0
|
||||
✅ **UI renders**: Main dashboard loads with proper templates
|
||||
✅ **API endpoints work**: `/api/status-snapshot` returns seismograph data
|
||||
✅ **Database access works**: Models properly connected
|
||||
✅ **Static files serve**: CSS, JS, icons all accessible
|
||||
|
||||
## Critical Changes Made
|
||||
|
||||
### 1. Fixed Import in models.py
|
||||
**File:** `app/seismo/models.py`
|
||||
**Change:** `from backend.database import Base` → `from app.seismo.database import Base`
|
||||
**Reason:** Avoid duplicate Base instances causing SQLAlchemy errors
|
||||
|
||||
### 2. Created New Entry Point
|
||||
**File:** `app/main.py`
|
||||
**Features:**
|
||||
- Terra-View branding (title, version, health check)
|
||||
- Imports from new `app.*` structure
|
||||
- Registers all seismo and SLM routers
|
||||
- Middleware for environment context
|
||||
|
||||
### 3. Created UI Routes Module
|
||||
**File:** `app/ui/routes.py`
|
||||
**Purpose:** Centralize all HTML page routes (device-agnostic)
|
||||
|
||||
### 4. Created Module-Specific Databases
|
||||
**Files:** `app/seismo/database.py`, `app/slm/database.py`
|
||||
**Status:** Both currently point to shared `seismo_fleet.db` (migration pending)
|
||||
|
||||
## Recent Updates (2026-01-09)
|
||||
|
||||
✅ **ALL imports updated** - Changed all `backend.*` imports to `app.seismo.*` or `app.slm.*`
|
||||
✅ **Old structure deleted** - `backend/` and `templates/` directories removed
|
||||
✅ **Containers rebuilt** - All three containers (Terra-View, SFM, SLMM) working with new imports
|
||||
✅ **Verified working** - Tested health endpoints and UI after migration
|
||||
|
||||
## What's NOT Yet Done
|
||||
|
||||
❌ **Partial routes missing** - `/partials/*` endpoints not yet added
|
||||
❌ **Database not split** - Still using shared `seismo_fleet.db`
|
||||
|
||||
## How to Run
|
||||
|
||||
```bash
|
||||
# Start on custom port to avoid conflicts
|
||||
PORT=9999 python3 -m app.main
|
||||
|
||||
# Test health endpoint
|
||||
curl http://localhost:9999/health
|
||||
|
||||
# Test API endpoint
|
||||
curl http://localhost:9999/api/status-snapshot
|
||||
|
||||
# Access UI
|
||||
open http://localhost:9999/
|
||||
```
|
||||
|
||||
## Next Steps (Recommended Order)
|
||||
|
||||
1. **Add partial routes** to app/main.py or create separate router
|
||||
2. **Test all endpoints thoroughly** - Verify roster CRUD, photos, settings
|
||||
3. **Split databases** (Phase 2 of plan)
|
||||
4. **Implement API aggregation layer** (Phase 3 of plan)
|
||||
|
||||
## Known Issues
|
||||
|
||||
None currently - app starts and serves requests successfully!
|
||||
|
||||
## Testing Checklist
|
||||
|
||||
- [x] App starts without errors
|
||||
- [x] Health endpoint returns correct version
|
||||
- [x] Main dashboard loads
|
||||
- [x] Status snapshot API works
|
||||
- [ ] All seismo endpoints work
|
||||
- [ ] All SLM endpoints work
|
||||
- [ ] Roster CRUD operations work
|
||||
- [ ] Photos upload/download works
|
||||
- [ ] Settings page works
|
||||
|
||||
## Rollback Instructions
|
||||
|
||||
~~The old structure has been deleted.~~ To rollback, restore from your backup:
|
||||
|
||||
```bash
|
||||
# Restore from your backup
|
||||
# The old backend/ and templates/ directories were removed on 2026-01-09
|
||||
```
|
||||
|
||||
## Important Notes
|
||||
|
||||
- **MIGRATION COMPLETE**: Old `backend/` and `templates/` directories removed
|
||||
- **ALL IMPORTS UPDATED**: All Python files now use `app.*` imports
|
||||
- **NO DATA LOSS**: Database untouched, only code structure changed
|
||||
- **CONTAINERS WORKING**: All three containers (Terra-View, SFM, SLMM) healthy
|
||||
- **FULLY SELF-CONTAINED**: Application runs entirely from `app/` directory
|
||||
|
||||
---
|
||||
|
||||
**Congratulations!** 🎉 Import migration complete! The modular monolith is now self-contained and production-ready.
|
||||
546
PROJECTS_SYSTEM_IMPLEMENTATION.md
Normal file
@@ -0,0 +1,546 @@
|
||||
# Projects System Implementation - Terra-View
|
||||
|
||||
## Overview
|
||||
|
||||
The Projects system has been successfully scaffolded in Terra-View. This document provides a complete overview of what has been built, how it works, and what needs to be completed.
|
||||
|
||||
## ✅ Completed Components
|
||||
|
||||
### 1. Database Schema
|
||||
|
||||
**Location**: `/backend/models.py`
|
||||
|
||||
Seven new tables have been added:
|
||||
|
||||
- **ProjectType**: Template definitions for project types (Sound, Vibration, Combined)
|
||||
- **Project**: Top-level project organization with type reference
|
||||
- **MonitoringLocation**: Generic locations (NRLs for sound, monitoring points for vibration)
|
||||
- **UnitAssignment**: Links devices to locations
|
||||
- **ScheduledAction**: Automated recording control schedules
|
||||
- **RecordingSession**: Tracks actual recording/monitoring sessions
|
||||
- **DataFile**: File references for downloaded data
|
||||
|
||||
**Key Features**:
|
||||
- Type-aware design (project_type_id determines features)
|
||||
- Flexible metadata fields (JSON columns for type-specific data)
|
||||
- Denormalized fields for efficient queries
|
||||
- Proper indexing on foreign keys
|
||||
|
||||
### 2. Service Layer
|
||||
|
||||
#### SLMM Client (`/backend/services/slmm_client.py`)
|
||||
- Clean wrapper for all SLMM API operations
|
||||
- Methods for: start/stop/pause/resume recording, get status, configure devices
|
||||
- Error handling with custom exceptions
|
||||
- Singleton pattern for easy access
|
||||
|
||||
#### Device Controller (`/backend/services/device_controller.py`)
|
||||
- Routes commands to appropriate backend (SLMM for SLMs, SFM for seismographs)
|
||||
- Unified interface across device types
|
||||
- Ready for future SFM implementation
|
||||
|
||||
#### Scheduler Service (`/backend/services/scheduler.py`)
|
||||
- Background task that checks for pending scheduled actions every 60 seconds
|
||||
- Executes actions by calling device controller
|
||||
- Creates/updates recording sessions
|
||||
- Tracks execution status and errors
|
||||
- Manual execution support for testing
|
||||
|
||||
### 3. API Routers
|
||||
|
||||
#### Projects Router (`/backend/routers/projects.py`)
|
||||
Endpoints:
|
||||
- `GET /api/projects/list` - Project list with stats
|
||||
- `GET /api/projects/stats` - Overview statistics
|
||||
- `POST /api/projects/create` - Create new project
|
||||
- `GET /api/projects/{id}` - Get project details
|
||||
- `PUT /api/projects/{id}` - Update project
|
||||
- `DELETE /api/projects/{id}` - Archive project
|
||||
- `GET /api/projects/{id}/dashboard` - Project dashboard data
|
||||
- `GET /api/projects/types/list` - Get project type templates
|
||||
|
||||
#### Project Locations Router (`/backend/routers/project_locations.py`)
|
||||
Endpoints:
|
||||
- `GET /api/projects/{id}/locations` - List locations
|
||||
- `POST /api/projects/{id}/locations/create` - Create location
|
||||
- `PUT /api/projects/{id}/locations/{location_id}` - Update location
|
||||
- `DELETE /api/projects/{id}/locations/{location_id}` - Delete location
|
||||
- `GET /api/projects/{id}/assignments` - List unit assignments
|
||||
- `POST /api/projects/{id}/locations/{location_id}/assign` - Assign unit
|
||||
- `POST /api/projects/{id}/assignments/{assignment_id}/unassign` - Unassign unit
|
||||
- `GET /api/projects/{id}/available-units` - Get units available for assignment
|
||||
|
||||
#### Scheduler Router (`/backend/routers/scheduler.py`)
|
||||
Endpoints:
|
||||
- `GET /api/projects/{id}/scheduler/actions` - List scheduled actions
|
||||
- `POST /api/projects/{id}/scheduler/actions/create` - Create action
|
||||
- `POST /api/projects/{id}/scheduler/schedule-session` - Schedule recording session
|
||||
- `PUT /api/projects/{id}/scheduler/actions/{action_id}` - Update action
|
||||
- `POST /api/projects/{id}/scheduler/actions/{action_id}/cancel` - Cancel action
|
||||
- `DELETE /api/projects/{id}/scheduler/actions/{action_id}` - Delete action
|
||||
- `POST /api/projects/{id}/scheduler/actions/{action_id}/execute` - Manual execution
|
||||
- `GET /api/projects/{id}/scheduler/status` - Scheduler status
|
||||
- `POST /api/projects/{id}/scheduler/execute-pending` - Trigger pending executions
|
||||
|
||||
### 4. Frontend
|
||||
|
||||
#### Main Page
|
||||
**Location**: `/templates/projects/overview.html`
|
||||
|
||||
Features:
|
||||
- Summary statistics cards (projects, locations, assignments, sessions)
|
||||
- Tabbed interface (All, Active, Completed, Archived)
|
||||
- Project cards grid layout
|
||||
- Create project modal with two-step flow:
|
||||
1. Select project type (Sound/Vibration/Combined)
|
||||
2. Fill project details
|
||||
- HTMX-powered dynamic updates
|
||||
|
||||
#### Navigation
|
||||
**Location**: `/templates/base.html` (updated)
|
||||
- "Projects" link added to sidebar
|
||||
- Active state highlighting
|
||||
|
||||
### 5. Application Integration
|
||||
|
||||
**Location**: `/backend/main.py`
|
||||
|
||||
- Routers registered
|
||||
- Page route added (`/projects`)
|
||||
- Scheduler service starts on application startup
|
||||
- Scheduler stops on application shutdown
|
||||
|
||||
### 6. Database Initialization
|
||||
|
||||
**Script**: `/backend/init_projects_db.py`
|
||||
|
||||
- Creates all project tables
|
||||
- Populates ProjectType with default templates
|
||||
- ✅ Successfully executed - database is ready
|
||||
|
||||
---
|
||||
|
||||
## 📁 File Organization
|
||||
|
||||
```
|
||||
terra-view/
|
||||
├── backend/
|
||||
│ ├── models.py [✅ Updated]
|
||||
│ ├── init_projects_db.py [✅ Created]
|
||||
│ ├── main.py [✅ Updated]
|
||||
│ ├── routers/
|
||||
│ │ ├── projects.py [✅ Created]
|
||||
│ │ ├── project_locations.py [✅ Created]
|
||||
│ │ └── scheduler.py [✅ Created]
|
||||
│ └── services/
|
||||
│ ├── slmm_client.py [✅ Created]
|
||||
│ ├── device_controller.py [✅ Created]
|
||||
│ └── scheduler.py [✅ Created]
|
||||
├── templates/
|
||||
│ ├── base.html [✅ Updated]
|
||||
│ ├── projects/
|
||||
│ │ └── overview.html [✅ Created]
|
||||
│ └── partials/
|
||||
│ └── projects/ [📁 Created, empty]
|
||||
└── data/
|
||||
└── seismo_fleet.db [✅ Tables created]
|
||||
```
|
||||
|
||||
---
|
||||
|
||||
## 🔨 What Still Needs to be Built
|
||||
|
||||
### 1. Frontend Templates (Partials)
|
||||
|
||||
**Directory**: `/templates/partials/projects/`
|
||||
|
||||
**Required Files**:
|
||||
|
||||
#### `project_stats.html`
|
||||
Stats cards for overview page:
|
||||
- Total/Active/Completed projects
|
||||
- Total locations
|
||||
- Assigned units
|
||||
- Active sessions
|
||||
|
||||
#### `project_list.html`
|
||||
Project cards grid:
|
||||
- Project name, type, status
|
||||
- Location count, unit count
|
||||
- Active session indicator
|
||||
- Link to project dashboard
|
||||
|
||||
#### `project_dashboard.html`
|
||||
Main project dashboard panel with tabs:
|
||||
- Summary stats
|
||||
- Active locations and assignments
|
||||
- Upcoming scheduled actions
|
||||
- Recent sessions
|
||||
|
||||
#### `location_list.html`
|
||||
Location cards/table:
|
||||
- Location name, type, coordinates
|
||||
- Assigned unit (if any)
|
||||
- Session count
|
||||
- Assign/unassign button
|
||||
|
||||
#### `assignment_list.html`
|
||||
Unit assignment table:
|
||||
- Unit ID, device type
|
||||
- Location name
|
||||
- Assignment dates
|
||||
- Status
|
||||
- Unassign button
|
||||
|
||||
#### `scheduler_agenda.html`
|
||||
Calendar/agenda view:
|
||||
- Scheduled actions sorted by time
|
||||
- Action type (start/stop/download)
|
||||
- Location and unit
|
||||
- Status indicator
|
||||
- Cancel/execute buttons
|
||||
|
||||
### 2. Project Dashboard Page
|
||||
|
||||
**Location**: `/templates/projects/project_dashboard.html`
|
||||
|
||||
Full project detail page with:
|
||||
- Header with project name, type, status
|
||||
- Tab navigation (Dashboard, Scheduler, Locations, Units, Data, Settings)
|
||||
- Tab content areas
|
||||
- Modals for adding locations, scheduling sessions
|
||||
|
||||
### 3. Additional UI Components
|
||||
|
||||
- Project type selection cards (with icons)
|
||||
- Location creation modal
|
||||
- Unit assignment modal
|
||||
- Schedule session modal (with date/time picker)
|
||||
- Data file browser
|
||||
|
||||
### 4. SLMM Enhancements
|
||||
|
||||
**Location**: `/slmm/app/routers.py` (SLMM repo)
|
||||
|
||||
New endpoint needed:
|
||||
```python
|
||||
POST /api/nl43/{unit_id}/ftp/download
|
||||
```
|
||||
|
||||
This should:
|
||||
- Accept destination_path and files list
|
||||
- Connect to SLM via FTP
|
||||
- Download specified files
|
||||
- Save to Terra-View's `data/Projects/` directory
|
||||
- Return file list with metadata
|
||||
|
||||
### 5. SFM Client (Future)
|
||||
|
||||
**Location**: `/backend/services/sfm_client.py` (to be created)
|
||||
|
||||
Similar to SLMM client, but for seismographs:
|
||||
- Get seismograph status
|
||||
- Start/stop recording
|
||||
- Download data files
|
||||
- Integrate with device controller
|
||||
|
||||
---
|
||||
|
||||
## 🚀 Testing the System
|
||||
|
||||
### 1. Start Terra-View
|
||||
|
||||
```bash
|
||||
cd /home/serversdown/tmi/terra-view
|
||||
# Start Terra-View (however you normally start it)
|
||||
```
|
||||
|
||||
Verify in logs:
|
||||
```
|
||||
Starting scheduler service...
|
||||
Scheduler service started
|
||||
```
|
||||
|
||||
### 2. Navigate to Projects
|
||||
|
||||
Open browser: `http://localhost:8001/projects`
|
||||
|
||||
You should see:
|
||||
- Summary stats cards (all zeros initially)
|
||||
- Tabs (All Projects, Active, Completed, Archived)
|
||||
- "New Project" button
|
||||
|
||||
### 3. Create a Project
|
||||
|
||||
1. Click "New Project"
|
||||
2. Select a project type (e.g., "Sound Monitoring")
|
||||
3. Fill in details:
|
||||
- Name: "Test Sound Project"
|
||||
- Client: "Test Client"
|
||||
- Start Date: Today
|
||||
4. Submit
|
||||
|
||||
### 4. Test API Endpoints
|
||||
|
||||
```bash
|
||||
# Get project types
|
||||
curl http://localhost:8001/api/projects/types/list
|
||||
|
||||
# Get projects list
|
||||
curl http://localhost:8001/api/projects/list
|
||||
|
||||
# Get project stats
|
||||
curl http://localhost:8001/api/projects/stats
|
||||
```
|
||||
|
||||
### 5. Test Scheduler Status
|
||||
|
||||
```bash
|
||||
curl http://localhost:8001/api/projects/{project_id}/scheduler/status
|
||||
```
|
||||
|
||||
---
|
||||
|
||||
## 📋 Dataflow Examples
|
||||
|
||||
### Creating and Scheduling a Recording Session
|
||||
|
||||
1. **User creates project** → Project record in DB
|
||||
2. **User adds NRL** → MonitoringLocation record
|
||||
3. **User assigns SLM to NRL** → UnitAssignment record
|
||||
4. **User schedules recording** → 2 ScheduledAction records (start + stop)
|
||||
5. **Scheduler runs every minute** → Checks for pending actions
|
||||
6. **Start action time arrives** → Scheduler calls SLMM via device controller
|
||||
7. **SLMM sends TCP command to SLM** → Recording starts
|
||||
8. **RecordingSession created** → Tracks the session
|
||||
9. **Stop action time arrives** → Scheduler stops recording
|
||||
10. **Session updated** → stopped_at, duration_seconds filled
|
||||
11. **User triggers download** → Files copied to `data/Projects/{project_id}/sound/{nrl_name}/`
|
||||
12. **DataFile records created** → Track file references
|
||||
|
||||
---
|
||||
|
||||
## 🎨 UI Design Patterns
|
||||
|
||||
### Established Patterns (from SLM dashboard):
|
||||
|
||||
1. **Stats Cards**: 4-column grid, auto-refresh every 30s
|
||||
2. **Sidebar Lists**: Searchable, filterable, auto-refresh
|
||||
3. **Main Panel**: Large central area for details
|
||||
4. **Modals**: Centered, overlay background
|
||||
5. **HTMX**: All dynamic updates, minimal JavaScript
|
||||
6. **Tailwind**: Consistent styling with dark mode support
|
||||
|
||||
### Color Scheme:
|
||||
|
||||
- Primary: `seismo-orange` (#f48b1c)
|
||||
- Secondary: `seismo-navy` (#142a66)
|
||||
- Accent: `seismo-burgundy` (#7d234d)
|
||||
|
||||
---
|
||||
|
||||
## 🔧 Configuration
|
||||
|
||||
### Environment Variables
|
||||
|
||||
- `SLMM_BASE_URL`: SLMM backend URL (default: http://localhost:8100)
|
||||
- `ENVIRONMENT`: "development" or "production"
|
||||
|
||||
### Scheduler Settings
|
||||
|
||||
Located in `/backend/services/scheduler.py`:
|
||||
- `check_interval`: 60 seconds (adjust as needed)
|
||||
|
||||
---
|
||||
|
||||
## 📚 Next Steps
|
||||
|
||||
### Immediate (Get Basic UI Working):
|
||||
1. Create partial templates (stats, lists)
|
||||
2. Test creating projects via UI
|
||||
3. Implement project dashboard page
|
||||
|
||||
### Short-term (Core Features):
|
||||
4. Add location management UI
|
||||
5. Add unit assignment UI
|
||||
6. Add scheduler UI (agenda view)
|
||||
|
||||
### Medium-term (Data Flow):
|
||||
7. Implement SLMM download endpoint
|
||||
8. Test full recording workflow
|
||||
9. Add file browser for downloaded data
|
||||
|
||||
### Long-term (Complete System):
|
||||
10. Implement SFM client for seismographs
|
||||
11. Add data visualization
|
||||
12. Add project reporting
|
||||
13. Add user authentication
|
||||
|
||||
---
|
||||
|
||||
## 🐛 Known Issues / TODOs
|
||||
|
||||
1. **Partial templates missing**: Need to create HTML templates for all partials
|
||||
2. **SLMM download endpoint**: Needs implementation in SLMM backend
|
||||
3. **Project dashboard page**: Not yet created
|
||||
4. **SFM integration**: Placeholder only, needs real implementation
|
||||
5. **File download tracking**: DataFile records not yet created after downloads
|
||||
6. **Error handling**: Need better user-facing error messages
|
||||
7. **Validation**: Form validation could be improved
|
||||
8. **Testing**: No automated tests yet
|
||||
|
||||
---
|
||||
|
||||
## 📖 API Documentation
|
||||
|
||||
### Project Type Object
|
||||
```json
|
||||
{
|
||||
"id": "sound_monitoring",
|
||||
"name": "Sound Monitoring",
|
||||
"description": "...",
|
||||
"icon": "volume-2",
|
||||
"supports_sound": true,
|
||||
"supports_vibration": false
|
||||
}
|
||||
```
|
||||
|
||||
### Project Object
|
||||
```json
|
||||
{
|
||||
"id": "uuid",
|
||||
"name": "Project Name",
|
||||
"description": "...",
|
||||
"project_type_id": "sound_monitoring",
|
||||
"status": "active",
|
||||
"client_name": "Client Inc",
|
||||
"site_address": "123 Main St",
|
||||
"site_coordinates": "40.7128,-74.0060",
|
||||
"start_date": "2024-01-15",
|
||||
"end_date": null,
|
||||
"created_at": "2024-01-15T10:00:00",
|
||||
"updated_at": "2024-01-15T10:00:00"
|
||||
}
|
||||
```
|
||||
|
||||
### MonitoringLocation Object
|
||||
```json
|
||||
{
|
||||
"id": "uuid",
|
||||
"project_id": "uuid",
|
||||
"location_type": "sound",
|
||||
"name": "NRL-001",
|
||||
"description": "...",
|
||||
"coordinates": "40.7128,-74.0060",
|
||||
"address": "123 Main St",
|
||||
"location_metadata": "{...}",
|
||||
"created_at": "2024-01-15T10:00:00"
|
||||
}
|
||||
```
|
||||
|
||||
### UnitAssignment Object
|
||||
```json
|
||||
{
|
||||
"id": "uuid",
|
||||
"unit_id": "nl43-001",
|
||||
"location_id": "uuid",
|
||||
"project_id": "uuid",
|
||||
"device_type": "sound_level_meter",
|
||||
"assigned_at": "2024-01-15T10:00:00",
|
||||
"assigned_until": null,
|
||||
"status": "active",
|
||||
"notes": "..."
|
||||
}
|
||||
```
|
||||
|
||||
### ScheduledAction Object
|
||||
```json
|
||||
{
|
||||
"id": "uuid",
|
||||
"project_id": "uuid",
|
||||
"location_id": "uuid",
|
||||
"unit_id": "nl43-001",
|
||||
"action_type": "start",
|
||||
"device_type": "sound_level_meter",
|
||||
"scheduled_time": "2024-01-16T08:00:00",
|
||||
"executed_at": null,
|
||||
"execution_status": "pending",
|
||||
"module_response": null,
|
||||
"error_message": null
|
||||
}
|
||||
```
|
||||
|
||||
---
|
||||
|
||||
## 🎓 Architecture Decisions
|
||||
|
||||
### Why Project Types?
|
||||
Allows the system to scale to different monitoring scenarios (air quality, multi-hazard, etc.) without code changes. Just add a new ProjectType record and the UI adapts.
|
||||
|
||||
### Why Generic MonitoringLocation?
|
||||
Instead of separate NRL and MonitoringPoint tables, one table with a `location_type` discriminator keeps the schema clean and allows for combined projects.
|
||||
|
||||
### Why Denormalized Fields?
|
||||
Fields like `project_id` in UnitAssignment (already have via location) enable faster queries without joins.
|
||||
|
||||
### Why Scheduler in Terra-View?
|
||||
Terra-View is the orchestration layer. SLMM only handles device communication. Keeping scheduling logic in Terra-View allows for complex workflows across multiple device types.
|
||||
|
||||
### Why JSON Metadata Columns?
|
||||
Type-specific fields (like ambient_conditions for sound projects) don't apply to all location types. JSON columns provide flexibility without cluttering the schema.
|
||||
|
||||
---
|
||||
|
||||
## 💡 Tips for Continuing Development
|
||||
|
||||
1. **Follow Existing Patterns**: Look at the SLM dashboard code for reference
|
||||
2. **Use HTMX Aggressively**: Minimize JavaScript, let HTMX handle updates
|
||||
3. **Keep Routers Thin**: Move business logic to service layer
|
||||
4. **Return HTML Partials**: Most endpoints should return HTML, not JSON
|
||||
5. **Test Incrementally**: Build one partial at a time and test in browser
|
||||
6. **Check Logs**: Scheduler logs execution attempts
|
||||
7. **Use Browser DevTools**: Network tab shows HTMX requests
|
||||
|
||||
---
|
||||
|
||||
## 📞 Support
|
||||
|
||||
For questions or issues:
|
||||
1. Check this document first
|
||||
2. Review existing dashboards (SLM, Seismographs) for patterns
|
||||
3. Check logs for scheduler execution details
|
||||
4. Test API endpoints with curl to isolate issues
|
||||
|
||||
---
|
||||
|
||||
## ✅ Checklist for Completion
|
||||
|
||||
- [x] Database schema designed
|
||||
- [x] Models created
|
||||
- [x] Migration script run successfully
|
||||
- [x] Service layer complete (SLMM client, device controller, scheduler)
|
||||
- [x] API routers created (projects, locations, scheduler)
|
||||
- [x] Navigation updated
|
||||
- [x] Main overview page created
|
||||
- [x] Routes registered in main.py
|
||||
- [x] Scheduler service integrated
|
||||
- [ ] Partial templates created
|
||||
- [ ] Project dashboard page created
|
||||
- [ ] Location management UI
|
||||
- [ ] Unit assignment UI
|
||||
- [ ] Scheduler UI (agenda view)
|
||||
- [ ] SLMM download endpoint implemented
|
||||
- [ ] Full workflow tested end-to-end
|
||||
- [ ] SFM client implemented (future)
|
||||
|
||||
---
|
||||
|
||||
**Last Updated**: 2026-01-12
|
||||
|
||||
**Database Status**: ✅ Initialized
|
||||
|
||||
**Backend Status**: ✅ Complete
|
||||
|
||||
**Frontend Status**: 🟡 Partial (overview page only)
|
||||
|
||||
**Ready for Testing**: ✅ Yes (basic functionality)
|
||||
30
README.md
@@ -1,31 +1,5 @@
|
||||
# Terra-View v0.5.0
|
||||
Unified platform for managing seismograph fleets and sound level meter deployments. Built as a modular monolith with independent feature modules (Seismo, SLM) sharing a common UI layer. Track deployments, monitor health in real time, merge roster intent with incoming telemetry, and control your entire fleet through a unified database and dashboard.
|
||||
|
||||
## Architecture
|
||||
|
||||
Terra-View follows a **modular monolith** architecture with independent feature modules in a single codebase:
|
||||
|
||||
- **app/seismo/** - Seismograph Fleet Module (SFM)
|
||||
- Device roster and deployment tracking
|
||||
- Series 3/4 telemetry ingestion
|
||||
- Status monitoring (OK/Pending/Missing)
|
||||
- Photo management and location tracking
|
||||
- **app/slm/** - Sound Level Meter Manager (SLMM)
|
||||
- NL43 device configuration and control
|
||||
- Real-time measurement monitoring
|
||||
- TCP/FTP/Web interface support
|
||||
- Dashboard statistics and unit management
|
||||
- **app/ui/** - Shared UI layer
|
||||
- Templates, static assets, and common components
|
||||
- Progressive Web App (PWA) support
|
||||
- **app/api/** - API aggregation layer
|
||||
- Cross-module endpoints
|
||||
- Future unified dashboard APIs
|
||||
|
||||
**Multi-Container Deployment**: Three Docker containers built from the same codebase:
|
||||
- `terra-view` (port 8001) - Main UI with all modules integrated
|
||||
- `sfm` (port 8002) - Seismo API backend
|
||||
- `slmm` (port 8100) - SLM API backend
|
||||
# Seismo Fleet Manager v0.4.2
|
||||
Backend API and HTMX-powered web interface for managing a mixed fleet of seismographs and field modems. Track deployments, monitor health in real time, merge roster intent with incoming telemetry, and control your fleet through a unified database and dashboard.
|
||||
|
||||
## Features
|
||||
|
||||
|
||||
@@ -1,13 +0,0 @@
|
||||
"""
|
||||
API Aggregation Layer - Dashboard endpoints
|
||||
Composes data from multiple feature modules
|
||||
"""
|
||||
from fastapi import APIRouter
|
||||
|
||||
router = APIRouter(prefix="/api/dashboard", tags=["dashboard-aggregation"])
|
||||
|
||||
# TODO: Implement aggregation endpoints that combine data from
|
||||
# app.seismo and app.slm modules
|
||||
|
||||
# For now, individual feature modules expose their own APIs directly
|
||||
# Future: Add cross-feature aggregation here
|
||||
@@ -1,13 +0,0 @@
|
||||
"""
|
||||
API Aggregation Layer - Roster endpoints
|
||||
Aggregates roster data from all feature modules
|
||||
"""
|
||||
from fastapi import APIRouter
|
||||
|
||||
router = APIRouter(prefix="/api/roster-aggregation", tags=["roster-aggregation"])
|
||||
|
||||
# TODO: Implement unified roster endpoints that combine data from
|
||||
# app.seismo and app.slm modules
|
||||
|
||||
# For now, individual feature modules expose their own roster APIs
|
||||
# Future: Add cross-feature roster aggregation here
|
||||
@@ -1,83 +0,0 @@
|
||||
"""
|
||||
SLMM API Proxy
|
||||
Forwards /api/slmm/* requests to the SLMM backend service
|
||||
"""
|
||||
import httpx
|
||||
import logging
|
||||
from fastapi import APIRouter, Request, Response, WebSocket
|
||||
from fastapi.responses import StreamingResponse
|
||||
from app.core.config import SLMM_API_URL
|
||||
|
||||
logger = logging.getLogger(__name__)
|
||||
|
||||
router = APIRouter(prefix="/api/slmm", tags=["slmm-proxy"])
|
||||
|
||||
|
||||
@router.api_route("/{path:path}", methods=["GET", "POST", "PUT", "DELETE", "PATCH"])
|
||||
async def proxy_slmm_request(path: str, request: Request):
|
||||
"""Proxy HTTP requests to SLMM backend"""
|
||||
# Build target URL - rewrite /api/slmm/* to /api/nl43/*
|
||||
target_url = f"{SLMM_API_URL}/api/nl43/{path}"
|
||||
|
||||
# Get query params
|
||||
query_string = str(request.url.query)
|
||||
if query_string:
|
||||
target_url += f"?{query_string}"
|
||||
|
||||
logger.info(f"Proxying {request.method} {target_url}")
|
||||
|
||||
# Read request body
|
||||
body = await request.body()
|
||||
|
||||
# Forward headers (exclude host)
|
||||
headers = {
|
||||
key: value
|
||||
for key, value in request.headers.items()
|
||||
if key.lower() not in ['host', 'content-length']
|
||||
}
|
||||
|
||||
async with httpx.AsyncClient(timeout=30.0) as client:
|
||||
try:
|
||||
# Make proxied request
|
||||
response = await client.request(
|
||||
method=request.method,
|
||||
url=target_url,
|
||||
content=body,
|
||||
headers=headers
|
||||
)
|
||||
|
||||
# Return response
|
||||
return Response(
|
||||
content=response.content,
|
||||
status_code=response.status_code,
|
||||
headers=dict(response.headers)
|
||||
)
|
||||
except httpx.RequestError as e:
|
||||
logger.error(f"Proxy request failed: {e}")
|
||||
return Response(
|
||||
content=f'{{"detail": "SLMM backend unavailable: {str(e)}"}}',
|
||||
status_code=502,
|
||||
media_type="application/json"
|
||||
)
|
||||
|
||||
|
||||
@router.websocket("/{unit_id}/live")
|
||||
async def proxy_slmm_websocket(websocket: WebSocket, unit_id: str):
|
||||
"""Proxy WebSocket connections to SLMM backend for live data streaming"""
|
||||
await websocket.accept()
|
||||
|
||||
# Build WebSocket URL
|
||||
ws_protocol = "ws" if "localhost" in SLMM_API_URL or "127.0.0.1" in SLMM_API_URL else "wss"
|
||||
ws_url = SLMM_API_URL.replace("http://", f"{ws_protocol}://").replace("https://", f"{ws_protocol}://")
|
||||
ws_target = f"{ws_url}/api/slmm/{unit_id}/live"
|
||||
|
||||
logger.info(f"Proxying WebSocket to {ws_target}")
|
||||
|
||||
async with httpx.AsyncClient() as client:
|
||||
try:
|
||||
async with client.stream("GET", ws_target) as response:
|
||||
async for chunk in response.aiter_bytes():
|
||||
await websocket.send_bytes(chunk)
|
||||
except Exception as e:
|
||||
logger.error(f"WebSocket proxy error: {e}")
|
||||
await websocket.close(code=1011, reason=f"Backend error: {str(e)}")
|
||||
@@ -1,22 +0,0 @@
|
||||
"""
|
||||
Core configuration for Terra-View application
|
||||
"""
|
||||
import os
|
||||
|
||||
# Application
|
||||
APP_NAME = "Terra-View"
|
||||
VERSION = "1.0.0"
|
||||
ENVIRONMENT = os.getenv("ENVIRONMENT", "production")
|
||||
|
||||
# Ports
|
||||
PORT = int(os.getenv("PORT", 8001))
|
||||
|
||||
# External Services
|
||||
# Terra-View is a unified application with seismograph logic built-in
|
||||
# The only external HTTP dependency is SLMM for NL-43 device communication
|
||||
SLMM_API_URL = os.getenv("SLMM_API_URL", "http://localhost:8100")
|
||||
|
||||
# Database URLs (feature-specific)
|
||||
SEISMO_DATABASE_URL = "sqlite:///./data/seismo.db"
|
||||
SLM_DATABASE_URL = "sqlite:///./data/slm.db"
|
||||
MODEM_DATABASE_URL = "sqlite:///./data/modem.db"
|
||||
216
app/main.py
@@ -1,216 +0,0 @@
|
||||
"""
|
||||
Terra-View - Unified monitoring platform for device fleets
|
||||
Modular monolith architecture with strict feature boundaries
|
||||
"""
|
||||
import os
|
||||
import logging
|
||||
from fastapi import FastAPI, Request
|
||||
from fastapi.middleware.cors import CORSMiddleware
|
||||
from fastapi.staticfiles import StaticFiles
|
||||
from fastapi.responses import JSONResponse
|
||||
from fastapi.exceptions import RequestValidationError
|
||||
|
||||
# Configure logging
|
||||
logging.basicConfig(
|
||||
level=logging.INFO,
|
||||
format='%(asctime)s - %(name)s - %(levelname)s - %(message)s'
|
||||
)
|
||||
logger = logging.getLogger(__name__)
|
||||
|
||||
# Import configuration
|
||||
from app.core.config import APP_NAME, VERSION, ENVIRONMENT
|
||||
|
||||
# Import UI routes
|
||||
from app.ui import routes as ui_routes
|
||||
|
||||
# Import feature module routers (seismo)
|
||||
from app.seismo.routers import (
|
||||
roster as seismo_roster,
|
||||
units as seismo_units,
|
||||
photos as seismo_photos,
|
||||
roster_edit as seismo_roster_edit,
|
||||
dashboard as seismo_dashboard,
|
||||
dashboard_tabs as seismo_dashboard_tabs,
|
||||
activity as seismo_activity,
|
||||
seismo_dashboard as seismo_seismo_dashboard,
|
||||
settings as seismo_settings,
|
||||
partials as seismo_partials,
|
||||
)
|
||||
from app.seismo import routes as seismo_legacy_routes
|
||||
|
||||
# Import feature module routers (SLM)
|
||||
from app.slm.routers import router as slm_router
|
||||
from app.slm.dashboard import router as slm_dashboard_router
|
||||
|
||||
# Import API aggregation layer (placeholder for now)
|
||||
from app.api import dashboard as api_dashboard
|
||||
from app.api import roster as api_roster
|
||||
|
||||
# Initialize database tables
|
||||
from app.seismo.database import engine as seismo_engine, Base as SeismoBase
|
||||
SeismoBase.metadata.create_all(bind=seismo_engine)
|
||||
|
||||
from app.slm.database import engine as slm_engine, Base as SlmBase
|
||||
SlmBase.metadata.create_all(bind=slm_engine)
|
||||
|
||||
# Initialize FastAPI app
|
||||
app = FastAPI(
|
||||
title=APP_NAME,
|
||||
description="Unified monitoring platform for seismograph, modem, and sound level meter fleets",
|
||||
version=VERSION
|
||||
)
|
||||
|
||||
# Add validation error handler to log details
|
||||
@app.exception_handler(RequestValidationError)
|
||||
async def validation_exception_handler(request: Request, exc: RequestValidationError):
|
||||
logger.error(f"Validation error on {request.url}: {exc.errors()}")
|
||||
logger.error(f"Body: {await request.body()}")
|
||||
return JSONResponse(
|
||||
status_code=400,
|
||||
content={"detail": exc.errors()}
|
||||
)
|
||||
|
||||
# Configure CORS
|
||||
app.add_middleware(
|
||||
CORSMiddleware,
|
||||
allow_origins=["*"],
|
||||
allow_credentials=True,
|
||||
allow_methods=["*"],
|
||||
allow_headers=["*"],
|
||||
)
|
||||
|
||||
# Mount static files
|
||||
app.mount("/static", StaticFiles(directory="app/ui/static"), name="static")
|
||||
|
||||
# Middleware to add environment to request state
|
||||
@app.middleware("http")
|
||||
async def add_environment_to_context(request: Request, call_next):
|
||||
"""Middleware to add environment variable to request state"""
|
||||
request.state.environment = ENVIRONMENT
|
||||
response = await call_next(request)
|
||||
return response
|
||||
|
||||
# ===== INCLUDE ROUTERS =====
|
||||
|
||||
# UI Layer (HTML pages)
|
||||
app.include_router(ui_routes.router)
|
||||
|
||||
# Seismograph Feature Module APIs
|
||||
app.include_router(seismo_roster.router)
|
||||
app.include_router(seismo_units.router)
|
||||
app.include_router(seismo_photos.router)
|
||||
app.include_router(seismo_roster_edit.router)
|
||||
app.include_router(seismo_dashboard.router)
|
||||
app.include_router(seismo_dashboard_tabs.router)
|
||||
app.include_router(seismo_activity.router)
|
||||
app.include_router(seismo_seismo_dashboard.router)
|
||||
app.include_router(seismo_settings.router)
|
||||
app.include_router(seismo_partials.router, prefix="/partials")
|
||||
app.include_router(seismo_legacy_routes.router)
|
||||
|
||||
# SLM Feature Module APIs
|
||||
app.include_router(slm_router)
|
||||
app.include_router(slm_dashboard_router)
|
||||
|
||||
# SLMM Backend Proxy (forward /api/slmm/* to SLMM service)
|
||||
from app.api import slmm_proxy
|
||||
app.include_router(slmm_proxy.router)
|
||||
|
||||
# API Aggregation Layer (future cross-feature endpoints)
|
||||
# app.include_router(api_dashboard.router) # TODO: Implement aggregation
|
||||
# app.include_router(api_roster.router) # TODO: Implement aggregation
|
||||
|
||||
# ===== ADDITIONAL ROUTES FROM OLD MAIN.PY =====
|
||||
# These will need to be migrated to appropriate modules
|
||||
|
||||
from fastapi.templating import Jinja2Templates
|
||||
from typing import List, Dict
|
||||
from pydantic import BaseModel
|
||||
from sqlalchemy.orm import Session
|
||||
from fastapi import Depends
|
||||
|
||||
from app.seismo.database import get_db
|
||||
from app.seismo.services.snapshot import emit_status_snapshot
|
||||
from app.seismo.models import IgnoredUnit
|
||||
|
||||
# TODO: Move these to appropriate feature modules or UI layer
|
||||
|
||||
@app.post("/api/sync-edits")
|
||||
async def sync_edits(request: dict, db: Session = Depends(get_db)):
|
||||
"""Process offline edit queue and sync to database"""
|
||||
# TODO: Move to seismo module
|
||||
from app.seismo.models import RosterUnit
|
||||
|
||||
class EditItem(BaseModel):
|
||||
id: int
|
||||
unitId: str
|
||||
changes: Dict
|
||||
timestamp: int
|
||||
|
||||
class SyncEditsRequest(BaseModel):
|
||||
edits: List[EditItem]
|
||||
|
||||
sync_request = SyncEditsRequest(**request)
|
||||
results = []
|
||||
synced_ids = []
|
||||
|
||||
for edit in sync_request.edits:
|
||||
try:
|
||||
unit = db.query(RosterUnit).filter_by(id=edit.unitId).first()
|
||||
|
||||
if not unit:
|
||||
results.append({
|
||||
"id": edit.id,
|
||||
"status": "error",
|
||||
"reason": f"Unit {edit.unitId} not found"
|
||||
})
|
||||
continue
|
||||
|
||||
for key, value in edit.changes.items():
|
||||
if hasattr(unit, key):
|
||||
if key in ['deployed', 'retired']:
|
||||
setattr(unit, key, value in ['true', True, 'True', '1', 1])
|
||||
else:
|
||||
setattr(unit, key, value if value != '' else None)
|
||||
|
||||
db.commit()
|
||||
|
||||
results.append({
|
||||
"id": edit.id,
|
||||
"status": "success"
|
||||
})
|
||||
synced_ids.append(edit.id)
|
||||
|
||||
except Exception as e:
|
||||
db.rollback()
|
||||
results.append({
|
||||
"id": edit.id,
|
||||
"status": "error",
|
||||
"reason": str(e)
|
||||
})
|
||||
|
||||
synced_count = len(synced_ids)
|
||||
|
||||
return JSONResponse({
|
||||
"synced": synced_count,
|
||||
"total": len(sync_request.edits),
|
||||
"synced_ids": synced_ids,
|
||||
"results": results
|
||||
})
|
||||
|
||||
|
||||
@app.get("/health")
|
||||
def health_check():
|
||||
"""Health check endpoint"""
|
||||
return {
|
||||
"message": f"{APP_NAME} v{VERSION}",
|
||||
"status": "running",
|
||||
"version": VERSION,
|
||||
"modules": ["seismo", "slm"]
|
||||
}
|
||||
|
||||
|
||||
if __name__ == "__main__":
|
||||
import uvicorn
|
||||
from app.core.config import PORT
|
||||
uvicorn.run(app, host="0.0.0.0", port=PORT)
|
||||
@@ -1,36 +0,0 @@
|
||||
"""
|
||||
Seismograph feature module database connection
|
||||
"""
|
||||
from sqlalchemy import create_engine
|
||||
from sqlalchemy.ext.declarative import declarative_base
|
||||
from sqlalchemy.orm import sessionmaker
|
||||
import os
|
||||
|
||||
# Ensure data directory exists
|
||||
os.makedirs("data", exist_ok=True)
|
||||
|
||||
# For now, we'll use the old database (seismo_fleet.db) until we migrate
|
||||
# TODO: Migrate to seismo.db
|
||||
SQLALCHEMY_DATABASE_URL = "sqlite:///./data/seismo_fleet.db"
|
||||
|
||||
engine = create_engine(
|
||||
SQLALCHEMY_DATABASE_URL, connect_args={"check_same_thread": False}
|
||||
)
|
||||
|
||||
SessionLocal = sessionmaker(autocommit=False, autoflush=False, bind=engine)
|
||||
|
||||
Base = declarative_base()
|
||||
|
||||
|
||||
def get_db():
|
||||
"""Dependency for database sessions"""
|
||||
db = SessionLocal()
|
||||
try:
|
||||
yield db
|
||||
finally:
|
||||
db.close()
|
||||
|
||||
|
||||
def get_db_session():
|
||||
"""Get a database session directly (not as a dependency)"""
|
||||
return SessionLocal()
|
||||
@@ -1,110 +0,0 @@
|
||||
from sqlalchemy import Column, String, DateTime, Boolean, Text, Date, Integer
|
||||
from datetime import datetime
|
||||
from app.seismo.database import Base
|
||||
|
||||
|
||||
class Emitter(Base):
|
||||
__tablename__ = "emitters"
|
||||
|
||||
id = Column(String, primary_key=True, index=True)
|
||||
unit_type = Column(String, nullable=False)
|
||||
last_seen = Column(DateTime, default=datetime.utcnow)
|
||||
last_file = Column(String, nullable=False)
|
||||
status = Column(String, nullable=False)
|
||||
notes = Column(String, nullable=True)
|
||||
|
||||
|
||||
class RosterUnit(Base):
|
||||
"""
|
||||
Roster table: represents our *intended assignment* of a unit.
|
||||
This is editable from the GUI.
|
||||
|
||||
Supports multiple device types (seismograph, modem, sound_level_meter) with type-specific fields.
|
||||
"""
|
||||
__tablename__ = "roster"
|
||||
|
||||
# Core fields (all device types)
|
||||
id = Column(String, primary_key=True, index=True)
|
||||
unit_type = Column(String, default="series3") # Backward compatibility
|
||||
device_type = Column(String, default="seismograph") # "seismograph" | "modem" | "sound_level_meter"
|
||||
deployed = Column(Boolean, default=True)
|
||||
retired = Column(Boolean, default=False)
|
||||
note = Column(String, nullable=True)
|
||||
project_id = Column(String, nullable=True)
|
||||
location = Column(String, nullable=True) # Legacy field - use address/coordinates instead
|
||||
address = Column(String, nullable=True) # Human-readable address
|
||||
coordinates = Column(String, nullable=True) # Lat,Lon format: "34.0522,-118.2437"
|
||||
last_updated = Column(DateTime, default=datetime.utcnow)
|
||||
|
||||
# Seismograph-specific fields (nullable for modems and SLMs)
|
||||
last_calibrated = Column(Date, nullable=True)
|
||||
next_calibration_due = Column(Date, nullable=True)
|
||||
|
||||
# Modem assignment (shared by seismographs and SLMs)
|
||||
deployed_with_modem_id = Column(String, nullable=True) # FK to another RosterUnit (device_type=modem)
|
||||
|
||||
# Modem-specific fields (nullable for seismographs and SLMs)
|
||||
ip_address = Column(String, nullable=True)
|
||||
phone_number = Column(String, nullable=True)
|
||||
hardware_model = Column(String, nullable=True)
|
||||
|
||||
# Sound Level Meter-specific fields (nullable for seismographs and modems)
|
||||
slm_host = Column(String, nullable=True) # Device IP or hostname
|
||||
slm_tcp_port = Column(Integer, nullable=True) # TCP control port (default 2255)
|
||||
slm_ftp_port = Column(Integer, nullable=True) # FTP data retrieval port (default 21)
|
||||
slm_model = Column(String, nullable=True) # NL-43, NL-53, etc.
|
||||
slm_serial_number = Column(String, nullable=True) # Device serial number
|
||||
slm_frequency_weighting = Column(String, nullable=True) # A, C, Z
|
||||
slm_time_weighting = Column(String, nullable=True) # F (Fast), S (Slow), I (Impulse)
|
||||
slm_measurement_range = Column(String, nullable=True) # e.g., "30-130 dB"
|
||||
slm_last_check = Column(DateTime, nullable=True) # Last communication check
|
||||
|
||||
|
||||
class IgnoredUnit(Base):
|
||||
"""
|
||||
Ignored units: units that report but should be filtered out from unknown emitters.
|
||||
Used to suppress noise from old projects.
|
||||
"""
|
||||
__tablename__ = "ignored_units"
|
||||
|
||||
id = Column(String, primary_key=True, index=True)
|
||||
reason = Column(String, nullable=True)
|
||||
ignored_at = Column(DateTime, default=datetime.utcnow)
|
||||
|
||||
|
||||
class UnitHistory(Base):
|
||||
"""
|
||||
Unit history: complete timeline of changes to each unit.
|
||||
Tracks note changes, status changes, deployment/benched events, and more.
|
||||
"""
|
||||
__tablename__ = "unit_history"
|
||||
|
||||
id = Column(Integer, primary_key=True, autoincrement=True)
|
||||
unit_id = Column(String, nullable=False, index=True) # FK to RosterUnit.id
|
||||
change_type = Column(String, nullable=False) # note_change, deployed_change, retired_change, etc.
|
||||
field_name = Column(String, nullable=True) # Which field changed
|
||||
old_value = Column(Text, nullable=True) # Previous value
|
||||
new_value = Column(Text, nullable=True) # New value
|
||||
changed_at = Column(DateTime, default=datetime.utcnow, nullable=False, index=True)
|
||||
source = Column(String, default="manual") # manual, csv_import, telemetry, offline_sync
|
||||
notes = Column(Text, nullable=True) # Optional reason/context for the change
|
||||
|
||||
|
||||
class UserPreferences(Base):
|
||||
"""
|
||||
User preferences: persistent storage for application settings.
|
||||
Single-row table (id=1) to store global user preferences.
|
||||
"""
|
||||
__tablename__ = "user_preferences"
|
||||
|
||||
id = Column(Integer, primary_key=True, default=1)
|
||||
timezone = Column(String, default="America/New_York")
|
||||
theme = Column(String, default="auto") # auto, light, dark
|
||||
auto_refresh_interval = Column(Integer, default=10) # seconds
|
||||
date_format = Column(String, default="MM/DD/YYYY")
|
||||
table_rows_per_page = Column(Integer, default=25)
|
||||
calibration_interval_days = Column(Integer, default=365)
|
||||
calibration_warning_days = Column(Integer, default=30)
|
||||
status_ok_threshold_hours = Column(Integer, default=12)
|
||||
status_pending_threshold_hours = Column(Integer, default=24)
|
||||
updated_at = Column(DateTime, default=datetime.utcnow, onupdate=datetime.utcnow)
|
||||
@@ -1,140 +0,0 @@
|
||||
"""
|
||||
Partial routes for HTMX dynamic content loading.
|
||||
These routes return HTML fragments that are loaded into the page via HTMX.
|
||||
"""
|
||||
from fastapi import APIRouter, Request
|
||||
from fastapi.responses import HTMLResponse
|
||||
from fastapi.templating import Jinja2Templates
|
||||
|
||||
from app.seismo.services.snapshot import emit_status_snapshot
|
||||
|
||||
router = APIRouter()
|
||||
templates = Jinja2Templates(directory="app/ui/templates")
|
||||
|
||||
|
||||
@router.get("/unknown-emitters", response_class=HTMLResponse)
|
||||
async def get_unknown_emitters(request: Request):
|
||||
"""
|
||||
Returns HTML partial with unknown emitters (units reporting but not in roster).
|
||||
Called periodically via HTMX (every 10s) from the roster page.
|
||||
"""
|
||||
snapshot = emit_status_snapshot()
|
||||
|
||||
# Convert unknown units dict to list and add required fields
|
||||
unknown_list = []
|
||||
for unit_id, unit_data in snapshot.get("unknown", {}).items():
|
||||
unknown_list.append({
|
||||
"id": unit_id,
|
||||
"status": unit_data["status"],
|
||||
"age": unit_data["age"],
|
||||
"fname": unit_data.get("fname", ""),
|
||||
})
|
||||
|
||||
# Sort by ID for consistent display
|
||||
unknown_list.sort(key=lambda x: x["id"])
|
||||
|
||||
return templates.TemplateResponse(
|
||||
"partials/unknown_emitters.html",
|
||||
{
|
||||
"request": request,
|
||||
"unknown_units": unknown_list
|
||||
}
|
||||
)
|
||||
|
||||
|
||||
@router.get("/devices-all", response_class=HTMLResponse)
|
||||
async def get_all_devices(request: Request):
|
||||
"""
|
||||
Returns HTML partial with all devices (deployed, benched, retired, ignored).
|
||||
Called on page load and when filters are applied.
|
||||
"""
|
||||
snapshot = emit_status_snapshot()
|
||||
|
||||
# Combine all units from different buckets
|
||||
all_units = []
|
||||
|
||||
# Add active units (deployed)
|
||||
for unit_id, unit_data in snapshot.get("active", {}).items():
|
||||
unit_info = {
|
||||
"id": unit_id,
|
||||
"status": unit_data["status"],
|
||||
"age": unit_data["age"],
|
||||
"last_seen": unit_data.get("last", ""),
|
||||
"fname": unit_data.get("fname", ""),
|
||||
"deployed": True,
|
||||
"retired": False,
|
||||
"ignored": False,
|
||||
"note": unit_data.get("note", ""),
|
||||
"device_type": unit_data.get("device_type", "seismograph"),
|
||||
"location": unit_data.get("location", ""),
|
||||
"address": unit_data.get("address", ""),
|
||||
"coordinates": unit_data.get("coordinates", ""),
|
||||
"last_calibrated": unit_data.get("last_calibrated"),
|
||||
"next_calibration_due": unit_data.get("next_calibration_due"),
|
||||
"deployed_with_modem_id": unit_data.get("deployed_with_modem_id"),
|
||||
"ip_address": unit_data.get("ip_address"),
|
||||
"phone_number": unit_data.get("phone_number"),
|
||||
"hardware_model": unit_data.get("hardware_model"),
|
||||
}
|
||||
all_units.append(unit_info)
|
||||
|
||||
# Add benched units (not deployed, not retired)
|
||||
for unit_id, unit_data in snapshot.get("benched", {}).items():
|
||||
unit_info = {
|
||||
"id": unit_id,
|
||||
"status": unit_data["status"],
|
||||
"age": unit_data["age"],
|
||||
"last_seen": unit_data.get("last", ""),
|
||||
"fname": unit_data.get("fname", ""),
|
||||
"deployed": False,
|
||||
"retired": False,
|
||||
"ignored": False,
|
||||
"note": unit_data.get("note", ""),
|
||||
"device_type": unit_data.get("device_type", "seismograph"),
|
||||
"location": unit_data.get("location", ""),
|
||||
"address": unit_data.get("address", ""),
|
||||
"coordinates": unit_data.get("coordinates", ""),
|
||||
"last_calibrated": unit_data.get("last_calibrated"),
|
||||
"next_calibration_due": unit_data.get("next_calibration_due"),
|
||||
"deployed_with_modem_id": unit_data.get("deployed_with_modem_id"),
|
||||
"ip_address": unit_data.get("ip_address"),
|
||||
"phone_number": unit_data.get("phone_number"),
|
||||
"hardware_model": unit_data.get("hardware_model"),
|
||||
}
|
||||
all_units.append(unit_info)
|
||||
|
||||
# Add retired units
|
||||
for unit_id, unit_data in snapshot.get("retired", {}).items():
|
||||
unit_info = {
|
||||
"id": unit_id,
|
||||
"status": "Retired",
|
||||
"age": unit_data["age"],
|
||||
"last_seen": unit_data.get("last", ""),
|
||||
"fname": unit_data.get("fname", ""),
|
||||
"deployed": False,
|
||||
"retired": True,
|
||||
"ignored": False,
|
||||
"note": unit_data.get("note", ""),
|
||||
"device_type": unit_data.get("device_type", "seismograph"),
|
||||
"location": unit_data.get("location", ""),
|
||||
"address": unit_data.get("address", ""),
|
||||
"coordinates": unit_data.get("coordinates", ""),
|
||||
"last_calibrated": unit_data.get("last_calibrated"),
|
||||
"next_calibration_due": unit_data.get("next_calibration_due"),
|
||||
"deployed_with_modem_id": unit_data.get("deployed_with_modem_id"),
|
||||
"ip_address": unit_data.get("ip_address"),
|
||||
"phone_number": unit_data.get("phone_number"),
|
||||
"hardware_model": unit_data.get("hardware_model"),
|
||||
}
|
||||
all_units.append(unit_info)
|
||||
|
||||
# Sort by ID for consistent display
|
||||
all_units.sort(key=lambda x: x["id"])
|
||||
|
||||
return templates.TemplateResponse(
|
||||
"partials/devices_table.html",
|
||||
{
|
||||
"request": request,
|
||||
"units": all_units
|
||||
}
|
||||
)
|
||||
@@ -1 +0,0 @@
|
||||
# SLMM addon package for NL43 integration.
|
||||
@@ -1,317 +0,0 @@
|
||||
"""
|
||||
Dashboard API endpoints for SLM/NL43 devices.
|
||||
This layer aggregates and transforms data from the device API for UI consumption.
|
||||
"""
|
||||
from fastapi import APIRouter, Depends, HTTPException, Request
|
||||
from fastapi.responses import HTMLResponse
|
||||
from fastapi.templating import Jinja2Templates
|
||||
from sqlalchemy.orm import Session
|
||||
from sqlalchemy import func
|
||||
from typing import List, Dict, Any
|
||||
import logging
|
||||
|
||||
from app.slm.database import get_db as get_slm_db
|
||||
from app.slm.models import NL43Config, NL43Status
|
||||
from app.slm.services import NL43Client
|
||||
# Import seismo database for roster data
|
||||
from app.seismo.database import get_db as get_seismo_db
|
||||
from app.seismo.models import RosterUnit
|
||||
|
||||
logger = logging.getLogger(__name__)
|
||||
|
||||
router = APIRouter(prefix="/api/slm-dashboard", tags=["slm-dashboard"])
|
||||
templates = Jinja2Templates(directory="app/ui/templates")
|
||||
|
||||
|
||||
@router.get("/stats", response_class=HTMLResponse)
|
||||
async def get_dashboard_stats(request: Request, db: Session = Depends(get_seismo_db)):
|
||||
"""Get aggregate statistics for the SLM dashboard from roster (returns HTML)."""
|
||||
# Query SLMs from the roster
|
||||
slms = db.query(RosterUnit).filter_by(
|
||||
device_type="sound_level_meter",
|
||||
retired=False
|
||||
).all()
|
||||
|
||||
total_units = len(slms)
|
||||
deployed = sum(1 for s in slms if s.deployed)
|
||||
benched = sum(1 for s in slms if not s.deployed)
|
||||
|
||||
# For "active", count SLMs with recent check-ins (within last hour)
|
||||
from datetime import datetime, timedelta, timezone
|
||||
one_hour_ago = datetime.now(timezone.utc) - timedelta(hours=1)
|
||||
active = sum(1 for s in slms if s.slm_last_check and s.slm_last_check >= one_hour_ago)
|
||||
|
||||
# Map to template variable names
|
||||
# total_count, deployed_count, active_count, benched_count
|
||||
return templates.TemplateResponse(
|
||||
"partials/slm_stats.html",
|
||||
{
|
||||
"request": request,
|
||||
"total_count": total_units,
|
||||
"deployed_count": deployed,
|
||||
"active_count": active,
|
||||
"benched_count": benched
|
||||
}
|
||||
)
|
||||
|
||||
|
||||
@router.get("/units", response_class=HTMLResponse)
|
||||
async def get_units_list(request: Request, db: Session = Depends(get_seismo_db)):
|
||||
"""Get list of all SLM units from roster (returns HTML)."""
|
||||
# Query SLMs from the roster (not retired)
|
||||
slms = db.query(RosterUnit).filter_by(
|
||||
device_type="sound_level_meter",
|
||||
retired=False
|
||||
).order_by(RosterUnit.id).all()
|
||||
|
||||
units = []
|
||||
for slm in slms:
|
||||
# Map to template field names
|
||||
unit_data = {
|
||||
"id": slm.id,
|
||||
"slm_host": slm.slm_host,
|
||||
"slm_tcp_port": slm.slm_tcp_port,
|
||||
"slm_last_check": slm.slm_last_check,
|
||||
"slm_model": slm.slm_model or "NL-43",
|
||||
"address": slm.address,
|
||||
"deployed_with_modem_id": slm.deployed_with_modem_id,
|
||||
}
|
||||
units.append(unit_data)
|
||||
|
||||
return templates.TemplateResponse(
|
||||
"partials/slm_unit_list.html",
|
||||
{
|
||||
"request": request,
|
||||
"units": units
|
||||
}
|
||||
)
|
||||
|
||||
|
||||
@router.get("/live-view/{unit_id}", response_class=HTMLResponse)
|
||||
async def get_live_view(unit_id: str, request: Request, slm_db: Session = Depends(get_slm_db), roster_db: Session = Depends(get_seismo_db)):
|
||||
"""Get live measurement data for a specific unit (returns HTML)."""
|
||||
# Get unit from roster
|
||||
unit = roster_db.query(RosterUnit).filter_by(
|
||||
id=unit_id,
|
||||
device_type="sound_level_meter"
|
||||
).first()
|
||||
|
||||
if not unit:
|
||||
return templates.TemplateResponse(
|
||||
"partials/slm_live_view_error.html",
|
||||
{
|
||||
"request": request,
|
||||
"error": f"Unit {unit_id} not found in roster"
|
||||
}
|
||||
)
|
||||
|
||||
# Get status from monitoring database (may not exist yet)
|
||||
status = slm_db.query(NL43Status).filter_by(unit_id=unit_id).first()
|
||||
|
||||
# Get modem info if available
|
||||
modem = None
|
||||
modem_ip = None
|
||||
if unit.deployed_with_modem_id:
|
||||
modem = roster_db.query(RosterUnit).filter_by(
|
||||
id=unit.deployed_with_modem_id,
|
||||
device_type="modem"
|
||||
).first()
|
||||
if modem:
|
||||
modem_ip = modem.ip_address
|
||||
elif unit.slm_host:
|
||||
modem_ip = unit.slm_host
|
||||
|
||||
# Determine if measuring
|
||||
is_measuring = False
|
||||
if status and status.measurement_state:
|
||||
is_measuring = status.measurement_state.lower() == 'start'
|
||||
|
||||
return templates.TemplateResponse(
|
||||
"partials/slm_live_view.html",
|
||||
{
|
||||
"request": request,
|
||||
"unit": unit,
|
||||
"modem": modem,
|
||||
"modem_ip": modem_ip,
|
||||
"current_status": status,
|
||||
"is_measuring": is_measuring
|
||||
}
|
||||
)
|
||||
|
||||
|
||||
@router.get("/config/{unit_id}", response_class=HTMLResponse)
|
||||
async def get_unit_config(unit_id: str, request: Request, roster_db: Session = Depends(get_seismo_db)):
|
||||
"""Return the HTML config form for a specific unit."""
|
||||
unit = roster_db.query(RosterUnit).filter_by(
|
||||
id=unit_id,
|
||||
device_type="sound_level_meter"
|
||||
).first()
|
||||
|
||||
if not unit:
|
||||
raise HTTPException(status_code=404, detail="Unit configuration not found")
|
||||
|
||||
return templates.TemplateResponse(
|
||||
"partials/slm_config_form.html",
|
||||
{
|
||||
"request": request,
|
||||
"unit": unit
|
||||
}
|
||||
)
|
||||
|
||||
|
||||
@router.post("/config/{unit_id}")
|
||||
async def update_unit_config(
|
||||
unit_id: str,
|
||||
request: Request,
|
||||
roster_db: Session = Depends(get_seismo_db),
|
||||
slm_db: Session = Depends(get_slm_db)
|
||||
):
|
||||
"""Update configuration for a specific unit from the form submission."""
|
||||
unit = roster_db.query(RosterUnit).filter_by(
|
||||
id=unit_id,
|
||||
device_type="sound_level_meter"
|
||||
).first()
|
||||
|
||||
if not unit:
|
||||
raise HTTPException(status_code=404, detail="Unit configuration not found")
|
||||
|
||||
form = await request.form()
|
||||
|
||||
def get_int(value, default=None):
|
||||
try:
|
||||
return int(value) if value not in (None, "") else default
|
||||
except (TypeError, ValueError):
|
||||
return default
|
||||
|
||||
# Update roster fields
|
||||
unit.slm_model = form.get("slm_model") or unit.slm_model
|
||||
unit.slm_serial_number = form.get("slm_serial_number") or unit.slm_serial_number
|
||||
unit.slm_frequency_weighting = form.get("slm_frequency_weighting") or unit.slm_frequency_weighting
|
||||
unit.slm_time_weighting = form.get("slm_time_weighting") or unit.slm_time_weighting
|
||||
unit.slm_measurement_range = form.get("slm_measurement_range") or unit.slm_measurement_range
|
||||
|
||||
unit.slm_host = form.get("slm_host") or None
|
||||
unit.slm_tcp_port = get_int(form.get("slm_tcp_port"), unit.slm_tcp_port or 2255)
|
||||
unit.slm_ftp_port = get_int(form.get("slm_ftp_port"), unit.slm_ftp_port or 21)
|
||||
|
||||
deployed_with_modem_id = form.get("deployed_with_modem_id") or None
|
||||
unit.deployed_with_modem_id = deployed_with_modem_id
|
||||
|
||||
roster_db.commit()
|
||||
roster_db.refresh(unit)
|
||||
|
||||
# Update or create NL43 config so SLMM can reach the device
|
||||
config = slm_db.query(NL43Config).filter_by(unit_id=unit_id).first()
|
||||
if not config:
|
||||
config = NL43Config(unit_id=unit_id)
|
||||
slm_db.add(config)
|
||||
|
||||
# Resolve host from modem if present, otherwise fall back to direct IP or existing config
|
||||
host_for_config = None
|
||||
if deployed_with_modem_id:
|
||||
modem = roster_db.query(RosterUnit).filter_by(
|
||||
id=deployed_with_modem_id,
|
||||
device_type="modem"
|
||||
).first()
|
||||
if modem and modem.ip_address:
|
||||
host_for_config = modem.ip_address
|
||||
if not host_for_config:
|
||||
host_for_config = unit.slm_host or config.host or "127.0.0.1"
|
||||
|
||||
config.host = host_for_config
|
||||
config.tcp_port = get_int(form.get("slm_tcp_port"), config.tcp_port or 2255)
|
||||
config.tcp_enabled = True
|
||||
config.ftp_enabled = bool(config.ftp_username and config.ftp_password)
|
||||
|
||||
slm_db.commit()
|
||||
slm_db.refresh(config)
|
||||
|
||||
return {"success": True, "unit_id": unit_id}
|
||||
|
||||
|
||||
@router.post("/control/{unit_id}/{action}")
|
||||
async def control_unit(unit_id: str, action: str, db: Session = Depends(get_slm_db)):
|
||||
"""Send control command to a unit (start, stop, pause, resume, etc.)."""
|
||||
config = db.query(NL43Config).filter_by(unit_id=unit_id).first()
|
||||
if not config:
|
||||
raise HTTPException(status_code=404, detail="Unit configuration not found")
|
||||
|
||||
if not config.tcp_enabled:
|
||||
raise HTTPException(status_code=400, detail="TCP control not enabled for this unit")
|
||||
|
||||
# Create NL43Client
|
||||
client = NL43Client(
|
||||
host=config.host,
|
||||
port=config.tcp_port,
|
||||
timeout=5.0,
|
||||
ftp_username=config.ftp_username,
|
||||
ftp_password=config.ftp_password
|
||||
)
|
||||
|
||||
# Map action to command
|
||||
action_map = {
|
||||
"start": "start_measurement",
|
||||
"stop": "stop_measurement",
|
||||
"pause": "pause_measurement",
|
||||
"resume": "resume_measurement",
|
||||
"reset": "reset_measurement",
|
||||
"sleep": "sleep_mode",
|
||||
"wake": "wake_from_sleep",
|
||||
}
|
||||
|
||||
if action not in action_map:
|
||||
raise HTTPException(status_code=400, detail=f"Unknown action: {action}")
|
||||
|
||||
method_name = action_map[action]
|
||||
method = getattr(client, method_name, None)
|
||||
|
||||
if not method:
|
||||
raise HTTPException(status_code=500, detail=f"Method {method_name} not implemented")
|
||||
|
||||
try:
|
||||
result = await method()
|
||||
return {"success": True, "action": action, "result": result}
|
||||
except Exception as e:
|
||||
logger.error(f"Error executing {action} on {unit_id}: {e}")
|
||||
raise HTTPException(status_code=500, detail=str(e))
|
||||
|
||||
|
||||
@router.get("/test-modem/{unit_id}")
|
||||
async def test_modem(unit_id: str, db: Session = Depends(get_slm_db)):
|
||||
"""Test connectivity to a unit's modem/device."""
|
||||
config = db.query(NL43Config).filter_by(unit_id=unit_id).first()
|
||||
if not config:
|
||||
raise HTTPException(status_code=404, detail="Unit configuration not found")
|
||||
|
||||
if not config.tcp_enabled:
|
||||
raise HTTPException(status_code=400, detail="TCP control not enabled for this unit")
|
||||
|
||||
client = NL43Client(
|
||||
host=config.host,
|
||||
port=config.tcp_port,
|
||||
timeout=5.0,
|
||||
ftp_username=config.ftp_username,
|
||||
ftp_password=config.ftp_password
|
||||
)
|
||||
|
||||
try:
|
||||
# Try to get measurement state as a connectivity test
|
||||
state = await client.get_measurement_state()
|
||||
return {
|
||||
"success": True,
|
||||
"unit_id": unit_id,
|
||||
"host": config.host,
|
||||
"port": config.tcp_port,
|
||||
"reachable": True,
|
||||
"measurement_state": state
|
||||
}
|
||||
except Exception as e:
|
||||
logger.warning(f"Modem test failed for {unit_id}: {e}")
|
||||
return {
|
||||
"success": False,
|
||||
"unit_id": unit_id,
|
||||
"host": config.host,
|
||||
"port": config.tcp_port,
|
||||
"reachable": False,
|
||||
"error": str(e)
|
||||
}
|
||||
@@ -1,27 +0,0 @@
|
||||
from sqlalchemy import create_engine
|
||||
from sqlalchemy.ext.declarative import declarative_base
|
||||
from sqlalchemy.orm import sessionmaker
|
||||
import os
|
||||
|
||||
# Ensure data directory exists for the SLMM addon
|
||||
os.makedirs("data", exist_ok=True)
|
||||
|
||||
SQLALCHEMY_DATABASE_URL = "sqlite:///./data/slmm.db"
|
||||
|
||||
engine = create_engine(SQLALCHEMY_DATABASE_URL, connect_args={"check_same_thread": False})
|
||||
SessionLocal = sessionmaker(autocommit=False, autoflush=False, bind=engine)
|
||||
Base = declarative_base()
|
||||
|
||||
|
||||
def get_db():
|
||||
"""Dependency for database sessions."""
|
||||
db = SessionLocal()
|
||||
try:
|
||||
yield db
|
||||
finally:
|
||||
db.close()
|
||||
|
||||
|
||||
def get_db_session():
|
||||
"""Get a database session directly (not as a dependency)."""
|
||||
return SessionLocal()
|
||||
116
app/slm/main.py
@@ -1,116 +0,0 @@
|
||||
import os
|
||||
import logging
|
||||
from fastapi import FastAPI, Request
|
||||
from fastapi.middleware.cors import CORSMiddleware
|
||||
from fastapi.responses import HTMLResponse
|
||||
from fastapi.templating import Jinja2Templates
|
||||
|
||||
from app.slm.database import Base, engine
|
||||
from app.slm import routers
|
||||
|
||||
# Configure logging
|
||||
logging.basicConfig(
|
||||
level=logging.INFO,
|
||||
format="%(asctime)s - %(name)s - %(levelname)s - %(message)s",
|
||||
handlers=[
|
||||
logging.StreamHandler(),
|
||||
logging.FileHandler("data/slmm.log"),
|
||||
],
|
||||
)
|
||||
logger = logging.getLogger(__name__)
|
||||
|
||||
# Ensure database tables exist for the addon
|
||||
Base.metadata.create_all(bind=engine)
|
||||
logger.info("Database tables initialized")
|
||||
|
||||
app = FastAPI(
|
||||
title="SLMM NL43 Addon",
|
||||
description="Standalone module for NL43 configuration and status APIs",
|
||||
version="0.1.0",
|
||||
)
|
||||
|
||||
# CORS configuration - use environment variable for allowed origins
|
||||
# Default to "*" for development, but should be restricted in production
|
||||
allowed_origins = os.getenv("CORS_ORIGINS", "*").split(",")
|
||||
logger.info(f"CORS allowed origins: {allowed_origins}")
|
||||
|
||||
app.add_middleware(
|
||||
CORSMiddleware,
|
||||
allow_origins=allowed_origins,
|
||||
allow_credentials=True,
|
||||
allow_methods=["*"],
|
||||
allow_headers=["*"],
|
||||
)
|
||||
|
||||
templates = Jinja2Templates(directory="templates")
|
||||
|
||||
app.include_router(routers.router)
|
||||
|
||||
|
||||
@app.get("/", response_class=HTMLResponse)
|
||||
def index(request: Request):
|
||||
return templates.TemplateResponse("index.html", {"request": request})
|
||||
|
||||
|
||||
@app.get("/health")
|
||||
async def health():
|
||||
"""Basic health check endpoint."""
|
||||
return {"status": "ok", "service": "slmm-nl43-addon"}
|
||||
|
||||
|
||||
@app.get("/health/devices")
|
||||
async def health_devices():
|
||||
"""Enhanced health check that tests device connectivity."""
|
||||
from sqlalchemy.orm import Session
|
||||
from app.slm.database import SessionLocal
|
||||
from app.slm.services import NL43Client
|
||||
from app.slm.models import NL43Config
|
||||
|
||||
db: Session = SessionLocal()
|
||||
device_status = []
|
||||
|
||||
try:
|
||||
configs = db.query(NL43Config).filter_by(tcp_enabled=True).all()
|
||||
|
||||
for cfg in configs:
|
||||
client = NL43Client(cfg.host, cfg.tcp_port, timeout=2.0, ftp_username=cfg.ftp_username, ftp_password=cfg.ftp_password)
|
||||
status = {
|
||||
"unit_id": cfg.unit_id,
|
||||
"host": cfg.host,
|
||||
"port": cfg.tcp_port,
|
||||
"reachable": False,
|
||||
"error": None,
|
||||
}
|
||||
|
||||
try:
|
||||
# Try to connect (don't send command to avoid rate limiting issues)
|
||||
import asyncio
|
||||
reader, writer = await asyncio.wait_for(
|
||||
asyncio.open_connection(cfg.host, cfg.tcp_port), timeout=2.0
|
||||
)
|
||||
writer.close()
|
||||
await writer.wait_closed()
|
||||
status["reachable"] = True
|
||||
except Exception as e:
|
||||
status["error"] = str(type(e).__name__)
|
||||
logger.warning(f"Device {cfg.unit_id} health check failed: {e}")
|
||||
|
||||
device_status.append(status)
|
||||
|
||||
finally:
|
||||
db.close()
|
||||
|
||||
all_reachable = all(d["reachable"] for d in device_status) if device_status else True
|
||||
|
||||
return {
|
||||
"status": "ok" if all_reachable else "degraded",
|
||||
"devices": device_status,
|
||||
"total_devices": len(device_status),
|
||||
"reachable_devices": sum(1 for d in device_status if d["reachable"]),
|
||||
}
|
||||
|
||||
|
||||
if __name__ == "__main__":
|
||||
import uvicorn
|
||||
|
||||
uvicorn.run("app.main:app", host="0.0.0.0", port=int(os.getenv("PORT", "8100")), reload=True)
|
||||
@@ -1,43 +0,0 @@
|
||||
from sqlalchemy import Column, String, DateTime, Boolean, Integer, Text, func
|
||||
from app.slm.database import Base
|
||||
|
||||
|
||||
class NL43Config(Base):
|
||||
"""
|
||||
NL43 connection/config metadata for the standalone SLMM addon.
|
||||
"""
|
||||
|
||||
__tablename__ = "nl43_config"
|
||||
|
||||
unit_id = Column(String, primary_key=True, index=True)
|
||||
host = Column(String, default="127.0.0.1")
|
||||
tcp_port = Column(Integer, default=80) # NL43 TCP control port (via RX55)
|
||||
tcp_enabled = Column(Boolean, default=True)
|
||||
ftp_enabled = Column(Boolean, default=False)
|
||||
ftp_username = Column(String, nullable=True) # FTP login username
|
||||
ftp_password = Column(String, nullable=True) # FTP login password
|
||||
web_enabled = Column(Boolean, default=False)
|
||||
|
||||
|
||||
class NL43Status(Base):
|
||||
"""
|
||||
Latest NL43 status snapshot for quick dashboard/API access.
|
||||
"""
|
||||
|
||||
__tablename__ = "nl43_status"
|
||||
|
||||
unit_id = Column(String, primary_key=True, index=True)
|
||||
last_seen = Column(DateTime, default=func.now())
|
||||
measurement_state = Column(String, default="unknown") # Measure/Stop
|
||||
measurement_start_time = Column(DateTime, nullable=True) # When measurement started (UTC)
|
||||
counter = Column(String, nullable=True) # d0: Measurement interval counter (1-600)
|
||||
lp = Column(String, nullable=True) # Instantaneous sound pressure level
|
||||
leq = Column(String, nullable=True) # Equivalent continuous sound level
|
||||
lmax = Column(String, nullable=True) # Maximum level
|
||||
lmin = Column(String, nullable=True) # Minimum level
|
||||
lpeak = Column(String, nullable=True) # Peak level
|
||||
battery_level = Column(String, nullable=True)
|
||||
power_source = Column(String, nullable=True)
|
||||
sd_remaining_mb = Column(String, nullable=True)
|
||||
sd_free_ratio = Column(String, nullable=True)
|
||||
raw_payload = Column(Text, nullable=True)
|
||||
1333
app/slm/routers.py
@@ -1,828 +0,0 @@
|
||||
"""
|
||||
NL43 TCP connector and snapshot persistence.
|
||||
|
||||
Implements simple per-request TCP calls to avoid long-lived socket complexity.
|
||||
Extend to pooled connections/DRD streaming later.
|
||||
"""
|
||||
|
||||
import asyncio
|
||||
import contextlib
|
||||
import logging
|
||||
import time
|
||||
from dataclasses import dataclass
|
||||
from datetime import datetime
|
||||
from typing import Optional, List
|
||||
from sqlalchemy.orm import Session
|
||||
from ftplib import FTP
|
||||
from pathlib import Path
|
||||
|
||||
from app.slm.models import NL43Status
|
||||
|
||||
logger = logging.getLogger(__name__)
|
||||
|
||||
|
||||
@dataclass
|
||||
class NL43Snapshot:
|
||||
unit_id: str
|
||||
measurement_state: str = "unknown"
|
||||
counter: Optional[str] = None # d0: Measurement interval counter (1-600)
|
||||
lp: Optional[str] = None # Instantaneous sound pressure level
|
||||
leq: Optional[str] = None # Equivalent continuous sound level
|
||||
lmax: Optional[str] = None # Maximum level
|
||||
lmin: Optional[str] = None # Minimum level
|
||||
lpeak: Optional[str] = None # Peak level
|
||||
battery_level: Optional[str] = None
|
||||
power_source: Optional[str] = None
|
||||
sd_remaining_mb: Optional[str] = None
|
||||
sd_free_ratio: Optional[str] = None
|
||||
raw_payload: Optional[str] = None
|
||||
|
||||
|
||||
def persist_snapshot(s: NL43Snapshot, db: Session):
|
||||
"""Persist the latest snapshot for API/dashboard use."""
|
||||
try:
|
||||
row = db.query(NL43Status).filter_by(unit_id=s.unit_id).first()
|
||||
if not row:
|
||||
row = NL43Status(unit_id=s.unit_id)
|
||||
db.add(row)
|
||||
|
||||
row.last_seen = datetime.utcnow()
|
||||
|
||||
# Track measurement start time by detecting state transition
|
||||
previous_state = row.measurement_state
|
||||
new_state = s.measurement_state
|
||||
|
||||
logger.info(f"State transition check for {s.unit_id}: '{previous_state}' -> '{new_state}'")
|
||||
|
||||
# Device returns "Start" when measuring, "Stop" when stopped
|
||||
# Normalize to previous behavior for backward compatibility
|
||||
is_measuring = new_state == "Start"
|
||||
was_measuring = previous_state == "Start"
|
||||
|
||||
if not was_measuring and is_measuring:
|
||||
# Measurement just started - record the start time
|
||||
row.measurement_start_time = datetime.utcnow()
|
||||
logger.info(f"✓ Measurement started on {s.unit_id} at {row.measurement_start_time}")
|
||||
elif was_measuring and not is_measuring:
|
||||
# Measurement stopped - clear the start time
|
||||
row.measurement_start_time = None
|
||||
logger.info(f"✓ Measurement stopped on {s.unit_id}")
|
||||
|
||||
row.measurement_state = new_state
|
||||
row.counter = s.counter
|
||||
row.lp = s.lp
|
||||
row.leq = s.leq
|
||||
row.lmax = s.lmax
|
||||
row.lmin = s.lmin
|
||||
row.lpeak = s.lpeak
|
||||
row.battery_level = s.battery_level
|
||||
row.power_source = s.power_source
|
||||
row.sd_remaining_mb = s.sd_remaining_mb
|
||||
row.sd_free_ratio = s.sd_free_ratio
|
||||
row.raw_payload = s.raw_payload
|
||||
|
||||
db.commit()
|
||||
except Exception as e:
|
||||
db.rollback()
|
||||
logger.error(f"Failed to persist snapshot for unit {s.unit_id}: {e}")
|
||||
raise
|
||||
|
||||
|
||||
# Rate limiting: NL43 requires ≥1 second between commands
|
||||
_last_command_time = {}
|
||||
_rate_limit_lock = asyncio.Lock()
|
||||
|
||||
|
||||
class NL43Client:
|
||||
def __init__(self, host: str, port: int, timeout: float = 5.0, ftp_username: str = None, ftp_password: str = None):
|
||||
self.host = host
|
||||
self.port = port
|
||||
self.timeout = timeout
|
||||
self.ftp_username = ftp_username or "anonymous"
|
||||
self.ftp_password = ftp_password or ""
|
||||
self.device_key = f"{host}:{port}"
|
||||
|
||||
async def _enforce_rate_limit(self):
|
||||
"""Ensure ≥1 second between commands to the same device."""
|
||||
async with _rate_limit_lock:
|
||||
last_time = _last_command_time.get(self.device_key, 0)
|
||||
elapsed = time.time() - last_time
|
||||
if elapsed < 1.0:
|
||||
wait_time = 1.0 - elapsed
|
||||
logger.debug(f"Rate limiting: waiting {wait_time:.2f}s for {self.device_key}")
|
||||
await asyncio.sleep(wait_time)
|
||||
_last_command_time[self.device_key] = time.time()
|
||||
|
||||
async def _send_command(self, cmd: str) -> str:
|
||||
"""Send ASCII command to NL43 device via TCP.
|
||||
|
||||
NL43 protocol returns two lines for query commands:
|
||||
Line 1: Result code (R+0000 for success, error codes otherwise)
|
||||
Line 2: Actual data (for query commands ending with '?')
|
||||
"""
|
||||
await self._enforce_rate_limit()
|
||||
|
||||
logger.info(f"Sending command to {self.device_key}: {cmd.strip()}")
|
||||
|
||||
try:
|
||||
reader, writer = await asyncio.wait_for(
|
||||
asyncio.open_connection(self.host, self.port), timeout=self.timeout
|
||||
)
|
||||
except asyncio.TimeoutError:
|
||||
logger.error(f"Connection timeout to {self.device_key}")
|
||||
raise ConnectionError(f"Failed to connect to device at {self.host}:{self.port}")
|
||||
except Exception as e:
|
||||
logger.error(f"Connection failed to {self.device_key}: {e}")
|
||||
raise ConnectionError(f"Failed to connect to device: {str(e)}")
|
||||
|
||||
try:
|
||||
writer.write(cmd.encode("ascii"))
|
||||
await writer.drain()
|
||||
|
||||
# Read first line (result code)
|
||||
first_line_data = await asyncio.wait_for(reader.readuntil(b"\n"), timeout=self.timeout)
|
||||
result_code = first_line_data.decode(errors="ignore").strip()
|
||||
|
||||
# Remove leading $ prompt if present
|
||||
if result_code.startswith("$"):
|
||||
result_code = result_code[1:].strip()
|
||||
|
||||
logger.info(f"Result code from {self.device_key}: {result_code}")
|
||||
|
||||
# Check result code
|
||||
if result_code == "R+0000":
|
||||
# Success - for query commands, read the second line with actual data
|
||||
is_query = cmd.strip().endswith("?")
|
||||
if is_query:
|
||||
data_line = await asyncio.wait_for(reader.readuntil(b"\n"), timeout=self.timeout)
|
||||
response = data_line.decode(errors="ignore").strip()
|
||||
logger.debug(f"Data line from {self.device_key}: {response}")
|
||||
return response
|
||||
else:
|
||||
# Setting command - return success code
|
||||
return result_code
|
||||
elif result_code == "R+0001":
|
||||
raise ValueError("Command error - device did not recognize command")
|
||||
elif result_code == "R+0002":
|
||||
raise ValueError("Parameter error - invalid parameter value")
|
||||
elif result_code == "R+0003":
|
||||
raise ValueError("Spec/type error - command not supported by this device model")
|
||||
elif result_code == "R+0004":
|
||||
raise ValueError("Status error - device is in wrong state for this command")
|
||||
else:
|
||||
raise ValueError(f"Unknown result code: {result_code}")
|
||||
|
||||
except asyncio.TimeoutError:
|
||||
logger.error(f"Response timeout from {self.device_key}")
|
||||
raise TimeoutError(f"Device did not respond within {self.timeout}s")
|
||||
except Exception as e:
|
||||
logger.error(f"Communication error with {self.device_key}: {e}")
|
||||
raise
|
||||
finally:
|
||||
writer.close()
|
||||
with contextlib.suppress(Exception):
|
||||
await writer.wait_closed()
|
||||
|
||||
async def request_dod(self) -> NL43Snapshot:
|
||||
"""Request DOD (Data Output Display) snapshot from device.
|
||||
|
||||
Returns parsed measurement data from the device display.
|
||||
"""
|
||||
# _send_command now handles result code validation and returns the data line
|
||||
resp = await self._send_command("DOD?\r\n")
|
||||
|
||||
# Validate response format
|
||||
if not resp:
|
||||
logger.warning(f"Empty data response from DOD command on {self.device_key}")
|
||||
raise ValueError("Device returned empty data for DOD? command")
|
||||
|
||||
# Remove leading $ prompt if present (shouldn't be there after _send_command, but be safe)
|
||||
if resp.startswith("$"):
|
||||
resp = resp[1:].strip()
|
||||
|
||||
parts = [p.strip() for p in resp.split(",") if p.strip() != ""]
|
||||
|
||||
# DOD should return at least some data points
|
||||
if len(parts) < 2:
|
||||
logger.error(f"Malformed DOD data from {self.device_key}: {resp}")
|
||||
raise ValueError(f"Malformed DOD data: expected comma-separated values, got: {resp}")
|
||||
|
||||
logger.info(f"Parsed {len(parts)} data points from DOD response")
|
||||
|
||||
# Query actual measurement state (DOD doesn't include this information)
|
||||
try:
|
||||
measurement_state = await self.get_measurement_state()
|
||||
except Exception as e:
|
||||
logger.warning(f"Failed to get measurement state, defaulting to 'Measure': {e}")
|
||||
measurement_state = "Measure"
|
||||
|
||||
snap = NL43Snapshot(unit_id="", raw_payload=resp, measurement_state=measurement_state)
|
||||
|
||||
# Parse known positions (based on NL43 communication guide - DRD format)
|
||||
# DRD format: d0=counter, d1=Lp, d2=Leq, d3=Lmax, d4=Lmin, d5=Lpeak, d6=LIeq, ...
|
||||
try:
|
||||
# Capture d0 (counter) for timer synchronization
|
||||
if len(parts) >= 1:
|
||||
snap.counter = parts[0] # d0: Measurement interval counter (1-600)
|
||||
if len(parts) >= 2:
|
||||
snap.lp = parts[1] # d1: Instantaneous sound pressure level
|
||||
if len(parts) >= 3:
|
||||
snap.leq = parts[2] # d2: Equivalent continuous sound level
|
||||
if len(parts) >= 4:
|
||||
snap.lmax = parts[3] # d3: Maximum level
|
||||
if len(parts) >= 5:
|
||||
snap.lmin = parts[4] # d4: Minimum level
|
||||
if len(parts) >= 6:
|
||||
snap.lpeak = parts[5] # d5: Peak level
|
||||
except (IndexError, ValueError) as e:
|
||||
logger.warning(f"Error parsing DOD data points: {e}")
|
||||
|
||||
return snap
|
||||
|
||||
async def start(self):
|
||||
"""Start measurement on the device.
|
||||
|
||||
According to NL43 protocol: Measure,Start (no $ prefix, capitalized param)
|
||||
"""
|
||||
await self._send_command("Measure,Start\r\n")
|
||||
|
||||
async def stop(self):
|
||||
"""Stop measurement on the device.
|
||||
|
||||
According to NL43 protocol: Measure,Stop (no $ prefix, capitalized param)
|
||||
"""
|
||||
await self._send_command("Measure,Stop\r\n")
|
||||
|
||||
async def set_store_mode_manual(self):
|
||||
"""Set the device to Manual Store mode.
|
||||
|
||||
According to NL43 protocol: Store Mode,Manual sets manual storage mode
|
||||
"""
|
||||
await self._send_command("Store Mode,Manual\r\n")
|
||||
logger.info(f"Store mode set to Manual on {self.device_key}")
|
||||
|
||||
async def manual_store(self):
|
||||
"""Manually store the current measurement data.
|
||||
|
||||
According to NL43 protocol: Manual Store,Start executes storing
|
||||
Parameter p1="Start" executes the storage operation
|
||||
Device must be in Manual Store mode first
|
||||
"""
|
||||
await self._send_command("Manual Store,Start\r\n")
|
||||
logger.info(f"Manual store executed on {self.device_key}")
|
||||
|
||||
async def pause(self):
|
||||
"""Pause the current measurement."""
|
||||
await self._send_command("Pause,On\r\n")
|
||||
logger.info(f"Measurement paused on {self.device_key}")
|
||||
|
||||
async def resume(self):
|
||||
"""Resume a paused measurement."""
|
||||
await self._send_command("Pause,Off\r\n")
|
||||
logger.info(f"Measurement resumed on {self.device_key}")
|
||||
|
||||
async def reset(self):
|
||||
"""Reset the measurement data."""
|
||||
await self._send_command("Reset\r\n")
|
||||
logger.info(f"Measurement data reset on {self.device_key}")
|
||||
|
||||
async def get_measurement_state(self) -> str:
|
||||
"""Get the current measurement state.
|
||||
|
||||
Returns: "Start" if measuring, "Stop" if stopped
|
||||
"""
|
||||
resp = await self._send_command("Measure?\r\n")
|
||||
state = resp.strip()
|
||||
logger.info(f"Measurement state on {self.device_key}: {state}")
|
||||
return state
|
||||
|
||||
async def get_battery_level(self) -> str:
|
||||
"""Get the battery level."""
|
||||
resp = await self._send_command("Battery Level?\r\n")
|
||||
logger.info(f"Battery level on {self.device_key}: {resp}")
|
||||
return resp.strip()
|
||||
|
||||
async def get_clock(self) -> str:
|
||||
"""Get the device clock time."""
|
||||
resp = await self._send_command("Clock?\r\n")
|
||||
logger.info(f"Clock on {self.device_key}: {resp}")
|
||||
return resp.strip()
|
||||
|
||||
async def set_clock(self, datetime_str: str):
|
||||
"""Set the device clock time.
|
||||
|
||||
Args:
|
||||
datetime_str: Time in format YYYY/MM/DD,HH:MM:SS or YYYY/MM/DD HH:MM:SS
|
||||
"""
|
||||
# Device expects format: Clock,YYYY/MM/DD HH:MM:SS (space between date and time)
|
||||
# Replace comma with space if present to normalize format
|
||||
normalized = datetime_str.replace(',', ' ', 1)
|
||||
await self._send_command(f"Clock,{normalized}\r\n")
|
||||
logger.info(f"Clock set on {self.device_key} to {normalized}")
|
||||
|
||||
async def get_frequency_weighting(self, channel: str = "Main") -> str:
|
||||
"""Get frequency weighting (A, C, Z, etc.).
|
||||
|
||||
Args:
|
||||
channel: Main, Sub1, Sub2, or Sub3
|
||||
"""
|
||||
resp = await self._send_command(f"Frequency Weighting ({channel})?\r\n")
|
||||
logger.info(f"Frequency weighting ({channel}) on {self.device_key}: {resp}")
|
||||
return resp.strip()
|
||||
|
||||
async def set_frequency_weighting(self, weighting: str, channel: str = "Main"):
|
||||
"""Set frequency weighting.
|
||||
|
||||
Args:
|
||||
weighting: A, C, or Z
|
||||
channel: Main, Sub1, Sub2, or Sub3
|
||||
"""
|
||||
await self._send_command(f"Frequency Weighting ({channel}),{weighting}\r\n")
|
||||
logger.info(f"Frequency weighting ({channel}) set to {weighting} on {self.device_key}")
|
||||
|
||||
async def get_time_weighting(self, channel: str = "Main") -> str:
|
||||
"""Get time weighting (F, S, I).
|
||||
|
||||
Args:
|
||||
channel: Main, Sub1, Sub2, or Sub3
|
||||
"""
|
||||
resp = await self._send_command(f"Time Weighting ({channel})?\r\n")
|
||||
logger.info(f"Time weighting ({channel}) on {self.device_key}: {resp}")
|
||||
return resp.strip()
|
||||
|
||||
async def set_time_weighting(self, weighting: str, channel: str = "Main"):
|
||||
"""Set time weighting.
|
||||
|
||||
Args:
|
||||
weighting: F (Fast), S (Slow), or I (Impulse)
|
||||
channel: Main, Sub1, Sub2, or Sub3
|
||||
"""
|
||||
await self._send_command(f"Time Weighting ({channel}),{weighting}\r\n")
|
||||
logger.info(f"Time weighting ({channel}) set to {weighting} on {self.device_key}")
|
||||
|
||||
async def request_dlc(self) -> dict:
|
||||
"""Request DLC (Data Last Calculation) - final stored measurement results.
|
||||
|
||||
This retrieves the complete calculation results from the last/current measurement,
|
||||
including all statistical data. Similar to DOD but for final results.
|
||||
|
||||
Returns:
|
||||
Dict with parsed DLC data
|
||||
"""
|
||||
resp = await self._send_command("DLC?\r\n")
|
||||
logger.info(f"DLC data received from {self.device_key}: {resp[:100]}...")
|
||||
|
||||
# Parse DLC response - similar format to DOD
|
||||
# The exact format depends on device configuration
|
||||
# For now, return raw data - can be enhanced based on actual response format
|
||||
return {
|
||||
"raw_data": resp.strip(),
|
||||
"device_key": self.device_key,
|
||||
}
|
||||
|
||||
async def sleep(self):
|
||||
"""Put the device into sleep mode to conserve battery.
|
||||
|
||||
Sleep mode is useful for battery conservation between scheduled measurements.
|
||||
Device can be woken up remotely via TCP command or by pressing a button.
|
||||
"""
|
||||
await self._send_command("Sleep Mode,On\r\n")
|
||||
logger.info(f"Device {self.device_key} entering sleep mode")
|
||||
|
||||
async def wake(self):
|
||||
"""Wake the device from sleep mode.
|
||||
|
||||
Note: This may not work if the device is in deep sleep.
|
||||
Physical button press might be required in some cases.
|
||||
"""
|
||||
await self._send_command("Sleep Mode,Off\r\n")
|
||||
logger.info(f"Device {self.device_key} waking from sleep mode")
|
||||
|
||||
async def get_sleep_status(self) -> str:
|
||||
"""Get the current sleep mode status."""
|
||||
resp = await self._send_command("Sleep Mode?\r\n")
|
||||
logger.info(f"Sleep mode status on {self.device_key}: {resp}")
|
||||
return resp.strip()
|
||||
|
||||
async def stream_drd(self, callback):
|
||||
"""Stream continuous DRD output from the device.
|
||||
|
||||
Opens a persistent connection and streams DRD data lines.
|
||||
Calls the provided callback function with each parsed snapshot.
|
||||
|
||||
Args:
|
||||
callback: Async function that receives NL43Snapshot objects
|
||||
|
||||
The stream continues until an exception occurs or the connection is closed.
|
||||
Send SUB character (0x1A) to stop the stream.
|
||||
"""
|
||||
await self._enforce_rate_limit()
|
||||
|
||||
logger.info(f"Starting DRD stream for {self.device_key}")
|
||||
|
||||
try:
|
||||
reader, writer = await asyncio.wait_for(
|
||||
asyncio.open_connection(self.host, self.port), timeout=self.timeout
|
||||
)
|
||||
except asyncio.TimeoutError:
|
||||
logger.error(f"DRD stream connection timeout to {self.device_key}")
|
||||
raise ConnectionError(f"Failed to connect to device at {self.host}:{self.port}")
|
||||
except Exception as e:
|
||||
logger.error(f"DRD stream connection failed to {self.device_key}: {e}")
|
||||
raise ConnectionError(f"Failed to connect to device: {str(e)}")
|
||||
|
||||
try:
|
||||
# Start DRD streaming
|
||||
writer.write(b"DRD?\r\n")
|
||||
await writer.drain()
|
||||
|
||||
# Read initial result code
|
||||
first_line_data = await asyncio.wait_for(reader.readuntil(b"\n"), timeout=self.timeout)
|
||||
result_code = first_line_data.decode(errors="ignore").strip()
|
||||
|
||||
if result_code.startswith("$"):
|
||||
result_code = result_code[1:].strip()
|
||||
|
||||
logger.debug(f"DRD stream result code from {self.device_key}: {result_code}")
|
||||
|
||||
if result_code != "R+0000":
|
||||
raise ValueError(f"DRD stream failed to start: {result_code}")
|
||||
|
||||
logger.info(f"DRD stream started successfully for {self.device_key}")
|
||||
|
||||
# Continuously read data lines
|
||||
while True:
|
||||
try:
|
||||
line_data = await asyncio.wait_for(reader.readuntil(b"\n"), timeout=30.0)
|
||||
line = line_data.decode(errors="ignore").strip()
|
||||
|
||||
if not line:
|
||||
continue
|
||||
|
||||
# Remove leading $ if present
|
||||
if line.startswith("$"):
|
||||
line = line[1:].strip()
|
||||
|
||||
# Parse the DRD data (same format as DOD)
|
||||
parts = [p.strip() for p in line.split(",") if p.strip() != ""]
|
||||
|
||||
if len(parts) < 2:
|
||||
logger.warning(f"Malformed DRD data from {self.device_key}: {line}")
|
||||
continue
|
||||
|
||||
snap = NL43Snapshot(unit_id="", raw_payload=line, measurement_state="Measure")
|
||||
|
||||
# Parse known positions (DRD format - same as DOD)
|
||||
# DRD format: d0=counter, d1=Lp, d2=Leq, d3=Lmax, d4=Lmin, d5=Lpeak, d6=LIeq, ...
|
||||
try:
|
||||
# Capture d0 (counter) for timer synchronization
|
||||
if len(parts) >= 1:
|
||||
snap.counter = parts[0] # d0: Measurement interval counter (1-600)
|
||||
if len(parts) >= 2:
|
||||
snap.lp = parts[1] # d1: Instantaneous sound pressure level
|
||||
if len(parts) >= 3:
|
||||
snap.leq = parts[2] # d2: Equivalent continuous sound level
|
||||
if len(parts) >= 4:
|
||||
snap.lmax = parts[3] # d3: Maximum level
|
||||
if len(parts) >= 5:
|
||||
snap.lmin = parts[4] # d4: Minimum level
|
||||
if len(parts) >= 6:
|
||||
snap.lpeak = parts[5] # d5: Peak level
|
||||
except (IndexError, ValueError) as e:
|
||||
logger.warning(f"Error parsing DRD data points: {e}")
|
||||
|
||||
# Call the callback with the snapshot
|
||||
await callback(snap)
|
||||
|
||||
except asyncio.TimeoutError:
|
||||
logger.warning(f"DRD stream timeout (no data for 30s) from {self.device_key}")
|
||||
break
|
||||
except asyncio.IncompleteReadError:
|
||||
logger.info(f"DRD stream closed by device {self.device_key}")
|
||||
break
|
||||
|
||||
finally:
|
||||
# Send SUB character to stop streaming
|
||||
try:
|
||||
writer.write(b"\x1A")
|
||||
await writer.drain()
|
||||
except Exception:
|
||||
pass
|
||||
|
||||
writer.close()
|
||||
with contextlib.suppress(Exception):
|
||||
await writer.wait_closed()
|
||||
|
||||
logger.info(f"DRD stream ended for {self.device_key}")
|
||||
|
||||
async def set_measurement_time(self, preset: str):
|
||||
"""Set measurement time preset.
|
||||
|
||||
Args:
|
||||
preset: Time preset (10s, 1m, 5m, 10m, 15m, 30m, 1h, 8h, 24h, or custom like "00:05:30")
|
||||
"""
|
||||
await self._send_command(f"Measurement Time Preset Manual,{preset}\r\n")
|
||||
logger.info(f"Set measurement time to {preset} on {self.device_key}")
|
||||
|
||||
async def get_measurement_time(self) -> str:
|
||||
"""Get current measurement time preset.
|
||||
|
||||
Returns: Current time preset setting
|
||||
"""
|
||||
resp = await self._send_command("Measurement Time Preset Manual?\r\n")
|
||||
return resp.strip()
|
||||
|
||||
async def set_leq_interval(self, preset: str):
|
||||
"""Set Leq calculation interval preset.
|
||||
|
||||
Args:
|
||||
preset: Interval preset (Off, 10s, 1m, 5m, 10m, 15m, 30m, 1h, 8h, 24h, or custom like "00:05:30")
|
||||
"""
|
||||
await self._send_command(f"Leq Calculation Interval Preset,{preset}\r\n")
|
||||
logger.info(f"Set Leq interval to {preset} on {self.device_key}")
|
||||
|
||||
async def get_leq_interval(self) -> str:
|
||||
"""Get current Leq calculation interval preset.
|
||||
|
||||
Returns: Current interval preset setting
|
||||
"""
|
||||
resp = await self._send_command("Leq Calculation Interval Preset?\r\n")
|
||||
return resp.strip()
|
||||
|
||||
async def set_lp_interval(self, preset: str):
|
||||
"""Set Lp store interval.
|
||||
|
||||
Args:
|
||||
preset: Store interval (Off, 10ms, 25ms, 100ms, 200ms, 1s)
|
||||
"""
|
||||
await self._send_command(f"Lp Store Interval,{preset}\r\n")
|
||||
logger.info(f"Set Lp interval to {preset} on {self.device_key}")
|
||||
|
||||
async def get_lp_interval(self) -> str:
|
||||
"""Get current Lp store interval.
|
||||
|
||||
Returns: Current store interval setting
|
||||
"""
|
||||
resp = await self._send_command("Lp Store Interval?\r\n")
|
||||
return resp.strip()
|
||||
|
||||
async def set_index_number(self, index: int):
|
||||
"""Set index number for file numbering (Store Name).
|
||||
|
||||
Args:
|
||||
index: Index number (0000-9999)
|
||||
"""
|
||||
if not 0 <= index <= 9999:
|
||||
raise ValueError("Index must be between 0000 and 9999")
|
||||
await self._send_command(f"Store Name,{index:04d}\r\n")
|
||||
logger.info(f"Set store name (index) to {index:04d} on {self.device_key}")
|
||||
|
||||
async def get_index_number(self) -> str:
|
||||
"""Get current index number (Store Name).
|
||||
|
||||
Returns: Current index number
|
||||
"""
|
||||
resp = await self._send_command("Store Name?\r\n")
|
||||
return resp.strip()
|
||||
|
||||
async def get_overwrite_status(self) -> str:
|
||||
"""Check if saved data exists at current store target.
|
||||
|
||||
This command checks whether saved data exists in the set store target
|
||||
(store mode / store name / store address). Use this before storing
|
||||
to prevent accidentally overwriting data.
|
||||
|
||||
Returns:
|
||||
"None" - No data exists (safe to store)
|
||||
"Exist" - Data exists (would overwrite)
|
||||
"""
|
||||
resp = await self._send_command("Overwrite?\r\n")
|
||||
return resp.strip()
|
||||
|
||||
async def get_all_settings(self) -> dict:
|
||||
"""Query all device settings for verification.
|
||||
|
||||
Returns: Dictionary with all current device settings
|
||||
"""
|
||||
settings = {}
|
||||
|
||||
# Measurement settings
|
||||
try:
|
||||
settings["measurement_state"] = await self.get_measurement_state()
|
||||
except Exception as e:
|
||||
settings["measurement_state"] = f"Error: {e}"
|
||||
|
||||
try:
|
||||
settings["frequency_weighting"] = await self.get_frequency_weighting()
|
||||
except Exception as e:
|
||||
settings["frequency_weighting"] = f"Error: {e}"
|
||||
|
||||
try:
|
||||
settings["time_weighting"] = await self.get_time_weighting()
|
||||
except Exception as e:
|
||||
settings["time_weighting"] = f"Error: {e}"
|
||||
|
||||
# Timing/interval settings
|
||||
try:
|
||||
settings["measurement_time"] = await self.get_measurement_time()
|
||||
except Exception as e:
|
||||
settings["measurement_time"] = f"Error: {e}"
|
||||
|
||||
try:
|
||||
settings["leq_interval"] = await self.get_leq_interval()
|
||||
except Exception as e:
|
||||
settings["leq_interval"] = f"Error: {e}"
|
||||
|
||||
try:
|
||||
settings["lp_interval"] = await self.get_lp_interval()
|
||||
except Exception as e:
|
||||
settings["lp_interval"] = f"Error: {e}"
|
||||
|
||||
try:
|
||||
settings["index_number"] = await self.get_index_number()
|
||||
except Exception as e:
|
||||
settings["index_number"] = f"Error: {e}"
|
||||
|
||||
# Device info
|
||||
try:
|
||||
settings["battery_level"] = await self.get_battery_level()
|
||||
except Exception as e:
|
||||
settings["battery_level"] = f"Error: {e}"
|
||||
|
||||
try:
|
||||
settings["clock"] = await self.get_clock()
|
||||
except Exception as e:
|
||||
settings["clock"] = f"Error: {e}"
|
||||
|
||||
# Sleep mode
|
||||
try:
|
||||
settings["sleep_mode"] = await self.get_sleep_status()
|
||||
except Exception as e:
|
||||
settings["sleep_mode"] = f"Error: {e}"
|
||||
|
||||
# FTP status
|
||||
try:
|
||||
settings["ftp_status"] = await self.get_ftp_status()
|
||||
except Exception as e:
|
||||
settings["ftp_status"] = f"Error: {e}"
|
||||
|
||||
logger.info(f"Retrieved all settings for {self.device_key}")
|
||||
return settings
|
||||
|
||||
async def enable_ftp(self):
|
||||
"""Enable FTP server on the device.
|
||||
|
||||
According to NL43 protocol: FTP,On enables the FTP server
|
||||
"""
|
||||
await self._send_command("FTP,On\r\n")
|
||||
logger.info(f"FTP enabled on {self.device_key}")
|
||||
|
||||
async def disable_ftp(self):
|
||||
"""Disable FTP server on the device.
|
||||
|
||||
According to NL43 protocol: FTP,Off disables the FTP server
|
||||
"""
|
||||
await self._send_command("FTP,Off\r\n")
|
||||
logger.info(f"FTP disabled on {self.device_key}")
|
||||
|
||||
async def get_ftp_status(self) -> str:
|
||||
"""Query FTP server status on the device.
|
||||
|
||||
Returns: "On" or "Off"
|
||||
"""
|
||||
resp = await self._send_command("FTP?\r\n")
|
||||
logger.info(f"FTP status on {self.device_key}: {resp}")
|
||||
return resp.strip()
|
||||
|
||||
async def list_ftp_files(self, remote_path: str = "/") -> List[dict]:
|
||||
"""List files on the device via FTP.
|
||||
|
||||
Args:
|
||||
remote_path: Directory path on the device (default: root)
|
||||
|
||||
Returns:
|
||||
List of file info dicts with 'name', 'size', 'modified', 'is_dir'
|
||||
"""
|
||||
logger.info(f"Listing FTP files on {self.device_key} at {remote_path}")
|
||||
|
||||
def _list_ftp_sync():
|
||||
"""Synchronous FTP listing using ftplib (supports active mode)."""
|
||||
ftp = FTP()
|
||||
ftp.set_debuglevel(0)
|
||||
try:
|
||||
# Connect and login
|
||||
ftp.connect(self.host, 21, timeout=10)
|
||||
ftp.login(self.ftp_username, self.ftp_password)
|
||||
ftp.set_pasv(False) # Force active mode
|
||||
|
||||
# Change to target directory
|
||||
if remote_path != "/":
|
||||
ftp.cwd(remote_path)
|
||||
|
||||
# Get directory listing with details
|
||||
files = []
|
||||
lines = []
|
||||
ftp.retrlines('LIST', lines.append)
|
||||
|
||||
for line in lines:
|
||||
# Parse Unix-style ls output
|
||||
parts = line.split(None, 8)
|
||||
if len(parts) < 9:
|
||||
continue
|
||||
|
||||
is_dir = parts[0].startswith('d')
|
||||
size = int(parts[4]) if not is_dir else 0
|
||||
name = parts[8]
|
||||
|
||||
# Skip . and ..
|
||||
if name in ('.', '..'):
|
||||
continue
|
||||
|
||||
# Parse modification time
|
||||
# Format: "Jan 07 14:23" or "Dec 25 2025"
|
||||
modified_str = f"{parts[5]} {parts[6]} {parts[7]}"
|
||||
modified_timestamp = None
|
||||
try:
|
||||
from datetime import datetime
|
||||
# Try parsing with time (recent files: "Jan 07 14:23")
|
||||
try:
|
||||
dt = datetime.strptime(modified_str, "%b %d %H:%M")
|
||||
# Add current year since it's not in the format
|
||||
dt = dt.replace(year=datetime.now().year)
|
||||
|
||||
# If the resulting date is in the future, it's actually from last year
|
||||
if dt > datetime.now():
|
||||
dt = dt.replace(year=dt.year - 1)
|
||||
|
||||
modified_timestamp = dt.isoformat()
|
||||
except ValueError:
|
||||
# Try parsing with year (older files: "Dec 25 2025")
|
||||
dt = datetime.strptime(modified_str, "%b %d %Y")
|
||||
modified_timestamp = dt.isoformat()
|
||||
except Exception as e:
|
||||
logger.warning(f"Failed to parse timestamp '{modified_str}': {e}")
|
||||
|
||||
file_info = {
|
||||
"name": name,
|
||||
"path": f"{remote_path.rstrip('/')}/{name}",
|
||||
"size": size,
|
||||
"modified": modified_str, # Keep original string
|
||||
"modified_timestamp": modified_timestamp, # Add parsed timestamp
|
||||
"is_dir": is_dir,
|
||||
}
|
||||
files.append(file_info)
|
||||
logger.debug(f"Found file: {file_info}")
|
||||
|
||||
logger.info(f"Found {len(files)} files/directories on {self.device_key}")
|
||||
return files
|
||||
|
||||
finally:
|
||||
try:
|
||||
ftp.quit()
|
||||
except:
|
||||
pass
|
||||
|
||||
try:
|
||||
# Run synchronous FTP in thread pool
|
||||
return await asyncio.to_thread(_list_ftp_sync)
|
||||
except Exception as e:
|
||||
logger.error(f"Failed to list FTP files on {self.device_key}: {e}")
|
||||
raise ConnectionError(f"FTP connection failed: {str(e)}")
|
||||
|
||||
async def download_ftp_file(self, remote_path: str, local_path: str):
|
||||
"""Download a file from the device via FTP.
|
||||
|
||||
Args:
|
||||
remote_path: Full path to file on the device
|
||||
local_path: Local path where file will be saved
|
||||
"""
|
||||
logger.info(f"Downloading {remote_path} from {self.device_key} to {local_path}")
|
||||
|
||||
def _download_ftp_sync():
|
||||
"""Synchronous FTP download using ftplib (supports active mode)."""
|
||||
ftp = FTP()
|
||||
ftp.set_debuglevel(0)
|
||||
try:
|
||||
# Connect and login
|
||||
ftp.connect(self.host, 21, timeout=10)
|
||||
ftp.login(self.ftp_username, self.ftp_password)
|
||||
ftp.set_pasv(False) # Force active mode
|
||||
|
||||
# Download file
|
||||
with open(local_path, 'wb') as f:
|
||||
ftp.retrbinary(f'RETR {remote_path}', f.write)
|
||||
|
||||
logger.info(f"Successfully downloaded {remote_path} to {local_path}")
|
||||
|
||||
finally:
|
||||
try:
|
||||
ftp.quit()
|
||||
except:
|
||||
pass
|
||||
|
||||
try:
|
||||
# Run synchronous FTP in thread pool
|
||||
await asyncio.to_thread(_download_ftp_sync)
|
||||
except Exception as e:
|
||||
logger.error(f"Failed to download {remote_path} from {self.device_key}: {e}")
|
||||
raise ConnectionError(f"FTP download failed: {str(e)}")
|
||||
@@ -1,92 +0,0 @@
|
||||
"""
|
||||
UI Layer Routes - HTML page routes only (no business logic)
|
||||
"""
|
||||
from fastapi import APIRouter, Request
|
||||
from fastapi.responses import HTMLResponse, FileResponse
|
||||
from fastapi.templating import Jinja2Templates
|
||||
import os
|
||||
|
||||
router = APIRouter()
|
||||
|
||||
# Setup Jinja2 templates
|
||||
templates = Jinja2Templates(directory="app/ui/templates")
|
||||
|
||||
# Read environment (development or production)
|
||||
ENVIRONMENT = os.getenv("ENVIRONMENT", "production")
|
||||
VERSION = "1.0.0" # Terra-View version
|
||||
|
||||
# Override TemplateResponse to include environment and version in context
|
||||
original_template_response = templates.TemplateResponse
|
||||
def custom_template_response(name, context=None, *args, **kwargs):
|
||||
if context is None:
|
||||
context = {}
|
||||
context["environment"] = ENVIRONMENT
|
||||
context["version"] = VERSION
|
||||
return original_template_response(name, context, *args, **kwargs)
|
||||
templates.TemplateResponse = custom_template_response
|
||||
|
||||
|
||||
# ===== HTML PAGE ROUTES =====
|
||||
|
||||
@router.get("/", response_class=HTMLResponse)
|
||||
async def dashboard(request: Request):
|
||||
"""Dashboard home page"""
|
||||
return templates.TemplateResponse("dashboard.html", {"request": request})
|
||||
|
||||
|
||||
@router.get("/roster", response_class=HTMLResponse)
|
||||
async def roster_page(request: Request):
|
||||
"""Fleet roster page"""
|
||||
return templates.TemplateResponse("roster.html", {"request": request})
|
||||
|
||||
|
||||
@router.get("/unit/{unit_id}", response_class=HTMLResponse)
|
||||
async def unit_detail_page(request: Request, unit_id: str):
|
||||
"""Unit detail page"""
|
||||
return templates.TemplateResponse("unit_detail.html", {
|
||||
"request": request,
|
||||
"unit_id": unit_id
|
||||
})
|
||||
|
||||
|
||||
@router.get("/settings", response_class=HTMLResponse)
|
||||
async def settings_page(request: Request):
|
||||
"""Settings page for roster management"""
|
||||
return templates.TemplateResponse("settings.html", {"request": request})
|
||||
|
||||
|
||||
@router.get("/sound-level-meters", response_class=HTMLResponse)
|
||||
async def sound_level_meters_page(request: Request):
|
||||
"""Sound Level Meters management dashboard"""
|
||||
return templates.TemplateResponse("sound_level_meters.html", {"request": request})
|
||||
|
||||
|
||||
@router.get("/seismographs", response_class=HTMLResponse)
|
||||
async def seismographs_page(request: Request):
|
||||
"""Seismographs management dashboard"""
|
||||
return templates.TemplateResponse("seismographs.html", {"request": request})
|
||||
|
||||
|
||||
# ===== PWA ROUTES =====
|
||||
|
||||
@router.get("/sw.js")
|
||||
async def service_worker():
|
||||
"""Serve service worker with proper headers for PWA"""
|
||||
return FileResponse(
|
||||
"app/ui/static/sw.js",
|
||||
media_type="application/javascript",
|
||||
headers={
|
||||
"Service-Worker-Allowed": "/",
|
||||
"Cache-Control": "no-cache"
|
||||
}
|
||||
)
|
||||
|
||||
|
||||
@router.get("/offline-db.js")
|
||||
async def offline_db_script():
|
||||
"""Serve offline database script"""
|
||||
return FileResponse(
|
||||
"app/ui/static/offline-db.js",
|
||||
media_type="application/javascript",
|
||||
headers={"Cache-Control": "no-cache"}
|
||||
)
|
||||
@@ -1,195 +0,0 @@
|
||||
{% extends "base.html" %}
|
||||
|
||||
{% block title %}{{ unit_id }} - Sound Level Meter{% endblock %}
|
||||
|
||||
{% block content %}
|
||||
<div class="mb-6">
|
||||
<a href="/roster" class="text-seismo-orange hover:text-seismo-orange-dark flex items-center">
|
||||
<svg class="w-4 h-4 mr-1" fill="none" stroke="currentColor" viewBox="0 0 24 24">
|
||||
<path stroke-linecap="round" stroke-linejoin="round" stroke-width="2" d="M15 19l-7-7 7-7"></path>
|
||||
</svg>
|
||||
Back to Roster
|
||||
</a>
|
||||
</div>
|
||||
|
||||
<div class="mb-8">
|
||||
<div class="flex justify-between items-start">
|
||||
<div>
|
||||
<h1 class="text-3xl font-bold text-gray-900 dark:text-white flex items-center">
|
||||
<svg class="w-8 h-8 mr-3 text-seismo-orange" fill="none" stroke="currentColor" viewBox="0 0 24 24">
|
||||
<path stroke-linecap="round" stroke-linejoin="round" stroke-width="2"
|
||||
d="M9 19V6l12-3v13M9 19c0 1.105-1.343 2-3 2s-3-.895-3-2 1.343-2 3-2 3 .895 3 2zm12-3c0 1.105-1.343 2-3 2s-3-.895-3-2 1.343-2 3-2 3 .895 3 2zM9 10l12-3">
|
||||
</path>
|
||||
</svg>
|
||||
{{ unit_id }}
|
||||
</h1>
|
||||
<p class="text-gray-600 dark:text-gray-400 mt-1">
|
||||
{{ unit.slm_model or 'NL-43' }} Sound Level Meter
|
||||
</p>
|
||||
</div>
|
||||
<div class="flex gap-2">
|
||||
<span class="px-3 py-1 rounded-full text-sm font-medium
|
||||
{% if unit.deployed %}bg-blue-100 text-blue-800 dark:bg-blue-900 dark:text-blue-200
|
||||
{% else %}bg-gray-100 text-gray-800 dark:bg-gray-700 dark:text-gray-200{% endif %}">
|
||||
{% if unit.deployed %}Deployed{% else %}Benched{% endif %}
|
||||
</span>
|
||||
</div>
|
||||
</div>
|
||||
</div>
|
||||
|
||||
<!-- Control Panel -->
|
||||
<div class="mb-8">
|
||||
<h2 class="text-xl font-semibold text-gray-900 dark:text-white mb-4">Control Panel</h2>
|
||||
<div hx-get="/slm/partials/{{ unit_id }}/controls"
|
||||
hx-trigger="load, every 5s"
|
||||
hx-swap="innerHTML">
|
||||
<div class="text-center py-8 text-gray-500">Loading controls...</div>
|
||||
</div>
|
||||
</div>
|
||||
|
||||
<!-- Real-time Data Stream -->
|
||||
<div class="mb-8">
|
||||
<h2 class="text-xl font-semibold text-gray-900 dark:text-white mb-4">Real-time Measurements</h2>
|
||||
<div class="bg-white dark:bg-slate-800 rounded-xl shadow-lg p-6">
|
||||
<div id="slm-stream-container">
|
||||
<div class="text-center py-8">
|
||||
<button onclick="startStream()"
|
||||
id="stream-start-btn"
|
||||
class="px-6 py-3 bg-seismo-orange text-white rounded-lg hover:bg-seismo-orange-dark transition-colors">
|
||||
Start Real-time Stream
|
||||
</button>
|
||||
<p class="text-sm text-gray-500 mt-2">Click to begin streaming live measurement data</p>
|
||||
</div>
|
||||
<div id="stream-data" class="hidden">
|
||||
<div class="grid grid-cols-2 md:grid-cols-4 gap-4 mb-4">
|
||||
<div class="bg-gray-50 dark:bg-gray-900 rounded-lg p-4">
|
||||
<div class="text-sm text-gray-600 dark:text-gray-400 mb-1">Lp (Instant)</div>
|
||||
<div id="stream-lp" class="text-3xl font-bold text-gray-900 dark:text-white">--</div>
|
||||
<div class="text-xs text-gray-500">dB</div>
|
||||
</div>
|
||||
<div class="bg-gray-50 dark:bg-gray-900 rounded-lg p-4">
|
||||
<div class="text-sm text-gray-600 dark:text-gray-400 mb-1">Leq (Average)</div>
|
||||
<div id="stream-leq" class="text-3xl font-bold text-blue-600 dark:text-blue-400">--</div>
|
||||
<div class="text-xs text-gray-500">dB</div>
|
||||
</div>
|
||||
<div class="bg-gray-50 dark:bg-gray-900 rounded-lg p-4">
|
||||
<div class="text-sm text-gray-600 dark:text-gray-400 mb-1">Lmax</div>
|
||||
<div id="stream-lmax" class="text-3xl font-bold text-red-600 dark:text-red-400">--</div>
|
||||
<div class="text-xs text-gray-500">dB</div>
|
||||
</div>
|
||||
<div class="bg-gray-50 dark:bg-gray-900 rounded-lg p-4">
|
||||
<div class="text-sm text-gray-600 dark:text-gray-400 mb-1">Lmin</div>
|
||||
<div id="stream-lmin" class="text-3xl font-bold text-green-600 dark:text-green-400">--</div>
|
||||
<div class="text-xs text-gray-500">dB</div>
|
||||
</div>
|
||||
</div>
|
||||
<div class="flex justify-between items-center">
|
||||
<div class="text-xs text-gray-500">
|
||||
<span class="inline-block w-2 h-2 bg-green-500 rounded-full mr-2 animate-pulse"></span>
|
||||
Streaming
|
||||
</div>
|
||||
<button onclick="stopStream()"
|
||||
class="px-4 py-2 bg-red-600 text-white text-sm rounded-lg hover:bg-red-700 transition-colors">
|
||||
Stop Stream
|
||||
</button>
|
||||
</div>
|
||||
</div>
|
||||
</div>
|
||||
</div>
|
||||
</div>
|
||||
|
||||
<!-- Device Information -->
|
||||
<div class="mb-8">
|
||||
<h2 class="text-xl font-semibold text-gray-900 dark:text-white mb-4">Device Information</h2>
|
||||
<div class="bg-white dark:bg-slate-800 rounded-xl shadow-lg p-6">
|
||||
<div class="grid grid-cols-1 md:grid-cols-2 gap-4">
|
||||
<div>
|
||||
<div class="text-sm text-gray-600 dark:text-gray-400">Model</div>
|
||||
<div class="text-lg font-medium text-gray-900 dark:text-white">{{ unit.slm_model or 'NL-43' }}</div>
|
||||
</div>
|
||||
<div>
|
||||
<div class="text-sm text-gray-600 dark:text-gray-400">Serial Number</div>
|
||||
<div class="text-lg font-medium text-gray-900 dark:text-white">{{ unit.slm_serial_number or 'N/A' }}</div>
|
||||
</div>
|
||||
<div>
|
||||
<div class="text-sm text-gray-600 dark:text-gray-400">Host</div>
|
||||
<div class="text-lg font-medium text-gray-900 dark:text-white">{{ unit.slm_host or 'Not configured' }}</div>
|
||||
</div>
|
||||
<div>
|
||||
<div class="text-sm text-gray-600 dark:text-gray-400">TCP Port</div>
|
||||
<div class="text-lg font-medium text-gray-900 dark:text-white">{{ unit.slm_tcp_port or 'N/A' }}</div>
|
||||
</div>
|
||||
<div>
|
||||
<div class="text-sm text-gray-600 dark:text-gray-400">Frequency Weighting</div>
|
||||
<div class="text-lg font-medium text-gray-900 dark:text-white">{{ unit.slm_frequency_weighting or 'A' }}</div>
|
||||
</div>
|
||||
<div>
|
||||
<div class="text-sm text-gray-600 dark:text-gray-400">Time Weighting</div>
|
||||
<div class="text-lg font-medium text-gray-900 dark:text-white">{{ unit.slm_time_weighting or 'F (Fast)' }}</div>
|
||||
</div>
|
||||
<div class="md:col-span-2">
|
||||
<div class="text-sm text-gray-600 dark:text-gray-400">Location</div>
|
||||
<div class="text-lg font-medium text-gray-900 dark:text-white">{{ unit.address or unit.location or 'Not specified' }}</div>
|
||||
</div>
|
||||
{% if unit.note %}
|
||||
<div class="md:col-span-2">
|
||||
<div class="text-sm text-gray-600 dark:text-gray-400">Notes</div>
|
||||
<div class="text-gray-900 dark:text-white">{{ unit.note }}</div>
|
||||
</div>
|
||||
{% endif %}
|
||||
</div>
|
||||
</div>
|
||||
</div>
|
||||
|
||||
<script>
|
||||
let ws = null;
|
||||
|
||||
function startStream() {
|
||||
const protocol = window.location.protocol === 'https:' ? 'wss:' : 'ws:';
|
||||
const wsUrl = `${protocol}//${window.location.host}/api/slmm/{{ unit_id }}/stream`;
|
||||
|
||||
ws = new WebSocket(wsUrl);
|
||||
|
||||
ws.onopen = () => {
|
||||
document.getElementById('stream-start-btn').classList.add('hidden');
|
||||
document.getElementById('stream-data').classList.remove('hidden');
|
||||
console.log('WebSocket connected');
|
||||
};
|
||||
|
||||
ws.onmessage = (event) => {
|
||||
const data = JSON.parse(event.data);
|
||||
|
||||
if (data.error) {
|
||||
console.error('Stream error:', data.error);
|
||||
stopStream();
|
||||
alert('Error: ' + data.error);
|
||||
return;
|
||||
}
|
||||
|
||||
// Update values
|
||||
document.getElementById('stream-lp').textContent = data.lp || '--';
|
||||
document.getElementById('stream-leq').textContent = data.leq || '--';
|
||||
document.getElementById('stream-lmax').textContent = data.lmax || '--';
|
||||
document.getElementById('stream-lmin').textContent = data.lmin || '--';
|
||||
};
|
||||
|
||||
ws.onerror = (error) => {
|
||||
console.error('WebSocket error:', error);
|
||||
stopStream();
|
||||
};
|
||||
|
||||
ws.onclose = () => {
|
||||
console.log('WebSocket closed');
|
||||
};
|
||||
}
|
||||
|
||||
function stopStream() {
|
||||
if (ws) {
|
||||
ws.close();
|
||||
ws = null;
|
||||
}
|
||||
document.getElementById('stream-start-btn').classList.remove('hidden');
|
||||
document.getElementById('stream-data').classList.add('hidden');
|
||||
}
|
||||
</script>
|
||||
{% endblock %}
|
||||
@@ -1,257 +0,0 @@
|
||||
{% extends "base.html" %}
|
||||
|
||||
{% block title %}Sound Level Meters - Seismo Fleet Manager{% endblock %}
|
||||
|
||||
{% block content %}
|
||||
<div class="mb-8">
|
||||
<h1 class="text-3xl font-bold text-gray-900 dark:text-white">Sound Level Meters</h1>
|
||||
<p class="text-gray-600 dark:text-gray-400 mt-1">Monitor and manage sound level measurement devices</p>
|
||||
</div>
|
||||
|
||||
<!-- Summary Stats -->
|
||||
<div class="grid grid-cols-1 md:grid-cols-4 gap-6 mb-8"
|
||||
hx-get="/api/slm-dashboard/stats"
|
||||
hx-trigger="load, every 10s"
|
||||
hx-swap="innerHTML">
|
||||
<!-- Stats will be loaded here -->
|
||||
<div class="animate-pulse bg-gray-200 dark:bg-gray-700 h-24 rounded-xl"></div>
|
||||
<div class="animate-pulse bg-gray-200 dark:bg-gray-700 h-24 rounded-xl"></div>
|
||||
<div class="animate-pulse bg-gray-200 dark:bg-gray-700 h-24 rounded-xl"></div>
|
||||
<div class="animate-pulse bg-gray-200 dark:bg-gray-700 h-24 rounded-xl"></div>
|
||||
</div>
|
||||
|
||||
<!-- Main Content Grid -->
|
||||
<div class="grid grid-cols-1 lg:grid-cols-3 gap-6">
|
||||
<!-- SLM List -->
|
||||
<div class="lg:col-span-1">
|
||||
<div class="bg-white dark:bg-slate-800 rounded-xl shadow-lg p-6">
|
||||
<h2 class="text-xl font-semibold text-gray-900 dark:text-white mb-4">Active Units</h2>
|
||||
|
||||
<!-- Search/Filter -->
|
||||
<div class="mb-4">
|
||||
<input type="text"
|
||||
placeholder="Search units..."
|
||||
class="w-full px-4 py-2 border border-gray-300 dark:border-gray-600 rounded-lg bg-white dark:bg-gray-700 text-gray-900 dark:text-white"
|
||||
hx-get="/api/slm-dashboard/units"
|
||||
hx-trigger="keyup changed delay:300ms"
|
||||
hx-target="#slm-list"
|
||||
hx-include="this"
|
||||
name="search">
|
||||
</div>
|
||||
|
||||
<!-- SLM List -->
|
||||
<div id="slm-list"
|
||||
class="space-y-2 max-h-[600px] overflow-y-auto"
|
||||
hx-get="/api/slm-dashboard/units"
|
||||
hx-trigger="load, every 10s"
|
||||
hx-swap="innerHTML">
|
||||
<!-- Loading skeleton -->
|
||||
<div class="animate-pulse space-y-2">
|
||||
<div class="bg-gray-200 dark:bg-gray-700 h-20 rounded-lg"></div>
|
||||
<div class="bg-gray-200 dark:bg-gray-700 h-20 rounded-lg"></div>
|
||||
<div class="bg-gray-200 dark:bg-gray-700 h-20 rounded-lg"></div>
|
||||
</div>
|
||||
</div>
|
||||
</div>
|
||||
</div>
|
||||
|
||||
<!-- Live View Panel -->
|
||||
<div class="lg:col-span-2">
|
||||
<div id="live-view-panel" class="bg-white dark:bg-slate-800 rounded-xl shadow-lg p-6">
|
||||
<!-- Initial state - no unit selected -->
|
||||
<div class="flex flex-col items-center justify-center h-[600px] text-gray-400 dark:text-gray-500">
|
||||
<svg class="w-24 h-24 mb-4" fill="none" stroke="currentColor" viewBox="0 0 24 24">
|
||||
<path stroke-linecap="round" stroke-linejoin="round" stroke-width="2" d="M15.536 8.464a5 5 0 010 7.072m2.828-9.9a9 9 0 010 12.728M5.586 15H4a1 1 0 01-1-1v-4a1 1 0 011-1h1.586l4.707-4.707C10.923 3.663 12 4.109 12 5v14c0 .891-1.077 1.337-1.707.707L5.586 15z"></path>
|
||||
</svg>
|
||||
<p class="text-lg font-medium">No unit selected</p>
|
||||
<p class="text-sm mt-2">Select a sound level meter from the list to view live data</p>
|
||||
</div>
|
||||
</div>
|
||||
</div>
|
||||
</div>
|
||||
|
||||
<!-- Configuration Modal -->
|
||||
<div id="config-modal" class="hidden fixed inset-0 bg-black bg-opacity-50 flex items-center justify-center z-50">
|
||||
<div class="bg-white dark:bg-slate-800 rounded-xl p-6 max-w-2xl w-full mx-4 max-h-[90vh] overflow-y-auto">
|
||||
<div class="flex items-center justify-between mb-6">
|
||||
<h3 class="text-2xl font-bold text-gray-900 dark:text-white">Configure SLM</h3>
|
||||
<button onclick="closeConfigModal()" class="text-gray-500 hover:text-gray-700 dark:text-gray-400 dark:hover:text-gray-200">
|
||||
<svg class="w-6 h-6" fill="none" stroke="currentColor" viewBox="0 0 24 24">
|
||||
<path stroke-linecap="round" stroke-linejoin="round" stroke-width="2" d="M6 18L18 6M6 6l12 12"></path>
|
||||
</svg>
|
||||
</button>
|
||||
</div>
|
||||
|
||||
<div id="config-modal-content">
|
||||
<!-- Content loaded via HTMX -->
|
||||
<div class="animate-pulse space-y-4">
|
||||
<div class="h-4 bg-gray-200 dark:bg-gray-700 rounded w-3/4"></div>
|
||||
<div class="h-4 bg-gray-200 dark:bg-gray-700 rounded"></div>
|
||||
<div class="h-4 bg-gray-200 dark:bg-gray-700 rounded w-5/6"></div>
|
||||
</div>
|
||||
</div>
|
||||
</div>
|
||||
</div>
|
||||
|
||||
<script>
|
||||
// Function to select a unit and load live view
|
||||
function selectUnit(unitId) {
|
||||
// Remove active state from all items
|
||||
document.querySelectorAll('.slm-unit-item').forEach(item => {
|
||||
item.classList.remove('bg-seismo-orange', 'text-white');
|
||||
item.classList.add('bg-gray-100', 'dark:bg-gray-700');
|
||||
});
|
||||
|
||||
// Add active state to clicked item
|
||||
event.currentTarget.classList.remove('bg-gray-100', 'dark:bg-gray-700');
|
||||
event.currentTarget.classList.add('bg-seismo-orange', 'text-white');
|
||||
|
||||
// Load live view for this unit
|
||||
htmx.ajax('GET', `/api/slm-dashboard/live-view/${unitId}`, {
|
||||
target: '#live-view-panel',
|
||||
swap: 'innerHTML'
|
||||
});
|
||||
}
|
||||
|
||||
// Configuration modal functions
|
||||
function openConfigModal(unitId) {
|
||||
const modal = document.getElementById('config-modal');
|
||||
modal.classList.remove('hidden');
|
||||
|
||||
// Load configuration form via HTMX
|
||||
htmx.ajax('GET', `/api/slm-dashboard/config/${unitId}`, {
|
||||
target: '#config-modal-content',
|
||||
swap: 'innerHTML'
|
||||
});
|
||||
}
|
||||
|
||||
function closeConfigModal() {
|
||||
document.getElementById('config-modal').classList.add('hidden');
|
||||
}
|
||||
|
||||
// Close modal on escape key
|
||||
document.addEventListener('keydown', function(e) {
|
||||
if (e.key === 'Escape') {
|
||||
closeConfigModal();
|
||||
}
|
||||
});
|
||||
|
||||
// Close modal when clicking outside
|
||||
document.getElementById('config-modal')?.addEventListener('click', function(e) {
|
||||
if (e.target === this) {
|
||||
closeConfigModal();
|
||||
}
|
||||
});
|
||||
|
||||
// Initialize WebSocket for selected unit
|
||||
let currentWebSocket = null;
|
||||
|
||||
function initLiveDataStream(unitId) {
|
||||
// Close existing connection if any
|
||||
if (currentWebSocket) {
|
||||
currentWebSocket.close();
|
||||
}
|
||||
|
||||
// WebSocket URL for SLMM backend via proxy
|
||||
const wsProtocol = window.location.protocol === 'https:' ? 'wss:' : 'ws:';
|
||||
const wsUrl = `${wsProtocol}//${window.location.host}/api/slmm/${unitId}/live`;
|
||||
|
||||
currentWebSocket = new WebSocket(wsUrl);
|
||||
|
||||
currentWebSocket.onopen = function() {
|
||||
console.log('WebSocket connected');
|
||||
// Toggle button visibility
|
||||
const startBtn = document.getElementById('start-stream-btn');
|
||||
const stopBtn = document.getElementById('stop-stream-btn');
|
||||
if (startBtn) startBtn.style.display = 'none';
|
||||
if (stopBtn) stopBtn.style.display = 'flex';
|
||||
};
|
||||
|
||||
currentWebSocket.onmessage = async function(event) {
|
||||
try {
|
||||
let payload = event.data;
|
||||
if (payload instanceof Blob) {
|
||||
payload = await payload.text();
|
||||
}
|
||||
const data = typeof payload === 'string' ? JSON.parse(payload) : payload;
|
||||
updateLiveChart(data);
|
||||
updateLiveMetrics(data);
|
||||
} catch (error) {
|
||||
console.error('Error parsing WebSocket message:', error);
|
||||
}
|
||||
};
|
||||
|
||||
currentWebSocket.onerror = function(error) {
|
||||
console.error('WebSocket error:', error);
|
||||
};
|
||||
|
||||
currentWebSocket.onclose = function() {
|
||||
console.log('WebSocket closed');
|
||||
// Toggle button visibility
|
||||
const startBtn = document.getElementById('start-stream-btn');
|
||||
const stopBtn = document.getElementById('stop-stream-btn');
|
||||
if (startBtn) startBtn.style.display = 'flex';
|
||||
if (stopBtn) stopBtn.style.display = 'none';
|
||||
};
|
||||
}
|
||||
|
||||
function stopLiveDataStream() {
|
||||
if (currentWebSocket) {
|
||||
currentWebSocket.close();
|
||||
currentWebSocket = null;
|
||||
}
|
||||
}
|
||||
|
||||
// Update live chart with new data point
|
||||
let chartData = {
|
||||
timestamps: [],
|
||||
lp: [],
|
||||
leq: []
|
||||
};
|
||||
|
||||
function updateLiveChart(data) {
|
||||
const now = new Date();
|
||||
chartData.timestamps.push(now.toLocaleTimeString());
|
||||
chartData.lp.push(parseFloat(data.lp || 0));
|
||||
chartData.leq.push(parseFloat(data.leq || 0));
|
||||
|
||||
// Keep only last 60 data points (1 minute at 1 sample/sec)
|
||||
if (chartData.timestamps.length > 60) {
|
||||
chartData.timestamps.shift();
|
||||
chartData.lp.shift();
|
||||
chartData.leq.shift();
|
||||
}
|
||||
|
||||
// Update chart (using Chart.js if available)
|
||||
if (window.liveChart) {
|
||||
window.liveChart.data.labels = chartData.timestamps;
|
||||
window.liveChart.data.datasets[0].data = chartData.lp;
|
||||
window.liveChart.data.datasets[1].data = chartData.leq;
|
||||
window.liveChart.update('none'); // Update without animation for smooth real-time
|
||||
}
|
||||
}
|
||||
|
||||
function updateLiveMetrics(data) {
|
||||
// Update metric displays
|
||||
if (document.getElementById('live-lp')) {
|
||||
document.getElementById('live-lp').textContent = data.lp || '--';
|
||||
}
|
||||
if (document.getElementById('live-leq')) {
|
||||
document.getElementById('live-leq').textContent = data.leq || '--';
|
||||
}
|
||||
if (document.getElementById('live-lmax')) {
|
||||
document.getElementById('live-lmax').textContent = data.lmax || '--';
|
||||
}
|
||||
if (document.getElementById('live-lmin')) {
|
||||
document.getElementById('live-lmin').textContent = data.lmin || '--';
|
||||
}
|
||||
}
|
||||
|
||||
// Cleanup on page unload
|
||||
window.addEventListener('beforeunload', function() {
|
||||
if (currentWebSocket) {
|
||||
currentWebSocket.close();
|
||||
}
|
||||
});
|
||||
</script>
|
||||
{% endblock %}
|
||||
108
backend/init_projects_db.py
Normal file
@@ -0,0 +1,108 @@
|
||||
#!/usr/bin/env python3
|
||||
"""
|
||||
Database initialization script for Projects system.
|
||||
|
||||
This script creates the new project management tables and populates
|
||||
the project_types table with default templates.
|
||||
|
||||
Usage:
|
||||
python -m backend.init_projects_db
|
||||
"""
|
||||
|
||||
from sqlalchemy.orm import Session
|
||||
from backend.database import engine, SessionLocal
|
||||
from backend.models import (
|
||||
Base,
|
||||
ProjectType,
|
||||
Project,
|
||||
MonitoringLocation,
|
||||
UnitAssignment,
|
||||
ScheduledAction,
|
||||
RecordingSession,
|
||||
DataFile,
|
||||
)
|
||||
from datetime import datetime
|
||||
|
||||
|
||||
def init_project_types(db: Session):
|
||||
"""Initialize default project types."""
|
||||
project_types = [
|
||||
{
|
||||
"id": "sound_monitoring",
|
||||
"name": "Sound Monitoring",
|
||||
"description": "Noise monitoring projects with sound level meters and NRLs (Noise Recording Locations)",
|
||||
"icon": "volume-2", # Lucide icon name
|
||||
"supports_sound": True,
|
||||
"supports_vibration": False,
|
||||
},
|
||||
{
|
||||
"id": "vibration_monitoring",
|
||||
"name": "Vibration Monitoring",
|
||||
"description": "Seismic/vibration monitoring projects with seismographs and monitoring points",
|
||||
"icon": "activity", # Lucide icon name
|
||||
"supports_sound": False,
|
||||
"supports_vibration": True,
|
||||
},
|
||||
{
|
||||
"id": "combined",
|
||||
"name": "Combined Monitoring",
|
||||
"description": "Full-spectrum monitoring with both sound and vibration capabilities",
|
||||
"icon": "layers", # Lucide icon name
|
||||
"supports_sound": True,
|
||||
"supports_vibration": True,
|
||||
},
|
||||
]
|
||||
|
||||
for pt_data in project_types:
|
||||
existing = db.query(ProjectType).filter_by(id=pt_data["id"]).first()
|
||||
if not existing:
|
||||
pt = ProjectType(**pt_data)
|
||||
db.add(pt)
|
||||
print(f"✓ Created project type: {pt_data['name']}")
|
||||
else:
|
||||
print(f" Project type already exists: {pt_data['name']}")
|
||||
|
||||
db.commit()
|
||||
|
||||
|
||||
def create_tables():
|
||||
"""Create all tables defined in models."""
|
||||
print("Creating project management tables...")
|
||||
Base.metadata.create_all(bind=engine)
|
||||
print("✓ Tables created successfully")
|
||||
|
||||
|
||||
def main():
|
||||
print("=" * 60)
|
||||
print("Terra-View Projects System - Database Initialization")
|
||||
print("=" * 60)
|
||||
print()
|
||||
|
||||
# Create tables
|
||||
create_tables()
|
||||
print()
|
||||
|
||||
# Initialize project types
|
||||
db = SessionLocal()
|
||||
try:
|
||||
print("Initializing project types...")
|
||||
init_project_types(db)
|
||||
print()
|
||||
print("=" * 60)
|
||||
print("✓ Database initialization complete!")
|
||||
print("=" * 60)
|
||||
print()
|
||||
print("Next steps:")
|
||||
print(" 1. Restart Terra-View to load new routes")
|
||||
print(" 2. Navigate to /projects to create your first project")
|
||||
print(" 3. Check documentation for API endpoints")
|
||||
except Exception as e:
|
||||
print(f"✗ Error during initialization: {e}")
|
||||
db.rollback()
|
||||
raise
|
||||
finally:
|
||||
db.close()
|
||||
|
||||
|
||||
if __name__ == "__main__":
|
||||
main()
|
||||
669
backend/main.py
Normal file
@@ -0,0 +1,669 @@
|
||||
import os
|
||||
import logging
|
||||
from fastapi import FastAPI, Request, Depends
|
||||
from fastapi.middleware.cors import CORSMiddleware
|
||||
from fastapi.staticfiles import StaticFiles
|
||||
from fastapi.templating import Jinja2Templates
|
||||
from fastapi.responses import HTMLResponse, FileResponse, JSONResponse
|
||||
from fastapi.exceptions import RequestValidationError
|
||||
from sqlalchemy.orm import Session
|
||||
from typing import List, Dict, Optional
|
||||
from pydantic import BaseModel
|
||||
|
||||
# Configure logging
|
||||
logging.basicConfig(
|
||||
level=logging.INFO,
|
||||
format='%(asctime)s - %(name)s - %(levelname)s - %(message)s'
|
||||
)
|
||||
logger = logging.getLogger(__name__)
|
||||
|
||||
from backend.database import engine, Base, get_db
|
||||
from backend.routers import roster, units, photos, roster_edit, roster_rename, dashboard, dashboard_tabs, activity, slmm, slm_ui, slm_dashboard, seismo_dashboard, projects, project_locations, scheduler
|
||||
from backend.services.snapshot import emit_status_snapshot
|
||||
from backend.models import IgnoredUnit
|
||||
|
||||
# Create database tables
|
||||
Base.metadata.create_all(bind=engine)
|
||||
|
||||
# Read environment (development or production)
|
||||
ENVIRONMENT = os.getenv("ENVIRONMENT", "production")
|
||||
|
||||
# Initialize FastAPI app
|
||||
VERSION = "0.4.2"
|
||||
app = FastAPI(
|
||||
title="Seismo Fleet Manager",
|
||||
description="Backend API for managing seismograph fleet status",
|
||||
version=VERSION
|
||||
)
|
||||
|
||||
# Add validation error handler to log details
|
||||
@app.exception_handler(RequestValidationError)
|
||||
async def validation_exception_handler(request: Request, exc: RequestValidationError):
|
||||
logger.error(f"Validation error on {request.url}: {exc.errors()}")
|
||||
logger.error(f"Body: {await request.body()}")
|
||||
return JSONResponse(
|
||||
status_code=400,
|
||||
content={"detail": exc.errors()}
|
||||
)
|
||||
|
||||
# Configure CORS
|
||||
app.add_middleware(
|
||||
CORSMiddleware,
|
||||
allow_origins=["*"],
|
||||
allow_credentials=True,
|
||||
allow_methods=["*"],
|
||||
allow_headers=["*"],
|
||||
)
|
||||
|
||||
# Mount static files
|
||||
app.mount("/static", StaticFiles(directory="backend/static"), name="static")
|
||||
|
||||
# Setup Jinja2 templates
|
||||
templates = Jinja2Templates(directory="templates")
|
||||
|
||||
# Add custom context processor to inject environment variable into all templates
|
||||
@app.middleware("http")
|
||||
async def add_environment_to_context(request: Request, call_next):
|
||||
"""Middleware to add environment variable to request state"""
|
||||
request.state.environment = ENVIRONMENT
|
||||
response = await call_next(request)
|
||||
return response
|
||||
|
||||
# Override TemplateResponse to include environment and version in context
|
||||
original_template_response = templates.TemplateResponse
|
||||
def custom_template_response(name, context=None, *args, **kwargs):
|
||||
if context is None:
|
||||
context = {}
|
||||
context["environment"] = ENVIRONMENT
|
||||
context["version"] = VERSION
|
||||
return original_template_response(name, context, *args, **kwargs)
|
||||
templates.TemplateResponse = custom_template_response
|
||||
|
||||
# Include API routers
|
||||
app.include_router(roster.router)
|
||||
app.include_router(units.router)
|
||||
app.include_router(photos.router)
|
||||
app.include_router(roster_edit.router)
|
||||
app.include_router(roster_rename.router)
|
||||
app.include_router(dashboard.router)
|
||||
app.include_router(dashboard_tabs.router)
|
||||
app.include_router(activity.router)
|
||||
app.include_router(slmm.router)
|
||||
app.include_router(slm_ui.router)
|
||||
app.include_router(slm_dashboard.router)
|
||||
app.include_router(seismo_dashboard.router)
|
||||
|
||||
from backend.routers import settings
|
||||
app.include_router(settings.router)
|
||||
|
||||
# Projects system routers
|
||||
app.include_router(projects.router)
|
||||
app.include_router(project_locations.router)
|
||||
app.include_router(scheduler.router)
|
||||
|
||||
# Start scheduler service on application startup
|
||||
from backend.services.scheduler import start_scheduler, stop_scheduler
|
||||
|
||||
@app.on_event("startup")
|
||||
async def startup_event():
|
||||
"""Initialize services on app startup"""
|
||||
logger.info("Starting scheduler service...")
|
||||
await start_scheduler()
|
||||
logger.info("Scheduler service started")
|
||||
|
||||
@app.on_event("shutdown")
|
||||
def shutdown_event():
|
||||
"""Clean up services on app shutdown"""
|
||||
logger.info("Stopping scheduler service...")
|
||||
stop_scheduler()
|
||||
logger.info("Scheduler service stopped")
|
||||
|
||||
|
||||
# Legacy routes from the original backend
|
||||
from backend import routes as legacy_routes
|
||||
app.include_router(legacy_routes.router)
|
||||
|
||||
|
||||
# HTML page routes
|
||||
@app.get("/", response_class=HTMLResponse)
|
||||
async def dashboard(request: Request):
|
||||
"""Dashboard home page"""
|
||||
return templates.TemplateResponse("dashboard.html", {"request": request})
|
||||
|
||||
|
||||
@app.get("/roster", response_class=HTMLResponse)
|
||||
async def roster_page(request: Request):
|
||||
"""Fleet roster page"""
|
||||
return templates.TemplateResponse("roster.html", {"request": request})
|
||||
|
||||
|
||||
@app.get("/unit/{unit_id}", response_class=HTMLResponse)
|
||||
async def unit_detail_page(request: Request, unit_id: str):
|
||||
"""Unit detail page"""
|
||||
return templates.TemplateResponse("unit_detail.html", {
|
||||
"request": request,
|
||||
"unit_id": unit_id
|
||||
})
|
||||
|
||||
|
||||
@app.get("/settings", response_class=HTMLResponse)
|
||||
async def settings_page(request: Request):
|
||||
"""Settings page for roster management"""
|
||||
return templates.TemplateResponse("settings.html", {"request": request})
|
||||
|
||||
|
||||
@app.get("/sound-level-meters", response_class=HTMLResponse)
|
||||
async def sound_level_meters_page(request: Request):
|
||||
"""Sound Level Meters management dashboard"""
|
||||
return templates.TemplateResponse("sound_level_meters.html", {"request": request})
|
||||
|
||||
|
||||
@app.get("/slm/{unit_id}", response_class=HTMLResponse)
|
||||
async def slm_legacy_dashboard(
|
||||
request: Request,
|
||||
unit_id: str,
|
||||
from_project: Optional[str] = None,
|
||||
from_nrl: Optional[str] = None,
|
||||
db: Session = Depends(get_db)
|
||||
):
|
||||
"""Legacy SLM control center dashboard for a specific unit"""
|
||||
# Get project details if from_project is provided
|
||||
project = None
|
||||
if from_project:
|
||||
from backend.models import Project
|
||||
project = db.query(Project).filter_by(id=from_project).first()
|
||||
|
||||
# Get NRL location details if from_nrl is provided
|
||||
nrl_location = None
|
||||
if from_nrl:
|
||||
from backend.models import NRLLocation
|
||||
nrl_location = db.query(NRLLocation).filter_by(id=from_nrl).first()
|
||||
|
||||
return templates.TemplateResponse("slm_legacy_dashboard.html", {
|
||||
"request": request,
|
||||
"unit_id": unit_id,
|
||||
"from_project": from_project,
|
||||
"from_nrl": from_nrl,
|
||||
"project": project,
|
||||
"nrl_location": nrl_location
|
||||
})
|
||||
|
||||
|
||||
@app.get("/seismographs", response_class=HTMLResponse)
|
||||
async def seismographs_page(request: Request):
|
||||
"""Seismographs management dashboard"""
|
||||
return templates.TemplateResponse("seismographs.html", {"request": request})
|
||||
|
||||
|
||||
@app.get("/projects", response_class=HTMLResponse)
|
||||
async def projects_page(request: Request):
|
||||
"""Projects management and overview"""
|
||||
return templates.TemplateResponse("projects/overview.html", {"request": request})
|
||||
|
||||
|
||||
@app.get("/projects/{project_id}", response_class=HTMLResponse)
|
||||
async def project_detail_page(request: Request, project_id: str):
|
||||
"""Project detail dashboard"""
|
||||
return templates.TemplateResponse("projects/detail.html", {
|
||||
"request": request,
|
||||
"project_id": project_id
|
||||
})
|
||||
|
||||
|
||||
@app.get("/projects/{project_id}/nrl/{location_id}", response_class=HTMLResponse)
|
||||
async def nrl_detail_page(
|
||||
request: Request,
|
||||
project_id: str,
|
||||
location_id: str,
|
||||
db: Session = Depends(get_db)
|
||||
):
|
||||
"""NRL (Noise Recording Location) detail page with tabs"""
|
||||
from backend.models import Project, MonitoringLocation, UnitAssignment, RosterUnit, RecordingSession, DataFile
|
||||
from sqlalchemy import and_
|
||||
|
||||
# Get project
|
||||
project = db.query(Project).filter_by(id=project_id).first()
|
||||
if not project:
|
||||
return templates.TemplateResponse("404.html", {
|
||||
"request": request,
|
||||
"message": "Project not found"
|
||||
}, status_code=404)
|
||||
|
||||
# Get location
|
||||
location = db.query(MonitoringLocation).filter_by(
|
||||
id=location_id,
|
||||
project_id=project_id
|
||||
).first()
|
||||
|
||||
if not location:
|
||||
return templates.TemplateResponse("404.html", {
|
||||
"request": request,
|
||||
"message": "Location not found"
|
||||
}, status_code=404)
|
||||
|
||||
# Get active assignment
|
||||
assignment = db.query(UnitAssignment).filter(
|
||||
and_(
|
||||
UnitAssignment.location_id == location_id,
|
||||
UnitAssignment.status == "active"
|
||||
)
|
||||
).first()
|
||||
|
||||
assigned_unit = None
|
||||
if assignment:
|
||||
assigned_unit = db.query(RosterUnit).filter_by(id=assignment.unit_id).first()
|
||||
|
||||
# Get session count
|
||||
session_count = db.query(RecordingSession).filter_by(location_id=location_id).count()
|
||||
|
||||
# Get file count (DataFile links to session, not directly to location)
|
||||
file_count = db.query(DataFile).join(
|
||||
RecordingSession,
|
||||
DataFile.session_id == RecordingSession.id
|
||||
).filter(RecordingSession.location_id == location_id).count()
|
||||
|
||||
# Check for active session
|
||||
active_session = db.query(RecordingSession).filter(
|
||||
and_(
|
||||
RecordingSession.location_id == location_id,
|
||||
RecordingSession.status == "recording"
|
||||
)
|
||||
).first()
|
||||
|
||||
return templates.TemplateResponse("nrl_detail.html", {
|
||||
"request": request,
|
||||
"project_id": project_id,
|
||||
"location_id": location_id,
|
||||
"project": project,
|
||||
"location": location,
|
||||
"assignment": assignment,
|
||||
"assigned_unit": assigned_unit,
|
||||
"session_count": session_count,
|
||||
"file_count": file_count,
|
||||
"active_session": active_session,
|
||||
})
|
||||
|
||||
|
||||
# ===== PWA ROUTES =====
|
||||
|
||||
@app.get("/sw.js")
|
||||
async def service_worker():
|
||||
"""Serve service worker with proper headers for PWA"""
|
||||
return FileResponse(
|
||||
"backend/static/sw.js",
|
||||
media_type="application/javascript",
|
||||
headers={
|
||||
"Service-Worker-Allowed": "/",
|
||||
"Cache-Control": "no-cache"
|
||||
}
|
||||
)
|
||||
|
||||
|
||||
@app.get("/offline-db.js")
|
||||
async def offline_db_script():
|
||||
"""Serve offline database script"""
|
||||
return FileResponse(
|
||||
"backend/static/offline-db.js",
|
||||
media_type="application/javascript",
|
||||
headers={"Cache-Control": "no-cache"}
|
||||
)
|
||||
|
||||
|
||||
# Pydantic model for sync edits
|
||||
class EditItem(BaseModel):
|
||||
id: int
|
||||
unitId: str
|
||||
changes: Dict
|
||||
timestamp: int
|
||||
|
||||
|
||||
class SyncEditsRequest(BaseModel):
|
||||
edits: List[EditItem]
|
||||
|
||||
|
||||
@app.post("/api/sync-edits")
|
||||
async def sync_edits(request: SyncEditsRequest, db: Session = Depends(get_db)):
|
||||
"""Process offline edit queue and sync to database"""
|
||||
from backend.models import RosterUnit
|
||||
|
||||
results = []
|
||||
synced_ids = []
|
||||
|
||||
for edit in request.edits:
|
||||
try:
|
||||
# Find the unit
|
||||
unit = db.query(RosterUnit).filter_by(id=edit.unitId).first()
|
||||
|
||||
if not unit:
|
||||
results.append({
|
||||
"id": edit.id,
|
||||
"status": "error",
|
||||
"reason": f"Unit {edit.unitId} not found"
|
||||
})
|
||||
continue
|
||||
|
||||
# Apply changes
|
||||
for key, value in edit.changes.items():
|
||||
if hasattr(unit, key):
|
||||
# Handle boolean conversions
|
||||
if key in ['deployed', 'retired']:
|
||||
setattr(unit, key, value in ['true', True, 'True', '1', 1])
|
||||
else:
|
||||
setattr(unit, key, value if value != '' else None)
|
||||
|
||||
db.commit()
|
||||
|
||||
results.append({
|
||||
"id": edit.id,
|
||||
"status": "success"
|
||||
})
|
||||
synced_ids.append(edit.id)
|
||||
|
||||
except Exception as e:
|
||||
db.rollback()
|
||||
results.append({
|
||||
"id": edit.id,
|
||||
"status": "error",
|
||||
"reason": str(e)
|
||||
})
|
||||
|
||||
synced_count = len(synced_ids)
|
||||
|
||||
return JSONResponse({
|
||||
"synced": synced_count,
|
||||
"total": len(request.edits),
|
||||
"synced_ids": synced_ids,
|
||||
"results": results
|
||||
})
|
||||
|
||||
|
||||
@app.get("/partials/roster-deployed", response_class=HTMLResponse)
|
||||
async def roster_deployed_partial(request: Request):
|
||||
"""Partial template for deployed units tab"""
|
||||
from datetime import datetime
|
||||
snapshot = emit_status_snapshot()
|
||||
|
||||
units_list = []
|
||||
for unit_id, unit_data in snapshot["active"].items():
|
||||
units_list.append({
|
||||
"id": unit_id,
|
||||
"status": unit_data.get("status", "Unknown"),
|
||||
"age": unit_data.get("age", "N/A"),
|
||||
"last_seen": unit_data.get("last", "Never"),
|
||||
"deployed": unit_data.get("deployed", False),
|
||||
"note": unit_data.get("note", ""),
|
||||
"device_type": unit_data.get("device_type", "seismograph"),
|
||||
"address": unit_data.get("address", ""),
|
||||
"coordinates": unit_data.get("coordinates", ""),
|
||||
"project_id": unit_data.get("project_id", ""),
|
||||
"last_calibrated": unit_data.get("last_calibrated"),
|
||||
"next_calibration_due": unit_data.get("next_calibration_due"),
|
||||
"deployed_with_modem_id": unit_data.get("deployed_with_modem_id"),
|
||||
"ip_address": unit_data.get("ip_address"),
|
||||
"phone_number": unit_data.get("phone_number"),
|
||||
"hardware_model": unit_data.get("hardware_model"),
|
||||
})
|
||||
|
||||
# Sort by status priority (Missing > Pending > OK) then by ID
|
||||
status_priority = {"Missing": 0, "Pending": 1, "OK": 2}
|
||||
units_list.sort(key=lambda x: (status_priority.get(x["status"], 3), x["id"]))
|
||||
|
||||
return templates.TemplateResponse("partials/roster_table.html", {
|
||||
"request": request,
|
||||
"units": units_list,
|
||||
"timestamp": datetime.now().strftime("%H:%M:%S")
|
||||
})
|
||||
|
||||
|
||||
@app.get("/partials/roster-benched", response_class=HTMLResponse)
|
||||
async def roster_benched_partial(request: Request):
|
||||
"""Partial template for benched units tab"""
|
||||
from datetime import datetime
|
||||
snapshot = emit_status_snapshot()
|
||||
|
||||
units_list = []
|
||||
for unit_id, unit_data in snapshot["benched"].items():
|
||||
units_list.append({
|
||||
"id": unit_id,
|
||||
"status": unit_data.get("status", "N/A"),
|
||||
"age": unit_data.get("age", "N/A"),
|
||||
"last_seen": unit_data.get("last", "Never"),
|
||||
"deployed": unit_data.get("deployed", False),
|
||||
"note": unit_data.get("note", ""),
|
||||
"device_type": unit_data.get("device_type", "seismograph"),
|
||||
"address": unit_data.get("address", ""),
|
||||
"coordinates": unit_data.get("coordinates", ""),
|
||||
"project_id": unit_data.get("project_id", ""),
|
||||
"last_calibrated": unit_data.get("last_calibrated"),
|
||||
"next_calibration_due": unit_data.get("next_calibration_due"),
|
||||
"deployed_with_modem_id": unit_data.get("deployed_with_modem_id"),
|
||||
"ip_address": unit_data.get("ip_address"),
|
||||
"phone_number": unit_data.get("phone_number"),
|
||||
"hardware_model": unit_data.get("hardware_model"),
|
||||
})
|
||||
|
||||
# Sort by ID
|
||||
units_list.sort(key=lambda x: x["id"])
|
||||
|
||||
return templates.TemplateResponse("partials/roster_table.html", {
|
||||
"request": request,
|
||||
"units": units_list,
|
||||
"timestamp": datetime.now().strftime("%H:%M:%S")
|
||||
})
|
||||
|
||||
|
||||
@app.get("/partials/roster-retired", response_class=HTMLResponse)
|
||||
async def roster_retired_partial(request: Request):
|
||||
"""Partial template for retired units tab"""
|
||||
from datetime import datetime
|
||||
snapshot = emit_status_snapshot()
|
||||
|
||||
units_list = []
|
||||
for unit_id, unit_data in snapshot["retired"].items():
|
||||
units_list.append({
|
||||
"id": unit_id,
|
||||
"status": unit_data["status"],
|
||||
"age": unit_data["age"],
|
||||
"last_seen": unit_data["last"],
|
||||
"deployed": unit_data["deployed"],
|
||||
"note": unit_data.get("note", ""),
|
||||
"device_type": unit_data.get("device_type", "seismograph"),
|
||||
"last_calibrated": unit_data.get("last_calibrated"),
|
||||
"next_calibration_due": unit_data.get("next_calibration_due"),
|
||||
"deployed_with_modem_id": unit_data.get("deployed_with_modem_id"),
|
||||
"ip_address": unit_data.get("ip_address"),
|
||||
"phone_number": unit_data.get("phone_number"),
|
||||
"hardware_model": unit_data.get("hardware_model"),
|
||||
})
|
||||
|
||||
# Sort by ID
|
||||
units_list.sort(key=lambda x: x["id"])
|
||||
|
||||
return templates.TemplateResponse("partials/retired_table.html", {
|
||||
"request": request,
|
||||
"units": units_list,
|
||||
"timestamp": datetime.now().strftime("%H:%M:%S")
|
||||
})
|
||||
|
||||
|
||||
@app.get("/partials/roster-ignored", response_class=HTMLResponse)
|
||||
async def roster_ignored_partial(request: Request, db: Session = Depends(get_db)):
|
||||
"""Partial template for ignored units tab"""
|
||||
from datetime import datetime
|
||||
|
||||
ignored = db.query(IgnoredUnit).all()
|
||||
ignored_list = []
|
||||
for unit in ignored:
|
||||
ignored_list.append({
|
||||
"id": unit.id,
|
||||
"reason": unit.reason or "",
|
||||
"ignored_at": unit.ignored_at.strftime("%Y-%m-%d %H:%M:%S") if unit.ignored_at else "Unknown"
|
||||
})
|
||||
|
||||
# Sort by ID
|
||||
ignored_list.sort(key=lambda x: x["id"])
|
||||
|
||||
return templates.TemplateResponse("partials/ignored_table.html", {
|
||||
"request": request,
|
||||
"ignored_units": ignored_list,
|
||||
"timestamp": datetime.now().strftime("%H:%M:%S")
|
||||
})
|
||||
|
||||
|
||||
@app.get("/partials/unknown-emitters", response_class=HTMLResponse)
|
||||
async def unknown_emitters_partial(request: Request):
|
||||
"""Partial template for unknown emitters (HTMX)"""
|
||||
snapshot = emit_status_snapshot()
|
||||
|
||||
unknown_list = []
|
||||
for unit_id, unit_data in snapshot.get("unknown", {}).items():
|
||||
unknown_list.append({
|
||||
"id": unit_id,
|
||||
"status": unit_data["status"],
|
||||
"age": unit_data["age"],
|
||||
"fname": unit_data.get("fname", ""),
|
||||
})
|
||||
|
||||
# Sort by ID
|
||||
unknown_list.sort(key=lambda x: x["id"])
|
||||
|
||||
return templates.TemplateResponse("partials/unknown_emitters.html", {
|
||||
"request": request,
|
||||
"unknown_units": unknown_list
|
||||
})
|
||||
|
||||
|
||||
@app.get("/partials/devices-all", response_class=HTMLResponse)
|
||||
async def devices_all_partial(request: Request):
|
||||
"""Unified partial template for ALL devices with comprehensive filtering support"""
|
||||
from datetime import datetime
|
||||
snapshot = emit_status_snapshot()
|
||||
|
||||
units_list = []
|
||||
|
||||
# Add deployed/active units
|
||||
for unit_id, unit_data in snapshot["active"].items():
|
||||
units_list.append({
|
||||
"id": unit_id,
|
||||
"status": unit_data.get("status", "Unknown"),
|
||||
"age": unit_data.get("age", "N/A"),
|
||||
"last_seen": unit_data.get("last", "Never"),
|
||||
"deployed": True,
|
||||
"retired": False,
|
||||
"ignored": False,
|
||||
"note": unit_data.get("note", ""),
|
||||
"device_type": unit_data.get("device_type", "seismograph"),
|
||||
"address": unit_data.get("address", ""),
|
||||
"coordinates": unit_data.get("coordinates", ""),
|
||||
"project_id": unit_data.get("project_id", ""),
|
||||
"last_calibrated": unit_data.get("last_calibrated"),
|
||||
"next_calibration_due": unit_data.get("next_calibration_due"),
|
||||
"deployed_with_modem_id": unit_data.get("deployed_with_modem_id"),
|
||||
"ip_address": unit_data.get("ip_address"),
|
||||
"phone_number": unit_data.get("phone_number"),
|
||||
"hardware_model": unit_data.get("hardware_model"),
|
||||
})
|
||||
|
||||
# Add benched units
|
||||
for unit_id, unit_data in snapshot["benched"].items():
|
||||
units_list.append({
|
||||
"id": unit_id,
|
||||
"status": unit_data.get("status", "N/A"),
|
||||
"age": unit_data.get("age", "N/A"),
|
||||
"last_seen": unit_data.get("last", "Never"),
|
||||
"deployed": False,
|
||||
"retired": False,
|
||||
"ignored": False,
|
||||
"note": unit_data.get("note", ""),
|
||||
"device_type": unit_data.get("device_type", "seismograph"),
|
||||
"address": unit_data.get("address", ""),
|
||||
"coordinates": unit_data.get("coordinates", ""),
|
||||
"project_id": unit_data.get("project_id", ""),
|
||||
"last_calibrated": unit_data.get("last_calibrated"),
|
||||
"next_calibration_due": unit_data.get("next_calibration_due"),
|
||||
"deployed_with_modem_id": unit_data.get("deployed_with_modem_id"),
|
||||
"ip_address": unit_data.get("ip_address"),
|
||||
"phone_number": unit_data.get("phone_number"),
|
||||
"hardware_model": unit_data.get("hardware_model"),
|
||||
})
|
||||
|
||||
# Add retired units
|
||||
for unit_id, unit_data in snapshot["retired"].items():
|
||||
units_list.append({
|
||||
"id": unit_id,
|
||||
"status": "Retired",
|
||||
"age": "N/A",
|
||||
"last_seen": "N/A",
|
||||
"deployed": False,
|
||||
"retired": True,
|
||||
"ignored": False,
|
||||
"note": unit_data.get("note", ""),
|
||||
"device_type": unit_data.get("device_type", "seismograph"),
|
||||
"address": unit_data.get("address", ""),
|
||||
"coordinates": unit_data.get("coordinates", ""),
|
||||
"project_id": unit_data.get("project_id", ""),
|
||||
"last_calibrated": unit_data.get("last_calibrated"),
|
||||
"next_calibration_due": unit_data.get("next_calibration_due"),
|
||||
"deployed_with_modem_id": unit_data.get("deployed_with_modem_id"),
|
||||
"ip_address": unit_data.get("ip_address"),
|
||||
"phone_number": unit_data.get("phone_number"),
|
||||
"hardware_model": unit_data.get("hardware_model"),
|
||||
})
|
||||
|
||||
# Add ignored units
|
||||
for unit_id, unit_data in snapshot.get("ignored", {}).items():
|
||||
units_list.append({
|
||||
"id": unit_id,
|
||||
"status": "Ignored",
|
||||
"age": "N/A",
|
||||
"last_seen": "N/A",
|
||||
"deployed": False,
|
||||
"retired": False,
|
||||
"ignored": True,
|
||||
"note": unit_data.get("note", unit_data.get("reason", "")),
|
||||
"device_type": unit_data.get("device_type", "unknown"),
|
||||
"address": "",
|
||||
"coordinates": "",
|
||||
"project_id": "",
|
||||
"last_calibrated": None,
|
||||
"next_calibration_due": None,
|
||||
"deployed_with_modem_id": None,
|
||||
"ip_address": None,
|
||||
"phone_number": None,
|
||||
"hardware_model": None,
|
||||
})
|
||||
|
||||
# Sort by status category, then by ID
|
||||
def sort_key(unit):
|
||||
# Priority: deployed (active) -> benched -> retired -> ignored
|
||||
if unit["deployed"]:
|
||||
return (0, unit["id"])
|
||||
elif not unit["retired"] and not unit["ignored"]:
|
||||
return (1, unit["id"])
|
||||
elif unit["retired"]:
|
||||
return (2, unit["id"])
|
||||
else:
|
||||
return (3, unit["id"])
|
||||
|
||||
units_list.sort(key=sort_key)
|
||||
|
||||
return templates.TemplateResponse("partials/devices_table.html", {
|
||||
"request": request,
|
||||
"units": units_list,
|
||||
"timestamp": datetime.now().strftime("%H:%M:%S")
|
||||
})
|
||||
|
||||
|
||||
@app.get("/health")
|
||||
def health_check():
|
||||
"""Health check endpoint"""
|
||||
return {
|
||||
"message": f"Seismo Fleet Manager v{VERSION}",
|
||||
"status": "running",
|
||||
"version": VERSION
|
||||
}
|
||||
|
||||
|
||||
if __name__ == "__main__":
|
||||
import uvicorn
|
||||
uvicorn.run(app, host="0.0.0.0", port=8001)
|
||||
84
backend/migrate_add_device_types.py
Normal file
@@ -0,0 +1,84 @@
|
||||
"""
|
||||
Migration script to add device type support to the roster table.
|
||||
|
||||
This adds columns for:
|
||||
- device_type (seismograph/modem discriminator)
|
||||
- Seismograph-specific fields (calibration dates, modem pairing)
|
||||
- Modem-specific fields (IP address, phone number, hardware model)
|
||||
|
||||
Run this script once to migrate an existing database.
|
||||
"""
|
||||
|
||||
import sqlite3
|
||||
import os
|
||||
|
||||
# Database path
|
||||
DB_PATH = "./data/seismo_fleet.db"
|
||||
|
||||
def migrate_database():
|
||||
"""Add new columns to the roster table"""
|
||||
|
||||
if not os.path.exists(DB_PATH):
|
||||
print(f"Database not found at {DB_PATH}")
|
||||
print("The database will be created automatically when you run the application.")
|
||||
return
|
||||
|
||||
print(f"Migrating database: {DB_PATH}")
|
||||
|
||||
conn = sqlite3.connect(DB_PATH)
|
||||
cursor = conn.cursor()
|
||||
|
||||
# Check if device_type column already exists
|
||||
cursor.execute("PRAGMA table_info(roster)")
|
||||
columns = [col[1] for col in cursor.fetchall()]
|
||||
|
||||
if "device_type" in columns:
|
||||
print("Migration already applied - device_type column exists")
|
||||
conn.close()
|
||||
return
|
||||
|
||||
print("Adding new columns to roster table...")
|
||||
|
||||
try:
|
||||
# Add device type discriminator
|
||||
cursor.execute("ALTER TABLE roster ADD COLUMN device_type TEXT DEFAULT 'seismograph'")
|
||||
print(" ✓ Added device_type column")
|
||||
|
||||
# Add seismograph-specific fields
|
||||
cursor.execute("ALTER TABLE roster ADD COLUMN last_calibrated DATE")
|
||||
print(" ✓ Added last_calibrated column")
|
||||
|
||||
cursor.execute("ALTER TABLE roster ADD COLUMN next_calibration_due DATE")
|
||||
print(" ✓ Added next_calibration_due column")
|
||||
|
||||
cursor.execute("ALTER TABLE roster ADD COLUMN deployed_with_modem_id TEXT")
|
||||
print(" ✓ Added deployed_with_modem_id column")
|
||||
|
||||
# Add modem-specific fields
|
||||
cursor.execute("ALTER TABLE roster ADD COLUMN ip_address TEXT")
|
||||
print(" ✓ Added ip_address column")
|
||||
|
||||
cursor.execute("ALTER TABLE roster ADD COLUMN phone_number TEXT")
|
||||
print(" ✓ Added phone_number column")
|
||||
|
||||
cursor.execute("ALTER TABLE roster ADD COLUMN hardware_model TEXT")
|
||||
print(" ✓ Added hardware_model column")
|
||||
|
||||
# Set all existing units to seismograph type
|
||||
cursor.execute("UPDATE roster SET device_type = 'seismograph' WHERE device_type IS NULL")
|
||||
print(" ✓ Set existing units to seismograph type")
|
||||
|
||||
conn.commit()
|
||||
print("\nMigration completed successfully!")
|
||||
|
||||
except sqlite3.Error as e:
|
||||
print(f"\nError during migration: {e}")
|
||||
conn.rollback()
|
||||
raise
|
||||
|
||||
finally:
|
||||
conn.close()
|
||||
|
||||
|
||||
if __name__ == "__main__":
|
||||
migrate_database()
|
||||
78
backend/migrate_add_slm_fields.py
Normal file
@@ -0,0 +1,78 @@
|
||||
#!/usr/bin/env python3
|
||||
"""
|
||||
Database migration: Add sound level meter fields to roster table.
|
||||
|
||||
Adds columns for sound_level_meter device type support.
|
||||
"""
|
||||
|
||||
import sqlite3
|
||||
from pathlib import Path
|
||||
|
||||
def migrate():
|
||||
"""Add SLM fields to roster table if they don't exist."""
|
||||
|
||||
# Try multiple possible database locations
|
||||
possible_paths = [
|
||||
Path("data/seismo_fleet.db"),
|
||||
Path("data/sfm.db"),
|
||||
Path("data/seismo.db"),
|
||||
]
|
||||
|
||||
db_path = None
|
||||
for path in possible_paths:
|
||||
if path.exists():
|
||||
db_path = path
|
||||
break
|
||||
|
||||
if db_path is None:
|
||||
print(f"Database not found in any of: {[str(p) for p in possible_paths]}")
|
||||
print("Creating database with models.py will include new fields automatically.")
|
||||
return
|
||||
|
||||
print(f"Using database: {db_path}")
|
||||
|
||||
conn = sqlite3.connect(db_path)
|
||||
cursor = conn.cursor()
|
||||
|
||||
# Check if columns already exist
|
||||
cursor.execute("PRAGMA table_info(roster)")
|
||||
existing_columns = {row[1] for row in cursor.fetchall()}
|
||||
|
||||
new_columns = {
|
||||
"slm_host": "TEXT",
|
||||
"slm_tcp_port": "INTEGER",
|
||||
"slm_model": "TEXT",
|
||||
"slm_serial_number": "TEXT",
|
||||
"slm_frequency_weighting": "TEXT",
|
||||
"slm_time_weighting": "TEXT",
|
||||
"slm_measurement_range": "TEXT",
|
||||
"slm_last_check": "DATETIME",
|
||||
}
|
||||
|
||||
migrations_applied = []
|
||||
|
||||
for column_name, column_type in new_columns.items():
|
||||
if column_name not in existing_columns:
|
||||
try:
|
||||
cursor.execute(f"ALTER TABLE roster ADD COLUMN {column_name} {column_type}")
|
||||
migrations_applied.append(column_name)
|
||||
print(f"✓ Added column: {column_name} ({column_type})")
|
||||
except sqlite3.OperationalError as e:
|
||||
print(f"✗ Failed to add column {column_name}: {e}")
|
||||
else:
|
||||
print(f"○ Column already exists: {column_name}")
|
||||
|
||||
conn.commit()
|
||||
conn.close()
|
||||
|
||||
if migrations_applied:
|
||||
print(f"\n✓ Migration complete! Added {len(migrations_applied)} new columns.")
|
||||
else:
|
||||
print("\n○ No migration needed - all columns already exist.")
|
||||
|
||||
print("\nSound level meter fields are now available in the roster table.")
|
||||
print("You can now set device_type='sound_level_meter' for SLM devices.")
|
||||
|
||||
|
||||
if __name__ == "__main__":
|
||||
migrate()
|
||||
78
backend/migrate_add_unit_history.py
Normal file
@@ -0,0 +1,78 @@
|
||||
"""
|
||||
Migration script to add unit history timeline support.
|
||||
|
||||
This creates the unit_history table to track all changes to units:
|
||||
- Note changes (archived old notes, new notes)
|
||||
- Deployment status changes (deployed/benched)
|
||||
- Retired status changes
|
||||
- Other field changes
|
||||
|
||||
Run this script once to migrate an existing database.
|
||||
"""
|
||||
|
||||
import sqlite3
|
||||
import os
|
||||
|
||||
# Database path
|
||||
DB_PATH = "./data/seismo_fleet.db"
|
||||
|
||||
def migrate_database():
|
||||
"""Create the unit_history table"""
|
||||
|
||||
if not os.path.exists(DB_PATH):
|
||||
print(f"Database not found at {DB_PATH}")
|
||||
print("The database will be created automatically when you run the application.")
|
||||
return
|
||||
|
||||
print(f"Migrating database: {DB_PATH}")
|
||||
|
||||
conn = sqlite3.connect(DB_PATH)
|
||||
cursor = conn.cursor()
|
||||
|
||||
# Check if unit_history table already exists
|
||||
cursor.execute("SELECT name FROM sqlite_master WHERE type='table' AND name='unit_history'")
|
||||
if cursor.fetchone():
|
||||
print("Migration already applied - unit_history table exists")
|
||||
conn.close()
|
||||
return
|
||||
|
||||
print("Creating unit_history table...")
|
||||
|
||||
try:
|
||||
cursor.execute("""
|
||||
CREATE TABLE unit_history (
|
||||
id INTEGER PRIMARY KEY AUTOINCREMENT,
|
||||
unit_id TEXT NOT NULL,
|
||||
change_type TEXT NOT NULL,
|
||||
field_name TEXT,
|
||||
old_value TEXT,
|
||||
new_value TEXT,
|
||||
changed_at TIMESTAMP NOT NULL,
|
||||
source TEXT DEFAULT 'manual',
|
||||
notes TEXT
|
||||
)
|
||||
""")
|
||||
print(" ✓ Created unit_history table")
|
||||
|
||||
# Create indexes for better query performance
|
||||
cursor.execute("CREATE INDEX idx_unit_history_unit_id ON unit_history(unit_id)")
|
||||
print(" ✓ Created index on unit_id")
|
||||
|
||||
cursor.execute("CREATE INDEX idx_unit_history_changed_at ON unit_history(changed_at)")
|
||||
print(" ✓ Created index on changed_at")
|
||||
|
||||
conn.commit()
|
||||
print("\nMigration completed successfully!")
|
||||
print("Units will now track their complete history of changes.")
|
||||
|
||||
except sqlite3.Error as e:
|
||||
print(f"\nError during migration: {e}")
|
||||
conn.rollback()
|
||||
raise
|
||||
|
||||
finally:
|
||||
conn.close()
|
||||
|
||||
|
||||
if __name__ == "__main__":
|
||||
migrate_database()
|
||||
80
backend/migrate_add_user_preferences.py
Normal file
@@ -0,0 +1,80 @@
|
||||
"""
|
||||
Migration script to add user_preferences table.
|
||||
|
||||
This creates a new table for storing persistent user preferences:
|
||||
- Display settings (timezone, theme, date format)
|
||||
- Auto-refresh configuration
|
||||
- Calibration defaults
|
||||
- Status threshold customization
|
||||
|
||||
Run this script once to migrate an existing database.
|
||||
"""
|
||||
|
||||
import sqlite3
|
||||
import os
|
||||
|
||||
# Database path
|
||||
DB_PATH = "./data/seismo_fleet.db"
|
||||
|
||||
def migrate_database():
|
||||
"""Create user_preferences table"""
|
||||
|
||||
if not os.path.exists(DB_PATH):
|
||||
print(f"Database not found at {DB_PATH}")
|
||||
print("The database will be created automatically when you run the application.")
|
||||
return
|
||||
|
||||
print(f"Migrating database: {DB_PATH}")
|
||||
|
||||
conn = sqlite3.connect(DB_PATH)
|
||||
cursor = conn.cursor()
|
||||
|
||||
# Check if user_preferences table already exists
|
||||
cursor.execute("SELECT name FROM sqlite_master WHERE type='table' AND name='user_preferences'")
|
||||
table_exists = cursor.fetchone()
|
||||
|
||||
if table_exists:
|
||||
print("Migration already applied - user_preferences table exists")
|
||||
conn.close()
|
||||
return
|
||||
|
||||
print("Creating user_preferences table...")
|
||||
|
||||
try:
|
||||
cursor.execute("""
|
||||
CREATE TABLE user_preferences (
|
||||
id INTEGER PRIMARY KEY DEFAULT 1,
|
||||
timezone TEXT DEFAULT 'America/New_York',
|
||||
theme TEXT DEFAULT 'auto',
|
||||
auto_refresh_interval INTEGER DEFAULT 10,
|
||||
date_format TEXT DEFAULT 'MM/DD/YYYY',
|
||||
table_rows_per_page INTEGER DEFAULT 25,
|
||||
calibration_interval_days INTEGER DEFAULT 365,
|
||||
calibration_warning_days INTEGER DEFAULT 30,
|
||||
status_ok_threshold_hours INTEGER DEFAULT 12,
|
||||
status_pending_threshold_hours INTEGER DEFAULT 24,
|
||||
updated_at TIMESTAMP DEFAULT CURRENT_TIMESTAMP
|
||||
)
|
||||
""")
|
||||
print(" ✓ Created user_preferences table")
|
||||
|
||||
# Insert default preferences
|
||||
cursor.execute("""
|
||||
INSERT INTO user_preferences (id) VALUES (1)
|
||||
""")
|
||||
print(" ✓ Inserted default preferences")
|
||||
|
||||
conn.commit()
|
||||
print("\nMigration completed successfully!")
|
||||
|
||||
except sqlite3.Error as e:
|
||||
print(f"\nError during migration: {e}")
|
||||
conn.rollback()
|
||||
raise
|
||||
|
||||
finally:
|
||||
conn.close()
|
||||
|
||||
|
||||
if __name__ == "__main__":
|
||||
migrate_database()
|
||||
277
backend/models.py
Normal file
@@ -0,0 +1,277 @@
|
||||
from sqlalchemy import Column, String, DateTime, Boolean, Text, Date, Integer
|
||||
from datetime import datetime
|
||||
from backend.database import Base
|
||||
|
||||
|
||||
class Emitter(Base):
|
||||
__tablename__ = "emitters"
|
||||
|
||||
id = Column(String, primary_key=True, index=True)
|
||||
unit_type = Column(String, nullable=False)
|
||||
last_seen = Column(DateTime, default=datetime.utcnow)
|
||||
last_file = Column(String, nullable=False)
|
||||
status = Column(String, nullable=False)
|
||||
notes = Column(String, nullable=True)
|
||||
|
||||
|
||||
class RosterUnit(Base):
|
||||
"""
|
||||
Roster table: represents our *intended assignment* of a unit.
|
||||
This is editable from the GUI.
|
||||
|
||||
Supports multiple device types (seismograph, modem, sound_level_meter) with type-specific fields.
|
||||
"""
|
||||
__tablename__ = "roster"
|
||||
|
||||
# Core fields (all device types)
|
||||
id = Column(String, primary_key=True, index=True)
|
||||
unit_type = Column(String, default="series3") # Backward compatibility
|
||||
device_type = Column(String, default="seismograph") # "seismograph" | "modem" | "sound_level_meter"
|
||||
deployed = Column(Boolean, default=True)
|
||||
retired = Column(Boolean, default=False)
|
||||
note = Column(String, nullable=True)
|
||||
project_id = Column(String, nullable=True)
|
||||
location = Column(String, nullable=True) # Legacy field - use address/coordinates instead
|
||||
address = Column(String, nullable=True) # Human-readable address
|
||||
coordinates = Column(String, nullable=True) # Lat,Lon format: "34.0522,-118.2437"
|
||||
last_updated = Column(DateTime, default=datetime.utcnow)
|
||||
|
||||
# Seismograph-specific fields (nullable for modems and SLMs)
|
||||
last_calibrated = Column(Date, nullable=True)
|
||||
next_calibration_due = Column(Date, nullable=True)
|
||||
|
||||
# Modem assignment (shared by seismographs and SLMs)
|
||||
deployed_with_modem_id = Column(String, nullable=True) # FK to another RosterUnit (device_type=modem)
|
||||
|
||||
# Modem-specific fields (nullable for seismographs and SLMs)
|
||||
ip_address = Column(String, nullable=True)
|
||||
phone_number = Column(String, nullable=True)
|
||||
hardware_model = Column(String, nullable=True)
|
||||
|
||||
# Sound Level Meter-specific fields (nullable for seismographs and modems)
|
||||
slm_host = Column(String, nullable=True) # Device IP or hostname
|
||||
slm_tcp_port = Column(Integer, nullable=True) # TCP control port (default 2255)
|
||||
slm_ftp_port = Column(Integer, nullable=True) # FTP data retrieval port (default 21)
|
||||
slm_model = Column(String, nullable=True) # NL-43, NL-53, etc.
|
||||
slm_serial_number = Column(String, nullable=True) # Device serial number
|
||||
slm_frequency_weighting = Column(String, nullable=True) # A, C, Z
|
||||
slm_time_weighting = Column(String, nullable=True) # F (Fast), S (Slow), I (Impulse)
|
||||
slm_measurement_range = Column(String, nullable=True) # e.g., "30-130 dB"
|
||||
slm_last_check = Column(DateTime, nullable=True) # Last communication check
|
||||
|
||||
|
||||
class IgnoredUnit(Base):
|
||||
"""
|
||||
Ignored units: units that report but should be filtered out from unknown emitters.
|
||||
Used to suppress noise from old projects.
|
||||
"""
|
||||
__tablename__ = "ignored_units"
|
||||
|
||||
id = Column(String, primary_key=True, index=True)
|
||||
reason = Column(String, nullable=True)
|
||||
ignored_at = Column(DateTime, default=datetime.utcnow)
|
||||
|
||||
|
||||
class UnitHistory(Base):
|
||||
"""
|
||||
Unit history: complete timeline of changes to each unit.
|
||||
Tracks note changes, status changes, deployment/benched events, and more.
|
||||
"""
|
||||
__tablename__ = "unit_history"
|
||||
|
||||
id = Column(Integer, primary_key=True, autoincrement=True)
|
||||
unit_id = Column(String, nullable=False, index=True) # FK to RosterUnit.id
|
||||
change_type = Column(String, nullable=False) # note_change, deployed_change, retired_change, etc.
|
||||
field_name = Column(String, nullable=True) # Which field changed
|
||||
old_value = Column(Text, nullable=True) # Previous value
|
||||
new_value = Column(Text, nullable=True) # New value
|
||||
changed_at = Column(DateTime, default=datetime.utcnow, nullable=False, index=True)
|
||||
source = Column(String, default="manual") # manual, csv_import, telemetry, offline_sync
|
||||
notes = Column(Text, nullable=True) # Optional reason/context for the change
|
||||
|
||||
|
||||
class UserPreferences(Base):
|
||||
"""
|
||||
User preferences: persistent storage for application settings.
|
||||
Single-row table (id=1) to store global user preferences.
|
||||
"""
|
||||
__tablename__ = "user_preferences"
|
||||
|
||||
id = Column(Integer, primary_key=True, default=1)
|
||||
timezone = Column(String, default="America/New_York")
|
||||
theme = Column(String, default="auto") # auto, light, dark
|
||||
auto_refresh_interval = Column(Integer, default=10) # seconds
|
||||
date_format = Column(String, default="MM/DD/YYYY")
|
||||
table_rows_per_page = Column(Integer, default=25)
|
||||
calibration_interval_days = Column(Integer, default=365)
|
||||
calibration_warning_days = Column(Integer, default=30)
|
||||
status_ok_threshold_hours = Column(Integer, default=12)
|
||||
status_pending_threshold_hours = Column(Integer, default=24)
|
||||
updated_at = Column(DateTime, default=datetime.utcnow, onupdate=datetime.utcnow)
|
||||
|
||||
|
||||
# ============================================================================
|
||||
# Project Management System
|
||||
# ============================================================================
|
||||
|
||||
class ProjectType(Base):
|
||||
"""
|
||||
Project type templates: defines available project types and their capabilities.
|
||||
Pre-populated with: sound_monitoring, vibration_monitoring, combined.
|
||||
"""
|
||||
__tablename__ = "project_types"
|
||||
|
||||
id = Column(String, primary_key=True) # sound_monitoring, vibration_monitoring, combined
|
||||
name = Column(String, nullable=False, unique=True) # "Sound Monitoring", "Vibration Monitoring"
|
||||
description = Column(Text, nullable=True)
|
||||
icon = Column(String, nullable=True) # Icon identifier for UI
|
||||
supports_sound = Column(Boolean, default=False) # Enables SLM features
|
||||
supports_vibration = Column(Boolean, default=False) # Enables seismograph features
|
||||
created_at = Column(DateTime, default=datetime.utcnow)
|
||||
|
||||
|
||||
class Project(Base):
|
||||
"""
|
||||
Projects: top-level organization for monitoring work.
|
||||
Type-aware to enable/disable features based on project_type_id.
|
||||
"""
|
||||
__tablename__ = "projects"
|
||||
|
||||
id = Column(String, primary_key=True, index=True) # UUID
|
||||
name = Column(String, nullable=False, unique=True)
|
||||
description = Column(Text, nullable=True)
|
||||
project_type_id = Column(String, nullable=False) # FK to ProjectType.id
|
||||
status = Column(String, default="active") # active, completed, archived
|
||||
|
||||
# Project metadata
|
||||
client_name = Column(String, nullable=True)
|
||||
site_address = Column(String, nullable=True)
|
||||
site_coordinates = Column(String, nullable=True) # "lat,lon"
|
||||
start_date = Column(Date, nullable=True)
|
||||
end_date = Column(Date, nullable=True)
|
||||
|
||||
created_at = Column(DateTime, default=datetime.utcnow)
|
||||
updated_at = Column(DateTime, default=datetime.utcnow, onupdate=datetime.utcnow)
|
||||
|
||||
|
||||
class MonitoringLocation(Base):
|
||||
"""
|
||||
Monitoring locations: generic location for monitoring activities.
|
||||
Can be NRL (Noise Recording Location) for sound projects,
|
||||
or monitoring point for vibration projects.
|
||||
"""
|
||||
__tablename__ = "monitoring_locations"
|
||||
|
||||
id = Column(String, primary_key=True, index=True) # UUID
|
||||
project_id = Column(String, nullable=False, index=True) # FK to Project.id
|
||||
location_type = Column(String, nullable=False) # "sound" | "vibration"
|
||||
|
||||
name = Column(String, nullable=False) # NRL-001, VP-North, etc.
|
||||
description = Column(Text, nullable=True)
|
||||
coordinates = Column(String, nullable=True) # "lat,lon"
|
||||
address = Column(String, nullable=True)
|
||||
|
||||
# Type-specific metadata stored as JSON
|
||||
# For sound: {"ambient_conditions": "urban", "expected_sources": ["traffic"]}
|
||||
# For vibration: {"ground_type": "bedrock", "depth": "10m"}
|
||||
location_metadata = Column(Text, nullable=True)
|
||||
|
||||
created_at = Column(DateTime, default=datetime.utcnow)
|
||||
updated_at = Column(DateTime, default=datetime.utcnow, onupdate=datetime.utcnow)
|
||||
|
||||
|
||||
class UnitAssignment(Base):
|
||||
"""
|
||||
Unit assignments: links devices (SLMs or seismographs) to monitoring locations.
|
||||
Supports temporary assignments with assigned_until.
|
||||
"""
|
||||
__tablename__ = "unit_assignments"
|
||||
|
||||
id = Column(String, primary_key=True, index=True) # UUID
|
||||
unit_id = Column(String, nullable=False, index=True) # FK to RosterUnit.id
|
||||
location_id = Column(String, nullable=False, index=True) # FK to MonitoringLocation.id
|
||||
|
||||
assigned_at = Column(DateTime, default=datetime.utcnow)
|
||||
assigned_until = Column(DateTime, nullable=True) # Null = indefinite
|
||||
status = Column(String, default="active") # active, completed, cancelled
|
||||
notes = Column(Text, nullable=True)
|
||||
|
||||
# Denormalized for efficient queries
|
||||
device_type = Column(String, nullable=False) # sound_level_meter | seismograph
|
||||
project_id = Column(String, nullable=False, index=True) # FK to Project.id
|
||||
|
||||
created_at = Column(DateTime, default=datetime.utcnow)
|
||||
|
||||
|
||||
class ScheduledAction(Base):
|
||||
"""
|
||||
Scheduled actions: automation for recording start/stop/download.
|
||||
Terra-View executes these by calling SLMM or SFM endpoints.
|
||||
"""
|
||||
__tablename__ = "scheduled_actions"
|
||||
|
||||
id = Column(String, primary_key=True, index=True) # UUID
|
||||
project_id = Column(String, nullable=False, index=True) # FK to Project.id
|
||||
location_id = Column(String, nullable=False, index=True) # FK to MonitoringLocation.id
|
||||
unit_id = Column(String, nullable=True, index=True) # FK to RosterUnit.id (nullable if location-based)
|
||||
|
||||
action_type = Column(String, nullable=False) # start, stop, download, calibrate
|
||||
device_type = Column(String, nullable=False) # sound_level_meter | seismograph
|
||||
|
||||
scheduled_time = Column(DateTime, nullable=False, index=True)
|
||||
executed_at = Column(DateTime, nullable=True)
|
||||
execution_status = Column(String, default="pending") # pending, completed, failed, cancelled
|
||||
|
||||
# Response from device module (SLMM or SFM)
|
||||
module_response = Column(Text, nullable=True) # JSON
|
||||
error_message = Column(Text, nullable=True)
|
||||
|
||||
notes = Column(Text, nullable=True)
|
||||
created_at = Column(DateTime, default=datetime.utcnow)
|
||||
|
||||
|
||||
class RecordingSession(Base):
|
||||
"""
|
||||
Recording sessions: tracks actual monitoring sessions.
|
||||
Created when recording starts, updated when it stops.
|
||||
"""
|
||||
__tablename__ = "recording_sessions"
|
||||
|
||||
id = Column(String, primary_key=True, index=True) # UUID
|
||||
project_id = Column(String, nullable=False, index=True) # FK to Project.id
|
||||
location_id = Column(String, nullable=False, index=True) # FK to MonitoringLocation.id
|
||||
unit_id = Column(String, nullable=False, index=True) # FK to RosterUnit.id
|
||||
|
||||
session_type = Column(String, nullable=False) # sound | vibration
|
||||
started_at = Column(DateTime, nullable=False)
|
||||
stopped_at = Column(DateTime, nullable=True)
|
||||
duration_seconds = Column(Integer, nullable=True)
|
||||
status = Column(String, default="recording") # recording, completed, failed
|
||||
|
||||
# Snapshot of device configuration at recording time
|
||||
session_metadata = Column(Text, nullable=True) # JSON
|
||||
|
||||
created_at = Column(DateTime, default=datetime.utcnow)
|
||||
updated_at = Column(DateTime, default=datetime.utcnow, onupdate=datetime.utcnow)
|
||||
|
||||
|
||||
class DataFile(Base):
|
||||
"""
|
||||
Data files: references to recorded data files.
|
||||
Terra-View tracks file metadata; actual files stored in data/Projects/ directory.
|
||||
"""
|
||||
__tablename__ = "data_files"
|
||||
|
||||
id = Column(String, primary_key=True, index=True) # UUID
|
||||
session_id = Column(String, nullable=False, index=True) # FK to RecordingSession.id
|
||||
|
||||
file_path = Column(String, nullable=False) # Relative to data/Projects/
|
||||
file_type = Column(String, nullable=False) # wav, csv, mseed, json
|
||||
file_size_bytes = Column(Integer, nullable=True)
|
||||
downloaded_at = Column(DateTime, nullable=True)
|
||||
checksum = Column(String, nullable=True) # SHA256 or MD5
|
||||
|
||||
# Additional file metadata
|
||||
file_metadata = Column(Text, nullable=True) # JSON
|
||||
|
||||
created_at = Column(DateTime, default=datetime.utcnow)
|
||||
@@ -4,8 +4,8 @@ from sqlalchemy import desc
|
||||
from pathlib import Path
|
||||
from datetime import datetime, timedelta, timezone
|
||||
from typing import List, Dict, Any
|
||||
from app.seismo.database import get_db
|
||||
from app.seismo.models import UnitHistory, Emitter, RosterUnit
|
||||
from backend.database import get_db
|
||||
from backend.models import UnitHistory, Emitter, RosterUnit
|
||||
|
||||
router = APIRouter(prefix="/api", tags=["activity"])
|
||||
|
||||
@@ -1,10 +1,10 @@
|
||||
from fastapi import APIRouter, Request, Depends
|
||||
from fastapi.templating import Jinja2Templates
|
||||
|
||||
from app.seismo.services.snapshot import emit_status_snapshot
|
||||
from backend.services.snapshot import emit_status_snapshot
|
||||
|
||||
router = APIRouter()
|
||||
templates = Jinja2Templates(directory="app/ui/templates")
|
||||
templates = Jinja2Templates(directory="templates")
|
||||
|
||||
|
||||
@router.get("/dashboard/active")
|
||||
@@ -2,8 +2,8 @@
|
||||
from fastapi import APIRouter, Depends
|
||||
from sqlalchemy.orm import Session
|
||||
|
||||
from app.seismo.database import get_db
|
||||
from app.seismo.services.snapshot import emit_status_snapshot
|
||||
from backend.database import get_db
|
||||
from backend.services.snapshot import emit_status_snapshot
|
||||
|
||||
router = APIRouter(prefix="/dashboard", tags=["dashboard-tabs"])
|
||||
|
||||
@@ -8,8 +8,8 @@ import shutil
|
||||
from PIL import Image
|
||||
from PIL.ExifTags import TAGS, GPSTAGS
|
||||
from sqlalchemy.orm import Session
|
||||
from app.seismo.database import get_db
|
||||
from app.seismo.models import RosterUnit
|
||||
from backend.database import get_db
|
||||
from backend.models import RosterUnit
|
||||
|
||||
router = APIRouter(prefix="/api", tags=["photos"])
|
||||
|
||||
488
backend/routers/project_locations.py
Normal file
@@ -0,0 +1,488 @@
|
||||
"""
|
||||
Project Locations Router
|
||||
|
||||
Handles monitoring locations (NRLs for sound, monitoring points for vibration)
|
||||
and unit assignments within projects.
|
||||
"""
|
||||
|
||||
from fastapi import APIRouter, Request, Depends, HTTPException, Query
|
||||
from fastapi.templating import Jinja2Templates
|
||||
from fastapi.responses import HTMLResponse, JSONResponse
|
||||
from sqlalchemy.orm import Session
|
||||
from sqlalchemy import and_, or_
|
||||
from datetime import datetime
|
||||
from typing import Optional
|
||||
import uuid
|
||||
import json
|
||||
|
||||
from backend.database import get_db
|
||||
from backend.models import (
|
||||
Project,
|
||||
ProjectType,
|
||||
MonitoringLocation,
|
||||
UnitAssignment,
|
||||
RosterUnit,
|
||||
RecordingSession,
|
||||
)
|
||||
|
||||
router = APIRouter(prefix="/api/projects/{project_id}", tags=["project-locations"])
|
||||
templates = Jinja2Templates(directory="templates")
|
||||
|
||||
|
||||
# ============================================================================
|
||||
# Monitoring Locations CRUD
|
||||
# ============================================================================
|
||||
|
||||
@router.get("/locations", response_class=HTMLResponse)
|
||||
async def get_project_locations(
|
||||
project_id: str,
|
||||
request: Request,
|
||||
db: Session = Depends(get_db),
|
||||
location_type: Optional[str] = Query(None),
|
||||
):
|
||||
"""
|
||||
Get all monitoring locations for a project.
|
||||
Returns HTML partial with location list.
|
||||
"""
|
||||
project = db.query(Project).filter_by(id=project_id).first()
|
||||
if not project:
|
||||
raise HTTPException(status_code=404, detail="Project not found")
|
||||
|
||||
query = db.query(MonitoringLocation).filter_by(project_id=project_id)
|
||||
|
||||
# Filter by type if provided
|
||||
if location_type:
|
||||
query = query.filter_by(location_type=location_type)
|
||||
|
||||
locations = query.order_by(MonitoringLocation.name).all()
|
||||
|
||||
# Enrich with assignment info
|
||||
locations_data = []
|
||||
for location in locations:
|
||||
# Get active assignment
|
||||
assignment = db.query(UnitAssignment).filter(
|
||||
and_(
|
||||
UnitAssignment.location_id == location.id,
|
||||
UnitAssignment.status == "active",
|
||||
)
|
||||
).first()
|
||||
|
||||
assigned_unit = None
|
||||
if assignment:
|
||||
assigned_unit = db.query(RosterUnit).filter_by(id=assignment.unit_id).first()
|
||||
|
||||
# Count recording sessions
|
||||
session_count = db.query(RecordingSession).filter_by(
|
||||
location_id=location.id
|
||||
).count()
|
||||
|
||||
locations_data.append({
|
||||
"location": location,
|
||||
"assignment": assignment,
|
||||
"assigned_unit": assigned_unit,
|
||||
"session_count": session_count,
|
||||
})
|
||||
|
||||
return templates.TemplateResponse("partials/projects/location_list.html", {
|
||||
"request": request,
|
||||
"project": project,
|
||||
"locations": locations_data,
|
||||
})
|
||||
|
||||
|
||||
@router.post("/locations/create")
|
||||
async def create_location(
|
||||
project_id: str,
|
||||
request: Request,
|
||||
db: Session = Depends(get_db),
|
||||
):
|
||||
"""
|
||||
Create a new monitoring location within a project.
|
||||
"""
|
||||
project = db.query(Project).filter_by(id=project_id).first()
|
||||
if not project:
|
||||
raise HTTPException(status_code=404, detail="Project not found")
|
||||
|
||||
form_data = await request.form()
|
||||
|
||||
location = MonitoringLocation(
|
||||
id=str(uuid.uuid4()),
|
||||
project_id=project_id,
|
||||
location_type=form_data.get("location_type"),
|
||||
name=form_data.get("name"),
|
||||
description=form_data.get("description"),
|
||||
coordinates=form_data.get("coordinates"),
|
||||
address=form_data.get("address"),
|
||||
location_metadata=form_data.get("location_metadata"), # JSON string
|
||||
)
|
||||
|
||||
db.add(location)
|
||||
db.commit()
|
||||
db.refresh(location)
|
||||
|
||||
return JSONResponse({
|
||||
"success": True,
|
||||
"location_id": location.id,
|
||||
"message": f"Location '{location.name}' created successfully",
|
||||
})
|
||||
|
||||
|
||||
@router.put("/locations/{location_id}")
|
||||
async def update_location(
|
||||
project_id: str,
|
||||
location_id: str,
|
||||
request: Request,
|
||||
db: Session = Depends(get_db),
|
||||
):
|
||||
"""
|
||||
Update a monitoring location.
|
||||
"""
|
||||
location = db.query(MonitoringLocation).filter_by(
|
||||
id=location_id,
|
||||
project_id=project_id,
|
||||
).first()
|
||||
|
||||
if not location:
|
||||
raise HTTPException(status_code=404, detail="Location not found")
|
||||
|
||||
data = await request.json()
|
||||
|
||||
# Update fields if provided
|
||||
if "name" in data:
|
||||
location.name = data["name"]
|
||||
if "description" in data:
|
||||
location.description = data["description"]
|
||||
if "location_type" in data:
|
||||
location.location_type = data["location_type"]
|
||||
if "coordinates" in data:
|
||||
location.coordinates = data["coordinates"]
|
||||
if "address" in data:
|
||||
location.address = data["address"]
|
||||
if "location_metadata" in data:
|
||||
location.location_metadata = data["location_metadata"]
|
||||
|
||||
location.updated_at = datetime.utcnow()
|
||||
|
||||
db.commit()
|
||||
|
||||
return {"success": True, "message": "Location updated successfully"}
|
||||
|
||||
|
||||
@router.delete("/locations/{location_id}")
|
||||
async def delete_location(
|
||||
project_id: str,
|
||||
location_id: str,
|
||||
db: Session = Depends(get_db),
|
||||
):
|
||||
"""
|
||||
Delete a monitoring location.
|
||||
"""
|
||||
location = db.query(MonitoringLocation).filter_by(
|
||||
id=location_id,
|
||||
project_id=project_id,
|
||||
).first()
|
||||
|
||||
if not location:
|
||||
raise HTTPException(status_code=404, detail="Location not found")
|
||||
|
||||
# Check if location has active assignments
|
||||
active_assignments = db.query(UnitAssignment).filter(
|
||||
and_(
|
||||
UnitAssignment.location_id == location_id,
|
||||
UnitAssignment.status == "active",
|
||||
)
|
||||
).count()
|
||||
|
||||
if active_assignments > 0:
|
||||
raise HTTPException(
|
||||
status_code=400,
|
||||
detail="Cannot delete location with active unit assignments. Unassign units first.",
|
||||
)
|
||||
|
||||
db.delete(location)
|
||||
db.commit()
|
||||
|
||||
return {"success": True, "message": "Location deleted successfully"}
|
||||
|
||||
|
||||
# ============================================================================
|
||||
# Unit Assignments
|
||||
# ============================================================================
|
||||
|
||||
@router.get("/assignments", response_class=HTMLResponse)
|
||||
async def get_project_assignments(
|
||||
project_id: str,
|
||||
request: Request,
|
||||
db: Session = Depends(get_db),
|
||||
status: Optional[str] = Query("active"),
|
||||
):
|
||||
"""
|
||||
Get all unit assignments for a project.
|
||||
Returns HTML partial with assignment list.
|
||||
"""
|
||||
query = db.query(UnitAssignment).filter_by(project_id=project_id)
|
||||
|
||||
if status:
|
||||
query = query.filter_by(status=status)
|
||||
|
||||
assignments = query.order_by(UnitAssignment.assigned_at.desc()).all()
|
||||
|
||||
# Enrich with unit and location details
|
||||
assignments_data = []
|
||||
for assignment in assignments:
|
||||
unit = db.query(RosterUnit).filter_by(id=assignment.unit_id).first()
|
||||
location = db.query(MonitoringLocation).filter_by(id=assignment.location_id).first()
|
||||
|
||||
assignments_data.append({
|
||||
"assignment": assignment,
|
||||
"unit": unit,
|
||||
"location": location,
|
||||
})
|
||||
|
||||
return templates.TemplateResponse("partials/projects/assignment_list.html", {
|
||||
"request": request,
|
||||
"project_id": project_id,
|
||||
"assignments": assignments_data,
|
||||
})
|
||||
|
||||
|
||||
@router.post("/locations/{location_id}/assign")
|
||||
async def assign_unit_to_location(
|
||||
project_id: str,
|
||||
location_id: str,
|
||||
request: Request,
|
||||
db: Session = Depends(get_db),
|
||||
):
|
||||
"""
|
||||
Assign a unit to a monitoring location.
|
||||
"""
|
||||
location = db.query(MonitoringLocation).filter_by(
|
||||
id=location_id,
|
||||
project_id=project_id,
|
||||
).first()
|
||||
|
||||
if not location:
|
||||
raise HTTPException(status_code=404, detail="Location not found")
|
||||
|
||||
form_data = await request.form()
|
||||
unit_id = form_data.get("unit_id")
|
||||
|
||||
# Verify unit exists and matches location type
|
||||
unit = db.query(RosterUnit).filter_by(id=unit_id).first()
|
||||
if not unit:
|
||||
raise HTTPException(status_code=404, detail="Unit not found")
|
||||
|
||||
# Check device type matches location type
|
||||
expected_device_type = "sound_level_meter" if location.location_type == "sound" else "seismograph"
|
||||
if unit.device_type != expected_device_type:
|
||||
raise HTTPException(
|
||||
status_code=400,
|
||||
detail=f"Unit type '{unit.device_type}' does not match location type '{location.location_type}'",
|
||||
)
|
||||
|
||||
# Check if location already has an active assignment
|
||||
existing_assignment = db.query(UnitAssignment).filter(
|
||||
and_(
|
||||
UnitAssignment.location_id == location_id,
|
||||
UnitAssignment.status == "active",
|
||||
)
|
||||
).first()
|
||||
|
||||
if existing_assignment:
|
||||
raise HTTPException(
|
||||
status_code=400,
|
||||
detail=f"Location already has an active unit assignment ({existing_assignment.unit_id}). Unassign first.",
|
||||
)
|
||||
|
||||
# Create new assignment
|
||||
assigned_until_str = form_data.get("assigned_until")
|
||||
assigned_until = datetime.fromisoformat(assigned_until_str) if assigned_until_str else None
|
||||
|
||||
assignment = UnitAssignment(
|
||||
id=str(uuid.uuid4()),
|
||||
unit_id=unit_id,
|
||||
location_id=location_id,
|
||||
project_id=project_id,
|
||||
device_type=unit.device_type,
|
||||
assigned_until=assigned_until,
|
||||
status="active",
|
||||
notes=form_data.get("notes"),
|
||||
)
|
||||
|
||||
db.add(assignment)
|
||||
db.commit()
|
||||
db.refresh(assignment)
|
||||
|
||||
return JSONResponse({
|
||||
"success": True,
|
||||
"assignment_id": assignment.id,
|
||||
"message": f"Unit '{unit_id}' assigned to '{location.name}'",
|
||||
})
|
||||
|
||||
|
||||
@router.post("/assignments/{assignment_id}/unassign")
|
||||
async def unassign_unit(
|
||||
project_id: str,
|
||||
assignment_id: str,
|
||||
db: Session = Depends(get_db),
|
||||
):
|
||||
"""
|
||||
Unassign a unit from a location.
|
||||
"""
|
||||
assignment = db.query(UnitAssignment).filter_by(
|
||||
id=assignment_id,
|
||||
project_id=project_id,
|
||||
).first()
|
||||
|
||||
if not assignment:
|
||||
raise HTTPException(status_code=404, detail="Assignment not found")
|
||||
|
||||
# Check if there are active recording sessions
|
||||
active_sessions = db.query(RecordingSession).filter(
|
||||
and_(
|
||||
RecordingSession.location_id == assignment.location_id,
|
||||
RecordingSession.unit_id == assignment.unit_id,
|
||||
RecordingSession.status == "recording",
|
||||
)
|
||||
).count()
|
||||
|
||||
if active_sessions > 0:
|
||||
raise HTTPException(
|
||||
status_code=400,
|
||||
detail="Cannot unassign unit with active recording sessions. Stop recording first.",
|
||||
)
|
||||
|
||||
assignment.status = "completed"
|
||||
assignment.assigned_until = datetime.utcnow()
|
||||
|
||||
db.commit()
|
||||
|
||||
return {"success": True, "message": "Unit unassigned successfully"}
|
||||
|
||||
|
||||
# ============================================================================
|
||||
# Available Units for Assignment
|
||||
# ============================================================================
|
||||
|
||||
@router.get("/available-units", response_class=JSONResponse)
|
||||
async def get_available_units(
|
||||
project_id: str,
|
||||
location_type: str = Query(...),
|
||||
db: Session = Depends(get_db),
|
||||
):
|
||||
"""
|
||||
Get list of available units for assignment to a location.
|
||||
Filters by device type matching the location type.
|
||||
"""
|
||||
# Determine required device type
|
||||
required_device_type = "sound_level_meter" if location_type == "sound" else "seismograph"
|
||||
|
||||
# Get all units of the required type that are deployed and not retired
|
||||
all_units = db.query(RosterUnit).filter(
|
||||
and_(
|
||||
RosterUnit.device_type == required_device_type,
|
||||
RosterUnit.deployed == True,
|
||||
RosterUnit.retired == False,
|
||||
)
|
||||
).all()
|
||||
|
||||
# Filter out units that already have active assignments
|
||||
assigned_unit_ids = db.query(UnitAssignment.unit_id).filter(
|
||||
UnitAssignment.status == "active"
|
||||
).distinct().all()
|
||||
assigned_unit_ids = [uid[0] for uid in assigned_unit_ids]
|
||||
|
||||
available_units = [
|
||||
{
|
||||
"id": unit.id,
|
||||
"device_type": unit.device_type,
|
||||
"location": unit.address or unit.location,
|
||||
"model": unit.slm_model if unit.device_type == "sound_level_meter" else unit.unit_type,
|
||||
}
|
||||
for unit in all_units
|
||||
if unit.id not in assigned_unit_ids
|
||||
]
|
||||
|
||||
return available_units
|
||||
|
||||
|
||||
# ============================================================================
|
||||
# NRL-specific endpoints for detail page
|
||||
# ============================================================================
|
||||
|
||||
@router.get("/nrl/{location_id}/sessions", response_class=HTMLResponse)
|
||||
async def get_nrl_sessions(
|
||||
project_id: str,
|
||||
location_id: str,
|
||||
request: Request,
|
||||
db: Session = Depends(get_db),
|
||||
):
|
||||
"""
|
||||
Get recording sessions for a specific NRL.
|
||||
Returns HTML partial with session list.
|
||||
"""
|
||||
from backend.models import RecordingSession, RosterUnit
|
||||
|
||||
sessions = db.query(RecordingSession).filter_by(
|
||||
location_id=location_id
|
||||
).order_by(RecordingSession.started_at.desc()).all()
|
||||
|
||||
# Enrich with unit details
|
||||
sessions_data = []
|
||||
for session in sessions:
|
||||
unit = None
|
||||
if session.unit_id:
|
||||
unit = db.query(RosterUnit).filter_by(id=session.unit_id).first()
|
||||
|
||||
sessions_data.append({
|
||||
"session": session,
|
||||
"unit": unit,
|
||||
})
|
||||
|
||||
return templates.TemplateResponse("partials/projects/session_list.html", {
|
||||
"request": request,
|
||||
"project_id": project_id,
|
||||
"location_id": location_id,
|
||||
"sessions": sessions_data,
|
||||
})
|
||||
|
||||
|
||||
@router.get("/nrl/{location_id}/files", response_class=HTMLResponse)
|
||||
async def get_nrl_files(
|
||||
project_id: str,
|
||||
location_id: str,
|
||||
request: Request,
|
||||
db: Session = Depends(get_db),
|
||||
):
|
||||
"""
|
||||
Get data files for a specific NRL.
|
||||
Returns HTML partial with file list.
|
||||
"""
|
||||
from backend.models import DataFile, RecordingSession
|
||||
|
||||
# Join DataFile with RecordingSession to filter by location_id
|
||||
files = db.query(DataFile).join(
|
||||
RecordingSession,
|
||||
DataFile.session_id == RecordingSession.id
|
||||
).filter(
|
||||
RecordingSession.location_id == location_id
|
||||
).order_by(DataFile.created_at.desc()).all()
|
||||
|
||||
# Enrich with session details
|
||||
files_data = []
|
||||
for file in files:
|
||||
session = None
|
||||
if file.session_id:
|
||||
session = db.query(RecordingSession).filter_by(id=file.session_id).first()
|
||||
|
||||
files_data.append({
|
||||
"file": file,
|
||||
"session": session,
|
||||
})
|
||||
|
||||
return templates.TemplateResponse("partials/projects/file_list.html", {
|
||||
"request": request,
|
||||
"project_id": project_id,
|
||||
"location_id": location_id,
|
||||
"files": files_data,
|
||||
})
|
||||
888
backend/routers/projects.py
Normal file
@@ -0,0 +1,888 @@
|
||||
"""
|
||||
Projects Router
|
||||
|
||||
Provides API endpoints for the Projects system:
|
||||
- Project CRUD operations
|
||||
- Project dashboards
|
||||
- Project statistics
|
||||
- Type-aware features
|
||||
"""
|
||||
|
||||
from fastapi import APIRouter, Request, Depends, HTTPException, Query
|
||||
from fastapi.templating import Jinja2Templates
|
||||
from fastapi.responses import HTMLResponse, JSONResponse
|
||||
from sqlalchemy.orm import Session
|
||||
from sqlalchemy import func, and_
|
||||
from datetime import datetime, timedelta
|
||||
from typing import Optional
|
||||
import uuid
|
||||
import json
|
||||
import logging
|
||||
|
||||
from backend.database import get_db
|
||||
from backend.models import (
|
||||
Project,
|
||||
ProjectType,
|
||||
MonitoringLocation,
|
||||
UnitAssignment,
|
||||
RecordingSession,
|
||||
ScheduledAction,
|
||||
RosterUnit,
|
||||
)
|
||||
|
||||
router = APIRouter(prefix="/api/projects", tags=["projects"])
|
||||
templates = Jinja2Templates(directory="templates")
|
||||
logger = logging.getLogger(__name__)
|
||||
|
||||
|
||||
# ============================================================================
|
||||
# Project List & Overview
|
||||
# ============================================================================
|
||||
|
||||
@router.get("/list", response_class=HTMLResponse)
|
||||
async def get_projects_list(
|
||||
request: Request,
|
||||
db: Session = Depends(get_db),
|
||||
status: Optional[str] = Query(None),
|
||||
project_type_id: Optional[str] = Query(None),
|
||||
view: Optional[str] = Query(None),
|
||||
):
|
||||
"""
|
||||
Get list of all projects.
|
||||
Returns HTML partial with project cards.
|
||||
"""
|
||||
query = db.query(Project)
|
||||
|
||||
# Filter by status if provided
|
||||
if status:
|
||||
query = query.filter(Project.status == status)
|
||||
|
||||
# Filter by project type if provided
|
||||
if project_type_id:
|
||||
query = query.filter(Project.project_type_id == project_type_id)
|
||||
|
||||
projects = query.order_by(Project.created_at.desc()).all()
|
||||
|
||||
# Enrich each project with stats
|
||||
projects_data = []
|
||||
for project in projects:
|
||||
# Get project type
|
||||
project_type = db.query(ProjectType).filter_by(id=project.project_type_id).first()
|
||||
|
||||
# Count locations
|
||||
location_count = db.query(func.count(MonitoringLocation.id)).filter_by(
|
||||
project_id=project.id
|
||||
).scalar()
|
||||
|
||||
# Count assigned units
|
||||
unit_count = db.query(func.count(UnitAssignment.id)).filter(
|
||||
and_(
|
||||
UnitAssignment.project_id == project.id,
|
||||
UnitAssignment.status == "active",
|
||||
)
|
||||
).scalar()
|
||||
|
||||
# Count active sessions
|
||||
active_session_count = db.query(func.count(RecordingSession.id)).filter(
|
||||
and_(
|
||||
RecordingSession.project_id == project.id,
|
||||
RecordingSession.status == "recording",
|
||||
)
|
||||
).scalar()
|
||||
|
||||
projects_data.append({
|
||||
"project": project,
|
||||
"project_type": project_type,
|
||||
"location_count": location_count,
|
||||
"unit_count": unit_count,
|
||||
"active_session_count": active_session_count,
|
||||
})
|
||||
|
||||
template_name = "partials/projects/project_list.html"
|
||||
if view == "compact":
|
||||
template_name = "partials/projects/project_list_compact.html"
|
||||
|
||||
return templates.TemplateResponse(template_name, {
|
||||
"request": request,
|
||||
"projects": projects_data,
|
||||
})
|
||||
|
||||
|
||||
@router.get("/stats", response_class=HTMLResponse)
|
||||
async def get_projects_stats(request: Request, db: Session = Depends(get_db)):
|
||||
"""
|
||||
Get summary statistics for projects overview.
|
||||
Returns HTML partial with stat cards.
|
||||
"""
|
||||
# Count projects by status
|
||||
total_projects = db.query(func.count(Project.id)).scalar()
|
||||
active_projects = db.query(func.count(Project.id)).filter_by(status="active").scalar()
|
||||
completed_projects = db.query(func.count(Project.id)).filter_by(status="completed").scalar()
|
||||
|
||||
# Count total locations across all projects
|
||||
total_locations = db.query(func.count(MonitoringLocation.id)).scalar()
|
||||
|
||||
# Count assigned units
|
||||
assigned_units = db.query(func.count(UnitAssignment.id)).filter_by(
|
||||
status="active"
|
||||
).scalar()
|
||||
|
||||
# Count active recording sessions
|
||||
active_sessions = db.query(func.count(RecordingSession.id)).filter_by(
|
||||
status="recording"
|
||||
).scalar()
|
||||
|
||||
return templates.TemplateResponse("partials/projects/project_stats.html", {
|
||||
"request": request,
|
||||
"total_projects": total_projects,
|
||||
"active_projects": active_projects,
|
||||
"completed_projects": completed_projects,
|
||||
"total_locations": total_locations,
|
||||
"assigned_units": assigned_units,
|
||||
"active_sessions": active_sessions,
|
||||
})
|
||||
|
||||
|
||||
# ============================================================================
|
||||
# Project CRUD
|
||||
# ============================================================================
|
||||
|
||||
@router.post("/create")
|
||||
async def create_project(request: Request, db: Session = Depends(get_db)):
|
||||
"""
|
||||
Create a new project.
|
||||
Expects form data with project details.
|
||||
"""
|
||||
form_data = await request.form()
|
||||
|
||||
project = Project(
|
||||
id=str(uuid.uuid4()),
|
||||
name=form_data.get("name"),
|
||||
description=form_data.get("description"),
|
||||
project_type_id=form_data.get("project_type_id"),
|
||||
status="active",
|
||||
client_name=form_data.get("client_name"),
|
||||
site_address=form_data.get("site_address"),
|
||||
site_coordinates=form_data.get("site_coordinates"),
|
||||
start_date=datetime.fromisoformat(form_data.get("start_date")) if form_data.get("start_date") else None,
|
||||
end_date=datetime.fromisoformat(form_data.get("end_date")) if form_data.get("end_date") else None,
|
||||
)
|
||||
|
||||
db.add(project)
|
||||
db.commit()
|
||||
db.refresh(project)
|
||||
|
||||
return JSONResponse({
|
||||
"success": True,
|
||||
"project_id": project.id,
|
||||
"message": f"Project '{project.name}' created successfully",
|
||||
})
|
||||
|
||||
|
||||
@router.get("/{project_id}")
|
||||
async def get_project(project_id: str, db: Session = Depends(get_db)):
|
||||
"""
|
||||
Get project details by ID.
|
||||
Returns JSON with full project data.
|
||||
"""
|
||||
project = db.query(Project).filter_by(id=project_id).first()
|
||||
if not project:
|
||||
raise HTTPException(status_code=404, detail="Project not found")
|
||||
|
||||
project_type = db.query(ProjectType).filter_by(id=project.project_type_id).first()
|
||||
|
||||
return {
|
||||
"id": project.id,
|
||||
"name": project.name,
|
||||
"description": project.description,
|
||||
"project_type_id": project.project_type_id,
|
||||
"project_type_name": project_type.name if project_type else None,
|
||||
"status": project.status,
|
||||
"client_name": project.client_name,
|
||||
"site_address": project.site_address,
|
||||
"site_coordinates": project.site_coordinates,
|
||||
"start_date": project.start_date.isoformat() if project.start_date else None,
|
||||
"end_date": project.end_date.isoformat() if project.end_date else None,
|
||||
"created_at": project.created_at.isoformat(),
|
||||
"updated_at": project.updated_at.isoformat(),
|
||||
}
|
||||
|
||||
|
||||
@router.put("/{project_id}")
|
||||
async def update_project(
|
||||
project_id: str,
|
||||
request: Request,
|
||||
db: Session = Depends(get_db),
|
||||
):
|
||||
"""
|
||||
Update project details.
|
||||
Expects JSON body with fields to update.
|
||||
"""
|
||||
project = db.query(Project).filter_by(id=project_id).first()
|
||||
if not project:
|
||||
raise HTTPException(status_code=404, detail="Project not found")
|
||||
|
||||
data = await request.json()
|
||||
|
||||
# Update fields if provided
|
||||
if "name" in data:
|
||||
project.name = data["name"]
|
||||
if "description" in data:
|
||||
project.description = data["description"]
|
||||
if "status" in data:
|
||||
project.status = data["status"]
|
||||
if "client_name" in data:
|
||||
project.client_name = data["client_name"]
|
||||
if "site_address" in data:
|
||||
project.site_address = data["site_address"]
|
||||
if "site_coordinates" in data:
|
||||
project.site_coordinates = data["site_coordinates"]
|
||||
if "start_date" in data:
|
||||
project.start_date = datetime.fromisoformat(data["start_date"]) if data["start_date"] else None
|
||||
if "end_date" in data:
|
||||
project.end_date = datetime.fromisoformat(data["end_date"]) if data["end_date"] else None
|
||||
|
||||
project.updated_at = datetime.utcnow()
|
||||
|
||||
db.commit()
|
||||
|
||||
return {"success": True, "message": "Project updated successfully"}
|
||||
|
||||
|
||||
@router.delete("/{project_id}")
|
||||
async def delete_project(project_id: str, db: Session = Depends(get_db)):
|
||||
"""
|
||||
Delete a project (soft delete by archiving).
|
||||
"""
|
||||
project = db.query(Project).filter_by(id=project_id).first()
|
||||
if not project:
|
||||
raise HTTPException(status_code=404, detail="Project not found")
|
||||
|
||||
project.status = "archived"
|
||||
project.updated_at = datetime.utcnow()
|
||||
|
||||
db.commit()
|
||||
|
||||
return {"success": True, "message": "Project archived successfully"}
|
||||
|
||||
|
||||
# ============================================================================
|
||||
# Project Dashboard Data
|
||||
# ============================================================================
|
||||
|
||||
@router.get("/{project_id}/dashboard", response_class=HTMLResponse)
|
||||
async def get_project_dashboard(
|
||||
project_id: str,
|
||||
request: Request,
|
||||
db: Session = Depends(get_db),
|
||||
):
|
||||
"""
|
||||
Get project dashboard data.
|
||||
Returns HTML partial with project summary.
|
||||
"""
|
||||
project = db.query(Project).filter_by(id=project_id).first()
|
||||
if not project:
|
||||
raise HTTPException(status_code=404, detail="Project not found")
|
||||
|
||||
project_type = db.query(ProjectType).filter_by(id=project.project_type_id).first()
|
||||
|
||||
# Get locations
|
||||
locations = db.query(MonitoringLocation).filter_by(project_id=project_id).all()
|
||||
|
||||
# Get assigned units with details
|
||||
assignments = db.query(UnitAssignment).filter(
|
||||
and_(
|
||||
UnitAssignment.project_id == project_id,
|
||||
UnitAssignment.status == "active",
|
||||
)
|
||||
).all()
|
||||
|
||||
assigned_units = []
|
||||
for assignment in assignments:
|
||||
unit = db.query(RosterUnit).filter_by(id=assignment.unit_id).first()
|
||||
if unit:
|
||||
assigned_units.append({
|
||||
"assignment": assignment,
|
||||
"unit": unit,
|
||||
})
|
||||
|
||||
# Get active recording sessions
|
||||
active_sessions = db.query(RecordingSession).filter(
|
||||
and_(
|
||||
RecordingSession.project_id == project_id,
|
||||
RecordingSession.status == "recording",
|
||||
)
|
||||
).all()
|
||||
|
||||
# Get completed sessions count
|
||||
completed_sessions_count = db.query(func.count(RecordingSession.id)).filter(
|
||||
and_(
|
||||
RecordingSession.project_id == project_id,
|
||||
RecordingSession.status == "completed",
|
||||
)
|
||||
).scalar()
|
||||
|
||||
# Get upcoming scheduled actions
|
||||
upcoming_actions = db.query(ScheduledAction).filter(
|
||||
and_(
|
||||
ScheduledAction.project_id == project_id,
|
||||
ScheduledAction.execution_status == "pending",
|
||||
ScheduledAction.scheduled_time > datetime.utcnow(),
|
||||
)
|
||||
).order_by(ScheduledAction.scheduled_time).limit(5).all()
|
||||
|
||||
return templates.TemplateResponse("partials/projects/project_dashboard.html", {
|
||||
"request": request,
|
||||
"project": project,
|
||||
"project_type": project_type,
|
||||
"locations": locations,
|
||||
"assigned_units": assigned_units,
|
||||
"active_sessions": active_sessions,
|
||||
"completed_sessions_count": completed_sessions_count,
|
||||
"upcoming_actions": upcoming_actions,
|
||||
})
|
||||
|
||||
|
||||
# ============================================================================
|
||||
# Project Types
|
||||
# ============================================================================
|
||||
|
||||
@router.get("/{project_id}/header", response_class=JSONResponse)
|
||||
async def get_project_header(project_id: str, db: Session = Depends(get_db)):
|
||||
"""
|
||||
Get project header information for dynamic display.
|
||||
Returns JSON with project name, status, and type.
|
||||
"""
|
||||
project = db.query(Project).filter_by(id=project_id).first()
|
||||
if not project:
|
||||
raise HTTPException(status_code=404, detail="Project not found")
|
||||
|
||||
project_type = db.query(ProjectType).filter_by(id=project.project_type_id).first()
|
||||
|
||||
return JSONResponse({
|
||||
"id": project.id,
|
||||
"name": project.name,
|
||||
"status": project.status,
|
||||
"project_type_id": project.project_type_id,
|
||||
"project_type_name": project_type.name if project_type else None,
|
||||
})
|
||||
|
||||
|
||||
@router.get("/{project_id}/units", response_class=HTMLResponse)
|
||||
async def get_project_units(
|
||||
project_id: str,
|
||||
request: Request,
|
||||
db: Session = Depends(get_db),
|
||||
):
|
||||
"""
|
||||
Get all units assigned to this project's locations.
|
||||
Returns HTML partial with unit list.
|
||||
"""
|
||||
from backend.models import DataFile
|
||||
|
||||
# Get all assignments for this project
|
||||
assignments = db.query(UnitAssignment).filter(
|
||||
and_(
|
||||
UnitAssignment.project_id == project_id,
|
||||
UnitAssignment.status == "active",
|
||||
)
|
||||
).all()
|
||||
|
||||
# Enrich with unit and location details
|
||||
units_data = []
|
||||
for assignment in assignments:
|
||||
unit = db.query(RosterUnit).filter_by(id=assignment.unit_id).first()
|
||||
location = db.query(MonitoringLocation).filter_by(id=assignment.location_id).first()
|
||||
|
||||
# Count sessions for this assignment
|
||||
session_count = db.query(func.count(RecordingSession.id)).filter_by(
|
||||
location_id=assignment.location_id,
|
||||
unit_id=assignment.unit_id,
|
||||
).scalar()
|
||||
|
||||
# Count files from sessions
|
||||
file_count = db.query(func.count(DataFile.id)).join(
|
||||
RecordingSession,
|
||||
DataFile.session_id == RecordingSession.id
|
||||
).filter(
|
||||
RecordingSession.location_id == assignment.location_id,
|
||||
RecordingSession.unit_id == assignment.unit_id,
|
||||
).scalar()
|
||||
|
||||
# Check if currently recording
|
||||
active_session = db.query(RecordingSession).filter(
|
||||
and_(
|
||||
RecordingSession.location_id == assignment.location_id,
|
||||
RecordingSession.unit_id == assignment.unit_id,
|
||||
RecordingSession.status == "recording",
|
||||
)
|
||||
).first()
|
||||
|
||||
units_data.append({
|
||||
"assignment": assignment,
|
||||
"unit": unit,
|
||||
"location": location,
|
||||
"session_count": session_count,
|
||||
"file_count": file_count,
|
||||
"active_session": active_session,
|
||||
})
|
||||
|
||||
# Get project type for label context
|
||||
project = db.query(Project).filter_by(id=project_id).first()
|
||||
project_type = db.query(ProjectType).filter_by(id=project.project_type_id).first() if project else None
|
||||
|
||||
return templates.TemplateResponse("partials/projects/unit_list.html", {
|
||||
"request": request,
|
||||
"project_id": project_id,
|
||||
"units": units_data,
|
||||
"project_type": project_type,
|
||||
})
|
||||
|
||||
|
||||
@router.get("/{project_id}/schedules", response_class=HTMLResponse)
|
||||
async def get_project_schedules(
|
||||
project_id: str,
|
||||
request: Request,
|
||||
db: Session = Depends(get_db),
|
||||
status: Optional[str] = Query(None),
|
||||
):
|
||||
"""
|
||||
Get scheduled actions for this project.
|
||||
Returns HTML partial with schedule list.
|
||||
Optional status filter: pending, completed, failed, cancelled
|
||||
"""
|
||||
query = db.query(ScheduledAction).filter_by(project_id=project_id)
|
||||
|
||||
# Filter by status if provided
|
||||
if status:
|
||||
query = query.filter(ScheduledAction.execution_status == status)
|
||||
|
||||
schedules = query.order_by(ScheduledAction.scheduled_time.desc()).all()
|
||||
|
||||
# Enrich with location details
|
||||
schedules_data = []
|
||||
for schedule in schedules:
|
||||
location = None
|
||||
if schedule.location_id:
|
||||
location = db.query(MonitoringLocation).filter_by(id=schedule.location_id).first()
|
||||
|
||||
schedules_data.append({
|
||||
"schedule": schedule,
|
||||
"location": location,
|
||||
})
|
||||
|
||||
return templates.TemplateResponse("partials/projects/schedule_list.html", {
|
||||
"request": request,
|
||||
"project_id": project_id,
|
||||
"schedules": schedules_data,
|
||||
})
|
||||
|
||||
|
||||
@router.get("/{project_id}/sessions", response_class=HTMLResponse)
|
||||
async def get_project_sessions(
|
||||
project_id: str,
|
||||
request: Request,
|
||||
db: Session = Depends(get_db),
|
||||
status: Optional[str] = Query(None),
|
||||
):
|
||||
"""
|
||||
Get all recording sessions for this project.
|
||||
Returns HTML partial with session list.
|
||||
Optional status filter: recording, completed, paused, failed
|
||||
"""
|
||||
query = db.query(RecordingSession).filter_by(project_id=project_id)
|
||||
|
||||
# Filter by status if provided
|
||||
if status:
|
||||
query = query.filter(RecordingSession.status == status)
|
||||
|
||||
sessions = query.order_by(RecordingSession.started_at.desc()).all()
|
||||
|
||||
# Enrich with unit and location details
|
||||
sessions_data = []
|
||||
for session in sessions:
|
||||
unit = None
|
||||
location = None
|
||||
|
||||
if session.unit_id:
|
||||
unit = db.query(RosterUnit).filter_by(id=session.unit_id).first()
|
||||
if session.location_id:
|
||||
location = db.query(MonitoringLocation).filter_by(id=session.location_id).first()
|
||||
|
||||
sessions_data.append({
|
||||
"session": session,
|
||||
"unit": unit,
|
||||
"location": location,
|
||||
})
|
||||
|
||||
return templates.TemplateResponse("partials/projects/session_list.html", {
|
||||
"request": request,
|
||||
"project_id": project_id,
|
||||
"sessions": sessions_data,
|
||||
})
|
||||
|
||||
|
||||
@router.get("/{project_id}/files", response_class=HTMLResponse)
|
||||
async def get_project_files(
|
||||
project_id: str,
|
||||
request: Request,
|
||||
db: Session = Depends(get_db),
|
||||
file_type: Optional[str] = Query(None),
|
||||
):
|
||||
"""
|
||||
Get all data files from all sessions in this project.
|
||||
Returns HTML partial with file list.
|
||||
Optional file_type filter: audio, data, log, etc.
|
||||
"""
|
||||
from backend.models import DataFile
|
||||
|
||||
# Join through RecordingSession to get project files
|
||||
query = db.query(DataFile).join(
|
||||
RecordingSession,
|
||||
DataFile.session_id == RecordingSession.id
|
||||
).filter(RecordingSession.project_id == project_id)
|
||||
|
||||
# Filter by file type if provided
|
||||
if file_type:
|
||||
query = query.filter(DataFile.file_type == file_type)
|
||||
|
||||
files = query.order_by(DataFile.created_at.desc()).all()
|
||||
|
||||
# Enrich with session details
|
||||
files_data = []
|
||||
for file in files:
|
||||
session = None
|
||||
if file.session_id:
|
||||
session = db.query(RecordingSession).filter_by(id=file.session_id).first()
|
||||
|
||||
files_data.append({
|
||||
"file": file,
|
||||
"session": session,
|
||||
})
|
||||
|
||||
return templates.TemplateResponse("partials/projects/file_list.html", {
|
||||
"request": request,
|
||||
"project_id": project_id,
|
||||
"files": files_data,
|
||||
})
|
||||
|
||||
|
||||
@router.get("/{project_id}/ftp-browser", response_class=HTMLResponse)
|
||||
async def get_ftp_browser(
|
||||
project_id: str,
|
||||
request: Request,
|
||||
db: Session = Depends(get_db),
|
||||
):
|
||||
"""
|
||||
Get FTP browser interface for downloading files from assigned SLMs.
|
||||
Returns HTML partial with FTP browser.
|
||||
"""
|
||||
from backend.models import DataFile
|
||||
|
||||
# Get all assignments for this project
|
||||
assignments = db.query(UnitAssignment).filter(
|
||||
and_(
|
||||
UnitAssignment.project_id == project_id,
|
||||
UnitAssignment.status == "active",
|
||||
)
|
||||
).all()
|
||||
|
||||
# Enrich with unit and location details
|
||||
units_data = []
|
||||
for assignment in assignments:
|
||||
unit = db.query(RosterUnit).filter_by(id=assignment.unit_id).first()
|
||||
location = db.query(MonitoringLocation).filter_by(id=assignment.location_id).first()
|
||||
|
||||
# Only include SLM units
|
||||
if unit and unit.device_type == "sound_level_meter":
|
||||
units_data.append({
|
||||
"assignment": assignment,
|
||||
"unit": unit,
|
||||
"location": location,
|
||||
})
|
||||
|
||||
return templates.TemplateResponse("partials/projects/ftp_browser.html", {
|
||||
"request": request,
|
||||
"project_id": project_id,
|
||||
"units": units_data,
|
||||
})
|
||||
|
||||
|
||||
@router.post("/{project_id}/ftp-download-to-server")
|
||||
async def ftp_download_to_server(
|
||||
project_id: str,
|
||||
request: Request,
|
||||
db: Session = Depends(get_db),
|
||||
):
|
||||
"""
|
||||
Download a file from an SLM to the server via FTP.
|
||||
Creates a DataFile record and stores the file in data/Projects/{project_id}/
|
||||
"""
|
||||
import httpx
|
||||
import os
|
||||
import hashlib
|
||||
from pathlib import Path
|
||||
from backend.models import DataFile
|
||||
|
||||
data = await request.json()
|
||||
unit_id = data.get("unit_id")
|
||||
remote_path = data.get("remote_path")
|
||||
location_id = data.get("location_id")
|
||||
|
||||
if not unit_id or not remote_path:
|
||||
raise HTTPException(status_code=400, detail="Missing unit_id or remote_path")
|
||||
|
||||
# Get or create active session for this location/unit
|
||||
session = db.query(RecordingSession).filter(
|
||||
and_(
|
||||
RecordingSession.project_id == project_id,
|
||||
RecordingSession.location_id == location_id,
|
||||
RecordingSession.unit_id == unit_id,
|
||||
RecordingSession.status.in_(["recording", "paused"])
|
||||
)
|
||||
).first()
|
||||
|
||||
# If no active session, create one
|
||||
if not session:
|
||||
session = RecordingSession(
|
||||
id=str(uuid.uuid4()),
|
||||
project_id=project_id,
|
||||
location_id=location_id,
|
||||
unit_id=unit_id,
|
||||
status="completed",
|
||||
started_at=datetime.utcnow(),
|
||||
stopped_at=datetime.utcnow(),
|
||||
notes="Auto-created for FTP download"
|
||||
)
|
||||
db.add(session)
|
||||
db.commit()
|
||||
db.refresh(session)
|
||||
|
||||
# Download file from SLMM
|
||||
SLMM_BASE_URL = os.getenv("SLMM_BASE_URL", "http://localhost:8100")
|
||||
|
||||
try:
|
||||
async with httpx.AsyncClient(timeout=300.0) as client:
|
||||
response = await client.post(
|
||||
f"{SLMM_BASE_URL}/api/nl43/{unit_id}/ftp/download",
|
||||
json={"remote_path": remote_path}
|
||||
)
|
||||
|
||||
if not response.is_success:
|
||||
raise HTTPException(
|
||||
status_code=response.status_code,
|
||||
detail=f"Failed to download from SLMM: {response.text}"
|
||||
)
|
||||
|
||||
# Extract filename from remote_path
|
||||
filename = os.path.basename(remote_path)
|
||||
|
||||
# Determine file type from extension
|
||||
ext = os.path.splitext(filename)[1].lower()
|
||||
file_type_map = {
|
||||
'.wav': 'audio',
|
||||
'.mp3': 'audio',
|
||||
'.csv': 'data',
|
||||
'.txt': 'data',
|
||||
'.log': 'log',
|
||||
'.json': 'data',
|
||||
}
|
||||
file_type = file_type_map.get(ext, 'data')
|
||||
|
||||
# Create directory structure: data/Projects/{project_id}/{session_id}/
|
||||
project_dir = Path(f"data/Projects/{project_id}/{session.id}")
|
||||
project_dir.mkdir(parents=True, exist_ok=True)
|
||||
|
||||
# Save file to disk
|
||||
file_path = project_dir / filename
|
||||
file_content = response.content
|
||||
|
||||
with open(file_path, 'wb') as f:
|
||||
f.write(file_content)
|
||||
|
||||
# Calculate checksum
|
||||
checksum = hashlib.sha256(file_content).hexdigest()
|
||||
|
||||
# Create DataFile record
|
||||
data_file = DataFile(
|
||||
id=str(uuid.uuid4()),
|
||||
session_id=session.id,
|
||||
file_path=str(file_path.relative_to("data")), # Store relative to data/
|
||||
file_type=file_type,
|
||||
file_size_bytes=len(file_content),
|
||||
downloaded_at=datetime.utcnow(),
|
||||
checksum=checksum,
|
||||
file_metadata=json.dumps({
|
||||
"source": "ftp",
|
||||
"remote_path": remote_path,
|
||||
"unit_id": unit_id,
|
||||
"location_id": location_id,
|
||||
})
|
||||
)
|
||||
|
||||
db.add(data_file)
|
||||
db.commit()
|
||||
|
||||
return {
|
||||
"success": True,
|
||||
"message": f"Downloaded {filename} to server",
|
||||
"file_id": data_file.id,
|
||||
"file_path": str(file_path),
|
||||
"file_size": len(file_content),
|
||||
}
|
||||
|
||||
except httpx.TimeoutException:
|
||||
raise HTTPException(
|
||||
status_code=504,
|
||||
detail="Timeout downloading file from SLM"
|
||||
)
|
||||
except Exception as e:
|
||||
logger.error(f"Error downloading file to server: {e}")
|
||||
raise HTTPException(
|
||||
status_code=500,
|
||||
detail=f"Failed to download file to server: {str(e)}"
|
||||
)
|
||||
|
||||
|
||||
@router.post("/{project_id}/ftp-download-folder-to-server")
|
||||
async def ftp_download_folder_to_server(
|
||||
project_id: str,
|
||||
request: Request,
|
||||
db: Session = Depends(get_db),
|
||||
):
|
||||
"""
|
||||
Download an entire folder from an SLM to the server via FTP as a ZIP file.
|
||||
Creates a DataFile record and stores the ZIP in data/Projects/{project_id}/
|
||||
"""
|
||||
import httpx
|
||||
import os
|
||||
import hashlib
|
||||
from pathlib import Path
|
||||
from backend.models import DataFile
|
||||
|
||||
data = await request.json()
|
||||
unit_id = data.get("unit_id")
|
||||
remote_path = data.get("remote_path")
|
||||
location_id = data.get("location_id")
|
||||
|
||||
if not unit_id or not remote_path:
|
||||
raise HTTPException(status_code=400, detail="Missing unit_id or remote_path")
|
||||
|
||||
# Get or create active session for this location/unit
|
||||
session = db.query(RecordingSession).filter(
|
||||
and_(
|
||||
RecordingSession.project_id == project_id,
|
||||
RecordingSession.location_id == location_id,
|
||||
RecordingSession.unit_id == unit_id,
|
||||
RecordingSession.status.in_(["recording", "paused"])
|
||||
)
|
||||
).first()
|
||||
|
||||
# If no active session, create one
|
||||
if not session:
|
||||
session = RecordingSession(
|
||||
id=str(uuid.uuid4()),
|
||||
project_id=project_id,
|
||||
location_id=location_id,
|
||||
unit_id=unit_id,
|
||||
status="completed",
|
||||
started_at=datetime.utcnow(),
|
||||
stopped_at=datetime.utcnow(),
|
||||
notes="Auto-created for FTP folder download"
|
||||
)
|
||||
db.add(session)
|
||||
db.commit()
|
||||
db.refresh(session)
|
||||
|
||||
# Download folder from SLMM
|
||||
SLMM_BASE_URL = os.getenv("SLMM_BASE_URL", "http://localhost:8100")
|
||||
|
||||
try:
|
||||
async with httpx.AsyncClient(timeout=600.0) as client: # Longer timeout for folders
|
||||
response = await client.post(
|
||||
f"{SLMM_BASE_URL}/api/nl43/{unit_id}/ftp/download-folder",
|
||||
json={"remote_path": remote_path}
|
||||
)
|
||||
|
||||
if not response.is_success:
|
||||
raise HTTPException(
|
||||
status_code=response.status_code,
|
||||
detail=f"Failed to download folder from SLMM: {response.text}"
|
||||
)
|
||||
|
||||
# Extract folder name from remote_path
|
||||
folder_name = os.path.basename(remote_path.rstrip('/'))
|
||||
filename = f"{folder_name}.zip"
|
||||
|
||||
# Create directory structure: data/Projects/{project_id}/{session_id}/
|
||||
project_dir = Path(f"data/Projects/{project_id}/{session.id}")
|
||||
project_dir.mkdir(parents=True, exist_ok=True)
|
||||
|
||||
# Save ZIP file to disk
|
||||
file_path = project_dir / filename
|
||||
file_content = response.content
|
||||
|
||||
with open(file_path, 'wb') as f:
|
||||
f.write(file_content)
|
||||
|
||||
# Calculate checksum
|
||||
checksum = hashlib.sha256(file_content).hexdigest()
|
||||
|
||||
# Create DataFile record
|
||||
data_file = DataFile(
|
||||
id=str(uuid.uuid4()),
|
||||
session_id=session.id,
|
||||
file_path=str(file_path.relative_to("data")), # Store relative to data/
|
||||
file_type='archive', # ZIP archives
|
||||
file_size_bytes=len(file_content),
|
||||
downloaded_at=datetime.utcnow(),
|
||||
checksum=checksum,
|
||||
file_metadata=json.dumps({
|
||||
"source": "ftp_folder",
|
||||
"remote_path": remote_path,
|
||||
"unit_id": unit_id,
|
||||
"location_id": location_id,
|
||||
"folder_name": folder_name,
|
||||
})
|
||||
)
|
||||
|
||||
db.add(data_file)
|
||||
db.commit()
|
||||
|
||||
return {
|
||||
"success": True,
|
||||
"message": f"Downloaded folder {folder_name} to server as ZIP",
|
||||
"file_id": data_file.id,
|
||||
"file_path": str(file_path),
|
||||
"file_size": len(file_content),
|
||||
}
|
||||
|
||||
except httpx.TimeoutException:
|
||||
raise HTTPException(
|
||||
status_code=504,
|
||||
detail="Timeout downloading folder from SLM (large folders may take a while)"
|
||||
)
|
||||
except Exception as e:
|
||||
logger.error(f"Error downloading folder to server: {e}")
|
||||
raise HTTPException(
|
||||
status_code=500,
|
||||
detail=f"Failed to download folder to server: {str(e)}"
|
||||
)
|
||||
|
||||
|
||||
# ============================================================================
|
||||
# Project Types
|
||||
# ============================================================================
|
||||
|
||||
@router.get("/types/list", response_class=HTMLResponse)
|
||||
async def get_project_types(request: Request, db: Session = Depends(get_db)):
|
||||
"""
|
||||
Get all available project types.
|
||||
Returns HTML partial with project type cards.
|
||||
"""
|
||||
project_types = db.query(ProjectType).all()
|
||||
|
||||
return templates.TemplateResponse("partials/projects/project_type_cards.html", {
|
||||
"request": request,
|
||||
"project_types": project_types,
|
||||
})
|
||||
@@ -4,8 +4,8 @@ from datetime import datetime, timedelta
|
||||
from typing import Dict, Any
|
||||
import random
|
||||
|
||||
from app.seismo.database import get_db
|
||||
from app.seismo.services.snapshot import emit_status_snapshot
|
||||
from backend.database import get_db
|
||||
from backend.services.snapshot import emit_status_snapshot
|
||||
|
||||
router = APIRouter(prefix="/api", tags=["roster"])
|
||||
|
||||
@@ -8,8 +8,8 @@ import logging
|
||||
import httpx
|
||||
import os
|
||||
|
||||
from app.seismo.database import get_db
|
||||
from app.seismo.models import RosterUnit, IgnoredUnit, Emitter, UnitHistory
|
||||
from backend.database import get_db
|
||||
from backend.models import RosterUnit, IgnoredUnit, Emitter, UnitHistory
|
||||
|
||||
router = APIRouter(prefix="/api/roster", tags=["roster-edit"])
|
||||
logger = logging.getLogger(__name__)
|
||||
139
backend/routers/roster_rename.py
Normal file
@@ -0,0 +1,139 @@
|
||||
"""
|
||||
Roster Unit Rename Router
|
||||
|
||||
Provides endpoint for safely renaming unit IDs across all database tables.
|
||||
"""
|
||||
|
||||
from fastapi import APIRouter, Depends, HTTPException, Form
|
||||
from sqlalchemy.orm import Session
|
||||
from datetime import datetime
|
||||
import logging
|
||||
|
||||
from backend.database import get_db
|
||||
from backend.models import RosterUnit, Emitter, UnitHistory
|
||||
from backend.routers.roster_edit import record_history, sync_slm_to_slmm_cache
|
||||
|
||||
router = APIRouter(prefix="/api/roster", tags=["roster-rename"])
|
||||
logger = logging.getLogger(__name__)
|
||||
|
||||
|
||||
@router.post("/rename")
|
||||
async def rename_unit(
|
||||
old_id: str = Form(...),
|
||||
new_id: str = Form(...),
|
||||
db: Session = Depends(get_db)
|
||||
):
|
||||
"""
|
||||
Rename a unit ID across all tables.
|
||||
Updates the unit ID in roster, emitters, unit_history, and all foreign key references.
|
||||
|
||||
IMPORTANT: This operation updates the primary key, which affects all relationships.
|
||||
"""
|
||||
# Validate input
|
||||
if not old_id or not new_id:
|
||||
raise HTTPException(status_code=400, detail="Both old_id and new_id are required")
|
||||
|
||||
if old_id == new_id:
|
||||
raise HTTPException(status_code=400, detail="New ID must be different from old ID")
|
||||
|
||||
# Check if old unit exists
|
||||
old_unit = db.query(RosterUnit).filter(RosterUnit.id == old_id).first()
|
||||
if not old_unit:
|
||||
raise HTTPException(status_code=404, detail=f"Unit '{old_id}' not found")
|
||||
|
||||
# Check if new ID already exists
|
||||
existing_unit = db.query(RosterUnit).filter(RosterUnit.id == new_id).first()
|
||||
if existing_unit:
|
||||
raise HTTPException(status_code=409, detail=f"Unit ID '{new_id}' already exists")
|
||||
|
||||
device_type = old_unit.device_type
|
||||
|
||||
try:
|
||||
# Record history for the rename operation (using old_id since that's still valid)
|
||||
record_history(
|
||||
db=db,
|
||||
unit_id=old_id,
|
||||
change_type="id_change",
|
||||
field_name="id",
|
||||
old_value=old_id,
|
||||
new_value=new_id,
|
||||
source="manual",
|
||||
notes=f"Unit renamed from '{old_id}' to '{new_id}'"
|
||||
)
|
||||
|
||||
# Update roster table (primary)
|
||||
old_unit.id = new_id
|
||||
old_unit.last_updated = datetime.utcnow()
|
||||
|
||||
# Update emitters table
|
||||
emitter = db.query(Emitter).filter(Emitter.id == old_id).first()
|
||||
if emitter:
|
||||
emitter.id = new_id
|
||||
|
||||
# Update unit_history table (all entries for this unit)
|
||||
db.query(UnitHistory).filter(UnitHistory.unit_id == old_id).update(
|
||||
{"unit_id": new_id},
|
||||
synchronize_session=False
|
||||
)
|
||||
|
||||
# Update deployed_with_modem_id references (units that reference this as modem)
|
||||
db.query(RosterUnit).filter(RosterUnit.deployed_with_modem_id == old_id).update(
|
||||
{"deployed_with_modem_id": new_id},
|
||||
synchronize_session=False
|
||||
)
|
||||
|
||||
# Update unit_assignments table (if exists)
|
||||
try:
|
||||
from backend.models import UnitAssignment
|
||||
db.query(UnitAssignment).filter(UnitAssignment.unit_id == old_id).update(
|
||||
{"unit_id": new_id},
|
||||
synchronize_session=False
|
||||
)
|
||||
except Exception as e:
|
||||
logger.warning(f"Could not update unit_assignments: {e}")
|
||||
|
||||
# Update recording_sessions table (if exists)
|
||||
try:
|
||||
from backend.models import RecordingSession
|
||||
db.query(RecordingSession).filter(RecordingSession.unit_id == old_id).update(
|
||||
{"unit_id": new_id},
|
||||
synchronize_session=False
|
||||
)
|
||||
except Exception as e:
|
||||
logger.warning(f"Could not update recording_sessions: {e}")
|
||||
|
||||
# Commit all changes
|
||||
db.commit()
|
||||
|
||||
# If sound level meter, sync updated config to SLMM cache
|
||||
if device_type == "sound_level_meter":
|
||||
logger.info(f"Syncing renamed SLM {new_id} (was {old_id}) config to SLMM cache...")
|
||||
result = await sync_slm_to_slmm_cache(
|
||||
unit_id=new_id,
|
||||
host=old_unit.slm_host,
|
||||
tcp_port=old_unit.slm_tcp_port,
|
||||
ftp_port=old_unit.slm_ftp_port,
|
||||
deployed_with_modem_id=old_unit.deployed_with_modem_id,
|
||||
db=db
|
||||
)
|
||||
|
||||
if not result["success"]:
|
||||
logger.warning(f"SLMM cache sync warning for renamed unit {new_id}: {result['message']}")
|
||||
|
||||
logger.info(f"Successfully renamed unit '{old_id}' to '{new_id}'")
|
||||
|
||||
return {
|
||||
"success": True,
|
||||
"message": f"Successfully renamed unit from '{old_id}' to '{new_id}'",
|
||||
"old_id": old_id,
|
||||
"new_id": new_id,
|
||||
"device_type": device_type
|
||||
}
|
||||
|
||||
except Exception as e:
|
||||
db.rollback()
|
||||
logger.error(f"Error renaming unit '{old_id}' to '{new_id}': {e}")
|
||||
raise HTTPException(
|
||||
status_code=500,
|
||||
detail=f"Failed to rename unit: {str(e)}"
|
||||
)
|
||||
409
backend/routers/scheduler.py
Normal file
@@ -0,0 +1,409 @@
|
||||
"""
|
||||
Scheduler Router
|
||||
|
||||
Handles scheduled actions for automated recording control.
|
||||
"""
|
||||
|
||||
from fastapi import APIRouter, Request, Depends, HTTPException, Query
|
||||
from fastapi.templating import Jinja2Templates
|
||||
from fastapi.responses import HTMLResponse, JSONResponse
|
||||
from sqlalchemy.orm import Session
|
||||
from sqlalchemy import and_, or_
|
||||
from datetime import datetime, timedelta
|
||||
from typing import Optional
|
||||
import uuid
|
||||
import json
|
||||
|
||||
from backend.database import get_db
|
||||
from backend.models import (
|
||||
Project,
|
||||
ScheduledAction,
|
||||
MonitoringLocation,
|
||||
UnitAssignment,
|
||||
RosterUnit,
|
||||
)
|
||||
from backend.services.scheduler import get_scheduler
|
||||
|
||||
router = APIRouter(prefix="/api/projects/{project_id}/scheduler", tags=["scheduler"])
|
||||
templates = Jinja2Templates(directory="templates")
|
||||
|
||||
|
||||
# ============================================================================
|
||||
# Scheduled Actions List
|
||||
# ============================================================================
|
||||
|
||||
@router.get("/actions", response_class=HTMLResponse)
|
||||
async def get_scheduled_actions(
|
||||
project_id: str,
|
||||
request: Request,
|
||||
db: Session = Depends(get_db),
|
||||
status: Optional[str] = Query(None),
|
||||
start_date: Optional[str] = Query(None),
|
||||
end_date: Optional[str] = Query(None),
|
||||
):
|
||||
"""
|
||||
Get scheduled actions for a project.
|
||||
Returns HTML partial with agenda/calendar view.
|
||||
"""
|
||||
query = db.query(ScheduledAction).filter_by(project_id=project_id)
|
||||
|
||||
# Filter by status
|
||||
if status:
|
||||
query = query.filter_by(execution_status=status)
|
||||
else:
|
||||
# By default, show pending and upcoming completed/failed
|
||||
query = query.filter(
|
||||
or_(
|
||||
ScheduledAction.execution_status == "pending",
|
||||
and_(
|
||||
ScheduledAction.execution_status.in_(["completed", "failed"]),
|
||||
ScheduledAction.scheduled_time >= datetime.utcnow() - timedelta(days=7),
|
||||
),
|
||||
)
|
||||
)
|
||||
|
||||
# Filter by date range
|
||||
if start_date:
|
||||
query = query.filter(ScheduledAction.scheduled_time >= datetime.fromisoformat(start_date))
|
||||
if end_date:
|
||||
query = query.filter(ScheduledAction.scheduled_time <= datetime.fromisoformat(end_date))
|
||||
|
||||
actions = query.order_by(ScheduledAction.scheduled_time).all()
|
||||
|
||||
# Enrich with location and unit details
|
||||
actions_data = []
|
||||
for action in actions:
|
||||
location = db.query(MonitoringLocation).filter_by(id=action.location_id).first()
|
||||
|
||||
unit = None
|
||||
if action.unit_id:
|
||||
unit = db.query(RosterUnit).filter_by(id=action.unit_id).first()
|
||||
else:
|
||||
# Get from assignment
|
||||
assignment = db.query(UnitAssignment).filter(
|
||||
and_(
|
||||
UnitAssignment.location_id == action.location_id,
|
||||
UnitAssignment.status == "active",
|
||||
)
|
||||
).first()
|
||||
if assignment:
|
||||
unit = db.query(RosterUnit).filter_by(id=assignment.unit_id).first()
|
||||
|
||||
actions_data.append({
|
||||
"action": action,
|
||||
"location": location,
|
||||
"unit": unit,
|
||||
})
|
||||
|
||||
return templates.TemplateResponse("partials/projects/scheduler_agenda.html", {
|
||||
"request": request,
|
||||
"project_id": project_id,
|
||||
"actions": actions_data,
|
||||
})
|
||||
|
||||
|
||||
# ============================================================================
|
||||
# Create Scheduled Action
|
||||
# ============================================================================
|
||||
|
||||
@router.post("/actions/create")
|
||||
async def create_scheduled_action(
|
||||
project_id: str,
|
||||
request: Request,
|
||||
db: Session = Depends(get_db),
|
||||
):
|
||||
"""
|
||||
Create a new scheduled action.
|
||||
"""
|
||||
project = db.query(Project).filter_by(id=project_id).first()
|
||||
if not project:
|
||||
raise HTTPException(status_code=404, detail="Project not found")
|
||||
|
||||
form_data = await request.form()
|
||||
|
||||
location_id = form_data.get("location_id")
|
||||
location = db.query(MonitoringLocation).filter_by(
|
||||
id=location_id,
|
||||
project_id=project_id,
|
||||
).first()
|
||||
|
||||
if not location:
|
||||
raise HTTPException(status_code=404, detail="Location not found")
|
||||
|
||||
# Determine device type from location
|
||||
device_type = "sound_level_meter" if location.location_type == "sound" else "seismograph"
|
||||
|
||||
# Get unit_id (optional - can be determined from assignment at execution time)
|
||||
unit_id = form_data.get("unit_id")
|
||||
|
||||
action = ScheduledAction(
|
||||
id=str(uuid.uuid4()),
|
||||
project_id=project_id,
|
||||
location_id=location_id,
|
||||
unit_id=unit_id,
|
||||
action_type=form_data.get("action_type"),
|
||||
device_type=device_type,
|
||||
scheduled_time=datetime.fromisoformat(form_data.get("scheduled_time")),
|
||||
execution_status="pending",
|
||||
notes=form_data.get("notes"),
|
||||
)
|
||||
|
||||
db.add(action)
|
||||
db.commit()
|
||||
db.refresh(action)
|
||||
|
||||
return JSONResponse({
|
||||
"success": True,
|
||||
"action_id": action.id,
|
||||
"message": f"Scheduled action '{action.action_type}' created for {action.scheduled_time}",
|
||||
})
|
||||
|
||||
|
||||
# ============================================================================
|
||||
# Schedule Recording Session
|
||||
# ============================================================================
|
||||
|
||||
@router.post("/schedule-session")
|
||||
async def schedule_recording_session(
|
||||
project_id: str,
|
||||
request: Request,
|
||||
db: Session = Depends(get_db),
|
||||
):
|
||||
"""
|
||||
Schedule a complete recording session (start + stop).
|
||||
Creates two scheduled actions: start and stop.
|
||||
"""
|
||||
project = db.query(Project).filter_by(id=project_id).first()
|
||||
if not project:
|
||||
raise HTTPException(status_code=404, detail="Project not found")
|
||||
|
||||
form_data = await request.form()
|
||||
|
||||
location_id = form_data.get("location_id")
|
||||
location = db.query(MonitoringLocation).filter_by(
|
||||
id=location_id,
|
||||
project_id=project_id,
|
||||
).first()
|
||||
|
||||
if not location:
|
||||
raise HTTPException(status_code=404, detail="Location not found")
|
||||
|
||||
device_type = "sound_level_meter" if location.location_type == "sound" else "seismograph"
|
||||
unit_id = form_data.get("unit_id")
|
||||
|
||||
start_time = datetime.fromisoformat(form_data.get("start_time"))
|
||||
duration_minutes = int(form_data.get("duration_minutes", 60))
|
||||
stop_time = start_time + timedelta(minutes=duration_minutes)
|
||||
|
||||
# Create START action
|
||||
start_action = ScheduledAction(
|
||||
id=str(uuid.uuid4()),
|
||||
project_id=project_id,
|
||||
location_id=location_id,
|
||||
unit_id=unit_id,
|
||||
action_type="start",
|
||||
device_type=device_type,
|
||||
scheduled_time=start_time,
|
||||
execution_status="pending",
|
||||
notes=form_data.get("notes"),
|
||||
)
|
||||
|
||||
# Create STOP action
|
||||
stop_action = ScheduledAction(
|
||||
id=str(uuid.uuid4()),
|
||||
project_id=project_id,
|
||||
location_id=location_id,
|
||||
unit_id=unit_id,
|
||||
action_type="stop",
|
||||
device_type=device_type,
|
||||
scheduled_time=stop_time,
|
||||
execution_status="pending",
|
||||
notes=f"Auto-stop after {duration_minutes} minutes",
|
||||
)
|
||||
|
||||
db.add(start_action)
|
||||
db.add(stop_action)
|
||||
db.commit()
|
||||
|
||||
return JSONResponse({
|
||||
"success": True,
|
||||
"start_action_id": start_action.id,
|
||||
"stop_action_id": stop_action.id,
|
||||
"message": f"Recording session scheduled from {start_time} to {stop_time}",
|
||||
})
|
||||
|
||||
|
||||
# ============================================================================
|
||||
# Update/Cancel Scheduled Action
|
||||
# ============================================================================
|
||||
|
||||
@router.put("/actions/{action_id}")
|
||||
async def update_scheduled_action(
|
||||
project_id: str,
|
||||
action_id: str,
|
||||
request: Request,
|
||||
db: Session = Depends(get_db),
|
||||
):
|
||||
"""
|
||||
Update a scheduled action (only if not yet executed).
|
||||
"""
|
||||
action = db.query(ScheduledAction).filter_by(
|
||||
id=action_id,
|
||||
project_id=project_id,
|
||||
).first()
|
||||
|
||||
if not action:
|
||||
raise HTTPException(status_code=404, detail="Action not found")
|
||||
|
||||
if action.execution_status != "pending":
|
||||
raise HTTPException(
|
||||
status_code=400,
|
||||
detail="Cannot update action that has already been executed",
|
||||
)
|
||||
|
||||
data = await request.json()
|
||||
|
||||
if "scheduled_time" in data:
|
||||
action.scheduled_time = datetime.fromisoformat(data["scheduled_time"])
|
||||
if "notes" in data:
|
||||
action.notes = data["notes"]
|
||||
|
||||
db.commit()
|
||||
|
||||
return {"success": True, "message": "Action updated successfully"}
|
||||
|
||||
|
||||
@router.post("/actions/{action_id}/cancel")
|
||||
async def cancel_scheduled_action(
|
||||
project_id: str,
|
||||
action_id: str,
|
||||
db: Session = Depends(get_db),
|
||||
):
|
||||
"""
|
||||
Cancel a pending scheduled action.
|
||||
"""
|
||||
action = db.query(ScheduledAction).filter_by(
|
||||
id=action_id,
|
||||
project_id=project_id,
|
||||
).first()
|
||||
|
||||
if not action:
|
||||
raise HTTPException(status_code=404, detail="Action not found")
|
||||
|
||||
if action.execution_status != "pending":
|
||||
raise HTTPException(
|
||||
status_code=400,
|
||||
detail="Can only cancel pending actions",
|
||||
)
|
||||
|
||||
action.execution_status = "cancelled"
|
||||
db.commit()
|
||||
|
||||
return {"success": True, "message": "Action cancelled successfully"}
|
||||
|
||||
|
||||
@router.delete("/actions/{action_id}")
|
||||
async def delete_scheduled_action(
|
||||
project_id: str,
|
||||
action_id: str,
|
||||
db: Session = Depends(get_db),
|
||||
):
|
||||
"""
|
||||
Delete a scheduled action (only if pending or cancelled).
|
||||
"""
|
||||
action = db.query(ScheduledAction).filter_by(
|
||||
id=action_id,
|
||||
project_id=project_id,
|
||||
).first()
|
||||
|
||||
if not action:
|
||||
raise HTTPException(status_code=404, detail="Action not found")
|
||||
|
||||
if action.execution_status not in ["pending", "cancelled"]:
|
||||
raise HTTPException(
|
||||
status_code=400,
|
||||
detail="Cannot delete action that has been executed",
|
||||
)
|
||||
|
||||
db.delete(action)
|
||||
db.commit()
|
||||
|
||||
return {"success": True, "message": "Action deleted successfully"}
|
||||
|
||||
|
||||
# ============================================================================
|
||||
# Manual Execution
|
||||
# ============================================================================
|
||||
|
||||
@router.post("/actions/{action_id}/execute")
|
||||
async def execute_action_now(
|
||||
project_id: str,
|
||||
action_id: str,
|
||||
db: Session = Depends(get_db),
|
||||
):
|
||||
"""
|
||||
Manually trigger execution of a scheduled action (for testing/debugging).
|
||||
"""
|
||||
action = db.query(ScheduledAction).filter_by(
|
||||
id=action_id,
|
||||
project_id=project_id,
|
||||
).first()
|
||||
|
||||
if not action:
|
||||
raise HTTPException(status_code=404, detail="Action not found")
|
||||
|
||||
if action.execution_status != "pending":
|
||||
raise HTTPException(
|
||||
status_code=400,
|
||||
detail="Action is not pending",
|
||||
)
|
||||
|
||||
# Execute via scheduler service
|
||||
scheduler = get_scheduler()
|
||||
result = await scheduler.execute_action_by_id(action_id)
|
||||
|
||||
# Refresh from DB to get updated status
|
||||
db.refresh(action)
|
||||
|
||||
return JSONResponse({
|
||||
"success": result.get("success", False),
|
||||
"result": result,
|
||||
"action": {
|
||||
"id": action.id,
|
||||
"execution_status": action.execution_status,
|
||||
"executed_at": action.executed_at.isoformat() if action.executed_at else None,
|
||||
"error_message": action.error_message,
|
||||
},
|
||||
})
|
||||
|
||||
|
||||
# ============================================================================
|
||||
# Scheduler Status
|
||||
# ============================================================================
|
||||
|
||||
@router.get("/status")
|
||||
async def get_scheduler_status():
|
||||
"""
|
||||
Get scheduler service status.
|
||||
"""
|
||||
scheduler = get_scheduler()
|
||||
|
||||
return {
|
||||
"running": scheduler.running,
|
||||
"check_interval": scheduler.check_interval,
|
||||
}
|
||||
|
||||
|
||||
@router.post("/execute-pending")
|
||||
async def trigger_pending_execution():
|
||||
"""
|
||||
Manually trigger execution of all pending actions (for testing).
|
||||
"""
|
||||
scheduler = get_scheduler()
|
||||
results = await scheduler.execute_pending_actions()
|
||||
|
||||
return {
|
||||
"success": True,
|
||||
"executed_count": len(results),
|
||||
"results": results,
|
||||
}
|
||||
@@ -7,11 +7,11 @@ from fastapi import APIRouter, Request, Depends, Query
|
||||
from fastapi.responses import HTMLResponse
|
||||
from fastapi.templating import Jinja2Templates
|
||||
from sqlalchemy.orm import Session
|
||||
from app.seismo.database import get_db
|
||||
from app.seismo.models import RosterUnit
|
||||
from backend.database import get_db
|
||||
from backend.models import RosterUnit
|
||||
|
||||
router = APIRouter(prefix="/api/seismo-dashboard", tags=["seismo-dashboard"])
|
||||
templates = Jinja2Templates(directory="app/ui/templates")
|
||||
templates = Jinja2Templates(directory="templates")
|
||||
|
||||
|
||||
@router.get("/stats", response_class=HTMLResponse)
|
||||
@@ -9,9 +9,9 @@ import io
|
||||
import shutil
|
||||
from pathlib import Path
|
||||
|
||||
from app.seismo.database import get_db
|
||||
from app.seismo.models import RosterUnit, Emitter, IgnoredUnit, UserPreferences
|
||||
from app.seismo.services.database_backup import DatabaseBackupService
|
||||
from backend.database import get_db
|
||||
from backend.models import RosterUnit, Emitter, IgnoredUnit, UserPreferences
|
||||
from backend.services.database_backup import DatabaseBackupService
|
||||
|
||||
router = APIRouter(prefix="/api/settings", tags=["settings"])
|
||||
|
||||
380
backend/routers/slm_dashboard.py
Normal file
@@ -0,0 +1,380 @@
|
||||
"""
|
||||
SLM Dashboard Router
|
||||
|
||||
Provides API endpoints for the Sound Level Meters dashboard page.
|
||||
"""
|
||||
|
||||
from fastapi import APIRouter, Request, Depends, Query
|
||||
from fastapi.templating import Jinja2Templates
|
||||
from fastapi.responses import HTMLResponse
|
||||
from sqlalchemy.orm import Session
|
||||
from sqlalchemy import func
|
||||
from datetime import datetime, timedelta
|
||||
import asyncio
|
||||
import httpx
|
||||
import logging
|
||||
import os
|
||||
|
||||
from backend.database import get_db
|
||||
from backend.models import RosterUnit
|
||||
from backend.routers.roster_edit import sync_slm_to_slmm_cache
|
||||
|
||||
logger = logging.getLogger(__name__)
|
||||
|
||||
router = APIRouter(prefix="/api/slm-dashboard", tags=["slm-dashboard"])
|
||||
templates = Jinja2Templates(directory="templates")
|
||||
|
||||
# SLMM backend URL - configurable via environment variable
|
||||
SLMM_BASE_URL = os.getenv("SLMM_BASE_URL", "http://localhost:8100")
|
||||
|
||||
|
||||
@router.get("/stats", response_class=HTMLResponse)
|
||||
async def get_slm_stats(request: Request, db: Session = Depends(get_db)):
|
||||
"""
|
||||
Get summary statistics for SLM dashboard.
|
||||
Returns HTML partial with stat cards.
|
||||
"""
|
||||
# Query all SLMs
|
||||
all_slms = db.query(RosterUnit).filter_by(device_type="sound_level_meter").all()
|
||||
|
||||
# Count deployed vs benched
|
||||
deployed_count = sum(1 for slm in all_slms if slm.deployed and not slm.retired)
|
||||
benched_count = sum(1 for slm in all_slms if not slm.deployed and not slm.retired)
|
||||
retired_count = sum(1 for slm in all_slms if slm.retired)
|
||||
|
||||
# Count recently active (checked in last hour)
|
||||
one_hour_ago = datetime.utcnow() - timedelta(hours=1)
|
||||
active_count = sum(1 for slm in all_slms
|
||||
if slm.slm_last_check and slm.slm_last_check > one_hour_ago)
|
||||
|
||||
return templates.TemplateResponse("partials/slm_stats.html", {
|
||||
"request": request,
|
||||
"total_count": len(all_slms),
|
||||
"deployed_count": deployed_count,
|
||||
"benched_count": benched_count,
|
||||
"active_count": active_count,
|
||||
"retired_count": retired_count
|
||||
})
|
||||
|
||||
|
||||
@router.get("/units", response_class=HTMLResponse)
|
||||
async def get_slm_units(
|
||||
request: Request,
|
||||
db: Session = Depends(get_db),
|
||||
search: str = Query(None),
|
||||
project: str = Query(None),
|
||||
include_measurement: bool = Query(False),
|
||||
):
|
||||
"""
|
||||
Get list of SLM units for the sidebar.
|
||||
Returns HTML partial with unit cards.
|
||||
"""
|
||||
query = db.query(RosterUnit).filter_by(device_type="sound_level_meter")
|
||||
|
||||
# Filter by project if provided
|
||||
if project:
|
||||
query = query.filter(RosterUnit.project_id == project)
|
||||
|
||||
# Filter by search term if provided
|
||||
if search:
|
||||
search_term = f"%{search}%"
|
||||
query = query.filter(
|
||||
(RosterUnit.id.like(search_term)) |
|
||||
(RosterUnit.slm_model.like(search_term)) |
|
||||
(RosterUnit.address.like(search_term))
|
||||
)
|
||||
|
||||
units = query.order_by(
|
||||
RosterUnit.retired.asc(),
|
||||
RosterUnit.deployed.desc(),
|
||||
RosterUnit.id.asc()
|
||||
).all()
|
||||
|
||||
one_hour_ago = datetime.utcnow() - timedelta(hours=1)
|
||||
for unit in units:
|
||||
unit.is_recent = bool(unit.slm_last_check and unit.slm_last_check > one_hour_ago)
|
||||
|
||||
if include_measurement:
|
||||
async def fetch_measurement_state(client: httpx.AsyncClient, unit_id: str) -> str | None:
|
||||
try:
|
||||
response = await client.get(f"{SLMM_BASE_URL}/api/nl43/{unit_id}/measurement-state")
|
||||
if response.status_code == 200:
|
||||
return response.json().get("measurement_state")
|
||||
except Exception:
|
||||
return None
|
||||
return None
|
||||
|
||||
deployed_units = [unit for unit in units if unit.deployed and not unit.retired]
|
||||
if deployed_units:
|
||||
async with httpx.AsyncClient(timeout=3.0) as client:
|
||||
tasks = [fetch_measurement_state(client, unit.id) for unit in deployed_units]
|
||||
results = await asyncio.gather(*tasks, return_exceptions=True)
|
||||
|
||||
for unit, state in zip(deployed_units, results):
|
||||
if isinstance(state, Exception):
|
||||
unit.measurement_state = None
|
||||
else:
|
||||
unit.measurement_state = state
|
||||
|
||||
return templates.TemplateResponse("partials/slm_device_list.html", {
|
||||
"request": request,
|
||||
"units": units
|
||||
})
|
||||
|
||||
|
||||
@router.get("/live-view/{unit_id}", response_class=HTMLResponse)
|
||||
async def get_live_view(request: Request, unit_id: str, db: Session = Depends(get_db)):
|
||||
"""
|
||||
Get live view panel for a specific SLM unit.
|
||||
Returns HTML partial with live metrics and chart.
|
||||
"""
|
||||
# Get unit from database
|
||||
unit = db.query(RosterUnit).filter_by(id=unit_id, device_type="sound_level_meter").first()
|
||||
|
||||
if not unit:
|
||||
return templates.TemplateResponse("partials/slm_live_view_error.html", {
|
||||
"request": request,
|
||||
"error": f"Unit {unit_id} not found"
|
||||
})
|
||||
|
||||
# Get modem information if assigned
|
||||
modem = None
|
||||
modem_ip = None
|
||||
if unit.deployed_with_modem_id:
|
||||
modem = db.query(RosterUnit).filter_by(id=unit.deployed_with_modem_id, device_type="modem").first()
|
||||
if modem:
|
||||
modem_ip = modem.ip_address
|
||||
else:
|
||||
logger.warning(f"SLM {unit_id} is assigned to modem {unit.deployed_with_modem_id} but modem not found")
|
||||
|
||||
# Fallback to direct slm_host if no modem assigned (backward compatibility)
|
||||
if not modem_ip and unit.slm_host:
|
||||
modem_ip = unit.slm_host
|
||||
logger.info(f"Using legacy slm_host for {unit_id}: {modem_ip}")
|
||||
|
||||
# Try to get current status from SLMM
|
||||
current_status = None
|
||||
measurement_state = None
|
||||
is_measuring = False
|
||||
|
||||
try:
|
||||
async with httpx.AsyncClient(timeout=10.0) as client:
|
||||
# Get measurement state
|
||||
state_response = await client.get(
|
||||
f"{SLMM_BASE_URL}/api/nl43/{unit_id}/measurement-state"
|
||||
)
|
||||
if state_response.status_code == 200:
|
||||
state_data = state_response.json()
|
||||
measurement_state = state_data.get("measurement_state", "Unknown")
|
||||
is_measuring = state_data.get("is_measuring", False)
|
||||
|
||||
# If measuring, sync start time from FTP to database (fixes wrong timestamps)
|
||||
if is_measuring:
|
||||
try:
|
||||
sync_response = await client.post(
|
||||
f"{SLMM_BASE_URL}/api/nl43/{unit_id}/sync-start-time",
|
||||
timeout=10.0
|
||||
)
|
||||
if sync_response.status_code == 200:
|
||||
sync_data = sync_response.json()
|
||||
logger.info(f"Synced start time for {unit_id}: {sync_data.get('message')}")
|
||||
else:
|
||||
logger.warning(f"Failed to sync start time for {unit_id}: {sync_response.status_code}")
|
||||
except Exception as e:
|
||||
# Don't fail the whole request if sync fails
|
||||
logger.warning(f"Could not sync start time for {unit_id}: {e}")
|
||||
|
||||
# Get live status (now with corrected start time)
|
||||
status_response = await client.get(
|
||||
f"{SLMM_BASE_URL}/api/nl43/{unit_id}/live"
|
||||
)
|
||||
if status_response.status_code == 200:
|
||||
status_data = status_response.json()
|
||||
current_status = status_data.get("data", {})
|
||||
except Exception as e:
|
||||
logger.error(f"Failed to get status for {unit_id}: {e}")
|
||||
|
||||
return templates.TemplateResponse("partials/slm_live_view.html", {
|
||||
"request": request,
|
||||
"unit": unit,
|
||||
"modem": modem,
|
||||
"modem_ip": modem_ip,
|
||||
"current_status": current_status,
|
||||
"measurement_state": measurement_state,
|
||||
"is_measuring": is_measuring
|
||||
})
|
||||
|
||||
|
||||
@router.post("/control/{unit_id}/{action}")
|
||||
async def control_slm(unit_id: str, action: str):
|
||||
"""
|
||||
Send control commands to SLM (start, stop, pause, resume, reset).
|
||||
Proxies to SLMM backend.
|
||||
"""
|
||||
valid_actions = ["start", "stop", "pause", "resume", "reset"]
|
||||
|
||||
if action not in valid_actions:
|
||||
return {"status": "error", "detail": f"Invalid action. Must be one of: {valid_actions}"}
|
||||
|
||||
try:
|
||||
async with httpx.AsyncClient(timeout=10.0) as client:
|
||||
response = await client.post(
|
||||
f"{SLMM_BASE_URL}/api/nl43/{unit_id}/{action}"
|
||||
)
|
||||
|
||||
if response.status_code == 200:
|
||||
return response.json()
|
||||
else:
|
||||
return {
|
||||
"status": "error",
|
||||
"detail": f"SLMM returned status {response.status_code}"
|
||||
}
|
||||
except Exception as e:
|
||||
logger.error(f"Failed to control {unit_id}: {e}")
|
||||
return {
|
||||
"status": "error",
|
||||
"detail": str(e)
|
||||
}
|
||||
|
||||
@router.get("/config/{unit_id}", response_class=HTMLResponse)
|
||||
async def get_slm_config(request: Request, unit_id: str, db: Session = Depends(get_db)):
|
||||
"""
|
||||
Get configuration form for a specific SLM unit.
|
||||
Returns HTML partial with configuration form.
|
||||
"""
|
||||
unit = db.query(RosterUnit).filter_by(id=unit_id, device_type="sound_level_meter").first()
|
||||
|
||||
if not unit:
|
||||
return HTMLResponse(
|
||||
content=f'<div class="text-red-500">Unit {unit_id} not found</div>',
|
||||
status_code=404
|
||||
)
|
||||
|
||||
return templates.TemplateResponse("partials/slm_config_form.html", {
|
||||
"request": request,
|
||||
"unit": unit
|
||||
})
|
||||
|
||||
|
||||
@router.post("/config/{unit_id}")
|
||||
async def save_slm_config(request: Request, unit_id: str, db: Session = Depends(get_db)):
|
||||
"""
|
||||
Save SLM configuration.
|
||||
Updates unit parameters in the database.
|
||||
"""
|
||||
unit = db.query(RosterUnit).filter_by(id=unit_id, device_type="sound_level_meter").first()
|
||||
|
||||
if not unit:
|
||||
return {"status": "error", "detail": f"Unit {unit_id} not found"}
|
||||
|
||||
try:
|
||||
# Get form data
|
||||
form_data = await request.form()
|
||||
|
||||
# Update SLM-specific fields
|
||||
unit.slm_model = form_data.get("slm_model") or None
|
||||
unit.slm_serial_number = form_data.get("slm_serial_number") or None
|
||||
unit.slm_frequency_weighting = form_data.get("slm_frequency_weighting") or None
|
||||
unit.slm_time_weighting = form_data.get("slm_time_weighting") or None
|
||||
unit.slm_measurement_range = form_data.get("slm_measurement_range") or None
|
||||
|
||||
# Update network configuration
|
||||
modem_id = form_data.get("deployed_with_modem_id")
|
||||
unit.deployed_with_modem_id = modem_id if modem_id else None
|
||||
|
||||
# Always update TCP and FTP ports (used regardless of modem assignment)
|
||||
unit.slm_tcp_port = int(form_data.get("slm_tcp_port")) if form_data.get("slm_tcp_port") else None
|
||||
unit.slm_ftp_port = int(form_data.get("slm_ftp_port")) if form_data.get("slm_ftp_port") else None
|
||||
|
||||
# Only update direct IP if no modem is assigned
|
||||
if not modem_id:
|
||||
unit.slm_host = form_data.get("slm_host") or None
|
||||
else:
|
||||
# Clear legacy direct IP field when modem is assigned
|
||||
unit.slm_host = None
|
||||
|
||||
db.commit()
|
||||
logger.info(f"Updated configuration for SLM {unit_id}")
|
||||
|
||||
# Sync updated configuration to SLMM cache
|
||||
logger.info(f"Syncing SLM {unit_id} config changes to SLMM cache...")
|
||||
result = await sync_slm_to_slmm_cache(
|
||||
unit_id=unit_id,
|
||||
host=unit.slm_host, # Use the updated host from Terra-View
|
||||
tcp_port=unit.slm_tcp_port,
|
||||
ftp_port=unit.slm_ftp_port,
|
||||
deployed_with_modem_id=unit.deployed_with_modem_id, # Resolve modem IP if assigned
|
||||
db=db
|
||||
)
|
||||
|
||||
if not result["success"]:
|
||||
logger.warning(f"SLMM cache sync warning for {unit_id}: {result['message']}")
|
||||
# Config still saved in Terra-View (source of truth)
|
||||
|
||||
return {"status": "success", "unit_id": unit_id}
|
||||
|
||||
except Exception as e:
|
||||
db.rollback()
|
||||
logger.error(f"Failed to save config for {unit_id}: {e}")
|
||||
return {"status": "error", "detail": str(e)}
|
||||
|
||||
|
||||
@router.get("/test-modem/{modem_id}")
|
||||
async def test_modem_connection(modem_id: str, db: Session = Depends(get_db)):
|
||||
"""
|
||||
Test modem connectivity with a simple ping/health check.
|
||||
Returns response time and connection status.
|
||||
"""
|
||||
import subprocess
|
||||
import time
|
||||
|
||||
# Get modem from database
|
||||
modem = db.query(RosterUnit).filter_by(id=modem_id, device_type="modem").first()
|
||||
|
||||
if not modem:
|
||||
return {"status": "error", "detail": f"Modem {modem_id} not found"}
|
||||
|
||||
if not modem.ip_address:
|
||||
return {"status": "error", "detail": f"Modem {modem_id} has no IP address configured"}
|
||||
|
||||
try:
|
||||
# Ping the modem (1 packet, 2 second timeout)
|
||||
start_time = time.time()
|
||||
result = subprocess.run(
|
||||
["ping", "-c", "1", "-W", "2", modem.ip_address],
|
||||
capture_output=True,
|
||||
text=True,
|
||||
timeout=3
|
||||
)
|
||||
response_time = int((time.time() - start_time) * 1000) # Convert to milliseconds
|
||||
|
||||
if result.returncode == 0:
|
||||
return {
|
||||
"status": "success",
|
||||
"modem_id": modem_id,
|
||||
"ip_address": modem.ip_address,
|
||||
"response_time": response_time,
|
||||
"message": "Modem is responding to ping"
|
||||
}
|
||||
else:
|
||||
return {
|
||||
"status": "error",
|
||||
"modem_id": modem_id,
|
||||
"ip_address": modem.ip_address,
|
||||
"detail": "Modem not responding to ping"
|
||||
}
|
||||
|
||||
except subprocess.TimeoutExpired:
|
||||
return {
|
||||
"status": "error",
|
||||
"modem_id": modem_id,
|
||||
"ip_address": modem.ip_address,
|
||||
"detail": "Ping timeout (> 2 seconds)"
|
||||
}
|
||||
except Exception as e:
|
||||
logger.error(f"Failed to ping modem {modem_id}: {e}")
|
||||
return {
|
||||
"status": "error",
|
||||
"modem_id": modem_id,
|
||||
"detail": str(e)
|
||||
}
|
||||
123
backend/routers/slm_ui.py
Normal file
@@ -0,0 +1,123 @@
|
||||
"""
|
||||
Sound Level Meter UI Router
|
||||
|
||||
Provides endpoints for SLM dashboard cards, detail pages, and real-time data.
|
||||
"""
|
||||
|
||||
from fastapi import APIRouter, Depends, HTTPException, Request
|
||||
from fastapi.responses import HTMLResponse
|
||||
from fastapi.templating import Jinja2Templates
|
||||
from sqlalchemy.orm import Session
|
||||
from datetime import datetime
|
||||
import httpx
|
||||
import logging
|
||||
import os
|
||||
|
||||
from backend.database import get_db
|
||||
from backend.models import RosterUnit
|
||||
|
||||
logger = logging.getLogger(__name__)
|
||||
|
||||
router = APIRouter(prefix="/slm", tags=["slm-ui"])
|
||||
templates = Jinja2Templates(directory="templates")
|
||||
|
||||
SLMM_BASE_URL = os.getenv("SLMM_BASE_URL", "http://172.19.0.1:8100")
|
||||
|
||||
|
||||
@router.get("/{unit_id}", response_class=HTMLResponse)
|
||||
async def slm_detail_page(request: Request, unit_id: str, db: Session = Depends(get_db)):
|
||||
"""Sound level meter detail page with controls."""
|
||||
|
||||
# Get roster unit
|
||||
unit = db.query(RosterUnit).filter_by(id=unit_id).first()
|
||||
if not unit or unit.device_type != "sound_level_meter":
|
||||
raise HTTPException(status_code=404, detail="Sound level meter not found")
|
||||
|
||||
return templates.TemplateResponse("slm_detail.html", {
|
||||
"request": request,
|
||||
"unit": unit,
|
||||
"unit_id": unit_id
|
||||
})
|
||||
|
||||
|
||||
@router.get("/api/{unit_id}/summary")
|
||||
async def get_slm_summary(unit_id: str, db: Session = Depends(get_db)):
|
||||
"""Get SLM summary data for dashboard card."""
|
||||
|
||||
# Get roster unit
|
||||
unit = db.query(RosterUnit).filter_by(id=unit_id).first()
|
||||
if not unit or unit.device_type != "sound_level_meter":
|
||||
raise HTTPException(status_code=404, detail="Sound level meter not found")
|
||||
|
||||
# Try to get live status from SLMM
|
||||
status_data = None
|
||||
try:
|
||||
async with httpx.AsyncClient(timeout=3.0) as client:
|
||||
response = await client.get(f"{SLMM_BASE_URL}/api/nl43/{unit_id}/status")
|
||||
if response.status_code == 200:
|
||||
status_data = response.json().get("data")
|
||||
except Exception as e:
|
||||
logger.warning(f"Failed to get SLM status for {unit_id}: {e}")
|
||||
|
||||
return {
|
||||
"unit_id": unit_id,
|
||||
"device_type": "sound_level_meter",
|
||||
"deployed": unit.deployed,
|
||||
"model": unit.slm_model or "NL-43",
|
||||
"location": unit.address or unit.location,
|
||||
"coordinates": unit.coordinates,
|
||||
"note": unit.note,
|
||||
"status": status_data,
|
||||
"last_check": unit.slm_last_check.isoformat() if unit.slm_last_check else None,
|
||||
}
|
||||
|
||||
|
||||
@router.get("/partials/{unit_id}/card", response_class=HTMLResponse)
|
||||
async def slm_dashboard_card(request: Request, unit_id: str, db: Session = Depends(get_db)):
|
||||
"""Render SLM dashboard card partial."""
|
||||
|
||||
summary = await get_slm_summary(unit_id, db)
|
||||
|
||||
return templates.TemplateResponse("partials/slm_card.html", {
|
||||
"request": request,
|
||||
"slm": summary
|
||||
})
|
||||
|
||||
|
||||
@router.get("/partials/{unit_id}/controls", response_class=HTMLResponse)
|
||||
async def slm_controls_partial(request: Request, unit_id: str, db: Session = Depends(get_db)):
|
||||
"""Render SLM control panel partial."""
|
||||
|
||||
unit = db.query(RosterUnit).filter_by(id=unit_id).first()
|
||||
if not unit or unit.device_type != "sound_level_meter":
|
||||
raise HTTPException(status_code=404, detail="Sound level meter not found")
|
||||
|
||||
# Get current status from SLMM
|
||||
measurement_state = None
|
||||
battery_level = None
|
||||
try:
|
||||
async with httpx.AsyncClient(timeout=3.0) as client:
|
||||
# Get measurement state
|
||||
state_response = await client.get(
|
||||
f"{SLMM_BASE_URL}/api/nl43/{unit_id}/measurement-state"
|
||||
)
|
||||
if state_response.status_code == 200:
|
||||
measurement_state = state_response.json().get("measurement_state")
|
||||
|
||||
# Get battery level
|
||||
battery_response = await client.get(
|
||||
f"{SLMM_BASE_URL}/api/nl43/{unit_id}/battery"
|
||||
)
|
||||
if battery_response.status_code == 200:
|
||||
battery_level = battery_response.json().get("battery_level")
|
||||
except Exception as e:
|
||||
logger.warning(f"Failed to get SLM control data for {unit_id}: {e}")
|
||||
|
||||
return templates.TemplateResponse("partials/slm_controls.html", {
|
||||
"request": request,
|
||||
"unit_id": unit_id,
|
||||
"unit": unit,
|
||||
"measurement_state": measurement_state,
|
||||
"battery_level": battery_level,
|
||||
"is_measuring": measurement_state == "Start"
|
||||
})
|
||||
301
backend/routers/slmm.py
Normal file
@@ -0,0 +1,301 @@
|
||||
"""
|
||||
SLMM (Sound Level Meter Manager) Proxy Router
|
||||
|
||||
Proxies requests from SFM to the standalone SLMM backend service.
|
||||
SLMM runs on port 8100 and handles NL43/NL53 sound level meter communication.
|
||||
"""
|
||||
|
||||
from fastapi import APIRouter, HTTPException, Request, Response, WebSocket, WebSocketDisconnect
|
||||
from fastapi.responses import StreamingResponse
|
||||
import httpx
|
||||
import websockets
|
||||
import asyncio
|
||||
import logging
|
||||
import os
|
||||
|
||||
logger = logging.getLogger(__name__)
|
||||
|
||||
router = APIRouter(prefix="/api/slmm", tags=["slmm"])
|
||||
|
||||
# SLMM backend URL - configurable via environment variable
|
||||
SLMM_BASE_URL = os.getenv("SLMM_BASE_URL", "http://localhost:8100")
|
||||
# WebSocket URL derived from HTTP URL
|
||||
SLMM_WS_BASE_URL = SLMM_BASE_URL.replace("http://", "ws://").replace("https://", "wss://")
|
||||
|
||||
|
||||
@router.get("/health")
|
||||
async def check_slmm_health():
|
||||
"""
|
||||
Check if the SLMM backend service is reachable and healthy.
|
||||
"""
|
||||
try:
|
||||
async with httpx.AsyncClient(timeout=5.0) as client:
|
||||
response = await client.get(f"{SLMM_BASE_URL}/health")
|
||||
|
||||
if response.status_code == 200:
|
||||
data = response.json()
|
||||
return {
|
||||
"status": "ok",
|
||||
"slmm_status": "connected",
|
||||
"slmm_url": SLMM_BASE_URL,
|
||||
"slmm_version": data.get("version", "unknown"),
|
||||
"slmm_response": data
|
||||
}
|
||||
else:
|
||||
return {
|
||||
"status": "degraded",
|
||||
"slmm_status": "error",
|
||||
"slmm_url": SLMM_BASE_URL,
|
||||
"detail": f"SLMM returned status {response.status_code}"
|
||||
}
|
||||
|
||||
except httpx.ConnectError:
|
||||
return {
|
||||
"status": "error",
|
||||
"slmm_status": "unreachable",
|
||||
"slmm_url": SLMM_BASE_URL,
|
||||
"detail": "Cannot connect to SLMM backend. Is it running?"
|
||||
}
|
||||
except Exception as e:
|
||||
return {
|
||||
"status": "error",
|
||||
"slmm_status": "error",
|
||||
"slmm_url": SLMM_BASE_URL,
|
||||
"detail": str(e)
|
||||
}
|
||||
|
||||
|
||||
# WebSocket routes MUST come before the catch-all route
|
||||
@router.websocket("/{unit_id}/stream")
|
||||
async def proxy_websocket_stream(websocket: WebSocket, unit_id: str):
|
||||
"""
|
||||
Proxy WebSocket connections to SLMM's /stream endpoint.
|
||||
|
||||
This allows real-time streaming of measurement data from NL43 devices
|
||||
through the SFM unified interface.
|
||||
"""
|
||||
await websocket.accept()
|
||||
logger.info(f"WebSocket connection accepted for SLMM unit {unit_id}")
|
||||
|
||||
# Build target WebSocket URL
|
||||
target_ws_url = f"{SLMM_WS_BASE_URL}/api/nl43/{unit_id}/stream"
|
||||
logger.info(f"Connecting to SLMM WebSocket: {target_ws_url}")
|
||||
|
||||
backend_ws = None
|
||||
|
||||
try:
|
||||
# Connect to SLMM backend WebSocket
|
||||
backend_ws = await websockets.connect(target_ws_url)
|
||||
logger.info(f"Connected to SLMM backend WebSocket for {unit_id}")
|
||||
|
||||
# Create tasks for bidirectional communication
|
||||
async def forward_to_backend():
|
||||
"""Forward messages from client to SLMM backend"""
|
||||
try:
|
||||
while True:
|
||||
data = await websocket.receive_text()
|
||||
await backend_ws.send(data)
|
||||
except WebSocketDisconnect:
|
||||
logger.info(f"Client WebSocket disconnected for {unit_id}")
|
||||
except Exception as e:
|
||||
logger.error(f"Error forwarding to backend: {e}")
|
||||
|
||||
async def forward_to_client():
|
||||
"""Forward messages from SLMM backend to client"""
|
||||
try:
|
||||
async for message in backend_ws:
|
||||
await websocket.send_text(message)
|
||||
except websockets.exceptions.ConnectionClosed:
|
||||
logger.info(f"Backend WebSocket closed for {unit_id}")
|
||||
except Exception as e:
|
||||
logger.error(f"Error forwarding to client: {e}")
|
||||
|
||||
# Run both forwarding tasks concurrently
|
||||
await asyncio.gather(
|
||||
forward_to_backend(),
|
||||
forward_to_client(),
|
||||
return_exceptions=True
|
||||
)
|
||||
|
||||
except websockets.exceptions.WebSocketException as e:
|
||||
logger.error(f"WebSocket error connecting to SLMM backend: {e}")
|
||||
try:
|
||||
await websocket.send_json({
|
||||
"error": "Failed to connect to SLMM backend",
|
||||
"detail": str(e)
|
||||
})
|
||||
except Exception:
|
||||
pass
|
||||
except Exception as e:
|
||||
logger.error(f"Unexpected error in WebSocket proxy for {unit_id}: {e}")
|
||||
try:
|
||||
await websocket.send_json({
|
||||
"error": "Internal server error",
|
||||
"detail": str(e)
|
||||
})
|
||||
except Exception:
|
||||
pass
|
||||
finally:
|
||||
# Clean up connections
|
||||
if backend_ws:
|
||||
try:
|
||||
await backend_ws.close()
|
||||
except Exception:
|
||||
pass
|
||||
try:
|
||||
await websocket.close()
|
||||
except Exception:
|
||||
pass
|
||||
logger.info(f"WebSocket proxy closed for {unit_id}")
|
||||
|
||||
|
||||
@router.websocket("/{unit_id}/live")
|
||||
async def proxy_websocket_live(websocket: WebSocket, unit_id: str):
|
||||
"""
|
||||
Proxy WebSocket connections to SLMM's /live endpoint.
|
||||
|
||||
Alternative WebSocket endpoint that may be used by some frontend components.
|
||||
"""
|
||||
await websocket.accept()
|
||||
logger.info(f"WebSocket connection accepted for SLMM unit {unit_id} (live endpoint)")
|
||||
|
||||
# Build target WebSocket URL - try /stream endpoint as SLMM uses that for WebSocket
|
||||
target_ws_url = f"{SLMM_WS_BASE_URL}/api/nl43/{unit_id}/stream"
|
||||
logger.info(f"Connecting to SLMM WebSocket: {target_ws_url}")
|
||||
|
||||
backend_ws = None
|
||||
|
||||
try:
|
||||
# Connect to SLMM backend WebSocket
|
||||
backend_ws = await websockets.connect(target_ws_url)
|
||||
logger.info(f"Connected to SLMM backend WebSocket for {unit_id} (live endpoint)")
|
||||
|
||||
# Create tasks for bidirectional communication
|
||||
async def forward_to_backend():
|
||||
"""Forward messages from client to SLMM backend"""
|
||||
try:
|
||||
while True:
|
||||
data = await websocket.receive_text()
|
||||
await backend_ws.send(data)
|
||||
except WebSocketDisconnect:
|
||||
logger.info(f"Client WebSocket disconnected for {unit_id} (live)")
|
||||
except Exception as e:
|
||||
logger.error(f"Error forwarding to backend (live): {e}")
|
||||
|
||||
async def forward_to_client():
|
||||
"""Forward messages from SLMM backend to client"""
|
||||
try:
|
||||
async for message in backend_ws:
|
||||
await websocket.send_text(message)
|
||||
except websockets.exceptions.ConnectionClosed:
|
||||
logger.info(f"Backend WebSocket closed for {unit_id} (live)")
|
||||
except Exception as e:
|
||||
logger.error(f"Error forwarding to client (live): {e}")
|
||||
|
||||
# Run both forwarding tasks concurrently
|
||||
await asyncio.gather(
|
||||
forward_to_backend(),
|
||||
forward_to_client(),
|
||||
return_exceptions=True
|
||||
)
|
||||
|
||||
except websockets.exceptions.WebSocketException as e:
|
||||
logger.error(f"WebSocket error connecting to SLMM backend (live): {e}")
|
||||
try:
|
||||
await websocket.send_json({
|
||||
"error": "Failed to connect to SLMM backend",
|
||||
"detail": str(e)
|
||||
})
|
||||
except Exception:
|
||||
pass
|
||||
except Exception as e:
|
||||
logger.error(f"Unexpected error in WebSocket proxy for {unit_id} (live): {e}")
|
||||
try:
|
||||
await websocket.send_json({
|
||||
"error": "Internal server error",
|
||||
"detail": str(e)
|
||||
})
|
||||
except Exception:
|
||||
pass
|
||||
finally:
|
||||
# Clean up connections
|
||||
if backend_ws:
|
||||
try:
|
||||
await backend_ws.close()
|
||||
except Exception:
|
||||
pass
|
||||
try:
|
||||
await websocket.close()
|
||||
except Exception:
|
||||
pass
|
||||
logger.info(f"WebSocket proxy closed for {unit_id} (live)")
|
||||
|
||||
|
||||
# HTTP catch-all route MUST come after specific routes (including WebSocket routes)
|
||||
@router.api_route("/{path:path}", methods=["GET", "POST", "PUT", "DELETE", "PATCH"])
|
||||
async def proxy_to_slmm(path: str, request: Request):
|
||||
"""
|
||||
Proxy all requests to the SLMM backend service.
|
||||
|
||||
This allows SFM to act as a unified frontend for all device types,
|
||||
while SLMM remains a standalone backend service.
|
||||
"""
|
||||
# Build target URL
|
||||
target_url = f"{SLMM_BASE_URL}/api/nl43/{path}"
|
||||
|
||||
# Get query parameters
|
||||
query_params = dict(request.query_params)
|
||||
|
||||
# Get request body if present
|
||||
body = None
|
||||
if request.method in ["POST", "PUT", "PATCH"]:
|
||||
try:
|
||||
body = await request.body()
|
||||
except Exception as e:
|
||||
logger.error(f"Failed to read request body: {e}")
|
||||
body = None
|
||||
|
||||
# Get headers (exclude host and other proxy-specific headers)
|
||||
headers = dict(request.headers)
|
||||
headers_to_exclude = ["host", "content-length", "transfer-encoding", "connection"]
|
||||
proxy_headers = {k: v for k, v in headers.items() if k.lower() not in headers_to_exclude}
|
||||
|
||||
logger.info(f"Proxying {request.method} request to SLMM: {target_url}")
|
||||
|
||||
try:
|
||||
async with httpx.AsyncClient(timeout=30.0) as client:
|
||||
# Forward the request to SLMM
|
||||
response = await client.request(
|
||||
method=request.method,
|
||||
url=target_url,
|
||||
params=query_params,
|
||||
headers=proxy_headers,
|
||||
content=body
|
||||
)
|
||||
|
||||
# Return the response from SLMM
|
||||
return Response(
|
||||
content=response.content,
|
||||
status_code=response.status_code,
|
||||
headers=dict(response.headers),
|
||||
media_type=response.headers.get("content-type")
|
||||
)
|
||||
|
||||
except httpx.ConnectError:
|
||||
logger.error(f"Failed to connect to SLMM backend at {SLMM_BASE_URL}")
|
||||
raise HTTPException(
|
||||
status_code=503,
|
||||
detail=f"SLMM backend service unavailable. Is SLMM running on {SLMM_BASE_URL}?"
|
||||
)
|
||||
except httpx.TimeoutException:
|
||||
logger.error(f"Timeout connecting to SLMM backend at {SLMM_BASE_URL}")
|
||||
raise HTTPException(
|
||||
status_code=504,
|
||||
detail="SLMM backend timeout"
|
||||
)
|
||||
except Exception as e:
|
||||
logger.error(f"Error proxying to SLMM: {e}")
|
||||
raise HTTPException(
|
||||
status_code=500,
|
||||
detail=f"Failed to proxy request to SLMM: {str(e)}"
|
||||
)
|
||||
@@ -3,8 +3,9 @@ from sqlalchemy.orm import Session
|
||||
from datetime import datetime
|
||||
from typing import Dict, Any
|
||||
|
||||
from app.seismo.database import get_db
|
||||
from app.seismo.services.snapshot import emit_status_snapshot
|
||||
from backend.database import get_db
|
||||
from backend.services.snapshot import emit_status_snapshot
|
||||
from backend.models import RosterUnit
|
||||
|
||||
router = APIRouter(prefix="/api", tags=["units"])
|
||||
|
||||
@@ -42,3 +43,32 @@ def get_unit_detail(unit_id: str, db: Session = Depends(get_db)):
|
||||
"note": unit_data.get("note", ""),
|
||||
"coordinates": coords
|
||||
}
|
||||
|
||||
|
||||
@router.get("/units/{unit_id}")
|
||||
def get_unit_by_id(unit_id: str, db: Session = Depends(get_db)):
|
||||
"""
|
||||
Get unit data directly from the roster (for settings/configuration).
|
||||
"""
|
||||
unit = db.query(RosterUnit).filter_by(id=unit_id).first()
|
||||
|
||||
if not unit:
|
||||
raise HTTPException(status_code=404, detail=f"Unit {unit_id} not found")
|
||||
|
||||
return {
|
||||
"id": unit.id,
|
||||
"unit_type": unit.unit_type,
|
||||
"device_type": unit.device_type,
|
||||
"deployed": unit.deployed,
|
||||
"retired": unit.retired,
|
||||
"note": unit.note,
|
||||
"location": unit.location,
|
||||
"address": unit.address,
|
||||
"coordinates": unit.coordinates,
|
||||
"slm_host": unit.slm_host,
|
||||
"slm_tcp_port": unit.slm_tcp_port,
|
||||
"slm_ftp_port": unit.slm_ftp_port,
|
||||
"slm_model": unit.slm_model,
|
||||
"slm_serial_number": unit.slm_serial_number,
|
||||
"deployed_with_modem_id": unit.deployed_with_modem_id
|
||||
}
|
||||
@@ -4,8 +4,8 @@ from pydantic import BaseModel
|
||||
from datetime import datetime
|
||||
from typing import Optional, List
|
||||
|
||||
from app.seismo.database import get_db
|
||||
from app.seismo.models import Emitter
|
||||
from backend.database import get_db
|
||||
from backend.models import Emitter
|
||||
|
||||
router = APIRouter()
|
||||
|
||||
@@ -10,7 +10,7 @@ from datetime import datetime
|
||||
from typing import Optional
|
||||
import logging
|
||||
|
||||
from app.seismo.services.database_backup import DatabaseBackupService
|
||||
from backend.services.database_backup import DatabaseBackupService
|
||||
|
||||
logger = logging.getLogger(__name__)
|
||||
|
||||
384
backend/services/device_controller.py
Normal file
@@ -0,0 +1,384 @@
|
||||
"""
|
||||
Device Controller Service
|
||||
|
||||
Routes device operations to the appropriate backend module:
|
||||
- SLMM for sound level meters
|
||||
- SFM for seismographs (future implementation)
|
||||
|
||||
This abstraction allows Projects system to work with any device type
|
||||
without knowing the underlying communication protocol.
|
||||
"""
|
||||
|
||||
from typing import Dict, Any, Optional, List
|
||||
from backend.services.slmm_client import get_slmm_client, SLMMClientError
|
||||
|
||||
|
||||
class DeviceControllerError(Exception):
|
||||
"""Base exception for device controller errors."""
|
||||
pass
|
||||
|
||||
|
||||
class UnsupportedDeviceTypeError(DeviceControllerError):
|
||||
"""Raised when device type is not supported."""
|
||||
pass
|
||||
|
||||
|
||||
class DeviceController:
|
||||
"""
|
||||
Unified interface for controlling all device types.
|
||||
|
||||
Routes commands to appropriate backend module based on device_type.
|
||||
|
||||
Usage:
|
||||
controller = DeviceController()
|
||||
await controller.start_recording("nl43-001", "sound_level_meter", config={})
|
||||
await controller.stop_recording("seismo-042", "seismograph")
|
||||
"""
|
||||
|
||||
def __init__(self):
|
||||
self.slmm_client = get_slmm_client()
|
||||
|
||||
# ========================================================================
|
||||
# Recording Control
|
||||
# ========================================================================
|
||||
|
||||
async def start_recording(
|
||||
self,
|
||||
unit_id: str,
|
||||
device_type: str,
|
||||
config: Optional[Dict[str, Any]] = None,
|
||||
) -> Dict[str, Any]:
|
||||
"""
|
||||
Start recording on a device.
|
||||
|
||||
Args:
|
||||
unit_id: Unit identifier
|
||||
device_type: "sound_level_meter" | "seismograph"
|
||||
config: Device-specific recording configuration
|
||||
|
||||
Returns:
|
||||
Response dict from device module
|
||||
|
||||
Raises:
|
||||
UnsupportedDeviceTypeError: Device type not supported
|
||||
DeviceControllerError: Operation failed
|
||||
"""
|
||||
if device_type == "sound_level_meter":
|
||||
try:
|
||||
return await self.slmm_client.start_recording(unit_id, config)
|
||||
except SLMMClientError as e:
|
||||
raise DeviceControllerError(f"SLMM error: {str(e)}")
|
||||
|
||||
elif device_type == "seismograph":
|
||||
# TODO: Implement SFM client for seismograph control
|
||||
# For now, return a placeholder response
|
||||
return {
|
||||
"status": "not_implemented",
|
||||
"message": "Seismograph recording control not yet implemented",
|
||||
"unit_id": unit_id,
|
||||
}
|
||||
|
||||
else:
|
||||
raise UnsupportedDeviceTypeError(
|
||||
f"Device type '{device_type}' is not supported. "
|
||||
f"Supported types: sound_level_meter, seismograph"
|
||||
)
|
||||
|
||||
async def stop_recording(
|
||||
self,
|
||||
unit_id: str,
|
||||
device_type: str,
|
||||
) -> Dict[str, Any]:
|
||||
"""
|
||||
Stop recording on a device.
|
||||
|
||||
Args:
|
||||
unit_id: Unit identifier
|
||||
device_type: "sound_level_meter" | "seismograph"
|
||||
|
||||
Returns:
|
||||
Response dict from device module
|
||||
"""
|
||||
if device_type == "sound_level_meter":
|
||||
try:
|
||||
return await self.slmm_client.stop_recording(unit_id)
|
||||
except SLMMClientError as e:
|
||||
raise DeviceControllerError(f"SLMM error: {str(e)}")
|
||||
|
||||
elif device_type == "seismograph":
|
||||
# TODO: Implement SFM client
|
||||
return {
|
||||
"status": "not_implemented",
|
||||
"message": "Seismograph recording control not yet implemented",
|
||||
"unit_id": unit_id,
|
||||
}
|
||||
|
||||
else:
|
||||
raise UnsupportedDeviceTypeError(f"Unsupported device type: {device_type}")
|
||||
|
||||
async def pause_recording(
|
||||
self,
|
||||
unit_id: str,
|
||||
device_type: str,
|
||||
) -> Dict[str, Any]:
|
||||
"""
|
||||
Pause recording on a device.
|
||||
|
||||
Args:
|
||||
unit_id: Unit identifier
|
||||
device_type: "sound_level_meter" | "seismograph"
|
||||
|
||||
Returns:
|
||||
Response dict from device module
|
||||
"""
|
||||
if device_type == "sound_level_meter":
|
||||
try:
|
||||
return await self.slmm_client.pause_recording(unit_id)
|
||||
except SLMMClientError as e:
|
||||
raise DeviceControllerError(f"SLMM error: {str(e)}")
|
||||
|
||||
elif device_type == "seismograph":
|
||||
return {
|
||||
"status": "not_implemented",
|
||||
"message": "Seismograph pause not yet implemented",
|
||||
"unit_id": unit_id,
|
||||
}
|
||||
|
||||
else:
|
||||
raise UnsupportedDeviceTypeError(f"Unsupported device type: {device_type}")
|
||||
|
||||
async def resume_recording(
|
||||
self,
|
||||
unit_id: str,
|
||||
device_type: str,
|
||||
) -> Dict[str, Any]:
|
||||
"""
|
||||
Resume paused recording on a device.
|
||||
|
||||
Args:
|
||||
unit_id: Unit identifier
|
||||
device_type: "sound_level_meter" | "seismograph"
|
||||
|
||||
Returns:
|
||||
Response dict from device module
|
||||
"""
|
||||
if device_type == "sound_level_meter":
|
||||
try:
|
||||
return await self.slmm_client.resume_recording(unit_id)
|
||||
except SLMMClientError as e:
|
||||
raise DeviceControllerError(f"SLMM error: {str(e)}")
|
||||
|
||||
elif device_type == "seismograph":
|
||||
return {
|
||||
"status": "not_implemented",
|
||||
"message": "Seismograph resume not yet implemented",
|
||||
"unit_id": unit_id,
|
||||
}
|
||||
|
||||
else:
|
||||
raise UnsupportedDeviceTypeError(f"Unsupported device type: {device_type}")
|
||||
|
||||
# ========================================================================
|
||||
# Status & Monitoring
|
||||
# ========================================================================
|
||||
|
||||
async def get_device_status(
|
||||
self,
|
||||
unit_id: str,
|
||||
device_type: str,
|
||||
) -> Dict[str, Any]:
|
||||
"""
|
||||
Get current device status.
|
||||
|
||||
Args:
|
||||
unit_id: Unit identifier
|
||||
device_type: "sound_level_meter" | "seismograph"
|
||||
|
||||
Returns:
|
||||
Status dict from device module
|
||||
"""
|
||||
if device_type == "sound_level_meter":
|
||||
try:
|
||||
return await self.slmm_client.get_unit_status(unit_id)
|
||||
except SLMMClientError as e:
|
||||
raise DeviceControllerError(f"SLMM error: {str(e)}")
|
||||
|
||||
elif device_type == "seismograph":
|
||||
# TODO: Implement SFM status check
|
||||
return {
|
||||
"status": "not_implemented",
|
||||
"message": "Seismograph status not yet implemented",
|
||||
"unit_id": unit_id,
|
||||
}
|
||||
|
||||
else:
|
||||
raise UnsupportedDeviceTypeError(f"Unsupported device type: {device_type}")
|
||||
|
||||
async def get_live_data(
|
||||
self,
|
||||
unit_id: str,
|
||||
device_type: str,
|
||||
) -> Dict[str, Any]:
|
||||
"""
|
||||
Get live data from device.
|
||||
|
||||
Args:
|
||||
unit_id: Unit identifier
|
||||
device_type: "sound_level_meter" | "seismograph"
|
||||
|
||||
Returns:
|
||||
Live data dict from device module
|
||||
"""
|
||||
if device_type == "sound_level_meter":
|
||||
try:
|
||||
return await self.slmm_client.get_live_data(unit_id)
|
||||
except SLMMClientError as e:
|
||||
raise DeviceControllerError(f"SLMM error: {str(e)}")
|
||||
|
||||
elif device_type == "seismograph":
|
||||
return {
|
||||
"status": "not_implemented",
|
||||
"message": "Seismograph live data not yet implemented",
|
||||
"unit_id": unit_id,
|
||||
}
|
||||
|
||||
else:
|
||||
raise UnsupportedDeviceTypeError(f"Unsupported device type: {device_type}")
|
||||
|
||||
# ========================================================================
|
||||
# Data Download
|
||||
# ========================================================================
|
||||
|
||||
async def download_files(
|
||||
self,
|
||||
unit_id: str,
|
||||
device_type: str,
|
||||
destination_path: str,
|
||||
files: Optional[List[str]] = None,
|
||||
) -> Dict[str, Any]:
|
||||
"""
|
||||
Download data files from device.
|
||||
|
||||
Args:
|
||||
unit_id: Unit identifier
|
||||
device_type: "sound_level_meter" | "seismograph"
|
||||
destination_path: Local path to save files
|
||||
files: List of filenames, or None for all
|
||||
|
||||
Returns:
|
||||
Download result with file list
|
||||
"""
|
||||
if device_type == "sound_level_meter":
|
||||
try:
|
||||
return await self.slmm_client.download_files(
|
||||
unit_id,
|
||||
destination_path,
|
||||
files,
|
||||
)
|
||||
except SLMMClientError as e:
|
||||
raise DeviceControllerError(f"SLMM error: {str(e)}")
|
||||
|
||||
elif device_type == "seismograph":
|
||||
# TODO: Implement SFM file download
|
||||
return {
|
||||
"status": "not_implemented",
|
||||
"message": "Seismograph file download not yet implemented",
|
||||
"unit_id": unit_id,
|
||||
}
|
||||
|
||||
else:
|
||||
raise UnsupportedDeviceTypeError(f"Unsupported device type: {device_type}")
|
||||
|
||||
# ========================================================================
|
||||
# Device Configuration
|
||||
# ========================================================================
|
||||
|
||||
async def update_device_config(
|
||||
self,
|
||||
unit_id: str,
|
||||
device_type: str,
|
||||
config: Dict[str, Any],
|
||||
) -> Dict[str, Any]:
|
||||
"""
|
||||
Update device configuration.
|
||||
|
||||
Args:
|
||||
unit_id: Unit identifier
|
||||
device_type: "sound_level_meter" | "seismograph"
|
||||
config: Configuration parameters
|
||||
|
||||
Returns:
|
||||
Updated config from device module
|
||||
"""
|
||||
if device_type == "sound_level_meter":
|
||||
try:
|
||||
return await self.slmm_client.update_unit_config(
|
||||
unit_id,
|
||||
host=config.get("host"),
|
||||
tcp_port=config.get("tcp_port"),
|
||||
ftp_port=config.get("ftp_port"),
|
||||
ftp_username=config.get("ftp_username"),
|
||||
ftp_password=config.get("ftp_password"),
|
||||
)
|
||||
except SLMMClientError as e:
|
||||
raise DeviceControllerError(f"SLMM error: {str(e)}")
|
||||
|
||||
elif device_type == "seismograph":
|
||||
return {
|
||||
"status": "not_implemented",
|
||||
"message": "Seismograph config update not yet implemented",
|
||||
"unit_id": unit_id,
|
||||
}
|
||||
|
||||
else:
|
||||
raise UnsupportedDeviceTypeError(f"Unsupported device type: {device_type}")
|
||||
|
||||
# ========================================================================
|
||||
# Health Check
|
||||
# ========================================================================
|
||||
|
||||
async def check_device_connectivity(
|
||||
self,
|
||||
unit_id: str,
|
||||
device_type: str,
|
||||
) -> bool:
|
||||
"""
|
||||
Check if device is reachable.
|
||||
|
||||
Args:
|
||||
unit_id: Unit identifier
|
||||
device_type: "sound_level_meter" | "seismograph"
|
||||
|
||||
Returns:
|
||||
True if device is reachable, False otherwise
|
||||
"""
|
||||
if device_type == "sound_level_meter":
|
||||
try:
|
||||
status = await self.slmm_client.get_unit_status(unit_id)
|
||||
return status.get("last_seen") is not None
|
||||
except:
|
||||
return False
|
||||
|
||||
elif device_type == "seismograph":
|
||||
# TODO: Implement SFM connectivity check
|
||||
return False
|
||||
|
||||
else:
|
||||
return False
|
||||
|
||||
|
||||
# Singleton instance
|
||||
_default_controller: Optional[DeviceController] = None
|
||||
|
||||
|
||||
def get_device_controller() -> DeviceController:
|
||||
"""
|
||||
Get the default device controller instance.
|
||||
|
||||
Returns:
|
||||
DeviceController instance
|
||||
"""
|
||||
global _default_controller
|
||||
if _default_controller is None:
|
||||
_default_controller = DeviceController()
|
||||
return _default_controller
|
||||
355
backend/services/scheduler.py
Normal file
@@ -0,0 +1,355 @@
|
||||
"""
|
||||
Scheduler Service
|
||||
|
||||
Executes scheduled actions for Projects system.
|
||||
Monitors pending scheduled actions and executes them by calling device modules (SLMM/SFM).
|
||||
|
||||
This service runs as a background task in FastAPI, checking for pending actions
|
||||
every minute and executing them when their scheduled time arrives.
|
||||
"""
|
||||
|
||||
import asyncio
|
||||
import json
|
||||
from datetime import datetime, timedelta
|
||||
from typing import Optional, List, Dict, Any
|
||||
from sqlalchemy.orm import Session
|
||||
from sqlalchemy import and_
|
||||
|
||||
from backend.database import SessionLocal
|
||||
from backend.models import ScheduledAction, RecordingSession, MonitoringLocation, Project
|
||||
from backend.services.device_controller import get_device_controller, DeviceControllerError
|
||||
import uuid
|
||||
|
||||
|
||||
class SchedulerService:
|
||||
"""
|
||||
Service for executing scheduled actions.
|
||||
|
||||
Usage:
|
||||
scheduler = SchedulerService()
|
||||
await scheduler.start() # Start background loop
|
||||
scheduler.stop() # Stop background loop
|
||||
"""
|
||||
|
||||
def __init__(self, check_interval: int = 60):
|
||||
"""
|
||||
Initialize scheduler.
|
||||
|
||||
Args:
|
||||
check_interval: Seconds between checks for pending actions (default: 60)
|
||||
"""
|
||||
self.check_interval = check_interval
|
||||
self.running = False
|
||||
self.task: Optional[asyncio.Task] = None
|
||||
self.device_controller = get_device_controller()
|
||||
|
||||
async def start(self):
|
||||
"""Start the scheduler background task."""
|
||||
if self.running:
|
||||
print("Scheduler is already running")
|
||||
return
|
||||
|
||||
self.running = True
|
||||
self.task = asyncio.create_task(self._run_loop())
|
||||
print(f"Scheduler started (checking every {self.check_interval}s)")
|
||||
|
||||
def stop(self):
|
||||
"""Stop the scheduler background task."""
|
||||
self.running = False
|
||||
if self.task:
|
||||
self.task.cancel()
|
||||
print("Scheduler stopped")
|
||||
|
||||
async def _run_loop(self):
|
||||
"""Main scheduler loop."""
|
||||
while self.running:
|
||||
try:
|
||||
await self.execute_pending_actions()
|
||||
except Exception as e:
|
||||
print(f"Scheduler error: {e}")
|
||||
# Continue running even if there's an error
|
||||
|
||||
await asyncio.sleep(self.check_interval)
|
||||
|
||||
async def execute_pending_actions(self) -> List[Dict[str, Any]]:
|
||||
"""
|
||||
Find and execute all pending scheduled actions that are due.
|
||||
|
||||
Returns:
|
||||
List of execution results
|
||||
"""
|
||||
db = SessionLocal()
|
||||
results = []
|
||||
|
||||
try:
|
||||
# Find pending actions that are due
|
||||
now = datetime.utcnow()
|
||||
pending_actions = db.query(ScheduledAction).filter(
|
||||
and_(
|
||||
ScheduledAction.execution_status == "pending",
|
||||
ScheduledAction.scheduled_time <= now,
|
||||
)
|
||||
).order_by(ScheduledAction.scheduled_time).all()
|
||||
|
||||
if not pending_actions:
|
||||
return []
|
||||
|
||||
print(f"Found {len(pending_actions)} pending action(s) to execute")
|
||||
|
||||
for action in pending_actions:
|
||||
result = await self._execute_action(action, db)
|
||||
results.append(result)
|
||||
|
||||
db.commit()
|
||||
|
||||
except Exception as e:
|
||||
print(f"Error executing pending actions: {e}")
|
||||
db.rollback()
|
||||
finally:
|
||||
db.close()
|
||||
|
||||
return results
|
||||
|
||||
async def _execute_action(
|
||||
self,
|
||||
action: ScheduledAction,
|
||||
db: Session,
|
||||
) -> Dict[str, Any]:
|
||||
"""
|
||||
Execute a single scheduled action.
|
||||
|
||||
Args:
|
||||
action: ScheduledAction to execute
|
||||
db: Database session
|
||||
|
||||
Returns:
|
||||
Execution result dict
|
||||
"""
|
||||
print(f"Executing action {action.id}: {action.action_type} for unit {action.unit_id}")
|
||||
|
||||
result = {
|
||||
"action_id": action.id,
|
||||
"action_type": action.action_type,
|
||||
"unit_id": action.unit_id,
|
||||
"scheduled_time": action.scheduled_time.isoformat(),
|
||||
"success": False,
|
||||
"error": None,
|
||||
}
|
||||
|
||||
try:
|
||||
# Determine which unit to use
|
||||
# If unit_id is specified, use it; otherwise get from location assignment
|
||||
unit_id = action.unit_id
|
||||
if not unit_id:
|
||||
# Get assigned unit from location
|
||||
from backend.models import UnitAssignment
|
||||
assignment = db.query(UnitAssignment).filter(
|
||||
and_(
|
||||
UnitAssignment.location_id == action.location_id,
|
||||
UnitAssignment.status == "active",
|
||||
)
|
||||
).first()
|
||||
|
||||
if not assignment:
|
||||
raise Exception(f"No active unit assigned to location {action.location_id}")
|
||||
|
||||
unit_id = assignment.unit_id
|
||||
|
||||
# Execute the action based on type
|
||||
if action.action_type == "start":
|
||||
response = await self._execute_start(action, unit_id, db)
|
||||
elif action.action_type == "stop":
|
||||
response = await self._execute_stop(action, unit_id, db)
|
||||
elif action.action_type == "download":
|
||||
response = await self._execute_download(action, unit_id, db)
|
||||
else:
|
||||
raise Exception(f"Unknown action type: {action.action_type}")
|
||||
|
||||
# Mark action as completed
|
||||
action.execution_status = "completed"
|
||||
action.executed_at = datetime.utcnow()
|
||||
action.module_response = json.dumps(response)
|
||||
|
||||
result["success"] = True
|
||||
result["response"] = response
|
||||
|
||||
print(f"✓ Action {action.id} completed successfully")
|
||||
|
||||
except Exception as e:
|
||||
# Mark action as failed
|
||||
action.execution_status = "failed"
|
||||
action.executed_at = datetime.utcnow()
|
||||
action.error_message = str(e)
|
||||
|
||||
result["error"] = str(e)
|
||||
|
||||
print(f"✗ Action {action.id} failed: {e}")
|
||||
|
||||
return result
|
||||
|
||||
async def _execute_start(
|
||||
self,
|
||||
action: ScheduledAction,
|
||||
unit_id: str,
|
||||
db: Session,
|
||||
) -> Dict[str, Any]:
|
||||
"""Execute a 'start' action."""
|
||||
# Start recording via device controller
|
||||
response = await self.device_controller.start_recording(
|
||||
unit_id,
|
||||
action.device_type,
|
||||
config={}, # TODO: Load config from action.notes or metadata
|
||||
)
|
||||
|
||||
# Create recording session
|
||||
session = RecordingSession(
|
||||
id=str(uuid.uuid4()),
|
||||
project_id=action.project_id,
|
||||
location_id=action.location_id,
|
||||
unit_id=unit_id,
|
||||
session_type="sound" if action.device_type == "sound_level_meter" else "vibration",
|
||||
started_at=datetime.utcnow(),
|
||||
status="recording",
|
||||
session_metadata=json.dumps({"scheduled_action_id": action.id}),
|
||||
)
|
||||
db.add(session)
|
||||
|
||||
return {
|
||||
"status": "started",
|
||||
"session_id": session.id,
|
||||
"device_response": response,
|
||||
}
|
||||
|
||||
async def _execute_stop(
|
||||
self,
|
||||
action: ScheduledAction,
|
||||
unit_id: str,
|
||||
db: Session,
|
||||
) -> Dict[str, Any]:
|
||||
"""Execute a 'stop' action."""
|
||||
# Stop recording via device controller
|
||||
response = await self.device_controller.stop_recording(
|
||||
unit_id,
|
||||
action.device_type,
|
||||
)
|
||||
|
||||
# Find and update the active recording session
|
||||
active_session = db.query(RecordingSession).filter(
|
||||
and_(
|
||||
RecordingSession.location_id == action.location_id,
|
||||
RecordingSession.unit_id == unit_id,
|
||||
RecordingSession.status == "recording",
|
||||
)
|
||||
).first()
|
||||
|
||||
if active_session:
|
||||
active_session.stopped_at = datetime.utcnow()
|
||||
active_session.status = "completed"
|
||||
active_session.duration_seconds = int(
|
||||
(active_session.stopped_at - active_session.started_at).total_seconds()
|
||||
)
|
||||
|
||||
return {
|
||||
"status": "stopped",
|
||||
"session_id": active_session.id if active_session else None,
|
||||
"device_response": response,
|
||||
}
|
||||
|
||||
async def _execute_download(
|
||||
self,
|
||||
action: ScheduledAction,
|
||||
unit_id: str,
|
||||
db: Session,
|
||||
) -> Dict[str, Any]:
|
||||
"""Execute a 'download' action."""
|
||||
# Get project and location info for file path
|
||||
location = db.query(MonitoringLocation).filter_by(id=action.location_id).first()
|
||||
project = db.query(Project).filter_by(id=action.project_id).first()
|
||||
|
||||
if not location or not project:
|
||||
raise Exception("Project or location not found")
|
||||
|
||||
# Build destination path
|
||||
# Example: data/Projects/{project-id}/sound/{location-name}/session-{timestamp}/
|
||||
session_timestamp = datetime.utcnow().strftime("%Y-%m-%d-%H%M")
|
||||
location_type_dir = "sound" if action.device_type == "sound_level_meter" else "vibration"
|
||||
|
||||
destination_path = (
|
||||
f"data/Projects/{project.id}/{location_type_dir}/"
|
||||
f"{location.name}/session-{session_timestamp}/"
|
||||
)
|
||||
|
||||
# Download files via device controller
|
||||
response = await self.device_controller.download_files(
|
||||
unit_id,
|
||||
action.device_type,
|
||||
destination_path,
|
||||
files=None, # Download all files
|
||||
)
|
||||
|
||||
# TODO: Create DataFile records for downloaded files
|
||||
|
||||
return {
|
||||
"status": "downloaded",
|
||||
"destination_path": destination_path,
|
||||
"device_response": response,
|
||||
}
|
||||
|
||||
# ========================================================================
|
||||
# Manual Execution (for testing/debugging)
|
||||
# ========================================================================
|
||||
|
||||
async def execute_action_by_id(self, action_id: str) -> Dict[str, Any]:
|
||||
"""
|
||||
Manually execute a specific action by ID.
|
||||
|
||||
Args:
|
||||
action_id: ScheduledAction ID
|
||||
|
||||
Returns:
|
||||
Execution result
|
||||
"""
|
||||
db = SessionLocal()
|
||||
try:
|
||||
action = db.query(ScheduledAction).filter_by(id=action_id).first()
|
||||
if not action:
|
||||
return {"success": False, "error": "Action not found"}
|
||||
|
||||
result = await self._execute_action(action, db)
|
||||
db.commit()
|
||||
return result
|
||||
|
||||
except Exception as e:
|
||||
db.rollback()
|
||||
return {"success": False, "error": str(e)}
|
||||
finally:
|
||||
db.close()
|
||||
|
||||
|
||||
# Singleton instance
|
||||
_scheduler_instance: Optional[SchedulerService] = None
|
||||
|
||||
|
||||
def get_scheduler() -> SchedulerService:
|
||||
"""
|
||||
Get the scheduler singleton instance.
|
||||
|
||||
Returns:
|
||||
SchedulerService instance
|
||||
"""
|
||||
global _scheduler_instance
|
||||
if _scheduler_instance is None:
|
||||
_scheduler_instance = SchedulerService()
|
||||
return _scheduler_instance
|
||||
|
||||
|
||||
async def start_scheduler():
|
||||
"""Start the global scheduler instance."""
|
||||
scheduler = get_scheduler()
|
||||
await scheduler.start()
|
||||
|
||||
|
||||
def stop_scheduler():
|
||||
"""Stop the global scheduler instance."""
|
||||
scheduler = get_scheduler()
|
||||
scheduler.stop()
|
||||
423
backend/services/slmm_client.py
Normal file
@@ -0,0 +1,423 @@
|
||||
"""
|
||||
SLMM API Client Wrapper
|
||||
|
||||
Provides a clean interface for Terra-View to interact with the SLMM backend.
|
||||
All SLM operations should go through this client instead of direct HTTP calls.
|
||||
|
||||
SLMM (Sound Level Meter Manager) is a separate service running on port 8100
|
||||
that handles TCP/FTP communication with Rion NL-43/NL-53 devices.
|
||||
"""
|
||||
|
||||
import httpx
|
||||
from typing import Optional, Dict, Any, List
|
||||
from datetime import datetime
|
||||
import json
|
||||
|
||||
|
||||
# SLMM backend base URLs
|
||||
SLMM_BASE_URL = "http://localhost:8100"
|
||||
SLMM_API_BASE = f"{SLMM_BASE_URL}/api/nl43"
|
||||
|
||||
|
||||
class SLMMClientError(Exception):
|
||||
"""Base exception for SLMM client errors."""
|
||||
pass
|
||||
|
||||
|
||||
class SLMMConnectionError(SLMMClientError):
|
||||
"""Raised when cannot connect to SLMM backend."""
|
||||
pass
|
||||
|
||||
|
||||
class SLMMDeviceError(SLMMClientError):
|
||||
"""Raised when device operation fails."""
|
||||
pass
|
||||
|
||||
|
||||
class SLMMClient:
|
||||
"""
|
||||
Client for interacting with SLMM backend.
|
||||
|
||||
Usage:
|
||||
client = SLMMClient()
|
||||
units = await client.get_all_units()
|
||||
status = await client.get_unit_status("nl43-001")
|
||||
await client.start_recording("nl43-001", config={...})
|
||||
"""
|
||||
|
||||
def __init__(self, base_url: str = SLMM_BASE_URL, timeout: float = 30.0):
|
||||
self.base_url = base_url
|
||||
self.api_base = f"{base_url}/api/nl43"
|
||||
self.timeout = timeout
|
||||
|
||||
async def _request(
|
||||
self,
|
||||
method: str,
|
||||
endpoint: str,
|
||||
data: Optional[Dict] = None,
|
||||
params: Optional[Dict] = None,
|
||||
) -> Dict[str, Any]:
|
||||
"""
|
||||
Make an HTTP request to SLMM backend.
|
||||
|
||||
Args:
|
||||
method: HTTP method (GET, POST, PUT, DELETE)
|
||||
endpoint: API endpoint (e.g., "/units", "/{unit_id}/status")
|
||||
data: JSON body for POST/PUT requests
|
||||
params: Query parameters
|
||||
|
||||
Returns:
|
||||
Response JSON as dict
|
||||
|
||||
Raises:
|
||||
SLMMConnectionError: Cannot reach SLMM
|
||||
SLMMDeviceError: Device operation failed
|
||||
"""
|
||||
url = f"{self.api_base}{endpoint}"
|
||||
|
||||
try:
|
||||
async with httpx.AsyncClient(timeout=self.timeout) as client:
|
||||
response = await client.request(
|
||||
method=method,
|
||||
url=url,
|
||||
json=data,
|
||||
params=params,
|
||||
)
|
||||
response.raise_for_status()
|
||||
|
||||
# Handle empty responses
|
||||
if not response.content:
|
||||
return {}
|
||||
|
||||
return response.json()
|
||||
|
||||
except httpx.ConnectError as e:
|
||||
raise SLMMConnectionError(
|
||||
f"Cannot connect to SLMM backend at {self.base_url}. "
|
||||
f"Is SLMM running? Error: {str(e)}"
|
||||
)
|
||||
except httpx.HTTPStatusError as e:
|
||||
error_detail = "Unknown error"
|
||||
try:
|
||||
error_data = e.response.json()
|
||||
error_detail = error_data.get("detail", str(error_data))
|
||||
except:
|
||||
error_detail = e.response.text or str(e)
|
||||
|
||||
raise SLMMDeviceError(
|
||||
f"SLMM operation failed: {error_detail}"
|
||||
)
|
||||
except Exception as e:
|
||||
raise SLMMClientError(f"Unexpected error: {str(e)}")
|
||||
|
||||
# ========================================================================
|
||||
# Unit Management
|
||||
# ========================================================================
|
||||
|
||||
async def get_all_units(self) -> List[Dict[str, Any]]:
|
||||
"""
|
||||
Get all configured SLM units from SLMM.
|
||||
|
||||
Returns:
|
||||
List of unit dicts with id, config, and status
|
||||
"""
|
||||
# SLMM doesn't have a /units endpoint yet, so we'll need to add this
|
||||
# For now, return empty list or implement when SLMM endpoint is ready
|
||||
try:
|
||||
response = await self._request("GET", "/units")
|
||||
return response.get("units", [])
|
||||
except SLMMClientError:
|
||||
# Endpoint may not exist yet
|
||||
return []
|
||||
|
||||
async def get_unit_config(self, unit_id: str) -> Dict[str, Any]:
|
||||
"""
|
||||
Get unit configuration from SLMM cache.
|
||||
|
||||
Args:
|
||||
unit_id: Unit identifier (e.g., "nl43-001")
|
||||
|
||||
Returns:
|
||||
Config dict with host, tcp_port, ftp_port, etc.
|
||||
"""
|
||||
return await self._request("GET", f"/{unit_id}/config")
|
||||
|
||||
async def update_unit_config(
|
||||
self,
|
||||
unit_id: str,
|
||||
host: Optional[str] = None,
|
||||
tcp_port: Optional[int] = None,
|
||||
ftp_port: Optional[int] = None,
|
||||
ftp_username: Optional[str] = None,
|
||||
ftp_password: Optional[str] = None,
|
||||
) -> Dict[str, Any]:
|
||||
"""
|
||||
Update unit configuration in SLMM cache.
|
||||
|
||||
Args:
|
||||
unit_id: Unit identifier
|
||||
host: Device IP address
|
||||
tcp_port: TCP control port (default: 2255)
|
||||
ftp_port: FTP data port (default: 21)
|
||||
ftp_username: FTP username
|
||||
ftp_password: FTP password
|
||||
|
||||
Returns:
|
||||
Updated config
|
||||
"""
|
||||
config = {}
|
||||
if host is not None:
|
||||
config["host"] = host
|
||||
if tcp_port is not None:
|
||||
config["tcp_port"] = tcp_port
|
||||
if ftp_port is not None:
|
||||
config["ftp_port"] = ftp_port
|
||||
if ftp_username is not None:
|
||||
config["ftp_username"] = ftp_username
|
||||
if ftp_password is not None:
|
||||
config["ftp_password"] = ftp_password
|
||||
|
||||
return await self._request("PUT", f"/{unit_id}/config", data=config)
|
||||
|
||||
# ========================================================================
|
||||
# Status & Monitoring
|
||||
# ========================================================================
|
||||
|
||||
async def get_unit_status(self, unit_id: str) -> Dict[str, Any]:
|
||||
"""
|
||||
Get cached status snapshot from SLMM.
|
||||
|
||||
Args:
|
||||
unit_id: Unit identifier
|
||||
|
||||
Returns:
|
||||
Status dict with measurement_state, lp, leq, battery, etc.
|
||||
"""
|
||||
return await self._request("GET", f"/{unit_id}/status")
|
||||
|
||||
async def get_live_data(self, unit_id: str) -> Dict[str, Any]:
|
||||
"""
|
||||
Request fresh data from device (DOD command).
|
||||
|
||||
Args:
|
||||
unit_id: Unit identifier
|
||||
|
||||
Returns:
|
||||
Live data snapshot
|
||||
"""
|
||||
return await self._request("GET", f"/{unit_id}/live")
|
||||
|
||||
# ========================================================================
|
||||
# Recording Control
|
||||
# ========================================================================
|
||||
|
||||
async def start_recording(
|
||||
self,
|
||||
unit_id: str,
|
||||
config: Optional[Dict[str, Any]] = None,
|
||||
) -> Dict[str, Any]:
|
||||
"""
|
||||
Start recording on a unit.
|
||||
|
||||
Args:
|
||||
unit_id: Unit identifier
|
||||
config: Optional recording config (interval, settings, etc.)
|
||||
|
||||
Returns:
|
||||
Response from SLMM with success status
|
||||
"""
|
||||
return await self._request("POST", f"/{unit_id}/start", data=config or {})
|
||||
|
||||
async def stop_recording(self, unit_id: str) -> Dict[str, Any]:
|
||||
"""
|
||||
Stop recording on a unit.
|
||||
|
||||
Args:
|
||||
unit_id: Unit identifier
|
||||
|
||||
Returns:
|
||||
Response from SLMM
|
||||
"""
|
||||
return await self._request("POST", f"/{unit_id}/stop")
|
||||
|
||||
async def pause_recording(self, unit_id: str) -> Dict[str, Any]:
|
||||
"""
|
||||
Pause recording on a unit.
|
||||
|
||||
Args:
|
||||
unit_id: Unit identifier
|
||||
|
||||
Returns:
|
||||
Response from SLMM
|
||||
"""
|
||||
return await self._request("POST", f"/{unit_id}/pause")
|
||||
|
||||
async def resume_recording(self, unit_id: str) -> Dict[str, Any]:
|
||||
"""
|
||||
Resume paused recording on a unit.
|
||||
|
||||
Args:
|
||||
unit_id: Unit identifier
|
||||
|
||||
Returns:
|
||||
Response from SLMM
|
||||
"""
|
||||
return await self._request("POST", f"/{unit_id}/resume")
|
||||
|
||||
async def reset_data(self, unit_id: str) -> Dict[str, Any]:
|
||||
"""
|
||||
Reset measurement data on a unit.
|
||||
|
||||
Args:
|
||||
unit_id: Unit identifier
|
||||
|
||||
Returns:
|
||||
Response from SLMM
|
||||
"""
|
||||
return await self._request("POST", f"/{unit_id}/reset")
|
||||
|
||||
# ========================================================================
|
||||
# Device Settings
|
||||
# ========================================================================
|
||||
|
||||
async def get_frequency_weighting(self, unit_id: str) -> Dict[str, Any]:
|
||||
"""
|
||||
Get frequency weighting setting (A, C, or Z).
|
||||
|
||||
Args:
|
||||
unit_id: Unit identifier
|
||||
|
||||
Returns:
|
||||
Dict with current weighting
|
||||
"""
|
||||
return await self._request("GET", f"/{unit_id}/frequency-weighting")
|
||||
|
||||
async def set_frequency_weighting(
|
||||
self,
|
||||
unit_id: str,
|
||||
weighting: str,
|
||||
) -> Dict[str, Any]:
|
||||
"""
|
||||
Set frequency weighting (A, C, or Z).
|
||||
|
||||
Args:
|
||||
unit_id: Unit identifier
|
||||
weighting: "A", "C", or "Z"
|
||||
|
||||
Returns:
|
||||
Confirmation response
|
||||
"""
|
||||
return await self._request(
|
||||
"PUT",
|
||||
f"/{unit_id}/frequency-weighting",
|
||||
data={"weighting": weighting},
|
||||
)
|
||||
|
||||
async def get_time_weighting(self, unit_id: str) -> Dict[str, Any]:
|
||||
"""
|
||||
Get time weighting setting (F, S, or I).
|
||||
|
||||
Args:
|
||||
unit_id: Unit identifier
|
||||
|
||||
Returns:
|
||||
Dict with current time weighting
|
||||
"""
|
||||
return await self._request("GET", f"/{unit_id}/time-weighting")
|
||||
|
||||
async def set_time_weighting(
|
||||
self,
|
||||
unit_id: str,
|
||||
weighting: str,
|
||||
) -> Dict[str, Any]:
|
||||
"""
|
||||
Set time weighting (F=Fast, S=Slow, I=Impulse).
|
||||
|
||||
Args:
|
||||
unit_id: Unit identifier
|
||||
weighting: "F", "S", or "I"
|
||||
|
||||
Returns:
|
||||
Confirmation response
|
||||
"""
|
||||
return await self._request(
|
||||
"PUT",
|
||||
f"/{unit_id}/time-weighting",
|
||||
data={"weighting": weighting},
|
||||
)
|
||||
|
||||
async def get_all_settings(self, unit_id: str) -> Dict[str, Any]:
|
||||
"""
|
||||
Get all device settings.
|
||||
|
||||
Args:
|
||||
unit_id: Unit identifier
|
||||
|
||||
Returns:
|
||||
Dict with all settings
|
||||
"""
|
||||
return await self._request("GET", f"/{unit_id}/settings")
|
||||
|
||||
# ========================================================================
|
||||
# Data Download (Future)
|
||||
# ========================================================================
|
||||
|
||||
async def download_files(
|
||||
self,
|
||||
unit_id: str,
|
||||
destination_path: str,
|
||||
files: Optional[List[str]] = None,
|
||||
) -> Dict[str, Any]:
|
||||
"""
|
||||
Download files from unit via FTP.
|
||||
|
||||
NOTE: This endpoint doesn't exist in SLMM yet. Will need to implement.
|
||||
|
||||
Args:
|
||||
unit_id: Unit identifier
|
||||
destination_path: Local path to save files
|
||||
files: List of filenames to download, or None for all
|
||||
|
||||
Returns:
|
||||
Dict with downloaded files list and metadata
|
||||
"""
|
||||
data = {
|
||||
"destination_path": destination_path,
|
||||
"files": files or "all",
|
||||
}
|
||||
return await self._request("POST", f"/{unit_id}/ftp/download", data=data)
|
||||
|
||||
# ========================================================================
|
||||
# Health Check
|
||||
# ========================================================================
|
||||
|
||||
async def health_check(self) -> bool:
|
||||
"""
|
||||
Check if SLMM backend is reachable.
|
||||
|
||||
Returns:
|
||||
True if SLMM is responding, False otherwise
|
||||
"""
|
||||
try:
|
||||
async with httpx.AsyncClient(timeout=5.0) as client:
|
||||
response = await client.get(f"{self.base_url}/health")
|
||||
return response.status_code == 200
|
||||
except:
|
||||
return False
|
||||
|
||||
|
||||
# Singleton instance for convenience
|
||||
_default_client: Optional[SLMMClient] = None
|
||||
|
||||
|
||||
def get_slmm_client() -> SLMMClient:
|
||||
"""
|
||||
Get the default SLMM client instance.
|
||||
|
||||
Returns:
|
||||
SLMMClient instance
|
||||
"""
|
||||
global _default_client
|
||||
if _default_client is None:
|
||||
_default_client = SLMMClient()
|
||||
return _default_client
|
||||
@@ -1,8 +1,8 @@
|
||||
from datetime import datetime, timezone
|
||||
from sqlalchemy.orm import Session
|
||||
|
||||
from app.seismo.database import get_db_session
|
||||
from app.seismo.models import Emitter, RosterUnit, IgnoredUnit
|
||||
from backend.database import get_db_session
|
||||
from backend.models import Emitter, RosterUnit, IgnoredUnit
|
||||
|
||||
|
||||
def ensure_utc(dt):
|
||||
@@ -60,7 +60,7 @@ def emit_status_snapshot():
|
||||
db = get_db_session()
|
||||
try:
|
||||
# Get user preferences for status thresholds
|
||||
from app.seismo.models import UserPreferences
|
||||
from backend.models import UserPreferences
|
||||
prefs = db.query(UserPreferences).filter_by(id=1).first()
|
||||
status_ok_threshold = prefs.status_ok_threshold_hours if prefs else 12
|
||||
status_pending_threshold = prefs.status_pending_threshold_hours if prefs else 24
|
||||
|
Before Width: | Height: | Size: 1.9 KiB After Width: | Height: | Size: 1.9 KiB |
|
Before Width: | Height: | Size: 288 B After Width: | Height: | Size: 288 B |
|
Before Width: | Height: | Size: 2.2 KiB After Width: | Height: | Size: 2.2 KiB |
|
Before Width: | Height: | Size: 287 B After Width: | Height: | Size: 287 B |
|
Before Width: | Height: | Size: 2.2 KiB After Width: | Height: | Size: 2.2 KiB |
|
Before Width: | Height: | Size: 288 B After Width: | Height: | Size: 288 B |
|
Before Width: | Height: | Size: 2.9 KiB After Width: | Height: | Size: 2.9 KiB |
|
Before Width: | Height: | Size: 288 B After Width: | Height: | Size: 288 B |
|
Before Width: | Height: | Size: 5.8 KiB After Width: | Height: | Size: 5.8 KiB |
|
Before Width: | Height: | Size: 287 B After Width: | Height: | Size: 287 B |
|
Before Width: | Height: | Size: 7.8 KiB After Width: | Height: | Size: 7.8 KiB |
|
Before Width: | Height: | Size: 288 B After Width: | Height: | Size: 288 B |
|
Before Width: | Height: | Size: 1.1 KiB After Width: | Height: | Size: 1.1 KiB |
|
Before Width: | Height: | Size: 283 B After Width: | Height: | Size: 283 B |
|
Before Width: | Height: | Size: 1.4 KiB After Width: | Height: | Size: 1.4 KiB |
|
Before Width: | Height: | Size: 283 B After Width: | Height: | Size: 283 B |
@@ -1,25 +1,22 @@
|
||||
services:
|
||||
|
||||
# --- TERRA-VIEW (PRODUCTION) ---
|
||||
# Unified application: UI + Seismograph logic + SLM dashboard/proxy
|
||||
# Only external HTTP dependency: SLMM backend for NL-43 device communication
|
||||
# --- TERRA-VIEW PRODUCTION ---
|
||||
terra-view-prod:
|
||||
build:
|
||||
context: .
|
||||
dockerfile: Dockerfile.terraview
|
||||
build: .
|
||||
container_name: terra-view
|
||||
network_mode: host
|
||||
ports:
|
||||
- "8001:8001"
|
||||
volumes:
|
||||
- ./data:/app/data
|
||||
environment:
|
||||
- PYTHONUNBUFFERED=1
|
||||
- ENVIRONMENT=production
|
||||
- PORT=8001
|
||||
- SLMM_API_URL=http://localhost:8100 # Points to SLMM container
|
||||
- SLMM_BASE_URL=http://host.docker.internal:8100
|
||||
restart: unless-stopped
|
||||
depends_on:
|
||||
- slmm
|
||||
command: python3 -m app.main # Runs full Terra-View (UI + seismo + SLM)
|
||||
extra_hosts:
|
||||
- "host.docker.internal:host-gateway"
|
||||
healthcheck:
|
||||
test: ["CMD", "curl", "-f", "http://localhost:8001/health"]
|
||||
interval: 30s
|
||||
@@ -27,42 +24,37 @@ services:
|
||||
retries: 3
|
||||
start_period: 40s
|
||||
|
||||
# --- TERRA-VIEW (DEVELOPMENT) ---
|
||||
terra-view-dev:
|
||||
build:
|
||||
context: .
|
||||
dockerfile: Dockerfile.terraview
|
||||
container_name: terra-view-dev
|
||||
network_mode: host
|
||||
volumes:
|
||||
- ./data-dev:/app/data
|
||||
environment:
|
||||
- PYTHONUNBUFFERED=1
|
||||
- ENVIRONMENT=development
|
||||
- PORT=1001
|
||||
- SLMM_API_URL=http://localhost:8100
|
||||
restart: unless-stopped
|
||||
depends_on:
|
||||
- slmm
|
||||
profiles:
|
||||
- dev
|
||||
command: python3 -m app.main
|
||||
healthcheck:
|
||||
test: ["CMD", "curl", "-f", "http://localhost:1001/health"]
|
||||
interval: 30s
|
||||
timeout: 10s
|
||||
retries: 3
|
||||
start_period: 40s
|
||||
# --- TERRA-VIEW DEVELOPMENT ---
|
||||
# terra-view-dev:
|
||||
# build: .
|
||||
# container_name: terra-view-dev
|
||||
# ports:
|
||||
# - "1001:8001"
|
||||
# volumes:
|
||||
# - ./data-dev:/app/data
|
||||
# environment:
|
||||
# - PYTHONUNBUFFERED=1
|
||||
# - ENVIRONMENT=development
|
||||
# - SLMM_BASE_URL=http://slmm:8100
|
||||
# restart: unless-stopped
|
||||
# depends_on:
|
||||
# - slmm
|
||||
# healthcheck:
|
||||
# test: ["CMD", "curl", "-f", "http://localhost:8001/health"]
|
||||
# interval: 30s
|
||||
# timeout: 10s
|
||||
# retries: 3
|
||||
# start_period: 40s
|
||||
|
||||
# --- SLMM (Sound Level Meter Manager) ---
|
||||
slmm:
|
||||
build:
|
||||
context: .
|
||||
dockerfile: Dockerfile.slm
|
||||
context: ../slmm
|
||||
dockerfile: Dockerfile
|
||||
container_name: slmm
|
||||
network_mode: host
|
||||
volumes:
|
||||
- ./data-slm:/app/data
|
||||
- ../slmm/data:/app/data
|
||||
environment:
|
||||
- PYTHONUNBUFFERED=1
|
||||
- PORT=8100
|
||||
@@ -78,4 +70,3 @@ services:
|
||||
volumes:
|
||||
data:
|
||||
data-dev:
|
||||
data-slm:
|
||||
|
||||
138
rename_unit.py
Normal file
@@ -0,0 +1,138 @@
|
||||
#!/usr/bin/env python3
|
||||
"""
|
||||
Script to rename a unit ID in the database.
|
||||
This updates the unit across all tables with proper foreign key handling.
|
||||
"""
|
||||
|
||||
import sys
|
||||
from sqlalchemy import create_engine, text
|
||||
from sqlalchemy.orm import sessionmaker
|
||||
|
||||
DATABASE_URL = "sqlite:///data/sfm.db"
|
||||
|
||||
def rename_unit(old_id: str, new_id: str):
|
||||
"""
|
||||
Rename a unit ID across all relevant tables.
|
||||
|
||||
Args:
|
||||
old_id: Current unit ID (e.g., "SLM4301")
|
||||
new_id: New unit ID (e.g., "SLM-43-01")
|
||||
"""
|
||||
engine = create_engine(DATABASE_URL)
|
||||
Session = sessionmaker(bind=engine)
|
||||
session = Session()
|
||||
|
||||
try:
|
||||
# Check if old unit exists
|
||||
result = session.execute(
|
||||
text("SELECT id, device_type FROM roster WHERE id = :old_id"),
|
||||
{"old_id": old_id}
|
||||
).fetchone()
|
||||
|
||||
if not result:
|
||||
print(f"❌ Error: Unit '{old_id}' not found in roster")
|
||||
return False
|
||||
|
||||
device_type = result[1]
|
||||
print(f"✓ Found unit '{old_id}' (device_type: {device_type})")
|
||||
|
||||
# Check if new ID already exists
|
||||
result = session.execute(
|
||||
text("SELECT id FROM roster WHERE id = :new_id"),
|
||||
{"new_id": new_id}
|
||||
).fetchone()
|
||||
|
||||
if result:
|
||||
print(f"❌ Error: Unit ID '{new_id}' already exists")
|
||||
return False
|
||||
|
||||
print(f"\n🔄 Renaming '{old_id}' → '{new_id}'...\n")
|
||||
|
||||
# Update roster table (primary)
|
||||
session.execute(
|
||||
text("UPDATE roster SET id = :new_id WHERE id = :old_id"),
|
||||
{"new_id": new_id, "old_id": old_id}
|
||||
)
|
||||
print(f" ✓ Updated roster")
|
||||
|
||||
# Update emitters table
|
||||
result = session.execute(
|
||||
text("UPDATE emitters SET id = :new_id WHERE id = :old_id"),
|
||||
{"new_id": new_id, "old_id": old_id}
|
||||
)
|
||||
if result.rowcount > 0:
|
||||
print(f" ✓ Updated emitters ({result.rowcount} rows)")
|
||||
|
||||
# Update unit_history table
|
||||
result = session.execute(
|
||||
text("UPDATE unit_history SET unit_id = :new_id WHERE unit_id = :old_id"),
|
||||
{"new_id": new_id, "old_id": old_id}
|
||||
)
|
||||
if result.rowcount > 0:
|
||||
print(f" ✓ Updated unit_history ({result.rowcount} rows)")
|
||||
|
||||
# Update deployed_with_modem_id references
|
||||
result = session.execute(
|
||||
text("UPDATE roster SET deployed_with_modem_id = :new_id WHERE deployed_with_modem_id = :old_id"),
|
||||
{"new_id": new_id, "old_id": old_id}
|
||||
)
|
||||
if result.rowcount > 0:
|
||||
print(f" ✓ Updated modem references ({result.rowcount} rows)")
|
||||
|
||||
# Update unit_assignments table (if exists)
|
||||
try:
|
||||
result = session.execute(
|
||||
text("UPDATE unit_assignments SET unit_id = :new_id WHERE unit_id = :old_id"),
|
||||
{"new_id": new_id, "old_id": old_id}
|
||||
)
|
||||
if result.rowcount > 0:
|
||||
print(f" ✓ Updated unit_assignments ({result.rowcount} rows)")
|
||||
except Exception:
|
||||
pass # Table may not exist
|
||||
|
||||
# Update recording_sessions table (if exists)
|
||||
try:
|
||||
result = session.execute(
|
||||
text("UPDATE recording_sessions SET unit_id = :new_id WHERE unit_id = :old_id"),
|
||||
{"new_id": new_id, "old_id": old_id}
|
||||
)
|
||||
if result.rowcount > 0:
|
||||
print(f" ✓ Updated recording_sessions ({result.rowcount} rows)")
|
||||
except Exception:
|
||||
pass # Table may not exist
|
||||
|
||||
# Commit all changes
|
||||
session.commit()
|
||||
print(f"\n✅ Successfully renamed unit '{old_id}' to '{new_id}'")
|
||||
return True
|
||||
|
||||
except Exception as e:
|
||||
session.rollback()
|
||||
print(f"\n❌ Error during rename: {e}")
|
||||
return False
|
||||
finally:
|
||||
session.close()
|
||||
|
||||
|
||||
if __name__ == "__main__":
|
||||
if len(sys.argv) != 3:
|
||||
print("Usage: python rename_unit.py <old_id> <new_id>")
|
||||
print("Example: python rename_unit.py SLM4301 SLM-43-01")
|
||||
sys.exit(1)
|
||||
|
||||
old_id = sys.argv[1]
|
||||
new_id = sys.argv[2]
|
||||
|
||||
print(f"Unit Renaming Tool")
|
||||
print(f"=" * 50)
|
||||
print(f"Old ID: {old_id}")
|
||||
print(f"New ID: {new_id}")
|
||||
print(f"=" * 50)
|
||||
|
||||
confirm = input(f"\nAre you sure you want to rename '{old_id}' to '{new_id}'? (yes/no): ")
|
||||
if confirm.lower() != 'yes':
|
||||
print("❌ Rename cancelled")
|
||||
sys.exit(0)
|
||||
|
||||
success = rename_unit(old_id, new_id)
|
||||
sys.exit(0 if success else 1)
|
||||
@@ -127,7 +127,7 @@
|
||||
Sound Level Meters
|
||||
</a>
|
||||
|
||||
<a href="#" class="flex items-center px-4 py-3 rounded-lg hover:bg-gray-100 dark:hover:bg-gray-700 opacity-50 cursor-not-allowed">
|
||||
<a href="/projects" class="flex items-center px-4 py-3 rounded-lg hover:bg-gray-100 dark:hover:bg-gray-700 {% if request.url.path.startswith('/projects') %}bg-gray-100 dark:bg-gray-700{% endif %}">
|
||||
<svg class="w-5 h-5 mr-3" fill="none" stroke="currentColor" viewBox="0 0 24 24">
|
||||
<path stroke-linecap="round" stroke-linejoin="round" stroke-width="2" d="M19 11H5m14 0a2 2 0 012 2v6a2 2 0 01-2 2H5a2 2 0 01-2-2v-6a2 2 0 012-2m14 0V9a2 2 0 00-2-2M5 11V9a2 2 0 012-2m0 0V5a2 2 0 012-2h6a2 2 0 012 2v2M7 7h10"></path>
|
||||
</svg>
|
||||
563
templates/nrl_detail.html
Normal file
@@ -0,0 +1,563 @@
|
||||
{% extends "base.html" %}
|
||||
|
||||
{% block title %}{{ location.name }} - NRL Detail{% endblock %}
|
||||
|
||||
{% block content %}
|
||||
<!-- Breadcrumb Navigation -->
|
||||
<div class="mb-6">
|
||||
<nav class="flex items-center space-x-2 text-sm">
|
||||
<a href="/projects" class="text-seismo-orange hover:text-seismo-navy flex items-center">
|
||||
<svg class="w-4 h-4 mr-1" fill="none" stroke="currentColor" viewBox="0 0 24 24">
|
||||
<path stroke-linecap="round" stroke-linejoin="round" stroke-width="2" d="M15 19l-7-7 7-7"></path>
|
||||
</svg>
|
||||
Projects
|
||||
</a>
|
||||
<svg class="w-4 h-4 text-gray-400" fill="none" stroke="currentColor" viewBox="0 0 24 24">
|
||||
<path stroke-linecap="round" stroke-linejoin="round" stroke-width="2" d="M9 5l7 7-7 7"></path>
|
||||
</svg>
|
||||
<a href="/projects/{{ project_id }}" class="text-seismo-orange hover:text-seismo-navy">
|
||||
{{ project.name }}
|
||||
</a>
|
||||
<svg class="w-4 h-4 text-gray-400" fill="none" stroke="currentColor" viewBox="0 0 24 24">
|
||||
<path stroke-linecap="round" stroke-linejoin="round" stroke-width="2" d="M9 5l7 7-7 7"></path>
|
||||
</svg>
|
||||
<span class="text-gray-900 dark:text-white font-medium">{{ location.name }}</span>
|
||||
</nav>
|
||||
</div>
|
||||
|
||||
<!-- Header -->
|
||||
<div class="mb-8">
|
||||
<div class="flex justify-between items-start">
|
||||
<div>
|
||||
<h1 class="text-3xl font-bold text-gray-900 dark:text-white flex items-center">
|
||||
<svg class="w-8 h-8 mr-3 text-seismo-orange" fill="none" stroke="currentColor" viewBox="0 0 24 24">
|
||||
<path stroke-linecap="round" stroke-linejoin="round" stroke-width="2" d="M17.657 16.657L13.414 20.9a1.998 1.998 0 01-2.827 0l-4.244-4.243a8 8 0 1111.314 0z"></path>
|
||||
<path stroke-linecap="round" stroke-linejoin="round" stroke-width="2" d="M15 11a3 3 0 11-6 0 3 3 0 016 0z"></path>
|
||||
</svg>
|
||||
{{ location.name }}
|
||||
</h1>
|
||||
<p class="text-gray-600 dark:text-gray-400 mt-1">
|
||||
Noise Recording Location • {{ project.name }}
|
||||
</p>
|
||||
</div>
|
||||
<div class="flex gap-2">
|
||||
{% if assigned_unit %}
|
||||
<span class="px-3 py-1 rounded-full text-sm font-medium bg-green-100 text-green-800 dark:bg-green-900/30 dark:text-green-300">
|
||||
<svg class="w-4 h-4 inline mr-1" fill="none" stroke="currentColor" viewBox="0 0 24 24">
|
||||
<path stroke-linecap="round" stroke-linejoin="round" stroke-width="2" d="M5 13l4 4L19 7"></path>
|
||||
</svg>
|
||||
Unit Assigned
|
||||
</span>
|
||||
{% else %}
|
||||
<span class="px-3 py-1 rounded-full text-sm font-medium bg-amber-100 text-amber-800 dark:bg-amber-900/30 dark:text-amber-300">
|
||||
No Unit Assigned
|
||||
</span>
|
||||
{% endif %}
|
||||
</div>
|
||||
</div>
|
||||
</div>
|
||||
|
||||
<!-- Tab Navigation -->
|
||||
<div class="mb-6 border-b border-gray-200 dark:border-gray-700">
|
||||
<nav class="flex space-x-6">
|
||||
<button onclick="switchTab('overview')"
|
||||
data-tab="overview"
|
||||
class="tab-button px-4 py-3 border-b-2 font-medium text-sm transition-colors border-seismo-orange text-seismo-orange">
|
||||
Overview
|
||||
</button>
|
||||
<button onclick="switchTab('settings')"
|
||||
data-tab="settings"
|
||||
class="tab-button px-4 py-3 border-b-2 border-transparent font-medium text-sm text-gray-600 dark:text-gray-400 hover:text-gray-900 dark:hover:text-white hover:border-gray-300 dark:hover:border-gray-600 transition-colors">
|
||||
Settings
|
||||
</button>
|
||||
{% if assigned_unit %}
|
||||
<button onclick="switchTab('command')"
|
||||
data-tab="command"
|
||||
class="tab-button px-4 py-3 border-b-2 border-transparent font-medium text-sm text-gray-600 dark:text-gray-400 hover:text-gray-900 dark:hover:text-white hover:border-gray-300 dark:hover:border-gray-600 transition-colors">
|
||||
Command Center
|
||||
</button>
|
||||
{% endif %}
|
||||
<button onclick="switchTab('sessions')"
|
||||
data-tab="sessions"
|
||||
class="tab-button px-4 py-3 border-b-2 border-transparent font-medium text-sm text-gray-600 dark:text-gray-400 hover:text-gray-900 dark:hover:text-white hover:border-gray-300 dark:hover:border-gray-600 transition-colors">
|
||||
Recording Sessions
|
||||
</button>
|
||||
<button onclick="switchTab('data')"
|
||||
data-tab="data"
|
||||
class="tab-button px-4 py-3 border-b-2 border-transparent font-medium text-sm text-gray-600 dark:text-gray-400 hover:text-gray-900 dark:hover:text-white hover:border-gray-300 dark:hover:border-gray-600 transition-colors">
|
||||
Data Files
|
||||
</button>
|
||||
</nav>
|
||||
</div>
|
||||
|
||||
<!-- Tab Content -->
|
||||
<div id="tab-content">
|
||||
<!-- Overview Tab -->
|
||||
<div id="overview-tab" class="tab-panel">
|
||||
<div class="grid grid-cols-1 lg:grid-cols-2 gap-6">
|
||||
<!-- Location Details Card -->
|
||||
<div class="bg-white dark:bg-slate-800 rounded-xl shadow-lg p-6">
|
||||
<h2 class="text-xl font-semibold text-gray-900 dark:text-white mb-4">Location Details</h2>
|
||||
<div class="space-y-4">
|
||||
<div>
|
||||
<div class="text-sm text-gray-600 dark:text-gray-400">Name</div>
|
||||
<div class="text-lg font-medium text-gray-900 dark:text-white">{{ location.name }}</div>
|
||||
</div>
|
||||
{% if location.description %}
|
||||
<div>
|
||||
<div class="text-sm text-gray-600 dark:text-gray-400">Description</div>
|
||||
<div class="text-gray-900 dark:text-white">{{ location.description }}</div>
|
||||
</div>
|
||||
{% endif %}
|
||||
{% if location.address %}
|
||||
<div>
|
||||
<div class="text-sm text-gray-600 dark:text-gray-400">Address</div>
|
||||
<div class="text-gray-900 dark:text-white">{{ location.address }}</div>
|
||||
</div>
|
||||
{% endif %}
|
||||
{% if location.coordinates %}
|
||||
<div>
|
||||
<div class="text-sm text-gray-600 dark:text-gray-400">Coordinates</div>
|
||||
<div class="text-gray-900 dark:text-white font-mono text-sm">{{ location.coordinates }}</div>
|
||||
</div>
|
||||
{% endif %}
|
||||
<div>
|
||||
<div class="text-sm text-gray-600 dark:text-gray-400">Created</div>
|
||||
<div class="text-gray-900 dark:text-white">{{ location.created_at.strftime('%Y-%m-%d %H:%M') if location.created_at else 'N/A' }}</div>
|
||||
</div>
|
||||
</div>
|
||||
</div>
|
||||
|
||||
<!-- Assignment Card -->
|
||||
<div class="bg-white dark:bg-slate-800 rounded-xl shadow-lg p-6">
|
||||
<h2 class="text-xl font-semibold text-gray-900 dark:text-white mb-4">Unit Assignment</h2>
|
||||
{% if assigned_unit %}
|
||||
<div class="space-y-4">
|
||||
<div>
|
||||
<div class="text-sm text-gray-600 dark:text-gray-400">Assigned Unit</div>
|
||||
<div class="text-lg font-medium text-gray-900 dark:text-white">
|
||||
<a href="/slm/{{ assigned_unit.id }}?from_project={{ project_id }}&from_nrl={{ location_id }}" class="text-seismo-orange hover:text-seismo-navy">
|
||||
{{ assigned_unit.id }}
|
||||
</a>
|
||||
</div>
|
||||
</div>
|
||||
{% if assigned_unit.slm_model %}
|
||||
<div>
|
||||
<div class="text-sm text-gray-600 dark:text-gray-400">Model</div>
|
||||
<div class="text-gray-900 dark:text-white">{{ assigned_unit.slm_model }}</div>
|
||||
</div>
|
||||
{% endif %}
|
||||
{% if assignment %}
|
||||
<div>
|
||||
<div class="text-sm text-gray-600 dark:text-gray-400">Assigned Since</div>
|
||||
<div class="text-gray-900 dark:text-white">{{ assignment.assigned_at.strftime('%Y-%m-%d %H:%M') if assignment.assigned_at else 'N/A' }}</div>
|
||||
</div>
|
||||
{% if assignment.notes %}
|
||||
<div>
|
||||
<div class="text-sm text-gray-600 dark:text-gray-400">Notes</div>
|
||||
<div class="text-gray-900 dark:text-white text-sm">{{ assignment.notes }}</div>
|
||||
</div>
|
||||
{% endif %}
|
||||
{% endif %}
|
||||
<div class="pt-2">
|
||||
<button onclick="unassignUnit('{{ assignment.id }}')"
|
||||
class="px-4 py-2 bg-amber-100 text-amber-800 dark:bg-amber-900/30 dark:text-amber-300 rounded-lg hover:bg-amber-200 dark:hover:bg-amber-900/50 transition-colors">
|
||||
Unassign Unit
|
||||
</button>
|
||||
</div>
|
||||
</div>
|
||||
{% else %}
|
||||
<div class="text-center py-8">
|
||||
<svg class="w-16 h-16 mx-auto mb-4 text-gray-400" fill="none" stroke="currentColor" viewBox="0 0 24 24">
|
||||
<path stroke-linecap="round" stroke-linejoin="round" stroke-width="2" d="M20 13V6a2 2 0 00-2-2H6a2 2 0 00-2 2v7m16 0v5a2 2 0 01-2 2H6a2 2 0 01-2-2v-5m16 0h-2.586a1 1 0 00-.707.293l-2.414 2.414a1 1 0 01-.707.293h-3.172a1 1 0 01-.707-.293l-2.414-2.414A1 1 0 006.586 13H4"></path>
|
||||
</svg>
|
||||
<p class="text-gray-500 dark:text-gray-400 mb-4">No unit currently assigned</p>
|
||||
<button onclick="openAssignModal()"
|
||||
class="px-4 py-2 bg-seismo-orange text-white rounded-lg hover:bg-seismo-navy transition-colors">
|
||||
Assign a Unit
|
||||
</button>
|
||||
</div>
|
||||
{% endif %}
|
||||
</div>
|
||||
</div>
|
||||
|
||||
<!-- Stats Cards -->
|
||||
<div class="grid grid-cols-1 md:grid-cols-3 gap-6 mt-6">
|
||||
<div class="bg-white dark:bg-slate-800 rounded-xl shadow-lg p-6">
|
||||
<div class="flex items-center justify-between">
|
||||
<div>
|
||||
<p class="text-sm text-gray-600 dark:text-gray-400">Total Sessions</p>
|
||||
<p class="text-3xl font-bold text-gray-900 dark:text-white mt-2">{{ session_count }}</p>
|
||||
</div>
|
||||
<div class="w-12 h-12 bg-blue-100 dark:bg-blue-900/30 rounded-lg flex items-center justify-center">
|
||||
<svg class="w-6 h-6 text-blue-600 dark:text-blue-400" fill="none" stroke="currentColor" viewBox="0 0 24 24">
|
||||
<path stroke-linecap="round" stroke-linejoin="round" stroke-width="2" d="M9 19V6l12-3v13M9 19c0 1.105-1.343 2-3 2s-3-.895-3-2 1.343-2 3-2 3 .895 3 2zm12-3c0 1.105-1.343 2-3 2s-3-.895-3-2 1.343-2 3-2 3 .895 3 2zM9 10l12-3"></path>
|
||||
</svg>
|
||||
</div>
|
||||
</div>
|
||||
</div>
|
||||
|
||||
<div class="bg-white dark:bg-slate-800 rounded-xl shadow-lg p-6">
|
||||
<div class="flex items-center justify-between">
|
||||
<div>
|
||||
<p class="text-sm text-gray-600 dark:text-gray-400">Data Files</p>
|
||||
<p class="text-3xl font-bold text-gray-900 dark:text-white mt-2">{{ file_count }}</p>
|
||||
</div>
|
||||
<div class="w-12 h-12 bg-green-100 dark:bg-green-900/30 rounded-lg flex items-center justify-center">
|
||||
<svg class="w-6 h-6 text-green-600 dark:text-green-400" fill="none" stroke="currentColor" viewBox="0 0 24 24">
|
||||
<path stroke-linecap="round" stroke-linejoin="round" stroke-width="2" d="M7 21h10a2 2 0 002-2V9.414a1 1 0 00-.293-.707l-5.414-5.414A1 1 0 0012.586 3H7a2 2 0 00-2 2v14a2 2 0 002 2z"></path>
|
||||
</svg>
|
||||
</div>
|
||||
</div>
|
||||
</div>
|
||||
|
||||
<div class="bg-white dark:bg-slate-800 rounded-xl shadow-lg p-6">
|
||||
<div class="flex items-center justify-between">
|
||||
<div>
|
||||
<p class="text-sm text-gray-600 dark:text-gray-400">Active Session</p>
|
||||
<p class="text-lg font-semibold text-gray-900 dark:text-white mt-2">
|
||||
{% if active_session %}
|
||||
<span class="text-green-600 dark:text-green-400">Recording</span>
|
||||
{% else %}
|
||||
<span class="text-gray-500">Idle</span>
|
||||
{% endif %}
|
||||
</p>
|
||||
</div>
|
||||
<div class="w-12 h-12 bg-purple-100 dark:bg-purple-900/30 rounded-lg flex items-center justify-center">
|
||||
<svg class="w-6 h-6 text-purple-600 dark:text-purple-400" fill="none" stroke="currentColor" viewBox="0 0 24 24">
|
||||
<path stroke-linecap="round" stroke-linejoin="round" stroke-width="2" d="M12 8v4l3 3m6-3a9 9 0 11-18 0 9 9 0 0118 0z"></path>
|
||||
</svg>
|
||||
</div>
|
||||
</div>
|
||||
</div>
|
||||
</div>
|
||||
</div>
|
||||
|
||||
<!-- Settings Tab -->
|
||||
<div id="settings-tab" class="tab-panel hidden">
|
||||
<div class="bg-white dark:bg-slate-800 rounded-xl shadow-lg p-6">
|
||||
<h2 class="text-xl font-semibold text-gray-900 dark:text-white mb-6">Location Settings</h2>
|
||||
|
||||
<form id="location-settings-form" class="space-y-6">
|
||||
<div>
|
||||
<label class="block text-sm font-medium text-gray-700 dark:text-gray-300 mb-2">Name</label>
|
||||
<input type="text" id="settings-name" value="{{ location.name }}"
|
||||
class="w-full px-4 py-2 border border-gray-300 dark:border-gray-600 rounded-lg bg-white dark:bg-gray-700 text-gray-900 dark:text-white" required>
|
||||
</div>
|
||||
|
||||
<div>
|
||||
<label class="block text-sm font-medium text-gray-700 dark:text-gray-300 mb-2">Description</label>
|
||||
<textarea id="settings-description" rows="3"
|
||||
class="w-full px-4 py-2 border border-gray-300 dark:border-gray-600 rounded-lg bg-white dark:bg-gray-700 text-gray-900 dark:text-white">{{ location.description or '' }}</textarea>
|
||||
</div>
|
||||
|
||||
<div>
|
||||
<label class="block text-sm font-medium text-gray-700 dark:text-gray-300 mb-2">Address</label>
|
||||
<input type="text" id="settings-address" value="{{ location.address or '' }}"
|
||||
class="w-full px-4 py-2 border border-gray-300 dark:border-gray-600 rounded-lg bg-white dark:bg-gray-700 text-gray-900 dark:text-white">
|
||||
</div>
|
||||
|
||||
<div>
|
||||
<label class="block text-sm font-medium text-gray-700 dark:text-gray-300 mb-2">Coordinates</label>
|
||||
<input type="text" id="settings-coordinates" value="{{ location.coordinates or '' }}"
|
||||
placeholder="40.7128,-74.0060"
|
||||
class="w-full px-4 py-2 border border-gray-300 dark:border-gray-600 rounded-lg bg-white dark:bg-gray-700 text-gray-900 dark:text-white">
|
||||
<p class="text-xs text-gray-500 mt-1">Format: latitude,longitude</p>
|
||||
</div>
|
||||
|
||||
<div id="settings-error" class="hidden text-sm text-red-600"></div>
|
||||
|
||||
<div class="flex justify-end gap-3 pt-2">
|
||||
<button type="button" onclick="window.location.href='/projects/{{ project_id }}'"
|
||||
class="px-6 py-2 border border-gray-300 dark:border-gray-600 rounded-lg text-gray-700 dark:text-gray-300 hover:bg-gray-50 dark:hover:bg-gray-700">
|
||||
Cancel
|
||||
</button>
|
||||
<button type="submit"
|
||||
class="px-6 py-2 bg-seismo-orange hover:bg-seismo-navy text-white rounded-lg font-medium">
|
||||
Save Changes
|
||||
</button>
|
||||
</div>
|
||||
</form>
|
||||
</div>
|
||||
</div>
|
||||
|
||||
<!-- Command Center Tab -->
|
||||
{% if assigned_unit %}
|
||||
<div id="command-tab" class="tab-panel hidden">
|
||||
<div class="bg-white dark:bg-slate-800 rounded-xl shadow-lg p-6">
|
||||
<h2 class="text-xl font-semibold text-gray-900 dark:text-white mb-6">
|
||||
SLM Command Center - {{ assigned_unit.id }}
|
||||
</h2>
|
||||
|
||||
<div id="slm-command-center"
|
||||
hx-get="/api/slm-dashboard/live-view/{{ assigned_unit.id if assigned_unit else '' }}"
|
||||
hx-trigger="load"
|
||||
hx-swap="innerHTML">
|
||||
<div class="text-center py-8 text-gray-500">
|
||||
<div class="animate-spin rounded-full h-12 w-12 border-b-2 border-seismo-orange mx-auto mb-4"></div>
|
||||
<p>Loading command center...</p>
|
||||
</div>
|
||||
</div>
|
||||
</div>
|
||||
</div>
|
||||
{% endif %}
|
||||
|
||||
<!-- Recording Sessions Tab -->
|
||||
<div id="sessions-tab" class="tab-panel hidden">
|
||||
<div class="bg-white dark:bg-slate-800 rounded-xl shadow-lg p-6">
|
||||
<div class="flex items-center justify-between mb-6">
|
||||
<h2 class="text-xl font-semibold text-gray-900 dark:text-white">Recording Sessions</h2>
|
||||
{% if assigned_unit %}
|
||||
<button onclick="openScheduleModal()"
|
||||
class="px-4 py-2 bg-seismo-orange text-white rounded-lg hover:bg-seismo-navy transition-colors">
|
||||
Schedule Session
|
||||
</button>
|
||||
{% endif %}
|
||||
</div>
|
||||
|
||||
<div id="sessions-list"
|
||||
hx-get="/api/projects/{{ project_id }}/nrl/{{ location_id }}/sessions"
|
||||
hx-trigger="load, every 30s"
|
||||
hx-swap="innerHTML">
|
||||
<div class="text-center py-8 text-gray-500">Loading sessions...</div>
|
||||
</div>
|
||||
</div>
|
||||
</div>
|
||||
|
||||
<!-- Data Files Tab -->
|
||||
<div id="data-tab" class="tab-panel hidden">
|
||||
<div class="bg-white dark:bg-slate-800 rounded-xl shadow-lg p-6">
|
||||
<div class="flex items-center justify-between mb-6">
|
||||
<h2 class="text-xl font-semibold text-gray-900 dark:text-white">Data Files</h2>
|
||||
<div class="text-sm text-gray-500">
|
||||
<span class="font-medium">{{ file_count }}</span> files
|
||||
</div>
|
||||
</div>
|
||||
|
||||
<div id="data-files-list"
|
||||
hx-get="/api/projects/{{ project_id }}/nrl/{{ location_id }}/files"
|
||||
hx-trigger="load"
|
||||
hx-swap="innerHTML">
|
||||
<div class="text-center py-8 text-gray-500">Loading data files...</div>
|
||||
</div>
|
||||
</div>
|
||||
</div>
|
||||
</div>
|
||||
|
||||
<!-- Assign Unit Modal -->
|
||||
<div id="assign-modal" class="hidden fixed inset-0 bg-black bg-opacity-50 z-50 flex items-center justify-center">
|
||||
<div class="bg-white dark:bg-slate-800 rounded-xl shadow-2xl w-full max-w-2xl max-h-[90vh] overflow-y-auto m-4">
|
||||
<div class="p-6 border-b border-gray-200 dark:border-gray-700 flex items-center justify-between">
|
||||
<div>
|
||||
<h2 class="text-2xl font-bold text-gray-900 dark:text-white">Assign Unit</h2>
|
||||
<p class="text-gray-600 dark:text-gray-400 mt-1">Attach a sound level meter to this location</p>
|
||||
</div>
|
||||
<button onclick="closeAssignModal()" class="text-gray-500 hover:text-gray-700 dark:text-gray-400 dark:hover:text-gray-200">
|
||||
<svg class="w-6 h-6" fill="none" stroke="currentColor" viewBox="0 0 24 24">
|
||||
<path stroke-linecap="round" stroke-linejoin="round" stroke-width="2" d="M6 18L18 6M6 6l12 12"></path>
|
||||
</svg>
|
||||
</button>
|
||||
</div>
|
||||
|
||||
<form id="assign-form" class="p-6 space-y-4">
|
||||
<div>
|
||||
<label class="block text-sm font-medium text-gray-700 dark:text-gray-300 mb-2">Available Units</label>
|
||||
<select id="assign-unit-id" name="unit_id"
|
||||
class="w-full px-4 py-2 border border-gray-300 dark:border-gray-600 rounded-lg bg-white dark:bg-gray-700 text-gray-900 dark:text-white" required>
|
||||
<option value="">Loading units...</option>
|
||||
</select>
|
||||
<p id="assign-empty" class="hidden text-xs text-gray-500 mt-2">No available units for this location type.</p>
|
||||
</div>
|
||||
|
||||
<div>
|
||||
<label class="block text-sm font-medium text-gray-700 dark:text-gray-300 mb-2">Notes</label>
|
||||
<textarea id="assign-notes" name="notes" rows="2"
|
||||
class="w-full px-4 py-2 border border-gray-300 dark:border-gray-600 rounded-lg bg-white dark:bg-gray-700 text-gray-900 dark:text-white"></textarea>
|
||||
</div>
|
||||
|
||||
<div id="assign-error" class="hidden text-sm text-red-600"></div>
|
||||
|
||||
<div class="flex justify-end gap-3 pt-2">
|
||||
<button type="button" onclick="closeAssignModal()"
|
||||
class="px-6 py-2 border border-gray-300 dark:border-gray-600 rounded-lg text-gray-700 dark:text-gray-300 hover:bg-gray-50 dark:hover:bg-gray-700">
|
||||
Cancel
|
||||
</button>
|
||||
<button type="submit"
|
||||
class="px-6 py-2 bg-seismo-orange hover:bg-seismo-navy text-white rounded-lg font-medium">
|
||||
Assign Unit
|
||||
</button>
|
||||
</div>
|
||||
</form>
|
||||
</div>
|
||||
</div>
|
||||
|
||||
<script>
|
||||
const projectId = "{{ project_id }}";
|
||||
const locationId = "{{ location_id }}";
|
||||
|
||||
// Tab switching
|
||||
function switchTab(tabName) {
|
||||
// Hide all tab panels
|
||||
document.querySelectorAll('.tab-panel').forEach(panel => {
|
||||
panel.classList.add('hidden');
|
||||
});
|
||||
|
||||
// Reset all tab buttons
|
||||
document.querySelectorAll('.tab-button').forEach(button => {
|
||||
button.classList.remove('border-seismo-orange', 'text-seismo-orange');
|
||||
button.classList.add('border-transparent', 'text-gray-600', 'dark:text-gray-400');
|
||||
});
|
||||
|
||||
// Show selected tab panel
|
||||
const panel = document.getElementById(`${tabName}-tab`);
|
||||
if (panel) {
|
||||
panel.classList.remove('hidden');
|
||||
}
|
||||
|
||||
// Highlight selected tab button
|
||||
const button = document.querySelector(`[data-tab="${tabName}"]`);
|
||||
if (button) {
|
||||
button.classList.remove('border-transparent', 'text-gray-600', 'dark:text-gray-400');
|
||||
button.classList.add('border-seismo-orange', 'text-seismo-orange');
|
||||
}
|
||||
}
|
||||
|
||||
// Location settings form submission
|
||||
document.getElementById('location-settings-form').addEventListener('submit', async function(e) {
|
||||
e.preventDefault();
|
||||
|
||||
const payload = {
|
||||
name: document.getElementById('settings-name').value.trim(),
|
||||
description: document.getElementById('settings-description').value.trim() || null,
|
||||
address: document.getElementById('settings-address').value.trim() || null,
|
||||
coordinates: document.getElementById('settings-coordinates').value.trim() || null,
|
||||
};
|
||||
|
||||
try {
|
||||
const response = await fetch(`/api/projects/${projectId}/locations/${locationId}`, {
|
||||
method: 'PUT',
|
||||
headers: {'Content-Type': 'application/json'},
|
||||
body: JSON.stringify(payload)
|
||||
});
|
||||
|
||||
if (!response.ok) {
|
||||
const data = await response.json().catch(() => ({}));
|
||||
throw new Error(data.detail || 'Failed to update location');
|
||||
}
|
||||
|
||||
window.location.reload();
|
||||
} catch (err) {
|
||||
const errorEl = document.getElementById('settings-error');
|
||||
errorEl.textContent = err.message || 'Failed to update location.';
|
||||
errorEl.classList.remove('hidden');
|
||||
}
|
||||
});
|
||||
|
||||
// Assign modal functions
|
||||
function openAssignModal() {
|
||||
const modal = document.getElementById('assign-modal');
|
||||
modal.classList.remove('hidden');
|
||||
loadAvailableUnits();
|
||||
}
|
||||
|
||||
function closeAssignModal() {
|
||||
document.getElementById('assign-modal').classList.add('hidden');
|
||||
}
|
||||
|
||||
async function loadAvailableUnits() {
|
||||
try {
|
||||
const response = await fetch(`/api/projects/${projectId}/available-units?location_type=sound`);
|
||||
if (!response.ok) {
|
||||
throw new Error('Failed to load available units');
|
||||
}
|
||||
const data = await response.json();
|
||||
const select = document.getElementById('assign-unit-id');
|
||||
select.innerHTML = '<option value="">Select a unit</option>';
|
||||
|
||||
if (!data.length) {
|
||||
document.getElementById('assign-empty').classList.remove('hidden');
|
||||
return;
|
||||
}
|
||||
|
||||
data.forEach(unit => {
|
||||
const option = document.createElement('option');
|
||||
option.value = unit.id;
|
||||
option.textContent = `${unit.id} • ${unit.model || unit.device_type}`;
|
||||
select.appendChild(option);
|
||||
});
|
||||
} catch (err) {
|
||||
const errorEl = document.getElementById('assign-error');
|
||||
errorEl.textContent = err.message || 'Failed to load units.';
|
||||
errorEl.classList.remove('hidden');
|
||||
}
|
||||
}
|
||||
|
||||
document.getElementById('assign-form').addEventListener('submit', async function(e) {
|
||||
e.preventDefault();
|
||||
|
||||
const unitId = document.getElementById('assign-unit-id').value;
|
||||
const notes = document.getElementById('assign-notes').value.trim();
|
||||
|
||||
if (!unitId) {
|
||||
document.getElementById('assign-error').textContent = 'Select a unit to assign.';
|
||||
document.getElementById('assign-error').classList.remove('hidden');
|
||||
return;
|
||||
}
|
||||
|
||||
try {
|
||||
const formData = new FormData();
|
||||
formData.append('unit_id', unitId);
|
||||
formData.append('notes', notes);
|
||||
|
||||
const response = await fetch(`/api/projects/${projectId}/locations/${locationId}/assign`, {
|
||||
method: 'POST',
|
||||
body: formData
|
||||
});
|
||||
|
||||
if (!response.ok) {
|
||||
const data = await response.json().catch(() => ({}));
|
||||
throw new Error(data.detail || 'Failed to assign unit');
|
||||
}
|
||||
|
||||
window.location.reload();
|
||||
} catch (err) {
|
||||
const errorEl = document.getElementById('assign-error');
|
||||
errorEl.textContent = err.message || 'Failed to assign unit.';
|
||||
errorEl.classList.remove('hidden');
|
||||
}
|
||||
});
|
||||
|
||||
async function unassignUnit(assignmentId) {
|
||||
if (!confirm('Unassign this unit from the location?')) return;
|
||||
|
||||
try {
|
||||
const response = await fetch(`/api/projects/${projectId}/assignments/${assignmentId}/unassign`, {
|
||||
method: 'POST'
|
||||
});
|
||||
|
||||
if (!response.ok) {
|
||||
const data = await response.json().catch(() => ({}));
|
||||
throw new Error(data.detail || 'Failed to unassign unit');
|
||||
}
|
||||
|
||||
window.location.reload();
|
||||
} catch (err) {
|
||||
alert(err.message || 'Failed to unassign unit.');
|
||||
}
|
||||
}
|
||||
|
||||
// Keyboard shortcuts
|
||||
document.addEventListener('keydown', function(e) {
|
||||
if (e.key === 'Escape') {
|
||||
closeAssignModal();
|
||||
}
|
||||
});
|
||||
|
||||
// Click outside to close modal
|
||||
document.getElementById('assign-modal')?.addEventListener('click', function(e) {
|
||||
if (e.target === this) {
|
||||
closeAssignModal();
|
||||
}
|
||||
});
|
||||
</script>
|
||||
{% endblock %}
|
||||