Compare commits
76 Commits
7971092509
...
v0.7.0
| Author | SHA1 | Date | |
|---|---|---|---|
| 1dd396acd8 | |||
| e89a04f58c | |||
| e4ef065db8 | |||
| 86010de60c | |||
| f89f04cd6f | |||
| 67a2faa2d3 | |||
| 14856e61ef | |||
| 2b69518b33 | |||
| 6070d03e83 | |||
| 240552751c | |||
| 015ce0a254 | |||
| ef8c046f31 | |||
| 3637cf5af8 | |||
| 7fde14d882 | |||
| bd3d937a82 | |||
| 291fa8e862 | |||
| 8e292b1aca | |||
| 7516bbea70 | |||
| da4e5f66c5 | |||
| dae2595303 | |||
| 0c4e7aa5e6 | |||
| 229499ccf6 | |||
| fdc4adeaee | |||
| b3bf91880a | |||
| 17b3f91dfc | |||
| 6c1d0bc467 | |||
|
|
abd059983f | ||
|
|
0f17841218 | ||
|
|
65362bab21 | ||
|
|
dc77a362ce | ||
|
|
28942600ab | ||
|
|
80861997af | ||
| b15d434fce | |||
|
|
70ef43de11 | ||
| 7b4e12c127 | |||
|
|
24473c9ca3 | ||
|
|
caabfd0c42 | ||
|
|
ebe60d2b7d | ||
|
|
842e9d6f61 | ||
| 742a98a8ed | |||
| 3b29c4d645 | |||
|
|
63d9c59873 | ||
|
|
794bfc00dc | ||
|
|
89662d2fa5 | ||
|
|
eb0a99796d | ||
| b47e69e609 | |||
| 1cb25b6c17 | |||
|
|
e515bff1a9 | ||
|
|
f296806fd1 | ||
|
|
24da5ab79f | ||
|
|
305540f564 | ||
|
|
639b485c28 | ||
|
|
d78bafb76e | ||
|
|
8373cff10d | ||
|
|
4957a08198 | ||
|
|
05482bd903 | ||
|
|
5ee6f5eb28 | ||
| 7ce0f6115d | |||
|
|
6492fdff82 | ||
|
|
44d7841852 | ||
|
|
38c600aca3 | ||
|
|
eeda94926f | ||
|
|
57be9bf1f1 | ||
|
|
8431784708 | ||
|
|
c771a86675 | ||
|
|
65ea0920db | ||
|
|
1f3fa7a718 | ||
|
|
a9c9b1fd48 | ||
|
|
4c213c96ee | ||
|
|
ff38b74548 | ||
|
|
c8a030a3ba | ||
|
|
d8a8330427 | ||
|
|
1ef0557ccb | ||
|
|
6c7ce5aad0 | ||
|
|
54754e2279 | ||
|
|
8787a2dbb8 |
@@ -1,3 +1,5 @@
|
||||
docker-compose.override.yml
|
||||
|
||||
# Python cache / compiled
|
||||
__pycache__
|
||||
*.pyc
|
||||
@@ -28,6 +30,7 @@ ENV/
|
||||
|
||||
# Runtime data (mounted volumes)
|
||||
data/
|
||||
data-dev/
|
||||
|
||||
# Editors / OS junk
|
||||
.vscode/
|
||||
|
||||
19
.gitignore
vendored
@@ -1,3 +1,16 @@
|
||||
# Terra-View Specifics
|
||||
# Dev build counter (local only, never commit)
|
||||
build_number.txt
|
||||
|
||||
# SQLite database files
|
||||
*.db
|
||||
*.db-journal
|
||||
data/
|
||||
data-dev/
|
||||
.aider*
|
||||
.aider*
|
||||
docker-compose.override.yml
|
||||
|
||||
# Byte-compiled / optimized / DLL files
|
||||
__pycache__/
|
||||
*.py[codz]
|
||||
@@ -206,9 +219,3 @@ marimo/_static/
|
||||
marimo/_lsp/
|
||||
__marimo__/
|
||||
|
||||
# Seismo Fleet Manager
|
||||
# SQLite database files
|
||||
*.db
|
||||
*.db-journal
|
||||
data/
|
||||
.aider*
|
||||
|
||||
159
CHANGELOG.md
@@ -1,10 +1,161 @@
|
||||
# Changelog
|
||||
|
||||
All notable changes to Seismo Fleet Manager will be documented in this file.
|
||||
All notable changes to Terra-View will be documented in this file.
|
||||
|
||||
The format is based on [Keep a Changelog](https://keepachangelog.com/en/1.0.0/),
|
||||
and this project adheres to [Semantic Versioning](https://semver.org/spec/v2.0.0.html).
|
||||
|
||||
## [0.7.0] - 2026-03-07
|
||||
|
||||
### Added
|
||||
- **Project Status Management**: Projects can now be placed `on_hold` or `archived`, with automatic cancellation of pending scheduled actions
|
||||
- **Hard Delete Projects**: Support for permanently deleting projects, in addition to soft-delete with auto-pruning
|
||||
- **Vibration Location Detail**: New dedicated template for vibration project location detail views
|
||||
- **Vibration Project Isolation**: Vibration projects no longer show SLM-specific project tabs
|
||||
- **Manual SD Card Data Upload**: Upload offline NRL data directly from SD card via ZIP or multi-file select
|
||||
- Accepts `.rnd`/`.rnh` files; parses `.rnh` metadata for session start/stop times, serial number, and store name
|
||||
- Creates `MonitoringSession` and `DataFile` records automatically; no unit assignment required
|
||||
- Upload panel on NRL detail Data Files tab with inline feedback and auto-refresh via HTMX
|
||||
- **Standalone SLM Type**: New SLM device mode that operates without a modem (direct IP connection)
|
||||
- **NL32 Data Support**: Report generator and web viewer now support NL32 measurement data format
|
||||
- **Combined Report Wizard**: Multi-session combined Excel report generation tool
|
||||
- Wizard UI grouped by location with period type badges (day/night)
|
||||
- Each selected session produces one `.xlsx` in a ZIP archive
|
||||
- Period type filtering: day sessions keep last calendar date (7AM–6:59PM); night sessions span both days (7PM–6:59AM)
|
||||
- **Combined Report Preview**: Interactive spreadsheet-style preview before generating combined reports
|
||||
- **Chart Preview**: Live chart preview in the report generator matching final report styling
|
||||
- **SLM Model Schemas**: Per-model configuration schemas for NL32, NL43, NL53 devices
|
||||
- **Data Collection Mode**: Projects now store a data collection mode field with UI controls and migration
|
||||
|
||||
### Changed
|
||||
- **MonitoringSession rename**: `RecordingSession` renamed to `MonitoringSession` throughout codebase; DB table renamed from `recording_sessions` to `monitoring_sessions`
|
||||
- Migration: `backend/migrate_rename_recording_to_monitoring_sessions.py`
|
||||
- **Combined Report Split Logic**: Separate days now generate separate `.xlsx` files; NRLs remain one per sheet
|
||||
- **Mass Upload Parsing**: Smarter file filtering — no longer imports unneeded Lp files or `.xlsx` files
|
||||
- **SLM Start Time Grace Period**: 15-minute grace window added so data starting at session start time is included
|
||||
- **NL32 Date Parsing**: Date now read from `start_time` field instead of file metadata
|
||||
- **Project Data Labels**: Improved Jinja filters and UI label clarity for project data views
|
||||
|
||||
### Fixed
|
||||
- **Dev/Prod Separation**: Dev server now uses Docker Compose override; production deployment no longer affected by dev config
|
||||
- **SLM Modal**: Bench/deploy toggle now correctly shown in SLM unit modal
|
||||
- **Auto-Downloaded Files**: Files downloaded by scheduler now appear in project file listings
|
||||
- **Duplicate Download**: Removed duplicate file download that occurred following a scheduled stop
|
||||
- **SLMM Environment Variables**: `TCP_IDLE_TTL` and `TCP_MAX_AGE` now correctly passed to SLMM service via docker-compose
|
||||
|
||||
### Technical Details
|
||||
- `session_label` and `period_type` stored on `monitoring_sessions` table (migration: `migrate_add_session_period_type.py`)
|
||||
- `device_model` stored on `monitoring_sessions` table (migration: `migrate_add_session_device_model.py`)
|
||||
- Upload endpoint: `POST /api/projects/{project_id}/nrl/{location_id}/upload-data`
|
||||
- ZIP filename format: `{session_label}_{project_name}_report.xlsx` (label first)
|
||||
|
||||
### Migration Notes
|
||||
Run the following migration scripts once per database before deploying:
|
||||
```bash
|
||||
python backend/migrate_rename_recording_to_monitoring_sessions.py
|
||||
python backend/migrate_add_session_period_type.py
|
||||
python backend/migrate_add_session_device_model.py
|
||||
```
|
||||
|
||||
---
|
||||
|
||||
## [0.6.1] - 2026-02-16
|
||||
|
||||
### Added
|
||||
- **One-Off Recording Schedules**: Support for scheduling single recordings with specific start and end datetimes
|
||||
- **Bidirectional Pairing Sync**: Pairing a device with a modem now automatically updates both sides, clearing stale pairings when reassigned
|
||||
- **Auto-Fill Notes from Modem**: Notes are now copied from modem to paired device when fields are empty
|
||||
- **SLMM Download Requests**: New `_download_request` method in SLMM client for binary file downloads with local save
|
||||
|
||||
### Fixed
|
||||
- **Scheduler Timezone**: One-off scheduler times now use local time instead of UTC
|
||||
- **Pairing Consistency**: Old device references are properly cleared when a modem is re-paired to a new device
|
||||
|
||||
## [0.6.0] - 2026-02-06
|
||||
|
||||
### Added
|
||||
- **Calendar & Reservation Mode**: Fleet calendar view with reservation system for scheduling device deployments
|
||||
- **Device Pairing Interface**: New two-column pairing page (`/pair-devices`) for linking recorders (seismographs/SLMs) with modems
|
||||
- Visual pairing interface with drag-and-drop style interactions
|
||||
- Fuzzy-search modem pairing for SLMs
|
||||
- Pairing options now accessible from modem page
|
||||
- Improved pair status sharing across views
|
||||
- **Modem Dashboard Enhancements**:
|
||||
- Modem model number now a dedicated configuration field with per-model options
|
||||
- Direct link to modem login page from unit detail view
|
||||
- Modem view converted to list format
|
||||
- **Seismograph List Improvements**:
|
||||
- Enhanced visibility with better filtering and sorting
|
||||
- Calibration dates now color-coded for quick status assessment
|
||||
- User sets date of previous calibration (not expiry) for clearer workflow
|
||||
- **SLMM Device Control Lock**: Prevents command flooding to NL-43 devices
|
||||
|
||||
### Changed
|
||||
- **Calibration Date UX**: Users now set the date of the previous calibration rather than upcoming expiry dates - more intuitive workflow
|
||||
- **Settings Persistence**: Settings save no longer reloads the page
|
||||
- **Tab State**: Tab state now persists in URL hash for better navigation
|
||||
- **Scheduler Management**: Schedule changes now cascade to individual events
|
||||
- **Dashboard Filtering**: Enhanced dashboard with additional filtering options and SLM status sync
|
||||
- **SLMM Polling Intervals**: Fixed and improved polling intervals for better responsiveness
|
||||
- **24-Hour Scheduler Cycle**: Improved cycle handling to prevent issues with scheduled downloads
|
||||
|
||||
### Fixed
|
||||
- **SLM Modal Fields**: Modal now only contains correct device-specific fields
|
||||
- **IP Address Handling**: IP address correctly passed via modem pairing
|
||||
- **Mobile Type Display**: Fixed incorrect device type display in roster and device tables
|
||||
- **SLMM Scheduled Downloads**: Fixed issues with scheduled download operations
|
||||
|
||||
## [0.5.1] - 2026-01-27
|
||||
|
||||
### Added
|
||||
- **Dashboard Schedule View**: Today's scheduled actions now display directly on the main dashboard
|
||||
- New "Today's Actions" panel showing upcoming and past scheduled events
|
||||
- Schedule list partial for project-specific schedule views
|
||||
- API endpoint for fetching today's schedule data
|
||||
- **New Branding Assets**: Complete logo rework for Terra-View
|
||||
- New Terra-View logos for light and dark themes
|
||||
- Retina-ready (@2x) logo variants
|
||||
- Updated favicons (16px and 32px)
|
||||
- Refreshed PWA icons (72px through 512px)
|
||||
|
||||
### Changed
|
||||
- **Dashboard Layout**: Reorganized to include schedule information panel
|
||||
- **Base Template**: Updated to use new Terra-View logos with theme-aware switching
|
||||
|
||||
## [0.5.0] - 2026-01-23
|
||||
|
||||
_Note: This version was not formally released; changes were included in v0.5.1._
|
||||
|
||||
## [0.4.4] - 2026-01-23
|
||||
|
||||
### Added
|
||||
- **Recurring schedules**: New scheduler service, recurring schedule APIs, and schedule templates (calendar/interval/list).
|
||||
- **Alerts UI + backend**: Alerting service plus dropdown/list templates for surfacing notifications.
|
||||
- **Report templates + viewers**: CRUD API for report templates, report preview screen, and RND file viewer.
|
||||
- **SLM tooling**: SLM settings modal and SLM project report generator workflow.
|
||||
|
||||
### Changed
|
||||
- **Project data management**: Unified files view, refreshed FTP browser, and new project header/templates for file/session/unit/assignment lists.
|
||||
- **Device/SLM sync**: Standardized SLM device types and tightened SLMM sync paths.
|
||||
- **Docs/scripts**: Cleanup pass and expanded device-type documentation.
|
||||
|
||||
### Fixed
|
||||
- **Scheduler actions**: Strict command definitions so actions run reliably.
|
||||
- **Project view title**: Resolved JSON string rendering in project headers.
|
||||
|
||||
## [0.4.3] - 2026-01-14
|
||||
|
||||
### Added
|
||||
- **Sound Level Meter roster tooling**: Roster manager surfaces SLM metadata, supports rename unit flows, and adds return-to-project navigation to keep SLM dashboard users oriented.
|
||||
- **Project management templates**: New schedule and unit list templates plus file/session lists show what each project stores before teams dive into deployments.
|
||||
|
||||
### Changed
|
||||
- **Project view refresh**: FTP browser now downloads folders locally, the countdown timer was rebuilt, and project/device templates gained edit modals for projects and locations so navigation feels smoother.
|
||||
- **SLM control sync & accuracy**: Control center groundwork now runs inside the dev UI, configuration edits propagate to SLMM (which caches configs for faster responses), and the SLM live view reads the correct DRD fields after the refactor.
|
||||
|
||||
### Fixed
|
||||
- **SLM UI syntax bug**: Resolved the unexpected token error that appeared in the refreshed SLM components.
|
||||
|
||||
## [0.4.2] - 2026-01-05
|
||||
|
||||
### Added
|
||||
@@ -348,6 +499,12 @@ No database migration required for v0.4.0. All new features use existing databas
|
||||
- Photo management per unit
|
||||
- Automated status categorization (OK/Pending/Missing)
|
||||
|
||||
[0.7.0]: https://github.com/serversdwn/seismo-fleet-manager/compare/v0.6.1...v0.7.0
|
||||
[0.6.0]: https://github.com/serversdwn/seismo-fleet-manager/compare/v0.5.1...v0.6.0
|
||||
[0.5.1]: https://github.com/serversdwn/seismo-fleet-manager/compare/v0.5.0...v0.5.1
|
||||
[0.5.0]: https://github.com/serversdwn/seismo-fleet-manager/compare/v0.4.4...v0.5.0
|
||||
[0.4.4]: https://github.com/serversdwn/seismo-fleet-manager/compare/v0.4.3...v0.4.4
|
||||
[0.4.3]: https://github.com/serversdwn/seismo-fleet-manager/compare/v0.4.2...v0.4.3
|
||||
[0.4.2]: https://github.com/serversdwn/seismo-fleet-manager/compare/v0.4.1...v0.4.2
|
||||
[0.4.1]: https://github.com/serversdwn/seismo-fleet-manager/compare/v0.4.0...v0.4.1
|
||||
[0.4.0]: https://github.com/serversdwn/seismo-fleet-manager/compare/v0.3.3...v0.4.0
|
||||
|
||||
@@ -1,5 +1,9 @@
|
||||
FROM python:3.11-slim
|
||||
|
||||
# Build number for dev builds (injected via --build-arg)
|
||||
ARG BUILD_NUMBER=0
|
||||
ENV BUILD_NUMBER=${BUILD_NUMBER}
|
||||
|
||||
# Set working directory
|
||||
WORKDIR /app
|
||||
|
||||
|
||||
86
README.md
@@ -1,4 +1,4 @@
|
||||
# Seismo Fleet Manager v0.4.2
|
||||
# Terra-View v0.7.0
|
||||
Backend API and HTMX-powered web interface for managing a mixed fleet of seismographs and field modems. Track deployments, monitor health in real time, merge roster intent with incoming telemetry, and control your fleet through a unified database and dashboard.
|
||||
|
||||
## Features
|
||||
@@ -308,7 +308,7 @@ print(response.json())
|
||||
|-------|------|-------------|
|
||||
| id | string | Unit identifier (primary key) |
|
||||
| unit_type | string | Hardware model name (default: `series3`) |
|
||||
| device_type | string | `seismograph` or `modem` discriminator |
|
||||
| device_type | string | Device type: `"seismograph"`, `"modem"`, or `"slm"` (sound level meter) |
|
||||
| deployed | boolean | Whether the unit is in the field |
|
||||
| retired | boolean | Removes the unit from deployments but preserves history |
|
||||
| note | string | Notes about the unit |
|
||||
@@ -334,6 +334,39 @@ print(response.json())
|
||||
| phone_number | string | Cellular number for the modem |
|
||||
| hardware_model | string | Modem hardware reference |
|
||||
|
||||
**Sound Level Meter (SLM) fields**
|
||||
|
||||
| Field | Type | Description |
|
||||
|-------|------|-------------|
|
||||
| slm_host | string | Direct IP address for SLM (if not using modem) |
|
||||
| slm_tcp_port | integer | TCP control port (default: 2255) |
|
||||
| slm_ftp_port | integer | FTP file transfer port (default: 21) |
|
||||
| slm_model | string | Device model (NL-43, NL-53) |
|
||||
| slm_serial_number | string | Manufacturer serial number |
|
||||
| slm_frequency_weighting | string | Frequency weighting setting (A, C, Z) |
|
||||
| slm_time_weighting | string | Time weighting setting (F=Fast, S=Slow) |
|
||||
| slm_measurement_range | string | Measurement range setting |
|
||||
| slm_last_check | datetime | Last status check timestamp |
|
||||
| deployed_with_modem_id | string | Modem pairing (shared with seismographs) |
|
||||
|
||||
### Device Type Schema
|
||||
|
||||
Terra-View supports three device types with the following standardized `device_type` values:
|
||||
|
||||
- **`"seismograph"`** (default) - Seismic monitoring devices (Series 3, Series 4, Micromate)
|
||||
- Uses: calibration dates, modem pairing
|
||||
- Examples: BE1234, UM12345 (Series 3/4 units)
|
||||
|
||||
- **`"modem"`** - Field modems and network equipment
|
||||
- Uses: IP address, phone number, hardware model
|
||||
- Examples: MDM001, MODEM-2025-01
|
||||
|
||||
- **`"slm"`** - Sound level meters (Rion NL-43/NL-53)
|
||||
- Uses: TCP/FTP configuration, measurement settings, modem pairing
|
||||
- Examples: SLM-43-01, NL43-001
|
||||
|
||||
**Important**: All `device_type` values must be lowercase. The legacy value `"sound_level_meter"` has been deprecated in favor of the shorter `"slm"`. Run `backend/migrate_standardize_device_types.py` to update existing databases.
|
||||
|
||||
### Emitter Table (Device Check-ins)
|
||||
|
||||
| Field | Type | Description |
|
||||
@@ -463,6 +496,35 @@ docker compose down -v
|
||||
|
||||
## Release Highlights
|
||||
|
||||
### v0.7.0 — 2026-03-07
|
||||
- **Project Status Management**: On-hold and archived project states with automatic cancellation of pending actions
|
||||
- **Manual SD Card Upload**: Upload offline NRL/SLM data directly from SD card (ZIP or multi-file); auto-creates monitoring sessions from `.rnh` metadata
|
||||
- **Combined Report Wizard**: Multi-session Excel report generation with location grouping, period type filtering, and ZIP download
|
||||
- **NL32 Support**: Report generator and web viewer now handle NL32 measurement data
|
||||
- **Chart Preview**: Live chart preview in the report generator matching final output styling
|
||||
- **Standalone SLM Mode**: SLMs can now be configured without a paired modem (direct IP)
|
||||
- **Vibration Project Isolation**: Vibration project views no longer show SLM-specific tabs
|
||||
- **MonitoringSession Rename**: `RecordingSession` renamed to `MonitoringSession` throughout; run migration before deploying
|
||||
|
||||
### v0.6.1 — 2026-02-16
|
||||
- **One-Off Recording Schedules**: Schedule single recordings with specific start/end datetimes
|
||||
- **Bidirectional Pairing Sync**: Device-modem pairing now updates both sides automatically
|
||||
- **Scheduler Timezone Fix**: One-off schedule times use local time instead of UTC
|
||||
|
||||
### v0.6.0 — 2026-02-06
|
||||
- **Calendar & Reservation Mode**: Fleet calendar view with device deployment scheduling and reservation system
|
||||
- **Device Pairing Interface**: New `/pair-devices` page with two-column layout for linking recorders with modems, fuzzy-search, and visual pairing workflow
|
||||
- **Calibration UX Overhaul**: Users now set date of previous calibration (not expiry); seismograph list enhanced with color-coded calibration status, filtering, and sorting
|
||||
- **Modem Dashboard**: Model number as dedicated config, modem login links, list view format, and pairing options accessible from modem page
|
||||
- **SLMM Improvements**: Device control lock prevents command flooding, fixed polling intervals and scheduled downloads
|
||||
- **UI Polish**: Tab state persists in URL hash, settings save without reload, scheduler changes cascade to events, fixed mobile type display
|
||||
|
||||
### v0.4.3 — 2026-01-14
|
||||
- **Sound Level Meter workflow**: Roster manager surfaces SLM metadata, supports rename actions, and adds return-to-project navigation plus schedule/unit templates for project planning.
|
||||
- **Project insight panels**: Project dashboards now expose file and session lists so teams can see what each project stores before diving into units.
|
||||
- **Project view polish**: FTP browser supports folder downloads, the timer display was reimplemented, and the project/device templates gained edit modals for projects and locations to streamline navigation.
|
||||
- **SLM sync & accuracy**: Configuration edits now propagate to SLMM (which caches configs for faster responses) and the live view uses the correct DRD fields so telemetry aligns with the control center.
|
||||
|
||||
### v0.4.0 — 2025-12-16
|
||||
- **Database Management System**: Complete backup and restore functionality with manual snapshots, restore operations, and upload/download capabilities
|
||||
- **Remote Database Cloning**: New `clone_db_to_dev.py` script for copying production database to remote dev servers over WAN
|
||||
@@ -532,9 +594,25 @@ MIT
|
||||
|
||||
## Version
|
||||
|
||||
**Current: 0.4.0** — Database management system with backup/restore and remote cloning (2025-12-16)
|
||||
**Current: 0.7.0** — Project status management, manual SD card upload, combined report wizard, NL32 support, MonitoringSession rename (2026-03-07)
|
||||
|
||||
Previous: 0.3.3 — Mobile navigation improvements and better status visibility (2025-12-12)
|
||||
Previous: 0.6.1 — One-off recording schedules, bidirectional pairing sync, scheduler timezone fix (2026-02-16)
|
||||
|
||||
0.6.0 — Calendar & reservation mode, device pairing interface, calibration UX overhaul, modem dashboard enhancements (2026-02-06)
|
||||
|
||||
0.5.1 — Dashboard schedule view with today's actions panel, new Terra-View branding and logo rework (2026-01-27)
|
||||
|
||||
0.4.4 — Recurring schedules, alerting UI, report templates + RND viewer, and SLM workflow polish (2026-01-23)
|
||||
|
||||
0.4.3 — SLM roster/project view refresh, project insight panels, FTP browser folder downloads, and SLMM sync (2026-01-14)
|
||||
|
||||
0.4.2 — SLM configuration interface with TCP/FTP controls, modem diagnostics, and dashboard endpoints for Sound Level Meters (2026-01-05)
|
||||
|
||||
0.4.1 — Sound Level Meter integration with full management UI for SLM units (2026-01-05)
|
||||
|
||||
0.4.0 — Database management system with backup/restore and remote cloning (2025-12-16)
|
||||
|
||||
0.3.3 — Mobile navigation improvements and better status visibility (2025-12-12)
|
||||
|
||||
0.3.2 — Progressive Web App with mobile optimization (2025-12-12)
|
||||
|
||||
|
||||
BIN
assets/terra-view-icon_large.png
Normal file
|
After Width: | Height: | Size: 36 KiB |
@@ -18,7 +18,7 @@ from backend.models import (
|
||||
MonitoringLocation,
|
||||
UnitAssignment,
|
||||
ScheduledAction,
|
||||
RecordingSession,
|
||||
MonitoringSession,
|
||||
DataFile,
|
||||
)
|
||||
from datetime import datetime
|
||||
|
||||
146
backend/main.py
@@ -1,6 +1,6 @@
|
||||
import os
|
||||
import logging
|
||||
from fastapi import FastAPI, Request, Depends
|
||||
from fastapi import FastAPI, Request, Depends, HTTPException
|
||||
from fastapi.middleware.cors import CORSMiddleware
|
||||
from fastapi.staticfiles import StaticFiles
|
||||
from fastapi.templating import Jinja2Templates
|
||||
@@ -18,9 +18,10 @@ logging.basicConfig(
|
||||
logger = logging.getLogger(__name__)
|
||||
|
||||
from backend.database import engine, Base, get_db
|
||||
from backend.routers import roster, units, photos, roster_edit, roster_rename, dashboard, dashboard_tabs, activity, slmm, slm_ui, slm_dashboard, seismo_dashboard, projects, project_locations, scheduler
|
||||
from backend.routers import roster, units, photos, roster_edit, roster_rename, dashboard, dashboard_tabs, activity, slmm, slm_ui, slm_dashboard, seismo_dashboard, projects, project_locations, scheduler, modem_dashboard
|
||||
from backend.services.snapshot import emit_status_snapshot
|
||||
from backend.models import IgnoredUnit
|
||||
from backend.utils.timezone import get_user_timezone
|
||||
|
||||
# Create database tables
|
||||
Base.metadata.create_all(bind=engine)
|
||||
@@ -29,7 +30,11 @@ Base.metadata.create_all(bind=engine)
|
||||
ENVIRONMENT = os.getenv("ENVIRONMENT", "production")
|
||||
|
||||
# Initialize FastAPI app
|
||||
VERSION = "0.4.2"
|
||||
VERSION = "0.7.0"
|
||||
if ENVIRONMENT == "development":
|
||||
_build = os.getenv("BUILD_NUMBER", "0")
|
||||
if _build and _build != "0":
|
||||
VERSION = f"{VERSION}-{_build}"
|
||||
app = FastAPI(
|
||||
title="Seismo Fleet Manager",
|
||||
description="Backend API for managing seismograph fleet status",
|
||||
@@ -58,8 +63,8 @@ app.add_middleware(
|
||||
# Mount static files
|
||||
app.mount("/static", StaticFiles(directory="backend/static"), name="static")
|
||||
|
||||
# Setup Jinja2 templates
|
||||
templates = Jinja2Templates(directory="templates")
|
||||
# Use shared templates configuration with timezone filters
|
||||
from backend.templates_config import templates
|
||||
|
||||
# Add custom context processor to inject environment variable into all templates
|
||||
@app.middleware("http")
|
||||
@@ -92,6 +97,7 @@ app.include_router(slmm.router)
|
||||
app.include_router(slm_ui.router)
|
||||
app.include_router(slm_dashboard.router)
|
||||
app.include_router(seismo_dashboard.router)
|
||||
app.include_router(modem_dashboard.router)
|
||||
|
||||
from backend.routers import settings
|
||||
app.include_router(settings.router)
|
||||
@@ -101,8 +107,25 @@ app.include_router(projects.router)
|
||||
app.include_router(project_locations.router)
|
||||
app.include_router(scheduler.router)
|
||||
|
||||
# Start scheduler service on application startup
|
||||
# Report templates router
|
||||
from backend.routers import report_templates
|
||||
app.include_router(report_templates.router)
|
||||
|
||||
# Alerts router
|
||||
from backend.routers import alerts
|
||||
app.include_router(alerts.router)
|
||||
|
||||
# Recurring schedules router
|
||||
from backend.routers import recurring_schedules
|
||||
app.include_router(recurring_schedules.router)
|
||||
|
||||
# Fleet Calendar router
|
||||
from backend.routers import fleet_calendar
|
||||
app.include_router(fleet_calendar.router)
|
||||
|
||||
# Start scheduler service and device status monitor on application startup
|
||||
from backend.services.scheduler import start_scheduler, stop_scheduler
|
||||
from backend.services.device_status_monitor import start_device_status_monitor, stop_device_status_monitor
|
||||
|
||||
@app.on_event("startup")
|
||||
async def startup_event():
|
||||
@@ -111,9 +134,17 @@ async def startup_event():
|
||||
await start_scheduler()
|
||||
logger.info("Scheduler service started")
|
||||
|
||||
logger.info("Starting device status monitor...")
|
||||
await start_device_status_monitor()
|
||||
logger.info("Device status monitor started")
|
||||
|
||||
@app.on_event("shutdown")
|
||||
def shutdown_event():
|
||||
"""Clean up services on app shutdown"""
|
||||
logger.info("Stopping device status monitor...")
|
||||
stop_device_status_monitor()
|
||||
logger.info("Device status monitor stopped")
|
||||
|
||||
logger.info("Stopping scheduler service...")
|
||||
stop_scheduler()
|
||||
logger.info("Scheduler service stopped")
|
||||
@@ -195,6 +226,73 @@ async def seismographs_page(request: Request):
|
||||
return templates.TemplateResponse("seismographs.html", {"request": request})
|
||||
|
||||
|
||||
@app.get("/modems", response_class=HTMLResponse)
|
||||
async def modems_page(request: Request):
|
||||
"""Field modems management dashboard"""
|
||||
return templates.TemplateResponse("modems.html", {"request": request})
|
||||
|
||||
|
||||
@app.get("/pair-devices", response_class=HTMLResponse)
|
||||
async def pair_devices_page(request: Request, db: Session = Depends(get_db)):
|
||||
"""
|
||||
Device pairing page - two-column layout for pairing recorders with modems.
|
||||
"""
|
||||
from backend.models import RosterUnit
|
||||
|
||||
# Get all non-retired recorders (seismographs and SLMs)
|
||||
recorders = db.query(RosterUnit).filter(
|
||||
RosterUnit.retired == False,
|
||||
RosterUnit.device_type.in_(["seismograph", "slm", None]) # None defaults to seismograph
|
||||
).order_by(RosterUnit.id).all()
|
||||
|
||||
# Get all non-retired modems
|
||||
modems = db.query(RosterUnit).filter(
|
||||
RosterUnit.retired == False,
|
||||
RosterUnit.device_type == "modem"
|
||||
).order_by(RosterUnit.id).all()
|
||||
|
||||
# Build existing pairings list
|
||||
pairings = []
|
||||
for recorder in recorders:
|
||||
if recorder.deployed_with_modem_id:
|
||||
modem = next((m for m in modems if m.id == recorder.deployed_with_modem_id), None)
|
||||
pairings.append({
|
||||
"recorder_id": recorder.id,
|
||||
"recorder_type": (recorder.device_type or "seismograph").upper(),
|
||||
"modem_id": recorder.deployed_with_modem_id,
|
||||
"modem_ip": modem.ip_address if modem else None
|
||||
})
|
||||
|
||||
# Convert to dicts for template
|
||||
recorders_data = [
|
||||
{
|
||||
"id": r.id,
|
||||
"device_type": r.device_type or "seismograph",
|
||||
"deployed": r.deployed,
|
||||
"deployed_with_modem_id": r.deployed_with_modem_id
|
||||
}
|
||||
for r in recorders
|
||||
]
|
||||
|
||||
modems_data = [
|
||||
{
|
||||
"id": m.id,
|
||||
"deployed": m.deployed,
|
||||
"deployed_with_unit_id": m.deployed_with_unit_id,
|
||||
"ip_address": m.ip_address,
|
||||
"phone_number": m.phone_number
|
||||
}
|
||||
for m in modems
|
||||
]
|
||||
|
||||
return templates.TemplateResponse("pair_devices.html", {
|
||||
"request": request,
|
||||
"recorders": recorders_data,
|
||||
"modems": modems_data,
|
||||
"pairings": pairings
|
||||
})
|
||||
|
||||
|
||||
@app.get("/projects", response_class=HTMLResponse)
|
||||
async def projects_page(request: Request):
|
||||
"""Projects management and overview"""
|
||||
@@ -218,7 +316,7 @@ async def nrl_detail_page(
|
||||
db: Session = Depends(get_db)
|
||||
):
|
||||
"""NRL (Noise Recording Location) detail page with tabs"""
|
||||
from backend.models import Project, MonitoringLocation, UnitAssignment, RosterUnit, RecordingSession, DataFile
|
||||
from backend.models import Project, MonitoringLocation, UnitAssignment, RosterUnit, MonitoringSession, DataFile
|
||||
from sqlalchemy import and_
|
||||
|
||||
# Get project
|
||||
@@ -254,23 +352,33 @@ async def nrl_detail_page(
|
||||
assigned_unit = db.query(RosterUnit).filter_by(id=assignment.unit_id).first()
|
||||
|
||||
# Get session count
|
||||
session_count = db.query(RecordingSession).filter_by(location_id=location_id).count()
|
||||
session_count = db.query(MonitoringSession).filter_by(location_id=location_id).count()
|
||||
|
||||
# Get file count (DataFile links to session, not directly to location)
|
||||
file_count = db.query(DataFile).join(
|
||||
RecordingSession,
|
||||
DataFile.session_id == RecordingSession.id
|
||||
).filter(RecordingSession.location_id == location_id).count()
|
||||
MonitoringSession,
|
||||
DataFile.session_id == MonitoringSession.id
|
||||
).filter(MonitoringSession.location_id == location_id).count()
|
||||
|
||||
# Check for active session
|
||||
active_session = db.query(RecordingSession).filter(
|
||||
active_session = db.query(MonitoringSession).filter(
|
||||
and_(
|
||||
RecordingSession.location_id == location_id,
|
||||
RecordingSession.status == "recording"
|
||||
MonitoringSession.location_id == location_id,
|
||||
MonitoringSession.status == "recording"
|
||||
)
|
||||
).first()
|
||||
|
||||
return templates.TemplateResponse("nrl_detail.html", {
|
||||
# Parse connection_mode from location_metadata JSON
|
||||
import json as _json
|
||||
connection_mode = "connected"
|
||||
try:
|
||||
meta = _json.loads(location.location_metadata or "{}")
|
||||
connection_mode = meta.get("connection_mode", "connected")
|
||||
except Exception:
|
||||
pass
|
||||
|
||||
template = "vibration_location_detail.html" if location.location_type == "vibration" else "nrl_detail.html"
|
||||
return templates.TemplateResponse(template, {
|
||||
"request": request,
|
||||
"project_id": project_id,
|
||||
"location_id": location_id,
|
||||
@@ -281,6 +389,7 @@ async def nrl_detail_page(
|
||||
"session_count": session_count,
|
||||
"file_count": file_count,
|
||||
"active_session": active_session,
|
||||
"connection_mode": connection_mode,
|
||||
})
|
||||
|
||||
|
||||
@@ -559,6 +668,7 @@ async def devices_all_partial(request: Request):
|
||||
"last_calibrated": unit_data.get("last_calibrated"),
|
||||
"next_calibration_due": unit_data.get("next_calibration_due"),
|
||||
"deployed_with_modem_id": unit_data.get("deployed_with_modem_id"),
|
||||
"deployed_with_unit_id": unit_data.get("deployed_with_unit_id"),
|
||||
"ip_address": unit_data.get("ip_address"),
|
||||
"phone_number": unit_data.get("phone_number"),
|
||||
"hardware_model": unit_data.get("hardware_model"),
|
||||
@@ -582,6 +692,7 @@ async def devices_all_partial(request: Request):
|
||||
"last_calibrated": unit_data.get("last_calibrated"),
|
||||
"next_calibration_due": unit_data.get("next_calibration_due"),
|
||||
"deployed_with_modem_id": unit_data.get("deployed_with_modem_id"),
|
||||
"deployed_with_unit_id": unit_data.get("deployed_with_unit_id"),
|
||||
"ip_address": unit_data.get("ip_address"),
|
||||
"phone_number": unit_data.get("phone_number"),
|
||||
"hardware_model": unit_data.get("hardware_model"),
|
||||
@@ -605,6 +716,7 @@ async def devices_all_partial(request: Request):
|
||||
"last_calibrated": unit_data.get("last_calibrated"),
|
||||
"next_calibration_due": unit_data.get("next_calibration_due"),
|
||||
"deployed_with_modem_id": unit_data.get("deployed_with_modem_id"),
|
||||
"deployed_with_unit_id": unit_data.get("deployed_with_unit_id"),
|
||||
"ip_address": unit_data.get("ip_address"),
|
||||
"phone_number": unit_data.get("phone_number"),
|
||||
"hardware_model": unit_data.get("hardware_model"),
|
||||
@@ -628,6 +740,7 @@ async def devices_all_partial(request: Request):
|
||||
"last_calibrated": None,
|
||||
"next_calibration_due": None,
|
||||
"deployed_with_modem_id": None,
|
||||
"deployed_with_unit_id": None,
|
||||
"ip_address": None,
|
||||
"phone_number": None,
|
||||
"hardware_model": None,
|
||||
@@ -650,7 +763,8 @@ async def devices_all_partial(request: Request):
|
||||
return templates.TemplateResponse("partials/devices_table.html", {
|
||||
"request": request,
|
||||
"units": units_list,
|
||||
"timestamp": datetime.now().strftime("%H:%M:%S")
|
||||
"timestamp": datetime.now().strftime("%H:%M:%S"),
|
||||
"user_timezone": get_user_timezone()
|
||||
})
|
||||
|
||||
|
||||
|
||||
67
backend/migrate_add_auto_increment_index.py
Normal file
@@ -0,0 +1,67 @@
|
||||
"""
|
||||
Migration: Add auto_increment_index column to recurring_schedules table
|
||||
|
||||
This migration adds the auto_increment_index column that controls whether
|
||||
the scheduler should automatically find an unused store index before starting
|
||||
a new measurement.
|
||||
|
||||
Run this script once to update existing databases:
|
||||
python -m backend.migrate_add_auto_increment_index
|
||||
"""
|
||||
|
||||
import sqlite3
|
||||
import os
|
||||
|
||||
DB_PATH = "data/seismo_fleet.db"
|
||||
|
||||
|
||||
def migrate():
|
||||
"""Add auto_increment_index column to recurring_schedules table."""
|
||||
if not os.path.exists(DB_PATH):
|
||||
print(f"Database not found at {DB_PATH}")
|
||||
return False
|
||||
|
||||
conn = sqlite3.connect(DB_PATH)
|
||||
cursor = conn.cursor()
|
||||
|
||||
try:
|
||||
# Check if recurring_schedules table exists
|
||||
cursor.execute("""
|
||||
SELECT name FROM sqlite_master
|
||||
WHERE type='table' AND name='recurring_schedules'
|
||||
""")
|
||||
if not cursor.fetchone():
|
||||
print("recurring_schedules table does not exist yet. Will be created on app startup.")
|
||||
conn.close()
|
||||
return True
|
||||
|
||||
# Check if auto_increment_index column already exists
|
||||
cursor.execute("PRAGMA table_info(recurring_schedules)")
|
||||
columns = [row[1] for row in cursor.fetchall()]
|
||||
|
||||
if "auto_increment_index" in columns:
|
||||
print("auto_increment_index column already exists in recurring_schedules table.")
|
||||
conn.close()
|
||||
return True
|
||||
|
||||
# Add the column
|
||||
print("Adding auto_increment_index column to recurring_schedules table...")
|
||||
cursor.execute("""
|
||||
ALTER TABLE recurring_schedules
|
||||
ADD COLUMN auto_increment_index BOOLEAN DEFAULT 1
|
||||
""")
|
||||
conn.commit()
|
||||
print("Successfully added auto_increment_index column.")
|
||||
|
||||
conn.close()
|
||||
return True
|
||||
|
||||
except Exception as e:
|
||||
print(f"Migration failed: {e}")
|
||||
conn.close()
|
||||
return False
|
||||
|
||||
|
||||
if __name__ == "__main__":
|
||||
success = migrate()
|
||||
exit(0 if success else 1)
|
||||
84
backend/migrate_add_deployment_type.py
Normal file
@@ -0,0 +1,84 @@
|
||||
"""
|
||||
Migration script to add deployment_type and deployed_with_unit_id fields to roster table.
|
||||
|
||||
deployment_type: tracks what type of device a modem is deployed with:
|
||||
- "seismograph" - Modem is connected to a seismograph
|
||||
- "slm" - Modem is connected to a sound level meter
|
||||
- NULL/empty - Not assigned or unknown
|
||||
|
||||
deployed_with_unit_id: stores the ID of the seismograph/SLM this modem is deployed with
|
||||
(reverse relationship of deployed_with_modem_id)
|
||||
|
||||
Run this script once to migrate an existing database.
|
||||
"""
|
||||
|
||||
import sqlite3
|
||||
import os
|
||||
|
||||
# Database path
|
||||
DB_PATH = "./data/seismo_fleet.db"
|
||||
|
||||
|
||||
def migrate_database():
|
||||
"""Add deployment_type and deployed_with_unit_id columns to roster table"""
|
||||
|
||||
if not os.path.exists(DB_PATH):
|
||||
print(f"Database not found at {DB_PATH}")
|
||||
print("The database will be created automatically when you run the application.")
|
||||
return
|
||||
|
||||
print(f"Migrating database: {DB_PATH}")
|
||||
|
||||
conn = sqlite3.connect(DB_PATH)
|
||||
cursor = conn.cursor()
|
||||
|
||||
# Check if roster table exists
|
||||
cursor.execute("SELECT name FROM sqlite_master WHERE type='table' AND name='roster'")
|
||||
table_exists = cursor.fetchone()
|
||||
|
||||
if not table_exists:
|
||||
print("Roster table does not exist yet - will be created when app runs")
|
||||
conn.close()
|
||||
return
|
||||
|
||||
# Check existing columns
|
||||
cursor.execute("PRAGMA table_info(roster)")
|
||||
columns = [col[1] for col in cursor.fetchall()]
|
||||
|
||||
try:
|
||||
# Add deployment_type if not exists
|
||||
if 'deployment_type' not in columns:
|
||||
print("Adding deployment_type column to roster table...")
|
||||
cursor.execute("ALTER TABLE roster ADD COLUMN deployment_type TEXT")
|
||||
print(" Added deployment_type column")
|
||||
|
||||
cursor.execute("CREATE INDEX IF NOT EXISTS ix_roster_deployment_type ON roster(deployment_type)")
|
||||
print(" Created index on deployment_type")
|
||||
else:
|
||||
print("deployment_type column already exists")
|
||||
|
||||
# Add deployed_with_unit_id if not exists
|
||||
if 'deployed_with_unit_id' not in columns:
|
||||
print("Adding deployed_with_unit_id column to roster table...")
|
||||
cursor.execute("ALTER TABLE roster ADD COLUMN deployed_with_unit_id TEXT")
|
||||
print(" Added deployed_with_unit_id column")
|
||||
|
||||
cursor.execute("CREATE INDEX IF NOT EXISTS ix_roster_deployed_with_unit_id ON roster(deployed_with_unit_id)")
|
||||
print(" Created index on deployed_with_unit_id")
|
||||
else:
|
||||
print("deployed_with_unit_id column already exists")
|
||||
|
||||
conn.commit()
|
||||
print("\nMigration completed successfully!")
|
||||
|
||||
except sqlite3.Error as e:
|
||||
print(f"\nError during migration: {e}")
|
||||
conn.rollback()
|
||||
raise
|
||||
|
||||
finally:
|
||||
conn.close()
|
||||
|
||||
|
||||
if __name__ == "__main__":
|
||||
migrate_database()
|
||||
103
backend/migrate_add_job_reservations.py
Normal file
@@ -0,0 +1,103 @@
|
||||
"""
|
||||
Migration script to add job reservations for the Fleet Calendar feature.
|
||||
|
||||
This creates two tables:
|
||||
- job_reservations: Track future unit assignments for jobs/projects
|
||||
- job_reservation_units: Link specific units to reservations
|
||||
|
||||
Run this script once to migrate an existing database.
|
||||
"""
|
||||
|
||||
import sqlite3
|
||||
import os
|
||||
|
||||
# Database path
|
||||
DB_PATH = "./data/seismo_fleet.db"
|
||||
|
||||
|
||||
def migrate_database():
|
||||
"""Create the job_reservations and job_reservation_units tables"""
|
||||
|
||||
if not os.path.exists(DB_PATH):
|
||||
print(f"Database not found at {DB_PATH}")
|
||||
print("The database will be created automatically when you run the application.")
|
||||
return
|
||||
|
||||
print(f"Migrating database: {DB_PATH}")
|
||||
|
||||
conn = sqlite3.connect(DB_PATH)
|
||||
cursor = conn.cursor()
|
||||
|
||||
# Check if job_reservations table already exists
|
||||
cursor.execute("SELECT name FROM sqlite_master WHERE type='table' AND name='job_reservations'")
|
||||
if cursor.fetchone():
|
||||
print("Migration already applied - job_reservations table exists")
|
||||
conn.close()
|
||||
return
|
||||
|
||||
print("Creating job_reservations table...")
|
||||
|
||||
try:
|
||||
# Create job_reservations table
|
||||
cursor.execute("""
|
||||
CREATE TABLE job_reservations (
|
||||
id TEXT PRIMARY KEY,
|
||||
name TEXT NOT NULL,
|
||||
project_id TEXT,
|
||||
start_date DATE NOT NULL,
|
||||
end_date DATE NOT NULL,
|
||||
assignment_type TEXT NOT NULL DEFAULT 'quantity',
|
||||
device_type TEXT DEFAULT 'seismograph',
|
||||
quantity_needed INTEGER,
|
||||
notes TEXT,
|
||||
color TEXT DEFAULT '#3B82F6',
|
||||
created_at TIMESTAMP DEFAULT CURRENT_TIMESTAMP,
|
||||
updated_at TIMESTAMP DEFAULT CURRENT_TIMESTAMP
|
||||
)
|
||||
""")
|
||||
print(" Created job_reservations table")
|
||||
|
||||
# Create indexes for job_reservations
|
||||
cursor.execute("CREATE INDEX idx_job_reservations_project_id ON job_reservations(project_id)")
|
||||
print(" Created index on project_id")
|
||||
|
||||
cursor.execute("CREATE INDEX idx_job_reservations_dates ON job_reservations(start_date, end_date)")
|
||||
print(" Created index on dates")
|
||||
|
||||
# Create job_reservation_units table
|
||||
print("Creating job_reservation_units table...")
|
||||
cursor.execute("""
|
||||
CREATE TABLE job_reservation_units (
|
||||
id TEXT PRIMARY KEY,
|
||||
reservation_id TEXT NOT NULL,
|
||||
unit_id TEXT NOT NULL,
|
||||
assignment_source TEXT DEFAULT 'specific',
|
||||
assigned_at TIMESTAMP DEFAULT CURRENT_TIMESTAMP,
|
||||
FOREIGN KEY (reservation_id) REFERENCES job_reservations(id),
|
||||
FOREIGN KEY (unit_id) REFERENCES roster(id)
|
||||
)
|
||||
""")
|
||||
print(" Created job_reservation_units table")
|
||||
|
||||
# Create indexes for job_reservation_units
|
||||
cursor.execute("CREATE INDEX idx_job_reservation_units_reservation_id ON job_reservation_units(reservation_id)")
|
||||
print(" Created index on reservation_id")
|
||||
|
||||
cursor.execute("CREATE INDEX idx_job_reservation_units_unit_id ON job_reservation_units(unit_id)")
|
||||
print(" Created index on unit_id")
|
||||
|
||||
conn.commit()
|
||||
print("\nMigration completed successfully!")
|
||||
print("You can now use the Fleet Calendar to manage unit reservations.")
|
||||
|
||||
except sqlite3.Error as e:
|
||||
print(f"\nError during migration: {e}")
|
||||
conn.rollback()
|
||||
raise
|
||||
|
||||
finally:
|
||||
conn.close()
|
||||
|
||||
|
||||
if __name__ == "__main__":
|
||||
migrate_database()
|
||||
73
backend/migrate_add_oneoff_schedule_fields.py
Normal file
@@ -0,0 +1,73 @@
|
||||
"""
|
||||
Migration: Add one-off schedule fields to recurring_schedules table
|
||||
|
||||
Adds start_datetime and end_datetime columns for one-off recording schedules.
|
||||
|
||||
Run this script once to update existing databases:
|
||||
python -m backend.migrate_add_oneoff_schedule_fields
|
||||
"""
|
||||
|
||||
import sqlite3
|
||||
import os
|
||||
|
||||
DB_PATH = "data/seismo_fleet.db"
|
||||
|
||||
|
||||
def migrate():
|
||||
"""Add one-off schedule columns to recurring_schedules table."""
|
||||
if not os.path.exists(DB_PATH):
|
||||
print(f"Database not found at {DB_PATH}")
|
||||
return False
|
||||
|
||||
conn = sqlite3.connect(DB_PATH)
|
||||
cursor = conn.cursor()
|
||||
|
||||
try:
|
||||
cursor.execute("""
|
||||
SELECT name FROM sqlite_master
|
||||
WHERE type='table' AND name='recurring_schedules'
|
||||
""")
|
||||
if not cursor.fetchone():
|
||||
print("recurring_schedules table does not exist yet. Will be created on app startup.")
|
||||
conn.close()
|
||||
return True
|
||||
|
||||
cursor.execute("PRAGMA table_info(recurring_schedules)")
|
||||
columns = [row[1] for row in cursor.fetchall()]
|
||||
|
||||
added = False
|
||||
|
||||
if "start_datetime" not in columns:
|
||||
print("Adding start_datetime column to recurring_schedules table...")
|
||||
cursor.execute("""
|
||||
ALTER TABLE recurring_schedules
|
||||
ADD COLUMN start_datetime DATETIME NULL
|
||||
""")
|
||||
added = True
|
||||
|
||||
if "end_datetime" not in columns:
|
||||
print("Adding end_datetime column to recurring_schedules table...")
|
||||
cursor.execute("""
|
||||
ALTER TABLE recurring_schedules
|
||||
ADD COLUMN end_datetime DATETIME NULL
|
||||
""")
|
||||
added = True
|
||||
|
||||
if added:
|
||||
conn.commit()
|
||||
print("Successfully added one-off schedule columns.")
|
||||
else:
|
||||
print("One-off schedule columns already exist.")
|
||||
|
||||
conn.close()
|
||||
return True
|
||||
|
||||
except Exception as e:
|
||||
print(f"Migration failed: {e}")
|
||||
conn.close()
|
||||
return False
|
||||
|
||||
|
||||
if __name__ == "__main__":
|
||||
success = migrate()
|
||||
exit(0 if success else 1)
|
||||
53
backend/migrate_add_project_data_collection_mode.py
Normal file
@@ -0,0 +1,53 @@
|
||||
#!/usr/bin/env python3
|
||||
"""
|
||||
Migration: Add data_collection_mode column to projects table.
|
||||
|
||||
Values:
|
||||
"remote" — units have modems; data pulled via FTP/scheduler automatically
|
||||
"manual" — no modem; SD cards retrieved daily and uploaded by hand
|
||||
|
||||
All existing projects are backfilled to "manual" (safe conservative default).
|
||||
|
||||
Run once inside the Docker container:
|
||||
docker exec terra-view python3 backend/migrate_add_project_data_collection_mode.py
|
||||
"""
|
||||
from pathlib import Path
|
||||
|
||||
DB_PATH = Path("data/seismo_fleet.db")
|
||||
|
||||
|
||||
def migrate():
|
||||
import sqlite3
|
||||
|
||||
if not DB_PATH.exists():
|
||||
print(f"Database not found at {DB_PATH}. Are you running from /home/serversdown/terra-view?")
|
||||
return
|
||||
|
||||
conn = sqlite3.connect(DB_PATH)
|
||||
conn.row_factory = sqlite3.Row
|
||||
cur = conn.cursor()
|
||||
|
||||
# ── 1. Add column (idempotent) ───────────────────────────────────────────
|
||||
cur.execute("PRAGMA table_info(projects)")
|
||||
existing_cols = {row["name"] for row in cur.fetchall()}
|
||||
|
||||
if "data_collection_mode" not in existing_cols:
|
||||
cur.execute("ALTER TABLE projects ADD COLUMN data_collection_mode TEXT DEFAULT 'manual'")
|
||||
conn.commit()
|
||||
print("✓ Added column data_collection_mode to projects")
|
||||
else:
|
||||
print("○ Column data_collection_mode already exists — skipping ALTER TABLE")
|
||||
|
||||
# ── 2. Backfill NULLs to 'manual' ────────────────────────────────────────
|
||||
cur.execute("UPDATE projects SET data_collection_mode = 'manual' WHERE data_collection_mode IS NULL")
|
||||
updated = cur.rowcount
|
||||
conn.commit()
|
||||
conn.close()
|
||||
|
||||
if updated:
|
||||
print(f"✓ Backfilled {updated} project(s) to data_collection_mode='manual'.")
|
||||
print("Migration complete.")
|
||||
|
||||
|
||||
if __name__ == "__main__":
|
||||
migrate()
|
||||
56
backend/migrate_add_project_deleted_at.py
Normal file
@@ -0,0 +1,56 @@
|
||||
"""
|
||||
Migration: Add deleted_at column to projects table
|
||||
|
||||
Adds columns:
|
||||
- projects.deleted_at: Timestamp set when status='deleted'; data hard-deleted after 60 days
|
||||
"""
|
||||
|
||||
import sqlite3
|
||||
import sys
|
||||
from pathlib import Path
|
||||
|
||||
|
||||
def migrate(db_path: str):
|
||||
"""Run the migration."""
|
||||
print(f"Migrating database: {db_path}")
|
||||
|
||||
conn = sqlite3.connect(db_path)
|
||||
cursor = conn.cursor()
|
||||
|
||||
try:
|
||||
cursor.execute("SELECT name FROM sqlite_master WHERE type='table' AND name='projects'")
|
||||
if not cursor.fetchone():
|
||||
print("projects table does not exist. Skipping migration.")
|
||||
return
|
||||
|
||||
cursor.execute("PRAGMA table_info(projects)")
|
||||
existing_cols = {row[1] for row in cursor.fetchall()}
|
||||
|
||||
if 'deleted_at' not in existing_cols:
|
||||
print("Adding deleted_at column to projects...")
|
||||
cursor.execute("ALTER TABLE projects ADD COLUMN deleted_at DATETIME")
|
||||
else:
|
||||
print("deleted_at column already exists. Skipping.")
|
||||
|
||||
conn.commit()
|
||||
print("Migration completed successfully!")
|
||||
|
||||
except Exception as e:
|
||||
print(f"Migration failed: {e}")
|
||||
conn.rollback()
|
||||
raise
|
||||
finally:
|
||||
conn.close()
|
||||
|
||||
|
||||
if __name__ == "__main__":
|
||||
db_path = "./data/terra-view.db"
|
||||
|
||||
if len(sys.argv) > 1:
|
||||
db_path = sys.argv[1]
|
||||
|
||||
if not Path(db_path).exists():
|
||||
print(f"Database not found: {db_path}")
|
||||
sys.exit(1)
|
||||
|
||||
migrate(db_path)
|
||||
80
backend/migrate_add_project_number.py
Normal file
@@ -0,0 +1,80 @@
|
||||
"""
|
||||
Migration script to add project_number field to projects table.
|
||||
|
||||
This adds a new column for TMI internal project numbering:
|
||||
- Format: xxxx-YY (e.g., "2567-23")
|
||||
- xxxx = incremental project number
|
||||
- YY = year project was started
|
||||
|
||||
Combined with client_name and name (project/site name), this enables
|
||||
smart searching across all project identifiers.
|
||||
|
||||
Run this script once to migrate an existing database.
|
||||
"""
|
||||
|
||||
import sqlite3
|
||||
import os
|
||||
|
||||
# Database path
|
||||
DB_PATH = "./data/seismo_fleet.db"
|
||||
|
||||
|
||||
def migrate_database():
|
||||
"""Add project_number column to projects table"""
|
||||
|
||||
if not os.path.exists(DB_PATH):
|
||||
print(f"Database not found at {DB_PATH}")
|
||||
print("The database will be created automatically when you run the application.")
|
||||
return
|
||||
|
||||
print(f"Migrating database: {DB_PATH}")
|
||||
|
||||
conn = sqlite3.connect(DB_PATH)
|
||||
cursor = conn.cursor()
|
||||
|
||||
# Check if projects table exists
|
||||
cursor.execute("SELECT name FROM sqlite_master WHERE type='table' AND name='projects'")
|
||||
table_exists = cursor.fetchone()
|
||||
|
||||
if not table_exists:
|
||||
print("Projects table does not exist yet - will be created when app runs")
|
||||
conn.close()
|
||||
return
|
||||
|
||||
# Check if project_number column already exists
|
||||
cursor.execute("PRAGMA table_info(projects)")
|
||||
columns = [col[1] for col in cursor.fetchall()]
|
||||
|
||||
if 'project_number' in columns:
|
||||
print("Migration already applied - project_number column exists")
|
||||
conn.close()
|
||||
return
|
||||
|
||||
print("Adding project_number column to projects table...")
|
||||
|
||||
try:
|
||||
cursor.execute("ALTER TABLE projects ADD COLUMN project_number TEXT")
|
||||
print(" Added project_number column")
|
||||
|
||||
# Create index for faster searching
|
||||
cursor.execute("CREATE INDEX IF NOT EXISTS ix_projects_project_number ON projects(project_number)")
|
||||
print(" Created index on project_number")
|
||||
|
||||
# Also add index on client_name if it doesn't exist
|
||||
cursor.execute("CREATE INDEX IF NOT EXISTS ix_projects_client_name ON projects(client_name)")
|
||||
print(" Created index on client_name")
|
||||
|
||||
conn.commit()
|
||||
print("\nMigration completed successfully!")
|
||||
|
||||
except sqlite3.Error as e:
|
||||
print(f"\nError during migration: {e}")
|
||||
conn.rollback()
|
||||
raise
|
||||
|
||||
finally:
|
||||
conn.close()
|
||||
|
||||
|
||||
if __name__ == "__main__":
|
||||
migrate_database()
|
||||
88
backend/migrate_add_report_templates.py
Normal file
@@ -0,0 +1,88 @@
|
||||
"""
|
||||
Migration script to add report_templates table.
|
||||
|
||||
This creates a new table for storing report generation configurations:
|
||||
- Template name and project association
|
||||
- Time filtering settings (start/end time)
|
||||
- Date range filtering (optional)
|
||||
- Report title defaults
|
||||
|
||||
Run this script once to migrate an existing database.
|
||||
"""
|
||||
|
||||
import sqlite3
|
||||
import os
|
||||
|
||||
# Database path
|
||||
DB_PATH = "./data/seismo_fleet.db"
|
||||
|
||||
def migrate_database():
|
||||
"""Create report_templates table"""
|
||||
|
||||
if not os.path.exists(DB_PATH):
|
||||
print(f"Database not found at {DB_PATH}")
|
||||
print("The database will be created automatically when you run the application.")
|
||||
return
|
||||
|
||||
print(f"Migrating database: {DB_PATH}")
|
||||
|
||||
conn = sqlite3.connect(DB_PATH)
|
||||
cursor = conn.cursor()
|
||||
|
||||
# Check if report_templates table already exists
|
||||
cursor.execute("SELECT name FROM sqlite_master WHERE type='table' AND name='report_templates'")
|
||||
table_exists = cursor.fetchone()
|
||||
|
||||
if table_exists:
|
||||
print("Migration already applied - report_templates table exists")
|
||||
conn.close()
|
||||
return
|
||||
|
||||
print("Creating report_templates table...")
|
||||
|
||||
try:
|
||||
cursor.execute("""
|
||||
CREATE TABLE report_templates (
|
||||
id TEXT PRIMARY KEY,
|
||||
name TEXT NOT NULL,
|
||||
project_id TEXT,
|
||||
report_title TEXT DEFAULT 'Background Noise Study',
|
||||
start_time TEXT,
|
||||
end_time TEXT,
|
||||
start_date TEXT,
|
||||
end_date TEXT,
|
||||
created_at TIMESTAMP DEFAULT CURRENT_TIMESTAMP,
|
||||
updated_at TIMESTAMP DEFAULT CURRENT_TIMESTAMP
|
||||
)
|
||||
""")
|
||||
print(" ✓ Created report_templates table")
|
||||
|
||||
# Insert default templates
|
||||
import uuid
|
||||
|
||||
default_templates = [
|
||||
(str(uuid.uuid4()), "Nighttime (7PM-7AM)", None, "Background Noise Study", "19:00", "07:00", None, None),
|
||||
(str(uuid.uuid4()), "Daytime (7AM-7PM)", None, "Background Noise Study", "07:00", "19:00", None, None),
|
||||
(str(uuid.uuid4()), "Full Day (All Data)", None, "Background Noise Study", None, None, None, None),
|
||||
]
|
||||
|
||||
cursor.executemany("""
|
||||
INSERT INTO report_templates (id, name, project_id, report_title, start_time, end_time, start_date, end_date)
|
||||
VALUES (?, ?, ?, ?, ?, ?, ?, ?)
|
||||
""", default_templates)
|
||||
print(" ✓ Inserted default templates (Nighttime, Daytime, Full Day)")
|
||||
|
||||
conn.commit()
|
||||
print("\nMigration completed successfully!")
|
||||
|
||||
except sqlite3.Error as e:
|
||||
print(f"\nError during migration: {e}")
|
||||
conn.rollback()
|
||||
raise
|
||||
|
||||
finally:
|
||||
conn.close()
|
||||
|
||||
|
||||
if __name__ == "__main__":
|
||||
migrate_database()
|
||||
127
backend/migrate_add_session_device_model.py
Normal file
@@ -0,0 +1,127 @@
|
||||
#!/usr/bin/env python3
|
||||
"""
|
||||
Migration: Add device_model column to monitoring_sessions table.
|
||||
|
||||
Records which physical SLM model produced each session's data (e.g. "NL-43",
|
||||
"NL-53", "NL-32"). Used by report generation to apply the correct parsing
|
||||
logic without re-opening files to detect format.
|
||||
|
||||
Run once inside the Docker container:
|
||||
docker exec terra-view python3 backend/migrate_add_session_device_model.py
|
||||
|
||||
Backfill strategy for existing rows:
|
||||
1. If session.unit_id is set, use roster.slm_model for that unit.
|
||||
2. Else, peek at the first .rnd file in the session: presence of the 'LAeq'
|
||||
column header identifies AU2 / NL-32 format.
|
||||
Sessions where neither hint is available remain NULL — the file-content
|
||||
fallback in report code handles them transparently.
|
||||
"""
|
||||
import csv
|
||||
import io
|
||||
from pathlib import Path
|
||||
|
||||
DB_PATH = Path("data/seismo_fleet.db")
|
||||
|
||||
|
||||
def _peek_first_row(abs_path: Path) -> dict:
|
||||
"""Read only the header + first data row of an RND file. Very cheap."""
|
||||
try:
|
||||
with open(abs_path, "r", encoding="utf-8", errors="replace") as f:
|
||||
reader = csv.DictReader(f)
|
||||
return next(reader, None) or {}
|
||||
except Exception:
|
||||
return {}
|
||||
|
||||
|
||||
def _detect_model_from_rnd(abs_path: Path) -> str | None:
|
||||
"""Return 'NL-32' if file uses AU2 column format, else None."""
|
||||
row = _peek_first_row(abs_path)
|
||||
if "LAeq" in row:
|
||||
return "NL-32"
|
||||
return None
|
||||
|
||||
|
||||
def migrate():
|
||||
import sqlite3
|
||||
|
||||
if not DB_PATH.exists():
|
||||
print(f"Database not found at {DB_PATH}. Are you running from /home/serversdown/terra-view?")
|
||||
return
|
||||
|
||||
conn = sqlite3.connect(DB_PATH)
|
||||
conn.row_factory = sqlite3.Row
|
||||
cur = conn.cursor()
|
||||
|
||||
# ── 1. Add column (idempotent) ───────────────────────────────────────────
|
||||
cur.execute("PRAGMA table_info(monitoring_sessions)")
|
||||
existing_cols = {row["name"] for row in cur.fetchall()}
|
||||
|
||||
if "device_model" not in existing_cols:
|
||||
cur.execute("ALTER TABLE monitoring_sessions ADD COLUMN device_model TEXT")
|
||||
conn.commit()
|
||||
print("✓ Added column device_model to monitoring_sessions")
|
||||
else:
|
||||
print("○ Column device_model already exists — skipping ALTER TABLE")
|
||||
|
||||
# ── 2. Backfill existing NULL rows ───────────────────────────────────────
|
||||
cur.execute(
|
||||
"SELECT id, unit_id FROM monitoring_sessions WHERE device_model IS NULL"
|
||||
)
|
||||
sessions = cur.fetchall()
|
||||
print(f"Backfilling {len(sessions)} session(s) with device_model=NULL...")
|
||||
|
||||
updated = skipped = 0
|
||||
for row in sessions:
|
||||
session_id = row["id"]
|
||||
unit_id = row["unit_id"]
|
||||
device_model = None
|
||||
|
||||
# Strategy A: look up unit's slm_model from the roster
|
||||
if unit_id:
|
||||
cur.execute(
|
||||
"SELECT slm_model FROM roster WHERE id = ?", (unit_id,)
|
||||
)
|
||||
unit_row = cur.fetchone()
|
||||
if unit_row and unit_row["slm_model"]:
|
||||
device_model = unit_row["slm_model"]
|
||||
|
||||
# Strategy B: detect from first .rnd file in the session
|
||||
if device_model is None:
|
||||
cur.execute(
|
||||
"""SELECT file_path FROM data_files
|
||||
WHERE session_id = ?
|
||||
AND lower(file_path) LIKE '%.rnd'
|
||||
LIMIT 1""",
|
||||
(session_id,),
|
||||
)
|
||||
file_row = cur.fetchone()
|
||||
if file_row:
|
||||
abs_path = Path("data") / file_row["file_path"]
|
||||
device_model = _detect_model_from_rnd(abs_path)
|
||||
# None here means NL-43/NL-53 format (or unreadable file) —
|
||||
# leave as NULL so the existing fallback applies.
|
||||
|
||||
if device_model:
|
||||
cur.execute(
|
||||
"UPDATE monitoring_sessions SET device_model = ? WHERE id = ?",
|
||||
(device_model, session_id),
|
||||
)
|
||||
updated += 1
|
||||
else:
|
||||
skipped += 1
|
||||
|
||||
conn.commit()
|
||||
conn.close()
|
||||
|
||||
print(f"✓ Backfilled {updated} session(s) with a device_model.")
|
||||
if skipped:
|
||||
print(
|
||||
f" {skipped} session(s) left as NULL "
|
||||
"(no unit link and no AU2 file hint — NL-43/NL-53 or unknown; "
|
||||
"file-content detection applies at report time)."
|
||||
)
|
||||
print("Migration complete.")
|
||||
|
||||
|
||||
if __name__ == "__main__":
|
||||
migrate()
|
||||
131
backend/migrate_add_session_period_type.py
Normal file
@@ -0,0 +1,131 @@
|
||||
#!/usr/bin/env python3
|
||||
"""
|
||||
Migration: Add session_label and period_type columns to monitoring_sessions.
|
||||
|
||||
session_label - user-editable display name, e.g. "NRL-1 Sun 2/23 Night"
|
||||
period_type - one of: weekday_day | weekday_night | weekend_day | weekend_night
|
||||
Auto-derived from started_at when NULL.
|
||||
|
||||
Period definitions (used in report stats table):
|
||||
weekday_day Mon-Fri 07:00-22:00 -> Daytime (7AM-10PM)
|
||||
weekday_night Mon-Fri 22:00-07:00 -> Nighttime (10PM-7AM)
|
||||
weekend_day Sat-Sun 07:00-22:00 -> Daytime (7AM-10PM)
|
||||
weekend_night Sat-Sun 22:00-07:00 -> Nighttime (10PM-7AM)
|
||||
|
||||
Run once inside the Docker container:
|
||||
docker exec terra-view python3 backend/migrate_add_session_period_type.py
|
||||
"""
|
||||
from pathlib import Path
|
||||
from datetime import datetime
|
||||
|
||||
DB_PATH = Path("data/seismo_fleet.db")
|
||||
|
||||
|
||||
def _derive_period_type(started_at_str: str) -> str | None:
|
||||
"""Derive period_type from a started_at ISO datetime string."""
|
||||
if not started_at_str:
|
||||
return None
|
||||
try:
|
||||
dt = datetime.fromisoformat(started_at_str)
|
||||
except ValueError:
|
||||
return None
|
||||
is_weekend = dt.weekday() >= 5 # 5=Sat, 6=Sun
|
||||
is_night = dt.hour >= 22 or dt.hour < 7
|
||||
if is_weekend:
|
||||
return "weekend_night" if is_night else "weekend_day"
|
||||
else:
|
||||
return "weekday_night" if is_night else "weekday_day"
|
||||
|
||||
|
||||
def _build_label(started_at_str: str, location_name: str | None, period_type: str | None) -> str | None:
|
||||
"""Build a human-readable session label."""
|
||||
if not started_at_str:
|
||||
return None
|
||||
try:
|
||||
dt = datetime.fromisoformat(started_at_str)
|
||||
except ValueError:
|
||||
return None
|
||||
|
||||
day_abbr = dt.strftime("%a") # Mon, Tue, Sun, etc.
|
||||
date_str = dt.strftime("%-m/%-d") # 2/23
|
||||
|
||||
period_labels = {
|
||||
"weekday_day": "Day",
|
||||
"weekday_night": "Night",
|
||||
"weekend_day": "Day",
|
||||
"weekend_night": "Night",
|
||||
}
|
||||
period_str = period_labels.get(period_type or "", "")
|
||||
|
||||
parts = []
|
||||
if location_name:
|
||||
parts.append(location_name)
|
||||
parts.append(f"{day_abbr} {date_str}")
|
||||
if period_str:
|
||||
parts.append(period_str)
|
||||
return " — ".join(parts)
|
||||
|
||||
|
||||
def migrate():
|
||||
import sqlite3
|
||||
|
||||
if not DB_PATH.exists():
|
||||
print(f"Database not found at {DB_PATH}. Are you running from /home/serversdown/terra-view?")
|
||||
return
|
||||
|
||||
conn = sqlite3.connect(DB_PATH)
|
||||
conn.row_factory = sqlite3.Row
|
||||
cur = conn.cursor()
|
||||
|
||||
# 1. Add columns (idempotent)
|
||||
cur.execute("PRAGMA table_info(monitoring_sessions)")
|
||||
existing_cols = {row["name"] for row in cur.fetchall()}
|
||||
|
||||
for col, typedef in [("session_label", "TEXT"), ("period_type", "TEXT")]:
|
||||
if col not in existing_cols:
|
||||
cur.execute(f"ALTER TABLE monitoring_sessions ADD COLUMN {col} {typedef}")
|
||||
conn.commit()
|
||||
print(f"✓ Added column {col} to monitoring_sessions")
|
||||
else:
|
||||
print(f"○ Column {col} already exists — skipping ALTER TABLE")
|
||||
|
||||
# 2. Backfill existing rows
|
||||
cur.execute(
|
||||
"""SELECT ms.id, ms.started_at, ms.location_id
|
||||
FROM monitoring_sessions ms
|
||||
WHERE ms.period_type IS NULL OR ms.session_label IS NULL"""
|
||||
)
|
||||
sessions = cur.fetchall()
|
||||
print(f"Backfilling {len(sessions)} session(s)...")
|
||||
|
||||
updated = 0
|
||||
for row in sessions:
|
||||
session_id = row["id"]
|
||||
started_at = row["started_at"]
|
||||
location_id = row["location_id"]
|
||||
|
||||
# Look up location name
|
||||
location_name = None
|
||||
if location_id:
|
||||
cur.execute("SELECT name FROM monitoring_locations WHERE id = ?", (location_id,))
|
||||
loc_row = cur.fetchone()
|
||||
if loc_row:
|
||||
location_name = loc_row["name"]
|
||||
|
||||
period_type = _derive_period_type(started_at)
|
||||
label = _build_label(started_at, location_name, period_type)
|
||||
|
||||
cur.execute(
|
||||
"UPDATE monitoring_sessions SET period_type = ?, session_label = ? WHERE id = ?",
|
||||
(period_type, label, session_id),
|
||||
)
|
||||
updated += 1
|
||||
|
||||
conn.commit()
|
||||
conn.close()
|
||||
print(f"✓ Backfilled {updated} session(s).")
|
||||
print("Migration complete.")
|
||||
|
||||
|
||||
if __name__ == "__main__":
|
||||
migrate()
|
||||
@@ -71,7 +71,7 @@ def migrate():
|
||||
print("\n○ No migration needed - all columns already exist.")
|
||||
|
||||
print("\nSound level meter fields are now available in the roster table.")
|
||||
print("You can now set device_type='sound_level_meter' for SLM devices.")
|
||||
print("Note: Use device_type='slm' for Sound Level Meters. Legacy 'sound_level_meter' has been deprecated.")
|
||||
|
||||
|
||||
if __name__ == "__main__":
|
||||
|
||||
89
backend/migrate_add_tbd_dates.py
Normal file
@@ -0,0 +1,89 @@
|
||||
"""
|
||||
Migration: Add TBD date support to job reservations
|
||||
|
||||
Adds columns:
|
||||
- job_reservations.estimated_end_date: For planning when end is TBD
|
||||
- job_reservations.end_date_tbd: Boolean flag for TBD end dates
|
||||
- job_reservation_units.unit_start_date: Unit-specific start (for swaps)
|
||||
- job_reservation_units.unit_end_date: Unit-specific end (for swaps)
|
||||
- job_reservation_units.unit_end_tbd: Unit-specific TBD flag
|
||||
- job_reservation_units.notes: Notes for the assignment
|
||||
|
||||
Also makes job_reservations.end_date nullable.
|
||||
"""
|
||||
|
||||
import sqlite3
|
||||
import sys
|
||||
from pathlib import Path
|
||||
|
||||
def migrate(db_path: str):
|
||||
"""Run the migration."""
|
||||
print(f"Migrating database: {db_path}")
|
||||
|
||||
conn = sqlite3.connect(db_path)
|
||||
cursor = conn.cursor()
|
||||
|
||||
try:
|
||||
# Check if job_reservations table exists
|
||||
cursor.execute("SELECT name FROM sqlite_master WHERE type='table' AND name='job_reservations'")
|
||||
if not cursor.fetchone():
|
||||
print("job_reservations table does not exist. Skipping migration.")
|
||||
return
|
||||
|
||||
# Get existing columns in job_reservations
|
||||
cursor.execute("PRAGMA table_info(job_reservations)")
|
||||
existing_cols = {row[1] for row in cursor.fetchall()}
|
||||
|
||||
# Add new columns to job_reservations if they don't exist
|
||||
if 'estimated_end_date' not in existing_cols:
|
||||
print("Adding estimated_end_date column to job_reservations...")
|
||||
cursor.execute("ALTER TABLE job_reservations ADD COLUMN estimated_end_date DATE")
|
||||
|
||||
if 'end_date_tbd' not in existing_cols:
|
||||
print("Adding end_date_tbd column to job_reservations...")
|
||||
cursor.execute("ALTER TABLE job_reservations ADD COLUMN end_date_tbd BOOLEAN DEFAULT 0")
|
||||
|
||||
# Get existing columns in job_reservation_units
|
||||
cursor.execute("PRAGMA table_info(job_reservation_units)")
|
||||
unit_cols = {row[1] for row in cursor.fetchall()}
|
||||
|
||||
# Add new columns to job_reservation_units if they don't exist
|
||||
if 'unit_start_date' not in unit_cols:
|
||||
print("Adding unit_start_date column to job_reservation_units...")
|
||||
cursor.execute("ALTER TABLE job_reservation_units ADD COLUMN unit_start_date DATE")
|
||||
|
||||
if 'unit_end_date' not in unit_cols:
|
||||
print("Adding unit_end_date column to job_reservation_units...")
|
||||
cursor.execute("ALTER TABLE job_reservation_units ADD COLUMN unit_end_date DATE")
|
||||
|
||||
if 'unit_end_tbd' not in unit_cols:
|
||||
print("Adding unit_end_tbd column to job_reservation_units...")
|
||||
cursor.execute("ALTER TABLE job_reservation_units ADD COLUMN unit_end_tbd BOOLEAN DEFAULT 0")
|
||||
|
||||
if 'notes' not in unit_cols:
|
||||
print("Adding notes column to job_reservation_units...")
|
||||
cursor.execute("ALTER TABLE job_reservation_units ADD COLUMN notes TEXT")
|
||||
|
||||
conn.commit()
|
||||
print("Migration completed successfully!")
|
||||
|
||||
except Exception as e:
|
||||
print(f"Migration failed: {e}")
|
||||
conn.rollback()
|
||||
raise
|
||||
finally:
|
||||
conn.close()
|
||||
|
||||
|
||||
if __name__ == "__main__":
|
||||
# Default to dev database
|
||||
db_path = "./data-dev/seismo_fleet.db"
|
||||
|
||||
if len(sys.argv) > 1:
|
||||
db_path = sys.argv[1]
|
||||
|
||||
if not Path(db_path).exists():
|
||||
print(f"Database not found: {db_path}")
|
||||
sys.exit(1)
|
||||
|
||||
migrate(db_path)
|
||||
105
backend/migrate_fix_end_date_nullable.py
Normal file
@@ -0,0 +1,105 @@
|
||||
"""
|
||||
Migration: Make job_reservations.end_date nullable for TBD support
|
||||
|
||||
SQLite doesn't support ALTER COLUMN, so we need to:
|
||||
1. Create a new table with the correct schema
|
||||
2. Copy data
|
||||
3. Drop old table
|
||||
4. Rename new table
|
||||
"""
|
||||
|
||||
import sqlite3
|
||||
import sys
|
||||
from pathlib import Path
|
||||
|
||||
def migrate(db_path: str):
|
||||
"""Run the migration."""
|
||||
print(f"Migrating database: {db_path}")
|
||||
|
||||
conn = sqlite3.connect(db_path)
|
||||
cursor = conn.cursor()
|
||||
|
||||
try:
|
||||
# Check if job_reservations table exists
|
||||
cursor.execute("SELECT name FROM sqlite_master WHERE type='table' AND name='job_reservations'")
|
||||
if not cursor.fetchone():
|
||||
print("job_reservations table does not exist. Skipping migration.")
|
||||
return
|
||||
|
||||
# Check current schema
|
||||
cursor.execute("PRAGMA table_info(job_reservations)")
|
||||
columns = cursor.fetchall()
|
||||
col_info = {row[1]: row for row in columns}
|
||||
|
||||
# Check if end_date is already nullable (notnull=0)
|
||||
if 'end_date' in col_info and col_info['end_date'][3] == 0:
|
||||
print("end_date is already nullable. Skipping table recreation.")
|
||||
return
|
||||
|
||||
print("Recreating job_reservations table with nullable end_date...")
|
||||
|
||||
# Create new table with correct schema
|
||||
cursor.execute("""
|
||||
CREATE TABLE job_reservations_new (
|
||||
id TEXT PRIMARY KEY,
|
||||
name TEXT NOT NULL,
|
||||
project_id TEXT,
|
||||
start_date DATE NOT NULL,
|
||||
end_date DATE,
|
||||
estimated_end_date DATE,
|
||||
end_date_tbd BOOLEAN DEFAULT 0,
|
||||
assignment_type TEXT NOT NULL DEFAULT 'quantity',
|
||||
device_type TEXT DEFAULT 'seismograph',
|
||||
quantity_needed INTEGER,
|
||||
notes TEXT,
|
||||
color TEXT DEFAULT '#3B82F6',
|
||||
created_at DATETIME,
|
||||
updated_at DATETIME
|
||||
)
|
||||
""")
|
||||
|
||||
# Copy existing data
|
||||
cursor.execute("""
|
||||
INSERT INTO job_reservations_new
|
||||
SELECT
|
||||
id, name, project_id, start_date, end_date,
|
||||
COALESCE(estimated_end_date, NULL) as estimated_end_date,
|
||||
COALESCE(end_date_tbd, 0) as end_date_tbd,
|
||||
assignment_type, device_type, quantity_needed, notes, color,
|
||||
created_at, updated_at
|
||||
FROM job_reservations
|
||||
""")
|
||||
|
||||
# Drop old table
|
||||
cursor.execute("DROP TABLE job_reservations")
|
||||
|
||||
# Rename new table
|
||||
cursor.execute("ALTER TABLE job_reservations_new RENAME TO job_reservations")
|
||||
|
||||
# Recreate index
|
||||
cursor.execute("CREATE INDEX IF NOT EXISTS ix_job_reservations_id ON job_reservations (id)")
|
||||
cursor.execute("CREATE INDEX IF NOT EXISTS ix_job_reservations_project_id ON job_reservations (project_id)")
|
||||
|
||||
conn.commit()
|
||||
print("Migration completed successfully!")
|
||||
|
||||
except Exception as e:
|
||||
print(f"Migration failed: {e}")
|
||||
conn.rollback()
|
||||
raise
|
||||
finally:
|
||||
conn.close()
|
||||
|
||||
|
||||
if __name__ == "__main__":
|
||||
# Default to dev database
|
||||
db_path = "./data-dev/seismo_fleet.db"
|
||||
|
||||
if len(sys.argv) > 1:
|
||||
db_path = sys.argv[1]
|
||||
|
||||
if not Path(db_path).exists():
|
||||
print(f"Database not found: {db_path}")
|
||||
sys.exit(1)
|
||||
|
||||
migrate(db_path)
|
||||
54
backend/migrate_rename_recording_to_monitoring_sessions.py
Normal file
@@ -0,0 +1,54 @@
|
||||
"""
|
||||
Migration: Rename recording_sessions table to monitoring_sessions
|
||||
|
||||
Renames the table and updates the model name from RecordingSession to MonitoringSession.
|
||||
Run once per database: python backend/migrate_rename_recording_to_monitoring_sessions.py
|
||||
"""
|
||||
|
||||
import sqlite3
|
||||
import sys
|
||||
from pathlib import Path
|
||||
|
||||
|
||||
def migrate(db_path: str):
|
||||
"""Run the migration."""
|
||||
print(f"Migrating database: {db_path}")
|
||||
|
||||
conn = sqlite3.connect(db_path)
|
||||
cursor = conn.cursor()
|
||||
|
||||
try:
|
||||
cursor.execute("SELECT name FROM sqlite_master WHERE type='table' AND name='recording_sessions'")
|
||||
if not cursor.fetchone():
|
||||
cursor.execute("SELECT name FROM sqlite_master WHERE type='table' AND name='monitoring_sessions'")
|
||||
if cursor.fetchone():
|
||||
print("monitoring_sessions table already exists. Skipping migration.")
|
||||
else:
|
||||
print("recording_sessions table does not exist. Skipping migration.")
|
||||
return
|
||||
|
||||
print("Renaming recording_sessions -> monitoring_sessions...")
|
||||
cursor.execute("ALTER TABLE recording_sessions RENAME TO monitoring_sessions")
|
||||
|
||||
conn.commit()
|
||||
print("Migration completed successfully!")
|
||||
|
||||
except Exception as e:
|
||||
print(f"Migration failed: {e}")
|
||||
conn.rollback()
|
||||
raise
|
||||
finally:
|
||||
conn.close()
|
||||
|
||||
|
||||
if __name__ == "__main__":
|
||||
db_path = "./data/terra-view.db"
|
||||
|
||||
if len(sys.argv) > 1:
|
||||
db_path = sys.argv[1]
|
||||
|
||||
if not Path(db_path).exists():
|
||||
print(f"Database not found: {db_path}")
|
||||
sys.exit(1)
|
||||
|
||||
migrate(db_path)
|
||||
106
backend/migrate_standardize_device_types.py
Normal file
@@ -0,0 +1,106 @@
|
||||
"""
|
||||
Database Migration: Standardize device_type values
|
||||
|
||||
This migration ensures all device_type values follow the official schema:
|
||||
- "seismograph" - Seismic monitoring devices
|
||||
- "modem" - Field modems and network equipment
|
||||
- "slm" - Sound level meters (NL-43/NL-53)
|
||||
|
||||
Changes:
|
||||
- Converts "sound_level_meter" → "slm"
|
||||
- Safe to run multiple times (idempotent)
|
||||
- No data loss
|
||||
|
||||
Usage:
|
||||
python backend/migrate_standardize_device_types.py
|
||||
"""
|
||||
|
||||
import sys
|
||||
import os
|
||||
|
||||
# Add parent directory to path so we can import backend modules
|
||||
sys.path.insert(0, os.path.dirname(os.path.dirname(os.path.abspath(__file__))))
|
||||
|
||||
from sqlalchemy import create_engine, text
|
||||
from sqlalchemy.orm import sessionmaker
|
||||
|
||||
# Database configuration
|
||||
SQLALCHEMY_DATABASE_URL = "sqlite:///./data/seismo_fleet.db"
|
||||
engine = create_engine(SQLALCHEMY_DATABASE_URL, connect_args={"check_same_thread": False})
|
||||
SessionLocal = sessionmaker(autocommit=False, autoflush=False, bind=engine)
|
||||
|
||||
|
||||
def migrate():
|
||||
"""Standardize device_type values in the database"""
|
||||
db = SessionLocal()
|
||||
|
||||
try:
|
||||
print("=" * 70)
|
||||
print("Database Migration: Standardize device_type values")
|
||||
print("=" * 70)
|
||||
print()
|
||||
|
||||
# Check for existing "sound_level_meter" values
|
||||
result = db.execute(
|
||||
text("SELECT COUNT(*) as count FROM roster WHERE device_type = 'sound_level_meter'")
|
||||
).fetchone()
|
||||
|
||||
count_to_migrate = result[0] if result else 0
|
||||
|
||||
if count_to_migrate == 0:
|
||||
print("✓ No records need migration - all device_type values are already standardized")
|
||||
print()
|
||||
print("Current device_type distribution:")
|
||||
|
||||
# Show distribution
|
||||
distribution = db.execute(
|
||||
text("SELECT device_type, COUNT(*) as count FROM roster GROUP BY device_type ORDER BY count DESC")
|
||||
).fetchall()
|
||||
|
||||
for row in distribution:
|
||||
device_type, count = row
|
||||
print(f" - {device_type}: {count} units")
|
||||
|
||||
print()
|
||||
print("Migration not needed.")
|
||||
return
|
||||
|
||||
print(f"Found {count_to_migrate} record(s) with device_type='sound_level_meter'")
|
||||
print()
|
||||
print("Converting 'sound_level_meter' → 'slm'...")
|
||||
|
||||
# Perform the migration
|
||||
db.execute(
|
||||
text("UPDATE roster SET device_type = 'slm' WHERE device_type = 'sound_level_meter'")
|
||||
)
|
||||
db.commit()
|
||||
|
||||
print(f"✓ Successfully migrated {count_to_migrate} record(s)")
|
||||
print()
|
||||
|
||||
# Show final distribution
|
||||
print("Updated device_type distribution:")
|
||||
distribution = db.execute(
|
||||
text("SELECT device_type, COUNT(*) as count FROM roster GROUP BY device_type ORDER BY count DESC")
|
||||
).fetchall()
|
||||
|
||||
for row in distribution:
|
||||
device_type, count = row
|
||||
print(f" - {device_type}: {count} units")
|
||||
|
||||
print()
|
||||
print("=" * 70)
|
||||
print("Migration completed successfully!")
|
||||
print("=" * 70)
|
||||
|
||||
except Exception as e:
|
||||
db.rollback()
|
||||
print(f"\n❌ Error during migration: {e}")
|
||||
print("\nRolling back changes...")
|
||||
raise
|
||||
finally:
|
||||
db.close()
|
||||
|
||||
|
||||
if __name__ == "__main__":
|
||||
migrate()
|
||||
@@ -19,14 +19,17 @@ class RosterUnit(Base):
|
||||
Roster table: represents our *intended assignment* of a unit.
|
||||
This is editable from the GUI.
|
||||
|
||||
Supports multiple device types (seismograph, modem, sound_level_meter) with type-specific fields.
|
||||
Supports multiple device types with type-specific fields:
|
||||
- "seismograph" - Seismic monitoring devices (default)
|
||||
- "modem" - Field modems and network equipment
|
||||
- "slm" - Sound level meters (NL-43/NL-53)
|
||||
"""
|
||||
__tablename__ = "roster"
|
||||
|
||||
# Core fields (all device types)
|
||||
id = Column(String, primary_key=True, index=True)
|
||||
unit_type = Column(String, default="series3") # Backward compatibility
|
||||
device_type = Column(String, default="seismograph") # "seismograph" | "modem" | "sound_level_meter"
|
||||
device_type = Column(String, default="seismograph") # "seismograph" | "modem" | "slm"
|
||||
deployed = Column(Boolean, default=True)
|
||||
retired = Column(Boolean, default=False)
|
||||
note = Column(String, nullable=True)
|
||||
@@ -47,6 +50,8 @@ class RosterUnit(Base):
|
||||
ip_address = Column(String, nullable=True)
|
||||
phone_number = Column(String, nullable=True)
|
||||
hardware_model = Column(String, nullable=True)
|
||||
deployment_type = Column(String, nullable=True) # "seismograph" | "slm" - what type of device this modem is deployed with
|
||||
deployed_with_unit_id = Column(String, nullable=True) # ID of seismograph/SLM this modem is deployed with
|
||||
|
||||
# Sound Level Meter-specific fields (nullable for seismographs and modems)
|
||||
slm_host = Column(String, nullable=True) # Device IP or hostname
|
||||
@@ -134,17 +139,31 @@ class Project(Base):
|
||||
"""
|
||||
Projects: top-level organization for monitoring work.
|
||||
Type-aware to enable/disable features based on project_type_id.
|
||||
|
||||
Project naming convention:
|
||||
- project_number: TMI internal ID format xxxx-YY (e.g., "2567-23")
|
||||
- client_name: Client/contractor name (e.g., "PJ Dick")
|
||||
- name: Project/site name (e.g., "RKM Hall", "CMU Campus")
|
||||
|
||||
Display format: "2567-23 - PJ Dick - RKM Hall"
|
||||
Users can search by any of these fields.
|
||||
"""
|
||||
__tablename__ = "projects"
|
||||
|
||||
id = Column(String, primary_key=True, index=True) # UUID
|
||||
name = Column(String, nullable=False, unique=True)
|
||||
project_number = Column(String, nullable=True, index=True) # TMI ID: xxxx-YY format (e.g., "2567-23")
|
||||
name = Column(String, nullable=False, unique=True) # Project/site name (e.g., "RKM Hall")
|
||||
description = Column(Text, nullable=True)
|
||||
project_type_id = Column(String, nullable=False) # FK to ProjectType.id
|
||||
status = Column(String, default="active") # active, completed, archived
|
||||
status = Column(String, default="active") # active, on_hold, completed, archived, deleted
|
||||
|
||||
# Data collection mode: how field data reaches Terra-View.
|
||||
# "remote" — units have modems; data pulled via FTP/scheduler automatically
|
||||
# "manual" — no modem; SD cards retrieved daily and uploaded by hand
|
||||
data_collection_mode = Column(String, default="manual") # remote | manual
|
||||
|
||||
# Project metadata
|
||||
client_name = Column(String, nullable=True)
|
||||
client_name = Column(String, nullable=True, index=True) # Client name (e.g., "PJ Dick")
|
||||
site_address = Column(String, nullable=True)
|
||||
site_coordinates = Column(String, nullable=True) # "lat,lon"
|
||||
start_date = Column(Date, nullable=True)
|
||||
@@ -152,6 +171,7 @@ class Project(Base):
|
||||
|
||||
created_at = Column(DateTime, default=datetime.utcnow)
|
||||
updated_at = Column(DateTime, default=datetime.utcnow, onupdate=datetime.utcnow)
|
||||
deleted_at = Column(DateTime, nullable=True) # Set when status='deleted'; hard delete scheduled after 60 days
|
||||
|
||||
|
||||
class MonitoringLocation(Base):
|
||||
@@ -197,7 +217,7 @@ class UnitAssignment(Base):
|
||||
notes = Column(Text, nullable=True)
|
||||
|
||||
# Denormalized for efficient queries
|
||||
device_type = Column(String, nullable=False) # sound_level_meter | seismograph
|
||||
device_type = Column(String, nullable=False) # "slm" | "seismograph"
|
||||
project_id = Column(String, nullable=False, index=True) # FK to Project.id
|
||||
|
||||
created_at = Column(DateTime, default=datetime.utcnow)
|
||||
@@ -215,8 +235,8 @@ class ScheduledAction(Base):
|
||||
location_id = Column(String, nullable=False, index=True) # FK to MonitoringLocation.id
|
||||
unit_id = Column(String, nullable=True, index=True) # FK to RosterUnit.id (nullable if location-based)
|
||||
|
||||
action_type = Column(String, nullable=False) # start, stop, download, calibrate
|
||||
device_type = Column(String, nullable=False) # sound_level_meter | seismograph
|
||||
action_type = Column(String, nullable=False) # start, stop, download, cycle, calibrate
|
||||
device_type = Column(String, nullable=False) # "slm" | "seismograph"
|
||||
|
||||
scheduled_time = Column(DateTime, nullable=False, index=True)
|
||||
executed_at = Column(DateTime, nullable=True)
|
||||
@@ -230,17 +250,21 @@ class ScheduledAction(Base):
|
||||
created_at = Column(DateTime, default=datetime.utcnow)
|
||||
|
||||
|
||||
class RecordingSession(Base):
|
||||
class MonitoringSession(Base):
|
||||
"""
|
||||
Recording sessions: tracks actual monitoring sessions.
|
||||
Created when recording starts, updated when it stops.
|
||||
Monitoring sessions: tracks actual monitoring sessions.
|
||||
Created when monitoring starts, updated when it stops.
|
||||
"""
|
||||
__tablename__ = "recording_sessions"
|
||||
__tablename__ = "monitoring_sessions"
|
||||
|
||||
id = Column(String, primary_key=True, index=True) # UUID
|
||||
project_id = Column(String, nullable=False, index=True) # FK to Project.id
|
||||
location_id = Column(String, nullable=False, index=True) # FK to MonitoringLocation.id
|
||||
unit_id = Column(String, nullable=False, index=True) # FK to RosterUnit.id
|
||||
unit_id = Column(String, nullable=True, index=True) # FK to RosterUnit.id (nullable for offline uploads)
|
||||
|
||||
# Physical device model that produced this session's data (e.g. "NL-43", "NL-53", "NL-32").
|
||||
# Null for older records; report code falls back to file-content detection when null.
|
||||
device_model = Column(String, nullable=True)
|
||||
|
||||
session_type = Column(String, nullable=False) # sound | vibration
|
||||
started_at = Column(DateTime, nullable=False)
|
||||
@@ -248,6 +272,14 @@ class RecordingSession(Base):
|
||||
duration_seconds = Column(Integer, nullable=True)
|
||||
status = Column(String, default="recording") # recording, completed, failed
|
||||
|
||||
# Human-readable label auto-derived from date/location, editable by user.
|
||||
# e.g. "NRL-1 — Sun 2/23 — Night"
|
||||
session_label = Column(String, nullable=True)
|
||||
|
||||
# Period classification for report stats columns.
|
||||
# weekday_day | weekday_night | weekend_day | weekend_night
|
||||
period_type = Column(String, nullable=True)
|
||||
|
||||
# Snapshot of device configuration at recording time
|
||||
session_metadata = Column(Text, nullable=True) # JSON
|
||||
|
||||
@@ -263,7 +295,7 @@ class DataFile(Base):
|
||||
__tablename__ = "data_files"
|
||||
|
||||
id = Column(String, primary_key=True, index=True) # UUID
|
||||
session_id = Column(String, nullable=False, index=True) # FK to RecordingSession.id
|
||||
session_id = Column(String, nullable=False, index=True) # FK to MonitoringSession.id
|
||||
|
||||
file_path = Column(String, nullable=False) # Relative to data/Projects/
|
||||
file_type = Column(String, nullable=False) # wav, csv, mseed, json
|
||||
@@ -275,3 +307,190 @@ class DataFile(Base):
|
||||
file_metadata = Column(Text, nullable=True) # JSON
|
||||
|
||||
created_at = Column(DateTime, default=datetime.utcnow)
|
||||
|
||||
|
||||
class ReportTemplate(Base):
|
||||
"""
|
||||
Report templates: saved configurations for generating Excel reports.
|
||||
Allows users to save time filter presets, titles, etc. for reuse.
|
||||
"""
|
||||
__tablename__ = "report_templates"
|
||||
|
||||
id = Column(String, primary_key=True, index=True) # UUID
|
||||
name = Column(String, nullable=False) # "Nighttime Report", "Full Day Report"
|
||||
project_id = Column(String, nullable=True) # Optional: project-specific template
|
||||
|
||||
# Template settings
|
||||
report_title = Column(String, default="Background Noise Study")
|
||||
start_time = Column(String, nullable=True) # "19:00" format
|
||||
end_time = Column(String, nullable=True) # "07:00" format
|
||||
start_date = Column(String, nullable=True) # "2025-01-15" format (optional)
|
||||
end_date = Column(String, nullable=True) # "2025-01-20" format (optional)
|
||||
|
||||
created_at = Column(DateTime, default=datetime.utcnow)
|
||||
updated_at = Column(DateTime, default=datetime.utcnow, onupdate=datetime.utcnow)
|
||||
|
||||
|
||||
# ============================================================================
|
||||
# Sound Monitoring Scheduler
|
||||
# ============================================================================
|
||||
|
||||
class RecurringSchedule(Base):
|
||||
"""
|
||||
Recurring schedule definitions for automated sound monitoring.
|
||||
|
||||
Supports three schedule types:
|
||||
- "weekly_calendar": Select specific days with start/end times (e.g., Mon/Wed/Fri 7pm-7am)
|
||||
- "simple_interval": For 24/7 monitoring with daily stop/download/restart cycles
|
||||
- "one_off": Single recording session with specific start and end date/time
|
||||
"""
|
||||
__tablename__ = "recurring_schedules"
|
||||
|
||||
id = Column(String, primary_key=True, index=True) # UUID
|
||||
project_id = Column(String, nullable=False, index=True) # FK to Project.id
|
||||
location_id = Column(String, nullable=False, index=True) # FK to MonitoringLocation.id
|
||||
unit_id = Column(String, nullable=True, index=True) # FK to RosterUnit.id (optional, can use assignment)
|
||||
|
||||
name = Column(String, nullable=False) # "Weeknight Monitoring", "24/7 Continuous"
|
||||
schedule_type = Column(String, nullable=False) # "weekly_calendar" | "simple_interval" | "one_off"
|
||||
device_type = Column(String, nullable=False) # "slm" | "seismograph"
|
||||
|
||||
# Weekly Calendar fields (schedule_type = "weekly_calendar")
|
||||
# JSON format: {
|
||||
# "monday": {"enabled": true, "start": "19:00", "end": "07:00"},
|
||||
# "tuesday": {"enabled": false},
|
||||
# ...
|
||||
# }
|
||||
weekly_pattern = Column(Text, nullable=True)
|
||||
|
||||
# Simple Interval fields (schedule_type = "simple_interval")
|
||||
interval_type = Column(String, nullable=True) # "daily" | "hourly"
|
||||
cycle_time = Column(String, nullable=True) # "00:00" - time to run stop/download/restart
|
||||
include_download = Column(Boolean, default=True) # Download data before restart
|
||||
|
||||
# One-Off fields (schedule_type = "one_off")
|
||||
start_datetime = Column(DateTime, nullable=True) # Exact start date+time (stored as UTC)
|
||||
end_datetime = Column(DateTime, nullable=True) # Exact end date+time (stored as UTC)
|
||||
|
||||
# Automation options (applies to all schedule types)
|
||||
auto_increment_index = Column(Boolean, default=True) # Auto-increment store/index number before start
|
||||
# When True: prevents "overwrite data?" prompts by using a new index each time
|
||||
|
||||
# Shared configuration
|
||||
enabled = Column(Boolean, default=True)
|
||||
timezone = Column(String, default="America/New_York")
|
||||
|
||||
# Tracking
|
||||
last_generated_at = Column(DateTime, nullable=True) # When actions were last generated
|
||||
next_occurrence = Column(DateTime, nullable=True) # Computed next action time
|
||||
|
||||
created_at = Column(DateTime, default=datetime.utcnow)
|
||||
updated_at = Column(DateTime, default=datetime.utcnow, onupdate=datetime.utcnow)
|
||||
|
||||
|
||||
class Alert(Base):
|
||||
"""
|
||||
In-app alerts for device status changes and system events.
|
||||
|
||||
Designed for future expansion to email/webhook notifications.
|
||||
Currently supports:
|
||||
- device_offline: Device became unreachable
|
||||
- device_online: Device came back online
|
||||
- schedule_failed: Scheduled action failed to execute
|
||||
- schedule_completed: Scheduled action completed successfully
|
||||
"""
|
||||
__tablename__ = "alerts"
|
||||
|
||||
id = Column(String, primary_key=True, index=True) # UUID
|
||||
|
||||
# Alert classification
|
||||
alert_type = Column(String, nullable=False) # "device_offline" | "device_online" | "schedule_failed" | "schedule_completed"
|
||||
severity = Column(String, default="warning") # "info" | "warning" | "critical"
|
||||
|
||||
# Related entities (nullable - may not all apply)
|
||||
project_id = Column(String, nullable=True, index=True)
|
||||
location_id = Column(String, nullable=True, index=True)
|
||||
unit_id = Column(String, nullable=True, index=True)
|
||||
schedule_id = Column(String, nullable=True) # RecurringSchedule or ScheduledAction id
|
||||
|
||||
# Alert content
|
||||
title = Column(String, nullable=False) # "NRL-001 Device Offline"
|
||||
message = Column(Text, nullable=True) # Detailed description
|
||||
alert_metadata = Column(Text, nullable=True) # JSON: additional context data
|
||||
|
||||
# Status tracking
|
||||
status = Column(String, default="active") # "active" | "acknowledged" | "resolved" | "dismissed"
|
||||
acknowledged_at = Column(DateTime, nullable=True)
|
||||
resolved_at = Column(DateTime, nullable=True)
|
||||
|
||||
created_at = Column(DateTime, default=datetime.utcnow)
|
||||
expires_at = Column(DateTime, nullable=True) # Auto-dismiss after this time
|
||||
|
||||
|
||||
# ============================================================================
|
||||
# Fleet Calendar & Job Reservations
|
||||
# ============================================================================
|
||||
|
||||
class JobReservation(Base):
|
||||
"""
|
||||
Job reservations: reserve units for future jobs/projects.
|
||||
|
||||
Supports two assignment modes:
|
||||
- "specific": Pick exact units (SN-001, SN-002, etc.)
|
||||
- "quantity": Reserve a number of units (e.g., "need 8 seismographs")
|
||||
|
||||
Used by the Fleet Calendar to visualize unit availability over time.
|
||||
"""
|
||||
__tablename__ = "job_reservations"
|
||||
|
||||
id = Column(String, primary_key=True, index=True) # UUID
|
||||
name = Column(String, nullable=False) # "Job A - March deployment"
|
||||
project_id = Column(String, nullable=True, index=True) # Optional FK to Project
|
||||
|
||||
# Date range for the reservation
|
||||
start_date = Column(Date, nullable=False)
|
||||
end_date = Column(Date, nullable=True) # Nullable = TBD / ongoing
|
||||
estimated_end_date = Column(Date, nullable=True) # For planning when end is TBD
|
||||
end_date_tbd = Column(Boolean, default=False) # True = end date unknown
|
||||
|
||||
# Assignment type: "specific" or "quantity"
|
||||
assignment_type = Column(String, nullable=False, default="quantity")
|
||||
|
||||
# For quantity reservations
|
||||
device_type = Column(String, default="seismograph") # seismograph | slm
|
||||
quantity_needed = Column(Integer, nullable=True) # e.g., 8 units
|
||||
|
||||
# Metadata
|
||||
notes = Column(Text, nullable=True)
|
||||
color = Column(String, default="#3B82F6") # For calendar display (blue default)
|
||||
|
||||
created_at = Column(DateTime, default=datetime.utcnow)
|
||||
updated_at = Column(DateTime, default=datetime.utcnow, onupdate=datetime.utcnow)
|
||||
|
||||
|
||||
class JobReservationUnit(Base):
|
||||
"""
|
||||
Links specific units to job reservations.
|
||||
|
||||
Used when:
|
||||
- assignment_type="specific": Units are directly assigned
|
||||
- assignment_type="quantity": Units can be filled in later
|
||||
|
||||
Supports unit swaps: same reservation can have multiple units with
|
||||
different date ranges (e.g., BE17353 Feb-Jun, then BE18438 Jun-Nov).
|
||||
"""
|
||||
__tablename__ = "job_reservation_units"
|
||||
|
||||
id = Column(String, primary_key=True, index=True) # UUID
|
||||
reservation_id = Column(String, nullable=False, index=True) # FK to JobReservation
|
||||
unit_id = Column(String, nullable=False, index=True) # FK to RosterUnit
|
||||
|
||||
# Unit-specific date range (for swaps) - defaults to reservation dates if null
|
||||
unit_start_date = Column(Date, nullable=True) # When this specific unit starts
|
||||
unit_end_date = Column(Date, nullable=True) # When this unit ends (swap out date)
|
||||
unit_end_tbd = Column(Boolean, default=False) # True = end unknown (until cal expires or job ends)
|
||||
|
||||
# Track how this assignment was made
|
||||
assignment_source = Column(String, default="specific") # "specific" | "filled" | "swap"
|
||||
assigned_at = Column(DateTime, default=datetime.utcnow)
|
||||
notes = Column(Text, nullable=True) # "Replacing BE17353" etc.
|
||||
|
||||
326
backend/routers/alerts.py
Normal file
@@ -0,0 +1,326 @@
|
||||
"""
|
||||
Alerts Router
|
||||
|
||||
API endpoints for managing in-app alerts.
|
||||
"""
|
||||
|
||||
from fastapi import APIRouter, Request, Depends, HTTPException, Query
|
||||
from fastapi.responses import HTMLResponse, JSONResponse
|
||||
from sqlalchemy.orm import Session
|
||||
from typing import Optional
|
||||
from datetime import datetime, timedelta
|
||||
|
||||
from backend.database import get_db
|
||||
from backend.models import Alert, RosterUnit
|
||||
from backend.services.alert_service import get_alert_service
|
||||
from backend.templates_config import templates
|
||||
|
||||
router = APIRouter(prefix="/api/alerts", tags=["alerts"])
|
||||
|
||||
|
||||
# ============================================================================
|
||||
# Alert List and Count
|
||||
# ============================================================================
|
||||
|
||||
@router.get("/")
|
||||
async def list_alerts(
|
||||
db: Session = Depends(get_db),
|
||||
status: Optional[str] = Query(None, description="Filter by status: active, acknowledged, resolved, dismissed"),
|
||||
project_id: Optional[str] = Query(None),
|
||||
unit_id: Optional[str] = Query(None),
|
||||
alert_type: Optional[str] = Query(None, description="Filter by type: device_offline, device_online, schedule_failed"),
|
||||
limit: int = Query(50, le=100),
|
||||
offset: int = Query(0, ge=0),
|
||||
):
|
||||
"""
|
||||
List alerts with optional filters.
|
||||
"""
|
||||
alert_service = get_alert_service(db)
|
||||
|
||||
alerts = alert_service.get_all_alerts(
|
||||
status=status,
|
||||
project_id=project_id,
|
||||
unit_id=unit_id,
|
||||
alert_type=alert_type,
|
||||
limit=limit,
|
||||
offset=offset,
|
||||
)
|
||||
|
||||
return {
|
||||
"alerts": [
|
||||
{
|
||||
"id": a.id,
|
||||
"alert_type": a.alert_type,
|
||||
"severity": a.severity,
|
||||
"title": a.title,
|
||||
"message": a.message,
|
||||
"status": a.status,
|
||||
"unit_id": a.unit_id,
|
||||
"project_id": a.project_id,
|
||||
"location_id": a.location_id,
|
||||
"created_at": a.created_at.isoformat() if a.created_at else None,
|
||||
"acknowledged_at": a.acknowledged_at.isoformat() if a.acknowledged_at else None,
|
||||
"resolved_at": a.resolved_at.isoformat() if a.resolved_at else None,
|
||||
}
|
||||
for a in alerts
|
||||
],
|
||||
"count": len(alerts),
|
||||
"limit": limit,
|
||||
"offset": offset,
|
||||
}
|
||||
|
||||
|
||||
@router.get("/active")
|
||||
async def list_active_alerts(
|
||||
db: Session = Depends(get_db),
|
||||
project_id: Optional[str] = Query(None),
|
||||
unit_id: Optional[str] = Query(None),
|
||||
alert_type: Optional[str] = Query(None),
|
||||
min_severity: Optional[str] = Query(None, description="Minimum severity: info, warning, critical"),
|
||||
limit: int = Query(50, le=100),
|
||||
):
|
||||
"""
|
||||
List only active alerts.
|
||||
"""
|
||||
alert_service = get_alert_service(db)
|
||||
|
||||
alerts = alert_service.get_active_alerts(
|
||||
project_id=project_id,
|
||||
unit_id=unit_id,
|
||||
alert_type=alert_type,
|
||||
min_severity=min_severity,
|
||||
limit=limit,
|
||||
)
|
||||
|
||||
return {
|
||||
"alerts": [
|
||||
{
|
||||
"id": a.id,
|
||||
"alert_type": a.alert_type,
|
||||
"severity": a.severity,
|
||||
"title": a.title,
|
||||
"message": a.message,
|
||||
"unit_id": a.unit_id,
|
||||
"project_id": a.project_id,
|
||||
"created_at": a.created_at.isoformat() if a.created_at else None,
|
||||
}
|
||||
for a in alerts
|
||||
],
|
||||
"count": len(alerts),
|
||||
}
|
||||
|
||||
|
||||
@router.get("/active/count")
|
||||
async def get_active_alert_count(db: Session = Depends(get_db)):
|
||||
"""
|
||||
Get count of active alerts (for navbar badge).
|
||||
"""
|
||||
alert_service = get_alert_service(db)
|
||||
count = alert_service.get_active_alert_count()
|
||||
return {"count": count}
|
||||
|
||||
|
||||
# ============================================================================
|
||||
# Single Alert Operations
|
||||
# ============================================================================
|
||||
|
||||
@router.get("/{alert_id}")
|
||||
async def get_alert(
|
||||
alert_id: str,
|
||||
db: Session = Depends(get_db),
|
||||
):
|
||||
"""
|
||||
Get a specific alert.
|
||||
"""
|
||||
alert = db.query(Alert).filter_by(id=alert_id).first()
|
||||
if not alert:
|
||||
raise HTTPException(status_code=404, detail="Alert not found")
|
||||
|
||||
# Get related unit info
|
||||
unit = None
|
||||
if alert.unit_id:
|
||||
unit = db.query(RosterUnit).filter_by(id=alert.unit_id).first()
|
||||
|
||||
return {
|
||||
"id": alert.id,
|
||||
"alert_type": alert.alert_type,
|
||||
"severity": alert.severity,
|
||||
"title": alert.title,
|
||||
"message": alert.message,
|
||||
"metadata": alert.alert_metadata,
|
||||
"status": alert.status,
|
||||
"unit_id": alert.unit_id,
|
||||
"unit_name": unit.id if unit else None,
|
||||
"project_id": alert.project_id,
|
||||
"location_id": alert.location_id,
|
||||
"schedule_id": alert.schedule_id,
|
||||
"created_at": alert.created_at.isoformat() if alert.created_at else None,
|
||||
"acknowledged_at": alert.acknowledged_at.isoformat() if alert.acknowledged_at else None,
|
||||
"resolved_at": alert.resolved_at.isoformat() if alert.resolved_at else None,
|
||||
"expires_at": alert.expires_at.isoformat() if alert.expires_at else None,
|
||||
}
|
||||
|
||||
|
||||
@router.post("/{alert_id}/acknowledge")
|
||||
async def acknowledge_alert(
|
||||
alert_id: str,
|
||||
db: Session = Depends(get_db),
|
||||
):
|
||||
"""
|
||||
Mark alert as acknowledged.
|
||||
"""
|
||||
alert_service = get_alert_service(db)
|
||||
alert = alert_service.acknowledge_alert(alert_id)
|
||||
|
||||
if not alert:
|
||||
raise HTTPException(status_code=404, detail="Alert not found")
|
||||
|
||||
return {
|
||||
"success": True,
|
||||
"alert_id": alert.id,
|
||||
"status": alert.status,
|
||||
}
|
||||
|
||||
|
||||
@router.post("/{alert_id}/dismiss")
|
||||
async def dismiss_alert(
|
||||
alert_id: str,
|
||||
db: Session = Depends(get_db),
|
||||
):
|
||||
"""
|
||||
Dismiss alert.
|
||||
"""
|
||||
alert_service = get_alert_service(db)
|
||||
alert = alert_service.dismiss_alert(alert_id)
|
||||
|
||||
if not alert:
|
||||
raise HTTPException(status_code=404, detail="Alert not found")
|
||||
|
||||
return {
|
||||
"success": True,
|
||||
"alert_id": alert.id,
|
||||
"status": alert.status,
|
||||
}
|
||||
|
||||
|
||||
@router.post("/{alert_id}/resolve")
|
||||
async def resolve_alert(
|
||||
alert_id: str,
|
||||
db: Session = Depends(get_db),
|
||||
):
|
||||
"""
|
||||
Manually resolve an alert.
|
||||
"""
|
||||
alert_service = get_alert_service(db)
|
||||
alert = alert_service.resolve_alert(alert_id)
|
||||
|
||||
if not alert:
|
||||
raise HTTPException(status_code=404, detail="Alert not found")
|
||||
|
||||
return {
|
||||
"success": True,
|
||||
"alert_id": alert.id,
|
||||
"status": alert.status,
|
||||
}
|
||||
|
||||
|
||||
# ============================================================================
|
||||
# HTML Partials for HTMX
|
||||
# ============================================================================
|
||||
|
||||
@router.get("/partials/dropdown", response_class=HTMLResponse)
|
||||
async def get_alert_dropdown(
|
||||
request: Request,
|
||||
db: Session = Depends(get_db),
|
||||
):
|
||||
"""
|
||||
Return HTML partial for alert dropdown in navbar.
|
||||
"""
|
||||
alert_service = get_alert_service(db)
|
||||
alerts = alert_service.get_active_alerts(limit=10)
|
||||
|
||||
# Calculate relative time for each alert
|
||||
now = datetime.utcnow()
|
||||
alerts_data = []
|
||||
for alert in alerts:
|
||||
delta = now - alert.created_at
|
||||
if delta.days > 0:
|
||||
time_ago = f"{delta.days}d ago"
|
||||
elif delta.seconds >= 3600:
|
||||
time_ago = f"{delta.seconds // 3600}h ago"
|
||||
elif delta.seconds >= 60:
|
||||
time_ago = f"{delta.seconds // 60}m ago"
|
||||
else:
|
||||
time_ago = "just now"
|
||||
|
||||
alerts_data.append({
|
||||
"alert": alert,
|
||||
"time_ago": time_ago,
|
||||
})
|
||||
|
||||
return templates.TemplateResponse("partials/alerts/alert_dropdown.html", {
|
||||
"request": request,
|
||||
"alerts": alerts_data,
|
||||
"total_count": alert_service.get_active_alert_count(),
|
||||
})
|
||||
|
||||
|
||||
@router.get("/partials/list", response_class=HTMLResponse)
|
||||
async def get_alert_list(
|
||||
request: Request,
|
||||
db: Session = Depends(get_db),
|
||||
status: Optional[str] = Query(None),
|
||||
limit: int = Query(20),
|
||||
):
|
||||
"""
|
||||
Return HTML partial for alert list page.
|
||||
"""
|
||||
alert_service = get_alert_service(db)
|
||||
|
||||
if status:
|
||||
alerts = alert_service.get_all_alerts(status=status, limit=limit)
|
||||
else:
|
||||
alerts = alert_service.get_all_alerts(limit=limit)
|
||||
|
||||
# Calculate relative time for each alert
|
||||
now = datetime.utcnow()
|
||||
alerts_data = []
|
||||
for alert in alerts:
|
||||
delta = now - alert.created_at
|
||||
if delta.days > 0:
|
||||
time_ago = f"{delta.days}d ago"
|
||||
elif delta.seconds >= 3600:
|
||||
time_ago = f"{delta.seconds // 3600}h ago"
|
||||
elif delta.seconds >= 60:
|
||||
time_ago = f"{delta.seconds // 60}m ago"
|
||||
else:
|
||||
time_ago = "just now"
|
||||
|
||||
alerts_data.append({
|
||||
"alert": alert,
|
||||
"time_ago": time_ago,
|
||||
})
|
||||
|
||||
return templates.TemplateResponse("partials/alerts/alert_list.html", {
|
||||
"request": request,
|
||||
"alerts": alerts_data,
|
||||
"status_filter": status,
|
||||
})
|
||||
|
||||
|
||||
# ============================================================================
|
||||
# Cleanup
|
||||
# ============================================================================
|
||||
|
||||
@router.post("/cleanup-expired")
|
||||
async def cleanup_expired_alerts(db: Session = Depends(get_db)):
|
||||
"""
|
||||
Cleanup expired alerts (admin/maintenance endpoint).
|
||||
"""
|
||||
alert_service = get_alert_service(db)
|
||||
count = alert_service.cleanup_expired_alerts()
|
||||
|
||||
return {
|
||||
"success": True,
|
||||
"cleaned_up": count,
|
||||
}
|
||||
@@ -1,10 +1,15 @@
|
||||
from fastapi import APIRouter, Request, Depends
|
||||
from fastapi.templating import Jinja2Templates
|
||||
from sqlalchemy.orm import Session
|
||||
from sqlalchemy import and_
|
||||
from datetime import datetime, timedelta
|
||||
|
||||
from backend.database import get_db
|
||||
from backend.models import ScheduledAction, MonitoringLocation, Project
|
||||
from backend.services.snapshot import emit_status_snapshot
|
||||
from backend.templates_config import templates
|
||||
from backend.utils.timezone import utc_to_local, local_to_utc, get_user_timezone
|
||||
|
||||
router = APIRouter()
|
||||
templates = Jinja2Templates(directory="templates")
|
||||
|
||||
|
||||
@router.get("/dashboard/active")
|
||||
@@ -23,3 +28,79 @@ def dashboard_benched(request: Request):
|
||||
"partials/benched_table.html",
|
||||
{"request": request, "units": snapshot["benched"]}
|
||||
)
|
||||
|
||||
|
||||
@router.get("/dashboard/todays-actions")
|
||||
def dashboard_todays_actions(request: Request, db: Session = Depends(get_db)):
|
||||
"""
|
||||
Get today's scheduled actions for the dashboard card.
|
||||
Shows upcoming, completed, and failed actions for today.
|
||||
"""
|
||||
import json
|
||||
from zoneinfo import ZoneInfo
|
||||
|
||||
# Get today's date range in local timezone
|
||||
tz = ZoneInfo(get_user_timezone())
|
||||
now_local = datetime.now(tz)
|
||||
today_start_local = now_local.replace(hour=0, minute=0, second=0, microsecond=0)
|
||||
today_end_local = today_start_local + timedelta(days=1)
|
||||
|
||||
# Convert to UTC for database query
|
||||
today_start_utc = today_start_local.astimezone(ZoneInfo("UTC")).replace(tzinfo=None)
|
||||
today_end_utc = today_end_local.astimezone(ZoneInfo("UTC")).replace(tzinfo=None)
|
||||
|
||||
# Exclude actions from paused/removed projects
|
||||
paused_project_ids = [
|
||||
p.id for p in db.query(Project.id).filter(
|
||||
Project.status.in_(["on_hold", "archived", "deleted"])
|
||||
).all()
|
||||
]
|
||||
|
||||
# Query today's actions
|
||||
actions = db.query(ScheduledAction).filter(
|
||||
ScheduledAction.scheduled_time >= today_start_utc,
|
||||
ScheduledAction.scheduled_time < today_end_utc,
|
||||
ScheduledAction.project_id.notin_(paused_project_ids),
|
||||
).order_by(ScheduledAction.scheduled_time.asc()).all()
|
||||
|
||||
# Enrich with location/project info and parse results
|
||||
enriched_actions = []
|
||||
for action in actions:
|
||||
location = None
|
||||
project = None
|
||||
if action.location_id:
|
||||
location = db.query(MonitoringLocation).filter_by(id=action.location_id).first()
|
||||
if action.project_id:
|
||||
project = db.query(Project).filter_by(id=action.project_id).first()
|
||||
|
||||
# Parse module_response for result details
|
||||
result_data = None
|
||||
if action.module_response:
|
||||
try:
|
||||
result_data = json.loads(action.module_response)
|
||||
except json.JSONDecodeError:
|
||||
pass
|
||||
|
||||
enriched_actions.append({
|
||||
"action": action,
|
||||
"location": location,
|
||||
"project": project,
|
||||
"result": result_data,
|
||||
})
|
||||
|
||||
# Count by status
|
||||
pending_count = sum(1 for a in actions if a.execution_status == "pending")
|
||||
completed_count = sum(1 for a in actions if a.execution_status == "completed")
|
||||
failed_count = sum(1 for a in actions if a.execution_status == "failed")
|
||||
|
||||
return templates.TemplateResponse(
|
||||
"partials/dashboard/todays_actions.html",
|
||||
{
|
||||
"request": request,
|
||||
"actions": enriched_actions,
|
||||
"pending_count": pending_count,
|
||||
"completed_count": completed_count,
|
||||
"failed_count": failed_count,
|
||||
"total_count": len(actions),
|
||||
}
|
||||
)
|
||||
|
||||
610
backend/routers/fleet_calendar.py
Normal file
@@ -0,0 +1,610 @@
|
||||
"""
|
||||
Fleet Calendar Router
|
||||
|
||||
API endpoints for the Fleet Calendar feature:
|
||||
- Calendar page and data
|
||||
- Job reservation CRUD
|
||||
- Unit assignment management
|
||||
- Availability checking
|
||||
"""
|
||||
|
||||
from fastapi import APIRouter, Request, Depends, HTTPException, Query
|
||||
from fastapi.responses import HTMLResponse, JSONResponse
|
||||
from sqlalchemy.orm import Session
|
||||
from datetime import datetime, date, timedelta
|
||||
from typing import Optional, List
|
||||
import uuid
|
||||
import logging
|
||||
|
||||
from backend.database import get_db
|
||||
from backend.models import (
|
||||
RosterUnit, JobReservation, JobReservationUnit,
|
||||
UserPreferences, Project
|
||||
)
|
||||
from backend.templates_config import templates
|
||||
from backend.services.fleet_calendar_service import (
|
||||
get_day_summary,
|
||||
get_calendar_year_data,
|
||||
get_rolling_calendar_data,
|
||||
check_calibration_conflicts,
|
||||
get_available_units_for_period,
|
||||
get_calibration_status
|
||||
)
|
||||
|
||||
router = APIRouter(tags=["fleet-calendar"])
|
||||
logger = logging.getLogger(__name__)
|
||||
|
||||
|
||||
# ============================================================================
|
||||
# Calendar Page
|
||||
# ============================================================================
|
||||
|
||||
@router.get("/fleet-calendar", response_class=HTMLResponse)
|
||||
async def fleet_calendar_page(
|
||||
request: Request,
|
||||
year: Optional[int] = None,
|
||||
month: Optional[int] = None,
|
||||
device_type: str = "seismograph",
|
||||
db: Session = Depends(get_db)
|
||||
):
|
||||
"""Main Fleet Calendar page with rolling 12-month view."""
|
||||
today = date.today()
|
||||
|
||||
# Default to current month as the start
|
||||
if year is None:
|
||||
year = today.year
|
||||
if month is None:
|
||||
month = today.month
|
||||
|
||||
# Get calendar data for 12 months starting from year/month
|
||||
calendar_data = get_rolling_calendar_data(db, year, month, device_type)
|
||||
|
||||
# Get projects for the reservation form dropdown
|
||||
projects = db.query(Project).filter(
|
||||
Project.status == "active"
|
||||
).order_by(Project.name).all()
|
||||
|
||||
# Calculate prev/next month navigation
|
||||
prev_year, prev_month = (year - 1, 12) if month == 1 else (year, month - 1)
|
||||
next_year, next_month = (year + 1, 1) if month == 12 else (year, month + 1)
|
||||
|
||||
return templates.TemplateResponse(
|
||||
"fleet_calendar.html",
|
||||
{
|
||||
"request": request,
|
||||
"start_year": year,
|
||||
"start_month": month,
|
||||
"prev_year": prev_year,
|
||||
"prev_month": prev_month,
|
||||
"next_year": next_year,
|
||||
"next_month": next_month,
|
||||
"device_type": device_type,
|
||||
"calendar_data": calendar_data,
|
||||
"projects": projects,
|
||||
"today": today.isoformat()
|
||||
}
|
||||
)
|
||||
|
||||
|
||||
# ============================================================================
|
||||
# Calendar Data API
|
||||
# ============================================================================
|
||||
|
||||
@router.get("/api/fleet-calendar/data", response_class=JSONResponse)
|
||||
async def get_calendar_data(
|
||||
year: int,
|
||||
device_type: str = "seismograph",
|
||||
db: Session = Depends(get_db)
|
||||
):
|
||||
"""Get calendar data for a specific year."""
|
||||
return get_calendar_year_data(db, year, device_type)
|
||||
|
||||
|
||||
@router.get("/api/fleet-calendar/day/{date_str}", response_class=HTMLResponse)
|
||||
async def get_day_detail(
|
||||
request: Request,
|
||||
date_str: str,
|
||||
device_type: str = "seismograph",
|
||||
db: Session = Depends(get_db)
|
||||
):
|
||||
"""Get detailed view for a specific day (HTMX partial)."""
|
||||
try:
|
||||
check_date = date.fromisoformat(date_str)
|
||||
except ValueError:
|
||||
raise HTTPException(status_code=400, detail="Invalid date format. Use YYYY-MM-DD")
|
||||
|
||||
day_data = get_day_summary(db, check_date, device_type)
|
||||
|
||||
# Get projects for display names
|
||||
projects = {p.id: p for p in db.query(Project).all()}
|
||||
|
||||
return templates.TemplateResponse(
|
||||
"partials/fleet_calendar/day_detail.html",
|
||||
{
|
||||
"request": request,
|
||||
"day_data": day_data,
|
||||
"date_str": date_str,
|
||||
"date_display": check_date.strftime("%B %d, %Y"),
|
||||
"device_type": device_type,
|
||||
"projects": projects
|
||||
}
|
||||
)
|
||||
|
||||
|
||||
# ============================================================================
|
||||
# Reservation CRUD
|
||||
# ============================================================================
|
||||
|
||||
@router.post("/api/fleet-calendar/reservations", response_class=JSONResponse)
|
||||
async def create_reservation(
|
||||
request: Request,
|
||||
db: Session = Depends(get_db)
|
||||
):
|
||||
"""Create a new job reservation."""
|
||||
data = await request.json()
|
||||
|
||||
# Validate required fields
|
||||
required = ["name", "start_date", "assignment_type"]
|
||||
for field in required:
|
||||
if field not in data:
|
||||
raise HTTPException(status_code=400, detail=f"Missing required field: {field}")
|
||||
|
||||
# Need either end_date or end_date_tbd
|
||||
end_date_tbd = data.get("end_date_tbd", False)
|
||||
if not end_date_tbd and not data.get("end_date"):
|
||||
raise HTTPException(status_code=400, detail="End date is required unless marked as TBD")
|
||||
|
||||
try:
|
||||
start_date = date.fromisoformat(data["start_date"])
|
||||
end_date = date.fromisoformat(data["end_date"]) if data.get("end_date") else None
|
||||
estimated_end_date = date.fromisoformat(data["estimated_end_date"]) if data.get("estimated_end_date") else None
|
||||
except ValueError:
|
||||
raise HTTPException(status_code=400, detail="Invalid date format. Use YYYY-MM-DD")
|
||||
|
||||
if end_date and end_date < start_date:
|
||||
raise HTTPException(status_code=400, detail="End date must be after start date")
|
||||
|
||||
if estimated_end_date and estimated_end_date < start_date:
|
||||
raise HTTPException(status_code=400, detail="Estimated end date must be after start date")
|
||||
|
||||
reservation = JobReservation(
|
||||
id=str(uuid.uuid4()),
|
||||
name=data["name"],
|
||||
project_id=data.get("project_id"),
|
||||
start_date=start_date,
|
||||
end_date=end_date,
|
||||
estimated_end_date=estimated_end_date,
|
||||
end_date_tbd=end_date_tbd,
|
||||
assignment_type=data["assignment_type"],
|
||||
device_type=data.get("device_type", "seismograph"),
|
||||
quantity_needed=data.get("quantity_needed"),
|
||||
notes=data.get("notes"),
|
||||
color=data.get("color", "#3B82F6")
|
||||
)
|
||||
|
||||
db.add(reservation)
|
||||
|
||||
# If specific units were provided, assign them
|
||||
if data.get("unit_ids") and data["assignment_type"] == "specific":
|
||||
for unit_id in data["unit_ids"]:
|
||||
assignment = JobReservationUnit(
|
||||
id=str(uuid.uuid4()),
|
||||
reservation_id=reservation.id,
|
||||
unit_id=unit_id,
|
||||
assignment_source="specific"
|
||||
)
|
||||
db.add(assignment)
|
||||
|
||||
db.commit()
|
||||
|
||||
logger.info(f"Created reservation: {reservation.name} ({reservation.id})")
|
||||
|
||||
return {
|
||||
"success": True,
|
||||
"reservation_id": reservation.id,
|
||||
"message": f"Created reservation: {reservation.name}"
|
||||
}
|
||||
|
||||
|
||||
@router.get("/api/fleet-calendar/reservations/{reservation_id}", response_class=JSONResponse)
|
||||
async def get_reservation(
|
||||
reservation_id: str,
|
||||
db: Session = Depends(get_db)
|
||||
):
|
||||
"""Get a specific reservation with its assigned units."""
|
||||
reservation = db.query(JobReservation).filter_by(id=reservation_id).first()
|
||||
if not reservation:
|
||||
raise HTTPException(status_code=404, detail="Reservation not found")
|
||||
|
||||
# Get assigned units
|
||||
assignments = db.query(JobReservationUnit).filter_by(
|
||||
reservation_id=reservation_id
|
||||
).all()
|
||||
|
||||
unit_ids = [a.unit_id for a in assignments]
|
||||
units = db.query(RosterUnit).filter(RosterUnit.id.in_(unit_ids)).all() if unit_ids else []
|
||||
|
||||
return {
|
||||
"id": reservation.id,
|
||||
"name": reservation.name,
|
||||
"project_id": reservation.project_id,
|
||||
"start_date": reservation.start_date.isoformat(),
|
||||
"end_date": reservation.end_date.isoformat() if reservation.end_date else None,
|
||||
"estimated_end_date": reservation.estimated_end_date.isoformat() if reservation.estimated_end_date else None,
|
||||
"end_date_tbd": reservation.end_date_tbd,
|
||||
"assignment_type": reservation.assignment_type,
|
||||
"device_type": reservation.device_type,
|
||||
"quantity_needed": reservation.quantity_needed,
|
||||
"notes": reservation.notes,
|
||||
"color": reservation.color,
|
||||
"assigned_units": [
|
||||
{
|
||||
"id": u.id,
|
||||
"last_calibrated": u.last_calibrated.isoformat() if u.last_calibrated else None,
|
||||
"deployed": u.deployed
|
||||
}
|
||||
for u in units
|
||||
]
|
||||
}
|
||||
|
||||
|
||||
@router.put("/api/fleet-calendar/reservations/{reservation_id}", response_class=JSONResponse)
|
||||
async def update_reservation(
|
||||
reservation_id: str,
|
||||
request: Request,
|
||||
db: Session = Depends(get_db)
|
||||
):
|
||||
"""Update an existing reservation."""
|
||||
reservation = db.query(JobReservation).filter_by(id=reservation_id).first()
|
||||
if not reservation:
|
||||
raise HTTPException(status_code=404, detail="Reservation not found")
|
||||
|
||||
data = await request.json()
|
||||
|
||||
# Update fields if provided
|
||||
if "name" in data:
|
||||
reservation.name = data["name"]
|
||||
if "project_id" in data:
|
||||
reservation.project_id = data["project_id"]
|
||||
if "start_date" in data:
|
||||
reservation.start_date = date.fromisoformat(data["start_date"])
|
||||
if "end_date" in data:
|
||||
reservation.end_date = date.fromisoformat(data["end_date"]) if data["end_date"] else None
|
||||
if "estimated_end_date" in data:
|
||||
reservation.estimated_end_date = date.fromisoformat(data["estimated_end_date"]) if data["estimated_end_date"] else None
|
||||
if "end_date_tbd" in data:
|
||||
reservation.end_date_tbd = data["end_date_tbd"]
|
||||
if "assignment_type" in data:
|
||||
reservation.assignment_type = data["assignment_type"]
|
||||
if "quantity_needed" in data:
|
||||
reservation.quantity_needed = data["quantity_needed"]
|
||||
if "notes" in data:
|
||||
reservation.notes = data["notes"]
|
||||
if "color" in data:
|
||||
reservation.color = data["color"]
|
||||
|
||||
reservation.updated_at = datetime.utcnow()
|
||||
|
||||
db.commit()
|
||||
|
||||
logger.info(f"Updated reservation: {reservation.name} ({reservation.id})")
|
||||
|
||||
return {
|
||||
"success": True,
|
||||
"message": f"Updated reservation: {reservation.name}"
|
||||
}
|
||||
|
||||
|
||||
@router.delete("/api/fleet-calendar/reservations/{reservation_id}", response_class=JSONResponse)
|
||||
async def delete_reservation(
|
||||
reservation_id: str,
|
||||
db: Session = Depends(get_db)
|
||||
):
|
||||
"""Delete a reservation and its unit assignments."""
|
||||
reservation = db.query(JobReservation).filter_by(id=reservation_id).first()
|
||||
if not reservation:
|
||||
raise HTTPException(status_code=404, detail="Reservation not found")
|
||||
|
||||
# Delete unit assignments first
|
||||
db.query(JobReservationUnit).filter_by(reservation_id=reservation_id).delete()
|
||||
|
||||
# Delete the reservation
|
||||
db.delete(reservation)
|
||||
db.commit()
|
||||
|
||||
logger.info(f"Deleted reservation: {reservation.name} ({reservation_id})")
|
||||
|
||||
return {
|
||||
"success": True,
|
||||
"message": "Reservation deleted"
|
||||
}
|
||||
|
||||
|
||||
# ============================================================================
|
||||
# Unit Assignment
|
||||
# ============================================================================
|
||||
|
||||
@router.post("/api/fleet-calendar/reservations/{reservation_id}/assign-units", response_class=JSONResponse)
|
||||
async def assign_units_to_reservation(
|
||||
reservation_id: str,
|
||||
request: Request,
|
||||
db: Session = Depends(get_db)
|
||||
):
|
||||
"""Assign specific units to a reservation."""
|
||||
reservation = db.query(JobReservation).filter_by(id=reservation_id).first()
|
||||
if not reservation:
|
||||
raise HTTPException(status_code=404, detail="Reservation not found")
|
||||
|
||||
data = await request.json()
|
||||
unit_ids = data.get("unit_ids", [])
|
||||
|
||||
if not unit_ids:
|
||||
raise HTTPException(status_code=400, detail="No units specified")
|
||||
|
||||
# Verify units exist
|
||||
units = db.query(RosterUnit).filter(RosterUnit.id.in_(unit_ids)).all()
|
||||
found_ids = {u.id for u in units}
|
||||
missing = set(unit_ids) - found_ids
|
||||
if missing:
|
||||
raise HTTPException(status_code=404, detail=f"Units not found: {', '.join(missing)}")
|
||||
|
||||
# Check for conflicts (already assigned to overlapping reservations)
|
||||
conflicts = []
|
||||
for unit_id in unit_ids:
|
||||
# Check if unit is already assigned to this reservation
|
||||
existing = db.query(JobReservationUnit).filter_by(
|
||||
reservation_id=reservation_id,
|
||||
unit_id=unit_id
|
||||
).first()
|
||||
if existing:
|
||||
continue # Already assigned, skip
|
||||
|
||||
# Check overlapping reservations
|
||||
overlapping = db.query(JobReservation).join(
|
||||
JobReservationUnit, JobReservation.id == JobReservationUnit.reservation_id
|
||||
).filter(
|
||||
JobReservationUnit.unit_id == unit_id,
|
||||
JobReservation.id != reservation_id,
|
||||
JobReservation.start_date <= reservation.end_date,
|
||||
JobReservation.end_date >= reservation.start_date
|
||||
).first()
|
||||
|
||||
if overlapping:
|
||||
conflicts.append({
|
||||
"unit_id": unit_id,
|
||||
"conflict_reservation": overlapping.name,
|
||||
"conflict_dates": f"{overlapping.start_date} - {overlapping.end_date}"
|
||||
})
|
||||
continue
|
||||
|
||||
# Add assignment
|
||||
assignment = JobReservationUnit(
|
||||
id=str(uuid.uuid4()),
|
||||
reservation_id=reservation_id,
|
||||
unit_id=unit_id,
|
||||
assignment_source="filled" if reservation.assignment_type == "quantity" else "specific"
|
||||
)
|
||||
db.add(assignment)
|
||||
|
||||
db.commit()
|
||||
|
||||
# Check for calibration conflicts
|
||||
cal_conflicts = check_calibration_conflicts(db, reservation_id)
|
||||
|
||||
assigned_count = db.query(JobReservationUnit).filter_by(
|
||||
reservation_id=reservation_id
|
||||
).count()
|
||||
|
||||
return {
|
||||
"success": True,
|
||||
"assigned_count": assigned_count,
|
||||
"conflicts": conflicts,
|
||||
"calibration_warnings": cal_conflicts,
|
||||
"message": f"Assigned {len(unit_ids) - len(conflicts)} units"
|
||||
}
|
||||
|
||||
|
||||
@router.delete("/api/fleet-calendar/reservations/{reservation_id}/units/{unit_id}", response_class=JSONResponse)
|
||||
async def remove_unit_from_reservation(
|
||||
reservation_id: str,
|
||||
unit_id: str,
|
||||
db: Session = Depends(get_db)
|
||||
):
|
||||
"""Remove a unit from a reservation."""
|
||||
assignment = db.query(JobReservationUnit).filter_by(
|
||||
reservation_id=reservation_id,
|
||||
unit_id=unit_id
|
||||
).first()
|
||||
|
||||
if not assignment:
|
||||
raise HTTPException(status_code=404, detail="Unit assignment not found")
|
||||
|
||||
db.delete(assignment)
|
||||
db.commit()
|
||||
|
||||
return {
|
||||
"success": True,
|
||||
"message": f"Removed {unit_id} from reservation"
|
||||
}
|
||||
|
||||
|
||||
# ============================================================================
|
||||
# Availability & Conflicts
|
||||
# ============================================================================
|
||||
|
||||
@router.get("/api/fleet-calendar/availability", response_class=JSONResponse)
|
||||
async def check_availability(
|
||||
start_date: str,
|
||||
end_date: str,
|
||||
device_type: str = "seismograph",
|
||||
exclude_reservation_id: Optional[str] = None,
|
||||
db: Session = Depends(get_db)
|
||||
):
|
||||
"""Get units available for a specific date range."""
|
||||
try:
|
||||
start = date.fromisoformat(start_date)
|
||||
end = date.fromisoformat(end_date)
|
||||
except ValueError:
|
||||
raise HTTPException(status_code=400, detail="Invalid date format. Use YYYY-MM-DD")
|
||||
|
||||
available = get_available_units_for_period(
|
||||
db, start, end, device_type, exclude_reservation_id
|
||||
)
|
||||
|
||||
return {
|
||||
"start_date": start_date,
|
||||
"end_date": end_date,
|
||||
"device_type": device_type,
|
||||
"available_units": available,
|
||||
"count": len(available)
|
||||
}
|
||||
|
||||
|
||||
@router.get("/api/fleet-calendar/reservations/{reservation_id}/conflicts", response_class=JSONResponse)
|
||||
async def get_reservation_conflicts(
|
||||
reservation_id: str,
|
||||
db: Session = Depends(get_db)
|
||||
):
|
||||
"""Check for calibration conflicts in a reservation."""
|
||||
reservation = db.query(JobReservation).filter_by(id=reservation_id).first()
|
||||
if not reservation:
|
||||
raise HTTPException(status_code=404, detail="Reservation not found")
|
||||
|
||||
conflicts = check_calibration_conflicts(db, reservation_id)
|
||||
|
||||
return {
|
||||
"reservation_id": reservation_id,
|
||||
"reservation_name": reservation.name,
|
||||
"conflicts": conflicts,
|
||||
"has_conflicts": len(conflicts) > 0
|
||||
}
|
||||
|
||||
|
||||
# ============================================================================
|
||||
# HTMX Partials
|
||||
# ============================================================================
|
||||
|
||||
@router.get("/api/fleet-calendar/reservations-list", response_class=HTMLResponse)
|
||||
async def get_reservations_list(
|
||||
request: Request,
|
||||
year: Optional[int] = None,
|
||||
month: Optional[int] = None,
|
||||
device_type: str = "seismograph",
|
||||
db: Session = Depends(get_db)
|
||||
):
|
||||
"""Get list of reservations as HTMX partial."""
|
||||
from sqlalchemy import or_
|
||||
|
||||
today = date.today()
|
||||
if year is None:
|
||||
year = today.year
|
||||
if month is None:
|
||||
month = today.month
|
||||
|
||||
# Calculate 12-month window
|
||||
start_date = date(year, month, 1)
|
||||
# End date is 12 months later
|
||||
end_year = year + ((month + 10) // 12)
|
||||
end_month = ((month + 10) % 12) + 1
|
||||
if end_month == 12:
|
||||
end_date = date(end_year, 12, 31)
|
||||
else:
|
||||
end_date = date(end_year, end_month + 1, 1) - timedelta(days=1)
|
||||
|
||||
# Include TBD reservations that started before window end
|
||||
reservations = db.query(JobReservation).filter(
|
||||
JobReservation.device_type == device_type,
|
||||
JobReservation.start_date <= end_date,
|
||||
or_(
|
||||
JobReservation.end_date >= start_date,
|
||||
JobReservation.end_date == None # TBD reservations
|
||||
)
|
||||
).order_by(JobReservation.start_date).all()
|
||||
|
||||
# Get assignment counts
|
||||
reservation_data = []
|
||||
for res in reservations:
|
||||
assigned_count = db.query(JobReservationUnit).filter_by(
|
||||
reservation_id=res.id
|
||||
).count()
|
||||
|
||||
# Check for calibration conflicts
|
||||
conflicts = check_calibration_conflicts(db, res.id)
|
||||
|
||||
reservation_data.append({
|
||||
"reservation": res,
|
||||
"assigned_count": assigned_count,
|
||||
"has_conflicts": len(conflicts) > 0,
|
||||
"conflict_count": len(conflicts)
|
||||
})
|
||||
|
||||
return templates.TemplateResponse(
|
||||
"partials/fleet_calendar/reservations_list.html",
|
||||
{
|
||||
"request": request,
|
||||
"reservations": reservation_data,
|
||||
"year": year,
|
||||
"device_type": device_type
|
||||
}
|
||||
)
|
||||
|
||||
|
||||
@router.get("/api/fleet-calendar/available-units", response_class=HTMLResponse)
|
||||
async def get_available_units_partial(
|
||||
request: Request,
|
||||
start_date: str,
|
||||
end_date: str,
|
||||
device_type: str = "seismograph",
|
||||
reservation_id: Optional[str] = None,
|
||||
db: Session = Depends(get_db)
|
||||
):
|
||||
"""Get available units as HTMX partial for the assignment modal."""
|
||||
try:
|
||||
start = date.fromisoformat(start_date)
|
||||
end = date.fromisoformat(end_date)
|
||||
except ValueError:
|
||||
raise HTTPException(status_code=400, detail="Invalid date format")
|
||||
|
||||
available = get_available_units_for_period(
|
||||
db, start, end, device_type, reservation_id
|
||||
)
|
||||
|
||||
return templates.TemplateResponse(
|
||||
"partials/fleet_calendar/available_units.html",
|
||||
{
|
||||
"request": request,
|
||||
"units": available,
|
||||
"start_date": start_date,
|
||||
"end_date": end_date,
|
||||
"device_type": device_type,
|
||||
"reservation_id": reservation_id
|
||||
}
|
||||
)
|
||||
|
||||
|
||||
@router.get("/api/fleet-calendar/month/{year}/{month}", response_class=HTMLResponse)
|
||||
async def get_month_partial(
|
||||
request: Request,
|
||||
year: int,
|
||||
month: int,
|
||||
device_type: str = "seismograph",
|
||||
db: Session = Depends(get_db)
|
||||
):
|
||||
"""Get a single month calendar as HTMX partial."""
|
||||
calendar_data = get_calendar_year_data(db, year, device_type)
|
||||
month_data = calendar_data["months"].get(month)
|
||||
|
||||
if not month_data:
|
||||
raise HTTPException(status_code=404, detail="Invalid month")
|
||||
|
||||
return templates.TemplateResponse(
|
||||
"partials/fleet_calendar/month_grid.html",
|
||||
{
|
||||
"request": request,
|
||||
"year": year,
|
||||
"month": month,
|
||||
"month_data": month_data,
|
||||
"device_type": device_type,
|
||||
"today": date.today().isoformat()
|
||||
}
|
||||
)
|
||||
429
backend/routers/modem_dashboard.py
Normal file
@@ -0,0 +1,429 @@
|
||||
"""
|
||||
Modem Dashboard Router
|
||||
|
||||
Provides API endpoints for the Field Modems management page.
|
||||
"""
|
||||
|
||||
from fastapi import APIRouter, Request, Depends, Query
|
||||
from fastapi.responses import HTMLResponse
|
||||
from sqlalchemy.orm import Session
|
||||
from datetime import datetime
|
||||
import subprocess
|
||||
import time
|
||||
import logging
|
||||
|
||||
from backend.database import get_db
|
||||
from backend.models import RosterUnit
|
||||
from backend.templates_config import templates
|
||||
|
||||
logger = logging.getLogger(__name__)
|
||||
|
||||
router = APIRouter(prefix="/api/modem-dashboard", tags=["modem-dashboard"])
|
||||
|
||||
|
||||
@router.get("/stats", response_class=HTMLResponse)
|
||||
async def get_modem_stats(request: Request, db: Session = Depends(get_db)):
|
||||
"""
|
||||
Get summary statistics for modem dashboard.
|
||||
Returns HTML partial with stat cards.
|
||||
"""
|
||||
# Query all modems
|
||||
all_modems = db.query(RosterUnit).filter_by(device_type="modem").all()
|
||||
|
||||
# Get IDs of modems that have devices paired to them
|
||||
paired_modem_ids = set()
|
||||
devices_with_modems = db.query(RosterUnit).filter(
|
||||
RosterUnit.deployed_with_modem_id.isnot(None),
|
||||
RosterUnit.retired == False
|
||||
).all()
|
||||
for device in devices_with_modems:
|
||||
if device.deployed_with_modem_id:
|
||||
paired_modem_ids.add(device.deployed_with_modem_id)
|
||||
|
||||
# Count categories
|
||||
total_count = len(all_modems)
|
||||
retired_count = sum(1 for m in all_modems if m.retired)
|
||||
|
||||
# In use = deployed AND paired with a device
|
||||
in_use_count = sum(1 for m in all_modems
|
||||
if m.deployed and not m.retired and m.id in paired_modem_ids)
|
||||
|
||||
# Spare = deployed but NOT paired (available for assignment)
|
||||
spare_count = sum(1 for m in all_modems
|
||||
if m.deployed and not m.retired and m.id not in paired_modem_ids)
|
||||
|
||||
# Benched = not deployed and not retired
|
||||
benched_count = sum(1 for m in all_modems if not m.deployed and not m.retired)
|
||||
|
||||
return templates.TemplateResponse("partials/modem_stats.html", {
|
||||
"request": request,
|
||||
"total_count": total_count,
|
||||
"in_use_count": in_use_count,
|
||||
"spare_count": spare_count,
|
||||
"benched_count": benched_count,
|
||||
"retired_count": retired_count
|
||||
})
|
||||
|
||||
|
||||
@router.get("/units", response_class=HTMLResponse)
|
||||
async def get_modem_units(
|
||||
request: Request,
|
||||
db: Session = Depends(get_db),
|
||||
search: str = Query(None),
|
||||
filter_status: str = Query(None), # "in_use", "spare", "benched", "retired"
|
||||
):
|
||||
"""
|
||||
Get list of modem units for the dashboard.
|
||||
Returns HTML partial with modem cards.
|
||||
"""
|
||||
query = db.query(RosterUnit).filter_by(device_type="modem")
|
||||
|
||||
# Filter by search term if provided
|
||||
if search:
|
||||
search_term = f"%{search}%"
|
||||
query = query.filter(
|
||||
(RosterUnit.id.ilike(search_term)) |
|
||||
(RosterUnit.ip_address.ilike(search_term)) |
|
||||
(RosterUnit.hardware_model.ilike(search_term)) |
|
||||
(RosterUnit.phone_number.ilike(search_term)) |
|
||||
(RosterUnit.location.ilike(search_term))
|
||||
)
|
||||
|
||||
modems = query.order_by(
|
||||
RosterUnit.retired.asc(),
|
||||
RosterUnit.deployed.desc(),
|
||||
RosterUnit.id.asc()
|
||||
).all()
|
||||
|
||||
# Get paired device info for each modem
|
||||
paired_devices = {}
|
||||
devices_with_modems = db.query(RosterUnit).filter(
|
||||
RosterUnit.deployed_with_modem_id.isnot(None),
|
||||
RosterUnit.retired == False
|
||||
).all()
|
||||
for device in devices_with_modems:
|
||||
if device.deployed_with_modem_id:
|
||||
paired_devices[device.deployed_with_modem_id] = {
|
||||
"id": device.id,
|
||||
"device_type": device.device_type,
|
||||
"deployed": device.deployed
|
||||
}
|
||||
|
||||
# Annotate modems with paired device info
|
||||
modem_list = []
|
||||
for modem in modems:
|
||||
paired = paired_devices.get(modem.id)
|
||||
|
||||
# Determine status category
|
||||
if modem.retired:
|
||||
status = "retired"
|
||||
elif not modem.deployed:
|
||||
status = "benched"
|
||||
elif paired:
|
||||
status = "in_use"
|
||||
else:
|
||||
status = "spare"
|
||||
|
||||
# Apply filter if specified
|
||||
if filter_status and status != filter_status:
|
||||
continue
|
||||
|
||||
modem_list.append({
|
||||
"id": modem.id,
|
||||
"ip_address": modem.ip_address,
|
||||
"phone_number": modem.phone_number,
|
||||
"hardware_model": modem.hardware_model,
|
||||
"deployed": modem.deployed,
|
||||
"retired": modem.retired,
|
||||
"location": modem.location,
|
||||
"project_id": modem.project_id,
|
||||
"paired_device": paired,
|
||||
"status": status
|
||||
})
|
||||
|
||||
return templates.TemplateResponse("partials/modem_list.html", {
|
||||
"request": request,
|
||||
"modems": modem_list
|
||||
})
|
||||
|
||||
|
||||
@router.get("/{modem_id}/paired-device")
|
||||
async def get_paired_device(modem_id: str, db: Session = Depends(get_db)):
|
||||
"""
|
||||
Get the device (SLM/seismograph) that is paired with this modem.
|
||||
Returns JSON with device info or null if not paired.
|
||||
"""
|
||||
# Check modem exists
|
||||
modem = db.query(RosterUnit).filter_by(id=modem_id, device_type="modem").first()
|
||||
if not modem:
|
||||
return {"status": "error", "detail": f"Modem {modem_id} not found"}
|
||||
|
||||
# Find device paired with this modem
|
||||
device = db.query(RosterUnit).filter(
|
||||
RosterUnit.deployed_with_modem_id == modem_id,
|
||||
RosterUnit.retired == False
|
||||
).first()
|
||||
|
||||
if device:
|
||||
return {
|
||||
"paired": True,
|
||||
"device": {
|
||||
"id": device.id,
|
||||
"device_type": device.device_type,
|
||||
"deployed": device.deployed,
|
||||
"project_id": device.project_id,
|
||||
"location": device.location or device.address
|
||||
}
|
||||
}
|
||||
|
||||
return {"paired": False, "device": None}
|
||||
|
||||
|
||||
@router.get("/{modem_id}/paired-device-html", response_class=HTMLResponse)
|
||||
async def get_paired_device_html(modem_id: str, request: Request, db: Session = Depends(get_db)):
|
||||
"""
|
||||
Get HTML partial showing the device paired with this modem.
|
||||
Used by unit_detail.html for modems.
|
||||
"""
|
||||
# Check modem exists
|
||||
modem = db.query(RosterUnit).filter_by(id=modem_id, device_type="modem").first()
|
||||
if not modem:
|
||||
return HTMLResponse('<p class="text-red-500">Modem not found</p>')
|
||||
|
||||
# Find device paired with this modem
|
||||
device = db.query(RosterUnit).filter(
|
||||
RosterUnit.deployed_with_modem_id == modem_id,
|
||||
RosterUnit.retired == False
|
||||
).first()
|
||||
|
||||
return templates.TemplateResponse("partials/modem_paired_device.html", {
|
||||
"request": request,
|
||||
"modem_id": modem_id,
|
||||
"device": device
|
||||
})
|
||||
|
||||
|
||||
@router.get("/{modem_id}/ping")
|
||||
async def ping_modem(modem_id: str, db: Session = Depends(get_db)):
|
||||
"""
|
||||
Test modem connectivity with a simple ping.
|
||||
Returns response time and connection status.
|
||||
"""
|
||||
# Get modem from database
|
||||
modem = db.query(RosterUnit).filter_by(id=modem_id, device_type="modem").first()
|
||||
|
||||
if not modem:
|
||||
return {"status": "error", "detail": f"Modem {modem_id} not found"}
|
||||
|
||||
if not modem.ip_address:
|
||||
return {"status": "error", "detail": f"Modem {modem_id} has no IP address configured"}
|
||||
|
||||
try:
|
||||
# Ping the modem (1 packet, 2 second timeout)
|
||||
start_time = time.time()
|
||||
result = subprocess.run(
|
||||
["ping", "-c", "1", "-W", "2", modem.ip_address],
|
||||
capture_output=True,
|
||||
text=True,
|
||||
timeout=3
|
||||
)
|
||||
response_time = int((time.time() - start_time) * 1000) # Convert to milliseconds
|
||||
|
||||
if result.returncode == 0:
|
||||
return {
|
||||
"status": "success",
|
||||
"modem_id": modem_id,
|
||||
"ip_address": modem.ip_address,
|
||||
"response_time_ms": response_time,
|
||||
"message": "Modem is responding"
|
||||
}
|
||||
else:
|
||||
return {
|
||||
"status": "error",
|
||||
"modem_id": modem_id,
|
||||
"ip_address": modem.ip_address,
|
||||
"detail": "Modem not responding to ping"
|
||||
}
|
||||
|
||||
except subprocess.TimeoutExpired:
|
||||
return {
|
||||
"status": "error",
|
||||
"modem_id": modem_id,
|
||||
"ip_address": modem.ip_address,
|
||||
"detail": "Ping timeout"
|
||||
}
|
||||
except Exception as e:
|
||||
logger.error(f"Failed to ping modem {modem_id}: {e}")
|
||||
return {
|
||||
"status": "error",
|
||||
"modem_id": modem_id,
|
||||
"detail": str(e)
|
||||
}
|
||||
|
||||
|
||||
@router.get("/{modem_id}/diagnostics")
|
||||
async def get_modem_diagnostics(modem_id: str, db: Session = Depends(get_db)):
|
||||
"""
|
||||
Get modem diagnostics (signal strength, data usage, uptime).
|
||||
|
||||
Currently returns placeholders. When ModemManager is available,
|
||||
this endpoint will query it for real diagnostics.
|
||||
"""
|
||||
modem = db.query(RosterUnit).filter_by(id=modem_id, device_type="modem").first()
|
||||
if not modem:
|
||||
return {"status": "error", "detail": f"Modem {modem_id} not found"}
|
||||
|
||||
# TODO: Query ModemManager backend when available
|
||||
return {
|
||||
"status": "unavailable",
|
||||
"message": "ModemManager integration not yet available",
|
||||
"modem_id": modem_id,
|
||||
"signal_strength_dbm": None,
|
||||
"data_usage_mb": None,
|
||||
"uptime_seconds": None,
|
||||
"carrier": None,
|
||||
"connection_type": None # LTE, 5G, etc.
|
||||
}
|
||||
|
||||
|
||||
@router.get("/{modem_id}/pairable-devices")
|
||||
async def get_pairable_devices(
|
||||
modem_id: str,
|
||||
db: Session = Depends(get_db),
|
||||
search: str = Query(None),
|
||||
hide_paired: bool = Query(True)
|
||||
):
|
||||
"""
|
||||
Get list of devices (seismographs and SLMs) that can be paired with this modem.
|
||||
Used by the device picker modal in unit_detail.html.
|
||||
"""
|
||||
# Check modem exists
|
||||
modem = db.query(RosterUnit).filter_by(id=modem_id, device_type="modem").first()
|
||||
if not modem:
|
||||
return {"status": "error", "detail": f"Modem {modem_id} not found"}
|
||||
|
||||
# Query seismographs and SLMs
|
||||
query = db.query(RosterUnit).filter(
|
||||
RosterUnit.device_type.in_(["seismograph", "sound_level_meter"]),
|
||||
RosterUnit.retired == False
|
||||
)
|
||||
|
||||
# Filter by search term if provided
|
||||
if search:
|
||||
search_term = f"%{search}%"
|
||||
query = query.filter(
|
||||
(RosterUnit.id.ilike(search_term)) |
|
||||
(RosterUnit.project_id.ilike(search_term)) |
|
||||
(RosterUnit.location.ilike(search_term)) |
|
||||
(RosterUnit.address.ilike(search_term)) |
|
||||
(RosterUnit.note.ilike(search_term))
|
||||
)
|
||||
|
||||
devices = query.order_by(
|
||||
RosterUnit.deployed.desc(),
|
||||
RosterUnit.device_type.asc(),
|
||||
RosterUnit.id.asc()
|
||||
).all()
|
||||
|
||||
# Build device list
|
||||
device_list = []
|
||||
for device in devices:
|
||||
# Skip already paired devices if hide_paired is True
|
||||
is_paired_to_other = (
|
||||
device.deployed_with_modem_id is not None and
|
||||
device.deployed_with_modem_id != modem_id
|
||||
)
|
||||
is_paired_to_this = device.deployed_with_modem_id == modem_id
|
||||
|
||||
if hide_paired and is_paired_to_other:
|
||||
continue
|
||||
|
||||
device_list.append({
|
||||
"id": device.id,
|
||||
"device_type": device.device_type,
|
||||
"deployed": device.deployed,
|
||||
"project_id": device.project_id,
|
||||
"location": device.location or device.address,
|
||||
"note": device.note,
|
||||
"paired_modem_id": device.deployed_with_modem_id,
|
||||
"is_paired_to_this": is_paired_to_this,
|
||||
"is_paired_to_other": is_paired_to_other
|
||||
})
|
||||
|
||||
return {"devices": device_list, "modem_id": modem_id}
|
||||
|
||||
|
||||
@router.post("/{modem_id}/pair")
|
||||
async def pair_device_to_modem(
|
||||
modem_id: str,
|
||||
db: Session = Depends(get_db),
|
||||
device_id: str = Query(..., description="ID of the device to pair")
|
||||
):
|
||||
"""
|
||||
Pair a device (seismograph or SLM) to this modem.
|
||||
Updates the device's deployed_with_modem_id field.
|
||||
"""
|
||||
# Check modem exists
|
||||
modem = db.query(RosterUnit).filter_by(id=modem_id, device_type="modem").first()
|
||||
if not modem:
|
||||
return {"status": "error", "detail": f"Modem {modem_id} not found"}
|
||||
|
||||
# Find the device
|
||||
device = db.query(RosterUnit).filter(
|
||||
RosterUnit.id == device_id,
|
||||
RosterUnit.device_type.in_(["seismograph", "sound_level_meter"]),
|
||||
RosterUnit.retired == False
|
||||
).first()
|
||||
if not device:
|
||||
return {"status": "error", "detail": f"Device {device_id} not found"}
|
||||
|
||||
# Unpair any device currently paired to this modem
|
||||
currently_paired = db.query(RosterUnit).filter(
|
||||
RosterUnit.deployed_with_modem_id == modem_id
|
||||
).all()
|
||||
for paired_device in currently_paired:
|
||||
paired_device.deployed_with_modem_id = None
|
||||
|
||||
# Pair the new device
|
||||
device.deployed_with_modem_id = modem_id
|
||||
db.commit()
|
||||
|
||||
return {
|
||||
"status": "success",
|
||||
"modem_id": modem_id,
|
||||
"device_id": device_id,
|
||||
"message": f"Device {device_id} paired to modem {modem_id}"
|
||||
}
|
||||
|
||||
|
||||
@router.post("/{modem_id}/unpair")
|
||||
async def unpair_device_from_modem(modem_id: str, db: Session = Depends(get_db)):
|
||||
"""
|
||||
Unpair any device currently paired to this modem.
|
||||
"""
|
||||
# Check modem exists
|
||||
modem = db.query(RosterUnit).filter_by(id=modem_id, device_type="modem").first()
|
||||
if not modem:
|
||||
return {"status": "error", "detail": f"Modem {modem_id} not found"}
|
||||
|
||||
# Find and unpair device
|
||||
device = db.query(RosterUnit).filter(
|
||||
RosterUnit.deployed_with_modem_id == modem_id
|
||||
).first()
|
||||
|
||||
if device:
|
||||
old_device_id = device.id
|
||||
device.deployed_with_modem_id = None
|
||||
db.commit()
|
||||
return {
|
||||
"status": "success",
|
||||
"modem_id": modem_id,
|
||||
"unpaired_device_id": old_device_id,
|
||||
"message": f"Device {old_device_id} unpaired from modem {modem_id}"
|
||||
}
|
||||
|
||||
return {
|
||||
"status": "success",
|
||||
"modem_id": modem_id,
|
||||
"message": "No device was paired to this modem"
|
||||
}
|
||||
@@ -6,7 +6,6 @@ and unit assignments within projects.
|
||||
"""
|
||||
|
||||
from fastapi import APIRouter, Request, Depends, HTTPException, Query
|
||||
from fastapi.templating import Jinja2Templates
|
||||
from fastapi.responses import HTMLResponse, JSONResponse
|
||||
from sqlalchemy.orm import Session
|
||||
from sqlalchemy import and_, or_
|
||||
@@ -15,6 +14,12 @@ from typing import Optional
|
||||
import uuid
|
||||
import json
|
||||
|
||||
from fastapi import UploadFile, File
|
||||
import zipfile
|
||||
import hashlib
|
||||
import io
|
||||
from pathlib import Path
|
||||
|
||||
from backend.database import get_db
|
||||
from backend.models import (
|
||||
Project,
|
||||
@@ -22,11 +27,45 @@ from backend.models import (
|
||||
MonitoringLocation,
|
||||
UnitAssignment,
|
||||
RosterUnit,
|
||||
RecordingSession,
|
||||
MonitoringSession,
|
||||
DataFile,
|
||||
)
|
||||
from backend.templates_config import templates
|
||||
|
||||
router = APIRouter(prefix="/api/projects/{project_id}", tags=["project-locations"])
|
||||
templates = Jinja2Templates(directory="templates")
|
||||
|
||||
|
||||
# ============================================================================
|
||||
# Session period helpers
|
||||
# ============================================================================
|
||||
|
||||
def _derive_period_type(dt: datetime) -> str:
|
||||
"""
|
||||
Classify a session start time into one of four period types.
|
||||
Night = 22:00–07:00, Day = 07:00–22:00.
|
||||
Weekend = Saturday (5) or Sunday (6).
|
||||
"""
|
||||
is_weekend = dt.weekday() >= 5
|
||||
is_night = dt.hour >= 22 or dt.hour < 7
|
||||
if is_weekend:
|
||||
return "weekend_night" if is_night else "weekend_day"
|
||||
return "weekday_night" if is_night else "weekday_day"
|
||||
|
||||
|
||||
def _build_session_label(dt: datetime, location_name: str, period_type: str) -> str:
|
||||
"""Build a human-readable session label, e.g. 'NRL-1 — Sun 2/23 — Night'.
|
||||
Uses started_at date as-is; user can correct period_type in the wizard.
|
||||
"""
|
||||
day_abbr = dt.strftime("%a")
|
||||
date_str = f"{dt.month}/{dt.day}"
|
||||
period_str = {
|
||||
"weekday_day": "Day",
|
||||
"weekday_night": "Night",
|
||||
"weekend_day": "Day",
|
||||
"weekend_night": "Night",
|
||||
}.get(period_type, "")
|
||||
parts = [p for p in [location_name, f"{day_abbr} {date_str}", period_str] if p]
|
||||
return " — ".join(parts)
|
||||
|
||||
|
||||
# ============================================================================
|
||||
@@ -71,8 +110,8 @@ async def get_project_locations(
|
||||
if assignment:
|
||||
assigned_unit = db.query(RosterUnit).filter_by(id=assignment.unit_id).first()
|
||||
|
||||
# Count recording sessions
|
||||
session_count = db.query(RecordingSession).filter_by(
|
||||
# Count monitoring sessions
|
||||
session_count = db.query(MonitoringSession).filter_by(
|
||||
location_id=location.id
|
||||
).count()
|
||||
|
||||
@@ -90,6 +129,40 @@ async def get_project_locations(
|
||||
})
|
||||
|
||||
|
||||
@router.get("/locations-json")
|
||||
async def get_project_locations_json(
|
||||
project_id: str,
|
||||
db: Session = Depends(get_db),
|
||||
location_type: Optional[str] = Query(None),
|
||||
):
|
||||
"""
|
||||
Get all monitoring locations for a project as JSON.
|
||||
Used by the schedule modal to populate location dropdown.
|
||||
"""
|
||||
project = db.query(Project).filter_by(id=project_id).first()
|
||||
if not project:
|
||||
raise HTTPException(status_code=404, detail="Project not found")
|
||||
|
||||
query = db.query(MonitoringLocation).filter_by(project_id=project_id)
|
||||
|
||||
if location_type:
|
||||
query = query.filter_by(location_type=location_type)
|
||||
|
||||
locations = query.order_by(MonitoringLocation.name).all()
|
||||
|
||||
return [
|
||||
{
|
||||
"id": loc.id,
|
||||
"name": loc.name,
|
||||
"location_type": loc.location_type,
|
||||
"description": loc.description,
|
||||
"address": loc.address,
|
||||
"coordinates": loc.coordinates,
|
||||
}
|
||||
for loc in locations
|
||||
]
|
||||
|
||||
|
||||
@router.post("/locations/create")
|
||||
async def create_location(
|
||||
project_id: str,
|
||||
@@ -273,7 +346,7 @@ async def assign_unit_to_location(
|
||||
raise HTTPException(status_code=404, detail="Unit not found")
|
||||
|
||||
# Check device type matches location type
|
||||
expected_device_type = "sound_level_meter" if location.location_type == "sound" else "seismograph"
|
||||
expected_device_type = "slm" if location.location_type == "sound" else "seismograph"
|
||||
if unit.device_type != expected_device_type:
|
||||
raise HTTPException(
|
||||
status_code=400,
|
||||
@@ -337,19 +410,19 @@ async def unassign_unit(
|
||||
if not assignment:
|
||||
raise HTTPException(status_code=404, detail="Assignment not found")
|
||||
|
||||
# Check if there are active recording sessions
|
||||
active_sessions = db.query(RecordingSession).filter(
|
||||
# Check if there are active monitoring sessions
|
||||
active_sessions = db.query(MonitoringSession).filter(
|
||||
and_(
|
||||
RecordingSession.location_id == assignment.location_id,
|
||||
RecordingSession.unit_id == assignment.unit_id,
|
||||
RecordingSession.status == "recording",
|
||||
MonitoringSession.location_id == assignment.location_id,
|
||||
MonitoringSession.unit_id == assignment.unit_id,
|
||||
MonitoringSession.status == "recording",
|
||||
)
|
||||
).count()
|
||||
|
||||
if active_sessions > 0:
|
||||
raise HTTPException(
|
||||
status_code=400,
|
||||
detail="Cannot unassign unit with active recording sessions. Stop recording first.",
|
||||
detail="Cannot unassign unit with active monitoring sessions. Stop monitoring first.",
|
||||
)
|
||||
|
||||
assignment.status = "completed"
|
||||
@@ -375,7 +448,7 @@ async def get_available_units(
|
||||
Filters by device type matching the location type.
|
||||
"""
|
||||
# Determine required device type
|
||||
required_device_type = "sound_level_meter" if location_type == "sound" else "seismograph"
|
||||
required_device_type = "slm" if location_type == "sound" else "seismograph"
|
||||
|
||||
# Get all units of the required type that are deployed and not retired
|
||||
all_units = db.query(RosterUnit).filter(
|
||||
@@ -397,7 +470,7 @@ async def get_available_units(
|
||||
"id": unit.id,
|
||||
"device_type": unit.device_type,
|
||||
"location": unit.address or unit.location,
|
||||
"model": unit.slm_model if unit.device_type == "sound_level_meter" else unit.unit_type,
|
||||
"model": unit.slm_model if unit.device_type == "slm" else unit.unit_type,
|
||||
}
|
||||
for unit in all_units
|
||||
if unit.id not in assigned_unit_ids
|
||||
@@ -418,14 +491,12 @@ async def get_nrl_sessions(
|
||||
db: Session = Depends(get_db),
|
||||
):
|
||||
"""
|
||||
Get recording sessions for a specific NRL.
|
||||
Get monitoring sessions for a specific NRL.
|
||||
Returns HTML partial with session list.
|
||||
"""
|
||||
from backend.models import RecordingSession, RosterUnit
|
||||
|
||||
sessions = db.query(RecordingSession).filter_by(
|
||||
sessions = db.query(MonitoringSession).filter_by(
|
||||
location_id=location_id
|
||||
).order_by(RecordingSession.started_at.desc()).all()
|
||||
).order_by(MonitoringSession.started_at.desc()).all()
|
||||
|
||||
# Enrich with unit details
|
||||
sessions_data = []
|
||||
@@ -458,14 +529,12 @@ async def get_nrl_files(
|
||||
Get data files for a specific NRL.
|
||||
Returns HTML partial with file list.
|
||||
"""
|
||||
from backend.models import DataFile, RecordingSession
|
||||
|
||||
# Join DataFile with RecordingSession to filter by location_id
|
||||
# Join DataFile with MonitoringSession to filter by location_id
|
||||
files = db.query(DataFile).join(
|
||||
RecordingSession,
|
||||
DataFile.session_id == RecordingSession.id
|
||||
MonitoringSession,
|
||||
DataFile.session_id == MonitoringSession.id
|
||||
).filter(
|
||||
RecordingSession.location_id == location_id
|
||||
MonitoringSession.location_id == location_id
|
||||
).order_by(DataFile.created_at.desc()).all()
|
||||
|
||||
# Enrich with session details
|
||||
@@ -473,7 +542,7 @@ async def get_nrl_files(
|
||||
for file in files:
|
||||
session = None
|
||||
if file.session_id:
|
||||
session = db.query(RecordingSession).filter_by(id=file.session_id).first()
|
||||
session = db.query(MonitoringSession).filter_by(id=file.session_id).first()
|
||||
|
||||
files_data.append({
|
||||
"file": file,
|
||||
@@ -486,3 +555,310 @@ async def get_nrl_files(
|
||||
"location_id": location_id,
|
||||
"files": files_data,
|
||||
})
|
||||
|
||||
|
||||
# ============================================================================
|
||||
# Manual SD Card Data Upload
|
||||
# ============================================================================
|
||||
|
||||
def _parse_rnh(content: bytes) -> dict:
|
||||
"""
|
||||
Parse a Rion .rnh metadata file (INI-style with [Section] headers).
|
||||
Returns a dict of key metadata fields.
|
||||
"""
|
||||
result = {}
|
||||
try:
|
||||
text = content.decode("utf-8", errors="replace")
|
||||
for line in text.splitlines():
|
||||
line = line.strip()
|
||||
if not line or line.startswith("["):
|
||||
continue
|
||||
if "," in line:
|
||||
key, _, value = line.partition(",")
|
||||
key = key.strip()
|
||||
value = value.strip()
|
||||
if key == "Serial Number":
|
||||
result["serial_number"] = value
|
||||
elif key == "Store Name":
|
||||
result["store_name"] = value
|
||||
elif key == "Index Number":
|
||||
result["index_number"] = value
|
||||
elif key == "Measurement Start Time":
|
||||
result["start_time_str"] = value
|
||||
elif key == "Measurement Stop Time":
|
||||
result["stop_time_str"] = value
|
||||
elif key == "Total Measurement Time":
|
||||
result["total_time_str"] = value
|
||||
except Exception:
|
||||
pass
|
||||
return result
|
||||
|
||||
|
||||
def _parse_rnh_datetime(s: str):
|
||||
"""Parse RNH datetime string: '2026/02/17 19:00:19' -> datetime"""
|
||||
from datetime import datetime
|
||||
if not s:
|
||||
return None
|
||||
try:
|
||||
return datetime.strptime(s.strip(), "%Y/%m/%d %H:%M:%S")
|
||||
except Exception:
|
||||
return None
|
||||
|
||||
|
||||
def _classify_file(filename: str) -> str:
|
||||
"""Classify a file by name into a DataFile file_type."""
|
||||
name = filename.lower()
|
||||
if name.endswith(".rnh"):
|
||||
return "log"
|
||||
if name.endswith(".rnd"):
|
||||
return "measurement"
|
||||
if name.endswith(".zip"):
|
||||
return "archive"
|
||||
return "data"
|
||||
|
||||
|
||||
@router.post("/nrl/{location_id}/upload-data")
|
||||
async def upload_nrl_data(
|
||||
project_id: str,
|
||||
location_id: str,
|
||||
db: Session = Depends(get_db),
|
||||
files: list[UploadFile] = File(...),
|
||||
):
|
||||
"""
|
||||
Manually upload SD card data for an offline NRL.
|
||||
|
||||
Accepts either:
|
||||
- A single .zip file (the Auto_#### folder zipped) — auto-extracted
|
||||
- Multiple .rnd / .rnh files selected directly from the SD card folder
|
||||
|
||||
Creates a MonitoringSession from .rnh metadata and DataFile records
|
||||
for each measurement file. No unit assignment required.
|
||||
"""
|
||||
from datetime import datetime
|
||||
|
||||
# Verify project and location exist
|
||||
location = db.query(MonitoringLocation).filter_by(
|
||||
id=location_id, project_id=project_id
|
||||
).first()
|
||||
if not location:
|
||||
raise HTTPException(status_code=404, detail="Location not found")
|
||||
|
||||
# --- Step 1: Normalize to (filename, bytes) list ---
|
||||
file_entries: list[tuple[str, bytes]] = []
|
||||
|
||||
if len(files) == 1 and files[0].filename.lower().endswith(".zip"):
|
||||
raw = await files[0].read()
|
||||
try:
|
||||
with zipfile.ZipFile(io.BytesIO(raw)) as zf:
|
||||
for info in zf.infolist():
|
||||
if info.is_dir():
|
||||
continue
|
||||
name = Path(info.filename).name # strip folder path
|
||||
if not name:
|
||||
continue
|
||||
file_entries.append((name, zf.read(info)))
|
||||
except zipfile.BadZipFile:
|
||||
raise HTTPException(status_code=400, detail="Uploaded file is not a valid ZIP archive.")
|
||||
else:
|
||||
for uf in files:
|
||||
data = await uf.read()
|
||||
file_entries.append((uf.filename, data))
|
||||
|
||||
if not file_entries:
|
||||
raise HTTPException(status_code=400, detail="No usable files found in upload.")
|
||||
|
||||
# --- Step 1b: Filter to only relevant files ---
|
||||
# Keep: .rnh (metadata) and measurement .rnd files
|
||||
# NL-43 generates two .rnd types: _Leq_ (15-min averages, wanted) and _Lp_ (1-sec granular, skip)
|
||||
# AU2 (NL-23/older Rion) generates a single Au2_####.rnd per session — always keep those
|
||||
# Drop: _Lp_ .rnd, .xlsx, .mp3, and anything else
|
||||
def _is_wanted(fname: str) -> bool:
|
||||
n = fname.lower()
|
||||
if n.endswith(".rnh"):
|
||||
return True
|
||||
if n.endswith(".rnd"):
|
||||
if "_leq_" in n: # NL-43 Leq file
|
||||
return True
|
||||
if n.startswith("au2_"): # AU2 format (NL-23) — always Leq equivalent
|
||||
return True
|
||||
if "_lp" not in n and "_leq_" not in n:
|
||||
# Unknown .rnd format — include it so we don't silently drop data
|
||||
return True
|
||||
return False
|
||||
|
||||
file_entries = [(fname, fbytes) for fname, fbytes in file_entries if _is_wanted(fname)]
|
||||
|
||||
if not file_entries:
|
||||
raise HTTPException(status_code=400, detail="No usable .rnd or .rnh files found. Expected NL-43 _Leq_ files or AU2 format .rnd files.")
|
||||
|
||||
# --- Step 2: Find and parse .rnh metadata ---
|
||||
rnh_meta = {}
|
||||
for fname, fbytes in file_entries:
|
||||
if fname.lower().endswith(".rnh"):
|
||||
rnh_meta = _parse_rnh(fbytes)
|
||||
break
|
||||
|
||||
started_at = _parse_rnh_datetime(rnh_meta.get("start_time_str")) or datetime.utcnow()
|
||||
stopped_at = _parse_rnh_datetime(rnh_meta.get("stop_time_str"))
|
||||
duration_seconds = None
|
||||
if started_at and stopped_at:
|
||||
duration_seconds = int((stopped_at - started_at).total_seconds())
|
||||
|
||||
store_name = rnh_meta.get("store_name", "")
|
||||
serial_number = rnh_meta.get("serial_number", "")
|
||||
index_number = rnh_meta.get("index_number", "")
|
||||
|
||||
# --- Step 3: Create MonitoringSession ---
|
||||
period_type = _derive_period_type(started_at) if started_at else None
|
||||
session_label = _build_session_label(started_at, location.name, period_type) if started_at else None
|
||||
|
||||
session_id = str(uuid.uuid4())
|
||||
monitoring_session = MonitoringSession(
|
||||
id=session_id,
|
||||
project_id=project_id,
|
||||
location_id=location_id,
|
||||
unit_id=None,
|
||||
session_type="sound",
|
||||
started_at=started_at,
|
||||
stopped_at=stopped_at,
|
||||
duration_seconds=duration_seconds,
|
||||
status="completed",
|
||||
session_label=session_label,
|
||||
period_type=period_type,
|
||||
session_metadata=json.dumps({
|
||||
"source": "manual_upload",
|
||||
"store_name": store_name,
|
||||
"serial_number": serial_number,
|
||||
"index_number": index_number,
|
||||
}),
|
||||
)
|
||||
db.add(monitoring_session)
|
||||
db.commit()
|
||||
db.refresh(monitoring_session)
|
||||
|
||||
# --- Step 4: Write files to disk and create DataFile records ---
|
||||
output_dir = Path("data/Projects") / project_id / session_id
|
||||
output_dir.mkdir(parents=True, exist_ok=True)
|
||||
|
||||
leq_count = 0
|
||||
lp_count = 0
|
||||
metadata_count = 0
|
||||
files_imported = 0
|
||||
|
||||
for fname, fbytes in file_entries:
|
||||
file_type = _classify_file(fname)
|
||||
fname_lower = fname.lower()
|
||||
|
||||
# Track counts for summary
|
||||
if fname_lower.endswith(".rnd"):
|
||||
if "_leq_" in fname_lower:
|
||||
leq_count += 1
|
||||
elif "_lp" in fname_lower:
|
||||
lp_count += 1
|
||||
elif fname_lower.endswith(".rnh"):
|
||||
metadata_count += 1
|
||||
|
||||
# Write to disk
|
||||
dest = output_dir / fname
|
||||
dest.write_bytes(fbytes)
|
||||
|
||||
# Compute checksum
|
||||
checksum = hashlib.sha256(fbytes).hexdigest()
|
||||
|
||||
# Store relative path from data/ dir
|
||||
rel_path = str(dest.relative_to("data"))
|
||||
|
||||
data_file = DataFile(
|
||||
id=str(uuid.uuid4()),
|
||||
session_id=session_id,
|
||||
file_path=rel_path,
|
||||
file_type=file_type,
|
||||
file_size_bytes=len(fbytes),
|
||||
downloaded_at=datetime.utcnow(),
|
||||
checksum=checksum,
|
||||
file_metadata=json.dumps({
|
||||
"source": "manual_upload",
|
||||
"original_filename": fname,
|
||||
"store_name": store_name,
|
||||
}),
|
||||
)
|
||||
db.add(data_file)
|
||||
files_imported += 1
|
||||
|
||||
db.commit()
|
||||
|
||||
return {
|
||||
"success": True,
|
||||
"session_id": session_id,
|
||||
"files_imported": files_imported,
|
||||
"leq_files": leq_count,
|
||||
"lp_files": lp_count,
|
||||
"metadata_files": metadata_count,
|
||||
"store_name": store_name,
|
||||
"started_at": started_at.isoformat() if started_at else None,
|
||||
"stopped_at": stopped_at.isoformat() if stopped_at else None,
|
||||
}
|
||||
|
||||
|
||||
# ============================================================================
|
||||
# NRL Live Status (connected NRLs only)
|
||||
# ============================================================================
|
||||
|
||||
@router.get("/nrl/{location_id}/live-status", response_class=HTMLResponse)
|
||||
async def get_nrl_live_status(
|
||||
project_id: str,
|
||||
location_id: str,
|
||||
request: Request,
|
||||
db: Session = Depends(get_db),
|
||||
):
|
||||
"""
|
||||
Fetch cached status from SLMM for the unit assigned to this NRL and
|
||||
return a compact HTML status card. Used in the NRL overview tab for
|
||||
connected NRLs. Gracefully shows an offline message if SLMM is unreachable.
|
||||
"""
|
||||
import os
|
||||
import httpx
|
||||
|
||||
# Find the assigned unit
|
||||
assignment = db.query(UnitAssignment).filter(
|
||||
and_(
|
||||
UnitAssignment.location_id == location_id,
|
||||
UnitAssignment.status == "active",
|
||||
)
|
||||
).first()
|
||||
|
||||
if not assignment:
|
||||
return templates.TemplateResponse("partials/projects/nrl_live_status.html", {
|
||||
"request": request,
|
||||
"status": None,
|
||||
"error": "No unit assigned",
|
||||
})
|
||||
|
||||
unit = db.query(RosterUnit).filter_by(id=assignment.unit_id).first()
|
||||
if not unit:
|
||||
return templates.TemplateResponse("partials/projects/nrl_live_status.html", {
|
||||
"request": request,
|
||||
"status": None,
|
||||
"error": "Assigned unit not found",
|
||||
})
|
||||
|
||||
slmm_base = os.getenv("SLMM_BASE_URL", "http://localhost:8100")
|
||||
status_data = None
|
||||
error_msg = None
|
||||
|
||||
try:
|
||||
async with httpx.AsyncClient(timeout=5.0) as client:
|
||||
resp = await client.get(f"{slmm_base}/api/nl43/{unit.id}/status")
|
||||
if resp.status_code == 200:
|
||||
status_data = resp.json()
|
||||
else:
|
||||
error_msg = f"SLMM returned {resp.status_code}"
|
||||
except Exception as e:
|
||||
error_msg = "SLMM unreachable"
|
||||
|
||||
return templates.TemplateResponse("partials/projects/nrl_live_status.html", {
|
||||
"request": request,
|
||||
"unit": unit,
|
||||
"status": status_data,
|
||||
"error": error_msg,
|
||||
})
|
||||
|
||||
522
backend/routers/recurring_schedules.py
Normal file
@@ -0,0 +1,522 @@
|
||||
"""
|
||||
Recurring Schedules Router
|
||||
|
||||
API endpoints for managing recurring monitoring schedules.
|
||||
"""
|
||||
|
||||
from fastapi import APIRouter, Request, Depends, HTTPException, Query
|
||||
from fastapi.responses import HTMLResponse, JSONResponse
|
||||
from sqlalchemy.orm import Session
|
||||
from typing import Optional
|
||||
from datetime import datetime
|
||||
import json
|
||||
|
||||
from backend.database import get_db
|
||||
from backend.models import RecurringSchedule, MonitoringLocation, Project, RosterUnit
|
||||
from backend.services.recurring_schedule_service import get_recurring_schedule_service
|
||||
from backend.templates_config import templates
|
||||
|
||||
router = APIRouter(prefix="/api/projects/{project_id}/recurring-schedules", tags=["recurring-schedules"])
|
||||
|
||||
|
||||
# ============================================================================
|
||||
# List and Get
|
||||
# ============================================================================
|
||||
|
||||
@router.get("/")
|
||||
async def list_recurring_schedules(
|
||||
project_id: str,
|
||||
db: Session = Depends(get_db),
|
||||
enabled_only: bool = Query(False),
|
||||
):
|
||||
"""
|
||||
List all recurring schedules for a project.
|
||||
"""
|
||||
project = db.query(Project).filter_by(id=project_id).first()
|
||||
if not project:
|
||||
raise HTTPException(status_code=404, detail="Project not found")
|
||||
|
||||
query = db.query(RecurringSchedule).filter_by(project_id=project_id)
|
||||
if enabled_only:
|
||||
query = query.filter_by(enabled=True)
|
||||
|
||||
schedules = query.order_by(RecurringSchedule.created_at.desc()).all()
|
||||
|
||||
return {
|
||||
"schedules": [
|
||||
{
|
||||
"id": s.id,
|
||||
"name": s.name,
|
||||
"schedule_type": s.schedule_type,
|
||||
"device_type": s.device_type,
|
||||
"location_id": s.location_id,
|
||||
"unit_id": s.unit_id,
|
||||
"enabled": s.enabled,
|
||||
"weekly_pattern": json.loads(s.weekly_pattern) if s.weekly_pattern else None,
|
||||
"interval_type": s.interval_type,
|
||||
"cycle_time": s.cycle_time,
|
||||
"include_download": s.include_download,
|
||||
"timezone": s.timezone,
|
||||
"next_occurrence": s.next_occurrence.isoformat() if s.next_occurrence else None,
|
||||
"last_generated_at": s.last_generated_at.isoformat() if s.last_generated_at else None,
|
||||
"created_at": s.created_at.isoformat() if s.created_at else None,
|
||||
}
|
||||
for s in schedules
|
||||
],
|
||||
"count": len(schedules),
|
||||
}
|
||||
|
||||
|
||||
@router.get("/{schedule_id}")
|
||||
async def get_recurring_schedule(
|
||||
project_id: str,
|
||||
schedule_id: str,
|
||||
db: Session = Depends(get_db),
|
||||
):
|
||||
"""
|
||||
Get a specific recurring schedule.
|
||||
"""
|
||||
schedule = db.query(RecurringSchedule).filter_by(
|
||||
id=schedule_id,
|
||||
project_id=project_id,
|
||||
).first()
|
||||
|
||||
if not schedule:
|
||||
raise HTTPException(status_code=404, detail="Schedule not found")
|
||||
|
||||
# Get related location and unit info
|
||||
location = db.query(MonitoringLocation).filter_by(id=schedule.location_id).first()
|
||||
unit = None
|
||||
if schedule.unit_id:
|
||||
unit = db.query(RosterUnit).filter_by(id=schedule.unit_id).first()
|
||||
|
||||
return {
|
||||
"id": schedule.id,
|
||||
"name": schedule.name,
|
||||
"schedule_type": schedule.schedule_type,
|
||||
"device_type": schedule.device_type,
|
||||
"location_id": schedule.location_id,
|
||||
"location_name": location.name if location else None,
|
||||
"unit_id": schedule.unit_id,
|
||||
"unit_name": unit.id if unit else None,
|
||||
"enabled": schedule.enabled,
|
||||
"weekly_pattern": json.loads(schedule.weekly_pattern) if schedule.weekly_pattern else None,
|
||||
"interval_type": schedule.interval_type,
|
||||
"cycle_time": schedule.cycle_time,
|
||||
"include_download": schedule.include_download,
|
||||
"timezone": schedule.timezone,
|
||||
"next_occurrence": schedule.next_occurrence.isoformat() if schedule.next_occurrence else None,
|
||||
"last_generated_at": schedule.last_generated_at.isoformat() if schedule.last_generated_at else None,
|
||||
"created_at": schedule.created_at.isoformat() if schedule.created_at else None,
|
||||
"updated_at": schedule.updated_at.isoformat() if schedule.updated_at else None,
|
||||
}
|
||||
|
||||
|
||||
# ============================================================================
|
||||
# Create
|
||||
# ============================================================================
|
||||
|
||||
@router.post("/")
|
||||
async def create_recurring_schedule(
|
||||
project_id: str,
|
||||
request: Request,
|
||||
db: Session = Depends(get_db),
|
||||
):
|
||||
"""
|
||||
Create recurring schedules for one or more locations.
|
||||
|
||||
Body for weekly_calendar (supports multiple locations):
|
||||
{
|
||||
"name": "Weeknight Monitoring",
|
||||
"schedule_type": "weekly_calendar",
|
||||
"location_ids": ["uuid1", "uuid2"], // Array of location IDs
|
||||
"weekly_pattern": {
|
||||
"monday": {"enabled": true, "start": "19:00", "end": "07:00"},
|
||||
"tuesday": {"enabled": false},
|
||||
...
|
||||
},
|
||||
"include_download": true,
|
||||
"auto_increment_index": true,
|
||||
"timezone": "America/New_York"
|
||||
}
|
||||
|
||||
Body for simple_interval (supports multiple locations):
|
||||
{
|
||||
"name": "24/7 Continuous",
|
||||
"schedule_type": "simple_interval",
|
||||
"location_ids": ["uuid1", "uuid2"], // Array of location IDs
|
||||
"interval_type": "daily",
|
||||
"cycle_time": "00:00",
|
||||
"include_download": true,
|
||||
"auto_increment_index": true,
|
||||
"timezone": "America/New_York"
|
||||
}
|
||||
|
||||
Legacy single location support (backwards compatible):
|
||||
{
|
||||
"name": "...",
|
||||
"location_id": "uuid", // Single location ID
|
||||
...
|
||||
}
|
||||
"""
|
||||
project = db.query(Project).filter_by(id=project_id).first()
|
||||
if not project:
|
||||
raise HTTPException(status_code=404, detail="Project not found")
|
||||
|
||||
data = await request.json()
|
||||
|
||||
# Support both location_ids (array) and location_id (single) for backwards compatibility
|
||||
location_ids = data.get("location_ids", [])
|
||||
if not location_ids and data.get("location_id"):
|
||||
location_ids = [data.get("location_id")]
|
||||
|
||||
if not location_ids:
|
||||
raise HTTPException(status_code=400, detail="At least one location is required")
|
||||
|
||||
# Validate all locations exist
|
||||
locations = db.query(MonitoringLocation).filter(
|
||||
MonitoringLocation.id.in_(location_ids),
|
||||
MonitoringLocation.project_id == project_id,
|
||||
).all()
|
||||
|
||||
if len(locations) != len(location_ids):
|
||||
raise HTTPException(status_code=404, detail="One or more locations not found")
|
||||
|
||||
service = get_recurring_schedule_service(db)
|
||||
created_schedules = []
|
||||
base_name = data.get("name", "Unnamed Schedule")
|
||||
|
||||
# Parse one-off datetime fields if applicable
|
||||
one_off_start = None
|
||||
one_off_end = None
|
||||
if data.get("schedule_type") == "one_off":
|
||||
from zoneinfo import ZoneInfo
|
||||
|
||||
tz = ZoneInfo(data.get("timezone", "America/New_York"))
|
||||
|
||||
start_dt_str = data.get("start_datetime")
|
||||
end_dt_str = data.get("end_datetime")
|
||||
|
||||
if not start_dt_str or not end_dt_str:
|
||||
raise HTTPException(status_code=400, detail="One-off schedules require start and end date/time")
|
||||
|
||||
try:
|
||||
start_local = datetime.fromisoformat(start_dt_str).replace(tzinfo=tz)
|
||||
end_local = datetime.fromisoformat(end_dt_str).replace(tzinfo=tz)
|
||||
except ValueError:
|
||||
raise HTTPException(status_code=400, detail="Invalid datetime format")
|
||||
|
||||
duration = end_local - start_local
|
||||
if duration.total_seconds() < 900:
|
||||
raise HTTPException(status_code=400, detail="Duration must be at least 15 minutes")
|
||||
if duration.total_seconds() > 86400:
|
||||
raise HTTPException(status_code=400, detail="Duration cannot exceed 24 hours")
|
||||
|
||||
from datetime import timezone as dt_timezone
|
||||
now_local = datetime.now(tz)
|
||||
if start_local <= now_local:
|
||||
raise HTTPException(status_code=400, detail="Start time must be in the future")
|
||||
|
||||
# Convert to UTC for storage
|
||||
one_off_start = start_local.astimezone(ZoneInfo("UTC")).replace(tzinfo=None)
|
||||
one_off_end = end_local.astimezone(ZoneInfo("UTC")).replace(tzinfo=None)
|
||||
|
||||
# Create a schedule for each location
|
||||
for location in locations:
|
||||
# Determine device type from location
|
||||
device_type = "slm" if location.location_type == "sound" else "seismograph"
|
||||
|
||||
# Append location name if multiple locations
|
||||
schedule_name = f"{base_name} - {location.name}" if len(locations) > 1 else base_name
|
||||
|
||||
schedule = service.create_schedule(
|
||||
project_id=project_id,
|
||||
location_id=location.id,
|
||||
name=schedule_name,
|
||||
schedule_type=data.get("schedule_type", "weekly_calendar"),
|
||||
device_type=device_type,
|
||||
unit_id=data.get("unit_id"),
|
||||
weekly_pattern=data.get("weekly_pattern"),
|
||||
interval_type=data.get("interval_type"),
|
||||
cycle_time=data.get("cycle_time"),
|
||||
include_download=data.get("include_download", True),
|
||||
auto_increment_index=data.get("auto_increment_index", True),
|
||||
timezone=data.get("timezone", "America/New_York"),
|
||||
start_datetime=one_off_start,
|
||||
end_datetime=one_off_end,
|
||||
)
|
||||
|
||||
# Generate actions immediately so they appear right away
|
||||
generated_actions = service.generate_actions_for_schedule(schedule, horizon_days=7)
|
||||
|
||||
created_schedules.append({
|
||||
"schedule_id": schedule.id,
|
||||
"location_id": location.id,
|
||||
"location_name": location.name,
|
||||
"actions_generated": len(generated_actions),
|
||||
})
|
||||
|
||||
total_actions = sum(s.get("actions_generated", 0) for s in created_schedules)
|
||||
|
||||
return JSONResponse({
|
||||
"success": True,
|
||||
"schedules": created_schedules,
|
||||
"count": len(created_schedules),
|
||||
"actions_generated": total_actions,
|
||||
"message": f"Created {len(created_schedules)} recurring schedule(s) with {total_actions} upcoming actions",
|
||||
})
|
||||
|
||||
|
||||
# ============================================================================
|
||||
# Update
|
||||
# ============================================================================
|
||||
|
||||
@router.put("/{schedule_id}")
|
||||
async def update_recurring_schedule(
|
||||
project_id: str,
|
||||
schedule_id: str,
|
||||
request: Request,
|
||||
db: Session = Depends(get_db),
|
||||
):
|
||||
"""
|
||||
Update a recurring schedule.
|
||||
"""
|
||||
schedule = db.query(RecurringSchedule).filter_by(
|
||||
id=schedule_id,
|
||||
project_id=project_id,
|
||||
).first()
|
||||
|
||||
if not schedule:
|
||||
raise HTTPException(status_code=404, detail="Schedule not found")
|
||||
|
||||
data = await request.json()
|
||||
service = get_recurring_schedule_service(db)
|
||||
|
||||
# Build update kwargs
|
||||
update_kwargs = {}
|
||||
for field in ["name", "weekly_pattern", "interval_type", "cycle_time",
|
||||
"include_download", "auto_increment_index", "timezone", "unit_id"]:
|
||||
if field in data:
|
||||
update_kwargs[field] = data[field]
|
||||
|
||||
updated = service.update_schedule(schedule_id, **update_kwargs)
|
||||
|
||||
return {
|
||||
"success": True,
|
||||
"schedule_id": updated.id,
|
||||
"message": "Schedule updated successfully",
|
||||
}
|
||||
|
||||
|
||||
# ============================================================================
|
||||
# Delete
|
||||
# ============================================================================
|
||||
|
||||
@router.delete("/{schedule_id}")
|
||||
async def delete_recurring_schedule(
|
||||
project_id: str,
|
||||
schedule_id: str,
|
||||
db: Session = Depends(get_db),
|
||||
):
|
||||
"""
|
||||
Delete a recurring schedule.
|
||||
"""
|
||||
service = get_recurring_schedule_service(db)
|
||||
deleted = service.delete_schedule(schedule_id)
|
||||
|
||||
if not deleted:
|
||||
raise HTTPException(status_code=404, detail="Schedule not found")
|
||||
|
||||
return {
|
||||
"success": True,
|
||||
"message": "Schedule deleted successfully",
|
||||
}
|
||||
|
||||
|
||||
# ============================================================================
|
||||
# Enable/Disable
|
||||
# ============================================================================
|
||||
|
||||
@router.post("/{schedule_id}/enable")
|
||||
async def enable_schedule(
|
||||
project_id: str,
|
||||
schedule_id: str,
|
||||
db: Session = Depends(get_db),
|
||||
):
|
||||
"""
|
||||
Enable a disabled schedule.
|
||||
"""
|
||||
service = get_recurring_schedule_service(db)
|
||||
schedule = service.enable_schedule(schedule_id)
|
||||
|
||||
if not schedule:
|
||||
raise HTTPException(status_code=404, detail="Schedule not found")
|
||||
|
||||
return {
|
||||
"success": True,
|
||||
"schedule_id": schedule.id,
|
||||
"enabled": schedule.enabled,
|
||||
"message": "Schedule enabled",
|
||||
}
|
||||
|
||||
|
||||
@router.post("/{schedule_id}/disable")
|
||||
async def disable_schedule(
|
||||
project_id: str,
|
||||
schedule_id: str,
|
||||
db: Session = Depends(get_db),
|
||||
):
|
||||
"""
|
||||
Disable a schedule and cancel all its pending actions.
|
||||
"""
|
||||
service = get_recurring_schedule_service(db)
|
||||
|
||||
# Count pending actions before disabling (for response message)
|
||||
from sqlalchemy import and_
|
||||
from backend.models import ScheduledAction
|
||||
pending_count = db.query(ScheduledAction).filter(
|
||||
and_(
|
||||
ScheduledAction.execution_status == "pending",
|
||||
ScheduledAction.notes.like(f'%"schedule_id": "{schedule_id}"%'),
|
||||
)
|
||||
).count()
|
||||
|
||||
schedule = service.disable_schedule(schedule_id)
|
||||
|
||||
if not schedule:
|
||||
raise HTTPException(status_code=404, detail="Schedule not found")
|
||||
|
||||
message = "Schedule disabled"
|
||||
if pending_count > 0:
|
||||
message += f" and {pending_count} pending action(s) cancelled"
|
||||
|
||||
return {
|
||||
"success": True,
|
||||
"schedule_id": schedule.id,
|
||||
"enabled": schedule.enabled,
|
||||
"cancelled_actions": pending_count,
|
||||
"message": message,
|
||||
}
|
||||
|
||||
|
||||
# ============================================================================
|
||||
# Preview Generated Actions
|
||||
# ============================================================================
|
||||
|
||||
@router.post("/{schedule_id}/generate-preview")
|
||||
async def preview_generated_actions(
|
||||
project_id: str,
|
||||
schedule_id: str,
|
||||
db: Session = Depends(get_db),
|
||||
days: int = Query(7, ge=1, le=30),
|
||||
):
|
||||
"""
|
||||
Preview what actions would be generated without saving them.
|
||||
"""
|
||||
schedule = db.query(RecurringSchedule).filter_by(
|
||||
id=schedule_id,
|
||||
project_id=project_id,
|
||||
).first()
|
||||
|
||||
if not schedule:
|
||||
raise HTTPException(status_code=404, detail="Schedule not found")
|
||||
|
||||
service = get_recurring_schedule_service(db)
|
||||
actions = service.generate_actions_for_schedule(
|
||||
schedule,
|
||||
horizon_days=days,
|
||||
preview_only=True,
|
||||
)
|
||||
|
||||
return {
|
||||
"schedule_id": schedule_id,
|
||||
"schedule_name": schedule.name,
|
||||
"preview_days": days,
|
||||
"actions": [
|
||||
{
|
||||
"action_type": a.action_type,
|
||||
"scheduled_time": a.scheduled_time.isoformat(),
|
||||
"notes": a.notes,
|
||||
}
|
||||
for a in actions
|
||||
],
|
||||
"action_count": len(actions),
|
||||
}
|
||||
|
||||
|
||||
# ============================================================================
|
||||
# Manual Generation Trigger
|
||||
# ============================================================================
|
||||
|
||||
@router.post("/{schedule_id}/generate")
|
||||
async def generate_actions_now(
|
||||
project_id: str,
|
||||
schedule_id: str,
|
||||
db: Session = Depends(get_db),
|
||||
days: int = Query(7, ge=1, le=30),
|
||||
):
|
||||
"""
|
||||
Manually trigger action generation for a schedule.
|
||||
"""
|
||||
schedule = db.query(RecurringSchedule).filter_by(
|
||||
id=schedule_id,
|
||||
project_id=project_id,
|
||||
).first()
|
||||
|
||||
if not schedule:
|
||||
raise HTTPException(status_code=404, detail="Schedule not found")
|
||||
|
||||
if not schedule.enabled:
|
||||
raise HTTPException(status_code=400, detail="Schedule is disabled")
|
||||
|
||||
service = get_recurring_schedule_service(db)
|
||||
actions = service.generate_actions_for_schedule(
|
||||
schedule,
|
||||
horizon_days=days,
|
||||
preview_only=False,
|
||||
)
|
||||
|
||||
return {
|
||||
"success": True,
|
||||
"schedule_id": schedule_id,
|
||||
"generated_count": len(actions),
|
||||
"message": f"Generated {len(actions)} scheduled actions",
|
||||
}
|
||||
|
||||
|
||||
# ============================================================================
|
||||
# HTML Partials
|
||||
# ============================================================================
|
||||
|
||||
@router.get("/partials/list", response_class=HTMLResponse)
|
||||
async def get_schedule_list_partial(
|
||||
project_id: str,
|
||||
request: Request,
|
||||
db: Session = Depends(get_db),
|
||||
):
|
||||
"""
|
||||
Return HTML partial for schedule list.
|
||||
"""
|
||||
project = db.query(Project).filter_by(id=project_id).first()
|
||||
project_status = project.status if project else "active"
|
||||
|
||||
schedules = db.query(RecurringSchedule).filter_by(
|
||||
project_id=project_id
|
||||
).order_by(RecurringSchedule.created_at.desc()).all()
|
||||
|
||||
# Enrich with location info
|
||||
schedule_data = []
|
||||
for s in schedules:
|
||||
location = db.query(MonitoringLocation).filter_by(id=s.location_id).first()
|
||||
schedule_data.append({
|
||||
"schedule": s,
|
||||
"location": location,
|
||||
"pattern": json.loads(s.weekly_pattern) if s.weekly_pattern else None,
|
||||
})
|
||||
|
||||
return templates.TemplateResponse("partials/projects/recurring_schedule_list.html", {
|
||||
"request": request,
|
||||
"project_id": project_id,
|
||||
"schedules": schedule_data,
|
||||
"project_status": project_status,
|
||||
})
|
||||
187
backend/routers/report_templates.py
Normal file
@@ -0,0 +1,187 @@
|
||||
"""
|
||||
Report Templates Router
|
||||
|
||||
CRUD operations for report template management.
|
||||
Templates store time filter presets and report configuration for reuse.
|
||||
"""
|
||||
|
||||
from fastapi import APIRouter, Depends, HTTPException
|
||||
from fastapi.responses import JSONResponse
|
||||
from sqlalchemy.orm import Session
|
||||
from datetime import datetime
|
||||
from typing import Optional
|
||||
import uuid
|
||||
|
||||
from backend.database import get_db
|
||||
from backend.models import ReportTemplate
|
||||
|
||||
router = APIRouter(prefix="/api/report-templates", tags=["report-templates"])
|
||||
|
||||
|
||||
@router.get("")
|
||||
async def list_templates(
|
||||
project_id: Optional[str] = None,
|
||||
db: Session = Depends(get_db),
|
||||
):
|
||||
"""
|
||||
List all report templates.
|
||||
Optionally filter by project_id (includes global templates with project_id=None).
|
||||
"""
|
||||
query = db.query(ReportTemplate)
|
||||
|
||||
if project_id:
|
||||
# Include global templates (project_id=None) AND project-specific templates
|
||||
query = query.filter(
|
||||
(ReportTemplate.project_id == None) | (ReportTemplate.project_id == project_id)
|
||||
)
|
||||
|
||||
templates = query.order_by(ReportTemplate.name).all()
|
||||
|
||||
return [
|
||||
{
|
||||
"id": t.id,
|
||||
"name": t.name,
|
||||
"project_id": t.project_id,
|
||||
"report_title": t.report_title,
|
||||
"start_time": t.start_time,
|
||||
"end_time": t.end_time,
|
||||
"start_date": t.start_date,
|
||||
"end_date": t.end_date,
|
||||
"created_at": t.created_at.isoformat() if t.created_at else None,
|
||||
"updated_at": t.updated_at.isoformat() if t.updated_at else None,
|
||||
}
|
||||
for t in templates
|
||||
]
|
||||
|
||||
|
||||
@router.post("")
|
||||
async def create_template(
|
||||
data: dict,
|
||||
db: Session = Depends(get_db),
|
||||
):
|
||||
"""
|
||||
Create a new report template.
|
||||
|
||||
Request body:
|
||||
- name: Template name (required)
|
||||
- project_id: Optional project ID for project-specific template
|
||||
- report_title: Default report title
|
||||
- start_time: Start time filter (HH:MM format)
|
||||
- end_time: End time filter (HH:MM format)
|
||||
- start_date: Start date filter (YYYY-MM-DD format)
|
||||
- end_date: End date filter (YYYY-MM-DD format)
|
||||
"""
|
||||
name = data.get("name")
|
||||
if not name:
|
||||
raise HTTPException(status_code=400, detail="Template name is required")
|
||||
|
||||
template = ReportTemplate(
|
||||
id=str(uuid.uuid4()),
|
||||
name=name,
|
||||
project_id=data.get("project_id"),
|
||||
report_title=data.get("report_title", "Background Noise Study"),
|
||||
start_time=data.get("start_time"),
|
||||
end_time=data.get("end_time"),
|
||||
start_date=data.get("start_date"),
|
||||
end_date=data.get("end_date"),
|
||||
)
|
||||
|
||||
db.add(template)
|
||||
db.commit()
|
||||
db.refresh(template)
|
||||
|
||||
return {
|
||||
"id": template.id,
|
||||
"name": template.name,
|
||||
"project_id": template.project_id,
|
||||
"report_title": template.report_title,
|
||||
"start_time": template.start_time,
|
||||
"end_time": template.end_time,
|
||||
"start_date": template.start_date,
|
||||
"end_date": template.end_date,
|
||||
"created_at": template.created_at.isoformat() if template.created_at else None,
|
||||
}
|
||||
|
||||
|
||||
@router.get("/{template_id}")
|
||||
async def get_template(
|
||||
template_id: str,
|
||||
db: Session = Depends(get_db),
|
||||
):
|
||||
"""Get a specific report template by ID."""
|
||||
template = db.query(ReportTemplate).filter_by(id=template_id).first()
|
||||
if not template:
|
||||
raise HTTPException(status_code=404, detail="Template not found")
|
||||
|
||||
return {
|
||||
"id": template.id,
|
||||
"name": template.name,
|
||||
"project_id": template.project_id,
|
||||
"report_title": template.report_title,
|
||||
"start_time": template.start_time,
|
||||
"end_time": template.end_time,
|
||||
"start_date": template.start_date,
|
||||
"end_date": template.end_date,
|
||||
"created_at": template.created_at.isoformat() if template.created_at else None,
|
||||
"updated_at": template.updated_at.isoformat() if template.updated_at else None,
|
||||
}
|
||||
|
||||
|
||||
@router.put("/{template_id}")
|
||||
async def update_template(
|
||||
template_id: str,
|
||||
data: dict,
|
||||
db: Session = Depends(get_db),
|
||||
):
|
||||
"""Update an existing report template."""
|
||||
template = db.query(ReportTemplate).filter_by(id=template_id).first()
|
||||
if not template:
|
||||
raise HTTPException(status_code=404, detail="Template not found")
|
||||
|
||||
# Update fields if provided
|
||||
if "name" in data:
|
||||
template.name = data["name"]
|
||||
if "project_id" in data:
|
||||
template.project_id = data["project_id"]
|
||||
if "report_title" in data:
|
||||
template.report_title = data["report_title"]
|
||||
if "start_time" in data:
|
||||
template.start_time = data["start_time"]
|
||||
if "end_time" in data:
|
||||
template.end_time = data["end_time"]
|
||||
if "start_date" in data:
|
||||
template.start_date = data["start_date"]
|
||||
if "end_date" in data:
|
||||
template.end_date = data["end_date"]
|
||||
|
||||
template.updated_at = datetime.utcnow()
|
||||
db.commit()
|
||||
db.refresh(template)
|
||||
|
||||
return {
|
||||
"id": template.id,
|
||||
"name": template.name,
|
||||
"project_id": template.project_id,
|
||||
"report_title": template.report_title,
|
||||
"start_time": template.start_time,
|
||||
"end_time": template.end_time,
|
||||
"start_date": template.start_date,
|
||||
"end_date": template.end_date,
|
||||
"updated_at": template.updated_at.isoformat() if template.updated_at else None,
|
||||
}
|
||||
|
||||
|
||||
@router.delete("/{template_id}")
|
||||
async def delete_template(
|
||||
template_id: str,
|
||||
db: Session = Depends(get_db),
|
||||
):
|
||||
"""Delete a report template."""
|
||||
template = db.query(ReportTemplate).filter_by(id=template_id).first()
|
||||
if not template:
|
||||
raise HTTPException(status_code=404, detail="Template not found")
|
||||
|
||||
db.delete(template)
|
||||
db.commit()
|
||||
|
||||
return JSONResponse({"status": "success", "message": "Template deleted"})
|
||||
@@ -2,20 +2,32 @@ from fastapi import APIRouter, Depends
|
||||
from sqlalchemy.orm import Session
|
||||
from datetime import datetime, timedelta
|
||||
from typing import Dict, Any
|
||||
import asyncio
|
||||
import logging
|
||||
import random
|
||||
|
||||
from backend.database import get_db
|
||||
from backend.services.snapshot import emit_status_snapshot
|
||||
from backend.services.slm_status_sync import sync_slm_status_to_emitters
|
||||
|
||||
router = APIRouter(prefix="/api", tags=["roster"])
|
||||
logger = logging.getLogger(__name__)
|
||||
|
||||
|
||||
@router.get("/status-snapshot")
|
||||
def get_status_snapshot(db: Session = Depends(get_db)):
|
||||
async def get_status_snapshot(db: Session = Depends(get_db)):
|
||||
"""
|
||||
Calls emit_status_snapshot() to get current fleet status.
|
||||
This will be replaced with real Series3 emitter logic later.
|
||||
Syncs SLM status from SLMM before generating snapshot.
|
||||
"""
|
||||
# Sync SLM status from SLMM (with timeout to prevent blocking)
|
||||
try:
|
||||
await asyncio.wait_for(sync_slm_status_to_emitters(), timeout=2.0)
|
||||
except asyncio.TimeoutError:
|
||||
logger.warning("SLM status sync timed out, using cached data")
|
||||
except Exception as e:
|
||||
logger.warning(f"SLM status sync failed: {e}")
|
||||
|
||||
return emit_status_snapshot()
|
||||
|
||||
|
||||
|
||||
@@ -1,7 +1,7 @@
|
||||
from fastapi import APIRouter, Depends, HTTPException, Form, UploadFile, File, Request
|
||||
from fastapi import APIRouter, Depends, HTTPException, Form, UploadFile, File, Request, Query
|
||||
from fastapi.exceptions import RequestValidationError
|
||||
from sqlalchemy.orm import Session
|
||||
from datetime import datetime, date
|
||||
from datetime import datetime, date, timedelta
|
||||
import csv
|
||||
import io
|
||||
import logging
|
||||
@@ -9,11 +9,20 @@ import httpx
|
||||
import os
|
||||
|
||||
from backend.database import get_db
|
||||
from backend.models import RosterUnit, IgnoredUnit, Emitter, UnitHistory
|
||||
from backend.models import RosterUnit, IgnoredUnit, Emitter, UnitHistory, UserPreferences
|
||||
from backend.services.slmm_sync import sync_slm_to_slmm
|
||||
|
||||
router = APIRouter(prefix="/api/roster", tags=["roster-edit"])
|
||||
logger = logging.getLogger(__name__)
|
||||
|
||||
|
||||
def get_calibration_interval(db: Session) -> int:
|
||||
"""Get calibration interval from user preferences, default 365 days."""
|
||||
prefs = db.query(UserPreferences).first()
|
||||
if prefs and prefs.calibration_interval_days:
|
||||
return prefs.calibration_interval_days
|
||||
return 365
|
||||
|
||||
# SLMM backend URL for syncing device configs to cache
|
||||
SLMM_BASE_URL = os.getenv("SLMM_BASE_URL", "http://localhost:8100")
|
||||
|
||||
@@ -150,6 +159,8 @@ async def add_roster_unit(
|
||||
ip_address: str = Form(None),
|
||||
phone_number: str = Form(None),
|
||||
hardware_model: str = Form(None),
|
||||
deployment_type: str = Form(None), # "seismograph" | "slm" - what device type modem is deployed with
|
||||
deployed_with_unit_id: str = Form(None), # ID of seismograph/SLM this modem is deployed with
|
||||
# Sound Level Meter-specific fields
|
||||
slm_host: str = Form(None),
|
||||
slm_tcp_port: str = Form(None),
|
||||
@@ -182,8 +193,13 @@ async def add_roster_unit(
|
||||
except ValueError:
|
||||
raise HTTPException(status_code=400, detail="Invalid last_calibrated date format. Use YYYY-MM-DD")
|
||||
|
||||
# Auto-calculate next_calibration_due from last_calibrated using calibration interval
|
||||
next_cal_date = None
|
||||
if next_calibration_due:
|
||||
if last_cal_date:
|
||||
cal_interval = get_calibration_interval(db)
|
||||
next_cal_date = last_cal_date + timedelta(days=cal_interval)
|
||||
elif next_calibration_due:
|
||||
# Fallback: allow explicit setting if no last_calibrated
|
||||
try:
|
||||
next_cal_date = datetime.strptime(next_calibration_due, "%Y-%m-%d").date()
|
||||
except ValueError:
|
||||
@@ -209,6 +225,7 @@ async def add_roster_unit(
|
||||
ip_address=ip_address if ip_address else None,
|
||||
phone_number=phone_number if phone_number else None,
|
||||
hardware_model=hardware_model if hardware_model else None,
|
||||
deployment_type=deployment_type if deployment_type else None,
|
||||
# Sound Level Meter-specific fields
|
||||
slm_host=slm_host if slm_host else None,
|
||||
slm_tcp_port=slm_tcp_port_int,
|
||||
@@ -219,11 +236,46 @@ async def add_roster_unit(
|
||||
slm_time_weighting=slm_time_weighting if slm_time_weighting else None,
|
||||
slm_measurement_range=slm_measurement_range if slm_measurement_range else None,
|
||||
)
|
||||
|
||||
# Auto-fill data from modem if pairing and fields are empty
|
||||
if deployed_with_modem_id:
|
||||
modem = db.query(RosterUnit).filter(
|
||||
RosterUnit.id == deployed_with_modem_id,
|
||||
RosterUnit.device_type == "modem"
|
||||
).first()
|
||||
if modem:
|
||||
if not unit.location and modem.location:
|
||||
unit.location = modem.location
|
||||
if not unit.address and modem.address:
|
||||
unit.address = modem.address
|
||||
if not unit.coordinates and modem.coordinates:
|
||||
unit.coordinates = modem.coordinates
|
||||
if not unit.project_id and modem.project_id:
|
||||
unit.project_id = modem.project_id
|
||||
if not unit.note and modem.note:
|
||||
unit.note = modem.note
|
||||
|
||||
# Bidirectional pairing sync for new units
|
||||
if device_type in ("seismograph", "slm") and deployed_with_modem_id:
|
||||
modem_to_update = db.query(RosterUnit).filter(
|
||||
RosterUnit.id == deployed_with_modem_id,
|
||||
RosterUnit.device_type == "modem"
|
||||
).first()
|
||||
if modem_to_update:
|
||||
# Clear old device's reference if modem was paired elsewhere
|
||||
if modem_to_update.deployed_with_unit_id and modem_to_update.deployed_with_unit_id != id:
|
||||
old_device = db.query(RosterUnit).filter(
|
||||
RosterUnit.id == modem_to_update.deployed_with_unit_id
|
||||
).first()
|
||||
if old_device and old_device.deployed_with_modem_id == deployed_with_modem_id:
|
||||
old_device.deployed_with_modem_id = None
|
||||
modem_to_update.deployed_with_unit_id = id
|
||||
|
||||
db.add(unit)
|
||||
db.commit()
|
||||
|
||||
# If sound level meter, sync config to SLMM cache
|
||||
if device_type == "sound_level_meter":
|
||||
if device_type == "slm":
|
||||
logger.info(f"Syncing SLM {id} config to SLMM cache...")
|
||||
result = await sync_slm_to_slmm_cache(
|
||||
unit_id=id,
|
||||
@@ -259,6 +311,145 @@ def get_modems_list(db: Session = Depends(get_db)):
|
||||
]
|
||||
|
||||
|
||||
@router.get("/search/modems")
|
||||
def search_modems(
|
||||
request: Request,
|
||||
q: str = Query("", description="Search term"),
|
||||
deployed_only: bool = Query(False, description="Only show deployed modems"),
|
||||
exclude_retired: bool = Query(True, description="Exclude retired modems"),
|
||||
limit: int = Query(10, le=50),
|
||||
db: Session = Depends(get_db)
|
||||
):
|
||||
"""
|
||||
Search modems by ID, IP address, or note. Returns HTML partial for HTMX dropdown.
|
||||
|
||||
Used by modem picker component to find modems to link with seismographs/SLMs.
|
||||
"""
|
||||
from fastapi.responses import HTMLResponse
|
||||
from fastapi.templating import Jinja2Templates
|
||||
|
||||
templates = Jinja2Templates(directory="templates")
|
||||
|
||||
query = db.query(RosterUnit).filter(RosterUnit.device_type == "modem")
|
||||
|
||||
if deployed_only:
|
||||
query = query.filter(RosterUnit.deployed == True)
|
||||
|
||||
if exclude_retired:
|
||||
query = query.filter(RosterUnit.retired == False)
|
||||
|
||||
# Search by ID, IP address, or note
|
||||
if q and q.strip():
|
||||
search_term = f"%{q.strip()}%"
|
||||
query = query.filter(
|
||||
(RosterUnit.id.ilike(search_term)) |
|
||||
(RosterUnit.ip_address.ilike(search_term)) |
|
||||
(RosterUnit.note.ilike(search_term))
|
||||
)
|
||||
|
||||
modems = query.order_by(RosterUnit.id).limit(limit).all()
|
||||
|
||||
# Build results
|
||||
results = []
|
||||
for modem in modems:
|
||||
# Build display text: ID - IP - Note (if available)
|
||||
display_parts = [modem.id]
|
||||
if modem.ip_address:
|
||||
display_parts.append(modem.ip_address)
|
||||
if modem.note:
|
||||
display_parts.append(modem.note)
|
||||
display = " - ".join(display_parts)
|
||||
|
||||
results.append({
|
||||
"id": modem.id,
|
||||
"ip_address": modem.ip_address or "",
|
||||
"phone_number": modem.phone_number or "",
|
||||
"note": modem.note or "",
|
||||
"deployed": modem.deployed,
|
||||
"display": display
|
||||
})
|
||||
|
||||
# Determine if we should show "no results" message
|
||||
show_empty = len(results) == 0 and q and q.strip()
|
||||
|
||||
return templates.TemplateResponse(
|
||||
"partials/modem_search_results.html",
|
||||
{
|
||||
"request": request,
|
||||
"modems": results,
|
||||
"query": q,
|
||||
"show_empty": show_empty
|
||||
}
|
||||
)
|
||||
|
||||
|
||||
@router.get("/search/units")
|
||||
def search_units(
|
||||
request: Request,
|
||||
q: str = Query("", description="Search term"),
|
||||
device_type: str = Query(None, description="Filter by device type: seismograph, modem, slm"),
|
||||
deployed_only: bool = Query(False, description="Only show deployed units"),
|
||||
exclude_retired: bool = Query(True, description="Exclude retired units"),
|
||||
limit: int = Query(10, le=50),
|
||||
db: Session = Depends(get_db)
|
||||
):
|
||||
"""
|
||||
Search roster units by ID or note. Returns HTML partial for HTMX dropdown.
|
||||
|
||||
Used by unit picker component to find seismographs/SLMs to link with modems.
|
||||
"""
|
||||
from fastapi.responses import HTMLResponse
|
||||
from fastapi.templating import Jinja2Templates
|
||||
|
||||
templates = Jinja2Templates(directory="templates")
|
||||
|
||||
query = db.query(RosterUnit)
|
||||
|
||||
# Apply filters
|
||||
if device_type:
|
||||
query = query.filter(RosterUnit.device_type == device_type)
|
||||
|
||||
if deployed_only:
|
||||
query = query.filter(RosterUnit.deployed == True)
|
||||
|
||||
if exclude_retired:
|
||||
query = query.filter(RosterUnit.retired == False)
|
||||
|
||||
# Search by ID or note
|
||||
if q and q.strip():
|
||||
search_term = f"%{q.strip()}%"
|
||||
query = query.filter(
|
||||
(RosterUnit.id.ilike(search_term)) |
|
||||
(RosterUnit.note.ilike(search_term))
|
||||
)
|
||||
|
||||
units = query.order_by(RosterUnit.id).limit(limit).all()
|
||||
|
||||
# Build results
|
||||
results = []
|
||||
for unit in units:
|
||||
results.append({
|
||||
"id": unit.id,
|
||||
"device_type": unit.device_type or "seismograph",
|
||||
"note": unit.note or "",
|
||||
"deployed": unit.deployed,
|
||||
"display": f"{unit.id}" + (f" - {unit.note}" if unit.note else "")
|
||||
})
|
||||
|
||||
# Determine if we should show "no results" message
|
||||
show_empty = len(results) == 0 and q and q.strip()
|
||||
|
||||
return templates.TemplateResponse(
|
||||
"partials/unit_search_results.html",
|
||||
{
|
||||
"request": request,
|
||||
"units": results,
|
||||
"query": q,
|
||||
"show_empty": show_empty
|
||||
}
|
||||
)
|
||||
|
||||
|
||||
@router.get("/{unit_id}")
|
||||
def get_roster_unit(unit_id: str, db: Session = Depends(get_db)):
|
||||
"""Get a single roster unit by ID"""
|
||||
@@ -283,6 +474,8 @@ def get_roster_unit(unit_id: str, db: Session = Depends(get_db)):
|
||||
"ip_address": unit.ip_address or "",
|
||||
"phone_number": unit.phone_number or "",
|
||||
"hardware_model": unit.hardware_model or "",
|
||||
"deployment_type": unit.deployment_type or "",
|
||||
"deployed_with_unit_id": unit.deployed_with_unit_id or "",
|
||||
"slm_host": unit.slm_host or "",
|
||||
"slm_tcp_port": unit.slm_tcp_port or "",
|
||||
"slm_ftp_port": unit.slm_ftp_port or "",
|
||||
@@ -295,7 +488,7 @@ def get_roster_unit(unit_id: str, db: Session = Depends(get_db)):
|
||||
|
||||
|
||||
@router.post("/edit/{unit_id}")
|
||||
def edit_roster_unit(
|
||||
async def edit_roster_unit(
|
||||
unit_id: str,
|
||||
device_type: str = Form("seismograph"),
|
||||
unit_type: str = Form("series3"),
|
||||
@@ -314,6 +507,8 @@ def edit_roster_unit(
|
||||
ip_address: str = Form(None),
|
||||
phone_number: str = Form(None),
|
||||
hardware_model: str = Form(None),
|
||||
deployment_type: str = Form(None),
|
||||
deployed_with_unit_id: str = Form(None),
|
||||
# Sound Level Meter-specific fields
|
||||
slm_host: str = Form(None),
|
||||
slm_tcp_port: str = Form(None),
|
||||
@@ -323,6 +518,14 @@ def edit_roster_unit(
|
||||
slm_frequency_weighting: str = Form(None),
|
||||
slm_time_weighting: str = Form(None),
|
||||
slm_measurement_range: str = Form(None),
|
||||
# Cascade options - sync fields to paired device
|
||||
cascade_to_unit_id: str = Form(None),
|
||||
cascade_deployed: str = Form(None),
|
||||
cascade_retired: str = Form(None),
|
||||
cascade_project: str = Form(None),
|
||||
cascade_location: str = Form(None),
|
||||
cascade_coordinates: str = Form(None),
|
||||
cascade_note: str = Form(None),
|
||||
db: Session = Depends(get_db)
|
||||
):
|
||||
unit = db.query(RosterUnit).filter(RosterUnit.id == unit_id).first()
|
||||
@@ -345,8 +548,13 @@ def edit_roster_unit(
|
||||
except ValueError:
|
||||
raise HTTPException(status_code=400, detail="Invalid last_calibrated date format. Use YYYY-MM-DD")
|
||||
|
||||
# Auto-calculate next_calibration_due from last_calibrated using calibration interval
|
||||
next_cal_date = None
|
||||
if next_calibration_due:
|
||||
if last_cal_date:
|
||||
cal_interval = get_calibration_interval(db)
|
||||
next_cal_date = last_cal_date + timedelta(days=cal_interval)
|
||||
elif next_calibration_due:
|
||||
# Fallback: allow explicit setting if no last_calibrated
|
||||
try:
|
||||
next_cal_date = datetime.strptime(next_calibration_due, "%Y-%m-%d").date()
|
||||
except ValueError:
|
||||
@@ -374,10 +582,31 @@ def edit_roster_unit(
|
||||
unit.next_calibration_due = next_cal_date
|
||||
unit.deployed_with_modem_id = deployed_with_modem_id if deployed_with_modem_id else None
|
||||
|
||||
# Auto-fill data from modem if pairing and fields are empty
|
||||
if deployed_with_modem_id:
|
||||
modem = db.query(RosterUnit).filter(
|
||||
RosterUnit.id == deployed_with_modem_id,
|
||||
RosterUnit.device_type == "modem"
|
||||
).first()
|
||||
if modem:
|
||||
# Only fill if the device field is empty
|
||||
if not unit.location and modem.location:
|
||||
unit.location = modem.location
|
||||
if not unit.address and modem.address:
|
||||
unit.address = modem.address
|
||||
if not unit.coordinates and modem.coordinates:
|
||||
unit.coordinates = modem.coordinates
|
||||
if not unit.project_id and modem.project_id:
|
||||
unit.project_id = modem.project_id
|
||||
if not unit.note and modem.note:
|
||||
unit.note = modem.note
|
||||
|
||||
# Modem-specific fields
|
||||
unit.ip_address = ip_address if ip_address else None
|
||||
unit.phone_number = phone_number if phone_number else None
|
||||
unit.hardware_model = hardware_model if hardware_model else None
|
||||
unit.deployment_type = deployment_type if deployment_type else None
|
||||
unit.deployed_with_unit_id = deployed_with_unit_id if deployed_with_unit_id else None
|
||||
|
||||
# Sound Level Meter-specific fields
|
||||
unit.slm_host = slm_host if slm_host else None
|
||||
@@ -389,6 +618,51 @@ def edit_roster_unit(
|
||||
unit.slm_time_weighting = slm_time_weighting if slm_time_weighting else None
|
||||
unit.slm_measurement_range = slm_measurement_range if slm_measurement_range else None
|
||||
|
||||
# Bidirectional pairing sync
|
||||
new_modem_id = deployed_with_modem_id if deployed_with_modem_id else None
|
||||
new_unit_pair_id = deployed_with_unit_id if deployed_with_unit_id else None
|
||||
|
||||
# When a device (seismograph/SLM) sets deployed_with_modem_id, update modem's deployed_with_unit_id
|
||||
if device_type in ("seismograph", "slm"):
|
||||
# Clear old modem's reference if modem changed
|
||||
old_modem_id = db.query(RosterUnit.deployed_with_modem_id).filter(
|
||||
RosterUnit.id == unit_id
|
||||
).scalar()
|
||||
# old_modem_id is already the new value at this point since we set it above,
|
||||
# but we need to check the *previous* modem. We already set it, so check if
|
||||
# there's a modem pointing to us that we're no longer paired with.
|
||||
if new_modem_id:
|
||||
modem_to_update = db.query(RosterUnit).filter(
|
||||
RosterUnit.id == new_modem_id,
|
||||
RosterUnit.device_type == "modem"
|
||||
).first()
|
||||
if modem_to_update and modem_to_update.deployed_with_unit_id != unit_id:
|
||||
# Clear old device's reference to this modem if modem was paired elsewhere
|
||||
if modem_to_update.deployed_with_unit_id:
|
||||
old_device = db.query(RosterUnit).filter(
|
||||
RosterUnit.id == modem_to_update.deployed_with_unit_id
|
||||
).first()
|
||||
if old_device and old_device.deployed_with_modem_id == new_modem_id:
|
||||
old_device.deployed_with_modem_id = None
|
||||
modem_to_update.deployed_with_unit_id = unit_id
|
||||
|
||||
# When a modem sets deployed_with_unit_id, update device's deployed_with_modem_id
|
||||
if device_type == "modem":
|
||||
if new_unit_pair_id:
|
||||
device_to_update = db.query(RosterUnit).filter(
|
||||
RosterUnit.id == new_unit_pair_id,
|
||||
RosterUnit.device_type.in_(["seismograph", "slm"])
|
||||
).first()
|
||||
if device_to_update and device_to_update.deployed_with_modem_id != unit_id:
|
||||
# Clear old modem's reference to this device if device was paired elsewhere
|
||||
if device_to_update.deployed_with_modem_id:
|
||||
old_modem = db.query(RosterUnit).filter(
|
||||
RosterUnit.id == device_to_update.deployed_with_modem_id
|
||||
).first()
|
||||
if old_modem and old_modem.deployed_with_unit_id == new_unit_pair_id:
|
||||
old_modem.deployed_with_unit_id = None
|
||||
device_to_update.deployed_with_modem_id = unit_id
|
||||
|
||||
# Record history entries for changed fields
|
||||
if old_note != note:
|
||||
record_history(db, unit_id, "note_change", "note", old_note, note, "manual")
|
||||
@@ -403,12 +677,93 @@ def edit_roster_unit(
|
||||
old_status_text = "retired" if old_retired else "active"
|
||||
record_history(db, unit_id, "retired_change", "retired", old_status_text, status_text, "manual")
|
||||
|
||||
# Handle cascade to paired device
|
||||
cascaded_unit_id = None
|
||||
if cascade_to_unit_id and cascade_to_unit_id.strip():
|
||||
paired_unit = db.query(RosterUnit).filter(RosterUnit.id == cascade_to_unit_id).first()
|
||||
if paired_unit:
|
||||
cascaded_unit_id = paired_unit.id
|
||||
|
||||
# Cascade deployed status
|
||||
if cascade_deployed in ['true', 'True', '1', 'yes']:
|
||||
old_paired_deployed = paired_unit.deployed
|
||||
paired_unit.deployed = deployed_bool
|
||||
paired_unit.last_updated = datetime.utcnow()
|
||||
if old_paired_deployed != deployed_bool:
|
||||
status_text = "deployed" if deployed_bool else "benched"
|
||||
old_status_text = "deployed" if old_paired_deployed else "benched"
|
||||
record_history(db, paired_unit.id, "deployed_change", "deployed",
|
||||
old_status_text, status_text, f"cascade from {unit_id}")
|
||||
|
||||
# Cascade retired status
|
||||
if cascade_retired in ['true', 'True', '1', 'yes']:
|
||||
old_paired_retired = paired_unit.retired
|
||||
paired_unit.retired = retired_bool
|
||||
paired_unit.last_updated = datetime.utcnow()
|
||||
if old_paired_retired != retired_bool:
|
||||
status_text = "retired" if retired_bool else "active"
|
||||
old_status_text = "retired" if old_paired_retired else "active"
|
||||
record_history(db, paired_unit.id, "retired_change", "retired",
|
||||
old_status_text, status_text, f"cascade from {unit_id}")
|
||||
|
||||
# Cascade project
|
||||
if cascade_project in ['true', 'True', '1', 'yes']:
|
||||
old_paired_project = paired_unit.project_id
|
||||
paired_unit.project_id = project_id
|
||||
paired_unit.last_updated = datetime.utcnow()
|
||||
if old_paired_project != project_id:
|
||||
record_history(db, paired_unit.id, "project_change", "project_id",
|
||||
old_paired_project or "", project_id or "", f"cascade from {unit_id}")
|
||||
|
||||
# Cascade address/location
|
||||
if cascade_location in ['true', 'True', '1', 'yes']:
|
||||
old_paired_address = paired_unit.address
|
||||
old_paired_location = paired_unit.location
|
||||
paired_unit.address = address
|
||||
paired_unit.location = location
|
||||
paired_unit.last_updated = datetime.utcnow()
|
||||
if old_paired_address != address:
|
||||
record_history(db, paired_unit.id, "address_change", "address",
|
||||
old_paired_address or "", address or "", f"cascade from {unit_id}")
|
||||
|
||||
# Cascade coordinates
|
||||
if cascade_coordinates in ['true', 'True', '1', 'yes']:
|
||||
old_paired_coords = paired_unit.coordinates
|
||||
paired_unit.coordinates = coordinates
|
||||
paired_unit.last_updated = datetime.utcnow()
|
||||
if old_paired_coords != coordinates:
|
||||
record_history(db, paired_unit.id, "coordinates_change", "coordinates",
|
||||
old_paired_coords or "", coordinates or "", f"cascade from {unit_id}")
|
||||
|
||||
# Cascade note
|
||||
if cascade_note in ['true', 'True', '1', 'yes']:
|
||||
old_paired_note = paired_unit.note
|
||||
paired_unit.note = note
|
||||
paired_unit.last_updated = datetime.utcnow()
|
||||
if old_paired_note != note:
|
||||
record_history(db, paired_unit.id, "note_change", "note",
|
||||
old_paired_note or "", note or "", f"cascade from {unit_id}")
|
||||
|
||||
db.commit()
|
||||
return {"message": "Unit updated", "id": unit_id, "device_type": device_type}
|
||||
|
||||
# Sync SLM polling config to SLMM when deployed/retired status changes
|
||||
# This ensures benched units stop being polled
|
||||
if device_type == "slm" and (old_deployed != deployed_bool or old_retired != retired_bool):
|
||||
db.refresh(unit) # Refresh to get committed values
|
||||
try:
|
||||
await sync_slm_to_slmm(unit)
|
||||
logger.info(f"Synced SLM {unit_id} polling config to SLMM (deployed={deployed_bool}, retired={retired_bool})")
|
||||
except Exception as e:
|
||||
logger.warning(f"Failed to sync SLM {unit_id} polling config to SLMM: {e}")
|
||||
|
||||
response = {"message": "Unit updated", "id": unit_id, "device_type": device_type}
|
||||
if cascaded_unit_id:
|
||||
response["cascaded_to"] = cascaded_unit_id
|
||||
return response
|
||||
|
||||
|
||||
@router.post("/set-deployed/{unit_id}")
|
||||
def set_deployed(unit_id: str, deployed: bool = Form(...), db: Session = Depends(get_db)):
|
||||
async def set_deployed(unit_id: str, deployed: bool = Form(...), db: Session = Depends(get_db)):
|
||||
unit = get_or_create_roster_unit(db, unit_id)
|
||||
old_deployed = unit.deployed
|
||||
unit.deployed = deployed
|
||||
@@ -429,11 +784,21 @@ def set_deployed(unit_id: str, deployed: bool = Form(...), db: Session = Depends
|
||||
)
|
||||
|
||||
db.commit()
|
||||
|
||||
# Sync SLM polling config to SLMM when deployed status changes
|
||||
if unit.device_type == "slm" and old_deployed != deployed:
|
||||
db.refresh(unit)
|
||||
try:
|
||||
await sync_slm_to_slmm(unit)
|
||||
logger.info(f"Synced SLM {unit_id} polling config to SLMM (deployed={deployed})")
|
||||
except Exception as e:
|
||||
logger.warning(f"Failed to sync SLM {unit_id} polling config to SLMM: {e}")
|
||||
|
||||
return {"message": "Updated", "id": unit_id, "deployed": deployed}
|
||||
|
||||
|
||||
@router.post("/set-retired/{unit_id}")
|
||||
def set_retired(unit_id: str, retired: bool = Form(...), db: Session = Depends(get_db)):
|
||||
async def set_retired(unit_id: str, retired: bool = Form(...), db: Session = Depends(get_db)):
|
||||
unit = get_or_create_roster_unit(db, unit_id)
|
||||
old_retired = unit.retired
|
||||
unit.retired = retired
|
||||
@@ -454,20 +819,34 @@ def set_retired(unit_id: str, retired: bool = Form(...), db: Session = Depends(g
|
||||
)
|
||||
|
||||
db.commit()
|
||||
|
||||
# Sync SLM polling config to SLMM when retired status changes
|
||||
if unit.device_type == "slm" and old_retired != retired:
|
||||
db.refresh(unit)
|
||||
try:
|
||||
await sync_slm_to_slmm(unit)
|
||||
logger.info(f"Synced SLM {unit_id} polling config to SLMM (retired={retired})")
|
||||
except Exception as e:
|
||||
logger.warning(f"Failed to sync SLM {unit_id} polling config to SLMM: {e}")
|
||||
|
||||
return {"message": "Updated", "id": unit_id, "retired": retired}
|
||||
|
||||
|
||||
@router.delete("/{unit_id}")
|
||||
def delete_roster_unit(unit_id: str, db: Session = Depends(get_db)):
|
||||
async def delete_roster_unit(unit_id: str, db: Session = Depends(get_db)):
|
||||
"""
|
||||
Permanently delete a unit from the database.
|
||||
Checks roster, emitters, and ignored_units tables and deletes from any table where the unit exists.
|
||||
|
||||
For SLM devices, also removes from SLMM to stop background polling.
|
||||
"""
|
||||
deleted = False
|
||||
was_slm = False
|
||||
|
||||
# Try to delete from roster table
|
||||
roster_unit = db.query(RosterUnit).filter(RosterUnit.id == unit_id).first()
|
||||
if roster_unit:
|
||||
was_slm = roster_unit.device_type == "slm"
|
||||
db.delete(roster_unit)
|
||||
deleted = True
|
||||
|
||||
@@ -488,6 +867,19 @@ def delete_roster_unit(unit_id: str, db: Session = Depends(get_db)):
|
||||
raise HTTPException(status_code=404, detail="Unit not found")
|
||||
|
||||
db.commit()
|
||||
|
||||
# If it was an SLM, also delete from SLMM
|
||||
if was_slm:
|
||||
try:
|
||||
async with httpx.AsyncClient(timeout=5.0) as client:
|
||||
response = await client.delete(f"{SLMM_BASE_URL}/api/nl43/{unit_id}/config")
|
||||
if response.status_code in [200, 404]:
|
||||
logger.info(f"Deleted SLM {unit_id} from SLMM")
|
||||
else:
|
||||
logger.warning(f"Failed to delete SLM {unit_id} from SLMM: {response.status_code}")
|
||||
except Exception as e:
|
||||
logger.error(f"Error deleting SLM {unit_id} from SLMM: {e}")
|
||||
|
||||
return {"message": "Unit deleted", "id": unit_id}
|
||||
|
||||
|
||||
@@ -514,6 +906,37 @@ def set_note(unit_id: str, note: str = Form(""), db: Session = Depends(get_db)):
|
||||
return {"message": "Updated", "id": unit_id, "note": note}
|
||||
|
||||
|
||||
def _parse_bool(value: str) -> bool:
|
||||
"""Parse boolean from CSV string value."""
|
||||
return value.lower() in ('true', '1', 'yes') if value else False
|
||||
|
||||
|
||||
def _parse_int(value: str) -> int | None:
|
||||
"""Parse integer from CSV string value, return None if empty or invalid."""
|
||||
if not value or not value.strip():
|
||||
return None
|
||||
try:
|
||||
return int(value.strip())
|
||||
except ValueError:
|
||||
return None
|
||||
|
||||
|
||||
def _parse_date(value: str) -> date | None:
|
||||
"""Parse date from CSV string value (YYYY-MM-DD format)."""
|
||||
if not value or not value.strip():
|
||||
return None
|
||||
try:
|
||||
return datetime.strptime(value.strip(), '%Y-%m-%d').date()
|
||||
except ValueError:
|
||||
return None
|
||||
|
||||
|
||||
def _get_csv_value(row: dict, key: str, default=None):
|
||||
"""Get value from CSV row, return default if empty."""
|
||||
value = row.get(key, '').strip() if row.get(key) else ''
|
||||
return value if value else default
|
||||
|
||||
|
||||
@router.post("/import-csv")
|
||||
async def import_csv(
|
||||
file: UploadFile = File(...),
|
||||
@@ -524,13 +947,40 @@ async def import_csv(
|
||||
Import roster units from CSV file.
|
||||
|
||||
Expected CSV columns (unit_id is required, others are optional):
|
||||
- unit_id: Unique identifier for the unit
|
||||
- unit_type: Type of unit (default: "series3")
|
||||
- deployed: Boolean for deployment status (default: False)
|
||||
- retired: Boolean for retirement status (default: False)
|
||||
|
||||
Common fields (all device types):
|
||||
- unit_id: Unique identifier for the unit (REQUIRED)
|
||||
- device_type: "seismograph", "modem", or "slm" (default: "seismograph")
|
||||
- unit_type: Sub-type (e.g., "series3", "series4" for seismographs)
|
||||
- deployed: Boolean (true/false/yes/no/1/0)
|
||||
- retired: Boolean
|
||||
- note: Notes about the unit
|
||||
- project_id: Project identifier
|
||||
- location: Location description
|
||||
- address: Street address
|
||||
- coordinates: GPS coordinates (lat;lon or lat,lon)
|
||||
|
||||
Seismograph-specific:
|
||||
- last_calibrated: Date (YYYY-MM-DD)
|
||||
- next_calibration_due: Date (YYYY-MM-DD)
|
||||
- deployed_with_modem_id: ID of paired modem
|
||||
|
||||
Modem-specific:
|
||||
- ip_address: Device IP address
|
||||
- phone_number: SIM card phone number
|
||||
- hardware_model: Hardware model (e.g., IBR900, RV55)
|
||||
|
||||
SLM-specific:
|
||||
- slm_host: Device IP or hostname
|
||||
- slm_tcp_port: TCP control port (default 2255)
|
||||
- slm_ftp_port: FTP port (default 21)
|
||||
- slm_model: Device model (NL-43, NL-53)
|
||||
- slm_serial_number: Serial number
|
||||
- slm_frequency_weighting: A, C, or Z
|
||||
- slm_time_weighting: F (Fast), S (Slow), I (Impulse)
|
||||
- slm_measurement_range: e.g., "30-130 dB"
|
||||
|
||||
Lines starting with # are treated as comments and skipped.
|
||||
|
||||
Args:
|
||||
file: CSV file upload
|
||||
@@ -543,6 +993,46 @@ async def import_csv(
|
||||
# Read file content
|
||||
contents = await file.read()
|
||||
csv_text = contents.decode('utf-8')
|
||||
|
||||
# Filter out comment lines (starting with #)
|
||||
lines = csv_text.split('\n')
|
||||
filtered_lines = [line for line in lines if not line.strip().startswith('#')]
|
||||
csv_text = '\n'.join(filtered_lines)
|
||||
|
||||
# First pass: validate for duplicates and empty unit_ids
|
||||
csv_reader = csv.DictReader(io.StringIO(csv_text))
|
||||
seen_unit_ids = {} # unit_id -> list of row numbers
|
||||
empty_unit_id_rows = []
|
||||
|
||||
for row_num, row in enumerate(csv_reader, start=2):
|
||||
unit_id = row.get('unit_id', '').strip()
|
||||
if not unit_id:
|
||||
empty_unit_id_rows.append(row_num)
|
||||
else:
|
||||
if unit_id not in seen_unit_ids:
|
||||
seen_unit_ids[unit_id] = []
|
||||
seen_unit_ids[unit_id].append(row_num)
|
||||
|
||||
# Check for validation errors
|
||||
validation_errors = []
|
||||
|
||||
# Report empty unit_ids
|
||||
if empty_unit_id_rows:
|
||||
validation_errors.append(f"Empty unit_id on row(s): {', '.join(map(str, empty_unit_id_rows))}")
|
||||
|
||||
# Report duplicates
|
||||
duplicates = {uid: rows for uid, rows in seen_unit_ids.items() if len(rows) > 1}
|
||||
if duplicates:
|
||||
for uid, rows in duplicates.items():
|
||||
validation_errors.append(f"Duplicate unit_id '{uid}' on rows: {', '.join(map(str, rows))}")
|
||||
|
||||
if validation_errors:
|
||||
raise HTTPException(
|
||||
status_code=400,
|
||||
detail="CSV validation failed:\n" + "\n".join(validation_errors)
|
||||
)
|
||||
|
||||
# Second pass: actually import the data
|
||||
csv_reader = csv.DictReader(io.StringIO(csv_text))
|
||||
|
||||
results = {
|
||||
@@ -563,6 +1053,9 @@ async def import_csv(
|
||||
})
|
||||
continue
|
||||
|
||||
# Determine device type
|
||||
device_type = _get_csv_value(row, 'device_type', 'seismograph')
|
||||
|
||||
# Check if unit exists
|
||||
existing_unit = db.query(RosterUnit).filter(RosterUnit.id == unit_id).first()
|
||||
|
||||
@@ -571,31 +1064,104 @@ async def import_csv(
|
||||
results["skipped"].append(unit_id)
|
||||
continue
|
||||
|
||||
# Update existing unit
|
||||
existing_unit.unit_type = row.get('unit_type', existing_unit.unit_type or 'series3')
|
||||
existing_unit.deployed = row.get('deployed', '').lower() in ('true', '1', 'yes') if row.get('deployed') else existing_unit.deployed
|
||||
existing_unit.retired = row.get('retired', '').lower() in ('true', '1', 'yes') if row.get('retired') else existing_unit.retired
|
||||
existing_unit.note = row.get('note', existing_unit.note or '')
|
||||
existing_unit.project_id = row.get('project_id', existing_unit.project_id)
|
||||
existing_unit.location = row.get('location', existing_unit.location)
|
||||
existing_unit.address = row.get('address', existing_unit.address)
|
||||
existing_unit.coordinates = row.get('coordinates', existing_unit.coordinates)
|
||||
# Update existing unit - common fields
|
||||
existing_unit.device_type = device_type
|
||||
existing_unit.unit_type = _get_csv_value(row, 'unit_type', existing_unit.unit_type or 'series3')
|
||||
existing_unit.deployed = _parse_bool(row.get('deployed', '')) if row.get('deployed') else existing_unit.deployed
|
||||
existing_unit.retired = _parse_bool(row.get('retired', '')) if row.get('retired') else existing_unit.retired
|
||||
existing_unit.note = _get_csv_value(row, 'note', existing_unit.note)
|
||||
existing_unit.project_id = _get_csv_value(row, 'project_id', existing_unit.project_id)
|
||||
existing_unit.location = _get_csv_value(row, 'location', existing_unit.location)
|
||||
existing_unit.address = _get_csv_value(row, 'address', existing_unit.address)
|
||||
existing_unit.coordinates = _get_csv_value(row, 'coordinates', existing_unit.coordinates)
|
||||
existing_unit.last_updated = datetime.utcnow()
|
||||
|
||||
# Seismograph-specific fields
|
||||
if row.get('last_calibrated'):
|
||||
last_cal = _parse_date(row.get('last_calibrated'))
|
||||
existing_unit.last_calibrated = last_cal
|
||||
# Auto-calculate next_calibration_due using calibration interval
|
||||
if last_cal:
|
||||
cal_interval = get_calibration_interval(db)
|
||||
existing_unit.next_calibration_due = last_cal + timedelta(days=cal_interval)
|
||||
elif row.get('next_calibration_due'):
|
||||
# Only use explicit next_calibration_due if no last_calibrated
|
||||
existing_unit.next_calibration_due = _parse_date(row.get('next_calibration_due'))
|
||||
if row.get('deployed_with_modem_id'):
|
||||
existing_unit.deployed_with_modem_id = _get_csv_value(row, 'deployed_with_modem_id')
|
||||
|
||||
# Modem-specific fields
|
||||
if row.get('ip_address'):
|
||||
existing_unit.ip_address = _get_csv_value(row, 'ip_address')
|
||||
if row.get('phone_number'):
|
||||
existing_unit.phone_number = _get_csv_value(row, 'phone_number')
|
||||
if row.get('hardware_model'):
|
||||
existing_unit.hardware_model = _get_csv_value(row, 'hardware_model')
|
||||
if row.get('deployment_type'):
|
||||
existing_unit.deployment_type = _get_csv_value(row, 'deployment_type')
|
||||
if row.get('deployed_with_unit_id'):
|
||||
existing_unit.deployed_with_unit_id = _get_csv_value(row, 'deployed_with_unit_id')
|
||||
|
||||
# SLM-specific fields
|
||||
if row.get('slm_host'):
|
||||
existing_unit.slm_host = _get_csv_value(row, 'slm_host')
|
||||
if row.get('slm_tcp_port'):
|
||||
existing_unit.slm_tcp_port = _parse_int(row.get('slm_tcp_port'))
|
||||
if row.get('slm_ftp_port'):
|
||||
existing_unit.slm_ftp_port = _parse_int(row.get('slm_ftp_port'))
|
||||
if row.get('slm_model'):
|
||||
existing_unit.slm_model = _get_csv_value(row, 'slm_model')
|
||||
if row.get('slm_serial_number'):
|
||||
existing_unit.slm_serial_number = _get_csv_value(row, 'slm_serial_number')
|
||||
if row.get('slm_frequency_weighting'):
|
||||
existing_unit.slm_frequency_weighting = _get_csv_value(row, 'slm_frequency_weighting')
|
||||
if row.get('slm_time_weighting'):
|
||||
existing_unit.slm_time_weighting = _get_csv_value(row, 'slm_time_weighting')
|
||||
if row.get('slm_measurement_range'):
|
||||
existing_unit.slm_measurement_range = _get_csv_value(row, 'slm_measurement_range')
|
||||
|
||||
results["updated"].append(unit_id)
|
||||
else:
|
||||
# Create new unit
|
||||
# Calculate next_calibration_due from last_calibrated
|
||||
last_cal = _parse_date(row.get('last_calibrated', ''))
|
||||
if last_cal:
|
||||
cal_interval = get_calibration_interval(db)
|
||||
next_cal = last_cal + timedelta(days=cal_interval)
|
||||
else:
|
||||
next_cal = _parse_date(row.get('next_calibration_due', ''))
|
||||
|
||||
# Create new unit with all fields
|
||||
new_unit = RosterUnit(
|
||||
id=unit_id,
|
||||
unit_type=row.get('unit_type', 'series3'),
|
||||
deployed=row.get('deployed', '').lower() in ('true', '1', 'yes'),
|
||||
retired=row.get('retired', '').lower() in ('true', '1', 'yes'),
|
||||
note=row.get('note', ''),
|
||||
project_id=row.get('project_id'),
|
||||
location=row.get('location'),
|
||||
address=row.get('address'),
|
||||
coordinates=row.get('coordinates'),
|
||||
last_updated=datetime.utcnow()
|
||||
device_type=device_type,
|
||||
unit_type=_get_csv_value(row, 'unit_type', 'series3'),
|
||||
deployed=_parse_bool(row.get('deployed', '')),
|
||||
retired=_parse_bool(row.get('retired', '')),
|
||||
note=_get_csv_value(row, 'note', ''),
|
||||
project_id=_get_csv_value(row, 'project_id'),
|
||||
location=_get_csv_value(row, 'location'),
|
||||
address=_get_csv_value(row, 'address'),
|
||||
coordinates=_get_csv_value(row, 'coordinates'),
|
||||
last_updated=datetime.utcnow(),
|
||||
# Seismograph fields - auto-calc next_calibration_due from last_calibrated
|
||||
last_calibrated=last_cal,
|
||||
next_calibration_due=next_cal,
|
||||
deployed_with_modem_id=_get_csv_value(row, 'deployed_with_modem_id'),
|
||||
# Modem fields
|
||||
ip_address=_get_csv_value(row, 'ip_address'),
|
||||
phone_number=_get_csv_value(row, 'phone_number'),
|
||||
hardware_model=_get_csv_value(row, 'hardware_model'),
|
||||
deployment_type=_get_csv_value(row, 'deployment_type'),
|
||||
deployed_with_unit_id=_get_csv_value(row, 'deployed_with_unit_id'),
|
||||
# SLM fields
|
||||
slm_host=_get_csv_value(row, 'slm_host'),
|
||||
slm_tcp_port=_parse_int(row.get('slm_tcp_port', '')),
|
||||
slm_ftp_port=_parse_int(row.get('slm_ftp_port', '')),
|
||||
slm_model=_get_csv_value(row, 'slm_model'),
|
||||
slm_serial_number=_get_csv_value(row, 'slm_serial_number'),
|
||||
slm_frequency_weighting=_get_csv_value(row, 'slm_frequency_weighting'),
|
||||
slm_time_weighting=_get_csv_value(row, 'slm_time_weighting'),
|
||||
slm_measurement_range=_get_csv_value(row, 'slm_measurement_range'),
|
||||
)
|
||||
db.add(new_unit)
|
||||
results["added"].append(unit_id)
|
||||
@@ -718,3 +1284,145 @@ def delete_history_entry(history_id: int, db: Session = Depends(get_db)):
|
||||
db.delete(history_entry)
|
||||
db.commit()
|
||||
return {"message": "History entry deleted", "id": history_id}
|
||||
|
||||
|
||||
@router.post("/pair-devices")
|
||||
async def pair_devices(
|
||||
request: Request,
|
||||
db: Session = Depends(get_db)
|
||||
):
|
||||
"""
|
||||
Create a bidirectional pairing between a recorder (seismograph/SLM) and a modem.
|
||||
|
||||
Sets:
|
||||
- recorder.deployed_with_modem_id = modem_id
|
||||
- modem.deployed_with_unit_id = recorder_id
|
||||
|
||||
Also clears any previous pairings for both devices.
|
||||
"""
|
||||
data = await request.json()
|
||||
recorder_id = data.get("recorder_id")
|
||||
modem_id = data.get("modem_id")
|
||||
|
||||
if not recorder_id or not modem_id:
|
||||
raise HTTPException(status_code=400, detail="Both recorder_id and modem_id are required")
|
||||
|
||||
# Get or create the units
|
||||
recorder = db.query(RosterUnit).filter(RosterUnit.id == recorder_id).first()
|
||||
modem = db.query(RosterUnit).filter(RosterUnit.id == modem_id).first()
|
||||
|
||||
if not recorder:
|
||||
raise HTTPException(status_code=404, detail=f"Recorder {recorder_id} not found in roster")
|
||||
if not modem:
|
||||
raise HTTPException(status_code=404, detail=f"Modem {modem_id} not found in roster")
|
||||
|
||||
# Validate device types
|
||||
if recorder.device_type == "modem":
|
||||
raise HTTPException(status_code=400, detail=f"{recorder_id} is a modem, not a recorder")
|
||||
if modem.device_type != "modem":
|
||||
raise HTTPException(status_code=400, detail=f"{modem_id} is not a modem (type: {modem.device_type})")
|
||||
|
||||
# Clear any previous pairings
|
||||
# If recorder was paired with a different modem, clear that modem's link
|
||||
if recorder.deployed_with_modem_id and recorder.deployed_with_modem_id != modem_id:
|
||||
old_modem = db.query(RosterUnit).filter(RosterUnit.id == recorder.deployed_with_modem_id).first()
|
||||
if old_modem and old_modem.deployed_with_unit_id == recorder_id:
|
||||
record_history(db, old_modem.id, "update", "deployed_with_unit_id",
|
||||
old_modem.deployed_with_unit_id, None, "pair_devices", f"Cleared by new pairing")
|
||||
old_modem.deployed_with_unit_id = None
|
||||
|
||||
# If modem was paired with a different recorder, clear that recorder's link
|
||||
if modem.deployed_with_unit_id and modem.deployed_with_unit_id != recorder_id:
|
||||
old_recorder = db.query(RosterUnit).filter(RosterUnit.id == modem.deployed_with_unit_id).first()
|
||||
if old_recorder and old_recorder.deployed_with_modem_id == modem_id:
|
||||
record_history(db, old_recorder.id, "update", "deployed_with_modem_id",
|
||||
old_recorder.deployed_with_modem_id, None, "pair_devices", f"Cleared by new pairing")
|
||||
old_recorder.deployed_with_modem_id = None
|
||||
|
||||
# Record history for the pairing
|
||||
old_recorder_modem = recorder.deployed_with_modem_id
|
||||
old_modem_unit = modem.deployed_with_unit_id
|
||||
|
||||
# Set the new pairing
|
||||
recorder.deployed_with_modem_id = modem_id
|
||||
modem.deployed_with_unit_id = recorder_id
|
||||
|
||||
# Record history
|
||||
if old_recorder_modem != modem_id:
|
||||
record_history(db, recorder_id, "update", "deployed_with_modem_id",
|
||||
old_recorder_modem, modem_id, "pair_devices", f"Paired with modem")
|
||||
if old_modem_unit != recorder_id:
|
||||
record_history(db, modem_id, "update", "deployed_with_unit_id",
|
||||
old_modem_unit, recorder_id, "pair_devices", f"Paired with recorder")
|
||||
|
||||
db.commit()
|
||||
|
||||
logger.info(f"Paired {recorder_id} with modem {modem_id}")
|
||||
|
||||
# If SLM, sync to SLMM cache
|
||||
if recorder.device_type == "slm":
|
||||
await sync_slm_to_slmm_cache(
|
||||
unit_id=recorder_id,
|
||||
host=recorder.slm_host,
|
||||
tcp_port=recorder.slm_tcp_port,
|
||||
ftp_port=recorder.slm_ftp_port,
|
||||
deployed_with_modem_id=modem_id,
|
||||
db=db
|
||||
)
|
||||
|
||||
return {
|
||||
"success": True,
|
||||
"message": f"Paired {recorder_id} with {modem_id}",
|
||||
"recorder_id": recorder_id,
|
||||
"modem_id": modem_id
|
||||
}
|
||||
|
||||
|
||||
@router.post("/unpair-devices")
|
||||
async def unpair_devices(
|
||||
request: Request,
|
||||
db: Session = Depends(get_db)
|
||||
):
|
||||
"""
|
||||
Remove the bidirectional pairing between a recorder and modem.
|
||||
|
||||
Clears:
|
||||
- recorder.deployed_with_modem_id
|
||||
- modem.deployed_with_unit_id
|
||||
"""
|
||||
data = await request.json()
|
||||
recorder_id = data.get("recorder_id")
|
||||
modem_id = data.get("modem_id")
|
||||
|
||||
if not recorder_id or not modem_id:
|
||||
raise HTTPException(status_code=400, detail="Both recorder_id and modem_id are required")
|
||||
|
||||
recorder = db.query(RosterUnit).filter(RosterUnit.id == recorder_id).first()
|
||||
modem = db.query(RosterUnit).filter(RosterUnit.id == modem_id).first()
|
||||
|
||||
changes_made = False
|
||||
|
||||
if recorder and recorder.deployed_with_modem_id == modem_id:
|
||||
record_history(db, recorder_id, "update", "deployed_with_modem_id",
|
||||
recorder.deployed_with_modem_id, None, "unpair_devices", "Unpairing")
|
||||
recorder.deployed_with_modem_id = None
|
||||
changes_made = True
|
||||
|
||||
if modem and modem.deployed_with_unit_id == recorder_id:
|
||||
record_history(db, modem_id, "update", "deployed_with_unit_id",
|
||||
modem.deployed_with_unit_id, None, "unpair_devices", "Unpairing")
|
||||
modem.deployed_with_unit_id = None
|
||||
changes_made = True
|
||||
|
||||
if changes_made:
|
||||
db.commit()
|
||||
logger.info(f"Unpaired {recorder_id} from modem {modem_id}")
|
||||
return {
|
||||
"success": True,
|
||||
"message": f"Unpaired {recorder_id} from {modem_id}"
|
||||
}
|
||||
else:
|
||||
return {
|
||||
"success": False,
|
||||
"message": "No pairing found between these devices"
|
||||
}
|
||||
|
||||
@@ -92,21 +92,21 @@ async def rename_unit(
|
||||
except Exception as e:
|
||||
logger.warning(f"Could not update unit_assignments: {e}")
|
||||
|
||||
# Update recording_sessions table (if exists)
|
||||
# Update monitoring_sessions table (if exists)
|
||||
try:
|
||||
from backend.models import RecordingSession
|
||||
db.query(RecordingSession).filter(RecordingSession.unit_id == old_id).update(
|
||||
from backend.models import MonitoringSession
|
||||
db.query(MonitoringSession).filter(MonitoringSession.unit_id == old_id).update(
|
||||
{"unit_id": new_id},
|
||||
synchronize_session=False
|
||||
)
|
||||
except Exception as e:
|
||||
logger.warning(f"Could not update recording_sessions: {e}")
|
||||
logger.warning(f"Could not update monitoring_sessions: {e}")
|
||||
|
||||
# Commit all changes
|
||||
db.commit()
|
||||
|
||||
# If sound level meter, sync updated config to SLMM cache
|
||||
if device_type == "sound_level_meter":
|
||||
if device_type == "slm":
|
||||
logger.info(f"Syncing renamed SLM {new_id} (was {old_id}) config to SLMM cache...")
|
||||
result = await sync_slm_to_slmm_cache(
|
||||
unit_id=new_id,
|
||||
|
||||
@@ -5,7 +5,6 @@ Handles scheduled actions for automated recording control.
|
||||
"""
|
||||
|
||||
from fastapi import APIRouter, Request, Depends, HTTPException, Query
|
||||
from fastapi.templating import Jinja2Templates
|
||||
from fastapi.responses import HTMLResponse, JSONResponse
|
||||
from sqlalchemy.orm import Session
|
||||
from sqlalchemy import and_, or_
|
||||
@@ -23,9 +22,9 @@ from backend.models import (
|
||||
RosterUnit,
|
||||
)
|
||||
from backend.services.scheduler import get_scheduler
|
||||
from backend.templates_config import templates
|
||||
|
||||
router = APIRouter(prefix="/api/projects/{project_id}/scheduler", tags=["scheduler"])
|
||||
templates = Jinja2Templates(directory="templates")
|
||||
|
||||
|
||||
# ============================================================================
|
||||
@@ -131,7 +130,7 @@ async def create_scheduled_action(
|
||||
raise HTTPException(status_code=404, detail="Location not found")
|
||||
|
||||
# Determine device type from location
|
||||
device_type = "sound_level_meter" if location.location_type == "sound" else "seismograph"
|
||||
device_type = "slm" if location.location_type == "sound" else "seismograph"
|
||||
|
||||
# Get unit_id (optional - can be determined from assignment at execution time)
|
||||
unit_id = form_data.get("unit_id")
|
||||
@@ -188,7 +187,7 @@ async def schedule_recording_session(
|
||||
if not location:
|
||||
raise HTTPException(status_code=404, detail="Location not found")
|
||||
|
||||
device_type = "sound_level_meter" if location.location_type == "sound" else "seismograph"
|
||||
device_type = "slm" if location.location_type == "sound" else "seismograph"
|
||||
unit_id = form_data.get("unit_id")
|
||||
|
||||
start_time = datetime.fromisoformat(form_data.get("start_time"))
|
||||
|
||||
@@ -3,15 +3,16 @@ Seismograph Dashboard API Router
|
||||
Provides endpoints for the seismograph-specific dashboard
|
||||
"""
|
||||
|
||||
from datetime import date
|
||||
|
||||
from fastapi import APIRouter, Request, Depends, Query
|
||||
from fastapi.responses import HTMLResponse
|
||||
from fastapi.templating import Jinja2Templates
|
||||
from sqlalchemy.orm import Session
|
||||
from backend.database import get_db
|
||||
from backend.models import RosterUnit
|
||||
from backend.templates_config import templates
|
||||
|
||||
router = APIRouter(prefix="/api/seismo-dashboard", tags=["seismo-dashboard"])
|
||||
templates = Jinja2Templates(directory="templates")
|
||||
|
||||
|
||||
@router.get("/stats", response_class=HTMLResponse)
|
||||
@@ -50,10 +51,14 @@ async def get_seismo_stats(request: Request, db: Session = Depends(get_db)):
|
||||
async def get_seismo_units(
|
||||
request: Request,
|
||||
db: Session = Depends(get_db),
|
||||
search: str = Query(None)
|
||||
search: str = Query(None),
|
||||
sort: str = Query("id"),
|
||||
order: str = Query("asc"),
|
||||
status: str = Query(None),
|
||||
modem: str = Query(None)
|
||||
):
|
||||
"""
|
||||
Returns HTML partial with filterable seismograph unit list
|
||||
Returns HTML partial with filterable and sortable seismograph unit list
|
||||
"""
|
||||
query = db.query(RosterUnit).filter_by(
|
||||
device_type="seismograph",
|
||||
@@ -62,20 +67,52 @@ async def get_seismo_units(
|
||||
|
||||
# Apply search filter
|
||||
if search:
|
||||
search_lower = search.lower()
|
||||
query = query.filter(
|
||||
(RosterUnit.id.ilike(f"%{search}%")) |
|
||||
(RosterUnit.note.ilike(f"%{search}%")) |
|
||||
(RosterUnit.address.ilike(f"%{search}%"))
|
||||
)
|
||||
|
||||
seismos = query.order_by(RosterUnit.id).all()
|
||||
# Apply status filter
|
||||
if status == "deployed":
|
||||
query = query.filter(RosterUnit.deployed == True)
|
||||
elif status == "benched":
|
||||
query = query.filter(RosterUnit.deployed == False)
|
||||
|
||||
# Apply modem filter
|
||||
if modem == "with":
|
||||
query = query.filter(RosterUnit.deployed_with_modem_id.isnot(None))
|
||||
elif modem == "without":
|
||||
query = query.filter(RosterUnit.deployed_with_modem_id.is_(None))
|
||||
|
||||
# Apply sorting
|
||||
sort_column_map = {
|
||||
"id": RosterUnit.id,
|
||||
"status": RosterUnit.deployed,
|
||||
"modem": RosterUnit.deployed_with_modem_id,
|
||||
"location": RosterUnit.address,
|
||||
"last_calibrated": RosterUnit.last_calibrated,
|
||||
"notes": RosterUnit.note
|
||||
}
|
||||
sort_column = sort_column_map.get(sort, RosterUnit.id)
|
||||
|
||||
if order == "desc":
|
||||
query = query.order_by(sort_column.desc())
|
||||
else:
|
||||
query = query.order_by(sort_column.asc())
|
||||
|
||||
seismos = query.all()
|
||||
|
||||
return templates.TemplateResponse(
|
||||
"partials/seismo_unit_list.html",
|
||||
{
|
||||
"request": request,
|
||||
"units": seismos,
|
||||
"search": search or ""
|
||||
"search": search or "",
|
||||
"sort": sort,
|
||||
"order": order,
|
||||
"status": status or "",
|
||||
"modem": modem or "",
|
||||
"today": date.today()
|
||||
}
|
||||
)
|
||||
|
||||
@@ -477,3 +477,75 @@ async def upload_snapshot(file: UploadFile = File(...)):
|
||||
|
||||
except Exception as e:
|
||||
raise HTTPException(status_code=500, detail=f"Upload failed: {str(e)}")
|
||||
|
||||
|
||||
# ============================================================================
|
||||
# SLMM SYNC ENDPOINTS
|
||||
# ============================================================================
|
||||
|
||||
@router.post("/slmm/sync-all")
|
||||
async def sync_all_slms(db: Session = Depends(get_db)):
|
||||
"""
|
||||
Manually trigger full sync of all SLM devices from Terra-View roster to SLMM.
|
||||
|
||||
This ensures SLMM database matches Terra-View roster (source of truth).
|
||||
Also cleans up orphaned devices in SLMM that are not in Terra-View.
|
||||
"""
|
||||
from backend.services.slmm_sync import sync_all_slms_to_slmm, cleanup_orphaned_slmm_devices
|
||||
|
||||
try:
|
||||
# Sync all SLMs
|
||||
sync_results = await sync_all_slms_to_slmm(db)
|
||||
|
||||
# Clean up orphaned devices
|
||||
cleanup_results = await cleanup_orphaned_slmm_devices(db)
|
||||
|
||||
return {
|
||||
"status": "ok",
|
||||
"sync": sync_results,
|
||||
"cleanup": cleanup_results
|
||||
}
|
||||
|
||||
except Exception as e:
|
||||
raise HTTPException(status_code=500, detail=f"Sync failed: {str(e)}")
|
||||
|
||||
|
||||
@router.get("/slmm/status")
|
||||
async def get_slmm_sync_status(db: Session = Depends(get_db)):
|
||||
"""
|
||||
Get status of SLMM synchronization.
|
||||
|
||||
Shows which devices are in Terra-View roster vs SLMM database.
|
||||
"""
|
||||
from backend.services.slmm_sync import get_slmm_devices
|
||||
|
||||
try:
|
||||
# Get devices from both systems
|
||||
roster_slms = db.query(RosterUnit).filter_by(device_type="slm").all()
|
||||
slmm_devices = await get_slmm_devices()
|
||||
|
||||
if slmm_devices is None:
|
||||
raise HTTPException(status_code=503, detail="SLMM service unavailable")
|
||||
|
||||
roster_unit_ids = {unit.unit_type for unit in roster_slms}
|
||||
slmm_unit_ids = set(slmm_devices)
|
||||
|
||||
# Find differences
|
||||
in_roster_only = roster_unit_ids - slmm_unit_ids
|
||||
in_slmm_only = slmm_unit_ids - roster_unit_ids
|
||||
in_both = roster_unit_ids & slmm_unit_ids
|
||||
|
||||
return {
|
||||
"status": "ok",
|
||||
"terra_view_total": len(roster_unit_ids),
|
||||
"slmm_total": len(slmm_unit_ids),
|
||||
"synced": len(in_both),
|
||||
"missing_from_slmm": list(in_roster_only),
|
||||
"orphaned_in_slmm": list(in_slmm_only),
|
||||
"in_sync": len(in_roster_only) == 0 and len(in_slmm_only) == 0
|
||||
}
|
||||
|
||||
except HTTPException:
|
||||
raise
|
||||
except Exception as e:
|
||||
raise HTTPException(status_code=500, detail=f"Status check failed: {str(e)}")
|
||||
|
||||
@@ -5,7 +5,6 @@ Provides API endpoints for the Sound Level Meters dashboard page.
|
||||
"""
|
||||
|
||||
from fastapi import APIRouter, Request, Depends, Query
|
||||
from fastapi.templating import Jinja2Templates
|
||||
from fastapi.responses import HTMLResponse
|
||||
from sqlalchemy.orm import Session
|
||||
from sqlalchemy import func
|
||||
@@ -18,11 +17,11 @@ import os
|
||||
from backend.database import get_db
|
||||
from backend.models import RosterUnit
|
||||
from backend.routers.roster_edit import sync_slm_to_slmm_cache
|
||||
from backend.templates_config import templates
|
||||
|
||||
logger = logging.getLogger(__name__)
|
||||
|
||||
router = APIRouter(prefix="/api/slm-dashboard", tags=["slm-dashboard"])
|
||||
templates = Jinja2Templates(directory="templates")
|
||||
|
||||
# SLMM backend URL - configurable via environment variable
|
||||
SLMM_BASE_URL = os.getenv("SLMM_BASE_URL", "http://localhost:8100")
|
||||
@@ -35,7 +34,7 @@ async def get_slm_stats(request: Request, db: Session = Depends(get_db)):
|
||||
Returns HTML partial with stat cards.
|
||||
"""
|
||||
# Query all SLMs
|
||||
all_slms = db.query(RosterUnit).filter_by(device_type="sound_level_meter").all()
|
||||
all_slms = db.query(RosterUnit).filter_by(device_type="slm").all()
|
||||
|
||||
# Count deployed vs benched
|
||||
deployed_count = sum(1 for slm in all_slms if slm.deployed and not slm.retired)
|
||||
@@ -69,7 +68,7 @@ async def get_slm_units(
|
||||
Get list of SLM units for the sidebar.
|
||||
Returns HTML partial with unit cards.
|
||||
"""
|
||||
query = db.query(RosterUnit).filter_by(device_type="sound_level_meter")
|
||||
query = db.query(RosterUnit).filter_by(device_type="slm")
|
||||
|
||||
# Filter by project if provided
|
||||
if project:
|
||||
@@ -129,7 +128,7 @@ async def get_live_view(request: Request, unit_id: str, db: Session = Depends(ge
|
||||
Returns HTML partial with live metrics and chart.
|
||||
"""
|
||||
# Get unit from database
|
||||
unit = db.query(RosterUnit).filter_by(id=unit_id, device_type="sound_level_meter").first()
|
||||
unit = db.query(RosterUnit).filter_by(id=unit_id, device_type="slm").first()
|
||||
|
||||
if not unit:
|
||||
return templates.TemplateResponse("partials/slm_live_view_error.html", {
|
||||
@@ -168,23 +167,7 @@ async def get_live_view(request: Request, unit_id: str, db: Session = Depends(ge
|
||||
measurement_state = state_data.get("measurement_state", "Unknown")
|
||||
is_measuring = state_data.get("is_measuring", False)
|
||||
|
||||
# If measuring, sync start time from FTP to database (fixes wrong timestamps)
|
||||
if is_measuring:
|
||||
try:
|
||||
sync_response = await client.post(
|
||||
f"{SLMM_BASE_URL}/api/nl43/{unit_id}/sync-start-time",
|
||||
timeout=10.0
|
||||
)
|
||||
if sync_response.status_code == 200:
|
||||
sync_data = sync_response.json()
|
||||
logger.info(f"Synced start time for {unit_id}: {sync_data.get('message')}")
|
||||
else:
|
||||
logger.warning(f"Failed to sync start time for {unit_id}: {sync_response.status_code}")
|
||||
except Exception as e:
|
||||
# Don't fail the whole request if sync fails
|
||||
logger.warning(f"Could not sync start time for {unit_id}: {e}")
|
||||
|
||||
# Get live status (now with corrected start time)
|
||||
# Get live status (measurement_start_time is already stored in SLMM database)
|
||||
status_response = await client.get(
|
||||
f"{SLMM_BASE_URL}/api/nl43/{unit_id}/live"
|
||||
)
|
||||
@@ -242,7 +225,7 @@ async def get_slm_config(request: Request, unit_id: str, db: Session = Depends(g
|
||||
Get configuration form for a specific SLM unit.
|
||||
Returns HTML partial with configuration form.
|
||||
"""
|
||||
unit = db.query(RosterUnit).filter_by(id=unit_id, device_type="sound_level_meter").first()
|
||||
unit = db.query(RosterUnit).filter_by(id=unit_id, device_type="slm").first()
|
||||
|
||||
if not unit:
|
||||
return HTMLResponse(
|
||||
@@ -262,7 +245,7 @@ async def save_slm_config(request: Request, unit_id: str, db: Session = Depends(
|
||||
Save SLM configuration.
|
||||
Updates unit parameters in the database.
|
||||
"""
|
||||
unit = db.query(RosterUnit).filter_by(id=unit_id, device_type="sound_level_meter").first()
|
||||
unit = db.query(RosterUnit).filter_by(id=unit_id, device_type="slm").first()
|
||||
|
||||
if not unit:
|
||||
return {"status": "error", "detail": f"Unit {unit_id} not found"}
|
||||
|
||||
@@ -6,7 +6,6 @@ Provides endpoints for SLM dashboard cards, detail pages, and real-time data.
|
||||
|
||||
from fastapi import APIRouter, Depends, HTTPException, Request
|
||||
from fastapi.responses import HTMLResponse
|
||||
from fastapi.templating import Jinja2Templates
|
||||
from sqlalchemy.orm import Session
|
||||
from datetime import datetime
|
||||
import httpx
|
||||
@@ -15,11 +14,11 @@ import os
|
||||
|
||||
from backend.database import get_db
|
||||
from backend.models import RosterUnit
|
||||
from backend.templates_config import templates
|
||||
|
||||
logger = logging.getLogger(__name__)
|
||||
|
||||
router = APIRouter(prefix="/slm", tags=["slm-ui"])
|
||||
templates = Jinja2Templates(directory="templates")
|
||||
|
||||
SLMM_BASE_URL = os.getenv("SLMM_BASE_URL", "http://172.19.0.1:8100")
|
||||
|
||||
@@ -30,7 +29,7 @@ async def slm_detail_page(request: Request, unit_id: str, db: Session = Depends(
|
||||
|
||||
# Get roster unit
|
||||
unit = db.query(RosterUnit).filter_by(id=unit_id).first()
|
||||
if not unit or unit.device_type != "sound_level_meter":
|
||||
if not unit or unit.device_type != "slm":
|
||||
raise HTTPException(status_code=404, detail="Sound level meter not found")
|
||||
|
||||
return templates.TemplateResponse("slm_detail.html", {
|
||||
@@ -46,7 +45,7 @@ async def get_slm_summary(unit_id: str, db: Session = Depends(get_db)):
|
||||
|
||||
# Get roster unit
|
||||
unit = db.query(RosterUnit).filter_by(id=unit_id).first()
|
||||
if not unit or unit.device_type != "sound_level_meter":
|
||||
if not unit or unit.device_type != "slm":
|
||||
raise HTTPException(status_code=404, detail="Sound level meter not found")
|
||||
|
||||
# Try to get live status from SLMM
|
||||
@@ -61,7 +60,7 @@ async def get_slm_summary(unit_id: str, db: Session = Depends(get_db)):
|
||||
|
||||
return {
|
||||
"unit_id": unit_id,
|
||||
"device_type": "sound_level_meter",
|
||||
"device_type": "slm",
|
||||
"deployed": unit.deployed,
|
||||
"model": unit.slm_model or "NL-43",
|
||||
"location": unit.address or unit.location,
|
||||
@@ -89,7 +88,7 @@ async def slm_controls_partial(request: Request, unit_id: str, db: Session = Dep
|
||||
"""Render SLM control panel partial."""
|
||||
|
||||
unit = db.query(RosterUnit).filter_by(id=unit_id).first()
|
||||
if not unit or unit.device_type != "sound_level_meter":
|
||||
if not unit or unit.device_type != "slm":
|
||||
raise HTTPException(status_code=404, detail="Sound level meter not found")
|
||||
|
||||
# Get current status from SLMM
|
||||
|
||||
462
backend/services/alert_service.py
Normal file
@@ -0,0 +1,462 @@
|
||||
"""
|
||||
Alert Service
|
||||
|
||||
Manages in-app alerts for device status changes and system events.
|
||||
Provides foundation for future notification channels (email, webhook).
|
||||
"""
|
||||
|
||||
import json
|
||||
import uuid
|
||||
import logging
|
||||
from datetime import datetime, timedelta
|
||||
from typing import Optional, List, Dict, Any
|
||||
|
||||
from sqlalchemy.orm import Session
|
||||
from sqlalchemy import and_, or_
|
||||
|
||||
from backend.models import Alert, RosterUnit
|
||||
|
||||
logger = logging.getLogger(__name__)
|
||||
|
||||
|
||||
class AlertService:
|
||||
"""
|
||||
Service for managing alerts.
|
||||
|
||||
Handles alert lifecycle:
|
||||
- Create alerts from various triggers
|
||||
- Query active alerts
|
||||
- Acknowledge/resolve/dismiss alerts
|
||||
- (Future) Dispatch to notification channels
|
||||
"""
|
||||
|
||||
def __init__(self, db: Session):
|
||||
self.db = db
|
||||
|
||||
def create_alert(
|
||||
self,
|
||||
alert_type: str,
|
||||
title: str,
|
||||
message: str = None,
|
||||
severity: str = "warning",
|
||||
unit_id: str = None,
|
||||
project_id: str = None,
|
||||
location_id: str = None,
|
||||
schedule_id: str = None,
|
||||
metadata: dict = None,
|
||||
expires_hours: int = 24,
|
||||
) -> Alert:
|
||||
"""
|
||||
Create a new alert.
|
||||
|
||||
Args:
|
||||
alert_type: Type of alert (device_offline, device_online, schedule_failed)
|
||||
title: Short alert title
|
||||
message: Detailed description
|
||||
severity: info, warning, or critical
|
||||
unit_id: Related unit ID (optional)
|
||||
project_id: Related project ID (optional)
|
||||
location_id: Related location ID (optional)
|
||||
schedule_id: Related schedule ID (optional)
|
||||
metadata: Additional JSON data
|
||||
expires_hours: Hours until auto-expiry (default 24)
|
||||
|
||||
Returns:
|
||||
Created Alert instance
|
||||
"""
|
||||
alert = Alert(
|
||||
id=str(uuid.uuid4()),
|
||||
alert_type=alert_type,
|
||||
title=title,
|
||||
message=message,
|
||||
severity=severity,
|
||||
unit_id=unit_id,
|
||||
project_id=project_id,
|
||||
location_id=location_id,
|
||||
schedule_id=schedule_id,
|
||||
alert_metadata=json.dumps(metadata) if metadata else None,
|
||||
status="active",
|
||||
expires_at=datetime.utcnow() + timedelta(hours=expires_hours),
|
||||
)
|
||||
|
||||
self.db.add(alert)
|
||||
self.db.commit()
|
||||
self.db.refresh(alert)
|
||||
|
||||
logger.info(f"Created alert: {alert.title} ({alert.alert_type})")
|
||||
return alert
|
||||
|
||||
def create_device_offline_alert(
|
||||
self,
|
||||
unit_id: str,
|
||||
consecutive_failures: int = 0,
|
||||
last_error: str = None,
|
||||
) -> Optional[Alert]:
|
||||
"""
|
||||
Create alert when device becomes unreachable.
|
||||
|
||||
Only creates if no active offline alert exists for this device.
|
||||
|
||||
Args:
|
||||
unit_id: The unit that went offline
|
||||
consecutive_failures: Number of consecutive poll failures
|
||||
last_error: Last error message from polling
|
||||
|
||||
Returns:
|
||||
Created Alert or None if alert already exists
|
||||
"""
|
||||
# Check if active offline alert already exists
|
||||
existing = self.db.query(Alert).filter(
|
||||
and_(
|
||||
Alert.unit_id == unit_id,
|
||||
Alert.alert_type == "device_offline",
|
||||
Alert.status == "active",
|
||||
)
|
||||
).first()
|
||||
|
||||
if existing:
|
||||
logger.debug(f"Offline alert already exists for {unit_id}")
|
||||
return None
|
||||
|
||||
# Get unit info for title
|
||||
unit = self.db.query(RosterUnit).filter_by(id=unit_id).first()
|
||||
unit_name = unit.id if unit else unit_id
|
||||
|
||||
# Determine severity based on failure count
|
||||
severity = "critical" if consecutive_failures >= 5 else "warning"
|
||||
|
||||
return self.create_alert(
|
||||
alert_type="device_offline",
|
||||
title=f"{unit_name} is offline",
|
||||
message=f"Device has been unreachable after {consecutive_failures} failed connection attempts."
|
||||
+ (f" Last error: {last_error}" if last_error else ""),
|
||||
severity=severity,
|
||||
unit_id=unit_id,
|
||||
metadata={
|
||||
"consecutive_failures": consecutive_failures,
|
||||
"last_error": last_error,
|
||||
},
|
||||
expires_hours=48, # Offline alerts stay longer
|
||||
)
|
||||
|
||||
def resolve_device_offline_alert(self, unit_id: str) -> Optional[Alert]:
|
||||
"""
|
||||
Auto-resolve offline alert when device comes back online.
|
||||
|
||||
Also creates an "device_online" info alert to notify user.
|
||||
|
||||
Args:
|
||||
unit_id: The unit that came back online
|
||||
|
||||
Returns:
|
||||
The resolved Alert or None if no alert existed
|
||||
"""
|
||||
# Find active offline alert
|
||||
alert = self.db.query(Alert).filter(
|
||||
and_(
|
||||
Alert.unit_id == unit_id,
|
||||
Alert.alert_type == "device_offline",
|
||||
Alert.status == "active",
|
||||
)
|
||||
).first()
|
||||
|
||||
if not alert:
|
||||
return None
|
||||
|
||||
# Resolve the offline alert
|
||||
alert.status = "resolved"
|
||||
alert.resolved_at = datetime.utcnow()
|
||||
self.db.commit()
|
||||
|
||||
logger.info(f"Resolved offline alert for {unit_id}")
|
||||
|
||||
# Create online notification
|
||||
unit = self.db.query(RosterUnit).filter_by(id=unit_id).first()
|
||||
unit_name = unit.id if unit else unit_id
|
||||
|
||||
self.create_alert(
|
||||
alert_type="device_online",
|
||||
title=f"{unit_name} is back online",
|
||||
message="Device connection has been restored.",
|
||||
severity="info",
|
||||
unit_id=unit_id,
|
||||
expires_hours=6, # Info alerts expire quickly
|
||||
)
|
||||
|
||||
return alert
|
||||
|
||||
def create_schedule_failed_alert(
|
||||
self,
|
||||
schedule_id: str,
|
||||
action_type: str,
|
||||
unit_id: str = None,
|
||||
error_message: str = None,
|
||||
project_id: str = None,
|
||||
location_id: str = None,
|
||||
) -> Alert:
|
||||
"""
|
||||
Create alert when a scheduled action fails.
|
||||
|
||||
Args:
|
||||
schedule_id: The ScheduledAction or RecurringSchedule ID
|
||||
action_type: start, stop, download, cycle
|
||||
unit_id: Related unit
|
||||
error_message: Error from execution
|
||||
project_id: Related project
|
||||
location_id: Related location
|
||||
|
||||
Returns:
|
||||
Created Alert
|
||||
"""
|
||||
return self.create_alert(
|
||||
alert_type="schedule_failed",
|
||||
title=f"Scheduled {action_type} failed",
|
||||
message=error_message or f"The scheduled {action_type} action did not complete successfully.",
|
||||
severity="warning",
|
||||
unit_id=unit_id,
|
||||
project_id=project_id,
|
||||
location_id=location_id,
|
||||
schedule_id=schedule_id,
|
||||
metadata={"action_type": action_type},
|
||||
expires_hours=24,
|
||||
)
|
||||
|
||||
def create_schedule_completed_alert(
|
||||
self,
|
||||
schedule_id: str,
|
||||
action_type: str,
|
||||
unit_id: str = None,
|
||||
project_id: str = None,
|
||||
location_id: str = None,
|
||||
metadata: dict = None,
|
||||
) -> Alert:
|
||||
"""
|
||||
Create alert when a scheduled action completes successfully.
|
||||
|
||||
Args:
|
||||
schedule_id: The ScheduledAction ID
|
||||
action_type: start, stop, download, cycle
|
||||
unit_id: Related unit
|
||||
project_id: Related project
|
||||
location_id: Related location
|
||||
metadata: Additional info (e.g., downloaded folder, index numbers)
|
||||
|
||||
Returns:
|
||||
Created Alert
|
||||
"""
|
||||
# Build descriptive message based on action type and metadata
|
||||
if action_type == "stop" and metadata:
|
||||
download_folder = metadata.get("downloaded_folder")
|
||||
download_success = metadata.get("download_success", False)
|
||||
if download_success and download_folder:
|
||||
message = f"Measurement stopped and data downloaded ({download_folder})"
|
||||
elif download_success is False and metadata.get("download_attempted"):
|
||||
message = "Measurement stopped but download failed"
|
||||
else:
|
||||
message = "Measurement stopped successfully"
|
||||
elif action_type == "start" and metadata:
|
||||
new_index = metadata.get("new_index")
|
||||
if new_index is not None:
|
||||
message = f"Measurement started (index {new_index:04d})"
|
||||
else:
|
||||
message = "Measurement started successfully"
|
||||
else:
|
||||
message = f"Scheduled {action_type} completed successfully"
|
||||
|
||||
return self.create_alert(
|
||||
alert_type="schedule_completed",
|
||||
title=f"Scheduled {action_type} completed",
|
||||
message=message,
|
||||
severity="info",
|
||||
unit_id=unit_id,
|
||||
project_id=project_id,
|
||||
location_id=location_id,
|
||||
schedule_id=schedule_id,
|
||||
metadata={"action_type": action_type, **(metadata or {})},
|
||||
expires_hours=12, # Info alerts expire quickly
|
||||
)
|
||||
|
||||
def get_active_alerts(
|
||||
self,
|
||||
project_id: str = None,
|
||||
unit_id: str = None,
|
||||
alert_type: str = None,
|
||||
min_severity: str = None,
|
||||
limit: int = 50,
|
||||
) -> List[Alert]:
|
||||
"""
|
||||
Query active alerts with optional filters.
|
||||
|
||||
Args:
|
||||
project_id: Filter by project
|
||||
unit_id: Filter by unit
|
||||
alert_type: Filter by alert type
|
||||
min_severity: Minimum severity (info, warning, critical)
|
||||
limit: Maximum results
|
||||
|
||||
Returns:
|
||||
List of matching alerts
|
||||
"""
|
||||
query = self.db.query(Alert).filter(Alert.status == "active")
|
||||
|
||||
if project_id:
|
||||
query = query.filter(Alert.project_id == project_id)
|
||||
|
||||
if unit_id:
|
||||
query = query.filter(Alert.unit_id == unit_id)
|
||||
|
||||
if alert_type:
|
||||
query = query.filter(Alert.alert_type == alert_type)
|
||||
|
||||
if min_severity:
|
||||
# Map severity to numeric for comparison
|
||||
severity_levels = {"info": 1, "warning": 2, "critical": 3}
|
||||
min_level = severity_levels.get(min_severity, 1)
|
||||
|
||||
if min_level == 2:
|
||||
query = query.filter(Alert.severity.in_(["warning", "critical"]))
|
||||
elif min_level == 3:
|
||||
query = query.filter(Alert.severity == "critical")
|
||||
|
||||
return query.order_by(Alert.created_at.desc()).limit(limit).all()
|
||||
|
||||
def get_all_alerts(
|
||||
self,
|
||||
status: str = None,
|
||||
project_id: str = None,
|
||||
unit_id: str = None,
|
||||
alert_type: str = None,
|
||||
limit: int = 50,
|
||||
offset: int = 0,
|
||||
) -> List[Alert]:
|
||||
"""
|
||||
Query all alerts with optional filters (includes non-active).
|
||||
|
||||
Args:
|
||||
status: Filter by status (active, acknowledged, resolved, dismissed)
|
||||
project_id: Filter by project
|
||||
unit_id: Filter by unit
|
||||
alert_type: Filter by alert type
|
||||
limit: Maximum results
|
||||
offset: Pagination offset
|
||||
|
||||
Returns:
|
||||
List of matching alerts
|
||||
"""
|
||||
query = self.db.query(Alert)
|
||||
|
||||
if status:
|
||||
query = query.filter(Alert.status == status)
|
||||
|
||||
if project_id:
|
||||
query = query.filter(Alert.project_id == project_id)
|
||||
|
||||
if unit_id:
|
||||
query = query.filter(Alert.unit_id == unit_id)
|
||||
|
||||
if alert_type:
|
||||
query = query.filter(Alert.alert_type == alert_type)
|
||||
|
||||
return (
|
||||
query.order_by(Alert.created_at.desc())
|
||||
.offset(offset)
|
||||
.limit(limit)
|
||||
.all()
|
||||
)
|
||||
|
||||
def get_active_alert_count(self) -> int:
|
||||
"""Get count of active alerts for badge display."""
|
||||
return self.db.query(Alert).filter(Alert.status == "active").count()
|
||||
|
||||
def acknowledge_alert(self, alert_id: str) -> Optional[Alert]:
|
||||
"""
|
||||
Mark alert as acknowledged.
|
||||
|
||||
Args:
|
||||
alert_id: Alert to acknowledge
|
||||
|
||||
Returns:
|
||||
Updated Alert or None if not found
|
||||
"""
|
||||
alert = self.db.query(Alert).filter_by(id=alert_id).first()
|
||||
if not alert:
|
||||
return None
|
||||
|
||||
alert.status = "acknowledged"
|
||||
alert.acknowledged_at = datetime.utcnow()
|
||||
self.db.commit()
|
||||
|
||||
logger.info(f"Acknowledged alert: {alert.title}")
|
||||
return alert
|
||||
|
||||
def dismiss_alert(self, alert_id: str) -> Optional[Alert]:
|
||||
"""
|
||||
Dismiss alert (user chose to ignore).
|
||||
|
||||
Args:
|
||||
alert_id: Alert to dismiss
|
||||
|
||||
Returns:
|
||||
Updated Alert or None if not found
|
||||
"""
|
||||
alert = self.db.query(Alert).filter_by(id=alert_id).first()
|
||||
if not alert:
|
||||
return None
|
||||
|
||||
alert.status = "dismissed"
|
||||
self.db.commit()
|
||||
|
||||
logger.info(f"Dismissed alert: {alert.title}")
|
||||
return alert
|
||||
|
||||
def resolve_alert(self, alert_id: str) -> Optional[Alert]:
|
||||
"""
|
||||
Manually resolve an alert.
|
||||
|
||||
Args:
|
||||
alert_id: Alert to resolve
|
||||
|
||||
Returns:
|
||||
Updated Alert or None if not found
|
||||
"""
|
||||
alert = self.db.query(Alert).filter_by(id=alert_id).first()
|
||||
if not alert:
|
||||
return None
|
||||
|
||||
alert.status = "resolved"
|
||||
alert.resolved_at = datetime.utcnow()
|
||||
self.db.commit()
|
||||
|
||||
logger.info(f"Resolved alert: {alert.title}")
|
||||
return alert
|
||||
|
||||
def cleanup_expired_alerts(self) -> int:
|
||||
"""
|
||||
Remove alerts past their expiration time.
|
||||
|
||||
Returns:
|
||||
Number of alerts cleaned up
|
||||
"""
|
||||
now = datetime.utcnow()
|
||||
expired = self.db.query(Alert).filter(
|
||||
and_(
|
||||
Alert.expires_at.isnot(None),
|
||||
Alert.expires_at < now,
|
||||
Alert.status == "active",
|
||||
)
|
||||
).all()
|
||||
|
||||
count = len(expired)
|
||||
for alert in expired:
|
||||
alert.status = "dismissed"
|
||||
|
||||
if count > 0:
|
||||
self.db.commit()
|
||||
logger.info(f"Cleaned up {count} expired alerts")
|
||||
|
||||
return count
|
||||
|
||||
|
||||
def get_alert_service(db: Session) -> AlertService:
|
||||
"""Get an AlertService instance with the given database session."""
|
||||
return AlertService(db)
|
||||
@@ -31,7 +31,7 @@ class DeviceController:
|
||||
|
||||
Usage:
|
||||
controller = DeviceController()
|
||||
await controller.start_recording("nl43-001", "sound_level_meter", config={})
|
||||
await controller.start_recording("nl43-001", "slm", config={})
|
||||
await controller.stop_recording("seismo-042", "seismograph")
|
||||
"""
|
||||
|
||||
@@ -53,7 +53,7 @@ class DeviceController:
|
||||
|
||||
Args:
|
||||
unit_id: Unit identifier
|
||||
device_type: "sound_level_meter" | "seismograph"
|
||||
device_type: "slm" | "seismograph"
|
||||
config: Device-specific recording configuration
|
||||
|
||||
Returns:
|
||||
@@ -63,7 +63,7 @@ class DeviceController:
|
||||
UnsupportedDeviceTypeError: Device type not supported
|
||||
DeviceControllerError: Operation failed
|
||||
"""
|
||||
if device_type == "sound_level_meter":
|
||||
if device_type == "slm":
|
||||
try:
|
||||
return await self.slmm_client.start_recording(unit_id, config)
|
||||
except SLMMClientError as e:
|
||||
@@ -81,7 +81,7 @@ class DeviceController:
|
||||
else:
|
||||
raise UnsupportedDeviceTypeError(
|
||||
f"Device type '{device_type}' is not supported. "
|
||||
f"Supported types: sound_level_meter, seismograph"
|
||||
f"Supported types: slm, seismograph"
|
||||
)
|
||||
|
||||
async def stop_recording(
|
||||
@@ -94,12 +94,12 @@ class DeviceController:
|
||||
|
||||
Args:
|
||||
unit_id: Unit identifier
|
||||
device_type: "sound_level_meter" | "seismograph"
|
||||
device_type: "slm" | "seismograph"
|
||||
|
||||
Returns:
|
||||
Response dict from device module
|
||||
"""
|
||||
if device_type == "sound_level_meter":
|
||||
if device_type == "slm":
|
||||
try:
|
||||
return await self.slmm_client.stop_recording(unit_id)
|
||||
except SLMMClientError as e:
|
||||
@@ -126,12 +126,12 @@ class DeviceController:
|
||||
|
||||
Args:
|
||||
unit_id: Unit identifier
|
||||
device_type: "sound_level_meter" | "seismograph"
|
||||
device_type: "slm" | "seismograph"
|
||||
|
||||
Returns:
|
||||
Response dict from device module
|
||||
"""
|
||||
if device_type == "sound_level_meter":
|
||||
if device_type == "slm":
|
||||
try:
|
||||
return await self.slmm_client.pause_recording(unit_id)
|
||||
except SLMMClientError as e:
|
||||
@@ -157,12 +157,12 @@ class DeviceController:
|
||||
|
||||
Args:
|
||||
unit_id: Unit identifier
|
||||
device_type: "sound_level_meter" | "seismograph"
|
||||
device_type: "slm" | "seismograph"
|
||||
|
||||
Returns:
|
||||
Response dict from device module
|
||||
"""
|
||||
if device_type == "sound_level_meter":
|
||||
if device_type == "slm":
|
||||
try:
|
||||
return await self.slmm_client.resume_recording(unit_id)
|
||||
except SLMMClientError as e:
|
||||
@@ -192,12 +192,12 @@ class DeviceController:
|
||||
|
||||
Args:
|
||||
unit_id: Unit identifier
|
||||
device_type: "sound_level_meter" | "seismograph"
|
||||
device_type: "slm" | "seismograph"
|
||||
|
||||
Returns:
|
||||
Status dict from device module
|
||||
"""
|
||||
if device_type == "sound_level_meter":
|
||||
if device_type == "slm":
|
||||
try:
|
||||
return await self.slmm_client.get_unit_status(unit_id)
|
||||
except SLMMClientError as e:
|
||||
@@ -224,12 +224,12 @@ class DeviceController:
|
||||
|
||||
Args:
|
||||
unit_id: Unit identifier
|
||||
device_type: "sound_level_meter" | "seismograph"
|
||||
device_type: "slm" | "seismograph"
|
||||
|
||||
Returns:
|
||||
Live data dict from device module
|
||||
"""
|
||||
if device_type == "sound_level_meter":
|
||||
if device_type == "slm":
|
||||
try:
|
||||
return await self.slmm_client.get_live_data(unit_id)
|
||||
except SLMMClientError as e:
|
||||
@@ -261,14 +261,14 @@ class DeviceController:
|
||||
|
||||
Args:
|
||||
unit_id: Unit identifier
|
||||
device_type: "sound_level_meter" | "seismograph"
|
||||
device_type: "slm" | "seismograph"
|
||||
destination_path: Local path to save files
|
||||
files: List of filenames, or None for all
|
||||
|
||||
Returns:
|
||||
Download result with file list
|
||||
"""
|
||||
if device_type == "sound_level_meter":
|
||||
if device_type == "slm":
|
||||
try:
|
||||
return await self.slmm_client.download_files(
|
||||
unit_id,
|
||||
@@ -289,6 +289,74 @@ class DeviceController:
|
||||
else:
|
||||
raise UnsupportedDeviceTypeError(f"Unsupported device type: {device_type}")
|
||||
|
||||
# ========================================================================
|
||||
# FTP Control
|
||||
# ========================================================================
|
||||
|
||||
async def enable_ftp(
|
||||
self,
|
||||
unit_id: str,
|
||||
device_type: str,
|
||||
) -> Dict[str, Any]:
|
||||
"""
|
||||
Enable FTP server on device.
|
||||
|
||||
Must be called before downloading files.
|
||||
|
||||
Args:
|
||||
unit_id: Unit identifier
|
||||
device_type: "slm" | "seismograph"
|
||||
|
||||
Returns:
|
||||
Response dict with status
|
||||
"""
|
||||
if device_type == "slm":
|
||||
try:
|
||||
return await self.slmm_client.enable_ftp(unit_id)
|
||||
except SLMMClientError as e:
|
||||
raise DeviceControllerError(f"SLMM error: {str(e)}")
|
||||
|
||||
elif device_type == "seismograph":
|
||||
return {
|
||||
"status": "not_implemented",
|
||||
"message": "Seismograph FTP not yet implemented",
|
||||
"unit_id": unit_id,
|
||||
}
|
||||
|
||||
else:
|
||||
raise UnsupportedDeviceTypeError(f"Unsupported device type: {device_type}")
|
||||
|
||||
async def disable_ftp(
|
||||
self,
|
||||
unit_id: str,
|
||||
device_type: str,
|
||||
) -> Dict[str, Any]:
|
||||
"""
|
||||
Disable FTP server on device.
|
||||
|
||||
Args:
|
||||
unit_id: Unit identifier
|
||||
device_type: "slm" | "seismograph"
|
||||
|
||||
Returns:
|
||||
Response dict with status
|
||||
"""
|
||||
if device_type == "slm":
|
||||
try:
|
||||
return await self.slmm_client.disable_ftp(unit_id)
|
||||
except SLMMClientError as e:
|
||||
raise DeviceControllerError(f"SLMM error: {str(e)}")
|
||||
|
||||
elif device_type == "seismograph":
|
||||
return {
|
||||
"status": "not_implemented",
|
||||
"message": "Seismograph FTP not yet implemented",
|
||||
"unit_id": unit_id,
|
||||
}
|
||||
|
||||
else:
|
||||
raise UnsupportedDeviceTypeError(f"Unsupported device type: {device_type}")
|
||||
|
||||
# ========================================================================
|
||||
# Device Configuration
|
||||
# ========================================================================
|
||||
@@ -304,13 +372,13 @@ class DeviceController:
|
||||
|
||||
Args:
|
||||
unit_id: Unit identifier
|
||||
device_type: "sound_level_meter" | "seismograph"
|
||||
device_type: "slm" | "seismograph"
|
||||
config: Configuration parameters
|
||||
|
||||
Returns:
|
||||
Updated config from device module
|
||||
"""
|
||||
if device_type == "sound_level_meter":
|
||||
if device_type == "slm":
|
||||
try:
|
||||
return await self.slmm_client.update_unit_config(
|
||||
unit_id,
|
||||
@@ -333,6 +401,157 @@ class DeviceController:
|
||||
else:
|
||||
raise UnsupportedDeviceTypeError(f"Unsupported device type: {device_type}")
|
||||
|
||||
# ========================================================================
|
||||
# Store/Index Management
|
||||
# ========================================================================
|
||||
|
||||
async def increment_index(
|
||||
self,
|
||||
unit_id: str,
|
||||
device_type: str,
|
||||
) -> Dict[str, Any]:
|
||||
"""
|
||||
Increment the store/index number on a device.
|
||||
|
||||
For SLMs, this increments the store name to prevent "overwrite data?" prompts.
|
||||
Should be called before starting a new measurement if auto_increment_index is enabled.
|
||||
|
||||
Args:
|
||||
unit_id: Unit identifier
|
||||
device_type: "slm" | "seismograph"
|
||||
|
||||
Returns:
|
||||
Response dict with old_index and new_index
|
||||
"""
|
||||
if device_type == "slm":
|
||||
try:
|
||||
return await self.slmm_client.increment_index(unit_id)
|
||||
except SLMMClientError as e:
|
||||
raise DeviceControllerError(f"SLMM error: {str(e)}")
|
||||
|
||||
elif device_type == "seismograph":
|
||||
# Seismographs may not have the same concept of store index
|
||||
return {
|
||||
"status": "not_applicable",
|
||||
"message": "Index increment not applicable for seismographs",
|
||||
"unit_id": unit_id,
|
||||
}
|
||||
|
||||
else:
|
||||
raise UnsupportedDeviceTypeError(f"Unsupported device type: {device_type}")
|
||||
|
||||
async def get_index_number(
|
||||
self,
|
||||
unit_id: str,
|
||||
device_type: str,
|
||||
) -> Dict[str, Any]:
|
||||
"""
|
||||
Get current store/index number from device.
|
||||
|
||||
Args:
|
||||
unit_id: Unit identifier
|
||||
device_type: "slm" | "seismograph"
|
||||
|
||||
Returns:
|
||||
Response dict with current index_number
|
||||
"""
|
||||
if device_type == "slm":
|
||||
try:
|
||||
return await self.slmm_client.get_index_number(unit_id)
|
||||
except SLMMClientError as e:
|
||||
raise DeviceControllerError(f"SLMM error: {str(e)}")
|
||||
|
||||
elif device_type == "seismograph":
|
||||
return {
|
||||
"status": "not_applicable",
|
||||
"message": "Index number not applicable for seismographs",
|
||||
"unit_id": unit_id,
|
||||
}
|
||||
|
||||
else:
|
||||
raise UnsupportedDeviceTypeError(f"Unsupported device type: {device_type}")
|
||||
|
||||
# ========================================================================
|
||||
# Cycle Commands (for scheduled automation)
|
||||
# ========================================================================
|
||||
|
||||
async def start_cycle(
|
||||
self,
|
||||
unit_id: str,
|
||||
device_type: str,
|
||||
sync_clock: bool = True,
|
||||
) -> Dict[str, Any]:
|
||||
"""
|
||||
Execute complete start cycle for scheduled automation.
|
||||
|
||||
This handles the full pre-recording workflow:
|
||||
1. Sync device clock to server time
|
||||
2. Find next safe index (with overwrite protection)
|
||||
3. Start measurement
|
||||
|
||||
Args:
|
||||
unit_id: Unit identifier
|
||||
device_type: "slm" | "seismograph"
|
||||
sync_clock: Whether to sync device clock to server time
|
||||
|
||||
Returns:
|
||||
Response dict from device module
|
||||
"""
|
||||
if device_type == "slm":
|
||||
try:
|
||||
return await self.slmm_client.start_cycle(unit_id, sync_clock)
|
||||
except SLMMClientError as e:
|
||||
raise DeviceControllerError(f"SLMM error: {str(e)}")
|
||||
|
||||
elif device_type == "seismograph":
|
||||
return {
|
||||
"status": "not_implemented",
|
||||
"message": "Seismograph start cycle not yet implemented",
|
||||
"unit_id": unit_id,
|
||||
}
|
||||
|
||||
else:
|
||||
raise UnsupportedDeviceTypeError(f"Unsupported device type: {device_type}")
|
||||
|
||||
async def stop_cycle(
|
||||
self,
|
||||
unit_id: str,
|
||||
device_type: str,
|
||||
download: bool = True,
|
||||
) -> Dict[str, Any]:
|
||||
"""
|
||||
Execute complete stop cycle for scheduled automation.
|
||||
|
||||
This handles the full post-recording workflow:
|
||||
1. Stop measurement
|
||||
2. Enable FTP
|
||||
3. Download measurement folder
|
||||
4. Verify download
|
||||
|
||||
Args:
|
||||
unit_id: Unit identifier
|
||||
device_type: "slm" | "seismograph"
|
||||
download: Whether to download measurement data
|
||||
|
||||
Returns:
|
||||
Response dict from device module
|
||||
"""
|
||||
if device_type == "slm":
|
||||
try:
|
||||
return await self.slmm_client.stop_cycle(unit_id, download)
|
||||
except SLMMClientError as e:
|
||||
raise DeviceControllerError(f"SLMM error: {str(e)}")
|
||||
|
||||
elif device_type == "seismograph":
|
||||
return {
|
||||
"status": "not_implemented",
|
||||
"message": "Seismograph stop cycle not yet implemented",
|
||||
"unit_id": unit_id,
|
||||
}
|
||||
|
||||
else:
|
||||
raise UnsupportedDeviceTypeError(f"Unsupported device type: {device_type}")
|
||||
|
||||
# ========================================================================
|
||||
# Health Check
|
||||
# ========================================================================
|
||||
@@ -347,12 +566,12 @@ class DeviceController:
|
||||
|
||||
Args:
|
||||
unit_id: Unit identifier
|
||||
device_type: "sound_level_meter" | "seismograph"
|
||||
device_type: "slm" | "seismograph"
|
||||
|
||||
Returns:
|
||||
True if device is reachable, False otherwise
|
||||
"""
|
||||
if device_type == "sound_level_meter":
|
||||
if device_type == "slm":
|
||||
try:
|
||||
status = await self.slmm_client.get_unit_status(unit_id)
|
||||
return status.get("last_seen") is not None
|
||||
|
||||
184
backend/services/device_status_monitor.py
Normal file
@@ -0,0 +1,184 @@
|
||||
"""
|
||||
Device Status Monitor
|
||||
|
||||
Background task that monitors device reachability via SLMM polling status
|
||||
and triggers alerts when devices go offline or come back online.
|
||||
|
||||
This service bridges SLMM's device polling with Terra-View's alert system.
|
||||
"""
|
||||
|
||||
import asyncio
|
||||
import logging
|
||||
from datetime import datetime
|
||||
from typing import Optional, Dict
|
||||
|
||||
from backend.database import SessionLocal
|
||||
from backend.services.slmm_client import get_slmm_client, SLMMClientError
|
||||
from backend.services.alert_service import get_alert_service
|
||||
|
||||
logger = logging.getLogger(__name__)
|
||||
|
||||
|
||||
class DeviceStatusMonitor:
|
||||
"""
|
||||
Monitors device reachability via SLMM's polling status endpoint.
|
||||
|
||||
Detects state transitions (online→offline, offline→online) and
|
||||
triggers AlertService to create/resolve alerts.
|
||||
|
||||
Usage:
|
||||
monitor = DeviceStatusMonitor()
|
||||
await monitor.start() # Start background monitoring
|
||||
monitor.stop() # Stop monitoring
|
||||
"""
|
||||
|
||||
def __init__(self, check_interval: int = 60):
|
||||
"""
|
||||
Initialize the monitor.
|
||||
|
||||
Args:
|
||||
check_interval: Seconds between status checks (default: 60)
|
||||
"""
|
||||
self.check_interval = check_interval
|
||||
self.running = False
|
||||
self.task: Optional[asyncio.Task] = None
|
||||
self.slmm_client = get_slmm_client()
|
||||
|
||||
# Track previous device states to detect transitions
|
||||
self._device_states: Dict[str, bool] = {}
|
||||
|
||||
async def start(self):
|
||||
"""Start the monitoring background task."""
|
||||
if self.running:
|
||||
logger.warning("DeviceStatusMonitor is already running")
|
||||
return
|
||||
|
||||
self.running = True
|
||||
self.task = asyncio.create_task(self._monitor_loop())
|
||||
logger.info(f"DeviceStatusMonitor started (checking every {self.check_interval}s)")
|
||||
|
||||
def stop(self):
|
||||
"""Stop the monitoring background task."""
|
||||
self.running = False
|
||||
if self.task:
|
||||
self.task.cancel()
|
||||
logger.info("DeviceStatusMonitor stopped")
|
||||
|
||||
async def _monitor_loop(self):
|
||||
"""Main monitoring loop."""
|
||||
while self.running:
|
||||
try:
|
||||
await self._check_all_devices()
|
||||
except Exception as e:
|
||||
logger.error(f"Error in device status monitor: {e}", exc_info=True)
|
||||
|
||||
# Sleep in small intervals for graceful shutdown
|
||||
for _ in range(self.check_interval):
|
||||
if not self.running:
|
||||
break
|
||||
await asyncio.sleep(1)
|
||||
|
||||
logger.info("DeviceStatusMonitor loop exited")
|
||||
|
||||
async def _check_all_devices(self):
|
||||
"""
|
||||
Fetch polling status from SLMM and detect state transitions.
|
||||
|
||||
Uses GET /api/slmm/_polling/status (proxied to SLMM)
|
||||
"""
|
||||
try:
|
||||
# Get status from SLMM
|
||||
status_response = await self.slmm_client.get_polling_status()
|
||||
devices = status_response.get("devices", [])
|
||||
|
||||
if not devices:
|
||||
logger.debug("No devices in polling status response")
|
||||
return
|
||||
|
||||
db = SessionLocal()
|
||||
try:
|
||||
alert_service = get_alert_service(db)
|
||||
|
||||
for device in devices:
|
||||
unit_id = device.get("unit_id")
|
||||
if not unit_id:
|
||||
continue
|
||||
|
||||
is_reachable = device.get("is_reachable", True)
|
||||
previous_reachable = self._device_states.get(unit_id)
|
||||
|
||||
# Skip if this is the first check (no previous state)
|
||||
if previous_reachable is None:
|
||||
self._device_states[unit_id] = is_reachable
|
||||
logger.debug(f"Initial state for {unit_id}: reachable={is_reachable}")
|
||||
continue
|
||||
|
||||
# Detect offline transition (was online, now offline)
|
||||
if previous_reachable and not is_reachable:
|
||||
logger.warning(f"Device {unit_id} went OFFLINE")
|
||||
alert_service.create_device_offline_alert(
|
||||
unit_id=unit_id,
|
||||
consecutive_failures=device.get("consecutive_failures", 0),
|
||||
last_error=device.get("last_error"),
|
||||
)
|
||||
|
||||
# Detect online transition (was offline, now online)
|
||||
elif not previous_reachable and is_reachable:
|
||||
logger.info(f"Device {unit_id} came back ONLINE")
|
||||
alert_service.resolve_device_offline_alert(unit_id)
|
||||
|
||||
# Update tracked state
|
||||
self._device_states[unit_id] = is_reachable
|
||||
|
||||
# Cleanup expired alerts while we're here
|
||||
alert_service.cleanup_expired_alerts()
|
||||
|
||||
finally:
|
||||
db.close()
|
||||
|
||||
except SLMMClientError as e:
|
||||
logger.warning(f"Could not reach SLMM for status check: {e}")
|
||||
except Exception as e:
|
||||
logger.error(f"Error checking device status: {e}", exc_info=True)
|
||||
|
||||
def get_tracked_devices(self) -> Dict[str, bool]:
|
||||
"""
|
||||
Get the current tracked device states.
|
||||
|
||||
Returns:
|
||||
Dict mapping unit_id to is_reachable status
|
||||
"""
|
||||
return dict(self._device_states)
|
||||
|
||||
def clear_tracked_devices(self):
|
||||
"""Clear all tracked device states (useful for testing)."""
|
||||
self._device_states.clear()
|
||||
|
||||
|
||||
# Singleton instance
|
||||
_monitor_instance: Optional[DeviceStatusMonitor] = None
|
||||
|
||||
|
||||
def get_device_status_monitor() -> DeviceStatusMonitor:
|
||||
"""
|
||||
Get the device status monitor singleton instance.
|
||||
|
||||
Returns:
|
||||
DeviceStatusMonitor instance
|
||||
"""
|
||||
global _monitor_instance
|
||||
if _monitor_instance is None:
|
||||
_monitor_instance = DeviceStatusMonitor()
|
||||
return _monitor_instance
|
||||
|
||||
|
||||
async def start_device_status_monitor():
|
||||
"""Start the global device status monitor."""
|
||||
monitor = get_device_status_monitor()
|
||||
await monitor.start()
|
||||
|
||||
|
||||
def stop_device_status_monitor():
|
||||
"""Stop the global device status monitor."""
|
||||
monitor = get_device_status_monitor()
|
||||
monitor.stop()
|
||||
668
backend/services/fleet_calendar_service.py
Normal file
@@ -0,0 +1,668 @@
|
||||
"""
|
||||
Fleet Calendar Service
|
||||
|
||||
Business logic for:
|
||||
- Calculating unit availability on any given date
|
||||
- Calibration status tracking (valid, expiring soon, expired)
|
||||
- Job reservation management
|
||||
- Conflict detection (calibration expires mid-job)
|
||||
"""
|
||||
|
||||
from datetime import date, datetime, timedelta
|
||||
from typing import Dict, List, Optional, Tuple
|
||||
from sqlalchemy.orm import Session
|
||||
from sqlalchemy import and_, or_
|
||||
|
||||
from backend.models import (
|
||||
RosterUnit, JobReservation, JobReservationUnit,
|
||||
UserPreferences, Project
|
||||
)
|
||||
|
||||
|
||||
def get_calibration_status(
|
||||
unit: RosterUnit,
|
||||
check_date: date,
|
||||
warning_days: int = 30
|
||||
) -> str:
|
||||
"""
|
||||
Determine calibration status for a unit on a specific date.
|
||||
|
||||
Returns:
|
||||
"valid" - Calibration is good on this date
|
||||
"expiring_soon" - Within warning_days of expiry
|
||||
"expired" - Calibration has expired
|
||||
"needs_calibration" - No calibration date set
|
||||
"""
|
||||
if not unit.last_calibrated:
|
||||
return "needs_calibration"
|
||||
|
||||
# Calculate expiry date (1 year from last calibration)
|
||||
expiry_date = unit.last_calibrated + timedelta(days=365)
|
||||
|
||||
if check_date >= expiry_date:
|
||||
return "expired"
|
||||
elif check_date >= expiry_date - timedelta(days=warning_days):
|
||||
return "expiring_soon"
|
||||
else:
|
||||
return "valid"
|
||||
|
||||
|
||||
def get_unit_reservations_on_date(
|
||||
db: Session,
|
||||
unit_id: str,
|
||||
check_date: date
|
||||
) -> List[JobReservation]:
|
||||
"""Get all reservations that include this unit on the given date."""
|
||||
|
||||
# Get reservation IDs that have this unit assigned
|
||||
assigned_reservation_ids = db.query(JobReservationUnit.reservation_id).filter(
|
||||
JobReservationUnit.unit_id == unit_id
|
||||
).subquery()
|
||||
|
||||
# Get reservations that:
|
||||
# 1. Have this unit assigned AND date is within range
|
||||
reservations = db.query(JobReservation).filter(
|
||||
JobReservation.id.in_(assigned_reservation_ids),
|
||||
JobReservation.start_date <= check_date,
|
||||
JobReservation.end_date >= check_date
|
||||
).all()
|
||||
|
||||
return reservations
|
||||
|
||||
|
||||
def is_unit_available_on_date(
|
||||
db: Session,
|
||||
unit: RosterUnit,
|
||||
check_date: date,
|
||||
warning_days: int = 30
|
||||
) -> Tuple[bool, str, Optional[str]]:
|
||||
"""
|
||||
Check if a unit is available on a specific date.
|
||||
|
||||
Returns:
|
||||
(is_available, status, reservation_name)
|
||||
- is_available: True if unit can be assigned to new work
|
||||
- status: "available", "reserved", "expired", "retired", "needs_calibration"
|
||||
- reservation_name: Name of blocking reservation (if any)
|
||||
"""
|
||||
# Check if retired
|
||||
if unit.retired:
|
||||
return False, "retired", None
|
||||
|
||||
# Check calibration status
|
||||
cal_status = get_calibration_status(unit, check_date, warning_days)
|
||||
if cal_status == "expired":
|
||||
return False, "expired", None
|
||||
if cal_status == "needs_calibration":
|
||||
return False, "needs_calibration", None
|
||||
|
||||
# Check if already reserved
|
||||
reservations = get_unit_reservations_on_date(db, unit.id, check_date)
|
||||
if reservations:
|
||||
return False, "reserved", reservations[0].name
|
||||
|
||||
# Unit is available (even if expiring soon - that's just a warning)
|
||||
return True, "available", None
|
||||
|
||||
|
||||
def get_day_summary(
|
||||
db: Session,
|
||||
check_date: date,
|
||||
device_type: str = "seismograph"
|
||||
) -> Dict:
|
||||
"""
|
||||
Get a complete summary of fleet status for a specific day.
|
||||
|
||||
Returns dict with:
|
||||
- available_units: List of available unit IDs with calibration info
|
||||
- reserved_units: List of reserved unit IDs with reservation info
|
||||
- expired_units: List of units with expired calibration
|
||||
- expiring_soon_units: List of units expiring within warning period
|
||||
- reservations: List of active reservations on this date
|
||||
- counts: Summary counts
|
||||
"""
|
||||
# Get user preferences for warning days
|
||||
prefs = db.query(UserPreferences).filter_by(id=1).first()
|
||||
warning_days = prefs.calibration_warning_days if prefs else 30
|
||||
|
||||
# Get all non-retired units of the specified device type
|
||||
units = db.query(RosterUnit).filter(
|
||||
RosterUnit.device_type == device_type,
|
||||
RosterUnit.retired == False
|
||||
).all()
|
||||
|
||||
available_units = []
|
||||
reserved_units = []
|
||||
expired_units = []
|
||||
expiring_soon_units = []
|
||||
needs_calibration_units = []
|
||||
cal_expiring_today = [] # Units whose calibration expires ON this day
|
||||
|
||||
for unit in units:
|
||||
is_avail, status, reservation_name = is_unit_available_on_date(
|
||||
db, unit, check_date, warning_days
|
||||
)
|
||||
|
||||
cal_status = get_calibration_status(unit, check_date, warning_days)
|
||||
expiry_date = None
|
||||
if unit.last_calibrated:
|
||||
expiry_date = (unit.last_calibrated + timedelta(days=365)).isoformat()
|
||||
|
||||
unit_info = {
|
||||
"id": unit.id,
|
||||
"last_calibrated": unit.last_calibrated.isoformat() if unit.last_calibrated else None,
|
||||
"expiry_date": expiry_date,
|
||||
"calibration_status": cal_status,
|
||||
"deployed": unit.deployed,
|
||||
"note": unit.note or ""
|
||||
}
|
||||
|
||||
# Check if calibration expires ON this specific day
|
||||
if unit.last_calibrated:
|
||||
unit_expiry_date = unit.last_calibrated + timedelta(days=365)
|
||||
if unit_expiry_date == check_date:
|
||||
cal_expiring_today.append(unit_info)
|
||||
|
||||
if status == "available":
|
||||
available_units.append(unit_info)
|
||||
if cal_status == "expiring_soon":
|
||||
expiring_soon_units.append(unit_info)
|
||||
elif status == "reserved":
|
||||
unit_info["reservation_name"] = reservation_name
|
||||
reserved_units.append(unit_info)
|
||||
if cal_status == "expiring_soon":
|
||||
expiring_soon_units.append(unit_info)
|
||||
elif status == "expired":
|
||||
expired_units.append(unit_info)
|
||||
elif status == "needs_calibration":
|
||||
needs_calibration_units.append(unit_info)
|
||||
|
||||
# Get active reservations on this date
|
||||
reservations = db.query(JobReservation).filter(
|
||||
JobReservation.device_type == device_type,
|
||||
JobReservation.start_date <= check_date,
|
||||
JobReservation.end_date >= check_date
|
||||
).all()
|
||||
|
||||
reservation_list = []
|
||||
for res in reservations:
|
||||
# Count assigned units for this reservation
|
||||
assigned_count = db.query(JobReservationUnit).filter(
|
||||
JobReservationUnit.reservation_id == res.id
|
||||
).count()
|
||||
|
||||
reservation_list.append({
|
||||
"id": res.id,
|
||||
"name": res.name,
|
||||
"start_date": res.start_date.isoformat(),
|
||||
"end_date": res.end_date.isoformat(),
|
||||
"assignment_type": res.assignment_type,
|
||||
"quantity_needed": res.quantity_needed,
|
||||
"assigned_count": assigned_count,
|
||||
"color": res.color,
|
||||
"project_id": res.project_id
|
||||
})
|
||||
|
||||
return {
|
||||
"date": check_date.isoformat(),
|
||||
"device_type": device_type,
|
||||
"available_units": available_units,
|
||||
"reserved_units": reserved_units,
|
||||
"expired_units": expired_units,
|
||||
"expiring_soon_units": expiring_soon_units,
|
||||
"needs_calibration_units": needs_calibration_units,
|
||||
"cal_expiring_today": cal_expiring_today,
|
||||
"reservations": reservation_list,
|
||||
"counts": {
|
||||
"available": len(available_units),
|
||||
"reserved": len(reserved_units),
|
||||
"expired": len(expired_units),
|
||||
"expiring_soon": len(expiring_soon_units),
|
||||
"needs_calibration": len(needs_calibration_units),
|
||||
"cal_expiring_today": len(cal_expiring_today),
|
||||
"total": len(units)
|
||||
}
|
||||
}
|
||||
|
||||
|
||||
def get_calendar_year_data(
|
||||
db: Session,
|
||||
year: int,
|
||||
device_type: str = "seismograph"
|
||||
) -> Dict:
|
||||
"""
|
||||
Get calendar data for an entire year.
|
||||
|
||||
For performance, this returns summary counts per day rather than
|
||||
full unit lists. Use get_day_summary() for detailed day data.
|
||||
"""
|
||||
# Get user preferences
|
||||
prefs = db.query(UserPreferences).filter_by(id=1).first()
|
||||
warning_days = prefs.calibration_warning_days if prefs else 30
|
||||
|
||||
# Get all units
|
||||
units = db.query(RosterUnit).filter(
|
||||
RosterUnit.device_type == device_type,
|
||||
RosterUnit.retired == False
|
||||
).all()
|
||||
|
||||
# Get all reservations that overlap with this year
|
||||
# Include TBD reservations (end_date is null) that started before year end
|
||||
year_start = date(year, 1, 1)
|
||||
year_end = date(year, 12, 31)
|
||||
|
||||
reservations = db.query(JobReservation).filter(
|
||||
JobReservation.device_type == device_type,
|
||||
JobReservation.start_date <= year_end,
|
||||
or_(
|
||||
JobReservation.end_date >= year_start,
|
||||
JobReservation.end_date == None # TBD reservations
|
||||
)
|
||||
).all()
|
||||
|
||||
# Get all unit assignments for these reservations
|
||||
reservation_ids = [r.id for r in reservations]
|
||||
assignments = db.query(JobReservationUnit).filter(
|
||||
JobReservationUnit.reservation_id.in_(reservation_ids)
|
||||
).all() if reservation_ids else []
|
||||
|
||||
# Build a lookup: unit_id -> list of (start_date, end_date, reservation_name)
|
||||
# For TBD reservations, use estimated_end_date if available, or a far future date
|
||||
unit_reservations = {}
|
||||
for res in reservations:
|
||||
res_assignments = [a for a in assignments if a.reservation_id == res.id]
|
||||
for assignment in res_assignments:
|
||||
unit_id = assignment.unit_id
|
||||
# Use unit-specific dates if set, otherwise use reservation dates
|
||||
start_d = assignment.unit_start_date or res.start_date
|
||||
if assignment.unit_end_tbd or (assignment.unit_end_date is None and res.end_date_tbd):
|
||||
# TBD: use estimated date or far future for availability calculation
|
||||
end_d = res.estimated_end_date or date(year + 5, 12, 31)
|
||||
else:
|
||||
end_d = assignment.unit_end_date or res.end_date or date(year + 5, 12, 31)
|
||||
|
||||
if unit_id not in unit_reservations:
|
||||
unit_reservations[unit_id] = []
|
||||
unit_reservations[unit_id].append((start_d, end_d, res.name))
|
||||
|
||||
# Generate data for each month
|
||||
months_data = {}
|
||||
|
||||
for month in range(1, 13):
|
||||
# Get first and last day of month
|
||||
first_day = date(year, month, 1)
|
||||
if month == 12:
|
||||
last_day = date(year, 12, 31)
|
||||
else:
|
||||
last_day = date(year, month + 1, 1) - timedelta(days=1)
|
||||
|
||||
days_data = {}
|
||||
current_day = first_day
|
||||
|
||||
while current_day <= last_day:
|
||||
available = 0
|
||||
reserved = 0
|
||||
expired = 0
|
||||
expiring_soon = 0
|
||||
needs_cal = 0
|
||||
cal_expiring_on_day = 0 # Units whose calibration expires ON this day
|
||||
cal_expired_on_day = 0 # Units whose calibration expired ON this day
|
||||
|
||||
for unit in units:
|
||||
# Check calibration
|
||||
cal_status = get_calibration_status(unit, current_day, warning_days)
|
||||
|
||||
# Check if calibration expires/expired ON this specific day
|
||||
if unit.last_calibrated:
|
||||
unit_expiry = unit.last_calibrated + timedelta(days=365)
|
||||
if unit_expiry == current_day:
|
||||
cal_expiring_on_day += 1
|
||||
# Check if expired yesterday (first day of being expired)
|
||||
elif unit_expiry == current_day - timedelta(days=1):
|
||||
cal_expired_on_day += 1
|
||||
|
||||
if cal_status == "expired":
|
||||
expired += 1
|
||||
continue
|
||||
if cal_status == "needs_calibration":
|
||||
needs_cal += 1
|
||||
continue
|
||||
|
||||
# Check if reserved
|
||||
is_reserved = False
|
||||
if unit.id in unit_reservations:
|
||||
for start_d, end_d, _ in unit_reservations[unit.id]:
|
||||
if start_d <= current_day <= end_d:
|
||||
is_reserved = True
|
||||
break
|
||||
|
||||
if is_reserved:
|
||||
reserved += 1
|
||||
else:
|
||||
available += 1
|
||||
|
||||
if cal_status == "expiring_soon":
|
||||
expiring_soon += 1
|
||||
|
||||
days_data[current_day.day] = {
|
||||
"available": available,
|
||||
"reserved": reserved,
|
||||
"expired": expired,
|
||||
"expiring_soon": expiring_soon,
|
||||
"needs_calibration": needs_cal,
|
||||
"cal_expiring_on_day": cal_expiring_on_day,
|
||||
"cal_expired_on_day": cal_expired_on_day
|
||||
}
|
||||
|
||||
current_day += timedelta(days=1)
|
||||
|
||||
months_data[month] = {
|
||||
"name": first_day.strftime("%B"),
|
||||
"short_name": first_day.strftime("%b"),
|
||||
"days": days_data,
|
||||
"first_weekday": first_day.weekday(), # 0=Monday, 6=Sunday
|
||||
"num_days": last_day.day
|
||||
}
|
||||
|
||||
# Also include reservation summary for the year
|
||||
reservation_list = []
|
||||
for res in reservations:
|
||||
assigned_count = len([a for a in assignments if a.reservation_id == res.id])
|
||||
reservation_list.append({
|
||||
"id": res.id,
|
||||
"name": res.name,
|
||||
"start_date": res.start_date.isoformat(),
|
||||
"end_date": res.end_date.isoformat(),
|
||||
"quantity_needed": res.quantity_needed,
|
||||
"assigned_count": assigned_count,
|
||||
"color": res.color
|
||||
})
|
||||
|
||||
return {
|
||||
"year": year,
|
||||
"device_type": device_type,
|
||||
"months": months_data,
|
||||
"reservations": reservation_list,
|
||||
"total_units": len(units)
|
||||
}
|
||||
|
||||
|
||||
def get_rolling_calendar_data(
|
||||
db: Session,
|
||||
start_year: int,
|
||||
start_month: int,
|
||||
device_type: str = "seismograph"
|
||||
) -> Dict:
|
||||
"""
|
||||
Get calendar data for 12 months starting from a specific month/year.
|
||||
|
||||
This supports the rolling calendar view where users can scroll through
|
||||
months one at a time, viewing any 12-month window.
|
||||
"""
|
||||
# Get user preferences
|
||||
prefs = db.query(UserPreferences).filter_by(id=1).first()
|
||||
warning_days = prefs.calibration_warning_days if prefs else 30
|
||||
|
||||
# Get all units
|
||||
units = db.query(RosterUnit).filter(
|
||||
RosterUnit.device_type == device_type,
|
||||
RosterUnit.retired == False
|
||||
).all()
|
||||
|
||||
# Calculate the date range for 12 months
|
||||
first_date = date(start_year, start_month, 1)
|
||||
# Calculate end date (12 months later)
|
||||
end_year = start_year + 1 if start_month == 1 else start_year
|
||||
end_month = 12 if start_month == 1 else start_month - 1
|
||||
if start_month == 1:
|
||||
end_year = start_year
|
||||
end_month = 12
|
||||
else:
|
||||
# 12 months from start_month means we end at start_month - 1 next year
|
||||
end_year = start_year + 1
|
||||
end_month = start_month - 1
|
||||
|
||||
# Actually, simpler: go 11 months forward from start
|
||||
end_year = start_year + ((start_month + 10) // 12)
|
||||
end_month = ((start_month + 10) % 12) + 1
|
||||
if end_month == 12:
|
||||
last_date = date(end_year, 12, 31)
|
||||
else:
|
||||
last_date = date(end_year, end_month + 1, 1) - timedelta(days=1)
|
||||
|
||||
# Get all reservations that overlap with this 12-month range
|
||||
reservations = db.query(JobReservation).filter(
|
||||
JobReservation.device_type == device_type,
|
||||
JobReservation.start_date <= last_date,
|
||||
or_(
|
||||
JobReservation.end_date >= first_date,
|
||||
JobReservation.end_date == None # TBD reservations
|
||||
)
|
||||
).all()
|
||||
|
||||
# Get all unit assignments for these reservations
|
||||
reservation_ids = [r.id for r in reservations]
|
||||
assignments = db.query(JobReservationUnit).filter(
|
||||
JobReservationUnit.reservation_id.in_(reservation_ids)
|
||||
).all() if reservation_ids else []
|
||||
|
||||
# Build a lookup: unit_id -> list of (start_date, end_date, reservation_name)
|
||||
unit_reservations = {}
|
||||
for res in reservations:
|
||||
res_assignments = [a for a in assignments if a.reservation_id == res.id]
|
||||
for assignment in res_assignments:
|
||||
unit_id = assignment.unit_id
|
||||
start_d = assignment.unit_start_date or res.start_date
|
||||
if assignment.unit_end_tbd or (assignment.unit_end_date is None and res.end_date_tbd):
|
||||
end_d = res.estimated_end_date or date(start_year + 5, 12, 31)
|
||||
else:
|
||||
end_d = assignment.unit_end_date or res.end_date or date(start_year + 5, 12, 31)
|
||||
|
||||
if unit_id not in unit_reservations:
|
||||
unit_reservations[unit_id] = []
|
||||
unit_reservations[unit_id].append((start_d, end_d, res.name))
|
||||
|
||||
# Generate data for each of the 12 months
|
||||
months_data = []
|
||||
current_year = start_year
|
||||
current_month = start_month
|
||||
|
||||
for i in range(12):
|
||||
# Calculate this month's year and month
|
||||
m_year = start_year + ((start_month - 1 + i) // 12)
|
||||
m_month = ((start_month - 1 + i) % 12) + 1
|
||||
|
||||
first_day = date(m_year, m_month, 1)
|
||||
if m_month == 12:
|
||||
last_day = date(m_year, 12, 31)
|
||||
else:
|
||||
last_day = date(m_year, m_month + 1, 1) - timedelta(days=1)
|
||||
|
||||
days_data = {}
|
||||
current_day = first_day
|
||||
|
||||
while current_day <= last_day:
|
||||
available = 0
|
||||
reserved = 0
|
||||
expired = 0
|
||||
expiring_soon = 0
|
||||
needs_cal = 0
|
||||
cal_expiring_on_day = 0
|
||||
cal_expired_on_day = 0
|
||||
|
||||
for unit in units:
|
||||
cal_status = get_calibration_status(unit, current_day, warning_days)
|
||||
|
||||
if unit.last_calibrated:
|
||||
unit_expiry = unit.last_calibrated + timedelta(days=365)
|
||||
if unit_expiry == current_day:
|
||||
cal_expiring_on_day += 1
|
||||
elif unit_expiry == current_day - timedelta(days=1):
|
||||
cal_expired_on_day += 1
|
||||
|
||||
if cal_status == "expired":
|
||||
expired += 1
|
||||
continue
|
||||
if cal_status == "needs_calibration":
|
||||
needs_cal += 1
|
||||
continue
|
||||
|
||||
is_reserved = False
|
||||
if unit.id in unit_reservations:
|
||||
for start_d, end_d, _ in unit_reservations[unit.id]:
|
||||
if start_d <= current_day <= end_d:
|
||||
is_reserved = True
|
||||
break
|
||||
|
||||
if is_reserved:
|
||||
reserved += 1
|
||||
else:
|
||||
available += 1
|
||||
|
||||
if cal_status == "expiring_soon":
|
||||
expiring_soon += 1
|
||||
|
||||
days_data[current_day.day] = {
|
||||
"available": available,
|
||||
"reserved": reserved,
|
||||
"expired": expired,
|
||||
"expiring_soon": expiring_soon,
|
||||
"needs_calibration": needs_cal,
|
||||
"cal_expiring_on_day": cal_expiring_on_day,
|
||||
"cal_expired_on_day": cal_expired_on_day
|
||||
}
|
||||
|
||||
current_day += timedelta(days=1)
|
||||
|
||||
months_data.append({
|
||||
"year": m_year,
|
||||
"month": m_month,
|
||||
"name": first_day.strftime("%B"),
|
||||
"short_name": first_day.strftime("%b"),
|
||||
"year_short": first_day.strftime("%y"),
|
||||
"days": days_data,
|
||||
"first_weekday": first_day.weekday(),
|
||||
"num_days": last_day.day
|
||||
})
|
||||
|
||||
return {
|
||||
"start_year": start_year,
|
||||
"start_month": start_month,
|
||||
"device_type": device_type,
|
||||
"months": months_data,
|
||||
"total_units": len(units)
|
||||
}
|
||||
|
||||
|
||||
def check_calibration_conflicts(
|
||||
db: Session,
|
||||
reservation_id: str
|
||||
) -> List[Dict]:
|
||||
"""
|
||||
Check if any units assigned to a reservation will have their
|
||||
calibration expire during the reservation period.
|
||||
|
||||
Returns list of conflicts with unit info and expiry date.
|
||||
"""
|
||||
reservation = db.query(JobReservation).filter_by(id=reservation_id).first()
|
||||
if not reservation:
|
||||
return []
|
||||
|
||||
# Get assigned units
|
||||
assigned = db.query(JobReservationUnit).filter_by(
|
||||
reservation_id=reservation_id
|
||||
).all()
|
||||
|
||||
conflicts = []
|
||||
for assignment in assigned:
|
||||
unit = db.query(RosterUnit).filter_by(id=assignment.unit_id).first()
|
||||
if not unit or not unit.last_calibrated:
|
||||
continue
|
||||
|
||||
expiry_date = unit.last_calibrated + timedelta(days=365)
|
||||
|
||||
# Check if expiry falls within reservation period
|
||||
if reservation.start_date < expiry_date <= reservation.end_date:
|
||||
conflicts.append({
|
||||
"unit_id": unit.id,
|
||||
"last_calibrated": unit.last_calibrated.isoformat(),
|
||||
"expiry_date": expiry_date.isoformat(),
|
||||
"reservation_name": reservation.name,
|
||||
"days_into_job": (expiry_date - reservation.start_date).days
|
||||
})
|
||||
|
||||
return conflicts
|
||||
|
||||
|
||||
def get_available_units_for_period(
|
||||
db: Session,
|
||||
start_date: date,
|
||||
end_date: date,
|
||||
device_type: str = "seismograph",
|
||||
exclude_reservation_id: Optional[str] = None
|
||||
) -> List[Dict]:
|
||||
"""
|
||||
Get units that are available for the entire specified period.
|
||||
|
||||
A unit is available if:
|
||||
- Not retired
|
||||
- Calibration is valid through the end date
|
||||
- Not assigned to any other reservation that overlaps the period
|
||||
"""
|
||||
prefs = db.query(UserPreferences).filter_by(id=1).first()
|
||||
warning_days = prefs.calibration_warning_days if prefs else 30
|
||||
|
||||
units = db.query(RosterUnit).filter(
|
||||
RosterUnit.device_type == device_type,
|
||||
RosterUnit.retired == False
|
||||
).all()
|
||||
|
||||
# Get reservations that overlap with this period
|
||||
overlapping_reservations = db.query(JobReservation).filter(
|
||||
JobReservation.device_type == device_type,
|
||||
JobReservation.start_date <= end_date,
|
||||
JobReservation.end_date >= start_date
|
||||
)
|
||||
|
||||
if exclude_reservation_id:
|
||||
overlapping_reservations = overlapping_reservations.filter(
|
||||
JobReservation.id != exclude_reservation_id
|
||||
)
|
||||
|
||||
overlapping_reservations = overlapping_reservations.all()
|
||||
|
||||
# Get all units assigned to overlapping reservations
|
||||
reserved_unit_ids = set()
|
||||
for res in overlapping_reservations:
|
||||
assigned = db.query(JobReservationUnit).filter_by(
|
||||
reservation_id=res.id
|
||||
).all()
|
||||
for a in assigned:
|
||||
reserved_unit_ids.add(a.unit_id)
|
||||
|
||||
available_units = []
|
||||
for unit in units:
|
||||
# Check if already reserved
|
||||
if unit.id in reserved_unit_ids:
|
||||
continue
|
||||
|
||||
# Check calibration through end of period
|
||||
if not unit.last_calibrated:
|
||||
continue # Needs calibration
|
||||
|
||||
expiry_date = unit.last_calibrated + timedelta(days=365)
|
||||
if expiry_date <= end_date:
|
||||
continue # Calibration expires during period
|
||||
|
||||
cal_status = get_calibration_status(unit, end_date, warning_days)
|
||||
|
||||
available_units.append({
|
||||
"id": unit.id,
|
||||
"last_calibrated": unit.last_calibrated.isoformat(),
|
||||
"expiry_date": expiry_date.isoformat(),
|
||||
"calibration_status": cal_status,
|
||||
"deployed": unit.deployed,
|
||||
"note": unit.note or ""
|
||||
})
|
||||
|
||||
return available_units
|
||||
611
backend/services/recurring_schedule_service.py
Normal file
@@ -0,0 +1,611 @@
|
||||
"""
|
||||
Recurring Schedule Service
|
||||
|
||||
Manages recurring schedule definitions and generates ScheduledAction
|
||||
instances based on patterns (weekly calendar, simple interval).
|
||||
"""
|
||||
|
||||
import json
|
||||
import uuid
|
||||
import logging
|
||||
from datetime import datetime, timedelta, date, time
|
||||
from typing import Optional, List, Dict, Any, Tuple
|
||||
from zoneinfo import ZoneInfo
|
||||
|
||||
from sqlalchemy.orm import Session
|
||||
from sqlalchemy import and_
|
||||
|
||||
from backend.models import RecurringSchedule, ScheduledAction, MonitoringLocation, UnitAssignment, Project
|
||||
|
||||
logger = logging.getLogger(__name__)
|
||||
|
||||
# Day name mapping
|
||||
DAY_NAMES = ["monday", "tuesday", "wednesday", "thursday", "friday", "saturday", "sunday"]
|
||||
|
||||
|
||||
class RecurringScheduleService:
|
||||
"""
|
||||
Service for managing recurring schedules and generating ScheduledActions.
|
||||
|
||||
Supports two schedule types:
|
||||
- weekly_calendar: Specific days with start/end times
|
||||
- simple_interval: Daily stop/download/restart cycles for 24/7 monitoring
|
||||
"""
|
||||
|
||||
def __init__(self, db: Session):
|
||||
self.db = db
|
||||
|
||||
def create_schedule(
|
||||
self,
|
||||
project_id: str,
|
||||
location_id: str,
|
||||
name: str,
|
||||
schedule_type: str,
|
||||
device_type: str = "slm",
|
||||
unit_id: str = None,
|
||||
weekly_pattern: dict = None,
|
||||
interval_type: str = None,
|
||||
cycle_time: str = None,
|
||||
include_download: bool = True,
|
||||
auto_increment_index: bool = True,
|
||||
timezone: str = "America/New_York",
|
||||
start_datetime: datetime = None,
|
||||
end_datetime: datetime = None,
|
||||
) -> RecurringSchedule:
|
||||
"""
|
||||
Create a new recurring schedule.
|
||||
|
||||
Args:
|
||||
project_id: Project ID
|
||||
location_id: Monitoring location ID
|
||||
name: Schedule name
|
||||
schedule_type: "weekly_calendar", "simple_interval", or "one_off"
|
||||
device_type: "slm" or "seismograph"
|
||||
unit_id: Specific unit (optional, can use assignment)
|
||||
weekly_pattern: Dict of day patterns for weekly_calendar
|
||||
interval_type: "daily" or "hourly" for simple_interval
|
||||
cycle_time: Time string "HH:MM" for cycle
|
||||
include_download: Whether to download data on cycle
|
||||
auto_increment_index: Whether to auto-increment store index before start
|
||||
timezone: Timezone for schedule times
|
||||
start_datetime: Start date+time in UTC (one_off only)
|
||||
end_datetime: End date+time in UTC (one_off only)
|
||||
|
||||
Returns:
|
||||
Created RecurringSchedule
|
||||
"""
|
||||
schedule = RecurringSchedule(
|
||||
id=str(uuid.uuid4()),
|
||||
project_id=project_id,
|
||||
location_id=location_id,
|
||||
unit_id=unit_id,
|
||||
name=name,
|
||||
schedule_type=schedule_type,
|
||||
device_type=device_type,
|
||||
weekly_pattern=json.dumps(weekly_pattern) if weekly_pattern else None,
|
||||
interval_type=interval_type,
|
||||
cycle_time=cycle_time,
|
||||
include_download=include_download,
|
||||
auto_increment_index=auto_increment_index,
|
||||
enabled=True,
|
||||
timezone=timezone,
|
||||
start_datetime=start_datetime,
|
||||
end_datetime=end_datetime,
|
||||
)
|
||||
|
||||
# Calculate next occurrence
|
||||
schedule.next_occurrence = self._calculate_next_occurrence(schedule)
|
||||
|
||||
self.db.add(schedule)
|
||||
self.db.commit()
|
||||
self.db.refresh(schedule)
|
||||
|
||||
logger.info(f"Created recurring schedule: {name} ({schedule_type})")
|
||||
return schedule
|
||||
|
||||
def update_schedule(
|
||||
self,
|
||||
schedule_id: str,
|
||||
**kwargs,
|
||||
) -> Optional[RecurringSchedule]:
|
||||
"""
|
||||
Update a recurring schedule.
|
||||
|
||||
Args:
|
||||
schedule_id: Schedule to update
|
||||
**kwargs: Fields to update
|
||||
|
||||
Returns:
|
||||
Updated schedule or None
|
||||
"""
|
||||
schedule = self.db.query(RecurringSchedule).filter_by(id=schedule_id).first()
|
||||
if not schedule:
|
||||
return None
|
||||
|
||||
for key, value in kwargs.items():
|
||||
if hasattr(schedule, key):
|
||||
if key == "weekly_pattern" and isinstance(value, dict):
|
||||
value = json.dumps(value)
|
||||
setattr(schedule, key, value)
|
||||
|
||||
# Recalculate next occurrence
|
||||
schedule.next_occurrence = self._calculate_next_occurrence(schedule)
|
||||
|
||||
self.db.commit()
|
||||
self.db.refresh(schedule)
|
||||
|
||||
logger.info(f"Updated recurring schedule: {schedule.name}")
|
||||
return schedule
|
||||
|
||||
def delete_schedule(self, schedule_id: str) -> bool:
|
||||
"""
|
||||
Delete a recurring schedule and its pending generated actions.
|
||||
|
||||
Args:
|
||||
schedule_id: Schedule to delete
|
||||
|
||||
Returns:
|
||||
True if deleted, False if not found
|
||||
"""
|
||||
schedule = self.db.query(RecurringSchedule).filter_by(id=schedule_id).first()
|
||||
if not schedule:
|
||||
return False
|
||||
|
||||
# Delete pending generated actions for this schedule
|
||||
# The schedule_id is stored in the notes field as JSON
|
||||
pending_actions = self.db.query(ScheduledAction).filter(
|
||||
and_(
|
||||
ScheduledAction.execution_status == "pending",
|
||||
ScheduledAction.notes.like(f'%"schedule_id": "{schedule_id}"%'),
|
||||
)
|
||||
).all()
|
||||
|
||||
deleted_count = len(pending_actions)
|
||||
for action in pending_actions:
|
||||
self.db.delete(action)
|
||||
|
||||
self.db.delete(schedule)
|
||||
self.db.commit()
|
||||
|
||||
logger.info(f"Deleted recurring schedule: {schedule.name} (and {deleted_count} pending actions)")
|
||||
return True
|
||||
|
||||
def enable_schedule(self, schedule_id: str) -> Optional[RecurringSchedule]:
|
||||
"""Enable a disabled schedule."""
|
||||
return self.update_schedule(schedule_id, enabled=True)
|
||||
|
||||
def disable_schedule(self, schedule_id: str) -> Optional[RecurringSchedule]:
|
||||
"""Disable a schedule and cancel its pending actions."""
|
||||
schedule = self.update_schedule(schedule_id, enabled=False)
|
||||
if schedule:
|
||||
# Cancel all pending actions generated by this schedule
|
||||
pending_actions = self.db.query(ScheduledAction).filter(
|
||||
and_(
|
||||
ScheduledAction.execution_status == "pending",
|
||||
ScheduledAction.notes.like(f'%"schedule_id": "{schedule_id}"%'),
|
||||
)
|
||||
).all()
|
||||
|
||||
for action in pending_actions:
|
||||
action.execution_status = "cancelled"
|
||||
|
||||
if pending_actions:
|
||||
self.db.commit()
|
||||
logger.info(f"Cancelled {len(pending_actions)} pending actions for disabled schedule {schedule.name}")
|
||||
|
||||
return schedule
|
||||
|
||||
def generate_actions_for_schedule(
|
||||
self,
|
||||
schedule: RecurringSchedule,
|
||||
horizon_days: int = 7,
|
||||
preview_only: bool = False,
|
||||
) -> List[ScheduledAction]:
|
||||
"""
|
||||
Generate ScheduledAction entries for the next N days based on pattern.
|
||||
|
||||
Args:
|
||||
schedule: The recurring schedule
|
||||
horizon_days: Days ahead to generate
|
||||
preview_only: If True, don't save to DB (for preview)
|
||||
|
||||
Returns:
|
||||
List of generated ScheduledAction instances
|
||||
"""
|
||||
if not schedule.enabled:
|
||||
return []
|
||||
|
||||
if schedule.schedule_type == "weekly_calendar":
|
||||
actions = self._generate_weekly_calendar_actions(schedule, horizon_days)
|
||||
elif schedule.schedule_type == "simple_interval":
|
||||
actions = self._generate_interval_actions(schedule, horizon_days)
|
||||
elif schedule.schedule_type == "one_off":
|
||||
actions = self._generate_one_off_actions(schedule)
|
||||
else:
|
||||
logger.warning(f"Unknown schedule type: {schedule.schedule_type}")
|
||||
return []
|
||||
|
||||
if not preview_only and actions:
|
||||
for action in actions:
|
||||
self.db.add(action)
|
||||
|
||||
schedule.last_generated_at = datetime.utcnow()
|
||||
schedule.next_occurrence = self._calculate_next_occurrence(schedule)
|
||||
|
||||
self.db.commit()
|
||||
logger.info(f"Generated {len(actions)} actions for schedule: {schedule.name}")
|
||||
|
||||
return actions
|
||||
|
||||
def _generate_weekly_calendar_actions(
|
||||
self,
|
||||
schedule: RecurringSchedule,
|
||||
horizon_days: int,
|
||||
) -> List[ScheduledAction]:
|
||||
"""
|
||||
Generate actions from weekly calendar pattern.
|
||||
|
||||
Pattern format:
|
||||
{
|
||||
"monday": {"enabled": true, "start": "19:00", "end": "07:00"},
|
||||
"tuesday": {"enabled": false},
|
||||
...
|
||||
}
|
||||
"""
|
||||
if not schedule.weekly_pattern:
|
||||
return []
|
||||
|
||||
try:
|
||||
pattern = json.loads(schedule.weekly_pattern)
|
||||
except json.JSONDecodeError:
|
||||
logger.error(f"Invalid weekly_pattern JSON for schedule {schedule.id}")
|
||||
return []
|
||||
|
||||
actions = []
|
||||
tz = ZoneInfo(schedule.timezone)
|
||||
now_utc = datetime.utcnow()
|
||||
now_local = now_utc.replace(tzinfo=ZoneInfo("UTC")).astimezone(tz)
|
||||
|
||||
# Get unit_id (from schedule or assignment)
|
||||
unit_id = self._resolve_unit_id(schedule)
|
||||
|
||||
for day_offset in range(horizon_days):
|
||||
check_date = now_local.date() + timedelta(days=day_offset)
|
||||
day_name = DAY_NAMES[check_date.weekday()]
|
||||
day_config = pattern.get(day_name, {})
|
||||
|
||||
if not day_config.get("enabled", False):
|
||||
continue
|
||||
|
||||
start_time_str = day_config.get("start")
|
||||
end_time_str = day_config.get("end")
|
||||
|
||||
if not start_time_str or not end_time_str:
|
||||
continue
|
||||
|
||||
# Parse times
|
||||
start_time = self._parse_time(start_time_str)
|
||||
end_time = self._parse_time(end_time_str)
|
||||
|
||||
if not start_time or not end_time:
|
||||
continue
|
||||
|
||||
# Create start datetime in local timezone
|
||||
start_local = datetime.combine(check_date, start_time, tzinfo=tz)
|
||||
start_utc = start_local.astimezone(ZoneInfo("UTC")).replace(tzinfo=None)
|
||||
|
||||
# Handle overnight schedules (end time is next day)
|
||||
if end_time <= start_time:
|
||||
end_date = check_date + timedelta(days=1)
|
||||
else:
|
||||
end_date = check_date
|
||||
|
||||
end_local = datetime.combine(end_date, end_time, tzinfo=tz)
|
||||
end_utc = end_local.astimezone(ZoneInfo("UTC")).replace(tzinfo=None)
|
||||
|
||||
# Skip if start time has already passed
|
||||
if start_utc <= now_utc:
|
||||
continue
|
||||
|
||||
# Check if action already exists
|
||||
if self._action_exists(schedule.project_id, schedule.location_id, "start", start_utc):
|
||||
continue
|
||||
|
||||
# Build notes with automation metadata
|
||||
start_notes = json.dumps({
|
||||
"schedule_name": schedule.name,
|
||||
"schedule_id": schedule.id,
|
||||
"auto_increment_index": schedule.auto_increment_index,
|
||||
})
|
||||
|
||||
# Create START action
|
||||
start_action = ScheduledAction(
|
||||
id=str(uuid.uuid4()),
|
||||
project_id=schedule.project_id,
|
||||
location_id=schedule.location_id,
|
||||
unit_id=unit_id,
|
||||
action_type="start",
|
||||
device_type=schedule.device_type,
|
||||
scheduled_time=start_utc,
|
||||
execution_status="pending",
|
||||
notes=start_notes,
|
||||
)
|
||||
actions.append(start_action)
|
||||
|
||||
# Create STOP action (stop_cycle handles download when include_download is True)
|
||||
stop_notes = json.dumps({
|
||||
"schedule_name": schedule.name,
|
||||
"schedule_id": schedule.id,
|
||||
"schedule_type": "weekly_calendar",
|
||||
"include_download": schedule.include_download,
|
||||
})
|
||||
stop_action = ScheduledAction(
|
||||
id=str(uuid.uuid4()),
|
||||
project_id=schedule.project_id,
|
||||
location_id=schedule.location_id,
|
||||
unit_id=unit_id,
|
||||
action_type="stop",
|
||||
device_type=schedule.device_type,
|
||||
scheduled_time=end_utc,
|
||||
execution_status="pending",
|
||||
notes=stop_notes,
|
||||
)
|
||||
actions.append(stop_action)
|
||||
|
||||
return actions
|
||||
|
||||
def _generate_interval_actions(
|
||||
self,
|
||||
schedule: RecurringSchedule,
|
||||
horizon_days: int,
|
||||
) -> List[ScheduledAction]:
|
||||
"""
|
||||
Generate actions from simple interval pattern.
|
||||
|
||||
For daily cycles: stop, download (optional), start at cycle_time each day.
|
||||
"""
|
||||
if not schedule.cycle_time:
|
||||
return []
|
||||
|
||||
cycle_time = self._parse_time(schedule.cycle_time)
|
||||
if not cycle_time:
|
||||
return []
|
||||
|
||||
actions = []
|
||||
tz = ZoneInfo(schedule.timezone)
|
||||
now_utc = datetime.utcnow()
|
||||
now_local = now_utc.replace(tzinfo=ZoneInfo("UTC")).astimezone(tz)
|
||||
|
||||
# Get unit_id
|
||||
unit_id = self._resolve_unit_id(schedule)
|
||||
|
||||
for day_offset in range(horizon_days):
|
||||
check_date = now_local.date() + timedelta(days=day_offset)
|
||||
|
||||
# Create cycle datetime in local timezone
|
||||
cycle_local = datetime.combine(check_date, cycle_time, tzinfo=tz)
|
||||
cycle_utc = cycle_local.astimezone(ZoneInfo("UTC")).replace(tzinfo=None)
|
||||
|
||||
# Skip if time has passed
|
||||
if cycle_utc <= now_utc:
|
||||
continue
|
||||
|
||||
# Check if cycle action already exists
|
||||
if self._action_exists(schedule.project_id, schedule.location_id, "cycle", cycle_utc):
|
||||
continue
|
||||
|
||||
# Build notes with metadata for cycle action
|
||||
cycle_notes = json.dumps({
|
||||
"schedule_name": schedule.name,
|
||||
"schedule_id": schedule.id,
|
||||
"cycle_type": "daily",
|
||||
"include_download": schedule.include_download,
|
||||
"auto_increment_index": schedule.auto_increment_index,
|
||||
})
|
||||
|
||||
# Create single CYCLE action that handles stop -> download -> start
|
||||
# The scheduler's _execute_cycle method handles the full workflow with delays
|
||||
cycle_action = ScheduledAction(
|
||||
id=str(uuid.uuid4()),
|
||||
project_id=schedule.project_id,
|
||||
location_id=schedule.location_id,
|
||||
unit_id=unit_id,
|
||||
action_type="cycle",
|
||||
device_type=schedule.device_type,
|
||||
scheduled_time=cycle_utc,
|
||||
execution_status="pending",
|
||||
notes=cycle_notes,
|
||||
)
|
||||
actions.append(cycle_action)
|
||||
|
||||
return actions
|
||||
|
||||
def _generate_one_off_actions(
|
||||
self,
|
||||
schedule: RecurringSchedule,
|
||||
) -> List[ScheduledAction]:
|
||||
"""
|
||||
Generate start and stop actions for a one-off recording.
|
||||
|
||||
Unlike recurring types, this generates exactly one start and one stop action
|
||||
using the schedule's start_datetime and end_datetime directly.
|
||||
"""
|
||||
if not schedule.start_datetime or not schedule.end_datetime:
|
||||
logger.warning(f"One-off schedule {schedule.id} missing start/end datetime")
|
||||
return []
|
||||
|
||||
actions = []
|
||||
now_utc = datetime.utcnow()
|
||||
unit_id = self._resolve_unit_id(schedule)
|
||||
|
||||
# Skip if end time has already passed
|
||||
if schedule.end_datetime <= now_utc:
|
||||
return []
|
||||
|
||||
# Check if actions already exist for this schedule
|
||||
if self._action_exists(schedule.project_id, schedule.location_id, "start", schedule.start_datetime):
|
||||
return []
|
||||
|
||||
# Create START action (only if start time hasn't passed)
|
||||
if schedule.start_datetime > now_utc:
|
||||
start_notes = json.dumps({
|
||||
"schedule_name": schedule.name,
|
||||
"schedule_id": schedule.id,
|
||||
"schedule_type": "one_off",
|
||||
"auto_increment_index": schedule.auto_increment_index,
|
||||
})
|
||||
|
||||
start_action = ScheduledAction(
|
||||
id=str(uuid.uuid4()),
|
||||
project_id=schedule.project_id,
|
||||
location_id=schedule.location_id,
|
||||
unit_id=unit_id,
|
||||
action_type="start",
|
||||
device_type=schedule.device_type,
|
||||
scheduled_time=schedule.start_datetime,
|
||||
execution_status="pending",
|
||||
notes=start_notes,
|
||||
)
|
||||
actions.append(start_action)
|
||||
|
||||
# Create STOP action
|
||||
stop_notes = json.dumps({
|
||||
"schedule_name": schedule.name,
|
||||
"schedule_id": schedule.id,
|
||||
"schedule_type": "one_off",
|
||||
"include_download": schedule.include_download,
|
||||
})
|
||||
|
||||
stop_action = ScheduledAction(
|
||||
id=str(uuid.uuid4()),
|
||||
project_id=schedule.project_id,
|
||||
location_id=schedule.location_id,
|
||||
unit_id=unit_id,
|
||||
action_type="stop",
|
||||
device_type=schedule.device_type,
|
||||
scheduled_time=schedule.end_datetime,
|
||||
execution_status="pending",
|
||||
notes=stop_notes,
|
||||
)
|
||||
actions.append(stop_action)
|
||||
|
||||
return actions
|
||||
|
||||
def _calculate_next_occurrence(self, schedule: RecurringSchedule) -> Optional[datetime]:
|
||||
"""Calculate when the next action should occur."""
|
||||
if not schedule.enabled:
|
||||
return None
|
||||
|
||||
tz = ZoneInfo(schedule.timezone)
|
||||
now_utc = datetime.utcnow()
|
||||
now_local = now_utc.replace(tzinfo=ZoneInfo("UTC")).astimezone(tz)
|
||||
|
||||
if schedule.schedule_type == "weekly_calendar" and schedule.weekly_pattern:
|
||||
try:
|
||||
pattern = json.loads(schedule.weekly_pattern)
|
||||
except:
|
||||
return None
|
||||
|
||||
# Find next enabled day
|
||||
for day_offset in range(8): # Check up to a week ahead
|
||||
check_date = now_local.date() + timedelta(days=day_offset)
|
||||
day_name = DAY_NAMES[check_date.weekday()]
|
||||
day_config = pattern.get(day_name, {})
|
||||
|
||||
if day_config.get("enabled") and day_config.get("start"):
|
||||
start_time = self._parse_time(day_config["start"])
|
||||
if start_time:
|
||||
start_local = datetime.combine(check_date, start_time, tzinfo=tz)
|
||||
start_utc = start_local.astimezone(ZoneInfo("UTC")).replace(tzinfo=None)
|
||||
if start_utc > now_utc:
|
||||
return start_utc
|
||||
|
||||
elif schedule.schedule_type == "simple_interval" and schedule.cycle_time:
|
||||
cycle_time = self._parse_time(schedule.cycle_time)
|
||||
if cycle_time:
|
||||
# Find next cycle time
|
||||
for day_offset in range(2):
|
||||
check_date = now_local.date() + timedelta(days=day_offset)
|
||||
cycle_local = datetime.combine(check_date, cycle_time, tzinfo=tz)
|
||||
cycle_utc = cycle_local.astimezone(ZoneInfo("UTC")).replace(tzinfo=None)
|
||||
if cycle_utc > now_utc:
|
||||
return cycle_utc
|
||||
|
||||
elif schedule.schedule_type == "one_off":
|
||||
if schedule.start_datetime and schedule.start_datetime > now_utc:
|
||||
return schedule.start_datetime
|
||||
elif schedule.end_datetime and schedule.end_datetime > now_utc:
|
||||
return schedule.end_datetime
|
||||
return None
|
||||
|
||||
return None
|
||||
|
||||
def _resolve_unit_id(self, schedule: RecurringSchedule) -> Optional[str]:
|
||||
"""Get unit_id from schedule or active assignment."""
|
||||
if schedule.unit_id:
|
||||
return schedule.unit_id
|
||||
|
||||
# Try to get from active assignment
|
||||
assignment = self.db.query(UnitAssignment).filter(
|
||||
and_(
|
||||
UnitAssignment.location_id == schedule.location_id,
|
||||
UnitAssignment.status == "active",
|
||||
)
|
||||
).first()
|
||||
|
||||
return assignment.unit_id if assignment else None
|
||||
|
||||
def _action_exists(
|
||||
self,
|
||||
project_id: str,
|
||||
location_id: str,
|
||||
action_type: str,
|
||||
scheduled_time: datetime,
|
||||
) -> bool:
|
||||
"""Check if an action already exists for this time slot."""
|
||||
# Allow 5-minute window for duplicate detection
|
||||
time_window_start = scheduled_time - timedelta(minutes=5)
|
||||
time_window_end = scheduled_time + timedelta(minutes=5)
|
||||
|
||||
exists = self.db.query(ScheduledAction).filter(
|
||||
and_(
|
||||
ScheduledAction.project_id == project_id,
|
||||
ScheduledAction.location_id == location_id,
|
||||
ScheduledAction.action_type == action_type,
|
||||
ScheduledAction.scheduled_time >= time_window_start,
|
||||
ScheduledAction.scheduled_time <= time_window_end,
|
||||
ScheduledAction.execution_status == "pending",
|
||||
)
|
||||
).first()
|
||||
|
||||
return exists is not None
|
||||
|
||||
@staticmethod
|
||||
def _parse_time(time_str: str) -> Optional[time]:
|
||||
"""Parse time string "HH:MM" to time object."""
|
||||
try:
|
||||
parts = time_str.split(":")
|
||||
return time(int(parts[0]), int(parts[1]))
|
||||
except (ValueError, IndexError):
|
||||
return None
|
||||
|
||||
def get_schedules_for_project(self, project_id: str) -> List[RecurringSchedule]:
|
||||
"""Get all recurring schedules for a project."""
|
||||
return self.db.query(RecurringSchedule).filter_by(project_id=project_id).all()
|
||||
|
||||
def get_enabled_schedules(self) -> List[RecurringSchedule]:
|
||||
"""Get all enabled recurring schedules for projects that are not on hold or deleted."""
|
||||
active_project_ids = [
|
||||
p.id for p in self.db.query(Project.id).filter(
|
||||
Project.status.notin_(["on_hold", "archived", "deleted"])
|
||||
).all()
|
||||
]
|
||||
return self.db.query(RecurringSchedule).filter(
|
||||
RecurringSchedule.enabled == True,
|
||||
RecurringSchedule.project_id.in_(active_project_ids),
|
||||
).all()
|
||||
|
||||
|
||||
def get_recurring_schedule_service(db: Session) -> RecurringScheduleService:
|
||||
"""Get a RecurringScheduleService instance."""
|
||||
return RecurringScheduleService(db)
|
||||
@@ -4,22 +4,30 @@ Scheduler Service
|
||||
Executes scheduled actions for Projects system.
|
||||
Monitors pending scheduled actions and executes them by calling device modules (SLMM/SFM).
|
||||
|
||||
Extended to support recurring schedules:
|
||||
- Generates ScheduledActions from RecurringSchedule patterns
|
||||
- Cleans up old completed/failed actions
|
||||
|
||||
This service runs as a background task in FastAPI, checking for pending actions
|
||||
every minute and executing them when their scheduled time arrives.
|
||||
"""
|
||||
|
||||
import asyncio
|
||||
import json
|
||||
import logging
|
||||
from datetime import datetime, timedelta
|
||||
from typing import Optional, List, Dict, Any
|
||||
from sqlalchemy.orm import Session
|
||||
from sqlalchemy import and_
|
||||
|
||||
from backend.database import SessionLocal
|
||||
from backend.models import ScheduledAction, RecordingSession, MonitoringLocation, Project
|
||||
from backend.models import ScheduledAction, MonitoringSession, MonitoringLocation, Project, RecurringSchedule
|
||||
from backend.services.device_controller import get_device_controller, DeviceControllerError
|
||||
from backend.services.alert_service import get_alert_service
|
||||
import uuid
|
||||
|
||||
logger = logging.getLogger(__name__)
|
||||
|
||||
|
||||
class SchedulerService:
|
||||
"""
|
||||
@@ -62,11 +70,26 @@ class SchedulerService:
|
||||
|
||||
async def _run_loop(self):
|
||||
"""Main scheduler loop."""
|
||||
# Track when we last generated recurring actions (do this once per hour)
|
||||
last_generation_check = datetime.utcnow() - timedelta(hours=1)
|
||||
|
||||
while self.running:
|
||||
try:
|
||||
# Execute pending actions
|
||||
await self.execute_pending_actions()
|
||||
|
||||
# Generate actions from recurring schedules (every hour)
|
||||
now = datetime.utcnow()
|
||||
if (now - last_generation_check).total_seconds() >= 3600:
|
||||
await self.generate_recurring_actions()
|
||||
last_generation_check = now
|
||||
|
||||
# Cleanup old actions (also every hour, during generation cycle)
|
||||
if (now - last_generation_check).total_seconds() < 60:
|
||||
await self.cleanup_old_actions()
|
||||
|
||||
except Exception as e:
|
||||
print(f"Scheduler error: {e}")
|
||||
logger.error(f"Scheduler error: {e}", exc_info=True)
|
||||
# Continue running even if there's an error
|
||||
|
||||
await asyncio.sleep(self.check_interval)
|
||||
@@ -84,10 +107,19 @@ class SchedulerService:
|
||||
try:
|
||||
# Find pending actions that are due
|
||||
now = datetime.utcnow()
|
||||
|
||||
# Only execute actions for active/completed projects (not on_hold, archived, or deleted)
|
||||
active_project_ids = [
|
||||
p.id for p in db.query(Project.id).filter(
|
||||
Project.status.notin_(["on_hold", "archived", "deleted"])
|
||||
).all()
|
||||
]
|
||||
|
||||
pending_actions = db.query(ScheduledAction).filter(
|
||||
and_(
|
||||
ScheduledAction.execution_status == "pending",
|
||||
ScheduledAction.scheduled_time <= now,
|
||||
ScheduledAction.project_id.in_(active_project_ids),
|
||||
)
|
||||
).order_by(ScheduledAction.scheduled_time).all()
|
||||
|
||||
@@ -162,6 +194,8 @@ class SchedulerService:
|
||||
response = await self._execute_stop(action, unit_id, db)
|
||||
elif action.action_type == "download":
|
||||
response = await self._execute_download(action, unit_id, db)
|
||||
elif action.action_type == "cycle":
|
||||
response = await self._execute_cycle(action, unit_id, db)
|
||||
else:
|
||||
raise Exception(f"Unknown action type: {action.action_type}")
|
||||
|
||||
@@ -175,6 +209,21 @@ class SchedulerService:
|
||||
|
||||
print(f"✓ Action {action.id} completed successfully")
|
||||
|
||||
# Create success alert
|
||||
try:
|
||||
alert_service = get_alert_service(db)
|
||||
alert_metadata = response.get("cycle_response", {}) if isinstance(response, dict) else {}
|
||||
alert_service.create_schedule_completed_alert(
|
||||
schedule_id=action.id,
|
||||
action_type=action.action_type,
|
||||
unit_id=unit_id,
|
||||
project_id=action.project_id,
|
||||
location_id=action.location_id,
|
||||
metadata=alert_metadata,
|
||||
)
|
||||
except Exception as alert_err:
|
||||
logger.warning(f"Failed to create success alert: {alert_err}")
|
||||
|
||||
except Exception as e:
|
||||
# Mark action as failed
|
||||
action.execution_status = "failed"
|
||||
@@ -185,6 +234,20 @@ class SchedulerService:
|
||||
|
||||
print(f"✗ Action {action.id} failed: {e}")
|
||||
|
||||
# Create failure alert
|
||||
try:
|
||||
alert_service = get_alert_service(db)
|
||||
alert_service.create_schedule_failed_alert(
|
||||
schedule_id=action.id,
|
||||
action_type=action.action_type,
|
||||
unit_id=unit_id if 'unit_id' in dir() else action.unit_id,
|
||||
error_message=str(e),
|
||||
project_id=action.project_id,
|
||||
location_id=action.location_id,
|
||||
)
|
||||
except Exception as alert_err:
|
||||
logger.warning(f"Failed to create failure alert: {alert_err}")
|
||||
|
||||
return result
|
||||
|
||||
async def _execute_start(
|
||||
@@ -193,31 +256,41 @@ class SchedulerService:
|
||||
unit_id: str,
|
||||
db: Session,
|
||||
) -> Dict[str, Any]:
|
||||
"""Execute a 'start' action."""
|
||||
# Start recording via device controller
|
||||
response = await self.device_controller.start_recording(
|
||||
"""Execute a 'start' action using the start_cycle command.
|
||||
|
||||
start_cycle handles:
|
||||
1. Sync device clock to server time
|
||||
2. Find next safe index (with overwrite protection)
|
||||
3. Start measurement
|
||||
"""
|
||||
# Execute the full start cycle via device controller
|
||||
# SLMM handles clock sync, index increment, and start
|
||||
cycle_response = await self.device_controller.start_cycle(
|
||||
unit_id,
|
||||
action.device_type,
|
||||
config={}, # TODO: Load config from action.notes or metadata
|
||||
sync_clock=True,
|
||||
)
|
||||
|
||||
# Create recording session
|
||||
session = RecordingSession(
|
||||
session = MonitoringSession(
|
||||
id=str(uuid.uuid4()),
|
||||
project_id=action.project_id,
|
||||
location_id=action.location_id,
|
||||
unit_id=unit_id,
|
||||
session_type="sound" if action.device_type == "sound_level_meter" else "vibration",
|
||||
session_type="sound" if action.device_type == "slm" else "vibration",
|
||||
started_at=datetime.utcnow(),
|
||||
status="recording",
|
||||
session_metadata=json.dumps({"scheduled_action_id": action.id}),
|
||||
session_metadata=json.dumps({
|
||||
"scheduled_action_id": action.id,
|
||||
"cycle_response": cycle_response,
|
||||
}),
|
||||
)
|
||||
db.add(session)
|
||||
|
||||
return {
|
||||
"status": "started",
|
||||
"session_id": session.id,
|
||||
"device_response": response,
|
||||
"cycle_response": cycle_response,
|
||||
}
|
||||
|
||||
async def _execute_stop(
|
||||
@@ -226,19 +299,48 @@ class SchedulerService:
|
||||
unit_id: str,
|
||||
db: Session,
|
||||
) -> Dict[str, Any]:
|
||||
"""Execute a 'stop' action."""
|
||||
# Stop recording via device controller
|
||||
response = await self.device_controller.stop_recording(
|
||||
"""Execute a 'stop' action using the stop_cycle command.
|
||||
|
||||
stop_cycle handles:
|
||||
1. Stop measurement
|
||||
2. Enable FTP
|
||||
3. Download measurement folder to SLMM local storage
|
||||
|
||||
After stop_cycle, if download succeeded, this method fetches the ZIP
|
||||
from SLMM and extracts it into Terra-View's project directory, creating
|
||||
DataFile records for each file.
|
||||
"""
|
||||
import hashlib
|
||||
import io
|
||||
import os
|
||||
import zipfile
|
||||
import httpx
|
||||
from pathlib import Path
|
||||
from backend.models import DataFile
|
||||
|
||||
# Parse notes for download preference
|
||||
include_download = True
|
||||
try:
|
||||
if action.notes:
|
||||
notes_data = json.loads(action.notes)
|
||||
include_download = notes_data.get("include_download", True)
|
||||
except json.JSONDecodeError:
|
||||
pass # Notes is plain text, not JSON
|
||||
|
||||
# Execute the full stop cycle via device controller
|
||||
# SLMM handles stop, FTP enable, and download to SLMM-local storage
|
||||
cycle_response = await self.device_controller.stop_cycle(
|
||||
unit_id,
|
||||
action.device_type,
|
||||
download=include_download,
|
||||
)
|
||||
|
||||
# Find and update the active recording session
|
||||
active_session = db.query(RecordingSession).filter(
|
||||
active_session = db.query(MonitoringSession).filter(
|
||||
and_(
|
||||
RecordingSession.location_id == action.location_id,
|
||||
RecordingSession.unit_id == unit_id,
|
||||
RecordingSession.status == "recording",
|
||||
MonitoringSession.location_id == action.location_id,
|
||||
MonitoringSession.unit_id == unit_id,
|
||||
MonitoringSession.status == "recording",
|
||||
)
|
||||
).first()
|
||||
|
||||
@@ -248,11 +350,91 @@ class SchedulerService:
|
||||
active_session.duration_seconds = int(
|
||||
(active_session.stopped_at - active_session.started_at).total_seconds()
|
||||
)
|
||||
# Store download info in session metadata
|
||||
if cycle_response.get("download_success"):
|
||||
try:
|
||||
metadata = json.loads(active_session.session_metadata or "{}")
|
||||
metadata["downloaded_folder"] = cycle_response.get("downloaded_folder")
|
||||
metadata["local_path"] = cycle_response.get("local_path")
|
||||
active_session.session_metadata = json.dumps(metadata)
|
||||
except json.JSONDecodeError:
|
||||
pass
|
||||
|
||||
db.commit()
|
||||
|
||||
# If SLMM downloaded the folder successfully, fetch the ZIP from SLMM
|
||||
# and extract it into Terra-View's project directory, creating DataFile records
|
||||
files_created = 0
|
||||
if include_download and cycle_response.get("download_success") and active_session:
|
||||
folder_name = cycle_response.get("downloaded_folder") # e.g. "Auto_0058"
|
||||
remote_path = f"/NL-43/{folder_name}"
|
||||
|
||||
try:
|
||||
SLMM_BASE_URL = os.getenv("SLMM_BASE_URL", "http://localhost:8100")
|
||||
async with httpx.AsyncClient(timeout=600.0) as client:
|
||||
zip_response = await client.post(
|
||||
f"{SLMM_BASE_URL}/api/nl43/{unit_id}/ftp/download-folder",
|
||||
json={"remote_path": remote_path}
|
||||
)
|
||||
|
||||
if zip_response.is_success and len(zip_response.content) > 22:
|
||||
base_dir = Path(f"data/Projects/{action.project_id}/{active_session.id}/{folder_name}")
|
||||
base_dir.mkdir(parents=True, exist_ok=True)
|
||||
|
||||
file_type_map = {
|
||||
'.wav': 'audio', '.mp3': 'audio',
|
||||
'.csv': 'data', '.txt': 'data', '.json': 'data', '.dat': 'data',
|
||||
'.rnd': 'data', '.rnh': 'data',
|
||||
'.log': 'log',
|
||||
'.zip': 'archive',
|
||||
'.jpg': 'image', '.jpeg': 'image', '.png': 'image',
|
||||
'.pdf': 'document',
|
||||
}
|
||||
|
||||
with zipfile.ZipFile(io.BytesIO(zip_response.content)) as zf:
|
||||
for zip_info in zf.filelist:
|
||||
if zip_info.is_dir():
|
||||
continue
|
||||
file_data = zf.read(zip_info.filename)
|
||||
file_path = base_dir / zip_info.filename
|
||||
file_path.parent.mkdir(parents=True, exist_ok=True)
|
||||
with open(file_path, 'wb') as f:
|
||||
f.write(file_data)
|
||||
checksum = hashlib.sha256(file_data).hexdigest()
|
||||
ext = os.path.splitext(zip_info.filename)[1].lower()
|
||||
data_file = DataFile(
|
||||
id=str(uuid.uuid4()),
|
||||
session_id=active_session.id,
|
||||
file_path=str(file_path.relative_to("data")),
|
||||
file_type=file_type_map.get(ext, 'data'),
|
||||
file_size_bytes=len(file_data),
|
||||
downloaded_at=datetime.utcnow(),
|
||||
checksum=checksum,
|
||||
file_metadata=json.dumps({
|
||||
"source": "stop_cycle",
|
||||
"remote_path": remote_path,
|
||||
"unit_id": unit_id,
|
||||
"folder_name": folder_name,
|
||||
"relative_path": zip_info.filename,
|
||||
}),
|
||||
)
|
||||
db.add(data_file)
|
||||
files_created += 1
|
||||
|
||||
db.commit()
|
||||
logger.info(f"Created {files_created} DataFile records for session {active_session.id} from {folder_name}")
|
||||
else:
|
||||
logger.warning(f"ZIP from SLMM for {folder_name} was empty or failed, skipping DataFile creation")
|
||||
|
||||
except Exception as e:
|
||||
logger.error(f"Failed to extract ZIP and create DataFile records for {folder_name}: {e}")
|
||||
# Don't fail the stop action — the device was stopped successfully
|
||||
|
||||
return {
|
||||
"status": "stopped",
|
||||
"session_id": active_session.id if active_session else None,
|
||||
"device_response": response,
|
||||
"cycle_response": cycle_response,
|
||||
"files_created": files_created,
|
||||
}
|
||||
|
||||
async def _execute_download(
|
||||
@@ -261,7 +443,14 @@ class SchedulerService:
|
||||
unit_id: str,
|
||||
db: Session,
|
||||
) -> Dict[str, Any]:
|
||||
"""Execute a 'download' action."""
|
||||
"""Execute a 'download' action.
|
||||
|
||||
This handles standalone download actions (not part of stop_cycle).
|
||||
The workflow is:
|
||||
1. Enable FTP on device
|
||||
2. Download current measurement folder
|
||||
3. (Optionally disable FTP - left enabled for now)
|
||||
"""
|
||||
# Get project and location info for file path
|
||||
location = db.query(MonitoringLocation).filter_by(id=action.location_id).first()
|
||||
project = db.query(Project).filter_by(id=action.project_id).first()
|
||||
@@ -269,22 +458,33 @@ class SchedulerService:
|
||||
if not location or not project:
|
||||
raise Exception("Project or location not found")
|
||||
|
||||
# Build destination path
|
||||
# Example: data/Projects/{project-id}/sound/{location-name}/session-{timestamp}/
|
||||
# Build destination path (for logging/metadata reference)
|
||||
# Actual download location is managed by SLMM (data/downloads/{unit_id}/)
|
||||
session_timestamp = datetime.utcnow().strftime("%Y-%m-%d-%H%M")
|
||||
location_type_dir = "sound" if action.device_type == "sound_level_meter" else "vibration"
|
||||
location_type_dir = "sound" if action.device_type == "slm" else "vibration"
|
||||
|
||||
destination_path = (
|
||||
f"data/Projects/{project.id}/{location_type_dir}/"
|
||||
f"{location.name}/session-{session_timestamp}/"
|
||||
)
|
||||
|
||||
# Download files via device controller
|
||||
# Step 1: Disable FTP first to reset any stale connection state
|
||||
# Then enable FTP on device
|
||||
logger.info(f"Resetting FTP on {unit_id} for download (disable then enable)")
|
||||
try:
|
||||
await self.device_controller.disable_ftp(unit_id, action.device_type)
|
||||
except Exception as e:
|
||||
logger.warning(f"FTP disable failed (may already be off): {e}")
|
||||
await self.device_controller.enable_ftp(unit_id, action.device_type)
|
||||
|
||||
# Step 2: Download current measurement folder
|
||||
# The slmm_client.download_files() now automatically determines the correct
|
||||
# folder based on the device's current index number
|
||||
response = await self.device_controller.download_files(
|
||||
unit_id,
|
||||
action.device_type,
|
||||
destination_path,
|
||||
files=None, # Download all files
|
||||
files=None, # Download all files in current measurement folder
|
||||
)
|
||||
|
||||
# TODO: Create DataFile records for downloaded files
|
||||
@@ -295,6 +495,293 @@ class SchedulerService:
|
||||
"device_response": response,
|
||||
}
|
||||
|
||||
async def _execute_cycle(
|
||||
self,
|
||||
action: ScheduledAction,
|
||||
unit_id: str,
|
||||
db: Session,
|
||||
) -> Dict[str, Any]:
|
||||
"""Execute a full 'cycle' action: stop -> download -> start.
|
||||
|
||||
This combines stop, download, and start into a single action with
|
||||
appropriate delays between steps to ensure device stability.
|
||||
|
||||
Workflow:
|
||||
0. Pause background polling to prevent command conflicts
|
||||
1. Stop measurement (wait 10s)
|
||||
2. Disable FTP to reset state (wait 10s)
|
||||
3. Enable FTP (wait 10s)
|
||||
4. Download current measurement folder
|
||||
5. Wait 30s for device to settle
|
||||
6. Start new measurement cycle
|
||||
7. Re-enable background polling
|
||||
|
||||
Total time: ~70-90 seconds depending on download size
|
||||
"""
|
||||
logger.info(f"[CYCLE] === Starting full cycle for {unit_id} ===")
|
||||
|
||||
result = {
|
||||
"status": "cycle_complete",
|
||||
"steps": {},
|
||||
"old_session_id": None,
|
||||
"new_session_id": None,
|
||||
"polling_paused": False,
|
||||
}
|
||||
|
||||
# Step 0: Pause background polling for this device to prevent command conflicts
|
||||
# NL-43 devices only support one TCP connection at a time
|
||||
logger.info(f"[CYCLE] Step 0: Pausing background polling for {unit_id}")
|
||||
polling_was_enabled = False
|
||||
try:
|
||||
if action.device_type == "slm":
|
||||
# Get current polling state to restore later
|
||||
from backend.services.slmm_client import get_slmm_client
|
||||
slmm = get_slmm_client()
|
||||
try:
|
||||
polling_config = await slmm.get_device_polling_config(unit_id)
|
||||
polling_was_enabled = polling_config.get("poll_enabled", False)
|
||||
except Exception:
|
||||
polling_was_enabled = True # Assume enabled if can't check
|
||||
|
||||
# Disable polling during cycle
|
||||
await slmm.update_device_polling_config(unit_id, poll_enabled=False)
|
||||
result["polling_paused"] = True
|
||||
logger.info(f"[CYCLE] Background polling paused for {unit_id}")
|
||||
except Exception as e:
|
||||
logger.warning(f"[CYCLE] Failed to pause polling (continuing anyway): {e}")
|
||||
|
||||
try:
|
||||
# Step 1: Stop measurement
|
||||
logger.info(f"[CYCLE] Step 1/7: Stopping measurement on {unit_id}")
|
||||
try:
|
||||
stop_response = await self.device_controller.stop_recording(unit_id, action.device_type)
|
||||
result["steps"]["stop"] = {"success": True, "response": stop_response}
|
||||
logger.info(f"[CYCLE] Measurement stopped, waiting 10s...")
|
||||
except Exception as e:
|
||||
logger.warning(f"[CYCLE] Stop failed (may already be stopped): {e}")
|
||||
result["steps"]["stop"] = {"success": False, "error": str(e)}
|
||||
|
||||
await asyncio.sleep(10)
|
||||
|
||||
# Step 2: Disable FTP to reset any stale state
|
||||
logger.info(f"[CYCLE] Step 2/7: Disabling FTP on {unit_id}")
|
||||
try:
|
||||
await self.device_controller.disable_ftp(unit_id, action.device_type)
|
||||
result["steps"]["ftp_disable"] = {"success": True}
|
||||
logger.info(f"[CYCLE] FTP disabled, waiting 10s...")
|
||||
except Exception as e:
|
||||
logger.warning(f"[CYCLE] FTP disable failed (may already be off): {e}")
|
||||
result["steps"]["ftp_disable"] = {"success": False, "error": str(e)}
|
||||
|
||||
await asyncio.sleep(10)
|
||||
|
||||
# Step 3: Enable FTP
|
||||
logger.info(f"[CYCLE] Step 3/7: Enabling FTP on {unit_id}")
|
||||
try:
|
||||
await self.device_controller.enable_ftp(unit_id, action.device_type)
|
||||
result["steps"]["ftp_enable"] = {"success": True}
|
||||
logger.info(f"[CYCLE] FTP enabled, waiting 10s...")
|
||||
except Exception as e:
|
||||
logger.error(f"[CYCLE] FTP enable failed: {e}")
|
||||
result["steps"]["ftp_enable"] = {"success": False, "error": str(e)}
|
||||
# Continue anyway - download will fail but we can still try to start
|
||||
|
||||
await asyncio.sleep(10)
|
||||
|
||||
# Step 4: Download current measurement folder
|
||||
logger.info(f"[CYCLE] Step 4/7: Downloading measurement data from {unit_id}")
|
||||
location = db.query(MonitoringLocation).filter_by(id=action.location_id).first()
|
||||
project = db.query(Project).filter_by(id=action.project_id).first()
|
||||
|
||||
if location and project:
|
||||
session_timestamp = datetime.utcnow().strftime("%Y-%m-%d-%H%M")
|
||||
location_type_dir = "sound" if action.device_type == "slm" else "vibration"
|
||||
destination_path = (
|
||||
f"data/Projects/{project.id}/{location_type_dir}/"
|
||||
f"{location.name}/session-{session_timestamp}/"
|
||||
)
|
||||
|
||||
try:
|
||||
download_response = await self.device_controller.download_files(
|
||||
unit_id,
|
||||
action.device_type,
|
||||
destination_path,
|
||||
files=None,
|
||||
)
|
||||
result["steps"]["download"] = {"success": True, "response": download_response}
|
||||
logger.info(f"[CYCLE] Download complete")
|
||||
except Exception as e:
|
||||
logger.error(f"[CYCLE] Download failed: {e}")
|
||||
result["steps"]["download"] = {"success": False, "error": str(e)}
|
||||
else:
|
||||
result["steps"]["download"] = {"success": False, "error": "Project or location not found"}
|
||||
|
||||
# Close out the old recording session
|
||||
active_session = db.query(MonitoringSession).filter(
|
||||
and_(
|
||||
MonitoringSession.location_id == action.location_id,
|
||||
MonitoringSession.unit_id == unit_id,
|
||||
MonitoringSession.status == "recording",
|
||||
)
|
||||
).first()
|
||||
|
||||
if active_session:
|
||||
active_session.stopped_at = datetime.utcnow()
|
||||
active_session.status = "completed"
|
||||
active_session.duration_seconds = int(
|
||||
(active_session.stopped_at - active_session.started_at).total_seconds()
|
||||
)
|
||||
result["old_session_id"] = active_session.id
|
||||
|
||||
# Step 5: Wait for device to settle before starting new measurement
|
||||
logger.info(f"[CYCLE] Step 5/7: Waiting 30s for device to settle...")
|
||||
await asyncio.sleep(30)
|
||||
|
||||
# Step 6: Start new measurement cycle
|
||||
logger.info(f"[CYCLE] Step 6/7: Starting new measurement on {unit_id}")
|
||||
try:
|
||||
cycle_response = await self.device_controller.start_cycle(
|
||||
unit_id,
|
||||
action.device_type,
|
||||
sync_clock=True,
|
||||
)
|
||||
result["steps"]["start"] = {"success": True, "response": cycle_response}
|
||||
|
||||
# Create new recording session
|
||||
new_session = MonitoringSession(
|
||||
id=str(uuid.uuid4()),
|
||||
project_id=action.project_id,
|
||||
location_id=action.location_id,
|
||||
unit_id=unit_id,
|
||||
session_type="sound" if action.device_type == "slm" else "vibration",
|
||||
started_at=datetime.utcnow(),
|
||||
status="recording",
|
||||
session_metadata=json.dumps({
|
||||
"scheduled_action_id": action.id,
|
||||
"cycle_response": cycle_response,
|
||||
"action_type": "cycle",
|
||||
}),
|
||||
)
|
||||
db.add(new_session)
|
||||
result["new_session_id"] = new_session.id
|
||||
|
||||
logger.info(f"[CYCLE] New measurement started, session {new_session.id}")
|
||||
|
||||
except Exception as e:
|
||||
logger.error(f"[CYCLE] Start failed: {e}")
|
||||
result["steps"]["start"] = {"success": False, "error": str(e)}
|
||||
raise # Re-raise to mark the action as failed
|
||||
|
||||
finally:
|
||||
# Step 7: Re-enable background polling (always runs, even on failure)
|
||||
if result.get("polling_paused") and polling_was_enabled:
|
||||
logger.info(f"[CYCLE] Step 7/7: Re-enabling background polling for {unit_id}")
|
||||
try:
|
||||
if action.device_type == "slm":
|
||||
from backend.services.slmm_client import get_slmm_client
|
||||
slmm = get_slmm_client()
|
||||
await slmm.update_device_polling_config(unit_id, poll_enabled=True)
|
||||
logger.info(f"[CYCLE] Background polling re-enabled for {unit_id}")
|
||||
except Exception as e:
|
||||
logger.error(f"[CYCLE] Failed to re-enable polling: {e}")
|
||||
# Don't raise - cycle completed, just log the error
|
||||
|
||||
logger.info(f"[CYCLE] === Cycle complete for {unit_id} ===")
|
||||
return result
|
||||
|
||||
# ========================================================================
|
||||
# Recurring Schedule Generation
|
||||
# ========================================================================
|
||||
|
||||
async def generate_recurring_actions(self) -> int:
|
||||
"""
|
||||
Generate ScheduledActions from all enabled recurring schedules.
|
||||
|
||||
Runs once per hour to generate actions for the next 7 days.
|
||||
|
||||
Returns:
|
||||
Total number of actions generated
|
||||
"""
|
||||
db = SessionLocal()
|
||||
total_generated = 0
|
||||
|
||||
try:
|
||||
from backend.services.recurring_schedule_service import get_recurring_schedule_service
|
||||
|
||||
service = get_recurring_schedule_service(db)
|
||||
schedules = service.get_enabled_schedules()
|
||||
|
||||
if not schedules:
|
||||
logger.debug("No enabled recurring schedules found")
|
||||
return 0
|
||||
|
||||
logger.info(f"Generating actions for {len(schedules)} recurring schedule(s)")
|
||||
|
||||
for schedule in schedules:
|
||||
try:
|
||||
# Auto-disable one-off schedules whose end time has passed
|
||||
if schedule.schedule_type == "one_off" and schedule.end_datetime:
|
||||
if schedule.end_datetime <= datetime.utcnow():
|
||||
schedule.enabled = False
|
||||
schedule.next_occurrence = None
|
||||
db.commit()
|
||||
logger.info(f"Auto-disabled completed one-off schedule: {schedule.name}")
|
||||
continue
|
||||
|
||||
actions = service.generate_actions_for_schedule(schedule, horizon_days=7)
|
||||
total_generated += len(actions)
|
||||
except Exception as e:
|
||||
logger.error(f"Error generating actions for schedule {schedule.id}: {e}")
|
||||
|
||||
if total_generated > 0:
|
||||
logger.info(f"Generated {total_generated} scheduled actions from recurring schedules")
|
||||
|
||||
except Exception as e:
|
||||
logger.error(f"Error in generate_recurring_actions: {e}", exc_info=True)
|
||||
finally:
|
||||
db.close()
|
||||
|
||||
return total_generated
|
||||
|
||||
async def cleanup_old_actions(self, retention_days: int = 30) -> int:
|
||||
"""
|
||||
Remove old completed/failed actions to prevent database bloat.
|
||||
|
||||
Args:
|
||||
retention_days: Keep actions newer than this many days
|
||||
|
||||
Returns:
|
||||
Number of actions cleaned up
|
||||
"""
|
||||
db = SessionLocal()
|
||||
cleaned = 0
|
||||
|
||||
try:
|
||||
cutoff = datetime.utcnow() - timedelta(days=retention_days)
|
||||
|
||||
old_actions = db.query(ScheduledAction).filter(
|
||||
and_(
|
||||
ScheduledAction.execution_status.in_(["completed", "failed", "cancelled"]),
|
||||
ScheduledAction.executed_at < cutoff,
|
||||
)
|
||||
).all()
|
||||
|
||||
cleaned = len(old_actions)
|
||||
for action in old_actions:
|
||||
db.delete(action)
|
||||
|
||||
if cleaned > 0:
|
||||
db.commit()
|
||||
logger.info(f"Cleaned up {cleaned} old scheduled actions (>{retention_days} days)")
|
||||
|
||||
except Exception as e:
|
||||
logger.error(f"Error cleaning up old actions: {e}")
|
||||
db.rollback()
|
||||
finally:
|
||||
db.close()
|
||||
|
||||
return cleaned
|
||||
|
||||
# ========================================================================
|
||||
# Manual Execution (for testing/debugging)
|
||||
# ========================================================================
|
||||
|
||||
129
backend/services/slm_status_sync.py
Normal file
@@ -0,0 +1,129 @@
|
||||
"""
|
||||
SLM Status Synchronization Service
|
||||
|
||||
Syncs SLM device status from SLMM backend to Terra-View's Emitter table.
|
||||
This bridges SLMM's polling data with Terra-View's status snapshot system.
|
||||
|
||||
SLMM tracks device reachability via background polling. This service
|
||||
fetches that data and creates/updates Emitter records so SLMs appear
|
||||
correctly in the dashboard status snapshot.
|
||||
"""
|
||||
|
||||
import logging
|
||||
from datetime import datetime, timezone
|
||||
from typing import Dict, Any
|
||||
|
||||
from backend.database import get_db_session
|
||||
from backend.models import Emitter
|
||||
from backend.services.slmm_client import get_slmm_client, SLMMClientError
|
||||
|
||||
logger = logging.getLogger(__name__)
|
||||
|
||||
|
||||
async def sync_slm_status_to_emitters() -> Dict[str, Any]:
|
||||
"""
|
||||
Fetch SLM status from SLMM and sync to Terra-View's Emitter table.
|
||||
|
||||
For each device in SLMM's polling status:
|
||||
- If last_success exists, create/update Emitter with that timestamp
|
||||
- If not reachable, update Emitter with last known timestamp (or None)
|
||||
|
||||
Returns:
|
||||
Dict with synced_count, error_count, errors list
|
||||
"""
|
||||
client = get_slmm_client()
|
||||
synced = 0
|
||||
errors = []
|
||||
|
||||
try:
|
||||
# Get polling status from SLMM
|
||||
status_response = await client.get_polling_status()
|
||||
|
||||
# Handle nested response structure
|
||||
data = status_response.get("data", status_response)
|
||||
devices = data.get("devices", [])
|
||||
|
||||
if not devices:
|
||||
logger.debug("No SLM devices in SLMM polling status")
|
||||
return {"synced_count": 0, "error_count": 0, "errors": []}
|
||||
|
||||
db = get_db_session()
|
||||
try:
|
||||
for device in devices:
|
||||
unit_id = device.get("unit_id")
|
||||
if not unit_id:
|
||||
continue
|
||||
|
||||
try:
|
||||
# Get or create Emitter record
|
||||
emitter = db.query(Emitter).filter(Emitter.id == unit_id).first()
|
||||
|
||||
# Determine last_seen from SLMM data
|
||||
last_success_str = device.get("last_success")
|
||||
is_reachable = device.get("is_reachable", False)
|
||||
|
||||
if last_success_str:
|
||||
# Parse ISO format timestamp
|
||||
last_seen = datetime.fromisoformat(
|
||||
last_success_str.replace("Z", "+00:00")
|
||||
)
|
||||
# Convert to naive UTC for consistency with existing code
|
||||
if last_seen.tzinfo:
|
||||
last_seen = last_seen.astimezone(timezone.utc).replace(tzinfo=None)
|
||||
elif is_reachable:
|
||||
# Device is reachable but no last_success yet (first poll or just started)
|
||||
# Use current time so it shows as OK, not Missing
|
||||
last_seen = datetime.utcnow()
|
||||
else:
|
||||
last_seen = None
|
||||
|
||||
# Status will be recalculated by snapshot.py based on time thresholds
|
||||
# Just store a provisional status here
|
||||
status = "OK" if is_reachable else "Missing"
|
||||
|
||||
# Store last error message if available
|
||||
last_error = device.get("last_error") or ""
|
||||
|
||||
if emitter:
|
||||
# Update existing record
|
||||
emitter.last_seen = last_seen
|
||||
emitter.status = status
|
||||
emitter.unit_type = "slm"
|
||||
emitter.last_file = last_error
|
||||
else:
|
||||
# Create new record
|
||||
emitter = Emitter(
|
||||
id=unit_id,
|
||||
unit_type="slm",
|
||||
last_seen=last_seen,
|
||||
last_file=last_error,
|
||||
status=status
|
||||
)
|
||||
db.add(emitter)
|
||||
|
||||
synced += 1
|
||||
|
||||
except Exception as e:
|
||||
errors.append(f"{unit_id}: {str(e)}")
|
||||
logger.error(f"Error syncing SLM {unit_id}: {e}")
|
||||
|
||||
db.commit()
|
||||
|
||||
finally:
|
||||
db.close()
|
||||
|
||||
if synced > 0:
|
||||
logger.info(f"Synced {synced} SLM device(s) to Emitter table")
|
||||
|
||||
except SLMMClientError as e:
|
||||
logger.warning(f"Could not reach SLMM for status sync: {e}")
|
||||
errors.append(f"SLMM unreachable: {str(e)}")
|
||||
except Exception as e:
|
||||
logger.error(f"Error in SLM status sync: {e}", exc_info=True)
|
||||
errors.append(str(e))
|
||||
|
||||
return {
|
||||
"synced_count": synced,
|
||||
"error_count": len(errors),
|
||||
"errors": errors
|
||||
}
|
||||
@@ -9,13 +9,14 @@ that handles TCP/FTP communication with Rion NL-43/NL-53 devices.
|
||||
"""
|
||||
|
||||
import httpx
|
||||
import os
|
||||
from typing import Optional, Dict, Any, List
|
||||
from datetime import datetime
|
||||
import json
|
||||
|
||||
|
||||
# SLMM backend base URLs
|
||||
SLMM_BASE_URL = "http://localhost:8100"
|
||||
# SLMM backend base URLs - use environment variable if set (for Docker)
|
||||
SLMM_BASE_URL = os.environ.get("SLMM_BASE_URL", "http://localhost:8100")
|
||||
SLMM_API_BASE = f"{SLMM_BASE_URL}/api/nl43"
|
||||
|
||||
|
||||
@@ -108,7 +109,71 @@ class SLMMClient:
|
||||
f"SLMM operation failed: {error_detail}"
|
||||
)
|
||||
except Exception as e:
|
||||
raise SLMMClientError(f"Unexpected error: {str(e)}")
|
||||
error_msg = str(e) if str(e) else type(e).__name__
|
||||
raise SLMMClientError(f"Unexpected error: {error_msg}")
|
||||
|
||||
async def _download_request(
|
||||
self,
|
||||
endpoint: str,
|
||||
data: Dict[str, Any],
|
||||
unit_id: str,
|
||||
) -> Dict[str, Any]:
|
||||
"""
|
||||
Make a download request to SLMM that returns binary file content (not JSON).
|
||||
|
||||
Saves the file locally and returns metadata about the download.
|
||||
"""
|
||||
url = f"{self.api_base}{endpoint}"
|
||||
|
||||
try:
|
||||
async with httpx.AsyncClient(timeout=httpx.Timeout(300.0)) as client:
|
||||
response = await client.post(url, json=data)
|
||||
response.raise_for_status()
|
||||
|
||||
# Determine filename from Content-Disposition header or generate one
|
||||
content_disp = response.headers.get("content-disposition", "")
|
||||
filename = None
|
||||
if "filename=" in content_disp:
|
||||
filename = content_disp.split("filename=")[-1].strip('" ')
|
||||
|
||||
if not filename:
|
||||
remote_path = data.get("remote_path", "download")
|
||||
base = os.path.basename(remote_path.rstrip("/"))
|
||||
filename = f"{base}.zip" if not base.endswith(".zip") else base
|
||||
|
||||
# Save to local downloads directory
|
||||
download_dir = os.path.join("data", "downloads", unit_id)
|
||||
os.makedirs(download_dir, exist_ok=True)
|
||||
local_path = os.path.join(download_dir, filename)
|
||||
|
||||
with open(local_path, "wb") as f:
|
||||
f.write(response.content)
|
||||
|
||||
return {
|
||||
"success": True,
|
||||
"local_path": local_path,
|
||||
"filename": filename,
|
||||
"size_bytes": len(response.content),
|
||||
}
|
||||
|
||||
except httpx.ConnectError as e:
|
||||
raise SLMMConnectionError(
|
||||
f"Cannot connect to SLMM backend at {self.base_url}. "
|
||||
f"Is SLMM running? Error: {str(e)}"
|
||||
)
|
||||
except httpx.HTTPStatusError as e:
|
||||
error_detail = "Unknown error"
|
||||
try:
|
||||
error_data = e.response.json()
|
||||
error_detail = error_data.get("detail", str(error_data))
|
||||
except Exception:
|
||||
error_detail = e.response.text or str(e)
|
||||
raise SLMMDeviceError(f"SLMM download failed: {error_detail}")
|
||||
except (SLMMConnectionError, SLMMDeviceError):
|
||||
raise
|
||||
except Exception as e:
|
||||
error_msg = str(e) if str(e) else type(e).__name__
|
||||
raise SLMMClientError(f"Download error: {error_msg}")
|
||||
|
||||
# ========================================================================
|
||||
# Unit Management
|
||||
@@ -276,6 +341,124 @@ class SLMMClient:
|
||||
"""
|
||||
return await self._request("POST", f"/{unit_id}/reset")
|
||||
|
||||
# ========================================================================
|
||||
# Store/Index Management
|
||||
# ========================================================================
|
||||
|
||||
async def get_index_number(self, unit_id: str) -> Dict[str, Any]:
|
||||
"""
|
||||
Get current store/index number from device.
|
||||
|
||||
Args:
|
||||
unit_id: Unit identifier
|
||||
|
||||
Returns:
|
||||
Dict with current index_number (store name)
|
||||
"""
|
||||
return await self._request("GET", f"/{unit_id}/index-number")
|
||||
|
||||
async def set_index_number(
|
||||
self,
|
||||
unit_id: str,
|
||||
index_number: int,
|
||||
) -> Dict[str, Any]:
|
||||
"""
|
||||
Set store/index number on device.
|
||||
|
||||
Args:
|
||||
unit_id: Unit identifier
|
||||
index_number: New index number to set
|
||||
|
||||
Returns:
|
||||
Confirmation response
|
||||
"""
|
||||
return await self._request(
|
||||
"PUT",
|
||||
f"/{unit_id}/index-number",
|
||||
data={"index_number": index_number},
|
||||
)
|
||||
|
||||
async def check_overwrite_status(self, unit_id: str) -> Dict[str, Any]:
|
||||
"""
|
||||
Check if data exists at the current store index.
|
||||
|
||||
Args:
|
||||
unit_id: Unit identifier
|
||||
|
||||
Returns:
|
||||
Dict with:
|
||||
- overwrite_status: "None" (safe) or "Exist" (would overwrite)
|
||||
- will_overwrite: bool
|
||||
- safe_to_store: bool
|
||||
"""
|
||||
return await self._request("GET", f"/{unit_id}/overwrite-check")
|
||||
|
||||
async def increment_index(self, unit_id: str, max_attempts: int = 100) -> Dict[str, Any]:
|
||||
"""
|
||||
Find and set the next available (unused) store/index number.
|
||||
|
||||
Checks the current index - if it would overwrite existing data,
|
||||
increments until finding an unused index number.
|
||||
|
||||
Args:
|
||||
unit_id: Unit identifier
|
||||
max_attempts: Maximum number of indices to try before giving up
|
||||
|
||||
Returns:
|
||||
Dict with old_index, new_index, and attempts_made
|
||||
"""
|
||||
# Get current index
|
||||
current = await self.get_index_number(unit_id)
|
||||
old_index = current.get("index_number", 0)
|
||||
|
||||
# Check if current index is safe
|
||||
overwrite_check = await self.check_overwrite_status(unit_id)
|
||||
if overwrite_check.get("safe_to_store", False):
|
||||
# Current index is safe, no need to increment
|
||||
return {
|
||||
"success": True,
|
||||
"old_index": old_index,
|
||||
"new_index": old_index,
|
||||
"unit_id": unit_id,
|
||||
"already_safe": True,
|
||||
"attempts_made": 0,
|
||||
}
|
||||
|
||||
# Need to find an unused index
|
||||
attempts = 0
|
||||
test_index = old_index + 1
|
||||
|
||||
while attempts < max_attempts:
|
||||
# Set the new index
|
||||
await self.set_index_number(unit_id, test_index)
|
||||
|
||||
# Check if this index is safe
|
||||
overwrite_check = await self.check_overwrite_status(unit_id)
|
||||
attempts += 1
|
||||
|
||||
if overwrite_check.get("safe_to_store", False):
|
||||
return {
|
||||
"success": True,
|
||||
"old_index": old_index,
|
||||
"new_index": test_index,
|
||||
"unit_id": unit_id,
|
||||
"already_safe": False,
|
||||
"attempts_made": attempts,
|
||||
}
|
||||
|
||||
# Try next index (wrap around at 9999)
|
||||
test_index = (test_index + 1) % 10000
|
||||
|
||||
# Avoid infinite loops if we've wrapped around
|
||||
if test_index == old_index:
|
||||
break
|
||||
|
||||
# Could not find a safe index
|
||||
raise SLMMDeviceError(
|
||||
f"Could not find unused store index for {unit_id} after {attempts} attempts. "
|
||||
f"Consider downloading and clearing data from the device."
|
||||
)
|
||||
|
||||
# ========================================================================
|
||||
# Device Settings
|
||||
# ========================================================================
|
||||
@@ -359,9 +542,130 @@ class SLMMClient:
|
||||
return await self._request("GET", f"/{unit_id}/settings")
|
||||
|
||||
# ========================================================================
|
||||
# Data Download (Future)
|
||||
# FTP Control
|
||||
# ========================================================================
|
||||
|
||||
async def enable_ftp(self, unit_id: str) -> Dict[str, Any]:
|
||||
"""
|
||||
Enable FTP server on device.
|
||||
|
||||
Must be called before downloading files. FTP and TCP can work in tandem.
|
||||
|
||||
Args:
|
||||
unit_id: Unit identifier
|
||||
|
||||
Returns:
|
||||
Dict with status message
|
||||
"""
|
||||
return await self._request("POST", f"/{unit_id}/ftp/enable")
|
||||
|
||||
async def disable_ftp(self, unit_id: str) -> Dict[str, Any]:
|
||||
"""
|
||||
Disable FTP server on device.
|
||||
|
||||
Args:
|
||||
unit_id: Unit identifier
|
||||
|
||||
Returns:
|
||||
Dict with status message
|
||||
"""
|
||||
return await self._request("POST", f"/{unit_id}/ftp/disable")
|
||||
|
||||
async def get_ftp_status(self, unit_id: str) -> Dict[str, Any]:
|
||||
"""
|
||||
Get FTP server status on device.
|
||||
|
||||
Args:
|
||||
unit_id: Unit identifier
|
||||
|
||||
Returns:
|
||||
Dict with ftp_enabled status
|
||||
"""
|
||||
return await self._request("GET", f"/{unit_id}/ftp/status")
|
||||
|
||||
# ========================================================================
|
||||
# Data Download
|
||||
# ========================================================================
|
||||
|
||||
async def download_file(
|
||||
self,
|
||||
unit_id: str,
|
||||
remote_path: str,
|
||||
) -> Dict[str, Any]:
|
||||
"""
|
||||
Download a single file from unit via FTP.
|
||||
|
||||
Args:
|
||||
unit_id: Unit identifier
|
||||
remote_path: Path on device to download (e.g., "/NL43_DATA/measurement.wav")
|
||||
|
||||
Returns:
|
||||
Dict with local_path, filename, size_bytes
|
||||
"""
|
||||
return await self._download_request(
|
||||
f"/{unit_id}/ftp/download",
|
||||
{"remote_path": remote_path},
|
||||
unit_id,
|
||||
)
|
||||
|
||||
async def download_folder(
|
||||
self,
|
||||
unit_id: str,
|
||||
remote_path: str,
|
||||
) -> Dict[str, Any]:
|
||||
"""
|
||||
Download an entire folder from unit via FTP as a ZIP archive.
|
||||
|
||||
Useful for downloading complete measurement sessions (e.g., Auto_0000 folders).
|
||||
|
||||
Args:
|
||||
unit_id: Unit identifier
|
||||
remote_path: Folder path on device to download (e.g., "/NL43_DATA/Auto_0000")
|
||||
|
||||
Returns:
|
||||
Dict with local_path, folder_name, size_bytes
|
||||
"""
|
||||
return await self._download_request(
|
||||
f"/{unit_id}/ftp/download-folder",
|
||||
{"remote_path": remote_path},
|
||||
unit_id,
|
||||
)
|
||||
|
||||
async def download_current_measurement(
|
||||
self,
|
||||
unit_id: str,
|
||||
) -> Dict[str, Any]:
|
||||
"""
|
||||
Download the current measurement folder based on device's index number.
|
||||
|
||||
This is the recommended method for scheduled downloads - it automatically
|
||||
determines which folder to download based on the device's current store index.
|
||||
|
||||
Args:
|
||||
unit_id: Unit identifier
|
||||
|
||||
Returns:
|
||||
Dict with local_path, folder_name, file_count, zip_size_bytes, index_number
|
||||
"""
|
||||
# Get current index number from device
|
||||
index_info = await self.get_index_number(unit_id)
|
||||
index_number_raw = index_info.get("index_number", 0)
|
||||
|
||||
# Convert to int - device returns string like "0000" or "0001"
|
||||
try:
|
||||
index_number = int(index_number_raw)
|
||||
except (ValueError, TypeError):
|
||||
index_number = 0
|
||||
|
||||
# Format as Auto_XXXX folder name
|
||||
folder_name = f"Auto_{index_number:04d}"
|
||||
remote_path = f"/NL-43/{folder_name}"
|
||||
|
||||
# Download the folder
|
||||
result = await self.download_folder(unit_id, remote_path)
|
||||
result["index_number"] = index_number
|
||||
return result
|
||||
|
||||
async def download_files(
|
||||
self,
|
||||
unit_id: str,
|
||||
@@ -369,23 +673,153 @@ class SLMMClient:
|
||||
files: Optional[List[str]] = None,
|
||||
) -> Dict[str, Any]:
|
||||
"""
|
||||
Download files from unit via FTP.
|
||||
Download measurement files from unit via FTP.
|
||||
|
||||
NOTE: This endpoint doesn't exist in SLMM yet. Will need to implement.
|
||||
This method automatically determines the current measurement folder and downloads it.
|
||||
The destination_path parameter is logged for reference but actual download location
|
||||
is managed by SLMM (data/downloads/{unit_id}/).
|
||||
|
||||
Args:
|
||||
unit_id: Unit identifier
|
||||
destination_path: Local path to save files
|
||||
files: List of filenames to download, or None for all
|
||||
destination_path: Reference path (for logging/metadata, not used by SLMM)
|
||||
files: Ignored - always downloads the current measurement folder
|
||||
|
||||
Returns:
|
||||
Dict with downloaded files list and metadata
|
||||
Dict with download result including local_path, folder_name, etc.
|
||||
"""
|
||||
data = {
|
||||
"destination_path": destination_path,
|
||||
"files": files or "all",
|
||||
}
|
||||
return await self._request("POST", f"/{unit_id}/ftp/download", data=data)
|
||||
# Use the new method that automatically determines what to download
|
||||
result = await self.download_current_measurement(unit_id)
|
||||
result["requested_destination"] = destination_path
|
||||
return result
|
||||
|
||||
# ========================================================================
|
||||
# Cycle Commands (for scheduled automation)
|
||||
# ========================================================================
|
||||
|
||||
async def start_cycle(
|
||||
self,
|
||||
unit_id: str,
|
||||
sync_clock: bool = True,
|
||||
) -> Dict[str, Any]:
|
||||
"""
|
||||
Execute complete start cycle on device via SLMM.
|
||||
|
||||
This handles the full pre-recording workflow:
|
||||
1. Sync device clock to server time
|
||||
2. Find next safe index (with overwrite protection)
|
||||
3. Start measurement
|
||||
|
||||
Args:
|
||||
unit_id: Unit identifier
|
||||
sync_clock: Whether to sync device clock to server time
|
||||
|
||||
Returns:
|
||||
Dict with clock_synced, old_index, new_index, started, etc.
|
||||
"""
|
||||
return await self._request(
|
||||
"POST",
|
||||
f"/{unit_id}/start-cycle",
|
||||
data={"sync_clock": sync_clock},
|
||||
)
|
||||
|
||||
async def stop_cycle(
|
||||
self,
|
||||
unit_id: str,
|
||||
download: bool = True,
|
||||
download_path: Optional[str] = None,
|
||||
) -> Dict[str, Any]:
|
||||
"""
|
||||
Execute complete stop cycle on device via SLMM.
|
||||
|
||||
This handles the full post-recording workflow:
|
||||
1. Stop measurement
|
||||
2. Enable FTP
|
||||
3. Download measurement folder (if download=True)
|
||||
4. Verify download
|
||||
|
||||
Args:
|
||||
unit_id: Unit identifier
|
||||
download: Whether to download measurement data
|
||||
download_path: Custom path for downloaded ZIP (optional)
|
||||
|
||||
Returns:
|
||||
Dict with stopped, ftp_enabled, download_success, local_path, etc.
|
||||
"""
|
||||
data = {"download": download}
|
||||
if download_path:
|
||||
data["download_path"] = download_path
|
||||
return await self._request(
|
||||
"POST",
|
||||
f"/{unit_id}/stop-cycle",
|
||||
data=data,
|
||||
)
|
||||
|
||||
# ========================================================================
|
||||
# Polling Status (for device monitoring/alerts)
|
||||
# ========================================================================
|
||||
|
||||
async def get_polling_status(self) -> Dict[str, Any]:
|
||||
"""
|
||||
Get global polling status from SLMM.
|
||||
|
||||
Returns device reachability information for all polled devices.
|
||||
Used by DeviceStatusMonitor to detect offline/online transitions.
|
||||
|
||||
Returns:
|
||||
Dict with devices list containing:
|
||||
- unit_id
|
||||
- is_reachable
|
||||
- consecutive_failures
|
||||
- last_poll_attempt
|
||||
- last_success
|
||||
- last_error
|
||||
"""
|
||||
try:
|
||||
async with httpx.AsyncClient(timeout=self.timeout) as client:
|
||||
response = await client.get(f"{self.base_url}/api/nl43/_polling/status")
|
||||
response.raise_for_status()
|
||||
return response.json()
|
||||
except httpx.ConnectError:
|
||||
raise SLMMConnectionError("Cannot connect to SLMM for polling status")
|
||||
except Exception as e:
|
||||
raise SLMMClientError(f"Failed to get polling status: {str(e)}")
|
||||
|
||||
async def get_device_polling_config(self, unit_id: str) -> Dict[str, Any]:
|
||||
"""
|
||||
Get polling configuration for a specific device.
|
||||
|
||||
Args:
|
||||
unit_id: Unit identifier
|
||||
|
||||
Returns:
|
||||
Dict with poll_enabled and poll_interval_seconds
|
||||
"""
|
||||
return await self._request("GET", f"/{unit_id}/polling/config")
|
||||
|
||||
async def update_device_polling_config(
|
||||
self,
|
||||
unit_id: str,
|
||||
poll_enabled: Optional[bool] = None,
|
||||
poll_interval_seconds: Optional[int] = None,
|
||||
) -> Dict[str, Any]:
|
||||
"""
|
||||
Update polling configuration for a device.
|
||||
|
||||
Args:
|
||||
unit_id: Unit identifier
|
||||
poll_enabled: Enable/disable polling
|
||||
poll_interval_seconds: Polling interval (10-3600)
|
||||
|
||||
Returns:
|
||||
Updated config
|
||||
"""
|
||||
config = {}
|
||||
if poll_enabled is not None:
|
||||
config["poll_enabled"] = poll_enabled
|
||||
if poll_interval_seconds is not None:
|
||||
config["poll_interval_seconds"] = poll_interval_seconds
|
||||
|
||||
return await self._request("PUT", f"/{unit_id}/polling/config", data=config)
|
||||
|
||||
# ========================================================================
|
||||
# Health Check
|
||||
|
||||
231
backend/services/slmm_sync.py
Normal file
@@ -0,0 +1,231 @@
|
||||
"""
|
||||
SLMM Synchronization Service
|
||||
|
||||
This service ensures Terra-View roster is the single source of truth for SLM device configuration.
|
||||
When SLM devices are added, edited, or deleted in Terra-View, changes are automatically synced to SLMM.
|
||||
"""
|
||||
|
||||
import logging
|
||||
import httpx
|
||||
import os
|
||||
from typing import Optional
|
||||
from sqlalchemy.orm import Session
|
||||
|
||||
from backend.models import RosterUnit
|
||||
|
||||
logger = logging.getLogger(__name__)
|
||||
|
||||
SLMM_BASE_URL = os.getenv("SLMM_BASE_URL", "http://localhost:8100")
|
||||
|
||||
|
||||
async def sync_slm_to_slmm(unit: RosterUnit) -> bool:
|
||||
"""
|
||||
Sync a single SLM device from Terra-View roster to SLMM.
|
||||
|
||||
Args:
|
||||
unit: RosterUnit with device_type="slm"
|
||||
|
||||
Returns:
|
||||
True if sync successful, False otherwise
|
||||
"""
|
||||
if unit.device_type != "slm":
|
||||
logger.warning(f"Attempted to sync non-SLM unit {unit.id} to SLMM")
|
||||
return False
|
||||
|
||||
if not unit.slm_host:
|
||||
logger.warning(f"SLM {unit.id} has no host configured, skipping SLMM sync")
|
||||
return False
|
||||
|
||||
# Disable polling if unit is benched (deployed=False) or retired
|
||||
# Only actively deployed units should be polled
|
||||
should_poll = unit.deployed and not unit.retired
|
||||
|
||||
try:
|
||||
async with httpx.AsyncClient(timeout=5.0) as client:
|
||||
response = await client.put(
|
||||
f"{SLMM_BASE_URL}/api/nl43/{unit.id}/config",
|
||||
json={
|
||||
"host": unit.slm_host,
|
||||
"tcp_port": unit.slm_tcp_port or 2255,
|
||||
"tcp_enabled": True,
|
||||
"ftp_enabled": True,
|
||||
"ftp_username": "USER", # Default NL43 credentials
|
||||
"ftp_password": "0000",
|
||||
"poll_enabled": should_poll, # Disable polling for benched or retired units
|
||||
"poll_interval_seconds": 3600, # Default to 1 hour polling
|
||||
}
|
||||
)
|
||||
|
||||
if response.status_code in [200, 201]:
|
||||
logger.info(f"✓ Synced SLM {unit.id} to SLMM at {unit.slm_host}:{unit.slm_tcp_port or 2255}")
|
||||
return True
|
||||
else:
|
||||
logger.error(f"Failed to sync SLM {unit.id} to SLMM: {response.status_code} {response.text}")
|
||||
return False
|
||||
|
||||
except httpx.TimeoutException:
|
||||
logger.error(f"Timeout syncing SLM {unit.id} to SLMM")
|
||||
return False
|
||||
except Exception as e:
|
||||
logger.error(f"Error syncing SLM {unit.id} to SLMM: {e}")
|
||||
return False
|
||||
|
||||
|
||||
async def delete_slm_from_slmm(unit_id: str) -> bool:
|
||||
"""
|
||||
Delete a device from SLMM database.
|
||||
|
||||
Args:
|
||||
unit_id: The unit ID to delete
|
||||
|
||||
Returns:
|
||||
True if deletion successful or device doesn't exist, False on error
|
||||
"""
|
||||
try:
|
||||
async with httpx.AsyncClient(timeout=5.0) as client:
|
||||
response = await client.delete(
|
||||
f"{SLMM_BASE_URL}/api/nl43/{unit_id}/config"
|
||||
)
|
||||
|
||||
if response.status_code == 200:
|
||||
logger.info(f"✓ Deleted SLM {unit_id} from SLMM")
|
||||
return True
|
||||
elif response.status_code == 404:
|
||||
logger.info(f"SLM {unit_id} not found in SLMM (already deleted)")
|
||||
return True
|
||||
else:
|
||||
logger.error(f"Failed to delete SLM {unit_id} from SLMM: {response.status_code} {response.text}")
|
||||
return False
|
||||
|
||||
except httpx.TimeoutException:
|
||||
logger.error(f"Timeout deleting SLM {unit_id} from SLMM")
|
||||
return False
|
||||
except Exception as e:
|
||||
logger.error(f"Error deleting SLM {unit_id} from SLMM: {e}")
|
||||
return False
|
||||
|
||||
|
||||
async def sync_all_slms_to_slmm(db: Session) -> dict:
|
||||
"""
|
||||
Sync all SLM devices from Terra-View roster to SLMM.
|
||||
|
||||
This ensures SLMM database matches Terra-View roster as the source of truth.
|
||||
Should be called on Terra-View startup and optionally via admin endpoint.
|
||||
|
||||
Args:
|
||||
db: Database session
|
||||
|
||||
Returns:
|
||||
Dictionary with sync results
|
||||
"""
|
||||
logger.info("Starting full SLM sync to SLMM...")
|
||||
|
||||
# Get all SLM units from roster
|
||||
slm_units = db.query(RosterUnit).filter_by(device_type="slm").all()
|
||||
|
||||
results = {
|
||||
"total": len(slm_units),
|
||||
"synced": 0,
|
||||
"skipped": 0,
|
||||
"failed": 0
|
||||
}
|
||||
|
||||
for unit in slm_units:
|
||||
# Skip units without host configured
|
||||
if not unit.slm_host:
|
||||
results["skipped"] += 1
|
||||
logger.debug(f"Skipped {unit.unit_type} - no host configured")
|
||||
continue
|
||||
|
||||
# Sync to SLMM
|
||||
success = await sync_slm_to_slmm(unit)
|
||||
if success:
|
||||
results["synced"] += 1
|
||||
else:
|
||||
results["failed"] += 1
|
||||
|
||||
logger.info(
|
||||
f"SLM sync complete: {results['synced']} synced, "
|
||||
f"{results['skipped']} skipped, {results['failed']} failed"
|
||||
)
|
||||
|
||||
return results
|
||||
|
||||
|
||||
async def get_slmm_devices() -> Optional[list]:
|
||||
"""
|
||||
Get list of all devices currently in SLMM database.
|
||||
|
||||
Returns:
|
||||
List of device unit_ids, or None on error
|
||||
"""
|
||||
try:
|
||||
async with httpx.AsyncClient(timeout=5.0) as client:
|
||||
response = await client.get(f"{SLMM_BASE_URL}/api/nl43/_polling/status")
|
||||
|
||||
if response.status_code == 200:
|
||||
data = response.json()
|
||||
return [device["unit_id"] for device in data["data"]["devices"]]
|
||||
else:
|
||||
logger.error(f"Failed to get SLMM devices: {response.status_code}")
|
||||
return None
|
||||
|
||||
except Exception as e:
|
||||
logger.error(f"Error getting SLMM devices: {e}")
|
||||
return None
|
||||
|
||||
|
||||
async def cleanup_orphaned_slmm_devices(db: Session) -> dict:
|
||||
"""
|
||||
Remove devices from SLMM that are not in Terra-View roster.
|
||||
|
||||
This cleans up orphaned test devices or devices that were manually added to SLMM.
|
||||
|
||||
Args:
|
||||
db: Database session
|
||||
|
||||
Returns:
|
||||
Dictionary with cleanup results
|
||||
"""
|
||||
logger.info("Checking for orphaned devices in SLMM...")
|
||||
|
||||
# Get all device IDs from SLMM
|
||||
slmm_devices = await get_slmm_devices()
|
||||
if slmm_devices is None:
|
||||
return {"error": "Failed to get SLMM device list"}
|
||||
|
||||
# Get all SLM unit IDs from Terra-View roster
|
||||
roster_units = db.query(RosterUnit.id).filter_by(device_type="slm").all()
|
||||
roster_unit_ids = {unit.id for unit in roster_units}
|
||||
|
||||
# Find orphaned devices (in SLMM but not in roster)
|
||||
orphaned = [uid for uid in slmm_devices if uid not in roster_unit_ids]
|
||||
|
||||
results = {
|
||||
"total_in_slmm": len(slmm_devices),
|
||||
"total_in_roster": len(roster_unit_ids),
|
||||
"orphaned": len(orphaned),
|
||||
"deleted": 0,
|
||||
"failed": 0,
|
||||
"orphaned_devices": orphaned
|
||||
}
|
||||
|
||||
if not orphaned:
|
||||
logger.info("No orphaned devices found in SLMM")
|
||||
return results
|
||||
|
||||
logger.info(f"Found {len(orphaned)} orphaned devices in SLMM: {orphaned}")
|
||||
|
||||
# Delete orphaned devices
|
||||
for unit_id in orphaned:
|
||||
success = await delete_slm_from_slmm(unit_id)
|
||||
if success:
|
||||
results["deleted"] += 1
|
||||
else:
|
||||
results["failed"] += 1
|
||||
|
||||
logger.info(
|
||||
f"Cleanup complete: {results['deleted']} deleted, {results['failed']} failed"
|
||||
)
|
||||
|
||||
return results
|
||||
@@ -108,6 +108,7 @@ def emit_status_snapshot():
|
||||
"last_calibrated": r.last_calibrated.isoformat() if r.last_calibrated else None,
|
||||
"next_calibration_due": r.next_calibration_due.isoformat() if r.next_calibration_due else None,
|
||||
"deployed_with_modem_id": r.deployed_with_modem_id,
|
||||
"deployed_with_unit_id": r.deployed_with_unit_id,
|
||||
"ip_address": r.ip_address,
|
||||
"phone_number": r.phone_number,
|
||||
"hardware_model": r.hardware_model,
|
||||
@@ -137,6 +138,7 @@ def emit_status_snapshot():
|
||||
"last_calibrated": None,
|
||||
"next_calibration_due": None,
|
||||
"deployed_with_modem_id": None,
|
||||
"deployed_with_unit_id": None,
|
||||
"ip_address": None,
|
||||
"phone_number": None,
|
||||
"hardware_model": None,
|
||||
@@ -146,6 +148,34 @@ def emit_status_snapshot():
|
||||
"coordinates": "",
|
||||
}
|
||||
|
||||
# --- Derive modem status from paired devices ---
|
||||
# Modems don't have their own check-in system, so we inherit status
|
||||
# from whatever device they're paired with (seismograph or SLM)
|
||||
# Check both directions: modem.deployed_with_unit_id OR device.deployed_with_modem_id
|
||||
for unit_id, unit_data in units.items():
|
||||
if unit_data.get("device_type") == "modem" and not unit_data.get("retired"):
|
||||
paired_unit_id = None
|
||||
roster_unit = roster.get(unit_id)
|
||||
|
||||
# First, check if modem has deployed_with_unit_id set
|
||||
if roster_unit and roster_unit.deployed_with_unit_id:
|
||||
paired_unit_id = roster_unit.deployed_with_unit_id
|
||||
else:
|
||||
# Fallback: check if any device has this modem in deployed_with_modem_id
|
||||
for other_id, other_roster in roster.items():
|
||||
if other_roster.deployed_with_modem_id == unit_id:
|
||||
paired_unit_id = other_id
|
||||
break
|
||||
|
||||
if paired_unit_id:
|
||||
paired_unit = units.get(paired_unit_id)
|
||||
if paired_unit:
|
||||
# Inherit status from paired device
|
||||
unit_data["status"] = paired_unit.get("status", "Missing")
|
||||
unit_data["age"] = paired_unit.get("age", "N/A")
|
||||
unit_data["last"] = paired_unit.get("last")
|
||||
unit_data["derived_from"] = paired_unit_id
|
||||
|
||||
# Separate buckets for UI
|
||||
active_units = {
|
||||
uid: u for uid, u in units.items()
|
||||
|
||||
BIN
backend/static/icons/favicon-16.png
Normal file
|
After Width: | Height: | Size: 424 B |
BIN
backend/static/icons/favicon-32.png
Normal file
|
After Width: | Height: | Size: 1.1 KiB |
|
Before Width: | Height: | Size: 1.9 KiB After Width: | Height: | Size: 7.7 KiB |
|
Before Width: | Height: | Size: 2.2 KiB After Width: | Height: | Size: 9.2 KiB |
|
Before Width: | Height: | Size: 2.2 KiB After Width: | Height: | Size: 10 KiB |
|
Before Width: | Height: | Size: 2.9 KiB After Width: | Height: | Size: 15 KiB |
|
Before Width: | Height: | Size: 5.8 KiB After Width: | Height: | Size: 44 KiB |
|
Before Width: | Height: | Size: 7.8 KiB After Width: | Height: | Size: 68 KiB |
|
Before Width: | Height: | Size: 1.1 KiB After Width: | Height: | Size: 3.2 KiB |
|
Before Width: | Height: | Size: 1.4 KiB After Width: | Height: | Size: 5.0 KiB |
BIN
backend/static/terra-view-logo-dark.png
Normal file
|
After Width: | Height: | Size: 13 KiB |
BIN
backend/static/terra-view-logo-dark@2x.png
Normal file
|
After Width: | Height: | Size: 57 KiB |
BIN
backend/static/terra-view-logo-light.png
Normal file
|
After Width: | Height: | Size: 14 KiB |
BIN
backend/static/terra-view-logo-light@2x.png
Normal file
|
After Width: | Height: | Size: 49 KiB |
70
backend/templates_config.py
Normal file
@@ -0,0 +1,70 @@
|
||||
"""
|
||||
Shared Jinja2 templates configuration.
|
||||
|
||||
All routers should import `templates` from this module to get consistent
|
||||
filter and global function registration.
|
||||
"""
|
||||
|
||||
import json as _json
|
||||
from fastapi.templating import Jinja2Templates
|
||||
|
||||
# Import timezone utilities
|
||||
from backend.utils.timezone import (
|
||||
format_local_datetime, format_local_time,
|
||||
get_user_timezone, get_timezone_abbreviation
|
||||
)
|
||||
|
||||
|
||||
def jinja_local_datetime(dt, fmt="%Y-%m-%d %H:%M"):
|
||||
"""Jinja filter to convert UTC datetime to local timezone."""
|
||||
return format_local_datetime(dt, fmt)
|
||||
|
||||
|
||||
def jinja_local_time(dt):
|
||||
"""Jinja filter to format time in local timezone."""
|
||||
return format_local_time(dt)
|
||||
|
||||
|
||||
def jinja_timezone_abbr():
|
||||
"""Jinja global to get current timezone abbreviation."""
|
||||
return get_timezone_abbreviation()
|
||||
|
||||
|
||||
# Create templates instance
|
||||
templates = Jinja2Templates(directory="templates")
|
||||
|
||||
def jinja_local_date(dt, fmt="%m-%d-%y"):
|
||||
"""Jinja filter: format a UTC datetime as a local date string (e.g. 02-19-26)."""
|
||||
return format_local_datetime(dt, fmt)
|
||||
|
||||
|
||||
def jinja_fromjson(s):
|
||||
"""Jinja filter: parse a JSON string into a dict (returns {} on failure)."""
|
||||
if not s:
|
||||
return {}
|
||||
try:
|
||||
return _json.loads(s)
|
||||
except Exception:
|
||||
return {}
|
||||
|
||||
|
||||
def jinja_same_date(dt1, dt2) -> bool:
|
||||
"""Jinja global: True if two datetimes fall on the same local date."""
|
||||
if not dt1 or not dt2:
|
||||
return False
|
||||
try:
|
||||
d1 = format_local_datetime(dt1, "%Y-%m-%d")
|
||||
d2 = format_local_datetime(dt2, "%Y-%m-%d")
|
||||
return d1 == d2
|
||||
except Exception:
|
||||
return False
|
||||
|
||||
|
||||
# Register Jinja filters and globals
|
||||
templates.env.filters["local_datetime"] = jinja_local_datetime
|
||||
templates.env.filters["local_time"] = jinja_local_time
|
||||
templates.env.filters["local_date"] = jinja_local_date
|
||||
templates.env.filters["fromjson"] = jinja_fromjson
|
||||
templates.env.globals["timezone_abbr"] = jinja_timezone_abbr
|
||||
templates.env.globals["get_user_timezone"] = get_user_timezone
|
||||
templates.env.globals["same_date"] = jinja_same_date
|
||||
1
backend/utils/__init__.py
Normal file
@@ -0,0 +1 @@
|
||||
# Utils package
|
||||
173
backend/utils/timezone.py
Normal file
@@ -0,0 +1,173 @@
|
||||
"""
|
||||
Timezone utilities for Terra-View.
|
||||
|
||||
Provides consistent timezone handling throughout the application.
|
||||
All database times are stored in UTC; this module converts for display.
|
||||
"""
|
||||
|
||||
from datetime import datetime
|
||||
from zoneinfo import ZoneInfo
|
||||
from typing import Optional
|
||||
|
||||
from backend.database import SessionLocal
|
||||
from backend.models import UserPreferences
|
||||
|
||||
|
||||
# Default timezone if none set
|
||||
DEFAULT_TIMEZONE = "America/New_York"
|
||||
|
||||
|
||||
def get_user_timezone() -> str:
|
||||
"""
|
||||
Get the user's configured timezone from preferences.
|
||||
|
||||
Returns:
|
||||
Timezone string (e.g., "America/New_York")
|
||||
"""
|
||||
db = SessionLocal()
|
||||
try:
|
||||
prefs = db.query(UserPreferences).filter_by(id=1).first()
|
||||
if prefs and prefs.timezone:
|
||||
return prefs.timezone
|
||||
return DEFAULT_TIMEZONE
|
||||
finally:
|
||||
db.close()
|
||||
|
||||
|
||||
def get_timezone_info(tz_name: str = None) -> ZoneInfo:
|
||||
"""
|
||||
Get ZoneInfo object for the specified or user's timezone.
|
||||
|
||||
Args:
|
||||
tz_name: Timezone name, or None to use user preference
|
||||
|
||||
Returns:
|
||||
ZoneInfo object
|
||||
"""
|
||||
if tz_name is None:
|
||||
tz_name = get_user_timezone()
|
||||
try:
|
||||
return ZoneInfo(tz_name)
|
||||
except Exception:
|
||||
return ZoneInfo(DEFAULT_TIMEZONE)
|
||||
|
||||
|
||||
def utc_to_local(dt: datetime, tz_name: str = None) -> datetime:
|
||||
"""
|
||||
Convert a UTC datetime to local timezone.
|
||||
|
||||
Args:
|
||||
dt: Datetime in UTC (naive or aware)
|
||||
tz_name: Target timezone, or None to use user preference
|
||||
|
||||
Returns:
|
||||
Datetime in local timezone
|
||||
"""
|
||||
if dt is None:
|
||||
return None
|
||||
|
||||
tz = get_timezone_info(tz_name)
|
||||
|
||||
# Assume naive datetime is UTC
|
||||
if dt.tzinfo is None:
|
||||
dt = dt.replace(tzinfo=ZoneInfo("UTC"))
|
||||
|
||||
return dt.astimezone(tz)
|
||||
|
||||
|
||||
def local_to_utc(dt: datetime, tz_name: str = None) -> datetime:
|
||||
"""
|
||||
Convert a local datetime to UTC.
|
||||
|
||||
Args:
|
||||
dt: Datetime in local timezone (naive or aware)
|
||||
tz_name: Source timezone, or None to use user preference
|
||||
|
||||
Returns:
|
||||
Datetime in UTC (naive, for database storage)
|
||||
"""
|
||||
if dt is None:
|
||||
return None
|
||||
|
||||
tz = get_timezone_info(tz_name)
|
||||
|
||||
# Assume naive datetime is in local timezone
|
||||
if dt.tzinfo is None:
|
||||
dt = dt.replace(tzinfo=tz)
|
||||
|
||||
# Convert to UTC and strip tzinfo for database storage
|
||||
return dt.astimezone(ZoneInfo("UTC")).replace(tzinfo=None)
|
||||
|
||||
|
||||
def format_local_datetime(dt: datetime, fmt: str = "%Y-%m-%d %H:%M", tz_name: str = None) -> str:
|
||||
"""
|
||||
Format a UTC datetime as local time string.
|
||||
|
||||
Args:
|
||||
dt: Datetime in UTC
|
||||
fmt: strftime format string
|
||||
tz_name: Target timezone, or None to use user preference
|
||||
|
||||
Returns:
|
||||
Formatted datetime string in local time
|
||||
"""
|
||||
if dt is None:
|
||||
return "N/A"
|
||||
|
||||
local_dt = utc_to_local(dt, tz_name)
|
||||
return local_dt.strftime(fmt)
|
||||
|
||||
|
||||
def format_local_time(dt: datetime, tz_name: str = None) -> str:
|
||||
"""
|
||||
Format a UTC datetime as local time (HH:MM format).
|
||||
|
||||
Args:
|
||||
dt: Datetime in UTC
|
||||
tz_name: Target timezone
|
||||
|
||||
Returns:
|
||||
Time string in HH:MM format
|
||||
"""
|
||||
return format_local_datetime(dt, "%H:%M", tz_name)
|
||||
|
||||
|
||||
def format_local_date(dt: datetime, tz_name: str = None) -> str:
|
||||
"""
|
||||
Format a UTC datetime as local date (YYYY-MM-DD format).
|
||||
|
||||
Args:
|
||||
dt: Datetime in UTC
|
||||
tz_name: Target timezone
|
||||
|
||||
Returns:
|
||||
Date string
|
||||
"""
|
||||
return format_local_datetime(dt, "%Y-%m-%d", tz_name)
|
||||
|
||||
|
||||
def get_timezone_abbreviation(tz_name: str = None) -> str:
|
||||
"""
|
||||
Get the abbreviation for a timezone (e.g., EST, EDT, PST).
|
||||
|
||||
Args:
|
||||
tz_name: Timezone name, or None to use user preference
|
||||
|
||||
Returns:
|
||||
Timezone abbreviation
|
||||
"""
|
||||
tz = get_timezone_info(tz_name)
|
||||
now = datetime.now(tz)
|
||||
return now.strftime("%Z")
|
||||
|
||||
|
||||
# Common US timezone choices for settings dropdown
|
||||
TIMEZONE_CHOICES = [
|
||||
("America/New_York", "Eastern Time (ET)"),
|
||||
("America/Chicago", "Central Time (CT)"),
|
||||
("America/Denver", "Mountain Time (MT)"),
|
||||
("America/Los_Angeles", "Pacific Time (PT)"),
|
||||
("America/Anchorage", "Alaska Time (AKT)"),
|
||||
("Pacific/Honolulu", "Hawaii Time (HT)"),
|
||||
("UTC", "UTC"),
|
||||
]
|
||||
@@ -1,10 +0,0 @@
|
||||
{
|
||||
"filename": "snapshot_20251216_201738.db",
|
||||
"created_at": "20251216_201738",
|
||||
"created_at_iso": "2025-12-16T20:17:38.638982",
|
||||
"description": "Auto-backup before restore",
|
||||
"size_bytes": 57344,
|
||||
"size_mb": 0.05,
|
||||
"original_db_size_bytes": 57344,
|
||||
"type": "manual"
|
||||
}
|
||||
@@ -1,9 +0,0 @@
|
||||
{
|
||||
"filename": "snapshot_uploaded_20251216_201732.db",
|
||||
"created_at": "20251216_201732",
|
||||
"created_at_iso": "2025-12-16T20:17:32.574205",
|
||||
"description": "Uploaded: snapshot_20251216_200259.db",
|
||||
"size_bytes": 77824,
|
||||
"size_mb": 0.07,
|
||||
"type": "uploaded"
|
||||
}
|
||||
@@ -1,7 +1,7 @@
|
||||
services:
|
||||
|
||||
# --- TERRA-VIEW PRODUCTION ---
|
||||
terra-view-prod:
|
||||
terra-view:
|
||||
build: .
|
||||
container_name: terra-view
|
||||
ports:
|
||||
@@ -24,28 +24,6 @@ services:
|
||||
retries: 3
|
||||
start_period: 40s
|
||||
|
||||
# --- TERRA-VIEW DEVELOPMENT ---
|
||||
# terra-view-dev:
|
||||
# build: .
|
||||
# container_name: terra-view-dev
|
||||
# ports:
|
||||
# - "1001:8001"
|
||||
# volumes:
|
||||
# - ./data-dev:/app/data
|
||||
# environment:
|
||||
# - PYTHONUNBUFFERED=1
|
||||
# - ENVIRONMENT=development
|
||||
# - SLMM_BASE_URL=http://slmm:8100
|
||||
# restart: unless-stopped
|
||||
# depends_on:
|
||||
# - slmm
|
||||
# healthcheck:
|
||||
# test: ["CMD", "curl", "-f", "http://localhost:8001/health"]
|
||||
# interval: 30s
|
||||
# timeout: 10s
|
||||
# retries: 3
|
||||
# start_period: 40s
|
||||
|
||||
# --- SLMM (Sound Level Meter Manager) ---
|
||||
slmm:
|
||||
build:
|
||||
@@ -59,6 +37,8 @@ services:
|
||||
- PYTHONUNBUFFERED=1
|
||||
- PORT=8100
|
||||
- CORS_ORIGINS=*
|
||||
- TCP_IDLE_TTL=-1
|
||||
- TCP_MAX_AGE=-1
|
||||
restart: unless-stopped
|
||||
healthcheck:
|
||||
test: ["CMD", "curl", "-f", "http://localhost:8100/health"]
|
||||
@@ -69,4 +49,3 @@ services:
|
||||
|
||||
volumes:
|
||||
data:
|
||||
data-dev:
|
||||
|
||||
@@ -125,7 +125,7 @@ seismos = db.query(RosterUnit).filter_by(
|
||||
### Sound Level Meters Query
|
||||
```python
|
||||
slms = db.query(RosterUnit).filter_by(
|
||||
device_type="sound_level_meter",
|
||||
device_type="slm",
|
||||
retired=False
|
||||
).all()
|
||||
```
|
||||
|
||||
288
docs/DEVICE_TYPE_SCHEMA.md
Normal file
@@ -0,0 +1,288 @@
|
||||
# Device Type Schema - Terra-View
|
||||
|
||||
## Overview
|
||||
|
||||
Terra-View uses a single roster table to manage three different device types. The `device_type` field is the primary discriminator that determines which fields are relevant for each unit.
|
||||
|
||||
## Official device_type Values
|
||||
|
||||
As of **Terra-View v0.4.3**, the following device_type values are standardized:
|
||||
|
||||
### 1. `"seismograph"` (Default)
|
||||
**Purpose**: Seismic monitoring devices
|
||||
|
||||
**Applicable Fields**:
|
||||
- Common: id, unit_type, deployed, retired, note, project_id, location, address, coordinates
|
||||
- Specific: last_calibrated, next_calibration_due, deployed_with_modem_id
|
||||
|
||||
**Examples**:
|
||||
- `BE1234` - Series 3 seismograph
|
||||
- `UM12345` - Series 4 Micromate unit
|
||||
- `SEISMO-001` - Custom seismograph
|
||||
|
||||
**Unit Type Values**:
|
||||
- `series3` - Series 3 devices (default)
|
||||
- `series4` - Series 4 devices
|
||||
- `micromate` - Micromate devices
|
||||
|
||||
---
|
||||
|
||||
### 2. `"modem"`
|
||||
**Purpose**: Field modems and network equipment
|
||||
|
||||
**Applicable Fields**:
|
||||
- Common: id, unit_type, deployed, retired, note, project_id, location, address, coordinates
|
||||
- Specific: ip_address, phone_number, hardware_model
|
||||
|
||||
**Examples**:
|
||||
- `MDM001` - Field modem
|
||||
- `MODEM-2025-01` - Network modem
|
||||
- `RAVEN-XTV-01` - Specific modem model
|
||||
|
||||
**Unit Type Values**:
|
||||
- `modem` - Generic modem
|
||||
- `raven-xtv` - Raven XTV model
|
||||
- Custom values for specific hardware
|
||||
|
||||
---
|
||||
|
||||
### 3. `"slm"` ⭐
|
||||
**Purpose**: Sound level meters (Rion NL-43/NL-53)
|
||||
|
||||
**Applicable Fields**:
|
||||
- Common: id, unit_type, deployed, retired, note, project_id, location, address, coordinates
|
||||
- Specific: slm_host, slm_tcp_port, slm_ftp_port, slm_model, slm_serial_number, slm_frequency_weighting, slm_time_weighting, slm_measurement_range, slm_last_check, deployed_with_modem_id
|
||||
|
||||
**Examples**:
|
||||
- `SLM-43-01` - NL-43 sound level meter
|
||||
- `NL43-001` - NL-43 unit
|
||||
- `NL53-002` - NL-53 unit
|
||||
|
||||
**Unit Type Values**:
|
||||
- `nl43` - Rion NL-43 model
|
||||
- `nl53` - Rion NL-53 model
|
||||
|
||||
---
|
||||
|
||||
## Migration from Legacy Values
|
||||
|
||||
### Deprecated Values
|
||||
|
||||
The following device_type values have been **deprecated** and should be migrated:
|
||||
|
||||
- ❌ `"sound_level_meter"` → ✅ `"slm"`
|
||||
|
||||
### How to Migrate
|
||||
|
||||
Run the standardization migration script to update existing databases:
|
||||
|
||||
```bash
|
||||
cd /home/serversdown/tmi/terra-view
|
||||
python3 backend/migrate_standardize_device_types.py
|
||||
```
|
||||
|
||||
This script:
|
||||
- Converts all `"sound_level_meter"` values to `"slm"`
|
||||
- Is idempotent (safe to run multiple times)
|
||||
- Shows before/after distribution of device types
|
||||
- No data loss
|
||||
|
||||
---
|
||||
|
||||
## Database Schema
|
||||
|
||||
### RosterUnit Model (`backend/models.py`)
|
||||
|
||||
```python
|
||||
class RosterUnit(Base):
|
||||
"""
|
||||
Supports multiple device types:
|
||||
- "seismograph" - Seismic monitoring devices (default)
|
||||
- "modem" - Field modems and network equipment
|
||||
- "slm" - Sound level meters (NL-43/NL-53)
|
||||
"""
|
||||
__tablename__ = "roster"
|
||||
|
||||
# Core fields (all device types)
|
||||
id = Column(String, primary_key=True)
|
||||
unit_type = Column(String, default="series3")
|
||||
device_type = Column(String, default="seismograph") # "seismograph" | "modem" | "slm"
|
||||
deployed = Column(Boolean, default=True)
|
||||
retired = Column(Boolean, default=False)
|
||||
# ... other common fields
|
||||
|
||||
# Seismograph-specific
|
||||
last_calibrated = Column(Date, nullable=True)
|
||||
next_calibration_due = Column(Date, nullable=True)
|
||||
|
||||
# Modem-specific
|
||||
ip_address = Column(String, nullable=True)
|
||||
phone_number = Column(String, nullable=True)
|
||||
hardware_model = Column(String, nullable=True)
|
||||
|
||||
# SLM-specific
|
||||
slm_host = Column(String, nullable=True)
|
||||
slm_tcp_port = Column(Integer, nullable=True)
|
||||
slm_ftp_port = Column(Integer, nullable=True)
|
||||
slm_model = Column(String, nullable=True)
|
||||
slm_serial_number = Column(String, nullable=True)
|
||||
slm_frequency_weighting = Column(String, nullable=True)
|
||||
slm_time_weighting = Column(String, nullable=True)
|
||||
slm_measurement_range = Column(String, nullable=True)
|
||||
slm_last_check = Column(DateTime, nullable=True)
|
||||
|
||||
# Shared fields (seismograph + SLM)
|
||||
deployed_with_modem_id = Column(String, nullable=True) # FK to modem
|
||||
```
|
||||
|
||||
---
|
||||
|
||||
## API Usage
|
||||
|
||||
### Adding a New Unit
|
||||
|
||||
**Seismograph**:
|
||||
```bash
|
||||
curl -X POST http://localhost:8001/api/roster/add \
|
||||
-F "id=BE1234" \
|
||||
-F "device_type=seismograph" \
|
||||
-F "unit_type=series3" \
|
||||
-F "deployed=true"
|
||||
```
|
||||
|
||||
**Modem**:
|
||||
```bash
|
||||
curl -X POST http://localhost:8001/api/roster/add \
|
||||
-F "id=MDM001" \
|
||||
-F "device_type=modem" \
|
||||
-F "ip_address=192.0.2.10" \
|
||||
-F "phone_number=+1-555-0100"
|
||||
```
|
||||
|
||||
**Sound Level Meter**:
|
||||
```bash
|
||||
curl -X POST http://localhost:8001/api/roster/add \
|
||||
-F "id=SLM-43-01" \
|
||||
-F "device_type=slm" \
|
||||
-F "slm_host=63.45.161.30" \
|
||||
-F "slm_tcp_port=2255" \
|
||||
-F "slm_model=NL-43"
|
||||
```
|
||||
|
||||
### CSV Import Format
|
||||
|
||||
```csv
|
||||
unit_id,unit_type,device_type,deployed,slm_host,slm_tcp_port,slm_model
|
||||
SLM-43-01,nl43,slm,true,63.45.161.30,2255,NL-43
|
||||
SLM-43-02,nl43,slm,true,63.45.161.31,2255,NL-43
|
||||
BE1234,series3,seismograph,true,,,
|
||||
MDM001,modem,modem,true,,,
|
||||
```
|
||||
|
||||
---
|
||||
|
||||
## Frontend Behavior
|
||||
|
||||
### Device Type Selection
|
||||
|
||||
**Templates**: `unit_detail.html`, `roster.html`
|
||||
|
||||
```html
|
||||
<select name="device_type">
|
||||
<option value="seismograph">Seismograph</option>
|
||||
<option value="modem">Modem</option>
|
||||
<option value="slm">Sound Level Meter</option>
|
||||
</select>
|
||||
```
|
||||
|
||||
### Conditional Field Display
|
||||
|
||||
JavaScript functions check `device_type` to show/hide relevant fields:
|
||||
|
||||
```javascript
|
||||
function toggleDetailFields() {
|
||||
const deviceType = document.getElementById('device_type').value;
|
||||
|
||||
if (deviceType === 'seismograph') {
|
||||
// Show calibration fields
|
||||
} else if (deviceType === 'modem') {
|
||||
// Show network fields
|
||||
} else if (deviceType === 'slm') {
|
||||
// Show SLM configuration fields
|
||||
}
|
||||
}
|
||||
```
|
||||
|
||||
---
|
||||
|
||||
## Code Conventions
|
||||
|
||||
### Always Use Lowercase
|
||||
|
||||
✅ **Correct**:
|
||||
```python
|
||||
if unit.device_type == "slm":
|
||||
# Handle sound level meter
|
||||
```
|
||||
|
||||
❌ **Incorrect**:
|
||||
```python
|
||||
if unit.device_type == "SLM": # Wrong - case sensitive
|
||||
if unit.device_type == "sound_level_meter": # Deprecated
|
||||
```
|
||||
|
||||
### Query Patterns
|
||||
|
||||
**Filter by device type**:
|
||||
```python
|
||||
# Get all SLMs
|
||||
slms = db.query(RosterUnit).filter_by(device_type="slm").all()
|
||||
|
||||
# Get deployed seismographs
|
||||
seismos = db.query(RosterUnit).filter_by(
|
||||
device_type="seismograph",
|
||||
deployed=True
|
||||
).all()
|
||||
|
||||
# Get all modems
|
||||
modems = db.query(RosterUnit).filter_by(device_type="modem").all()
|
||||
```
|
||||
|
||||
---
|
||||
|
||||
## Testing
|
||||
|
||||
### Verify Device Type Distribution
|
||||
|
||||
```bash
|
||||
# Quick check
|
||||
sqlite3 data/seismo_fleet.db "SELECT device_type, COUNT(*) FROM roster GROUP BY device_type;"
|
||||
|
||||
# Detailed view
|
||||
sqlite3 data/seismo_fleet.db "SELECT id, device_type, unit_type, deployed FROM roster ORDER BY device_type, id;"
|
||||
```
|
||||
|
||||
### Check for Legacy Values
|
||||
|
||||
```bash
|
||||
# Should return 0 rows after migration
|
||||
sqlite3 data/seismo_fleet.db "SELECT id FROM roster WHERE device_type = 'sound_level_meter';"
|
||||
```
|
||||
|
||||
---
|
||||
|
||||
## Version History
|
||||
|
||||
- **v0.4.3** (2026-01-16) - Standardized device_type values, deprecated `"sound_level_meter"` → `"slm"`
|
||||
- **v0.4.0** (2026-01-05) - Added SLM support with `"sound_level_meter"` value
|
||||
- **v0.2.0** (2025-12-03) - Added modem device type
|
||||
- **v0.1.0** (2024-11-20) - Initial release with seismograph-only support
|
||||
|
||||
---
|
||||
|
||||
## Related Documentation
|
||||
|
||||
- [README.md](../README.md) - Main project documentation with data model
|
||||
- [DEVICE_TYPE_SLM_SUPPORT.md](DEVICE_TYPE_SLM_SUPPORT.md) - Legacy SLM implementation notes
|
||||
- [SOUND_LEVEL_METERS_DASHBOARD.md](SOUND_LEVEL_METERS_DASHBOARD.md) - SLM dashboard features
|
||||
- [SLM_CONFIGURATION.md](SLM_CONFIGURATION.md) - SLM device configuration guide
|
||||
@@ -1,5 +1,7 @@
|
||||
# Sound Level Meter Device Type Support
|
||||
|
||||
**⚠️ IMPORTANT**: This documentation uses the legacy `sound_level_meter` device type value. As of v0.4.3, the standardized value is `"slm"`. Run `backend/migrate_standardize_device_types.py` to update your database.
|
||||
|
||||
## Overview
|
||||
|
||||
Added full support for "Sound Level Meter" as a device type in the roster management system. Users can now create, edit, and manage SLM units through the Fleet Roster interface.
|
||||
@@ -95,7 +97,7 @@ All SLM fields are updated when editing existing unit.
|
||||
|
||||
The database schema already included SLM fields (no changes needed):
|
||||
- All fields are nullable to support multiple device types
|
||||
- Fields are only relevant when `device_type = "sound_level_meter"`
|
||||
- Fields are only relevant when `device_type = "slm"`
|
||||
|
||||
## Usage
|
||||
|
||||
@@ -125,7 +127,7 @@ The form automatically shows/hides relevant fields based on device type:
|
||||
|
||||
## Integration with SLMM Dashboard
|
||||
|
||||
Units with `device_type = "sound_level_meter"` will:
|
||||
Units with `device_type = "slm"` will:
|
||||
- Appear in the Sound Level Meters dashboard (`/sound-level-meters`)
|
||||
- Be available for live monitoring and control
|
||||
- Use the configured `slm_host` and `slm_tcp_port` for device communication
|
||||
|
||||
@@ -300,7 +300,7 @@ slm.deployed_with_modem_id = "modem-001"
|
||||
```json
|
||||
{
|
||||
"id": "nl43-001",
|
||||
"device_type": "sound_level_meter",
|
||||
"device_type": "slm",
|
||||
"deployed_with_modem_id": "modem-001",
|
||||
"slm_tcp_port": 2255,
|
||||
"slm_model": "NL-43",
|
||||
|
||||
@@ -135,7 +135,7 @@ The dashboard communicates with the SLMM backend service running on port 8100:
|
||||
SLM-specific fields in the RosterUnit model:
|
||||
|
||||
```python
|
||||
device_type = "sound_level_meter" # Distinguishes SLMs from seismographs
|
||||
device_type = "slm" # Distinguishes SLMs from seismographs
|
||||
slm_host = String # Device IP or hostname
|
||||
slm_tcp_port = Integer # TCP control port (default 2255)
|
||||
slm_model = String # NL-43, NL-53, etc.
|
||||
|
||||
17
docs/archive/README.md
Normal file
@@ -0,0 +1,17 @@
|
||||
# Terra-View Documentation Archive
|
||||
|
||||
This directory contains old documentation files that are no longer actively maintained but preserved for historical reference.
|
||||
|
||||
## Archived Documents
|
||||
|
||||
### PROJECTS_SYSTEM_IMPLEMENTATION.md
|
||||
Early implementation notes for the projects system. Superseded by current documentation in main docs directory.
|
||||
|
||||
### .aider.chat.history.md
|
||||
AI assistant chat history from development sessions. Contains context and decision-making process.
|
||||
|
||||
## Note
|
||||
|
||||
These documents may contain outdated information. For current documentation, see:
|
||||
- [Main README](../../README.md)
|
||||
- [Active Documentation](../)
|
||||
19
rebuild-dev.sh
Executable file
@@ -0,0 +1,19 @@
|
||||
#!/bin/bash
|
||||
# Dev rebuild script — increments build number, rebuilds and restarts terra-view
|
||||
set -e
|
||||
|
||||
SCRIPT_DIR="$(cd "$(dirname "${BASH_SOURCE[0]}")" && pwd)"
|
||||
BUILD_FILE="$SCRIPT_DIR/build_number.txt"
|
||||
|
||||
# Read and increment build number
|
||||
BUILD_NUMBER=$(cat "$BUILD_FILE" 2>/dev/null || echo "0")
|
||||
BUILD_NUMBER=$((BUILD_NUMBER + 1))
|
||||
echo "$BUILD_NUMBER" > "$BUILD_FILE"
|
||||
|
||||
echo "Building terra-view dev (build #$BUILD_NUMBER)..."
|
||||
|
||||
cd "$SCRIPT_DIR"
|
||||
docker compose build --build-arg BUILD_NUMBER="$BUILD_NUMBER" terra-view
|
||||
docker compose up -d terra-view
|
||||
|
||||
echo "Done — terra-view v0.6.1-$BUILD_NUMBER is running on :1001"
|
||||
12
rebuild-prod.sh
Normal file
@@ -0,0 +1,12 @@
|
||||
#!/bin/bash
|
||||
# Production rebuild script — rebuilds and restarts terra-view on :8001
|
||||
set -e
|
||||
|
||||
SCRIPT_DIR="$(cd "$(dirname "${BASH_SOURCE[0]}")" && pwd)"
|
||||
cd "$SCRIPT_DIR"
|
||||
|
||||
echo "Building terra-view production..."
|
||||
docker compose -f docker-compose.yml build terra-view
|
||||
docker compose -f docker-compose.yml up -d terra-view
|
||||
|
||||
echo "Done — terra-view production is running on :8001"
|
||||
@@ -7,3 +7,4 @@ jinja2==3.1.2
|
||||
aiofiles==23.2.1
|
||||
Pillow==10.1.0
|
||||
httpx==0.25.2
|
||||
openpyxl==3.1.2
|
||||
|
||||
@@ -1,6 +1,23 @@
|
||||
unit_id,unit_type,deployed,retired,note,project_id,location
|
||||
BE1234,series3,true,false,Primary unit at main site,PROJ-001,San Francisco CA
|
||||
BE5678,series3,true,false,Backup sensor,PROJ-001,Los Angeles CA
|
||||
BE9012,series3,false,false,In maintenance,PROJ-002,Workshop
|
||||
BE3456,series3,true,false,,PROJ-003,New York NY
|
||||
BE7890,series3,false,true,Decommissioned 2024,,Storage
|
||||
unit_id,device_type,unit_type,deployed,retired,note,project_id,location,address,coordinates,last_calibrated,next_calibration_due,deployed_with_modem_id,ip_address,phone_number,hardware_model,slm_host,slm_tcp_port,slm_ftp_port,slm_model,slm_serial_number,slm_frequency_weighting,slm_time_weighting,slm_measurement_range
|
||||
# ============================================
|
||||
# SEISMOGRAPHS (device_type=seismograph)
|
||||
# ============================================
|
||||
BE1234,seismograph,series3,true,false,Primary unit at main site,PROJ-001,San Francisco CA,123 Market St,37.7749;-122.4194,2025-06-15,2026-06-15,MDM001,,,,,,,,,,,
|
||||
BE5678,seismograph,series3,true,false,Backup sensor,PROJ-001,Los Angeles CA,456 Sunset Blvd,34.0522;-118.2437,2025-03-01,2026-03-01,MDM002,,,,,,,,,,,
|
||||
BE9012,seismograph,series4,false,false,In maintenance - needs calibration,PROJ-002,Workshop,789 Industrial Way,,,,,,,,,,,,,,
|
||||
BE3456,seismograph,series3,true,false,,PROJ-003,New York NY,101 Broadway,40.7128;-74.0060,2025-01-10,2026-01-10,,,,,,,,,,,
|
||||
BE7890,seismograph,series3,false,true,Decommissioned 2024,,Storage,Warehouse B,,,,,,,,,,,,,,,
|
||||
# ============================================
|
||||
# MODEMS (device_type=modem)
|
||||
# ============================================
|
||||
MDM001,modem,,true,false,Cradlepoint at SF site,PROJ-001,San Francisco CA,123 Market St,37.7749;-122.4194,,,,,192.168.1.100,+1-555-0101,IBR900,,,,,,,
|
||||
MDM002,modem,,true,false,Sierra Wireless at LA site,PROJ-001,Los Angeles CA,456 Sunset Blvd,34.0522;-118.2437,,,,,10.0.0.50,+1-555-0102,RV55,,,,,,,
|
||||
MDM003,modem,,false,false,Spare modem in storage,,,Storage,Warehouse A,,,,,,+1-555-0103,IBR600,,,,,,,
|
||||
MDM004,modem,,true,false,NYC backup modem,PROJ-003,New York NY,101 Broadway,40.7128;-74.0060,,,,,172.16.0.25,+1-555-0104,IBR1700,,,,,,,
|
||||
# ============================================
|
||||
# SOUND LEVEL METERS (device_type=slm)
|
||||
# ============================================
|
||||
SLM001,slm,,true,false,NL-43 at construction site A,PROJ-004,Downtown Site,500 Main St,40.7589;-73.9851,,,,,,,,192.168.10.101,2255,21,NL-43,12345678,A,F,30-130 dB
|
||||
SLM002,slm,,true,false,NL-43 at construction site B,PROJ-004,Midtown Site,600 Park Ave,40.7614;-73.9776,,,MDM004,,,,,192.168.10.102,2255,21,NL-43,12345679,A,S,30-130 dB
|
||||
SLM003,slm,,false,false,NL-53 spare unit,,,Storage,Warehouse A,,,,,,,,,,,NL-53,98765432,C,F,25-138 dB
|
||||
SLM004,slm,,true,false,NL-43 nighttime monitoring,PROJ-005,Residential Area,200 Quiet Lane,40.7484;-73.9857,,,,,,,,10.0.5.50,2255,21,NL-43,11112222,A,S,30-130 dB
|
||||
|
||||
|
@@ -1,120 +1,20 @@
|
||||
# Helper Scripts
|
||||
# Terra-View Utility Scripts
|
||||
|
||||
This directory contains helper scripts for database management and testing.
|
||||
This directory contains utility scripts for database operations, testing, and maintenance.
|
||||
|
||||
## Database Migration Scripts
|
||||
## Scripts
|
||||
|
||||
### migrate_dev_db.py
|
||||
Migrates the DEV database schema to add SLM-specific columns to the `roster` table.
|
||||
### create_test_db.py
|
||||
Generate a realistic test database with sample data.
|
||||
|
||||
**Usage:**
|
||||
```bash
|
||||
cd /home/serversdown/sfm/seismo-fleet-manager
|
||||
python3 scripts/migrate_dev_db.py
|
||||
```
|
||||
Usage: python scripts/create_test_db.py
|
||||
|
||||
**What it does:**
|
||||
- Adds 8 SLM-specific columns to the DEV database (data-dev/seismo_fleet.db)
|
||||
- Columns: slm_host, slm_tcp_port, slm_model, slm_serial_number, slm_frequency_weighting, slm_time_weighting, slm_measurement_range, slm_last_check
|
||||
- Safe to run multiple times (skips existing columns)
|
||||
### rename_unit.py
|
||||
Rename a unit ID across all tables.
|
||||
|
||||
### update_dev_db_schema.py
|
||||
Inspects and displays the DEV database schema.
|
||||
Usage: python scripts/rename_unit.py <old_id> <new_id>
|
||||
|
||||
**Usage:**
|
||||
```bash
|
||||
python3 scripts/update_dev_db_schema.py
|
||||
```
|
||||
### sync_slms_to_slmm.py
|
||||
Manually sync all SLM devices from Terra-View to SLMM.
|
||||
|
||||
**What it does:**
|
||||
- Shows all tables in the DEV database
|
||||
- Lists all columns in the roster table
|
||||
- Useful for verifying schema after migrations
|
||||
|
||||
## Test Data Scripts
|
||||
|
||||
### add_test_slms.py
|
||||
Adds test Sound Level Meter units to the DEV database.
|
||||
|
||||
**Usage:**
|
||||
```bash
|
||||
python3 scripts/add_test_slms.py
|
||||
```
|
||||
|
||||
**What it creates:**
|
||||
- nl43-001: NL-43 SLM at Construction Site A
|
||||
- nl43-002: NL-43 SLM at Construction Site B
|
||||
- nl53-001: NL-53 SLM at Residential Area
|
||||
- nl43-003: NL-43 SLM (not deployed, spare unit)
|
||||
|
||||
### add_test_modems.py
|
||||
Adds test modem units to the DEV database and assigns them to SLMs.
|
||||
|
||||
**Usage:**
|
||||
```bash
|
||||
python3 scripts/add_test_modems.py
|
||||
```
|
||||
|
||||
**What it creates:**
|
||||
- modem-001, modem-002, modem-003: Deployed modems (Raven XTV and Sierra Wireless)
|
||||
- modem-004: Spare modem (not deployed)
|
||||
|
||||
**Modem assignments:**
|
||||
- nl43-001 → modem-001
|
||||
- nl43-002 → modem-002
|
||||
- nl53-001 → modem-003
|
||||
|
||||
## Cleanup Scripts
|
||||
|
||||
### remove_test_data_from_prod.py
|
||||
**⚠️ PRODUCTION DATABASE CLEANUP**
|
||||
|
||||
Removes test data from the production database (data/seismo_fleet.db).
|
||||
|
||||
**Status:** Already executed successfully. Production database is clean.
|
||||
|
||||
**What it removed:**
|
||||
- All test SLM units (nl43-001, nl43-002, nl53-001, nl43-003)
|
||||
- All test modem units (modem-001, modem-002, modem-003, modem-004)
|
||||
|
||||
## Database Cloning
|
||||
|
||||
### clone_db_to_dev.py
|
||||
Clones the production database to create/update the DEV database.
|
||||
|
||||
**Usage:**
|
||||
```bash
|
||||
python3 scripts/clone_db_to_dev.py
|
||||
```
|
||||
|
||||
**What it does:**
|
||||
- Copies data/seismo_fleet.db → data-dev/seismo_fleet.db
|
||||
- Useful for syncing DEV database with production schema/data
|
||||
|
||||
## Setup Sequence
|
||||
|
||||
To set up a fresh DEV database with test data:
|
||||
|
||||
```bash
|
||||
cd /home/serversdown/sfm/seismo-fleet-manager
|
||||
|
||||
# 1. Fix permissions (if needed)
|
||||
sudo chown -R serversdown:serversdown data-dev/
|
||||
|
||||
# 2. Migrate schema
|
||||
python3 scripts/migrate_dev_db.py
|
||||
|
||||
# 3. Add test data
|
||||
python3 scripts/add_test_slms.py
|
||||
python3 scripts/add_test_modems.py
|
||||
|
||||
# 4. Verify
|
||||
sqlite3 data-dev/seismo_fleet.db "SELECT id, device_type FROM roster WHERE device_type IN ('sound_level_meter', 'modem');"
|
||||
```
|
||||
|
||||
## Important Notes
|
||||
|
||||
- **DEV Database**: `data-dev/seismo_fleet.db` - Used for development and testing
|
||||
- **Production Database**: `data/seismo_fleet.db` - Used by the running application
|
||||
- All test scripts are configured to use the DEV database only
|
||||
- Never run test data scripts against production
|
||||
Usage: python scripts/sync_slms_to_slmm.py
|
||||
|
||||
@@ -90,14 +90,14 @@ def rename_unit(old_id: str, new_id: str):
|
||||
except Exception:
|
||||
pass # Table may not exist
|
||||
|
||||
# Update recording_sessions table (if exists)
|
||||
# Update monitoring_sessions table (if exists)
|
||||
try:
|
||||
result = session.execute(
|
||||
text("UPDATE recording_sessions SET unit_id = :new_id WHERE unit_id = :old_id"),
|
||||
text("UPDATE monitoring_sessions SET unit_id = :new_id WHERE unit_id = :old_id"),
|
||||
{"new_id": new_id, "old_id": old_id}
|
||||
)
|
||||
if result.rowcount > 0:
|
||||
print(f" ✓ Updated recording_sessions ({result.rowcount} rows)")
|
||||
print(f" ✓ Updated monitoring_sessions ({result.rowcount} rows)")
|
||||
except Exception:
|
||||
pass # Table may not exist
|
||||
|
||||
@@ -25,7 +25,7 @@ async def sync_all_slms():
|
||||
try:
|
||||
# Get all SLM devices from Terra-View (source of truth)
|
||||
slm_devices = db.query(RosterUnit).filter_by(
|
||||
device_type="sound_level_meter"
|
||||
device_type="slm"
|
||||
).all()
|
||||
|
||||
logger.info(f"Found {len(slm_devices)} SLM devices in Terra-View roster")
|
||||
@@ -20,6 +20,9 @@
|
||||
|
||||
<!-- PWA Manifest -->
|
||||
<link rel="manifest" href="/static/manifest.json">
|
||||
<link rel="icon" type="image/png" sizes="32x32" href="/static/icons/favicon-32.png">
|
||||
<link rel="icon" type="image/png" sizes="16x16" href="/static/icons/favicon-16.png">
|
||||
<link rel="apple-touch-icon" sizes="180x180" href="/static/icons/icon-192.png">
|
||||
<meta name="theme-color" content="#f48b1c">
|
||||
<meta name="apple-mobile-web-app-capable" content="yes">
|
||||
<meta name="apple-mobile-web-app-status-bar-style" content="black-translucent">
|
||||
@@ -68,7 +71,7 @@
|
||||
|
||||
{% block extra_head %}{% endblock %}
|
||||
</head>
|
||||
<body class="bg-gray-100 dark:bg-gray-900 text-gray-900 dark:text-gray-100">
|
||||
<body class="bg-gray-100 dark:bg-slate-800 text-gray-900 dark:text-gray-100">
|
||||
|
||||
<!-- Offline Indicator -->
|
||||
<div id="offlineIndicator" class="offline-indicator">
|
||||
@@ -85,10 +88,10 @@
|
||||
<aside id="sidebar" class="sidebar w-64 bg-white dark:bg-slate-800 shadow-lg flex flex-col">
|
||||
<!-- Logo -->
|
||||
<div class="p-6 border-b border-gray-200 dark:border-gray-700">
|
||||
<h1 class="text-2xl font-bold text-seismo-navy dark:text-seismo-orange">
|
||||
Seismo<br>
|
||||
<span class="text-seismo-orange dark:text-seismo-burgundy">Fleet Manager</span>
|
||||
</h1>
|
||||
<a href="/" class="block">
|
||||
<img src="/static/terra-view-logo-light.png" srcset="/static/terra-view-logo-light.png 1x, /static/terra-view-logo-light@2x.png 2x" alt="Terra-View" class="block dark:hidden w-44 h-auto">
|
||||
<img src="/static/terra-view-logo-dark.png" srcset="/static/terra-view-logo-dark.png 1x, /static/terra-view-logo-dark@2x.png 2x" alt="Terra-View" class="hidden dark:block w-44 h-auto">
|
||||
</a>
|
||||
<div class="flex items-center justify-between mt-2">
|
||||
<p class="text-xs text-gray-500 dark:text-gray-400">v {{ version }}</p>
|
||||
{% if environment == 'development' %}
|
||||
@@ -127,6 +130,20 @@
|
||||
Sound Level Meters
|
||||
</a>
|
||||
|
||||
<a href="/modems" class="flex items-center px-4 py-3 rounded-lg hover:bg-gray-100 dark:hover:bg-gray-700 {% if request.url.path == '/modems' %}bg-gray-100 dark:bg-gray-700{% endif %}">
|
||||
<svg class="w-5 h-5 mr-3" fill="none" stroke="currentColor" viewBox="0 0 24 24">
|
||||
<path stroke-linecap="round" stroke-linejoin="round" stroke-width="2" d="M8.111 16.404a5.5 5.5 0 017.778 0M12 20h.01m-7.08-7.071c3.904-3.905 10.236-3.905 14.141 0M1.394 9.393c5.857-5.857 15.355-5.857 21.213 0"></path>
|
||||
</svg>
|
||||
Modems
|
||||
</a>
|
||||
|
||||
<a href="/pair-devices" class="flex items-center px-4 py-3 rounded-lg hover:bg-gray-100 dark:hover:bg-gray-700 {% if request.url.path == '/pair-devices' %}bg-gray-100 dark:bg-gray-700{% endif %}">
|
||||
<svg class="w-5 h-5 mr-3" fill="none" stroke="currentColor" viewBox="0 0 24 24">
|
||||
<path stroke-linecap="round" stroke-linejoin="round" stroke-width="2" d="M13.828 10.172a4 4 0 00-5.656 0l-4 4a4 4 0 105.656 5.656l1.102-1.101m-.758-4.899a4 4 0 005.656 0l4-4a4 4 0 00-5.656-5.656l-1.1 1.1"></path>
|
||||
</svg>
|
||||
Pair Devices
|
||||
</a>
|
||||
|
||||
<a href="/projects" class="flex items-center px-4 py-3 rounded-lg hover:bg-gray-100 dark:hover:bg-gray-700 {% if request.url.path.startswith('/projects') %}bg-gray-100 dark:bg-gray-700{% endif %}">
|
||||
<svg class="w-5 h-5 mr-3" fill="none" stroke="currentColor" viewBox="0 0 24 24">
|
||||
<path stroke-linecap="round" stroke-linejoin="round" stroke-width="2" d="M19 11H5m14 0a2 2 0 012 2v6a2 2 0 01-2 2H5a2 2 0 01-2-2v-6a2 2 0 012-2m14 0V9a2 2 0 00-2-2M5 11V9a2 2 0 012-2m0 0V5a2 2 0 012-2h6a2 2 0 012 2v2M7 7h10"></path>
|
||||
@@ -134,6 +151,13 @@
|
||||
Projects
|
||||
</a>
|
||||
|
||||
<a href="/fleet-calendar" class="flex items-center px-4 py-3 rounded-lg hover:bg-gray-100 dark:hover:bg-gray-700 {% if request.url.path.startswith('/fleet-calendar') %}bg-gray-100 dark:bg-gray-700{% endif %}">
|
||||
<svg class="w-5 h-5 mr-3" fill="none" stroke="currentColor" viewBox="0 0 24 24">
|
||||
<path stroke-linecap="round" stroke-linejoin="round" stroke-width="2" d="M8 7V3m8 4V3m-9 8h10M5 21h14a2 2 0 002-2V7a2 2 0 00-2-2H5a2 2 0 00-2 2v12a2 2 0 002 2z"></path>
|
||||
</svg>
|
||||
Fleet Calendar
|
||||
</a>
|
||||
|
||||
<a href="/settings" class="flex items-center px-4 py-3 rounded-lg hover:bg-gray-100 dark:hover:bg-gray-700 {% if request.url.path == '/settings' %}bg-gray-100 dark:bg-gray-700{% endif %}">
|
||||
<svg class="w-5 h-5 mr-3" fill="none" stroke="currentColor" viewBox="0 0 24 24">
|
||||
<path stroke-linecap="round" stroke-linejoin="round" stroke-width="2" d="M10.325 4.317c.426-1.756 2.924-1.756 3.35 0a1.724 1.724 0 002.573 1.066c1.543-.94 3.31.826 2.37 2.37a1.724 1.724 0 001.065 2.572c1.756.426 1.756 2.924 0 3.35a1.724 1.724 0 00-1.066 2.573c.94 1.543-.826 3.31-2.37 2.37a1.724 1.724 0 00-2.572 1.065c-.426 1.756-2.924 1.756-3.35 0a1.724 1.724 0 00-2.573-1.066c-1.543.94-3.31-.826-2.37-2.37a1.724 1.724 0 00-1.065-2.572c-1.756-.426-1.756-2.924 0-3.35a1.724 1.724 0 001.066-2.573c-.94-1.543.826-3.31 2.37-2.37.996.608 2.296.07 2.572-1.065z"></path>
|
||||
@@ -374,10 +398,10 @@
|
||||
</script>
|
||||
|
||||
<!-- Offline Database -->
|
||||
<script src="/static/offline-db.js?v=0.4.0"></script>
|
||||
<script src="/static/offline-db.js?v=0.6.1"></script>
|
||||
|
||||
<!-- Mobile JavaScript -->
|
||||
<script src="/static/mobile.js?v=0.4.0"></script>
|
||||
<script src="/static/mobile.js?v=0.6.1"></script>
|
||||
|
||||
{% block extra_scripts %}{% endblock %}
|
||||
</body>
|
||||
|
||||
315
templates/combined_report_preview.html
Normal file
@@ -0,0 +1,315 @@
|
||||
{% extends "base.html" %}
|
||||
|
||||
{% block title %}Combined Report Preview - {{ project.name }}{% endblock %}
|
||||
|
||||
{% block content %}
|
||||
<!-- jspreadsheet CSS -->
|
||||
<link rel="stylesheet" href="https://cdn.jsdelivr.net/npm/jspreadsheet-ce@4/dist/jspreadsheet.min.css" />
|
||||
<link rel="stylesheet" href="https://cdn.jsdelivr.net/npm/jsuites@5/dist/jsuites.min.css" />
|
||||
|
||||
<div class="min-h-screen bg-gray-100 dark:bg-slate-900">
|
||||
<!-- Header -->
|
||||
<div class="bg-white dark:bg-slate-800 shadow-sm border-b border-gray-200 dark:border-gray-700">
|
||||
<div class="max-w-7xl mx-auto px-4 sm:px-6 lg:px-8 py-4">
|
||||
<div class="flex flex-col md:flex-row md:items-center md:justify-between gap-4">
|
||||
<div>
|
||||
<h1 class="text-2xl font-bold text-gray-900 dark:text-white">Combined Report Preview & Editor</h1>
|
||||
<p class="text-sm text-gray-500 dark:text-gray-400 mt-1">
|
||||
{{ location_data|length }} location{{ 's' if location_data|length != 1 else '' }}
|
||||
{% if time_filter_desc %} | {{ time_filter_desc }}{% endif %}
|
||||
| {{ total_rows }} total row{{ 's' if total_rows != 1 else '' }}
|
||||
</p>
|
||||
</div>
|
||||
<div class="flex items-center gap-3">
|
||||
<button onclick="downloadCombinedReport()" id="download-btn"
|
||||
class="px-4 py-2 bg-emerald-600 text-white rounded-lg hover:bg-emerald-700 transition-colors flex items-center gap-2 text-sm font-medium">
|
||||
<svg class="w-5 h-5" fill="none" stroke="currentColor" viewBox="0 0 24 24">
|
||||
<path stroke-linecap="round" stroke-linejoin="round" stroke-width="2" d="M4 16v1a3 3 0 003 3h10a3 3 0 003-3v-1m-4-4l-4 4m0 0l-4-4m4 4V4"></path>
|
||||
</svg>
|
||||
Generate Reports (ZIP)
|
||||
</button>
|
||||
<a href="/api/projects/{{ project_id }}/combined-report-wizard"
|
||||
class="px-4 py-2 bg-gray-200 dark:bg-gray-700 text-gray-700 dark:text-gray-300 rounded-lg hover:bg-gray-300 dark:hover:bg-gray-600 transition-colors text-sm">
|
||||
← Back to Config
|
||||
</a>
|
||||
</div>
|
||||
</div>
|
||||
</div>
|
||||
</div>
|
||||
|
||||
<div class="max-w-7xl mx-auto px-4 sm:px-6 lg:px-8 py-4 space-y-4">
|
||||
|
||||
<!-- Report Metadata -->
|
||||
<div class="bg-white dark:bg-slate-800 rounded-lg shadow-sm border border-gray-200 dark:border-gray-700 p-4">
|
||||
<div class="grid grid-cols-1 md:grid-cols-3 gap-4">
|
||||
<div>
|
||||
<label class="block text-sm font-medium text-gray-700 dark:text-gray-300 mb-1">Report Title</label>
|
||||
<input type="text" id="edit-report-title" value="{{ report_title }}"
|
||||
class="w-full px-3 py-2 border border-gray-300 dark:border-gray-600 rounded-md bg-white dark:bg-gray-700 text-gray-900 dark:text-white text-sm">
|
||||
</div>
|
||||
<div>
|
||||
<label class="block text-sm font-medium text-gray-700 dark:text-gray-300 mb-1">Project Name</label>
|
||||
<input type="text" id="edit-project-name" value="{{ project_name }}"
|
||||
class="w-full px-3 py-2 border border-gray-300 dark:border-gray-600 rounded-md bg-white dark:bg-gray-700 text-gray-900 dark:text-white text-sm">
|
||||
</div>
|
||||
<div>
|
||||
<label class="block text-sm font-medium text-gray-700 dark:text-gray-300 mb-1">Client Name</label>
|
||||
<input type="text" id="edit-client-name" value="{{ client_name }}"
|
||||
class="w-full px-3 py-2 border border-gray-300 dark:border-gray-600 rounded-md bg-white dark:bg-gray-700 text-gray-900 dark:text-white text-sm">
|
||||
</div>
|
||||
</div>
|
||||
</div>
|
||||
|
||||
<!-- Location Tabs + Spreadsheet -->
|
||||
<div class="bg-white dark:bg-slate-800 rounded-lg shadow-sm border border-gray-200 dark:border-gray-700">
|
||||
|
||||
<!-- Tab Bar -->
|
||||
<div class="border-b border-gray-200 dark:border-gray-700 overflow-x-auto">
|
||||
<div class="flex min-w-max" id="tab-bar">
|
||||
{% for loc in location_data %}
|
||||
<button onclick="switchTab({{ loop.index0 }})"
|
||||
id="tab-btn-{{ loop.index0 }}"
|
||||
class="tab-btn px-4 py-3 text-sm font-medium whitespace-nowrap border-b-2 transition-colors
|
||||
{% if loop.first %}border-emerald-500 text-emerald-600 dark:text-emerald-400
|
||||
{% else %}border-transparent text-gray-500 dark:text-gray-400 hover:text-gray-700 dark:hover:text-gray-300 hover:border-gray-300{% endif %}">
|
||||
{{ loc.location_name }}
|
||||
<span class="ml-1.5 text-xs px-1.5 py-0.5 rounded-full
|
||||
{% if loop.first %}bg-emerald-100 text-emerald-700 dark:bg-emerald-900/40 dark:text-emerald-400
|
||||
{% else %}bg-gray-100 text-gray-500 dark:bg-gray-700 dark:text-gray-400{% endif %}"
|
||||
id="tab-count-{{ loop.index0 }}">
|
||||
{{ loc.filtered_count }}
|
||||
</span>
|
||||
</button>
|
||||
{% endfor %}
|
||||
</div>
|
||||
</div>
|
||||
|
||||
<!-- Spreadsheet Panels -->
|
||||
<div class="p-4">
|
||||
<div class="flex items-center justify-between mb-3">
|
||||
<h3 class="text-base font-semibold text-gray-900 dark:text-white" id="active-tab-title">
|
||||
{{ location_data[0].location_name if location_data else '' }}
|
||||
</h3>
|
||||
<div class="flex items-center gap-2 text-sm text-gray-500 dark:text-gray-400">
|
||||
<span>Right-click for options</span>
|
||||
<span class="text-gray-300 dark:text-gray-600">|</span>
|
||||
<span>Double-click to edit</span>
|
||||
</div>
|
||||
</div>
|
||||
|
||||
{% for loc in location_data %}
|
||||
<div id="panel-{{ loop.index0 }}" class="tab-panel {% if not loop.first %}hidden{% endif %} overflow-x-auto">
|
||||
<div id="spreadsheet-{{ loop.index0 }}"></div>
|
||||
</div>
|
||||
{% endfor %}
|
||||
</div>
|
||||
</div>
|
||||
|
||||
<!-- Help -->
|
||||
<div class="bg-blue-50 dark:bg-blue-900/20 rounded-lg p-4">
|
||||
<h3 class="text-sm font-medium text-blue-800 dark:text-blue-300 mb-2">Editing Tips</h3>
|
||||
<ul class="text-sm text-blue-700 dark:text-blue-400 list-disc list-inside space-y-1">
|
||||
<li>Double-click any cell to edit its value</li>
|
||||
<li>Use the Comments column to add notes about specific measurements</li>
|
||||
<li>Right-click a row to insert or delete rows</li>
|
||||
<li>Press Enter to confirm edits, Escape to cancel</li>
|
||||
<li>Switch between location tabs to edit each location's data independently</li>
|
||||
</ul>
|
||||
</div>
|
||||
|
||||
</div>
|
||||
</div>
|
||||
|
||||
<!-- jspreadsheet JS -->
|
||||
<script src="https://cdn.jsdelivr.net/npm/jsuites@5/dist/jsuites.min.js"></script>
|
||||
<script src="https://cdn.jsdelivr.net/npm/jspreadsheet-ce@4/dist/index.min.js"></script>
|
||||
|
||||
<script>
|
||||
const allLocationData = {{ locations_json | safe }};
|
||||
const spreadsheets = {};
|
||||
let activeTabIdx = 0;
|
||||
|
||||
const columnDef = [
|
||||
{ title: 'Test #', width: 80, type: 'numeric' },
|
||||
{ title: 'Date', width: 110, type: 'text' },
|
||||
{ title: 'Time', width: 90, type: 'text' },
|
||||
{ title: 'LAmax (dBA)', width: 110, type: 'numeric' },
|
||||
{ title: 'LA01 (dBA)', width: 110, type: 'numeric' },
|
||||
{ title: 'LA10 (dBA)', width: 110, type: 'numeric' },
|
||||
{ title: 'Comments', width: 250, type: 'text' },
|
||||
];
|
||||
|
||||
const jssOptions = {
|
||||
columns: columnDef,
|
||||
allowInsertRow: true,
|
||||
allowDeleteRow: true,
|
||||
allowInsertColumn: false,
|
||||
allowDeleteColumn: false,
|
||||
rowDrag: true,
|
||||
columnSorting: true,
|
||||
search: true,
|
||||
pagination: 50,
|
||||
paginationOptions: [25, 50, 100, 200],
|
||||
defaultColWidth: 100,
|
||||
minDimensions: [7, 1],
|
||||
tableOverflow: true,
|
||||
tableWidth: '100%',
|
||||
contextMenu: function(instance, col, row, e) {
|
||||
const items = [];
|
||||
if (row !== null) {
|
||||
items.push({
|
||||
title: 'Insert row above',
|
||||
onclick: function() { instance.insertRow(1, row, true); }
|
||||
});
|
||||
items.push({
|
||||
title: 'Insert row below',
|
||||
onclick: function() { instance.insertRow(1, row + 1, false); }
|
||||
});
|
||||
items.push({
|
||||
title: 'Delete this row',
|
||||
onclick: function() { instance.deleteRow(row); }
|
||||
});
|
||||
}
|
||||
return items;
|
||||
},
|
||||
style: {
|
||||
A: 'text-align: center;',
|
||||
B: 'text-align: center;',
|
||||
C: 'text-align: center;',
|
||||
D: 'text-align: right;',
|
||||
E: 'text-align: right;',
|
||||
F: 'text-align: right;',
|
||||
}
|
||||
};
|
||||
|
||||
document.addEventListener('DOMContentLoaded', function() {
|
||||
allLocationData.forEach(function(loc, idx) {
|
||||
const el = document.getElementById('spreadsheet-' + idx);
|
||||
if (!el) return;
|
||||
const opts = Object.assign({}, jssOptions, { data: loc.spreadsheet_data });
|
||||
spreadsheets[idx] = jspreadsheet(el, opts);
|
||||
});
|
||||
if (allLocationData.length > 0) {
|
||||
switchTab(0);
|
||||
}
|
||||
});
|
||||
|
||||
function switchTab(idx) {
|
||||
activeTabIdx = idx;
|
||||
|
||||
// Update panels
|
||||
document.querySelectorAll('.tab-panel').forEach(function(panel, i) {
|
||||
panel.classList.toggle('hidden', i !== idx);
|
||||
});
|
||||
|
||||
// Update tab button styles
|
||||
document.querySelectorAll('.tab-btn').forEach(function(btn, i) {
|
||||
const countBadge = document.getElementById('tab-count-' + i);
|
||||
if (i === idx) {
|
||||
btn.classList.add('border-emerald-500', 'text-emerald-600', 'dark:text-emerald-400');
|
||||
btn.classList.remove('border-transparent', 'text-gray-500', 'dark:text-gray-400');
|
||||
if (countBadge) {
|
||||
countBadge.classList.add('bg-emerald-100', 'text-emerald-700', 'dark:bg-emerald-900/40', 'dark:text-emerald-400');
|
||||
countBadge.classList.remove('bg-gray-100', 'text-gray-500', 'dark:bg-gray-700', 'dark:text-gray-400');
|
||||
}
|
||||
} else {
|
||||
btn.classList.remove('border-emerald-500', 'text-emerald-600', 'dark:text-emerald-400');
|
||||
btn.classList.add('border-transparent', 'text-gray-500', 'dark:text-gray-400');
|
||||
if (countBadge) {
|
||||
countBadge.classList.remove('bg-emerald-100', 'text-emerald-700', 'dark:bg-emerald-900/40', 'dark:text-emerald-400');
|
||||
countBadge.classList.add('bg-gray-100', 'text-gray-500', 'dark:bg-gray-700', 'dark:text-gray-400');
|
||||
}
|
||||
}
|
||||
});
|
||||
|
||||
// Update title
|
||||
if (allLocationData[idx]) {
|
||||
document.getElementById('active-tab-title').textContent = allLocationData[idx].location_name;
|
||||
}
|
||||
|
||||
// Refresh jspreadsheet rendering after showing panel
|
||||
if (spreadsheets[idx]) {
|
||||
try { spreadsheets[idx].updateTable(); } catch(e) {}
|
||||
}
|
||||
}
|
||||
|
||||
async function downloadCombinedReport() {
|
||||
const btn = document.getElementById('download-btn');
|
||||
const originalText = btn.innerHTML;
|
||||
btn.disabled = true;
|
||||
btn.innerHTML = '<svg class="w-5 h-5 animate-spin" fill="none" viewBox="0 0 24 24"><circle class="opacity-25" cx="12" cy="12" r="10" stroke="currentColor" stroke-width="4"></circle><path class="opacity-75" fill="currentColor" d="M4 12a8 8 0 018-8V0C5.373 0 0 5.373 0 12h4z"></path></svg> Generating ZIP...';
|
||||
|
||||
try {
|
||||
const locations = allLocationData.map(function(loc, idx) {
|
||||
return {
|
||||
session_id: loc.session_id || '',
|
||||
session_label: loc.session_label || '',
|
||||
period_type: loc.period_type || '',
|
||||
started_at: loc.started_at || '',
|
||||
location_name: loc.location_name,
|
||||
spreadsheet_data: spreadsheets[idx] ? spreadsheets[idx].getData() : loc.spreadsheet_data,
|
||||
};
|
||||
});
|
||||
|
||||
const payload = {
|
||||
report_title: document.getElementById('edit-report-title').value || 'Background Noise Study',
|
||||
project_name: document.getElementById('edit-project-name').value || '',
|
||||
client_name: document.getElementById('edit-client-name').value || '',
|
||||
locations: locations,
|
||||
};
|
||||
|
||||
const response = await fetch('/api/projects/{{ project_id }}/generate-combined-from-preview', {
|
||||
method: 'POST',
|
||||
headers: { 'Content-Type': 'application/json' },
|
||||
body: JSON.stringify(payload),
|
||||
});
|
||||
|
||||
if (response.ok) {
|
||||
const blob = await response.blob();
|
||||
const url = window.URL.createObjectURL(blob);
|
||||
const a = document.createElement('a');
|
||||
a.href = url;
|
||||
|
||||
const contentDisposition = response.headers.get('Content-Disposition');
|
||||
let filename = 'combined_reports.zip';
|
||||
if (contentDisposition) {
|
||||
const match = contentDisposition.match(/filename="(.+)"/);
|
||||
if (match) filename = match[1];
|
||||
}
|
||||
|
||||
a.download = filename;
|
||||
document.body.appendChild(a);
|
||||
a.click();
|
||||
window.URL.revokeObjectURL(url);
|
||||
a.remove();
|
||||
} else {
|
||||
const error = await response.json();
|
||||
alert('Error generating report: ' + (error.detail || 'Unknown error'));
|
||||
}
|
||||
} catch (error) {
|
||||
alert('Error generating report: ' + error.message);
|
||||
} finally {
|
||||
btn.disabled = false;
|
||||
btn.innerHTML = originalText;
|
||||
}
|
||||
}
|
||||
</script>
|
||||
|
||||
<style>
|
||||
/* Dark mode jspreadsheet styles */
|
||||
.dark .jexcel { background-color: #1e293b; color: #e2e8f0; }
|
||||
.dark .jexcel thead td { background-color: #334155 !important; color: #e2e8f0 !important; border-color: #475569 !important; }
|
||||
.dark .jexcel tbody td { background-color: #1e293b; color: #e2e8f0; border-color: #475569; }
|
||||
.dark .jexcel tbody td:hover { background-color: #334155; }
|
||||
.dark .jexcel tbody tr:nth-child(even) td { background-color: #0f172a; }
|
||||
.dark .jexcel_pagination { background-color: #1e293b; color: #e2e8f0; border-color: #475569; }
|
||||
.dark .jexcel_pagination a { color: #e2e8f0; }
|
||||
.dark .jexcel_search { background-color: #1e293b; color: #e2e8f0; border-color: #475569; }
|
||||
.dark .jexcel_search input { background-color: #334155; color: #e2e8f0; border-color: #475569; }
|
||||
.dark .jexcel_content { background-color: #1e293b; }
|
||||
.dark .jexcel_contextmenu { background-color: #1e293b; border-color: #475569; }
|
||||
.dark .jexcel_contextmenu a { color: #e2e8f0; }
|
||||
.dark .jexcel_contextmenu a:hover { background-color: #334155; }
|
||||
.jexcel_content { max-height: 600px; overflow: auto; }
|
||||
</style>
|
||||
{% endblock %}
|
||||
393
templates/combined_report_wizard.html
Normal file
@@ -0,0 +1,393 @@
|
||||
{% extends "base.html" %}
|
||||
|
||||
{% block title %}Combined Report Wizard - {{ project.name }}{% endblock %}
|
||||
|
||||
{% block content %}
|
||||
<div class="min-h-screen bg-gray-100 dark:bg-slate-900">
|
||||
<!-- Header -->
|
||||
<div class="bg-white dark:bg-slate-800 shadow-sm border-b border-gray-200 dark:border-gray-700">
|
||||
<div class="max-w-4xl mx-auto px-4 sm:px-6 lg:px-8 py-4">
|
||||
<div class="flex flex-col md:flex-row md:items-center md:justify-between gap-4">
|
||||
<div>
|
||||
<h1 class="text-2xl font-bold text-gray-900 dark:text-white">Combined Report Wizard</h1>
|
||||
<p class="text-sm text-gray-500 dark:text-gray-400 mt-1">{{ project.name }}</p>
|
||||
</div>
|
||||
<a href="/projects/{{ project_id }}"
|
||||
class="px-4 py-2 bg-gray-200 dark:bg-gray-700 text-gray-700 dark:text-gray-300 rounded-lg hover:bg-gray-300 dark:hover:bg-gray-600 transition-colors text-sm w-fit">
|
||||
← Back to Project
|
||||
</a>
|
||||
</div>
|
||||
</div>
|
||||
</div>
|
||||
|
||||
<div class="max-w-4xl mx-auto px-4 sm:px-6 lg:px-8 py-6 space-y-6">
|
||||
|
||||
<!-- Report Settings Card -->
|
||||
<div class="bg-white dark:bg-slate-800 rounded-lg shadow-sm border border-gray-200 dark:border-gray-700 p-6">
|
||||
<h2 class="text-lg font-semibold text-gray-900 dark:text-white mb-4">Report Settings</h2>
|
||||
|
||||
<!-- Template Selection -->
|
||||
<div class="flex items-end gap-2 mb-4">
|
||||
<div class="flex-1">
|
||||
<label for="template-select" class="block text-sm font-medium text-gray-700 dark:text-gray-300 mb-1">
|
||||
Load Template
|
||||
</label>
|
||||
<select id="template-select" onchange="applyTemplate()"
|
||||
class="block w-full px-3 py-2 border border-gray-300 dark:border-gray-600 rounded-md shadow-sm bg-white dark:bg-gray-700 text-gray-900 dark:text-white focus:outline-none focus:ring-emerald-500 focus:border-emerald-500 sm:text-sm">
|
||||
<option value="">-- Select a template --</option>
|
||||
</select>
|
||||
</div>
|
||||
<button type="button" onclick="saveAsTemplate()"
|
||||
class="px-3 py-2 text-sm bg-gray-200 dark:bg-gray-700 text-gray-700 dark:text-gray-300 rounded-md hover:bg-gray-300 dark:hover:bg-gray-600"
|
||||
title="Save current settings as template">
|
||||
<svg class="w-4 h-4" fill="none" stroke="currentColor" viewBox="0 0 24 24">
|
||||
<path stroke-linecap="round" stroke-linejoin="round" stroke-width="2" d="M8 7H5a2 2 0 00-2 2v9a2 2 0 002 2h14a2 2 0 002-2V9a2 2 0 00-2-2h-3m-1 4l-3 3m0 0l-3-3m3 3V4"></path>
|
||||
</svg>
|
||||
</button>
|
||||
</div>
|
||||
|
||||
<!-- Report Title -->
|
||||
<div class="mb-4">
|
||||
<label for="report-title" class="block text-sm font-medium text-gray-700 dark:text-gray-300 mb-1">
|
||||
Report Title
|
||||
</label>
|
||||
<input type="text" id="report-title" value="Background Noise Study"
|
||||
class="block w-full px-3 py-2 border border-gray-300 dark:border-gray-600 rounded-md shadow-sm bg-white dark:bg-gray-700 text-gray-900 dark:text-white focus:outline-none focus:ring-emerald-500 focus:border-emerald-500 sm:text-sm">
|
||||
</div>
|
||||
|
||||
<!-- Project and Client -->
|
||||
<div class="grid grid-cols-1 sm:grid-cols-2 gap-4">
|
||||
<div>
|
||||
<label for="report-project" class="block text-sm font-medium text-gray-700 dark:text-gray-300 mb-1">
|
||||
Project Name
|
||||
</label>
|
||||
<input type="text" id="report-project" value="{{ project.name }}"
|
||||
class="block w-full px-3 py-2 border border-gray-300 dark:border-gray-600 rounded-md shadow-sm bg-white dark:bg-gray-700 text-gray-900 dark:text-white focus:outline-none focus:ring-emerald-500 focus:border-emerald-500 sm:text-sm">
|
||||
</div>
|
||||
<div>
|
||||
<label for="report-client" class="block text-sm font-medium text-gray-700 dark:text-gray-300 mb-1">
|
||||
Client Name
|
||||
</label>
|
||||
<input type="text" id="report-client" value="{{ project.client_name if project.client_name else '' }}"
|
||||
class="block w-full px-3 py-2 border border-gray-300 dark:border-gray-600 rounded-md shadow-sm bg-white dark:bg-gray-700 text-gray-900 dark:text-white focus:outline-none focus:ring-emerald-500 focus:border-emerald-500 sm:text-sm">
|
||||
</div>
|
||||
</div>
|
||||
</div>
|
||||
|
||||
<!-- Sessions Card -->
|
||||
<div class="bg-white dark:bg-slate-800 rounded-lg shadow-sm border border-gray-200 dark:border-gray-700 p-6 overflow-hidden">
|
||||
<div class="flex items-center justify-between mb-1">
|
||||
<h2 class="text-lg font-semibold text-gray-900 dark:text-white">Monitoring Sessions</h2>
|
||||
<div class="flex gap-3 text-sm">
|
||||
<button type="button" onclick="selectAllSessions()" class="text-emerald-600 dark:text-emerald-400 hover:underline">Select All</button>
|
||||
<button type="button" onclick="deselectAllSessions()" class="text-gray-500 dark:text-gray-400 hover:underline">Deselect All</button>
|
||||
</div>
|
||||
</div>
|
||||
<p class="text-sm text-gray-500 dark:text-gray-400 mb-4">
|
||||
<span id="selected-count">0</span> session(s) selected — each selected session becomes one sheet in the ZIP.
|
||||
Change the period type per session to control how stats are bucketed (Day vs Night).
|
||||
</p>
|
||||
|
||||
{% if locations %}
|
||||
{% for loc in locations %}
|
||||
{% set loc_name = loc.name %}
|
||||
{% set sessions = loc.sessions %}
|
||||
<div class="border border-gray-200 dark:border-gray-700 rounded-lg mb-3 overflow-hidden">
|
||||
<!-- Location header / toggle -->
|
||||
<button type="button"
|
||||
onclick="toggleLocation('loc-{{ loop.index }}')"
|
||||
class="w-full flex items-center justify-between px-4 py-3 bg-gray-50 dark:bg-slate-700/50 hover:bg-gray-100 dark:hover:bg-slate-700 transition-colors text-left">
|
||||
<div class="flex items-center gap-3">
|
||||
<svg id="chevron-loc-{{ loop.index }}" class="w-4 h-4 text-gray-400 transition-transform" fill="none" stroke="currentColor" viewBox="0 0 24 24">
|
||||
<path stroke-linecap="round" stroke-linejoin="round" stroke-width="2" d="M19 9l-7 7-7-7"></path>
|
||||
</svg>
|
||||
<span class="font-medium text-gray-900 dark:text-white text-sm">{{ loc_name }}</span>
|
||||
<span class="text-xs text-gray-400 dark:text-gray-500">{{ sessions|length }} session{{ 's' if sessions|length != 1 else '' }}</span>
|
||||
</div>
|
||||
<div class="flex items-center gap-3 text-xs" onclick="event.stopPropagation()">
|
||||
<button type="button" onclick="selectLocation('loc-{{ loop.index }}')"
|
||||
class="text-emerald-600 dark:text-emerald-400 hover:underline">All</button>
|
||||
<button type="button" onclick="deselectLocation('loc-{{ loop.index }}')"
|
||||
class="text-gray-400 hover:underline">None</button>
|
||||
</div>
|
||||
</button>
|
||||
|
||||
<!-- Session rows -->
|
||||
<div id="loc-{{ loop.index }}" class="divide-y divide-gray-100 dark:divide-gray-700/50">
|
||||
{% for s in sessions %}
|
||||
{% set pt_colors = {
|
||||
'weekday_day': 'bg-blue-100 text-blue-800 dark:bg-blue-900/30 dark:text-blue-300',
|
||||
'weekday_night': 'bg-indigo-100 text-indigo-800 dark:bg-indigo-900/30 dark:text-indigo-300',
|
||||
'weekend_day': 'bg-amber-100 text-amber-800 dark:bg-amber-900/30 dark:text-amber-300',
|
||||
'weekend_night': 'bg-purple-100 text-purple-800 dark:bg-purple-900/30 dark:text-purple-300',
|
||||
} %}
|
||||
{% set pt_labels = {
|
||||
'weekday_day': 'Weekday Day',
|
||||
'weekday_night': 'Weekday Night',
|
||||
'weekend_day': 'Weekend Day',
|
||||
'weekend_night': 'Weekend Night',
|
||||
} %}
|
||||
<div class="flex items-center gap-3 px-4 py-3 hover:bg-gray-50 dark:hover:bg-slate-700/30 transition-colors">
|
||||
<!-- Checkbox -->
|
||||
<input type="checkbox"
|
||||
class="session-cb loc-{{ loop.index }}-cb h-4 w-4 text-emerald-600 border-gray-300 dark:border-gray-600 rounded focus:ring-emerald-500 shrink-0"
|
||||
value="{{ s.session_id }}"
|
||||
checked
|
||||
onchange="updateSelectionStats()">
|
||||
|
||||
<!-- Date/day info -->
|
||||
<div class="min-w-0 flex-1">
|
||||
<div class="flex flex-wrap items-center gap-2">
|
||||
<span class="text-sm font-medium text-gray-900 dark:text-white">
|
||||
{{ s.day_of_week }} {{ s.date_display }}
|
||||
</span>
|
||||
{% if s.session_label %}
|
||||
<span class="text-xs text-gray-400 dark:text-gray-500 truncate">{{ s.session_label }}</span>
|
||||
{% endif %}
|
||||
{% if s.status == 'recording' %}
|
||||
<span class="px-1.5 py-0.5 text-xs bg-red-100 text-red-700 dark:bg-red-900/30 dark:text-red-300 rounded-full flex items-center gap-1">
|
||||
<span class="w-1.5 h-1.5 bg-red-500 rounded-full animate-pulse"></span>Recording
|
||||
</span>
|
||||
{% endif %}
|
||||
</div>
|
||||
<div class="flex items-center gap-3 mt-0.5 text-xs text-gray-400 dark:text-gray-500">
|
||||
{% if s.started_at %}
|
||||
<span>{{ s.started_at }}</span>
|
||||
{% endif %}
|
||||
{% if s.duration_h is not none %}
|
||||
<span>{{ s.duration_h }}h {{ s.duration_m }}m</span>
|
||||
{% endif %}
|
||||
</div>
|
||||
</div>
|
||||
|
||||
<!-- Period type dropdown -->
|
||||
<div class="relative shrink-0" id="wiz-period-wrap-{{ s.session_id }}">
|
||||
<button type="button"
|
||||
onclick="toggleWizPeriodMenu('{{ s.session_id }}')"
|
||||
id="wiz-period-badge-{{ s.session_id }}"
|
||||
class="px-2 py-0.5 text-xs font-medium rounded-full flex items-center gap-1 transition-colors {{ pt_colors.get(s.period_type, 'bg-gray-100 text-gray-500 dark:bg-gray-700 dark:text-gray-400') }}"
|
||||
title="Click to change period type">
|
||||
<span id="wiz-period-label-{{ s.session_id }}">{{ pt_labels.get(s.period_type, 'Set period') }}</span>
|
||||
<svg class="w-3 h-3 opacity-60 shrink-0" fill="none" stroke="currentColor" viewBox="0 0 24 24">
|
||||
<path stroke-linecap="round" stroke-linejoin="round" stroke-width="2" d="M19 9l-7 7-7-7"></path>
|
||||
</svg>
|
||||
</button>
|
||||
<div id="wiz-period-menu-{{ s.session_id }}"
|
||||
class="hidden absolute right-0 top-full mt-1 z-20 bg-white dark:bg-slate-700 border border-gray-200 dark:border-gray-600 rounded-lg shadow-lg min-w-[160px] py-1">
|
||||
{% for pt, pt_label in [('weekday_day','Weekday Day'),('weekday_night','Weekday Night'),('weekend_day','Weekend Day'),('weekend_night','Weekend Night')] %}
|
||||
<button type="button"
|
||||
onclick="setWizPeriodType('{{ s.session_id }}', '{{ pt }}')"
|
||||
class="w-full text-left px-3 py-1.5 text-xs hover:bg-gray-100 dark:hover:bg-slate-600 text-gray-700 dark:text-gray-300 {% if s.period_type == pt %}font-bold{% endif %}">
|
||||
{{ pt_label }}
|
||||
</button>
|
||||
{% endfor %}
|
||||
</div>
|
||||
</div>
|
||||
</div>
|
||||
{% endfor %}
|
||||
</div>
|
||||
</div>
|
||||
{% endfor %}
|
||||
{% else %}
|
||||
<div class="text-center py-10 text-gray-500 dark:text-gray-400">
|
||||
<svg class="w-12 h-12 mx-auto mb-3 text-gray-300 dark:text-gray-600" fill="none" stroke="currentColor" viewBox="0 0 24 24">
|
||||
<path stroke-linecap="round" stroke-linejoin="round" stroke-width="2" d="M9 19V6l12-3v13M9 19c0 1.105-1.343 2-3 2s-3-.895-3-2 1.343-2 3-2 3 .895 3 2zm12-3c0 1.105-1.343 2-3 2s-3-.895-3-2 1.343-2 3-2 3 .895 3 2zM9 10l12-3"></path>
|
||||
</svg>
|
||||
<p>No monitoring sessions found.</p>
|
||||
<p class="text-sm mt-1">Upload data files to create sessions first.</p>
|
||||
</div>
|
||||
{% endif %}
|
||||
</div>
|
||||
|
||||
<!-- Footer Buttons -->
|
||||
<div class="flex flex-col sm:flex-row items-center justify-between gap-3 pb-6">
|
||||
<a href="/projects/{{ project_id }}"
|
||||
class="w-full sm:w-auto px-6 py-2.5 border border-gray-300 dark:border-gray-600 text-gray-700 dark:text-gray-300 bg-white dark:bg-gray-700 rounded-lg hover:bg-gray-50 dark:hover:bg-gray-600 transition-colors text-center text-sm font-medium">
|
||||
Cancel
|
||||
</a>
|
||||
<button type="button" onclick="gotoPreview()" id="preview-btn"
|
||||
{% if not locations %}disabled{% endif %}
|
||||
class="w-full sm:w-auto px-6 py-2.5 bg-emerald-600 text-white rounded-lg hover:bg-emerald-700 transition-colors text-sm font-medium flex items-center justify-center gap-2 disabled:opacity-50 disabled:cursor-not-allowed">
|
||||
<svg class="w-4 h-4" fill="none" stroke="currentColor" viewBox="0 0 24 24">
|
||||
<path stroke-linecap="round" stroke-linejoin="round" stroke-width="2" d="M15 12a3 3 0 11-6 0 3 3 0 016 0z"></path>
|
||||
<path stroke-linecap="round" stroke-linejoin="round" stroke-width="2" d="M2.458 12C3.732 7.943 7.523 5 12 5c4.478 0 8.268 2.943 9.542 7-1.274 4.057-5.064 7-9.542 7-4.477 0-8.268-2.943-9.542-7z"></path>
|
||||
</svg>
|
||||
Preview & Edit →
|
||||
</button>
|
||||
</div>
|
||||
|
||||
</div>
|
||||
</div>
|
||||
|
||||
<script>
|
||||
const PROJECT_ID = '{{ project_id }}';
|
||||
|
||||
const PERIOD_COLORS = {
|
||||
weekday_day: 'bg-blue-100 text-blue-800 dark:bg-blue-900/30 dark:text-blue-300',
|
||||
weekday_night: 'bg-indigo-100 text-indigo-800 dark:bg-indigo-900/30 dark:text-indigo-300',
|
||||
weekend_day: 'bg-amber-100 text-amber-800 dark:bg-amber-900/30 dark:text-amber-300',
|
||||
weekend_night: 'bg-purple-100 text-purple-800 dark:bg-purple-900/30 dark:text-purple-300',
|
||||
};
|
||||
const PERIOD_LABELS = {
|
||||
weekday_day: 'Weekday Day',
|
||||
weekday_night: 'Weekday Night',
|
||||
weekend_day: 'Weekend Day',
|
||||
weekend_night: 'Weekend Night',
|
||||
};
|
||||
const ALL_PERIOD_BADGE_CLASSES = [
|
||||
'bg-gray-100','text-gray-500','dark:bg-gray-700','dark:text-gray-400',
|
||||
...new Set(Object.values(PERIOD_COLORS).flatMap(s => s.split(' ')))
|
||||
];
|
||||
|
||||
// ── Location accordion ────────────────────────────────────────────
|
||||
|
||||
function toggleLocation(locId) {
|
||||
const body = document.getElementById(locId);
|
||||
const chevron = document.getElementById('chevron-' + locId);
|
||||
body.classList.toggle('hidden');
|
||||
chevron.style.transform = body.classList.contains('hidden') ? 'rotate(-90deg)' : '';
|
||||
}
|
||||
|
||||
function selectLocation(locId) {
|
||||
document.querySelectorAll('.' + locId + '-cb').forEach(cb => cb.checked = true);
|
||||
updateSelectionStats();
|
||||
}
|
||||
|
||||
function deselectLocation(locId) {
|
||||
document.querySelectorAll('.' + locId + '-cb').forEach(cb => cb.checked = false);
|
||||
updateSelectionStats();
|
||||
}
|
||||
|
||||
function selectAllSessions() {
|
||||
document.querySelectorAll('.session-cb').forEach(cb => cb.checked = true);
|
||||
updateSelectionStats();
|
||||
}
|
||||
|
||||
function deselectAllSessions() {
|
||||
document.querySelectorAll('.session-cb').forEach(cb => cb.checked = false);
|
||||
updateSelectionStats();
|
||||
}
|
||||
|
||||
function updateSelectionStats() {
|
||||
const count = document.querySelectorAll('.session-cb:checked').length;
|
||||
document.getElementById('selected-count').textContent = count;
|
||||
document.getElementById('preview-btn').disabled = count === 0;
|
||||
}
|
||||
|
||||
// ── Period type dropdown (wizard) ─────────────────────────────────
|
||||
|
||||
function toggleWizPeriodMenu(sessionId) {
|
||||
const menu = document.getElementById('wiz-period-menu-' + sessionId);
|
||||
document.querySelectorAll('[id^="wiz-period-menu-"]').forEach(m => {
|
||||
if (m.id !== 'wiz-period-menu-' + sessionId) m.classList.add('hidden');
|
||||
});
|
||||
menu.classList.toggle('hidden');
|
||||
}
|
||||
|
||||
document.addEventListener('click', function(e) {
|
||||
if (!e.target.closest('[id^="wiz-period-wrap-"]')) {
|
||||
document.querySelectorAll('[id^="wiz-period-menu-"]').forEach(m => m.classList.add('hidden'));
|
||||
}
|
||||
});
|
||||
|
||||
async function setWizPeriodType(sessionId, periodType) {
|
||||
document.getElementById('wiz-period-menu-' + sessionId).classList.add('hidden');
|
||||
const badge = document.getElementById('wiz-period-badge-' + sessionId);
|
||||
badge.disabled = true;
|
||||
try {
|
||||
const resp = await fetch(`/api/projects/${PROJECT_ID}/sessions/${sessionId}`, {
|
||||
method: 'PATCH',
|
||||
headers: {'Content-Type': 'application/json'},
|
||||
body: JSON.stringify({period_type: periodType}),
|
||||
});
|
||||
if (!resp.ok) throw new Error(await resp.text());
|
||||
ALL_PERIOD_BADGE_CLASSES.forEach(c => badge.classList.remove(c));
|
||||
const colorStr = PERIOD_COLORS[periodType] || 'bg-gray-100 text-gray-500 dark:bg-gray-700 dark:text-gray-400';
|
||||
badge.classList.add(...colorStr.split(' ').filter(Boolean));
|
||||
document.getElementById('wiz-period-label-' + sessionId).textContent = PERIOD_LABELS[periodType] || periodType;
|
||||
} catch(err) {
|
||||
alert('Failed to update period type: ' + err.message);
|
||||
} finally {
|
||||
badge.disabled = false;
|
||||
}
|
||||
}
|
||||
|
||||
// ── Template management ───────────────────────────────────────────
|
||||
|
||||
let reportTemplates = [];
|
||||
|
||||
async function loadTemplates() {
|
||||
try {
|
||||
const resp = await fetch('/api/report-templates?project_id=' + PROJECT_ID);
|
||||
if (resp.ok) {
|
||||
reportTemplates = await resp.json();
|
||||
populateTemplateDropdown();
|
||||
}
|
||||
} catch(e) { console.error('Error loading templates:', e); }
|
||||
}
|
||||
|
||||
function populateTemplateDropdown() {
|
||||
const select = document.getElementById('template-select');
|
||||
if (!select) return;
|
||||
select.innerHTML = '<option value="">-- Select a template --</option>';
|
||||
reportTemplates.forEach(t => {
|
||||
const opt = document.createElement('option');
|
||||
opt.value = t.id;
|
||||
opt.textContent = t.name;
|
||||
opt.dataset.config = JSON.stringify(t);
|
||||
select.appendChild(opt);
|
||||
});
|
||||
}
|
||||
|
||||
function applyTemplate() {
|
||||
const select = document.getElementById('template-select');
|
||||
const opt = select.options[select.selectedIndex];
|
||||
if (!opt.value) return;
|
||||
const t = JSON.parse(opt.dataset.config);
|
||||
if (t.report_title) document.getElementById('report-title').value = t.report_title;
|
||||
}
|
||||
|
||||
async function saveAsTemplate() {
|
||||
const name = prompt('Enter a name for this template:');
|
||||
if (!name) return;
|
||||
const data = {
|
||||
name,
|
||||
project_id: PROJECT_ID,
|
||||
report_title: document.getElementById('report-title').value || 'Background Noise Study',
|
||||
};
|
||||
try {
|
||||
const resp = await fetch('/api/report-templates', {
|
||||
method: 'POST',
|
||||
headers: {'Content-Type': 'application/json'},
|
||||
body: JSON.stringify(data),
|
||||
});
|
||||
if (resp.ok) { alert('Template saved!'); loadTemplates(); }
|
||||
else alert('Failed to save template');
|
||||
} catch(e) { alert('Error: ' + e.message); }
|
||||
}
|
||||
|
||||
// ── Navigate to preview ───────────────────────────────────────────
|
||||
|
||||
function gotoPreview() {
|
||||
const checked = Array.from(document.querySelectorAll('.session-cb:checked')).map(cb => cb.value);
|
||||
if (checked.length === 0) {
|
||||
alert('Please select at least one session.');
|
||||
return;
|
||||
}
|
||||
const params = new URLSearchParams({
|
||||
report_title: document.getElementById('report-title').value || 'Background Noise Study',
|
||||
project_name: document.getElementById('report-project').value || '',
|
||||
client_name: document.getElementById('report-client').value || '',
|
||||
selected_sessions: checked.join(','),
|
||||
});
|
||||
window.location.href = `/api/projects/${PROJECT_ID}/combined-report-preview?${params.toString()}`;
|
||||
}
|
||||
|
||||
// ── Init ─────────────────────────────────────────────────────────
|
||||
|
||||
document.addEventListener('DOMContentLoaded', function() {
|
||||
updateSelectionStats();
|
||||
loadTemplates();
|
||||
});
|
||||
</script>
|
||||
{% endblock %}
|
||||
@@ -27,10 +27,10 @@
|
||||
hx-swap="none"
|
||||
hx-on::after-request="updateDashboard(event)">
|
||||
|
||||
<div class="grid grid-cols-1 md:grid-cols-3 gap-6 mb-8">
|
||||
<div class="grid grid-cols-1 md:grid-cols-2 lg:grid-cols-4 gap-6 mb-8">
|
||||
|
||||
<!-- Fleet Summary Card -->
|
||||
<div class="rounded-xl shadow-lg bg-white dark:bg-slate-800 p-6" id="fleet-summary-card">
|
||||
<div class="rounded-xl shadow-lg bg-white dark:bg-slate-700 p-6" id="fleet-summary-card">
|
||||
<div class="flex items-center justify-between mb-4 cursor-pointer md:cursor-default" onclick="toggleCard('fleet-summary')">
|
||||
<h2 class="text-lg font-semibold text-gray-900 dark:text-white">Fleet Summary</h2>
|
||||
<div class="flex items-center gap-2">
|
||||
@@ -118,7 +118,7 @@
|
||||
</div>
|
||||
|
||||
<!-- Recent Alerts Card -->
|
||||
<div class="rounded-xl shadow-lg bg-white dark:bg-slate-800 p-6" id="recent-alerts-card">
|
||||
<div class="rounded-xl shadow-lg bg-white dark:bg-slate-700 p-6" id="recent-alerts-card">
|
||||
<div class="flex items-center justify-between mb-4 cursor-pointer md:cursor-default" onclick="toggleCard('recent-alerts')">
|
||||
<h2 class="text-lg font-semibold text-gray-900 dark:text-white">Recent Alerts</h2>
|
||||
<div class="flex items-center gap-2">
|
||||
@@ -138,7 +138,7 @@
|
||||
</div>
|
||||
|
||||
<!-- Recently Called In Units Card -->
|
||||
<div class="rounded-xl shadow-lg bg-white dark:bg-slate-800 p-6" id="recent-callins-card">
|
||||
<div class="rounded-xl shadow-lg bg-white dark:bg-slate-700 p-6" id="recent-callins-card">
|
||||
<div class="flex items-center justify-between mb-4 cursor-pointer md:cursor-default" onclick="toggleCard('recent-callins')">
|
||||
<h2 class="text-lg font-semibold text-gray-900 dark:text-white">Recent Call-Ins</h2>
|
||||
<div class="flex items-center gap-2">
|
||||
@@ -162,10 +162,95 @@
|
||||
</div>
|
||||
</div>
|
||||
|
||||
<!-- Today's Scheduled Actions Card -->
|
||||
<div class="rounded-xl shadow-lg bg-white dark:bg-slate-700 p-6" id="todays-actions-card">
|
||||
<div class="flex items-center justify-between mb-4 cursor-pointer md:cursor-default" onclick="toggleCard('todays-actions')">
|
||||
<h2 class="text-lg font-semibold text-gray-900 dark:text-white">Today's Schedule</h2>
|
||||
<div class="flex items-center gap-2">
|
||||
<svg class="w-6 h-6 text-seismo-orange" fill="none" stroke="currentColor" viewBox="0 0 24 24">
|
||||
<path stroke-linecap="round" stroke-linejoin="round" stroke-width="2"
|
||||
d="M8 7V3m8 4V3m-9 8h10M5 21h14a2 2 0 002-2V7a2 2 0 00-2-2H5a2 2 0 00-2 2v12a2 2 0 002 2z">
|
||||
</path>
|
||||
</svg>
|
||||
<svg class="w-5 h-5 text-gray-500 transition-transform md:hidden chevron" id="todays-actions-chevron" fill="none" stroke="currentColor" viewBox="0 0 24 24">
|
||||
<path stroke-linecap="round" stroke-linejoin="round" stroke-width="2" d="M19 9l-7 7-7-7"></path>
|
||||
</svg>
|
||||
</div>
|
||||
</div>
|
||||
<div class="card-content" id="todays-actions-content"
|
||||
hx-get="/dashboard/todays-actions"
|
||||
hx-trigger="load, every 30s"
|
||||
hx-swap="innerHTML">
|
||||
<p class="text-sm text-gray-500 dark:text-gray-400">Loading scheduled actions...</p>
|
||||
</div>
|
||||
</div>
|
||||
|
||||
</div>
|
||||
|
||||
<!-- Dashboard Filters -->
|
||||
<div class="rounded-xl shadow-lg bg-white dark:bg-slate-700 p-4 mb-4" id="dashboard-filters-card">
|
||||
<div class="flex items-center justify-between mb-3">
|
||||
<h3 class="text-sm font-semibold text-gray-700 dark:text-gray-300">Filter Dashboard</h3>
|
||||
<button onclick="resetFilters()" class="text-xs text-gray-500 hover:text-seismo-orange dark:hover:text-seismo-orange transition-colors">
|
||||
Reset Filters
|
||||
</button>
|
||||
</div>
|
||||
|
||||
<div class="flex flex-wrap gap-6">
|
||||
<!-- Device Type Filters -->
|
||||
<div class="flex flex-col gap-1">
|
||||
<span class="text-xs text-gray-500 dark:text-gray-400 font-medium uppercase tracking-wide">Device Type</span>
|
||||
<div class="flex gap-4">
|
||||
<label class="flex items-center gap-1.5 cursor-pointer">
|
||||
<input type="checkbox" id="filter-seismograph" checked
|
||||
class="rounded border-gray-300 text-blue-600 focus:ring-blue-500 dark:border-gray-600 dark:bg-slate-800"
|
||||
onchange="applyFilters()">
|
||||
<span class="text-sm text-gray-700 dark:text-gray-300">Seismographs</span>
|
||||
</label>
|
||||
<label class="flex items-center gap-1.5 cursor-pointer">
|
||||
<input type="checkbox" id="filter-slm" checked
|
||||
class="rounded border-gray-300 text-purple-600 focus:ring-purple-500 dark:border-gray-600 dark:bg-slate-800"
|
||||
onchange="applyFilters()">
|
||||
<span class="text-sm text-gray-700 dark:text-gray-300">SLMs</span>
|
||||
</label>
|
||||
<label class="flex items-center gap-1.5 cursor-pointer">
|
||||
<input type="checkbox" id="filter-modem" checked
|
||||
class="rounded border-gray-300 text-cyan-600 focus:ring-cyan-500 dark:border-gray-600 dark:bg-slate-800"
|
||||
onchange="applyFilters()">
|
||||
<span class="text-sm text-gray-700 dark:text-gray-300">Modems</span>
|
||||
</label>
|
||||
</div>
|
||||
</div>
|
||||
|
||||
<!-- Status Filters -->
|
||||
<div class="flex flex-col gap-1">
|
||||
<span class="text-xs text-gray-500 dark:text-gray-400 font-medium uppercase tracking-wide">Status</span>
|
||||
<div class="flex gap-4">
|
||||
<label class="flex items-center gap-1.5 cursor-pointer">
|
||||
<input type="checkbox" id="filter-ok" checked
|
||||
class="rounded border-gray-300 text-green-600 focus:ring-green-500 dark:border-gray-600 dark:bg-slate-800"
|
||||
onchange="applyFilters()">
|
||||
<span class="text-sm text-green-600 dark:text-green-400">OK</span>
|
||||
</label>
|
||||
<label class="flex items-center gap-1.5 cursor-pointer">
|
||||
<input type="checkbox" id="filter-pending" checked
|
||||
class="rounded border-gray-300 text-yellow-600 focus:ring-yellow-500 dark:border-gray-600 dark:bg-slate-800"
|
||||
onchange="applyFilters()">
|
||||
<span class="text-sm text-yellow-600 dark:text-yellow-400">Pending</span>
|
||||
</label>
|
||||
<label class="flex items-center gap-1.5 cursor-pointer">
|
||||
<input type="checkbox" id="filter-missing" checked
|
||||
class="rounded border-gray-300 text-red-600 focus:ring-red-500 dark:border-gray-600 dark:bg-slate-800"
|
||||
onchange="applyFilters()">
|
||||
<span class="text-sm text-red-600 dark:text-red-400">Missing</span>
|
||||
</label>
|
||||
</div>
|
||||
</div>
|
||||
</div>
|
||||
</div>
|
||||
|
||||
<!-- Fleet Map -->
|
||||
<div class="rounded-xl shadow-lg bg-white dark:bg-slate-800 p-6 mb-8" id="fleet-map-card">
|
||||
<div class="rounded-xl shadow-lg bg-white dark:bg-slate-700 p-6 mb-8" id="fleet-map-card">
|
||||
<div class="flex items-center justify-between mb-4 cursor-pointer md:cursor-default" onclick="toggleCard('fleet-map')">
|
||||
<h2 class="text-xl font-semibold text-gray-900 dark:text-white">Fleet Map</h2>
|
||||
<div class="flex items-center gap-2">
|
||||
@@ -181,7 +266,7 @@
|
||||
</div>
|
||||
|
||||
<!-- Recent Photos Section -->
|
||||
<div class="rounded-xl shadow-lg bg-white dark:bg-slate-800 p-6 mb-8" id="recent-photos-card">
|
||||
<div class="rounded-xl shadow-lg bg-white dark:bg-slate-700 p-6 mb-8" id="recent-photos-card">
|
||||
<div class="flex items-center justify-between mb-4 cursor-pointer md:cursor-default" onclick="toggleCard('recent-photos')">
|
||||
<h2 class="text-xl font-semibold text-gray-900 dark:text-white">Recent Photos</h2>
|
||||
<div class="flex items-center gap-2">
|
||||
@@ -201,7 +286,7 @@
|
||||
</div>
|
||||
|
||||
<!-- Fleet Status Section with Tabs -->
|
||||
<div class="rounded-xl shadow-lg bg-white dark:bg-slate-800 p-6" id="fleet-status-card">
|
||||
<div class="rounded-xl shadow-lg bg-white dark:bg-slate-700 p-6" id="fleet-status-card">
|
||||
|
||||
<div class="flex items-center justify-between mb-4 cursor-pointer md:cursor-default" onclick="toggleCard('fleet-status')">
|
||||
<h2 class="text-xl font-semibold text-gray-900 dark:text-white">Fleet Status</h2>
|
||||
@@ -279,6 +364,255 @@
|
||||
|
||||
|
||||
<script>
|
||||
// ===== Dashboard Filtering System =====
|
||||
let currentSnapshotData = null; // Store latest snapshot data for re-filtering
|
||||
|
||||
// Filter state - tracks which device types and statuses to show
|
||||
const filters = {
|
||||
deviceTypes: {
|
||||
seismograph: true,
|
||||
sound_level_meter: true,
|
||||
modem: true
|
||||
},
|
||||
statuses: {
|
||||
OK: true,
|
||||
Pending: true,
|
||||
Missing: true
|
||||
}
|
||||
};
|
||||
|
||||
// Load saved filter preferences from localStorage
|
||||
function loadFilterPreferences() {
|
||||
const saved = localStorage.getItem('dashboardFilters');
|
||||
if (saved) {
|
||||
try {
|
||||
const parsed = JSON.parse(saved);
|
||||
if (parsed.deviceTypes) Object.assign(filters.deviceTypes, parsed.deviceTypes);
|
||||
if (parsed.statuses) Object.assign(filters.statuses, parsed.statuses);
|
||||
} catch (e) {
|
||||
console.error('Error loading filter preferences:', e);
|
||||
}
|
||||
}
|
||||
|
||||
// Sync checkboxes with loaded state
|
||||
const seismoCheck = document.getElementById('filter-seismograph');
|
||||
const slmCheck = document.getElementById('filter-slm');
|
||||
const modemCheck = document.getElementById('filter-modem');
|
||||
const okCheck = document.getElementById('filter-ok');
|
||||
const pendingCheck = document.getElementById('filter-pending');
|
||||
const missingCheck = document.getElementById('filter-missing');
|
||||
|
||||
if (seismoCheck) seismoCheck.checked = filters.deviceTypes.seismograph;
|
||||
if (slmCheck) slmCheck.checked = filters.deviceTypes.sound_level_meter;
|
||||
if (modemCheck) modemCheck.checked = filters.deviceTypes.modem;
|
||||
if (okCheck) okCheck.checked = filters.statuses.OK;
|
||||
if (pendingCheck) pendingCheck.checked = filters.statuses.Pending;
|
||||
if (missingCheck) missingCheck.checked = filters.statuses.Missing;
|
||||
}
|
||||
|
||||
// Save filter preferences to localStorage
|
||||
function saveFilterPreferences() {
|
||||
localStorage.setItem('dashboardFilters', JSON.stringify(filters));
|
||||
}
|
||||
|
||||
// Apply filters - called when any checkbox changes
|
||||
function applyFilters() {
|
||||
// Update filter state from checkboxes
|
||||
const seismoCheck = document.getElementById('filter-seismograph');
|
||||
const slmCheck = document.getElementById('filter-slm');
|
||||
const modemCheck = document.getElementById('filter-modem');
|
||||
const okCheck = document.getElementById('filter-ok');
|
||||
const pendingCheck = document.getElementById('filter-pending');
|
||||
const missingCheck = document.getElementById('filter-missing');
|
||||
|
||||
if (seismoCheck) filters.deviceTypes.seismograph = seismoCheck.checked;
|
||||
if (slmCheck) filters.deviceTypes.sound_level_meter = slmCheck.checked;
|
||||
if (modemCheck) filters.deviceTypes.modem = modemCheck.checked;
|
||||
if (okCheck) filters.statuses.OK = okCheck.checked;
|
||||
if (pendingCheck) filters.statuses.Pending = pendingCheck.checked;
|
||||
if (missingCheck) filters.statuses.Missing = missingCheck.checked;
|
||||
|
||||
saveFilterPreferences();
|
||||
|
||||
// Re-render with current data and filters
|
||||
if (currentSnapshotData) {
|
||||
renderFilteredDashboard(currentSnapshotData);
|
||||
}
|
||||
}
|
||||
|
||||
// Reset all filters to show everything
|
||||
function resetFilters() {
|
||||
filters.deviceTypes = { seismograph: true, sound_level_meter: true, modem: true };
|
||||
filters.statuses = { OK: true, Pending: true, Missing: true };
|
||||
|
||||
// Update all checkboxes
|
||||
const checkboxes = [
|
||||
'filter-seismograph', 'filter-slm', 'filter-modem',
|
||||
'filter-ok', 'filter-pending', 'filter-missing'
|
||||
];
|
||||
checkboxes.forEach(id => {
|
||||
const el = document.getElementById(id);
|
||||
if (el) el.checked = true;
|
||||
});
|
||||
|
||||
saveFilterPreferences();
|
||||
|
||||
if (currentSnapshotData) {
|
||||
renderFilteredDashboard(currentSnapshotData);
|
||||
}
|
||||
}
|
||||
|
||||
// Check if a unit passes the current filters
|
||||
function unitPassesFilter(unit) {
|
||||
const deviceType = unit.device_type || 'seismograph';
|
||||
const status = unit.status || 'Missing';
|
||||
|
||||
// Check device type filter
|
||||
if (!filters.deviceTypes[deviceType]) {
|
||||
return false;
|
||||
}
|
||||
|
||||
// Check status filter
|
||||
if (!filters.statuses[status]) {
|
||||
return false;
|
||||
}
|
||||
|
||||
return true;
|
||||
}
|
||||
|
||||
// Get display label for device type
|
||||
function getDeviceTypeLabel(deviceType) {
|
||||
switch(deviceType) {
|
||||
case 'sound_level_meter': return 'SLM';
|
||||
case 'modem': return 'Modem';
|
||||
default: return 'Seismograph';
|
||||
}
|
||||
}
|
||||
|
||||
// Render dashboard with filtered data
|
||||
function renderFilteredDashboard(data) {
|
||||
// Filter active units for alerts
|
||||
const filteredActive = {};
|
||||
Object.entries(data.active || {}).forEach(([id, unit]) => {
|
||||
if (unitPassesFilter(unit)) {
|
||||
filteredActive[id] = unit;
|
||||
}
|
||||
});
|
||||
|
||||
// Update alerts with filtered data
|
||||
updateAlertsFiltered(filteredActive);
|
||||
|
||||
// Update map with filtered data
|
||||
updateFleetMapFiltered(data.units);
|
||||
}
|
||||
|
||||
// Update the Recent Alerts section with filtering
|
||||
function updateAlertsFiltered(filteredActive) {
|
||||
const alertsList = document.getElementById('alerts-list');
|
||||
const missingUnits = Object.entries(filteredActive).filter(([_, u]) => u.status === 'Missing');
|
||||
|
||||
if (!missingUnits.length) {
|
||||
// Check if this is because of filters or genuinely no alerts
|
||||
const anyMissing = currentSnapshotData && Object.values(currentSnapshotData.active || {}).some(u => u.status === 'Missing');
|
||||
if (anyMissing) {
|
||||
alertsList.innerHTML = '<p class="text-sm text-gray-500 dark:text-gray-400">No alerts match current filters</p>';
|
||||
} else {
|
||||
alertsList.innerHTML = '<p class="text-sm text-green-600 dark:text-green-400">All units reporting normally</p>';
|
||||
}
|
||||
} else {
|
||||
let alertsHtml = '';
|
||||
missingUnits.forEach(([id, unit]) => {
|
||||
const deviceLabel = getDeviceTypeLabel(unit.device_type);
|
||||
alertsHtml += `
|
||||
<div class="flex items-start space-x-2 text-sm">
|
||||
<span class="w-2 h-2 rounded-full bg-red-500 mt-1.5"></span>
|
||||
<div>
|
||||
<a href="/unit/${id}" class="font-medium text-red-600 dark:text-red-400 hover:underline">${id}</a>
|
||||
<span class="text-xs text-gray-500 ml-1">(${deviceLabel})</span>
|
||||
<p class="text-gray-600 dark:text-gray-400">Missing for ${unit.age}</p>
|
||||
</div>
|
||||
</div>`;
|
||||
});
|
||||
alertsList.innerHTML = alertsHtml;
|
||||
}
|
||||
}
|
||||
|
||||
// Update map with filtered data
|
||||
function updateFleetMapFiltered(allUnits) {
|
||||
if (!fleetMap) return;
|
||||
|
||||
// Clear existing markers
|
||||
fleetMarkers.forEach(marker => fleetMap.removeLayer(marker));
|
||||
fleetMarkers = [];
|
||||
|
||||
// Get deployed units with coordinates that pass the filter
|
||||
const deployedUnits = Object.entries(allUnits || {})
|
||||
.filter(([_, u]) => u.deployed && u.coordinates && unitPassesFilter(u));
|
||||
|
||||
if (deployedUnits.length === 0) {
|
||||
return;
|
||||
}
|
||||
|
||||
const bounds = [];
|
||||
|
||||
deployedUnits.forEach(([id, unit]) => {
|
||||
const coords = parseLocation(unit.coordinates);
|
||||
if (coords) {
|
||||
const [lat, lon] = coords;
|
||||
|
||||
// Color based on status
|
||||
const markerColor = unit.status === 'OK' ? 'green' :
|
||||
unit.status === 'Pending' ? 'orange' : 'red';
|
||||
|
||||
// Different marker style per device type
|
||||
const deviceType = unit.device_type || 'seismograph';
|
||||
let radius = 8;
|
||||
let weight = 2;
|
||||
|
||||
if (deviceType === 'modem') {
|
||||
radius = 6;
|
||||
weight = 2;
|
||||
} else if (deviceType === 'sound_level_meter') {
|
||||
radius = 8;
|
||||
weight = 3;
|
||||
}
|
||||
|
||||
const marker = L.circleMarker([lat, lon], {
|
||||
radius: radius,
|
||||
fillColor: markerColor,
|
||||
color: '#fff',
|
||||
weight: weight,
|
||||
opacity: 1,
|
||||
fillOpacity: 0.8
|
||||
}).addTo(fleetMap);
|
||||
|
||||
// Popup with device type
|
||||
const deviceLabel = getDeviceTypeLabel(deviceType);
|
||||
|
||||
marker.bindPopup(`
|
||||
<div class="p-2">
|
||||
<h3 class="font-bold text-lg">${id}</h3>
|
||||
<p class="text-sm text-gray-600">${deviceLabel}</p>
|
||||
<p class="text-sm">Status: <span style="color: ${markerColor}">${unit.status}</span></p>
|
||||
${unit.note ? `<p class="text-sm text-gray-600">${unit.note}</p>` : ''}
|
||||
<a href="/unit/${id}" class="text-blue-600 hover:underline text-sm">View Details</a>
|
||||
</div>
|
||||
`);
|
||||
|
||||
fleetMarkers.push(marker);
|
||||
bounds.push([lat, lon]);
|
||||
}
|
||||
});
|
||||
|
||||
// Only fit bounds on initial load, not on subsequent updates
|
||||
// This preserves the user's current map view when auto-refreshing
|
||||
if (bounds.length > 0 && !fleetMapInitialized) {
|
||||
const padding = window.innerWidth < 768 ? [20, 20] : [50, 50];
|
||||
fleetMap.fitBounds(bounds, { padding: padding });
|
||||
fleetMapInitialized = true;
|
||||
}
|
||||
}
|
||||
|
||||
// Toggle card collapse/expand (mobile only)
|
||||
function toggleCard(cardName) {
|
||||
// Only work on mobile
|
||||
@@ -316,7 +650,7 @@ function toggleCard(cardName) {
|
||||
// Restore card states from localStorage on page load
|
||||
function restoreCardStates() {
|
||||
const cardStates = JSON.parse(localStorage.getItem('dashboardCardStates') || '{}');
|
||||
const cardNames = ['fleet-summary', 'recent-alerts', 'recent-callins', 'fleet-map', 'fleet-status'];
|
||||
const cardNames = ['fleet-summary', 'recent-alerts', 'recent-callins', 'todays-actions', 'fleet-map', 'fleet-status'];
|
||||
|
||||
cardNames.forEach(cardName => {
|
||||
const content = document.getElementById(`${cardName}-content`);
|
||||
@@ -343,8 +677,17 @@ if (document.readyState === 'loading') {
|
||||
|
||||
function updateDashboard(event) {
|
||||
try {
|
||||
// Only process responses from /api/status-snapshot
|
||||
const requestUrl = event.detail.xhr.responseURL || event.detail.pathInfo?.requestPath;
|
||||
if (!requestUrl || !requestUrl.includes('/api/status-snapshot')) {
|
||||
return; // Ignore responses from other endpoints (like /dashboard/todays-actions)
|
||||
}
|
||||
|
||||
const data = JSON.parse(event.detail.xhr.response);
|
||||
|
||||
// Store data for filter re-application
|
||||
currentSnapshotData = data;
|
||||
|
||||
// Update "Last updated" timestamp with timezone
|
||||
const now = new Date();
|
||||
const timezone = localStorage.getItem('timezone') || 'America/New_York';
|
||||
@@ -356,7 +699,7 @@ function updateDashboard(event) {
|
||||
timeZoneName: 'short'
|
||||
});
|
||||
|
||||
// ===== Fleet summary numbers =====
|
||||
// ===== Fleet summary numbers (always unfiltered) =====
|
||||
document.getElementById('total-units').textContent = data.summary?.total ?? 0;
|
||||
document.getElementById('deployed-units').textContent = data.summary?.active ?? 0;
|
||||
document.getElementById('benched-units').textContent = data.summary?.benched ?? 0;
|
||||
@@ -364,9 +707,10 @@ function updateDashboard(event) {
|
||||
document.getElementById('status-pending').textContent = data.summary?.pending ?? 0;
|
||||
document.getElementById('status-missing').textContent = data.summary?.missing ?? 0;
|
||||
|
||||
// ===== Device type counts =====
|
||||
// ===== Device type counts (always unfiltered) =====
|
||||
let seismoCount = 0;
|
||||
let slmCount = 0;
|
||||
let modemCount = 0;
|
||||
Object.values(data.units || {}).forEach(unit => {
|
||||
if (unit.retired) return; // Don't count retired units
|
||||
const deviceType = unit.device_type || 'seismograph';
|
||||
@@ -374,46 +718,26 @@ function updateDashboard(event) {
|
||||
seismoCount++;
|
||||
} else if (deviceType === 'sound_level_meter') {
|
||||
slmCount++;
|
||||
} else if (deviceType === 'modem') {
|
||||
modemCount++;
|
||||
}
|
||||
});
|
||||
document.getElementById('seismo-count').textContent = seismoCount;
|
||||
document.getElementById('slm-count').textContent = slmCount;
|
||||
|
||||
// ===== Alerts =====
|
||||
const alertsList = document.getElementById('alerts-list');
|
||||
// Only show alerts for deployed units that are MISSING (not pending)
|
||||
const missingUnits = Object.entries(data.active).filter(([_, u]) => u.status === 'Missing');
|
||||
|
||||
if (!missingUnits.length) {
|
||||
alertsList.innerHTML =
|
||||
'<p class="text-sm text-green-600 dark:text-green-400">✓ All units reporting normally</p>';
|
||||
} else {
|
||||
let alertsHtml = '';
|
||||
|
||||
missingUnits.forEach(([id, unit]) => {
|
||||
alertsHtml += `
|
||||
<div class="flex items-start space-x-2 text-sm">
|
||||
<span class="w-2 h-2 rounded-full bg-red-500 mt-1.5"></span>
|
||||
<div>
|
||||
<a href="/unit/${id}" class="font-medium text-red-600 dark:text-red-400 hover:underline">${id}</a>
|
||||
<p class="text-gray-600 dark:text-gray-400">Missing for ${unit.age}</p>
|
||||
</div>
|
||||
</div>`;
|
||||
});
|
||||
|
||||
alertsList.innerHTML = alertsHtml;
|
||||
}
|
||||
|
||||
// ===== Update Fleet Map =====
|
||||
updateFleetMap(data);
|
||||
// ===== Apply filters and render map + alerts =====
|
||||
renderFilteredDashboard(data);
|
||||
|
||||
} catch (err) {
|
||||
console.error("Dashboard update error:", err);
|
||||
}
|
||||
}
|
||||
|
||||
// Handle tab switching
|
||||
// Handle tab switching and initialize components
|
||||
document.addEventListener('DOMContentLoaded', function() {
|
||||
// Load filter preferences
|
||||
loadFilterPreferences();
|
||||
|
||||
const tabButtons = document.querySelectorAll('.tab-button');
|
||||
|
||||
tabButtons.forEach(button => {
|
||||
@@ -453,64 +777,6 @@ function initFleetMap() {
|
||||
}, 100);
|
||||
}
|
||||
|
||||
function updateFleetMap(data) {
|
||||
if (!fleetMap) return;
|
||||
|
||||
// Clear existing markers
|
||||
fleetMarkers.forEach(marker => fleetMap.removeLayer(marker));
|
||||
fleetMarkers = [];
|
||||
|
||||
// Get deployed units with coordinates data
|
||||
const deployedUnits = Object.entries(data.units).filter(([_, u]) => u.deployed && u.coordinates);
|
||||
|
||||
if (deployedUnits.length === 0) {
|
||||
return;
|
||||
}
|
||||
|
||||
const bounds = [];
|
||||
|
||||
deployedUnits.forEach(([id, unit]) => {
|
||||
const coords = parseLocation(unit.coordinates);
|
||||
if (coords) {
|
||||
const [lat, lon] = coords;
|
||||
|
||||
// Create marker with custom color based on status
|
||||
const markerColor = unit.status === 'OK' ? 'green' : unit.status === 'Pending' ? 'orange' : 'red';
|
||||
|
||||
const marker = L.circleMarker([lat, lon], {
|
||||
radius: 8,
|
||||
fillColor: markerColor,
|
||||
color: '#fff',
|
||||
weight: 2,
|
||||
opacity: 1,
|
||||
fillOpacity: 0.8
|
||||
}).addTo(fleetMap);
|
||||
|
||||
// Add popup with unit info
|
||||
marker.bindPopup(`
|
||||
<div class="p-2">
|
||||
<h3 class="font-bold text-lg">${id}</h3>
|
||||
<p class="text-sm">Status: <span style="color: ${markerColor}">${unit.status}</span></p>
|
||||
<p class="text-sm">Type: ${unit.device_type}</p>
|
||||
${unit.note ? `<p class="text-sm text-gray-600">${unit.note}</p>` : ''}
|
||||
<a href="/unit/${id}" class="text-blue-600 hover:underline text-sm">View Details →</a>
|
||||
</div>
|
||||
`);
|
||||
|
||||
fleetMarkers.push(marker);
|
||||
bounds.push([lat, lon]);
|
||||
}
|
||||
});
|
||||
|
||||
// Fit map to show all markers
|
||||
if (bounds.length > 0) {
|
||||
// Use different padding for mobile vs desktop
|
||||
const padding = window.innerWidth < 768 ? [20, 20] : [50, 50];
|
||||
fleetMap.fitBounds(bounds, { padding: padding });
|
||||
fleetMapInitialized = true;
|
||||
}
|
||||
}
|
||||
|
||||
function parseLocation(location) {
|
||||
if (!location) return null;
|
||||
|
||||
|
||||
811
templates/fleet_calendar.html
Normal file
@@ -0,0 +1,811 @@
|
||||
{% extends "base.html" %}
|
||||
|
||||
{% block title %}Fleet Calendar - Terra-View{% endblock %}
|
||||
|
||||
{% block extra_head %}
|
||||
<style>
|
||||
/* Calendar grid layout */
|
||||
.calendar-grid {
|
||||
display: grid;
|
||||
grid-template-columns: repeat(4, 1fr);
|
||||
gap: 1rem;
|
||||
}
|
||||
|
||||
@media (max-width: 1280px) {
|
||||
.calendar-grid {
|
||||
grid-template-columns: repeat(3, 1fr);
|
||||
}
|
||||
}
|
||||
|
||||
@media (max-width: 768px) {
|
||||
.calendar-grid {
|
||||
grid-template-columns: repeat(2, 1fr);
|
||||
}
|
||||
}
|
||||
|
||||
@media (max-width: 480px) {
|
||||
.calendar-grid {
|
||||
grid-template-columns: 1fr;
|
||||
}
|
||||
}
|
||||
|
||||
/* Month card */
|
||||
.month-card {
|
||||
background: white;
|
||||
border-radius: 0.5rem;
|
||||
padding: 0.75rem;
|
||||
box-shadow: 0 1px 3px rgba(0,0,0,0.1);
|
||||
}
|
||||
|
||||
.dark .month-card {
|
||||
background: rgb(30 41 59);
|
||||
}
|
||||
|
||||
/* Day grid */
|
||||
.day-grid {
|
||||
display: grid;
|
||||
grid-template-columns: repeat(7, 1fr);
|
||||
gap: 2px;
|
||||
font-size: 0.75rem;
|
||||
}
|
||||
|
||||
.day-header {
|
||||
text-align: center;
|
||||
font-weight: 600;
|
||||
color: #6b7280;
|
||||
padding: 0.25rem 0;
|
||||
}
|
||||
|
||||
.dark .day-header {
|
||||
color: #9ca3af;
|
||||
}
|
||||
|
||||
/* Day cell */
|
||||
.day-cell {
|
||||
aspect-ratio: 1;
|
||||
display: flex;
|
||||
align-items: center;
|
||||
justify-content: center;
|
||||
border-radius: 0.25rem;
|
||||
cursor: pointer;
|
||||
position: relative;
|
||||
transition: all 0.15s ease;
|
||||
font-size: 0.7rem;
|
||||
}
|
||||
|
||||
.day-cell:hover {
|
||||
transform: scale(1.1);
|
||||
z-index: 10;
|
||||
}
|
||||
|
||||
.day-cell.today {
|
||||
ring: 2px;
|
||||
ring-color: #f48b1c;
|
||||
font-weight: bold;
|
||||
}
|
||||
|
||||
/* Default neutral day style */
|
||||
.day-cell.neutral {
|
||||
background-color: #f3f4f6;
|
||||
color: #374151;
|
||||
}
|
||||
|
||||
.dark .day-cell.neutral {
|
||||
background-color: rgba(55, 65, 81, 0.5);
|
||||
color: #d1d5db;
|
||||
}
|
||||
|
||||
/* Status indicator dot */
|
||||
.day-cell .status-dot {
|
||||
position: absolute;
|
||||
top: 2px;
|
||||
right: 2px;
|
||||
width: 5px;
|
||||
height: 5px;
|
||||
border-radius: 50%;
|
||||
}
|
||||
|
||||
.day-cell .status-dot.expired {
|
||||
background-color: #ef4444;
|
||||
}
|
||||
|
||||
/* Reservation mode colors - applied dynamically */
|
||||
.day-cell.reservation-available {
|
||||
background-color: #dcfce7;
|
||||
color: #166534;
|
||||
}
|
||||
|
||||
.dark .day-cell.reservation-available {
|
||||
background-color: rgba(34, 197, 94, 0.2);
|
||||
color: #86efac;
|
||||
}
|
||||
|
||||
.day-cell.reservation-partial {
|
||||
background-color: #fef3c7;
|
||||
color: #92400e;
|
||||
}
|
||||
|
||||
.dark .day-cell.reservation-partial {
|
||||
background-color: rgba(245, 158, 11, 0.2);
|
||||
color: #fcd34d;
|
||||
}
|
||||
|
||||
.day-cell.reservation-unavailable {
|
||||
background-color: #fee2e2;
|
||||
color: #991b1b;
|
||||
}
|
||||
|
||||
.dark .day-cell.reservation-unavailable {
|
||||
background-color: rgba(239, 68, 68, 0.2);
|
||||
color: #fca5a5;
|
||||
}
|
||||
|
||||
/* Legacy status colors (kept for day detail) */
|
||||
.day-cell.some-reserved {
|
||||
background-color: #dbeafe;
|
||||
color: #1e40af;
|
||||
}
|
||||
|
||||
.dark .day-cell.some-reserved {
|
||||
background-color: rgba(59, 130, 246, 0.2);
|
||||
color: #93c5fd;
|
||||
}
|
||||
|
||||
.day-cell.empty {
|
||||
background: transparent;
|
||||
cursor: default;
|
||||
}
|
||||
|
||||
.day-cell.empty:hover {
|
||||
transform: none;
|
||||
}
|
||||
|
||||
/* Reservation bar */
|
||||
.reservation-bar {
|
||||
display: flex;
|
||||
align-items: center;
|
||||
padding: 0.5rem 0.75rem;
|
||||
border-radius: 0.375rem;
|
||||
margin-bottom: 0.5rem;
|
||||
cursor: pointer;
|
||||
transition: opacity 0.15s ease;
|
||||
}
|
||||
|
||||
.reservation-bar:hover {
|
||||
opacity: 0.8;
|
||||
}
|
||||
|
||||
/* Slide-over panel */
|
||||
.slide-panel {
|
||||
position: fixed;
|
||||
top: 0;
|
||||
right: 0;
|
||||
width: 100%;
|
||||
max-width: 28rem;
|
||||
height: 100vh;
|
||||
background: white;
|
||||
box-shadow: -4px 0 15px rgba(0,0,0,0.1);
|
||||
transform: translateX(100%);
|
||||
transition: transform 0.3s ease;
|
||||
z-index: 50;
|
||||
overflow-y: auto;
|
||||
}
|
||||
|
||||
.dark .slide-panel {
|
||||
background: rgb(30 41 59);
|
||||
}
|
||||
|
||||
.slide-panel.open {
|
||||
transform: translateX(0);
|
||||
}
|
||||
|
||||
.panel-backdrop {
|
||||
position: fixed;
|
||||
inset: 0;
|
||||
background: rgba(0,0,0,0.3);
|
||||
opacity: 0;
|
||||
visibility: hidden;
|
||||
transition: all 0.3s ease;
|
||||
z-index: 40;
|
||||
}
|
||||
|
||||
.panel-backdrop.open {
|
||||
opacity: 1;
|
||||
visibility: visible;
|
||||
}
|
||||
</style>
|
||||
{% endblock %}
|
||||
|
||||
{% block content %}
|
||||
<div class="mb-6">
|
||||
<div class="flex flex-col sm:flex-row sm:items-center sm:justify-between gap-4">
|
||||
<div>
|
||||
<h1 class="text-3xl font-bold text-gray-900 dark:text-white">Fleet Calendar</h1>
|
||||
<p class="text-gray-600 dark:text-gray-400 mt-1">Plan unit assignments and track calibrations</p>
|
||||
</div>
|
||||
|
||||
</div>
|
||||
</div>
|
||||
|
||||
<!-- Summary Stats -->
|
||||
<div class="grid grid-cols-2 md:grid-cols-5 gap-4 mb-6">
|
||||
<div class="bg-white dark:bg-slate-800 rounded-lg p-4 shadow">
|
||||
<p class="text-sm text-gray-500 dark:text-gray-400">Total Units</p>
|
||||
<p class="text-2xl font-bold text-gray-900 dark:text-white">{{ calendar_data.total_units }}</p>
|
||||
</div>
|
||||
<div class="bg-green-50 dark:bg-green-900/20 rounded-lg p-4 shadow">
|
||||
<p class="text-sm text-green-600 dark:text-green-400">Available Today</p>
|
||||
<p class="text-2xl font-bold text-green-700 dark:text-green-300" id="available-today">--</p>
|
||||
</div>
|
||||
<div class="bg-blue-50 dark:bg-blue-900/20 rounded-lg p-4 shadow">
|
||||
<p class="text-sm text-blue-600 dark:text-blue-400">Reserved Today</p>
|
||||
<p class="text-2xl font-bold text-blue-700 dark:text-blue-300" id="reserved-today">--</p>
|
||||
</div>
|
||||
<div class="bg-yellow-50 dark:bg-yellow-900/20 rounded-lg p-4 shadow">
|
||||
<p class="text-sm text-yellow-600 dark:text-yellow-400">Expiring Soon</p>
|
||||
<p class="text-2xl font-bold text-yellow-700 dark:text-yellow-300" id="expiring-today">--</p>
|
||||
</div>
|
||||
<div class="bg-red-50 dark:bg-red-900/20 rounded-lg p-4 shadow">
|
||||
<p class="text-sm text-red-600 dark:text-red-400">Cal. Expired</p>
|
||||
<p class="text-2xl font-bold text-red-700 dark:text-red-300" id="expired-today">--</p>
|
||||
</div>
|
||||
</div>
|
||||
|
||||
<!-- Action Bar -->
|
||||
<div class="flex flex-wrap items-center justify-between gap-4 mb-6">
|
||||
<div class="flex items-center gap-4">
|
||||
<!-- Device Type Toggle -->
|
||||
<div class="flex rounded-lg bg-gray-100 dark:bg-gray-700 p-1">
|
||||
<a href="/fleet-calendar?year={{ start_year }}&month={{ start_month }}&device_type=seismograph"
|
||||
class="px-4 py-2 rounded-md text-sm font-medium transition-colors {% if device_type == 'seismograph' %}bg-white dark:bg-slate-600 text-gray-900 dark:text-white shadow{% else %}text-gray-600 dark:text-gray-300 hover:text-gray-900 dark:hover:text-white{% endif %}">
|
||||
Seismographs
|
||||
</a>
|
||||
<a href="/fleet-calendar?year={{ start_year }}&month={{ start_month }}&device_type=slm"
|
||||
class="px-4 py-2 rounded-md text-sm font-medium transition-colors {% if device_type == 'slm' %}bg-white dark:bg-slate-600 text-gray-900 dark:text-white shadow{% else %}text-gray-600 dark:text-gray-300 hover:text-gray-900 dark:hover:text-white{% endif %}">
|
||||
SLMs
|
||||
</a>
|
||||
</div>
|
||||
|
||||
<!-- Legend -->
|
||||
<div class="hidden md:flex items-center gap-4 text-sm" id="main-legend">
|
||||
<span class="flex items-center gap-1">
|
||||
<span class="w-3 h-3 rounded bg-gray-200 dark:bg-gray-600 relative">
|
||||
<span class="absolute top-0 right-0 w-1.5 h-1.5 rounded-full bg-red-500"></span>
|
||||
</span>
|
||||
Cal expires
|
||||
</span>
|
||||
</div>
|
||||
<!-- Reservation mode legend (hidden by default) -->
|
||||
<div class="hidden md:hidden items-center gap-4 text-sm" id="reservation-legend">
|
||||
<span class="flex items-center gap-1">
|
||||
<span class="w-3 h-3 rounded bg-green-200 dark:bg-green-700"></span>
|
||||
Available
|
||||
</span>
|
||||
<span class="flex items-center gap-1">
|
||||
<span class="w-3 h-3 rounded bg-yellow-200 dark:bg-yellow-700"></span>
|
||||
Partial
|
||||
</span>
|
||||
<span class="flex items-center gap-1">
|
||||
<span class="w-3 h-3 rounded bg-red-200 dark:bg-red-700"></span>
|
||||
Unavailable
|
||||
</span>
|
||||
</div>
|
||||
</div>
|
||||
|
||||
<button onclick="openReservationModal()"
|
||||
class="px-4 py-2 bg-blue-600 text-white rounded-lg hover:bg-blue-700 flex items-center gap-2">
|
||||
<svg class="w-5 h-5" fill="none" stroke="currentColor" viewBox="0 0 24 24">
|
||||
<path stroke-linecap="round" stroke-linejoin="round" stroke-width="2" d="M12 4v16m8-8H4"/>
|
||||
</svg>
|
||||
New Reservation
|
||||
</button>
|
||||
</div>
|
||||
|
||||
<!-- Calendar Grid -->
|
||||
<div class="calendar-grid mb-8">
|
||||
{% for month_data in calendar_data.months %}
|
||||
<div class="month-card">
|
||||
<h3 class="font-semibold text-gray-900 dark:text-white mb-2 text-center">
|
||||
{{ month_data.short_name }} '{{ month_data.year_short }}
|
||||
</h3>
|
||||
<div class="day-grid">
|
||||
<!-- Day headers -->
|
||||
<div class="day-header">S</div>
|
||||
<div class="day-header">M</div>
|
||||
<div class="day-header">T</div>
|
||||
<div class="day-header">W</div>
|
||||
<div class="day-header">T</div>
|
||||
<div class="day-header">F</div>
|
||||
<div class="day-header">S</div>
|
||||
|
||||
<!-- Empty cells for alignment (first_weekday is 0=Mon, we need 0=Sun) -->
|
||||
{% set first_day_offset = (month_data.first_weekday + 1) % 7 %}
|
||||
{% for i in range(first_day_offset) %}
|
||||
<div class="day-cell empty"></div>
|
||||
{% endfor %}
|
||||
|
||||
<!-- Day cells -->
|
||||
{% for day_num in range(1, month_data.num_days + 1) %}
|
||||
{% set day_data = month_data.days[day_num] %}
|
||||
{% set date_str = '%04d-%02d-%02d'|format(month_data.year, month_data.month, day_num) %}
|
||||
{% set is_today = date_str == today %}
|
||||
{% set has_cal_expiring = day_data.cal_expiring_on_day is defined and day_data.cal_expiring_on_day > 0 %}
|
||||
<div class="day-cell neutral {% if is_today %}today ring-2 ring-seismo-orange{% endif %}"
|
||||
data-date="{{ date_str }}"
|
||||
data-available="{{ day_data.available }}"
|
||||
onclick="openDayPanel('{{ date_str }}')"
|
||||
title="Available: {{ day_data.available }}, Reserved: {{ day_data.reserved }}{% if has_cal_expiring %}, Cal expires: {{ day_data.cal_expiring_on_day }}{% endif %}">
|
||||
{{ day_num }}
|
||||
{% if has_cal_expiring %}
|
||||
<span class="status-dot expired"></span>
|
||||
{% endif %}
|
||||
</div>
|
||||
{% endfor %}
|
||||
</div>
|
||||
</div>
|
||||
{% endfor %}
|
||||
</div>
|
||||
|
||||
<!-- Month Navigation (centered below calendar) -->
|
||||
<div class="flex items-center justify-center gap-3 mb-8">
|
||||
<a href="/fleet-calendar?year={{ prev_year }}&month={{ prev_month }}&device_type={{ device_type }}"
|
||||
class="px-3 py-2 rounded-lg bg-gray-100 dark:bg-gray-700 hover:bg-gray-200 dark:hover:bg-gray-600 text-gray-700 dark:text-gray-300"
|
||||
title="Previous month">
|
||||
<svg class="w-5 h-5" fill="none" stroke="currentColor" viewBox="0 0 24 24">
|
||||
<path stroke-linecap="round" stroke-linejoin="round" stroke-width="2" d="M15 19l-7-7 7-7"/>
|
||||
</svg>
|
||||
</a>
|
||||
<span class="text-lg font-bold text-gray-900 dark:text-white px-3">
|
||||
{{ calendar_data.months[0].name }} {{ calendar_data.months[0].year }} - {{ calendar_data.months[11].name }} {{ calendar_data.months[11].year }}
|
||||
</span>
|
||||
<a href="/fleet-calendar?year={{ next_year }}&month={{ next_month }}&device_type={{ device_type }}"
|
||||
class="px-3 py-2 rounded-lg bg-gray-100 dark:bg-gray-700 hover:bg-gray-200 dark:hover:bg-gray-600 text-gray-700 dark:text-gray-300"
|
||||
title="Next month">
|
||||
<svg class="w-5 h-5" fill="none" stroke="currentColor" viewBox="0 0 24 24">
|
||||
<path stroke-linecap="round" stroke-linejoin="round" stroke-width="2" d="M9 5l7 7-7 7"/>
|
||||
</svg>
|
||||
</a>
|
||||
<a href="/fleet-calendar?device_type={{ device_type }}"
|
||||
class="ml-2 px-4 py-2 rounded-lg bg-seismo-orange text-white hover:bg-orange-600">
|
||||
Today
|
||||
</a>
|
||||
</div>
|
||||
|
||||
<!-- Active Reservations -->
|
||||
<div class="bg-white dark:bg-slate-800 rounded-xl shadow-lg p-6 mb-8">
|
||||
<h2 class="text-xl font-semibold text-gray-900 dark:text-white mb-4">Active Reservations</h2>
|
||||
<div id="reservations-list"
|
||||
hx-get="/api/fleet-calendar/reservations-list?year={{ start_year }}&month={{ start_month }}&device_type={{ device_type }}"
|
||||
hx-trigger="load"
|
||||
hx-swap="innerHTML">
|
||||
<p class="text-gray-500">Loading reservations...</p>
|
||||
</div>
|
||||
</div>
|
||||
|
||||
<!-- Day Detail Slide Panel -->
|
||||
<div id="panel-backdrop" class="panel-backdrop" onclick="closeDayPanel()"></div>
|
||||
<div id="day-panel" class="slide-panel">
|
||||
<div class="p-6" id="day-panel-content">
|
||||
<!-- Content loaded via HTMX -->
|
||||
</div>
|
||||
</div>
|
||||
|
||||
<!-- Reservation Modal -->
|
||||
<div id="reservation-modal" class="fixed inset-0 z-50 hidden">
|
||||
<div class="fixed inset-0 bg-black/50" onclick="closeReservationModal()"></div>
|
||||
<div class="fixed inset-0 flex items-center justify-center p-4">
|
||||
<div class="bg-white dark:bg-slate-800 rounded-xl shadow-2xl max-w-lg w-full max-h-[90vh] overflow-y-auto" onclick="event.stopPropagation()">
|
||||
<div class="p-6">
|
||||
<div class="flex items-center justify-between mb-6">
|
||||
<h2 class="text-xl font-semibold text-gray-900 dark:text-white">New Reservation</h2>
|
||||
<button onclick="closeReservationModal()" class="text-gray-400 hover:text-gray-600 dark:hover:text-gray-300">
|
||||
<svg class="w-6 h-6" fill="none" stroke="currentColor" viewBox="0 0 24 24">
|
||||
<path stroke-linecap="round" stroke-linejoin="round" stroke-width="2" d="M6 18L18 6M6 6l12 12"/>
|
||||
</svg>
|
||||
</button>
|
||||
</div>
|
||||
|
||||
<form id="reservation-form" onsubmit="submitReservation(event)">
|
||||
<!-- Name -->
|
||||
<div class="mb-4">
|
||||
<label class="block text-sm font-medium text-gray-700 dark:text-gray-300 mb-1">Reservation Name *</label>
|
||||
<input type="text" name="name" required
|
||||
placeholder="e.g., Job A - March Deployment"
|
||||
class="w-full px-3 py-2 border border-gray-300 dark:border-gray-600 rounded-lg bg-white dark:bg-slate-700 text-gray-900 dark:text-white focus:ring-2 focus:ring-blue-500">
|
||||
</div>
|
||||
|
||||
<!-- Project (optional) -->
|
||||
<div class="mb-4">
|
||||
<label class="block text-sm font-medium text-gray-700 dark:text-gray-300 mb-1">Link to Project (optional)</label>
|
||||
<select name="project_id"
|
||||
class="w-full px-3 py-2 border border-gray-300 dark:border-gray-600 rounded-lg bg-white dark:bg-slate-700 text-gray-900 dark:text-white focus:ring-2 focus:ring-blue-500">
|
||||
<option value="">-- No project --</option>
|
||||
{% for project in projects %}
|
||||
<option value="{{ project.id }}">{{ project.name }}</option>
|
||||
{% endfor %}
|
||||
</select>
|
||||
</div>
|
||||
|
||||
<!-- Date Range -->
|
||||
<div class="mb-4">
|
||||
<label class="block text-sm font-medium text-gray-700 dark:text-gray-300 mb-1">Start Date *</label>
|
||||
<input type="date" name="start_date" required
|
||||
class="w-full px-3 py-2 border border-gray-300 dark:border-gray-600 rounded-lg bg-white dark:bg-slate-700 text-gray-900 dark:text-white focus:ring-2 focus:ring-blue-500">
|
||||
</div>
|
||||
|
||||
<div class="mb-4">
|
||||
<div class="flex items-center justify-between mb-1">
|
||||
<label class="block text-sm font-medium text-gray-700 dark:text-gray-300">End Date</label>
|
||||
<label class="flex items-center gap-2 cursor-pointer text-sm">
|
||||
<input type="checkbox" name="end_date_tbd" id="end_date_tbd"
|
||||
onchange="toggleEndDateTBD()"
|
||||
class="w-4 h-4 text-blue-600 focus:ring-blue-500 rounded border-gray-300 dark:border-gray-600">
|
||||
<span class="text-gray-600 dark:text-gray-400">TBD / Ongoing</span>
|
||||
</label>
|
||||
</div>
|
||||
<input type="date" name="end_date" id="end_date_input"
|
||||
class="w-full px-3 py-2 border border-gray-300 dark:border-gray-600 rounded-lg bg-white dark:bg-slate-700 text-gray-900 dark:text-white focus:ring-2 focus:ring-blue-500">
|
||||
</div>
|
||||
|
||||
<!-- Estimated End Date (shown when TBD is checked) -->
|
||||
<div id="estimated-end-section" class="mb-4 hidden">
|
||||
<label class="block text-sm font-medium text-gray-700 dark:text-gray-300 mb-1">
|
||||
Estimated End Date <span class="text-gray-400 font-normal">(for planning)</span>
|
||||
</label>
|
||||
<input type="date" name="estimated_end_date"
|
||||
class="w-full px-3 py-2 border border-gray-300 dark:border-gray-600 rounded-lg bg-white dark:bg-slate-700 text-gray-900 dark:text-white focus:ring-2 focus:ring-blue-500">
|
||||
<p class="text-xs text-gray-500 mt-1">Used for calendar visualization only. Can be updated later.</p>
|
||||
</div>
|
||||
|
||||
<!-- Assignment Type -->
|
||||
<div class="mb-4">
|
||||
<label class="block text-sm font-medium text-gray-700 dark:text-gray-300 mb-2">Assignment Type *</label>
|
||||
<div class="flex gap-4">
|
||||
<label class="flex items-center gap-2 cursor-pointer">
|
||||
<input type="radio" name="assignment_type" value="quantity" checked
|
||||
onchange="toggleAssignmentType(this.value)"
|
||||
class="text-blue-600 focus:ring-blue-500">
|
||||
<span class="text-gray-700 dark:text-gray-300">Reserve quantity</span>
|
||||
</label>
|
||||
<label class="flex items-center gap-2 cursor-pointer">
|
||||
<input type="radio" name="assignment_type" value="specific"
|
||||
onchange="toggleAssignmentType(this.value)"
|
||||
class="text-blue-600 focus:ring-blue-500">
|
||||
<span class="text-gray-700 dark:text-gray-300">Pick specific units</span>
|
||||
</label>
|
||||
</div>
|
||||
</div>
|
||||
|
||||
<!-- Quantity (shown for quantity type) -->
|
||||
<div id="quantity-section" class="mb-4">
|
||||
<label class="block text-sm font-medium text-gray-700 dark:text-gray-300 mb-1">Units Needed</label>
|
||||
<input type="number" name="quantity_needed" min="1" value="1"
|
||||
onchange="updateCalendarAvailability()"
|
||||
oninput="updateCalendarAvailability()"
|
||||
class="w-32 px-3 py-2 border border-gray-300 dark:border-gray-600 rounded-lg bg-white dark:bg-slate-700 text-gray-900 dark:text-white focus:ring-2 focus:ring-blue-500">
|
||||
<p class="text-xs text-gray-500 mt-1">Calendar will highlight availability based on quantity</p>
|
||||
</div>
|
||||
|
||||
<!-- Specific Units (shown for specific type) -->
|
||||
<div id="specific-section" class="mb-4 hidden">
|
||||
<label class="block text-sm font-medium text-gray-700 dark:text-gray-300 mb-1">Select Units</label>
|
||||
<div id="available-units-list" class="max-h-48 overflow-y-auto border border-gray-300 dark:border-gray-600 rounded-lg p-2">
|
||||
<p class="text-gray-500 text-sm">Select dates first to see available units</p>
|
||||
</div>
|
||||
</div>
|
||||
|
||||
<!-- Color -->
|
||||
<div class="mb-4">
|
||||
<label class="block text-sm font-medium text-gray-700 dark:text-gray-300 mb-1">Color</label>
|
||||
<div class="flex gap-2">
|
||||
{% for color in ['#3B82F6', '#10B981', '#F59E0B', '#EF4444', '#8B5CF6', '#EC4899'] %}
|
||||
<label class="cursor-pointer">
|
||||
<input type="radio" name="color" value="{{ color }}" {% if loop.first %}checked{% endif %} class="sr-only peer">
|
||||
<span class="block w-8 h-8 rounded-full peer-checked:ring-2 peer-checked:ring-offset-2 peer-checked:ring-gray-900 dark:peer-checked:ring-white"
|
||||
style="background-color: {{ color }}"></span>
|
||||
</label>
|
||||
{% endfor %}
|
||||
</div>
|
||||
</div>
|
||||
|
||||
<!-- Notes -->
|
||||
<div class="mb-6">
|
||||
<label class="block text-sm font-medium text-gray-700 dark:text-gray-300 mb-1">Notes</label>
|
||||
<textarea name="notes" rows="2"
|
||||
placeholder="Optional notes about this reservation"
|
||||
class="w-full px-3 py-2 border border-gray-300 dark:border-gray-600 rounded-lg bg-white dark:bg-slate-700 text-gray-900 dark:text-white focus:ring-2 focus:ring-blue-500"></textarea>
|
||||
</div>
|
||||
|
||||
<!-- Actions -->
|
||||
<div class="flex justify-end gap-3">
|
||||
<button type="button" onclick="closeReservationModal()"
|
||||
class="px-4 py-2 text-gray-700 dark:text-gray-300 hover:bg-gray-100 dark:hover:bg-gray-700 rounded-lg">
|
||||
Cancel
|
||||
</button>
|
||||
<button type="submit"
|
||||
class="px-4 py-2 bg-blue-600 text-white rounded-lg hover:bg-blue-700">
|
||||
Create Reservation
|
||||
</button>
|
||||
</div>
|
||||
</form>
|
||||
</div>
|
||||
</div>
|
||||
</div>
|
||||
</div>
|
||||
|
||||
<script>
|
||||
const deviceType = '{{ device_type }}';
|
||||
const startYear = {{ start_year }};
|
||||
const startMonth = {{ start_month }};
|
||||
|
||||
// Load today's stats
|
||||
document.addEventListener('DOMContentLoaded', function() {
|
||||
loadTodayStats();
|
||||
});
|
||||
|
||||
async function loadTodayStats() {
|
||||
try {
|
||||
const today = new Date().toISOString().split('T')[0];
|
||||
const response = await fetch(`/api/fleet-calendar/day/${today}?device_type=${deviceType}`);
|
||||
// Just update the stats from the day data response headers or a separate call
|
||||
// For now, calculate from calendar data
|
||||
const todayData = findTodayInCalendar();
|
||||
if (todayData) {
|
||||
document.getElementById('available-today').textContent = todayData.available;
|
||||
document.getElementById('reserved-today').textContent = todayData.reserved;
|
||||
document.getElementById('expiring-today').textContent = todayData.expiring_soon;
|
||||
document.getElementById('expired-today').textContent = todayData.expired;
|
||||
}
|
||||
} catch (error) {
|
||||
console.error('Error loading today stats:', error);
|
||||
}
|
||||
}
|
||||
|
||||
function findTodayInCalendar() {
|
||||
const today = new Date();
|
||||
const year = today.getFullYear();
|
||||
const month = today.getMonth() + 1;
|
||||
const day = today.getDate();
|
||||
const calendarData = {{ calendar_data | tojson }};
|
||||
// months is now an array, find the matching month
|
||||
const monthData = calendarData.months.find(m => m.year === year && m.month === month);
|
||||
if (monthData && monthData.days[day]) {
|
||||
return monthData.days[day];
|
||||
}
|
||||
return null;
|
||||
}
|
||||
|
||||
function openDayPanel(dateStr) {
|
||||
const panel = document.getElementById('day-panel');
|
||||
const backdrop = document.getElementById('panel-backdrop');
|
||||
const content = document.getElementById('day-panel-content');
|
||||
|
||||
// Load content via HTMX
|
||||
htmx.ajax('GET', `/api/fleet-calendar/day/${dateStr}?device_type=${deviceType}`, {
|
||||
target: '#day-panel-content',
|
||||
swap: 'innerHTML'
|
||||
});
|
||||
|
||||
panel.classList.add('open');
|
||||
backdrop.classList.add('open');
|
||||
}
|
||||
|
||||
function closeDayPanel() {
|
||||
const panel = document.getElementById('day-panel');
|
||||
const backdrop = document.getElementById('panel-backdrop');
|
||||
panel.classList.remove('open');
|
||||
backdrop.classList.remove('open');
|
||||
}
|
||||
|
||||
let reservationModeActive = false;
|
||||
|
||||
function openReservationModal() {
|
||||
document.getElementById('reservation-modal').classList.remove('hidden');
|
||||
reservationModeActive = true;
|
||||
// Show reservation legend, hide main legend
|
||||
document.getElementById('main-legend').classList.add('md:hidden');
|
||||
document.getElementById('main-legend').classList.remove('md:flex');
|
||||
document.getElementById('reservation-legend').classList.remove('md:hidden');
|
||||
document.getElementById('reservation-legend').classList.add('md:flex');
|
||||
// Trigger availability update
|
||||
updateCalendarAvailability();
|
||||
}
|
||||
|
||||
function closeReservationModal() {
|
||||
document.getElementById('reservation-modal').classList.add('hidden');
|
||||
document.getElementById('reservation-form').reset();
|
||||
reservationModeActive = false;
|
||||
// Restore main legend
|
||||
document.getElementById('main-legend').classList.remove('md:hidden');
|
||||
document.getElementById('main-legend').classList.add('md:flex');
|
||||
document.getElementById('reservation-legend').classList.add('md:hidden');
|
||||
document.getElementById('reservation-legend').classList.remove('md:flex');
|
||||
// Reset calendar colors
|
||||
resetCalendarColors();
|
||||
}
|
||||
|
||||
function resetCalendarColors() {
|
||||
document.querySelectorAll('.day-cell:not(.empty)').forEach(cell => {
|
||||
cell.classList.remove('reservation-available', 'reservation-partial', 'reservation-unavailable');
|
||||
cell.classList.add('neutral');
|
||||
});
|
||||
}
|
||||
|
||||
function updateCalendarAvailability() {
|
||||
if (!reservationModeActive) return;
|
||||
|
||||
const quantityInput = document.querySelector('input[name="quantity_needed"]');
|
||||
const quantity = parseInt(quantityInput?.value) || 1;
|
||||
|
||||
// Update each day cell based on available units vs quantity needed
|
||||
document.querySelectorAll('.day-cell:not(.empty)').forEach(cell => {
|
||||
const available = parseInt(cell.dataset.available) || 0;
|
||||
|
||||
// Remove all status classes
|
||||
cell.classList.remove('neutral', 'reservation-available', 'reservation-partial', 'reservation-unavailable');
|
||||
|
||||
if (available >= quantity) {
|
||||
cell.classList.add('reservation-available');
|
||||
} else if (available > 0) {
|
||||
cell.classList.add('reservation-partial');
|
||||
} else {
|
||||
cell.classList.add('reservation-unavailable');
|
||||
}
|
||||
});
|
||||
}
|
||||
|
||||
function toggleEndDateTBD() {
|
||||
const checkbox = document.getElementById('end_date_tbd');
|
||||
const endDateInput = document.getElementById('end_date_input');
|
||||
const estimatedSection = document.getElementById('estimated-end-section');
|
||||
|
||||
if (checkbox.checked) {
|
||||
endDateInput.disabled = true;
|
||||
endDateInput.value = '';
|
||||
endDateInput.classList.add('opacity-50', 'cursor-not-allowed');
|
||||
estimatedSection.classList.remove('hidden');
|
||||
} else {
|
||||
endDateInput.disabled = false;
|
||||
endDateInput.classList.remove('opacity-50', 'cursor-not-allowed');
|
||||
estimatedSection.classList.add('hidden');
|
||||
}
|
||||
}
|
||||
|
||||
function toggleAssignmentType(type) {
|
||||
const quantitySection = document.getElementById('quantity-section');
|
||||
const specificSection = document.getElementById('specific-section');
|
||||
|
||||
if (type === 'quantity') {
|
||||
quantitySection.classList.remove('hidden');
|
||||
specificSection.classList.add('hidden');
|
||||
} else {
|
||||
quantitySection.classList.add('hidden');
|
||||
specificSection.classList.remove('hidden');
|
||||
// Load available units based on selected dates
|
||||
loadAvailableUnits();
|
||||
}
|
||||
}
|
||||
|
||||
async function loadAvailableUnits() {
|
||||
const startDate = document.querySelector('input[name="start_date"]').value;
|
||||
const endDate = document.querySelector('input[name="end_date"]').value;
|
||||
|
||||
if (!startDate || !endDate) {
|
||||
document.getElementById('available-units-list').innerHTML =
|
||||
'<p class="text-gray-500 text-sm">Select dates first to see available units</p>';
|
||||
return;
|
||||
}
|
||||
|
||||
try {
|
||||
const response = await fetch(
|
||||
`/api/fleet-calendar/availability?start_date=${startDate}&end_date=${endDate}&device_type=${deviceType}`
|
||||
);
|
||||
const data = await response.json();
|
||||
|
||||
if (data.available_units.length === 0) {
|
||||
document.getElementById('available-units-list').innerHTML =
|
||||
'<p class="text-gray-500 text-sm">No units available for this period</p>';
|
||||
return;
|
||||
}
|
||||
|
||||
let html = '';
|
||||
for (const unit of data.available_units) {
|
||||
html += `
|
||||
<label class="flex items-center gap-2 p-2 hover:bg-gray-50 dark:hover:bg-gray-700 rounded cursor-pointer">
|
||||
<input type="checkbox" name="unit_ids" value="${unit.id}"
|
||||
class="text-blue-600 focus:ring-blue-500 rounded">
|
||||
<span class="text-gray-900 dark:text-white font-medium">${unit.id}</span>
|
||||
<span class="text-gray-500 text-sm">Cal: ${unit.last_calibrated || 'N/A'}</span>
|
||||
${unit.calibration_status === 'expiring_soon' ?
|
||||
'<span class="text-yellow-600 text-xs">Expiring soon</span>' : ''}
|
||||
</label>
|
||||
`;
|
||||
}
|
||||
document.getElementById('available-units-list').innerHTML = html;
|
||||
} catch (error) {
|
||||
console.error('Error loading available units:', error);
|
||||
}
|
||||
}
|
||||
|
||||
// Watch for date changes to reload available units
|
||||
document.addEventListener('DOMContentLoaded', function() {
|
||||
const startInput = document.querySelector('input[name="start_date"]');
|
||||
const endInput = document.querySelector('input[name="end_date"]');
|
||||
|
||||
if (startInput && endInput) {
|
||||
startInput.addEventListener('change', function() {
|
||||
if (document.querySelector('input[name="assignment_type"]:checked').value === 'specific') {
|
||||
loadAvailableUnits();
|
||||
}
|
||||
});
|
||||
endInput.addEventListener('change', function() {
|
||||
if (document.querySelector('input[name="assignment_type"]:checked').value === 'specific') {
|
||||
loadAvailableUnits();
|
||||
}
|
||||
});
|
||||
}
|
||||
});
|
||||
|
||||
async function submitReservation(event) {
|
||||
event.preventDefault();
|
||||
|
||||
const form = event.target;
|
||||
const formData = new FormData(form);
|
||||
const endDateTbd = formData.get('end_date_tbd') === 'on';
|
||||
|
||||
const data = {
|
||||
name: formData.get('name'),
|
||||
project_id: formData.get('project_id') || null,
|
||||
start_date: formData.get('start_date'),
|
||||
end_date: endDateTbd ? null : formData.get('end_date'),
|
||||
end_date_tbd: endDateTbd,
|
||||
estimated_end_date: endDateTbd ? (formData.get('estimated_end_date') || null) : null,
|
||||
assignment_type: formData.get('assignment_type'),
|
||||
device_type: deviceType,
|
||||
color: formData.get('color'),
|
||||
notes: formData.get('notes') || null
|
||||
};
|
||||
|
||||
// Validate: need either end_date or TBD checked
|
||||
if (!data.end_date && !data.end_date_tbd) {
|
||||
alert('Please enter an end date or check "TBD / Ongoing"');
|
||||
return;
|
||||
}
|
||||
|
||||
if (data.assignment_type === 'quantity') {
|
||||
data.quantity_needed = parseInt(formData.get('quantity_needed'));
|
||||
} else {
|
||||
data.unit_ids = formData.getAll('unit_ids');
|
||||
}
|
||||
|
||||
try {
|
||||
const response = await fetch('/api/fleet-calendar/reservations', {
|
||||
method: 'POST',
|
||||
headers: { 'Content-Type': 'application/json' },
|
||||
body: JSON.stringify(data)
|
||||
});
|
||||
|
||||
const result = await response.json();
|
||||
|
||||
if (result.success) {
|
||||
closeReservationModal();
|
||||
// Reload the page to refresh calendar
|
||||
window.location.reload();
|
||||
} else {
|
||||
alert('Error creating reservation: ' + (result.detail || 'Unknown error'));
|
||||
}
|
||||
} catch (error) {
|
||||
console.error('Error:', error);
|
||||
alert('Error creating reservation');
|
||||
}
|
||||
}
|
||||
|
||||
// Close panels on escape key
|
||||
document.addEventListener('keydown', function(e) {
|
||||
if (e.key === 'Escape') {
|
||||
closeDayPanel();
|
||||
closeReservationModal();
|
||||
}
|
||||
});
|
||||
</script>
|
||||
{% endblock %}
|
||||
102
templates/modems.html
Normal file
@@ -0,0 +1,102 @@
|
||||
{% extends "base.html" %}
|
||||
|
||||
{% block title %}Field Modems - Terra-View{% endblock %}
|
||||
|
||||
{% block content %}
|
||||
<div class="mb-8">
|
||||
<h1 class="text-3xl font-bold text-gray-900 dark:text-white flex items-center">
|
||||
<svg class="w-8 h-8 mr-3 text-seismo-orange" fill="none" stroke="currentColor" viewBox="0 0 24 24">
|
||||
<path stroke-linecap="round" stroke-linejoin="round" stroke-width="2" d="M8.111 16.404a5.5 5.5 0 017.778 0M12 20h.01m-7.08-7.071c3.904-3.905 10.236-3.905 14.141 0M1.394 9.393c5.857-5.857 15.355-5.857 21.213 0"></path>
|
||||
</svg>
|
||||
Field Modems
|
||||
</h1>
|
||||
<p class="text-gray-600 dark:text-gray-400 mt-1">Manage network connectivity devices for field equipment</p>
|
||||
</div>
|
||||
|
||||
<!-- Summary Stats -->
|
||||
<div class="grid grid-cols-1 md:grid-cols-4 gap-6 mb-8"
|
||||
hx-get="/api/modem-dashboard/stats"
|
||||
hx-trigger="load, every 30s"
|
||||
hx-swap="innerHTML">
|
||||
<!-- Stats will be loaded here -->
|
||||
<div class="animate-pulse bg-gray-200 dark:bg-gray-700 h-24 rounded-xl"></div>
|
||||
<div class="animate-pulse bg-gray-200 dark:bg-gray-700 h-24 rounded-xl"></div>
|
||||
<div class="animate-pulse bg-gray-200 dark:bg-gray-700 h-24 rounded-xl"></div>
|
||||
<div class="animate-pulse bg-gray-200 dark:bg-gray-700 h-24 rounded-xl"></div>
|
||||
</div>
|
||||
|
||||
<!-- Modem List -->
|
||||
<div class="bg-white dark:bg-slate-800 rounded-xl shadow-lg p-6">
|
||||
<div class="flex items-center justify-between mb-6">
|
||||
<h2 class="text-xl font-semibold text-gray-900 dark:text-white">All Modems</h2>
|
||||
<div class="flex items-center gap-4">
|
||||
<!-- Search -->
|
||||
<div class="relative">
|
||||
<input type="text"
|
||||
id="modem-search"
|
||||
placeholder="Search modems..."
|
||||
class="pl-9 pr-4 py-2 text-sm border border-gray-300 dark:border-gray-600 rounded-lg bg-white dark:bg-slate-700 text-gray-900 dark:text-white focus:ring-2 focus:ring-seismo-orange focus:border-transparent"
|
||||
hx-get="/api/modem-dashboard/units"
|
||||
hx-trigger="keyup changed delay:300ms"
|
||||
hx-target="#modem-list"
|
||||
hx-include="[name='search']"
|
||||
name="search">
|
||||
<svg class="w-4 h-4 absolute left-3 top-1/2 transform -translate-y-1/2 text-gray-400" fill="none" stroke="currentColor" viewBox="0 0 24 24">
|
||||
<path stroke-linecap="round" stroke-linejoin="round" stroke-width="2" d="M21 21l-6-6m2-5a7 7 0 11-14 0 7 7 0 0114 0z"></path>
|
||||
</svg>
|
||||
</div>
|
||||
<a href="/roster?device_type=modem" class="text-sm text-seismo-orange hover:underline">
|
||||
Add modem in roster
|
||||
</a>
|
||||
</div>
|
||||
</div>
|
||||
|
||||
<div id="modem-list"
|
||||
hx-get="/api/modem-dashboard/units"
|
||||
hx-trigger="load, every 30s"
|
||||
hx-swap="innerHTML">
|
||||
<p class="text-gray-500 dark:text-gray-400">Loading modems...</p>
|
||||
</div>
|
||||
</div>
|
||||
|
||||
<script>
|
||||
// Ping a modem and show result
|
||||
async function pingModem(modemId) {
|
||||
const btn = document.getElementById(`ping-btn-${modemId}`);
|
||||
const resultDiv = document.getElementById(`ping-result-${modemId}`);
|
||||
|
||||
// Show loading state
|
||||
const originalText = btn.textContent;
|
||||
btn.textContent = 'Pinging...';
|
||||
btn.disabled = true;
|
||||
resultDiv.classList.remove('hidden');
|
||||
resultDiv.className = 'mt-2 text-xs text-gray-500';
|
||||
resultDiv.textContent = 'Testing connection...';
|
||||
|
||||
try {
|
||||
const response = await fetch(`/api/modem-dashboard/${modemId}/ping`);
|
||||
const data = await response.json();
|
||||
|
||||
if (data.status === 'success') {
|
||||
resultDiv.className = 'mt-2 text-xs text-green-600 dark:text-green-400';
|
||||
resultDiv.innerHTML = `<span class="inline-block w-2 h-2 bg-green-500 rounded-full mr-1"></span>Online (${data.response_time_ms}ms)`;
|
||||
} else {
|
||||
resultDiv.className = 'mt-2 text-xs text-red-600 dark:text-red-400';
|
||||
resultDiv.innerHTML = `<span class="inline-block w-2 h-2 bg-red-500 rounded-full mr-1"></span>${data.detail || 'Offline'}`;
|
||||
}
|
||||
} catch (error) {
|
||||
resultDiv.className = 'mt-2 text-xs text-red-600 dark:text-red-400';
|
||||
resultDiv.textContent = 'Error: ' + error.message;
|
||||
}
|
||||
|
||||
// Restore button
|
||||
btn.textContent = originalText;
|
||||
btn.disabled = false;
|
||||
|
||||
// Hide result after 10 seconds
|
||||
setTimeout(() => {
|
||||
resultDiv.classList.add('hidden');
|
||||
}, 10000);
|
||||
}
|
||||
</script>
|
||||
{% endblock %}
|
||||
@@ -70,7 +70,7 @@
|
||||
class="tab-button px-4 py-3 border-b-2 border-transparent font-medium text-sm text-gray-600 dark:text-gray-400 hover:text-gray-900 dark:hover:text-white hover:border-gray-300 dark:hover:border-gray-600 transition-colors">
|
||||
Settings
|
||||
</button>
|
||||
{% if assigned_unit %}
|
||||
{% if assigned_unit and connection_mode == 'connected' %}
|
||||
<button onclick="switchTab('command')"
|
||||
data-tab="command"
|
||||
class="tab-button px-4 py-3 border-b-2 border-transparent font-medium text-sm text-gray-600 dark:text-gray-400 hover:text-gray-900 dark:hover:text-white hover:border-gray-300 dark:hover:border-gray-600 transition-colors">
|
||||
@@ -80,7 +80,7 @@
|
||||
<button onclick="switchTab('sessions')"
|
||||
data-tab="sessions"
|
||||
class="tab-button px-4 py-3 border-b-2 border-transparent font-medium text-sm text-gray-600 dark:text-gray-400 hover:text-gray-900 dark:hover:text-white hover:border-gray-300 dark:hover:border-gray-600 transition-colors">
|
||||
Recording Sessions
|
||||
Monitoring Sessions
|
||||
</button>
|
||||
<button onclick="switchTab('data')"
|
||||
data-tab="data"
|
||||
@@ -123,7 +123,7 @@
|
||||
{% endif %}
|
||||
<div>
|
||||
<div class="text-sm text-gray-600 dark:text-gray-400">Created</div>
|
||||
<div class="text-gray-900 dark:text-white">{{ location.created_at.strftime('%Y-%m-%d %H:%M') if location.created_at else 'N/A' }}</div>
|
||||
<div class="text-gray-900 dark:text-white">{{ location.created_at|local_datetime if location.created_at else 'N/A' }}</div>
|
||||
</div>
|
||||
</div>
|
||||
</div>
|
||||
@@ -150,7 +150,7 @@
|
||||
{% if assignment %}
|
||||
<div>
|
||||
<div class="text-sm text-gray-600 dark:text-gray-400">Assigned Since</div>
|
||||
<div class="text-gray-900 dark:text-white">{{ assignment.assigned_at.strftime('%Y-%m-%d %H:%M') if assignment.assigned_at else 'N/A' }}</div>
|
||||
<div class="text-gray-900 dark:text-white">{{ assignment.assigned_at|local_datetime if assignment.assigned_at else 'N/A' }}</div>
|
||||
</div>
|
||||
{% if assignment.notes %}
|
||||
<div>
|
||||
@@ -214,23 +214,54 @@
|
||||
<div class="bg-white dark:bg-slate-800 rounded-xl shadow-lg p-6">
|
||||
<div class="flex items-center justify-between">
|
||||
<div>
|
||||
{% if connection_mode == 'connected' %}
|
||||
<p class="text-sm text-gray-600 dark:text-gray-400">Active Session</p>
|
||||
<p class="text-lg font-semibold text-gray-900 dark:text-white mt-2">
|
||||
{% if active_session %}
|
||||
<span class="text-green-600 dark:text-green-400">Recording</span>
|
||||
<span class="text-green-600 dark:text-green-400">Monitoring</span>
|
||||
{% else %}
|
||||
<span class="text-gray-500">Idle</span>
|
||||
{% endif %}
|
||||
</p>
|
||||
{% else %}
|
||||
<p class="text-sm text-gray-600 dark:text-gray-400">Mode</p>
|
||||
<p class="text-lg font-semibold mt-2">
|
||||
<span class="text-amber-600 dark:text-amber-400">Offline / Manual</span>
|
||||
</p>
|
||||
{% endif %}
|
||||
</div>
|
||||
<div class="w-12 h-12 bg-purple-100 dark:bg-purple-900/30 rounded-lg flex items-center justify-center">
|
||||
{% if connection_mode == 'connected' %}
|
||||
<svg class="w-6 h-6 text-purple-600 dark:text-purple-400" fill="none" stroke="currentColor" viewBox="0 0 24 24">
|
||||
<path stroke-linecap="round" stroke-linejoin="round" stroke-width="2" d="M12 8v4l3 3m6-3a9 9 0 11-18 0 9 9 0 0118 0z"></path>
|
||||
</svg>
|
||||
{% else %}
|
||||
<svg class="w-6 h-6 text-amber-600 dark:text-amber-400" fill="none" stroke="currentColor" viewBox="0 0 24 24">
|
||||
<path stroke-linecap="round" stroke-linejoin="round" stroke-width="2" d="M4 16v1a3 3 0 003 3h10a3 3 0 003-3v-1m-4-8l-4-4m0 0L8 8m4-4v12"></path>
|
||||
</svg>
|
||||
{% endif %}
|
||||
</div>
|
||||
</div>
|
||||
</div>
|
||||
</div>
|
||||
|
||||
{% if connection_mode == 'connected' and assigned_unit %}
|
||||
<!-- Live Status Row (connected NRLs only) -->
|
||||
<div class="mt-6">
|
||||
<div class="bg-white dark:bg-slate-800 rounded-xl shadow-lg p-6">
|
||||
<div class="flex items-center justify-between mb-4">
|
||||
<h3 class="text-lg font-semibold text-gray-900 dark:text-white">Live Status</h3>
|
||||
<span class="text-xs text-gray-500 dark:text-gray-400">{{ assigned_unit.id }}</span>
|
||||
</div>
|
||||
<div id="nrl-live-status"
|
||||
hx-get="/api/projects/{{ project_id }}/nrl/{{ location_id }}/live-status"
|
||||
hx-trigger="load, every 30s"
|
||||
hx-swap="innerHTML">
|
||||
<div class="text-center py-4 text-gray-500 text-sm">Loading status…</div>
|
||||
</div>
|
||||
</div>
|
||||
</div>
|
||||
{% endif %}
|
||||
</div>
|
||||
|
||||
<!-- Settings Tab -->
|
||||
@@ -281,8 +312,8 @@
|
||||
</div>
|
||||
</div>
|
||||
|
||||
<!-- Command Center Tab -->
|
||||
{% if assigned_unit %}
|
||||
<!-- Command Center Tab (connected NRLs only) -->
|
||||
{% if assigned_unit and connection_mode == 'connected' %}
|
||||
<div id="command-tab" class="tab-panel hidden">
|
||||
<div class="bg-white dark:bg-slate-800 rounded-xl shadow-lg p-6">
|
||||
<h2 class="text-xl font-semibold text-gray-900 dark:text-white mb-6">
|
||||
@@ -302,11 +333,11 @@
|
||||
</div>
|
||||
{% endif %}
|
||||
|
||||
<!-- Recording Sessions Tab -->
|
||||
<!-- Monitoring Sessions Tab -->
|
||||
<div id="sessions-tab" class="tab-panel hidden">
|
||||
<div class="bg-white dark:bg-slate-800 rounded-xl shadow-lg p-6">
|
||||
<div class="flex items-center justify-between mb-6">
|
||||
<h2 class="text-xl font-semibold text-gray-900 dark:text-white">Recording Sessions</h2>
|
||||
<h2 class="text-xl font-semibold text-gray-900 dark:text-white">Monitoring Sessions</h2>
|
||||
{% if assigned_unit %}
|
||||
<button onclick="openScheduleModal()"
|
||||
class="px-4 py-2 bg-seismo-orange text-white rounded-lg hover:bg-seismo-navy transition-colors">
|
||||
@@ -329,8 +360,51 @@
|
||||
<div class="bg-white dark:bg-slate-800 rounded-xl shadow-lg p-6">
|
||||
<div class="flex items-center justify-between mb-6">
|
||||
<h2 class="text-xl font-semibold text-gray-900 dark:text-white">Data Files</h2>
|
||||
<div class="text-sm text-gray-500">
|
||||
<span class="font-medium">{{ file_count }}</span> files
|
||||
<div class="flex items-center gap-3">
|
||||
<span class="text-sm text-gray-500"><span class="font-medium">{{ file_count }}</span> files</span>
|
||||
<button onclick="toggleUploadPanel()"
|
||||
class="px-3 py-1.5 text-sm bg-seismo-orange text-white rounded-lg hover:bg-seismo-navy transition-colors flex items-center gap-1.5">
|
||||
<svg class="w-4 h-4" fill="none" stroke="currentColor" viewBox="0 0 24 24">
|
||||
<path stroke-linecap="round" stroke-linejoin="round" stroke-width="2" d="M4 16v1a3 3 0 003 3h10a3 3 0 003-3v-1m-4-8l-4-4m0 0L8 8m4-4v12"></path>
|
||||
</svg>
|
||||
Upload Data
|
||||
</button>
|
||||
</div>
|
||||
</div>
|
||||
|
||||
<!-- Upload Panel -->
|
||||
<div id="upload-panel" class="hidden mb-6 p-4 border-2 border-dashed border-gray-300 dark:border-gray-600 rounded-xl bg-gray-50 dark:bg-gray-800/50">
|
||||
<p class="text-sm font-medium text-gray-700 dark:text-gray-300 mb-1">Upload SD Card Data</p>
|
||||
<p class="text-xs text-gray-500 dark:text-gray-400 mb-3">
|
||||
Select a ZIP file, or select all files from inside an <code class="bg-gray-200 dark:bg-gray-700 px-1 rounded">Auto_####</code> folder. File types (.rnd, .rnh) are auto-detected.
|
||||
</p>
|
||||
<input type="file" id="upload-input" multiple
|
||||
accept=".zip,.rnd,.rnh,.RND,.RNH"
|
||||
class="block w-full text-sm text-gray-500 dark:text-gray-400
|
||||
file:mr-3 file:py-1.5 file:px-3 file:rounded-lg file:border-0
|
||||
file:text-sm file:font-medium file:bg-seismo-orange file:text-white
|
||||
hover:file:bg-seismo-navy file:cursor-pointer" />
|
||||
<div class="flex items-center gap-3 mt-3">
|
||||
<button id="upload-btn" onclick="submitUpload()"
|
||||
class="px-4 py-1.5 text-sm bg-green-600 text-white rounded-lg hover:bg-green-700 transition-colors">
|
||||
Import Files
|
||||
</button>
|
||||
<button id="upload-cancel-btn" onclick="toggleUploadPanel()"
|
||||
class="px-4 py-1.5 text-sm text-gray-600 dark:text-gray-400 hover:text-gray-900 dark:hover:text-white transition-colors">
|
||||
Cancel
|
||||
</button>
|
||||
<span id="upload-status" class="text-sm hidden"></span>
|
||||
</div>
|
||||
<!-- Progress bar (hidden until upload starts) -->
|
||||
<div id="upload-progress-wrap" class="hidden mt-3">
|
||||
<div class="flex justify-between text-xs text-gray-500 dark:text-gray-400 mb-1">
|
||||
<span id="upload-progress-label">Uploading…</span>
|
||||
</div>
|
||||
<div class="w-full bg-gray-200 dark:bg-gray-700 rounded-full h-2">
|
||||
<div id="upload-progress-bar"
|
||||
class="bg-green-500 h-2 rounded-full transition-all duration-300"
|
||||
style="width: 0%"></div>
|
||||
</div>
|
||||
</div>
|
||||
</div>
|
||||
|
||||
@@ -559,5 +633,112 @@ document.getElementById('assign-modal')?.addEventListener('click', function(e) {
|
||||
closeAssignModal();
|
||||
}
|
||||
});
|
||||
|
||||
// ── Upload Data ─────────────────────────────────────────────────────────────
|
||||
|
||||
function toggleUploadPanel() {
|
||||
const panel = document.getElementById('upload-panel');
|
||||
const status = document.getElementById('upload-status');
|
||||
panel.classList.toggle('hidden');
|
||||
// Reset state when reopening
|
||||
if (!panel.classList.contains('hidden')) {
|
||||
status.textContent = '';
|
||||
status.className = 'text-sm hidden';
|
||||
document.getElementById('upload-input').value = '';
|
||||
document.getElementById('upload-progress-wrap').classList.add('hidden');
|
||||
document.getElementById('upload-progress-bar').style.width = '0%';
|
||||
}
|
||||
}
|
||||
|
||||
function submitUpload() {
|
||||
const input = document.getElementById('upload-input');
|
||||
const status = document.getElementById('upload-status');
|
||||
const btn = document.getElementById('upload-btn');
|
||||
const cancelBtn = document.getElementById('upload-cancel-btn');
|
||||
const progressWrap = document.getElementById('upload-progress-wrap');
|
||||
const progressBar = document.getElementById('upload-progress-bar');
|
||||
const progressLabel = document.getElementById('upload-progress-label');
|
||||
|
||||
if (!input.files.length) {
|
||||
alert('Please select files to upload.');
|
||||
return;
|
||||
}
|
||||
|
||||
const fileCount = input.files.length;
|
||||
const formData = new FormData();
|
||||
for (const file of input.files) {
|
||||
formData.append('files', file);
|
||||
}
|
||||
|
||||
// Disable controls and show progress bar
|
||||
btn.disabled = true;
|
||||
btn.textContent = 'Uploading\u2026';
|
||||
btn.classList.add('opacity-60', 'cursor-not-allowed');
|
||||
cancelBtn.disabled = true;
|
||||
cancelBtn.classList.add('opacity-40', 'cursor-not-allowed');
|
||||
status.className = 'text-sm hidden';
|
||||
progressWrap.classList.remove('hidden');
|
||||
progressBar.style.width = '0%';
|
||||
progressLabel.textContent = `Uploading ${fileCount} file${fileCount !== 1 ? 's' : ''}\u2026`;
|
||||
|
||||
const xhr = new XMLHttpRequest();
|
||||
|
||||
xhr.upload.addEventListener('progress', (e) => {
|
||||
if (e.lengthComputable) {
|
||||
const pct = Math.round((e.loaded / e.total) * 100);
|
||||
progressBar.style.width = pct + '%';
|
||||
progressLabel.textContent = `Uploading ${fileCount} file${fileCount !== 1 ? 's' : ''}\u2026 ${pct}%`;
|
||||
}
|
||||
});
|
||||
|
||||
xhr.upload.addEventListener('load', () => {
|
||||
progressBar.style.width = '100%';
|
||||
progressLabel.textContent = 'Processing files on server\u2026';
|
||||
});
|
||||
|
||||
xhr.addEventListener('load', () => {
|
||||
progressWrap.classList.add('hidden');
|
||||
btn.disabled = false;
|
||||
btn.textContent = 'Import Files';
|
||||
btn.classList.remove('opacity-60', 'cursor-not-allowed');
|
||||
cancelBtn.disabled = false;
|
||||
cancelBtn.classList.remove('opacity-40', 'cursor-not-allowed');
|
||||
|
||||
try {
|
||||
const data = JSON.parse(xhr.responseText);
|
||||
if (xhr.status >= 200 && xhr.status < 300) {
|
||||
const parts = [`Imported ${data.files_imported} file${data.files_imported !== 1 ? 's' : ''}`];
|
||||
if (data.leq_files || data.lp_files) {
|
||||
parts.push(`(${data.leq_files} Leq, ${data.lp_files} Lp)`);
|
||||
}
|
||||
if (data.store_name) parts.push(`\u2014 ${data.store_name}`);
|
||||
status.textContent = parts.join(' ');
|
||||
status.className = 'text-sm text-green-600 dark:text-green-400';
|
||||
input.value = '';
|
||||
htmx.trigger(document.getElementById('data-files-list'), 'load');
|
||||
} else {
|
||||
status.textContent = `Error: ${data.detail || 'Upload failed'}`;
|
||||
status.className = 'text-sm text-red-600 dark:text-red-400';
|
||||
}
|
||||
} catch {
|
||||
status.textContent = 'Error: Unexpected server response';
|
||||
status.className = 'text-sm text-red-600 dark:text-red-400';
|
||||
}
|
||||
});
|
||||
|
||||
xhr.addEventListener('error', () => {
|
||||
progressWrap.classList.add('hidden');
|
||||
btn.disabled = false;
|
||||
btn.textContent = 'Import Files';
|
||||
btn.classList.remove('opacity-60', 'cursor-not-allowed');
|
||||
cancelBtn.disabled = false;
|
||||
cancelBtn.classList.remove('opacity-40', 'cursor-not-allowed');
|
||||
status.textContent = 'Error: Network error during upload';
|
||||
status.className = 'text-sm text-red-600 dark:text-red-400';
|
||||
});
|
||||
|
||||
xhr.open('POST', `/api/projects/${projectId}/nrl/${locationId}/upload-data`);
|
||||
xhr.send(formData);
|
||||
}
|
||||
</script>
|
||||
{% endblock %}
|
||||
|
||||
566
templates/pair_devices.html
Normal file
@@ -0,0 +1,566 @@
|
||||
{% extends "base.html" %}
|
||||
|
||||
{% block title %}Pair Devices - Terra-View{% endblock %}
|
||||
|
||||
{% block content %}
|
||||
<div class="max-w-7xl mx-auto">
|
||||
<!-- Header -->
|
||||
<div class="mb-6">
|
||||
<h1 class="text-2xl font-bold text-gray-900 dark:text-white">Pair Devices</h1>
|
||||
<p class="mt-1 text-sm text-gray-600 dark:text-gray-400">
|
||||
Select a recorder (seismograph or SLM) and a modem to create a bidirectional pairing.
|
||||
</p>
|
||||
</div>
|
||||
|
||||
<!-- Selection Summary Bar -->
|
||||
<div id="selection-bar" class="mb-6 p-4 bg-white dark:bg-slate-800 rounded-lg shadow border border-gray-200 dark:border-gray-700">
|
||||
<div class="flex items-center justify-between flex-wrap gap-4">
|
||||
<div class="flex items-center gap-6">
|
||||
<div class="flex items-center gap-2">
|
||||
<span class="text-sm text-gray-600 dark:text-gray-400">Recorder:</span>
|
||||
<span id="selected-recorder" class="font-mono font-medium text-gray-900 dark:text-white">None selected</span>
|
||||
</div>
|
||||
<svg class="w-5 h-5 text-gray-400" fill="none" stroke="currentColor" viewBox="0 0 24 24">
|
||||
<path stroke-linecap="round" stroke-linejoin="round" stroke-width="2" d="M14 5l7 7m0 0l-7 7m7-7H3"></path>
|
||||
</svg>
|
||||
<div class="flex items-center gap-2">
|
||||
<span class="text-sm text-gray-600 dark:text-gray-400">Modem:</span>
|
||||
<span id="selected-modem" class="font-mono font-medium text-gray-900 dark:text-white">None selected</span>
|
||||
</div>
|
||||
</div>
|
||||
<div class="flex items-center gap-3">
|
||||
<button id="clear-selection-btn"
|
||||
onclick="clearSelection()"
|
||||
class="px-4 py-2 text-sm font-medium text-gray-700 dark:text-gray-300 bg-gray-100 dark:bg-gray-700 rounded-lg hover:bg-gray-200 dark:hover:bg-gray-600 disabled:opacity-50 disabled:cursor-not-allowed"
|
||||
disabled>
|
||||
Clear
|
||||
</button>
|
||||
<button id="pair-btn"
|
||||
onclick="pairDevices()"
|
||||
class="px-4 py-2 text-sm font-medium text-white bg-seismo-orange rounded-lg hover:bg-orange-600 disabled:opacity-50 disabled:cursor-not-allowed"
|
||||
disabled>
|
||||
Pair Devices
|
||||
</button>
|
||||
</div>
|
||||
</div>
|
||||
</div>
|
||||
|
||||
<!-- Two Column Layout -->
|
||||
<div class="grid grid-cols-1 lg:grid-cols-2 gap-6">
|
||||
<!-- Left Column: Recorders (Seismographs + SLMs) -->
|
||||
<div class="bg-white dark:bg-slate-800 rounded-lg shadow border border-gray-200 dark:border-gray-700">
|
||||
<div class="px-4 py-3 border-b border-gray-200 dark:border-gray-700">
|
||||
<div class="flex items-center justify-between mb-3">
|
||||
<h2 class="text-lg font-semibold text-gray-900 dark:text-white flex items-center gap-2">
|
||||
<svg class="w-5 h-5 text-seismo-orange" fill="none" stroke="currentColor" viewBox="0 0 24 24">
|
||||
<path stroke-linecap="round" stroke-linejoin="round" stroke-width="2" d="M9 19v-6a2 2 0 00-2-2H5a2 2 0 00-2 2v6a2 2 0 002 2h2a2 2 0 002-2zm0 0V9a2 2 0 012-2h2a2 2 0 012 2v10m-6 0a2 2 0 002 2h2a2 2 0 002-2m0 0V5a2 2 0 012-2h2a2 2 0 012 2v14a2 2 0 01-2 2h-2a2 2 0 01-2-2z"></path>
|
||||
</svg>
|
||||
Recorders
|
||||
<span id="recorder-count" class="text-sm font-normal text-gray-500 dark:text-gray-400">({{ recorders|length }})</span>
|
||||
</h2>
|
||||
</div>
|
||||
<!-- Recorder Search & Filters -->
|
||||
<div class="space-y-2">
|
||||
<input type="text" id="recorder-search" placeholder="Search by ID..."
|
||||
class="w-full px-3 py-2 rounded-lg border border-gray-300 dark:border-gray-600 bg-white dark:bg-slate-700 text-gray-900 dark:text-white text-sm focus:ring-2 focus:ring-seismo-orange focus:border-seismo-orange"
|
||||
oninput="filterRecorders()">
|
||||
<div class="flex items-center gap-4">
|
||||
<label class="flex items-center gap-2 cursor-pointer">
|
||||
<input type="checkbox" id="recorder-hide-paired" onchange="filterRecorders()" class="rounded border-gray-300 dark:border-gray-600 text-seismo-orange focus:ring-seismo-orange">
|
||||
<span class="text-xs text-gray-600 dark:text-gray-400">Hide paired</span>
|
||||
</label>
|
||||
<label class="flex items-center gap-2 cursor-pointer">
|
||||
<input type="checkbox" id="recorder-deployed-only" onchange="filterRecorders()" class="rounded border-gray-300 dark:border-gray-600 text-seismo-orange focus:ring-seismo-orange">
|
||||
<span class="text-xs text-gray-600 dark:text-gray-400">Deployed only</span>
|
||||
</label>
|
||||
</div>
|
||||
</div>
|
||||
</div>
|
||||
<div class="max-h-[600px] overflow-y-auto">
|
||||
<div id="recorders-list" class="divide-y divide-gray-200 dark:divide-gray-700">
|
||||
{% for unit in recorders %}
|
||||
<div class="device-row recorder-row p-3 hover:bg-gray-50 dark:hover:bg-gray-700/50 cursor-pointer transition-colors"
|
||||
data-id="{{ unit.id }}"
|
||||
data-deployed="{{ unit.deployed|lower }}"
|
||||
data-paired-with="{{ unit.deployed_with_modem_id or '' }}"
|
||||
data-device-type="{{ unit.device_type }}"
|
||||
onclick="selectRecorder('{{ unit.id }}')">
|
||||
<div class="flex items-center justify-between">
|
||||
<div class="flex items-center gap-3">
|
||||
<div class="w-8 h-8 rounded-full flex items-center justify-center
|
||||
{% if unit.device_type == 'slm' %}bg-purple-100 dark:bg-purple-900/30 text-purple-600 dark:text-purple-400
|
||||
{% else %}bg-blue-100 dark:bg-blue-900/30 text-blue-600 dark:text-blue-400{% endif %}">
|
||||
{% if unit.device_type == 'slm' %}
|
||||
<svg class="w-4 h-4" fill="none" stroke="currentColor" viewBox="0 0 24 24">
|
||||
<path stroke-linecap="round" stroke-linejoin="round" stroke-width="2" d="M15.536 8.464a5 5 0 010 7.072m2.828-9.9a9 9 0 010 12.728M5.586 15H4a1 1 0 01-1-1v-4a1 1 0 011-1h1.586l4.707-4.707C10.923 3.663 12 4.109 12 5v14c0 .891-1.077 1.337-1.707.707L5.586 15z"></path>
|
||||
</svg>
|
||||
{% else %}
|
||||
<svg class="w-4 h-4" fill="none" stroke="currentColor" viewBox="0 0 24 24">
|
||||
<path stroke-linecap="round" stroke-linejoin="round" stroke-width="2" d="M9 19v-6a2 2 0 00-2-2H5a2 2 0 00-2 2v6a2 2 0 002 2h2a2 2 0 002-2zm0 0V9a2 2 0 012-2h2a2 2 0 012 2v10m-6 0a2 2 0 002 2h2a2 2 0 002-2m0 0V5a2 2 0 012-2h2a2 2 0 012 2v14a2 2 0 01-2 2h-2a2 2 0 01-2-2z"></path>
|
||||
</svg>
|
||||
{% endif %}
|
||||
</div>
|
||||
<div>
|
||||
<div class="font-mono font-medium text-gray-900 dark:text-white">{{ unit.id }}</div>
|
||||
<div class="text-xs text-gray-500 dark:text-gray-400">
|
||||
{{ unit.device_type|capitalize }}
|
||||
{% if not unit.deployed %}<span class="text-yellow-600 dark:text-yellow-400">(Benched)</span>{% endif %}
|
||||
</div>
|
||||
</div>
|
||||
</div>
|
||||
<div class="flex items-center gap-2">
|
||||
{% if unit.deployed_with_modem_id %}
|
||||
<span class="px-2 py-1 text-xs rounded-full bg-green-100 dark:bg-green-900/30 text-green-700 dark:text-green-400">
|
||||
→ {{ unit.deployed_with_modem_id }}
|
||||
</span>
|
||||
{% endif %}
|
||||
<div class="w-5 h-5 rounded-full border-2 border-gray-300 dark:border-gray-600 flex items-center justify-center selection-indicator">
|
||||
<svg class="w-3 h-3 text-seismo-orange hidden check-icon" fill="currentColor" viewBox="0 0 20 20">
|
||||
<path fill-rule="evenodd" d="M16.707 5.293a1 1 0 010 1.414l-8 8a1 1 0 01-1.414 0l-4-4a1 1 0 011.414-1.414L8 12.586l7.293-7.293a1 1 0 011.414 0z" clip-rule="evenodd"></path>
|
||||
</svg>
|
||||
</div>
|
||||
</div>
|
||||
</div>
|
||||
</div>
|
||||
{% else %}
|
||||
<div class="p-8 text-center text-gray-500 dark:text-gray-400">
|
||||
No recorders found in roster
|
||||
</div>
|
||||
{% endfor %}
|
||||
</div>
|
||||
</div>
|
||||
</div>
|
||||
|
||||
<!-- Right Column: Modems -->
|
||||
<div class="bg-white dark:bg-slate-800 rounded-lg shadow border border-gray-200 dark:border-gray-700">
|
||||
<div class="px-4 py-3 border-b border-gray-200 dark:border-gray-700">
|
||||
<div class="flex items-center justify-between mb-3">
|
||||
<h2 class="text-lg font-semibold text-gray-900 dark:text-white flex items-center gap-2">
|
||||
<svg class="w-5 h-5 text-seismo-orange" fill="none" stroke="currentColor" viewBox="0 0 24 24">
|
||||
<path stroke-linecap="round" stroke-linejoin="round" stroke-width="2" d="M8.111 16.404a5.5 5.5 0 017.778 0M12 20h.01m-7.08-7.071c3.904-3.905 10.236-3.905 14.141 0M1.394 9.393c5.857-5.857 15.355-5.857 21.213 0"></path>
|
||||
</svg>
|
||||
Modems
|
||||
<span id="modem-count" class="text-sm font-normal text-gray-500 dark:text-gray-400">({{ modems|length }})</span>
|
||||
</h2>
|
||||
</div>
|
||||
<!-- Modem Search & Filters -->
|
||||
<div class="space-y-2">
|
||||
<input type="text" id="modem-search" placeholder="Search by ID, IP, or phone..."
|
||||
class="w-full px-3 py-2 rounded-lg border border-gray-300 dark:border-gray-600 bg-white dark:bg-slate-700 text-gray-900 dark:text-white text-sm focus:ring-2 focus:ring-seismo-orange focus:border-seismo-orange"
|
||||
oninput="filterModems()">
|
||||
<div class="flex items-center gap-4">
|
||||
<label class="flex items-center gap-2 cursor-pointer">
|
||||
<input type="checkbox" id="modem-hide-paired" onchange="filterModems()" class="rounded border-gray-300 dark:border-gray-600 text-seismo-orange focus:ring-seismo-orange">
|
||||
<span class="text-xs text-gray-600 dark:text-gray-400">Hide paired</span>
|
||||
</label>
|
||||
<label class="flex items-center gap-2 cursor-pointer">
|
||||
<input type="checkbox" id="modem-deployed-only" onchange="filterModems()" class="rounded border-gray-300 dark:border-gray-600 text-seismo-orange focus:ring-seismo-orange">
|
||||
<span class="text-xs text-gray-600 dark:text-gray-400">Deployed only</span>
|
||||
</label>
|
||||
</div>
|
||||
</div>
|
||||
</div>
|
||||
<div class="max-h-[600px] overflow-y-auto">
|
||||
<div id="modems-list" class="divide-y divide-gray-200 dark:divide-gray-700">
|
||||
{% for unit in modems %}
|
||||
<div class="device-row modem-row p-3 hover:bg-gray-50 dark:hover:bg-gray-700/50 cursor-pointer transition-colors"
|
||||
data-id="{{ unit.id }}"
|
||||
data-deployed="{{ unit.deployed|lower }}"
|
||||
data-paired-with="{{ unit.deployed_with_unit_id or '' }}"
|
||||
data-ip="{{ unit.ip_address or '' }}"
|
||||
data-phone="{{ unit.phone_number or '' }}"
|
||||
onclick="selectModem('{{ unit.id }}')">
|
||||
<div class="flex items-center justify-between">
|
||||
<div class="flex items-center gap-3">
|
||||
<div class="w-8 h-8 rounded-full bg-amber-100 dark:bg-amber-900/30 flex items-center justify-center text-amber-600 dark:text-amber-400">
|
||||
<svg class="w-4 h-4" fill="none" stroke="currentColor" viewBox="0 0 24 24">
|
||||
<path stroke-linecap="round" stroke-linejoin="round" stroke-width="2" d="M8.111 16.404a5.5 5.5 0 017.778 0M12 20h.01m-7.08-7.071c3.904-3.905 10.236-3.905 14.141 0M1.394 9.393c5.857-5.857 15.355-5.857 21.213 0"></path>
|
||||
</svg>
|
||||
</div>
|
||||
<div>
|
||||
<div class="font-mono font-medium text-gray-900 dark:text-white">{{ unit.id }}</div>
|
||||
<div class="text-xs text-gray-500 dark:text-gray-400">
|
||||
{% if unit.ip_address %}<span class="font-mono">{{ unit.ip_address }}</span>{% endif %}
|
||||
{% if unit.phone_number %}{% if unit.ip_address %} · {% endif %}{{ unit.phone_number }}{% endif %}
|
||||
{% if not unit.ip_address and not unit.phone_number %}Modem{% endif %}
|
||||
{% if not unit.deployed %}<span class="text-yellow-600 dark:text-yellow-400">(Benched)</span>{% endif %}
|
||||
</div>
|
||||
</div>
|
||||
</div>
|
||||
<div class="flex items-center gap-2">
|
||||
{% if unit.deployed_with_unit_id %}
|
||||
<span class="px-2 py-1 text-xs rounded-full bg-green-100 dark:bg-green-900/30 text-green-700 dark:text-green-400">
|
||||
← {{ unit.deployed_with_unit_id }}
|
||||
</span>
|
||||
{% endif %}
|
||||
<div class="w-5 h-5 rounded-full border-2 border-gray-300 dark:border-gray-600 flex items-center justify-center selection-indicator">
|
||||
<svg class="w-3 h-3 text-seismo-orange hidden check-icon" fill="currentColor" viewBox="0 0 20 20">
|
||||
<path fill-rule="evenodd" d="M16.707 5.293a1 1 0 010 1.414l-8 8a1 1 0 01-1.414 0l-4-4a1 1 0 011.414-1.414L8 12.586l7.293-7.293a1 1 0 011.414 0z" clip-rule="evenodd"></path>
|
||||
</svg>
|
||||
</div>
|
||||
</div>
|
||||
</div>
|
||||
</div>
|
||||
{% else %}
|
||||
<div class="p-8 text-center text-gray-500 dark:text-gray-400">
|
||||
No modems found in roster
|
||||
</div>
|
||||
{% endfor %}
|
||||
</div>
|
||||
</div>
|
||||
</div>
|
||||
</div>
|
||||
|
||||
<!-- Existing Pairings Section -->
|
||||
<div class="mt-8 bg-white dark:bg-slate-800 rounded-lg shadow border border-gray-200 dark:border-gray-700">
|
||||
<div class="px-4 py-3 border-b border-gray-200 dark:border-gray-700">
|
||||
<h2 class="text-lg font-semibold text-gray-900 dark:text-white flex items-center gap-2">
|
||||
<svg class="w-5 h-5 text-green-500" fill="none" stroke="currentColor" viewBox="0 0 24 24">
|
||||
<path stroke-linecap="round" stroke-linejoin="round" stroke-width="2" d="M13.828 10.172a4 4 0 00-5.656 0l-4 4a4 4 0 105.656 5.656l1.102-1.101m-.758-4.899a4 4 0 005.656 0l4-4a4 4 0 00-5.656-5.656l-1.1 1.1"></path>
|
||||
</svg>
|
||||
Existing Pairings
|
||||
<span id="pairing-count" class="text-sm font-normal text-gray-500 dark:text-gray-400">({{ pairings|length }})</span>
|
||||
</h2>
|
||||
</div>
|
||||
<div class="max-h-[400px] overflow-y-auto">
|
||||
<div id="pairings-list" class="divide-y divide-gray-200 dark:divide-gray-700">
|
||||
{% for pairing in pairings %}
|
||||
<div class="pairing-row p-3 flex items-center justify-between hover:bg-gray-50 dark:hover:bg-gray-700/50">
|
||||
<div class="flex items-center gap-4">
|
||||
<div class="flex items-center gap-2">
|
||||
<span class="px-2 py-1 text-sm font-mono rounded bg-blue-100 dark:bg-blue-900/30 text-blue-700 dark:text-blue-400">
|
||||
{{ pairing.recorder_id }}
|
||||
</span>
|
||||
<span class="text-xs text-gray-500 dark:text-gray-400">{{ pairing.recorder_type }}</span>
|
||||
</div>
|
||||
<svg class="w-5 h-5 text-gray-400" fill="none" stroke="currentColor" viewBox="0 0 24 24">
|
||||
<path stroke-linecap="round" stroke-linejoin="round" stroke-width="2" d="M8 7h12m0 0l-4-4m4 4l-4 4m0 6H4m0 0l4 4m-4-4l4-4"></path>
|
||||
</svg>
|
||||
<div class="flex items-center gap-2">
|
||||
<span class="px-2 py-1 text-sm font-mono rounded bg-amber-100 dark:bg-amber-900/30 text-amber-700 dark:text-amber-400">
|
||||
{{ pairing.modem_id }}
|
||||
</span>
|
||||
{% if pairing.modem_ip %}
|
||||
<span class="text-xs font-mono text-gray-500 dark:text-gray-400">{{ pairing.modem_ip }}</span>
|
||||
{% endif %}
|
||||
</div>
|
||||
</div>
|
||||
<button onclick="unpairDevices('{{ pairing.recorder_id }}', '{{ pairing.modem_id }}')"
|
||||
class="p-2 text-red-600 dark:text-red-400 hover:bg-red-100 dark:hover:bg-red-900/30 rounded-lg transition-colors"
|
||||
title="Unpair devices">
|
||||
<svg class="w-4 h-4" fill="none" stroke="currentColor" viewBox="0 0 24 24">
|
||||
<path stroke-linecap="round" stroke-linejoin="round" stroke-width="2" d="M6 18L18 6M6 6l12 12"></path>
|
||||
</svg>
|
||||
</button>
|
||||
</div>
|
||||
{% else %}
|
||||
<div class="p-8 text-center text-gray-500 dark:text-gray-400">
|
||||
No pairings found. Select a recorder and modem above to create one.
|
||||
</div>
|
||||
{% endfor %}
|
||||
</div>
|
||||
</div>
|
||||
</div>
|
||||
</div>
|
||||
|
||||
<!-- Toast notification -->
|
||||
<div id="toast" class="fixed bottom-4 right-4 px-4 py-3 rounded-lg shadow-lg transform translate-y-full opacity-0 transition-all duration-300 z-50"></div>
|
||||
|
||||
<script>
|
||||
let selectedRecorder = null;
|
||||
let selectedModem = null;
|
||||
|
||||
function selectRecorder(id) {
|
||||
// Deselect previous
|
||||
document.querySelectorAll('.recorder-row').forEach(row => {
|
||||
row.classList.remove('bg-seismo-orange/10', 'dark:bg-seismo-orange/20');
|
||||
row.querySelector('.selection-indicator').classList.remove('border-seismo-orange', 'bg-seismo-orange');
|
||||
row.querySelector('.selection-indicator').classList.add('border-gray-300', 'dark:border-gray-600');
|
||||
row.querySelector('.check-icon').classList.add('hidden');
|
||||
});
|
||||
|
||||
// Toggle selection
|
||||
if (selectedRecorder === id) {
|
||||
selectedRecorder = null;
|
||||
document.getElementById('selected-recorder').textContent = 'None selected';
|
||||
} else {
|
||||
selectedRecorder = id;
|
||||
document.getElementById('selected-recorder').textContent = id;
|
||||
|
||||
// Highlight selected
|
||||
const row = document.querySelector(`.recorder-row[data-id="${id}"]`);
|
||||
if (row) {
|
||||
row.classList.add('bg-seismo-orange/10', 'dark:bg-seismo-orange/20');
|
||||
row.querySelector('.selection-indicator').classList.remove('border-gray-300', 'dark:border-gray-600');
|
||||
row.querySelector('.selection-indicator').classList.add('border-seismo-orange', 'bg-seismo-orange');
|
||||
row.querySelector('.check-icon').classList.remove('hidden');
|
||||
}
|
||||
}
|
||||
|
||||
updateButtons();
|
||||
}
|
||||
|
||||
function selectModem(id) {
|
||||
// Deselect previous
|
||||
document.querySelectorAll('.modem-row').forEach(row => {
|
||||
row.classList.remove('bg-seismo-orange/10', 'dark:bg-seismo-orange/20');
|
||||
row.querySelector('.selection-indicator').classList.remove('border-seismo-orange', 'bg-seismo-orange');
|
||||
row.querySelector('.selection-indicator').classList.add('border-gray-300', 'dark:border-gray-600');
|
||||
row.querySelector('.check-icon').classList.add('hidden');
|
||||
});
|
||||
|
||||
// Toggle selection
|
||||
if (selectedModem === id) {
|
||||
selectedModem = null;
|
||||
document.getElementById('selected-modem').textContent = 'None selected';
|
||||
} else {
|
||||
selectedModem = id;
|
||||
document.getElementById('selected-modem').textContent = id;
|
||||
|
||||
// Highlight selected
|
||||
const row = document.querySelector(`.modem-row[data-id="${id}"]`);
|
||||
if (row) {
|
||||
row.classList.add('bg-seismo-orange/10', 'dark:bg-seismo-orange/20');
|
||||
row.querySelector('.selection-indicator').classList.remove('border-gray-300', 'dark:border-gray-600');
|
||||
row.querySelector('.selection-indicator').classList.add('border-seismo-orange', 'bg-seismo-orange');
|
||||
row.querySelector('.check-icon').classList.remove('hidden');
|
||||
}
|
||||
}
|
||||
|
||||
updateButtons();
|
||||
}
|
||||
|
||||
function updateButtons() {
|
||||
const pairBtn = document.getElementById('pair-btn');
|
||||
const clearBtn = document.getElementById('clear-selection-btn');
|
||||
|
||||
pairBtn.disabled = !(selectedRecorder && selectedModem);
|
||||
clearBtn.disabled = !(selectedRecorder || selectedModem);
|
||||
}
|
||||
|
||||
function clearSelection() {
|
||||
if (selectedRecorder) selectRecorder(selectedRecorder);
|
||||
if (selectedModem) selectModem(selectedModem);
|
||||
}
|
||||
|
||||
function filterRecorders() {
|
||||
const searchTerm = document.getElementById('recorder-search').value.toLowerCase().trim();
|
||||
const hidePaired = document.getElementById('recorder-hide-paired').checked;
|
||||
const deployedOnly = document.getElementById('recorder-deployed-only').checked;
|
||||
|
||||
let visibleRecorders = 0;
|
||||
|
||||
document.querySelectorAll('.recorder-row').forEach(row => {
|
||||
const id = row.dataset.id.toLowerCase();
|
||||
const pairedWith = row.dataset.pairedWith;
|
||||
const deployed = row.dataset.deployed === 'true';
|
||||
|
||||
let show = true;
|
||||
if (searchTerm && !id.includes(searchTerm)) show = false;
|
||||
if (hidePaired && pairedWith) show = false;
|
||||
if (deployedOnly && !deployed) show = false;
|
||||
|
||||
row.style.display = show ? '' : 'none';
|
||||
if (show) visibleRecorders++;
|
||||
});
|
||||
|
||||
document.getElementById('recorder-count').textContent = `(${visibleRecorders})`;
|
||||
}
|
||||
|
||||
function filterModems() {
|
||||
const searchTerm = document.getElementById('modem-search').value.toLowerCase().trim();
|
||||
const hidePaired = document.getElementById('modem-hide-paired').checked;
|
||||
const deployedOnly = document.getElementById('modem-deployed-only').checked;
|
||||
|
||||
let visibleModems = 0;
|
||||
|
||||
document.querySelectorAll('.modem-row').forEach(row => {
|
||||
const id = row.dataset.id.toLowerCase();
|
||||
const ip = (row.dataset.ip || '').toLowerCase();
|
||||
const phone = (row.dataset.phone || '').toLowerCase();
|
||||
const pairedWith = row.dataset.pairedWith;
|
||||
const deployed = row.dataset.deployed === 'true';
|
||||
|
||||
let show = true;
|
||||
if (searchTerm && !id.includes(searchTerm) && !ip.includes(searchTerm) && !phone.includes(searchTerm)) show = false;
|
||||
if (hidePaired && pairedWith) show = false;
|
||||
if (deployedOnly && !deployed) show = false;
|
||||
|
||||
row.style.display = show ? '' : 'none';
|
||||
if (show) visibleModems++;
|
||||
});
|
||||
|
||||
document.getElementById('modem-count').textContent = `(${visibleModems})`;
|
||||
}
|
||||
|
||||
function saveScrollPositions() {
|
||||
const recordersList = document.getElementById('recorders-list').parentElement;
|
||||
const modemsList = document.getElementById('modems-list').parentElement;
|
||||
const pairingsList = document.getElementById('pairings-list').parentElement;
|
||||
|
||||
sessionStorage.setItem('pairDevices_recorderScroll', recordersList.scrollTop);
|
||||
sessionStorage.setItem('pairDevices_modemScroll', modemsList.scrollTop);
|
||||
sessionStorage.setItem('pairDevices_pairingScroll', pairingsList.scrollTop);
|
||||
|
||||
// Save recorder filter state
|
||||
sessionStorage.setItem('pairDevices_recorderSearch', document.getElementById('recorder-search').value);
|
||||
sessionStorage.setItem('pairDevices_recorderHidePaired', document.getElementById('recorder-hide-paired').checked);
|
||||
sessionStorage.setItem('pairDevices_recorderDeployedOnly', document.getElementById('recorder-deployed-only').checked);
|
||||
|
||||
// Save modem filter state
|
||||
sessionStorage.setItem('pairDevices_modemSearch', document.getElementById('modem-search').value);
|
||||
sessionStorage.setItem('pairDevices_modemHidePaired', document.getElementById('modem-hide-paired').checked);
|
||||
sessionStorage.setItem('pairDevices_modemDeployedOnly', document.getElementById('modem-deployed-only').checked);
|
||||
}
|
||||
|
||||
function restoreScrollPositions() {
|
||||
const recorderScroll = sessionStorage.getItem('pairDevices_recorderScroll');
|
||||
const modemScroll = sessionStorage.getItem('pairDevices_modemScroll');
|
||||
const pairingScroll = sessionStorage.getItem('pairDevices_pairingScroll');
|
||||
|
||||
if (recorderScroll) {
|
||||
document.getElementById('recorders-list').parentElement.scrollTop = parseInt(recorderScroll);
|
||||
}
|
||||
if (modemScroll) {
|
||||
document.getElementById('modems-list').parentElement.scrollTop = parseInt(modemScroll);
|
||||
}
|
||||
if (pairingScroll) {
|
||||
document.getElementById('pairings-list').parentElement.scrollTop = parseInt(pairingScroll);
|
||||
}
|
||||
|
||||
// Restore recorder filter state
|
||||
const recorderSearch = sessionStorage.getItem('pairDevices_recorderSearch');
|
||||
const recorderHidePaired = sessionStorage.getItem('pairDevices_recorderHidePaired');
|
||||
const recorderDeployedOnly = sessionStorage.getItem('pairDevices_recorderDeployedOnly');
|
||||
|
||||
if (recorderSearch) document.getElementById('recorder-search').value = recorderSearch;
|
||||
if (recorderHidePaired === 'true') document.getElementById('recorder-hide-paired').checked = true;
|
||||
if (recorderDeployedOnly === 'true') document.getElementById('recorder-deployed-only').checked = true;
|
||||
|
||||
// Restore modem filter state
|
||||
const modemSearch = sessionStorage.getItem('pairDevices_modemSearch');
|
||||
const modemHidePaired = sessionStorage.getItem('pairDevices_modemHidePaired');
|
||||
const modemDeployedOnly = sessionStorage.getItem('pairDevices_modemDeployedOnly');
|
||||
|
||||
if (modemSearch) document.getElementById('modem-search').value = modemSearch;
|
||||
if (modemHidePaired === 'true') document.getElementById('modem-hide-paired').checked = true;
|
||||
if (modemDeployedOnly === 'true') document.getElementById('modem-deployed-only').checked = true;
|
||||
|
||||
// Apply filters if any were set
|
||||
if (recorderSearch || recorderHidePaired === 'true' || recorderDeployedOnly === 'true') {
|
||||
filterRecorders();
|
||||
}
|
||||
if (modemSearch || modemHidePaired === 'true' || modemDeployedOnly === 'true') {
|
||||
filterModems();
|
||||
}
|
||||
|
||||
// Clear stored values
|
||||
sessionStorage.removeItem('pairDevices_recorderScroll');
|
||||
sessionStorage.removeItem('pairDevices_modemScroll');
|
||||
sessionStorage.removeItem('pairDevices_pairingScroll');
|
||||
sessionStorage.removeItem('pairDevices_recorderSearch');
|
||||
sessionStorage.removeItem('pairDevices_recorderHidePaired');
|
||||
sessionStorage.removeItem('pairDevices_recorderDeployedOnly');
|
||||
sessionStorage.removeItem('pairDevices_modemSearch');
|
||||
sessionStorage.removeItem('pairDevices_modemHidePaired');
|
||||
sessionStorage.removeItem('pairDevices_modemDeployedOnly');
|
||||
}
|
||||
|
||||
// Restore scroll positions on page load
|
||||
document.addEventListener('DOMContentLoaded', restoreScrollPositions);
|
||||
|
||||
async function pairDevices() {
|
||||
if (!selectedRecorder || !selectedModem) return;
|
||||
|
||||
const pairBtn = document.getElementById('pair-btn');
|
||||
pairBtn.disabled = true;
|
||||
pairBtn.textContent = 'Pairing...';
|
||||
|
||||
try {
|
||||
const response = await fetch('/api/roster/pair-devices', {
|
||||
method: 'POST',
|
||||
headers: { 'Content-Type': 'application/json' },
|
||||
body: JSON.stringify({
|
||||
recorder_id: selectedRecorder,
|
||||
modem_id: selectedModem
|
||||
})
|
||||
});
|
||||
|
||||
const result = await response.json();
|
||||
|
||||
if (response.ok) {
|
||||
showToast(`Paired ${selectedRecorder} with ${selectedModem}`, 'success');
|
||||
// Save scroll positions before reload
|
||||
saveScrollPositions();
|
||||
setTimeout(() => window.location.reload(), 500);
|
||||
} else {
|
||||
showToast(result.detail || 'Failed to pair devices', 'error');
|
||||
}
|
||||
} catch (error) {
|
||||
showToast('Error pairing devices: ' + error.message, 'error');
|
||||
} finally {
|
||||
pairBtn.disabled = false;
|
||||
pairBtn.textContent = 'Pair Devices';
|
||||
}
|
||||
}
|
||||
|
||||
async function unpairDevices(recorderId, modemId) {
|
||||
if (!confirm(`Unpair ${recorderId} from ${modemId}?`)) return;
|
||||
|
||||
try {
|
||||
const response = await fetch('/api/roster/unpair-devices', {
|
||||
method: 'POST',
|
||||
headers: { 'Content-Type': 'application/json' },
|
||||
body: JSON.stringify({
|
||||
recorder_id: recorderId,
|
||||
modem_id: modemId
|
||||
})
|
||||
});
|
||||
|
||||
const result = await response.json();
|
||||
|
||||
if (response.ok) {
|
||||
showToast(`Unpaired ${recorderId} from ${modemId}`, 'success');
|
||||
// Save scroll positions before reload
|
||||
saveScrollPositions();
|
||||
setTimeout(() => window.location.reload(), 500);
|
||||
} else {
|
||||
showToast(result.detail || 'Failed to unpair devices', 'error');
|
||||
}
|
||||
} catch (error) {
|
||||
showToast('Error unpairing devices: ' + error.message, 'error');
|
||||
}
|
||||
}
|
||||
|
||||
function showToast(message, type = 'info') {
|
||||
const toast = document.getElementById('toast');
|
||||
toast.textContent = message;
|
||||
toast.className = 'fixed bottom-4 right-4 px-4 py-3 rounded-lg shadow-lg transform transition-all duration-300 z-50';
|
||||
|
||||
if (type === 'success') {
|
||||
toast.classList.add('bg-green-500', 'text-white');
|
||||
} else if (type === 'error') {
|
||||
toast.classList.add('bg-red-500', 'text-white');
|
||||
} else {
|
||||
toast.classList.add('bg-gray-800', 'text-white');
|
||||
}
|
||||
|
||||
// Show
|
||||
toast.classList.remove('translate-y-full', 'opacity-0');
|
||||
|
||||
// Hide after 3 seconds
|
||||
setTimeout(() => {
|
||||
toast.classList.add('translate-y-full', 'opacity-0');
|
||||
}, 3000);
|
||||
}
|
||||
</script>
|
||||
|
||||
<style>
|
||||
.bg-seismo-orange\/10 {
|
||||
background-color: rgb(249 115 22 / 0.1);
|
||||
}
|
||||
.dark\:bg-seismo-orange\/20:is(.dark *) {
|
||||
background-color: rgb(249 115 22 / 0.2);
|
||||
}
|
||||
</style>
|
||||
{% endblock %}
|
||||
87
templates/partials/alerts/alert_dropdown.html
Normal file
@@ -0,0 +1,87 @@
|
||||
<!-- Alert Dropdown Content -->
|
||||
<!-- Loaded via HTMX into the alert dropdown in the navbar -->
|
||||
|
||||
<div class="max-h-96 overflow-y-auto">
|
||||
{% if alerts %}
|
||||
{% for item in alerts %}
|
||||
<div class="p-3 border-b border-gray-200 dark:border-gray-700 hover:bg-gray-50 dark:hover:bg-gray-700/50 transition-colors
|
||||
{% if item.alert.severity == 'critical' %}bg-red-50 dark:bg-red-900/20{% endif %}">
|
||||
<div class="flex items-start gap-3">
|
||||
<!-- Severity icon -->
|
||||
{% if item.alert.severity == 'critical' %}
|
||||
<span class="text-red-500 flex-shrink-0 mt-0.5">
|
||||
<svg class="w-5 h-5" fill="currentColor" viewBox="0 0 20 20">
|
||||
<path fill-rule="evenodd" d="M8.257 3.099c.765-1.36 2.722-1.36 3.486 0l5.58 9.92c.75 1.334-.213 2.98-1.742 2.98H4.42c-1.53 0-2.493-1.646-1.743-2.98l5.58-9.92zM11 13a1 1 0 11-2 0 1 1 0 012 0zm-1-8a1 1 0 00-1 1v3a1 1 0 002 0V6a1 1 0 00-1-1z" clip-rule="evenodd"/>
|
||||
</svg>
|
||||
</span>
|
||||
{% elif item.alert.severity == 'warning' %}
|
||||
<span class="text-yellow-500 flex-shrink-0 mt-0.5">
|
||||
<svg class="w-5 h-5" fill="currentColor" viewBox="0 0 20 20">
|
||||
<path fill-rule="evenodd" d="M8.257 3.099c.765-1.36 2.722-1.36 3.486 0l5.58 9.92c.75 1.334-.213 2.98-1.742 2.98H4.42c-1.53 0-2.493-1.646-1.743-2.98l5.58-9.92zM11 13a1 1 0 11-2 0 1 1 0 012 0zm-1-8a1 1 0 00-1 1v3a1 1 0 002 0V6a1 1 0 00-1-1z" clip-rule="evenodd"/>
|
||||
</svg>
|
||||
</span>
|
||||
{% else %}
|
||||
<span class="text-blue-500 flex-shrink-0 mt-0.5">
|
||||
<svg class="w-5 h-5" fill="currentColor" viewBox="0 0 20 20">
|
||||
<path fill-rule="evenodd" d="M18 10a8 8 0 11-16 0 8 8 0 0116 0zm-7-4a1 1 0 11-2 0 1 1 0 012 0zM9 9a1 1 0 000 2v3a1 1 0 001 1h1a1 1 0 100-2v-3a1 1 0 00-1-1H9z" clip-rule="evenodd"/>
|
||||
</svg>
|
||||
</span>
|
||||
{% endif %}
|
||||
|
||||
<div class="flex-1 min-w-0">
|
||||
<p class="text-sm font-medium text-gray-900 dark:text-white truncate">
|
||||
{{ item.alert.title }}
|
||||
</p>
|
||||
{% if item.alert.message %}
|
||||
<p class="text-xs text-gray-500 dark:text-gray-400 line-clamp-2 mt-0.5">
|
||||
{{ item.alert.message }}
|
||||
</p>
|
||||
{% endif %}
|
||||
<p class="text-xs text-gray-400 dark:text-gray-500 mt-1">
|
||||
{{ item.time_ago }}
|
||||
</p>
|
||||
</div>
|
||||
|
||||
<!-- Actions -->
|
||||
<div class="flex items-center gap-1 flex-shrink-0">
|
||||
<button hx-post="/api/alerts/{{ item.alert.id }}/acknowledge"
|
||||
hx-swap="none"
|
||||
hx-on::after-request="htmx.trigger('#alert-dropdown-content', 'refresh')"
|
||||
class="p-1.5 text-gray-400 hover:text-green-600 dark:hover:text-green-400 rounded transition-colors"
|
||||
title="Acknowledge">
|
||||
<svg class="w-4 h-4" fill="none" stroke="currentColor" viewBox="0 0 24 24">
|
||||
<path stroke-linecap="round" stroke-linejoin="round" stroke-width="2" d="M5 13l4 4L19 7"/>
|
||||
</svg>
|
||||
</button>
|
||||
<button hx-post="/api/alerts/{{ item.alert.id }}/dismiss"
|
||||
hx-swap="none"
|
||||
hx-on::after-request="htmx.trigger('#alert-dropdown-content', 'refresh')"
|
||||
class="p-1.5 text-gray-400 hover:text-red-600 dark:hover:text-red-400 rounded transition-colors"
|
||||
title="Dismiss">
|
||||
<svg class="w-4 h-4" fill="none" stroke="currentColor" viewBox="0 0 24 24">
|
||||
<path stroke-linecap="round" stroke-linejoin="round" stroke-width="2" d="M6 18L18 6M6 6l12 12"/>
|
||||
</svg>
|
||||
</button>
|
||||
</div>
|
||||
</div>
|
||||
</div>
|
||||
{% endfor %}
|
||||
{% else %}
|
||||
<div class="p-8 text-center">
|
||||
<svg class="w-12 h-12 mx-auto mb-3 text-gray-300 dark:text-gray-600" fill="none" stroke="currentColor" viewBox="0 0 24 24">
|
||||
<path stroke-linecap="round" stroke-linejoin="round" stroke-width="2" d="M9 12l2 2 4-4m6 2a9 9 0 11-18 0 9 9 0 0118 0z"/>
|
||||
</svg>
|
||||
<p class="text-gray-500 dark:text-gray-400 text-sm">No active alerts</p>
|
||||
<p class="text-gray-400 dark:text-gray-500 text-xs mt-1">All systems operational</p>
|
||||
</div>
|
||||
{% endif %}
|
||||
</div>
|
||||
|
||||
<!-- View all link -->
|
||||
{% if total_count > 0 %}
|
||||
<div class="p-3 border-t border-gray-200 dark:border-gray-700 text-center bg-gray-50 dark:bg-gray-800/50">
|
||||
<a href="/alerts" class="text-sm text-seismo-orange hover:text-seismo-navy dark:hover:text-orange-300 font-medium">
|
||||
View all {{ total_count }} alert{{ 's' if total_count != 1 else '' }}
|
||||
</a>
|
||||
</div>
|
||||
{% endif %}
|
||||
125
templates/partials/alerts/alert_list.html
Normal file
@@ -0,0 +1,125 @@
|
||||
<!-- Alert List Partial -->
|
||||
<!-- Full list of alerts for the alerts page -->
|
||||
|
||||
<div class="space-y-3">
|
||||
{% if alerts %}
|
||||
{% for item in alerts %}
|
||||
<div class="bg-white dark:bg-gray-800 rounded-lg border border-gray-200 dark:border-gray-700 p-4
|
||||
{% if item.alert.severity == 'critical' and item.alert.status == 'active' %}border-l-4 border-l-red-500{% endif %}
|
||||
{% if item.alert.severity == 'warning' and item.alert.status == 'active' %}border-l-4 border-l-yellow-500{% endif %}
|
||||
{% if item.alert.status != 'active' %}opacity-60{% endif %}">
|
||||
<div class="flex items-start gap-4">
|
||||
<!-- Severity icon -->
|
||||
<div class="flex-shrink-0">
|
||||
{% if item.alert.severity == 'critical' %}
|
||||
<div class="w-10 h-10 rounded-full bg-red-100 dark:bg-red-900/30 flex items-center justify-center">
|
||||
<svg class="w-5 h-5 text-red-600 dark:text-red-400" fill="currentColor" viewBox="0 0 20 20">
|
||||
<path fill-rule="evenodd" d="M8.257 3.099c.765-1.36 2.722-1.36 3.486 0l5.58 9.92c.75 1.334-.213 2.98-1.742 2.98H4.42c-1.53 0-2.493-1.646-1.743-2.98l5.58-9.92zM11 13a1 1 0 11-2 0 1 1 0 012 0zm-1-8a1 1 0 00-1 1v3a1 1 0 002 0V6a1 1 0 00-1-1z" clip-rule="evenodd"/>
|
||||
</svg>
|
||||
</div>
|
||||
{% elif item.alert.severity == 'warning' %}
|
||||
<div class="w-10 h-10 rounded-full bg-yellow-100 dark:bg-yellow-900/30 flex items-center justify-center">
|
||||
<svg class="w-5 h-5 text-yellow-600 dark:text-yellow-400" fill="currentColor" viewBox="0 0 20 20">
|
||||
<path fill-rule="evenodd" d="M8.257 3.099c.765-1.36 2.722-1.36 3.486 0l5.58 9.92c.75 1.334-.213 2.98-1.742 2.98H4.42c-1.53 0-2.493-1.646-1.743-2.98l5.58-9.92zM11 13a1 1 0 11-2 0 1 1 0 012 0zm-1-8a1 1 0 00-1 1v3a1 1 0 002 0V6a1 1 0 00-1-1z" clip-rule="evenodd"/>
|
||||
</svg>
|
||||
</div>
|
||||
{% else %}
|
||||
<div class="w-10 h-10 rounded-full bg-blue-100 dark:bg-blue-900/30 flex items-center justify-center">
|
||||
<svg class="w-5 h-5 text-blue-600 dark:text-blue-400" fill="currentColor" viewBox="0 0 20 20">
|
||||
<path fill-rule="evenodd" d="M18 10a8 8 0 11-16 0 8 8 0 0116 0zm-7-4a1 1 0 11-2 0 1 1 0 012 0zM9 9a1 1 0 000 2v3a1 1 0 001 1h1a1 1 0 100-2v-3a1 1 0 00-1-1H9z" clip-rule="evenodd"/>
|
||||
</svg>
|
||||
</div>
|
||||
{% endif %}
|
||||
</div>
|
||||
|
||||
<!-- Content -->
|
||||
<div class="flex-1 min-w-0">
|
||||
<div class="flex items-center gap-2 mb-1">
|
||||
<h3 class="text-base font-semibold text-gray-900 dark:text-white">
|
||||
{{ item.alert.title }}
|
||||
</h3>
|
||||
<!-- Status badge -->
|
||||
{% if item.alert.status == 'active' %}
|
||||
<span class="px-2 py-0.5 text-xs font-medium rounded-full bg-red-100 text-red-700 dark:bg-red-900/30 dark:text-red-300">
|
||||
Active
|
||||
</span>
|
||||
{% elif item.alert.status == 'acknowledged' %}
|
||||
<span class="px-2 py-0.5 text-xs font-medium rounded-full bg-yellow-100 text-yellow-700 dark:bg-yellow-900/30 dark:text-yellow-300">
|
||||
Acknowledged
|
||||
</span>
|
||||
{% elif item.alert.status == 'resolved' %}
|
||||
<span class="px-2 py-0.5 text-xs font-medium rounded-full bg-green-100 text-green-700 dark:bg-green-900/30 dark:text-green-300">
|
||||
Resolved
|
||||
</span>
|
||||
{% elif item.alert.status == 'dismissed' %}
|
||||
<span class="px-2 py-0.5 text-xs font-medium rounded-full bg-gray-100 text-gray-600 dark:bg-gray-700 dark:text-gray-400">
|
||||
Dismissed
|
||||
</span>
|
||||
{% endif %}
|
||||
</div>
|
||||
|
||||
{% if item.alert.message %}
|
||||
<p class="text-sm text-gray-600 dark:text-gray-300 mb-2">
|
||||
{{ item.alert.message }}
|
||||
</p>
|
||||
{% endif %}
|
||||
|
||||
<div class="flex items-center gap-4 text-xs text-gray-500 dark:text-gray-400">
|
||||
<span>{{ item.time_ago }}</span>
|
||||
{% if item.alert.unit_id %}
|
||||
<span class="flex items-center gap-1">
|
||||
<svg class="w-3.5 h-3.5" fill="none" stroke="currentColor" viewBox="0 0 24 24">
|
||||
<path stroke-linecap="round" stroke-linejoin="round" stroke-width="2" d="M9 3v2m6-2v2M9 19v2m6-2v2M5 9H3m2 6H3m18-6h-2m2 6h-2M7 19h10a2 2 0 002-2V7a2 2 0 00-2-2H7a2 2 0 00-2 2v10a2 2 0 002 2zM9 9h6v6H9V9z"/>
|
||||
</svg>
|
||||
{{ item.alert.unit_id }}
|
||||
</span>
|
||||
{% endif %}
|
||||
<span class="capitalize">{{ item.alert.alert_type | replace('_', ' ') }}</span>
|
||||
</div>
|
||||
</div>
|
||||
|
||||
<!-- Actions -->
|
||||
{% if item.alert.status == 'active' %}
|
||||
<div class="flex items-center gap-2 flex-shrink-0">
|
||||
<button hx-post="/api/alerts/{{ item.alert.id }}/acknowledge"
|
||||
hx-swap="none"
|
||||
hx-on::after-request="htmx.trigger('#alert-list', 'refresh')"
|
||||
class="px-3 py-1.5 text-sm bg-gray-100 dark:bg-gray-700 text-gray-700 dark:text-gray-300 rounded-lg hover:bg-gray-200 dark:hover:bg-gray-600 transition-colors">
|
||||
Acknowledge
|
||||
</button>
|
||||
<button hx-post="/api/alerts/{{ item.alert.id }}/resolve"
|
||||
hx-swap="none"
|
||||
hx-on::after-request="htmx.trigger('#alert-list', 'refresh')"
|
||||
class="px-3 py-1.5 text-sm bg-green-100 dark:bg-green-900/30 text-green-700 dark:text-green-300 rounded-lg hover:bg-green-200 dark:hover:bg-green-900/50 transition-colors">
|
||||
Resolve
|
||||
</button>
|
||||
<button hx-post="/api/alerts/{{ item.alert.id }}/dismiss"
|
||||
hx-swap="none"
|
||||
hx-on::after-request="htmx.trigger('#alert-list', 'refresh')"
|
||||
class="px-3 py-1.5 text-sm text-gray-500 hover:text-red-600 dark:hover:text-red-400 transition-colors"
|
||||
title="Dismiss">
|
||||
<svg class="w-5 h-5" fill="none" stroke="currentColor" viewBox="0 0 24 24">
|
||||
<path stroke-linecap="round" stroke-linejoin="round" stroke-width="2" d="M6 18L18 6M6 6l12 12"/>
|
||||
</svg>
|
||||
</button>
|
||||
</div>
|
||||
{% endif %}
|
||||
</div>
|
||||
</div>
|
||||
{% endfor %}
|
||||
{% else %}
|
||||
<div class="bg-white dark:bg-gray-800 rounded-lg border border-gray-200 dark:border-gray-700 p-12 text-center">
|
||||
<svg class="w-16 h-16 mx-auto mb-4 text-gray-300 dark:text-gray-600" fill="none" stroke="currentColor" viewBox="0 0 24 24">
|
||||
<path stroke-linecap="round" stroke-linejoin="round" stroke-width="2" d="M9 12l2 2 4-4m6 2a9 9 0 11-18 0 9 9 0 0118 0z"/>
|
||||
</svg>
|
||||
<h3 class="text-lg font-medium text-gray-900 dark:text-white mb-2">No alerts</h3>
|
||||
<p class="text-gray-500 dark:text-gray-400">
|
||||
{% if status_filter %}
|
||||
No {{ status_filter }} alerts found.
|
||||
{% else %}
|
||||
All systems are operating normally.
|
||||
{% endif %}
|
||||
</p>
|
||||
</div>
|
||||
{% endif %}
|
||||
</div>
|
||||
131
templates/partials/dashboard/todays_actions.html
Normal file
@@ -0,0 +1,131 @@
|
||||
<!-- Today's Scheduled Actions - Dashboard Card Content -->
|
||||
|
||||
<!-- Summary stats -->
|
||||
<div class="flex items-center gap-4 mb-4 text-sm">
|
||||
{% if pending_count > 0 %}
|
||||
<div class="flex items-center gap-1.5">
|
||||
<span class="w-2 h-2 bg-yellow-400 rounded-full"></span>
|
||||
<span class="text-gray-600 dark:text-gray-400">{{ pending_count }} pending</span>
|
||||
</div>
|
||||
{% endif %}
|
||||
{% if completed_count > 0 %}
|
||||
<div class="flex items-center gap-1.5">
|
||||
<span class="w-2 h-2 bg-green-400 rounded-full"></span>
|
||||
<span class="text-gray-600 dark:text-gray-400">{{ completed_count }} completed</span>
|
||||
</div>
|
||||
{% endif %}
|
||||
{% if failed_count > 0 %}
|
||||
<div class="flex items-center gap-1.5">
|
||||
<span class="w-2 h-2 bg-red-400 rounded-full"></span>
|
||||
<span class="text-gray-600 dark:text-gray-400">{{ failed_count }} failed</span>
|
||||
</div>
|
||||
{% endif %}
|
||||
{% if total_count == 0 %}
|
||||
<span class="text-gray-500 dark:text-gray-400">No actions scheduled for today</span>
|
||||
{% endif %}
|
||||
</div>
|
||||
|
||||
<!-- Actions list -->
|
||||
{% if actions %}
|
||||
<div class="space-y-2 max-h-64 overflow-y-auto">
|
||||
{% for item in actions %}
|
||||
<div class="flex items-center gap-3 p-2 rounded-lg
|
||||
{% if item.action.execution_status == 'pending' %}bg-yellow-50 dark:bg-yellow-900/20
|
||||
{% elif item.action.execution_status == 'completed' %}bg-green-50 dark:bg-green-900/20
|
||||
{% elif item.action.execution_status == 'failed' %}bg-red-50 dark:bg-red-900/20
|
||||
{% else %}bg-gray-50 dark:bg-gray-700/50{% endif %}">
|
||||
|
||||
<!-- Action type icon -->
|
||||
<div class="flex-shrink-0">
|
||||
{% if item.action.action_type == 'start' %}
|
||||
<div class="w-8 h-8 rounded-full bg-green-100 dark:bg-green-900/30 flex items-center justify-center">
|
||||
<svg class="w-4 h-4 text-green-600 dark:text-green-400" fill="currentColor" viewBox="0 0 20 20">
|
||||
<path fill-rule="evenodd" d="M10 18a8 8 0 100-16 8 8 0 000 16zM9.555 7.168A1 1 0 008 8v4a1 1 0 001.555.832l3-2a1 1 0 000-1.664l-3-2z" clip-rule="evenodd"/>
|
||||
</svg>
|
||||
</div>
|
||||
{% elif item.action.action_type == 'stop' %}
|
||||
<div class="w-8 h-8 rounded-full bg-red-100 dark:bg-red-900/30 flex items-center justify-center">
|
||||
<svg class="w-4 h-4 text-red-600 dark:text-red-400" fill="currentColor" viewBox="0 0 20 20">
|
||||
<path fill-rule="evenodd" d="M10 18a8 8 0 100-16 8 8 0 000 16zM8 7a1 1 0 00-1 1v4a1 1 0 001 1h4a1 1 0 001-1V8a1 1 0 00-1-1H8z" clip-rule="evenodd"/>
|
||||
</svg>
|
||||
</div>
|
||||
{% elif item.action.action_type == 'download' %}
|
||||
<div class="w-8 h-8 rounded-full bg-blue-100 dark:bg-blue-900/30 flex items-center justify-center">
|
||||
<svg class="w-4 h-4 text-blue-600 dark:text-blue-400" fill="currentColor" viewBox="0 0 20 20">
|
||||
<path fill-rule="evenodd" d="M3 17a1 1 0 011-1h12a1 1 0 110 2H4a1 1 0 01-1-1zm3.293-7.707a1 1 0 011.414 0L9 10.586V3a1 1 0 112 0v7.586l1.293-1.293a1 1 0 111.414 1.414l-3 3a1 1 0 01-1.414 0l-3-3a1 1 0 010-1.414z" clip-rule="evenodd"/>
|
||||
</svg>
|
||||
</div>
|
||||
{% endif %}
|
||||
</div>
|
||||
|
||||
<!-- Action details -->
|
||||
<div class="flex-1 min-w-0">
|
||||
<div class="flex items-center gap-2">
|
||||
<span class="font-medium text-sm text-gray-900 dark:text-white capitalize">{{ item.action.action_type }}</span>
|
||||
|
||||
<!-- Status indicator -->
|
||||
{% if item.action.execution_status == 'pending' %}
|
||||
<span class="text-xs text-yellow-600 dark:text-yellow-400">
|
||||
{{ item.action.scheduled_time|local_datetime('%H:%M') }}
|
||||
</span>
|
||||
{% elif item.action.execution_status == 'completed' %}
|
||||
<svg class="w-4 h-4 text-green-500" fill="currentColor" viewBox="0 0 20 20">
|
||||
<path fill-rule="evenodd" d="M10 18a8 8 0 100-16 8 8 0 000 16zm3.707-9.293a1 1 0 00-1.414-1.414L9 10.586 7.707 9.293a1 1 0 00-1.414 1.414l2 2a1 1 0 001.414 0l4-4z" clip-rule="evenodd"/>
|
||||
</svg>
|
||||
{% elif item.action.execution_status == 'failed' %}
|
||||
<svg class="w-4 h-4 text-red-500" fill="currentColor" viewBox="0 0 20 20">
|
||||
<path fill-rule="evenodd" d="M10 18a8 8 0 100-16 8 8 0 000 16zM8.707 7.293a1 1 0 00-1.414 1.414L8.586 10l-1.293 1.293a1 1 0 101.414 1.414L10 11.414l1.293 1.293a1 1 0 001.414-1.414L11.414 10l1.293-1.293a1 1 0 00-1.414-1.414L10 8.586 8.707 7.293z" clip-rule="evenodd"/>
|
||||
</svg>
|
||||
{% endif %}
|
||||
</div>
|
||||
|
||||
<!-- Location/Project info -->
|
||||
<div class="text-xs text-gray-500 dark:text-gray-400 truncate">
|
||||
{% if item.location %}
|
||||
<a href="/projects/{{ item.action.project_id }}/nrl/{{ item.location.id }}"
|
||||
class="hover:text-seismo-orange">
|
||||
{{ item.location.name }}
|
||||
</a>
|
||||
{% elif item.project %}
|
||||
<a href="/projects/{{ item.project.id }}" class="hover:text-seismo-orange">
|
||||
{{ item.project.name }}
|
||||
</a>
|
||||
{% endif %}
|
||||
</div>
|
||||
|
||||
<!-- Result details for completed/failed -->
|
||||
{% if item.action.execution_status == 'completed' and item.result %}
|
||||
{% if item.result.cycle_response and item.result.cycle_response.downloaded_folder %}
|
||||
<div class="text-xs text-green-600 dark:text-green-400">
|
||||
{{ item.result.cycle_response.downloaded_folder }}
|
||||
{% if item.result.cycle_response.download_success %}downloaded{% endif %}
|
||||
</div>
|
||||
{% endif %}
|
||||
{% elif item.action.execution_status == 'failed' and item.action.error_message %}
|
||||
<div class="text-xs text-red-600 dark:text-red-400 truncate" title="{{ item.action.error_message }}">
|
||||
{{ item.action.error_message[:50] }}{% if item.action.error_message|length > 50 %}...{% endif %}
|
||||
</div>
|
||||
{% endif %}
|
||||
</div>
|
||||
|
||||
<!-- Time -->
|
||||
<div class="flex-shrink-0 text-right">
|
||||
{% if item.action.execution_status == 'pending' %}
|
||||
<span class="text-xs text-gray-500 dark:text-gray-400">Scheduled</span>
|
||||
{% elif item.action.executed_at %}
|
||||
<span class="text-xs text-gray-500 dark:text-gray-400">
|
||||
{{ item.action.executed_at|local_datetime('%H:%M') }}
|
||||
</span>
|
||||
{% endif %}
|
||||
</div>
|
||||
</div>
|
||||
{% endfor %}
|
||||
</div>
|
||||
{% else %}
|
||||
<div class="text-center py-6 text-gray-500 dark:text-gray-400">
|
||||
<svg class="w-10 h-10 mx-auto mb-2 text-gray-300 dark:text-gray-600" fill="none" stroke="currentColor" viewBox="0 0 24 24">
|
||||
<path stroke-linecap="round" stroke-linejoin="round" stroke-width="2" d="M8 7V3m8 4V3m-9 8h10M5 21h14a2 2 0 002-2V7a2 2 0 00-2-2H5a2 2 0 00-2 2v12a2 2 0 002 2z"></path>
|
||||
</svg>
|
||||
<p class="text-sm">No scheduled actions for today</p>
|
||||
</div>
|
||||
{% endif %}
|
||||
@@ -60,7 +60,9 @@
|
||||
data-note="{{ unit.note if unit.note else '' }}">
|
||||
<td class="px-6 py-4 whitespace-nowrap">
|
||||
<div class="flex items-center space-x-2">
|
||||
{% if unit.status == 'OK' %}
|
||||
{% if not unit.deployed %}
|
||||
<span class="w-3 h-3 rounded-full bg-gray-400 dark:bg-gray-500" title="Benched"></span>
|
||||
{% elif unit.status == 'OK' %}
|
||||
<span class="w-3 h-3 rounded-full bg-green-500" title="OK"></span>
|
||||
{% elif unit.status == 'Pending' %}
|
||||
<span class="w-3 h-3 rounded-full bg-yellow-500" title="Pending"></span>
|
||||
@@ -85,6 +87,10 @@
|
||||
<span class="px-2 py-1 rounded-full bg-purple-100 dark:bg-purple-900/30 text-purple-800 dark:text-purple-300 text-xs font-medium">
|
||||
Modem
|
||||
</span>
|
||||
{% elif unit.device_type == 'slm' %}
|
||||
<span class="px-2 py-1 rounded-full bg-orange-100 dark:bg-orange-900/30 text-orange-800 dark:text-orange-300 text-xs font-medium">
|
||||
SLM
|
||||
</span>
|
||||
{% else %}
|
||||
<span class="px-2 py-1 rounded-full bg-blue-100 dark:bg-blue-900/30 text-blue-800 dark:text-blue-300 text-xs font-medium">
|
||||
Seismograph
|
||||
@@ -100,14 +106,19 @@
|
||||
{% if unit.phone_number %}
|
||||
<div>{{ unit.phone_number }}</div>
|
||||
{% endif %}
|
||||
{% if unit.hardware_model %}
|
||||
<div class="text-gray-500 dark:text-gray-500">{{ unit.hardware_model }}</div>
|
||||
{% if unit.deployed_with_unit_id %}
|
||||
<div>
|
||||
<span class="text-gray-500 dark:text-gray-500">Linked:</span>
|
||||
<a href="/unit/{{ unit.deployed_with_unit_id }}" class="text-seismo-orange hover:underline font-medium">
|
||||
{{ unit.deployed_with_unit_id }}
|
||||
</a>
|
||||
</div>
|
||||
{% endif %}
|
||||
{% else %}
|
||||
{% if unit.next_calibration_due %}
|
||||
{% if unit.last_calibrated %}
|
||||
<div>
|
||||
<span class="text-gray-500 dark:text-gray-500">Cal Due:</span>
|
||||
<span class="font-medium">{{ unit.next_calibration_due }}</span>
|
||||
<span class="text-gray-500 dark:text-gray-500">Last Cal:</span>
|
||||
<span class="font-medium">{{ unit.last_calibrated }}</span>
|
||||
</div>
|
||||
{% endif %}
|
||||
{% if unit.deployed_with_modem_id %}
|
||||
@@ -122,7 +133,7 @@
|
||||
</div>
|
||||
</td>
|
||||
<td class="px-6 py-4 whitespace-nowrap">
|
||||
<div class="text-sm text-gray-500 dark:text-gray-400">{{ unit.last_seen }}</div>
|
||||
<div class="text-sm text-gray-500 dark:text-gray-400 last-seen-cell" data-iso="{{ unit.last_seen }}">{{ unit.last_seen }}</div>
|
||||
</td>
|
||||
<td class="px-6 py-4 whitespace-nowrap">
|
||||
<div class="text-sm
|
||||
@@ -199,7 +210,9 @@
|
||||
<!-- Header: Status Dot + Unit ID + Status Badge -->
|
||||
<div class="flex items-center justify-between mb-2">
|
||||
<div class="flex items-center gap-2">
|
||||
{% if unit.status == 'OK' %}
|
||||
{% if not unit.deployed %}
|
||||
<span class="w-4 h-4 rounded-full bg-gray-400 dark:bg-gray-500" title="Benched"></span>
|
||||
{% elif unit.status == 'OK' %}
|
||||
<span class="w-4 h-4 rounded-full bg-green-500" title="OK"></span>
|
||||
{% elif unit.status == 'Pending' %}
|
||||
<span class="w-4 h-4 rounded-full bg-yellow-500" title="Pending"></span>
|
||||
@@ -226,6 +239,10 @@
|
||||
<span class="px-2 py-1 rounded-full bg-purple-100 dark:bg-purple-900/30 text-purple-800 dark:text-purple-300 text-xs font-medium">
|
||||
Modem
|
||||
</span>
|
||||
{% elif unit.device_type == 'slm' %}
|
||||
<span class="px-2 py-1 rounded-full bg-orange-100 dark:bg-orange-900/30 text-orange-800 dark:text-orange-300 text-xs font-medium">
|
||||
SLM
|
||||
</span>
|
||||
{% else %}
|
||||
<span class="px-2 py-1 rounded-full bg-blue-100 dark:bg-blue-900/30 text-blue-800 dark:text-blue-300 text-xs font-medium">
|
||||
Seismograph
|
||||
@@ -341,6 +358,39 @@
|
||||
</style>
|
||||
|
||||
<script>
|
||||
(function() {
|
||||
// User's configured timezone from settings (defaults to America/New_York)
|
||||
const userTimezone = '{{ user_timezone | default("America/New_York") }}';
|
||||
|
||||
// Format ISO timestamp to human-readable format in user's timezone
|
||||
function formatLastSeenLocal(isoString) {
|
||||
if (!isoString || isoString === 'Never' || isoString === 'N/A') {
|
||||
return isoString || 'Never';
|
||||
}
|
||||
try {
|
||||
const date = new Date(isoString);
|
||||
if (isNaN(date.getTime())) return isoString;
|
||||
|
||||
// Format in user's configured timezone
|
||||
return date.toLocaleString('en-US', {
|
||||
timeZone: userTimezone,
|
||||
month: 'short',
|
||||
day: 'numeric',
|
||||
hour: 'numeric',
|
||||
minute: '2-digit',
|
||||
hour12: true
|
||||
});
|
||||
} catch (e) {
|
||||
return isoString;
|
||||
}
|
||||
}
|
||||
|
||||
// Format all last-seen cells on page load
|
||||
document.querySelectorAll('.last-seen-cell').forEach(cell => {
|
||||
const isoDate = cell.getAttribute('data-iso');
|
||||
cell.textContent = formatLastSeenLocal(isoDate);
|
||||
});
|
||||
|
||||
// Update timestamp
|
||||
const timestampElement = document.getElementById('last-updated');
|
||||
if (timestampElement) {
|
||||
@@ -361,20 +411,23 @@
|
||||
};
|
||||
return acc;
|
||||
}, {});
|
||||
})();
|
||||
|
||||
// Sorting state
|
||||
let currentSort = { column: null, direction: 'asc' };
|
||||
// Sorting state (needs to persist across swaps)
|
||||
if (typeof window.currentSort === 'undefined') {
|
||||
window.currentSort = { column: null, direction: 'asc' };
|
||||
}
|
||||
|
||||
function sortTable(column) {
|
||||
const tbody = document.getElementById('roster-tbody');
|
||||
const rows = Array.from(tbody.getElementsByTagName('tr'));
|
||||
|
||||
// Determine sort direction
|
||||
if (currentSort.column === column) {
|
||||
currentSort.direction = currentSort.direction === 'asc' ? 'desc' : 'asc';
|
||||
if (window.currentSort.column === column) {
|
||||
window.currentSort.direction = window.currentSort.direction === 'asc' ? 'desc' : 'asc';
|
||||
} else {
|
||||
currentSort.column = column;
|
||||
currentSort.direction = 'asc';
|
||||
window.currentSort.column = column;
|
||||
window.currentSort.direction = 'asc';
|
||||
}
|
||||
|
||||
// Sort rows
|
||||
@@ -402,8 +455,8 @@
|
||||
bVal = bVal.toLowerCase();
|
||||
}
|
||||
|
||||
if (aVal < bVal) return currentSort.direction === 'asc' ? -1 : 1;
|
||||
if (aVal > bVal) return currentSort.direction === 'asc' ? 1 : -1;
|
||||
if (aVal < bVal) return window.currentSort.direction === 'asc' ? -1 : 1;
|
||||
if (aVal > bVal) return window.currentSort.direction === 'asc' ? 1 : -1;
|
||||
return 0;
|
||||
});
|
||||
|
||||
@@ -439,10 +492,10 @@
|
||||
});
|
||||
|
||||
// Set current indicator
|
||||
if (currentSort.column) {
|
||||
const indicator = document.querySelector(`.sort-indicator[data-column="${currentSort.column}"]`);
|
||||
if (window.currentSort.column) {
|
||||
const indicator = document.querySelector(`.sort-indicator[data-column="${window.currentSort.column}"]`);
|
||||
if (indicator) {
|
||||
indicator.className = `sort-indicator ${currentSort.direction}`;
|
||||
indicator.className = `sort-indicator ${window.currentSort.direction}`;
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
40
templates/partials/fleet_calendar/available_units.html
Normal file
@@ -0,0 +1,40 @@
|
||||
<!-- Available Units for Assignment -->
|
||||
{% if units %}
|
||||
<div class="space-y-1">
|
||||
{% for unit in units %}
|
||||
<label class="flex items-center gap-3 p-2 hover:bg-gray-50 dark:hover:bg-gray-700 rounded cursor-pointer">
|
||||
<input type="checkbox" name="unit_ids" value="{{ unit.id }}"
|
||||
class="w-4 h-4 text-blue-600 focus:ring-blue-500 rounded border-gray-300 dark:border-gray-600">
|
||||
<span class="font-medium text-gray-900 dark:text-white">{{ unit.id }}</span>
|
||||
<span class="text-sm text-gray-500 dark:text-gray-400 flex-1">
|
||||
{% if unit.last_calibrated %}
|
||||
Cal: {{ unit.last_calibrated }}
|
||||
{% else %}
|
||||
No cal date
|
||||
{% endif %}
|
||||
</span>
|
||||
{% if unit.calibration_status == 'expiring_soon' %}
|
||||
<span class="text-xs px-2 py-0.5 bg-yellow-100 dark:bg-yellow-900/30 text-yellow-700 dark:text-yellow-400 rounded-full">
|
||||
Expiring soon
|
||||
</span>
|
||||
{% endif %}
|
||||
{% if unit.deployed %}
|
||||
<span class="text-xs px-2 py-0.5 bg-green-100 dark:bg-green-900/30 text-green-700 dark:text-green-400 rounded-full">
|
||||
Deployed
|
||||
</span>
|
||||
{% else %}
|
||||
<span class="text-xs px-2 py-0.5 bg-gray-100 dark:bg-gray-700 text-gray-600 dark:text-gray-400 rounded-full">
|
||||
Benched
|
||||
</span>
|
||||
{% endif %}
|
||||
</label>
|
||||
{% endfor %}
|
||||
</div>
|
||||
{% else %}
|
||||
<p class="text-gray-500 dark:text-gray-400 text-sm py-4 text-center">
|
||||
No units available for this date range.
|
||||
{% if start_date and end_date %}
|
||||
<br><span class="text-xs">All units are either reserved, have expired calibrations, or are retired.</span>
|
||||
{% endif %}
|
||||
</p>
|
||||
{% endif %}
|
||||
186
templates/partials/fleet_calendar/day_detail.html
Normal file
@@ -0,0 +1,186 @@
|
||||
<!-- Day Detail Panel Content -->
|
||||
<div class="flex items-center justify-between mb-6">
|
||||
<h2 class="text-xl font-semibold text-gray-900 dark:text-white">{{ date_display }}</h2>
|
||||
<button onclick="closeDayPanel()" class="text-gray-400 hover:text-gray-600 dark:hover:text-gray-300">
|
||||
<svg class="w-6 h-6" fill="none" stroke="currentColor" viewBox="0 0 24 24">
|
||||
<path stroke-linecap="round" stroke-linejoin="round" stroke-width="2" d="M6 18L18 6M6 6l12 12"/>
|
||||
</svg>
|
||||
</button>
|
||||
</div>
|
||||
|
||||
<!-- Summary Stats -->
|
||||
<div class="grid grid-cols-2 gap-3 mb-6">
|
||||
<div class="bg-green-50 dark:bg-green-900/20 rounded-lg p-3 text-center">
|
||||
<p class="text-2xl font-bold text-green-700 dark:text-green-300">{{ day_data.counts.available }}</p>
|
||||
<p class="text-xs text-green-600 dark:text-green-400">Available</p>
|
||||
</div>
|
||||
<div class="bg-blue-50 dark:bg-blue-900/20 rounded-lg p-3 text-center">
|
||||
<p class="text-2xl font-bold text-blue-700 dark:text-blue-300">{{ day_data.counts.reserved }}</p>
|
||||
<p class="text-xs text-blue-600 dark:text-blue-400">Reserved</p>
|
||||
</div>
|
||||
<div class="bg-yellow-50 dark:bg-yellow-900/20 rounded-lg p-3 text-center">
|
||||
<p class="text-2xl font-bold text-yellow-700 dark:text-yellow-300">{{ day_data.counts.expiring_soon }}</p>
|
||||
<p class="text-xs text-yellow-600 dark:text-yellow-400">Expiring Soon</p>
|
||||
</div>
|
||||
<div class="bg-red-50 dark:bg-red-900/20 rounded-lg p-3 text-center">
|
||||
<p class="text-2xl font-bold text-red-700 dark:text-red-300">{{ day_data.counts.expired }}</p>
|
||||
<p class="text-xs text-red-600 dark:text-red-400">Cal. Expired</p>
|
||||
</div>
|
||||
</div>
|
||||
|
||||
<!-- Calibration Expiring TODAY - Important alert -->
|
||||
{% if day_data.cal_expiring_today %}
|
||||
<div class="mb-6 p-3 bg-red-50 dark:bg-red-900/30 border border-red-200 dark:border-red-800 rounded-lg">
|
||||
<h3 class="text-sm font-semibold text-red-700 dark:text-red-400 mb-2 flex items-center gap-2">
|
||||
<svg class="w-4 h-4" fill="currentColor" viewBox="0 0 20 20">
|
||||
<path fill-rule="evenodd" d="M8.257 3.099c.765-1.36 2.722-1.36 3.486 0l5.58 9.92c.75 1.334-.213 2.98-1.742 2.98H4.42c-1.53 0-2.493-1.646-1.743-2.98l5.58-9.92zM11 13a1 1 0 11-2 0 1 1 0 012 0zm-1-8a1 1 0 00-1 1v3a1 1 0 002 0V6a1 1 0 00-1-1z" clip-rule="evenodd"/>
|
||||
</svg>
|
||||
Calibration Expires Today ({{ day_data.cal_expiring_today|length }})
|
||||
</h3>
|
||||
<div class="space-y-1">
|
||||
{% for unit in day_data.cal_expiring_today %}
|
||||
<div class="flex items-center justify-between p-2 bg-white dark:bg-gray-800 rounded text-sm">
|
||||
<a href="/unit/{{ unit.id }}" class="font-medium text-red-600 dark:text-red-400 hover:underline">
|
||||
{{ unit.id }}
|
||||
</a>
|
||||
<span class="text-red-500 text-xs">
|
||||
Last cal: {{ unit.last_calibrated }}
|
||||
</span>
|
||||
</div>
|
||||
{% endfor %}
|
||||
</div>
|
||||
</div>
|
||||
{% endif %}
|
||||
|
||||
<!-- Reservations on this date -->
|
||||
{% if day_data.reservations %}
|
||||
<div class="mb-6">
|
||||
<h3 class="text-sm font-semibold text-gray-700 dark:text-gray-300 mb-3">Reservations</h3>
|
||||
{% for res in day_data.reservations %}
|
||||
<div class="reservation-bar mb-2" style="background-color: {{ res.color }}20; border-left: 4px solid {{ res.color }};">
|
||||
<div class="flex-1">
|
||||
<p class="font-medium text-gray-900 dark:text-white">{{ res.name }}</p>
|
||||
<p class="text-xs text-gray-500 dark:text-gray-400">
|
||||
{{ res.start_date }} - {{ res.end_date }}
|
||||
</p>
|
||||
</div>
|
||||
<div class="text-right">
|
||||
<p class="font-semibold text-gray-900 dark:text-white">
|
||||
{% if res.assignment_type == 'quantity' %}
|
||||
{{ res.assigned_count }}/{{ res.quantity_needed or '?' }}
|
||||
{% else %}
|
||||
{{ res.assigned_count }} units
|
||||
{% endif %}
|
||||
</p>
|
||||
</div>
|
||||
</div>
|
||||
{% endfor %}
|
||||
</div>
|
||||
{% endif %}
|
||||
|
||||
<!-- Available Units -->
|
||||
{% if day_data.available_units %}
|
||||
<div class="mb-6">
|
||||
<h3 class="text-sm font-semibold text-gray-700 dark:text-gray-300 mb-3">
|
||||
Available Units ({{ day_data.available_units|length }})
|
||||
</h3>
|
||||
<div class="max-h-48 overflow-y-auto space-y-1">
|
||||
{% for unit in day_data.available_units %}
|
||||
<div class="flex items-center justify-between p-2 bg-gray-50 dark:bg-gray-700/50 rounded text-sm">
|
||||
<a href="/unit/{{ unit.id }}" class="font-medium text-blue-600 dark:text-blue-400 hover:underline">
|
||||
{{ unit.id }}
|
||||
</a>
|
||||
<span class="text-gray-500 dark:text-gray-400 text-xs">
|
||||
{% if unit.last_calibrated %}
|
||||
Cal: {{ unit.last_calibrated }}
|
||||
{% else %}
|
||||
No cal date
|
||||
{% endif %}
|
||||
</span>
|
||||
</div>
|
||||
{% endfor %}
|
||||
</div>
|
||||
</div>
|
||||
{% endif %}
|
||||
|
||||
<!-- Reserved Units -->
|
||||
{% if day_data.reserved_units %}
|
||||
<div class="mb-6">
|
||||
<h3 class="text-sm font-semibold text-gray-700 dark:text-gray-300 mb-3">
|
||||
Reserved Units ({{ day_data.reserved_units|length }})
|
||||
</h3>
|
||||
<div class="max-h-48 overflow-y-auto space-y-1">
|
||||
{% for unit in day_data.reserved_units %}
|
||||
<div class="flex items-center justify-between p-2 bg-blue-50 dark:bg-blue-900/20 rounded text-sm">
|
||||
<a href="/unit/{{ unit.id }}" class="font-medium text-blue-600 dark:text-blue-400 hover:underline">
|
||||
{{ unit.id }}
|
||||
</a>
|
||||
<span class="text-blue-600 dark:text-blue-400 text-xs">
|
||||
{{ unit.reservation_name }}
|
||||
</span>
|
||||
</div>
|
||||
{% endfor %}
|
||||
</div>
|
||||
</div>
|
||||
{% endif %}
|
||||
|
||||
<!-- Calibration Expired -->
|
||||
{% if day_data.expired_units %}
|
||||
<div class="mb-6">
|
||||
<h3 class="text-sm font-semibold text-red-600 dark:text-red-400 mb-3">
|
||||
Calibration Expired ({{ day_data.expired_units|length }})
|
||||
</h3>
|
||||
<div class="max-h-48 overflow-y-auto space-y-1">
|
||||
{% for unit in day_data.expired_units %}
|
||||
<div class="flex items-center justify-between p-2 bg-red-50 dark:bg-red-900/20 rounded text-sm">
|
||||
<a href="/unit/{{ unit.id }}" class="font-medium text-red-600 dark:text-red-400 hover:underline">
|
||||
{{ unit.id }}
|
||||
</a>
|
||||
<span class="text-red-500 text-xs">
|
||||
Expired: {{ unit.expiry_date }}
|
||||
</span>
|
||||
</div>
|
||||
{% endfor %}
|
||||
</div>
|
||||
</div>
|
||||
{% endif %}
|
||||
|
||||
<!-- Needs Calibration -->
|
||||
{% if day_data.needs_calibration_units %}
|
||||
<div class="mb-6">
|
||||
<h3 class="text-sm font-semibold text-gray-600 dark:text-gray-400 mb-3">
|
||||
Needs Calibration Date ({{ day_data.needs_calibration_units|length }})
|
||||
</h3>
|
||||
<div class="max-h-32 overflow-y-auto space-y-1">
|
||||
{% for unit in day_data.needs_calibration_units %}
|
||||
<div class="flex items-center justify-between p-2 bg-gray-100 dark:bg-gray-700/50 rounded text-sm">
|
||||
<a href="/unit/{{ unit.id }}" class="font-medium text-gray-600 dark:text-gray-400 hover:underline">
|
||||
{{ unit.id }}
|
||||
</a>
|
||||
<span class="text-gray-400 text-xs">No cal date set</span>
|
||||
</div>
|
||||
{% endfor %}
|
||||
</div>
|
||||
</div>
|
||||
{% endif %}
|
||||
|
||||
<!-- Expiring Soon (informational) -->
|
||||
{% if day_data.expiring_soon_units %}
|
||||
<div class="mb-6">
|
||||
<h3 class="text-sm font-semibold text-yellow-600 dark:text-yellow-400 mb-3">
|
||||
Calibration Expiring Soon ({{ day_data.expiring_soon_units|length }})
|
||||
</h3>
|
||||
<div class="max-h-32 overflow-y-auto space-y-1">
|
||||
{% for unit in day_data.expiring_soon_units %}
|
||||
<div class="flex items-center justify-between p-2 bg-yellow-50 dark:bg-yellow-900/20 rounded text-sm">
|
||||
<a href="/unit/{{ unit.id }}" class="font-medium text-yellow-700 dark:text-yellow-400 hover:underline">
|
||||
{{ unit.id }}
|
||||
</a>
|
||||
<span class="text-yellow-600 text-xs">
|
||||
Expires: {{ unit.expiry_date }}
|
||||
</span>
|
||||
</div>
|
||||
{% endfor %}
|
||||
</div>
|
||||
</div>
|
||||
{% endif %}
|
||||
103
templates/partials/fleet_calendar/reservations_list.html
Normal file
@@ -0,0 +1,103 @@
|
||||
<!-- Reservations List -->
|
||||
{% if reservations %}
|
||||
<div class="space-y-3">
|
||||
{% for item in reservations %}
|
||||
{% set res = item.reservation %}
|
||||
<div class="flex items-center justify-between p-4 rounded-lg border border-gray-200 dark:border-gray-700 hover:bg-gray-50 dark:hover:bg-gray-700/50 transition-colors"
|
||||
style="border-left: 4px solid {{ res.color }};">
|
||||
<div class="flex-1">
|
||||
<div class="flex items-center gap-2">
|
||||
<h3 class="font-semibold text-gray-900 dark:text-white">{{ res.name }}</h3>
|
||||
{% if item.has_conflicts %}
|
||||
<span class="px-2 py-0.5 text-xs font-medium bg-red-100 dark:bg-red-900/30 text-red-700 dark:text-red-400 rounded-full"
|
||||
title="{{ item.conflict_count }} unit(s) have calibration expiring during this job">
|
||||
{{ item.conflict_count }} conflict{{ 's' if item.conflict_count != 1 else '' }}
|
||||
</span>
|
||||
{% endif %}
|
||||
</div>
|
||||
<p class="text-sm text-gray-500 dark:text-gray-400 mt-1">
|
||||
{{ res.start_date.strftime('%b %d, %Y') }} -
|
||||
{% if res.end_date %}
|
||||
{{ res.end_date.strftime('%b %d, %Y') }}
|
||||
{% elif res.end_date_tbd %}
|
||||
<span class="text-yellow-600 dark:text-yellow-400 font-medium">TBD</span>
|
||||
{% if res.estimated_end_date %}
|
||||
<span class="text-gray-400">(est. {{ res.estimated_end_date.strftime('%b %d, %Y') }})</span>
|
||||
{% endif %}
|
||||
{% else %}
|
||||
<span class="text-yellow-600 dark:text-yellow-400">Ongoing</span>
|
||||
{% endif %}
|
||||
</p>
|
||||
{% if res.notes %}
|
||||
<p class="text-sm text-gray-400 dark:text-gray-500 mt-1">{{ res.notes }}</p>
|
||||
{% endif %}
|
||||
</div>
|
||||
<div class="text-right ml-4">
|
||||
<p class="text-lg font-bold text-gray-900 dark:text-white">
|
||||
{% if res.assignment_type == 'quantity' %}
|
||||
{{ item.assigned_count }}/{{ res.quantity_needed or '?' }}
|
||||
{% else %}
|
||||
{{ item.assigned_count }}
|
||||
{% endif %}
|
||||
</p>
|
||||
<p class="text-xs text-gray-500 dark:text-gray-400">
|
||||
{{ 'units needed' if res.assignment_type == 'quantity' else 'units assigned' }}
|
||||
</p>
|
||||
</div>
|
||||
<div class="ml-4 flex items-center gap-2">
|
||||
<button onclick="editReservation('{{ res.id }}')"
|
||||
class="p-2 text-gray-400 hover:text-blue-600 dark:hover:text-blue-400 rounded-lg hover:bg-gray-100 dark:hover:bg-gray-700"
|
||||
title="Edit reservation">
|
||||
<svg class="w-5 h-5" fill="none" stroke="currentColor" viewBox="0 0 24 24">
|
||||
<path stroke-linecap="round" stroke-linejoin="round" stroke-width="2" d="M11 5H6a2 2 0 00-2 2v11a2 2 0 002 2h11a2 2 0 002-2v-5m-1.414-9.414a2 2 0 112.828 2.828L11.828 15H9v-2.828l8.586-8.586z"/>
|
||||
</svg>
|
||||
</button>
|
||||
<button onclick="deleteReservation('{{ res.id }}', '{{ res.name }}')"
|
||||
class="p-2 text-gray-400 hover:text-red-600 dark:hover:text-red-400 rounded-lg hover:bg-gray-100 dark:hover:bg-gray-700"
|
||||
title="Delete reservation">
|
||||
<svg class="w-5 h-5" fill="none" stroke="currentColor" viewBox="0 0 24 24">
|
||||
<path stroke-linecap="round" stroke-linejoin="round" stroke-width="2" d="M19 7l-.867 12.142A2 2 0 0116.138 21H7.862a2 2 0 01-1.995-1.858L5 7m5 4v6m4-6v6m1-10V4a1 1 0 00-1-1h-4a1 1 0 00-1 1v3M4 7h16"/>
|
||||
</svg>
|
||||
</button>
|
||||
</div>
|
||||
</div>
|
||||
{% endfor %}
|
||||
</div>
|
||||
|
||||
<script>
|
||||
async function deleteReservation(id, name) {
|
||||
if (!confirm(`Delete reservation "${name}"?\n\nThis will remove all unit assignments.`)) {
|
||||
return;
|
||||
}
|
||||
|
||||
try {
|
||||
const response = await fetch(`/api/fleet-calendar/reservations/${id}`, {
|
||||
method: 'DELETE'
|
||||
});
|
||||
|
||||
if (response.ok) {
|
||||
window.location.reload();
|
||||
} else {
|
||||
const data = await response.json();
|
||||
alert('Error: ' + (data.detail || 'Failed to delete'));
|
||||
}
|
||||
} catch (error) {
|
||||
console.error('Error:', error);
|
||||
alert('Error deleting reservation');
|
||||
}
|
||||
}
|
||||
|
||||
function editReservation(id) {
|
||||
// For now, just show alert - can implement edit modal later
|
||||
alert('Edit functionality coming soon. For now, delete and recreate the reservation.');
|
||||
}
|
||||
</script>
|
||||
{% else %}
|
||||
<div class="text-center py-8">
|
||||
<svg class="w-12 h-12 mx-auto text-gray-400 dark:text-gray-500 mb-4" fill="none" stroke="currentColor" viewBox="0 0 24 24">
|
||||
<path stroke-linecap="round" stroke-linejoin="round" stroke-width="2" d="M8 7V3m8 4V3m-9 8h10M5 21h14a2 2 0 002-2V7a2 2 0 00-2-2H5a2 2 0 00-2 2v12a2 2 0 002 2z"/>
|
||||
</svg>
|
||||
<p class="text-gray-500 dark:text-gray-400">No reservations for {{ year }}</p>
|
||||
<p class="text-sm text-gray-400 dark:text-gray-500 mt-1">Click "New Reservation" to plan unit assignments</p>
|
||||
</div>
|
||||
{% endif %}
|
||||