Compare commits
105 Commits
ff38b74548
...
cleanup/pr
| Author | SHA1 | Date | |
|---|---|---|---|
| 49bc625c1a | |||
| 95fedca8c9 | |||
| e8e155556a | |||
| 33e962e73d | |||
| ac48fb2977 | |||
| 3c4b81cf78 | |||
| d135727ebd | |||
| 64d4423308 | |||
| 4f71d528ce | |||
| 4f56dea4f3 | |||
| 57a85f565b | |||
|
|
e6555ba924 | ||
| 8694282dd0 | |||
| bc02dc9564 | |||
| 0d01715f81 | |||
| b3ec249c5e | |||
| b6e74258f1 | |||
| 1a87ff13c9 | |||
| 22c62c0729 | |||
| 0f47b69c92 | |||
| 76667454b3 | |||
| 0e3f512203 | |||
|
|
15d962ba42 | ||
| e4d1f0d684 | |||
|
|
b571dc29bc | ||
|
|
e2c841d5d7 | ||
| cc94493331 | |||
|
|
5a5426cceb | ||
|
|
66eddd6fe2 | ||
|
|
c77794787c | ||
| 61c84bc71d | |||
| fbf7f2a65d | |||
|
|
202fcaf91c | ||
|
|
3a411d0a89 | ||
| 0c2186f5d8 | |||
| c138e8c6a0 | |||
| 1dd396acd8 | |||
| e89a04f58c | |||
| e4ef065db8 | |||
| 86010de60c | |||
| f89f04cd6f | |||
| 67a2faa2d3 | |||
| 14856e61ef | |||
| 2b69518b33 | |||
| 6070d03e83 | |||
| 240552751c | |||
| 015ce0a254 | |||
| ef8c046f31 | |||
| 3637cf5af8 | |||
| 7fde14d882 | |||
| bd3d937a82 | |||
| 291fa8e862 | |||
| 8e292b1aca | |||
| 7516bbea70 | |||
| da4e5f66c5 | |||
| dae2595303 | |||
| 0c4e7aa5e6 | |||
| 229499ccf6 | |||
| fdc4adeaee | |||
| b3bf91880a | |||
| 17b3f91dfc | |||
| 6c1d0bc467 | |||
|
|
abd059983f | ||
|
|
0f17841218 | ||
|
|
65362bab21 | ||
|
|
dc77a362ce | ||
|
|
28942600ab | ||
|
|
80861997af | ||
| b15d434fce | |||
|
|
70ef43de11 | ||
| 7b4e12c127 | |||
|
|
24473c9ca3 | ||
|
|
caabfd0c42 | ||
|
|
ebe60d2b7d | ||
|
|
842e9d6f61 | ||
| 742a98a8ed | |||
| 3b29c4d645 | |||
|
|
63d9c59873 | ||
|
|
794bfc00dc | ||
|
|
89662d2fa5 | ||
|
|
eb0a99796d | ||
| b47e69e609 | |||
| 1cb25b6c17 | |||
|
|
e515bff1a9 | ||
|
|
f296806fd1 | ||
|
|
24da5ab79f | ||
|
|
305540f564 | ||
|
|
639b485c28 | ||
|
|
d78bafb76e | ||
|
|
8373cff10d | ||
|
|
4957a08198 | ||
|
|
05482bd903 | ||
|
|
5ee6f5eb28 | ||
| 7ce0f6115d | |||
|
|
6492fdff82 | ||
|
|
44d7841852 | ||
|
|
38c600aca3 | ||
|
|
eeda94926f | ||
|
|
57be9bf1f1 | ||
|
|
8431784708 | ||
|
|
c771a86675 | ||
|
|
65ea0920db | ||
|
|
1f3fa7a718 | ||
|
|
a9c9b1fd48 | ||
|
|
4c213c96ee |
@@ -1,3 +1,5 @@
|
|||||||
|
docker-compose.override.yml
|
||||||
|
|
||||||
# Python cache / compiled
|
# Python cache / compiled
|
||||||
__pycache__
|
__pycache__
|
||||||
*.pyc
|
*.pyc
|
||||||
@@ -28,6 +30,7 @@ ENV/
|
|||||||
|
|
||||||
# Runtime data (mounted volumes)
|
# Runtime data (mounted volumes)
|
||||||
data/
|
data/
|
||||||
|
data-dev/
|
||||||
|
|
||||||
# Editors / OS junk
|
# Editors / OS junk
|
||||||
.vscode/
|
.vscode/
|
||||||
|
|||||||
20
.gitignore
vendored
@@ -1,3 +1,17 @@
|
|||||||
|
# Terra-View Specifics
|
||||||
|
# Dev build counter (local only, never commit)
|
||||||
|
build_number.txt
|
||||||
|
docker-compose.override.yml
|
||||||
|
|
||||||
|
# SQLite database files
|
||||||
|
*.db
|
||||||
|
*.db-journal
|
||||||
|
data/
|
||||||
|
data-dev/
|
||||||
|
.aider*
|
||||||
|
.aider*
|
||||||
|
|
||||||
|
|
||||||
# Byte-compiled / optimized / DLL files
|
# Byte-compiled / optimized / DLL files
|
||||||
__pycache__/
|
__pycache__/
|
||||||
*.py[codz]
|
*.py[codz]
|
||||||
@@ -206,10 +220,14 @@ marimo/_static/
|
|||||||
marimo/_lsp/
|
marimo/_lsp/
|
||||||
__marimo__/
|
__marimo__/
|
||||||
|
|
||||||
|
<<<<<<< HEAD
|
||||||
# Seismo Fleet Manager
|
# Seismo Fleet Manager
|
||||||
# SQLite database files
|
# SQLite database files
|
||||||
*.db
|
*.db
|
||||||
*.db-journal
|
*.db-journal
|
||||||
data/
|
/data/
|
||||||
|
/data-dev/
|
||||||
.aider*
|
.aider*
|
||||||
.aider*
|
.aider*
|
||||||
|
=======
|
||||||
|
>>>>>>> 0c2186f5d89d948b0357d674c0773a67a67d8027
|
||||||
|
|||||||
260
CHANGELOG.md
@@ -1,10 +1,263 @@
|
|||||||
# Changelog
|
# Changelog
|
||||||
|
|
||||||
All notable changes to Seismo Fleet Manager will be documented in this file.
|
All notable changes to Terra-View will be documented in this file.
|
||||||
|
|
||||||
The format is based on [Keep a Changelog](https://keepachangelog.com/en/1.0.0/),
|
The format is based on [Keep a Changelog](https://keepachangelog.com/en/1.0.0/),
|
||||||
and this project adheres to [Semantic Versioning](https://semver.org/spec/v2.0.0.html).
|
and this project adheres to [Semantic Versioning](https://semver.org/spec/v2.0.0.html).
|
||||||
|
|
||||||
|
## [0.9.2] - 2026-03-27
|
||||||
|
|
||||||
|
### Added
|
||||||
|
- **Deployment Records**: Seismographs now track a full deployment history (location, project, dates). Each deployment is logged on the unit detail page with start/end dates, and the fleet calendar service uses this history for availability calculations.
|
||||||
|
- **Allocated Unit Status**: New `allocated` status for units reserved for an upcoming job but not yet deployed. Allocated units appear in the dashboard summary, roster filters, and devices table with visual indicators.
|
||||||
|
- **Project Allocation**: Units can be linked to a project via `allocated_to_project_id`. Allocation is shown on the unit detail page and in a new quick-info modal accessible from the fleet calendar and roster.
|
||||||
|
- **Quick-Info Unit Modal**: Click any unit in the fleet calendar or roster to open a modal showing cal status, project allocation, upcoming jobs, and deployment state — without leaving the page.
|
||||||
|
- **Cal Date in Planner**: When a unit is selected for a monitoring location slot in the Job Planner, its calibration expiry date is now shown inline so you can spot near-expiry units before committing.
|
||||||
|
- **Inline Seismograph Editing**: Unit rows in the seismograph dashboard now support inline editing of cal date, notes, and deployment status without navigating to the full detail page.
|
||||||
|
|
||||||
|
### Migration Notes
|
||||||
|
Run on each database before deploying:
|
||||||
|
```bash
|
||||||
|
docker compose exec terra-view python3 backend/migrate_add_allocated.py
|
||||||
|
docker compose exec terra-view python3 backend/migrate_add_deployment_records.py
|
||||||
|
```
|
||||||
|
|
||||||
|
---
|
||||||
|
|
||||||
|
## [0.9.1] - 2026-03-20
|
||||||
|
|
||||||
|
### Fixed
|
||||||
|
- **Location slots not persisting**: Empty monitoring location slots (no unit assigned yet) were lost on save/reload. Added `location_slots` JSON column to `job_reservations` to store the full slot list including empty slots.
|
||||||
|
- **Modems in Recent Alerts**: Modems no longer appear in the dashboard Recent Alerts panel — alerts are for seismographs and SLMs only. Modem status is still tracked internally via paired device inheritance.
|
||||||
|
- **Series 4 heartbeat `source_id`**: Updated heartbeat endpoint to accept the new `source_id` field from Series 4 units with fallback to the legacy field for backwards compatibility.
|
||||||
|
|
||||||
|
### Migration Notes
|
||||||
|
Run on each database before deploying:
|
||||||
|
```bash
|
||||||
|
docker compose exec terra-view python3 backend/migrate_add_location_slots.py
|
||||||
|
```
|
||||||
|
|
||||||
|
---
|
||||||
|
|
||||||
|
## [0.9.0] - 2026-03-19
|
||||||
|
|
||||||
|
### Added
|
||||||
|
- **Job Planner**: Full redesign of the Fleet Calendar into a two-tab Job Planner / Calendar interface
|
||||||
|
- **Planner tab**: Create and manage job reservations with name, device type, dates, color, estimated units, and monitoring locations
|
||||||
|
- **Calendar tab**: 12-month rolling heatmap with colored job bars per day; confirmed jobs solid, planned jobs dashed
|
||||||
|
- **Monitoring Locations**: Each job has named location slots (filled = unit assigned, empty = needs a unit); progress shown as `2/5` with colored squares that fill as units are assigned
|
||||||
|
- **Estimated Units**: Separate planning number independent of actual location count; shown prominently on job cards
|
||||||
|
- **Fleet Summary panel**: Unit counts as clickable filter buttons; unit list shows reservation badges with job name, dates, and color
|
||||||
|
- **Available Units panel**: Shows units available for the job's date range when assigning
|
||||||
|
- **Smart color picker**: 18-swatch palette + custom color wheel; new jobs auto-pick a color maximally distant in hue from existing jobs
|
||||||
|
- **Job card progress**: `est. N · X/Y (Z more)` with filled/empty squares; amber → green when fully assigned
|
||||||
|
- **Promote to Project**: Promote a planned job to a tracked project directly from the planner form
|
||||||
|
- **Collapsible job details**: Name, dates, device type, color, project link, and estimated units collapse into a summary header
|
||||||
|
- **Calendar bar tooltips**: Hover any job bar to see job name and date range
|
||||||
|
- **Hash-based tab persistence**: `#cal` in URL restores Calendar tab on refresh; device type toggle preserves active tab
|
||||||
|
- **Auto-scroll to today**: Switching to Calendar tab smooth-scrolls to the current month
|
||||||
|
- **Upcoming project status**: New `upcoming` status for projects promoted from reservations
|
||||||
|
- **Job device type**: Reservations carry a device type so they only appear on the correct calendar
|
||||||
|
- **Project filtering by device type**: Projects only appear on the calendar matching their type (vibration → seismograph, sound → SLM, combined → both)
|
||||||
|
- **Confirmed/Planned toggles**: Independent show/hide toggles for job bar layers on the calendar
|
||||||
|
- **Cal expire dots toggle**: Calibration expiry dots off by default, togglable
|
||||||
|
|
||||||
|
### Changed
|
||||||
|
- **Renamed**: "Fleet Calendar" / "Reservation Planner" → **"Job Planner"** throughout UI and sidebar
|
||||||
|
- **Project status dropdown**: Inline `<select>` in project header for quick status changes
|
||||||
|
- **"All Projects" tab**: Shows everything except deleted; default view excludes archived/completed
|
||||||
|
- **Toast notifications**: All `alert()` dialogs replaced with non-blocking toasts (green = success, red = error)
|
||||||
|
|
||||||
|
### Migration Notes
|
||||||
|
Run on each database before deploying:
|
||||||
|
```bash
|
||||||
|
docker compose exec terra-view python3 -c "
|
||||||
|
import sqlite3
|
||||||
|
conn = sqlite3.connect('/app/data/seismo_fleet.db')
|
||||||
|
conn.execute('ALTER TABLE job_reservations ADD COLUMN estimated_units INTEGER')
|
||||||
|
conn.commit()
|
||||||
|
conn.close()
|
||||||
|
"
|
||||||
|
```
|
||||||
|
|
||||||
|
---
|
||||||
|
|
||||||
|
## [0.8.0] - 2026-03-18
|
||||||
|
|
||||||
|
### Added
|
||||||
|
- **Watcher Manager**: New admin page (`/admin/watchers`) for monitoring field watcher agents
|
||||||
|
- Live status cards per agent showing connectivity, version, IP, last-seen age, and log tail
|
||||||
|
- Trigger Update button to queue a self-update on the agent's next heartbeat
|
||||||
|
- Expand/collapse log tail with full-log expand mode
|
||||||
|
- Live surgical refresh every 30 seconds via `/api/admin/watchers` — no full page reload, open logs stay open
|
||||||
|
|
||||||
|
### Changed
|
||||||
|
- **Watcher status logic**: Agent status now reflects whether Terra-View is hearing from the watcher (ok if seen within 60 minutes, missing otherwise) — previously reflected the worst unit status from the last heartbeat payload, which caused false alarms when units went missing
|
||||||
|
|
||||||
|
### Fixed
|
||||||
|
- **Watcher Manager meta row**: Dark mode background was white due to invalid `dark:bg-slate-850` Tailwind class; corrected to `dark:bg-slate-800`
|
||||||
|
|
||||||
|
---
|
||||||
|
|
||||||
|
## [0.7.1] - 2026-03-12
|
||||||
|
|
||||||
|
### Added
|
||||||
|
- **"Out for Calibration" Unit Status**: New `out_for_cal` status for units currently away for calibration, with visual indicators in the roster, unit list, and seismograph stats panel
|
||||||
|
- **Reservation Modal**: Fleet calendar reservation modal is now fully functional for creating and managing device reservations
|
||||||
|
|
||||||
|
### Changed
|
||||||
|
- **Retire Unit Button**: Redesigned to be more visually prominent/destructive to reduce accidental clicks
|
||||||
|
|
||||||
|
### Fixed
|
||||||
|
- **Migration Scripts**: Fixed database path references in several migration scripts
|
||||||
|
- **Docker Compose**: Removed dev override file from the repository; dev environment config kept separate
|
||||||
|
|
||||||
|
### Migration Notes
|
||||||
|
Run the following migration script once per database before deploying:
|
||||||
|
```bash
|
||||||
|
python backend/migrate_add_out_for_calibration.py
|
||||||
|
```
|
||||||
|
|
||||||
|
---
|
||||||
|
|
||||||
|
## [0.7.0] - 2026-03-07
|
||||||
|
|
||||||
|
### Added
|
||||||
|
- **Project Status Management**: Projects can now be placed `on_hold` or `archived`, with automatic cancellation of pending scheduled actions
|
||||||
|
- **Hard Delete Projects**: Support for permanently deleting projects, in addition to soft-delete with auto-pruning
|
||||||
|
- **Vibration Location Detail**: New dedicated template for vibration project location detail views
|
||||||
|
- **Vibration Project Isolation**: Vibration projects no longer show SLM-specific project tabs
|
||||||
|
- **Manual SD Card Data Upload**: Upload offline NRL data directly from SD card via ZIP or multi-file select
|
||||||
|
- Accepts `.rnd`/`.rnh` files; parses `.rnh` metadata for session start/stop times, serial number, and store name
|
||||||
|
- Creates `MonitoringSession` and `DataFile` records automatically; no unit assignment required
|
||||||
|
- Upload panel on NRL detail Data Files tab with inline feedback and auto-refresh via HTMX
|
||||||
|
- **Standalone SLM Type**: New SLM device mode that operates without a modem (direct IP connection)
|
||||||
|
- **NL32 Data Support**: Report generator and web viewer now support NL32 measurement data format
|
||||||
|
- **Combined Report Wizard**: Multi-session combined Excel report generation tool
|
||||||
|
- Wizard UI grouped by location with period type badges (day/night)
|
||||||
|
- Each selected session produces one `.xlsx` in a ZIP archive
|
||||||
|
- Period type filtering: day sessions keep last calendar date (7AM–6:59PM); night sessions span both days (7PM–6:59AM)
|
||||||
|
- **Combined Report Preview**: Interactive spreadsheet-style preview before generating combined reports
|
||||||
|
- **Chart Preview**: Live chart preview in the report generator matching final report styling
|
||||||
|
- **SLM Model Schemas**: Per-model configuration schemas for NL32, NL43, NL53 devices
|
||||||
|
- **Data Collection Mode**: Projects now store a data collection mode field with UI controls and migration
|
||||||
|
|
||||||
|
### Changed
|
||||||
|
- **MonitoringSession rename**: `RecordingSession` renamed to `MonitoringSession` throughout codebase; DB table renamed from `recording_sessions` to `monitoring_sessions`
|
||||||
|
- Migration: `backend/migrate_rename_recording_to_monitoring_sessions.py`
|
||||||
|
- **Combined Report Split Logic**: Separate days now generate separate `.xlsx` files; NRLs remain one per sheet
|
||||||
|
- **Mass Upload Parsing**: Smarter file filtering — no longer imports unneeded Lp files or `.xlsx` files
|
||||||
|
- **SLM Start Time Grace Period**: 15-minute grace window added so data starting at session start time is included
|
||||||
|
- **NL32 Date Parsing**: Date now read from `start_time` field instead of file metadata
|
||||||
|
- **Project Data Labels**: Improved Jinja filters and UI label clarity for project data views
|
||||||
|
|
||||||
|
### Fixed
|
||||||
|
- **Dev/Prod Separation**: Dev server now uses Docker Compose override; production deployment no longer affected by dev config
|
||||||
|
- **SLM Modal**: Bench/deploy toggle now correctly shown in SLM unit modal
|
||||||
|
- **Auto-Downloaded Files**: Files downloaded by scheduler now appear in project file listings
|
||||||
|
- **Duplicate Download**: Removed duplicate file download that occurred following a scheduled stop
|
||||||
|
- **SLMM Environment Variables**: `TCP_IDLE_TTL` and `TCP_MAX_AGE` now correctly passed to SLMM service via docker-compose
|
||||||
|
|
||||||
|
### Technical Details
|
||||||
|
- `session_label` and `period_type` stored on `monitoring_sessions` table (migration: `migrate_add_session_period_type.py`)
|
||||||
|
- `device_model` stored on `monitoring_sessions` table (migration: `migrate_add_session_device_model.py`)
|
||||||
|
- Upload endpoint: `POST /api/projects/{project_id}/nrl/{location_id}/upload-data`
|
||||||
|
- ZIP filename format: `{session_label}_{project_name}_report.xlsx` (label first)
|
||||||
|
|
||||||
|
### Migration Notes
|
||||||
|
Run the following migration scripts once per database before deploying:
|
||||||
|
```bash
|
||||||
|
python backend/migrate_rename_recording_to_monitoring_sessions.py
|
||||||
|
python backend/migrate_add_session_period_type.py
|
||||||
|
python backend/migrate_add_session_device_model.py
|
||||||
|
```
|
||||||
|
|
||||||
|
---
|
||||||
|
|
||||||
|
## [0.6.1] - 2026-02-16
|
||||||
|
|
||||||
|
### Added
|
||||||
|
- **One-Off Recording Schedules**: Support for scheduling single recordings with specific start and end datetimes
|
||||||
|
- **Bidirectional Pairing Sync**: Pairing a device with a modem now automatically updates both sides, clearing stale pairings when reassigned
|
||||||
|
- **Auto-Fill Notes from Modem**: Notes are now copied from modem to paired device when fields are empty
|
||||||
|
- **SLMM Download Requests**: New `_download_request` method in SLMM client for binary file downloads with local save
|
||||||
|
|
||||||
|
### Fixed
|
||||||
|
- **Scheduler Timezone**: One-off scheduler times now use local time instead of UTC
|
||||||
|
- **Pairing Consistency**: Old device references are properly cleared when a modem is re-paired to a new device
|
||||||
|
|
||||||
|
## [0.6.0] - 2026-02-06
|
||||||
|
|
||||||
|
### Added
|
||||||
|
- **Calendar & Reservation Mode**: Fleet calendar view with reservation system for scheduling device deployments
|
||||||
|
- **Device Pairing Interface**: New two-column pairing page (`/pair-devices`) for linking recorders (seismographs/SLMs) with modems
|
||||||
|
- Visual pairing interface with drag-and-drop style interactions
|
||||||
|
- Fuzzy-search modem pairing for SLMs
|
||||||
|
- Pairing options now accessible from modem page
|
||||||
|
- Improved pair status sharing across views
|
||||||
|
- **Modem Dashboard Enhancements**:
|
||||||
|
- Modem model number now a dedicated configuration field with per-model options
|
||||||
|
- Direct link to modem login page from unit detail view
|
||||||
|
- Modem view converted to list format
|
||||||
|
- **Seismograph List Improvements**:
|
||||||
|
- Enhanced visibility with better filtering and sorting
|
||||||
|
- Calibration dates now color-coded for quick status assessment
|
||||||
|
- User sets date of previous calibration (not expiry) for clearer workflow
|
||||||
|
- **SLMM Device Control Lock**: Prevents command flooding to NL-43 devices
|
||||||
|
|
||||||
|
### Changed
|
||||||
|
- **Calibration Date UX**: Users now set the date of the previous calibration rather than upcoming expiry dates - more intuitive workflow
|
||||||
|
- **Settings Persistence**: Settings save no longer reloads the page
|
||||||
|
- **Tab State**: Tab state now persists in URL hash for better navigation
|
||||||
|
- **Scheduler Management**: Schedule changes now cascade to individual events
|
||||||
|
- **Dashboard Filtering**: Enhanced dashboard with additional filtering options and SLM status sync
|
||||||
|
- **SLMM Polling Intervals**: Fixed and improved polling intervals for better responsiveness
|
||||||
|
- **24-Hour Scheduler Cycle**: Improved cycle handling to prevent issues with scheduled downloads
|
||||||
|
|
||||||
|
### Fixed
|
||||||
|
- **SLM Modal Fields**: Modal now only contains correct device-specific fields
|
||||||
|
- **IP Address Handling**: IP address correctly passed via modem pairing
|
||||||
|
- **Mobile Type Display**: Fixed incorrect device type display in roster and device tables
|
||||||
|
- **SLMM Scheduled Downloads**: Fixed issues with scheduled download operations
|
||||||
|
|
||||||
|
## [0.5.1] - 2026-01-27
|
||||||
|
|
||||||
|
### Added
|
||||||
|
- **Dashboard Schedule View**: Today's scheduled actions now display directly on the main dashboard
|
||||||
|
- New "Today's Actions" panel showing upcoming and past scheduled events
|
||||||
|
- Schedule list partial for project-specific schedule views
|
||||||
|
- API endpoint for fetching today's schedule data
|
||||||
|
- **New Branding Assets**: Complete logo rework for Terra-View
|
||||||
|
- New Terra-View logos for light and dark themes
|
||||||
|
- Retina-ready (@2x) logo variants
|
||||||
|
- Updated favicons (16px and 32px)
|
||||||
|
- Refreshed PWA icons (72px through 512px)
|
||||||
|
|
||||||
|
### Changed
|
||||||
|
- **Dashboard Layout**: Reorganized to include schedule information panel
|
||||||
|
- **Base Template**: Updated to use new Terra-View logos with theme-aware switching
|
||||||
|
|
||||||
|
## [0.5.0] - 2026-01-23
|
||||||
|
|
||||||
|
_Note: This version was not formally released; changes were included in v0.5.1._
|
||||||
|
|
||||||
|
## [0.4.4] - 2026-01-23
|
||||||
|
|
||||||
|
### Added
|
||||||
|
- **Recurring schedules**: New scheduler service, recurring schedule APIs, and schedule templates (calendar/interval/list).
|
||||||
|
- **Alerts UI + backend**: Alerting service plus dropdown/list templates for surfacing notifications.
|
||||||
|
- **Report templates + viewers**: CRUD API for report templates, report preview screen, and RND file viewer.
|
||||||
|
- **SLM tooling**: SLM settings modal and SLM project report generator workflow.
|
||||||
|
|
||||||
|
### Changed
|
||||||
|
- **Project data management**: Unified files view, refreshed FTP browser, and new project header/templates for file/session/unit/assignment lists.
|
||||||
|
- **Device/SLM sync**: Standardized SLM device types and tightened SLMM sync paths.
|
||||||
|
- **Docs/scripts**: Cleanup pass and expanded device-type documentation.
|
||||||
|
|
||||||
|
### Fixed
|
||||||
|
- **Scheduler actions**: Strict command definitions so actions run reliably.
|
||||||
|
- **Project view title**: Resolved JSON string rendering in project headers.
|
||||||
|
|
||||||
## [0.4.3] - 2026-01-14
|
## [0.4.3] - 2026-01-14
|
||||||
|
|
||||||
### Added
|
### Added
|
||||||
@@ -361,6 +614,11 @@ No database migration required for v0.4.0. All new features use existing databas
|
|||||||
- Photo management per unit
|
- Photo management per unit
|
||||||
- Automated status categorization (OK/Pending/Missing)
|
- Automated status categorization (OK/Pending/Missing)
|
||||||
|
|
||||||
|
[0.7.0]: https://github.com/serversdwn/seismo-fleet-manager/compare/v0.6.1...v0.7.0
|
||||||
|
[0.6.0]: https://github.com/serversdwn/seismo-fleet-manager/compare/v0.5.1...v0.6.0
|
||||||
|
[0.5.1]: https://github.com/serversdwn/seismo-fleet-manager/compare/v0.5.0...v0.5.1
|
||||||
|
[0.5.0]: https://github.com/serversdwn/seismo-fleet-manager/compare/v0.4.4...v0.5.0
|
||||||
|
[0.4.4]: https://github.com/serversdwn/seismo-fleet-manager/compare/v0.4.3...v0.4.4
|
||||||
[0.4.3]: https://github.com/serversdwn/seismo-fleet-manager/compare/v0.4.2...v0.4.3
|
[0.4.3]: https://github.com/serversdwn/seismo-fleet-manager/compare/v0.4.2...v0.4.3
|
||||||
[0.4.2]: https://github.com/serversdwn/seismo-fleet-manager/compare/v0.4.1...v0.4.2
|
[0.4.2]: https://github.com/serversdwn/seismo-fleet-manager/compare/v0.4.1...v0.4.2
|
||||||
[0.4.1]: https://github.com/serversdwn/seismo-fleet-manager/compare/v0.4.0...v0.4.1
|
[0.4.1]: https://github.com/serversdwn/seismo-fleet-manager/compare/v0.4.0...v0.4.1
|
||||||
|
|||||||
@@ -1,5 +1,9 @@
|
|||||||
FROM python:3.11-slim
|
FROM python:3.11-slim
|
||||||
|
|
||||||
|
# Build number for dev builds (injected via --build-arg)
|
||||||
|
ARG BUILD_NUMBER=0
|
||||||
|
ENV BUILD_NUMBER=${BUILD_NUMBER}
|
||||||
|
|
||||||
# Set working directory
|
# Set working directory
|
||||||
WORKDIR /app
|
WORKDIR /app
|
||||||
|
|
||||||
|
|||||||
48
README.md
@@ -1,4 +1,4 @@
|
|||||||
# Seismo Fleet Manager v0.4.3
|
# Terra-View v0.9.2
|
||||||
Backend API and HTMX-powered web interface for managing a mixed fleet of seismographs and field modems. Track deployments, monitor health in real time, merge roster intent with incoming telemetry, and control your fleet through a unified database and dashboard.
|
Backend API and HTMX-powered web interface for managing a mixed fleet of seismographs and field modems. Track deployments, monitor health in real time, merge roster intent with incoming telemetry, and control your fleet through a unified database and dashboard.
|
||||||
|
|
||||||
## Features
|
## Features
|
||||||
@@ -496,6 +496,34 @@ docker compose down -v
|
|||||||
|
|
||||||
## Release Highlights
|
## Release Highlights
|
||||||
|
|
||||||
|
### v0.8.0 — 2026-03-18
|
||||||
|
- **Watcher Manager**: Admin page for monitoring field watcher agents with live status cards, log tails, and one-click update triggering
|
||||||
|
- **Watcher Status Fix**: Agent status now reflects heartbeat connectivity (missing if not heard from in >60 min) rather than unit-level data staleness
|
||||||
|
- **Live Refresh**: Watcher Manager surgically patches status, last-seen, and pending indicators every 30s without a full page reload
|
||||||
|
|
||||||
|
### v0.7.0 — 2026-03-07
|
||||||
|
- **Project Status Management**: On-hold and archived project states with automatic cancellation of pending actions
|
||||||
|
- **Manual SD Card Upload**: Upload offline NRL/SLM data directly from SD card (ZIP or multi-file); auto-creates monitoring sessions from `.rnh` metadata
|
||||||
|
- **Combined Report Wizard**: Multi-session Excel report generation with location grouping, period type filtering, and ZIP download
|
||||||
|
- **NL32 Support**: Report generator and web viewer now handle NL32 measurement data
|
||||||
|
- **Chart Preview**: Live chart preview in the report generator matching final output styling
|
||||||
|
- **Standalone SLM Mode**: SLMs can now be configured without a paired modem (direct IP)
|
||||||
|
- **Vibration Project Isolation**: Vibration project views no longer show SLM-specific tabs
|
||||||
|
- **MonitoringSession Rename**: `RecordingSession` renamed to `MonitoringSession` throughout; run migration before deploying
|
||||||
|
|
||||||
|
### v0.6.1 — 2026-02-16
|
||||||
|
- **One-Off Recording Schedules**: Schedule single recordings with specific start/end datetimes
|
||||||
|
- **Bidirectional Pairing Sync**: Device-modem pairing now updates both sides automatically
|
||||||
|
- **Scheduler Timezone Fix**: One-off schedule times use local time instead of UTC
|
||||||
|
|
||||||
|
### v0.6.0 — 2026-02-06
|
||||||
|
- **Calendar & Reservation Mode**: Fleet calendar view with device deployment scheduling and reservation system
|
||||||
|
- **Device Pairing Interface**: New `/pair-devices` page with two-column layout for linking recorders with modems, fuzzy-search, and visual pairing workflow
|
||||||
|
- **Calibration UX Overhaul**: Users now set date of previous calibration (not expiry); seismograph list enhanced with color-coded calibration status, filtering, and sorting
|
||||||
|
- **Modem Dashboard**: Model number as dedicated config, modem login links, list view format, and pairing options accessible from modem page
|
||||||
|
- **SLMM Improvements**: Device control lock prevents command flooding, fixed polling intervals and scheduled downloads
|
||||||
|
- **UI Polish**: Tab state persists in URL hash, settings save without reload, scheduler changes cascade to events, fixed mobile type display
|
||||||
|
|
||||||
### v0.4.3 — 2026-01-14
|
### v0.4.3 — 2026-01-14
|
||||||
- **Sound Level Meter workflow**: Roster manager surfaces SLM metadata, supports rename actions, and adds return-to-project navigation plus schedule/unit templates for project planning.
|
- **Sound Level Meter workflow**: Roster manager surfaces SLM metadata, supports rename actions, and adds return-to-project navigation plus schedule/unit templates for project planning.
|
||||||
- **Project insight panels**: Project dashboards now expose file and session lists so teams can see what each project stores before diving into units.
|
- **Project insight panels**: Project dashboards now expose file and session lists so teams can see what each project stores before diving into units.
|
||||||
@@ -571,9 +599,23 @@ MIT
|
|||||||
|
|
||||||
## Version
|
## Version
|
||||||
|
|
||||||
**Current: 0.4.3** — SLM roster/project view refresh, project insight panels, FTP browser folder downloads, and SLMM sync (2026-01-14)
|
**Current: 0.8.0** — Watcher Manager admin page, live agent status refresh, watcher connectivity-based status (2026-03-18)
|
||||||
|
|
||||||
Previous: 0.4.2 — SLM configuration interface with TCP/FTP controls, modem diagnostics, and dashboard endpoints for Sound Level Meters (2026-01-05)
|
Previous: 0.7.1 — Out-for-calibration status, reservation modal, migration fixes (2026-03-12)
|
||||||
|
|
||||||
|
0.7.0 — Project status management, manual SD card upload, combined report wizard, NL32 support, MonitoringSession rename (2026-03-07)
|
||||||
|
|
||||||
|
0.6.1 — One-off recording schedules, bidirectional pairing sync, scheduler timezone fix (2026-02-16)
|
||||||
|
|
||||||
|
0.6.0 — Calendar & reservation mode, device pairing interface, calibration UX overhaul, modem dashboard enhancements (2026-02-06)
|
||||||
|
|
||||||
|
0.5.1 — Dashboard schedule view with today's actions panel, new Terra-View branding and logo rework (2026-01-27)
|
||||||
|
|
||||||
|
0.4.4 — Recurring schedules, alerting UI, report templates + RND viewer, and SLM workflow polish (2026-01-23)
|
||||||
|
|
||||||
|
0.4.3 — SLM roster/project view refresh, project insight panels, FTP browser folder downloads, and SLMM sync (2026-01-14)
|
||||||
|
|
||||||
|
0.4.2 — SLM configuration interface with TCP/FTP controls, modem diagnostics, and dashboard endpoints for Sound Level Meters (2026-01-05)
|
||||||
|
|
||||||
0.4.1 — Sound Level Meter integration with full management UI for SLM units (2026-01-05)
|
0.4.1 — Sound Level Meter integration with full management UI for SLM units (2026-01-05)
|
||||||
|
|
||||||
|
|||||||
BIN
assets/terra-view-icon_large.png
Normal file
|
After Width: | Height: | Size: 36 KiB |
@@ -18,7 +18,7 @@ from backend.models import (
|
|||||||
MonitoringLocation,
|
MonitoringLocation,
|
||||||
UnitAssignment,
|
UnitAssignment,
|
||||||
ScheduledAction,
|
ScheduledAction,
|
||||||
RecordingSession,
|
MonitoringSession,
|
||||||
DataFile,
|
DataFile,
|
||||||
)
|
)
|
||||||
from datetime import datetime
|
from datetime import datetime
|
||||||
|
|||||||
223
backend/main.py
@@ -18,9 +18,10 @@ logging.basicConfig(
|
|||||||
logger = logging.getLogger(__name__)
|
logger = logging.getLogger(__name__)
|
||||||
|
|
||||||
from backend.database import engine, Base, get_db
|
from backend.database import engine, Base, get_db
|
||||||
from backend.routers import roster, units, photos, roster_edit, roster_rename, dashboard, dashboard_tabs, activity, slmm, slm_ui, slm_dashboard, seismo_dashboard, projects, project_locations, scheduler
|
from backend.routers import roster, units, photos, roster_edit, roster_rename, dashboard, dashboard_tabs, activity, slmm, slm_ui, slm_dashboard, seismo_dashboard, projects, project_locations, scheduler, modem_dashboard
|
||||||
from backend.services.snapshot import emit_status_snapshot
|
from backend.services.snapshot import emit_status_snapshot
|
||||||
from backend.models import IgnoredUnit
|
from backend.models import IgnoredUnit
|
||||||
|
from backend.utils.timezone import get_user_timezone
|
||||||
|
|
||||||
# Create database tables
|
# Create database tables
|
||||||
Base.metadata.create_all(bind=engine)
|
Base.metadata.create_all(bind=engine)
|
||||||
@@ -29,7 +30,11 @@ Base.metadata.create_all(bind=engine)
|
|||||||
ENVIRONMENT = os.getenv("ENVIRONMENT", "production")
|
ENVIRONMENT = os.getenv("ENVIRONMENT", "production")
|
||||||
|
|
||||||
# Initialize FastAPI app
|
# Initialize FastAPI app
|
||||||
VERSION = "0.4.3"
|
VERSION = "0.9.2"
|
||||||
|
if ENVIRONMENT == "development":
|
||||||
|
_build = os.getenv("BUILD_NUMBER", "0")
|
||||||
|
if _build and _build != "0":
|
||||||
|
VERSION = f"{VERSION}-{_build}"
|
||||||
app = FastAPI(
|
app = FastAPI(
|
||||||
title="Seismo Fleet Manager",
|
title="Seismo Fleet Manager",
|
||||||
description="Backend API for managing seismograph fleet status",
|
description="Backend API for managing seismograph fleet status",
|
||||||
@@ -58,8 +63,8 @@ app.add_middleware(
|
|||||||
# Mount static files
|
# Mount static files
|
||||||
app.mount("/static", StaticFiles(directory="backend/static"), name="static")
|
app.mount("/static", StaticFiles(directory="backend/static"), name="static")
|
||||||
|
|
||||||
# Setup Jinja2 templates
|
# Use shared templates configuration with timezone filters
|
||||||
templates = Jinja2Templates(directory="templates")
|
from backend.templates_config import templates
|
||||||
|
|
||||||
# Add custom context processor to inject environment variable into all templates
|
# Add custom context processor to inject environment variable into all templates
|
||||||
@app.middleware("http")
|
@app.middleware("http")
|
||||||
@@ -92,17 +97,42 @@ app.include_router(slmm.router)
|
|||||||
app.include_router(slm_ui.router)
|
app.include_router(slm_ui.router)
|
||||||
app.include_router(slm_dashboard.router)
|
app.include_router(slm_dashboard.router)
|
||||||
app.include_router(seismo_dashboard.router)
|
app.include_router(seismo_dashboard.router)
|
||||||
|
app.include_router(modem_dashboard.router)
|
||||||
|
|
||||||
from backend.routers import settings
|
from backend.routers import settings
|
||||||
app.include_router(settings.router)
|
app.include_router(settings.router)
|
||||||
|
|
||||||
|
from backend.routers import watcher_manager
|
||||||
|
app.include_router(watcher_manager.router)
|
||||||
|
|
||||||
# Projects system routers
|
# Projects system routers
|
||||||
app.include_router(projects.router)
|
app.include_router(projects.router)
|
||||||
app.include_router(project_locations.router)
|
app.include_router(project_locations.router)
|
||||||
app.include_router(scheduler.router)
|
app.include_router(scheduler.router)
|
||||||
|
|
||||||
# Start scheduler service on application startup
|
# Report templates router
|
||||||
|
from backend.routers import report_templates
|
||||||
|
app.include_router(report_templates.router)
|
||||||
|
|
||||||
|
# Alerts router
|
||||||
|
from backend.routers import alerts
|
||||||
|
app.include_router(alerts.router)
|
||||||
|
|
||||||
|
# Recurring schedules router
|
||||||
|
from backend.routers import recurring_schedules
|
||||||
|
app.include_router(recurring_schedules.router)
|
||||||
|
|
||||||
|
# Fleet Calendar router
|
||||||
|
from backend.routers import fleet_calendar
|
||||||
|
app.include_router(fleet_calendar.router)
|
||||||
|
|
||||||
|
# Deployment Records router
|
||||||
|
from backend.routers import deployments
|
||||||
|
app.include_router(deployments.router)
|
||||||
|
|
||||||
|
# Start scheduler service and device status monitor on application startup
|
||||||
from backend.services.scheduler import start_scheduler, stop_scheduler
|
from backend.services.scheduler import start_scheduler, stop_scheduler
|
||||||
|
from backend.services.device_status_monitor import start_device_status_monitor, stop_device_status_monitor
|
||||||
|
|
||||||
@app.on_event("startup")
|
@app.on_event("startup")
|
||||||
async def startup_event():
|
async def startup_event():
|
||||||
@@ -111,9 +141,17 @@ async def startup_event():
|
|||||||
await start_scheduler()
|
await start_scheduler()
|
||||||
logger.info("Scheduler service started")
|
logger.info("Scheduler service started")
|
||||||
|
|
||||||
|
logger.info("Starting device status monitor...")
|
||||||
|
await start_device_status_monitor()
|
||||||
|
logger.info("Device status monitor started")
|
||||||
|
|
||||||
@app.on_event("shutdown")
|
@app.on_event("shutdown")
|
||||||
def shutdown_event():
|
def shutdown_event():
|
||||||
"""Clean up services on app shutdown"""
|
"""Clean up services on app shutdown"""
|
||||||
|
logger.info("Stopping device status monitor...")
|
||||||
|
stop_device_status_monitor()
|
||||||
|
logger.info("Device status monitor stopped")
|
||||||
|
|
||||||
logger.info("Stopping scheduler service...")
|
logger.info("Stopping scheduler service...")
|
||||||
stop_scheduler()
|
stop_scheduler()
|
||||||
logger.info("Scheduler service stopped")
|
logger.info("Scheduler service stopped")
|
||||||
@@ -195,6 +233,73 @@ async def seismographs_page(request: Request):
|
|||||||
return templates.TemplateResponse("seismographs.html", {"request": request})
|
return templates.TemplateResponse("seismographs.html", {"request": request})
|
||||||
|
|
||||||
|
|
||||||
|
@app.get("/modems", response_class=HTMLResponse)
|
||||||
|
async def modems_page(request: Request):
|
||||||
|
"""Field modems management dashboard"""
|
||||||
|
return templates.TemplateResponse("modems.html", {"request": request})
|
||||||
|
|
||||||
|
|
||||||
|
@app.get("/pair-devices", response_class=HTMLResponse)
|
||||||
|
async def pair_devices_page(request: Request, db: Session = Depends(get_db)):
|
||||||
|
"""
|
||||||
|
Device pairing page - two-column layout for pairing recorders with modems.
|
||||||
|
"""
|
||||||
|
from backend.models import RosterUnit
|
||||||
|
|
||||||
|
# Get all non-retired recorders (seismographs and SLMs)
|
||||||
|
recorders = db.query(RosterUnit).filter(
|
||||||
|
RosterUnit.retired == False,
|
||||||
|
RosterUnit.device_type.in_(["seismograph", "slm", None]) # None defaults to seismograph
|
||||||
|
).order_by(RosterUnit.id).all()
|
||||||
|
|
||||||
|
# Get all non-retired modems
|
||||||
|
modems = db.query(RosterUnit).filter(
|
||||||
|
RosterUnit.retired == False,
|
||||||
|
RosterUnit.device_type == "modem"
|
||||||
|
).order_by(RosterUnit.id).all()
|
||||||
|
|
||||||
|
# Build existing pairings list
|
||||||
|
pairings = []
|
||||||
|
for recorder in recorders:
|
||||||
|
if recorder.deployed_with_modem_id:
|
||||||
|
modem = next((m for m in modems if m.id == recorder.deployed_with_modem_id), None)
|
||||||
|
pairings.append({
|
||||||
|
"recorder_id": recorder.id,
|
||||||
|
"recorder_type": (recorder.device_type or "seismograph").upper(),
|
||||||
|
"modem_id": recorder.deployed_with_modem_id,
|
||||||
|
"modem_ip": modem.ip_address if modem else None
|
||||||
|
})
|
||||||
|
|
||||||
|
# Convert to dicts for template
|
||||||
|
recorders_data = [
|
||||||
|
{
|
||||||
|
"id": r.id,
|
||||||
|
"device_type": r.device_type or "seismograph",
|
||||||
|
"deployed": r.deployed,
|
||||||
|
"deployed_with_modem_id": r.deployed_with_modem_id
|
||||||
|
}
|
||||||
|
for r in recorders
|
||||||
|
]
|
||||||
|
|
||||||
|
modems_data = [
|
||||||
|
{
|
||||||
|
"id": m.id,
|
||||||
|
"deployed": m.deployed,
|
||||||
|
"deployed_with_unit_id": m.deployed_with_unit_id,
|
||||||
|
"ip_address": m.ip_address,
|
||||||
|
"phone_number": m.phone_number
|
||||||
|
}
|
||||||
|
for m in modems
|
||||||
|
]
|
||||||
|
|
||||||
|
return templates.TemplateResponse("pair_devices.html", {
|
||||||
|
"request": request,
|
||||||
|
"recorders": recorders_data,
|
||||||
|
"modems": modems_data,
|
||||||
|
"pairings": pairings
|
||||||
|
})
|
||||||
|
|
||||||
|
|
||||||
@app.get("/projects", response_class=HTMLResponse)
|
@app.get("/projects", response_class=HTMLResponse)
|
||||||
async def projects_page(request: Request):
|
async def projects_page(request: Request):
|
||||||
"""Projects management and overview"""
|
"""Projects management and overview"""
|
||||||
@@ -218,7 +323,7 @@ async def nrl_detail_page(
|
|||||||
db: Session = Depends(get_db)
|
db: Session = Depends(get_db)
|
||||||
):
|
):
|
||||||
"""NRL (Noise Recording Location) detail page with tabs"""
|
"""NRL (Noise Recording Location) detail page with tabs"""
|
||||||
from backend.models import Project, MonitoringLocation, UnitAssignment, RosterUnit, RecordingSession, DataFile
|
from backend.models import Project, MonitoringLocation, UnitAssignment, RosterUnit, MonitoringSession, DataFile
|
||||||
from sqlalchemy import and_
|
from sqlalchemy import and_
|
||||||
|
|
||||||
# Get project
|
# Get project
|
||||||
@@ -250,27 +355,40 @@ async def nrl_detail_page(
|
|||||||
).first()
|
).first()
|
||||||
|
|
||||||
assigned_unit = None
|
assigned_unit = None
|
||||||
|
assigned_modem = None
|
||||||
if assignment:
|
if assignment:
|
||||||
assigned_unit = db.query(RosterUnit).filter_by(id=assignment.unit_id).first()
|
assigned_unit = db.query(RosterUnit).filter_by(id=assignment.unit_id).first()
|
||||||
|
if assigned_unit and assigned_unit.deployed_with_modem_id:
|
||||||
|
assigned_modem = db.query(RosterUnit).filter_by(id=assigned_unit.deployed_with_modem_id).first()
|
||||||
|
|
||||||
# Get session count
|
# Get session count
|
||||||
session_count = db.query(RecordingSession).filter_by(location_id=location_id).count()
|
session_count = db.query(MonitoringSession).filter_by(location_id=location_id).count()
|
||||||
|
|
||||||
# Get file count (DataFile links to session, not directly to location)
|
# Get file count (DataFile links to session, not directly to location)
|
||||||
file_count = db.query(DataFile).join(
|
file_count = db.query(DataFile).join(
|
||||||
RecordingSession,
|
MonitoringSession,
|
||||||
DataFile.session_id == RecordingSession.id
|
DataFile.session_id == MonitoringSession.id
|
||||||
).filter(RecordingSession.location_id == location_id).count()
|
).filter(MonitoringSession.location_id == location_id).count()
|
||||||
|
|
||||||
# Check for active session
|
# Check for active session
|
||||||
active_session = db.query(RecordingSession).filter(
|
active_session = db.query(MonitoringSession).filter(
|
||||||
and_(
|
and_(
|
||||||
RecordingSession.location_id == location_id,
|
MonitoringSession.location_id == location_id,
|
||||||
RecordingSession.status == "recording"
|
MonitoringSession.status == "recording"
|
||||||
)
|
)
|
||||||
).first()
|
).first()
|
||||||
|
|
||||||
return templates.TemplateResponse("nrl_detail.html", {
|
# Parse connection_mode from location_metadata JSON
|
||||||
|
import json as _json
|
||||||
|
connection_mode = "connected"
|
||||||
|
try:
|
||||||
|
meta = _json.loads(location.location_metadata or "{}")
|
||||||
|
connection_mode = meta.get("connection_mode", "connected")
|
||||||
|
except Exception:
|
||||||
|
pass
|
||||||
|
|
||||||
|
template = "vibration_location_detail.html" if location.location_type == "vibration" else "nrl_detail.html"
|
||||||
|
return templates.TemplateResponse(template, {
|
||||||
"request": request,
|
"request": request,
|
||||||
"project_id": project_id,
|
"project_id": project_id,
|
||||||
"location_id": location_id,
|
"location_id": location_id,
|
||||||
@@ -278,9 +396,11 @@ async def nrl_detail_page(
|
|||||||
"location": location,
|
"location": location,
|
||||||
"assignment": assignment,
|
"assignment": assignment,
|
||||||
"assigned_unit": assigned_unit,
|
"assigned_unit": assigned_unit,
|
||||||
|
"assigned_modem": assigned_modem,
|
||||||
"session_count": session_count,
|
"session_count": session_count,
|
||||||
"file_count": file_count,
|
"file_count": file_count,
|
||||||
"active_session": active_session,
|
"active_session": active_session,
|
||||||
|
"connection_mode": connection_mode,
|
||||||
})
|
})
|
||||||
|
|
||||||
|
|
||||||
@@ -550,6 +670,7 @@ async def devices_all_partial(request: Request):
|
|||||||
"last_seen": unit_data.get("last", "Never"),
|
"last_seen": unit_data.get("last", "Never"),
|
||||||
"deployed": True,
|
"deployed": True,
|
||||||
"retired": False,
|
"retired": False,
|
||||||
|
"out_for_calibration": False,
|
||||||
"ignored": False,
|
"ignored": False,
|
||||||
"note": unit_data.get("note", ""),
|
"note": unit_data.get("note", ""),
|
||||||
"device_type": unit_data.get("device_type", "seismograph"),
|
"device_type": unit_data.get("device_type", "seismograph"),
|
||||||
@@ -559,6 +680,7 @@ async def devices_all_partial(request: Request):
|
|||||||
"last_calibrated": unit_data.get("last_calibrated"),
|
"last_calibrated": unit_data.get("last_calibrated"),
|
||||||
"next_calibration_due": unit_data.get("next_calibration_due"),
|
"next_calibration_due": unit_data.get("next_calibration_due"),
|
||||||
"deployed_with_modem_id": unit_data.get("deployed_with_modem_id"),
|
"deployed_with_modem_id": unit_data.get("deployed_with_modem_id"),
|
||||||
|
"deployed_with_unit_id": unit_data.get("deployed_with_unit_id"),
|
||||||
"ip_address": unit_data.get("ip_address"),
|
"ip_address": unit_data.get("ip_address"),
|
||||||
"phone_number": unit_data.get("phone_number"),
|
"phone_number": unit_data.get("phone_number"),
|
||||||
"hardware_model": unit_data.get("hardware_model"),
|
"hardware_model": unit_data.get("hardware_model"),
|
||||||
@@ -573,6 +695,7 @@ async def devices_all_partial(request: Request):
|
|||||||
"last_seen": unit_data.get("last", "Never"),
|
"last_seen": unit_data.get("last", "Never"),
|
||||||
"deployed": False,
|
"deployed": False,
|
||||||
"retired": False,
|
"retired": False,
|
||||||
|
"out_for_calibration": False,
|
||||||
"ignored": False,
|
"ignored": False,
|
||||||
"note": unit_data.get("note", ""),
|
"note": unit_data.get("note", ""),
|
||||||
"device_type": unit_data.get("device_type", "seismograph"),
|
"device_type": unit_data.get("device_type", "seismograph"),
|
||||||
@@ -582,6 +705,59 @@ async def devices_all_partial(request: Request):
|
|||||||
"last_calibrated": unit_data.get("last_calibrated"),
|
"last_calibrated": unit_data.get("last_calibrated"),
|
||||||
"next_calibration_due": unit_data.get("next_calibration_due"),
|
"next_calibration_due": unit_data.get("next_calibration_due"),
|
||||||
"deployed_with_modem_id": unit_data.get("deployed_with_modem_id"),
|
"deployed_with_modem_id": unit_data.get("deployed_with_modem_id"),
|
||||||
|
"deployed_with_unit_id": unit_data.get("deployed_with_unit_id"),
|
||||||
|
"ip_address": unit_data.get("ip_address"),
|
||||||
|
"phone_number": unit_data.get("phone_number"),
|
||||||
|
"hardware_model": unit_data.get("hardware_model"),
|
||||||
|
})
|
||||||
|
|
||||||
|
# Add allocated units
|
||||||
|
for unit_id, unit_data in snapshot.get("allocated", {}).items():
|
||||||
|
units_list.append({
|
||||||
|
"id": unit_id,
|
||||||
|
"status": "Allocated",
|
||||||
|
"age": "N/A",
|
||||||
|
"last_seen": "N/A",
|
||||||
|
"deployed": False,
|
||||||
|
"retired": False,
|
||||||
|
"out_for_calibration": False,
|
||||||
|
"allocated": True,
|
||||||
|
"allocated_to_project_id": unit_data.get("allocated_to_project_id", ""),
|
||||||
|
"ignored": False,
|
||||||
|
"note": unit_data.get("note", ""),
|
||||||
|
"device_type": unit_data.get("device_type", "seismograph"),
|
||||||
|
"address": unit_data.get("address", ""),
|
||||||
|
"coordinates": unit_data.get("coordinates", ""),
|
||||||
|
"project_id": unit_data.get("project_id", ""),
|
||||||
|
"last_calibrated": unit_data.get("last_calibrated"),
|
||||||
|
"next_calibration_due": unit_data.get("next_calibration_due"),
|
||||||
|
"deployed_with_modem_id": unit_data.get("deployed_with_modem_id"),
|
||||||
|
"deployed_with_unit_id": unit_data.get("deployed_with_unit_id"),
|
||||||
|
"ip_address": unit_data.get("ip_address"),
|
||||||
|
"phone_number": unit_data.get("phone_number"),
|
||||||
|
"hardware_model": unit_data.get("hardware_model"),
|
||||||
|
})
|
||||||
|
|
||||||
|
# Add out-for-calibration units
|
||||||
|
for unit_id, unit_data in snapshot["out_for_calibration"].items():
|
||||||
|
units_list.append({
|
||||||
|
"id": unit_id,
|
||||||
|
"status": "Out for Calibration",
|
||||||
|
"age": "N/A",
|
||||||
|
"last_seen": "N/A",
|
||||||
|
"deployed": False,
|
||||||
|
"retired": False,
|
||||||
|
"out_for_calibration": True,
|
||||||
|
"ignored": False,
|
||||||
|
"note": unit_data.get("note", ""),
|
||||||
|
"device_type": unit_data.get("device_type", "seismograph"),
|
||||||
|
"address": unit_data.get("address", ""),
|
||||||
|
"coordinates": unit_data.get("coordinates", ""),
|
||||||
|
"project_id": unit_data.get("project_id", ""),
|
||||||
|
"last_calibrated": unit_data.get("last_calibrated"),
|
||||||
|
"next_calibration_due": unit_data.get("next_calibration_due"),
|
||||||
|
"deployed_with_modem_id": unit_data.get("deployed_with_modem_id"),
|
||||||
|
"deployed_with_unit_id": unit_data.get("deployed_with_unit_id"),
|
||||||
"ip_address": unit_data.get("ip_address"),
|
"ip_address": unit_data.get("ip_address"),
|
||||||
"phone_number": unit_data.get("phone_number"),
|
"phone_number": unit_data.get("phone_number"),
|
||||||
"hardware_model": unit_data.get("hardware_model"),
|
"hardware_model": unit_data.get("hardware_model"),
|
||||||
@@ -596,6 +772,7 @@ async def devices_all_partial(request: Request):
|
|||||||
"last_seen": "N/A",
|
"last_seen": "N/A",
|
||||||
"deployed": False,
|
"deployed": False,
|
||||||
"retired": True,
|
"retired": True,
|
||||||
|
"out_for_calibration": False,
|
||||||
"ignored": False,
|
"ignored": False,
|
||||||
"note": unit_data.get("note", ""),
|
"note": unit_data.get("note", ""),
|
||||||
"device_type": unit_data.get("device_type", "seismograph"),
|
"device_type": unit_data.get("device_type", "seismograph"),
|
||||||
@@ -605,6 +782,7 @@ async def devices_all_partial(request: Request):
|
|||||||
"last_calibrated": unit_data.get("last_calibrated"),
|
"last_calibrated": unit_data.get("last_calibrated"),
|
||||||
"next_calibration_due": unit_data.get("next_calibration_due"),
|
"next_calibration_due": unit_data.get("next_calibration_due"),
|
||||||
"deployed_with_modem_id": unit_data.get("deployed_with_modem_id"),
|
"deployed_with_modem_id": unit_data.get("deployed_with_modem_id"),
|
||||||
|
"deployed_with_unit_id": unit_data.get("deployed_with_unit_id"),
|
||||||
"ip_address": unit_data.get("ip_address"),
|
"ip_address": unit_data.get("ip_address"),
|
||||||
"phone_number": unit_data.get("phone_number"),
|
"phone_number": unit_data.get("phone_number"),
|
||||||
"hardware_model": unit_data.get("hardware_model"),
|
"hardware_model": unit_data.get("hardware_model"),
|
||||||
@@ -619,6 +797,7 @@ async def devices_all_partial(request: Request):
|
|||||||
"last_seen": "N/A",
|
"last_seen": "N/A",
|
||||||
"deployed": False,
|
"deployed": False,
|
||||||
"retired": False,
|
"retired": False,
|
||||||
|
"out_for_calibration": False,
|
||||||
"ignored": True,
|
"ignored": True,
|
||||||
"note": unit_data.get("note", unit_data.get("reason", "")),
|
"note": unit_data.get("note", unit_data.get("reason", "")),
|
||||||
"device_type": unit_data.get("device_type", "unknown"),
|
"device_type": unit_data.get("device_type", "unknown"),
|
||||||
@@ -628,6 +807,7 @@ async def devices_all_partial(request: Request):
|
|||||||
"last_calibrated": None,
|
"last_calibrated": None,
|
||||||
"next_calibration_due": None,
|
"next_calibration_due": None,
|
||||||
"deployed_with_modem_id": None,
|
"deployed_with_modem_id": None,
|
||||||
|
"deployed_with_unit_id": None,
|
||||||
"ip_address": None,
|
"ip_address": None,
|
||||||
"phone_number": None,
|
"phone_number": None,
|
||||||
"hardware_model": None,
|
"hardware_model": None,
|
||||||
@@ -635,22 +815,27 @@ async def devices_all_partial(request: Request):
|
|||||||
|
|
||||||
# Sort by status category, then by ID
|
# Sort by status category, then by ID
|
||||||
def sort_key(unit):
|
def sort_key(unit):
|
||||||
# Priority: deployed (active) -> benched -> retired -> ignored
|
# Priority: deployed (active) -> allocated -> benched -> out_for_calibration -> retired -> ignored
|
||||||
if unit["deployed"]:
|
if unit["deployed"]:
|
||||||
return (0, unit["id"])
|
return (0, unit["id"])
|
||||||
elif not unit["retired"] and not unit["ignored"]:
|
elif unit.get("allocated"):
|
||||||
return (1, unit["id"])
|
return (1, unit["id"])
|
||||||
elif unit["retired"]:
|
elif not unit["retired"] and not unit["out_for_calibration"] and not unit["ignored"]:
|
||||||
return (2, unit["id"])
|
return (2, unit["id"])
|
||||||
else:
|
elif unit["out_for_calibration"]:
|
||||||
return (3, unit["id"])
|
return (3, unit["id"])
|
||||||
|
elif unit["retired"]:
|
||||||
|
return (4, unit["id"])
|
||||||
|
else:
|
||||||
|
return (5, unit["id"])
|
||||||
|
|
||||||
units_list.sort(key=sort_key)
|
units_list.sort(key=sort_key)
|
||||||
|
|
||||||
return templates.TemplateResponse("partials/devices_table.html", {
|
return templates.TemplateResponse("partials/devices_table.html", {
|
||||||
"request": request,
|
"request": request,
|
||||||
"units": units_list,
|
"units": units_list,
|
||||||
"timestamp": datetime.now().strftime("%H:%M:%S")
|
"timestamp": datetime.now().strftime("%H:%M:%S"),
|
||||||
|
"user_timezone": get_user_timezone()
|
||||||
})
|
})
|
||||||
|
|
||||||
|
|
||||||
|
|||||||
35
backend/migrate_add_allocated.py
Normal file
@@ -0,0 +1,35 @@
|
|||||||
|
"""
|
||||||
|
Migration: Add allocated and allocated_to_project_id columns to roster table.
|
||||||
|
Run once: python backend/migrate_add_allocated.py
|
||||||
|
"""
|
||||||
|
import sqlite3
|
||||||
|
import os
|
||||||
|
|
||||||
|
DB_PATH = os.path.join(os.path.dirname(__file__), '..', 'data', 'seismo_fleet.db')
|
||||||
|
|
||||||
|
def run():
|
||||||
|
conn = sqlite3.connect(DB_PATH)
|
||||||
|
cur = conn.cursor()
|
||||||
|
|
||||||
|
# Check existing columns
|
||||||
|
cur.execute("PRAGMA table_info(roster)")
|
||||||
|
cols = {row[1] for row in cur.fetchall()}
|
||||||
|
|
||||||
|
if 'allocated' not in cols:
|
||||||
|
cur.execute("ALTER TABLE roster ADD COLUMN allocated BOOLEAN DEFAULT 0 NOT NULL")
|
||||||
|
print("Added column: allocated")
|
||||||
|
else:
|
||||||
|
print("Column already exists: allocated")
|
||||||
|
|
||||||
|
if 'allocated_to_project_id' not in cols:
|
||||||
|
cur.execute("ALTER TABLE roster ADD COLUMN allocated_to_project_id VARCHAR")
|
||||||
|
print("Added column: allocated_to_project_id")
|
||||||
|
else:
|
||||||
|
print("Column already exists: allocated_to_project_id")
|
||||||
|
|
||||||
|
conn.commit()
|
||||||
|
conn.close()
|
||||||
|
print("Migration complete.")
|
||||||
|
|
||||||
|
if __name__ == '__main__':
|
||||||
|
run()
|
||||||
67
backend/migrate_add_auto_increment_index.py
Normal file
@@ -0,0 +1,67 @@
|
|||||||
|
"""
|
||||||
|
Migration: Add auto_increment_index column to recurring_schedules table
|
||||||
|
|
||||||
|
This migration adds the auto_increment_index column that controls whether
|
||||||
|
the scheduler should automatically find an unused store index before starting
|
||||||
|
a new measurement.
|
||||||
|
|
||||||
|
Run this script once to update existing databases:
|
||||||
|
python -m backend.migrate_add_auto_increment_index
|
||||||
|
"""
|
||||||
|
|
||||||
|
import sqlite3
|
||||||
|
import os
|
||||||
|
|
||||||
|
DB_PATH = "data/seismo_fleet.db"
|
||||||
|
|
||||||
|
|
||||||
|
def migrate():
|
||||||
|
"""Add auto_increment_index column to recurring_schedules table."""
|
||||||
|
if not os.path.exists(DB_PATH):
|
||||||
|
print(f"Database not found at {DB_PATH}")
|
||||||
|
return False
|
||||||
|
|
||||||
|
conn = sqlite3.connect(DB_PATH)
|
||||||
|
cursor = conn.cursor()
|
||||||
|
|
||||||
|
try:
|
||||||
|
# Check if recurring_schedules table exists
|
||||||
|
cursor.execute("""
|
||||||
|
SELECT name FROM sqlite_master
|
||||||
|
WHERE type='table' AND name='recurring_schedules'
|
||||||
|
""")
|
||||||
|
if not cursor.fetchone():
|
||||||
|
print("recurring_schedules table does not exist yet. Will be created on app startup.")
|
||||||
|
conn.close()
|
||||||
|
return True
|
||||||
|
|
||||||
|
# Check if auto_increment_index column already exists
|
||||||
|
cursor.execute("PRAGMA table_info(recurring_schedules)")
|
||||||
|
columns = [row[1] for row in cursor.fetchall()]
|
||||||
|
|
||||||
|
if "auto_increment_index" in columns:
|
||||||
|
print("auto_increment_index column already exists in recurring_schedules table.")
|
||||||
|
conn.close()
|
||||||
|
return True
|
||||||
|
|
||||||
|
# Add the column
|
||||||
|
print("Adding auto_increment_index column to recurring_schedules table...")
|
||||||
|
cursor.execute("""
|
||||||
|
ALTER TABLE recurring_schedules
|
||||||
|
ADD COLUMN auto_increment_index BOOLEAN DEFAULT 1
|
||||||
|
""")
|
||||||
|
conn.commit()
|
||||||
|
print("Successfully added auto_increment_index column.")
|
||||||
|
|
||||||
|
conn.close()
|
||||||
|
return True
|
||||||
|
|
||||||
|
except Exception as e:
|
||||||
|
print(f"Migration failed: {e}")
|
||||||
|
conn.close()
|
||||||
|
return False
|
||||||
|
|
||||||
|
|
||||||
|
if __name__ == "__main__":
|
||||||
|
success = migrate()
|
||||||
|
exit(0 if success else 1)
|
||||||
79
backend/migrate_add_deployment_records.py
Normal file
@@ -0,0 +1,79 @@
|
|||||||
|
"""
|
||||||
|
Migration: Add deployment_records table.
|
||||||
|
|
||||||
|
Tracks each time a unit is sent to the field and returned.
|
||||||
|
The active deployment is the row with actual_removal_date IS NULL.
|
||||||
|
|
||||||
|
Run once per database:
|
||||||
|
python backend/migrate_add_deployment_records.py
|
||||||
|
"""
|
||||||
|
|
||||||
|
import sqlite3
|
||||||
|
import os
|
||||||
|
|
||||||
|
DB_PATH = "./data/seismo_fleet.db"
|
||||||
|
|
||||||
|
|
||||||
|
def migrate_database():
|
||||||
|
if not os.path.exists(DB_PATH):
|
||||||
|
print(f"Database not found at {DB_PATH}")
|
||||||
|
return
|
||||||
|
|
||||||
|
conn = sqlite3.connect(DB_PATH)
|
||||||
|
cursor = conn.cursor()
|
||||||
|
|
||||||
|
try:
|
||||||
|
# Check if table already exists
|
||||||
|
cursor.execute("""
|
||||||
|
SELECT name FROM sqlite_master
|
||||||
|
WHERE type='table' AND name='deployment_records'
|
||||||
|
""")
|
||||||
|
if cursor.fetchone():
|
||||||
|
print("✓ deployment_records table already exists, skipping")
|
||||||
|
return
|
||||||
|
|
||||||
|
print("Creating deployment_records table...")
|
||||||
|
cursor.execute("""
|
||||||
|
CREATE TABLE deployment_records (
|
||||||
|
id TEXT PRIMARY KEY,
|
||||||
|
unit_id TEXT NOT NULL,
|
||||||
|
deployed_date DATE,
|
||||||
|
estimated_removal_date DATE,
|
||||||
|
actual_removal_date DATE,
|
||||||
|
project_ref TEXT,
|
||||||
|
project_id TEXT,
|
||||||
|
location_name TEXT,
|
||||||
|
notes TEXT,
|
||||||
|
created_at TIMESTAMP DEFAULT CURRENT_TIMESTAMP,
|
||||||
|
updated_at TIMESTAMP DEFAULT CURRENT_TIMESTAMP
|
||||||
|
)
|
||||||
|
""")
|
||||||
|
|
||||||
|
cursor.execute("""
|
||||||
|
CREATE INDEX idx_deployment_records_unit_id
|
||||||
|
ON deployment_records(unit_id)
|
||||||
|
""")
|
||||||
|
cursor.execute("""
|
||||||
|
CREATE INDEX idx_deployment_records_project_id
|
||||||
|
ON deployment_records(project_id)
|
||||||
|
""")
|
||||||
|
# Index for finding active deployments quickly
|
||||||
|
cursor.execute("""
|
||||||
|
CREATE INDEX idx_deployment_records_active
|
||||||
|
ON deployment_records(unit_id, actual_removal_date)
|
||||||
|
""")
|
||||||
|
|
||||||
|
conn.commit()
|
||||||
|
print("✓ deployment_records table created successfully")
|
||||||
|
print("✓ Indexes created")
|
||||||
|
|
||||||
|
except Exception as e:
|
||||||
|
conn.rollback()
|
||||||
|
print(f"✗ Migration failed: {e}")
|
||||||
|
raise
|
||||||
|
finally:
|
||||||
|
conn.close()
|
||||||
|
|
||||||
|
|
||||||
|
if __name__ == "__main__":
|
||||||
|
migrate_database()
|
||||||
84
backend/migrate_add_deployment_type.py
Normal file
@@ -0,0 +1,84 @@
|
|||||||
|
"""
|
||||||
|
Migration script to add deployment_type and deployed_with_unit_id fields to roster table.
|
||||||
|
|
||||||
|
deployment_type: tracks what type of device a modem is deployed with:
|
||||||
|
- "seismograph" - Modem is connected to a seismograph
|
||||||
|
- "slm" - Modem is connected to a sound level meter
|
||||||
|
- NULL/empty - Not assigned or unknown
|
||||||
|
|
||||||
|
deployed_with_unit_id: stores the ID of the seismograph/SLM this modem is deployed with
|
||||||
|
(reverse relationship of deployed_with_modem_id)
|
||||||
|
|
||||||
|
Run this script once to migrate an existing database.
|
||||||
|
"""
|
||||||
|
|
||||||
|
import sqlite3
|
||||||
|
import os
|
||||||
|
|
||||||
|
# Database path
|
||||||
|
DB_PATH = "./data/seismo_fleet.db"
|
||||||
|
|
||||||
|
|
||||||
|
def migrate_database():
|
||||||
|
"""Add deployment_type and deployed_with_unit_id columns to roster table"""
|
||||||
|
|
||||||
|
if not os.path.exists(DB_PATH):
|
||||||
|
print(f"Database not found at {DB_PATH}")
|
||||||
|
print("The database will be created automatically when you run the application.")
|
||||||
|
return
|
||||||
|
|
||||||
|
print(f"Migrating database: {DB_PATH}")
|
||||||
|
|
||||||
|
conn = sqlite3.connect(DB_PATH)
|
||||||
|
cursor = conn.cursor()
|
||||||
|
|
||||||
|
# Check if roster table exists
|
||||||
|
cursor.execute("SELECT name FROM sqlite_master WHERE type='table' AND name='roster'")
|
||||||
|
table_exists = cursor.fetchone()
|
||||||
|
|
||||||
|
if not table_exists:
|
||||||
|
print("Roster table does not exist yet - will be created when app runs")
|
||||||
|
conn.close()
|
||||||
|
return
|
||||||
|
|
||||||
|
# Check existing columns
|
||||||
|
cursor.execute("PRAGMA table_info(roster)")
|
||||||
|
columns = [col[1] for col in cursor.fetchall()]
|
||||||
|
|
||||||
|
try:
|
||||||
|
# Add deployment_type if not exists
|
||||||
|
if 'deployment_type' not in columns:
|
||||||
|
print("Adding deployment_type column to roster table...")
|
||||||
|
cursor.execute("ALTER TABLE roster ADD COLUMN deployment_type TEXT")
|
||||||
|
print(" Added deployment_type column")
|
||||||
|
|
||||||
|
cursor.execute("CREATE INDEX IF NOT EXISTS ix_roster_deployment_type ON roster(deployment_type)")
|
||||||
|
print(" Created index on deployment_type")
|
||||||
|
else:
|
||||||
|
print("deployment_type column already exists")
|
||||||
|
|
||||||
|
# Add deployed_with_unit_id if not exists
|
||||||
|
if 'deployed_with_unit_id' not in columns:
|
||||||
|
print("Adding deployed_with_unit_id column to roster table...")
|
||||||
|
cursor.execute("ALTER TABLE roster ADD COLUMN deployed_with_unit_id TEXT")
|
||||||
|
print(" Added deployed_with_unit_id column")
|
||||||
|
|
||||||
|
cursor.execute("CREATE INDEX IF NOT EXISTS ix_roster_deployed_with_unit_id ON roster(deployed_with_unit_id)")
|
||||||
|
print(" Created index on deployed_with_unit_id")
|
||||||
|
else:
|
||||||
|
print("deployed_with_unit_id column already exists")
|
||||||
|
|
||||||
|
conn.commit()
|
||||||
|
print("\nMigration completed successfully!")
|
||||||
|
|
||||||
|
except sqlite3.Error as e:
|
||||||
|
print(f"\nError during migration: {e}")
|
||||||
|
conn.rollback()
|
||||||
|
raise
|
||||||
|
|
||||||
|
finally:
|
||||||
|
conn.close()
|
||||||
|
|
||||||
|
|
||||||
|
if __name__ == "__main__":
|
||||||
|
migrate_database()
|
||||||
62
backend/migrate_add_estimated_units.py
Normal file
@@ -0,0 +1,62 @@
|
|||||||
|
"""
|
||||||
|
Migration: Add estimated_units to job_reservations
|
||||||
|
|
||||||
|
Adds column:
|
||||||
|
- job_reservations.estimated_units: Estimated number of units for the reservation (nullable integer)
|
||||||
|
"""
|
||||||
|
|
||||||
|
import sqlite3
|
||||||
|
import sys
|
||||||
|
from pathlib import Path
|
||||||
|
|
||||||
|
# Default database path (matches production pattern)
|
||||||
|
DB_PATH = "./data/seismo_fleet.db"
|
||||||
|
|
||||||
|
|
||||||
|
def migrate(db_path: str):
|
||||||
|
"""Run the migration."""
|
||||||
|
print(f"Migrating database: {db_path}")
|
||||||
|
|
||||||
|
conn = sqlite3.connect(db_path)
|
||||||
|
cursor = conn.cursor()
|
||||||
|
|
||||||
|
try:
|
||||||
|
# Check if job_reservations table exists
|
||||||
|
cursor.execute("SELECT name FROM sqlite_master WHERE type='table' AND name='job_reservations'")
|
||||||
|
if not cursor.fetchone():
|
||||||
|
print("job_reservations table does not exist. Skipping migration.")
|
||||||
|
return
|
||||||
|
|
||||||
|
# Get existing columns in job_reservations
|
||||||
|
cursor.execute("PRAGMA table_info(job_reservations)")
|
||||||
|
existing_cols = {row[1] for row in cursor.fetchall()}
|
||||||
|
|
||||||
|
# Add estimated_units column if it doesn't exist
|
||||||
|
if 'estimated_units' not in existing_cols:
|
||||||
|
print("Adding estimated_units column to job_reservations...")
|
||||||
|
cursor.execute("ALTER TABLE job_reservations ADD COLUMN estimated_units INTEGER")
|
||||||
|
else:
|
||||||
|
print("estimated_units column already exists. Skipping.")
|
||||||
|
|
||||||
|
conn.commit()
|
||||||
|
print("Migration completed successfully!")
|
||||||
|
|
||||||
|
except Exception as e:
|
||||||
|
print(f"Migration failed: {e}")
|
||||||
|
conn.rollback()
|
||||||
|
raise
|
||||||
|
finally:
|
||||||
|
conn.close()
|
||||||
|
|
||||||
|
|
||||||
|
if __name__ == "__main__":
|
||||||
|
db_path = DB_PATH
|
||||||
|
|
||||||
|
if len(sys.argv) > 1:
|
||||||
|
db_path = sys.argv[1]
|
||||||
|
|
||||||
|
if not Path(db_path).exists():
|
||||||
|
print(f"Database not found: {db_path}")
|
||||||
|
sys.exit(1)
|
||||||
|
|
||||||
|
migrate(db_path)
|
||||||
103
backend/migrate_add_job_reservations.py
Normal file
@@ -0,0 +1,103 @@
|
|||||||
|
"""
|
||||||
|
Migration script to add job reservations for the Fleet Calendar feature.
|
||||||
|
|
||||||
|
This creates two tables:
|
||||||
|
- job_reservations: Track future unit assignments for jobs/projects
|
||||||
|
- job_reservation_units: Link specific units to reservations
|
||||||
|
|
||||||
|
Run this script once to migrate an existing database.
|
||||||
|
"""
|
||||||
|
|
||||||
|
import sqlite3
|
||||||
|
import os
|
||||||
|
|
||||||
|
# Database path
|
||||||
|
DB_PATH = "./data/seismo_fleet.db"
|
||||||
|
|
||||||
|
|
||||||
|
def migrate_database():
|
||||||
|
"""Create the job_reservations and job_reservation_units tables"""
|
||||||
|
|
||||||
|
if not os.path.exists(DB_PATH):
|
||||||
|
print(f"Database not found at {DB_PATH}")
|
||||||
|
print("The database will be created automatically when you run the application.")
|
||||||
|
return
|
||||||
|
|
||||||
|
print(f"Migrating database: {DB_PATH}")
|
||||||
|
|
||||||
|
conn = sqlite3.connect(DB_PATH)
|
||||||
|
cursor = conn.cursor()
|
||||||
|
|
||||||
|
# Check if job_reservations table already exists
|
||||||
|
cursor.execute("SELECT name FROM sqlite_master WHERE type='table' AND name='job_reservations'")
|
||||||
|
if cursor.fetchone():
|
||||||
|
print("Migration already applied - job_reservations table exists")
|
||||||
|
conn.close()
|
||||||
|
return
|
||||||
|
|
||||||
|
print("Creating job_reservations table...")
|
||||||
|
|
||||||
|
try:
|
||||||
|
# Create job_reservations table
|
||||||
|
cursor.execute("""
|
||||||
|
CREATE TABLE job_reservations (
|
||||||
|
id TEXT PRIMARY KEY,
|
||||||
|
name TEXT NOT NULL,
|
||||||
|
project_id TEXT,
|
||||||
|
start_date DATE NOT NULL,
|
||||||
|
end_date DATE NOT NULL,
|
||||||
|
assignment_type TEXT NOT NULL DEFAULT 'quantity',
|
||||||
|
device_type TEXT DEFAULT 'seismograph',
|
||||||
|
quantity_needed INTEGER,
|
||||||
|
notes TEXT,
|
||||||
|
color TEXT DEFAULT '#3B82F6',
|
||||||
|
created_at TIMESTAMP DEFAULT CURRENT_TIMESTAMP,
|
||||||
|
updated_at TIMESTAMP DEFAULT CURRENT_TIMESTAMP
|
||||||
|
)
|
||||||
|
""")
|
||||||
|
print(" Created job_reservations table")
|
||||||
|
|
||||||
|
# Create indexes for job_reservations
|
||||||
|
cursor.execute("CREATE INDEX idx_job_reservations_project_id ON job_reservations(project_id)")
|
||||||
|
print(" Created index on project_id")
|
||||||
|
|
||||||
|
cursor.execute("CREATE INDEX idx_job_reservations_dates ON job_reservations(start_date, end_date)")
|
||||||
|
print(" Created index on dates")
|
||||||
|
|
||||||
|
# Create job_reservation_units table
|
||||||
|
print("Creating job_reservation_units table...")
|
||||||
|
cursor.execute("""
|
||||||
|
CREATE TABLE job_reservation_units (
|
||||||
|
id TEXT PRIMARY KEY,
|
||||||
|
reservation_id TEXT NOT NULL,
|
||||||
|
unit_id TEXT NOT NULL,
|
||||||
|
assignment_source TEXT DEFAULT 'specific',
|
||||||
|
assigned_at TIMESTAMP DEFAULT CURRENT_TIMESTAMP,
|
||||||
|
FOREIGN KEY (reservation_id) REFERENCES job_reservations(id),
|
||||||
|
FOREIGN KEY (unit_id) REFERENCES roster(id)
|
||||||
|
)
|
||||||
|
""")
|
||||||
|
print(" Created job_reservation_units table")
|
||||||
|
|
||||||
|
# Create indexes for job_reservation_units
|
||||||
|
cursor.execute("CREATE INDEX idx_job_reservation_units_reservation_id ON job_reservation_units(reservation_id)")
|
||||||
|
print(" Created index on reservation_id")
|
||||||
|
|
||||||
|
cursor.execute("CREATE INDEX idx_job_reservation_units_unit_id ON job_reservation_units(unit_id)")
|
||||||
|
print(" Created index on unit_id")
|
||||||
|
|
||||||
|
conn.commit()
|
||||||
|
print("\nMigration completed successfully!")
|
||||||
|
print("You can now use the Fleet Calendar to manage unit reservations.")
|
||||||
|
|
||||||
|
except sqlite3.Error as e:
|
||||||
|
print(f"\nError during migration: {e}")
|
||||||
|
conn.rollback()
|
||||||
|
raise
|
||||||
|
|
||||||
|
finally:
|
||||||
|
conn.close()
|
||||||
|
|
||||||
|
|
||||||
|
if __name__ == "__main__":
|
||||||
|
migrate_database()
|
||||||
24
backend/migrate_add_location_slots.py
Normal file
@@ -0,0 +1,24 @@
|
|||||||
|
"""
|
||||||
|
Migration: Add location_slots column to job_reservations table.
|
||||||
|
Stores the full ordered slot list (including empty/unassigned slots) as JSON.
|
||||||
|
Run once per database.
|
||||||
|
"""
|
||||||
|
import sqlite3
|
||||||
|
import os
|
||||||
|
|
||||||
|
DB_PATH = os.environ.get("DB_PATH", "/app/data/seismo_fleet.db")
|
||||||
|
|
||||||
|
def run():
|
||||||
|
conn = sqlite3.connect(DB_PATH)
|
||||||
|
cursor = conn.cursor()
|
||||||
|
existing = [r[1] for r in cursor.execute("PRAGMA table_info(job_reservations)").fetchall()]
|
||||||
|
if "location_slots" not in existing:
|
||||||
|
cursor.execute("ALTER TABLE job_reservations ADD COLUMN location_slots TEXT")
|
||||||
|
conn.commit()
|
||||||
|
print("Added location_slots column to job_reservations.")
|
||||||
|
else:
|
||||||
|
print("location_slots column already exists, skipping.")
|
||||||
|
conn.close()
|
||||||
|
|
||||||
|
if __name__ == "__main__":
|
||||||
|
run()
|
||||||
73
backend/migrate_add_oneoff_schedule_fields.py
Normal file
@@ -0,0 +1,73 @@
|
|||||||
|
"""
|
||||||
|
Migration: Add one-off schedule fields to recurring_schedules table
|
||||||
|
|
||||||
|
Adds start_datetime and end_datetime columns for one-off recording schedules.
|
||||||
|
|
||||||
|
Run this script once to update existing databases:
|
||||||
|
python -m backend.migrate_add_oneoff_schedule_fields
|
||||||
|
"""
|
||||||
|
|
||||||
|
import sqlite3
|
||||||
|
import os
|
||||||
|
|
||||||
|
DB_PATH = "data/seismo_fleet.db"
|
||||||
|
|
||||||
|
|
||||||
|
def migrate():
|
||||||
|
"""Add one-off schedule columns to recurring_schedules table."""
|
||||||
|
if not os.path.exists(DB_PATH):
|
||||||
|
print(f"Database not found at {DB_PATH}")
|
||||||
|
return False
|
||||||
|
|
||||||
|
conn = sqlite3.connect(DB_PATH)
|
||||||
|
cursor = conn.cursor()
|
||||||
|
|
||||||
|
try:
|
||||||
|
cursor.execute("""
|
||||||
|
SELECT name FROM sqlite_master
|
||||||
|
WHERE type='table' AND name='recurring_schedules'
|
||||||
|
""")
|
||||||
|
if not cursor.fetchone():
|
||||||
|
print("recurring_schedules table does not exist yet. Will be created on app startup.")
|
||||||
|
conn.close()
|
||||||
|
return True
|
||||||
|
|
||||||
|
cursor.execute("PRAGMA table_info(recurring_schedules)")
|
||||||
|
columns = [row[1] for row in cursor.fetchall()]
|
||||||
|
|
||||||
|
added = False
|
||||||
|
|
||||||
|
if "start_datetime" not in columns:
|
||||||
|
print("Adding start_datetime column to recurring_schedules table...")
|
||||||
|
cursor.execute("""
|
||||||
|
ALTER TABLE recurring_schedules
|
||||||
|
ADD COLUMN start_datetime DATETIME NULL
|
||||||
|
""")
|
||||||
|
added = True
|
||||||
|
|
||||||
|
if "end_datetime" not in columns:
|
||||||
|
print("Adding end_datetime column to recurring_schedules table...")
|
||||||
|
cursor.execute("""
|
||||||
|
ALTER TABLE recurring_schedules
|
||||||
|
ADD COLUMN end_datetime DATETIME NULL
|
||||||
|
""")
|
||||||
|
added = True
|
||||||
|
|
||||||
|
if added:
|
||||||
|
conn.commit()
|
||||||
|
print("Successfully added one-off schedule columns.")
|
||||||
|
else:
|
||||||
|
print("One-off schedule columns already exist.")
|
||||||
|
|
||||||
|
conn.close()
|
||||||
|
return True
|
||||||
|
|
||||||
|
except Exception as e:
|
||||||
|
print(f"Migration failed: {e}")
|
||||||
|
conn.close()
|
||||||
|
return False
|
||||||
|
|
||||||
|
|
||||||
|
if __name__ == "__main__":
|
||||||
|
success = migrate()
|
||||||
|
exit(0 if success else 1)
|
||||||
54
backend/migrate_add_out_for_calibration.py
Normal file
@@ -0,0 +1,54 @@
|
|||||||
|
"""
|
||||||
|
Database Migration: Add out_for_calibration field to roster table
|
||||||
|
|
||||||
|
Changes:
|
||||||
|
- Adds out_for_calibration BOOLEAN column (default FALSE) to roster table
|
||||||
|
- Safe to run multiple times (idempotent)
|
||||||
|
- No data loss
|
||||||
|
|
||||||
|
Usage:
|
||||||
|
python backend/migrate_add_out_for_calibration.py
|
||||||
|
"""
|
||||||
|
|
||||||
|
import sys
|
||||||
|
import os
|
||||||
|
|
||||||
|
sys.path.insert(0, os.path.dirname(os.path.dirname(os.path.abspath(__file__))))
|
||||||
|
|
||||||
|
from sqlalchemy import create_engine, text
|
||||||
|
from sqlalchemy.orm import sessionmaker
|
||||||
|
|
||||||
|
SQLALCHEMY_DATABASE_URL = "sqlite:///./data/seismo_fleet.db"
|
||||||
|
engine = create_engine(SQLALCHEMY_DATABASE_URL, connect_args={"check_same_thread": False})
|
||||||
|
SessionLocal = sessionmaker(autocommit=False, autoflush=False, bind=engine)
|
||||||
|
|
||||||
|
|
||||||
|
def migrate():
|
||||||
|
db = SessionLocal()
|
||||||
|
try:
|
||||||
|
print("=" * 60)
|
||||||
|
print("Migration: Add out_for_calibration to roster")
|
||||||
|
print("=" * 60)
|
||||||
|
|
||||||
|
# Check if column already exists
|
||||||
|
result = db.execute(text("PRAGMA table_info(roster)")).fetchall()
|
||||||
|
columns = [row[1] for row in result]
|
||||||
|
|
||||||
|
if "out_for_calibration" in columns:
|
||||||
|
print("Column out_for_calibration already exists. Skipping.")
|
||||||
|
else:
|
||||||
|
db.execute(text("ALTER TABLE roster ADD COLUMN out_for_calibration BOOLEAN DEFAULT FALSE"))
|
||||||
|
db.commit()
|
||||||
|
print("Added out_for_calibration column to roster table.")
|
||||||
|
|
||||||
|
print("Migration complete.")
|
||||||
|
except Exception as e:
|
||||||
|
db.rollback()
|
||||||
|
print(f"Error: {e}")
|
||||||
|
raise
|
||||||
|
finally:
|
||||||
|
db.close()
|
||||||
|
|
||||||
|
|
||||||
|
if __name__ == "__main__":
|
||||||
|
migrate()
|
||||||
53
backend/migrate_add_project_data_collection_mode.py
Normal file
@@ -0,0 +1,53 @@
|
|||||||
|
#!/usr/bin/env python3
|
||||||
|
"""
|
||||||
|
Migration: Add data_collection_mode column to projects table.
|
||||||
|
|
||||||
|
Values:
|
||||||
|
"remote" — units have modems; data pulled via FTP/scheduler automatically
|
||||||
|
"manual" — no modem; SD cards retrieved daily and uploaded by hand
|
||||||
|
|
||||||
|
All existing projects are backfilled to "manual" (safe conservative default).
|
||||||
|
|
||||||
|
Run once inside the Docker container:
|
||||||
|
docker exec terra-view python3 backend/migrate_add_project_data_collection_mode.py
|
||||||
|
"""
|
||||||
|
from pathlib import Path
|
||||||
|
|
||||||
|
DB_PATH = Path("data/seismo_fleet.db")
|
||||||
|
|
||||||
|
|
||||||
|
def migrate():
|
||||||
|
import sqlite3
|
||||||
|
|
||||||
|
if not DB_PATH.exists():
|
||||||
|
print(f"Database not found at {DB_PATH}. Are you running from /home/serversdown/terra-view?")
|
||||||
|
return
|
||||||
|
|
||||||
|
conn = sqlite3.connect(DB_PATH)
|
||||||
|
conn.row_factory = sqlite3.Row
|
||||||
|
cur = conn.cursor()
|
||||||
|
|
||||||
|
# ── 1. Add column (idempotent) ───────────────────────────────────────────
|
||||||
|
cur.execute("PRAGMA table_info(projects)")
|
||||||
|
existing_cols = {row["name"] for row in cur.fetchall()}
|
||||||
|
|
||||||
|
if "data_collection_mode" not in existing_cols:
|
||||||
|
cur.execute("ALTER TABLE projects ADD COLUMN data_collection_mode TEXT DEFAULT 'manual'")
|
||||||
|
conn.commit()
|
||||||
|
print("✓ Added column data_collection_mode to projects")
|
||||||
|
else:
|
||||||
|
print("○ Column data_collection_mode already exists — skipping ALTER TABLE")
|
||||||
|
|
||||||
|
# ── 2. Backfill NULLs to 'manual' ────────────────────────────────────────
|
||||||
|
cur.execute("UPDATE projects SET data_collection_mode = 'manual' WHERE data_collection_mode IS NULL")
|
||||||
|
updated = cur.rowcount
|
||||||
|
conn.commit()
|
||||||
|
conn.close()
|
||||||
|
|
||||||
|
if updated:
|
||||||
|
print(f"✓ Backfilled {updated} project(s) to data_collection_mode='manual'.")
|
||||||
|
print("Migration complete.")
|
||||||
|
|
||||||
|
|
||||||
|
if __name__ == "__main__":
|
||||||
|
migrate()
|
||||||
56
backend/migrate_add_project_deleted_at.py
Normal file
@@ -0,0 +1,56 @@
|
|||||||
|
"""
|
||||||
|
Migration: Add deleted_at column to projects table
|
||||||
|
|
||||||
|
Adds columns:
|
||||||
|
- projects.deleted_at: Timestamp set when status='deleted'; data hard-deleted after 60 days
|
||||||
|
"""
|
||||||
|
|
||||||
|
import sqlite3
|
||||||
|
import sys
|
||||||
|
from pathlib import Path
|
||||||
|
|
||||||
|
|
||||||
|
def migrate(db_path: str):
|
||||||
|
"""Run the migration."""
|
||||||
|
print(f"Migrating database: {db_path}")
|
||||||
|
|
||||||
|
conn = sqlite3.connect(db_path)
|
||||||
|
cursor = conn.cursor()
|
||||||
|
|
||||||
|
try:
|
||||||
|
cursor.execute("SELECT name FROM sqlite_master WHERE type='table' AND name='projects'")
|
||||||
|
if not cursor.fetchone():
|
||||||
|
print("projects table does not exist. Skipping migration.")
|
||||||
|
return
|
||||||
|
|
||||||
|
cursor.execute("PRAGMA table_info(projects)")
|
||||||
|
existing_cols = {row[1] for row in cursor.fetchall()}
|
||||||
|
|
||||||
|
if 'deleted_at' not in existing_cols:
|
||||||
|
print("Adding deleted_at column to projects...")
|
||||||
|
cursor.execute("ALTER TABLE projects ADD COLUMN deleted_at DATETIME")
|
||||||
|
else:
|
||||||
|
print("deleted_at column already exists. Skipping.")
|
||||||
|
|
||||||
|
conn.commit()
|
||||||
|
print("Migration completed successfully!")
|
||||||
|
|
||||||
|
except Exception as e:
|
||||||
|
print(f"Migration failed: {e}")
|
||||||
|
conn.rollback()
|
||||||
|
raise
|
||||||
|
finally:
|
||||||
|
conn.close()
|
||||||
|
|
||||||
|
|
||||||
|
if __name__ == "__main__":
|
||||||
|
db_path = "./data/seismo_fleet.db"
|
||||||
|
|
||||||
|
if len(sys.argv) > 1:
|
||||||
|
db_path = sys.argv[1]
|
||||||
|
|
||||||
|
if not Path(db_path).exists():
|
||||||
|
print(f"Database not found: {db_path}")
|
||||||
|
sys.exit(1)
|
||||||
|
|
||||||
|
migrate(db_path)
|
||||||
80
backend/migrate_add_project_number.py
Normal file
@@ -0,0 +1,80 @@
|
|||||||
|
"""
|
||||||
|
Migration script to add project_number field to projects table.
|
||||||
|
|
||||||
|
This adds a new column for TMI internal project numbering:
|
||||||
|
- Format: xxxx-YY (e.g., "2567-23")
|
||||||
|
- xxxx = incremental project number
|
||||||
|
- YY = year project was started
|
||||||
|
|
||||||
|
Combined with client_name and name (project/site name), this enables
|
||||||
|
smart searching across all project identifiers.
|
||||||
|
|
||||||
|
Run this script once to migrate an existing database.
|
||||||
|
"""
|
||||||
|
|
||||||
|
import sqlite3
|
||||||
|
import os
|
||||||
|
|
||||||
|
# Database path
|
||||||
|
DB_PATH = "./data/seismo_fleet.db"
|
||||||
|
|
||||||
|
|
||||||
|
def migrate_database():
|
||||||
|
"""Add project_number column to projects table"""
|
||||||
|
|
||||||
|
if not os.path.exists(DB_PATH):
|
||||||
|
print(f"Database not found at {DB_PATH}")
|
||||||
|
print("The database will be created automatically when you run the application.")
|
||||||
|
return
|
||||||
|
|
||||||
|
print(f"Migrating database: {DB_PATH}")
|
||||||
|
|
||||||
|
conn = sqlite3.connect(DB_PATH)
|
||||||
|
cursor = conn.cursor()
|
||||||
|
|
||||||
|
# Check if projects table exists
|
||||||
|
cursor.execute("SELECT name FROM sqlite_master WHERE type='table' AND name='projects'")
|
||||||
|
table_exists = cursor.fetchone()
|
||||||
|
|
||||||
|
if not table_exists:
|
||||||
|
print("Projects table does not exist yet - will be created when app runs")
|
||||||
|
conn.close()
|
||||||
|
return
|
||||||
|
|
||||||
|
# Check if project_number column already exists
|
||||||
|
cursor.execute("PRAGMA table_info(projects)")
|
||||||
|
columns = [col[1] for col in cursor.fetchall()]
|
||||||
|
|
||||||
|
if 'project_number' in columns:
|
||||||
|
print("Migration already applied - project_number column exists")
|
||||||
|
conn.close()
|
||||||
|
return
|
||||||
|
|
||||||
|
print("Adding project_number column to projects table...")
|
||||||
|
|
||||||
|
try:
|
||||||
|
cursor.execute("ALTER TABLE projects ADD COLUMN project_number TEXT")
|
||||||
|
print(" Added project_number column")
|
||||||
|
|
||||||
|
# Create index for faster searching
|
||||||
|
cursor.execute("CREATE INDEX IF NOT EXISTS ix_projects_project_number ON projects(project_number)")
|
||||||
|
print(" Created index on project_number")
|
||||||
|
|
||||||
|
# Also add index on client_name if it doesn't exist
|
||||||
|
cursor.execute("CREATE INDEX IF NOT EXISTS ix_projects_client_name ON projects(client_name)")
|
||||||
|
print(" Created index on client_name")
|
||||||
|
|
||||||
|
conn.commit()
|
||||||
|
print("\nMigration completed successfully!")
|
||||||
|
|
||||||
|
except sqlite3.Error as e:
|
||||||
|
print(f"\nError during migration: {e}")
|
||||||
|
conn.rollback()
|
||||||
|
raise
|
||||||
|
|
||||||
|
finally:
|
||||||
|
conn.close()
|
||||||
|
|
||||||
|
|
||||||
|
if __name__ == "__main__":
|
||||||
|
migrate_database()
|
||||||
88
backend/migrate_add_report_templates.py
Normal file
@@ -0,0 +1,88 @@
|
|||||||
|
"""
|
||||||
|
Migration script to add report_templates table.
|
||||||
|
|
||||||
|
This creates a new table for storing report generation configurations:
|
||||||
|
- Template name and project association
|
||||||
|
- Time filtering settings (start/end time)
|
||||||
|
- Date range filtering (optional)
|
||||||
|
- Report title defaults
|
||||||
|
|
||||||
|
Run this script once to migrate an existing database.
|
||||||
|
"""
|
||||||
|
|
||||||
|
import sqlite3
|
||||||
|
import os
|
||||||
|
|
||||||
|
# Database path
|
||||||
|
DB_PATH = "./data/seismo_fleet.db"
|
||||||
|
|
||||||
|
def migrate_database():
|
||||||
|
"""Create report_templates table"""
|
||||||
|
|
||||||
|
if not os.path.exists(DB_PATH):
|
||||||
|
print(f"Database not found at {DB_PATH}")
|
||||||
|
print("The database will be created automatically when you run the application.")
|
||||||
|
return
|
||||||
|
|
||||||
|
print(f"Migrating database: {DB_PATH}")
|
||||||
|
|
||||||
|
conn = sqlite3.connect(DB_PATH)
|
||||||
|
cursor = conn.cursor()
|
||||||
|
|
||||||
|
# Check if report_templates table already exists
|
||||||
|
cursor.execute("SELECT name FROM sqlite_master WHERE type='table' AND name='report_templates'")
|
||||||
|
table_exists = cursor.fetchone()
|
||||||
|
|
||||||
|
if table_exists:
|
||||||
|
print("Migration already applied - report_templates table exists")
|
||||||
|
conn.close()
|
||||||
|
return
|
||||||
|
|
||||||
|
print("Creating report_templates table...")
|
||||||
|
|
||||||
|
try:
|
||||||
|
cursor.execute("""
|
||||||
|
CREATE TABLE report_templates (
|
||||||
|
id TEXT PRIMARY KEY,
|
||||||
|
name TEXT NOT NULL,
|
||||||
|
project_id TEXT,
|
||||||
|
report_title TEXT DEFAULT 'Background Noise Study',
|
||||||
|
start_time TEXT,
|
||||||
|
end_time TEXT,
|
||||||
|
start_date TEXT,
|
||||||
|
end_date TEXT,
|
||||||
|
created_at TIMESTAMP DEFAULT CURRENT_TIMESTAMP,
|
||||||
|
updated_at TIMESTAMP DEFAULT CURRENT_TIMESTAMP
|
||||||
|
)
|
||||||
|
""")
|
||||||
|
print(" ✓ Created report_templates table")
|
||||||
|
|
||||||
|
# Insert default templates
|
||||||
|
import uuid
|
||||||
|
|
||||||
|
default_templates = [
|
||||||
|
(str(uuid.uuid4()), "Nighttime (7PM-7AM)", None, "Background Noise Study", "19:00", "07:00", None, None),
|
||||||
|
(str(uuid.uuid4()), "Daytime (7AM-7PM)", None, "Background Noise Study", "07:00", "19:00", None, None),
|
||||||
|
(str(uuid.uuid4()), "Full Day (All Data)", None, "Background Noise Study", None, None, None, None),
|
||||||
|
]
|
||||||
|
|
||||||
|
cursor.executemany("""
|
||||||
|
INSERT INTO report_templates (id, name, project_id, report_title, start_time, end_time, start_date, end_date)
|
||||||
|
VALUES (?, ?, ?, ?, ?, ?, ?, ?)
|
||||||
|
""", default_templates)
|
||||||
|
print(" ✓ Inserted default templates (Nighttime, Daytime, Full Day)")
|
||||||
|
|
||||||
|
conn.commit()
|
||||||
|
print("\nMigration completed successfully!")
|
||||||
|
|
||||||
|
except sqlite3.Error as e:
|
||||||
|
print(f"\nError during migration: {e}")
|
||||||
|
conn.rollback()
|
||||||
|
raise
|
||||||
|
|
||||||
|
finally:
|
||||||
|
conn.close()
|
||||||
|
|
||||||
|
|
||||||
|
if __name__ == "__main__":
|
||||||
|
migrate_database()
|
||||||
127
backend/migrate_add_session_device_model.py
Normal file
@@ -0,0 +1,127 @@
|
|||||||
|
#!/usr/bin/env python3
|
||||||
|
"""
|
||||||
|
Migration: Add device_model column to monitoring_sessions table.
|
||||||
|
|
||||||
|
Records which physical SLM model produced each session's data (e.g. "NL-43",
|
||||||
|
"NL-53", "NL-32"). Used by report generation to apply the correct parsing
|
||||||
|
logic without re-opening files to detect format.
|
||||||
|
|
||||||
|
Run once inside the Docker container:
|
||||||
|
docker exec terra-view python3 backend/migrate_add_session_device_model.py
|
||||||
|
|
||||||
|
Backfill strategy for existing rows:
|
||||||
|
1. If session.unit_id is set, use roster.slm_model for that unit.
|
||||||
|
2. Else, peek at the first .rnd file in the session: presence of the 'LAeq'
|
||||||
|
column header identifies AU2 / NL-32 format.
|
||||||
|
Sessions where neither hint is available remain NULL — the file-content
|
||||||
|
fallback in report code handles them transparently.
|
||||||
|
"""
|
||||||
|
import csv
|
||||||
|
import io
|
||||||
|
from pathlib import Path
|
||||||
|
|
||||||
|
DB_PATH = Path("data/seismo_fleet.db")
|
||||||
|
|
||||||
|
|
||||||
|
def _peek_first_row(abs_path: Path) -> dict:
|
||||||
|
"""Read only the header + first data row of an RND file. Very cheap."""
|
||||||
|
try:
|
||||||
|
with open(abs_path, "r", encoding="utf-8", errors="replace") as f:
|
||||||
|
reader = csv.DictReader(f)
|
||||||
|
return next(reader, None) or {}
|
||||||
|
except Exception:
|
||||||
|
return {}
|
||||||
|
|
||||||
|
|
||||||
|
def _detect_model_from_rnd(abs_path: Path) -> str | None:
|
||||||
|
"""Return 'NL-32' if file uses AU2 column format, else None."""
|
||||||
|
row = _peek_first_row(abs_path)
|
||||||
|
if "LAeq" in row:
|
||||||
|
return "NL-32"
|
||||||
|
return None
|
||||||
|
|
||||||
|
|
||||||
|
def migrate():
|
||||||
|
import sqlite3
|
||||||
|
|
||||||
|
if not DB_PATH.exists():
|
||||||
|
print(f"Database not found at {DB_PATH}. Are you running from /home/serversdown/terra-view?")
|
||||||
|
return
|
||||||
|
|
||||||
|
conn = sqlite3.connect(DB_PATH)
|
||||||
|
conn.row_factory = sqlite3.Row
|
||||||
|
cur = conn.cursor()
|
||||||
|
|
||||||
|
# ── 1. Add column (idempotent) ───────────────────────────────────────────
|
||||||
|
cur.execute("PRAGMA table_info(monitoring_sessions)")
|
||||||
|
existing_cols = {row["name"] for row in cur.fetchall()}
|
||||||
|
|
||||||
|
if "device_model" not in existing_cols:
|
||||||
|
cur.execute("ALTER TABLE monitoring_sessions ADD COLUMN device_model TEXT")
|
||||||
|
conn.commit()
|
||||||
|
print("✓ Added column device_model to monitoring_sessions")
|
||||||
|
else:
|
||||||
|
print("○ Column device_model already exists — skipping ALTER TABLE")
|
||||||
|
|
||||||
|
# ── 2. Backfill existing NULL rows ───────────────────────────────────────
|
||||||
|
cur.execute(
|
||||||
|
"SELECT id, unit_id FROM monitoring_sessions WHERE device_model IS NULL"
|
||||||
|
)
|
||||||
|
sessions = cur.fetchall()
|
||||||
|
print(f"Backfilling {len(sessions)} session(s) with device_model=NULL...")
|
||||||
|
|
||||||
|
updated = skipped = 0
|
||||||
|
for row in sessions:
|
||||||
|
session_id = row["id"]
|
||||||
|
unit_id = row["unit_id"]
|
||||||
|
device_model = None
|
||||||
|
|
||||||
|
# Strategy A: look up unit's slm_model from the roster
|
||||||
|
if unit_id:
|
||||||
|
cur.execute(
|
||||||
|
"SELECT slm_model FROM roster WHERE id = ?", (unit_id,)
|
||||||
|
)
|
||||||
|
unit_row = cur.fetchone()
|
||||||
|
if unit_row and unit_row["slm_model"]:
|
||||||
|
device_model = unit_row["slm_model"]
|
||||||
|
|
||||||
|
# Strategy B: detect from first .rnd file in the session
|
||||||
|
if device_model is None:
|
||||||
|
cur.execute(
|
||||||
|
"""SELECT file_path FROM data_files
|
||||||
|
WHERE session_id = ?
|
||||||
|
AND lower(file_path) LIKE '%.rnd'
|
||||||
|
LIMIT 1""",
|
||||||
|
(session_id,),
|
||||||
|
)
|
||||||
|
file_row = cur.fetchone()
|
||||||
|
if file_row:
|
||||||
|
abs_path = Path("data") / file_row["file_path"]
|
||||||
|
device_model = _detect_model_from_rnd(abs_path)
|
||||||
|
# None here means NL-43/NL-53 format (or unreadable file) —
|
||||||
|
# leave as NULL so the existing fallback applies.
|
||||||
|
|
||||||
|
if device_model:
|
||||||
|
cur.execute(
|
||||||
|
"UPDATE monitoring_sessions SET device_model = ? WHERE id = ?",
|
||||||
|
(device_model, session_id),
|
||||||
|
)
|
||||||
|
updated += 1
|
||||||
|
else:
|
||||||
|
skipped += 1
|
||||||
|
|
||||||
|
conn.commit()
|
||||||
|
conn.close()
|
||||||
|
|
||||||
|
print(f"✓ Backfilled {updated} session(s) with a device_model.")
|
||||||
|
if skipped:
|
||||||
|
print(
|
||||||
|
f" {skipped} session(s) left as NULL "
|
||||||
|
"(no unit link and no AU2 file hint — NL-43/NL-53 or unknown; "
|
||||||
|
"file-content detection applies at report time)."
|
||||||
|
)
|
||||||
|
print("Migration complete.")
|
||||||
|
|
||||||
|
|
||||||
|
if __name__ == "__main__":
|
||||||
|
migrate()
|
||||||
42
backend/migrate_add_session_period_hours.py
Normal file
@@ -0,0 +1,42 @@
|
|||||||
|
"""
|
||||||
|
Migration: add period_start_hour and period_end_hour to monitoring_sessions.
|
||||||
|
|
||||||
|
Run once:
|
||||||
|
python backend/migrate_add_session_period_hours.py
|
||||||
|
|
||||||
|
Or inside the container:
|
||||||
|
docker exec terra-view python3 backend/migrate_add_session_period_hours.py
|
||||||
|
"""
|
||||||
|
|
||||||
|
import sys
|
||||||
|
import os
|
||||||
|
sys.path.insert(0, os.path.dirname(os.path.dirname(os.path.abspath(__file__))))
|
||||||
|
|
||||||
|
from backend.database import engine
|
||||||
|
from sqlalchemy import text
|
||||||
|
|
||||||
|
def run():
|
||||||
|
with engine.connect() as conn:
|
||||||
|
# Check which columns already exist
|
||||||
|
result = conn.execute(text("PRAGMA table_info(monitoring_sessions)"))
|
||||||
|
existing = {row[1] for row in result}
|
||||||
|
|
||||||
|
added = []
|
||||||
|
for col, definition in [
|
||||||
|
("period_start_hour", "INTEGER"),
|
||||||
|
("period_end_hour", "INTEGER"),
|
||||||
|
]:
|
||||||
|
if col not in existing:
|
||||||
|
conn.execute(text(f"ALTER TABLE monitoring_sessions ADD COLUMN {col} {definition}"))
|
||||||
|
added.append(col)
|
||||||
|
else:
|
||||||
|
print(f" Column '{col}' already exists — skipping.")
|
||||||
|
|
||||||
|
conn.commit()
|
||||||
|
|
||||||
|
if added:
|
||||||
|
print(f" Added columns: {', '.join(added)}")
|
||||||
|
print("Migration complete.")
|
||||||
|
|
||||||
|
if __name__ == "__main__":
|
||||||
|
run()
|
||||||
131
backend/migrate_add_session_period_type.py
Normal file
@@ -0,0 +1,131 @@
|
|||||||
|
#!/usr/bin/env python3
|
||||||
|
"""
|
||||||
|
Migration: Add session_label and period_type columns to monitoring_sessions.
|
||||||
|
|
||||||
|
session_label - user-editable display name, e.g. "NRL-1 Sun 2/23 Night"
|
||||||
|
period_type - one of: weekday_day | weekday_night | weekend_day | weekend_night
|
||||||
|
Auto-derived from started_at when NULL.
|
||||||
|
|
||||||
|
Period definitions (used in report stats table):
|
||||||
|
weekday_day Mon-Fri 07:00-22:00 -> Daytime (7AM-10PM)
|
||||||
|
weekday_night Mon-Fri 22:00-07:00 -> Nighttime (10PM-7AM)
|
||||||
|
weekend_day Sat-Sun 07:00-22:00 -> Daytime (7AM-10PM)
|
||||||
|
weekend_night Sat-Sun 22:00-07:00 -> Nighttime (10PM-7AM)
|
||||||
|
|
||||||
|
Run once inside the Docker container:
|
||||||
|
docker exec terra-view python3 backend/migrate_add_session_period_type.py
|
||||||
|
"""
|
||||||
|
from pathlib import Path
|
||||||
|
from datetime import datetime
|
||||||
|
|
||||||
|
DB_PATH = Path("data/seismo_fleet.db")
|
||||||
|
|
||||||
|
|
||||||
|
def _derive_period_type(started_at_str: str) -> str | None:
|
||||||
|
"""Derive period_type from a started_at ISO datetime string."""
|
||||||
|
if not started_at_str:
|
||||||
|
return None
|
||||||
|
try:
|
||||||
|
dt = datetime.fromisoformat(started_at_str)
|
||||||
|
except ValueError:
|
||||||
|
return None
|
||||||
|
is_weekend = dt.weekday() >= 5 # 5=Sat, 6=Sun
|
||||||
|
is_night = dt.hour >= 22 or dt.hour < 7
|
||||||
|
if is_weekend:
|
||||||
|
return "weekend_night" if is_night else "weekend_day"
|
||||||
|
else:
|
||||||
|
return "weekday_night" if is_night else "weekday_day"
|
||||||
|
|
||||||
|
|
||||||
|
def _build_label(started_at_str: str, location_name: str | None, period_type: str | None) -> str | None:
|
||||||
|
"""Build a human-readable session label."""
|
||||||
|
if not started_at_str:
|
||||||
|
return None
|
||||||
|
try:
|
||||||
|
dt = datetime.fromisoformat(started_at_str)
|
||||||
|
except ValueError:
|
||||||
|
return None
|
||||||
|
|
||||||
|
day_abbr = dt.strftime("%a") # Mon, Tue, Sun, etc.
|
||||||
|
date_str = dt.strftime("%-m/%-d") # 2/23
|
||||||
|
|
||||||
|
period_labels = {
|
||||||
|
"weekday_day": "Day",
|
||||||
|
"weekday_night": "Night",
|
||||||
|
"weekend_day": "Day",
|
||||||
|
"weekend_night": "Night",
|
||||||
|
}
|
||||||
|
period_str = period_labels.get(period_type or "", "")
|
||||||
|
|
||||||
|
parts = []
|
||||||
|
if location_name:
|
||||||
|
parts.append(location_name)
|
||||||
|
parts.append(f"{day_abbr} {date_str}")
|
||||||
|
if period_str:
|
||||||
|
parts.append(period_str)
|
||||||
|
return " — ".join(parts)
|
||||||
|
|
||||||
|
|
||||||
|
def migrate():
|
||||||
|
import sqlite3
|
||||||
|
|
||||||
|
if not DB_PATH.exists():
|
||||||
|
print(f"Database not found at {DB_PATH}. Are you running from /home/serversdown/terra-view?")
|
||||||
|
return
|
||||||
|
|
||||||
|
conn = sqlite3.connect(DB_PATH)
|
||||||
|
conn.row_factory = sqlite3.Row
|
||||||
|
cur = conn.cursor()
|
||||||
|
|
||||||
|
# 1. Add columns (idempotent)
|
||||||
|
cur.execute("PRAGMA table_info(monitoring_sessions)")
|
||||||
|
existing_cols = {row["name"] for row in cur.fetchall()}
|
||||||
|
|
||||||
|
for col, typedef in [("session_label", "TEXT"), ("period_type", "TEXT")]:
|
||||||
|
if col not in existing_cols:
|
||||||
|
cur.execute(f"ALTER TABLE monitoring_sessions ADD COLUMN {col} {typedef}")
|
||||||
|
conn.commit()
|
||||||
|
print(f"✓ Added column {col} to monitoring_sessions")
|
||||||
|
else:
|
||||||
|
print(f"○ Column {col} already exists — skipping ALTER TABLE")
|
||||||
|
|
||||||
|
# 2. Backfill existing rows
|
||||||
|
cur.execute(
|
||||||
|
"""SELECT ms.id, ms.started_at, ms.location_id
|
||||||
|
FROM monitoring_sessions ms
|
||||||
|
WHERE ms.period_type IS NULL OR ms.session_label IS NULL"""
|
||||||
|
)
|
||||||
|
sessions = cur.fetchall()
|
||||||
|
print(f"Backfilling {len(sessions)} session(s)...")
|
||||||
|
|
||||||
|
updated = 0
|
||||||
|
for row in sessions:
|
||||||
|
session_id = row["id"]
|
||||||
|
started_at = row["started_at"]
|
||||||
|
location_id = row["location_id"]
|
||||||
|
|
||||||
|
# Look up location name
|
||||||
|
location_name = None
|
||||||
|
if location_id:
|
||||||
|
cur.execute("SELECT name FROM monitoring_locations WHERE id = ?", (location_id,))
|
||||||
|
loc_row = cur.fetchone()
|
||||||
|
if loc_row:
|
||||||
|
location_name = loc_row["name"]
|
||||||
|
|
||||||
|
period_type = _derive_period_type(started_at)
|
||||||
|
label = _build_label(started_at, location_name, period_type)
|
||||||
|
|
||||||
|
cur.execute(
|
||||||
|
"UPDATE monitoring_sessions SET period_type = ?, session_label = ? WHERE id = ?",
|
||||||
|
(period_type, label, session_id),
|
||||||
|
)
|
||||||
|
updated += 1
|
||||||
|
|
||||||
|
conn.commit()
|
||||||
|
conn.close()
|
||||||
|
print(f"✓ Backfilled {updated} session(s).")
|
||||||
|
print("Migration complete.")
|
||||||
|
|
||||||
|
|
||||||
|
if __name__ == "__main__":
|
||||||
|
migrate()
|
||||||
41
backend/migrate_add_session_report_date.py
Normal file
@@ -0,0 +1,41 @@
|
|||||||
|
"""
|
||||||
|
Migration: add report_date to monitoring_sessions.
|
||||||
|
|
||||||
|
Run once:
|
||||||
|
python backend/migrate_add_session_report_date.py
|
||||||
|
|
||||||
|
Or inside the container:
|
||||||
|
docker exec terra-view-terra-view-1 python3 backend/migrate_add_session_report_date.py
|
||||||
|
"""
|
||||||
|
|
||||||
|
import sys
|
||||||
|
import os
|
||||||
|
sys.path.insert(0, os.path.dirname(os.path.dirname(os.path.abspath(__file__))))
|
||||||
|
|
||||||
|
from backend.database import engine
|
||||||
|
from sqlalchemy import text
|
||||||
|
|
||||||
|
def run():
|
||||||
|
with engine.connect() as conn:
|
||||||
|
# Check which columns already exist
|
||||||
|
result = conn.execute(text("PRAGMA table_info(monitoring_sessions)"))
|
||||||
|
existing = {row[1] for row in result}
|
||||||
|
|
||||||
|
added = []
|
||||||
|
for col, definition in [
|
||||||
|
("report_date", "DATE"),
|
||||||
|
]:
|
||||||
|
if col not in existing:
|
||||||
|
conn.execute(text(f"ALTER TABLE monitoring_sessions ADD COLUMN {col} {definition}"))
|
||||||
|
added.append(col)
|
||||||
|
else:
|
||||||
|
print(f" Column '{col}' already exists — skipping.")
|
||||||
|
|
||||||
|
conn.commit()
|
||||||
|
|
||||||
|
if added:
|
||||||
|
print(f" Added columns: {', '.join(added)}")
|
||||||
|
print("Migration complete.")
|
||||||
|
|
||||||
|
if __name__ == "__main__":
|
||||||
|
run()
|
||||||
89
backend/migrate_add_tbd_dates.py
Normal file
@@ -0,0 +1,89 @@
|
|||||||
|
"""
|
||||||
|
Migration: Add TBD date support to job reservations
|
||||||
|
|
||||||
|
Adds columns:
|
||||||
|
- job_reservations.estimated_end_date: For planning when end is TBD
|
||||||
|
- job_reservations.end_date_tbd: Boolean flag for TBD end dates
|
||||||
|
- job_reservation_units.unit_start_date: Unit-specific start (for swaps)
|
||||||
|
- job_reservation_units.unit_end_date: Unit-specific end (for swaps)
|
||||||
|
- job_reservation_units.unit_end_tbd: Unit-specific TBD flag
|
||||||
|
- job_reservation_units.notes: Notes for the assignment
|
||||||
|
|
||||||
|
Also makes job_reservations.end_date nullable.
|
||||||
|
"""
|
||||||
|
|
||||||
|
import sqlite3
|
||||||
|
import sys
|
||||||
|
from pathlib import Path
|
||||||
|
|
||||||
|
def migrate(db_path: str):
|
||||||
|
"""Run the migration."""
|
||||||
|
print(f"Migrating database: {db_path}")
|
||||||
|
|
||||||
|
conn = sqlite3.connect(db_path)
|
||||||
|
cursor = conn.cursor()
|
||||||
|
|
||||||
|
try:
|
||||||
|
# Check if job_reservations table exists
|
||||||
|
cursor.execute("SELECT name FROM sqlite_master WHERE type='table' AND name='job_reservations'")
|
||||||
|
if not cursor.fetchone():
|
||||||
|
print("job_reservations table does not exist. Skipping migration.")
|
||||||
|
return
|
||||||
|
|
||||||
|
# Get existing columns in job_reservations
|
||||||
|
cursor.execute("PRAGMA table_info(job_reservations)")
|
||||||
|
existing_cols = {row[1] for row in cursor.fetchall()}
|
||||||
|
|
||||||
|
# Add new columns to job_reservations if they don't exist
|
||||||
|
if 'estimated_end_date' not in existing_cols:
|
||||||
|
print("Adding estimated_end_date column to job_reservations...")
|
||||||
|
cursor.execute("ALTER TABLE job_reservations ADD COLUMN estimated_end_date DATE")
|
||||||
|
|
||||||
|
if 'end_date_tbd' not in existing_cols:
|
||||||
|
print("Adding end_date_tbd column to job_reservations...")
|
||||||
|
cursor.execute("ALTER TABLE job_reservations ADD COLUMN end_date_tbd BOOLEAN DEFAULT 0")
|
||||||
|
|
||||||
|
# Get existing columns in job_reservation_units
|
||||||
|
cursor.execute("PRAGMA table_info(job_reservation_units)")
|
||||||
|
unit_cols = {row[1] for row in cursor.fetchall()}
|
||||||
|
|
||||||
|
# Add new columns to job_reservation_units if they don't exist
|
||||||
|
if 'unit_start_date' not in unit_cols:
|
||||||
|
print("Adding unit_start_date column to job_reservation_units...")
|
||||||
|
cursor.execute("ALTER TABLE job_reservation_units ADD COLUMN unit_start_date DATE")
|
||||||
|
|
||||||
|
if 'unit_end_date' not in unit_cols:
|
||||||
|
print("Adding unit_end_date column to job_reservation_units...")
|
||||||
|
cursor.execute("ALTER TABLE job_reservation_units ADD COLUMN unit_end_date DATE")
|
||||||
|
|
||||||
|
if 'unit_end_tbd' not in unit_cols:
|
||||||
|
print("Adding unit_end_tbd column to job_reservation_units...")
|
||||||
|
cursor.execute("ALTER TABLE job_reservation_units ADD COLUMN unit_end_tbd BOOLEAN DEFAULT 0")
|
||||||
|
|
||||||
|
if 'notes' not in unit_cols:
|
||||||
|
print("Adding notes column to job_reservation_units...")
|
||||||
|
cursor.execute("ALTER TABLE job_reservation_units ADD COLUMN notes TEXT")
|
||||||
|
|
||||||
|
conn.commit()
|
||||||
|
print("Migration completed successfully!")
|
||||||
|
|
||||||
|
except Exception as e:
|
||||||
|
print(f"Migration failed: {e}")
|
||||||
|
conn.rollback()
|
||||||
|
raise
|
||||||
|
finally:
|
||||||
|
conn.close()
|
||||||
|
|
||||||
|
|
||||||
|
if __name__ == "__main__":
|
||||||
|
# Default to dev database
|
||||||
|
db_path = "./data-dev/seismo_fleet.db"
|
||||||
|
|
||||||
|
if len(sys.argv) > 1:
|
||||||
|
db_path = sys.argv[1]
|
||||||
|
|
||||||
|
if not Path(db_path).exists():
|
||||||
|
print(f"Database not found: {db_path}")
|
||||||
|
sys.exit(1)
|
||||||
|
|
||||||
|
migrate(db_path)
|
||||||
105
backend/migrate_fix_end_date_nullable.py
Normal file
@@ -0,0 +1,105 @@
|
|||||||
|
"""
|
||||||
|
Migration: Make job_reservations.end_date nullable for TBD support
|
||||||
|
|
||||||
|
SQLite doesn't support ALTER COLUMN, so we need to:
|
||||||
|
1. Create a new table with the correct schema
|
||||||
|
2. Copy data
|
||||||
|
3. Drop old table
|
||||||
|
4. Rename new table
|
||||||
|
"""
|
||||||
|
|
||||||
|
import sqlite3
|
||||||
|
import sys
|
||||||
|
from pathlib import Path
|
||||||
|
|
||||||
|
def migrate(db_path: str):
|
||||||
|
"""Run the migration."""
|
||||||
|
print(f"Migrating database: {db_path}")
|
||||||
|
|
||||||
|
conn = sqlite3.connect(db_path)
|
||||||
|
cursor = conn.cursor()
|
||||||
|
|
||||||
|
try:
|
||||||
|
# Check if job_reservations table exists
|
||||||
|
cursor.execute("SELECT name FROM sqlite_master WHERE type='table' AND name='job_reservations'")
|
||||||
|
if not cursor.fetchone():
|
||||||
|
print("job_reservations table does not exist. Skipping migration.")
|
||||||
|
return
|
||||||
|
|
||||||
|
# Check current schema
|
||||||
|
cursor.execute("PRAGMA table_info(job_reservations)")
|
||||||
|
columns = cursor.fetchall()
|
||||||
|
col_info = {row[1]: row for row in columns}
|
||||||
|
|
||||||
|
# Check if end_date is already nullable (notnull=0)
|
||||||
|
if 'end_date' in col_info and col_info['end_date'][3] == 0:
|
||||||
|
print("end_date is already nullable. Skipping table recreation.")
|
||||||
|
return
|
||||||
|
|
||||||
|
print("Recreating job_reservations table with nullable end_date...")
|
||||||
|
|
||||||
|
# Create new table with correct schema
|
||||||
|
cursor.execute("""
|
||||||
|
CREATE TABLE job_reservations_new (
|
||||||
|
id TEXT PRIMARY KEY,
|
||||||
|
name TEXT NOT NULL,
|
||||||
|
project_id TEXT,
|
||||||
|
start_date DATE NOT NULL,
|
||||||
|
end_date DATE,
|
||||||
|
estimated_end_date DATE,
|
||||||
|
end_date_tbd BOOLEAN DEFAULT 0,
|
||||||
|
assignment_type TEXT NOT NULL DEFAULT 'quantity',
|
||||||
|
device_type TEXT DEFAULT 'seismograph',
|
||||||
|
quantity_needed INTEGER,
|
||||||
|
notes TEXT,
|
||||||
|
color TEXT DEFAULT '#3B82F6',
|
||||||
|
created_at DATETIME,
|
||||||
|
updated_at DATETIME
|
||||||
|
)
|
||||||
|
""")
|
||||||
|
|
||||||
|
# Copy existing data
|
||||||
|
cursor.execute("""
|
||||||
|
INSERT INTO job_reservations_new
|
||||||
|
SELECT
|
||||||
|
id, name, project_id, start_date, end_date,
|
||||||
|
COALESCE(estimated_end_date, NULL) as estimated_end_date,
|
||||||
|
COALESCE(end_date_tbd, 0) as end_date_tbd,
|
||||||
|
assignment_type, device_type, quantity_needed, notes, color,
|
||||||
|
created_at, updated_at
|
||||||
|
FROM job_reservations
|
||||||
|
""")
|
||||||
|
|
||||||
|
# Drop old table
|
||||||
|
cursor.execute("DROP TABLE job_reservations")
|
||||||
|
|
||||||
|
# Rename new table
|
||||||
|
cursor.execute("ALTER TABLE job_reservations_new RENAME TO job_reservations")
|
||||||
|
|
||||||
|
# Recreate index
|
||||||
|
cursor.execute("CREATE INDEX IF NOT EXISTS ix_job_reservations_id ON job_reservations (id)")
|
||||||
|
cursor.execute("CREATE INDEX IF NOT EXISTS ix_job_reservations_project_id ON job_reservations (project_id)")
|
||||||
|
|
||||||
|
conn.commit()
|
||||||
|
print("Migration completed successfully!")
|
||||||
|
|
||||||
|
except Exception as e:
|
||||||
|
print(f"Migration failed: {e}")
|
||||||
|
conn.rollback()
|
||||||
|
raise
|
||||||
|
finally:
|
||||||
|
conn.close()
|
||||||
|
|
||||||
|
|
||||||
|
if __name__ == "__main__":
|
||||||
|
# Default to dev database
|
||||||
|
db_path = "./data-dev/seismo_fleet.db"
|
||||||
|
|
||||||
|
if len(sys.argv) > 1:
|
||||||
|
db_path = sys.argv[1]
|
||||||
|
|
||||||
|
if not Path(db_path).exists():
|
||||||
|
print(f"Database not found: {db_path}")
|
||||||
|
sys.exit(1)
|
||||||
|
|
||||||
|
migrate(db_path)
|
||||||
54
backend/migrate_rename_recording_to_monitoring_sessions.py
Normal file
@@ -0,0 +1,54 @@
|
|||||||
|
"""
|
||||||
|
Migration: Rename recording_sessions table to monitoring_sessions
|
||||||
|
|
||||||
|
Renames the table and updates the model name from RecordingSession to MonitoringSession.
|
||||||
|
Run once per database: python backend/migrate_rename_recording_to_monitoring_sessions.py
|
||||||
|
"""
|
||||||
|
|
||||||
|
import sqlite3
|
||||||
|
import sys
|
||||||
|
from pathlib import Path
|
||||||
|
|
||||||
|
|
||||||
|
def migrate(db_path: str):
|
||||||
|
"""Run the migration."""
|
||||||
|
print(f"Migrating database: {db_path}")
|
||||||
|
|
||||||
|
conn = sqlite3.connect(db_path)
|
||||||
|
cursor = conn.cursor()
|
||||||
|
|
||||||
|
try:
|
||||||
|
cursor.execute("SELECT name FROM sqlite_master WHERE type='table' AND name='recording_sessions'")
|
||||||
|
if not cursor.fetchone():
|
||||||
|
cursor.execute("SELECT name FROM sqlite_master WHERE type='table' AND name='monitoring_sessions'")
|
||||||
|
if cursor.fetchone():
|
||||||
|
print("monitoring_sessions table already exists. Skipping migration.")
|
||||||
|
else:
|
||||||
|
print("recording_sessions table does not exist. Skipping migration.")
|
||||||
|
return
|
||||||
|
|
||||||
|
print("Renaming recording_sessions -> monitoring_sessions...")
|
||||||
|
cursor.execute("ALTER TABLE recording_sessions RENAME TO monitoring_sessions")
|
||||||
|
|
||||||
|
conn.commit()
|
||||||
|
print("Migration completed successfully!")
|
||||||
|
|
||||||
|
except Exception as e:
|
||||||
|
print(f"Migration failed: {e}")
|
||||||
|
conn.rollback()
|
||||||
|
raise
|
||||||
|
finally:
|
||||||
|
conn.close()
|
||||||
|
|
||||||
|
|
||||||
|
if __name__ == "__main__":
|
||||||
|
db_path = "./data/seismo_fleet.db"
|
||||||
|
|
||||||
|
if len(sys.argv) > 1:
|
||||||
|
db_path = sys.argv[1]
|
||||||
|
|
||||||
|
if not Path(db_path).exists():
|
||||||
|
print(f"Database not found: {db_path}")
|
||||||
|
sys.exit(1)
|
||||||
|
|
||||||
|
migrate(db_path)
|
||||||
@@ -32,6 +32,9 @@ class RosterUnit(Base):
|
|||||||
device_type = Column(String, default="seismograph") # "seismograph" | "modem" | "slm"
|
device_type = Column(String, default="seismograph") # "seismograph" | "modem" | "slm"
|
||||||
deployed = Column(Boolean, default=True)
|
deployed = Column(Boolean, default=True)
|
||||||
retired = Column(Boolean, default=False)
|
retired = Column(Boolean, default=False)
|
||||||
|
out_for_calibration = Column(Boolean, default=False)
|
||||||
|
allocated = Column(Boolean, default=False) # Staged for an upcoming job, not yet deployed
|
||||||
|
allocated_to_project_id = Column(String, nullable=True) # Which project it's allocated to
|
||||||
note = Column(String, nullable=True)
|
note = Column(String, nullable=True)
|
||||||
project_id = Column(String, nullable=True)
|
project_id = Column(String, nullable=True)
|
||||||
location = Column(String, nullable=True) # Legacy field - use address/coordinates instead
|
location = Column(String, nullable=True) # Legacy field - use address/coordinates instead
|
||||||
@@ -50,6 +53,8 @@ class RosterUnit(Base):
|
|||||||
ip_address = Column(String, nullable=True)
|
ip_address = Column(String, nullable=True)
|
||||||
phone_number = Column(String, nullable=True)
|
phone_number = Column(String, nullable=True)
|
||||||
hardware_model = Column(String, nullable=True)
|
hardware_model = Column(String, nullable=True)
|
||||||
|
deployment_type = Column(String, nullable=True) # "seismograph" | "slm" - what type of device this modem is deployed with
|
||||||
|
deployed_with_unit_id = Column(String, nullable=True) # ID of seismograph/SLM this modem is deployed with
|
||||||
|
|
||||||
# Sound Level Meter-specific fields (nullable for seismographs and modems)
|
# Sound Level Meter-specific fields (nullable for seismographs and modems)
|
||||||
slm_host = Column(String, nullable=True) # Device IP or hostname
|
slm_host = Column(String, nullable=True) # Device IP or hostname
|
||||||
@@ -63,6 +68,26 @@ class RosterUnit(Base):
|
|||||||
slm_last_check = Column(DateTime, nullable=True) # Last communication check
|
slm_last_check = Column(DateTime, nullable=True) # Last communication check
|
||||||
|
|
||||||
|
|
||||||
|
class WatcherAgent(Base):
|
||||||
|
"""
|
||||||
|
Watcher agents: tracks the watcher processes (series3-watcher, thor-watcher)
|
||||||
|
that run on field machines and report unit heartbeats.
|
||||||
|
|
||||||
|
Updated on every heartbeat received from each source_id.
|
||||||
|
"""
|
||||||
|
__tablename__ = "watcher_agents"
|
||||||
|
|
||||||
|
id = Column(String, primary_key=True, index=True) # source_id (hostname)
|
||||||
|
source_type = Column(String, nullable=False) # series3_watcher | series4_watcher
|
||||||
|
version = Column(String, nullable=True) # e.g. "1.4.0"
|
||||||
|
last_seen = Column(DateTime, default=datetime.utcnow)
|
||||||
|
status = Column(String, nullable=False, default="unknown") # ok | pending | missing | error | unknown
|
||||||
|
ip_address = Column(String, nullable=True)
|
||||||
|
log_tail = Column(Text, nullable=True) # last N log lines (JSON array of strings)
|
||||||
|
update_pending = Column(Boolean, default=False) # set True to trigger remote update
|
||||||
|
update_version = Column(String, nullable=True) # target version to update to
|
||||||
|
|
||||||
|
|
||||||
class IgnoredUnit(Base):
|
class IgnoredUnit(Base):
|
||||||
"""
|
"""
|
||||||
Ignored units: units that report but should be filtered out from unknown emitters.
|
Ignored units: units that report but should be filtered out from unknown emitters.
|
||||||
@@ -137,17 +162,31 @@ class Project(Base):
|
|||||||
"""
|
"""
|
||||||
Projects: top-level organization for monitoring work.
|
Projects: top-level organization for monitoring work.
|
||||||
Type-aware to enable/disable features based on project_type_id.
|
Type-aware to enable/disable features based on project_type_id.
|
||||||
|
|
||||||
|
Project naming convention:
|
||||||
|
- project_number: TMI internal ID format xxxx-YY (e.g., "2567-23")
|
||||||
|
- client_name: Client/contractor name (e.g., "PJ Dick")
|
||||||
|
- name: Project/site name (e.g., "RKM Hall", "CMU Campus")
|
||||||
|
|
||||||
|
Display format: "2567-23 - PJ Dick - RKM Hall"
|
||||||
|
Users can search by any of these fields.
|
||||||
"""
|
"""
|
||||||
__tablename__ = "projects"
|
__tablename__ = "projects"
|
||||||
|
|
||||||
id = Column(String, primary_key=True, index=True) # UUID
|
id = Column(String, primary_key=True, index=True) # UUID
|
||||||
name = Column(String, nullable=False, unique=True)
|
project_number = Column(String, nullable=True, index=True) # TMI ID: xxxx-YY format (e.g., "2567-23")
|
||||||
|
name = Column(String, nullable=False, unique=True) # Project/site name (e.g., "RKM Hall")
|
||||||
description = Column(Text, nullable=True)
|
description = Column(Text, nullable=True)
|
||||||
project_type_id = Column(String, nullable=False) # FK to ProjectType.id
|
project_type_id = Column(String, nullable=False) # FK to ProjectType.id
|
||||||
status = Column(String, default="active") # active, completed, archived
|
status = Column(String, default="active") # active, on_hold, completed, archived, deleted
|
||||||
|
|
||||||
|
# Data collection mode: how field data reaches Terra-View.
|
||||||
|
# "remote" — units have modems; data pulled via FTP/scheduler automatically
|
||||||
|
# "manual" — no modem; SD cards retrieved daily and uploaded by hand
|
||||||
|
data_collection_mode = Column(String, default="manual") # remote | manual
|
||||||
|
|
||||||
# Project metadata
|
# Project metadata
|
||||||
client_name = Column(String, nullable=True)
|
client_name = Column(String, nullable=True, index=True) # Client name (e.g., "PJ Dick")
|
||||||
site_address = Column(String, nullable=True)
|
site_address = Column(String, nullable=True)
|
||||||
site_coordinates = Column(String, nullable=True) # "lat,lon"
|
site_coordinates = Column(String, nullable=True) # "lat,lon"
|
||||||
start_date = Column(Date, nullable=True)
|
start_date = Column(Date, nullable=True)
|
||||||
@@ -155,6 +194,7 @@ class Project(Base):
|
|||||||
|
|
||||||
created_at = Column(DateTime, default=datetime.utcnow)
|
created_at = Column(DateTime, default=datetime.utcnow)
|
||||||
updated_at = Column(DateTime, default=datetime.utcnow, onupdate=datetime.utcnow)
|
updated_at = Column(DateTime, default=datetime.utcnow, onupdate=datetime.utcnow)
|
||||||
|
deleted_at = Column(DateTime, nullable=True) # Set when status='deleted'; hard delete scheduled after 60 days
|
||||||
|
|
||||||
|
|
||||||
class MonitoringLocation(Base):
|
class MonitoringLocation(Base):
|
||||||
@@ -218,7 +258,7 @@ class ScheduledAction(Base):
|
|||||||
location_id = Column(String, nullable=False, index=True) # FK to MonitoringLocation.id
|
location_id = Column(String, nullable=False, index=True) # FK to MonitoringLocation.id
|
||||||
unit_id = Column(String, nullable=True, index=True) # FK to RosterUnit.id (nullable if location-based)
|
unit_id = Column(String, nullable=True, index=True) # FK to RosterUnit.id (nullable if location-based)
|
||||||
|
|
||||||
action_type = Column(String, nullable=False) # start, stop, download, calibrate
|
action_type = Column(String, nullable=False) # start, stop, download, cycle, calibrate
|
||||||
device_type = Column(String, nullable=False) # "slm" | "seismograph"
|
device_type = Column(String, nullable=False) # "slm" | "seismograph"
|
||||||
|
|
||||||
scheduled_time = Column(DateTime, nullable=False, index=True)
|
scheduled_time = Column(DateTime, nullable=False, index=True)
|
||||||
@@ -233,17 +273,21 @@ class ScheduledAction(Base):
|
|||||||
created_at = Column(DateTime, default=datetime.utcnow)
|
created_at = Column(DateTime, default=datetime.utcnow)
|
||||||
|
|
||||||
|
|
||||||
class RecordingSession(Base):
|
class MonitoringSession(Base):
|
||||||
"""
|
"""
|
||||||
Recording sessions: tracks actual monitoring sessions.
|
Monitoring sessions: tracks actual monitoring sessions.
|
||||||
Created when recording starts, updated when it stops.
|
Created when monitoring starts, updated when it stops.
|
||||||
"""
|
"""
|
||||||
__tablename__ = "recording_sessions"
|
__tablename__ = "monitoring_sessions"
|
||||||
|
|
||||||
id = Column(String, primary_key=True, index=True) # UUID
|
id = Column(String, primary_key=True, index=True) # UUID
|
||||||
project_id = Column(String, nullable=False, index=True) # FK to Project.id
|
project_id = Column(String, nullable=False, index=True) # FK to Project.id
|
||||||
location_id = Column(String, nullable=False, index=True) # FK to MonitoringLocation.id
|
location_id = Column(String, nullable=False, index=True) # FK to MonitoringLocation.id
|
||||||
unit_id = Column(String, nullable=False, index=True) # FK to RosterUnit.id
|
unit_id = Column(String, nullable=True, index=True) # FK to RosterUnit.id (nullable for offline uploads)
|
||||||
|
|
||||||
|
# Physical device model that produced this session's data (e.g. "NL-43", "NL-53", "NL-32").
|
||||||
|
# Null for older records; report code falls back to file-content detection when null.
|
||||||
|
device_model = Column(String, nullable=True)
|
||||||
|
|
||||||
session_type = Column(String, nullable=False) # sound | vibration
|
session_type = Column(String, nullable=False) # sound | vibration
|
||||||
started_at = Column(DateTime, nullable=False)
|
started_at = Column(DateTime, nullable=False)
|
||||||
@@ -251,6 +295,25 @@ class RecordingSession(Base):
|
|||||||
duration_seconds = Column(Integer, nullable=True)
|
duration_seconds = Column(Integer, nullable=True)
|
||||||
status = Column(String, default="recording") # recording, completed, failed
|
status = Column(String, default="recording") # recording, completed, failed
|
||||||
|
|
||||||
|
# Human-readable label auto-derived from date/location, editable by user.
|
||||||
|
# e.g. "NRL-1 — Sun 2/23 — Night"
|
||||||
|
session_label = Column(String, nullable=True)
|
||||||
|
|
||||||
|
# Period classification for report stats columns.
|
||||||
|
# weekday_day | weekday_night | weekend_day | weekend_night
|
||||||
|
period_type = Column(String, nullable=True)
|
||||||
|
|
||||||
|
# Effective monitoring window (hours 0–23). Night sessions cross midnight
|
||||||
|
# (period_end_hour < period_start_hour). NULL = no filtering applied.
|
||||||
|
# e.g. Day: start=7, end=19 Night: start=19, end=7
|
||||||
|
period_start_hour = Column(Integer, nullable=True)
|
||||||
|
period_end_hour = Column(Integer, nullable=True)
|
||||||
|
|
||||||
|
# For day sessions: the specific calendar date to use for report filtering.
|
||||||
|
# Overrides the automatic "last date with daytime rows" heuristic.
|
||||||
|
# Null = use heuristic.
|
||||||
|
report_date = Column(Date, nullable=True)
|
||||||
|
|
||||||
# Snapshot of device configuration at recording time
|
# Snapshot of device configuration at recording time
|
||||||
session_metadata = Column(Text, nullable=True) # JSON
|
session_metadata = Column(Text, nullable=True) # JSON
|
||||||
|
|
||||||
@@ -266,7 +329,7 @@ class DataFile(Base):
|
|||||||
__tablename__ = "data_files"
|
__tablename__ = "data_files"
|
||||||
|
|
||||||
id = Column(String, primary_key=True, index=True) # UUID
|
id = Column(String, primary_key=True, index=True) # UUID
|
||||||
session_id = Column(String, nullable=False, index=True) # FK to RecordingSession.id
|
session_id = Column(String, nullable=False, index=True) # FK to MonitoringSession.id
|
||||||
|
|
||||||
file_path = Column(String, nullable=False) # Relative to data/Projects/
|
file_path = Column(String, nullable=False) # Relative to data/Projects/
|
||||||
file_type = Column(String, nullable=False) # wav, csv, mseed, json
|
file_type = Column(String, nullable=False) # wav, csv, mseed, json
|
||||||
@@ -278,3 +341,237 @@ class DataFile(Base):
|
|||||||
file_metadata = Column(Text, nullable=True) # JSON
|
file_metadata = Column(Text, nullable=True) # JSON
|
||||||
|
|
||||||
created_at = Column(DateTime, default=datetime.utcnow)
|
created_at = Column(DateTime, default=datetime.utcnow)
|
||||||
|
|
||||||
|
|
||||||
|
class ReportTemplate(Base):
|
||||||
|
"""
|
||||||
|
Report templates: saved configurations for generating Excel reports.
|
||||||
|
Allows users to save time filter presets, titles, etc. for reuse.
|
||||||
|
"""
|
||||||
|
__tablename__ = "report_templates"
|
||||||
|
|
||||||
|
id = Column(String, primary_key=True, index=True) # UUID
|
||||||
|
name = Column(String, nullable=False) # "Nighttime Report", "Full Day Report"
|
||||||
|
project_id = Column(String, nullable=True) # Optional: project-specific template
|
||||||
|
|
||||||
|
# Template settings
|
||||||
|
report_title = Column(String, default="Background Noise Study")
|
||||||
|
start_time = Column(String, nullable=True) # "19:00" format
|
||||||
|
end_time = Column(String, nullable=True) # "07:00" format
|
||||||
|
start_date = Column(String, nullable=True) # "2025-01-15" format (optional)
|
||||||
|
end_date = Column(String, nullable=True) # "2025-01-20" format (optional)
|
||||||
|
|
||||||
|
created_at = Column(DateTime, default=datetime.utcnow)
|
||||||
|
updated_at = Column(DateTime, default=datetime.utcnow, onupdate=datetime.utcnow)
|
||||||
|
|
||||||
|
|
||||||
|
# ============================================================================
|
||||||
|
# Sound Monitoring Scheduler
|
||||||
|
# ============================================================================
|
||||||
|
|
||||||
|
class RecurringSchedule(Base):
|
||||||
|
"""
|
||||||
|
Recurring schedule definitions for automated sound monitoring.
|
||||||
|
|
||||||
|
Supports three schedule types:
|
||||||
|
- "weekly_calendar": Select specific days with start/end times (e.g., Mon/Wed/Fri 7pm-7am)
|
||||||
|
- "simple_interval": For 24/7 monitoring with daily stop/download/restart cycles
|
||||||
|
- "one_off": Single recording session with specific start and end date/time
|
||||||
|
"""
|
||||||
|
__tablename__ = "recurring_schedules"
|
||||||
|
|
||||||
|
id = Column(String, primary_key=True, index=True) # UUID
|
||||||
|
project_id = Column(String, nullable=False, index=True) # FK to Project.id
|
||||||
|
location_id = Column(String, nullable=False, index=True) # FK to MonitoringLocation.id
|
||||||
|
unit_id = Column(String, nullable=True, index=True) # FK to RosterUnit.id (optional, can use assignment)
|
||||||
|
|
||||||
|
name = Column(String, nullable=False) # "Weeknight Monitoring", "24/7 Continuous"
|
||||||
|
schedule_type = Column(String, nullable=False) # "weekly_calendar" | "simple_interval" | "one_off"
|
||||||
|
device_type = Column(String, nullable=False) # "slm" | "seismograph"
|
||||||
|
|
||||||
|
# Weekly Calendar fields (schedule_type = "weekly_calendar")
|
||||||
|
# JSON format: {
|
||||||
|
# "monday": {"enabled": true, "start": "19:00", "end": "07:00"},
|
||||||
|
# "tuesday": {"enabled": false},
|
||||||
|
# ...
|
||||||
|
# }
|
||||||
|
weekly_pattern = Column(Text, nullable=True)
|
||||||
|
|
||||||
|
# Simple Interval fields (schedule_type = "simple_interval")
|
||||||
|
interval_type = Column(String, nullable=True) # "daily" | "hourly"
|
||||||
|
cycle_time = Column(String, nullable=True) # "00:00" - time to run stop/download/restart
|
||||||
|
include_download = Column(Boolean, default=True) # Download data before restart
|
||||||
|
|
||||||
|
# One-Off fields (schedule_type = "one_off")
|
||||||
|
start_datetime = Column(DateTime, nullable=True) # Exact start date+time (stored as UTC)
|
||||||
|
end_datetime = Column(DateTime, nullable=True) # Exact end date+time (stored as UTC)
|
||||||
|
|
||||||
|
# Automation options (applies to all schedule types)
|
||||||
|
auto_increment_index = Column(Boolean, default=True) # Auto-increment store/index number before start
|
||||||
|
# When True: prevents "overwrite data?" prompts by using a new index each time
|
||||||
|
|
||||||
|
# Shared configuration
|
||||||
|
enabled = Column(Boolean, default=True)
|
||||||
|
timezone = Column(String, default="America/New_York")
|
||||||
|
|
||||||
|
# Tracking
|
||||||
|
last_generated_at = Column(DateTime, nullable=True) # When actions were last generated
|
||||||
|
next_occurrence = Column(DateTime, nullable=True) # Computed next action time
|
||||||
|
|
||||||
|
created_at = Column(DateTime, default=datetime.utcnow)
|
||||||
|
updated_at = Column(DateTime, default=datetime.utcnow, onupdate=datetime.utcnow)
|
||||||
|
|
||||||
|
|
||||||
|
class Alert(Base):
|
||||||
|
"""
|
||||||
|
In-app alerts for device status changes and system events.
|
||||||
|
|
||||||
|
Designed for future expansion to email/webhook notifications.
|
||||||
|
Currently supports:
|
||||||
|
- device_offline: Device became unreachable
|
||||||
|
- device_online: Device came back online
|
||||||
|
- schedule_failed: Scheduled action failed to execute
|
||||||
|
- schedule_completed: Scheduled action completed successfully
|
||||||
|
"""
|
||||||
|
__tablename__ = "alerts"
|
||||||
|
|
||||||
|
id = Column(String, primary_key=True, index=True) # UUID
|
||||||
|
|
||||||
|
# Alert classification
|
||||||
|
alert_type = Column(String, nullable=False) # "device_offline" | "device_online" | "schedule_failed" | "schedule_completed"
|
||||||
|
severity = Column(String, default="warning") # "info" | "warning" | "critical"
|
||||||
|
|
||||||
|
# Related entities (nullable - may not all apply)
|
||||||
|
project_id = Column(String, nullable=True, index=True)
|
||||||
|
location_id = Column(String, nullable=True, index=True)
|
||||||
|
unit_id = Column(String, nullable=True, index=True)
|
||||||
|
schedule_id = Column(String, nullable=True) # RecurringSchedule or ScheduledAction id
|
||||||
|
|
||||||
|
# Alert content
|
||||||
|
title = Column(String, nullable=False) # "NRL-001 Device Offline"
|
||||||
|
message = Column(Text, nullable=True) # Detailed description
|
||||||
|
alert_metadata = Column(Text, nullable=True) # JSON: additional context data
|
||||||
|
|
||||||
|
# Status tracking
|
||||||
|
status = Column(String, default="active") # "active" | "acknowledged" | "resolved" | "dismissed"
|
||||||
|
acknowledged_at = Column(DateTime, nullable=True)
|
||||||
|
resolved_at = Column(DateTime, nullable=True)
|
||||||
|
|
||||||
|
created_at = Column(DateTime, default=datetime.utcnow)
|
||||||
|
expires_at = Column(DateTime, nullable=True) # Auto-dismiss after this time
|
||||||
|
|
||||||
|
|
||||||
|
# ============================================================================
|
||||||
|
# Deployment Records
|
||||||
|
# ============================================================================
|
||||||
|
|
||||||
|
class DeploymentRecord(Base):
|
||||||
|
"""
|
||||||
|
Deployment records: tracks each time a unit is sent to the field and returned.
|
||||||
|
|
||||||
|
Each row represents one deployment. The active deployment is the record
|
||||||
|
with actual_removal_date IS NULL. The fleet calendar uses this to show
|
||||||
|
units as "In Field" and surface their expected return date.
|
||||||
|
|
||||||
|
project_ref is a freeform string for legacy/vibration jobs like "Fay I-80".
|
||||||
|
project_id will be populated once those jobs are migrated to proper Project records.
|
||||||
|
"""
|
||||||
|
__tablename__ = "deployment_records"
|
||||||
|
|
||||||
|
id = Column(String, primary_key=True, index=True) # UUID
|
||||||
|
unit_id = Column(String, nullable=False, index=True) # FK to RosterUnit.id
|
||||||
|
|
||||||
|
deployed_date = Column(Date, nullable=True) # When unit left the yard
|
||||||
|
estimated_removal_date = Column(Date, nullable=True) # Expected return date
|
||||||
|
actual_removal_date = Column(Date, nullable=True) # Filled in when returned; NULL = still out
|
||||||
|
|
||||||
|
# Project linkage: freeform for legacy jobs, FK for proper project records
|
||||||
|
project_ref = Column(String, nullable=True) # e.g. "Fay I-80" (vibration jobs)
|
||||||
|
project_id = Column(String, nullable=True, index=True) # FK to Project.id (when available)
|
||||||
|
|
||||||
|
location_name = Column(String, nullable=True) # e.g. "North Gate", "VP-001"
|
||||||
|
notes = Column(Text, nullable=True)
|
||||||
|
|
||||||
|
created_at = Column(DateTime, default=datetime.utcnow)
|
||||||
|
updated_at = Column(DateTime, default=datetime.utcnow, onupdate=datetime.utcnow)
|
||||||
|
|
||||||
|
|
||||||
|
# ============================================================================
|
||||||
|
# Fleet Calendar & Job Reservations
|
||||||
|
# ============================================================================
|
||||||
|
|
||||||
|
class JobReservation(Base):
|
||||||
|
"""
|
||||||
|
Job reservations: reserve units for future jobs/projects.
|
||||||
|
|
||||||
|
Supports two assignment modes:
|
||||||
|
- "specific": Pick exact units (SN-001, SN-002, etc.)
|
||||||
|
- "quantity": Reserve a number of units (e.g., "need 8 seismographs")
|
||||||
|
|
||||||
|
Used by the Fleet Calendar to visualize unit availability over time.
|
||||||
|
"""
|
||||||
|
__tablename__ = "job_reservations"
|
||||||
|
|
||||||
|
id = Column(String, primary_key=True, index=True) # UUID
|
||||||
|
name = Column(String, nullable=False) # "Job A - March deployment"
|
||||||
|
project_id = Column(String, nullable=True, index=True) # Optional FK to Project
|
||||||
|
|
||||||
|
# Date range for the reservation
|
||||||
|
start_date = Column(Date, nullable=False)
|
||||||
|
end_date = Column(Date, nullable=True) # Nullable = TBD / ongoing
|
||||||
|
estimated_end_date = Column(Date, nullable=True) # For planning when end is TBD
|
||||||
|
end_date_tbd = Column(Boolean, default=False) # True = end date unknown
|
||||||
|
|
||||||
|
# Assignment type: "specific" or "quantity"
|
||||||
|
assignment_type = Column(String, nullable=False, default="quantity")
|
||||||
|
|
||||||
|
# For quantity reservations
|
||||||
|
device_type = Column(String, default="seismograph") # seismograph | slm
|
||||||
|
quantity_needed = Column(Integer, nullable=True) # e.g., 8 units
|
||||||
|
estimated_units = Column(Integer, nullable=True)
|
||||||
|
|
||||||
|
# Full slot list as JSON: [{"location_name": "North Gate", "unit_id": null}, ...]
|
||||||
|
# Includes empty slots (no unit assigned yet). Filled slots are authoritative in JobReservationUnit.
|
||||||
|
location_slots = Column(Text, nullable=True)
|
||||||
|
|
||||||
|
# Metadata
|
||||||
|
notes = Column(Text, nullable=True)
|
||||||
|
color = Column(String, default="#3B82F6") # For calendar display (blue default)
|
||||||
|
|
||||||
|
created_at = Column(DateTime, default=datetime.utcnow)
|
||||||
|
updated_at = Column(DateTime, default=datetime.utcnow, onupdate=datetime.utcnow)
|
||||||
|
|
||||||
|
|
||||||
|
class JobReservationUnit(Base):
|
||||||
|
"""
|
||||||
|
Links specific units to job reservations.
|
||||||
|
|
||||||
|
Used when:
|
||||||
|
- assignment_type="specific": Units are directly assigned
|
||||||
|
- assignment_type="quantity": Units can be filled in later
|
||||||
|
|
||||||
|
Supports unit swaps: same reservation can have multiple units with
|
||||||
|
different date ranges (e.g., BE17353 Feb-Jun, then BE18438 Jun-Nov).
|
||||||
|
"""
|
||||||
|
__tablename__ = "job_reservation_units"
|
||||||
|
|
||||||
|
id = Column(String, primary_key=True, index=True) # UUID
|
||||||
|
reservation_id = Column(String, nullable=False, index=True) # FK to JobReservation
|
||||||
|
unit_id = Column(String, nullable=False, index=True) # FK to RosterUnit
|
||||||
|
|
||||||
|
# Unit-specific date range (for swaps) - defaults to reservation dates if null
|
||||||
|
unit_start_date = Column(Date, nullable=True) # When this specific unit starts
|
||||||
|
unit_end_date = Column(Date, nullable=True) # When this unit ends (swap out date)
|
||||||
|
unit_end_tbd = Column(Boolean, default=False) # True = end unknown (until cal expires or job ends)
|
||||||
|
|
||||||
|
# Track how this assignment was made
|
||||||
|
assignment_source = Column(String, default="specific") # "specific" | "filled" | "swap"
|
||||||
|
assigned_at = Column(DateTime, default=datetime.utcnow)
|
||||||
|
notes = Column(Text, nullable=True) # "Replacing BE17353" etc.
|
||||||
|
|
||||||
|
# Power requirements for this deployment slot
|
||||||
|
power_type = Column(String, nullable=True) # "ac" | "solar" | None
|
||||||
|
|
||||||
|
# Location identity
|
||||||
|
location_name = Column(String, nullable=True) # e.g. "North Gate", "Main Entrance"
|
||||||
|
slot_index = Column(Integer, nullable=True) # Order within reservation (0-based)
|
||||||
|
|||||||
326
backend/routers/alerts.py
Normal file
@@ -0,0 +1,326 @@
|
|||||||
|
"""
|
||||||
|
Alerts Router
|
||||||
|
|
||||||
|
API endpoints for managing in-app alerts.
|
||||||
|
"""
|
||||||
|
|
||||||
|
from fastapi import APIRouter, Request, Depends, HTTPException, Query
|
||||||
|
from fastapi.responses import HTMLResponse, JSONResponse
|
||||||
|
from sqlalchemy.orm import Session
|
||||||
|
from typing import Optional
|
||||||
|
from datetime import datetime, timedelta
|
||||||
|
|
||||||
|
from backend.database import get_db
|
||||||
|
from backend.models import Alert, RosterUnit
|
||||||
|
from backend.services.alert_service import get_alert_service
|
||||||
|
from backend.templates_config import templates
|
||||||
|
|
||||||
|
router = APIRouter(prefix="/api/alerts", tags=["alerts"])
|
||||||
|
|
||||||
|
|
||||||
|
# ============================================================================
|
||||||
|
# Alert List and Count
|
||||||
|
# ============================================================================
|
||||||
|
|
||||||
|
@router.get("/")
|
||||||
|
async def list_alerts(
|
||||||
|
db: Session = Depends(get_db),
|
||||||
|
status: Optional[str] = Query(None, description="Filter by status: active, acknowledged, resolved, dismissed"),
|
||||||
|
project_id: Optional[str] = Query(None),
|
||||||
|
unit_id: Optional[str] = Query(None),
|
||||||
|
alert_type: Optional[str] = Query(None, description="Filter by type: device_offline, device_online, schedule_failed"),
|
||||||
|
limit: int = Query(50, le=100),
|
||||||
|
offset: int = Query(0, ge=0),
|
||||||
|
):
|
||||||
|
"""
|
||||||
|
List alerts with optional filters.
|
||||||
|
"""
|
||||||
|
alert_service = get_alert_service(db)
|
||||||
|
|
||||||
|
alerts = alert_service.get_all_alerts(
|
||||||
|
status=status,
|
||||||
|
project_id=project_id,
|
||||||
|
unit_id=unit_id,
|
||||||
|
alert_type=alert_type,
|
||||||
|
limit=limit,
|
||||||
|
offset=offset,
|
||||||
|
)
|
||||||
|
|
||||||
|
return {
|
||||||
|
"alerts": [
|
||||||
|
{
|
||||||
|
"id": a.id,
|
||||||
|
"alert_type": a.alert_type,
|
||||||
|
"severity": a.severity,
|
||||||
|
"title": a.title,
|
||||||
|
"message": a.message,
|
||||||
|
"status": a.status,
|
||||||
|
"unit_id": a.unit_id,
|
||||||
|
"project_id": a.project_id,
|
||||||
|
"location_id": a.location_id,
|
||||||
|
"created_at": a.created_at.isoformat() if a.created_at else None,
|
||||||
|
"acknowledged_at": a.acknowledged_at.isoformat() if a.acknowledged_at else None,
|
||||||
|
"resolved_at": a.resolved_at.isoformat() if a.resolved_at else None,
|
||||||
|
}
|
||||||
|
for a in alerts
|
||||||
|
],
|
||||||
|
"count": len(alerts),
|
||||||
|
"limit": limit,
|
||||||
|
"offset": offset,
|
||||||
|
}
|
||||||
|
|
||||||
|
|
||||||
|
@router.get("/active")
|
||||||
|
async def list_active_alerts(
|
||||||
|
db: Session = Depends(get_db),
|
||||||
|
project_id: Optional[str] = Query(None),
|
||||||
|
unit_id: Optional[str] = Query(None),
|
||||||
|
alert_type: Optional[str] = Query(None),
|
||||||
|
min_severity: Optional[str] = Query(None, description="Minimum severity: info, warning, critical"),
|
||||||
|
limit: int = Query(50, le=100),
|
||||||
|
):
|
||||||
|
"""
|
||||||
|
List only active alerts.
|
||||||
|
"""
|
||||||
|
alert_service = get_alert_service(db)
|
||||||
|
|
||||||
|
alerts = alert_service.get_active_alerts(
|
||||||
|
project_id=project_id,
|
||||||
|
unit_id=unit_id,
|
||||||
|
alert_type=alert_type,
|
||||||
|
min_severity=min_severity,
|
||||||
|
limit=limit,
|
||||||
|
)
|
||||||
|
|
||||||
|
return {
|
||||||
|
"alerts": [
|
||||||
|
{
|
||||||
|
"id": a.id,
|
||||||
|
"alert_type": a.alert_type,
|
||||||
|
"severity": a.severity,
|
||||||
|
"title": a.title,
|
||||||
|
"message": a.message,
|
||||||
|
"unit_id": a.unit_id,
|
||||||
|
"project_id": a.project_id,
|
||||||
|
"created_at": a.created_at.isoformat() if a.created_at else None,
|
||||||
|
}
|
||||||
|
for a in alerts
|
||||||
|
],
|
||||||
|
"count": len(alerts),
|
||||||
|
}
|
||||||
|
|
||||||
|
|
||||||
|
@router.get("/active/count")
|
||||||
|
async def get_active_alert_count(db: Session = Depends(get_db)):
|
||||||
|
"""
|
||||||
|
Get count of active alerts (for navbar badge).
|
||||||
|
"""
|
||||||
|
alert_service = get_alert_service(db)
|
||||||
|
count = alert_service.get_active_alert_count()
|
||||||
|
return {"count": count}
|
||||||
|
|
||||||
|
|
||||||
|
# ============================================================================
|
||||||
|
# Single Alert Operations
|
||||||
|
# ============================================================================
|
||||||
|
|
||||||
|
@router.get("/{alert_id}")
|
||||||
|
async def get_alert(
|
||||||
|
alert_id: str,
|
||||||
|
db: Session = Depends(get_db),
|
||||||
|
):
|
||||||
|
"""
|
||||||
|
Get a specific alert.
|
||||||
|
"""
|
||||||
|
alert = db.query(Alert).filter_by(id=alert_id).first()
|
||||||
|
if not alert:
|
||||||
|
raise HTTPException(status_code=404, detail="Alert not found")
|
||||||
|
|
||||||
|
# Get related unit info
|
||||||
|
unit = None
|
||||||
|
if alert.unit_id:
|
||||||
|
unit = db.query(RosterUnit).filter_by(id=alert.unit_id).first()
|
||||||
|
|
||||||
|
return {
|
||||||
|
"id": alert.id,
|
||||||
|
"alert_type": alert.alert_type,
|
||||||
|
"severity": alert.severity,
|
||||||
|
"title": alert.title,
|
||||||
|
"message": alert.message,
|
||||||
|
"metadata": alert.alert_metadata,
|
||||||
|
"status": alert.status,
|
||||||
|
"unit_id": alert.unit_id,
|
||||||
|
"unit_name": unit.id if unit else None,
|
||||||
|
"project_id": alert.project_id,
|
||||||
|
"location_id": alert.location_id,
|
||||||
|
"schedule_id": alert.schedule_id,
|
||||||
|
"created_at": alert.created_at.isoformat() if alert.created_at else None,
|
||||||
|
"acknowledged_at": alert.acknowledged_at.isoformat() if alert.acknowledged_at else None,
|
||||||
|
"resolved_at": alert.resolved_at.isoformat() if alert.resolved_at else None,
|
||||||
|
"expires_at": alert.expires_at.isoformat() if alert.expires_at else None,
|
||||||
|
}
|
||||||
|
|
||||||
|
|
||||||
|
@router.post("/{alert_id}/acknowledge")
|
||||||
|
async def acknowledge_alert(
|
||||||
|
alert_id: str,
|
||||||
|
db: Session = Depends(get_db),
|
||||||
|
):
|
||||||
|
"""
|
||||||
|
Mark alert as acknowledged.
|
||||||
|
"""
|
||||||
|
alert_service = get_alert_service(db)
|
||||||
|
alert = alert_service.acknowledge_alert(alert_id)
|
||||||
|
|
||||||
|
if not alert:
|
||||||
|
raise HTTPException(status_code=404, detail="Alert not found")
|
||||||
|
|
||||||
|
return {
|
||||||
|
"success": True,
|
||||||
|
"alert_id": alert.id,
|
||||||
|
"status": alert.status,
|
||||||
|
}
|
||||||
|
|
||||||
|
|
||||||
|
@router.post("/{alert_id}/dismiss")
|
||||||
|
async def dismiss_alert(
|
||||||
|
alert_id: str,
|
||||||
|
db: Session = Depends(get_db),
|
||||||
|
):
|
||||||
|
"""
|
||||||
|
Dismiss alert.
|
||||||
|
"""
|
||||||
|
alert_service = get_alert_service(db)
|
||||||
|
alert = alert_service.dismiss_alert(alert_id)
|
||||||
|
|
||||||
|
if not alert:
|
||||||
|
raise HTTPException(status_code=404, detail="Alert not found")
|
||||||
|
|
||||||
|
return {
|
||||||
|
"success": True,
|
||||||
|
"alert_id": alert.id,
|
||||||
|
"status": alert.status,
|
||||||
|
}
|
||||||
|
|
||||||
|
|
||||||
|
@router.post("/{alert_id}/resolve")
|
||||||
|
async def resolve_alert(
|
||||||
|
alert_id: str,
|
||||||
|
db: Session = Depends(get_db),
|
||||||
|
):
|
||||||
|
"""
|
||||||
|
Manually resolve an alert.
|
||||||
|
"""
|
||||||
|
alert_service = get_alert_service(db)
|
||||||
|
alert = alert_service.resolve_alert(alert_id)
|
||||||
|
|
||||||
|
if not alert:
|
||||||
|
raise HTTPException(status_code=404, detail="Alert not found")
|
||||||
|
|
||||||
|
return {
|
||||||
|
"success": True,
|
||||||
|
"alert_id": alert.id,
|
||||||
|
"status": alert.status,
|
||||||
|
}
|
||||||
|
|
||||||
|
|
||||||
|
# ============================================================================
|
||||||
|
# HTML Partials for HTMX
|
||||||
|
# ============================================================================
|
||||||
|
|
||||||
|
@router.get("/partials/dropdown", response_class=HTMLResponse)
|
||||||
|
async def get_alert_dropdown(
|
||||||
|
request: Request,
|
||||||
|
db: Session = Depends(get_db),
|
||||||
|
):
|
||||||
|
"""
|
||||||
|
Return HTML partial for alert dropdown in navbar.
|
||||||
|
"""
|
||||||
|
alert_service = get_alert_service(db)
|
||||||
|
alerts = alert_service.get_active_alerts(limit=10)
|
||||||
|
|
||||||
|
# Calculate relative time for each alert
|
||||||
|
now = datetime.utcnow()
|
||||||
|
alerts_data = []
|
||||||
|
for alert in alerts:
|
||||||
|
delta = now - alert.created_at
|
||||||
|
if delta.days > 0:
|
||||||
|
time_ago = f"{delta.days}d ago"
|
||||||
|
elif delta.seconds >= 3600:
|
||||||
|
time_ago = f"{delta.seconds // 3600}h ago"
|
||||||
|
elif delta.seconds >= 60:
|
||||||
|
time_ago = f"{delta.seconds // 60}m ago"
|
||||||
|
else:
|
||||||
|
time_ago = "just now"
|
||||||
|
|
||||||
|
alerts_data.append({
|
||||||
|
"alert": alert,
|
||||||
|
"time_ago": time_ago,
|
||||||
|
})
|
||||||
|
|
||||||
|
return templates.TemplateResponse("partials/alerts/alert_dropdown.html", {
|
||||||
|
"request": request,
|
||||||
|
"alerts": alerts_data,
|
||||||
|
"total_count": alert_service.get_active_alert_count(),
|
||||||
|
})
|
||||||
|
|
||||||
|
|
||||||
|
@router.get("/partials/list", response_class=HTMLResponse)
|
||||||
|
async def get_alert_list(
|
||||||
|
request: Request,
|
||||||
|
db: Session = Depends(get_db),
|
||||||
|
status: Optional[str] = Query(None),
|
||||||
|
limit: int = Query(20),
|
||||||
|
):
|
||||||
|
"""
|
||||||
|
Return HTML partial for alert list page.
|
||||||
|
"""
|
||||||
|
alert_service = get_alert_service(db)
|
||||||
|
|
||||||
|
if status:
|
||||||
|
alerts = alert_service.get_all_alerts(status=status, limit=limit)
|
||||||
|
else:
|
||||||
|
alerts = alert_service.get_all_alerts(limit=limit)
|
||||||
|
|
||||||
|
# Calculate relative time for each alert
|
||||||
|
now = datetime.utcnow()
|
||||||
|
alerts_data = []
|
||||||
|
for alert in alerts:
|
||||||
|
delta = now - alert.created_at
|
||||||
|
if delta.days > 0:
|
||||||
|
time_ago = f"{delta.days}d ago"
|
||||||
|
elif delta.seconds >= 3600:
|
||||||
|
time_ago = f"{delta.seconds // 3600}h ago"
|
||||||
|
elif delta.seconds >= 60:
|
||||||
|
time_ago = f"{delta.seconds // 60}m ago"
|
||||||
|
else:
|
||||||
|
time_ago = "just now"
|
||||||
|
|
||||||
|
alerts_data.append({
|
||||||
|
"alert": alert,
|
||||||
|
"time_ago": time_ago,
|
||||||
|
})
|
||||||
|
|
||||||
|
return templates.TemplateResponse("partials/alerts/alert_list.html", {
|
||||||
|
"request": request,
|
||||||
|
"alerts": alerts_data,
|
||||||
|
"status_filter": status,
|
||||||
|
})
|
||||||
|
|
||||||
|
|
||||||
|
# ============================================================================
|
||||||
|
# Cleanup
|
||||||
|
# ============================================================================
|
||||||
|
|
||||||
|
@router.post("/cleanup-expired")
|
||||||
|
async def cleanup_expired_alerts(db: Session = Depends(get_db)):
|
||||||
|
"""
|
||||||
|
Cleanup expired alerts (admin/maintenance endpoint).
|
||||||
|
"""
|
||||||
|
alert_service = get_alert_service(db)
|
||||||
|
count = alert_service.cleanup_expired_alerts()
|
||||||
|
|
||||||
|
return {
|
||||||
|
"success": True,
|
||||||
|
"cleaned_up": count,
|
||||||
|
}
|
||||||
@@ -1,10 +1,15 @@
|
|||||||
from fastapi import APIRouter, Request, Depends
|
from fastapi import APIRouter, Request, Depends
|
||||||
from fastapi.templating import Jinja2Templates
|
from sqlalchemy.orm import Session
|
||||||
|
from sqlalchemy import and_
|
||||||
|
from datetime import datetime, timedelta
|
||||||
|
|
||||||
|
from backend.database import get_db
|
||||||
|
from backend.models import ScheduledAction, MonitoringLocation, Project
|
||||||
from backend.services.snapshot import emit_status_snapshot
|
from backend.services.snapshot import emit_status_snapshot
|
||||||
|
from backend.templates_config import templates
|
||||||
|
from backend.utils.timezone import utc_to_local, local_to_utc, get_user_timezone
|
||||||
|
|
||||||
router = APIRouter()
|
router = APIRouter()
|
||||||
templates = Jinja2Templates(directory="templates")
|
|
||||||
|
|
||||||
|
|
||||||
@router.get("/dashboard/active")
|
@router.get("/dashboard/active")
|
||||||
@@ -23,3 +28,79 @@ def dashboard_benched(request: Request):
|
|||||||
"partials/benched_table.html",
|
"partials/benched_table.html",
|
||||||
{"request": request, "units": snapshot["benched"]}
|
{"request": request, "units": snapshot["benched"]}
|
||||||
)
|
)
|
||||||
|
|
||||||
|
|
||||||
|
@router.get("/dashboard/todays-actions")
|
||||||
|
def dashboard_todays_actions(request: Request, db: Session = Depends(get_db)):
|
||||||
|
"""
|
||||||
|
Get today's scheduled actions for the dashboard card.
|
||||||
|
Shows upcoming, completed, and failed actions for today.
|
||||||
|
"""
|
||||||
|
import json
|
||||||
|
from zoneinfo import ZoneInfo
|
||||||
|
|
||||||
|
# Get today's date range in local timezone
|
||||||
|
tz = ZoneInfo(get_user_timezone())
|
||||||
|
now_local = datetime.now(tz)
|
||||||
|
today_start_local = now_local.replace(hour=0, minute=0, second=0, microsecond=0)
|
||||||
|
today_end_local = today_start_local + timedelta(days=1)
|
||||||
|
|
||||||
|
# Convert to UTC for database query
|
||||||
|
today_start_utc = today_start_local.astimezone(ZoneInfo("UTC")).replace(tzinfo=None)
|
||||||
|
today_end_utc = today_end_local.astimezone(ZoneInfo("UTC")).replace(tzinfo=None)
|
||||||
|
|
||||||
|
# Exclude actions from paused/removed projects
|
||||||
|
paused_project_ids = [
|
||||||
|
p.id for p in db.query(Project.id).filter(
|
||||||
|
Project.status.in_(["on_hold", "archived", "deleted"])
|
||||||
|
).all()
|
||||||
|
]
|
||||||
|
|
||||||
|
# Query today's actions
|
||||||
|
actions = db.query(ScheduledAction).filter(
|
||||||
|
ScheduledAction.scheduled_time >= today_start_utc,
|
||||||
|
ScheduledAction.scheduled_time < today_end_utc,
|
||||||
|
ScheduledAction.project_id.notin_(paused_project_ids),
|
||||||
|
).order_by(ScheduledAction.scheduled_time.asc()).all()
|
||||||
|
|
||||||
|
# Enrich with location/project info and parse results
|
||||||
|
enriched_actions = []
|
||||||
|
for action in actions:
|
||||||
|
location = None
|
||||||
|
project = None
|
||||||
|
if action.location_id:
|
||||||
|
location = db.query(MonitoringLocation).filter_by(id=action.location_id).first()
|
||||||
|
if action.project_id:
|
||||||
|
project = db.query(Project).filter_by(id=action.project_id).first()
|
||||||
|
|
||||||
|
# Parse module_response for result details
|
||||||
|
result_data = None
|
||||||
|
if action.module_response:
|
||||||
|
try:
|
||||||
|
result_data = json.loads(action.module_response)
|
||||||
|
except json.JSONDecodeError:
|
||||||
|
pass
|
||||||
|
|
||||||
|
enriched_actions.append({
|
||||||
|
"action": action,
|
||||||
|
"location": location,
|
||||||
|
"project": project,
|
||||||
|
"result": result_data,
|
||||||
|
})
|
||||||
|
|
||||||
|
# Count by status
|
||||||
|
pending_count = sum(1 for a in actions if a.execution_status == "pending")
|
||||||
|
completed_count = sum(1 for a in actions if a.execution_status == "completed")
|
||||||
|
failed_count = sum(1 for a in actions if a.execution_status == "failed")
|
||||||
|
|
||||||
|
return templates.TemplateResponse(
|
||||||
|
"partials/dashboard/todays_actions.html",
|
||||||
|
{
|
||||||
|
"request": request,
|
||||||
|
"actions": enriched_actions,
|
||||||
|
"pending_count": pending_count,
|
||||||
|
"completed_count": completed_count,
|
||||||
|
"failed_count": failed_count,
|
||||||
|
"total_count": len(actions),
|
||||||
|
}
|
||||||
|
)
|
||||||
|
|||||||
154
backend/routers/deployments.py
Normal file
@@ -0,0 +1,154 @@
|
|||||||
|
from fastapi import APIRouter, Depends, HTTPException
|
||||||
|
from sqlalchemy.orm import Session
|
||||||
|
from datetime import datetime, date
|
||||||
|
from typing import Optional
|
||||||
|
import uuid
|
||||||
|
|
||||||
|
from backend.database import get_db
|
||||||
|
from backend.models import DeploymentRecord, RosterUnit
|
||||||
|
|
||||||
|
router = APIRouter(prefix="/api", tags=["deployments"])
|
||||||
|
|
||||||
|
|
||||||
|
def _serialize(record: DeploymentRecord) -> dict:
|
||||||
|
return {
|
||||||
|
"id": record.id,
|
||||||
|
"unit_id": record.unit_id,
|
||||||
|
"deployed_date": record.deployed_date.isoformat() if record.deployed_date else None,
|
||||||
|
"estimated_removal_date": record.estimated_removal_date.isoformat() if record.estimated_removal_date else None,
|
||||||
|
"actual_removal_date": record.actual_removal_date.isoformat() if record.actual_removal_date else None,
|
||||||
|
"project_ref": record.project_ref,
|
||||||
|
"project_id": record.project_id,
|
||||||
|
"location_name": record.location_name,
|
||||||
|
"notes": record.notes,
|
||||||
|
"created_at": record.created_at.isoformat() if record.created_at else None,
|
||||||
|
"updated_at": record.updated_at.isoformat() if record.updated_at else None,
|
||||||
|
"is_active": record.actual_removal_date is None,
|
||||||
|
}
|
||||||
|
|
||||||
|
|
||||||
|
@router.get("/deployments/{unit_id}")
|
||||||
|
def get_deployments(unit_id: str, db: Session = Depends(get_db)):
|
||||||
|
"""Get all deployment records for a unit, newest first."""
|
||||||
|
unit = db.query(RosterUnit).filter_by(id=unit_id).first()
|
||||||
|
if not unit:
|
||||||
|
raise HTTPException(status_code=404, detail=f"Unit {unit_id} not found")
|
||||||
|
|
||||||
|
records = (
|
||||||
|
db.query(DeploymentRecord)
|
||||||
|
.filter_by(unit_id=unit_id)
|
||||||
|
.order_by(DeploymentRecord.deployed_date.desc(), DeploymentRecord.created_at.desc())
|
||||||
|
.all()
|
||||||
|
)
|
||||||
|
return {"deployments": [_serialize(r) for r in records]}
|
||||||
|
|
||||||
|
|
||||||
|
@router.get("/deployments/{unit_id}/active")
|
||||||
|
def get_active_deployment(unit_id: str, db: Session = Depends(get_db)):
|
||||||
|
"""Get the current active deployment (actual_removal_date is NULL), or null."""
|
||||||
|
record = (
|
||||||
|
db.query(DeploymentRecord)
|
||||||
|
.filter(
|
||||||
|
DeploymentRecord.unit_id == unit_id,
|
||||||
|
DeploymentRecord.actual_removal_date == None
|
||||||
|
)
|
||||||
|
.order_by(DeploymentRecord.created_at.desc())
|
||||||
|
.first()
|
||||||
|
)
|
||||||
|
return {"deployment": _serialize(record) if record else None}
|
||||||
|
|
||||||
|
|
||||||
|
@router.post("/deployments/{unit_id}")
|
||||||
|
def create_deployment(unit_id: str, payload: dict, db: Session = Depends(get_db)):
|
||||||
|
"""
|
||||||
|
Create a new deployment record for a unit.
|
||||||
|
|
||||||
|
Body fields (all optional):
|
||||||
|
deployed_date (YYYY-MM-DD)
|
||||||
|
estimated_removal_date (YYYY-MM-DD)
|
||||||
|
project_ref (freeform string)
|
||||||
|
project_id (UUID if linked to Project)
|
||||||
|
location_name
|
||||||
|
notes
|
||||||
|
"""
|
||||||
|
unit = db.query(RosterUnit).filter_by(id=unit_id).first()
|
||||||
|
if not unit:
|
||||||
|
raise HTTPException(status_code=404, detail=f"Unit {unit_id} not found")
|
||||||
|
|
||||||
|
def parse_date(val) -> Optional[date]:
|
||||||
|
if not val:
|
||||||
|
return None
|
||||||
|
if isinstance(val, date):
|
||||||
|
return val
|
||||||
|
return date.fromisoformat(str(val))
|
||||||
|
|
||||||
|
record = DeploymentRecord(
|
||||||
|
id=str(uuid.uuid4()),
|
||||||
|
unit_id=unit_id,
|
||||||
|
deployed_date=parse_date(payload.get("deployed_date")),
|
||||||
|
estimated_removal_date=parse_date(payload.get("estimated_removal_date")),
|
||||||
|
actual_removal_date=None,
|
||||||
|
project_ref=payload.get("project_ref"),
|
||||||
|
project_id=payload.get("project_id"),
|
||||||
|
location_name=payload.get("location_name"),
|
||||||
|
notes=payload.get("notes"),
|
||||||
|
created_at=datetime.utcnow(),
|
||||||
|
updated_at=datetime.utcnow(),
|
||||||
|
)
|
||||||
|
db.add(record)
|
||||||
|
db.commit()
|
||||||
|
db.refresh(record)
|
||||||
|
return _serialize(record)
|
||||||
|
|
||||||
|
|
||||||
|
@router.put("/deployments/{unit_id}/{deployment_id}")
|
||||||
|
def update_deployment(unit_id: str, deployment_id: str, payload: dict, db: Session = Depends(get_db)):
|
||||||
|
"""
|
||||||
|
Update a deployment record. Used for:
|
||||||
|
- Setting/changing estimated_removal_date
|
||||||
|
- Closing a deployment (set actual_removal_date to mark unit returned)
|
||||||
|
- Editing project_ref, location_name, notes
|
||||||
|
"""
|
||||||
|
record = db.query(DeploymentRecord).filter_by(id=deployment_id, unit_id=unit_id).first()
|
||||||
|
if not record:
|
||||||
|
raise HTTPException(status_code=404, detail="Deployment record not found")
|
||||||
|
|
||||||
|
def parse_date(val) -> Optional[date]:
|
||||||
|
if val is None:
|
||||||
|
return None
|
||||||
|
if val == "":
|
||||||
|
return None
|
||||||
|
if isinstance(val, date):
|
||||||
|
return val
|
||||||
|
return date.fromisoformat(str(val))
|
||||||
|
|
||||||
|
if "deployed_date" in payload:
|
||||||
|
record.deployed_date = parse_date(payload["deployed_date"])
|
||||||
|
if "estimated_removal_date" in payload:
|
||||||
|
record.estimated_removal_date = parse_date(payload["estimated_removal_date"])
|
||||||
|
if "actual_removal_date" in payload:
|
||||||
|
record.actual_removal_date = parse_date(payload["actual_removal_date"])
|
||||||
|
if "project_ref" in payload:
|
||||||
|
record.project_ref = payload["project_ref"]
|
||||||
|
if "project_id" in payload:
|
||||||
|
record.project_id = payload["project_id"]
|
||||||
|
if "location_name" in payload:
|
||||||
|
record.location_name = payload["location_name"]
|
||||||
|
if "notes" in payload:
|
||||||
|
record.notes = payload["notes"]
|
||||||
|
|
||||||
|
record.updated_at = datetime.utcnow()
|
||||||
|
db.commit()
|
||||||
|
db.refresh(record)
|
||||||
|
return _serialize(record)
|
||||||
|
|
||||||
|
|
||||||
|
@router.delete("/deployments/{unit_id}/{deployment_id}")
|
||||||
|
def delete_deployment(unit_id: str, deployment_id: str, db: Session = Depends(get_db)):
|
||||||
|
"""Delete a deployment record."""
|
||||||
|
record = db.query(DeploymentRecord).filter_by(id=deployment_id, unit_id=unit_id).first()
|
||||||
|
if not record:
|
||||||
|
raise HTTPException(status_code=404, detail="Deployment record not found")
|
||||||
|
db.delete(record)
|
||||||
|
db.commit()
|
||||||
|
return {"ok": True}
|
||||||
928
backend/routers/fleet_calendar.py
Normal file
@@ -0,0 +1,928 @@
|
|||||||
|
"""
|
||||||
|
Fleet Calendar Router
|
||||||
|
|
||||||
|
API endpoints for the Fleet Calendar feature:
|
||||||
|
- Calendar page and data
|
||||||
|
- Job reservation CRUD
|
||||||
|
- Unit assignment management
|
||||||
|
- Availability checking
|
||||||
|
"""
|
||||||
|
|
||||||
|
from fastapi import APIRouter, Request, Depends, HTTPException, Query
|
||||||
|
from fastapi.responses import HTMLResponse, JSONResponse
|
||||||
|
from sqlalchemy.orm import Session
|
||||||
|
from datetime import datetime, date, timedelta
|
||||||
|
from typing import Optional, List
|
||||||
|
import uuid
|
||||||
|
import logging
|
||||||
|
|
||||||
|
from backend.database import get_db
|
||||||
|
from backend.models import (
|
||||||
|
RosterUnit, JobReservation, JobReservationUnit,
|
||||||
|
UserPreferences, Project, MonitoringLocation, UnitAssignment
|
||||||
|
)
|
||||||
|
from backend.templates_config import templates
|
||||||
|
from backend.services.fleet_calendar_service import (
|
||||||
|
get_day_summary,
|
||||||
|
get_calendar_year_data,
|
||||||
|
get_rolling_calendar_data,
|
||||||
|
check_calibration_conflicts,
|
||||||
|
get_available_units_for_period,
|
||||||
|
get_calibration_status
|
||||||
|
)
|
||||||
|
|
||||||
|
router = APIRouter(tags=["fleet-calendar"])
|
||||||
|
logger = logging.getLogger(__name__)
|
||||||
|
|
||||||
|
|
||||||
|
# ============================================================================
|
||||||
|
# Calendar Page
|
||||||
|
# ============================================================================
|
||||||
|
|
||||||
|
@router.get("/fleet-calendar", response_class=HTMLResponse)
|
||||||
|
async def fleet_calendar_page(
|
||||||
|
request: Request,
|
||||||
|
year: Optional[int] = None,
|
||||||
|
month: Optional[int] = None,
|
||||||
|
device_type: str = "seismograph",
|
||||||
|
db: Session = Depends(get_db)
|
||||||
|
):
|
||||||
|
"""Main Fleet Calendar page with rolling 12-month view."""
|
||||||
|
today = date.today()
|
||||||
|
|
||||||
|
# Default to current month as the start
|
||||||
|
if year is None:
|
||||||
|
year = today.year
|
||||||
|
if month is None:
|
||||||
|
month = today.month
|
||||||
|
|
||||||
|
# Get calendar data for 12 months starting from year/month
|
||||||
|
calendar_data = get_rolling_calendar_data(db, year, month, device_type)
|
||||||
|
|
||||||
|
# Get projects for the reservation form dropdown
|
||||||
|
projects = db.query(Project).filter(
|
||||||
|
Project.status.in_(["active", "upcoming", "on_hold"])
|
||||||
|
).order_by(Project.name).all()
|
||||||
|
|
||||||
|
# Build a serializable list of items with dates for calendar bars
|
||||||
|
# Includes both tracked Projects (with dates) and Job Reservations (matching device_type)
|
||||||
|
project_colors = ['#3B82F6', '#10B981', '#F59E0B', '#EF4444', '#8B5CF6', '#EC4899', '#06B6D4', '#F97316']
|
||||||
|
# Map calendar device_type to project_type_ids
|
||||||
|
device_type_to_project_types = {
|
||||||
|
"seismograph": ["vibration_monitoring", "combined"],
|
||||||
|
"slm": ["sound_monitoring", "combined"],
|
||||||
|
}
|
||||||
|
relevant_project_types = device_type_to_project_types.get(device_type, [])
|
||||||
|
|
||||||
|
calendar_projects = []
|
||||||
|
for i, p in enumerate(projects):
|
||||||
|
if p.start_date and p.project_type_id in relevant_project_types:
|
||||||
|
calendar_projects.append({
|
||||||
|
"id": p.id,
|
||||||
|
"name": p.name,
|
||||||
|
"start_date": p.start_date.isoformat(),
|
||||||
|
"end_date": p.end_date.isoformat() if p.end_date else None,
|
||||||
|
"color": project_colors[i % len(project_colors)],
|
||||||
|
"confirmed": True,
|
||||||
|
})
|
||||||
|
|
||||||
|
# Add job reservations for this device_type as bars
|
||||||
|
from sqlalchemy import or_ as _or
|
||||||
|
cal_window_end = date(year + ((month + 10) // 12), ((month + 10) % 12) + 1, 1)
|
||||||
|
reservations_for_cal = db.query(JobReservation).filter(
|
||||||
|
JobReservation.device_type == device_type,
|
||||||
|
JobReservation.start_date <= cal_window_end,
|
||||||
|
_or(
|
||||||
|
JobReservation.end_date >= date(year, month, 1),
|
||||||
|
JobReservation.end_date == None,
|
||||||
|
)
|
||||||
|
).all()
|
||||||
|
for res in reservations_for_cal:
|
||||||
|
end = res.end_date or res.estimated_end_date
|
||||||
|
calendar_projects.append({
|
||||||
|
"id": res.id,
|
||||||
|
"name": res.name,
|
||||||
|
"start_date": res.start_date.isoformat(),
|
||||||
|
"end_date": end.isoformat() if end else None,
|
||||||
|
"color": res.color,
|
||||||
|
"confirmed": bool(res.project_id),
|
||||||
|
})
|
||||||
|
|
||||||
|
# Calculate prev/next month navigation
|
||||||
|
prev_year, prev_month = (year - 1, 12) if month == 1 else (year, month - 1)
|
||||||
|
next_year, next_month = (year + 1, 1) if month == 12 else (year, month + 1)
|
||||||
|
|
||||||
|
return templates.TemplateResponse(
|
||||||
|
"fleet_calendar.html",
|
||||||
|
{
|
||||||
|
"request": request,
|
||||||
|
"start_year": year,
|
||||||
|
"start_month": month,
|
||||||
|
"prev_year": prev_year,
|
||||||
|
"prev_month": prev_month,
|
||||||
|
"next_year": next_year,
|
||||||
|
"next_month": next_month,
|
||||||
|
"device_type": device_type,
|
||||||
|
"calendar_data": calendar_data,
|
||||||
|
"projects": projects,
|
||||||
|
"calendar_projects": calendar_projects,
|
||||||
|
"today": today.isoformat()
|
||||||
|
}
|
||||||
|
)
|
||||||
|
|
||||||
|
|
||||||
|
# ============================================================================
|
||||||
|
# Calendar Data API
|
||||||
|
# ============================================================================
|
||||||
|
|
||||||
|
@router.get("/api/fleet-calendar/data", response_class=JSONResponse)
|
||||||
|
async def get_calendar_data(
|
||||||
|
year: int,
|
||||||
|
device_type: str = "seismograph",
|
||||||
|
db: Session = Depends(get_db)
|
||||||
|
):
|
||||||
|
"""Get calendar data for a specific year."""
|
||||||
|
return get_calendar_year_data(db, year, device_type)
|
||||||
|
|
||||||
|
|
||||||
|
@router.get("/api/fleet-calendar/day/{date_str}", response_class=HTMLResponse)
|
||||||
|
async def get_day_detail(
|
||||||
|
request: Request,
|
||||||
|
date_str: str,
|
||||||
|
device_type: str = "seismograph",
|
||||||
|
db: Session = Depends(get_db)
|
||||||
|
):
|
||||||
|
"""Get detailed view for a specific day (HTMX partial)."""
|
||||||
|
try:
|
||||||
|
check_date = date.fromisoformat(date_str)
|
||||||
|
except ValueError:
|
||||||
|
raise HTTPException(status_code=400, detail="Invalid date format. Use YYYY-MM-DD")
|
||||||
|
|
||||||
|
day_data = get_day_summary(db, check_date, device_type)
|
||||||
|
|
||||||
|
# Get projects for display names
|
||||||
|
projects = {p.id: p for p in db.query(Project).all()}
|
||||||
|
|
||||||
|
return templates.TemplateResponse(
|
||||||
|
"partials/fleet_calendar/day_detail.html",
|
||||||
|
{
|
||||||
|
"request": request,
|
||||||
|
"day_data": day_data,
|
||||||
|
"date_str": date_str,
|
||||||
|
"date_display": check_date.strftime("%B %d, %Y"),
|
||||||
|
"device_type": device_type,
|
||||||
|
"projects": projects
|
||||||
|
}
|
||||||
|
)
|
||||||
|
|
||||||
|
|
||||||
|
# ============================================================================
|
||||||
|
# Reservation CRUD
|
||||||
|
# ============================================================================
|
||||||
|
|
||||||
|
@router.post("/api/fleet-calendar/reservations", response_class=JSONResponse)
|
||||||
|
async def create_reservation(
|
||||||
|
request: Request,
|
||||||
|
db: Session = Depends(get_db)
|
||||||
|
):
|
||||||
|
"""Create a new job reservation."""
|
||||||
|
data = await request.json()
|
||||||
|
|
||||||
|
# Validate required fields
|
||||||
|
required = ["name", "start_date", "assignment_type"]
|
||||||
|
for field in required:
|
||||||
|
if field not in data:
|
||||||
|
raise HTTPException(status_code=400, detail=f"Missing required field: {field}")
|
||||||
|
|
||||||
|
# Need either end_date or end_date_tbd
|
||||||
|
end_date_tbd = data.get("end_date_tbd", False)
|
||||||
|
if not end_date_tbd and not data.get("end_date"):
|
||||||
|
raise HTTPException(status_code=400, detail="End date is required unless marked as TBD")
|
||||||
|
|
||||||
|
try:
|
||||||
|
start_date = date.fromisoformat(data["start_date"])
|
||||||
|
end_date = date.fromisoformat(data["end_date"]) if data.get("end_date") else None
|
||||||
|
estimated_end_date = date.fromisoformat(data["estimated_end_date"]) if data.get("estimated_end_date") else None
|
||||||
|
except ValueError:
|
||||||
|
raise HTTPException(status_code=400, detail="Invalid date format. Use YYYY-MM-DD")
|
||||||
|
|
||||||
|
if end_date and end_date < start_date:
|
||||||
|
raise HTTPException(status_code=400, detail="End date must be after start date")
|
||||||
|
|
||||||
|
if estimated_end_date and estimated_end_date < start_date:
|
||||||
|
raise HTTPException(status_code=400, detail="Estimated end date must be after start date")
|
||||||
|
|
||||||
|
import json as _json
|
||||||
|
reservation = JobReservation(
|
||||||
|
id=str(uuid.uuid4()),
|
||||||
|
name=data["name"],
|
||||||
|
project_id=data.get("project_id"),
|
||||||
|
start_date=start_date,
|
||||||
|
end_date=end_date,
|
||||||
|
estimated_end_date=estimated_end_date,
|
||||||
|
end_date_tbd=end_date_tbd,
|
||||||
|
assignment_type=data["assignment_type"],
|
||||||
|
device_type=data.get("device_type", "seismograph"),
|
||||||
|
quantity_needed=data.get("quantity_needed"),
|
||||||
|
estimated_units=data.get("estimated_units"),
|
||||||
|
location_slots=_json.dumps(data["location_slots"]) if data.get("location_slots") is not None else None,
|
||||||
|
notes=data.get("notes"),
|
||||||
|
color=data.get("color", "#3B82F6")
|
||||||
|
)
|
||||||
|
|
||||||
|
db.add(reservation)
|
||||||
|
|
||||||
|
# If specific units were provided, assign them
|
||||||
|
if data.get("unit_ids") and data["assignment_type"] == "specific":
|
||||||
|
for unit_id in data["unit_ids"]:
|
||||||
|
assignment = JobReservationUnit(
|
||||||
|
id=str(uuid.uuid4()),
|
||||||
|
reservation_id=reservation.id,
|
||||||
|
unit_id=unit_id,
|
||||||
|
assignment_source="specific"
|
||||||
|
)
|
||||||
|
db.add(assignment)
|
||||||
|
|
||||||
|
db.commit()
|
||||||
|
|
||||||
|
logger.info(f"Created reservation: {reservation.name} ({reservation.id})")
|
||||||
|
|
||||||
|
return {
|
||||||
|
"success": True,
|
||||||
|
"reservation_id": reservation.id,
|
||||||
|
"message": f"Created reservation: {reservation.name}"
|
||||||
|
}
|
||||||
|
|
||||||
|
|
||||||
|
@router.get("/api/fleet-calendar/reservations/{reservation_id}", response_class=JSONResponse)
|
||||||
|
async def get_reservation(
|
||||||
|
reservation_id: str,
|
||||||
|
db: Session = Depends(get_db)
|
||||||
|
):
|
||||||
|
"""Get a specific reservation with its assigned units."""
|
||||||
|
reservation = db.query(JobReservation).filter_by(id=reservation_id).first()
|
||||||
|
if not reservation:
|
||||||
|
raise HTTPException(status_code=404, detail="Reservation not found")
|
||||||
|
|
||||||
|
# Get assigned units
|
||||||
|
assignments = db.query(JobReservationUnit).filter_by(
|
||||||
|
reservation_id=reservation_id
|
||||||
|
).all()
|
||||||
|
|
||||||
|
# Sort assignments by slot_index so order is preserved
|
||||||
|
assignments_sorted = sorted(assignments, key=lambda a: (a.slot_index if a.slot_index is not None else 999))
|
||||||
|
unit_ids = [a.unit_id for a in assignments_sorted]
|
||||||
|
units = db.query(RosterUnit).filter(RosterUnit.id.in_(unit_ids)).all() if unit_ids else []
|
||||||
|
units_by_id = {u.id: u for u in units}
|
||||||
|
# Build per-unit lookups from assignments
|
||||||
|
assignment_map = {a.unit_id: a for a in assignments_sorted}
|
||||||
|
|
||||||
|
import json as _json
|
||||||
|
stored_slots = _json.loads(reservation.location_slots) if reservation.location_slots else None
|
||||||
|
|
||||||
|
return {
|
||||||
|
"id": reservation.id,
|
||||||
|
"name": reservation.name,
|
||||||
|
"project_id": reservation.project_id,
|
||||||
|
"start_date": reservation.start_date.isoformat(),
|
||||||
|
"end_date": reservation.end_date.isoformat() if reservation.end_date else None,
|
||||||
|
"estimated_end_date": reservation.estimated_end_date.isoformat() if reservation.estimated_end_date else None,
|
||||||
|
"end_date_tbd": reservation.end_date_tbd,
|
||||||
|
"assignment_type": reservation.assignment_type,
|
||||||
|
"device_type": reservation.device_type,
|
||||||
|
"quantity_needed": reservation.quantity_needed,
|
||||||
|
"estimated_units": reservation.estimated_units,
|
||||||
|
"location_slots": stored_slots,
|
||||||
|
"notes": reservation.notes,
|
||||||
|
"color": reservation.color,
|
||||||
|
"assigned_units": [
|
||||||
|
{
|
||||||
|
"id": uid,
|
||||||
|
"last_calibrated": units_by_id[uid].last_calibrated.isoformat() if uid in units_by_id and units_by_id[uid].last_calibrated else None,
|
||||||
|
"deployed": units_by_id[uid].deployed if uid in units_by_id else False,
|
||||||
|
"power_type": assignment_map[uid].power_type,
|
||||||
|
"notes": assignment_map[uid].notes,
|
||||||
|
"location_name": assignment_map[uid].location_name,
|
||||||
|
"slot_index": assignment_map[uid].slot_index,
|
||||||
|
}
|
||||||
|
for uid in unit_ids
|
||||||
|
]
|
||||||
|
}
|
||||||
|
|
||||||
|
|
||||||
|
@router.put("/api/fleet-calendar/reservations/{reservation_id}", response_class=JSONResponse)
|
||||||
|
async def update_reservation(
|
||||||
|
reservation_id: str,
|
||||||
|
request: Request,
|
||||||
|
db: Session = Depends(get_db)
|
||||||
|
):
|
||||||
|
"""Update an existing reservation."""
|
||||||
|
reservation = db.query(JobReservation).filter_by(id=reservation_id).first()
|
||||||
|
if not reservation:
|
||||||
|
raise HTTPException(status_code=404, detail="Reservation not found")
|
||||||
|
|
||||||
|
data = await request.json()
|
||||||
|
|
||||||
|
# Update fields if provided
|
||||||
|
if "name" in data:
|
||||||
|
reservation.name = data["name"]
|
||||||
|
if "project_id" in data:
|
||||||
|
reservation.project_id = data["project_id"]
|
||||||
|
if "start_date" in data:
|
||||||
|
reservation.start_date = date.fromisoformat(data["start_date"])
|
||||||
|
if "end_date" in data:
|
||||||
|
reservation.end_date = date.fromisoformat(data["end_date"]) if data["end_date"] else None
|
||||||
|
if "estimated_end_date" in data:
|
||||||
|
reservation.estimated_end_date = date.fromisoformat(data["estimated_end_date"]) if data["estimated_end_date"] else None
|
||||||
|
if "end_date_tbd" in data:
|
||||||
|
reservation.end_date_tbd = data["end_date_tbd"]
|
||||||
|
if "assignment_type" in data:
|
||||||
|
reservation.assignment_type = data["assignment_type"]
|
||||||
|
if "quantity_needed" in data:
|
||||||
|
reservation.quantity_needed = data["quantity_needed"]
|
||||||
|
if "estimated_units" in data:
|
||||||
|
reservation.estimated_units = data["estimated_units"]
|
||||||
|
if "location_slots" in data:
|
||||||
|
import json as _json
|
||||||
|
reservation.location_slots = _json.dumps(data["location_slots"]) if data["location_slots"] is not None else None
|
||||||
|
if "notes" in data:
|
||||||
|
reservation.notes = data["notes"]
|
||||||
|
if "color" in data:
|
||||||
|
reservation.color = data["color"]
|
||||||
|
|
||||||
|
reservation.updated_at = datetime.utcnow()
|
||||||
|
|
||||||
|
db.commit()
|
||||||
|
|
||||||
|
logger.info(f"Updated reservation: {reservation.name} ({reservation.id})")
|
||||||
|
|
||||||
|
return {
|
||||||
|
"success": True,
|
||||||
|
"message": f"Updated reservation: {reservation.name}"
|
||||||
|
}
|
||||||
|
|
||||||
|
|
||||||
|
@router.delete("/api/fleet-calendar/reservations/{reservation_id}", response_class=JSONResponse)
|
||||||
|
async def delete_reservation(
|
||||||
|
reservation_id: str,
|
||||||
|
db: Session = Depends(get_db)
|
||||||
|
):
|
||||||
|
"""Delete a reservation and its unit assignments."""
|
||||||
|
reservation = db.query(JobReservation).filter_by(id=reservation_id).first()
|
||||||
|
if not reservation:
|
||||||
|
raise HTTPException(status_code=404, detail="Reservation not found")
|
||||||
|
|
||||||
|
# Delete unit assignments first
|
||||||
|
db.query(JobReservationUnit).filter_by(reservation_id=reservation_id).delete()
|
||||||
|
|
||||||
|
# Delete the reservation
|
||||||
|
db.delete(reservation)
|
||||||
|
db.commit()
|
||||||
|
|
||||||
|
logger.info(f"Deleted reservation: {reservation.name} ({reservation_id})")
|
||||||
|
|
||||||
|
return {
|
||||||
|
"success": True,
|
||||||
|
"message": "Reservation deleted"
|
||||||
|
}
|
||||||
|
|
||||||
|
|
||||||
|
# ============================================================================
|
||||||
|
# Unit Assignment
|
||||||
|
# ============================================================================
|
||||||
|
|
||||||
|
@router.post("/api/fleet-calendar/reservations/{reservation_id}/assign-units", response_class=JSONResponse)
|
||||||
|
async def assign_units_to_reservation(
|
||||||
|
reservation_id: str,
|
||||||
|
request: Request,
|
||||||
|
db: Session = Depends(get_db)
|
||||||
|
):
|
||||||
|
"""Assign specific units to a reservation."""
|
||||||
|
reservation = db.query(JobReservation).filter_by(id=reservation_id).first()
|
||||||
|
if not reservation:
|
||||||
|
raise HTTPException(status_code=404, detail="Reservation not found")
|
||||||
|
|
||||||
|
data = await request.json()
|
||||||
|
unit_ids = data.get("unit_ids", [])
|
||||||
|
# Optional per-unit dicts keyed by unit_id
|
||||||
|
power_types = data.get("power_types", {})
|
||||||
|
location_notes = data.get("location_notes", {})
|
||||||
|
location_names = data.get("location_names", {})
|
||||||
|
# slot_indices: {"BE17354": 0, "BE9441": 1, ...}
|
||||||
|
slot_indices = data.get("slot_indices", {})
|
||||||
|
|
||||||
|
# Verify units exist (allow empty list to clear all assignments)
|
||||||
|
if unit_ids:
|
||||||
|
units = db.query(RosterUnit).filter(RosterUnit.id.in_(unit_ids)).all()
|
||||||
|
found_ids = {u.id for u in units}
|
||||||
|
missing = set(unit_ids) - found_ids
|
||||||
|
if missing:
|
||||||
|
raise HTTPException(status_code=404, detail=f"Units not found: {', '.join(missing)}")
|
||||||
|
|
||||||
|
# Full replace: delete all existing assignments for this reservation first
|
||||||
|
db.query(JobReservationUnit).filter_by(reservation_id=reservation_id).delete()
|
||||||
|
db.flush()
|
||||||
|
|
||||||
|
# Check for conflicts with other reservations and insert new assignments
|
||||||
|
conflicts = []
|
||||||
|
for unit_id in unit_ids:
|
||||||
|
# Check overlapping reservations
|
||||||
|
if reservation.end_date:
|
||||||
|
overlapping = db.query(JobReservation).join(
|
||||||
|
JobReservationUnit, JobReservation.id == JobReservationUnit.reservation_id
|
||||||
|
).filter(
|
||||||
|
JobReservationUnit.unit_id == unit_id,
|
||||||
|
JobReservation.id != reservation_id,
|
||||||
|
JobReservation.start_date <= reservation.end_date,
|
||||||
|
JobReservation.end_date >= reservation.start_date
|
||||||
|
).first()
|
||||||
|
|
||||||
|
if overlapping:
|
||||||
|
conflicts.append({
|
||||||
|
"unit_id": unit_id,
|
||||||
|
"conflict_reservation": overlapping.name,
|
||||||
|
"conflict_dates": f"{overlapping.start_date} - {overlapping.end_date}"
|
||||||
|
})
|
||||||
|
continue
|
||||||
|
|
||||||
|
# Add assignment
|
||||||
|
assignment = JobReservationUnit(
|
||||||
|
id=str(uuid.uuid4()),
|
||||||
|
reservation_id=reservation_id,
|
||||||
|
unit_id=unit_id,
|
||||||
|
assignment_source="filled" if reservation.assignment_type == "quantity" else "specific",
|
||||||
|
power_type=power_types.get(unit_id),
|
||||||
|
notes=location_notes.get(unit_id),
|
||||||
|
location_name=location_names.get(unit_id),
|
||||||
|
slot_index=slot_indices.get(unit_id),
|
||||||
|
)
|
||||||
|
db.add(assignment)
|
||||||
|
|
||||||
|
db.commit()
|
||||||
|
|
||||||
|
# Check for calibration conflicts
|
||||||
|
cal_conflicts = check_calibration_conflicts(db, reservation_id)
|
||||||
|
|
||||||
|
assigned_count = db.query(JobReservationUnit).filter_by(
|
||||||
|
reservation_id=reservation_id
|
||||||
|
).count()
|
||||||
|
|
||||||
|
return {
|
||||||
|
"success": True,
|
||||||
|
"assigned_count": assigned_count,
|
||||||
|
"conflicts": conflicts,
|
||||||
|
"calibration_warnings": cal_conflicts,
|
||||||
|
"message": f"Assigned {len(unit_ids) - len(conflicts)} units"
|
||||||
|
}
|
||||||
|
|
||||||
|
|
||||||
|
@router.delete("/api/fleet-calendar/reservations/{reservation_id}/units/{unit_id}", response_class=JSONResponse)
|
||||||
|
async def remove_unit_from_reservation(
|
||||||
|
reservation_id: str,
|
||||||
|
unit_id: str,
|
||||||
|
db: Session = Depends(get_db)
|
||||||
|
):
|
||||||
|
"""Remove a unit from a reservation."""
|
||||||
|
assignment = db.query(JobReservationUnit).filter_by(
|
||||||
|
reservation_id=reservation_id,
|
||||||
|
unit_id=unit_id
|
||||||
|
).first()
|
||||||
|
|
||||||
|
if not assignment:
|
||||||
|
raise HTTPException(status_code=404, detail="Unit assignment not found")
|
||||||
|
|
||||||
|
db.delete(assignment)
|
||||||
|
db.commit()
|
||||||
|
|
||||||
|
return {
|
||||||
|
"success": True,
|
||||||
|
"message": f"Removed {unit_id} from reservation"
|
||||||
|
}
|
||||||
|
|
||||||
|
|
||||||
|
# ============================================================================
|
||||||
|
# Availability & Conflicts
|
||||||
|
# ============================================================================
|
||||||
|
|
||||||
|
@router.get("/api/fleet-calendar/availability", response_class=JSONResponse)
|
||||||
|
async def check_availability(
|
||||||
|
start_date: str,
|
||||||
|
end_date: str,
|
||||||
|
device_type: str = "seismograph",
|
||||||
|
exclude_reservation_id: Optional[str] = None,
|
||||||
|
db: Session = Depends(get_db)
|
||||||
|
):
|
||||||
|
"""Get units available for a specific date range."""
|
||||||
|
try:
|
||||||
|
start = date.fromisoformat(start_date)
|
||||||
|
end = date.fromisoformat(end_date)
|
||||||
|
except ValueError:
|
||||||
|
raise HTTPException(status_code=400, detail="Invalid date format. Use YYYY-MM-DD")
|
||||||
|
|
||||||
|
available = get_available_units_for_period(
|
||||||
|
db, start, end, device_type, exclude_reservation_id
|
||||||
|
)
|
||||||
|
|
||||||
|
return {
|
||||||
|
"start_date": start_date,
|
||||||
|
"end_date": end_date,
|
||||||
|
"device_type": device_type,
|
||||||
|
"available_units": available,
|
||||||
|
"count": len(available)
|
||||||
|
}
|
||||||
|
|
||||||
|
|
||||||
|
@router.get("/api/fleet-calendar/reservations/{reservation_id}/conflicts", response_class=JSONResponse)
|
||||||
|
async def get_reservation_conflicts(
|
||||||
|
reservation_id: str,
|
||||||
|
db: Session = Depends(get_db)
|
||||||
|
):
|
||||||
|
"""Check for calibration conflicts in a reservation."""
|
||||||
|
reservation = db.query(JobReservation).filter_by(id=reservation_id).first()
|
||||||
|
if not reservation:
|
||||||
|
raise HTTPException(status_code=404, detail="Reservation not found")
|
||||||
|
|
||||||
|
conflicts = check_calibration_conflicts(db, reservation_id)
|
||||||
|
|
||||||
|
return {
|
||||||
|
"reservation_id": reservation_id,
|
||||||
|
"reservation_name": reservation.name,
|
||||||
|
"conflicts": conflicts,
|
||||||
|
"has_conflicts": len(conflicts) > 0
|
||||||
|
}
|
||||||
|
|
||||||
|
|
||||||
|
# ============================================================================
|
||||||
|
# HTMX Partials
|
||||||
|
# ============================================================================
|
||||||
|
|
||||||
|
@router.get("/api/fleet-calendar/reservations-list", response_class=HTMLResponse)
|
||||||
|
async def get_reservations_list(
|
||||||
|
request: Request,
|
||||||
|
year: Optional[int] = None,
|
||||||
|
month: Optional[int] = None,
|
||||||
|
device_type: str = "seismograph",
|
||||||
|
db: Session = Depends(get_db)
|
||||||
|
):
|
||||||
|
"""Get list of reservations as HTMX partial."""
|
||||||
|
from sqlalchemy import or_
|
||||||
|
|
||||||
|
today = date.today()
|
||||||
|
if year is None:
|
||||||
|
year = today.year
|
||||||
|
if month is None:
|
||||||
|
month = today.month
|
||||||
|
|
||||||
|
# Calculate 12-month window
|
||||||
|
start_date = date(year, month, 1)
|
||||||
|
# End date is 12 months later
|
||||||
|
end_year = year + ((month + 10) // 12)
|
||||||
|
end_month = ((month + 10) % 12) + 1
|
||||||
|
if end_month == 12:
|
||||||
|
end_date = date(end_year, 12, 31)
|
||||||
|
else:
|
||||||
|
end_date = date(end_year, end_month + 1, 1) - timedelta(days=1)
|
||||||
|
|
||||||
|
# Filter by device_type and date window
|
||||||
|
reservations = db.query(JobReservation).filter(
|
||||||
|
JobReservation.device_type == device_type,
|
||||||
|
JobReservation.start_date <= end_date,
|
||||||
|
or_(
|
||||||
|
JobReservation.end_date >= start_date,
|
||||||
|
JobReservation.end_date == None # TBD reservations
|
||||||
|
)
|
||||||
|
).order_by(JobReservation.start_date).all()
|
||||||
|
|
||||||
|
# Get assignment counts
|
||||||
|
reservation_data = []
|
||||||
|
for res in reservations:
|
||||||
|
assignments = db.query(JobReservationUnit).filter_by(
|
||||||
|
reservation_id=res.id
|
||||||
|
).all()
|
||||||
|
assigned_count = len(assignments)
|
||||||
|
|
||||||
|
# Enrich assignments with unit details, sorted by slot_index
|
||||||
|
assignments_sorted = sorted(assignments, key=lambda a: (a.slot_index if a.slot_index is not None else 999))
|
||||||
|
unit_ids = [a.unit_id for a in assignments_sorted]
|
||||||
|
units = db.query(RosterUnit).filter(RosterUnit.id.in_(unit_ids)).all() if unit_ids else []
|
||||||
|
units_by_id = {u.id: u for u in units}
|
||||||
|
assigned_units = [
|
||||||
|
{
|
||||||
|
"id": a.unit_id,
|
||||||
|
"power_type": a.power_type,
|
||||||
|
"notes": a.notes,
|
||||||
|
"location_name": a.location_name,
|
||||||
|
"slot_index": a.slot_index,
|
||||||
|
"deployed": units_by_id[a.unit_id].deployed if a.unit_id in units_by_id else False,
|
||||||
|
"last_calibrated": units_by_id[a.unit_id].last_calibrated if a.unit_id in units_by_id else None,
|
||||||
|
}
|
||||||
|
for a in assignments_sorted
|
||||||
|
]
|
||||||
|
|
||||||
|
# Check for calibration conflicts
|
||||||
|
conflicts = check_calibration_conflicts(db, res.id)
|
||||||
|
|
||||||
|
location_count = res.quantity_needed or assigned_count
|
||||||
|
reservation_data.append({
|
||||||
|
"reservation": res,
|
||||||
|
"assigned_count": assigned_count,
|
||||||
|
"location_count": location_count,
|
||||||
|
"assigned_units": assigned_units,
|
||||||
|
"has_conflicts": len(conflicts) > 0,
|
||||||
|
"conflict_count": len(conflicts)
|
||||||
|
})
|
||||||
|
|
||||||
|
return templates.TemplateResponse(
|
||||||
|
"partials/fleet_calendar/reservations_list.html",
|
||||||
|
{
|
||||||
|
"request": request,
|
||||||
|
"reservations": reservation_data,
|
||||||
|
"year": year,
|
||||||
|
"device_type": device_type
|
||||||
|
}
|
||||||
|
)
|
||||||
|
|
||||||
|
|
||||||
|
@router.get("/api/fleet-calendar/planner-availability", response_class=JSONResponse)
|
||||||
|
async def get_planner_availability(
|
||||||
|
device_type: str = "seismograph",
|
||||||
|
start_date: Optional[str] = None,
|
||||||
|
end_date: Optional[str] = None,
|
||||||
|
exclude_reservation_id: Optional[str] = None,
|
||||||
|
db: Session = Depends(get_db)
|
||||||
|
):
|
||||||
|
"""Get available units for the reservation planner split-panel UI.
|
||||||
|
Dates are optional — if omitted, returns all non-retired units regardless of reservations.
|
||||||
|
"""
|
||||||
|
if start_date and end_date:
|
||||||
|
try:
|
||||||
|
start = date.fromisoformat(start_date)
|
||||||
|
end = date.fromisoformat(end_date)
|
||||||
|
except ValueError:
|
||||||
|
raise HTTPException(status_code=400, detail="Invalid date format. Use YYYY-MM-DD")
|
||||||
|
units = get_available_units_for_period(db, start, end, device_type, exclude_reservation_id)
|
||||||
|
else:
|
||||||
|
# No dates: return all non-retired units of this type, with current reservation info
|
||||||
|
from backend.models import RosterUnit as RU
|
||||||
|
from datetime import timedelta
|
||||||
|
today = date.today()
|
||||||
|
all_units = db.query(RU).filter(
|
||||||
|
RU.device_type == device_type,
|
||||||
|
RU.retired == False
|
||||||
|
).all()
|
||||||
|
|
||||||
|
# Build a map: unit_id -> list of active/upcoming reservations
|
||||||
|
active_assignments = db.query(JobReservationUnit).join(
|
||||||
|
JobReservation, JobReservationUnit.reservation_id == JobReservation.id
|
||||||
|
).filter(
|
||||||
|
JobReservation.device_type == device_type,
|
||||||
|
JobReservation.end_date >= today
|
||||||
|
).all()
|
||||||
|
unit_reservations = {}
|
||||||
|
for assignment in active_assignments:
|
||||||
|
res = db.query(JobReservation).filter(JobReservation.id == assignment.reservation_id).first()
|
||||||
|
if not res:
|
||||||
|
continue
|
||||||
|
unit_reservations.setdefault(assignment.unit_id, []).append({
|
||||||
|
"reservation_id": res.id,
|
||||||
|
"reservation_name": res.name,
|
||||||
|
"start_date": res.start_date.isoformat() if res.start_date else None,
|
||||||
|
"end_date": res.end_date.isoformat() if res.end_date else None,
|
||||||
|
"color": res.color or "#3B82F6"
|
||||||
|
})
|
||||||
|
|
||||||
|
units = []
|
||||||
|
for u in all_units:
|
||||||
|
expiry = (u.last_calibrated + timedelta(days=365)) if u.last_calibrated else None
|
||||||
|
units.append({
|
||||||
|
"id": u.id,
|
||||||
|
"last_calibrated": u.last_calibrated.isoformat() if u.last_calibrated else None,
|
||||||
|
"expiry_date": expiry.isoformat() if expiry else None,
|
||||||
|
"calibration_status": "needs_calibration" if not u.last_calibrated else "valid",
|
||||||
|
"deployed": u.deployed,
|
||||||
|
"out_for_calibration": u.out_for_calibration or False,
|
||||||
|
"allocated": getattr(u, 'allocated', False) or False,
|
||||||
|
"allocated_to_project_id": getattr(u, 'allocated_to_project_id', None) or "",
|
||||||
|
"note": u.note or "",
|
||||||
|
"reservations": unit_reservations.get(u.id, [])
|
||||||
|
})
|
||||||
|
|
||||||
|
# Sort: benched first (easier to assign), then deployed, then by ID
|
||||||
|
units.sort(key=lambda u: (1 if u["deployed"] else 0, u["id"]))
|
||||||
|
|
||||||
|
return {
|
||||||
|
"units": units,
|
||||||
|
"start_date": start_date,
|
||||||
|
"end_date": end_date,
|
||||||
|
"count": len(units)
|
||||||
|
}
|
||||||
|
|
||||||
|
|
||||||
|
@router.get("/api/fleet-calendar/unit-quick-info/{unit_id}", response_class=JSONResponse)
|
||||||
|
async def get_unit_quick_info(unit_id: str, db: Session = Depends(get_db)):
|
||||||
|
"""Return at-a-glance info for the planner quick-view modal."""
|
||||||
|
from backend.models import Emitter
|
||||||
|
u = db.query(RosterUnit).filter(RosterUnit.id == unit_id).first()
|
||||||
|
if not u:
|
||||||
|
raise HTTPException(status_code=404, detail="Unit not found")
|
||||||
|
|
||||||
|
today = date.today()
|
||||||
|
expiry = (u.last_calibrated + timedelta(days=365)) if u.last_calibrated else None
|
||||||
|
|
||||||
|
# Active/upcoming reservations
|
||||||
|
assignments = db.query(JobReservationUnit).filter(JobReservationUnit.unit_id == unit_id).all()
|
||||||
|
reservations = []
|
||||||
|
for a in assignments:
|
||||||
|
res = db.query(JobReservation).filter(
|
||||||
|
JobReservation.id == a.reservation_id,
|
||||||
|
JobReservation.end_date >= today
|
||||||
|
).first()
|
||||||
|
if res:
|
||||||
|
reservations.append({
|
||||||
|
"name": res.name,
|
||||||
|
"start_date": res.start_date.isoformat() if res.start_date else None,
|
||||||
|
"end_date": res.end_date.isoformat() if res.end_date else None,
|
||||||
|
"end_date_tbd": res.end_date_tbd,
|
||||||
|
"color": res.color or "#3B82F6",
|
||||||
|
"location_name": a.location_name,
|
||||||
|
})
|
||||||
|
|
||||||
|
# Last seen from emitter
|
||||||
|
emitter = db.query(Emitter).filter(Emitter.unit_type == unit_id).first()
|
||||||
|
|
||||||
|
return {
|
||||||
|
"id": u.id,
|
||||||
|
"unit_type": u.unit_type,
|
||||||
|
"deployed": u.deployed,
|
||||||
|
"out_for_calibration": u.out_for_calibration or False,
|
||||||
|
"note": u.note or "",
|
||||||
|
"project_id": u.project_id or "",
|
||||||
|
"address": u.address or u.location or "",
|
||||||
|
"coordinates": u.coordinates or "",
|
||||||
|
"deployed_with_modem_id": u.deployed_with_modem_id or "",
|
||||||
|
"last_calibrated": u.last_calibrated.isoformat() if u.last_calibrated else None,
|
||||||
|
"next_calibration_due": u.next_calibration_due.isoformat() if u.next_calibration_due else (expiry.isoformat() if expiry else None),
|
||||||
|
"cal_expired": not u.last_calibrated or (expiry and expiry < today),
|
||||||
|
"last_seen": emitter.last_seen.isoformat() if emitter and emitter.last_seen else None,
|
||||||
|
"reservations": reservations,
|
||||||
|
}
|
||||||
|
|
||||||
|
|
||||||
|
@router.get("/api/fleet-calendar/available-units", response_class=HTMLResponse)
|
||||||
|
async def get_available_units_partial(
|
||||||
|
request: Request,
|
||||||
|
start_date: str,
|
||||||
|
end_date: str,
|
||||||
|
device_type: str = "seismograph",
|
||||||
|
reservation_id: Optional[str] = None,
|
||||||
|
db: Session = Depends(get_db)
|
||||||
|
):
|
||||||
|
"""Get available units as HTMX partial for the assignment modal."""
|
||||||
|
try:
|
||||||
|
start = date.fromisoformat(start_date)
|
||||||
|
end = date.fromisoformat(end_date)
|
||||||
|
except ValueError:
|
||||||
|
raise HTTPException(status_code=400, detail="Invalid date format")
|
||||||
|
|
||||||
|
available = get_available_units_for_period(
|
||||||
|
db, start, end, device_type, reservation_id
|
||||||
|
)
|
||||||
|
|
||||||
|
return templates.TemplateResponse(
|
||||||
|
"partials/fleet_calendar/available_units.html",
|
||||||
|
{
|
||||||
|
"request": request,
|
||||||
|
"units": available,
|
||||||
|
"start_date": start_date,
|
||||||
|
"end_date": end_date,
|
||||||
|
"device_type": device_type,
|
||||||
|
"reservation_id": reservation_id
|
||||||
|
}
|
||||||
|
)
|
||||||
|
|
||||||
|
|
||||||
|
@router.get("/api/fleet-calendar/month/{year}/{month}", response_class=HTMLResponse)
|
||||||
|
async def get_month_partial(
|
||||||
|
request: Request,
|
||||||
|
year: int,
|
||||||
|
month: int,
|
||||||
|
device_type: str = "seismograph",
|
||||||
|
db: Session = Depends(get_db)
|
||||||
|
):
|
||||||
|
"""Get a single month calendar as HTMX partial."""
|
||||||
|
calendar_data = get_calendar_year_data(db, year, device_type)
|
||||||
|
month_data = calendar_data["months"].get(month)
|
||||||
|
|
||||||
|
if not month_data:
|
||||||
|
raise HTTPException(status_code=404, detail="Invalid month")
|
||||||
|
|
||||||
|
return templates.TemplateResponse(
|
||||||
|
"partials/fleet_calendar/month_grid.html",
|
||||||
|
{
|
||||||
|
"request": request,
|
||||||
|
"year": year,
|
||||||
|
"month": month,
|
||||||
|
"month_data": month_data,
|
||||||
|
"device_type": device_type,
|
||||||
|
"today": date.today().isoformat()
|
||||||
|
}
|
||||||
|
)
|
||||||
|
|
||||||
|
|
||||||
|
# ============================================================================
|
||||||
|
# Promote Reservation to Project
|
||||||
|
# ============================================================================
|
||||||
|
|
||||||
|
@router.post("/api/fleet-calendar/reservations/{reservation_id}/promote-to-project", response_class=JSONResponse)
|
||||||
|
async def promote_reservation_to_project(
|
||||||
|
reservation_id: str,
|
||||||
|
request: Request,
|
||||||
|
db: Session = Depends(get_db)
|
||||||
|
):
|
||||||
|
"""
|
||||||
|
Promote a job reservation to a full project in the projects DB.
|
||||||
|
Creates: Project + MonitoringLocations + UnitAssignments.
|
||||||
|
"""
|
||||||
|
reservation = db.query(JobReservation).filter_by(id=reservation_id).first()
|
||||||
|
if not reservation:
|
||||||
|
raise HTTPException(status_code=404, detail="Reservation not found")
|
||||||
|
|
||||||
|
data = await request.json()
|
||||||
|
project_number = data.get("project_number") or None
|
||||||
|
client_name = data.get("client_name") or None
|
||||||
|
|
||||||
|
# Map device_type to project_type_id
|
||||||
|
if reservation.device_type == "slm":
|
||||||
|
project_type_id = "sound_monitoring"
|
||||||
|
location_type = "sound"
|
||||||
|
else:
|
||||||
|
project_type_id = "vibration_monitoring"
|
||||||
|
location_type = "vibration"
|
||||||
|
|
||||||
|
# Check for duplicate project name
|
||||||
|
existing = db.query(Project).filter_by(name=reservation.name).first()
|
||||||
|
if existing:
|
||||||
|
raise HTTPException(status_code=409, detail=f"A project named '{reservation.name}' already exists.")
|
||||||
|
|
||||||
|
# Create the project
|
||||||
|
project_id = str(uuid.uuid4())
|
||||||
|
project = Project(
|
||||||
|
id=project_id,
|
||||||
|
name=reservation.name,
|
||||||
|
project_number=project_number,
|
||||||
|
client_name=client_name,
|
||||||
|
project_type_id=project_type_id,
|
||||||
|
status="upcoming",
|
||||||
|
start_date=reservation.start_date,
|
||||||
|
end_date=reservation.end_date,
|
||||||
|
description=reservation.notes,
|
||||||
|
)
|
||||||
|
db.add(project)
|
||||||
|
db.flush()
|
||||||
|
|
||||||
|
# Load assignments sorted by slot_index
|
||||||
|
assignments = db.query(JobReservationUnit).filter_by(reservation_id=reservation_id).all()
|
||||||
|
assignments_sorted = sorted(assignments, key=lambda a: (a.slot_index if a.slot_index is not None else 999))
|
||||||
|
|
||||||
|
locations_created = 0
|
||||||
|
units_assigned = 0
|
||||||
|
|
||||||
|
for i, assignment in enumerate(assignments_sorted):
|
||||||
|
loc_num = str(i + 1).zfill(3)
|
||||||
|
loc_name = assignment.location_name or f"Location {i + 1}"
|
||||||
|
|
||||||
|
location = MonitoringLocation(
|
||||||
|
id=str(uuid.uuid4()),
|
||||||
|
project_id=project_id,
|
||||||
|
location_type=location_type,
|
||||||
|
name=loc_name,
|
||||||
|
description=assignment.notes,
|
||||||
|
)
|
||||||
|
db.add(location)
|
||||||
|
db.flush()
|
||||||
|
locations_created += 1
|
||||||
|
|
||||||
|
if assignment.unit_id:
|
||||||
|
unit_assignment = UnitAssignment(
|
||||||
|
id=str(uuid.uuid4()),
|
||||||
|
unit_id=assignment.unit_id,
|
||||||
|
location_id=location.id,
|
||||||
|
project_id=project_id,
|
||||||
|
device_type=reservation.device_type or "seismograph",
|
||||||
|
status="active",
|
||||||
|
notes=f"Power: {assignment.power_type}" if assignment.power_type else None,
|
||||||
|
)
|
||||||
|
db.add(unit_assignment)
|
||||||
|
units_assigned += 1
|
||||||
|
|
||||||
|
db.commit()
|
||||||
|
|
||||||
|
logger.info(f"Promoted reservation '{reservation.name}' to project {project_id}")
|
||||||
|
|
||||||
|
return {
|
||||||
|
"success": True,
|
||||||
|
"project_id": project_id,
|
||||||
|
"project_name": reservation.name,
|
||||||
|
"locations_created": locations_created,
|
||||||
|
"units_assigned": units_assigned,
|
||||||
|
}
|
||||||
429
backend/routers/modem_dashboard.py
Normal file
@@ -0,0 +1,429 @@
|
|||||||
|
"""
|
||||||
|
Modem Dashboard Router
|
||||||
|
|
||||||
|
Provides API endpoints for the Field Modems management page.
|
||||||
|
"""
|
||||||
|
|
||||||
|
from fastapi import APIRouter, Request, Depends, Query
|
||||||
|
from fastapi.responses import HTMLResponse
|
||||||
|
from sqlalchemy.orm import Session
|
||||||
|
from datetime import datetime
|
||||||
|
import subprocess
|
||||||
|
import time
|
||||||
|
import logging
|
||||||
|
|
||||||
|
from backend.database import get_db
|
||||||
|
from backend.models import RosterUnit
|
||||||
|
from backend.templates_config import templates
|
||||||
|
|
||||||
|
logger = logging.getLogger(__name__)
|
||||||
|
|
||||||
|
router = APIRouter(prefix="/api/modem-dashboard", tags=["modem-dashboard"])
|
||||||
|
|
||||||
|
|
||||||
|
@router.get("/stats", response_class=HTMLResponse)
|
||||||
|
async def get_modem_stats(request: Request, db: Session = Depends(get_db)):
|
||||||
|
"""
|
||||||
|
Get summary statistics for modem dashboard.
|
||||||
|
Returns HTML partial with stat cards.
|
||||||
|
"""
|
||||||
|
# Query all modems
|
||||||
|
all_modems = db.query(RosterUnit).filter_by(device_type="modem").all()
|
||||||
|
|
||||||
|
# Get IDs of modems that have devices paired to them
|
||||||
|
paired_modem_ids = set()
|
||||||
|
devices_with_modems = db.query(RosterUnit).filter(
|
||||||
|
RosterUnit.deployed_with_modem_id.isnot(None),
|
||||||
|
RosterUnit.retired == False
|
||||||
|
).all()
|
||||||
|
for device in devices_with_modems:
|
||||||
|
if device.deployed_with_modem_id:
|
||||||
|
paired_modem_ids.add(device.deployed_with_modem_id)
|
||||||
|
|
||||||
|
# Count categories
|
||||||
|
total_count = len(all_modems)
|
||||||
|
retired_count = sum(1 for m in all_modems if m.retired)
|
||||||
|
|
||||||
|
# In use = deployed AND paired with a device
|
||||||
|
in_use_count = sum(1 for m in all_modems
|
||||||
|
if m.deployed and not m.retired and m.id in paired_modem_ids)
|
||||||
|
|
||||||
|
# Spare = deployed but NOT paired (available for assignment)
|
||||||
|
spare_count = sum(1 for m in all_modems
|
||||||
|
if m.deployed and not m.retired and m.id not in paired_modem_ids)
|
||||||
|
|
||||||
|
# Benched = not deployed and not retired
|
||||||
|
benched_count = sum(1 for m in all_modems if not m.deployed and not m.retired)
|
||||||
|
|
||||||
|
return templates.TemplateResponse("partials/modem_stats.html", {
|
||||||
|
"request": request,
|
||||||
|
"total_count": total_count,
|
||||||
|
"in_use_count": in_use_count,
|
||||||
|
"spare_count": spare_count,
|
||||||
|
"benched_count": benched_count,
|
||||||
|
"retired_count": retired_count
|
||||||
|
})
|
||||||
|
|
||||||
|
|
||||||
|
@router.get("/units", response_class=HTMLResponse)
|
||||||
|
async def get_modem_units(
|
||||||
|
request: Request,
|
||||||
|
db: Session = Depends(get_db),
|
||||||
|
search: str = Query(None),
|
||||||
|
filter_status: str = Query(None), # "in_use", "spare", "benched", "retired"
|
||||||
|
):
|
||||||
|
"""
|
||||||
|
Get list of modem units for the dashboard.
|
||||||
|
Returns HTML partial with modem cards.
|
||||||
|
"""
|
||||||
|
query = db.query(RosterUnit).filter_by(device_type="modem")
|
||||||
|
|
||||||
|
# Filter by search term if provided
|
||||||
|
if search:
|
||||||
|
search_term = f"%{search}%"
|
||||||
|
query = query.filter(
|
||||||
|
(RosterUnit.id.ilike(search_term)) |
|
||||||
|
(RosterUnit.ip_address.ilike(search_term)) |
|
||||||
|
(RosterUnit.hardware_model.ilike(search_term)) |
|
||||||
|
(RosterUnit.phone_number.ilike(search_term)) |
|
||||||
|
(RosterUnit.location.ilike(search_term))
|
||||||
|
)
|
||||||
|
|
||||||
|
modems = query.order_by(
|
||||||
|
RosterUnit.retired.asc(),
|
||||||
|
RosterUnit.deployed.desc(),
|
||||||
|
RosterUnit.id.asc()
|
||||||
|
).all()
|
||||||
|
|
||||||
|
# Get paired device info for each modem
|
||||||
|
paired_devices = {}
|
||||||
|
devices_with_modems = db.query(RosterUnit).filter(
|
||||||
|
RosterUnit.deployed_with_modem_id.isnot(None),
|
||||||
|
RosterUnit.retired == False
|
||||||
|
).all()
|
||||||
|
for device in devices_with_modems:
|
||||||
|
if device.deployed_with_modem_id:
|
||||||
|
paired_devices[device.deployed_with_modem_id] = {
|
||||||
|
"id": device.id,
|
||||||
|
"device_type": device.device_type,
|
||||||
|
"deployed": device.deployed
|
||||||
|
}
|
||||||
|
|
||||||
|
# Annotate modems with paired device info
|
||||||
|
modem_list = []
|
||||||
|
for modem in modems:
|
||||||
|
paired = paired_devices.get(modem.id)
|
||||||
|
|
||||||
|
# Determine status category
|
||||||
|
if modem.retired:
|
||||||
|
status = "retired"
|
||||||
|
elif not modem.deployed:
|
||||||
|
status = "benched"
|
||||||
|
elif paired:
|
||||||
|
status = "in_use"
|
||||||
|
else:
|
||||||
|
status = "spare"
|
||||||
|
|
||||||
|
# Apply filter if specified
|
||||||
|
if filter_status and status != filter_status:
|
||||||
|
continue
|
||||||
|
|
||||||
|
modem_list.append({
|
||||||
|
"id": modem.id,
|
||||||
|
"ip_address": modem.ip_address,
|
||||||
|
"phone_number": modem.phone_number,
|
||||||
|
"hardware_model": modem.hardware_model,
|
||||||
|
"deployed": modem.deployed,
|
||||||
|
"retired": modem.retired,
|
||||||
|
"location": modem.location,
|
||||||
|
"project_id": modem.project_id,
|
||||||
|
"paired_device": paired,
|
||||||
|
"status": status
|
||||||
|
})
|
||||||
|
|
||||||
|
return templates.TemplateResponse("partials/modem_list.html", {
|
||||||
|
"request": request,
|
||||||
|
"modems": modem_list
|
||||||
|
})
|
||||||
|
|
||||||
|
|
||||||
|
@router.get("/{modem_id}/paired-device")
|
||||||
|
async def get_paired_device(modem_id: str, db: Session = Depends(get_db)):
|
||||||
|
"""
|
||||||
|
Get the device (SLM/seismograph) that is paired with this modem.
|
||||||
|
Returns JSON with device info or null if not paired.
|
||||||
|
"""
|
||||||
|
# Check modem exists
|
||||||
|
modem = db.query(RosterUnit).filter_by(id=modem_id, device_type="modem").first()
|
||||||
|
if not modem:
|
||||||
|
return {"status": "error", "detail": f"Modem {modem_id} not found"}
|
||||||
|
|
||||||
|
# Find device paired with this modem
|
||||||
|
device = db.query(RosterUnit).filter(
|
||||||
|
RosterUnit.deployed_with_modem_id == modem_id,
|
||||||
|
RosterUnit.retired == False
|
||||||
|
).first()
|
||||||
|
|
||||||
|
if device:
|
||||||
|
return {
|
||||||
|
"paired": True,
|
||||||
|
"device": {
|
||||||
|
"id": device.id,
|
||||||
|
"device_type": device.device_type,
|
||||||
|
"deployed": device.deployed,
|
||||||
|
"project_id": device.project_id,
|
||||||
|
"location": device.location or device.address
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
return {"paired": False, "device": None}
|
||||||
|
|
||||||
|
|
||||||
|
@router.get("/{modem_id}/paired-device-html", response_class=HTMLResponse)
|
||||||
|
async def get_paired_device_html(modem_id: str, request: Request, db: Session = Depends(get_db)):
|
||||||
|
"""
|
||||||
|
Get HTML partial showing the device paired with this modem.
|
||||||
|
Used by unit_detail.html for modems.
|
||||||
|
"""
|
||||||
|
# Check modem exists
|
||||||
|
modem = db.query(RosterUnit).filter_by(id=modem_id, device_type="modem").first()
|
||||||
|
if not modem:
|
||||||
|
return HTMLResponse('<p class="text-red-500">Modem not found</p>')
|
||||||
|
|
||||||
|
# Find device paired with this modem
|
||||||
|
device = db.query(RosterUnit).filter(
|
||||||
|
RosterUnit.deployed_with_modem_id == modem_id,
|
||||||
|
RosterUnit.retired == False
|
||||||
|
).first()
|
||||||
|
|
||||||
|
return templates.TemplateResponse("partials/modem_paired_device.html", {
|
||||||
|
"request": request,
|
||||||
|
"modem_id": modem_id,
|
||||||
|
"device": device
|
||||||
|
})
|
||||||
|
|
||||||
|
|
||||||
|
@router.get("/{modem_id}/ping")
|
||||||
|
async def ping_modem(modem_id: str, db: Session = Depends(get_db)):
|
||||||
|
"""
|
||||||
|
Test modem connectivity with a simple ping.
|
||||||
|
Returns response time and connection status.
|
||||||
|
"""
|
||||||
|
# Get modem from database
|
||||||
|
modem = db.query(RosterUnit).filter_by(id=modem_id, device_type="modem").first()
|
||||||
|
|
||||||
|
if not modem:
|
||||||
|
return {"status": "error", "detail": f"Modem {modem_id} not found"}
|
||||||
|
|
||||||
|
if not modem.ip_address:
|
||||||
|
return {"status": "error", "detail": f"Modem {modem_id} has no IP address configured"}
|
||||||
|
|
||||||
|
try:
|
||||||
|
# Ping the modem (1 packet, 2 second timeout)
|
||||||
|
start_time = time.time()
|
||||||
|
result = subprocess.run(
|
||||||
|
["ping", "-c", "1", "-W", "2", modem.ip_address],
|
||||||
|
capture_output=True,
|
||||||
|
text=True,
|
||||||
|
timeout=3
|
||||||
|
)
|
||||||
|
response_time = int((time.time() - start_time) * 1000) # Convert to milliseconds
|
||||||
|
|
||||||
|
if result.returncode == 0:
|
||||||
|
return {
|
||||||
|
"status": "success",
|
||||||
|
"modem_id": modem_id,
|
||||||
|
"ip_address": modem.ip_address,
|
||||||
|
"response_time_ms": response_time,
|
||||||
|
"message": "Modem is responding"
|
||||||
|
}
|
||||||
|
else:
|
||||||
|
return {
|
||||||
|
"status": "error",
|
||||||
|
"modem_id": modem_id,
|
||||||
|
"ip_address": modem.ip_address,
|
||||||
|
"detail": "Modem not responding to ping"
|
||||||
|
}
|
||||||
|
|
||||||
|
except subprocess.TimeoutExpired:
|
||||||
|
return {
|
||||||
|
"status": "error",
|
||||||
|
"modem_id": modem_id,
|
||||||
|
"ip_address": modem.ip_address,
|
||||||
|
"detail": "Ping timeout"
|
||||||
|
}
|
||||||
|
except Exception as e:
|
||||||
|
logger.error(f"Failed to ping modem {modem_id}: {e}")
|
||||||
|
return {
|
||||||
|
"status": "error",
|
||||||
|
"modem_id": modem_id,
|
||||||
|
"detail": str(e)
|
||||||
|
}
|
||||||
|
|
||||||
|
|
||||||
|
@router.get("/{modem_id}/diagnostics")
|
||||||
|
async def get_modem_diagnostics(modem_id: str, db: Session = Depends(get_db)):
|
||||||
|
"""
|
||||||
|
Get modem diagnostics (signal strength, data usage, uptime).
|
||||||
|
|
||||||
|
Currently returns placeholders. When ModemManager is available,
|
||||||
|
this endpoint will query it for real diagnostics.
|
||||||
|
"""
|
||||||
|
modem = db.query(RosterUnit).filter_by(id=modem_id, device_type="modem").first()
|
||||||
|
if not modem:
|
||||||
|
return {"status": "error", "detail": f"Modem {modem_id} not found"}
|
||||||
|
|
||||||
|
# TODO: Query ModemManager backend when available
|
||||||
|
return {
|
||||||
|
"status": "unavailable",
|
||||||
|
"message": "ModemManager integration not yet available",
|
||||||
|
"modem_id": modem_id,
|
||||||
|
"signal_strength_dbm": None,
|
||||||
|
"data_usage_mb": None,
|
||||||
|
"uptime_seconds": None,
|
||||||
|
"carrier": None,
|
||||||
|
"connection_type": None # LTE, 5G, etc.
|
||||||
|
}
|
||||||
|
|
||||||
|
|
||||||
|
@router.get("/{modem_id}/pairable-devices")
|
||||||
|
async def get_pairable_devices(
|
||||||
|
modem_id: str,
|
||||||
|
db: Session = Depends(get_db),
|
||||||
|
search: str = Query(None),
|
||||||
|
hide_paired: bool = Query(True)
|
||||||
|
):
|
||||||
|
"""
|
||||||
|
Get list of devices (seismographs and SLMs) that can be paired with this modem.
|
||||||
|
Used by the device picker modal in unit_detail.html.
|
||||||
|
"""
|
||||||
|
# Check modem exists
|
||||||
|
modem = db.query(RosterUnit).filter_by(id=modem_id, device_type="modem").first()
|
||||||
|
if not modem:
|
||||||
|
return {"status": "error", "detail": f"Modem {modem_id} not found"}
|
||||||
|
|
||||||
|
# Query seismographs and SLMs
|
||||||
|
query = db.query(RosterUnit).filter(
|
||||||
|
RosterUnit.device_type.in_(["seismograph", "sound_level_meter"]),
|
||||||
|
RosterUnit.retired == False
|
||||||
|
)
|
||||||
|
|
||||||
|
# Filter by search term if provided
|
||||||
|
if search:
|
||||||
|
search_term = f"%{search}%"
|
||||||
|
query = query.filter(
|
||||||
|
(RosterUnit.id.ilike(search_term)) |
|
||||||
|
(RosterUnit.project_id.ilike(search_term)) |
|
||||||
|
(RosterUnit.location.ilike(search_term)) |
|
||||||
|
(RosterUnit.address.ilike(search_term)) |
|
||||||
|
(RosterUnit.note.ilike(search_term))
|
||||||
|
)
|
||||||
|
|
||||||
|
devices = query.order_by(
|
||||||
|
RosterUnit.deployed.desc(),
|
||||||
|
RosterUnit.device_type.asc(),
|
||||||
|
RosterUnit.id.asc()
|
||||||
|
).all()
|
||||||
|
|
||||||
|
# Build device list
|
||||||
|
device_list = []
|
||||||
|
for device in devices:
|
||||||
|
# Skip already paired devices if hide_paired is True
|
||||||
|
is_paired_to_other = (
|
||||||
|
device.deployed_with_modem_id is not None and
|
||||||
|
device.deployed_with_modem_id != modem_id
|
||||||
|
)
|
||||||
|
is_paired_to_this = device.deployed_with_modem_id == modem_id
|
||||||
|
|
||||||
|
if hide_paired and is_paired_to_other:
|
||||||
|
continue
|
||||||
|
|
||||||
|
device_list.append({
|
||||||
|
"id": device.id,
|
||||||
|
"device_type": device.device_type,
|
||||||
|
"deployed": device.deployed,
|
||||||
|
"project_id": device.project_id,
|
||||||
|
"location": device.location or device.address,
|
||||||
|
"note": device.note,
|
||||||
|
"paired_modem_id": device.deployed_with_modem_id,
|
||||||
|
"is_paired_to_this": is_paired_to_this,
|
||||||
|
"is_paired_to_other": is_paired_to_other
|
||||||
|
})
|
||||||
|
|
||||||
|
return {"devices": device_list, "modem_id": modem_id}
|
||||||
|
|
||||||
|
|
||||||
|
@router.post("/{modem_id}/pair")
|
||||||
|
async def pair_device_to_modem(
|
||||||
|
modem_id: str,
|
||||||
|
db: Session = Depends(get_db),
|
||||||
|
device_id: str = Query(..., description="ID of the device to pair")
|
||||||
|
):
|
||||||
|
"""
|
||||||
|
Pair a device (seismograph or SLM) to this modem.
|
||||||
|
Updates the device's deployed_with_modem_id field.
|
||||||
|
"""
|
||||||
|
# Check modem exists
|
||||||
|
modem = db.query(RosterUnit).filter_by(id=modem_id, device_type="modem").first()
|
||||||
|
if not modem:
|
||||||
|
return {"status": "error", "detail": f"Modem {modem_id} not found"}
|
||||||
|
|
||||||
|
# Find the device
|
||||||
|
device = db.query(RosterUnit).filter(
|
||||||
|
RosterUnit.id == device_id,
|
||||||
|
RosterUnit.device_type.in_(["seismograph", "sound_level_meter"]),
|
||||||
|
RosterUnit.retired == False
|
||||||
|
).first()
|
||||||
|
if not device:
|
||||||
|
return {"status": "error", "detail": f"Device {device_id} not found"}
|
||||||
|
|
||||||
|
# Unpair any device currently paired to this modem
|
||||||
|
currently_paired = db.query(RosterUnit).filter(
|
||||||
|
RosterUnit.deployed_with_modem_id == modem_id
|
||||||
|
).all()
|
||||||
|
for paired_device in currently_paired:
|
||||||
|
paired_device.deployed_with_modem_id = None
|
||||||
|
|
||||||
|
# Pair the new device
|
||||||
|
device.deployed_with_modem_id = modem_id
|
||||||
|
db.commit()
|
||||||
|
|
||||||
|
return {
|
||||||
|
"status": "success",
|
||||||
|
"modem_id": modem_id,
|
||||||
|
"device_id": device_id,
|
||||||
|
"message": f"Device {device_id} paired to modem {modem_id}"
|
||||||
|
}
|
||||||
|
|
||||||
|
|
||||||
|
@router.post("/{modem_id}/unpair")
|
||||||
|
async def unpair_device_from_modem(modem_id: str, db: Session = Depends(get_db)):
|
||||||
|
"""
|
||||||
|
Unpair any device currently paired to this modem.
|
||||||
|
"""
|
||||||
|
# Check modem exists
|
||||||
|
modem = db.query(RosterUnit).filter_by(id=modem_id, device_type="modem").first()
|
||||||
|
if not modem:
|
||||||
|
return {"status": "error", "detail": f"Modem {modem_id} not found"}
|
||||||
|
|
||||||
|
# Find and unpair device
|
||||||
|
device = db.query(RosterUnit).filter(
|
||||||
|
RosterUnit.deployed_with_modem_id == modem_id
|
||||||
|
).first()
|
||||||
|
|
||||||
|
if device:
|
||||||
|
old_device_id = device.id
|
||||||
|
device.deployed_with_modem_id = None
|
||||||
|
db.commit()
|
||||||
|
return {
|
||||||
|
"status": "success",
|
||||||
|
"modem_id": modem_id,
|
||||||
|
"unpaired_device_id": old_device_id,
|
||||||
|
"message": f"Device {old_device_id} unpaired from modem {modem_id}"
|
||||||
|
}
|
||||||
|
|
||||||
|
return {
|
||||||
|
"status": "success",
|
||||||
|
"modem_id": modem_id,
|
||||||
|
"message": "No device was paired to this modem"
|
||||||
|
}
|
||||||
@@ -6,7 +6,6 @@ and unit assignments within projects.
|
|||||||
"""
|
"""
|
||||||
|
|
||||||
from fastapi import APIRouter, Request, Depends, HTTPException, Query
|
from fastapi import APIRouter, Request, Depends, HTTPException, Query
|
||||||
from fastapi.templating import Jinja2Templates
|
|
||||||
from fastapi.responses import HTMLResponse, JSONResponse
|
from fastapi.responses import HTMLResponse, JSONResponse
|
||||||
from sqlalchemy.orm import Session
|
from sqlalchemy.orm import Session
|
||||||
from sqlalchemy import and_, or_
|
from sqlalchemy import and_, or_
|
||||||
@@ -15,6 +14,12 @@ from typing import Optional
|
|||||||
import uuid
|
import uuid
|
||||||
import json
|
import json
|
||||||
|
|
||||||
|
from fastapi import UploadFile, File
|
||||||
|
import zipfile
|
||||||
|
import hashlib
|
||||||
|
import io
|
||||||
|
from pathlib import Path
|
||||||
|
|
||||||
from backend.database import get_db
|
from backend.database import get_db
|
||||||
from backend.models import (
|
from backend.models import (
|
||||||
Project,
|
Project,
|
||||||
@@ -22,11 +27,59 @@ from backend.models import (
|
|||||||
MonitoringLocation,
|
MonitoringLocation,
|
||||||
UnitAssignment,
|
UnitAssignment,
|
||||||
RosterUnit,
|
RosterUnit,
|
||||||
RecordingSession,
|
MonitoringSession,
|
||||||
|
DataFile,
|
||||||
)
|
)
|
||||||
|
from backend.templates_config import templates
|
||||||
|
from backend.utils.timezone import local_to_utc
|
||||||
|
|
||||||
router = APIRouter(prefix="/api/projects/{project_id}", tags=["project-locations"])
|
router = APIRouter(prefix="/api/projects/{project_id}", tags=["project-locations"])
|
||||||
templates = Jinja2Templates(directory="templates")
|
|
||||||
|
|
||||||
|
# ============================================================================
|
||||||
|
# Shared helpers
|
||||||
|
# ============================================================================
|
||||||
|
|
||||||
|
def _require_sound_project(project) -> None:
|
||||||
|
"""Raise 400 if the project is not a sound_monitoring project."""
|
||||||
|
if not project or project.project_type_id != "sound_monitoring":
|
||||||
|
raise HTTPException(
|
||||||
|
status_code=400,
|
||||||
|
detail="This feature is only available for Sound Monitoring projects.",
|
||||||
|
)
|
||||||
|
|
||||||
|
|
||||||
|
# ============================================================================
|
||||||
|
# Session period helpers
|
||||||
|
# ============================================================================
|
||||||
|
|
||||||
|
def _derive_period_type(dt: datetime) -> str:
|
||||||
|
"""
|
||||||
|
Classify a session start time into one of four period types.
|
||||||
|
Night = 22:00–07:00, Day = 07:00–22:00.
|
||||||
|
Weekend = Saturday (5) or Sunday (6).
|
||||||
|
"""
|
||||||
|
is_weekend = dt.weekday() >= 5
|
||||||
|
is_night = dt.hour >= 22 or dt.hour < 7
|
||||||
|
if is_weekend:
|
||||||
|
return "weekend_night" if is_night else "weekend_day"
|
||||||
|
return "weekday_night" if is_night else "weekday_day"
|
||||||
|
|
||||||
|
|
||||||
|
def _build_session_label(dt: datetime, location_name: str, period_type: str) -> str:
|
||||||
|
"""Build a human-readable session label, e.g. 'NRL-1 — Sun 2/23 — Night'.
|
||||||
|
Uses started_at date as-is; user can correct period_type in the wizard.
|
||||||
|
"""
|
||||||
|
day_abbr = dt.strftime("%a")
|
||||||
|
date_str = f"{dt.month}/{dt.day}"
|
||||||
|
period_str = {
|
||||||
|
"weekday_day": "Day",
|
||||||
|
"weekday_night": "Night",
|
||||||
|
"weekend_day": "Day",
|
||||||
|
"weekend_night": "Night",
|
||||||
|
}.get(period_type, "")
|
||||||
|
parts = [p for p in [location_name, f"{day_abbr} {date_str}", period_str] if p]
|
||||||
|
return " — ".join(parts)
|
||||||
|
|
||||||
|
|
||||||
# ============================================================================
|
# ============================================================================
|
||||||
@@ -59,11 +112,11 @@ async def get_project_locations(
|
|||||||
# Enrich with assignment info
|
# Enrich with assignment info
|
||||||
locations_data = []
|
locations_data = []
|
||||||
for location in locations:
|
for location in locations:
|
||||||
# Get active assignment
|
# Get active assignment (active = assigned_until IS NULL)
|
||||||
assignment = db.query(UnitAssignment).filter(
|
assignment = db.query(UnitAssignment).filter(
|
||||||
and_(
|
and_(
|
||||||
UnitAssignment.location_id == location.id,
|
UnitAssignment.location_id == location.id,
|
||||||
UnitAssignment.status == "active",
|
UnitAssignment.assigned_until == None,
|
||||||
)
|
)
|
||||||
).first()
|
).first()
|
||||||
|
|
||||||
@@ -71,8 +124,8 @@ async def get_project_locations(
|
|||||||
if assignment:
|
if assignment:
|
||||||
assigned_unit = db.query(RosterUnit).filter_by(id=assignment.unit_id).first()
|
assigned_unit = db.query(RosterUnit).filter_by(id=assignment.unit_id).first()
|
||||||
|
|
||||||
# Count recording sessions
|
# Count monitoring sessions
|
||||||
session_count = db.query(RecordingSession).filter_by(
|
session_count = db.query(MonitoringSession).filter_by(
|
||||||
location_id=location.id
|
location_id=location.id
|
||||||
).count()
|
).count()
|
||||||
|
|
||||||
@@ -90,6 +143,40 @@ async def get_project_locations(
|
|||||||
})
|
})
|
||||||
|
|
||||||
|
|
||||||
|
@router.get("/locations-json")
|
||||||
|
async def get_project_locations_json(
|
||||||
|
project_id: str,
|
||||||
|
db: Session = Depends(get_db),
|
||||||
|
location_type: Optional[str] = Query(None),
|
||||||
|
):
|
||||||
|
"""
|
||||||
|
Get all monitoring locations for a project as JSON.
|
||||||
|
Used by the schedule modal to populate location dropdown.
|
||||||
|
"""
|
||||||
|
project = db.query(Project).filter_by(id=project_id).first()
|
||||||
|
if not project:
|
||||||
|
raise HTTPException(status_code=404, detail="Project not found")
|
||||||
|
|
||||||
|
query = db.query(MonitoringLocation).filter_by(project_id=project_id)
|
||||||
|
|
||||||
|
if location_type:
|
||||||
|
query = query.filter_by(location_type=location_type)
|
||||||
|
|
||||||
|
locations = query.order_by(MonitoringLocation.name).all()
|
||||||
|
|
||||||
|
return [
|
||||||
|
{
|
||||||
|
"id": loc.id,
|
||||||
|
"name": loc.name,
|
||||||
|
"location_type": loc.location_type,
|
||||||
|
"description": loc.description,
|
||||||
|
"address": loc.address,
|
||||||
|
"coordinates": loc.coordinates,
|
||||||
|
}
|
||||||
|
for loc in locations
|
||||||
|
]
|
||||||
|
|
||||||
|
|
||||||
@router.post("/locations/create")
|
@router.post("/locations/create")
|
||||||
async def create_location(
|
async def create_location(
|
||||||
project_id: str,
|
project_id: str,
|
||||||
@@ -185,11 +272,11 @@ async def delete_location(
|
|||||||
if not location:
|
if not location:
|
||||||
raise HTTPException(status_code=404, detail="Location not found")
|
raise HTTPException(status_code=404, detail="Location not found")
|
||||||
|
|
||||||
# Check if location has active assignments
|
# Check if location has active assignments (active = assigned_until IS NULL)
|
||||||
active_assignments = db.query(UnitAssignment).filter(
|
active_assignments = db.query(UnitAssignment).filter(
|
||||||
and_(
|
and_(
|
||||||
UnitAssignment.location_id == location_id,
|
UnitAssignment.location_id == location_id,
|
||||||
UnitAssignment.status == "active",
|
UnitAssignment.assigned_until == None,
|
||||||
)
|
)
|
||||||
).count()
|
).count()
|
||||||
|
|
||||||
@@ -280,18 +367,18 @@ async def assign_unit_to_location(
|
|||||||
detail=f"Unit type '{unit.device_type}' does not match location type '{location.location_type}'",
|
detail=f"Unit type '{unit.device_type}' does not match location type '{location.location_type}'",
|
||||||
)
|
)
|
||||||
|
|
||||||
# Check if location already has an active assignment
|
# Check if location already has an active assignment (active = assigned_until IS NULL)
|
||||||
existing_assignment = db.query(UnitAssignment).filter(
|
existing_assignment = db.query(UnitAssignment).filter(
|
||||||
and_(
|
and_(
|
||||||
UnitAssignment.location_id == location_id,
|
UnitAssignment.location_id == location_id,
|
||||||
UnitAssignment.status == "active",
|
UnitAssignment.assigned_until == None,
|
||||||
)
|
)
|
||||||
).first()
|
).first()
|
||||||
|
|
||||||
if existing_assignment:
|
if existing_assignment:
|
||||||
raise HTTPException(
|
raise HTTPException(
|
||||||
status_code=400,
|
status_code=400,
|
||||||
detail=f"Location already has an active unit assignment ({existing_assignment.unit_id}). Unassign first.",
|
detail=f"Location already has an active unit assignment ({existing_assignment.unit_id}). Use swap to replace it.",
|
||||||
)
|
)
|
||||||
|
|
||||||
# Create new assignment
|
# Create new assignment
|
||||||
@@ -337,19 +424,19 @@ async def unassign_unit(
|
|||||||
if not assignment:
|
if not assignment:
|
||||||
raise HTTPException(status_code=404, detail="Assignment not found")
|
raise HTTPException(status_code=404, detail="Assignment not found")
|
||||||
|
|
||||||
# Check if there are active recording sessions
|
# Check if there are active monitoring sessions
|
||||||
active_sessions = db.query(RecordingSession).filter(
|
active_sessions = db.query(MonitoringSession).filter(
|
||||||
and_(
|
and_(
|
||||||
RecordingSession.location_id == assignment.location_id,
|
MonitoringSession.location_id == assignment.location_id,
|
||||||
RecordingSession.unit_id == assignment.unit_id,
|
MonitoringSession.unit_id == assignment.unit_id,
|
||||||
RecordingSession.status == "recording",
|
MonitoringSession.status == "recording",
|
||||||
)
|
)
|
||||||
).count()
|
).count()
|
||||||
|
|
||||||
if active_sessions > 0:
|
if active_sessions > 0:
|
||||||
raise HTTPException(
|
raise HTTPException(
|
||||||
status_code=400,
|
status_code=400,
|
||||||
detail="Cannot unassign unit with active recording sessions. Stop recording first.",
|
detail="Cannot unassign unit with active monitoring sessions. Stop monitoring first.",
|
||||||
)
|
)
|
||||||
|
|
||||||
assignment.status = "completed"
|
assignment.status = "completed"
|
||||||
@@ -360,10 +447,120 @@ async def unassign_unit(
|
|||||||
return {"success": True, "message": "Unit unassigned successfully"}
|
return {"success": True, "message": "Unit unassigned successfully"}
|
||||||
|
|
||||||
|
|
||||||
|
@router.post("/locations/{location_id}/swap")
|
||||||
|
async def swap_unit_on_location(
|
||||||
|
project_id: str,
|
||||||
|
location_id: str,
|
||||||
|
request: Request,
|
||||||
|
db: Session = Depends(get_db),
|
||||||
|
):
|
||||||
|
"""
|
||||||
|
Swap the unit assigned to a vibration monitoring location.
|
||||||
|
Ends the current active assignment (if any), creates a new one,
|
||||||
|
and optionally updates modem pairing on the seismograph.
|
||||||
|
Works for first-time assignments too (no current assignment = just create).
|
||||||
|
"""
|
||||||
|
location = db.query(MonitoringLocation).filter_by(
|
||||||
|
id=location_id,
|
||||||
|
project_id=project_id,
|
||||||
|
).first()
|
||||||
|
if not location:
|
||||||
|
raise HTTPException(status_code=404, detail="Location not found")
|
||||||
|
|
||||||
|
form_data = await request.form()
|
||||||
|
unit_id = form_data.get("unit_id")
|
||||||
|
modem_id = form_data.get("modem_id") or None
|
||||||
|
notes = form_data.get("notes") or None
|
||||||
|
|
||||||
|
if not unit_id:
|
||||||
|
raise HTTPException(status_code=400, detail="unit_id is required")
|
||||||
|
|
||||||
|
# Validate new unit
|
||||||
|
unit = db.query(RosterUnit).filter_by(id=unit_id).first()
|
||||||
|
if not unit:
|
||||||
|
raise HTTPException(status_code=404, detail="Unit not found")
|
||||||
|
|
||||||
|
expected_device_type = "slm" if location.location_type == "sound" else "seismograph"
|
||||||
|
if unit.device_type != expected_device_type:
|
||||||
|
raise HTTPException(
|
||||||
|
status_code=400,
|
||||||
|
detail=f"Unit type '{unit.device_type}' does not match location type '{location.location_type}'",
|
||||||
|
)
|
||||||
|
|
||||||
|
# End current active assignment if one exists (active = assigned_until IS NULL)
|
||||||
|
current = db.query(UnitAssignment).filter(
|
||||||
|
and_(
|
||||||
|
UnitAssignment.location_id == location_id,
|
||||||
|
UnitAssignment.assigned_until == None,
|
||||||
|
)
|
||||||
|
).first()
|
||||||
|
if current:
|
||||||
|
current.assigned_until = datetime.utcnow()
|
||||||
|
current.status = "completed"
|
||||||
|
|
||||||
|
# Create new assignment
|
||||||
|
new_assignment = UnitAssignment(
|
||||||
|
id=str(uuid.uuid4()),
|
||||||
|
unit_id=unit_id,
|
||||||
|
location_id=location_id,
|
||||||
|
project_id=project_id,
|
||||||
|
device_type=unit.device_type,
|
||||||
|
assigned_until=None,
|
||||||
|
status="active",
|
||||||
|
notes=notes,
|
||||||
|
)
|
||||||
|
db.add(new_assignment)
|
||||||
|
|
||||||
|
# Update modem pairing on the seismograph if modem provided
|
||||||
|
if modem_id:
|
||||||
|
modem = db.query(RosterUnit).filter_by(id=modem_id, device_type="modem").first()
|
||||||
|
if not modem:
|
||||||
|
raise HTTPException(status_code=404, detail=f"Modem '{modem_id}' not found")
|
||||||
|
unit.deployed_with_modem_id = modem_id
|
||||||
|
modem.deployed_with_unit_id = unit_id
|
||||||
|
else:
|
||||||
|
# Clear modem pairing if not provided
|
||||||
|
unit.deployed_with_modem_id = None
|
||||||
|
|
||||||
|
db.commit()
|
||||||
|
|
||||||
|
return JSONResponse({
|
||||||
|
"success": True,
|
||||||
|
"assignment_id": new_assignment.id,
|
||||||
|
"message": f"Unit '{unit_id}' assigned to '{location.name}'" + (f" with modem '{modem_id}'" if modem_id else ""),
|
||||||
|
})
|
||||||
|
|
||||||
|
|
||||||
# ============================================================================
|
# ============================================================================
|
||||||
# Available Units for Assignment
|
# Available Units for Assignment
|
||||||
# ============================================================================
|
# ============================================================================
|
||||||
|
|
||||||
|
@router.get("/available-modems", response_class=JSONResponse)
|
||||||
|
async def get_available_modems(
|
||||||
|
project_id: str,
|
||||||
|
db: Session = Depends(get_db),
|
||||||
|
):
|
||||||
|
"""
|
||||||
|
Get all deployed, non-retired modems for the modem assignment dropdown.
|
||||||
|
"""
|
||||||
|
modems = db.query(RosterUnit).filter(
|
||||||
|
and_(
|
||||||
|
RosterUnit.device_type == "modem",
|
||||||
|
RosterUnit.deployed == True,
|
||||||
|
RosterUnit.retired == False,
|
||||||
|
)
|
||||||
|
).order_by(RosterUnit.id).all()
|
||||||
|
|
||||||
|
return [
|
||||||
|
{
|
||||||
|
"id": m.id,
|
||||||
|
"hardware_model": m.hardware_model,
|
||||||
|
"ip_address": m.ip_address,
|
||||||
|
}
|
||||||
|
for m in modems
|
||||||
|
]
|
||||||
|
|
||||||
|
|
||||||
@router.get("/available-units", response_class=JSONResponse)
|
@router.get("/available-units", response_class=JSONResponse)
|
||||||
async def get_available_units(
|
async def get_available_units(
|
||||||
project_id: str,
|
project_id: str,
|
||||||
@@ -386,9 +583,9 @@ async def get_available_units(
|
|||||||
)
|
)
|
||||||
).all()
|
).all()
|
||||||
|
|
||||||
# Filter out units that already have active assignments
|
# Filter out units that already have active assignments (active = assigned_until IS NULL)
|
||||||
assigned_unit_ids = db.query(UnitAssignment.unit_id).filter(
|
assigned_unit_ids = db.query(UnitAssignment.unit_id).filter(
|
||||||
UnitAssignment.status == "active"
|
UnitAssignment.assigned_until == None
|
||||||
).distinct().all()
|
).distinct().all()
|
||||||
assigned_unit_ids = [uid[0] for uid in assigned_unit_ids]
|
assigned_unit_ids = [uid[0] for uid in assigned_unit_ids]
|
||||||
|
|
||||||
@@ -418,14 +615,12 @@ async def get_nrl_sessions(
|
|||||||
db: Session = Depends(get_db),
|
db: Session = Depends(get_db),
|
||||||
):
|
):
|
||||||
"""
|
"""
|
||||||
Get recording sessions for a specific NRL.
|
Get monitoring sessions for a specific NRL.
|
||||||
Returns HTML partial with session list.
|
Returns HTML partial with session list.
|
||||||
"""
|
"""
|
||||||
from backend.models import RecordingSession, RosterUnit
|
sessions = db.query(MonitoringSession).filter_by(
|
||||||
|
|
||||||
sessions = db.query(RecordingSession).filter_by(
|
|
||||||
location_id=location_id
|
location_id=location_id
|
||||||
).order_by(RecordingSession.started_at.desc()).all()
|
).order_by(MonitoringSession.started_at.desc()).all()
|
||||||
|
|
||||||
# Enrich with unit details
|
# Enrich with unit details
|
||||||
sessions_data = []
|
sessions_data = []
|
||||||
@@ -458,14 +653,12 @@ async def get_nrl_files(
|
|||||||
Get data files for a specific NRL.
|
Get data files for a specific NRL.
|
||||||
Returns HTML partial with file list.
|
Returns HTML partial with file list.
|
||||||
"""
|
"""
|
||||||
from backend.models import DataFile, RecordingSession
|
# Join DataFile with MonitoringSession to filter by location_id
|
||||||
|
|
||||||
# Join DataFile with RecordingSession to filter by location_id
|
|
||||||
files = db.query(DataFile).join(
|
files = db.query(DataFile).join(
|
||||||
RecordingSession,
|
MonitoringSession,
|
||||||
DataFile.session_id == RecordingSession.id
|
DataFile.session_id == MonitoringSession.id
|
||||||
).filter(
|
).filter(
|
||||||
RecordingSession.location_id == location_id
|
MonitoringSession.location_id == location_id
|
||||||
).order_by(DataFile.created_at.desc()).all()
|
).order_by(DataFile.created_at.desc()).all()
|
||||||
|
|
||||||
# Enrich with session details
|
# Enrich with session details
|
||||||
@@ -473,7 +666,7 @@ async def get_nrl_files(
|
|||||||
for file in files:
|
for file in files:
|
||||||
session = None
|
session = None
|
||||||
if file.session_id:
|
if file.session_id:
|
||||||
session = db.query(RecordingSession).filter_by(id=file.session_id).first()
|
session = db.query(MonitoringSession).filter_by(id=file.session_id).first()
|
||||||
|
|
||||||
files_data.append({
|
files_data.append({
|
||||||
"file": file,
|
"file": file,
|
||||||
@@ -486,3 +679,324 @@ async def get_nrl_files(
|
|||||||
"location_id": location_id,
|
"location_id": location_id,
|
||||||
"files": files_data,
|
"files": files_data,
|
||||||
})
|
})
|
||||||
|
|
||||||
|
|
||||||
|
# ============================================================================
|
||||||
|
# Manual SD Card Data Upload
|
||||||
|
# ============================================================================
|
||||||
|
|
||||||
|
def _parse_rnh(content: bytes) -> dict:
|
||||||
|
"""
|
||||||
|
Parse a Rion .rnh metadata file (INI-style with [Section] headers).
|
||||||
|
Returns a dict of key metadata fields.
|
||||||
|
"""
|
||||||
|
result = {}
|
||||||
|
try:
|
||||||
|
text = content.decode("utf-8", errors="replace")
|
||||||
|
for line in text.splitlines():
|
||||||
|
line = line.strip()
|
||||||
|
if not line or line.startswith("["):
|
||||||
|
continue
|
||||||
|
if "," in line:
|
||||||
|
key, _, value = line.partition(",")
|
||||||
|
key = key.strip()
|
||||||
|
value = value.strip()
|
||||||
|
if key == "Serial Number":
|
||||||
|
result["serial_number"] = value
|
||||||
|
elif key == "Store Name":
|
||||||
|
result["store_name"] = value
|
||||||
|
elif key == "Index Number":
|
||||||
|
result["index_number"] = value
|
||||||
|
elif key == "Measurement Start Time":
|
||||||
|
result["start_time_str"] = value
|
||||||
|
elif key == "Measurement Stop Time":
|
||||||
|
result["stop_time_str"] = value
|
||||||
|
elif key == "Total Measurement Time":
|
||||||
|
result["total_time_str"] = value
|
||||||
|
except Exception:
|
||||||
|
pass
|
||||||
|
return result
|
||||||
|
|
||||||
|
|
||||||
|
def _parse_rnh_datetime(s: str):
|
||||||
|
"""Parse RNH datetime string: '2026/02/17 19:00:19' -> datetime"""
|
||||||
|
from datetime import datetime
|
||||||
|
if not s:
|
||||||
|
return None
|
||||||
|
try:
|
||||||
|
return datetime.strptime(s.strip(), "%Y/%m/%d %H:%M:%S")
|
||||||
|
except Exception:
|
||||||
|
return None
|
||||||
|
|
||||||
|
|
||||||
|
def _classify_file(filename: str) -> str:
|
||||||
|
"""Classify a file by name into a DataFile file_type."""
|
||||||
|
name = filename.lower()
|
||||||
|
if name.endswith(".rnh"):
|
||||||
|
return "log"
|
||||||
|
if name.endswith(".rnd"):
|
||||||
|
return "measurement"
|
||||||
|
if name.endswith(".zip"):
|
||||||
|
return "archive"
|
||||||
|
return "data"
|
||||||
|
|
||||||
|
|
||||||
|
@router.post("/nrl/{location_id}/upload-data")
|
||||||
|
async def upload_nrl_data(
|
||||||
|
project_id: str,
|
||||||
|
location_id: str,
|
||||||
|
db: Session = Depends(get_db),
|
||||||
|
files: list[UploadFile] = File(...),
|
||||||
|
):
|
||||||
|
"""
|
||||||
|
Manually upload SD card data for an offline NRL.
|
||||||
|
|
||||||
|
Accepts either:
|
||||||
|
- A single .zip file (the Auto_#### folder zipped) — auto-extracted
|
||||||
|
- Multiple .rnd / .rnh files selected directly from the SD card folder
|
||||||
|
|
||||||
|
Creates a MonitoringSession from .rnh metadata and DataFile records
|
||||||
|
for each measurement file. No unit assignment required.
|
||||||
|
"""
|
||||||
|
from datetime import datetime
|
||||||
|
|
||||||
|
# Verify project and location exist
|
||||||
|
project = db.query(Project).filter_by(id=project_id).first()
|
||||||
|
_require_sound_project(project)
|
||||||
|
|
||||||
|
location = db.query(MonitoringLocation).filter_by(
|
||||||
|
id=location_id, project_id=project_id
|
||||||
|
).first()
|
||||||
|
if not location:
|
||||||
|
raise HTTPException(status_code=404, detail="Location not found")
|
||||||
|
|
||||||
|
# --- Step 1: Normalize to (filename, bytes) list ---
|
||||||
|
file_entries: list[tuple[str, bytes]] = []
|
||||||
|
|
||||||
|
if len(files) == 1 and files[0].filename.lower().endswith(".zip"):
|
||||||
|
raw = await files[0].read()
|
||||||
|
try:
|
||||||
|
with zipfile.ZipFile(io.BytesIO(raw)) as zf:
|
||||||
|
for info in zf.infolist():
|
||||||
|
if info.is_dir():
|
||||||
|
continue
|
||||||
|
name = Path(info.filename).name # strip folder path
|
||||||
|
if not name:
|
||||||
|
continue
|
||||||
|
file_entries.append((name, zf.read(info)))
|
||||||
|
except zipfile.BadZipFile:
|
||||||
|
raise HTTPException(status_code=400, detail="Uploaded file is not a valid ZIP archive.")
|
||||||
|
else:
|
||||||
|
for uf in files:
|
||||||
|
data = await uf.read()
|
||||||
|
file_entries.append((uf.filename, data))
|
||||||
|
|
||||||
|
if not file_entries:
|
||||||
|
raise HTTPException(status_code=400, detail="No usable files found in upload.")
|
||||||
|
|
||||||
|
# --- Step 1b: Filter to only relevant files ---
|
||||||
|
# Keep: .rnh (metadata) and measurement .rnd files
|
||||||
|
# NL-43 generates two .rnd types: _Leq_ (15-min averages, wanted) and _Lp_ (1-sec granular, skip)
|
||||||
|
# AU2 (NL-23/older Rion) generates a single Au2_####.rnd per session — always keep those
|
||||||
|
# Drop: _Lp_ .rnd, .xlsx, .mp3, and anything else
|
||||||
|
def _is_wanted(fname: str) -> bool:
|
||||||
|
n = fname.lower()
|
||||||
|
if n.endswith(".rnh"):
|
||||||
|
return True
|
||||||
|
if n.endswith(".rnd"):
|
||||||
|
if "_leq_" in n: # NL-43 Leq file
|
||||||
|
return True
|
||||||
|
if n.startswith("au2_"): # AU2 format (NL-23) — always Leq equivalent
|
||||||
|
return True
|
||||||
|
if "_lp" not in n and "_leq_" not in n:
|
||||||
|
# Unknown .rnd format — include it so we don't silently drop data
|
||||||
|
return True
|
||||||
|
return False
|
||||||
|
|
||||||
|
file_entries = [(fname, fbytes) for fname, fbytes in file_entries if _is_wanted(fname)]
|
||||||
|
|
||||||
|
if not file_entries:
|
||||||
|
raise HTTPException(status_code=400, detail="No usable .rnd or .rnh files found. Expected NL-43 _Leq_ files or AU2 format .rnd files.")
|
||||||
|
|
||||||
|
# --- Step 2: Find and parse .rnh metadata ---
|
||||||
|
rnh_meta = {}
|
||||||
|
for fname, fbytes in file_entries:
|
||||||
|
if fname.lower().endswith(".rnh"):
|
||||||
|
rnh_meta = _parse_rnh(fbytes)
|
||||||
|
break
|
||||||
|
|
||||||
|
# RNH files store local time (no UTC offset). Use local values for period
|
||||||
|
# classification / label generation, then convert to UTC for DB storage so
|
||||||
|
# the local_datetime Jinja filter displays the correct time.
|
||||||
|
started_at_local = _parse_rnh_datetime(rnh_meta.get("start_time_str")) or datetime.utcnow()
|
||||||
|
stopped_at_local = _parse_rnh_datetime(rnh_meta.get("stop_time_str"))
|
||||||
|
|
||||||
|
started_at = local_to_utc(started_at_local)
|
||||||
|
stopped_at = local_to_utc(stopped_at_local) if stopped_at_local else None
|
||||||
|
|
||||||
|
duration_seconds = None
|
||||||
|
if started_at and stopped_at:
|
||||||
|
duration_seconds = int((stopped_at - started_at).total_seconds())
|
||||||
|
|
||||||
|
store_name = rnh_meta.get("store_name", "")
|
||||||
|
serial_number = rnh_meta.get("serial_number", "")
|
||||||
|
index_number = rnh_meta.get("index_number", "")
|
||||||
|
|
||||||
|
# --- Step 3: Create MonitoringSession ---
|
||||||
|
# Use local times for period/label so classification reflects the clock at the site.
|
||||||
|
period_type = _derive_period_type(started_at_local) if started_at_local else None
|
||||||
|
session_label = _build_session_label(started_at_local, location.name, period_type) if started_at_local else None
|
||||||
|
|
||||||
|
session_id = str(uuid.uuid4())
|
||||||
|
monitoring_session = MonitoringSession(
|
||||||
|
id=session_id,
|
||||||
|
project_id=project_id,
|
||||||
|
location_id=location_id,
|
||||||
|
unit_id=None,
|
||||||
|
session_type="sound",
|
||||||
|
started_at=started_at,
|
||||||
|
stopped_at=stopped_at,
|
||||||
|
duration_seconds=duration_seconds,
|
||||||
|
status="completed",
|
||||||
|
session_label=session_label,
|
||||||
|
period_type=period_type,
|
||||||
|
session_metadata=json.dumps({
|
||||||
|
"source": "manual_upload",
|
||||||
|
"store_name": store_name,
|
||||||
|
"serial_number": serial_number,
|
||||||
|
"index_number": index_number,
|
||||||
|
}),
|
||||||
|
)
|
||||||
|
db.add(monitoring_session)
|
||||||
|
db.commit()
|
||||||
|
db.refresh(monitoring_session)
|
||||||
|
|
||||||
|
# --- Step 4: Write files to disk and create DataFile records ---
|
||||||
|
output_dir = Path("data/Projects") / project_id / session_id
|
||||||
|
output_dir.mkdir(parents=True, exist_ok=True)
|
||||||
|
|
||||||
|
leq_count = 0
|
||||||
|
lp_count = 0
|
||||||
|
metadata_count = 0
|
||||||
|
files_imported = 0
|
||||||
|
|
||||||
|
for fname, fbytes in file_entries:
|
||||||
|
file_type = _classify_file(fname)
|
||||||
|
fname_lower = fname.lower()
|
||||||
|
|
||||||
|
# Track counts for summary
|
||||||
|
if fname_lower.endswith(".rnd"):
|
||||||
|
if "_leq_" in fname_lower:
|
||||||
|
leq_count += 1
|
||||||
|
elif "_lp" in fname_lower:
|
||||||
|
lp_count += 1
|
||||||
|
elif fname_lower.endswith(".rnh"):
|
||||||
|
metadata_count += 1
|
||||||
|
|
||||||
|
# Write to disk
|
||||||
|
dest = output_dir / fname
|
||||||
|
dest.write_bytes(fbytes)
|
||||||
|
|
||||||
|
# Compute checksum
|
||||||
|
checksum = hashlib.sha256(fbytes).hexdigest()
|
||||||
|
|
||||||
|
# Store relative path from data/ dir
|
||||||
|
rel_path = str(dest.relative_to("data"))
|
||||||
|
|
||||||
|
data_file = DataFile(
|
||||||
|
id=str(uuid.uuid4()),
|
||||||
|
session_id=session_id,
|
||||||
|
file_path=rel_path,
|
||||||
|
file_type=file_type,
|
||||||
|
file_size_bytes=len(fbytes),
|
||||||
|
downloaded_at=datetime.utcnow(),
|
||||||
|
checksum=checksum,
|
||||||
|
file_metadata=json.dumps({
|
||||||
|
"source": "manual_upload",
|
||||||
|
"original_filename": fname,
|
||||||
|
"store_name": store_name,
|
||||||
|
}),
|
||||||
|
)
|
||||||
|
db.add(data_file)
|
||||||
|
files_imported += 1
|
||||||
|
|
||||||
|
db.commit()
|
||||||
|
|
||||||
|
return {
|
||||||
|
"success": True,
|
||||||
|
"session_id": session_id,
|
||||||
|
"files_imported": files_imported,
|
||||||
|
"leq_files": leq_count,
|
||||||
|
"lp_files": lp_count,
|
||||||
|
"metadata_files": metadata_count,
|
||||||
|
"store_name": store_name,
|
||||||
|
"started_at": started_at.isoformat() if started_at else None,
|
||||||
|
"stopped_at": stopped_at.isoformat() if stopped_at else None,
|
||||||
|
}
|
||||||
|
|
||||||
|
|
||||||
|
# ============================================================================
|
||||||
|
# NRL Live Status (connected NRLs only)
|
||||||
|
# ============================================================================
|
||||||
|
|
||||||
|
@router.get("/nrl/{location_id}/live-status", response_class=HTMLResponse)
|
||||||
|
async def get_nrl_live_status(
|
||||||
|
project_id: str,
|
||||||
|
location_id: str,
|
||||||
|
request: Request,
|
||||||
|
db: Session = Depends(get_db),
|
||||||
|
):
|
||||||
|
"""
|
||||||
|
Fetch cached status from SLMM for the unit assigned to this NRL and
|
||||||
|
return a compact HTML status card. Used in the NRL overview tab for
|
||||||
|
connected NRLs. Gracefully shows an offline message if SLMM is unreachable.
|
||||||
|
Sound Monitoring projects only.
|
||||||
|
"""
|
||||||
|
import os
|
||||||
|
import httpx
|
||||||
|
|
||||||
|
_require_sound_project(db.query(Project).filter_by(id=project_id).first())
|
||||||
|
|
||||||
|
# Find the assigned unit (active = assigned_until IS NULL)
|
||||||
|
assignment = db.query(UnitAssignment).filter(
|
||||||
|
and_(
|
||||||
|
UnitAssignment.location_id == location_id,
|
||||||
|
UnitAssignment.assigned_until == None,
|
||||||
|
)
|
||||||
|
).first()
|
||||||
|
|
||||||
|
if not assignment:
|
||||||
|
return templates.TemplateResponse("partials/projects/nrl_live_status.html", {
|
||||||
|
"request": request,
|
||||||
|
"status": None,
|
||||||
|
"error": "No unit assigned",
|
||||||
|
})
|
||||||
|
|
||||||
|
unit = db.query(RosterUnit).filter_by(id=assignment.unit_id).first()
|
||||||
|
if not unit:
|
||||||
|
return templates.TemplateResponse("partials/projects/nrl_live_status.html", {
|
||||||
|
"request": request,
|
||||||
|
"status": None,
|
||||||
|
"error": "Assigned unit not found",
|
||||||
|
})
|
||||||
|
|
||||||
|
slmm_base = os.getenv("SLMM_BASE_URL", "http://localhost:8100")
|
||||||
|
status_data = None
|
||||||
|
error_msg = None
|
||||||
|
|
||||||
|
try:
|
||||||
|
async with httpx.AsyncClient(timeout=5.0) as client:
|
||||||
|
resp = await client.get(f"{slmm_base}/api/nl43/{unit.id}/status")
|
||||||
|
if resp.status_code == 200:
|
||||||
|
status_data = resp.json()
|
||||||
|
else:
|
||||||
|
error_msg = f"SLMM returned {resp.status_code}"
|
||||||
|
except Exception as e:
|
||||||
|
error_msg = "SLMM unreachable"
|
||||||
|
|
||||||
|
return templates.TemplateResponse("partials/projects/nrl_live_status.html", {
|
||||||
|
"request": request,
|
||||||
|
"unit": unit,
|
||||||
|
"status": status_data,
|
||||||
|
"error": error_msg,
|
||||||
|
})
|
||||||
|
|||||||
522
backend/routers/recurring_schedules.py
Normal file
@@ -0,0 +1,522 @@
|
|||||||
|
"""
|
||||||
|
Recurring Schedules Router
|
||||||
|
|
||||||
|
API endpoints for managing recurring monitoring schedules.
|
||||||
|
"""
|
||||||
|
|
||||||
|
from fastapi import APIRouter, Request, Depends, HTTPException, Query
|
||||||
|
from fastapi.responses import HTMLResponse, JSONResponse
|
||||||
|
from sqlalchemy.orm import Session
|
||||||
|
from typing import Optional
|
||||||
|
from datetime import datetime
|
||||||
|
import json
|
||||||
|
|
||||||
|
from backend.database import get_db
|
||||||
|
from backend.models import RecurringSchedule, MonitoringLocation, Project, RosterUnit
|
||||||
|
from backend.services.recurring_schedule_service import get_recurring_schedule_service
|
||||||
|
from backend.templates_config import templates
|
||||||
|
|
||||||
|
router = APIRouter(prefix="/api/projects/{project_id}/recurring-schedules", tags=["recurring-schedules"])
|
||||||
|
|
||||||
|
|
||||||
|
# ============================================================================
|
||||||
|
# List and Get
|
||||||
|
# ============================================================================
|
||||||
|
|
||||||
|
@router.get("/")
|
||||||
|
async def list_recurring_schedules(
|
||||||
|
project_id: str,
|
||||||
|
db: Session = Depends(get_db),
|
||||||
|
enabled_only: bool = Query(False),
|
||||||
|
):
|
||||||
|
"""
|
||||||
|
List all recurring schedules for a project.
|
||||||
|
"""
|
||||||
|
project = db.query(Project).filter_by(id=project_id).first()
|
||||||
|
if not project:
|
||||||
|
raise HTTPException(status_code=404, detail="Project not found")
|
||||||
|
|
||||||
|
query = db.query(RecurringSchedule).filter_by(project_id=project_id)
|
||||||
|
if enabled_only:
|
||||||
|
query = query.filter_by(enabled=True)
|
||||||
|
|
||||||
|
schedules = query.order_by(RecurringSchedule.created_at.desc()).all()
|
||||||
|
|
||||||
|
return {
|
||||||
|
"schedules": [
|
||||||
|
{
|
||||||
|
"id": s.id,
|
||||||
|
"name": s.name,
|
||||||
|
"schedule_type": s.schedule_type,
|
||||||
|
"device_type": s.device_type,
|
||||||
|
"location_id": s.location_id,
|
||||||
|
"unit_id": s.unit_id,
|
||||||
|
"enabled": s.enabled,
|
||||||
|
"weekly_pattern": json.loads(s.weekly_pattern) if s.weekly_pattern else None,
|
||||||
|
"interval_type": s.interval_type,
|
||||||
|
"cycle_time": s.cycle_time,
|
||||||
|
"include_download": s.include_download,
|
||||||
|
"timezone": s.timezone,
|
||||||
|
"next_occurrence": s.next_occurrence.isoformat() if s.next_occurrence else None,
|
||||||
|
"last_generated_at": s.last_generated_at.isoformat() if s.last_generated_at else None,
|
||||||
|
"created_at": s.created_at.isoformat() if s.created_at else None,
|
||||||
|
}
|
||||||
|
for s in schedules
|
||||||
|
],
|
||||||
|
"count": len(schedules),
|
||||||
|
}
|
||||||
|
|
||||||
|
|
||||||
|
@router.get("/{schedule_id}")
|
||||||
|
async def get_recurring_schedule(
|
||||||
|
project_id: str,
|
||||||
|
schedule_id: str,
|
||||||
|
db: Session = Depends(get_db),
|
||||||
|
):
|
||||||
|
"""
|
||||||
|
Get a specific recurring schedule.
|
||||||
|
"""
|
||||||
|
schedule = db.query(RecurringSchedule).filter_by(
|
||||||
|
id=schedule_id,
|
||||||
|
project_id=project_id,
|
||||||
|
).first()
|
||||||
|
|
||||||
|
if not schedule:
|
||||||
|
raise HTTPException(status_code=404, detail="Schedule not found")
|
||||||
|
|
||||||
|
# Get related location and unit info
|
||||||
|
location = db.query(MonitoringLocation).filter_by(id=schedule.location_id).first()
|
||||||
|
unit = None
|
||||||
|
if schedule.unit_id:
|
||||||
|
unit = db.query(RosterUnit).filter_by(id=schedule.unit_id).first()
|
||||||
|
|
||||||
|
return {
|
||||||
|
"id": schedule.id,
|
||||||
|
"name": schedule.name,
|
||||||
|
"schedule_type": schedule.schedule_type,
|
||||||
|
"device_type": schedule.device_type,
|
||||||
|
"location_id": schedule.location_id,
|
||||||
|
"location_name": location.name if location else None,
|
||||||
|
"unit_id": schedule.unit_id,
|
||||||
|
"unit_name": unit.id if unit else None,
|
||||||
|
"enabled": schedule.enabled,
|
||||||
|
"weekly_pattern": json.loads(schedule.weekly_pattern) if schedule.weekly_pattern else None,
|
||||||
|
"interval_type": schedule.interval_type,
|
||||||
|
"cycle_time": schedule.cycle_time,
|
||||||
|
"include_download": schedule.include_download,
|
||||||
|
"timezone": schedule.timezone,
|
||||||
|
"next_occurrence": schedule.next_occurrence.isoformat() if schedule.next_occurrence else None,
|
||||||
|
"last_generated_at": schedule.last_generated_at.isoformat() if schedule.last_generated_at else None,
|
||||||
|
"created_at": schedule.created_at.isoformat() if schedule.created_at else None,
|
||||||
|
"updated_at": schedule.updated_at.isoformat() if schedule.updated_at else None,
|
||||||
|
}
|
||||||
|
|
||||||
|
|
||||||
|
# ============================================================================
|
||||||
|
# Create
|
||||||
|
# ============================================================================
|
||||||
|
|
||||||
|
@router.post("/")
|
||||||
|
async def create_recurring_schedule(
|
||||||
|
project_id: str,
|
||||||
|
request: Request,
|
||||||
|
db: Session = Depends(get_db),
|
||||||
|
):
|
||||||
|
"""
|
||||||
|
Create recurring schedules for one or more locations.
|
||||||
|
|
||||||
|
Body for weekly_calendar (supports multiple locations):
|
||||||
|
{
|
||||||
|
"name": "Weeknight Monitoring",
|
||||||
|
"schedule_type": "weekly_calendar",
|
||||||
|
"location_ids": ["uuid1", "uuid2"], // Array of location IDs
|
||||||
|
"weekly_pattern": {
|
||||||
|
"monday": {"enabled": true, "start": "19:00", "end": "07:00"},
|
||||||
|
"tuesday": {"enabled": false},
|
||||||
|
...
|
||||||
|
},
|
||||||
|
"include_download": true,
|
||||||
|
"auto_increment_index": true,
|
||||||
|
"timezone": "America/New_York"
|
||||||
|
}
|
||||||
|
|
||||||
|
Body for simple_interval (supports multiple locations):
|
||||||
|
{
|
||||||
|
"name": "24/7 Continuous",
|
||||||
|
"schedule_type": "simple_interval",
|
||||||
|
"location_ids": ["uuid1", "uuid2"], // Array of location IDs
|
||||||
|
"interval_type": "daily",
|
||||||
|
"cycle_time": "00:00",
|
||||||
|
"include_download": true,
|
||||||
|
"auto_increment_index": true,
|
||||||
|
"timezone": "America/New_York"
|
||||||
|
}
|
||||||
|
|
||||||
|
Legacy single location support (backwards compatible):
|
||||||
|
{
|
||||||
|
"name": "...",
|
||||||
|
"location_id": "uuid", // Single location ID
|
||||||
|
...
|
||||||
|
}
|
||||||
|
"""
|
||||||
|
project = db.query(Project).filter_by(id=project_id).first()
|
||||||
|
if not project:
|
||||||
|
raise HTTPException(status_code=404, detail="Project not found")
|
||||||
|
|
||||||
|
data = await request.json()
|
||||||
|
|
||||||
|
# Support both location_ids (array) and location_id (single) for backwards compatibility
|
||||||
|
location_ids = data.get("location_ids", [])
|
||||||
|
if not location_ids and data.get("location_id"):
|
||||||
|
location_ids = [data.get("location_id")]
|
||||||
|
|
||||||
|
if not location_ids:
|
||||||
|
raise HTTPException(status_code=400, detail="At least one location is required")
|
||||||
|
|
||||||
|
# Validate all locations exist
|
||||||
|
locations = db.query(MonitoringLocation).filter(
|
||||||
|
MonitoringLocation.id.in_(location_ids),
|
||||||
|
MonitoringLocation.project_id == project_id,
|
||||||
|
).all()
|
||||||
|
|
||||||
|
if len(locations) != len(location_ids):
|
||||||
|
raise HTTPException(status_code=404, detail="One or more locations not found")
|
||||||
|
|
||||||
|
service = get_recurring_schedule_service(db)
|
||||||
|
created_schedules = []
|
||||||
|
base_name = data.get("name", "Unnamed Schedule")
|
||||||
|
|
||||||
|
# Parse one-off datetime fields if applicable
|
||||||
|
one_off_start = None
|
||||||
|
one_off_end = None
|
||||||
|
if data.get("schedule_type") == "one_off":
|
||||||
|
from zoneinfo import ZoneInfo
|
||||||
|
|
||||||
|
tz = ZoneInfo(data.get("timezone", "America/New_York"))
|
||||||
|
|
||||||
|
start_dt_str = data.get("start_datetime")
|
||||||
|
end_dt_str = data.get("end_datetime")
|
||||||
|
|
||||||
|
if not start_dt_str or not end_dt_str:
|
||||||
|
raise HTTPException(status_code=400, detail="One-off schedules require start and end date/time")
|
||||||
|
|
||||||
|
try:
|
||||||
|
start_local = datetime.fromisoformat(start_dt_str).replace(tzinfo=tz)
|
||||||
|
end_local = datetime.fromisoformat(end_dt_str).replace(tzinfo=tz)
|
||||||
|
except ValueError:
|
||||||
|
raise HTTPException(status_code=400, detail="Invalid datetime format")
|
||||||
|
|
||||||
|
duration = end_local - start_local
|
||||||
|
if duration.total_seconds() < 900:
|
||||||
|
raise HTTPException(status_code=400, detail="Duration must be at least 15 minutes")
|
||||||
|
if duration.total_seconds() > 86400:
|
||||||
|
raise HTTPException(status_code=400, detail="Duration cannot exceed 24 hours")
|
||||||
|
|
||||||
|
from datetime import timezone as dt_timezone
|
||||||
|
now_local = datetime.now(tz)
|
||||||
|
if start_local <= now_local:
|
||||||
|
raise HTTPException(status_code=400, detail="Start time must be in the future")
|
||||||
|
|
||||||
|
# Convert to UTC for storage
|
||||||
|
one_off_start = start_local.astimezone(ZoneInfo("UTC")).replace(tzinfo=None)
|
||||||
|
one_off_end = end_local.astimezone(ZoneInfo("UTC")).replace(tzinfo=None)
|
||||||
|
|
||||||
|
# Create a schedule for each location
|
||||||
|
for location in locations:
|
||||||
|
# Determine device type from location
|
||||||
|
device_type = "slm" if location.location_type == "sound" else "seismograph"
|
||||||
|
|
||||||
|
# Append location name if multiple locations
|
||||||
|
schedule_name = f"{base_name} - {location.name}" if len(locations) > 1 else base_name
|
||||||
|
|
||||||
|
schedule = service.create_schedule(
|
||||||
|
project_id=project_id,
|
||||||
|
location_id=location.id,
|
||||||
|
name=schedule_name,
|
||||||
|
schedule_type=data.get("schedule_type", "weekly_calendar"),
|
||||||
|
device_type=device_type,
|
||||||
|
unit_id=data.get("unit_id"),
|
||||||
|
weekly_pattern=data.get("weekly_pattern"),
|
||||||
|
interval_type=data.get("interval_type"),
|
||||||
|
cycle_time=data.get("cycle_time"),
|
||||||
|
include_download=data.get("include_download", True),
|
||||||
|
auto_increment_index=data.get("auto_increment_index", True),
|
||||||
|
timezone=data.get("timezone", "America/New_York"),
|
||||||
|
start_datetime=one_off_start,
|
||||||
|
end_datetime=one_off_end,
|
||||||
|
)
|
||||||
|
|
||||||
|
# Generate actions immediately so they appear right away
|
||||||
|
generated_actions = service.generate_actions_for_schedule(schedule, horizon_days=7)
|
||||||
|
|
||||||
|
created_schedules.append({
|
||||||
|
"schedule_id": schedule.id,
|
||||||
|
"location_id": location.id,
|
||||||
|
"location_name": location.name,
|
||||||
|
"actions_generated": len(generated_actions),
|
||||||
|
})
|
||||||
|
|
||||||
|
total_actions = sum(s.get("actions_generated", 0) for s in created_schedules)
|
||||||
|
|
||||||
|
return JSONResponse({
|
||||||
|
"success": True,
|
||||||
|
"schedules": created_schedules,
|
||||||
|
"count": len(created_schedules),
|
||||||
|
"actions_generated": total_actions,
|
||||||
|
"message": f"Created {len(created_schedules)} recurring schedule(s) with {total_actions} upcoming actions",
|
||||||
|
})
|
||||||
|
|
||||||
|
|
||||||
|
# ============================================================================
|
||||||
|
# Update
|
||||||
|
# ============================================================================
|
||||||
|
|
||||||
|
@router.put("/{schedule_id}")
|
||||||
|
async def update_recurring_schedule(
|
||||||
|
project_id: str,
|
||||||
|
schedule_id: str,
|
||||||
|
request: Request,
|
||||||
|
db: Session = Depends(get_db),
|
||||||
|
):
|
||||||
|
"""
|
||||||
|
Update a recurring schedule.
|
||||||
|
"""
|
||||||
|
schedule = db.query(RecurringSchedule).filter_by(
|
||||||
|
id=schedule_id,
|
||||||
|
project_id=project_id,
|
||||||
|
).first()
|
||||||
|
|
||||||
|
if not schedule:
|
||||||
|
raise HTTPException(status_code=404, detail="Schedule not found")
|
||||||
|
|
||||||
|
data = await request.json()
|
||||||
|
service = get_recurring_schedule_service(db)
|
||||||
|
|
||||||
|
# Build update kwargs
|
||||||
|
update_kwargs = {}
|
||||||
|
for field in ["name", "weekly_pattern", "interval_type", "cycle_time",
|
||||||
|
"include_download", "auto_increment_index", "timezone", "unit_id"]:
|
||||||
|
if field in data:
|
||||||
|
update_kwargs[field] = data[field]
|
||||||
|
|
||||||
|
updated = service.update_schedule(schedule_id, **update_kwargs)
|
||||||
|
|
||||||
|
return {
|
||||||
|
"success": True,
|
||||||
|
"schedule_id": updated.id,
|
||||||
|
"message": "Schedule updated successfully",
|
||||||
|
}
|
||||||
|
|
||||||
|
|
||||||
|
# ============================================================================
|
||||||
|
# Delete
|
||||||
|
# ============================================================================
|
||||||
|
|
||||||
|
@router.delete("/{schedule_id}")
|
||||||
|
async def delete_recurring_schedule(
|
||||||
|
project_id: str,
|
||||||
|
schedule_id: str,
|
||||||
|
db: Session = Depends(get_db),
|
||||||
|
):
|
||||||
|
"""
|
||||||
|
Delete a recurring schedule.
|
||||||
|
"""
|
||||||
|
service = get_recurring_schedule_service(db)
|
||||||
|
deleted = service.delete_schedule(schedule_id)
|
||||||
|
|
||||||
|
if not deleted:
|
||||||
|
raise HTTPException(status_code=404, detail="Schedule not found")
|
||||||
|
|
||||||
|
return {
|
||||||
|
"success": True,
|
||||||
|
"message": "Schedule deleted successfully",
|
||||||
|
}
|
||||||
|
|
||||||
|
|
||||||
|
# ============================================================================
|
||||||
|
# Enable/Disable
|
||||||
|
# ============================================================================
|
||||||
|
|
||||||
|
@router.post("/{schedule_id}/enable")
|
||||||
|
async def enable_schedule(
|
||||||
|
project_id: str,
|
||||||
|
schedule_id: str,
|
||||||
|
db: Session = Depends(get_db),
|
||||||
|
):
|
||||||
|
"""
|
||||||
|
Enable a disabled schedule.
|
||||||
|
"""
|
||||||
|
service = get_recurring_schedule_service(db)
|
||||||
|
schedule = service.enable_schedule(schedule_id)
|
||||||
|
|
||||||
|
if not schedule:
|
||||||
|
raise HTTPException(status_code=404, detail="Schedule not found")
|
||||||
|
|
||||||
|
return {
|
||||||
|
"success": True,
|
||||||
|
"schedule_id": schedule.id,
|
||||||
|
"enabled": schedule.enabled,
|
||||||
|
"message": "Schedule enabled",
|
||||||
|
}
|
||||||
|
|
||||||
|
|
||||||
|
@router.post("/{schedule_id}/disable")
|
||||||
|
async def disable_schedule(
|
||||||
|
project_id: str,
|
||||||
|
schedule_id: str,
|
||||||
|
db: Session = Depends(get_db),
|
||||||
|
):
|
||||||
|
"""
|
||||||
|
Disable a schedule and cancel all its pending actions.
|
||||||
|
"""
|
||||||
|
service = get_recurring_schedule_service(db)
|
||||||
|
|
||||||
|
# Count pending actions before disabling (for response message)
|
||||||
|
from sqlalchemy import and_
|
||||||
|
from backend.models import ScheduledAction
|
||||||
|
pending_count = db.query(ScheduledAction).filter(
|
||||||
|
and_(
|
||||||
|
ScheduledAction.execution_status == "pending",
|
||||||
|
ScheduledAction.notes.like(f'%"schedule_id": "{schedule_id}"%'),
|
||||||
|
)
|
||||||
|
).count()
|
||||||
|
|
||||||
|
schedule = service.disable_schedule(schedule_id)
|
||||||
|
|
||||||
|
if not schedule:
|
||||||
|
raise HTTPException(status_code=404, detail="Schedule not found")
|
||||||
|
|
||||||
|
message = "Schedule disabled"
|
||||||
|
if pending_count > 0:
|
||||||
|
message += f" and {pending_count} pending action(s) cancelled"
|
||||||
|
|
||||||
|
return {
|
||||||
|
"success": True,
|
||||||
|
"schedule_id": schedule.id,
|
||||||
|
"enabled": schedule.enabled,
|
||||||
|
"cancelled_actions": pending_count,
|
||||||
|
"message": message,
|
||||||
|
}
|
||||||
|
|
||||||
|
|
||||||
|
# ============================================================================
|
||||||
|
# Preview Generated Actions
|
||||||
|
# ============================================================================
|
||||||
|
|
||||||
|
@router.post("/{schedule_id}/generate-preview")
|
||||||
|
async def preview_generated_actions(
|
||||||
|
project_id: str,
|
||||||
|
schedule_id: str,
|
||||||
|
db: Session = Depends(get_db),
|
||||||
|
days: int = Query(7, ge=1, le=30),
|
||||||
|
):
|
||||||
|
"""
|
||||||
|
Preview what actions would be generated without saving them.
|
||||||
|
"""
|
||||||
|
schedule = db.query(RecurringSchedule).filter_by(
|
||||||
|
id=schedule_id,
|
||||||
|
project_id=project_id,
|
||||||
|
).first()
|
||||||
|
|
||||||
|
if not schedule:
|
||||||
|
raise HTTPException(status_code=404, detail="Schedule not found")
|
||||||
|
|
||||||
|
service = get_recurring_schedule_service(db)
|
||||||
|
actions = service.generate_actions_for_schedule(
|
||||||
|
schedule,
|
||||||
|
horizon_days=days,
|
||||||
|
preview_only=True,
|
||||||
|
)
|
||||||
|
|
||||||
|
return {
|
||||||
|
"schedule_id": schedule_id,
|
||||||
|
"schedule_name": schedule.name,
|
||||||
|
"preview_days": days,
|
||||||
|
"actions": [
|
||||||
|
{
|
||||||
|
"action_type": a.action_type,
|
||||||
|
"scheduled_time": a.scheduled_time.isoformat(),
|
||||||
|
"notes": a.notes,
|
||||||
|
}
|
||||||
|
for a in actions
|
||||||
|
],
|
||||||
|
"action_count": len(actions),
|
||||||
|
}
|
||||||
|
|
||||||
|
|
||||||
|
# ============================================================================
|
||||||
|
# Manual Generation Trigger
|
||||||
|
# ============================================================================
|
||||||
|
|
||||||
|
@router.post("/{schedule_id}/generate")
|
||||||
|
async def generate_actions_now(
|
||||||
|
project_id: str,
|
||||||
|
schedule_id: str,
|
||||||
|
db: Session = Depends(get_db),
|
||||||
|
days: int = Query(7, ge=1, le=30),
|
||||||
|
):
|
||||||
|
"""
|
||||||
|
Manually trigger action generation for a schedule.
|
||||||
|
"""
|
||||||
|
schedule = db.query(RecurringSchedule).filter_by(
|
||||||
|
id=schedule_id,
|
||||||
|
project_id=project_id,
|
||||||
|
).first()
|
||||||
|
|
||||||
|
if not schedule:
|
||||||
|
raise HTTPException(status_code=404, detail="Schedule not found")
|
||||||
|
|
||||||
|
if not schedule.enabled:
|
||||||
|
raise HTTPException(status_code=400, detail="Schedule is disabled")
|
||||||
|
|
||||||
|
service = get_recurring_schedule_service(db)
|
||||||
|
actions = service.generate_actions_for_schedule(
|
||||||
|
schedule,
|
||||||
|
horizon_days=days,
|
||||||
|
preview_only=False,
|
||||||
|
)
|
||||||
|
|
||||||
|
return {
|
||||||
|
"success": True,
|
||||||
|
"schedule_id": schedule_id,
|
||||||
|
"generated_count": len(actions),
|
||||||
|
"message": f"Generated {len(actions)} scheduled actions",
|
||||||
|
}
|
||||||
|
|
||||||
|
|
||||||
|
# ============================================================================
|
||||||
|
# HTML Partials
|
||||||
|
# ============================================================================
|
||||||
|
|
||||||
|
@router.get("/partials/list", response_class=HTMLResponse)
|
||||||
|
async def get_schedule_list_partial(
|
||||||
|
project_id: str,
|
||||||
|
request: Request,
|
||||||
|
db: Session = Depends(get_db),
|
||||||
|
):
|
||||||
|
"""
|
||||||
|
Return HTML partial for schedule list.
|
||||||
|
"""
|
||||||
|
project = db.query(Project).filter_by(id=project_id).first()
|
||||||
|
project_status = project.status if project else "active"
|
||||||
|
|
||||||
|
schedules = db.query(RecurringSchedule).filter_by(
|
||||||
|
project_id=project_id
|
||||||
|
).order_by(RecurringSchedule.created_at.desc()).all()
|
||||||
|
|
||||||
|
# Enrich with location info
|
||||||
|
schedule_data = []
|
||||||
|
for s in schedules:
|
||||||
|
location = db.query(MonitoringLocation).filter_by(id=s.location_id).first()
|
||||||
|
schedule_data.append({
|
||||||
|
"schedule": s,
|
||||||
|
"location": location,
|
||||||
|
"pattern": json.loads(s.weekly_pattern) if s.weekly_pattern else None,
|
||||||
|
})
|
||||||
|
|
||||||
|
return templates.TemplateResponse("partials/projects/recurring_schedule_list.html", {
|
||||||
|
"request": request,
|
||||||
|
"project_id": project_id,
|
||||||
|
"schedules": schedule_data,
|
||||||
|
"project_status": project_status,
|
||||||
|
})
|
||||||
187
backend/routers/report_templates.py
Normal file
@@ -0,0 +1,187 @@
|
|||||||
|
"""
|
||||||
|
Report Templates Router
|
||||||
|
|
||||||
|
CRUD operations for report template management.
|
||||||
|
Templates store time filter presets and report configuration for reuse.
|
||||||
|
"""
|
||||||
|
|
||||||
|
from fastapi import APIRouter, Depends, HTTPException
|
||||||
|
from fastapi.responses import JSONResponse
|
||||||
|
from sqlalchemy.orm import Session
|
||||||
|
from datetime import datetime
|
||||||
|
from typing import Optional
|
||||||
|
import uuid
|
||||||
|
|
||||||
|
from backend.database import get_db
|
||||||
|
from backend.models import ReportTemplate
|
||||||
|
|
||||||
|
router = APIRouter(prefix="/api/report-templates", tags=["report-templates"])
|
||||||
|
|
||||||
|
|
||||||
|
@router.get("")
|
||||||
|
async def list_templates(
|
||||||
|
project_id: Optional[str] = None,
|
||||||
|
db: Session = Depends(get_db),
|
||||||
|
):
|
||||||
|
"""
|
||||||
|
List all report templates.
|
||||||
|
Optionally filter by project_id (includes global templates with project_id=None).
|
||||||
|
"""
|
||||||
|
query = db.query(ReportTemplate)
|
||||||
|
|
||||||
|
if project_id:
|
||||||
|
# Include global templates (project_id=None) AND project-specific templates
|
||||||
|
query = query.filter(
|
||||||
|
(ReportTemplate.project_id == None) | (ReportTemplate.project_id == project_id)
|
||||||
|
)
|
||||||
|
|
||||||
|
templates = query.order_by(ReportTemplate.name).all()
|
||||||
|
|
||||||
|
return [
|
||||||
|
{
|
||||||
|
"id": t.id,
|
||||||
|
"name": t.name,
|
||||||
|
"project_id": t.project_id,
|
||||||
|
"report_title": t.report_title,
|
||||||
|
"start_time": t.start_time,
|
||||||
|
"end_time": t.end_time,
|
||||||
|
"start_date": t.start_date,
|
||||||
|
"end_date": t.end_date,
|
||||||
|
"created_at": t.created_at.isoformat() if t.created_at else None,
|
||||||
|
"updated_at": t.updated_at.isoformat() if t.updated_at else None,
|
||||||
|
}
|
||||||
|
for t in templates
|
||||||
|
]
|
||||||
|
|
||||||
|
|
||||||
|
@router.post("")
|
||||||
|
async def create_template(
|
||||||
|
data: dict,
|
||||||
|
db: Session = Depends(get_db),
|
||||||
|
):
|
||||||
|
"""
|
||||||
|
Create a new report template.
|
||||||
|
|
||||||
|
Request body:
|
||||||
|
- name: Template name (required)
|
||||||
|
- project_id: Optional project ID for project-specific template
|
||||||
|
- report_title: Default report title
|
||||||
|
- start_time: Start time filter (HH:MM format)
|
||||||
|
- end_time: End time filter (HH:MM format)
|
||||||
|
- start_date: Start date filter (YYYY-MM-DD format)
|
||||||
|
- end_date: End date filter (YYYY-MM-DD format)
|
||||||
|
"""
|
||||||
|
name = data.get("name")
|
||||||
|
if not name:
|
||||||
|
raise HTTPException(status_code=400, detail="Template name is required")
|
||||||
|
|
||||||
|
template = ReportTemplate(
|
||||||
|
id=str(uuid.uuid4()),
|
||||||
|
name=name,
|
||||||
|
project_id=data.get("project_id"),
|
||||||
|
report_title=data.get("report_title", "Background Noise Study"),
|
||||||
|
start_time=data.get("start_time"),
|
||||||
|
end_time=data.get("end_time"),
|
||||||
|
start_date=data.get("start_date"),
|
||||||
|
end_date=data.get("end_date"),
|
||||||
|
)
|
||||||
|
|
||||||
|
db.add(template)
|
||||||
|
db.commit()
|
||||||
|
db.refresh(template)
|
||||||
|
|
||||||
|
return {
|
||||||
|
"id": template.id,
|
||||||
|
"name": template.name,
|
||||||
|
"project_id": template.project_id,
|
||||||
|
"report_title": template.report_title,
|
||||||
|
"start_time": template.start_time,
|
||||||
|
"end_time": template.end_time,
|
||||||
|
"start_date": template.start_date,
|
||||||
|
"end_date": template.end_date,
|
||||||
|
"created_at": template.created_at.isoformat() if template.created_at else None,
|
||||||
|
}
|
||||||
|
|
||||||
|
|
||||||
|
@router.get("/{template_id}")
|
||||||
|
async def get_template(
|
||||||
|
template_id: str,
|
||||||
|
db: Session = Depends(get_db),
|
||||||
|
):
|
||||||
|
"""Get a specific report template by ID."""
|
||||||
|
template = db.query(ReportTemplate).filter_by(id=template_id).first()
|
||||||
|
if not template:
|
||||||
|
raise HTTPException(status_code=404, detail="Template not found")
|
||||||
|
|
||||||
|
return {
|
||||||
|
"id": template.id,
|
||||||
|
"name": template.name,
|
||||||
|
"project_id": template.project_id,
|
||||||
|
"report_title": template.report_title,
|
||||||
|
"start_time": template.start_time,
|
||||||
|
"end_time": template.end_time,
|
||||||
|
"start_date": template.start_date,
|
||||||
|
"end_date": template.end_date,
|
||||||
|
"created_at": template.created_at.isoformat() if template.created_at else None,
|
||||||
|
"updated_at": template.updated_at.isoformat() if template.updated_at else None,
|
||||||
|
}
|
||||||
|
|
||||||
|
|
||||||
|
@router.put("/{template_id}")
|
||||||
|
async def update_template(
|
||||||
|
template_id: str,
|
||||||
|
data: dict,
|
||||||
|
db: Session = Depends(get_db),
|
||||||
|
):
|
||||||
|
"""Update an existing report template."""
|
||||||
|
template = db.query(ReportTemplate).filter_by(id=template_id).first()
|
||||||
|
if not template:
|
||||||
|
raise HTTPException(status_code=404, detail="Template not found")
|
||||||
|
|
||||||
|
# Update fields if provided
|
||||||
|
if "name" in data:
|
||||||
|
template.name = data["name"]
|
||||||
|
if "project_id" in data:
|
||||||
|
template.project_id = data["project_id"]
|
||||||
|
if "report_title" in data:
|
||||||
|
template.report_title = data["report_title"]
|
||||||
|
if "start_time" in data:
|
||||||
|
template.start_time = data["start_time"]
|
||||||
|
if "end_time" in data:
|
||||||
|
template.end_time = data["end_time"]
|
||||||
|
if "start_date" in data:
|
||||||
|
template.start_date = data["start_date"]
|
||||||
|
if "end_date" in data:
|
||||||
|
template.end_date = data["end_date"]
|
||||||
|
|
||||||
|
template.updated_at = datetime.utcnow()
|
||||||
|
db.commit()
|
||||||
|
db.refresh(template)
|
||||||
|
|
||||||
|
return {
|
||||||
|
"id": template.id,
|
||||||
|
"name": template.name,
|
||||||
|
"project_id": template.project_id,
|
||||||
|
"report_title": template.report_title,
|
||||||
|
"start_time": template.start_time,
|
||||||
|
"end_time": template.end_time,
|
||||||
|
"start_date": template.start_date,
|
||||||
|
"end_date": template.end_date,
|
||||||
|
"updated_at": template.updated_at.isoformat() if template.updated_at else None,
|
||||||
|
}
|
||||||
|
|
||||||
|
|
||||||
|
@router.delete("/{template_id}")
|
||||||
|
async def delete_template(
|
||||||
|
template_id: str,
|
||||||
|
db: Session = Depends(get_db),
|
||||||
|
):
|
||||||
|
"""Delete a report template."""
|
||||||
|
template = db.query(ReportTemplate).filter_by(id=template_id).first()
|
||||||
|
if not template:
|
||||||
|
raise HTTPException(status_code=404, detail="Template not found")
|
||||||
|
|
||||||
|
db.delete(template)
|
||||||
|
db.commit()
|
||||||
|
|
||||||
|
return JSONResponse({"status": "success", "message": "Template deleted"})
|
||||||
@@ -2,20 +2,32 @@ from fastapi import APIRouter, Depends
|
|||||||
from sqlalchemy.orm import Session
|
from sqlalchemy.orm import Session
|
||||||
from datetime import datetime, timedelta
|
from datetime import datetime, timedelta
|
||||||
from typing import Dict, Any
|
from typing import Dict, Any
|
||||||
|
import asyncio
|
||||||
|
import logging
|
||||||
import random
|
import random
|
||||||
|
|
||||||
from backend.database import get_db
|
from backend.database import get_db
|
||||||
from backend.services.snapshot import emit_status_snapshot
|
from backend.services.snapshot import emit_status_snapshot
|
||||||
|
from backend.services.slm_status_sync import sync_slm_status_to_emitters
|
||||||
|
|
||||||
router = APIRouter(prefix="/api", tags=["roster"])
|
router = APIRouter(prefix="/api", tags=["roster"])
|
||||||
|
logger = logging.getLogger(__name__)
|
||||||
|
|
||||||
|
|
||||||
@router.get("/status-snapshot")
|
@router.get("/status-snapshot")
|
||||||
def get_status_snapshot(db: Session = Depends(get_db)):
|
async def get_status_snapshot(db: Session = Depends(get_db)):
|
||||||
"""
|
"""
|
||||||
Calls emit_status_snapshot() to get current fleet status.
|
Calls emit_status_snapshot() to get current fleet status.
|
||||||
This will be replaced with real Series3 emitter logic later.
|
Syncs SLM status from SLMM before generating snapshot.
|
||||||
"""
|
"""
|
||||||
|
# Sync SLM status from SLMM (with timeout to prevent blocking)
|
||||||
|
try:
|
||||||
|
await asyncio.wait_for(sync_slm_status_to_emitters(), timeout=2.0)
|
||||||
|
except asyncio.TimeoutError:
|
||||||
|
logger.warning("SLM status sync timed out, using cached data")
|
||||||
|
except Exception as e:
|
||||||
|
logger.warning(f"SLM status sync failed: {e}")
|
||||||
|
|
||||||
return emit_status_snapshot()
|
return emit_status_snapshot()
|
||||||
|
|
||||||
|
|
||||||
|
|||||||
@@ -92,15 +92,15 @@ async def rename_unit(
|
|||||||
except Exception as e:
|
except Exception as e:
|
||||||
logger.warning(f"Could not update unit_assignments: {e}")
|
logger.warning(f"Could not update unit_assignments: {e}")
|
||||||
|
|
||||||
# Update recording_sessions table (if exists)
|
# Update monitoring_sessions table (if exists)
|
||||||
try:
|
try:
|
||||||
from backend.models import RecordingSession
|
from backend.models import MonitoringSession
|
||||||
db.query(RecordingSession).filter(RecordingSession.unit_id == old_id).update(
|
db.query(MonitoringSession).filter(MonitoringSession.unit_id == old_id).update(
|
||||||
{"unit_id": new_id},
|
{"unit_id": new_id},
|
||||||
synchronize_session=False
|
synchronize_session=False
|
||||||
)
|
)
|
||||||
except Exception as e:
|
except Exception as e:
|
||||||
logger.warning(f"Could not update recording_sessions: {e}")
|
logger.warning(f"Could not update monitoring_sessions: {e}")
|
||||||
|
|
||||||
# Commit all changes
|
# Commit all changes
|
||||||
db.commit()
|
db.commit()
|
||||||
|
|||||||
@@ -5,7 +5,6 @@ Handles scheduled actions for automated recording control.
|
|||||||
"""
|
"""
|
||||||
|
|
||||||
from fastapi import APIRouter, Request, Depends, HTTPException, Query
|
from fastapi import APIRouter, Request, Depends, HTTPException, Query
|
||||||
from fastapi.templating import Jinja2Templates
|
|
||||||
from fastapi.responses import HTMLResponse, JSONResponse
|
from fastapi.responses import HTMLResponse, JSONResponse
|
||||||
from sqlalchemy.orm import Session
|
from sqlalchemy.orm import Session
|
||||||
from sqlalchemy import and_, or_
|
from sqlalchemy import and_, or_
|
||||||
@@ -23,9 +22,9 @@ from backend.models import (
|
|||||||
RosterUnit,
|
RosterUnit,
|
||||||
)
|
)
|
||||||
from backend.services.scheduler import get_scheduler
|
from backend.services.scheduler import get_scheduler
|
||||||
|
from backend.templates_config import templates
|
||||||
|
|
||||||
router = APIRouter(prefix="/api/projects/{project_id}/scheduler", tags=["scheduler"])
|
router = APIRouter(prefix="/api/projects/{project_id}/scheduler", tags=["scheduler"])
|
||||||
templates = Jinja2Templates(directory="templates")
|
|
||||||
|
|
||||||
|
|
||||||
# ============================================================================
|
# ============================================================================
|
||||||
|
|||||||
@@ -3,15 +3,16 @@ Seismograph Dashboard API Router
|
|||||||
Provides endpoints for the seismograph-specific dashboard
|
Provides endpoints for the seismograph-specific dashboard
|
||||||
"""
|
"""
|
||||||
|
|
||||||
from fastapi import APIRouter, Request, Depends, Query
|
from datetime import date, datetime, timedelta
|
||||||
|
|
||||||
|
from fastapi import APIRouter, Request, Depends, Query, Form, HTTPException
|
||||||
from fastapi.responses import HTMLResponse
|
from fastapi.responses import HTMLResponse
|
||||||
from fastapi.templating import Jinja2Templates
|
|
||||||
from sqlalchemy.orm import Session
|
from sqlalchemy.orm import Session
|
||||||
from backend.database import get_db
|
from backend.database import get_db
|
||||||
from backend.models import RosterUnit
|
from backend.models import RosterUnit, UnitHistory, UserPreferences
|
||||||
|
from backend.templates_config import templates
|
||||||
|
|
||||||
router = APIRouter(prefix="/api/seismo-dashboard", tags=["seismo-dashboard"])
|
router = APIRouter(prefix="/api/seismo-dashboard", tags=["seismo-dashboard"])
|
||||||
templates = Jinja2Templates(directory="templates")
|
|
||||||
|
|
||||||
|
|
||||||
@router.get("/stats", response_class=HTMLResponse)
|
@router.get("/stats", response_class=HTMLResponse)
|
||||||
@@ -27,7 +28,8 @@ async def get_seismo_stats(request: Request, db: Session = Depends(get_db)):
|
|||||||
|
|
||||||
total = len(seismos)
|
total = len(seismos)
|
||||||
deployed = sum(1 for s in seismos if s.deployed)
|
deployed = sum(1 for s in seismos if s.deployed)
|
||||||
benched = sum(1 for s in seismos if not s.deployed)
|
benched = sum(1 for s in seismos if not s.deployed and not s.out_for_calibration)
|
||||||
|
out_for_calibration = sum(1 for s in seismos if s.out_for_calibration)
|
||||||
|
|
||||||
# Count modems assigned to deployed seismographs
|
# Count modems assigned to deployed seismographs
|
||||||
with_modem = sum(1 for s in seismos if s.deployed and s.deployed_with_modem_id)
|
with_modem = sum(1 for s in seismos if s.deployed and s.deployed_with_modem_id)
|
||||||
@@ -40,6 +42,7 @@ async def get_seismo_stats(request: Request, db: Session = Depends(get_db)):
|
|||||||
"total": total,
|
"total": total,
|
||||||
"deployed": deployed,
|
"deployed": deployed,
|
||||||
"benched": benched,
|
"benched": benched,
|
||||||
|
"out_for_calibration": out_for_calibration,
|
||||||
"with_modem": with_modem,
|
"with_modem": with_modem,
|
||||||
"without_modem": without_modem
|
"without_modem": without_modem
|
||||||
}
|
}
|
||||||
@@ -50,10 +53,14 @@ async def get_seismo_stats(request: Request, db: Session = Depends(get_db)):
|
|||||||
async def get_seismo_units(
|
async def get_seismo_units(
|
||||||
request: Request,
|
request: Request,
|
||||||
db: Session = Depends(get_db),
|
db: Session = Depends(get_db),
|
||||||
search: str = Query(None)
|
search: str = Query(None),
|
||||||
|
sort: str = Query("id"),
|
||||||
|
order: str = Query("asc"),
|
||||||
|
status: str = Query(None),
|
||||||
|
modem: str = Query(None)
|
||||||
):
|
):
|
||||||
"""
|
"""
|
||||||
Returns HTML partial with filterable seismograph unit list
|
Returns HTML partial with filterable and sortable seismograph unit list
|
||||||
"""
|
"""
|
||||||
query = db.query(RosterUnit).filter_by(
|
query = db.query(RosterUnit).filter_by(
|
||||||
device_type="seismograph",
|
device_type="seismograph",
|
||||||
@@ -62,20 +69,160 @@ async def get_seismo_units(
|
|||||||
|
|
||||||
# Apply search filter
|
# Apply search filter
|
||||||
if search:
|
if search:
|
||||||
search_lower = search.lower()
|
|
||||||
query = query.filter(
|
query = query.filter(
|
||||||
(RosterUnit.id.ilike(f"%{search}%")) |
|
(RosterUnit.id.ilike(f"%{search}%")) |
|
||||||
(RosterUnit.note.ilike(f"%{search}%")) |
|
(RosterUnit.note.ilike(f"%{search}%")) |
|
||||||
(RosterUnit.address.ilike(f"%{search}%"))
|
(RosterUnit.address.ilike(f"%{search}%"))
|
||||||
)
|
)
|
||||||
|
|
||||||
seismos = query.order_by(RosterUnit.id).all()
|
# Apply status filter
|
||||||
|
if status == "deployed":
|
||||||
|
query = query.filter(RosterUnit.deployed == True)
|
||||||
|
elif status == "benched":
|
||||||
|
query = query.filter(RosterUnit.deployed == False, RosterUnit.out_for_calibration == False)
|
||||||
|
elif status == "out_for_calibration":
|
||||||
|
query = query.filter(RosterUnit.out_for_calibration == True)
|
||||||
|
|
||||||
|
# Apply modem filter
|
||||||
|
if modem == "with":
|
||||||
|
query = query.filter(RosterUnit.deployed_with_modem_id.isnot(None))
|
||||||
|
elif modem == "without":
|
||||||
|
query = query.filter(RosterUnit.deployed_with_modem_id.is_(None))
|
||||||
|
|
||||||
|
# Apply sorting
|
||||||
|
sort_column_map = {
|
||||||
|
"id": RosterUnit.id,
|
||||||
|
"status": RosterUnit.deployed,
|
||||||
|
"modem": RosterUnit.deployed_with_modem_id,
|
||||||
|
"location": RosterUnit.address,
|
||||||
|
"last_calibrated": RosterUnit.last_calibrated,
|
||||||
|
"notes": RosterUnit.note
|
||||||
|
}
|
||||||
|
sort_column = sort_column_map.get(sort, RosterUnit.id)
|
||||||
|
|
||||||
|
if order == "desc":
|
||||||
|
query = query.order_by(sort_column.desc())
|
||||||
|
else:
|
||||||
|
query = query.order_by(sort_column.asc())
|
||||||
|
|
||||||
|
seismos = query.all()
|
||||||
|
|
||||||
return templates.TemplateResponse(
|
return templates.TemplateResponse(
|
||||||
"partials/seismo_unit_list.html",
|
"partials/seismo_unit_list.html",
|
||||||
{
|
{
|
||||||
"request": request,
|
"request": request,
|
||||||
"units": seismos,
|
"units": seismos,
|
||||||
"search": search or ""
|
"search": search or "",
|
||||||
|
"sort": sort,
|
||||||
|
"order": order,
|
||||||
|
"status": status or "",
|
||||||
|
"modem": modem or "",
|
||||||
|
"today": date.today()
|
||||||
}
|
}
|
||||||
)
|
)
|
||||||
|
|
||||||
|
|
||||||
|
def _get_calibration_interval(db: Session) -> int:
|
||||||
|
prefs = db.query(UserPreferences).first()
|
||||||
|
if prefs and prefs.calibration_interval_days:
|
||||||
|
return prefs.calibration_interval_days
|
||||||
|
return 365
|
||||||
|
|
||||||
|
|
||||||
|
def _row_context(request: Request, unit: RosterUnit) -> dict:
|
||||||
|
return {"request": request, "unit": unit, "today": date.today()}
|
||||||
|
|
||||||
|
|
||||||
|
@router.get("/unit/{unit_id}/view-row", response_class=HTMLResponse)
|
||||||
|
async def get_seismo_view_row(unit_id: str, request: Request, db: Session = Depends(get_db)):
|
||||||
|
unit = db.query(RosterUnit).filter(RosterUnit.id == unit_id).first()
|
||||||
|
if not unit:
|
||||||
|
raise HTTPException(status_code=404, detail="Unit not found")
|
||||||
|
return templates.TemplateResponse("partials/seismo_row_view.html", _row_context(request, unit))
|
||||||
|
|
||||||
|
|
||||||
|
@router.get("/unit/{unit_id}/edit-row", response_class=HTMLResponse)
|
||||||
|
async def get_seismo_edit_row(unit_id: str, request: Request, db: Session = Depends(get_db)):
|
||||||
|
unit = db.query(RosterUnit).filter(RosterUnit.id == unit_id).first()
|
||||||
|
if not unit:
|
||||||
|
raise HTTPException(status_code=404, detail="Unit not found")
|
||||||
|
return templates.TemplateResponse("partials/seismo_row_edit.html", _row_context(request, unit))
|
||||||
|
|
||||||
|
|
||||||
|
@router.post("/unit/{unit_id}/quick-update", response_class=HTMLResponse)
|
||||||
|
async def quick_update_seismo_unit(
|
||||||
|
unit_id: str,
|
||||||
|
request: Request,
|
||||||
|
db: Session = Depends(get_db),
|
||||||
|
status: str = Form(...),
|
||||||
|
last_calibrated: str = Form(""),
|
||||||
|
note: str = Form(""),
|
||||||
|
):
|
||||||
|
unit = db.query(RosterUnit).filter(RosterUnit.id == unit_id).first()
|
||||||
|
if not unit:
|
||||||
|
raise HTTPException(status_code=404, detail="Unit not found")
|
||||||
|
|
||||||
|
# --- Status ---
|
||||||
|
old_deployed = unit.deployed
|
||||||
|
old_out_for_cal = unit.out_for_calibration
|
||||||
|
if status == "deployed":
|
||||||
|
unit.deployed = True
|
||||||
|
unit.out_for_calibration = False
|
||||||
|
elif status == "out_for_calibration":
|
||||||
|
unit.deployed = False
|
||||||
|
unit.out_for_calibration = True
|
||||||
|
else:
|
||||||
|
unit.deployed = False
|
||||||
|
unit.out_for_calibration = False
|
||||||
|
|
||||||
|
if unit.deployed != old_deployed or unit.out_for_calibration != old_out_for_cal:
|
||||||
|
old_status = "deployed" if old_deployed else ("out_for_calibration" if old_out_for_cal else "benched")
|
||||||
|
db.add(UnitHistory(
|
||||||
|
unit_id=unit_id,
|
||||||
|
change_type="deployed_change",
|
||||||
|
field_name="status",
|
||||||
|
old_value=old_status,
|
||||||
|
new_value=status,
|
||||||
|
source="manual",
|
||||||
|
))
|
||||||
|
|
||||||
|
# --- Last calibrated ---
|
||||||
|
old_cal = unit.last_calibrated
|
||||||
|
if last_calibrated:
|
||||||
|
try:
|
||||||
|
new_cal = datetime.strptime(last_calibrated, "%Y-%m-%d").date()
|
||||||
|
except ValueError:
|
||||||
|
raise HTTPException(status_code=400, detail="Invalid date format. Use YYYY-MM-DD")
|
||||||
|
unit.last_calibrated = new_cal
|
||||||
|
unit.next_calibration_due = new_cal + timedelta(days=_get_calibration_interval(db))
|
||||||
|
else:
|
||||||
|
unit.last_calibrated = None
|
||||||
|
unit.next_calibration_due = None
|
||||||
|
|
||||||
|
if unit.last_calibrated != old_cal:
|
||||||
|
db.add(UnitHistory(
|
||||||
|
unit_id=unit_id,
|
||||||
|
change_type="calibration_status_change",
|
||||||
|
field_name="last_calibrated",
|
||||||
|
old_value=old_cal.strftime("%Y-%m-%d") if old_cal else None,
|
||||||
|
new_value=last_calibrated or None,
|
||||||
|
source="manual",
|
||||||
|
))
|
||||||
|
|
||||||
|
# --- Note ---
|
||||||
|
old_note = unit.note
|
||||||
|
unit.note = note or None
|
||||||
|
if unit.note != old_note:
|
||||||
|
db.add(UnitHistory(
|
||||||
|
unit_id=unit_id,
|
||||||
|
change_type="note_change",
|
||||||
|
field_name="note",
|
||||||
|
old_value=old_note,
|
||||||
|
new_value=unit.note,
|
||||||
|
source="manual",
|
||||||
|
))
|
||||||
|
|
||||||
|
db.commit()
|
||||||
|
db.refresh(unit)
|
||||||
|
|
||||||
|
return templates.TemplateResponse("partials/seismo_row_view.html", _row_context(request, unit))
|
||||||
|
|||||||
@@ -5,7 +5,6 @@ Provides API endpoints for the Sound Level Meters dashboard page.
|
|||||||
"""
|
"""
|
||||||
|
|
||||||
from fastapi import APIRouter, Request, Depends, Query
|
from fastapi import APIRouter, Request, Depends, Query
|
||||||
from fastapi.templating import Jinja2Templates
|
|
||||||
from fastapi.responses import HTMLResponse
|
from fastapi.responses import HTMLResponse
|
||||||
from sqlalchemy.orm import Session
|
from sqlalchemy.orm import Session
|
||||||
from sqlalchemy import func
|
from sqlalchemy import func
|
||||||
@@ -18,11 +17,11 @@ import os
|
|||||||
from backend.database import get_db
|
from backend.database import get_db
|
||||||
from backend.models import RosterUnit
|
from backend.models import RosterUnit
|
||||||
from backend.routers.roster_edit import sync_slm_to_slmm_cache
|
from backend.routers.roster_edit import sync_slm_to_slmm_cache
|
||||||
|
from backend.templates_config import templates
|
||||||
|
|
||||||
logger = logging.getLogger(__name__)
|
logger = logging.getLogger(__name__)
|
||||||
|
|
||||||
router = APIRouter(prefix="/api/slm-dashboard", tags=["slm-dashboard"])
|
router = APIRouter(prefix="/api/slm-dashboard", tags=["slm-dashboard"])
|
||||||
templates = Jinja2Templates(directory="templates")
|
|
||||||
|
|
||||||
# SLMM backend URL - configurable via environment variable
|
# SLMM backend URL - configurable via environment variable
|
||||||
SLMM_BASE_URL = os.getenv("SLMM_BASE_URL", "http://localhost:8100")
|
SLMM_BASE_URL = os.getenv("SLMM_BASE_URL", "http://localhost:8100")
|
||||||
@@ -168,23 +167,7 @@ async def get_live_view(request: Request, unit_id: str, db: Session = Depends(ge
|
|||||||
measurement_state = state_data.get("measurement_state", "Unknown")
|
measurement_state = state_data.get("measurement_state", "Unknown")
|
||||||
is_measuring = state_data.get("is_measuring", False)
|
is_measuring = state_data.get("is_measuring", False)
|
||||||
|
|
||||||
# If measuring, sync start time from FTP to database (fixes wrong timestamps)
|
# Get live status (measurement_start_time is already stored in SLMM database)
|
||||||
if is_measuring:
|
|
||||||
try:
|
|
||||||
sync_response = await client.post(
|
|
||||||
f"{SLMM_BASE_URL}/api/nl43/{unit_id}/sync-start-time",
|
|
||||||
timeout=10.0
|
|
||||||
)
|
|
||||||
if sync_response.status_code == 200:
|
|
||||||
sync_data = sync_response.json()
|
|
||||||
logger.info(f"Synced start time for {unit_id}: {sync_data.get('message')}")
|
|
||||||
else:
|
|
||||||
logger.warning(f"Failed to sync start time for {unit_id}: {sync_response.status_code}")
|
|
||||||
except Exception as e:
|
|
||||||
# Don't fail the whole request if sync fails
|
|
||||||
logger.warning(f"Could not sync start time for {unit_id}: {e}")
|
|
||||||
|
|
||||||
# Get live status (now with corrected start time)
|
|
||||||
status_response = await client.get(
|
status_response = await client.get(
|
||||||
f"{SLMM_BASE_URL}/api/nl43/{unit_id}/live"
|
f"{SLMM_BASE_URL}/api/nl43/{unit_id}/live"
|
||||||
)
|
)
|
||||||
|
|||||||
@@ -6,7 +6,6 @@ Provides endpoints for SLM dashboard cards, detail pages, and real-time data.
|
|||||||
|
|
||||||
from fastapi import APIRouter, Depends, HTTPException, Request
|
from fastapi import APIRouter, Depends, HTTPException, Request
|
||||||
from fastapi.responses import HTMLResponse
|
from fastapi.responses import HTMLResponse
|
||||||
from fastapi.templating import Jinja2Templates
|
|
||||||
from sqlalchemy.orm import Session
|
from sqlalchemy.orm import Session
|
||||||
from datetime import datetime
|
from datetime import datetime
|
||||||
import httpx
|
import httpx
|
||||||
@@ -15,11 +14,11 @@ import os
|
|||||||
|
|
||||||
from backend.database import get_db
|
from backend.database import get_db
|
||||||
from backend.models import RosterUnit
|
from backend.models import RosterUnit
|
||||||
|
from backend.templates_config import templates
|
||||||
|
|
||||||
logger = logging.getLogger(__name__)
|
logger = logging.getLogger(__name__)
|
||||||
|
|
||||||
router = APIRouter(prefix="/slm", tags=["slm-ui"])
|
router = APIRouter(prefix="/slm", tags=["slm-ui"])
|
||||||
templates = Jinja2Templates(directory="templates")
|
|
||||||
|
|
||||||
SLMM_BASE_URL = os.getenv("SLMM_BASE_URL", "http://172.19.0.1:8100")
|
SLMM_BASE_URL = os.getenv("SLMM_BASE_URL", "http://172.19.0.1:8100")
|
||||||
|
|
||||||
|
|||||||
133
backend/routers/watcher_manager.py
Normal file
@@ -0,0 +1,133 @@
|
|||||||
|
"""
|
||||||
|
Watcher Manager — admin API for series3-watcher and thor-watcher agents.
|
||||||
|
|
||||||
|
Endpoints:
|
||||||
|
GET /api/admin/watchers — list all watcher agents
|
||||||
|
GET /api/admin/watchers/{agent_id} — get single agent detail
|
||||||
|
POST /api/admin/watchers/{agent_id}/trigger-update — flag agent for update
|
||||||
|
POST /api/admin/watchers/{agent_id}/clear-update — clear update flag
|
||||||
|
GET /api/admin/watchers/{agent_id}/update-check — polled by watcher on heartbeat
|
||||||
|
|
||||||
|
Page:
|
||||||
|
GET /admin/watchers — HTML admin page
|
||||||
|
"""
|
||||||
|
|
||||||
|
from datetime import datetime, timezone
|
||||||
|
from fastapi import APIRouter, Depends, HTTPException, Request
|
||||||
|
from fastapi.responses import HTMLResponse
|
||||||
|
from pydantic import BaseModel
|
||||||
|
from sqlalchemy.orm import Session
|
||||||
|
from typing import Optional
|
||||||
|
|
||||||
|
from backend.database import get_db
|
||||||
|
from backend.models import WatcherAgent
|
||||||
|
from backend.templates_config import templates
|
||||||
|
|
||||||
|
router = APIRouter(tags=["admin"])
|
||||||
|
|
||||||
|
|
||||||
|
# ── helpers ──────────────────────────────────────────────────────────────────
|
||||||
|
|
||||||
|
def _agent_to_dict(agent: WatcherAgent) -> dict:
|
||||||
|
last_seen = agent.last_seen
|
||||||
|
if last_seen:
|
||||||
|
now_utc = datetime.utcnow()
|
||||||
|
age_minutes = int((now_utc - last_seen).total_seconds() // 60)
|
||||||
|
if age_minutes > 60:
|
||||||
|
status = "missing"
|
||||||
|
else:
|
||||||
|
status = "ok"
|
||||||
|
else:
|
||||||
|
age_minutes = None
|
||||||
|
status = "missing"
|
||||||
|
|
||||||
|
return {
|
||||||
|
"id": agent.id,
|
||||||
|
"source_type": agent.source_type,
|
||||||
|
"version": agent.version,
|
||||||
|
"last_seen": last_seen.isoformat() if last_seen else None,
|
||||||
|
"age_minutes": age_minutes,
|
||||||
|
"status": status,
|
||||||
|
"ip_address": agent.ip_address,
|
||||||
|
"log_tail": agent.log_tail,
|
||||||
|
"update_pending": bool(agent.update_pending),
|
||||||
|
"update_version": agent.update_version,
|
||||||
|
}
|
||||||
|
|
||||||
|
|
||||||
|
# ── API routes ────────────────────────────────────────────────────────────────
|
||||||
|
|
||||||
|
@router.get("/api/admin/watchers")
|
||||||
|
def list_watchers(db: Session = Depends(get_db)):
|
||||||
|
agents = db.query(WatcherAgent).order_by(WatcherAgent.last_seen.desc()).all()
|
||||||
|
return [_agent_to_dict(a) for a in agents]
|
||||||
|
|
||||||
|
|
||||||
|
@router.get("/api/admin/watchers/{agent_id}")
|
||||||
|
def get_watcher(agent_id: str, db: Session = Depends(get_db)):
|
||||||
|
agent = db.query(WatcherAgent).filter(WatcherAgent.id == agent_id).first()
|
||||||
|
if not agent:
|
||||||
|
raise HTTPException(status_code=404, detail="Watcher agent not found")
|
||||||
|
return _agent_to_dict(agent)
|
||||||
|
|
||||||
|
|
||||||
|
class TriggerUpdateRequest(BaseModel):
|
||||||
|
version: Optional[str] = None # target version label (informational)
|
||||||
|
|
||||||
|
|
||||||
|
@router.post("/api/admin/watchers/{agent_id}/trigger-update")
|
||||||
|
def trigger_update(agent_id: str, body: TriggerUpdateRequest, db: Session = Depends(get_db)):
|
||||||
|
agent = db.query(WatcherAgent).filter(WatcherAgent.id == agent_id).first()
|
||||||
|
if not agent:
|
||||||
|
raise HTTPException(status_code=404, detail="Watcher agent not found")
|
||||||
|
agent.update_pending = True
|
||||||
|
agent.update_version = body.version
|
||||||
|
db.commit()
|
||||||
|
return {"ok": True, "agent_id": agent_id, "update_pending": True}
|
||||||
|
|
||||||
|
|
||||||
|
@router.post("/api/admin/watchers/{agent_id}/clear-update")
|
||||||
|
def clear_update(agent_id: str, db: Session = Depends(get_db)):
|
||||||
|
agent = db.query(WatcherAgent).filter(WatcherAgent.id == agent_id).first()
|
||||||
|
if not agent:
|
||||||
|
raise HTTPException(status_code=404, detail="Watcher agent not found")
|
||||||
|
agent.update_pending = False
|
||||||
|
agent.update_version = None
|
||||||
|
db.commit()
|
||||||
|
return {"ok": True, "agent_id": agent_id, "update_pending": False}
|
||||||
|
|
||||||
|
|
||||||
|
@router.get("/api/admin/watchers/{agent_id}/update-check")
|
||||||
|
def update_check(agent_id: str, db: Session = Depends(get_db)):
|
||||||
|
"""
|
||||||
|
Polled by watcher agents on each heartbeat cycle.
|
||||||
|
Returns update_available=True when an update has been triggered via the UI.
|
||||||
|
Automatically clears the flag after the watcher acknowledges it.
|
||||||
|
"""
|
||||||
|
agent = db.query(WatcherAgent).filter(WatcherAgent.id == agent_id).first()
|
||||||
|
if not agent:
|
||||||
|
return {"update_available": False}
|
||||||
|
|
||||||
|
pending = bool(agent.update_pending)
|
||||||
|
|
||||||
|
if pending:
|
||||||
|
# Clear the flag — the watcher will now self-update
|
||||||
|
agent.update_pending = False
|
||||||
|
db.commit()
|
||||||
|
|
||||||
|
return {
|
||||||
|
"update_available": pending,
|
||||||
|
"version": agent.update_version,
|
||||||
|
}
|
||||||
|
|
||||||
|
|
||||||
|
# ── HTML page ─────────────────────────────────────────────────────────────────
|
||||||
|
|
||||||
|
@router.get("/admin/watchers", response_class=HTMLResponse)
|
||||||
|
def admin_watchers_page(request: Request, db: Session = Depends(get_db)):
|
||||||
|
agents = db.query(WatcherAgent).order_by(WatcherAgent.last_seen.desc()).all()
|
||||||
|
agents_data = [_agent_to_dict(a) for a in agents]
|
||||||
|
return templates.TemplateResponse("admin_watchers.html", {
|
||||||
|
"request": request,
|
||||||
|
"agents": agents_data,
|
||||||
|
})
|
||||||
@@ -5,7 +5,7 @@ from datetime import datetime
|
|||||||
from typing import Optional, List
|
from typing import Optional, List
|
||||||
|
|
||||||
from backend.database import get_db
|
from backend.database import get_db
|
||||||
from backend.models import Emitter
|
from backend.models import Emitter, WatcherAgent
|
||||||
|
|
||||||
router = APIRouter()
|
router = APIRouter()
|
||||||
|
|
||||||
@@ -107,6 +107,35 @@ def get_fleet_status(db: Session = Depends(get_db)):
|
|||||||
emitters = db.query(Emitter).all()
|
emitters = db.query(Emitter).all()
|
||||||
return emitters
|
return emitters
|
||||||
|
|
||||||
|
# ── Watcher agent upsert helper ───────────────────────────────────────────────
|
||||||
|
|
||||||
|
def _upsert_watcher_agent(db: Session, source_id: str, source_type: str,
|
||||||
|
version: str, ip_address: str, log_tail: str,
|
||||||
|
status: str) -> None:
|
||||||
|
"""Create or update the WatcherAgent row for a given source_id."""
|
||||||
|
agent = db.query(WatcherAgent).filter(WatcherAgent.id == source_id).first()
|
||||||
|
if agent:
|
||||||
|
agent.source_type = source_type
|
||||||
|
agent.version = version
|
||||||
|
agent.last_seen = datetime.utcnow()
|
||||||
|
agent.status = status
|
||||||
|
if ip_address:
|
||||||
|
agent.ip_address = ip_address
|
||||||
|
if log_tail is not None:
|
||||||
|
agent.log_tail = log_tail
|
||||||
|
else:
|
||||||
|
agent = WatcherAgent(
|
||||||
|
id=source_id,
|
||||||
|
source_type=source_type,
|
||||||
|
version=version,
|
||||||
|
last_seen=datetime.utcnow(),
|
||||||
|
status=status,
|
||||||
|
ip_address=ip_address,
|
||||||
|
log_tail=log_tail,
|
||||||
|
)
|
||||||
|
db.add(agent)
|
||||||
|
|
||||||
|
|
||||||
# series3v1.1 Standardized Heartbeat Schema (multi-unit)
|
# series3v1.1 Standardized Heartbeat Schema (multi-unit)
|
||||||
from fastapi import Request
|
from fastapi import Request
|
||||||
|
|
||||||
@@ -120,6 +149,11 @@ async def series3_heartbeat(request: Request, db: Session = Depends(get_db)):
|
|||||||
|
|
||||||
source = payload.get("source_id")
|
source = payload.get("source_id")
|
||||||
units = payload.get("units", [])
|
units = payload.get("units", [])
|
||||||
|
version = payload.get("version")
|
||||||
|
log_tail = payload.get("log_tail") # list of strings or None
|
||||||
|
import json as _json
|
||||||
|
log_tail_str = _json.dumps(log_tail) if log_tail is not None else None
|
||||||
|
client_ip = request.client.host if request.client else None
|
||||||
|
|
||||||
print("\n=== Series 3 Heartbeat ===")
|
print("\n=== Series 3 Heartbeat ===")
|
||||||
print("Source:", source)
|
print("Source:", source)
|
||||||
@@ -182,13 +216,27 @@ async def series3_heartbeat(request: Request, db: Session = Depends(get_db)):
|
|||||||
|
|
||||||
results.append({"unit": uid, "status": status})
|
results.append({"unit": uid, "status": status})
|
||||||
|
|
||||||
|
if source:
|
||||||
|
_upsert_watcher_agent(db, source, "series3_watcher", version,
|
||||||
|
client_ip, log_tail_str, "ok")
|
||||||
|
|
||||||
db.commit()
|
db.commit()
|
||||||
|
|
||||||
|
# Check if an update has been triggered for this agent
|
||||||
|
update_available = False
|
||||||
|
if source:
|
||||||
|
agent = db.query(WatcherAgent).filter(WatcherAgent.id == source).first()
|
||||||
|
if agent and agent.update_pending:
|
||||||
|
update_available = True
|
||||||
|
agent.update_pending = False
|
||||||
|
db.commit()
|
||||||
|
|
||||||
return {
|
return {
|
||||||
"message": "Heartbeat processed",
|
"message": "Heartbeat processed",
|
||||||
"source": source,
|
"source": source,
|
||||||
"units_processed": len(results),
|
"units_processed": len(results),
|
||||||
"results": results
|
"results": results,
|
||||||
|
"update_available": update_available,
|
||||||
}
|
}
|
||||||
|
|
||||||
|
|
||||||
@@ -219,8 +267,14 @@ async def series4_heartbeat(request: Request, db: Session = Depends(get_db)):
|
|||||||
"""
|
"""
|
||||||
payload = await request.json()
|
payload = await request.json()
|
||||||
|
|
||||||
source = payload.get("source", "series4_emitter")
|
# Accept source_id (new standard field) with fallback to legacy "source" key
|
||||||
|
source = payload.get("source_id") or payload.get("source", "series4_emitter")
|
||||||
units = payload.get("units", [])
|
units = payload.get("units", [])
|
||||||
|
version = payload.get("version")
|
||||||
|
log_tail = payload.get("log_tail")
|
||||||
|
import json as _json
|
||||||
|
log_tail_str = _json.dumps(log_tail) if log_tail is not None else None
|
||||||
|
client_ip = request.client.host if request.client else None
|
||||||
|
|
||||||
print("\n=== Series 4 Heartbeat ===")
|
print("\n=== Series 4 Heartbeat ===")
|
||||||
print("Source:", source)
|
print("Source:", source)
|
||||||
@@ -276,11 +330,25 @@ async def series4_heartbeat(request: Request, db: Session = Depends(get_db)):
|
|||||||
|
|
||||||
results.append({"unit": uid, "status": status})
|
results.append({"unit": uid, "status": status})
|
||||||
|
|
||||||
|
if source:
|
||||||
|
_upsert_watcher_agent(db, source, "series4_watcher", version,
|
||||||
|
client_ip, log_tail_str, "ok")
|
||||||
|
|
||||||
db.commit()
|
db.commit()
|
||||||
|
|
||||||
|
# Check if an update has been triggered for this agent
|
||||||
|
update_available = False
|
||||||
|
if source:
|
||||||
|
agent = db.query(WatcherAgent).filter(WatcherAgent.id == source).first()
|
||||||
|
if agent and agent.update_pending:
|
||||||
|
update_available = True
|
||||||
|
agent.update_pending = False
|
||||||
|
db.commit()
|
||||||
|
|
||||||
return {
|
return {
|
||||||
"message": "Heartbeat processed",
|
"message": "Heartbeat processed",
|
||||||
"source": source,
|
"source": source,
|
||||||
"units_processed": len(results),
|
"units_processed": len(results),
|
||||||
"results": results
|
"results": results,
|
||||||
|
"update_available": update_available,
|
||||||
}
|
}
|
||||||
|
|||||||
462
backend/services/alert_service.py
Normal file
@@ -0,0 +1,462 @@
|
|||||||
|
"""
|
||||||
|
Alert Service
|
||||||
|
|
||||||
|
Manages in-app alerts for device status changes and system events.
|
||||||
|
Provides foundation for future notification channels (email, webhook).
|
||||||
|
"""
|
||||||
|
|
||||||
|
import json
|
||||||
|
import uuid
|
||||||
|
import logging
|
||||||
|
from datetime import datetime, timedelta
|
||||||
|
from typing import Optional, List, Dict, Any
|
||||||
|
|
||||||
|
from sqlalchemy.orm import Session
|
||||||
|
from sqlalchemy import and_, or_
|
||||||
|
|
||||||
|
from backend.models import Alert, RosterUnit
|
||||||
|
|
||||||
|
logger = logging.getLogger(__name__)
|
||||||
|
|
||||||
|
|
||||||
|
class AlertService:
|
||||||
|
"""
|
||||||
|
Service for managing alerts.
|
||||||
|
|
||||||
|
Handles alert lifecycle:
|
||||||
|
- Create alerts from various triggers
|
||||||
|
- Query active alerts
|
||||||
|
- Acknowledge/resolve/dismiss alerts
|
||||||
|
- (Future) Dispatch to notification channels
|
||||||
|
"""
|
||||||
|
|
||||||
|
def __init__(self, db: Session):
|
||||||
|
self.db = db
|
||||||
|
|
||||||
|
def create_alert(
|
||||||
|
self,
|
||||||
|
alert_type: str,
|
||||||
|
title: str,
|
||||||
|
message: str = None,
|
||||||
|
severity: str = "warning",
|
||||||
|
unit_id: str = None,
|
||||||
|
project_id: str = None,
|
||||||
|
location_id: str = None,
|
||||||
|
schedule_id: str = None,
|
||||||
|
metadata: dict = None,
|
||||||
|
expires_hours: int = 24,
|
||||||
|
) -> Alert:
|
||||||
|
"""
|
||||||
|
Create a new alert.
|
||||||
|
|
||||||
|
Args:
|
||||||
|
alert_type: Type of alert (device_offline, device_online, schedule_failed)
|
||||||
|
title: Short alert title
|
||||||
|
message: Detailed description
|
||||||
|
severity: info, warning, or critical
|
||||||
|
unit_id: Related unit ID (optional)
|
||||||
|
project_id: Related project ID (optional)
|
||||||
|
location_id: Related location ID (optional)
|
||||||
|
schedule_id: Related schedule ID (optional)
|
||||||
|
metadata: Additional JSON data
|
||||||
|
expires_hours: Hours until auto-expiry (default 24)
|
||||||
|
|
||||||
|
Returns:
|
||||||
|
Created Alert instance
|
||||||
|
"""
|
||||||
|
alert = Alert(
|
||||||
|
id=str(uuid.uuid4()),
|
||||||
|
alert_type=alert_type,
|
||||||
|
title=title,
|
||||||
|
message=message,
|
||||||
|
severity=severity,
|
||||||
|
unit_id=unit_id,
|
||||||
|
project_id=project_id,
|
||||||
|
location_id=location_id,
|
||||||
|
schedule_id=schedule_id,
|
||||||
|
alert_metadata=json.dumps(metadata) if metadata else None,
|
||||||
|
status="active",
|
||||||
|
expires_at=datetime.utcnow() + timedelta(hours=expires_hours),
|
||||||
|
)
|
||||||
|
|
||||||
|
self.db.add(alert)
|
||||||
|
self.db.commit()
|
||||||
|
self.db.refresh(alert)
|
||||||
|
|
||||||
|
logger.info(f"Created alert: {alert.title} ({alert.alert_type})")
|
||||||
|
return alert
|
||||||
|
|
||||||
|
def create_device_offline_alert(
|
||||||
|
self,
|
||||||
|
unit_id: str,
|
||||||
|
consecutive_failures: int = 0,
|
||||||
|
last_error: str = None,
|
||||||
|
) -> Optional[Alert]:
|
||||||
|
"""
|
||||||
|
Create alert when device becomes unreachable.
|
||||||
|
|
||||||
|
Only creates if no active offline alert exists for this device.
|
||||||
|
|
||||||
|
Args:
|
||||||
|
unit_id: The unit that went offline
|
||||||
|
consecutive_failures: Number of consecutive poll failures
|
||||||
|
last_error: Last error message from polling
|
||||||
|
|
||||||
|
Returns:
|
||||||
|
Created Alert or None if alert already exists
|
||||||
|
"""
|
||||||
|
# Check if active offline alert already exists
|
||||||
|
existing = self.db.query(Alert).filter(
|
||||||
|
and_(
|
||||||
|
Alert.unit_id == unit_id,
|
||||||
|
Alert.alert_type == "device_offline",
|
||||||
|
Alert.status == "active",
|
||||||
|
)
|
||||||
|
).first()
|
||||||
|
|
||||||
|
if existing:
|
||||||
|
logger.debug(f"Offline alert already exists for {unit_id}")
|
||||||
|
return None
|
||||||
|
|
||||||
|
# Get unit info for title
|
||||||
|
unit = self.db.query(RosterUnit).filter_by(id=unit_id).first()
|
||||||
|
unit_name = unit.id if unit else unit_id
|
||||||
|
|
||||||
|
# Determine severity based on failure count
|
||||||
|
severity = "critical" if consecutive_failures >= 5 else "warning"
|
||||||
|
|
||||||
|
return self.create_alert(
|
||||||
|
alert_type="device_offline",
|
||||||
|
title=f"{unit_name} is offline",
|
||||||
|
message=f"Device has been unreachable after {consecutive_failures} failed connection attempts."
|
||||||
|
+ (f" Last error: {last_error}" if last_error else ""),
|
||||||
|
severity=severity,
|
||||||
|
unit_id=unit_id,
|
||||||
|
metadata={
|
||||||
|
"consecutive_failures": consecutive_failures,
|
||||||
|
"last_error": last_error,
|
||||||
|
},
|
||||||
|
expires_hours=48, # Offline alerts stay longer
|
||||||
|
)
|
||||||
|
|
||||||
|
def resolve_device_offline_alert(self, unit_id: str) -> Optional[Alert]:
|
||||||
|
"""
|
||||||
|
Auto-resolve offline alert when device comes back online.
|
||||||
|
|
||||||
|
Also creates an "device_online" info alert to notify user.
|
||||||
|
|
||||||
|
Args:
|
||||||
|
unit_id: The unit that came back online
|
||||||
|
|
||||||
|
Returns:
|
||||||
|
The resolved Alert or None if no alert existed
|
||||||
|
"""
|
||||||
|
# Find active offline alert
|
||||||
|
alert = self.db.query(Alert).filter(
|
||||||
|
and_(
|
||||||
|
Alert.unit_id == unit_id,
|
||||||
|
Alert.alert_type == "device_offline",
|
||||||
|
Alert.status == "active",
|
||||||
|
)
|
||||||
|
).first()
|
||||||
|
|
||||||
|
if not alert:
|
||||||
|
return None
|
||||||
|
|
||||||
|
# Resolve the offline alert
|
||||||
|
alert.status = "resolved"
|
||||||
|
alert.resolved_at = datetime.utcnow()
|
||||||
|
self.db.commit()
|
||||||
|
|
||||||
|
logger.info(f"Resolved offline alert for {unit_id}")
|
||||||
|
|
||||||
|
# Create online notification
|
||||||
|
unit = self.db.query(RosterUnit).filter_by(id=unit_id).first()
|
||||||
|
unit_name = unit.id if unit else unit_id
|
||||||
|
|
||||||
|
self.create_alert(
|
||||||
|
alert_type="device_online",
|
||||||
|
title=f"{unit_name} is back online",
|
||||||
|
message="Device connection has been restored.",
|
||||||
|
severity="info",
|
||||||
|
unit_id=unit_id,
|
||||||
|
expires_hours=6, # Info alerts expire quickly
|
||||||
|
)
|
||||||
|
|
||||||
|
return alert
|
||||||
|
|
||||||
|
def create_schedule_failed_alert(
|
||||||
|
self,
|
||||||
|
schedule_id: str,
|
||||||
|
action_type: str,
|
||||||
|
unit_id: str = None,
|
||||||
|
error_message: str = None,
|
||||||
|
project_id: str = None,
|
||||||
|
location_id: str = None,
|
||||||
|
) -> Alert:
|
||||||
|
"""
|
||||||
|
Create alert when a scheduled action fails.
|
||||||
|
|
||||||
|
Args:
|
||||||
|
schedule_id: The ScheduledAction or RecurringSchedule ID
|
||||||
|
action_type: start, stop, download, cycle
|
||||||
|
unit_id: Related unit
|
||||||
|
error_message: Error from execution
|
||||||
|
project_id: Related project
|
||||||
|
location_id: Related location
|
||||||
|
|
||||||
|
Returns:
|
||||||
|
Created Alert
|
||||||
|
"""
|
||||||
|
return self.create_alert(
|
||||||
|
alert_type="schedule_failed",
|
||||||
|
title=f"Scheduled {action_type} failed",
|
||||||
|
message=error_message or f"The scheduled {action_type} action did not complete successfully.",
|
||||||
|
severity="warning",
|
||||||
|
unit_id=unit_id,
|
||||||
|
project_id=project_id,
|
||||||
|
location_id=location_id,
|
||||||
|
schedule_id=schedule_id,
|
||||||
|
metadata={"action_type": action_type},
|
||||||
|
expires_hours=24,
|
||||||
|
)
|
||||||
|
|
||||||
|
def create_schedule_completed_alert(
|
||||||
|
self,
|
||||||
|
schedule_id: str,
|
||||||
|
action_type: str,
|
||||||
|
unit_id: str = None,
|
||||||
|
project_id: str = None,
|
||||||
|
location_id: str = None,
|
||||||
|
metadata: dict = None,
|
||||||
|
) -> Alert:
|
||||||
|
"""
|
||||||
|
Create alert when a scheduled action completes successfully.
|
||||||
|
|
||||||
|
Args:
|
||||||
|
schedule_id: The ScheduledAction ID
|
||||||
|
action_type: start, stop, download, cycle
|
||||||
|
unit_id: Related unit
|
||||||
|
project_id: Related project
|
||||||
|
location_id: Related location
|
||||||
|
metadata: Additional info (e.g., downloaded folder, index numbers)
|
||||||
|
|
||||||
|
Returns:
|
||||||
|
Created Alert
|
||||||
|
"""
|
||||||
|
# Build descriptive message based on action type and metadata
|
||||||
|
if action_type == "stop" and metadata:
|
||||||
|
download_folder = metadata.get("downloaded_folder")
|
||||||
|
download_success = metadata.get("download_success", False)
|
||||||
|
if download_success and download_folder:
|
||||||
|
message = f"Measurement stopped and data downloaded ({download_folder})"
|
||||||
|
elif download_success is False and metadata.get("download_attempted"):
|
||||||
|
message = "Measurement stopped but download failed"
|
||||||
|
else:
|
||||||
|
message = "Measurement stopped successfully"
|
||||||
|
elif action_type == "start" and metadata:
|
||||||
|
new_index = metadata.get("new_index")
|
||||||
|
if new_index is not None:
|
||||||
|
message = f"Measurement started (index {new_index:04d})"
|
||||||
|
else:
|
||||||
|
message = "Measurement started successfully"
|
||||||
|
else:
|
||||||
|
message = f"Scheduled {action_type} completed successfully"
|
||||||
|
|
||||||
|
return self.create_alert(
|
||||||
|
alert_type="schedule_completed",
|
||||||
|
title=f"Scheduled {action_type} completed",
|
||||||
|
message=message,
|
||||||
|
severity="info",
|
||||||
|
unit_id=unit_id,
|
||||||
|
project_id=project_id,
|
||||||
|
location_id=location_id,
|
||||||
|
schedule_id=schedule_id,
|
||||||
|
metadata={"action_type": action_type, **(metadata or {})},
|
||||||
|
expires_hours=12, # Info alerts expire quickly
|
||||||
|
)
|
||||||
|
|
||||||
|
def get_active_alerts(
|
||||||
|
self,
|
||||||
|
project_id: str = None,
|
||||||
|
unit_id: str = None,
|
||||||
|
alert_type: str = None,
|
||||||
|
min_severity: str = None,
|
||||||
|
limit: int = 50,
|
||||||
|
) -> List[Alert]:
|
||||||
|
"""
|
||||||
|
Query active alerts with optional filters.
|
||||||
|
|
||||||
|
Args:
|
||||||
|
project_id: Filter by project
|
||||||
|
unit_id: Filter by unit
|
||||||
|
alert_type: Filter by alert type
|
||||||
|
min_severity: Minimum severity (info, warning, critical)
|
||||||
|
limit: Maximum results
|
||||||
|
|
||||||
|
Returns:
|
||||||
|
List of matching alerts
|
||||||
|
"""
|
||||||
|
query = self.db.query(Alert).filter(Alert.status == "active")
|
||||||
|
|
||||||
|
if project_id:
|
||||||
|
query = query.filter(Alert.project_id == project_id)
|
||||||
|
|
||||||
|
if unit_id:
|
||||||
|
query = query.filter(Alert.unit_id == unit_id)
|
||||||
|
|
||||||
|
if alert_type:
|
||||||
|
query = query.filter(Alert.alert_type == alert_type)
|
||||||
|
|
||||||
|
if min_severity:
|
||||||
|
# Map severity to numeric for comparison
|
||||||
|
severity_levels = {"info": 1, "warning": 2, "critical": 3}
|
||||||
|
min_level = severity_levels.get(min_severity, 1)
|
||||||
|
|
||||||
|
if min_level == 2:
|
||||||
|
query = query.filter(Alert.severity.in_(["warning", "critical"]))
|
||||||
|
elif min_level == 3:
|
||||||
|
query = query.filter(Alert.severity == "critical")
|
||||||
|
|
||||||
|
return query.order_by(Alert.created_at.desc()).limit(limit).all()
|
||||||
|
|
||||||
|
def get_all_alerts(
|
||||||
|
self,
|
||||||
|
status: str = None,
|
||||||
|
project_id: str = None,
|
||||||
|
unit_id: str = None,
|
||||||
|
alert_type: str = None,
|
||||||
|
limit: int = 50,
|
||||||
|
offset: int = 0,
|
||||||
|
) -> List[Alert]:
|
||||||
|
"""
|
||||||
|
Query all alerts with optional filters (includes non-active).
|
||||||
|
|
||||||
|
Args:
|
||||||
|
status: Filter by status (active, acknowledged, resolved, dismissed)
|
||||||
|
project_id: Filter by project
|
||||||
|
unit_id: Filter by unit
|
||||||
|
alert_type: Filter by alert type
|
||||||
|
limit: Maximum results
|
||||||
|
offset: Pagination offset
|
||||||
|
|
||||||
|
Returns:
|
||||||
|
List of matching alerts
|
||||||
|
"""
|
||||||
|
query = self.db.query(Alert)
|
||||||
|
|
||||||
|
if status:
|
||||||
|
query = query.filter(Alert.status == status)
|
||||||
|
|
||||||
|
if project_id:
|
||||||
|
query = query.filter(Alert.project_id == project_id)
|
||||||
|
|
||||||
|
if unit_id:
|
||||||
|
query = query.filter(Alert.unit_id == unit_id)
|
||||||
|
|
||||||
|
if alert_type:
|
||||||
|
query = query.filter(Alert.alert_type == alert_type)
|
||||||
|
|
||||||
|
return (
|
||||||
|
query.order_by(Alert.created_at.desc())
|
||||||
|
.offset(offset)
|
||||||
|
.limit(limit)
|
||||||
|
.all()
|
||||||
|
)
|
||||||
|
|
||||||
|
def get_active_alert_count(self) -> int:
|
||||||
|
"""Get count of active alerts for badge display."""
|
||||||
|
return self.db.query(Alert).filter(Alert.status == "active").count()
|
||||||
|
|
||||||
|
def acknowledge_alert(self, alert_id: str) -> Optional[Alert]:
|
||||||
|
"""
|
||||||
|
Mark alert as acknowledged.
|
||||||
|
|
||||||
|
Args:
|
||||||
|
alert_id: Alert to acknowledge
|
||||||
|
|
||||||
|
Returns:
|
||||||
|
Updated Alert or None if not found
|
||||||
|
"""
|
||||||
|
alert = self.db.query(Alert).filter_by(id=alert_id).first()
|
||||||
|
if not alert:
|
||||||
|
return None
|
||||||
|
|
||||||
|
alert.status = "acknowledged"
|
||||||
|
alert.acknowledged_at = datetime.utcnow()
|
||||||
|
self.db.commit()
|
||||||
|
|
||||||
|
logger.info(f"Acknowledged alert: {alert.title}")
|
||||||
|
return alert
|
||||||
|
|
||||||
|
def dismiss_alert(self, alert_id: str) -> Optional[Alert]:
|
||||||
|
"""
|
||||||
|
Dismiss alert (user chose to ignore).
|
||||||
|
|
||||||
|
Args:
|
||||||
|
alert_id: Alert to dismiss
|
||||||
|
|
||||||
|
Returns:
|
||||||
|
Updated Alert or None if not found
|
||||||
|
"""
|
||||||
|
alert = self.db.query(Alert).filter_by(id=alert_id).first()
|
||||||
|
if not alert:
|
||||||
|
return None
|
||||||
|
|
||||||
|
alert.status = "dismissed"
|
||||||
|
self.db.commit()
|
||||||
|
|
||||||
|
logger.info(f"Dismissed alert: {alert.title}")
|
||||||
|
return alert
|
||||||
|
|
||||||
|
def resolve_alert(self, alert_id: str) -> Optional[Alert]:
|
||||||
|
"""
|
||||||
|
Manually resolve an alert.
|
||||||
|
|
||||||
|
Args:
|
||||||
|
alert_id: Alert to resolve
|
||||||
|
|
||||||
|
Returns:
|
||||||
|
Updated Alert or None if not found
|
||||||
|
"""
|
||||||
|
alert = self.db.query(Alert).filter_by(id=alert_id).first()
|
||||||
|
if not alert:
|
||||||
|
return None
|
||||||
|
|
||||||
|
alert.status = "resolved"
|
||||||
|
alert.resolved_at = datetime.utcnow()
|
||||||
|
self.db.commit()
|
||||||
|
|
||||||
|
logger.info(f"Resolved alert: {alert.title}")
|
||||||
|
return alert
|
||||||
|
|
||||||
|
def cleanup_expired_alerts(self) -> int:
|
||||||
|
"""
|
||||||
|
Remove alerts past their expiration time.
|
||||||
|
|
||||||
|
Returns:
|
||||||
|
Number of alerts cleaned up
|
||||||
|
"""
|
||||||
|
now = datetime.utcnow()
|
||||||
|
expired = self.db.query(Alert).filter(
|
||||||
|
and_(
|
||||||
|
Alert.expires_at.isnot(None),
|
||||||
|
Alert.expires_at < now,
|
||||||
|
Alert.status == "active",
|
||||||
|
)
|
||||||
|
).all()
|
||||||
|
|
||||||
|
count = len(expired)
|
||||||
|
for alert in expired:
|
||||||
|
alert.status = "dismissed"
|
||||||
|
|
||||||
|
if count > 0:
|
||||||
|
self.db.commit()
|
||||||
|
logger.info(f"Cleaned up {count} expired alerts")
|
||||||
|
|
||||||
|
return count
|
||||||
|
|
||||||
|
|
||||||
|
def get_alert_service(db: Session) -> AlertService:
|
||||||
|
"""Get an AlertService instance with the given database session."""
|
||||||
|
return AlertService(db)
|
||||||
@@ -289,6 +289,74 @@ class DeviceController:
|
|||||||
else:
|
else:
|
||||||
raise UnsupportedDeviceTypeError(f"Unsupported device type: {device_type}")
|
raise UnsupportedDeviceTypeError(f"Unsupported device type: {device_type}")
|
||||||
|
|
||||||
|
# ========================================================================
|
||||||
|
# FTP Control
|
||||||
|
# ========================================================================
|
||||||
|
|
||||||
|
async def enable_ftp(
|
||||||
|
self,
|
||||||
|
unit_id: str,
|
||||||
|
device_type: str,
|
||||||
|
) -> Dict[str, Any]:
|
||||||
|
"""
|
||||||
|
Enable FTP server on device.
|
||||||
|
|
||||||
|
Must be called before downloading files.
|
||||||
|
|
||||||
|
Args:
|
||||||
|
unit_id: Unit identifier
|
||||||
|
device_type: "slm" | "seismograph"
|
||||||
|
|
||||||
|
Returns:
|
||||||
|
Response dict with status
|
||||||
|
"""
|
||||||
|
if device_type == "slm":
|
||||||
|
try:
|
||||||
|
return await self.slmm_client.enable_ftp(unit_id)
|
||||||
|
except SLMMClientError as e:
|
||||||
|
raise DeviceControllerError(f"SLMM error: {str(e)}")
|
||||||
|
|
||||||
|
elif device_type == "seismograph":
|
||||||
|
return {
|
||||||
|
"status": "not_implemented",
|
||||||
|
"message": "Seismograph FTP not yet implemented",
|
||||||
|
"unit_id": unit_id,
|
||||||
|
}
|
||||||
|
|
||||||
|
else:
|
||||||
|
raise UnsupportedDeviceTypeError(f"Unsupported device type: {device_type}")
|
||||||
|
|
||||||
|
async def disable_ftp(
|
||||||
|
self,
|
||||||
|
unit_id: str,
|
||||||
|
device_type: str,
|
||||||
|
) -> Dict[str, Any]:
|
||||||
|
"""
|
||||||
|
Disable FTP server on device.
|
||||||
|
|
||||||
|
Args:
|
||||||
|
unit_id: Unit identifier
|
||||||
|
device_type: "slm" | "seismograph"
|
||||||
|
|
||||||
|
Returns:
|
||||||
|
Response dict with status
|
||||||
|
"""
|
||||||
|
if device_type == "slm":
|
||||||
|
try:
|
||||||
|
return await self.slmm_client.disable_ftp(unit_id)
|
||||||
|
except SLMMClientError as e:
|
||||||
|
raise DeviceControllerError(f"SLMM error: {str(e)}")
|
||||||
|
|
||||||
|
elif device_type == "seismograph":
|
||||||
|
return {
|
||||||
|
"status": "not_implemented",
|
||||||
|
"message": "Seismograph FTP not yet implemented",
|
||||||
|
"unit_id": unit_id,
|
||||||
|
}
|
||||||
|
|
||||||
|
else:
|
||||||
|
raise UnsupportedDeviceTypeError(f"Unsupported device type: {device_type}")
|
||||||
|
|
||||||
# ========================================================================
|
# ========================================================================
|
||||||
# Device Configuration
|
# Device Configuration
|
||||||
# ========================================================================
|
# ========================================================================
|
||||||
@@ -333,6 +401,157 @@ class DeviceController:
|
|||||||
else:
|
else:
|
||||||
raise UnsupportedDeviceTypeError(f"Unsupported device type: {device_type}")
|
raise UnsupportedDeviceTypeError(f"Unsupported device type: {device_type}")
|
||||||
|
|
||||||
|
# ========================================================================
|
||||||
|
# Store/Index Management
|
||||||
|
# ========================================================================
|
||||||
|
|
||||||
|
async def increment_index(
|
||||||
|
self,
|
||||||
|
unit_id: str,
|
||||||
|
device_type: str,
|
||||||
|
) -> Dict[str, Any]:
|
||||||
|
"""
|
||||||
|
Increment the store/index number on a device.
|
||||||
|
|
||||||
|
For SLMs, this increments the store name to prevent "overwrite data?" prompts.
|
||||||
|
Should be called before starting a new measurement if auto_increment_index is enabled.
|
||||||
|
|
||||||
|
Args:
|
||||||
|
unit_id: Unit identifier
|
||||||
|
device_type: "slm" | "seismograph"
|
||||||
|
|
||||||
|
Returns:
|
||||||
|
Response dict with old_index and new_index
|
||||||
|
"""
|
||||||
|
if device_type == "slm":
|
||||||
|
try:
|
||||||
|
return await self.slmm_client.increment_index(unit_id)
|
||||||
|
except SLMMClientError as e:
|
||||||
|
raise DeviceControllerError(f"SLMM error: {str(e)}")
|
||||||
|
|
||||||
|
elif device_type == "seismograph":
|
||||||
|
# Seismographs may not have the same concept of store index
|
||||||
|
return {
|
||||||
|
"status": "not_applicable",
|
||||||
|
"message": "Index increment not applicable for seismographs",
|
||||||
|
"unit_id": unit_id,
|
||||||
|
}
|
||||||
|
|
||||||
|
else:
|
||||||
|
raise UnsupportedDeviceTypeError(f"Unsupported device type: {device_type}")
|
||||||
|
|
||||||
|
async def get_index_number(
|
||||||
|
self,
|
||||||
|
unit_id: str,
|
||||||
|
device_type: str,
|
||||||
|
) -> Dict[str, Any]:
|
||||||
|
"""
|
||||||
|
Get current store/index number from device.
|
||||||
|
|
||||||
|
Args:
|
||||||
|
unit_id: Unit identifier
|
||||||
|
device_type: "slm" | "seismograph"
|
||||||
|
|
||||||
|
Returns:
|
||||||
|
Response dict with current index_number
|
||||||
|
"""
|
||||||
|
if device_type == "slm":
|
||||||
|
try:
|
||||||
|
return await self.slmm_client.get_index_number(unit_id)
|
||||||
|
except SLMMClientError as e:
|
||||||
|
raise DeviceControllerError(f"SLMM error: {str(e)}")
|
||||||
|
|
||||||
|
elif device_type == "seismograph":
|
||||||
|
return {
|
||||||
|
"status": "not_applicable",
|
||||||
|
"message": "Index number not applicable for seismographs",
|
||||||
|
"unit_id": unit_id,
|
||||||
|
}
|
||||||
|
|
||||||
|
else:
|
||||||
|
raise UnsupportedDeviceTypeError(f"Unsupported device type: {device_type}")
|
||||||
|
|
||||||
|
# ========================================================================
|
||||||
|
# Cycle Commands (for scheduled automation)
|
||||||
|
# ========================================================================
|
||||||
|
|
||||||
|
async def start_cycle(
|
||||||
|
self,
|
||||||
|
unit_id: str,
|
||||||
|
device_type: str,
|
||||||
|
sync_clock: bool = True,
|
||||||
|
) -> Dict[str, Any]:
|
||||||
|
"""
|
||||||
|
Execute complete start cycle for scheduled automation.
|
||||||
|
|
||||||
|
This handles the full pre-recording workflow:
|
||||||
|
1. Sync device clock to server time
|
||||||
|
2. Find next safe index (with overwrite protection)
|
||||||
|
3. Start measurement
|
||||||
|
|
||||||
|
Args:
|
||||||
|
unit_id: Unit identifier
|
||||||
|
device_type: "slm" | "seismograph"
|
||||||
|
sync_clock: Whether to sync device clock to server time
|
||||||
|
|
||||||
|
Returns:
|
||||||
|
Response dict from device module
|
||||||
|
"""
|
||||||
|
if device_type == "slm":
|
||||||
|
try:
|
||||||
|
return await self.slmm_client.start_cycle(unit_id, sync_clock)
|
||||||
|
except SLMMClientError as e:
|
||||||
|
raise DeviceControllerError(f"SLMM error: {str(e)}")
|
||||||
|
|
||||||
|
elif device_type == "seismograph":
|
||||||
|
return {
|
||||||
|
"status": "not_implemented",
|
||||||
|
"message": "Seismograph start cycle not yet implemented",
|
||||||
|
"unit_id": unit_id,
|
||||||
|
}
|
||||||
|
|
||||||
|
else:
|
||||||
|
raise UnsupportedDeviceTypeError(f"Unsupported device type: {device_type}")
|
||||||
|
|
||||||
|
async def stop_cycle(
|
||||||
|
self,
|
||||||
|
unit_id: str,
|
||||||
|
device_type: str,
|
||||||
|
download: bool = True,
|
||||||
|
) -> Dict[str, Any]:
|
||||||
|
"""
|
||||||
|
Execute complete stop cycle for scheduled automation.
|
||||||
|
|
||||||
|
This handles the full post-recording workflow:
|
||||||
|
1. Stop measurement
|
||||||
|
2. Enable FTP
|
||||||
|
3. Download measurement folder
|
||||||
|
4. Verify download
|
||||||
|
|
||||||
|
Args:
|
||||||
|
unit_id: Unit identifier
|
||||||
|
device_type: "slm" | "seismograph"
|
||||||
|
download: Whether to download measurement data
|
||||||
|
|
||||||
|
Returns:
|
||||||
|
Response dict from device module
|
||||||
|
"""
|
||||||
|
if device_type == "slm":
|
||||||
|
try:
|
||||||
|
return await self.slmm_client.stop_cycle(unit_id, download)
|
||||||
|
except SLMMClientError as e:
|
||||||
|
raise DeviceControllerError(f"SLMM error: {str(e)}")
|
||||||
|
|
||||||
|
elif device_type == "seismograph":
|
||||||
|
return {
|
||||||
|
"status": "not_implemented",
|
||||||
|
"message": "Seismograph stop cycle not yet implemented",
|
||||||
|
"unit_id": unit_id,
|
||||||
|
}
|
||||||
|
|
||||||
|
else:
|
||||||
|
raise UnsupportedDeviceTypeError(f"Unsupported device type: {device_type}")
|
||||||
|
|
||||||
# ========================================================================
|
# ========================================================================
|
||||||
# Health Check
|
# Health Check
|
||||||
# ========================================================================
|
# ========================================================================
|
||||||
|
|||||||
184
backend/services/device_status_monitor.py
Normal file
@@ -0,0 +1,184 @@
|
|||||||
|
"""
|
||||||
|
Device Status Monitor
|
||||||
|
|
||||||
|
Background task that monitors device reachability via SLMM polling status
|
||||||
|
and triggers alerts when devices go offline or come back online.
|
||||||
|
|
||||||
|
This service bridges SLMM's device polling with Terra-View's alert system.
|
||||||
|
"""
|
||||||
|
|
||||||
|
import asyncio
|
||||||
|
import logging
|
||||||
|
from datetime import datetime
|
||||||
|
from typing import Optional, Dict
|
||||||
|
|
||||||
|
from backend.database import SessionLocal
|
||||||
|
from backend.services.slmm_client import get_slmm_client, SLMMClientError
|
||||||
|
from backend.services.alert_service import get_alert_service
|
||||||
|
|
||||||
|
logger = logging.getLogger(__name__)
|
||||||
|
|
||||||
|
|
||||||
|
class DeviceStatusMonitor:
|
||||||
|
"""
|
||||||
|
Monitors device reachability via SLMM's polling status endpoint.
|
||||||
|
|
||||||
|
Detects state transitions (online→offline, offline→online) and
|
||||||
|
triggers AlertService to create/resolve alerts.
|
||||||
|
|
||||||
|
Usage:
|
||||||
|
monitor = DeviceStatusMonitor()
|
||||||
|
await monitor.start() # Start background monitoring
|
||||||
|
monitor.stop() # Stop monitoring
|
||||||
|
"""
|
||||||
|
|
||||||
|
def __init__(self, check_interval: int = 60):
|
||||||
|
"""
|
||||||
|
Initialize the monitor.
|
||||||
|
|
||||||
|
Args:
|
||||||
|
check_interval: Seconds between status checks (default: 60)
|
||||||
|
"""
|
||||||
|
self.check_interval = check_interval
|
||||||
|
self.running = False
|
||||||
|
self.task: Optional[asyncio.Task] = None
|
||||||
|
self.slmm_client = get_slmm_client()
|
||||||
|
|
||||||
|
# Track previous device states to detect transitions
|
||||||
|
self._device_states: Dict[str, bool] = {}
|
||||||
|
|
||||||
|
async def start(self):
|
||||||
|
"""Start the monitoring background task."""
|
||||||
|
if self.running:
|
||||||
|
logger.warning("DeviceStatusMonitor is already running")
|
||||||
|
return
|
||||||
|
|
||||||
|
self.running = True
|
||||||
|
self.task = asyncio.create_task(self._monitor_loop())
|
||||||
|
logger.info(f"DeviceStatusMonitor started (checking every {self.check_interval}s)")
|
||||||
|
|
||||||
|
def stop(self):
|
||||||
|
"""Stop the monitoring background task."""
|
||||||
|
self.running = False
|
||||||
|
if self.task:
|
||||||
|
self.task.cancel()
|
||||||
|
logger.info("DeviceStatusMonitor stopped")
|
||||||
|
|
||||||
|
async def _monitor_loop(self):
|
||||||
|
"""Main monitoring loop."""
|
||||||
|
while self.running:
|
||||||
|
try:
|
||||||
|
await self._check_all_devices()
|
||||||
|
except Exception as e:
|
||||||
|
logger.error(f"Error in device status monitor: {e}", exc_info=True)
|
||||||
|
|
||||||
|
# Sleep in small intervals for graceful shutdown
|
||||||
|
for _ in range(self.check_interval):
|
||||||
|
if not self.running:
|
||||||
|
break
|
||||||
|
await asyncio.sleep(1)
|
||||||
|
|
||||||
|
logger.info("DeviceStatusMonitor loop exited")
|
||||||
|
|
||||||
|
async def _check_all_devices(self):
|
||||||
|
"""
|
||||||
|
Fetch polling status from SLMM and detect state transitions.
|
||||||
|
|
||||||
|
Uses GET /api/slmm/_polling/status (proxied to SLMM)
|
||||||
|
"""
|
||||||
|
try:
|
||||||
|
# Get status from SLMM
|
||||||
|
status_response = await self.slmm_client.get_polling_status()
|
||||||
|
devices = status_response.get("devices", [])
|
||||||
|
|
||||||
|
if not devices:
|
||||||
|
logger.debug("No devices in polling status response")
|
||||||
|
return
|
||||||
|
|
||||||
|
db = SessionLocal()
|
||||||
|
try:
|
||||||
|
alert_service = get_alert_service(db)
|
||||||
|
|
||||||
|
for device in devices:
|
||||||
|
unit_id = device.get("unit_id")
|
||||||
|
if not unit_id:
|
||||||
|
continue
|
||||||
|
|
||||||
|
is_reachable = device.get("is_reachable", True)
|
||||||
|
previous_reachable = self._device_states.get(unit_id)
|
||||||
|
|
||||||
|
# Skip if this is the first check (no previous state)
|
||||||
|
if previous_reachable is None:
|
||||||
|
self._device_states[unit_id] = is_reachable
|
||||||
|
logger.debug(f"Initial state for {unit_id}: reachable={is_reachable}")
|
||||||
|
continue
|
||||||
|
|
||||||
|
# Detect offline transition (was online, now offline)
|
||||||
|
if previous_reachable and not is_reachable:
|
||||||
|
logger.warning(f"Device {unit_id} went OFFLINE")
|
||||||
|
alert_service.create_device_offline_alert(
|
||||||
|
unit_id=unit_id,
|
||||||
|
consecutive_failures=device.get("consecutive_failures", 0),
|
||||||
|
last_error=device.get("last_error"),
|
||||||
|
)
|
||||||
|
|
||||||
|
# Detect online transition (was offline, now online)
|
||||||
|
elif not previous_reachable and is_reachable:
|
||||||
|
logger.info(f"Device {unit_id} came back ONLINE")
|
||||||
|
alert_service.resolve_device_offline_alert(unit_id)
|
||||||
|
|
||||||
|
# Update tracked state
|
||||||
|
self._device_states[unit_id] = is_reachable
|
||||||
|
|
||||||
|
# Cleanup expired alerts while we're here
|
||||||
|
alert_service.cleanup_expired_alerts()
|
||||||
|
|
||||||
|
finally:
|
||||||
|
db.close()
|
||||||
|
|
||||||
|
except SLMMClientError as e:
|
||||||
|
logger.warning(f"Could not reach SLMM for status check: {e}")
|
||||||
|
except Exception as e:
|
||||||
|
logger.error(f"Error checking device status: {e}", exc_info=True)
|
||||||
|
|
||||||
|
def get_tracked_devices(self) -> Dict[str, bool]:
|
||||||
|
"""
|
||||||
|
Get the current tracked device states.
|
||||||
|
|
||||||
|
Returns:
|
||||||
|
Dict mapping unit_id to is_reachable status
|
||||||
|
"""
|
||||||
|
return dict(self._device_states)
|
||||||
|
|
||||||
|
def clear_tracked_devices(self):
|
||||||
|
"""Clear all tracked device states (useful for testing)."""
|
||||||
|
self._device_states.clear()
|
||||||
|
|
||||||
|
|
||||||
|
# Singleton instance
|
||||||
|
_monitor_instance: Optional[DeviceStatusMonitor] = None
|
||||||
|
|
||||||
|
|
||||||
|
def get_device_status_monitor() -> DeviceStatusMonitor:
|
||||||
|
"""
|
||||||
|
Get the device status monitor singleton instance.
|
||||||
|
|
||||||
|
Returns:
|
||||||
|
DeviceStatusMonitor instance
|
||||||
|
"""
|
||||||
|
global _monitor_instance
|
||||||
|
if _monitor_instance is None:
|
||||||
|
_monitor_instance = DeviceStatusMonitor()
|
||||||
|
return _monitor_instance
|
||||||
|
|
||||||
|
|
||||||
|
async def start_device_status_monitor():
|
||||||
|
"""Start the global device status monitor."""
|
||||||
|
monitor = get_device_status_monitor()
|
||||||
|
await monitor.start()
|
||||||
|
|
||||||
|
|
||||||
|
def stop_device_status_monitor():
|
||||||
|
"""Stop the global device status monitor."""
|
||||||
|
monitor = get_device_status_monitor()
|
||||||
|
monitor.stop()
|
||||||
725
backend/services/fleet_calendar_service.py
Normal file
@@ -0,0 +1,725 @@
|
|||||||
|
"""
|
||||||
|
Fleet Calendar Service
|
||||||
|
|
||||||
|
Business logic for:
|
||||||
|
- Calculating unit availability on any given date
|
||||||
|
- Calibration status tracking (valid, expiring soon, expired)
|
||||||
|
- Job reservation management
|
||||||
|
- Conflict detection (calibration expires mid-job)
|
||||||
|
"""
|
||||||
|
|
||||||
|
from datetime import date, datetime, timedelta
|
||||||
|
from typing import Dict, List, Optional, Tuple
|
||||||
|
from sqlalchemy.orm import Session
|
||||||
|
from sqlalchemy import and_, or_
|
||||||
|
|
||||||
|
from backend.models import (
|
||||||
|
RosterUnit, JobReservation, JobReservationUnit,
|
||||||
|
UserPreferences, Project, DeploymentRecord
|
||||||
|
)
|
||||||
|
|
||||||
|
|
||||||
|
def get_calibration_status(
|
||||||
|
unit: RosterUnit,
|
||||||
|
check_date: date,
|
||||||
|
warning_days: int = 30
|
||||||
|
) -> str:
|
||||||
|
"""
|
||||||
|
Determine calibration status for a unit on a specific date.
|
||||||
|
|
||||||
|
Returns:
|
||||||
|
"valid" - Calibration is good on this date
|
||||||
|
"expiring_soon" - Within warning_days of expiry
|
||||||
|
"expired" - Calibration has expired
|
||||||
|
"needs_calibration" - No calibration date set
|
||||||
|
"""
|
||||||
|
if not unit.last_calibrated:
|
||||||
|
return "needs_calibration"
|
||||||
|
|
||||||
|
# Calculate expiry date (1 year from last calibration)
|
||||||
|
expiry_date = unit.last_calibrated + timedelta(days=365)
|
||||||
|
|
||||||
|
if check_date >= expiry_date:
|
||||||
|
return "expired"
|
||||||
|
elif check_date >= expiry_date - timedelta(days=warning_days):
|
||||||
|
return "expiring_soon"
|
||||||
|
else:
|
||||||
|
return "valid"
|
||||||
|
|
||||||
|
|
||||||
|
def get_unit_reservations_on_date(
|
||||||
|
db: Session,
|
||||||
|
unit_id: str,
|
||||||
|
check_date: date
|
||||||
|
) -> List[JobReservation]:
|
||||||
|
"""Get all reservations that include this unit on the given date."""
|
||||||
|
|
||||||
|
# Get reservation IDs that have this unit assigned
|
||||||
|
assigned_reservation_ids = db.query(JobReservationUnit.reservation_id).filter(
|
||||||
|
JobReservationUnit.unit_id == unit_id
|
||||||
|
).subquery()
|
||||||
|
|
||||||
|
# Get reservations that:
|
||||||
|
# 1. Have this unit assigned AND date is within range
|
||||||
|
reservations = db.query(JobReservation).filter(
|
||||||
|
JobReservation.id.in_(assigned_reservation_ids),
|
||||||
|
JobReservation.start_date <= check_date,
|
||||||
|
JobReservation.end_date >= check_date
|
||||||
|
).all()
|
||||||
|
|
||||||
|
return reservations
|
||||||
|
|
||||||
|
|
||||||
|
def get_active_deployment(db: Session, unit_id: str) -> Optional[DeploymentRecord]:
|
||||||
|
"""Return the active (unreturned) deployment record for a unit, or None."""
|
||||||
|
return (
|
||||||
|
db.query(DeploymentRecord)
|
||||||
|
.filter(
|
||||||
|
DeploymentRecord.unit_id == unit_id,
|
||||||
|
DeploymentRecord.actual_removal_date == None
|
||||||
|
)
|
||||||
|
.order_by(DeploymentRecord.created_at.desc())
|
||||||
|
.first()
|
||||||
|
)
|
||||||
|
|
||||||
|
|
||||||
|
def is_unit_available_on_date(
|
||||||
|
db: Session,
|
||||||
|
unit: RosterUnit,
|
||||||
|
check_date: date,
|
||||||
|
warning_days: int = 30
|
||||||
|
) -> Tuple[bool, str, Optional[str]]:
|
||||||
|
"""
|
||||||
|
Check if a unit is available on a specific date.
|
||||||
|
|
||||||
|
Returns:
|
||||||
|
(is_available, status, reservation_name)
|
||||||
|
- is_available: True if unit can be assigned to new work
|
||||||
|
- status: "available", "reserved", "expired", "retired", "needs_calibration", "in_field"
|
||||||
|
- reservation_name: Name of blocking reservation or project ref (if any)
|
||||||
|
"""
|
||||||
|
# Check if retired
|
||||||
|
if unit.retired:
|
||||||
|
return False, "retired", None
|
||||||
|
|
||||||
|
# Check calibration status
|
||||||
|
cal_status = get_calibration_status(unit, check_date, warning_days)
|
||||||
|
if cal_status == "expired":
|
||||||
|
return False, "expired", None
|
||||||
|
if cal_status == "needs_calibration":
|
||||||
|
return False, "needs_calibration", None
|
||||||
|
|
||||||
|
# Check for an active deployment record (unit is physically in the field)
|
||||||
|
active_deployment = get_active_deployment(db, unit.id)
|
||||||
|
if active_deployment:
|
||||||
|
label = active_deployment.project_ref or "Field deployment"
|
||||||
|
return False, "in_field", label
|
||||||
|
|
||||||
|
# Check if already reserved
|
||||||
|
reservations = get_unit_reservations_on_date(db, unit.id, check_date)
|
||||||
|
if reservations:
|
||||||
|
return False, "reserved", reservations[0].name
|
||||||
|
|
||||||
|
# Unit is available (even if expiring soon - that's just a warning)
|
||||||
|
return True, "available", None
|
||||||
|
|
||||||
|
|
||||||
|
def get_day_summary(
|
||||||
|
db: Session,
|
||||||
|
check_date: date,
|
||||||
|
device_type: str = "seismograph"
|
||||||
|
) -> Dict:
|
||||||
|
"""
|
||||||
|
Get a complete summary of fleet status for a specific day.
|
||||||
|
|
||||||
|
Returns dict with:
|
||||||
|
- available_units: List of available unit IDs with calibration info
|
||||||
|
- reserved_units: List of reserved unit IDs with reservation info
|
||||||
|
- expired_units: List of units with expired calibration
|
||||||
|
- expiring_soon_units: List of units expiring within warning period
|
||||||
|
- reservations: List of active reservations on this date
|
||||||
|
- counts: Summary counts
|
||||||
|
"""
|
||||||
|
# Get user preferences for warning days
|
||||||
|
prefs = db.query(UserPreferences).filter_by(id=1).first()
|
||||||
|
warning_days = prefs.calibration_warning_days if prefs else 30
|
||||||
|
|
||||||
|
# Get all non-retired units of the specified device type
|
||||||
|
units = db.query(RosterUnit).filter(
|
||||||
|
RosterUnit.device_type == device_type,
|
||||||
|
RosterUnit.retired == False
|
||||||
|
).all()
|
||||||
|
|
||||||
|
available_units = []
|
||||||
|
reserved_units = []
|
||||||
|
expired_units = []
|
||||||
|
expiring_soon_units = []
|
||||||
|
needs_calibration_units = []
|
||||||
|
in_field_units = []
|
||||||
|
cal_expiring_today = [] # Units whose calibration expires ON this day
|
||||||
|
|
||||||
|
for unit in units:
|
||||||
|
is_avail, status, reservation_name = is_unit_available_on_date(
|
||||||
|
db, unit, check_date, warning_days
|
||||||
|
)
|
||||||
|
|
||||||
|
cal_status = get_calibration_status(unit, check_date, warning_days)
|
||||||
|
expiry_date = None
|
||||||
|
if unit.last_calibrated:
|
||||||
|
expiry_date = (unit.last_calibrated + timedelta(days=365)).isoformat()
|
||||||
|
|
||||||
|
unit_info = {
|
||||||
|
"id": unit.id,
|
||||||
|
"last_calibrated": unit.last_calibrated.isoformat() if unit.last_calibrated else None,
|
||||||
|
"expiry_date": expiry_date,
|
||||||
|
"calibration_status": cal_status,
|
||||||
|
"deployed": unit.deployed,
|
||||||
|
"note": unit.note or ""
|
||||||
|
}
|
||||||
|
|
||||||
|
# Check if calibration expires ON this specific day
|
||||||
|
if unit.last_calibrated:
|
||||||
|
unit_expiry_date = unit.last_calibrated + timedelta(days=365)
|
||||||
|
if unit_expiry_date == check_date:
|
||||||
|
cal_expiring_today.append(unit_info)
|
||||||
|
|
||||||
|
if status == "available":
|
||||||
|
available_units.append(unit_info)
|
||||||
|
if cal_status == "expiring_soon":
|
||||||
|
expiring_soon_units.append(unit_info)
|
||||||
|
elif status == "in_field":
|
||||||
|
unit_info["project_ref"] = reservation_name
|
||||||
|
in_field_units.append(unit_info)
|
||||||
|
elif status == "reserved":
|
||||||
|
unit_info["reservation_name"] = reservation_name
|
||||||
|
reserved_units.append(unit_info)
|
||||||
|
if cal_status == "expiring_soon":
|
||||||
|
expiring_soon_units.append(unit_info)
|
||||||
|
elif status == "expired":
|
||||||
|
expired_units.append(unit_info)
|
||||||
|
elif status == "needs_calibration":
|
||||||
|
needs_calibration_units.append(unit_info)
|
||||||
|
|
||||||
|
# Get active reservations on this date
|
||||||
|
reservations = db.query(JobReservation).filter(
|
||||||
|
JobReservation.device_type == device_type,
|
||||||
|
JobReservation.start_date <= check_date,
|
||||||
|
JobReservation.end_date >= check_date
|
||||||
|
).all()
|
||||||
|
|
||||||
|
reservation_list = []
|
||||||
|
for res in reservations:
|
||||||
|
# Count assigned units for this reservation
|
||||||
|
assigned_count = db.query(JobReservationUnit).filter(
|
||||||
|
JobReservationUnit.reservation_id == res.id
|
||||||
|
).count()
|
||||||
|
|
||||||
|
reservation_list.append({
|
||||||
|
"id": res.id,
|
||||||
|
"name": res.name,
|
||||||
|
"start_date": res.start_date.isoformat(),
|
||||||
|
"end_date": res.end_date.isoformat(),
|
||||||
|
"assignment_type": res.assignment_type,
|
||||||
|
"quantity_needed": res.quantity_needed,
|
||||||
|
"assigned_count": assigned_count,
|
||||||
|
"color": res.color,
|
||||||
|
"project_id": res.project_id
|
||||||
|
})
|
||||||
|
|
||||||
|
return {
|
||||||
|
"date": check_date.isoformat(),
|
||||||
|
"device_type": device_type,
|
||||||
|
"available_units": available_units,
|
||||||
|
"in_field_units": in_field_units,
|
||||||
|
"reserved_units": reserved_units,
|
||||||
|
"expired_units": expired_units,
|
||||||
|
"expiring_soon_units": expiring_soon_units,
|
||||||
|
"needs_calibration_units": needs_calibration_units,
|
||||||
|
"cal_expiring_today": cal_expiring_today,
|
||||||
|
"reservations": reservation_list,
|
||||||
|
"counts": {
|
||||||
|
"available": len(available_units),
|
||||||
|
"in_field": len(in_field_units),
|
||||||
|
"reserved": len(reserved_units),
|
||||||
|
"expired": len(expired_units),
|
||||||
|
"expiring_soon": len(expiring_soon_units),
|
||||||
|
"needs_calibration": len(needs_calibration_units),
|
||||||
|
"cal_expiring_today": len(cal_expiring_today),
|
||||||
|
"total": len(units)
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
|
||||||
|
def get_calendar_year_data(
|
||||||
|
db: Session,
|
||||||
|
year: int,
|
||||||
|
device_type: str = "seismograph"
|
||||||
|
) -> Dict:
|
||||||
|
"""
|
||||||
|
Get calendar data for an entire year.
|
||||||
|
|
||||||
|
For performance, this returns summary counts per day rather than
|
||||||
|
full unit lists. Use get_day_summary() for detailed day data.
|
||||||
|
"""
|
||||||
|
# Get user preferences
|
||||||
|
prefs = db.query(UserPreferences).filter_by(id=1).first()
|
||||||
|
warning_days = prefs.calibration_warning_days if prefs else 30
|
||||||
|
|
||||||
|
# Get all units
|
||||||
|
units = db.query(RosterUnit).filter(
|
||||||
|
RosterUnit.device_type == device_type,
|
||||||
|
RosterUnit.retired == False
|
||||||
|
).all()
|
||||||
|
|
||||||
|
# Get all reservations that overlap with this year
|
||||||
|
# Include TBD reservations (end_date is null) that started before year end
|
||||||
|
year_start = date(year, 1, 1)
|
||||||
|
year_end = date(year, 12, 31)
|
||||||
|
|
||||||
|
reservations = db.query(JobReservation).filter(
|
||||||
|
JobReservation.device_type == device_type,
|
||||||
|
JobReservation.start_date <= year_end,
|
||||||
|
or_(
|
||||||
|
JobReservation.end_date >= year_start,
|
||||||
|
JobReservation.end_date == None # TBD reservations
|
||||||
|
)
|
||||||
|
).all()
|
||||||
|
|
||||||
|
# Get all unit assignments for these reservations
|
||||||
|
reservation_ids = [r.id for r in reservations]
|
||||||
|
assignments = db.query(JobReservationUnit).filter(
|
||||||
|
JobReservationUnit.reservation_id.in_(reservation_ids)
|
||||||
|
).all() if reservation_ids else []
|
||||||
|
|
||||||
|
# Build a lookup: unit_id -> list of (start_date, end_date, reservation_name)
|
||||||
|
# For TBD reservations, use estimated_end_date if available, or a far future date
|
||||||
|
unit_reservations = {}
|
||||||
|
for res in reservations:
|
||||||
|
res_assignments = [a for a in assignments if a.reservation_id == res.id]
|
||||||
|
for assignment in res_assignments:
|
||||||
|
unit_id = assignment.unit_id
|
||||||
|
# Use unit-specific dates if set, otherwise use reservation dates
|
||||||
|
start_d = assignment.unit_start_date or res.start_date
|
||||||
|
if assignment.unit_end_tbd or (assignment.unit_end_date is None and res.end_date_tbd):
|
||||||
|
# TBD: use estimated date or far future for availability calculation
|
||||||
|
end_d = res.estimated_end_date or date(year + 5, 12, 31)
|
||||||
|
else:
|
||||||
|
end_d = assignment.unit_end_date or res.end_date or date(year + 5, 12, 31)
|
||||||
|
|
||||||
|
if unit_id not in unit_reservations:
|
||||||
|
unit_reservations[unit_id] = []
|
||||||
|
unit_reservations[unit_id].append((start_d, end_d, res.name))
|
||||||
|
|
||||||
|
# Build set of unit IDs that have an active deployment record (still in the field)
|
||||||
|
unit_ids = [u.id for u in units]
|
||||||
|
active_deployments = db.query(DeploymentRecord.unit_id).filter(
|
||||||
|
DeploymentRecord.unit_id.in_(unit_ids),
|
||||||
|
DeploymentRecord.actual_removal_date == None
|
||||||
|
).all()
|
||||||
|
unit_in_field = {row.unit_id for row in active_deployments}
|
||||||
|
|
||||||
|
# Generate data for each month
|
||||||
|
months_data = {}
|
||||||
|
|
||||||
|
for month in range(1, 13):
|
||||||
|
# Get first and last day of month
|
||||||
|
first_day = date(year, month, 1)
|
||||||
|
if month == 12:
|
||||||
|
last_day = date(year, 12, 31)
|
||||||
|
else:
|
||||||
|
last_day = date(year, month + 1, 1) - timedelta(days=1)
|
||||||
|
|
||||||
|
days_data = {}
|
||||||
|
current_day = first_day
|
||||||
|
|
||||||
|
while current_day <= last_day:
|
||||||
|
available = 0
|
||||||
|
in_field = 0
|
||||||
|
reserved = 0
|
||||||
|
expired = 0
|
||||||
|
expiring_soon = 0
|
||||||
|
needs_cal = 0
|
||||||
|
cal_expiring_on_day = 0 # Units whose calibration expires ON this day
|
||||||
|
cal_expired_on_day = 0 # Units whose calibration expired ON this day
|
||||||
|
|
||||||
|
for unit in units:
|
||||||
|
# Check calibration
|
||||||
|
cal_status = get_calibration_status(unit, current_day, warning_days)
|
||||||
|
|
||||||
|
# Check if calibration expires/expired ON this specific day
|
||||||
|
if unit.last_calibrated:
|
||||||
|
unit_expiry = unit.last_calibrated + timedelta(days=365)
|
||||||
|
if unit_expiry == current_day:
|
||||||
|
cal_expiring_on_day += 1
|
||||||
|
# Check if expired yesterday (first day of being expired)
|
||||||
|
elif unit_expiry == current_day - timedelta(days=1):
|
||||||
|
cal_expired_on_day += 1
|
||||||
|
|
||||||
|
if cal_status == "expired":
|
||||||
|
expired += 1
|
||||||
|
continue
|
||||||
|
if cal_status == "needs_calibration":
|
||||||
|
needs_cal += 1
|
||||||
|
continue
|
||||||
|
|
||||||
|
# Check active deployment record (in field)
|
||||||
|
if unit.id in unit_in_field:
|
||||||
|
in_field += 1
|
||||||
|
continue
|
||||||
|
|
||||||
|
# Check if reserved
|
||||||
|
is_reserved = False
|
||||||
|
if unit.id in unit_reservations:
|
||||||
|
for start_d, end_d, _ in unit_reservations[unit.id]:
|
||||||
|
if start_d <= current_day <= end_d:
|
||||||
|
is_reserved = True
|
||||||
|
break
|
||||||
|
|
||||||
|
if is_reserved:
|
||||||
|
reserved += 1
|
||||||
|
else:
|
||||||
|
available += 1
|
||||||
|
|
||||||
|
if cal_status == "expiring_soon":
|
||||||
|
expiring_soon += 1
|
||||||
|
|
||||||
|
days_data[current_day.day] = {
|
||||||
|
"available": available,
|
||||||
|
"in_field": in_field,
|
||||||
|
"reserved": reserved,
|
||||||
|
"expired": expired,
|
||||||
|
"expiring_soon": expiring_soon,
|
||||||
|
"needs_calibration": needs_cal,
|
||||||
|
"cal_expiring_on_day": cal_expiring_on_day,
|
||||||
|
"cal_expired_on_day": cal_expired_on_day
|
||||||
|
}
|
||||||
|
|
||||||
|
current_day += timedelta(days=1)
|
||||||
|
|
||||||
|
months_data[month] = {
|
||||||
|
"name": first_day.strftime("%B"),
|
||||||
|
"short_name": first_day.strftime("%b"),
|
||||||
|
"days": days_data,
|
||||||
|
"first_weekday": first_day.weekday(), # 0=Monday, 6=Sunday
|
||||||
|
"num_days": last_day.day
|
||||||
|
}
|
||||||
|
|
||||||
|
# Also include reservation summary for the year
|
||||||
|
reservation_list = []
|
||||||
|
for res in reservations:
|
||||||
|
assigned_count = len([a for a in assignments if a.reservation_id == res.id])
|
||||||
|
reservation_list.append({
|
||||||
|
"id": res.id,
|
||||||
|
"name": res.name,
|
||||||
|
"start_date": res.start_date.isoformat(),
|
||||||
|
"end_date": res.end_date.isoformat(),
|
||||||
|
"quantity_needed": res.quantity_needed,
|
||||||
|
"assigned_count": assigned_count,
|
||||||
|
"color": res.color
|
||||||
|
})
|
||||||
|
|
||||||
|
return {
|
||||||
|
"year": year,
|
||||||
|
"device_type": device_type,
|
||||||
|
"months": months_data,
|
||||||
|
"reservations": reservation_list,
|
||||||
|
"total_units": len(units)
|
||||||
|
}
|
||||||
|
|
||||||
|
|
||||||
|
def get_rolling_calendar_data(
|
||||||
|
db: Session,
|
||||||
|
start_year: int,
|
||||||
|
start_month: int,
|
||||||
|
device_type: str = "seismograph"
|
||||||
|
) -> Dict:
|
||||||
|
"""
|
||||||
|
Get calendar data for 12 months starting from a specific month/year.
|
||||||
|
|
||||||
|
This supports the rolling calendar view where users can scroll through
|
||||||
|
months one at a time, viewing any 12-month window.
|
||||||
|
"""
|
||||||
|
# Get user preferences
|
||||||
|
prefs = db.query(UserPreferences).filter_by(id=1).first()
|
||||||
|
warning_days = prefs.calibration_warning_days if prefs else 30
|
||||||
|
|
||||||
|
# Get all units
|
||||||
|
units = db.query(RosterUnit).filter(
|
||||||
|
RosterUnit.device_type == device_type,
|
||||||
|
RosterUnit.retired == False
|
||||||
|
).all()
|
||||||
|
|
||||||
|
# Calculate the date range for 12 months
|
||||||
|
first_date = date(start_year, start_month, 1)
|
||||||
|
# Calculate end date (12 months later)
|
||||||
|
end_year = start_year + 1 if start_month == 1 else start_year
|
||||||
|
end_month = 12 if start_month == 1 else start_month - 1
|
||||||
|
if start_month == 1:
|
||||||
|
end_year = start_year
|
||||||
|
end_month = 12
|
||||||
|
else:
|
||||||
|
# 12 months from start_month means we end at start_month - 1 next year
|
||||||
|
end_year = start_year + 1
|
||||||
|
end_month = start_month - 1
|
||||||
|
|
||||||
|
# Actually, simpler: go 11 months forward from start
|
||||||
|
end_year = start_year + ((start_month + 10) // 12)
|
||||||
|
end_month = ((start_month + 10) % 12) + 1
|
||||||
|
if end_month == 12:
|
||||||
|
last_date = date(end_year, 12, 31)
|
||||||
|
else:
|
||||||
|
last_date = date(end_year, end_month + 1, 1) - timedelta(days=1)
|
||||||
|
|
||||||
|
# Get all reservations that overlap with this 12-month range
|
||||||
|
reservations = db.query(JobReservation).filter(
|
||||||
|
JobReservation.device_type == device_type,
|
||||||
|
JobReservation.start_date <= last_date,
|
||||||
|
or_(
|
||||||
|
JobReservation.end_date >= first_date,
|
||||||
|
JobReservation.end_date == None # TBD reservations
|
||||||
|
)
|
||||||
|
).all()
|
||||||
|
|
||||||
|
# Get all unit assignments for these reservations
|
||||||
|
reservation_ids = [r.id for r in reservations]
|
||||||
|
assignments = db.query(JobReservationUnit).filter(
|
||||||
|
JobReservationUnit.reservation_id.in_(reservation_ids)
|
||||||
|
).all() if reservation_ids else []
|
||||||
|
|
||||||
|
# Build a lookup: unit_id -> list of (start_date, end_date, reservation_name)
|
||||||
|
unit_reservations = {}
|
||||||
|
for res in reservations:
|
||||||
|
res_assignments = [a for a in assignments if a.reservation_id == res.id]
|
||||||
|
for assignment in res_assignments:
|
||||||
|
unit_id = assignment.unit_id
|
||||||
|
start_d = assignment.unit_start_date or res.start_date
|
||||||
|
if assignment.unit_end_tbd or (assignment.unit_end_date is None and res.end_date_tbd):
|
||||||
|
end_d = res.estimated_end_date or date(start_year + 5, 12, 31)
|
||||||
|
else:
|
||||||
|
end_d = assignment.unit_end_date or res.end_date or date(start_year + 5, 12, 31)
|
||||||
|
|
||||||
|
if unit_id not in unit_reservations:
|
||||||
|
unit_reservations[unit_id] = []
|
||||||
|
unit_reservations[unit_id].append((start_d, end_d, res.name))
|
||||||
|
|
||||||
|
# Build set of unit IDs that have an active deployment record (still in the field)
|
||||||
|
unit_ids = [u.id for u in units]
|
||||||
|
active_deployments = db.query(DeploymentRecord.unit_id).filter(
|
||||||
|
DeploymentRecord.unit_id.in_(unit_ids),
|
||||||
|
DeploymentRecord.actual_removal_date == None
|
||||||
|
).all()
|
||||||
|
unit_in_field = {row.unit_id for row in active_deployments}
|
||||||
|
|
||||||
|
# Generate data for each of the 12 months
|
||||||
|
months_data = []
|
||||||
|
current_year = start_year
|
||||||
|
current_month = start_month
|
||||||
|
|
||||||
|
for i in range(12):
|
||||||
|
# Calculate this month's year and month
|
||||||
|
m_year = start_year + ((start_month - 1 + i) // 12)
|
||||||
|
m_month = ((start_month - 1 + i) % 12) + 1
|
||||||
|
|
||||||
|
first_day = date(m_year, m_month, 1)
|
||||||
|
if m_month == 12:
|
||||||
|
last_day = date(m_year, 12, 31)
|
||||||
|
else:
|
||||||
|
last_day = date(m_year, m_month + 1, 1) - timedelta(days=1)
|
||||||
|
|
||||||
|
days_data = {}
|
||||||
|
current_day = first_day
|
||||||
|
|
||||||
|
while current_day <= last_day:
|
||||||
|
available = 0
|
||||||
|
reserved = 0
|
||||||
|
expired = 0
|
||||||
|
expiring_soon = 0
|
||||||
|
needs_cal = 0
|
||||||
|
cal_expiring_on_day = 0
|
||||||
|
cal_expired_on_day = 0
|
||||||
|
|
||||||
|
for unit in units:
|
||||||
|
cal_status = get_calibration_status(unit, current_day, warning_days)
|
||||||
|
|
||||||
|
if unit.last_calibrated:
|
||||||
|
unit_expiry = unit.last_calibrated + timedelta(days=365)
|
||||||
|
if unit_expiry == current_day:
|
||||||
|
cal_expiring_on_day += 1
|
||||||
|
elif unit_expiry == current_day - timedelta(days=1):
|
||||||
|
cal_expired_on_day += 1
|
||||||
|
|
||||||
|
if cal_status == "expired":
|
||||||
|
expired += 1
|
||||||
|
continue
|
||||||
|
if cal_status == "needs_calibration":
|
||||||
|
needs_cal += 1
|
||||||
|
continue
|
||||||
|
|
||||||
|
is_reserved = False
|
||||||
|
if unit.id in unit_reservations:
|
||||||
|
for start_d, end_d, _ in unit_reservations[unit.id]:
|
||||||
|
if start_d <= current_day <= end_d:
|
||||||
|
is_reserved = True
|
||||||
|
break
|
||||||
|
|
||||||
|
if is_reserved:
|
||||||
|
reserved += 1
|
||||||
|
else:
|
||||||
|
available += 1
|
||||||
|
|
||||||
|
if cal_status == "expiring_soon":
|
||||||
|
expiring_soon += 1
|
||||||
|
|
||||||
|
days_data[current_day.day] = {
|
||||||
|
"available": available,
|
||||||
|
"reserved": reserved,
|
||||||
|
"expired": expired,
|
||||||
|
"expiring_soon": expiring_soon,
|
||||||
|
"needs_calibration": needs_cal,
|
||||||
|
"cal_expiring_on_day": cal_expiring_on_day,
|
||||||
|
"cal_expired_on_day": cal_expired_on_day
|
||||||
|
}
|
||||||
|
|
||||||
|
current_day += timedelta(days=1)
|
||||||
|
|
||||||
|
months_data.append({
|
||||||
|
"year": m_year,
|
||||||
|
"month": m_month,
|
||||||
|
"name": first_day.strftime("%B"),
|
||||||
|
"short_name": first_day.strftime("%b"),
|
||||||
|
"year_short": first_day.strftime("%y"),
|
||||||
|
"days": days_data,
|
||||||
|
"first_weekday": first_day.weekday(),
|
||||||
|
"num_days": last_day.day
|
||||||
|
})
|
||||||
|
|
||||||
|
return {
|
||||||
|
"start_year": start_year,
|
||||||
|
"start_month": start_month,
|
||||||
|
"device_type": device_type,
|
||||||
|
"months": months_data,
|
||||||
|
"total_units": len(units)
|
||||||
|
}
|
||||||
|
|
||||||
|
|
||||||
|
def check_calibration_conflicts(
|
||||||
|
db: Session,
|
||||||
|
reservation_id: str
|
||||||
|
) -> List[Dict]:
|
||||||
|
"""
|
||||||
|
Check if any units assigned to a reservation will have their
|
||||||
|
calibration expire during the reservation period.
|
||||||
|
|
||||||
|
Returns list of conflicts with unit info and expiry date.
|
||||||
|
"""
|
||||||
|
reservation = db.query(JobReservation).filter_by(id=reservation_id).first()
|
||||||
|
if not reservation:
|
||||||
|
return []
|
||||||
|
|
||||||
|
# Get assigned units
|
||||||
|
assigned = db.query(JobReservationUnit).filter_by(
|
||||||
|
reservation_id=reservation_id
|
||||||
|
).all()
|
||||||
|
|
||||||
|
conflicts = []
|
||||||
|
for assignment in assigned:
|
||||||
|
unit = db.query(RosterUnit).filter_by(id=assignment.unit_id).first()
|
||||||
|
if not unit or not unit.last_calibrated:
|
||||||
|
continue
|
||||||
|
|
||||||
|
expiry_date = unit.last_calibrated + timedelta(days=365)
|
||||||
|
|
||||||
|
# Check if expiry falls within reservation period
|
||||||
|
if reservation.start_date < expiry_date <= reservation.end_date:
|
||||||
|
conflicts.append({
|
||||||
|
"unit_id": unit.id,
|
||||||
|
"last_calibrated": unit.last_calibrated.isoformat(),
|
||||||
|
"expiry_date": expiry_date.isoformat(),
|
||||||
|
"reservation_name": reservation.name,
|
||||||
|
"days_into_job": (expiry_date - reservation.start_date).days
|
||||||
|
})
|
||||||
|
|
||||||
|
return conflicts
|
||||||
|
|
||||||
|
|
||||||
|
def get_available_units_for_period(
|
||||||
|
db: Session,
|
||||||
|
start_date: date,
|
||||||
|
end_date: date,
|
||||||
|
device_type: str = "seismograph",
|
||||||
|
exclude_reservation_id: Optional[str] = None
|
||||||
|
) -> List[Dict]:
|
||||||
|
"""
|
||||||
|
Get units that are available for the entire specified period.
|
||||||
|
|
||||||
|
A unit is available if:
|
||||||
|
- Not retired
|
||||||
|
- Calibration is valid through the end date
|
||||||
|
- Not assigned to any other reservation that overlaps the period
|
||||||
|
"""
|
||||||
|
prefs = db.query(UserPreferences).filter_by(id=1).first()
|
||||||
|
warning_days = prefs.calibration_warning_days if prefs else 30
|
||||||
|
|
||||||
|
units = db.query(RosterUnit).filter(
|
||||||
|
RosterUnit.device_type == device_type,
|
||||||
|
RosterUnit.retired == False
|
||||||
|
).all()
|
||||||
|
|
||||||
|
# Get reservations that overlap with this period
|
||||||
|
overlapping_reservations = db.query(JobReservation).filter(
|
||||||
|
JobReservation.device_type == device_type,
|
||||||
|
JobReservation.start_date <= end_date,
|
||||||
|
JobReservation.end_date >= start_date
|
||||||
|
)
|
||||||
|
|
||||||
|
if exclude_reservation_id:
|
||||||
|
overlapping_reservations = overlapping_reservations.filter(
|
||||||
|
JobReservation.id != exclude_reservation_id
|
||||||
|
)
|
||||||
|
|
||||||
|
overlapping_reservations = overlapping_reservations.all()
|
||||||
|
|
||||||
|
# Get all units assigned to overlapping reservations
|
||||||
|
reserved_unit_ids = set()
|
||||||
|
for res in overlapping_reservations:
|
||||||
|
assigned = db.query(JobReservationUnit).filter_by(
|
||||||
|
reservation_id=res.id
|
||||||
|
).all()
|
||||||
|
for a in assigned:
|
||||||
|
reserved_unit_ids.add(a.unit_id)
|
||||||
|
|
||||||
|
# Get units with active deployment records (still in the field)
|
||||||
|
unit_ids = [u.id for u in units]
|
||||||
|
active_deps = db.query(DeploymentRecord.unit_id).filter(
|
||||||
|
DeploymentRecord.unit_id.in_(unit_ids),
|
||||||
|
DeploymentRecord.actual_removal_date == None
|
||||||
|
).all()
|
||||||
|
in_field_unit_ids = {row.unit_id for row in active_deps}
|
||||||
|
|
||||||
|
available_units = []
|
||||||
|
for unit in units:
|
||||||
|
# Check if already reserved
|
||||||
|
if unit.id in reserved_unit_ids:
|
||||||
|
continue
|
||||||
|
# Check if currently in the field
|
||||||
|
if unit.id in in_field_unit_ids:
|
||||||
|
continue
|
||||||
|
|
||||||
|
if unit.last_calibrated:
|
||||||
|
expiry_date = unit.last_calibrated + timedelta(days=365)
|
||||||
|
cal_status = get_calibration_status(unit, end_date, warning_days)
|
||||||
|
else:
|
||||||
|
expiry_date = None
|
||||||
|
cal_status = "needs_calibration"
|
||||||
|
|
||||||
|
available_units.append({
|
||||||
|
"id": unit.id,
|
||||||
|
"last_calibrated": unit.last_calibrated.isoformat() if unit.last_calibrated else None,
|
||||||
|
"expiry_date": expiry_date.isoformat() if expiry_date else None,
|
||||||
|
"calibration_status": cal_status,
|
||||||
|
"deployed": unit.deployed,
|
||||||
|
"out_for_calibration": unit.out_for_calibration or False,
|
||||||
|
"note": unit.note or ""
|
||||||
|
})
|
||||||
|
|
||||||
|
return available_units
|
||||||
611
backend/services/recurring_schedule_service.py
Normal file
@@ -0,0 +1,611 @@
|
|||||||
|
"""
|
||||||
|
Recurring Schedule Service
|
||||||
|
|
||||||
|
Manages recurring schedule definitions and generates ScheduledAction
|
||||||
|
instances based on patterns (weekly calendar, simple interval).
|
||||||
|
"""
|
||||||
|
|
||||||
|
import json
|
||||||
|
import uuid
|
||||||
|
import logging
|
||||||
|
from datetime import datetime, timedelta, date, time
|
||||||
|
from typing import Optional, List, Dict, Any, Tuple
|
||||||
|
from zoneinfo import ZoneInfo
|
||||||
|
|
||||||
|
from sqlalchemy.orm import Session
|
||||||
|
from sqlalchemy import and_
|
||||||
|
|
||||||
|
from backend.models import RecurringSchedule, ScheduledAction, MonitoringLocation, UnitAssignment, Project
|
||||||
|
|
||||||
|
logger = logging.getLogger(__name__)
|
||||||
|
|
||||||
|
# Day name mapping
|
||||||
|
DAY_NAMES = ["monday", "tuesday", "wednesday", "thursday", "friday", "saturday", "sunday"]
|
||||||
|
|
||||||
|
|
||||||
|
class RecurringScheduleService:
|
||||||
|
"""
|
||||||
|
Service for managing recurring schedules and generating ScheduledActions.
|
||||||
|
|
||||||
|
Supports two schedule types:
|
||||||
|
- weekly_calendar: Specific days with start/end times
|
||||||
|
- simple_interval: Daily stop/download/restart cycles for 24/7 monitoring
|
||||||
|
"""
|
||||||
|
|
||||||
|
def __init__(self, db: Session):
|
||||||
|
self.db = db
|
||||||
|
|
||||||
|
def create_schedule(
|
||||||
|
self,
|
||||||
|
project_id: str,
|
||||||
|
location_id: str,
|
||||||
|
name: str,
|
||||||
|
schedule_type: str,
|
||||||
|
device_type: str = "slm",
|
||||||
|
unit_id: str = None,
|
||||||
|
weekly_pattern: dict = None,
|
||||||
|
interval_type: str = None,
|
||||||
|
cycle_time: str = None,
|
||||||
|
include_download: bool = True,
|
||||||
|
auto_increment_index: bool = True,
|
||||||
|
timezone: str = "America/New_York",
|
||||||
|
start_datetime: datetime = None,
|
||||||
|
end_datetime: datetime = None,
|
||||||
|
) -> RecurringSchedule:
|
||||||
|
"""
|
||||||
|
Create a new recurring schedule.
|
||||||
|
|
||||||
|
Args:
|
||||||
|
project_id: Project ID
|
||||||
|
location_id: Monitoring location ID
|
||||||
|
name: Schedule name
|
||||||
|
schedule_type: "weekly_calendar", "simple_interval", or "one_off"
|
||||||
|
device_type: "slm" or "seismograph"
|
||||||
|
unit_id: Specific unit (optional, can use assignment)
|
||||||
|
weekly_pattern: Dict of day patterns for weekly_calendar
|
||||||
|
interval_type: "daily" or "hourly" for simple_interval
|
||||||
|
cycle_time: Time string "HH:MM" for cycle
|
||||||
|
include_download: Whether to download data on cycle
|
||||||
|
auto_increment_index: Whether to auto-increment store index before start
|
||||||
|
timezone: Timezone for schedule times
|
||||||
|
start_datetime: Start date+time in UTC (one_off only)
|
||||||
|
end_datetime: End date+time in UTC (one_off only)
|
||||||
|
|
||||||
|
Returns:
|
||||||
|
Created RecurringSchedule
|
||||||
|
"""
|
||||||
|
schedule = RecurringSchedule(
|
||||||
|
id=str(uuid.uuid4()),
|
||||||
|
project_id=project_id,
|
||||||
|
location_id=location_id,
|
||||||
|
unit_id=unit_id,
|
||||||
|
name=name,
|
||||||
|
schedule_type=schedule_type,
|
||||||
|
device_type=device_type,
|
||||||
|
weekly_pattern=json.dumps(weekly_pattern) if weekly_pattern else None,
|
||||||
|
interval_type=interval_type,
|
||||||
|
cycle_time=cycle_time,
|
||||||
|
include_download=include_download,
|
||||||
|
auto_increment_index=auto_increment_index,
|
||||||
|
enabled=True,
|
||||||
|
timezone=timezone,
|
||||||
|
start_datetime=start_datetime,
|
||||||
|
end_datetime=end_datetime,
|
||||||
|
)
|
||||||
|
|
||||||
|
# Calculate next occurrence
|
||||||
|
schedule.next_occurrence = self._calculate_next_occurrence(schedule)
|
||||||
|
|
||||||
|
self.db.add(schedule)
|
||||||
|
self.db.commit()
|
||||||
|
self.db.refresh(schedule)
|
||||||
|
|
||||||
|
logger.info(f"Created recurring schedule: {name} ({schedule_type})")
|
||||||
|
return schedule
|
||||||
|
|
||||||
|
def update_schedule(
|
||||||
|
self,
|
||||||
|
schedule_id: str,
|
||||||
|
**kwargs,
|
||||||
|
) -> Optional[RecurringSchedule]:
|
||||||
|
"""
|
||||||
|
Update a recurring schedule.
|
||||||
|
|
||||||
|
Args:
|
||||||
|
schedule_id: Schedule to update
|
||||||
|
**kwargs: Fields to update
|
||||||
|
|
||||||
|
Returns:
|
||||||
|
Updated schedule or None
|
||||||
|
"""
|
||||||
|
schedule = self.db.query(RecurringSchedule).filter_by(id=schedule_id).first()
|
||||||
|
if not schedule:
|
||||||
|
return None
|
||||||
|
|
||||||
|
for key, value in kwargs.items():
|
||||||
|
if hasattr(schedule, key):
|
||||||
|
if key == "weekly_pattern" and isinstance(value, dict):
|
||||||
|
value = json.dumps(value)
|
||||||
|
setattr(schedule, key, value)
|
||||||
|
|
||||||
|
# Recalculate next occurrence
|
||||||
|
schedule.next_occurrence = self._calculate_next_occurrence(schedule)
|
||||||
|
|
||||||
|
self.db.commit()
|
||||||
|
self.db.refresh(schedule)
|
||||||
|
|
||||||
|
logger.info(f"Updated recurring schedule: {schedule.name}")
|
||||||
|
return schedule
|
||||||
|
|
||||||
|
def delete_schedule(self, schedule_id: str) -> bool:
|
||||||
|
"""
|
||||||
|
Delete a recurring schedule and its pending generated actions.
|
||||||
|
|
||||||
|
Args:
|
||||||
|
schedule_id: Schedule to delete
|
||||||
|
|
||||||
|
Returns:
|
||||||
|
True if deleted, False if not found
|
||||||
|
"""
|
||||||
|
schedule = self.db.query(RecurringSchedule).filter_by(id=schedule_id).first()
|
||||||
|
if not schedule:
|
||||||
|
return False
|
||||||
|
|
||||||
|
# Delete pending generated actions for this schedule
|
||||||
|
# The schedule_id is stored in the notes field as JSON
|
||||||
|
pending_actions = self.db.query(ScheduledAction).filter(
|
||||||
|
and_(
|
||||||
|
ScheduledAction.execution_status == "pending",
|
||||||
|
ScheduledAction.notes.like(f'%"schedule_id": "{schedule_id}"%'),
|
||||||
|
)
|
||||||
|
).all()
|
||||||
|
|
||||||
|
deleted_count = len(pending_actions)
|
||||||
|
for action in pending_actions:
|
||||||
|
self.db.delete(action)
|
||||||
|
|
||||||
|
self.db.delete(schedule)
|
||||||
|
self.db.commit()
|
||||||
|
|
||||||
|
logger.info(f"Deleted recurring schedule: {schedule.name} (and {deleted_count} pending actions)")
|
||||||
|
return True
|
||||||
|
|
||||||
|
def enable_schedule(self, schedule_id: str) -> Optional[RecurringSchedule]:
|
||||||
|
"""Enable a disabled schedule."""
|
||||||
|
return self.update_schedule(schedule_id, enabled=True)
|
||||||
|
|
||||||
|
def disable_schedule(self, schedule_id: str) -> Optional[RecurringSchedule]:
|
||||||
|
"""Disable a schedule and cancel its pending actions."""
|
||||||
|
schedule = self.update_schedule(schedule_id, enabled=False)
|
||||||
|
if schedule:
|
||||||
|
# Cancel all pending actions generated by this schedule
|
||||||
|
pending_actions = self.db.query(ScheduledAction).filter(
|
||||||
|
and_(
|
||||||
|
ScheduledAction.execution_status == "pending",
|
||||||
|
ScheduledAction.notes.like(f'%"schedule_id": "{schedule_id}"%'),
|
||||||
|
)
|
||||||
|
).all()
|
||||||
|
|
||||||
|
for action in pending_actions:
|
||||||
|
action.execution_status = "cancelled"
|
||||||
|
|
||||||
|
if pending_actions:
|
||||||
|
self.db.commit()
|
||||||
|
logger.info(f"Cancelled {len(pending_actions)} pending actions for disabled schedule {schedule.name}")
|
||||||
|
|
||||||
|
return schedule
|
||||||
|
|
||||||
|
def generate_actions_for_schedule(
|
||||||
|
self,
|
||||||
|
schedule: RecurringSchedule,
|
||||||
|
horizon_days: int = 7,
|
||||||
|
preview_only: bool = False,
|
||||||
|
) -> List[ScheduledAction]:
|
||||||
|
"""
|
||||||
|
Generate ScheduledAction entries for the next N days based on pattern.
|
||||||
|
|
||||||
|
Args:
|
||||||
|
schedule: The recurring schedule
|
||||||
|
horizon_days: Days ahead to generate
|
||||||
|
preview_only: If True, don't save to DB (for preview)
|
||||||
|
|
||||||
|
Returns:
|
||||||
|
List of generated ScheduledAction instances
|
||||||
|
"""
|
||||||
|
if not schedule.enabled:
|
||||||
|
return []
|
||||||
|
|
||||||
|
if schedule.schedule_type == "weekly_calendar":
|
||||||
|
actions = self._generate_weekly_calendar_actions(schedule, horizon_days)
|
||||||
|
elif schedule.schedule_type == "simple_interval":
|
||||||
|
actions = self._generate_interval_actions(schedule, horizon_days)
|
||||||
|
elif schedule.schedule_type == "one_off":
|
||||||
|
actions = self._generate_one_off_actions(schedule)
|
||||||
|
else:
|
||||||
|
logger.warning(f"Unknown schedule type: {schedule.schedule_type}")
|
||||||
|
return []
|
||||||
|
|
||||||
|
if not preview_only and actions:
|
||||||
|
for action in actions:
|
||||||
|
self.db.add(action)
|
||||||
|
|
||||||
|
schedule.last_generated_at = datetime.utcnow()
|
||||||
|
schedule.next_occurrence = self._calculate_next_occurrence(schedule)
|
||||||
|
|
||||||
|
self.db.commit()
|
||||||
|
logger.info(f"Generated {len(actions)} actions for schedule: {schedule.name}")
|
||||||
|
|
||||||
|
return actions
|
||||||
|
|
||||||
|
def _generate_weekly_calendar_actions(
|
||||||
|
self,
|
||||||
|
schedule: RecurringSchedule,
|
||||||
|
horizon_days: int,
|
||||||
|
) -> List[ScheduledAction]:
|
||||||
|
"""
|
||||||
|
Generate actions from weekly calendar pattern.
|
||||||
|
|
||||||
|
Pattern format:
|
||||||
|
{
|
||||||
|
"monday": {"enabled": true, "start": "19:00", "end": "07:00"},
|
||||||
|
"tuesday": {"enabled": false},
|
||||||
|
...
|
||||||
|
}
|
||||||
|
"""
|
||||||
|
if not schedule.weekly_pattern:
|
||||||
|
return []
|
||||||
|
|
||||||
|
try:
|
||||||
|
pattern = json.loads(schedule.weekly_pattern)
|
||||||
|
except json.JSONDecodeError:
|
||||||
|
logger.error(f"Invalid weekly_pattern JSON for schedule {schedule.id}")
|
||||||
|
return []
|
||||||
|
|
||||||
|
actions = []
|
||||||
|
tz = ZoneInfo(schedule.timezone)
|
||||||
|
now_utc = datetime.utcnow()
|
||||||
|
now_local = now_utc.replace(tzinfo=ZoneInfo("UTC")).astimezone(tz)
|
||||||
|
|
||||||
|
# Get unit_id (from schedule or assignment)
|
||||||
|
unit_id = self._resolve_unit_id(schedule)
|
||||||
|
|
||||||
|
for day_offset in range(horizon_days):
|
||||||
|
check_date = now_local.date() + timedelta(days=day_offset)
|
||||||
|
day_name = DAY_NAMES[check_date.weekday()]
|
||||||
|
day_config = pattern.get(day_name, {})
|
||||||
|
|
||||||
|
if not day_config.get("enabled", False):
|
||||||
|
continue
|
||||||
|
|
||||||
|
start_time_str = day_config.get("start")
|
||||||
|
end_time_str = day_config.get("end")
|
||||||
|
|
||||||
|
if not start_time_str or not end_time_str:
|
||||||
|
continue
|
||||||
|
|
||||||
|
# Parse times
|
||||||
|
start_time = self._parse_time(start_time_str)
|
||||||
|
end_time = self._parse_time(end_time_str)
|
||||||
|
|
||||||
|
if not start_time or not end_time:
|
||||||
|
continue
|
||||||
|
|
||||||
|
# Create start datetime in local timezone
|
||||||
|
start_local = datetime.combine(check_date, start_time, tzinfo=tz)
|
||||||
|
start_utc = start_local.astimezone(ZoneInfo("UTC")).replace(tzinfo=None)
|
||||||
|
|
||||||
|
# Handle overnight schedules (end time is next day)
|
||||||
|
if end_time <= start_time:
|
||||||
|
end_date = check_date + timedelta(days=1)
|
||||||
|
else:
|
||||||
|
end_date = check_date
|
||||||
|
|
||||||
|
end_local = datetime.combine(end_date, end_time, tzinfo=tz)
|
||||||
|
end_utc = end_local.astimezone(ZoneInfo("UTC")).replace(tzinfo=None)
|
||||||
|
|
||||||
|
# Skip if start time has already passed
|
||||||
|
if start_utc <= now_utc:
|
||||||
|
continue
|
||||||
|
|
||||||
|
# Check if action already exists
|
||||||
|
if self._action_exists(schedule.project_id, schedule.location_id, "start", start_utc):
|
||||||
|
continue
|
||||||
|
|
||||||
|
# Build notes with automation metadata
|
||||||
|
start_notes = json.dumps({
|
||||||
|
"schedule_name": schedule.name,
|
||||||
|
"schedule_id": schedule.id,
|
||||||
|
"auto_increment_index": schedule.auto_increment_index,
|
||||||
|
})
|
||||||
|
|
||||||
|
# Create START action
|
||||||
|
start_action = ScheduledAction(
|
||||||
|
id=str(uuid.uuid4()),
|
||||||
|
project_id=schedule.project_id,
|
||||||
|
location_id=schedule.location_id,
|
||||||
|
unit_id=unit_id,
|
||||||
|
action_type="start",
|
||||||
|
device_type=schedule.device_type,
|
||||||
|
scheduled_time=start_utc,
|
||||||
|
execution_status="pending",
|
||||||
|
notes=start_notes,
|
||||||
|
)
|
||||||
|
actions.append(start_action)
|
||||||
|
|
||||||
|
# Create STOP action (stop_cycle handles download when include_download is True)
|
||||||
|
stop_notes = json.dumps({
|
||||||
|
"schedule_name": schedule.name,
|
||||||
|
"schedule_id": schedule.id,
|
||||||
|
"schedule_type": "weekly_calendar",
|
||||||
|
"include_download": schedule.include_download,
|
||||||
|
})
|
||||||
|
stop_action = ScheduledAction(
|
||||||
|
id=str(uuid.uuid4()),
|
||||||
|
project_id=schedule.project_id,
|
||||||
|
location_id=schedule.location_id,
|
||||||
|
unit_id=unit_id,
|
||||||
|
action_type="stop",
|
||||||
|
device_type=schedule.device_type,
|
||||||
|
scheduled_time=end_utc,
|
||||||
|
execution_status="pending",
|
||||||
|
notes=stop_notes,
|
||||||
|
)
|
||||||
|
actions.append(stop_action)
|
||||||
|
|
||||||
|
return actions
|
||||||
|
|
||||||
|
def _generate_interval_actions(
|
||||||
|
self,
|
||||||
|
schedule: RecurringSchedule,
|
||||||
|
horizon_days: int,
|
||||||
|
) -> List[ScheduledAction]:
|
||||||
|
"""
|
||||||
|
Generate actions from simple interval pattern.
|
||||||
|
|
||||||
|
For daily cycles: stop, download (optional), start at cycle_time each day.
|
||||||
|
"""
|
||||||
|
if not schedule.cycle_time:
|
||||||
|
return []
|
||||||
|
|
||||||
|
cycle_time = self._parse_time(schedule.cycle_time)
|
||||||
|
if not cycle_time:
|
||||||
|
return []
|
||||||
|
|
||||||
|
actions = []
|
||||||
|
tz = ZoneInfo(schedule.timezone)
|
||||||
|
now_utc = datetime.utcnow()
|
||||||
|
now_local = now_utc.replace(tzinfo=ZoneInfo("UTC")).astimezone(tz)
|
||||||
|
|
||||||
|
# Get unit_id
|
||||||
|
unit_id = self._resolve_unit_id(schedule)
|
||||||
|
|
||||||
|
for day_offset in range(horizon_days):
|
||||||
|
check_date = now_local.date() + timedelta(days=day_offset)
|
||||||
|
|
||||||
|
# Create cycle datetime in local timezone
|
||||||
|
cycle_local = datetime.combine(check_date, cycle_time, tzinfo=tz)
|
||||||
|
cycle_utc = cycle_local.astimezone(ZoneInfo("UTC")).replace(tzinfo=None)
|
||||||
|
|
||||||
|
# Skip if time has passed
|
||||||
|
if cycle_utc <= now_utc:
|
||||||
|
continue
|
||||||
|
|
||||||
|
# Check if cycle action already exists
|
||||||
|
if self._action_exists(schedule.project_id, schedule.location_id, "cycle", cycle_utc):
|
||||||
|
continue
|
||||||
|
|
||||||
|
# Build notes with metadata for cycle action
|
||||||
|
cycle_notes = json.dumps({
|
||||||
|
"schedule_name": schedule.name,
|
||||||
|
"schedule_id": schedule.id,
|
||||||
|
"cycle_type": "daily",
|
||||||
|
"include_download": schedule.include_download,
|
||||||
|
"auto_increment_index": schedule.auto_increment_index,
|
||||||
|
})
|
||||||
|
|
||||||
|
# Create single CYCLE action that handles stop -> download -> start
|
||||||
|
# The scheduler's _execute_cycle method handles the full workflow with delays
|
||||||
|
cycle_action = ScheduledAction(
|
||||||
|
id=str(uuid.uuid4()),
|
||||||
|
project_id=schedule.project_id,
|
||||||
|
location_id=schedule.location_id,
|
||||||
|
unit_id=unit_id,
|
||||||
|
action_type="cycle",
|
||||||
|
device_type=schedule.device_type,
|
||||||
|
scheduled_time=cycle_utc,
|
||||||
|
execution_status="pending",
|
||||||
|
notes=cycle_notes,
|
||||||
|
)
|
||||||
|
actions.append(cycle_action)
|
||||||
|
|
||||||
|
return actions
|
||||||
|
|
||||||
|
def _generate_one_off_actions(
|
||||||
|
self,
|
||||||
|
schedule: RecurringSchedule,
|
||||||
|
) -> List[ScheduledAction]:
|
||||||
|
"""
|
||||||
|
Generate start and stop actions for a one-off recording.
|
||||||
|
|
||||||
|
Unlike recurring types, this generates exactly one start and one stop action
|
||||||
|
using the schedule's start_datetime and end_datetime directly.
|
||||||
|
"""
|
||||||
|
if not schedule.start_datetime or not schedule.end_datetime:
|
||||||
|
logger.warning(f"One-off schedule {schedule.id} missing start/end datetime")
|
||||||
|
return []
|
||||||
|
|
||||||
|
actions = []
|
||||||
|
now_utc = datetime.utcnow()
|
||||||
|
unit_id = self._resolve_unit_id(schedule)
|
||||||
|
|
||||||
|
# Skip if end time has already passed
|
||||||
|
if schedule.end_datetime <= now_utc:
|
||||||
|
return []
|
||||||
|
|
||||||
|
# Check if actions already exist for this schedule
|
||||||
|
if self._action_exists(schedule.project_id, schedule.location_id, "start", schedule.start_datetime):
|
||||||
|
return []
|
||||||
|
|
||||||
|
# Create START action (only if start time hasn't passed)
|
||||||
|
if schedule.start_datetime > now_utc:
|
||||||
|
start_notes = json.dumps({
|
||||||
|
"schedule_name": schedule.name,
|
||||||
|
"schedule_id": schedule.id,
|
||||||
|
"schedule_type": "one_off",
|
||||||
|
"auto_increment_index": schedule.auto_increment_index,
|
||||||
|
})
|
||||||
|
|
||||||
|
start_action = ScheduledAction(
|
||||||
|
id=str(uuid.uuid4()),
|
||||||
|
project_id=schedule.project_id,
|
||||||
|
location_id=schedule.location_id,
|
||||||
|
unit_id=unit_id,
|
||||||
|
action_type="start",
|
||||||
|
device_type=schedule.device_type,
|
||||||
|
scheduled_time=schedule.start_datetime,
|
||||||
|
execution_status="pending",
|
||||||
|
notes=start_notes,
|
||||||
|
)
|
||||||
|
actions.append(start_action)
|
||||||
|
|
||||||
|
# Create STOP action
|
||||||
|
stop_notes = json.dumps({
|
||||||
|
"schedule_name": schedule.name,
|
||||||
|
"schedule_id": schedule.id,
|
||||||
|
"schedule_type": "one_off",
|
||||||
|
"include_download": schedule.include_download,
|
||||||
|
})
|
||||||
|
|
||||||
|
stop_action = ScheduledAction(
|
||||||
|
id=str(uuid.uuid4()),
|
||||||
|
project_id=schedule.project_id,
|
||||||
|
location_id=schedule.location_id,
|
||||||
|
unit_id=unit_id,
|
||||||
|
action_type="stop",
|
||||||
|
device_type=schedule.device_type,
|
||||||
|
scheduled_time=schedule.end_datetime,
|
||||||
|
execution_status="pending",
|
||||||
|
notes=stop_notes,
|
||||||
|
)
|
||||||
|
actions.append(stop_action)
|
||||||
|
|
||||||
|
return actions
|
||||||
|
|
||||||
|
def _calculate_next_occurrence(self, schedule: RecurringSchedule) -> Optional[datetime]:
|
||||||
|
"""Calculate when the next action should occur."""
|
||||||
|
if not schedule.enabled:
|
||||||
|
return None
|
||||||
|
|
||||||
|
tz = ZoneInfo(schedule.timezone)
|
||||||
|
now_utc = datetime.utcnow()
|
||||||
|
now_local = now_utc.replace(tzinfo=ZoneInfo("UTC")).astimezone(tz)
|
||||||
|
|
||||||
|
if schedule.schedule_type == "weekly_calendar" and schedule.weekly_pattern:
|
||||||
|
try:
|
||||||
|
pattern = json.loads(schedule.weekly_pattern)
|
||||||
|
except:
|
||||||
|
return None
|
||||||
|
|
||||||
|
# Find next enabled day
|
||||||
|
for day_offset in range(8): # Check up to a week ahead
|
||||||
|
check_date = now_local.date() + timedelta(days=day_offset)
|
||||||
|
day_name = DAY_NAMES[check_date.weekday()]
|
||||||
|
day_config = pattern.get(day_name, {})
|
||||||
|
|
||||||
|
if day_config.get("enabled") and day_config.get("start"):
|
||||||
|
start_time = self._parse_time(day_config["start"])
|
||||||
|
if start_time:
|
||||||
|
start_local = datetime.combine(check_date, start_time, tzinfo=tz)
|
||||||
|
start_utc = start_local.astimezone(ZoneInfo("UTC")).replace(tzinfo=None)
|
||||||
|
if start_utc > now_utc:
|
||||||
|
return start_utc
|
||||||
|
|
||||||
|
elif schedule.schedule_type == "simple_interval" and schedule.cycle_time:
|
||||||
|
cycle_time = self._parse_time(schedule.cycle_time)
|
||||||
|
if cycle_time:
|
||||||
|
# Find next cycle time
|
||||||
|
for day_offset in range(2):
|
||||||
|
check_date = now_local.date() + timedelta(days=day_offset)
|
||||||
|
cycle_local = datetime.combine(check_date, cycle_time, tzinfo=tz)
|
||||||
|
cycle_utc = cycle_local.astimezone(ZoneInfo("UTC")).replace(tzinfo=None)
|
||||||
|
if cycle_utc > now_utc:
|
||||||
|
return cycle_utc
|
||||||
|
|
||||||
|
elif schedule.schedule_type == "one_off":
|
||||||
|
if schedule.start_datetime and schedule.start_datetime > now_utc:
|
||||||
|
return schedule.start_datetime
|
||||||
|
elif schedule.end_datetime and schedule.end_datetime > now_utc:
|
||||||
|
return schedule.end_datetime
|
||||||
|
return None
|
||||||
|
|
||||||
|
return None
|
||||||
|
|
||||||
|
def _resolve_unit_id(self, schedule: RecurringSchedule) -> Optional[str]:
|
||||||
|
"""Get unit_id from schedule or active assignment."""
|
||||||
|
if schedule.unit_id:
|
||||||
|
return schedule.unit_id
|
||||||
|
|
||||||
|
# Try to get from active assignment
|
||||||
|
assignment = self.db.query(UnitAssignment).filter(
|
||||||
|
and_(
|
||||||
|
UnitAssignment.location_id == schedule.location_id,
|
||||||
|
UnitAssignment.status == "active",
|
||||||
|
)
|
||||||
|
).first()
|
||||||
|
|
||||||
|
return assignment.unit_id if assignment else None
|
||||||
|
|
||||||
|
def _action_exists(
|
||||||
|
self,
|
||||||
|
project_id: str,
|
||||||
|
location_id: str,
|
||||||
|
action_type: str,
|
||||||
|
scheduled_time: datetime,
|
||||||
|
) -> bool:
|
||||||
|
"""Check if an action already exists for this time slot."""
|
||||||
|
# Allow 5-minute window for duplicate detection
|
||||||
|
time_window_start = scheduled_time - timedelta(minutes=5)
|
||||||
|
time_window_end = scheduled_time + timedelta(minutes=5)
|
||||||
|
|
||||||
|
exists = self.db.query(ScheduledAction).filter(
|
||||||
|
and_(
|
||||||
|
ScheduledAction.project_id == project_id,
|
||||||
|
ScheduledAction.location_id == location_id,
|
||||||
|
ScheduledAction.action_type == action_type,
|
||||||
|
ScheduledAction.scheduled_time >= time_window_start,
|
||||||
|
ScheduledAction.scheduled_time <= time_window_end,
|
||||||
|
ScheduledAction.execution_status == "pending",
|
||||||
|
)
|
||||||
|
).first()
|
||||||
|
|
||||||
|
return exists is not None
|
||||||
|
|
||||||
|
@staticmethod
|
||||||
|
def _parse_time(time_str: str) -> Optional[time]:
|
||||||
|
"""Parse time string "HH:MM" to time object."""
|
||||||
|
try:
|
||||||
|
parts = time_str.split(":")
|
||||||
|
return time(int(parts[0]), int(parts[1]))
|
||||||
|
except (ValueError, IndexError):
|
||||||
|
return None
|
||||||
|
|
||||||
|
def get_schedules_for_project(self, project_id: str) -> List[RecurringSchedule]:
|
||||||
|
"""Get all recurring schedules for a project."""
|
||||||
|
return self.db.query(RecurringSchedule).filter_by(project_id=project_id).all()
|
||||||
|
|
||||||
|
def get_enabled_schedules(self) -> List[RecurringSchedule]:
|
||||||
|
"""Get all enabled recurring schedules for projects that are not on hold or deleted."""
|
||||||
|
active_project_ids = [
|
||||||
|
p.id for p in self.db.query(Project.id).filter(
|
||||||
|
Project.status.notin_(["on_hold", "archived", "deleted"])
|
||||||
|
).all()
|
||||||
|
]
|
||||||
|
return self.db.query(RecurringSchedule).filter(
|
||||||
|
RecurringSchedule.enabled == True,
|
||||||
|
RecurringSchedule.project_id.in_(active_project_ids),
|
||||||
|
).all()
|
||||||
|
|
||||||
|
|
||||||
|
def get_recurring_schedule_service(db: Session) -> RecurringScheduleService:
|
||||||
|
"""Get a RecurringScheduleService instance."""
|
||||||
|
return RecurringScheduleService(db)
|
||||||
@@ -4,22 +4,30 @@ Scheduler Service
|
|||||||
Executes scheduled actions for Projects system.
|
Executes scheduled actions for Projects system.
|
||||||
Monitors pending scheduled actions and executes them by calling device modules (SLMM/SFM).
|
Monitors pending scheduled actions and executes them by calling device modules (SLMM/SFM).
|
||||||
|
|
||||||
|
Extended to support recurring schedules:
|
||||||
|
- Generates ScheduledActions from RecurringSchedule patterns
|
||||||
|
- Cleans up old completed/failed actions
|
||||||
|
|
||||||
This service runs as a background task in FastAPI, checking for pending actions
|
This service runs as a background task in FastAPI, checking for pending actions
|
||||||
every minute and executing them when their scheduled time arrives.
|
every minute and executing them when their scheduled time arrives.
|
||||||
"""
|
"""
|
||||||
|
|
||||||
import asyncio
|
import asyncio
|
||||||
import json
|
import json
|
||||||
|
import logging
|
||||||
from datetime import datetime, timedelta
|
from datetime import datetime, timedelta
|
||||||
from typing import Optional, List, Dict, Any
|
from typing import Optional, List, Dict, Any
|
||||||
from sqlalchemy.orm import Session
|
from sqlalchemy.orm import Session
|
||||||
from sqlalchemy import and_
|
from sqlalchemy import and_
|
||||||
|
|
||||||
from backend.database import SessionLocal
|
from backend.database import SessionLocal
|
||||||
from backend.models import ScheduledAction, RecordingSession, MonitoringLocation, Project
|
from backend.models import ScheduledAction, MonitoringSession, MonitoringLocation, Project, RecurringSchedule
|
||||||
from backend.services.device_controller import get_device_controller, DeviceControllerError
|
from backend.services.device_controller import get_device_controller, DeviceControllerError
|
||||||
|
from backend.services.alert_service import get_alert_service
|
||||||
import uuid
|
import uuid
|
||||||
|
|
||||||
|
logger = logging.getLogger(__name__)
|
||||||
|
|
||||||
|
|
||||||
class SchedulerService:
|
class SchedulerService:
|
||||||
"""
|
"""
|
||||||
@@ -62,11 +70,26 @@ class SchedulerService:
|
|||||||
|
|
||||||
async def _run_loop(self):
|
async def _run_loop(self):
|
||||||
"""Main scheduler loop."""
|
"""Main scheduler loop."""
|
||||||
|
# Track when we last generated recurring actions (do this once per hour)
|
||||||
|
last_generation_check = datetime.utcnow() - timedelta(hours=1)
|
||||||
|
|
||||||
while self.running:
|
while self.running:
|
||||||
try:
|
try:
|
||||||
|
# Execute pending actions
|
||||||
await self.execute_pending_actions()
|
await self.execute_pending_actions()
|
||||||
|
|
||||||
|
# Generate actions from recurring schedules (every hour)
|
||||||
|
now = datetime.utcnow()
|
||||||
|
if (now - last_generation_check).total_seconds() >= 3600:
|
||||||
|
await self.generate_recurring_actions()
|
||||||
|
last_generation_check = now
|
||||||
|
|
||||||
|
# Cleanup old actions (also every hour, during generation cycle)
|
||||||
|
if (now - last_generation_check).total_seconds() < 60:
|
||||||
|
await self.cleanup_old_actions()
|
||||||
|
|
||||||
except Exception as e:
|
except Exception as e:
|
||||||
print(f"Scheduler error: {e}")
|
logger.error(f"Scheduler error: {e}", exc_info=True)
|
||||||
# Continue running even if there's an error
|
# Continue running even if there's an error
|
||||||
|
|
||||||
await asyncio.sleep(self.check_interval)
|
await asyncio.sleep(self.check_interval)
|
||||||
@@ -84,10 +107,19 @@ class SchedulerService:
|
|||||||
try:
|
try:
|
||||||
# Find pending actions that are due
|
# Find pending actions that are due
|
||||||
now = datetime.utcnow()
|
now = datetime.utcnow()
|
||||||
|
|
||||||
|
# Only execute actions for active/completed projects (not on_hold, archived, or deleted)
|
||||||
|
active_project_ids = [
|
||||||
|
p.id for p in db.query(Project.id).filter(
|
||||||
|
Project.status.notin_(["on_hold", "archived", "deleted"])
|
||||||
|
).all()
|
||||||
|
]
|
||||||
|
|
||||||
pending_actions = db.query(ScheduledAction).filter(
|
pending_actions = db.query(ScheduledAction).filter(
|
||||||
and_(
|
and_(
|
||||||
ScheduledAction.execution_status == "pending",
|
ScheduledAction.execution_status == "pending",
|
||||||
ScheduledAction.scheduled_time <= now,
|
ScheduledAction.scheduled_time <= now,
|
||||||
|
ScheduledAction.project_id.in_(active_project_ids),
|
||||||
)
|
)
|
||||||
).order_by(ScheduledAction.scheduled_time).all()
|
).order_by(ScheduledAction.scheduled_time).all()
|
||||||
|
|
||||||
@@ -162,6 +194,8 @@ class SchedulerService:
|
|||||||
response = await self._execute_stop(action, unit_id, db)
|
response = await self._execute_stop(action, unit_id, db)
|
||||||
elif action.action_type == "download":
|
elif action.action_type == "download":
|
||||||
response = await self._execute_download(action, unit_id, db)
|
response = await self._execute_download(action, unit_id, db)
|
||||||
|
elif action.action_type == "cycle":
|
||||||
|
response = await self._execute_cycle(action, unit_id, db)
|
||||||
else:
|
else:
|
||||||
raise Exception(f"Unknown action type: {action.action_type}")
|
raise Exception(f"Unknown action type: {action.action_type}")
|
||||||
|
|
||||||
@@ -175,6 +209,21 @@ class SchedulerService:
|
|||||||
|
|
||||||
print(f"✓ Action {action.id} completed successfully")
|
print(f"✓ Action {action.id} completed successfully")
|
||||||
|
|
||||||
|
# Create success alert
|
||||||
|
try:
|
||||||
|
alert_service = get_alert_service(db)
|
||||||
|
alert_metadata = response.get("cycle_response", {}) if isinstance(response, dict) else {}
|
||||||
|
alert_service.create_schedule_completed_alert(
|
||||||
|
schedule_id=action.id,
|
||||||
|
action_type=action.action_type,
|
||||||
|
unit_id=unit_id,
|
||||||
|
project_id=action.project_id,
|
||||||
|
location_id=action.location_id,
|
||||||
|
metadata=alert_metadata,
|
||||||
|
)
|
||||||
|
except Exception as alert_err:
|
||||||
|
logger.warning(f"Failed to create success alert: {alert_err}")
|
||||||
|
|
||||||
except Exception as e:
|
except Exception as e:
|
||||||
# Mark action as failed
|
# Mark action as failed
|
||||||
action.execution_status = "failed"
|
action.execution_status = "failed"
|
||||||
@@ -185,6 +234,20 @@ class SchedulerService:
|
|||||||
|
|
||||||
print(f"✗ Action {action.id} failed: {e}")
|
print(f"✗ Action {action.id} failed: {e}")
|
||||||
|
|
||||||
|
# Create failure alert
|
||||||
|
try:
|
||||||
|
alert_service = get_alert_service(db)
|
||||||
|
alert_service.create_schedule_failed_alert(
|
||||||
|
schedule_id=action.id,
|
||||||
|
action_type=action.action_type,
|
||||||
|
unit_id=unit_id if 'unit_id' in dir() else action.unit_id,
|
||||||
|
error_message=str(e),
|
||||||
|
project_id=action.project_id,
|
||||||
|
location_id=action.location_id,
|
||||||
|
)
|
||||||
|
except Exception as alert_err:
|
||||||
|
logger.warning(f"Failed to create failure alert: {alert_err}")
|
||||||
|
|
||||||
return result
|
return result
|
||||||
|
|
||||||
async def _execute_start(
|
async def _execute_start(
|
||||||
@@ -193,16 +256,23 @@ class SchedulerService:
|
|||||||
unit_id: str,
|
unit_id: str,
|
||||||
db: Session,
|
db: Session,
|
||||||
) -> Dict[str, Any]:
|
) -> Dict[str, Any]:
|
||||||
"""Execute a 'start' action."""
|
"""Execute a 'start' action using the start_cycle command.
|
||||||
# Start recording via device controller
|
|
||||||
response = await self.device_controller.start_recording(
|
start_cycle handles:
|
||||||
|
1. Sync device clock to server time
|
||||||
|
2. Find next safe index (with overwrite protection)
|
||||||
|
3. Start measurement
|
||||||
|
"""
|
||||||
|
# Execute the full start cycle via device controller
|
||||||
|
# SLMM handles clock sync, index increment, and start
|
||||||
|
cycle_response = await self.device_controller.start_cycle(
|
||||||
unit_id,
|
unit_id,
|
||||||
action.device_type,
|
action.device_type,
|
||||||
config={}, # TODO: Load config from action.notes or metadata
|
sync_clock=True,
|
||||||
)
|
)
|
||||||
|
|
||||||
# Create recording session
|
# Create recording session
|
||||||
session = RecordingSession(
|
session = MonitoringSession(
|
||||||
id=str(uuid.uuid4()),
|
id=str(uuid.uuid4()),
|
||||||
project_id=action.project_id,
|
project_id=action.project_id,
|
||||||
location_id=action.location_id,
|
location_id=action.location_id,
|
||||||
@@ -210,14 +280,17 @@ class SchedulerService:
|
|||||||
session_type="sound" if action.device_type == "slm" else "vibration",
|
session_type="sound" if action.device_type == "slm" else "vibration",
|
||||||
started_at=datetime.utcnow(),
|
started_at=datetime.utcnow(),
|
||||||
status="recording",
|
status="recording",
|
||||||
session_metadata=json.dumps({"scheduled_action_id": action.id}),
|
session_metadata=json.dumps({
|
||||||
|
"scheduled_action_id": action.id,
|
||||||
|
"cycle_response": cycle_response,
|
||||||
|
}),
|
||||||
)
|
)
|
||||||
db.add(session)
|
db.add(session)
|
||||||
|
|
||||||
return {
|
return {
|
||||||
"status": "started",
|
"status": "started",
|
||||||
"session_id": session.id,
|
"session_id": session.id,
|
||||||
"device_response": response,
|
"cycle_response": cycle_response,
|
||||||
}
|
}
|
||||||
|
|
||||||
async def _execute_stop(
|
async def _execute_stop(
|
||||||
@@ -226,19 +299,48 @@ class SchedulerService:
|
|||||||
unit_id: str,
|
unit_id: str,
|
||||||
db: Session,
|
db: Session,
|
||||||
) -> Dict[str, Any]:
|
) -> Dict[str, Any]:
|
||||||
"""Execute a 'stop' action."""
|
"""Execute a 'stop' action using the stop_cycle command.
|
||||||
# Stop recording via device controller
|
|
||||||
response = await self.device_controller.stop_recording(
|
stop_cycle handles:
|
||||||
|
1. Stop measurement
|
||||||
|
2. Enable FTP
|
||||||
|
3. Download measurement folder to SLMM local storage
|
||||||
|
|
||||||
|
After stop_cycle, if download succeeded, this method fetches the ZIP
|
||||||
|
from SLMM and extracts it into Terra-View's project directory, creating
|
||||||
|
DataFile records for each file.
|
||||||
|
"""
|
||||||
|
import hashlib
|
||||||
|
import io
|
||||||
|
import os
|
||||||
|
import zipfile
|
||||||
|
import httpx
|
||||||
|
from pathlib import Path
|
||||||
|
from backend.models import DataFile
|
||||||
|
|
||||||
|
# Parse notes for download preference
|
||||||
|
include_download = True
|
||||||
|
try:
|
||||||
|
if action.notes:
|
||||||
|
notes_data = json.loads(action.notes)
|
||||||
|
include_download = notes_data.get("include_download", True)
|
||||||
|
except json.JSONDecodeError:
|
||||||
|
pass # Notes is plain text, not JSON
|
||||||
|
|
||||||
|
# Execute the full stop cycle via device controller
|
||||||
|
# SLMM handles stop, FTP enable, and download to SLMM-local storage
|
||||||
|
cycle_response = await self.device_controller.stop_cycle(
|
||||||
unit_id,
|
unit_id,
|
||||||
action.device_type,
|
action.device_type,
|
||||||
|
download=include_download,
|
||||||
)
|
)
|
||||||
|
|
||||||
# Find and update the active recording session
|
# Find and update the active recording session
|
||||||
active_session = db.query(RecordingSession).filter(
|
active_session = db.query(MonitoringSession).filter(
|
||||||
and_(
|
and_(
|
||||||
RecordingSession.location_id == action.location_id,
|
MonitoringSession.location_id == action.location_id,
|
||||||
RecordingSession.unit_id == unit_id,
|
MonitoringSession.unit_id == unit_id,
|
||||||
RecordingSession.status == "recording",
|
MonitoringSession.status == "recording",
|
||||||
)
|
)
|
||||||
).first()
|
).first()
|
||||||
|
|
||||||
@@ -248,11 +350,91 @@ class SchedulerService:
|
|||||||
active_session.duration_seconds = int(
|
active_session.duration_seconds = int(
|
||||||
(active_session.stopped_at - active_session.started_at).total_seconds()
|
(active_session.stopped_at - active_session.started_at).total_seconds()
|
||||||
)
|
)
|
||||||
|
# Store download info in session metadata
|
||||||
|
if cycle_response.get("download_success"):
|
||||||
|
try:
|
||||||
|
metadata = json.loads(active_session.session_metadata or "{}")
|
||||||
|
metadata["downloaded_folder"] = cycle_response.get("downloaded_folder")
|
||||||
|
metadata["local_path"] = cycle_response.get("local_path")
|
||||||
|
active_session.session_metadata = json.dumps(metadata)
|
||||||
|
except json.JSONDecodeError:
|
||||||
|
pass
|
||||||
|
|
||||||
|
db.commit()
|
||||||
|
|
||||||
|
# If SLMM downloaded the folder successfully, fetch the ZIP from SLMM
|
||||||
|
# and extract it into Terra-View's project directory, creating DataFile records
|
||||||
|
files_created = 0
|
||||||
|
if include_download and cycle_response.get("download_success") and active_session:
|
||||||
|
folder_name = cycle_response.get("downloaded_folder") # e.g. "Auto_0058"
|
||||||
|
remote_path = f"/NL-43/{folder_name}"
|
||||||
|
|
||||||
|
try:
|
||||||
|
SLMM_BASE_URL = os.getenv("SLMM_BASE_URL", "http://localhost:8100")
|
||||||
|
async with httpx.AsyncClient(timeout=600.0) as client:
|
||||||
|
zip_response = await client.post(
|
||||||
|
f"{SLMM_BASE_URL}/api/nl43/{unit_id}/ftp/download-folder",
|
||||||
|
json={"remote_path": remote_path}
|
||||||
|
)
|
||||||
|
|
||||||
|
if zip_response.is_success and len(zip_response.content) > 22:
|
||||||
|
base_dir = Path(f"data/Projects/{action.project_id}/{active_session.id}/{folder_name}")
|
||||||
|
base_dir.mkdir(parents=True, exist_ok=True)
|
||||||
|
|
||||||
|
file_type_map = {
|
||||||
|
'.wav': 'audio', '.mp3': 'audio',
|
||||||
|
'.csv': 'data', '.txt': 'data', '.json': 'data', '.dat': 'data',
|
||||||
|
'.rnd': 'data', '.rnh': 'data',
|
||||||
|
'.log': 'log',
|
||||||
|
'.zip': 'archive',
|
||||||
|
'.jpg': 'image', '.jpeg': 'image', '.png': 'image',
|
||||||
|
'.pdf': 'document',
|
||||||
|
}
|
||||||
|
|
||||||
|
with zipfile.ZipFile(io.BytesIO(zip_response.content)) as zf:
|
||||||
|
for zip_info in zf.filelist:
|
||||||
|
if zip_info.is_dir():
|
||||||
|
continue
|
||||||
|
file_data = zf.read(zip_info.filename)
|
||||||
|
file_path = base_dir / zip_info.filename
|
||||||
|
file_path.parent.mkdir(parents=True, exist_ok=True)
|
||||||
|
with open(file_path, 'wb') as f:
|
||||||
|
f.write(file_data)
|
||||||
|
checksum = hashlib.sha256(file_data).hexdigest()
|
||||||
|
ext = os.path.splitext(zip_info.filename)[1].lower()
|
||||||
|
data_file = DataFile(
|
||||||
|
id=str(uuid.uuid4()),
|
||||||
|
session_id=active_session.id,
|
||||||
|
file_path=str(file_path.relative_to("data")),
|
||||||
|
file_type=file_type_map.get(ext, 'data'),
|
||||||
|
file_size_bytes=len(file_data),
|
||||||
|
downloaded_at=datetime.utcnow(),
|
||||||
|
checksum=checksum,
|
||||||
|
file_metadata=json.dumps({
|
||||||
|
"source": "stop_cycle",
|
||||||
|
"remote_path": remote_path,
|
||||||
|
"unit_id": unit_id,
|
||||||
|
"folder_name": folder_name,
|
||||||
|
"relative_path": zip_info.filename,
|
||||||
|
}),
|
||||||
|
)
|
||||||
|
db.add(data_file)
|
||||||
|
files_created += 1
|
||||||
|
|
||||||
|
db.commit()
|
||||||
|
logger.info(f"Created {files_created} DataFile records for session {active_session.id} from {folder_name}")
|
||||||
|
else:
|
||||||
|
logger.warning(f"ZIP from SLMM for {folder_name} was empty or failed, skipping DataFile creation")
|
||||||
|
|
||||||
|
except Exception as e:
|
||||||
|
logger.error(f"Failed to extract ZIP and create DataFile records for {folder_name}: {e}")
|
||||||
|
# Don't fail the stop action — the device was stopped successfully
|
||||||
|
|
||||||
return {
|
return {
|
||||||
"status": "stopped",
|
"status": "stopped",
|
||||||
"session_id": active_session.id if active_session else None,
|
"session_id": active_session.id if active_session else None,
|
||||||
"device_response": response,
|
"cycle_response": cycle_response,
|
||||||
|
"files_created": files_created,
|
||||||
}
|
}
|
||||||
|
|
||||||
async def _execute_download(
|
async def _execute_download(
|
||||||
@@ -261,7 +443,14 @@ class SchedulerService:
|
|||||||
unit_id: str,
|
unit_id: str,
|
||||||
db: Session,
|
db: Session,
|
||||||
) -> Dict[str, Any]:
|
) -> Dict[str, Any]:
|
||||||
"""Execute a 'download' action."""
|
"""Execute a 'download' action.
|
||||||
|
|
||||||
|
This handles standalone download actions (not part of stop_cycle).
|
||||||
|
The workflow is:
|
||||||
|
1. Enable FTP on device
|
||||||
|
2. Download current measurement folder
|
||||||
|
3. (Optionally disable FTP - left enabled for now)
|
||||||
|
"""
|
||||||
# Get project and location info for file path
|
# Get project and location info for file path
|
||||||
location = db.query(MonitoringLocation).filter_by(id=action.location_id).first()
|
location = db.query(MonitoringLocation).filter_by(id=action.location_id).first()
|
||||||
project = db.query(Project).filter_by(id=action.project_id).first()
|
project = db.query(Project).filter_by(id=action.project_id).first()
|
||||||
@@ -269,8 +458,8 @@ class SchedulerService:
|
|||||||
if not location or not project:
|
if not location or not project:
|
||||||
raise Exception("Project or location not found")
|
raise Exception("Project or location not found")
|
||||||
|
|
||||||
# Build destination path
|
# Build destination path (for logging/metadata reference)
|
||||||
# Example: data/Projects/{project-id}/sound/{location-name}/session-{timestamp}/
|
# Actual download location is managed by SLMM (data/downloads/{unit_id}/)
|
||||||
session_timestamp = datetime.utcnow().strftime("%Y-%m-%d-%H%M")
|
session_timestamp = datetime.utcnow().strftime("%Y-%m-%d-%H%M")
|
||||||
location_type_dir = "sound" if action.device_type == "slm" else "vibration"
|
location_type_dir = "sound" if action.device_type == "slm" else "vibration"
|
||||||
|
|
||||||
@@ -279,12 +468,23 @@ class SchedulerService:
|
|||||||
f"{location.name}/session-{session_timestamp}/"
|
f"{location.name}/session-{session_timestamp}/"
|
||||||
)
|
)
|
||||||
|
|
||||||
# Download files via device controller
|
# Step 1: Disable FTP first to reset any stale connection state
|
||||||
|
# Then enable FTP on device
|
||||||
|
logger.info(f"Resetting FTP on {unit_id} for download (disable then enable)")
|
||||||
|
try:
|
||||||
|
await self.device_controller.disable_ftp(unit_id, action.device_type)
|
||||||
|
except Exception as e:
|
||||||
|
logger.warning(f"FTP disable failed (may already be off): {e}")
|
||||||
|
await self.device_controller.enable_ftp(unit_id, action.device_type)
|
||||||
|
|
||||||
|
# Step 2: Download current measurement folder
|
||||||
|
# The slmm_client.download_files() now automatically determines the correct
|
||||||
|
# folder based on the device's current index number
|
||||||
response = await self.device_controller.download_files(
|
response = await self.device_controller.download_files(
|
||||||
unit_id,
|
unit_id,
|
||||||
action.device_type,
|
action.device_type,
|
||||||
destination_path,
|
destination_path,
|
||||||
files=None, # Download all files
|
files=None, # Download all files in current measurement folder
|
||||||
)
|
)
|
||||||
|
|
||||||
# TODO: Create DataFile records for downloaded files
|
# TODO: Create DataFile records for downloaded files
|
||||||
@@ -295,6 +495,293 @@ class SchedulerService:
|
|||||||
"device_response": response,
|
"device_response": response,
|
||||||
}
|
}
|
||||||
|
|
||||||
|
async def _execute_cycle(
|
||||||
|
self,
|
||||||
|
action: ScheduledAction,
|
||||||
|
unit_id: str,
|
||||||
|
db: Session,
|
||||||
|
) -> Dict[str, Any]:
|
||||||
|
"""Execute a full 'cycle' action: stop -> download -> start.
|
||||||
|
|
||||||
|
This combines stop, download, and start into a single action with
|
||||||
|
appropriate delays between steps to ensure device stability.
|
||||||
|
|
||||||
|
Workflow:
|
||||||
|
0. Pause background polling to prevent command conflicts
|
||||||
|
1. Stop measurement (wait 10s)
|
||||||
|
2. Disable FTP to reset state (wait 10s)
|
||||||
|
3. Enable FTP (wait 10s)
|
||||||
|
4. Download current measurement folder
|
||||||
|
5. Wait 30s for device to settle
|
||||||
|
6. Start new measurement cycle
|
||||||
|
7. Re-enable background polling
|
||||||
|
|
||||||
|
Total time: ~70-90 seconds depending on download size
|
||||||
|
"""
|
||||||
|
logger.info(f"[CYCLE] === Starting full cycle for {unit_id} ===")
|
||||||
|
|
||||||
|
result = {
|
||||||
|
"status": "cycle_complete",
|
||||||
|
"steps": {},
|
||||||
|
"old_session_id": None,
|
||||||
|
"new_session_id": None,
|
||||||
|
"polling_paused": False,
|
||||||
|
}
|
||||||
|
|
||||||
|
# Step 0: Pause background polling for this device to prevent command conflicts
|
||||||
|
# NL-43 devices only support one TCP connection at a time
|
||||||
|
logger.info(f"[CYCLE] Step 0: Pausing background polling for {unit_id}")
|
||||||
|
polling_was_enabled = False
|
||||||
|
try:
|
||||||
|
if action.device_type == "slm":
|
||||||
|
# Get current polling state to restore later
|
||||||
|
from backend.services.slmm_client import get_slmm_client
|
||||||
|
slmm = get_slmm_client()
|
||||||
|
try:
|
||||||
|
polling_config = await slmm.get_device_polling_config(unit_id)
|
||||||
|
polling_was_enabled = polling_config.get("poll_enabled", False)
|
||||||
|
except Exception:
|
||||||
|
polling_was_enabled = True # Assume enabled if can't check
|
||||||
|
|
||||||
|
# Disable polling during cycle
|
||||||
|
await slmm.update_device_polling_config(unit_id, poll_enabled=False)
|
||||||
|
result["polling_paused"] = True
|
||||||
|
logger.info(f"[CYCLE] Background polling paused for {unit_id}")
|
||||||
|
except Exception as e:
|
||||||
|
logger.warning(f"[CYCLE] Failed to pause polling (continuing anyway): {e}")
|
||||||
|
|
||||||
|
try:
|
||||||
|
# Step 1: Stop measurement
|
||||||
|
logger.info(f"[CYCLE] Step 1/7: Stopping measurement on {unit_id}")
|
||||||
|
try:
|
||||||
|
stop_response = await self.device_controller.stop_recording(unit_id, action.device_type)
|
||||||
|
result["steps"]["stop"] = {"success": True, "response": stop_response}
|
||||||
|
logger.info(f"[CYCLE] Measurement stopped, waiting 10s...")
|
||||||
|
except Exception as e:
|
||||||
|
logger.warning(f"[CYCLE] Stop failed (may already be stopped): {e}")
|
||||||
|
result["steps"]["stop"] = {"success": False, "error": str(e)}
|
||||||
|
|
||||||
|
await asyncio.sleep(10)
|
||||||
|
|
||||||
|
# Step 2: Disable FTP to reset any stale state
|
||||||
|
logger.info(f"[CYCLE] Step 2/7: Disabling FTP on {unit_id}")
|
||||||
|
try:
|
||||||
|
await self.device_controller.disable_ftp(unit_id, action.device_type)
|
||||||
|
result["steps"]["ftp_disable"] = {"success": True}
|
||||||
|
logger.info(f"[CYCLE] FTP disabled, waiting 10s...")
|
||||||
|
except Exception as e:
|
||||||
|
logger.warning(f"[CYCLE] FTP disable failed (may already be off): {e}")
|
||||||
|
result["steps"]["ftp_disable"] = {"success": False, "error": str(e)}
|
||||||
|
|
||||||
|
await asyncio.sleep(10)
|
||||||
|
|
||||||
|
# Step 3: Enable FTP
|
||||||
|
logger.info(f"[CYCLE] Step 3/7: Enabling FTP on {unit_id}")
|
||||||
|
try:
|
||||||
|
await self.device_controller.enable_ftp(unit_id, action.device_type)
|
||||||
|
result["steps"]["ftp_enable"] = {"success": True}
|
||||||
|
logger.info(f"[CYCLE] FTP enabled, waiting 10s...")
|
||||||
|
except Exception as e:
|
||||||
|
logger.error(f"[CYCLE] FTP enable failed: {e}")
|
||||||
|
result["steps"]["ftp_enable"] = {"success": False, "error": str(e)}
|
||||||
|
# Continue anyway - download will fail but we can still try to start
|
||||||
|
|
||||||
|
await asyncio.sleep(10)
|
||||||
|
|
||||||
|
# Step 4: Download current measurement folder
|
||||||
|
logger.info(f"[CYCLE] Step 4/7: Downloading measurement data from {unit_id}")
|
||||||
|
location = db.query(MonitoringLocation).filter_by(id=action.location_id).first()
|
||||||
|
project = db.query(Project).filter_by(id=action.project_id).first()
|
||||||
|
|
||||||
|
if location and project:
|
||||||
|
session_timestamp = datetime.utcnow().strftime("%Y-%m-%d-%H%M")
|
||||||
|
location_type_dir = "sound" if action.device_type == "slm" else "vibration"
|
||||||
|
destination_path = (
|
||||||
|
f"data/Projects/{project.id}/{location_type_dir}/"
|
||||||
|
f"{location.name}/session-{session_timestamp}/"
|
||||||
|
)
|
||||||
|
|
||||||
|
try:
|
||||||
|
download_response = await self.device_controller.download_files(
|
||||||
|
unit_id,
|
||||||
|
action.device_type,
|
||||||
|
destination_path,
|
||||||
|
files=None,
|
||||||
|
)
|
||||||
|
result["steps"]["download"] = {"success": True, "response": download_response}
|
||||||
|
logger.info(f"[CYCLE] Download complete")
|
||||||
|
except Exception as e:
|
||||||
|
logger.error(f"[CYCLE] Download failed: {e}")
|
||||||
|
result["steps"]["download"] = {"success": False, "error": str(e)}
|
||||||
|
else:
|
||||||
|
result["steps"]["download"] = {"success": False, "error": "Project or location not found"}
|
||||||
|
|
||||||
|
# Close out the old recording session
|
||||||
|
active_session = db.query(MonitoringSession).filter(
|
||||||
|
and_(
|
||||||
|
MonitoringSession.location_id == action.location_id,
|
||||||
|
MonitoringSession.unit_id == unit_id,
|
||||||
|
MonitoringSession.status == "recording",
|
||||||
|
)
|
||||||
|
).first()
|
||||||
|
|
||||||
|
if active_session:
|
||||||
|
active_session.stopped_at = datetime.utcnow()
|
||||||
|
active_session.status = "completed"
|
||||||
|
active_session.duration_seconds = int(
|
||||||
|
(active_session.stopped_at - active_session.started_at).total_seconds()
|
||||||
|
)
|
||||||
|
result["old_session_id"] = active_session.id
|
||||||
|
|
||||||
|
# Step 5: Wait for device to settle before starting new measurement
|
||||||
|
logger.info(f"[CYCLE] Step 5/7: Waiting 30s for device to settle...")
|
||||||
|
await asyncio.sleep(30)
|
||||||
|
|
||||||
|
# Step 6: Start new measurement cycle
|
||||||
|
logger.info(f"[CYCLE] Step 6/7: Starting new measurement on {unit_id}")
|
||||||
|
try:
|
||||||
|
cycle_response = await self.device_controller.start_cycle(
|
||||||
|
unit_id,
|
||||||
|
action.device_type,
|
||||||
|
sync_clock=True,
|
||||||
|
)
|
||||||
|
result["steps"]["start"] = {"success": True, "response": cycle_response}
|
||||||
|
|
||||||
|
# Create new recording session
|
||||||
|
new_session = MonitoringSession(
|
||||||
|
id=str(uuid.uuid4()),
|
||||||
|
project_id=action.project_id,
|
||||||
|
location_id=action.location_id,
|
||||||
|
unit_id=unit_id,
|
||||||
|
session_type="sound" if action.device_type == "slm" else "vibration",
|
||||||
|
started_at=datetime.utcnow(),
|
||||||
|
status="recording",
|
||||||
|
session_metadata=json.dumps({
|
||||||
|
"scheduled_action_id": action.id,
|
||||||
|
"cycle_response": cycle_response,
|
||||||
|
"action_type": "cycle",
|
||||||
|
}),
|
||||||
|
)
|
||||||
|
db.add(new_session)
|
||||||
|
result["new_session_id"] = new_session.id
|
||||||
|
|
||||||
|
logger.info(f"[CYCLE] New measurement started, session {new_session.id}")
|
||||||
|
|
||||||
|
except Exception as e:
|
||||||
|
logger.error(f"[CYCLE] Start failed: {e}")
|
||||||
|
result["steps"]["start"] = {"success": False, "error": str(e)}
|
||||||
|
raise # Re-raise to mark the action as failed
|
||||||
|
|
||||||
|
finally:
|
||||||
|
# Step 7: Re-enable background polling (always runs, even on failure)
|
||||||
|
if result.get("polling_paused") and polling_was_enabled:
|
||||||
|
logger.info(f"[CYCLE] Step 7/7: Re-enabling background polling for {unit_id}")
|
||||||
|
try:
|
||||||
|
if action.device_type == "slm":
|
||||||
|
from backend.services.slmm_client import get_slmm_client
|
||||||
|
slmm = get_slmm_client()
|
||||||
|
await slmm.update_device_polling_config(unit_id, poll_enabled=True)
|
||||||
|
logger.info(f"[CYCLE] Background polling re-enabled for {unit_id}")
|
||||||
|
except Exception as e:
|
||||||
|
logger.error(f"[CYCLE] Failed to re-enable polling: {e}")
|
||||||
|
# Don't raise - cycle completed, just log the error
|
||||||
|
|
||||||
|
logger.info(f"[CYCLE] === Cycle complete for {unit_id} ===")
|
||||||
|
return result
|
||||||
|
|
||||||
|
# ========================================================================
|
||||||
|
# Recurring Schedule Generation
|
||||||
|
# ========================================================================
|
||||||
|
|
||||||
|
async def generate_recurring_actions(self) -> int:
|
||||||
|
"""
|
||||||
|
Generate ScheduledActions from all enabled recurring schedules.
|
||||||
|
|
||||||
|
Runs once per hour to generate actions for the next 7 days.
|
||||||
|
|
||||||
|
Returns:
|
||||||
|
Total number of actions generated
|
||||||
|
"""
|
||||||
|
db = SessionLocal()
|
||||||
|
total_generated = 0
|
||||||
|
|
||||||
|
try:
|
||||||
|
from backend.services.recurring_schedule_service import get_recurring_schedule_service
|
||||||
|
|
||||||
|
service = get_recurring_schedule_service(db)
|
||||||
|
schedules = service.get_enabled_schedules()
|
||||||
|
|
||||||
|
if not schedules:
|
||||||
|
logger.debug("No enabled recurring schedules found")
|
||||||
|
return 0
|
||||||
|
|
||||||
|
logger.info(f"Generating actions for {len(schedules)} recurring schedule(s)")
|
||||||
|
|
||||||
|
for schedule in schedules:
|
||||||
|
try:
|
||||||
|
# Auto-disable one-off schedules whose end time has passed
|
||||||
|
if schedule.schedule_type == "one_off" and schedule.end_datetime:
|
||||||
|
if schedule.end_datetime <= datetime.utcnow():
|
||||||
|
schedule.enabled = False
|
||||||
|
schedule.next_occurrence = None
|
||||||
|
db.commit()
|
||||||
|
logger.info(f"Auto-disabled completed one-off schedule: {schedule.name}")
|
||||||
|
continue
|
||||||
|
|
||||||
|
actions = service.generate_actions_for_schedule(schedule, horizon_days=7)
|
||||||
|
total_generated += len(actions)
|
||||||
|
except Exception as e:
|
||||||
|
logger.error(f"Error generating actions for schedule {schedule.id}: {e}")
|
||||||
|
|
||||||
|
if total_generated > 0:
|
||||||
|
logger.info(f"Generated {total_generated} scheduled actions from recurring schedules")
|
||||||
|
|
||||||
|
except Exception as e:
|
||||||
|
logger.error(f"Error in generate_recurring_actions: {e}", exc_info=True)
|
||||||
|
finally:
|
||||||
|
db.close()
|
||||||
|
|
||||||
|
return total_generated
|
||||||
|
|
||||||
|
async def cleanup_old_actions(self, retention_days: int = 30) -> int:
|
||||||
|
"""
|
||||||
|
Remove old completed/failed actions to prevent database bloat.
|
||||||
|
|
||||||
|
Args:
|
||||||
|
retention_days: Keep actions newer than this many days
|
||||||
|
|
||||||
|
Returns:
|
||||||
|
Number of actions cleaned up
|
||||||
|
"""
|
||||||
|
db = SessionLocal()
|
||||||
|
cleaned = 0
|
||||||
|
|
||||||
|
try:
|
||||||
|
cutoff = datetime.utcnow() - timedelta(days=retention_days)
|
||||||
|
|
||||||
|
old_actions = db.query(ScheduledAction).filter(
|
||||||
|
and_(
|
||||||
|
ScheduledAction.execution_status.in_(["completed", "failed", "cancelled"]),
|
||||||
|
ScheduledAction.executed_at < cutoff,
|
||||||
|
)
|
||||||
|
).all()
|
||||||
|
|
||||||
|
cleaned = len(old_actions)
|
||||||
|
for action in old_actions:
|
||||||
|
db.delete(action)
|
||||||
|
|
||||||
|
if cleaned > 0:
|
||||||
|
db.commit()
|
||||||
|
logger.info(f"Cleaned up {cleaned} old scheduled actions (>{retention_days} days)")
|
||||||
|
|
||||||
|
except Exception as e:
|
||||||
|
logger.error(f"Error cleaning up old actions: {e}")
|
||||||
|
db.rollback()
|
||||||
|
finally:
|
||||||
|
db.close()
|
||||||
|
|
||||||
|
return cleaned
|
||||||
|
|
||||||
# ========================================================================
|
# ========================================================================
|
||||||
# Manual Execution (for testing/debugging)
|
# Manual Execution (for testing/debugging)
|
||||||
# ========================================================================
|
# ========================================================================
|
||||||
|
|||||||
129
backend/services/slm_status_sync.py
Normal file
@@ -0,0 +1,129 @@
|
|||||||
|
"""
|
||||||
|
SLM Status Synchronization Service
|
||||||
|
|
||||||
|
Syncs SLM device status from SLMM backend to Terra-View's Emitter table.
|
||||||
|
This bridges SLMM's polling data with Terra-View's status snapshot system.
|
||||||
|
|
||||||
|
SLMM tracks device reachability via background polling. This service
|
||||||
|
fetches that data and creates/updates Emitter records so SLMs appear
|
||||||
|
correctly in the dashboard status snapshot.
|
||||||
|
"""
|
||||||
|
|
||||||
|
import logging
|
||||||
|
from datetime import datetime, timezone
|
||||||
|
from typing import Dict, Any
|
||||||
|
|
||||||
|
from backend.database import get_db_session
|
||||||
|
from backend.models import Emitter
|
||||||
|
from backend.services.slmm_client import get_slmm_client, SLMMClientError
|
||||||
|
|
||||||
|
logger = logging.getLogger(__name__)
|
||||||
|
|
||||||
|
|
||||||
|
async def sync_slm_status_to_emitters() -> Dict[str, Any]:
|
||||||
|
"""
|
||||||
|
Fetch SLM status from SLMM and sync to Terra-View's Emitter table.
|
||||||
|
|
||||||
|
For each device in SLMM's polling status:
|
||||||
|
- If last_success exists, create/update Emitter with that timestamp
|
||||||
|
- If not reachable, update Emitter with last known timestamp (or None)
|
||||||
|
|
||||||
|
Returns:
|
||||||
|
Dict with synced_count, error_count, errors list
|
||||||
|
"""
|
||||||
|
client = get_slmm_client()
|
||||||
|
synced = 0
|
||||||
|
errors = []
|
||||||
|
|
||||||
|
try:
|
||||||
|
# Get polling status from SLMM
|
||||||
|
status_response = await client.get_polling_status()
|
||||||
|
|
||||||
|
# Handle nested response structure
|
||||||
|
data = status_response.get("data", status_response)
|
||||||
|
devices = data.get("devices", [])
|
||||||
|
|
||||||
|
if not devices:
|
||||||
|
logger.debug("No SLM devices in SLMM polling status")
|
||||||
|
return {"synced_count": 0, "error_count": 0, "errors": []}
|
||||||
|
|
||||||
|
db = get_db_session()
|
||||||
|
try:
|
||||||
|
for device in devices:
|
||||||
|
unit_id = device.get("unit_id")
|
||||||
|
if not unit_id:
|
||||||
|
continue
|
||||||
|
|
||||||
|
try:
|
||||||
|
# Get or create Emitter record
|
||||||
|
emitter = db.query(Emitter).filter(Emitter.id == unit_id).first()
|
||||||
|
|
||||||
|
# Determine last_seen from SLMM data
|
||||||
|
last_success_str = device.get("last_success")
|
||||||
|
is_reachable = device.get("is_reachable", False)
|
||||||
|
|
||||||
|
if last_success_str:
|
||||||
|
# Parse ISO format timestamp
|
||||||
|
last_seen = datetime.fromisoformat(
|
||||||
|
last_success_str.replace("Z", "+00:00")
|
||||||
|
)
|
||||||
|
# Convert to naive UTC for consistency with existing code
|
||||||
|
if last_seen.tzinfo:
|
||||||
|
last_seen = last_seen.astimezone(timezone.utc).replace(tzinfo=None)
|
||||||
|
elif is_reachable:
|
||||||
|
# Device is reachable but no last_success yet (first poll or just started)
|
||||||
|
# Use current time so it shows as OK, not Missing
|
||||||
|
last_seen = datetime.utcnow()
|
||||||
|
else:
|
||||||
|
last_seen = None
|
||||||
|
|
||||||
|
# Status will be recalculated by snapshot.py based on time thresholds
|
||||||
|
# Just store a provisional status here
|
||||||
|
status = "OK" if is_reachable else "Missing"
|
||||||
|
|
||||||
|
# Store last error message if available
|
||||||
|
last_error = device.get("last_error") or ""
|
||||||
|
|
||||||
|
if emitter:
|
||||||
|
# Update existing record
|
||||||
|
emitter.last_seen = last_seen
|
||||||
|
emitter.status = status
|
||||||
|
emitter.unit_type = "slm"
|
||||||
|
emitter.last_file = last_error
|
||||||
|
else:
|
||||||
|
# Create new record
|
||||||
|
emitter = Emitter(
|
||||||
|
id=unit_id,
|
||||||
|
unit_type="slm",
|
||||||
|
last_seen=last_seen,
|
||||||
|
last_file=last_error,
|
||||||
|
status=status
|
||||||
|
)
|
||||||
|
db.add(emitter)
|
||||||
|
|
||||||
|
synced += 1
|
||||||
|
|
||||||
|
except Exception as e:
|
||||||
|
errors.append(f"{unit_id}: {str(e)}")
|
||||||
|
logger.error(f"Error syncing SLM {unit_id}: {e}")
|
||||||
|
|
||||||
|
db.commit()
|
||||||
|
|
||||||
|
finally:
|
||||||
|
db.close()
|
||||||
|
|
||||||
|
if synced > 0:
|
||||||
|
logger.info(f"Synced {synced} SLM device(s) to Emitter table")
|
||||||
|
|
||||||
|
except SLMMClientError as e:
|
||||||
|
logger.warning(f"Could not reach SLMM for status sync: {e}")
|
||||||
|
errors.append(f"SLMM unreachable: {str(e)}")
|
||||||
|
except Exception as e:
|
||||||
|
logger.error(f"Error in SLM status sync: {e}", exc_info=True)
|
||||||
|
errors.append(str(e))
|
||||||
|
|
||||||
|
return {
|
||||||
|
"synced_count": synced,
|
||||||
|
"error_count": len(errors),
|
||||||
|
"errors": errors
|
||||||
|
}
|
||||||
@@ -9,13 +9,14 @@ that handles TCP/FTP communication with Rion NL-43/NL-53 devices.
|
|||||||
"""
|
"""
|
||||||
|
|
||||||
import httpx
|
import httpx
|
||||||
|
import os
|
||||||
from typing import Optional, Dict, Any, List
|
from typing import Optional, Dict, Any, List
|
||||||
from datetime import datetime
|
from datetime import datetime
|
||||||
import json
|
import json
|
||||||
|
|
||||||
|
|
||||||
# SLMM backend base URLs
|
# SLMM backend base URLs - use environment variable if set (for Docker)
|
||||||
SLMM_BASE_URL = "http://localhost:8100"
|
SLMM_BASE_URL = os.environ.get("SLMM_BASE_URL", "http://localhost:8100")
|
||||||
SLMM_API_BASE = f"{SLMM_BASE_URL}/api/nl43"
|
SLMM_API_BASE = f"{SLMM_BASE_URL}/api/nl43"
|
||||||
|
|
||||||
|
|
||||||
@@ -108,7 +109,71 @@ class SLMMClient:
|
|||||||
f"SLMM operation failed: {error_detail}"
|
f"SLMM operation failed: {error_detail}"
|
||||||
)
|
)
|
||||||
except Exception as e:
|
except Exception as e:
|
||||||
raise SLMMClientError(f"Unexpected error: {str(e)}")
|
error_msg = str(e) if str(e) else type(e).__name__
|
||||||
|
raise SLMMClientError(f"Unexpected error: {error_msg}")
|
||||||
|
|
||||||
|
async def _download_request(
|
||||||
|
self,
|
||||||
|
endpoint: str,
|
||||||
|
data: Dict[str, Any],
|
||||||
|
unit_id: str,
|
||||||
|
) -> Dict[str, Any]:
|
||||||
|
"""
|
||||||
|
Make a download request to SLMM that returns binary file content (not JSON).
|
||||||
|
|
||||||
|
Saves the file locally and returns metadata about the download.
|
||||||
|
"""
|
||||||
|
url = f"{self.api_base}{endpoint}"
|
||||||
|
|
||||||
|
try:
|
||||||
|
async with httpx.AsyncClient(timeout=httpx.Timeout(300.0)) as client:
|
||||||
|
response = await client.post(url, json=data)
|
||||||
|
response.raise_for_status()
|
||||||
|
|
||||||
|
# Determine filename from Content-Disposition header or generate one
|
||||||
|
content_disp = response.headers.get("content-disposition", "")
|
||||||
|
filename = None
|
||||||
|
if "filename=" in content_disp:
|
||||||
|
filename = content_disp.split("filename=")[-1].strip('" ')
|
||||||
|
|
||||||
|
if not filename:
|
||||||
|
remote_path = data.get("remote_path", "download")
|
||||||
|
base = os.path.basename(remote_path.rstrip("/"))
|
||||||
|
filename = f"{base}.zip" if not base.endswith(".zip") else base
|
||||||
|
|
||||||
|
# Save to local downloads directory
|
||||||
|
download_dir = os.path.join("data", "downloads", unit_id)
|
||||||
|
os.makedirs(download_dir, exist_ok=True)
|
||||||
|
local_path = os.path.join(download_dir, filename)
|
||||||
|
|
||||||
|
with open(local_path, "wb") as f:
|
||||||
|
f.write(response.content)
|
||||||
|
|
||||||
|
return {
|
||||||
|
"success": True,
|
||||||
|
"local_path": local_path,
|
||||||
|
"filename": filename,
|
||||||
|
"size_bytes": len(response.content),
|
||||||
|
}
|
||||||
|
|
||||||
|
except httpx.ConnectError as e:
|
||||||
|
raise SLMMConnectionError(
|
||||||
|
f"Cannot connect to SLMM backend at {self.base_url}. "
|
||||||
|
f"Is SLMM running? Error: {str(e)}"
|
||||||
|
)
|
||||||
|
except httpx.HTTPStatusError as e:
|
||||||
|
error_detail = "Unknown error"
|
||||||
|
try:
|
||||||
|
error_data = e.response.json()
|
||||||
|
error_detail = error_data.get("detail", str(error_data))
|
||||||
|
except Exception:
|
||||||
|
error_detail = e.response.text or str(e)
|
||||||
|
raise SLMMDeviceError(f"SLMM download failed: {error_detail}")
|
||||||
|
except (SLMMConnectionError, SLMMDeviceError):
|
||||||
|
raise
|
||||||
|
except Exception as e:
|
||||||
|
error_msg = str(e) if str(e) else type(e).__name__
|
||||||
|
raise SLMMClientError(f"Download error: {error_msg}")
|
||||||
|
|
||||||
# ========================================================================
|
# ========================================================================
|
||||||
# Unit Management
|
# Unit Management
|
||||||
@@ -276,6 +341,124 @@ class SLMMClient:
|
|||||||
"""
|
"""
|
||||||
return await self._request("POST", f"/{unit_id}/reset")
|
return await self._request("POST", f"/{unit_id}/reset")
|
||||||
|
|
||||||
|
# ========================================================================
|
||||||
|
# Store/Index Management
|
||||||
|
# ========================================================================
|
||||||
|
|
||||||
|
async def get_index_number(self, unit_id: str) -> Dict[str, Any]:
|
||||||
|
"""
|
||||||
|
Get current store/index number from device.
|
||||||
|
|
||||||
|
Args:
|
||||||
|
unit_id: Unit identifier
|
||||||
|
|
||||||
|
Returns:
|
||||||
|
Dict with current index_number (store name)
|
||||||
|
"""
|
||||||
|
return await self._request("GET", f"/{unit_id}/index-number")
|
||||||
|
|
||||||
|
async def set_index_number(
|
||||||
|
self,
|
||||||
|
unit_id: str,
|
||||||
|
index_number: int,
|
||||||
|
) -> Dict[str, Any]:
|
||||||
|
"""
|
||||||
|
Set store/index number on device.
|
||||||
|
|
||||||
|
Args:
|
||||||
|
unit_id: Unit identifier
|
||||||
|
index_number: New index number to set
|
||||||
|
|
||||||
|
Returns:
|
||||||
|
Confirmation response
|
||||||
|
"""
|
||||||
|
return await self._request(
|
||||||
|
"PUT",
|
||||||
|
f"/{unit_id}/index-number",
|
||||||
|
data={"index_number": index_number},
|
||||||
|
)
|
||||||
|
|
||||||
|
async def check_overwrite_status(self, unit_id: str) -> Dict[str, Any]:
|
||||||
|
"""
|
||||||
|
Check if data exists at the current store index.
|
||||||
|
|
||||||
|
Args:
|
||||||
|
unit_id: Unit identifier
|
||||||
|
|
||||||
|
Returns:
|
||||||
|
Dict with:
|
||||||
|
- overwrite_status: "None" (safe) or "Exist" (would overwrite)
|
||||||
|
- will_overwrite: bool
|
||||||
|
- safe_to_store: bool
|
||||||
|
"""
|
||||||
|
return await self._request("GET", f"/{unit_id}/overwrite-check")
|
||||||
|
|
||||||
|
async def increment_index(self, unit_id: str, max_attempts: int = 100) -> Dict[str, Any]:
|
||||||
|
"""
|
||||||
|
Find and set the next available (unused) store/index number.
|
||||||
|
|
||||||
|
Checks the current index - if it would overwrite existing data,
|
||||||
|
increments until finding an unused index number.
|
||||||
|
|
||||||
|
Args:
|
||||||
|
unit_id: Unit identifier
|
||||||
|
max_attempts: Maximum number of indices to try before giving up
|
||||||
|
|
||||||
|
Returns:
|
||||||
|
Dict with old_index, new_index, and attempts_made
|
||||||
|
"""
|
||||||
|
# Get current index
|
||||||
|
current = await self.get_index_number(unit_id)
|
||||||
|
old_index = current.get("index_number", 0)
|
||||||
|
|
||||||
|
# Check if current index is safe
|
||||||
|
overwrite_check = await self.check_overwrite_status(unit_id)
|
||||||
|
if overwrite_check.get("safe_to_store", False):
|
||||||
|
# Current index is safe, no need to increment
|
||||||
|
return {
|
||||||
|
"success": True,
|
||||||
|
"old_index": old_index,
|
||||||
|
"new_index": old_index,
|
||||||
|
"unit_id": unit_id,
|
||||||
|
"already_safe": True,
|
||||||
|
"attempts_made": 0,
|
||||||
|
}
|
||||||
|
|
||||||
|
# Need to find an unused index
|
||||||
|
attempts = 0
|
||||||
|
test_index = old_index + 1
|
||||||
|
|
||||||
|
while attempts < max_attempts:
|
||||||
|
# Set the new index
|
||||||
|
await self.set_index_number(unit_id, test_index)
|
||||||
|
|
||||||
|
# Check if this index is safe
|
||||||
|
overwrite_check = await self.check_overwrite_status(unit_id)
|
||||||
|
attempts += 1
|
||||||
|
|
||||||
|
if overwrite_check.get("safe_to_store", False):
|
||||||
|
return {
|
||||||
|
"success": True,
|
||||||
|
"old_index": old_index,
|
||||||
|
"new_index": test_index,
|
||||||
|
"unit_id": unit_id,
|
||||||
|
"already_safe": False,
|
||||||
|
"attempts_made": attempts,
|
||||||
|
}
|
||||||
|
|
||||||
|
# Try next index (wrap around at 9999)
|
||||||
|
test_index = (test_index + 1) % 10000
|
||||||
|
|
||||||
|
# Avoid infinite loops if we've wrapped around
|
||||||
|
if test_index == old_index:
|
||||||
|
break
|
||||||
|
|
||||||
|
# Could not find a safe index
|
||||||
|
raise SLMMDeviceError(
|
||||||
|
f"Could not find unused store index for {unit_id} after {attempts} attempts. "
|
||||||
|
f"Consider downloading and clearing data from the device."
|
||||||
|
)
|
||||||
|
|
||||||
# ========================================================================
|
# ========================================================================
|
||||||
# Device Settings
|
# Device Settings
|
||||||
# ========================================================================
|
# ========================================================================
|
||||||
@@ -359,9 +542,130 @@ class SLMMClient:
|
|||||||
return await self._request("GET", f"/{unit_id}/settings")
|
return await self._request("GET", f"/{unit_id}/settings")
|
||||||
|
|
||||||
# ========================================================================
|
# ========================================================================
|
||||||
# Data Download (Future)
|
# FTP Control
|
||||||
# ========================================================================
|
# ========================================================================
|
||||||
|
|
||||||
|
async def enable_ftp(self, unit_id: str) -> Dict[str, Any]:
|
||||||
|
"""
|
||||||
|
Enable FTP server on device.
|
||||||
|
|
||||||
|
Must be called before downloading files. FTP and TCP can work in tandem.
|
||||||
|
|
||||||
|
Args:
|
||||||
|
unit_id: Unit identifier
|
||||||
|
|
||||||
|
Returns:
|
||||||
|
Dict with status message
|
||||||
|
"""
|
||||||
|
return await self._request("POST", f"/{unit_id}/ftp/enable")
|
||||||
|
|
||||||
|
async def disable_ftp(self, unit_id: str) -> Dict[str, Any]:
|
||||||
|
"""
|
||||||
|
Disable FTP server on device.
|
||||||
|
|
||||||
|
Args:
|
||||||
|
unit_id: Unit identifier
|
||||||
|
|
||||||
|
Returns:
|
||||||
|
Dict with status message
|
||||||
|
"""
|
||||||
|
return await self._request("POST", f"/{unit_id}/ftp/disable")
|
||||||
|
|
||||||
|
async def get_ftp_status(self, unit_id: str) -> Dict[str, Any]:
|
||||||
|
"""
|
||||||
|
Get FTP server status on device.
|
||||||
|
|
||||||
|
Args:
|
||||||
|
unit_id: Unit identifier
|
||||||
|
|
||||||
|
Returns:
|
||||||
|
Dict with ftp_enabled status
|
||||||
|
"""
|
||||||
|
return await self._request("GET", f"/{unit_id}/ftp/status")
|
||||||
|
|
||||||
|
# ========================================================================
|
||||||
|
# Data Download
|
||||||
|
# ========================================================================
|
||||||
|
|
||||||
|
async def download_file(
|
||||||
|
self,
|
||||||
|
unit_id: str,
|
||||||
|
remote_path: str,
|
||||||
|
) -> Dict[str, Any]:
|
||||||
|
"""
|
||||||
|
Download a single file from unit via FTP.
|
||||||
|
|
||||||
|
Args:
|
||||||
|
unit_id: Unit identifier
|
||||||
|
remote_path: Path on device to download (e.g., "/NL43_DATA/measurement.wav")
|
||||||
|
|
||||||
|
Returns:
|
||||||
|
Dict with local_path, filename, size_bytes
|
||||||
|
"""
|
||||||
|
return await self._download_request(
|
||||||
|
f"/{unit_id}/ftp/download",
|
||||||
|
{"remote_path": remote_path},
|
||||||
|
unit_id,
|
||||||
|
)
|
||||||
|
|
||||||
|
async def download_folder(
|
||||||
|
self,
|
||||||
|
unit_id: str,
|
||||||
|
remote_path: str,
|
||||||
|
) -> Dict[str, Any]:
|
||||||
|
"""
|
||||||
|
Download an entire folder from unit via FTP as a ZIP archive.
|
||||||
|
|
||||||
|
Useful for downloading complete measurement sessions (e.g., Auto_0000 folders).
|
||||||
|
|
||||||
|
Args:
|
||||||
|
unit_id: Unit identifier
|
||||||
|
remote_path: Folder path on device to download (e.g., "/NL43_DATA/Auto_0000")
|
||||||
|
|
||||||
|
Returns:
|
||||||
|
Dict with local_path, folder_name, size_bytes
|
||||||
|
"""
|
||||||
|
return await self._download_request(
|
||||||
|
f"/{unit_id}/ftp/download-folder",
|
||||||
|
{"remote_path": remote_path},
|
||||||
|
unit_id,
|
||||||
|
)
|
||||||
|
|
||||||
|
async def download_current_measurement(
|
||||||
|
self,
|
||||||
|
unit_id: str,
|
||||||
|
) -> Dict[str, Any]:
|
||||||
|
"""
|
||||||
|
Download the current measurement folder based on device's index number.
|
||||||
|
|
||||||
|
This is the recommended method for scheduled downloads - it automatically
|
||||||
|
determines which folder to download based on the device's current store index.
|
||||||
|
|
||||||
|
Args:
|
||||||
|
unit_id: Unit identifier
|
||||||
|
|
||||||
|
Returns:
|
||||||
|
Dict with local_path, folder_name, file_count, zip_size_bytes, index_number
|
||||||
|
"""
|
||||||
|
# Get current index number from device
|
||||||
|
index_info = await self.get_index_number(unit_id)
|
||||||
|
index_number_raw = index_info.get("index_number", 0)
|
||||||
|
|
||||||
|
# Convert to int - device returns string like "0000" or "0001"
|
||||||
|
try:
|
||||||
|
index_number = int(index_number_raw)
|
||||||
|
except (ValueError, TypeError):
|
||||||
|
index_number = 0
|
||||||
|
|
||||||
|
# Format as Auto_XXXX folder name
|
||||||
|
folder_name = f"Auto_{index_number:04d}"
|
||||||
|
remote_path = f"/NL-43/{folder_name}"
|
||||||
|
|
||||||
|
# Download the folder
|
||||||
|
result = await self.download_folder(unit_id, remote_path)
|
||||||
|
result["index_number"] = index_number
|
||||||
|
return result
|
||||||
|
|
||||||
async def download_files(
|
async def download_files(
|
||||||
self,
|
self,
|
||||||
unit_id: str,
|
unit_id: str,
|
||||||
@@ -369,23 +673,153 @@ class SLMMClient:
|
|||||||
files: Optional[List[str]] = None,
|
files: Optional[List[str]] = None,
|
||||||
) -> Dict[str, Any]:
|
) -> Dict[str, Any]:
|
||||||
"""
|
"""
|
||||||
Download files from unit via FTP.
|
Download measurement files from unit via FTP.
|
||||||
|
|
||||||
NOTE: This endpoint doesn't exist in SLMM yet. Will need to implement.
|
This method automatically determines the current measurement folder and downloads it.
|
||||||
|
The destination_path parameter is logged for reference but actual download location
|
||||||
|
is managed by SLMM (data/downloads/{unit_id}/).
|
||||||
|
|
||||||
Args:
|
Args:
|
||||||
unit_id: Unit identifier
|
unit_id: Unit identifier
|
||||||
destination_path: Local path to save files
|
destination_path: Reference path (for logging/metadata, not used by SLMM)
|
||||||
files: List of filenames to download, or None for all
|
files: Ignored - always downloads the current measurement folder
|
||||||
|
|
||||||
Returns:
|
Returns:
|
||||||
Dict with downloaded files list and metadata
|
Dict with download result including local_path, folder_name, etc.
|
||||||
"""
|
"""
|
||||||
data = {
|
# Use the new method that automatically determines what to download
|
||||||
"destination_path": destination_path,
|
result = await self.download_current_measurement(unit_id)
|
||||||
"files": files or "all",
|
result["requested_destination"] = destination_path
|
||||||
}
|
return result
|
||||||
return await self._request("POST", f"/{unit_id}/ftp/download", data=data)
|
|
||||||
|
# ========================================================================
|
||||||
|
# Cycle Commands (for scheduled automation)
|
||||||
|
# ========================================================================
|
||||||
|
|
||||||
|
async def start_cycle(
|
||||||
|
self,
|
||||||
|
unit_id: str,
|
||||||
|
sync_clock: bool = True,
|
||||||
|
) -> Dict[str, Any]:
|
||||||
|
"""
|
||||||
|
Execute complete start cycle on device via SLMM.
|
||||||
|
|
||||||
|
This handles the full pre-recording workflow:
|
||||||
|
1. Sync device clock to server time
|
||||||
|
2. Find next safe index (with overwrite protection)
|
||||||
|
3. Start measurement
|
||||||
|
|
||||||
|
Args:
|
||||||
|
unit_id: Unit identifier
|
||||||
|
sync_clock: Whether to sync device clock to server time
|
||||||
|
|
||||||
|
Returns:
|
||||||
|
Dict with clock_synced, old_index, new_index, started, etc.
|
||||||
|
"""
|
||||||
|
return await self._request(
|
||||||
|
"POST",
|
||||||
|
f"/{unit_id}/start-cycle",
|
||||||
|
data={"sync_clock": sync_clock},
|
||||||
|
)
|
||||||
|
|
||||||
|
async def stop_cycle(
|
||||||
|
self,
|
||||||
|
unit_id: str,
|
||||||
|
download: bool = True,
|
||||||
|
download_path: Optional[str] = None,
|
||||||
|
) -> Dict[str, Any]:
|
||||||
|
"""
|
||||||
|
Execute complete stop cycle on device via SLMM.
|
||||||
|
|
||||||
|
This handles the full post-recording workflow:
|
||||||
|
1. Stop measurement
|
||||||
|
2. Enable FTP
|
||||||
|
3. Download measurement folder (if download=True)
|
||||||
|
4. Verify download
|
||||||
|
|
||||||
|
Args:
|
||||||
|
unit_id: Unit identifier
|
||||||
|
download: Whether to download measurement data
|
||||||
|
download_path: Custom path for downloaded ZIP (optional)
|
||||||
|
|
||||||
|
Returns:
|
||||||
|
Dict with stopped, ftp_enabled, download_success, local_path, etc.
|
||||||
|
"""
|
||||||
|
data = {"download": download}
|
||||||
|
if download_path:
|
||||||
|
data["download_path"] = download_path
|
||||||
|
return await self._request(
|
||||||
|
"POST",
|
||||||
|
f"/{unit_id}/stop-cycle",
|
||||||
|
data=data,
|
||||||
|
)
|
||||||
|
|
||||||
|
# ========================================================================
|
||||||
|
# Polling Status (for device monitoring/alerts)
|
||||||
|
# ========================================================================
|
||||||
|
|
||||||
|
async def get_polling_status(self) -> Dict[str, Any]:
|
||||||
|
"""
|
||||||
|
Get global polling status from SLMM.
|
||||||
|
|
||||||
|
Returns device reachability information for all polled devices.
|
||||||
|
Used by DeviceStatusMonitor to detect offline/online transitions.
|
||||||
|
|
||||||
|
Returns:
|
||||||
|
Dict with devices list containing:
|
||||||
|
- unit_id
|
||||||
|
- is_reachable
|
||||||
|
- consecutive_failures
|
||||||
|
- last_poll_attempt
|
||||||
|
- last_success
|
||||||
|
- last_error
|
||||||
|
"""
|
||||||
|
try:
|
||||||
|
async with httpx.AsyncClient(timeout=self.timeout) as client:
|
||||||
|
response = await client.get(f"{self.base_url}/api/nl43/_polling/status")
|
||||||
|
response.raise_for_status()
|
||||||
|
return response.json()
|
||||||
|
except httpx.ConnectError:
|
||||||
|
raise SLMMConnectionError("Cannot connect to SLMM for polling status")
|
||||||
|
except Exception as e:
|
||||||
|
raise SLMMClientError(f"Failed to get polling status: {str(e)}")
|
||||||
|
|
||||||
|
async def get_device_polling_config(self, unit_id: str) -> Dict[str, Any]:
|
||||||
|
"""
|
||||||
|
Get polling configuration for a specific device.
|
||||||
|
|
||||||
|
Args:
|
||||||
|
unit_id: Unit identifier
|
||||||
|
|
||||||
|
Returns:
|
||||||
|
Dict with poll_enabled and poll_interval_seconds
|
||||||
|
"""
|
||||||
|
return await self._request("GET", f"/{unit_id}/polling/config")
|
||||||
|
|
||||||
|
async def update_device_polling_config(
|
||||||
|
self,
|
||||||
|
unit_id: str,
|
||||||
|
poll_enabled: Optional[bool] = None,
|
||||||
|
poll_interval_seconds: Optional[int] = None,
|
||||||
|
) -> Dict[str, Any]:
|
||||||
|
"""
|
||||||
|
Update polling configuration for a device.
|
||||||
|
|
||||||
|
Args:
|
||||||
|
unit_id: Unit identifier
|
||||||
|
poll_enabled: Enable/disable polling
|
||||||
|
poll_interval_seconds: Polling interval (10-3600)
|
||||||
|
|
||||||
|
Returns:
|
||||||
|
Updated config
|
||||||
|
"""
|
||||||
|
config = {}
|
||||||
|
if poll_enabled is not None:
|
||||||
|
config["poll_enabled"] = poll_enabled
|
||||||
|
if poll_interval_seconds is not None:
|
||||||
|
config["poll_interval_seconds"] = poll_interval_seconds
|
||||||
|
|
||||||
|
return await self._request("PUT", f"/{unit_id}/polling/config", data=config)
|
||||||
|
|
||||||
# ========================================================================
|
# ========================================================================
|
||||||
# Health Check
|
# Health Check
|
||||||
|
|||||||
@@ -36,6 +36,10 @@ async def sync_slm_to_slmm(unit: RosterUnit) -> bool:
|
|||||||
logger.warning(f"SLM {unit.id} has no host configured, skipping SLMM sync")
|
logger.warning(f"SLM {unit.id} has no host configured, skipping SLMM sync")
|
||||||
return False
|
return False
|
||||||
|
|
||||||
|
# Disable polling if unit is benched (deployed=False) or retired
|
||||||
|
# Only actively deployed units should be polled
|
||||||
|
should_poll = unit.deployed and not unit.retired
|
||||||
|
|
||||||
try:
|
try:
|
||||||
async with httpx.AsyncClient(timeout=5.0) as client:
|
async with httpx.AsyncClient(timeout=5.0) as client:
|
||||||
response = await client.put(
|
response = await client.put(
|
||||||
@@ -47,8 +51,8 @@ async def sync_slm_to_slmm(unit: RosterUnit) -> bool:
|
|||||||
"ftp_enabled": True,
|
"ftp_enabled": True,
|
||||||
"ftp_username": "USER", # Default NL43 credentials
|
"ftp_username": "USER", # Default NL43 credentials
|
||||||
"ftp_password": "0000",
|
"ftp_password": "0000",
|
||||||
"poll_enabled": not unit.retired, # Disable polling for retired units
|
"poll_enabled": should_poll, # Disable polling for benched or retired units
|
||||||
"poll_interval_seconds": 60, # Default interval
|
"poll_interval_seconds": 3600, # Default to 1 hour polling
|
||||||
}
|
}
|
||||||
)
|
)
|
||||||
|
|
||||||
|
|||||||
@@ -80,6 +80,18 @@ def emit_status_snapshot():
|
|||||||
age = "N/A"
|
age = "N/A"
|
||||||
last_seen = None
|
last_seen = None
|
||||||
fname = ""
|
fname = ""
|
||||||
|
elif r.out_for_calibration:
|
||||||
|
# Out for calibration units get separated later
|
||||||
|
status = "Out for Calibration"
|
||||||
|
age = "N/A"
|
||||||
|
last_seen = None
|
||||||
|
fname = ""
|
||||||
|
elif getattr(r, 'allocated', False) and not r.deployed:
|
||||||
|
# Allocated: staged for an upcoming job, not yet physically deployed
|
||||||
|
status = "Allocated"
|
||||||
|
age = "N/A"
|
||||||
|
last_seen = None
|
||||||
|
fname = ""
|
||||||
else:
|
else:
|
||||||
if e:
|
if e:
|
||||||
last_seen = ensure_utc(e.last_seen)
|
last_seen = ensure_utc(e.last_seen)
|
||||||
@@ -103,11 +115,15 @@ def emit_status_snapshot():
|
|||||||
"deployed": r.deployed,
|
"deployed": r.deployed,
|
||||||
"note": r.note or "",
|
"note": r.note or "",
|
||||||
"retired": r.retired,
|
"retired": r.retired,
|
||||||
|
"out_for_calibration": r.out_for_calibration or False,
|
||||||
|
"allocated": getattr(r, 'allocated', False) or False,
|
||||||
|
"allocated_to_project_id": getattr(r, 'allocated_to_project_id', None) or "",
|
||||||
# Device type and type-specific fields
|
# Device type and type-specific fields
|
||||||
"device_type": r.device_type or "seismograph",
|
"device_type": r.device_type or "seismograph",
|
||||||
"last_calibrated": r.last_calibrated.isoformat() if r.last_calibrated else None,
|
"last_calibrated": r.last_calibrated.isoformat() if r.last_calibrated else None,
|
||||||
"next_calibration_due": r.next_calibration_due.isoformat() if r.next_calibration_due else None,
|
"next_calibration_due": r.next_calibration_due.isoformat() if r.next_calibration_due else None,
|
||||||
"deployed_with_modem_id": r.deployed_with_modem_id,
|
"deployed_with_modem_id": r.deployed_with_modem_id,
|
||||||
|
"deployed_with_unit_id": r.deployed_with_unit_id,
|
||||||
"ip_address": r.ip_address,
|
"ip_address": r.ip_address,
|
||||||
"phone_number": r.phone_number,
|
"phone_number": r.phone_number,
|
||||||
"hardware_model": r.hardware_model,
|
"hardware_model": r.hardware_model,
|
||||||
@@ -132,11 +148,15 @@ def emit_status_snapshot():
|
|||||||
"deployed": False, # default
|
"deployed": False, # default
|
||||||
"note": "",
|
"note": "",
|
||||||
"retired": False,
|
"retired": False,
|
||||||
|
"out_for_calibration": False,
|
||||||
|
"allocated": False,
|
||||||
|
"allocated_to_project_id": "",
|
||||||
# Device type and type-specific fields (defaults for unknown units)
|
# Device type and type-specific fields (defaults for unknown units)
|
||||||
"device_type": "seismograph", # default
|
"device_type": "seismograph", # default
|
||||||
"last_calibrated": None,
|
"last_calibrated": None,
|
||||||
"next_calibration_due": None,
|
"next_calibration_due": None,
|
||||||
"deployed_with_modem_id": None,
|
"deployed_with_modem_id": None,
|
||||||
|
"deployed_with_unit_id": None,
|
||||||
"ip_address": None,
|
"ip_address": None,
|
||||||
"phone_number": None,
|
"phone_number": None,
|
||||||
"hardware_model": None,
|
"hardware_model": None,
|
||||||
@@ -146,15 +166,48 @@ def emit_status_snapshot():
|
|||||||
"coordinates": "",
|
"coordinates": "",
|
||||||
}
|
}
|
||||||
|
|
||||||
|
# --- Derive modem status from paired devices ---
|
||||||
|
# Modems don't have their own check-in system, so we inherit status
|
||||||
|
# from whatever device they're paired with (seismograph or SLM)
|
||||||
|
# Check both directions: modem.deployed_with_unit_id OR device.deployed_with_modem_id
|
||||||
|
for unit_id, unit_data in units.items():
|
||||||
|
if unit_data.get("device_type") == "modem" and not unit_data.get("retired"):
|
||||||
|
paired_unit_id = None
|
||||||
|
roster_unit = roster.get(unit_id)
|
||||||
|
|
||||||
|
# First, check if modem has deployed_with_unit_id set
|
||||||
|
if roster_unit and roster_unit.deployed_with_unit_id:
|
||||||
|
paired_unit_id = roster_unit.deployed_with_unit_id
|
||||||
|
else:
|
||||||
|
# Fallback: check if any device has this modem in deployed_with_modem_id
|
||||||
|
for other_id, other_roster in roster.items():
|
||||||
|
if other_roster.deployed_with_modem_id == unit_id:
|
||||||
|
paired_unit_id = other_id
|
||||||
|
break
|
||||||
|
|
||||||
|
if paired_unit_id:
|
||||||
|
paired_unit = units.get(paired_unit_id)
|
||||||
|
if paired_unit:
|
||||||
|
# Inherit status from paired device
|
||||||
|
unit_data["status"] = paired_unit.get("status", "Missing")
|
||||||
|
unit_data["age"] = paired_unit.get("age", "N/A")
|
||||||
|
unit_data["last"] = paired_unit.get("last")
|
||||||
|
unit_data["derived_from"] = paired_unit_id
|
||||||
|
|
||||||
# Separate buckets for UI
|
# Separate buckets for UI
|
||||||
active_units = {
|
active_units = {
|
||||||
uid: u for uid, u in units.items()
|
uid: u for uid, u in units.items()
|
||||||
if not u["retired"] and u["deployed"] and uid not in ignored
|
if not u["retired"] and not u["out_for_calibration"] and u["deployed"] and uid not in ignored
|
||||||
}
|
}
|
||||||
|
|
||||||
benched_units = {
|
benched_units = {
|
||||||
uid: u for uid, u in units.items()
|
uid: u for uid, u in units.items()
|
||||||
if not u["retired"] and not u["deployed"] and uid not in ignored
|
if not u["retired"] and not u["out_for_calibration"] and not u["allocated"] and not u["deployed"] and uid not in ignored
|
||||||
|
}
|
||||||
|
|
||||||
|
allocated_units = {
|
||||||
|
uid: u for uid, u in units.items()
|
||||||
|
if not u["retired"] and not u["out_for_calibration"] and u["allocated"] and not u["deployed"] and uid not in ignored
|
||||||
}
|
}
|
||||||
|
|
||||||
retired_units = {
|
retired_units = {
|
||||||
@@ -162,6 +215,11 @@ def emit_status_snapshot():
|
|||||||
if u["retired"]
|
if u["retired"]
|
||||||
}
|
}
|
||||||
|
|
||||||
|
out_for_calibration_units = {
|
||||||
|
uid: u for uid, u in units.items()
|
||||||
|
if u["out_for_calibration"]
|
||||||
|
}
|
||||||
|
|
||||||
# Unknown units - emitters that aren't in the roster and aren't ignored
|
# Unknown units - emitters that aren't in the roster and aren't ignored
|
||||||
unknown_units = {
|
unknown_units = {
|
||||||
uid: u for uid, u in units.items()
|
uid: u for uid, u in units.items()
|
||||||
@@ -173,13 +231,17 @@ def emit_status_snapshot():
|
|||||||
"units": units,
|
"units": units,
|
||||||
"active": active_units,
|
"active": active_units,
|
||||||
"benched": benched_units,
|
"benched": benched_units,
|
||||||
|
"allocated": allocated_units,
|
||||||
"retired": retired_units,
|
"retired": retired_units,
|
||||||
|
"out_for_calibration": out_for_calibration_units,
|
||||||
"unknown": unknown_units,
|
"unknown": unknown_units,
|
||||||
"summary": {
|
"summary": {
|
||||||
"total": len(active_units) + len(benched_units),
|
"total": len(active_units) + len(benched_units) + len(allocated_units),
|
||||||
"active": len(active_units),
|
"active": len(active_units),
|
||||||
"benched": len(benched_units),
|
"benched": len(benched_units),
|
||||||
|
"allocated": len(allocated_units),
|
||||||
"retired": len(retired_units),
|
"retired": len(retired_units),
|
||||||
|
"out_for_calibration": len(out_for_calibration_units),
|
||||||
"unknown": len(unknown_units),
|
"unknown": len(unknown_units),
|
||||||
# Status counts only for deployed units (active_units)
|
# Status counts only for deployed units (active_units)
|
||||||
"ok": sum(1 for u in active_units.values() if u["status"] == "OK"),
|
"ok": sum(1 for u in active_units.values() if u["status"] == "OK"),
|
||||||
|
|||||||
BIN
backend/static/icons/favicon-16.png
Normal file
|
After Width: | Height: | Size: 424 B |
BIN
backend/static/icons/favicon-32.png
Normal file
|
After Width: | Height: | Size: 1.1 KiB |
|
Before Width: | Height: | Size: 1.9 KiB After Width: | Height: | Size: 7.7 KiB |
|
Before Width: | Height: | Size: 2.2 KiB After Width: | Height: | Size: 9.2 KiB |
|
Before Width: | Height: | Size: 2.2 KiB After Width: | Height: | Size: 10 KiB |
|
Before Width: | Height: | Size: 2.9 KiB After Width: | Height: | Size: 15 KiB |
|
Before Width: | Height: | Size: 5.8 KiB After Width: | Height: | Size: 44 KiB |
|
Before Width: | Height: | Size: 7.8 KiB After Width: | Height: | Size: 68 KiB |
|
Before Width: | Height: | Size: 1.1 KiB After Width: | Height: | Size: 3.2 KiB |
|
Before Width: | Height: | Size: 1.4 KiB After Width: | Height: | Size: 5.0 KiB |
BIN
backend/static/terra-view-logo-dark.png
Normal file
|
After Width: | Height: | Size: 13 KiB |
BIN
backend/static/terra-view-logo-dark@2x.png
Normal file
|
After Width: | Height: | Size: 57 KiB |
BIN
backend/static/terra-view-logo-light.png
Normal file
|
After Width: | Height: | Size: 14 KiB |
BIN
backend/static/terra-view-logo-light@2x.png
Normal file
|
After Width: | Height: | Size: 49 KiB |
90
backend/templates_config.py
Normal file
@@ -0,0 +1,90 @@
|
|||||||
|
"""
|
||||||
|
Shared Jinja2 templates configuration.
|
||||||
|
|
||||||
|
All routers should import `templates` from this module to get consistent
|
||||||
|
filter and global function registration.
|
||||||
|
"""
|
||||||
|
|
||||||
|
import json as _json
|
||||||
|
from fastapi.templating import Jinja2Templates
|
||||||
|
|
||||||
|
# Import timezone utilities
|
||||||
|
from backend.utils.timezone import (
|
||||||
|
format_local_datetime, format_local_time,
|
||||||
|
get_user_timezone, get_timezone_abbreviation
|
||||||
|
)
|
||||||
|
|
||||||
|
|
||||||
|
def jinja_local_datetime(dt, fmt="%Y-%m-%d %H:%M"):
|
||||||
|
"""Jinja filter to convert UTC datetime to local timezone."""
|
||||||
|
return format_local_datetime(dt, fmt)
|
||||||
|
|
||||||
|
|
||||||
|
def jinja_local_time(dt):
|
||||||
|
"""Jinja filter to format time in local timezone."""
|
||||||
|
return format_local_time(dt)
|
||||||
|
|
||||||
|
|
||||||
|
def jinja_timezone_abbr():
|
||||||
|
"""Jinja global to get current timezone abbreviation."""
|
||||||
|
return get_timezone_abbreviation()
|
||||||
|
|
||||||
|
|
||||||
|
# Create templates instance
|
||||||
|
templates = Jinja2Templates(directory="templates")
|
||||||
|
|
||||||
|
def jinja_local_date(dt, fmt="%m-%d-%y"):
|
||||||
|
"""Jinja filter: format a UTC datetime as a local date string (e.g. 02-19-26)."""
|
||||||
|
return format_local_datetime(dt, fmt)
|
||||||
|
|
||||||
|
|
||||||
|
def jinja_fromjson(s):
|
||||||
|
"""Jinja filter: parse a JSON string into a dict (returns {} on failure)."""
|
||||||
|
if not s:
|
||||||
|
return {}
|
||||||
|
try:
|
||||||
|
return _json.loads(s)
|
||||||
|
except Exception:
|
||||||
|
return {}
|
||||||
|
|
||||||
|
|
||||||
|
def jinja_same_date(dt1, dt2) -> bool:
|
||||||
|
"""Jinja global: True if two datetimes fall on the same local date."""
|
||||||
|
if not dt1 or not dt2:
|
||||||
|
return False
|
||||||
|
try:
|
||||||
|
d1 = format_local_datetime(dt1, "%Y-%m-%d")
|
||||||
|
d2 = format_local_datetime(dt2, "%Y-%m-%d")
|
||||||
|
return d1 == d2
|
||||||
|
except Exception:
|
||||||
|
return False
|
||||||
|
|
||||||
|
|
||||||
|
def jinja_log_tail_display(s):
|
||||||
|
"""Jinja filter: decode a JSON-encoded log tail array into a plain-text string."""
|
||||||
|
if not s:
|
||||||
|
return ""
|
||||||
|
try:
|
||||||
|
lines = _json.loads(s)
|
||||||
|
if isinstance(lines, list):
|
||||||
|
return "\n".join(str(l) for l in lines)
|
||||||
|
return str(s)
|
||||||
|
except Exception:
|
||||||
|
return str(s)
|
||||||
|
|
||||||
|
|
||||||
|
def jinja_local_datetime_input(dt):
|
||||||
|
"""Jinja filter: format UTC datetime as local YYYY-MM-DDTHH:MM for <input type='datetime-local'>."""
|
||||||
|
return format_local_datetime(dt, "%Y-%m-%dT%H:%M")
|
||||||
|
|
||||||
|
|
||||||
|
# Register Jinja filters and globals
|
||||||
|
templates.env.filters["local_datetime"] = jinja_local_datetime
|
||||||
|
templates.env.filters["local_time"] = jinja_local_time
|
||||||
|
templates.env.filters["local_date"] = jinja_local_date
|
||||||
|
templates.env.filters["local_datetime_input"] = jinja_local_datetime_input
|
||||||
|
templates.env.filters["fromjson"] = jinja_fromjson
|
||||||
|
templates.env.globals["timezone_abbr"] = jinja_timezone_abbr
|
||||||
|
templates.env.globals["get_user_timezone"] = get_user_timezone
|
||||||
|
templates.env.globals["same_date"] = jinja_same_date
|
||||||
|
templates.env.filters["log_tail_display"] = jinja_log_tail_display
|
||||||
1
backend/utils/__init__.py
Normal file
@@ -0,0 +1 @@
|
|||||||
|
# Utils package
|
||||||
173
backend/utils/timezone.py
Normal file
@@ -0,0 +1,173 @@
|
|||||||
|
"""
|
||||||
|
Timezone utilities for Terra-View.
|
||||||
|
|
||||||
|
Provides consistent timezone handling throughout the application.
|
||||||
|
All database times are stored in UTC; this module converts for display.
|
||||||
|
"""
|
||||||
|
|
||||||
|
from datetime import datetime
|
||||||
|
from zoneinfo import ZoneInfo
|
||||||
|
from typing import Optional
|
||||||
|
|
||||||
|
from backend.database import SessionLocal
|
||||||
|
from backend.models import UserPreferences
|
||||||
|
|
||||||
|
|
||||||
|
# Default timezone if none set
|
||||||
|
DEFAULT_TIMEZONE = "America/New_York"
|
||||||
|
|
||||||
|
|
||||||
|
def get_user_timezone() -> str:
|
||||||
|
"""
|
||||||
|
Get the user's configured timezone from preferences.
|
||||||
|
|
||||||
|
Returns:
|
||||||
|
Timezone string (e.g., "America/New_York")
|
||||||
|
"""
|
||||||
|
db = SessionLocal()
|
||||||
|
try:
|
||||||
|
prefs = db.query(UserPreferences).filter_by(id=1).first()
|
||||||
|
if prefs and prefs.timezone:
|
||||||
|
return prefs.timezone
|
||||||
|
return DEFAULT_TIMEZONE
|
||||||
|
finally:
|
||||||
|
db.close()
|
||||||
|
|
||||||
|
|
||||||
|
def get_timezone_info(tz_name: str = None) -> ZoneInfo:
|
||||||
|
"""
|
||||||
|
Get ZoneInfo object for the specified or user's timezone.
|
||||||
|
|
||||||
|
Args:
|
||||||
|
tz_name: Timezone name, or None to use user preference
|
||||||
|
|
||||||
|
Returns:
|
||||||
|
ZoneInfo object
|
||||||
|
"""
|
||||||
|
if tz_name is None:
|
||||||
|
tz_name = get_user_timezone()
|
||||||
|
try:
|
||||||
|
return ZoneInfo(tz_name)
|
||||||
|
except Exception:
|
||||||
|
return ZoneInfo(DEFAULT_TIMEZONE)
|
||||||
|
|
||||||
|
|
||||||
|
def utc_to_local(dt: datetime, tz_name: str = None) -> datetime:
|
||||||
|
"""
|
||||||
|
Convert a UTC datetime to local timezone.
|
||||||
|
|
||||||
|
Args:
|
||||||
|
dt: Datetime in UTC (naive or aware)
|
||||||
|
tz_name: Target timezone, or None to use user preference
|
||||||
|
|
||||||
|
Returns:
|
||||||
|
Datetime in local timezone
|
||||||
|
"""
|
||||||
|
if dt is None:
|
||||||
|
return None
|
||||||
|
|
||||||
|
tz = get_timezone_info(tz_name)
|
||||||
|
|
||||||
|
# Assume naive datetime is UTC
|
||||||
|
if dt.tzinfo is None:
|
||||||
|
dt = dt.replace(tzinfo=ZoneInfo("UTC"))
|
||||||
|
|
||||||
|
return dt.astimezone(tz)
|
||||||
|
|
||||||
|
|
||||||
|
def local_to_utc(dt: datetime, tz_name: str = None) -> datetime:
|
||||||
|
"""
|
||||||
|
Convert a local datetime to UTC.
|
||||||
|
|
||||||
|
Args:
|
||||||
|
dt: Datetime in local timezone (naive or aware)
|
||||||
|
tz_name: Source timezone, or None to use user preference
|
||||||
|
|
||||||
|
Returns:
|
||||||
|
Datetime in UTC (naive, for database storage)
|
||||||
|
"""
|
||||||
|
if dt is None:
|
||||||
|
return None
|
||||||
|
|
||||||
|
tz = get_timezone_info(tz_name)
|
||||||
|
|
||||||
|
# Assume naive datetime is in local timezone
|
||||||
|
if dt.tzinfo is None:
|
||||||
|
dt = dt.replace(tzinfo=tz)
|
||||||
|
|
||||||
|
# Convert to UTC and strip tzinfo for database storage
|
||||||
|
return dt.astimezone(ZoneInfo("UTC")).replace(tzinfo=None)
|
||||||
|
|
||||||
|
|
||||||
|
def format_local_datetime(dt: datetime, fmt: str = "%Y-%m-%d %H:%M", tz_name: str = None) -> str:
|
||||||
|
"""
|
||||||
|
Format a UTC datetime as local time string.
|
||||||
|
|
||||||
|
Args:
|
||||||
|
dt: Datetime in UTC
|
||||||
|
fmt: strftime format string
|
||||||
|
tz_name: Target timezone, or None to use user preference
|
||||||
|
|
||||||
|
Returns:
|
||||||
|
Formatted datetime string in local time
|
||||||
|
"""
|
||||||
|
if dt is None:
|
||||||
|
return "N/A"
|
||||||
|
|
||||||
|
local_dt = utc_to_local(dt, tz_name)
|
||||||
|
return local_dt.strftime(fmt)
|
||||||
|
|
||||||
|
|
||||||
|
def format_local_time(dt: datetime, tz_name: str = None) -> str:
|
||||||
|
"""
|
||||||
|
Format a UTC datetime as local time (HH:MM format).
|
||||||
|
|
||||||
|
Args:
|
||||||
|
dt: Datetime in UTC
|
||||||
|
tz_name: Target timezone
|
||||||
|
|
||||||
|
Returns:
|
||||||
|
Time string in HH:MM format
|
||||||
|
"""
|
||||||
|
return format_local_datetime(dt, "%H:%M", tz_name)
|
||||||
|
|
||||||
|
|
||||||
|
def format_local_date(dt: datetime, tz_name: str = None) -> str:
|
||||||
|
"""
|
||||||
|
Format a UTC datetime as local date (YYYY-MM-DD format).
|
||||||
|
|
||||||
|
Args:
|
||||||
|
dt: Datetime in UTC
|
||||||
|
tz_name: Target timezone
|
||||||
|
|
||||||
|
Returns:
|
||||||
|
Date string
|
||||||
|
"""
|
||||||
|
return format_local_datetime(dt, "%Y-%m-%d", tz_name)
|
||||||
|
|
||||||
|
|
||||||
|
def get_timezone_abbreviation(tz_name: str = None) -> str:
|
||||||
|
"""
|
||||||
|
Get the abbreviation for a timezone (e.g., EST, EDT, PST).
|
||||||
|
|
||||||
|
Args:
|
||||||
|
tz_name: Timezone name, or None to use user preference
|
||||||
|
|
||||||
|
Returns:
|
||||||
|
Timezone abbreviation
|
||||||
|
"""
|
||||||
|
tz = get_timezone_info(tz_name)
|
||||||
|
now = datetime.now(tz)
|
||||||
|
return now.strftime("%Z")
|
||||||
|
|
||||||
|
|
||||||
|
# Common US timezone choices for settings dropdown
|
||||||
|
TIMEZONE_CHOICES = [
|
||||||
|
("America/New_York", "Eastern Time (ET)"),
|
||||||
|
("America/Chicago", "Central Time (CT)"),
|
||||||
|
("America/Denver", "Mountain Time (MT)"),
|
||||||
|
("America/Los_Angeles", "Pacific Time (PT)"),
|
||||||
|
("America/Anchorage", "Alaska Time (AKT)"),
|
||||||
|
("Pacific/Honolulu", "Hawaii Time (HT)"),
|
||||||
|
("UTC", "UTC"),
|
||||||
|
]
|
||||||
@@ -1,10 +0,0 @@
|
|||||||
{
|
|
||||||
"filename": "snapshot_20251216_201738.db",
|
|
||||||
"created_at": "20251216_201738",
|
|
||||||
"created_at_iso": "2025-12-16T20:17:38.638982",
|
|
||||||
"description": "Auto-backup before restore",
|
|
||||||
"size_bytes": 57344,
|
|
||||||
"size_mb": 0.05,
|
|
||||||
"original_db_size_bytes": 57344,
|
|
||||||
"type": "manual"
|
|
||||||
}
|
|
||||||
@@ -1,9 +0,0 @@
|
|||||||
{
|
|
||||||
"filename": "snapshot_uploaded_20251216_201732.db",
|
|
||||||
"created_at": "20251216_201732",
|
|
||||||
"created_at_iso": "2025-12-16T20:17:32.574205",
|
|
||||||
"description": "Uploaded: snapshot_20251216_200259.db",
|
|
||||||
"size_bytes": 77824,
|
|
||||||
"size_mb": 0.07,
|
|
||||||
"type": "uploaded"
|
|
||||||
}
|
|
||||||
@@ -1,9 +1,7 @@
|
|||||||
services:
|
services:
|
||||||
|
|
||||||
# --- TERRA-VIEW PRODUCTION ---
|
terra-view:
|
||||||
terra-view-prod:
|
|
||||||
build: .
|
build: .
|
||||||
container_name: terra-view
|
|
||||||
ports:
|
ports:
|
||||||
- "8001:8001"
|
- "8001:8001"
|
||||||
volumes:
|
volumes:
|
||||||
@@ -24,34 +22,11 @@ services:
|
|||||||
retries: 3
|
retries: 3
|
||||||
start_period: 40s
|
start_period: 40s
|
||||||
|
|
||||||
# --- TERRA-VIEW DEVELOPMENT ---
|
|
||||||
# terra-view-dev:
|
|
||||||
# build: .
|
|
||||||
# container_name: terra-view-dev
|
|
||||||
# ports:
|
|
||||||
# - "1001:8001"
|
|
||||||
# volumes:
|
|
||||||
# - ./data-dev:/app/data
|
|
||||||
# environment:
|
|
||||||
# - PYTHONUNBUFFERED=1
|
|
||||||
# - ENVIRONMENT=development
|
|
||||||
# - SLMM_BASE_URL=http://slmm:8100
|
|
||||||
# restart: unless-stopped
|
|
||||||
# depends_on:
|
|
||||||
# - slmm
|
|
||||||
# healthcheck:
|
|
||||||
# test: ["CMD", "curl", "-f", "http://localhost:8001/health"]
|
|
||||||
# interval: 30s
|
|
||||||
# timeout: 10s
|
|
||||||
# retries: 3
|
|
||||||
# start_period: 40s
|
|
||||||
|
|
||||||
# --- SLMM (Sound Level Meter Manager) ---
|
# --- SLMM (Sound Level Meter Manager) ---
|
||||||
slmm:
|
slmm:
|
||||||
build:
|
build:
|
||||||
context: ../slmm
|
context: ../slmm
|
||||||
dockerfile: Dockerfile
|
dockerfile: Dockerfile
|
||||||
container_name: slmm
|
|
||||||
network_mode: host
|
network_mode: host
|
||||||
volumes:
|
volumes:
|
||||||
- ../slmm/data:/app/data
|
- ../slmm/data:/app/data
|
||||||
@@ -59,6 +34,8 @@ services:
|
|||||||
- PYTHONUNBUFFERED=1
|
- PYTHONUNBUFFERED=1
|
||||||
- PORT=8100
|
- PORT=8100
|
||||||
- CORS_ORIGINS=*
|
- CORS_ORIGINS=*
|
||||||
|
- TCP_IDLE_TTL=-1
|
||||||
|
- TCP_MAX_AGE=-1
|
||||||
restart: unless-stopped
|
restart: unless-stopped
|
||||||
healthcheck:
|
healthcheck:
|
||||||
test: ["CMD", "curl", "-f", "http://localhost:8100/health"]
|
test: ["CMD", "curl", "-f", "http://localhost:8100/health"]
|
||||||
@@ -69,4 +46,3 @@ services:
|
|||||||
|
|
||||||
volumes:
|
volumes:
|
||||||
data:
|
data:
|
||||||
data-dev:
|
|
||||||
|
|||||||
37
migrate_watcher_agents.py
Normal file
@@ -0,0 +1,37 @@
|
|||||||
|
"""
|
||||||
|
Migration: add watcher_agents table.
|
||||||
|
|
||||||
|
Safe to run multiple times (idempotent).
|
||||||
|
"""
|
||||||
|
|
||||||
|
import sqlite3
|
||||||
|
import os
|
||||||
|
|
||||||
|
DB_PATH = os.path.join(os.path.dirname(__file__), "data", "seismo.db")
|
||||||
|
|
||||||
|
|
||||||
|
def migrate():
|
||||||
|
con = sqlite3.connect(DB_PATH)
|
||||||
|
cur = con.cursor()
|
||||||
|
|
||||||
|
cur.execute("""
|
||||||
|
CREATE TABLE IF NOT EXISTS watcher_agents (
|
||||||
|
id TEXT PRIMARY KEY,
|
||||||
|
source_type TEXT NOT NULL,
|
||||||
|
version TEXT,
|
||||||
|
last_seen DATETIME,
|
||||||
|
status TEXT NOT NULL DEFAULT 'unknown',
|
||||||
|
ip_address TEXT,
|
||||||
|
log_tail TEXT,
|
||||||
|
update_pending INTEGER NOT NULL DEFAULT 0,
|
||||||
|
update_version TEXT
|
||||||
|
)
|
||||||
|
""")
|
||||||
|
|
||||||
|
con.commit()
|
||||||
|
con.close()
|
||||||
|
print("Migration complete: watcher_agents table ready.")
|
||||||
|
|
||||||
|
|
||||||
|
if __name__ == "__main__":
|
||||||
|
migrate()
|
||||||
19
rebuild-dev.sh
Executable file
@@ -0,0 +1,19 @@
|
|||||||
|
#!/bin/bash
|
||||||
|
# Dev rebuild script — increments build number, rebuilds and restarts terra-view
|
||||||
|
set -e
|
||||||
|
|
||||||
|
SCRIPT_DIR="$(cd "$(dirname "${BASH_SOURCE[0]}")" && pwd)"
|
||||||
|
BUILD_FILE="$SCRIPT_DIR/build_number.txt"
|
||||||
|
|
||||||
|
# Read and increment build number
|
||||||
|
BUILD_NUMBER=$(cat "$BUILD_FILE" 2>/dev/null || echo "0")
|
||||||
|
BUILD_NUMBER=$((BUILD_NUMBER + 1))
|
||||||
|
echo "$BUILD_NUMBER" > "$BUILD_FILE"
|
||||||
|
|
||||||
|
echo "Building terra-view dev (build #$BUILD_NUMBER)..."
|
||||||
|
|
||||||
|
cd "$SCRIPT_DIR"
|
||||||
|
docker compose build --build-arg BUILD_NUMBER="$BUILD_NUMBER" terra-view
|
||||||
|
docker compose up -d terra-view
|
||||||
|
|
||||||
|
echo "Done — terra-view v0.6.1-$BUILD_NUMBER is running on :1001"
|
||||||
12
rebuild-prod.sh
Executable file
@@ -0,0 +1,12 @@
|
|||||||
|
#!/bin/bash
|
||||||
|
# Production rebuild script — rebuilds and restarts terra-view on :8001
|
||||||
|
set -e
|
||||||
|
|
||||||
|
SCRIPT_DIR="$(cd "$(dirname "${BASH_SOURCE[0]}")" && pwd)"
|
||||||
|
cd "$SCRIPT_DIR"
|
||||||
|
|
||||||
|
echo "Building terra-view production..."
|
||||||
|
docker compose -f docker-compose.yml build terra-view
|
||||||
|
docker compose -f docker-compose.yml up -d terra-view
|
||||||
|
|
||||||
|
echo "Done — terra-view production is running on :8001"
|
||||||
@@ -7,3 +7,4 @@ jinja2==3.1.2
|
|||||||
aiofiles==23.2.1
|
aiofiles==23.2.1
|
||||||
Pillow==10.1.0
|
Pillow==10.1.0
|
||||||
httpx==0.25.2
|
httpx==0.25.2
|
||||||
|
openpyxl==3.1.2
|
||||||
|
|||||||
@@ -1,6 +1,23 @@
|
|||||||
unit_id,unit_type,deployed,retired,note,project_id,location
|
unit_id,device_type,unit_type,deployed,retired,note,project_id,location,address,coordinates,last_calibrated,next_calibration_due,deployed_with_modem_id,ip_address,phone_number,hardware_model,slm_host,slm_tcp_port,slm_ftp_port,slm_model,slm_serial_number,slm_frequency_weighting,slm_time_weighting,slm_measurement_range
|
||||||
BE1234,series3,true,false,Primary unit at main site,PROJ-001,San Francisco CA
|
# ============================================
|
||||||
BE5678,series3,true,false,Backup sensor,PROJ-001,Los Angeles CA
|
# SEISMOGRAPHS (device_type=seismograph)
|
||||||
BE9012,series3,false,false,In maintenance,PROJ-002,Workshop
|
# ============================================
|
||||||
BE3456,series3,true,false,,PROJ-003,New York NY
|
BE1234,seismograph,series3,true,false,Primary unit at main site,PROJ-001,San Francisco CA,123 Market St,37.7749;-122.4194,2025-06-15,2026-06-15,MDM001,,,,,,,,,,,
|
||||||
BE7890,series3,false,true,Decommissioned 2024,,Storage
|
BE5678,seismograph,series3,true,false,Backup sensor,PROJ-001,Los Angeles CA,456 Sunset Blvd,34.0522;-118.2437,2025-03-01,2026-03-01,MDM002,,,,,,,,,,,
|
||||||
|
BE9012,seismograph,series4,false,false,In maintenance - needs calibration,PROJ-002,Workshop,789 Industrial Way,,,,,,,,,,,,,,
|
||||||
|
BE3456,seismograph,series3,true,false,,PROJ-003,New York NY,101 Broadway,40.7128;-74.0060,2025-01-10,2026-01-10,,,,,,,,,,,
|
||||||
|
BE7890,seismograph,series3,false,true,Decommissioned 2024,,Storage,Warehouse B,,,,,,,,,,,,,,,
|
||||||
|
# ============================================
|
||||||
|
# MODEMS (device_type=modem)
|
||||||
|
# ============================================
|
||||||
|
MDM001,modem,,true,false,Cradlepoint at SF site,PROJ-001,San Francisco CA,123 Market St,37.7749;-122.4194,,,,,192.168.1.100,+1-555-0101,IBR900,,,,,,,
|
||||||
|
MDM002,modem,,true,false,Sierra Wireless at LA site,PROJ-001,Los Angeles CA,456 Sunset Blvd,34.0522;-118.2437,,,,,10.0.0.50,+1-555-0102,RV55,,,,,,,
|
||||||
|
MDM003,modem,,false,false,Spare modem in storage,,,Storage,Warehouse A,,,,,,+1-555-0103,IBR600,,,,,,,
|
||||||
|
MDM004,modem,,true,false,NYC backup modem,PROJ-003,New York NY,101 Broadway,40.7128;-74.0060,,,,,172.16.0.25,+1-555-0104,IBR1700,,,,,,,
|
||||||
|
# ============================================
|
||||||
|
# SOUND LEVEL METERS (device_type=slm)
|
||||||
|
# ============================================
|
||||||
|
SLM001,slm,,true,false,NL-43 at construction site A,PROJ-004,Downtown Site,500 Main St,40.7589;-73.9851,,,,,,,,192.168.10.101,2255,21,NL-43,12345678,A,F,30-130 dB
|
||||||
|
SLM002,slm,,true,false,NL-43 at construction site B,PROJ-004,Midtown Site,600 Park Ave,40.7614;-73.9776,,,MDM004,,,,,192.168.10.102,2255,21,NL-43,12345679,A,S,30-130 dB
|
||||||
|
SLM003,slm,,false,false,NL-53 spare unit,,,Storage,Warehouse A,,,,,,,,,,,NL-53,98765432,C,F,25-138 dB
|
||||||
|
SLM004,slm,,true,false,NL-43 nighttime monitoring,PROJ-005,Residential Area,200 Quiet Lane,40.7484;-73.9857,,,,,,,,10.0.5.50,2255,21,NL-43,11112222,A,S,30-130 dB
|
||||||
|
|||||||
|
@@ -8,7 +8,7 @@ import sys
|
|||||||
from sqlalchemy import create_engine, text
|
from sqlalchemy import create_engine, text
|
||||||
from sqlalchemy.orm import sessionmaker
|
from sqlalchemy.orm import sessionmaker
|
||||||
|
|
||||||
DATABASE_URL = "sqlite:///data/sfm.db"
|
DATABASE_URL = "sqlite:///data/seismo_fleet.db"
|
||||||
|
|
||||||
def rename_unit(old_id: str, new_id: str):
|
def rename_unit(old_id: str, new_id: str):
|
||||||
"""
|
"""
|
||||||
@@ -90,14 +90,14 @@ def rename_unit(old_id: str, new_id: str):
|
|||||||
except Exception:
|
except Exception:
|
||||||
pass # Table may not exist
|
pass # Table may not exist
|
||||||
|
|
||||||
# Update recording_sessions table (if exists)
|
# Update monitoring_sessions table (if exists)
|
||||||
try:
|
try:
|
||||||
result = session.execute(
|
result = session.execute(
|
||||||
text("UPDATE recording_sessions SET unit_id = :new_id WHERE unit_id = :old_id"),
|
text("UPDATE monitoring_sessions SET unit_id = :new_id WHERE unit_id = :old_id"),
|
||||||
{"new_id": new_id, "old_id": old_id}
|
{"new_id": new_id, "old_id": old_id}
|
||||||
)
|
)
|
||||||
if result.rowcount > 0:
|
if result.rowcount > 0:
|
||||||
print(f" ✓ Updated recording_sessions ({result.rowcount} rows)")
|
print(f" ✓ Updated monitoring_sessions ({result.rowcount} rows)")
|
||||||
except Exception:
|
except Exception:
|
||||||
pass # Table may not exist
|
pass # Table may not exist
|
||||||
|
|
||||||
|
|||||||
273
templates/admin_watchers.html
Normal file
@@ -0,0 +1,273 @@
|
|||||||
|
{% extends "base.html" %}
|
||||||
|
|
||||||
|
{% block title %}Watcher Manager — Admin{% endblock %}
|
||||||
|
|
||||||
|
{% block content %}
|
||||||
|
<div class="mb-6">
|
||||||
|
<div class="flex items-center gap-3">
|
||||||
|
<h1 class="text-3xl font-bold text-gray-900 dark:text-white">Watcher Manager</h1>
|
||||||
|
<span class="px-2 py-0.5 text-xs font-semibold bg-orange-100 text-orange-700 dark:bg-orange-900 dark:text-orange-300 rounded-full">Admin</span>
|
||||||
|
</div>
|
||||||
|
<p class="text-gray-500 dark:text-gray-400 mt-1 text-sm">
|
||||||
|
Monitor and manage field watcher agents. Data updates on each heartbeat received.
|
||||||
|
</p>
|
||||||
|
</div>
|
||||||
|
|
||||||
|
<!-- Agent cards -->
|
||||||
|
<div id="agent-list" class="space-y-4">
|
||||||
|
|
||||||
|
{% if not agents %}
|
||||||
|
<div class="bg-white dark:bg-slate-800 rounded-xl shadow p-8 text-center">
|
||||||
|
<svg class="w-12 h-12 mx-auto text-gray-300 dark:text-gray-600 mb-3" fill="none" stroke="currentColor" viewBox="0 0 24 24">
|
||||||
|
<path stroke-linecap="round" stroke-linejoin="round" stroke-width="1.5" d="M9.75 17L9 20l-1 1h8l-1-1-.75-3M3 13h18M5 17h14a2 2 0 002-2V5a2 2 0 00-2-2H5a2 2 0 00-2 2v10a2 2 0 002 2z"/>
|
||||||
|
</svg>
|
||||||
|
<p class="text-gray-500 dark:text-gray-400">No watcher agents have reported in yet.</p>
|
||||||
|
<p class="text-sm text-gray-400 dark:text-gray-500 mt-1">Once a watcher sends its first heartbeat it will appear here.</p>
|
||||||
|
</div>
|
||||||
|
{% endif %}
|
||||||
|
|
||||||
|
{% for agent in agents %}
|
||||||
|
<div class="bg-white dark:bg-slate-800 rounded-xl shadow-lg overflow-hidden" id="agent-{{ agent.id | replace(' ', '-') }}">
|
||||||
|
|
||||||
|
<!-- Card header -->
|
||||||
|
<div class="flex items-center justify-between px-6 py-4 border-b border-gray-100 dark:border-slate-700">
|
||||||
|
<div class="flex items-center gap-3">
|
||||||
|
<!-- Status dot -->
|
||||||
|
{% if agent.status == 'ok' %}
|
||||||
|
<span class="status-dot inline-block w-3 h-3 rounded-full bg-green-500 flex-shrink-0"></span>
|
||||||
|
{% elif agent.status == 'pending' %}
|
||||||
|
<span class="status-dot inline-block w-3 h-3 rounded-full bg-yellow-400 flex-shrink-0"></span>
|
||||||
|
{% elif agent.status in ('missing', 'error') %}
|
||||||
|
<span class="status-dot inline-block w-3 h-3 rounded-full bg-red-500 flex-shrink-0"></span>
|
||||||
|
{% else %}
|
||||||
|
<span class="status-dot inline-block w-3 h-3 rounded-full bg-gray-400 flex-shrink-0"></span>
|
||||||
|
{% endif %}
|
||||||
|
|
||||||
|
<div>
|
||||||
|
<h2 class="text-lg font-semibold text-gray-900 dark:text-white">{{ agent.id }}</h2>
|
||||||
|
<div class="flex items-center gap-3 text-xs text-gray-500 dark:text-gray-400 mt-0.5">
|
||||||
|
<span>{{ agent.source_type }}</span>
|
||||||
|
{% if agent.version %}
|
||||||
|
<span class="bg-gray-100 dark:bg-slate-700 px-1.5 py-0.5 rounded font-mono">v{{ agent.version }}</span>
|
||||||
|
{% endif %}
|
||||||
|
{% if agent.ip_address %}
|
||||||
|
<span>{{ agent.ip_address }}</span>
|
||||||
|
{% endif %}
|
||||||
|
</div>
|
||||||
|
</div>
|
||||||
|
</div>
|
||||||
|
|
||||||
|
<div class="flex items-center gap-3">
|
||||||
|
<!-- Status badge -->
|
||||||
|
{% if agent.status == 'ok' %}
|
||||||
|
<span class="status-badge px-2.5 py-1 text-xs font-semibold rounded-full bg-green-100 text-green-700 dark:bg-green-900 dark:text-green-300">OK</span>
|
||||||
|
{% elif agent.status == 'pending' %}
|
||||||
|
<span class="status-badge px-2.5 py-1 text-xs font-semibold rounded-full bg-yellow-100 text-yellow-700 dark:bg-yellow-900 dark:text-yellow-300">Pending</span>
|
||||||
|
{% elif agent.status == 'missing' %}
|
||||||
|
<span class="status-badge px-2.5 py-1 text-xs font-semibold rounded-full bg-red-100 text-red-700 dark:bg-red-900 dark:text-red-300">Missing</span>
|
||||||
|
{% elif agent.status == 'error' %}
|
||||||
|
<span class="status-badge px-2.5 py-1 text-xs font-semibold rounded-full bg-red-100 text-red-700 dark:bg-red-900 dark:text-red-300">Error</span>
|
||||||
|
{% else %}
|
||||||
|
<span class="status-badge px-2.5 py-1 text-xs font-semibold rounded-full bg-gray-100 text-gray-600 dark:bg-slate-700 dark:text-gray-400">Unknown</span>
|
||||||
|
{% endif %}
|
||||||
|
|
||||||
|
<!-- Trigger Update button -->
|
||||||
|
<button
|
||||||
|
onclick="triggerUpdate('{{ agent.id }}')"
|
||||||
|
class="px-3 py-1.5 text-xs font-medium bg-seismo-orange hover:bg-orange-600 text-white rounded-lg transition-colors"
|
||||||
|
id="btn-update-{{ agent.id | replace(' ', '-') }}"
|
||||||
|
>
|
||||||
|
Trigger Update
|
||||||
|
</button>
|
||||||
|
</div>
|
||||||
|
</div>
|
||||||
|
|
||||||
|
<!-- Meta row -->
|
||||||
|
<div class="px-6 py-3 bg-gray-50 dark:bg-slate-800 border-b border-gray-100 dark:border-slate-700 flex flex-wrap gap-6 text-sm">
|
||||||
|
<div>
|
||||||
|
<span class="text-gray-500 dark:text-gray-400">Last seen</span>
|
||||||
|
<span class="last-seen-value ml-2 font-medium text-gray-800 dark:text-gray-200">
|
||||||
|
{% if agent.last_seen %}
|
||||||
|
{{ agent.last_seen }}
|
||||||
|
{% if agent.age_minutes is not none %}
|
||||||
|
<span class="text-gray-400 dark:text-gray-500 font-normal">({{ agent.age_minutes }}m ago)</span>
|
||||||
|
{% endif %}
|
||||||
|
{% else %}
|
||||||
|
Never
|
||||||
|
{% endif %}
|
||||||
|
</span>
|
||||||
|
</div>
|
||||||
|
<div class="update-pending-indicator flex items-center gap-1.5 text-yellow-600 dark:text-yellow-400 {% if not agent.update_pending %}hidden{% endif %}">
|
||||||
|
<svg class="w-4 h-4" fill="none" stroke="currentColor" viewBox="0 0 24 24">
|
||||||
|
<path stroke-linecap="round" stroke-linejoin="round" stroke-width="2" d="M4 4v5h.582m15.356 2A8.001 8.001 0 004.582 9m0 0H9m11 11v-5h-.581m0 0a8.003 8.003 0 01-15.357-2m15.357 2H15"/>
|
||||||
|
</svg>
|
||||||
|
<span class="text-xs font-semibold">Update pending — will apply on next heartbeat</span>
|
||||||
|
</div>
|
||||||
|
</div>
|
||||||
|
|
||||||
|
<!-- Log tail -->
|
||||||
|
{% if agent.log_tail %}
|
||||||
|
<div class="px-6 py-4">
|
||||||
|
<div class="flex items-center justify-between mb-2">
|
||||||
|
<span class="text-xs font-semibold text-gray-500 dark:text-gray-400 uppercase tracking-wide">Log Tail</span>
|
||||||
|
<div class="flex items-center gap-3">
|
||||||
|
<button onclick="expandLog('{{ agent.id | replace(' ', '-') }}')" id="expand-{{ agent.id | replace(' ', '-') }}" class="text-xs text-gray-400 hover:text-gray-600 dark:hover:text-gray-200">
|
||||||
|
Expand
|
||||||
|
</button>
|
||||||
|
<button onclick="toggleLog('{{ agent.id | replace(' ', '-') }}')" class="text-xs text-gray-400 hover:text-gray-600 dark:hover:text-gray-200">
|
||||||
|
Toggle
|
||||||
|
</button>
|
||||||
|
</div>
|
||||||
|
</div>
|
||||||
|
<pre id="log-{{ agent.id | replace(' ', '-') }}" class="text-xs font-mono bg-gray-900 text-green-400 rounded-lg p-3 overflow-x-auto max-h-96 overflow-y-auto leading-relaxed hidden">{{ agent.log_tail | log_tail_display }}</pre>
|
||||||
|
</div>
|
||||||
|
{% else %}
|
||||||
|
<div class="px-6 py-4 text-xs text-gray-400 dark:text-gray-500 italic">No log data received yet.</div>
|
||||||
|
{% endif %}
|
||||||
|
|
||||||
|
</div>
|
||||||
|
{% endfor %}
|
||||||
|
|
||||||
|
</div>
|
||||||
|
|
||||||
|
<!-- Auto-refresh every 30s -->
|
||||||
|
<div class="mt-6 text-xs text-gray-400 dark:text-gray-600 text-center">
|
||||||
|
Auto-refreshes every 30 seconds — or <a href="/admin/watchers" class="underline hover:text-gray-600 dark:hover:text-gray-400">refresh now</a>
|
||||||
|
</div>
|
||||||
|
|
||||||
|
<script>
|
||||||
|
function triggerUpdate(agentId) {
|
||||||
|
if (!confirm('Trigger update for ' + agentId + '?\n\nThe watcher will self-update on its next heartbeat cycle.')) return;
|
||||||
|
|
||||||
|
const safeId = agentId.replace(/ /g, '-');
|
||||||
|
const btn = document.getElementById('btn-update-' + safeId);
|
||||||
|
btn.disabled = true;
|
||||||
|
btn.textContent = 'Sending...';
|
||||||
|
|
||||||
|
fetch('/api/admin/watchers/' + encodeURIComponent(agentId) + '/trigger-update', {
|
||||||
|
method: 'POST',
|
||||||
|
headers: {'Content-Type': 'application/json'},
|
||||||
|
body: JSON.stringify({})
|
||||||
|
})
|
||||||
|
.then(r => r.json())
|
||||||
|
.then(data => {
|
||||||
|
if (data.ok) {
|
||||||
|
btn.textContent = 'Update Queued';
|
||||||
|
btn.classList.remove('bg-seismo-orange', 'hover:bg-orange-600');
|
||||||
|
btn.classList.add('bg-green-600');
|
||||||
|
// Show the pending indicator immediately without a reload
|
||||||
|
const card = document.getElementById('agent-' + safeId);
|
||||||
|
if (card) {
|
||||||
|
const indicator = card.querySelector('.update-pending-indicator');
|
||||||
|
if (indicator) indicator.classList.remove('hidden');
|
||||||
|
}
|
||||||
|
} else {
|
||||||
|
btn.textContent = 'Error';
|
||||||
|
btn.classList.add('bg-red-600');
|
||||||
|
btn.disabled = false;
|
||||||
|
}
|
||||||
|
})
|
||||||
|
.catch(() => {
|
||||||
|
btn.textContent = 'Failed';
|
||||||
|
btn.classList.add('bg-red-600');
|
||||||
|
btn.disabled = false;
|
||||||
|
});
|
||||||
|
}
|
||||||
|
|
||||||
|
function toggleLog(agentId) {
|
||||||
|
const el = document.getElementById('log-' + agentId);
|
||||||
|
if (el) el.classList.toggle('hidden');
|
||||||
|
}
|
||||||
|
|
||||||
|
function expandLog(agentId) {
|
||||||
|
const el = document.getElementById('log-' + agentId);
|
||||||
|
const btn = document.getElementById('expand-' + agentId);
|
||||||
|
if (!el) return;
|
||||||
|
el.classList.remove('hidden');
|
||||||
|
if (el.classList.contains('max-h-96')) {
|
||||||
|
el.classList.remove('max-h-96');
|
||||||
|
el.style.maxHeight = 'none';
|
||||||
|
if (btn) btn.textContent = 'Collapse';
|
||||||
|
} else {
|
||||||
|
el.classList.add('max-h-96');
|
||||||
|
el.style.maxHeight = '';
|
||||||
|
if (btn) btn.textContent = 'Expand';
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
// Status colors for dot and badge by status value
|
||||||
|
const STATUS_DOT = {
|
||||||
|
ok: 'bg-green-500',
|
||||||
|
pending: 'bg-yellow-400',
|
||||||
|
missing: 'bg-red-500',
|
||||||
|
error: 'bg-red-500',
|
||||||
|
};
|
||||||
|
const STATUS_BADGE_CLASSES = {
|
||||||
|
ok: 'bg-green-100 text-green-700 dark:bg-green-900 dark:text-green-300',
|
||||||
|
pending: 'bg-yellow-100 text-yellow-700 dark:bg-yellow-900 dark:text-yellow-300',
|
||||||
|
missing: 'bg-red-100 text-red-700 dark:bg-red-900 dark:text-red-300',
|
||||||
|
error: 'bg-red-100 text-red-700 dark:bg-red-900 dark:text-red-300',
|
||||||
|
};
|
||||||
|
const STATUS_BADGE_DEFAULT = 'bg-gray-100 text-gray-600 dark:bg-slate-700 dark:text-gray-400';
|
||||||
|
const DOT_COLORS = ['bg-green-500', 'bg-yellow-400', 'bg-red-500', 'bg-gray-400'];
|
||||||
|
const BADGE_COLORS = [
|
||||||
|
'bg-green-100', 'text-green-700', 'dark:bg-green-900', 'dark:text-green-300',
|
||||||
|
'bg-yellow-100', 'text-yellow-700', 'dark:bg-yellow-900', 'dark:text-yellow-300',
|
||||||
|
'bg-red-100', 'text-red-700', 'dark:bg-red-900', 'dark:text-red-300',
|
||||||
|
'bg-gray-100', 'text-gray-600', 'dark:bg-slate-700', 'dark:text-gray-400',
|
||||||
|
];
|
||||||
|
|
||||||
|
function patchAgent(card, agent) {
|
||||||
|
// Status dot
|
||||||
|
const dot = card.querySelector('.status-dot');
|
||||||
|
if (dot) {
|
||||||
|
dot.classList.remove(...DOT_COLORS);
|
||||||
|
dot.classList.add(STATUS_DOT[agent.status] || 'bg-gray-400');
|
||||||
|
}
|
||||||
|
|
||||||
|
// Status badge
|
||||||
|
const badge = card.querySelector('.status-badge');
|
||||||
|
if (badge) {
|
||||||
|
badge.classList.remove(...BADGE_COLORS);
|
||||||
|
const label = agent.status ? agent.status.charAt(0).toUpperCase() + agent.status.slice(1) : 'Unknown';
|
||||||
|
badge.textContent = label === 'Ok' ? 'OK' : label;
|
||||||
|
const cls = STATUS_BADGE_CLASSES[agent.status] || STATUS_BADGE_DEFAULT;
|
||||||
|
badge.classList.add(...cls.split(' '));
|
||||||
|
}
|
||||||
|
|
||||||
|
// Last seen / age
|
||||||
|
const lastSeen = card.querySelector('.last-seen-value');
|
||||||
|
if (lastSeen) {
|
||||||
|
if (agent.last_seen) {
|
||||||
|
const age = agent.age_minutes != null
|
||||||
|
? ` <span class="text-gray-400 dark:text-gray-500 font-normal">(${agent.age_minutes}m ago)</span>`
|
||||||
|
: '';
|
||||||
|
lastSeen.innerHTML = agent.last_seen + age;
|
||||||
|
} else {
|
||||||
|
lastSeen.textContent = 'Never';
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
// Update pending indicator
|
||||||
|
const indicator = card.querySelector('.update-pending-indicator');
|
||||||
|
if (indicator) {
|
||||||
|
indicator.classList.toggle('hidden', !agent.update_pending);
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
function liveRefresh() {
|
||||||
|
fetch('/api/admin/watchers')
|
||||||
|
.then(r => r.json())
|
||||||
|
.then(agents => {
|
||||||
|
agents.forEach(agent => {
|
||||||
|
const safeId = agent.id.replace(/ /g, '-');
|
||||||
|
const card = document.getElementById('agent-' + safeId);
|
||||||
|
if (card) patchAgent(card, agent);
|
||||||
|
});
|
||||||
|
})
|
||||||
|
.catch(() => {}); // silently ignore fetch errors
|
||||||
|
}
|
||||||
|
|
||||||
|
setInterval(liveRefresh, 30000);
|
||||||
|
</script>
|
||||||
|
{% endblock %}
|
||||||
@@ -20,6 +20,9 @@
|
|||||||
|
|
||||||
<!-- PWA Manifest -->
|
<!-- PWA Manifest -->
|
||||||
<link rel="manifest" href="/static/manifest.json">
|
<link rel="manifest" href="/static/manifest.json">
|
||||||
|
<link rel="icon" type="image/png" sizes="32x32" href="/static/icons/favicon-32.png">
|
||||||
|
<link rel="icon" type="image/png" sizes="16x16" href="/static/icons/favicon-16.png">
|
||||||
|
<link rel="apple-touch-icon" sizes="180x180" href="/static/icons/icon-192.png">
|
||||||
<meta name="theme-color" content="#f48b1c">
|
<meta name="theme-color" content="#f48b1c">
|
||||||
<meta name="apple-mobile-web-app-capable" content="yes">
|
<meta name="apple-mobile-web-app-capable" content="yes">
|
||||||
<meta name="apple-mobile-web-app-status-bar-style" content="black-translucent">
|
<meta name="apple-mobile-web-app-status-bar-style" content="black-translucent">
|
||||||
@@ -68,7 +71,7 @@
|
|||||||
|
|
||||||
{% block extra_head %}{% endblock %}
|
{% block extra_head %}{% endblock %}
|
||||||
</head>
|
</head>
|
||||||
<body class="bg-gray-100 dark:bg-gray-900 text-gray-900 dark:text-gray-100">
|
<body class="bg-gray-100 dark:bg-slate-800 text-gray-900 dark:text-gray-100">
|
||||||
|
|
||||||
<!-- Offline Indicator -->
|
<!-- Offline Indicator -->
|
||||||
<div id="offlineIndicator" class="offline-indicator">
|
<div id="offlineIndicator" class="offline-indicator">
|
||||||
@@ -82,13 +85,13 @@
|
|||||||
|
|
||||||
<div class="flex h-screen overflow-hidden">
|
<div class="flex h-screen overflow-hidden">
|
||||||
<!-- Sidebar (Responsive) -->
|
<!-- Sidebar (Responsive) -->
|
||||||
<aside id="sidebar" class="sidebar w-64 bg-white dark:bg-slate-800 shadow-lg flex flex-col">
|
<aside id="sidebar" class="sidebar w-64 bg-white dark:bg-slate-800 shadow-lg flex flex-col{% if request.query_params.get('embed') == '1' %} hidden{% endif %}">
|
||||||
<!-- Logo -->
|
<!-- Logo -->
|
||||||
<div class="p-6 border-b border-gray-200 dark:border-gray-700">
|
<div class="p-6 border-b border-gray-200 dark:border-gray-700">
|
||||||
<h1 class="text-2xl font-bold text-seismo-navy dark:text-seismo-orange">
|
<a href="/" class="block">
|
||||||
Seismo<br>
|
<img src="/static/terra-view-logo-light.png" srcset="/static/terra-view-logo-light.png 1x, /static/terra-view-logo-light@2x.png 2x" alt="Terra-View" class="block dark:hidden w-44 h-auto">
|
||||||
<span class="text-seismo-orange dark:text-seismo-burgundy">Fleet Manager</span>
|
<img src="/static/terra-view-logo-dark.png" srcset="/static/terra-view-logo-dark.png 1x, /static/terra-view-logo-dark@2x.png 2x" alt="Terra-View" class="hidden dark:block w-44 h-auto">
|
||||||
</h1>
|
</a>
|
||||||
<div class="flex items-center justify-between mt-2">
|
<div class="flex items-center justify-between mt-2">
|
||||||
<p class="text-xs text-gray-500 dark:text-gray-400">v {{ version }}</p>
|
<p class="text-xs text-gray-500 dark:text-gray-400">v {{ version }}</p>
|
||||||
{% if environment == 'development' %}
|
{% if environment == 'development' %}
|
||||||
@@ -127,6 +130,20 @@
|
|||||||
Sound Level Meters
|
Sound Level Meters
|
||||||
</a>
|
</a>
|
||||||
|
|
||||||
|
<a href="/modems" class="flex items-center px-4 py-3 rounded-lg hover:bg-gray-100 dark:hover:bg-gray-700 {% if request.url.path == '/modems' %}bg-gray-100 dark:bg-gray-700{% endif %}">
|
||||||
|
<svg class="w-5 h-5 mr-3" fill="none" stroke="currentColor" viewBox="0 0 24 24">
|
||||||
|
<path stroke-linecap="round" stroke-linejoin="round" stroke-width="2" d="M8.111 16.404a5.5 5.5 0 017.778 0M12 20h.01m-7.08-7.071c3.904-3.905 10.236-3.905 14.141 0M1.394 9.393c5.857-5.857 15.355-5.857 21.213 0"></path>
|
||||||
|
</svg>
|
||||||
|
Modems
|
||||||
|
</a>
|
||||||
|
|
||||||
|
<a href="/pair-devices" class="flex items-center px-4 py-3 rounded-lg hover:bg-gray-100 dark:hover:bg-gray-700 {% if request.url.path == '/pair-devices' %}bg-gray-100 dark:bg-gray-700{% endif %}">
|
||||||
|
<svg class="w-5 h-5 mr-3" fill="none" stroke="currentColor" viewBox="0 0 24 24">
|
||||||
|
<path stroke-linecap="round" stroke-linejoin="round" stroke-width="2" d="M13.828 10.172a4 4 0 00-5.656 0l-4 4a4 4 0 105.656 5.656l1.102-1.101m-.758-4.899a4 4 0 005.656 0l4-4a4 4 0 00-5.656-5.656l-1.1 1.1"></path>
|
||||||
|
</svg>
|
||||||
|
Pair Devices
|
||||||
|
</a>
|
||||||
|
|
||||||
<a href="/projects" class="flex items-center px-4 py-3 rounded-lg hover:bg-gray-100 dark:hover:bg-gray-700 {% if request.url.path.startswith('/projects') %}bg-gray-100 dark:bg-gray-700{% endif %}">
|
<a href="/projects" class="flex items-center px-4 py-3 rounded-lg hover:bg-gray-100 dark:hover:bg-gray-700 {% if request.url.path.startswith('/projects') %}bg-gray-100 dark:bg-gray-700{% endif %}">
|
||||||
<svg class="w-5 h-5 mr-3" fill="none" stroke="currentColor" viewBox="0 0 24 24">
|
<svg class="w-5 h-5 mr-3" fill="none" stroke="currentColor" viewBox="0 0 24 24">
|
||||||
<path stroke-linecap="round" stroke-linejoin="round" stroke-width="2" d="M19 11H5m14 0a2 2 0 012 2v6a2 2 0 01-2 2H5a2 2 0 01-2-2v-6a2 2 0 012-2m14 0V9a2 2 0 00-2-2M5 11V9a2 2 0 012-2m0 0V5a2 2 0 012-2h6a2 2 0 012 2v2M7 7h10"></path>
|
<path stroke-linecap="round" stroke-linejoin="round" stroke-width="2" d="M19 11H5m14 0a2 2 0 012 2v6a2 2 0 01-2 2H5a2 2 0 01-2-2v-6a2 2 0 012-2m14 0V9a2 2 0 00-2-2M5 11V9a2 2 0 012-2m0 0V5a2 2 0 012-2h6a2 2 0 012 2v2M7 7h10"></path>
|
||||||
@@ -134,6 +151,13 @@
|
|||||||
Projects
|
Projects
|
||||||
</a>
|
</a>
|
||||||
|
|
||||||
|
<a href="/fleet-calendar" class="flex items-center px-4 py-3 rounded-lg hover:bg-gray-100 dark:hover:bg-gray-700 {% if request.url.path.startswith('/fleet-calendar') %}bg-gray-100 dark:bg-gray-700{% endif %}">
|
||||||
|
<svg class="w-5 h-5 mr-3" fill="none" stroke="currentColor" viewBox="0 0 24 24">
|
||||||
|
<path stroke-linecap="round" stroke-linejoin="round" stroke-width="2" d="M8 7V3m8 4V3m-9 8h10M5 21h14a2 2 0 002-2V7a2 2 0 00-2-2H5a2 2 0 00-2 2v12a2 2 0 002 2z"></path>
|
||||||
|
</svg>
|
||||||
|
Job Planner
|
||||||
|
</a>
|
||||||
|
|
||||||
<a href="/settings" class="flex items-center px-4 py-3 rounded-lg hover:bg-gray-100 dark:hover:bg-gray-700 {% if request.url.path == '/settings' %}bg-gray-100 dark:bg-gray-700{% endif %}">
|
<a href="/settings" class="flex items-center px-4 py-3 rounded-lg hover:bg-gray-100 dark:hover:bg-gray-700 {% if request.url.path == '/settings' %}bg-gray-100 dark:bg-gray-700{% endif %}">
|
||||||
<svg class="w-5 h-5 mr-3" fill="none" stroke="currentColor" viewBox="0 0 24 24">
|
<svg class="w-5 h-5 mr-3" fill="none" stroke="currentColor" viewBox="0 0 24 24">
|
||||||
<path stroke-linecap="round" stroke-linejoin="round" stroke-width="2" d="M10.325 4.317c.426-1.756 2.924-1.756 3.35 0a1.724 1.724 0 002.573 1.066c1.543-.94 3.31.826 2.37 2.37a1.724 1.724 0 001.065 2.572c1.756.426 1.756 2.924 0 3.35a1.724 1.724 0 00-1.066 2.573c.94 1.543-.826 3.31-2.37 2.37a1.724 1.724 0 00-2.572 1.065c-.426 1.756-2.924 1.756-3.35 0a1.724 1.724 0 00-2.573-1.066c-1.543.94-3.31-.826-2.37-2.37a1.724 1.724 0 00-1.065-2.572c-1.756-.426-1.756-2.924 0-3.35a1.724 1.724 0 001.066-2.573c-.94-1.543.826-3.31 2.37-2.37.996.608 2.296.07 2.572-1.065z"></path>
|
<path stroke-linecap="round" stroke-linejoin="round" stroke-width="2" d="M10.325 4.317c.426-1.756 2.924-1.756 3.35 0a1.724 1.724 0 002.573 1.066c1.543-.94 3.31.826 2.37 2.37a1.724 1.724 0 001.065 2.572c1.756.426 1.756 2.924 0 3.35a1.724 1.724 0 00-1.066 2.573c.94 1.543-.826 3.31-2.37 2.37a1.724 1.724 0 00-2.572 1.065c-.426 1.756-2.924 1.756-3.35 0a1.724 1.724 0 00-2.573-1.066c-1.543.94-3.31-.826-2.37-2.37a1.724 1.724 0 00-1.065-2.572c-1.756-.426-1.756-2.924 0-3.35a1.724 1.724 0 001.066-2.573c-.94-1.543.826-3.31 2.37-2.37.996.608 2.296.07 2.572-1.065z"></path>
|
||||||
@@ -169,14 +193,14 @@
|
|||||||
|
|
||||||
<!-- Main content -->
|
<!-- Main content -->
|
||||||
<main class="main-content flex-1 overflow-y-auto">
|
<main class="main-content flex-1 overflow-y-auto">
|
||||||
<div class="p-8">
|
<div class="{% if request.query_params.get('embed') == '1' %}p-4{% else %}p-8{% endif %}">
|
||||||
{% block content %}{% endblock %}
|
{% block content %}{% endblock %}
|
||||||
</div>
|
</div>
|
||||||
</main>
|
</main>
|
||||||
</div>
|
</div>
|
||||||
|
|
||||||
<!-- Bottom Navigation (Mobile Only) -->
|
<!-- Bottom Navigation (Mobile Only) -->
|
||||||
<nav class="bottom-nav">
|
<nav class="bottom-nav{% if request.query_params.get('embed') == '1' %} hidden{% endif %}">
|
||||||
<div class="grid grid-cols-4 h-16">
|
<div class="grid grid-cols-4 h-16">
|
||||||
<button id="hamburgerBtn" class="bottom-nav-btn" onclick="toggleMenu()" aria-label="Menu">
|
<button id="hamburgerBtn" class="bottom-nav-btn" onclick="toggleMenu()" aria-label="Menu">
|
||||||
<svg fill="none" stroke="currentColor" viewBox="0 0 24 24">
|
<svg fill="none" stroke="currentColor" viewBox="0 0 24 24">
|
||||||
@@ -374,10 +398,10 @@
|
|||||||
</script>
|
</script>
|
||||||
|
|
||||||
<!-- Offline Database -->
|
<!-- Offline Database -->
|
||||||
<script src="/static/offline-db.js?v=0.4.3"></script>
|
<script src="/static/offline-db.js?v=0.6.1"></script>
|
||||||
|
|
||||||
<!-- Mobile JavaScript -->
|
<!-- Mobile JavaScript -->
|
||||||
<script src="/static/mobile.js?v=0.4.3"></script>
|
<script src="/static/mobile.js?v=0.6.1"></script>
|
||||||
|
|
||||||
{% block extra_scripts %}{% endblock %}
|
{% block extra_scripts %}{% endblock %}
|
||||||
</body>
|
</body>
|
||||||
|
|||||||
315
templates/combined_report_preview.html
Normal file
@@ -0,0 +1,315 @@
|
|||||||
|
{% extends "base.html" %}
|
||||||
|
|
||||||
|
{% block title %}Combined Report Preview - {{ project.name }}{% endblock %}
|
||||||
|
|
||||||
|
{% block content %}
|
||||||
|
<!-- jspreadsheet CSS -->
|
||||||
|
<link rel="stylesheet" href="https://cdn.jsdelivr.net/npm/jspreadsheet-ce@4/dist/jspreadsheet.min.css" />
|
||||||
|
<link rel="stylesheet" href="https://cdn.jsdelivr.net/npm/jsuites@5/dist/jsuites.min.css" />
|
||||||
|
|
||||||
|
<div class="min-h-screen bg-gray-100 dark:bg-slate-900">
|
||||||
|
<!-- Header -->
|
||||||
|
<div class="bg-white dark:bg-slate-800 shadow-sm border-b border-gray-200 dark:border-gray-700">
|
||||||
|
<div class="max-w-7xl mx-auto px-4 sm:px-6 lg:px-8 py-4">
|
||||||
|
<div class="flex flex-col md:flex-row md:items-center md:justify-between gap-4">
|
||||||
|
<div>
|
||||||
|
<h1 class="text-2xl font-bold text-gray-900 dark:text-white">Combined Report Preview & Editor</h1>
|
||||||
|
<p class="text-sm text-gray-500 dark:text-gray-400 mt-1">
|
||||||
|
{{ location_data|length }} location{{ 's' if location_data|length != 1 else '' }}
|
||||||
|
{% if time_filter_desc %} | {{ time_filter_desc }}{% endif %}
|
||||||
|
| {{ total_rows }} total row{{ 's' if total_rows != 1 else '' }}
|
||||||
|
</p>
|
||||||
|
</div>
|
||||||
|
<div class="flex items-center gap-3">
|
||||||
|
<button onclick="downloadCombinedReport()" id="download-btn"
|
||||||
|
class="px-4 py-2 bg-emerald-600 text-white rounded-lg hover:bg-emerald-700 transition-colors flex items-center gap-2 text-sm font-medium">
|
||||||
|
<svg class="w-5 h-5" fill="none" stroke="currentColor" viewBox="0 0 24 24">
|
||||||
|
<path stroke-linecap="round" stroke-linejoin="round" stroke-width="2" d="M4 16v1a3 3 0 003 3h10a3 3 0 003-3v-1m-4-4l-4 4m0 0l-4-4m4 4V4"></path>
|
||||||
|
</svg>
|
||||||
|
Generate Reports (ZIP)
|
||||||
|
</button>
|
||||||
|
<a href="/api/projects/{{ project_id }}/combined-report-wizard"
|
||||||
|
class="px-4 py-2 bg-gray-200 dark:bg-gray-700 text-gray-700 dark:text-gray-300 rounded-lg hover:bg-gray-300 dark:hover:bg-gray-600 transition-colors text-sm">
|
||||||
|
← Back to Config
|
||||||
|
</a>
|
||||||
|
</div>
|
||||||
|
</div>
|
||||||
|
</div>
|
||||||
|
</div>
|
||||||
|
|
||||||
|
<div class="max-w-7xl mx-auto px-4 sm:px-6 lg:px-8 py-4 space-y-4">
|
||||||
|
|
||||||
|
<!-- Report Metadata -->
|
||||||
|
<div class="bg-white dark:bg-slate-800 rounded-lg shadow-sm border border-gray-200 dark:border-gray-700 p-4">
|
||||||
|
<div class="grid grid-cols-1 md:grid-cols-3 gap-4">
|
||||||
|
<div>
|
||||||
|
<label class="block text-sm font-medium text-gray-700 dark:text-gray-300 mb-1">Report Title</label>
|
||||||
|
<input type="text" id="edit-report-title" value="{{ report_title }}"
|
||||||
|
class="w-full px-3 py-2 border border-gray-300 dark:border-gray-600 rounded-md bg-white dark:bg-gray-700 text-gray-900 dark:text-white text-sm">
|
||||||
|
</div>
|
||||||
|
<div>
|
||||||
|
<label class="block text-sm font-medium text-gray-700 dark:text-gray-300 mb-1">Project Name</label>
|
||||||
|
<input type="text" id="edit-project-name" value="{{ project_name }}"
|
||||||
|
class="w-full px-3 py-2 border border-gray-300 dark:border-gray-600 rounded-md bg-white dark:bg-gray-700 text-gray-900 dark:text-white text-sm">
|
||||||
|
</div>
|
||||||
|
<div>
|
||||||
|
<label class="block text-sm font-medium text-gray-700 dark:text-gray-300 mb-1">Client Name</label>
|
||||||
|
<input type="text" id="edit-client-name" value="{{ client_name }}"
|
||||||
|
class="w-full px-3 py-2 border border-gray-300 dark:border-gray-600 rounded-md bg-white dark:bg-gray-700 text-gray-900 dark:text-white text-sm">
|
||||||
|
</div>
|
||||||
|
</div>
|
||||||
|
</div>
|
||||||
|
|
||||||
|
<!-- Location Tabs + Spreadsheet -->
|
||||||
|
<div class="bg-white dark:bg-slate-800 rounded-lg shadow-sm border border-gray-200 dark:border-gray-700">
|
||||||
|
|
||||||
|
<!-- Tab Bar -->
|
||||||
|
<div class="border-b border-gray-200 dark:border-gray-700 overflow-x-auto">
|
||||||
|
<div class="flex min-w-max" id="tab-bar">
|
||||||
|
{% for loc in location_data %}
|
||||||
|
<button onclick="switchTab({{ loop.index0 }})"
|
||||||
|
id="tab-btn-{{ loop.index0 }}"
|
||||||
|
class="tab-btn px-4 py-3 text-sm font-medium whitespace-nowrap border-b-2 transition-colors
|
||||||
|
{% if loop.first %}border-emerald-500 text-emerald-600 dark:text-emerald-400
|
||||||
|
{% else %}border-transparent text-gray-500 dark:text-gray-400 hover:text-gray-700 dark:hover:text-gray-300 hover:border-gray-300{% endif %}">
|
||||||
|
{{ loc.location_name }}
|
||||||
|
<span class="ml-1.5 text-xs px-1.5 py-0.5 rounded-full
|
||||||
|
{% if loop.first %}bg-emerald-100 text-emerald-700 dark:bg-emerald-900/40 dark:text-emerald-400
|
||||||
|
{% else %}bg-gray-100 text-gray-500 dark:bg-gray-700 dark:text-gray-400{% endif %}"
|
||||||
|
id="tab-count-{{ loop.index0 }}">
|
||||||
|
{{ loc.filtered_count }}
|
||||||
|
</span>
|
||||||
|
</button>
|
||||||
|
{% endfor %}
|
||||||
|
</div>
|
||||||
|
</div>
|
||||||
|
|
||||||
|
<!-- Spreadsheet Panels -->
|
||||||
|
<div class="p-4">
|
||||||
|
<div class="flex items-center justify-between mb-3">
|
||||||
|
<h3 class="text-base font-semibold text-gray-900 dark:text-white" id="active-tab-title">
|
||||||
|
{{ location_data[0].location_name if location_data else '' }}
|
||||||
|
</h3>
|
||||||
|
<div class="flex items-center gap-2 text-sm text-gray-500 dark:text-gray-400">
|
||||||
|
<span>Right-click for options</span>
|
||||||
|
<span class="text-gray-300 dark:text-gray-600">|</span>
|
||||||
|
<span>Double-click to edit</span>
|
||||||
|
</div>
|
||||||
|
</div>
|
||||||
|
|
||||||
|
{% for loc in location_data %}
|
||||||
|
<div id="panel-{{ loop.index0 }}" class="tab-panel {% if not loop.first %}hidden{% endif %} overflow-x-auto">
|
||||||
|
<div id="spreadsheet-{{ loop.index0 }}"></div>
|
||||||
|
</div>
|
||||||
|
{% endfor %}
|
||||||
|
</div>
|
||||||
|
</div>
|
||||||
|
|
||||||
|
<!-- Help -->
|
||||||
|
<div class="bg-blue-50 dark:bg-blue-900/20 rounded-lg p-4">
|
||||||
|
<h3 class="text-sm font-medium text-blue-800 dark:text-blue-300 mb-2">Editing Tips</h3>
|
||||||
|
<ul class="text-sm text-blue-700 dark:text-blue-400 list-disc list-inside space-y-1">
|
||||||
|
<li>Double-click any cell to edit its value</li>
|
||||||
|
<li>Use the Comments column to add notes about specific measurements</li>
|
||||||
|
<li>Right-click a row to insert or delete rows</li>
|
||||||
|
<li>Press Enter to confirm edits, Escape to cancel</li>
|
||||||
|
<li>Switch between location tabs to edit each location's data independently</li>
|
||||||
|
</ul>
|
||||||
|
</div>
|
||||||
|
|
||||||
|
</div>
|
||||||
|
</div>
|
||||||
|
|
||||||
|
<!-- jspreadsheet JS -->
|
||||||
|
<script src="https://cdn.jsdelivr.net/npm/jsuites@5/dist/jsuites.min.js"></script>
|
||||||
|
<script src="https://cdn.jsdelivr.net/npm/jspreadsheet-ce@4/dist/index.min.js"></script>
|
||||||
|
|
||||||
|
<script>
|
||||||
|
const allLocationData = {{ locations_json | safe }};
|
||||||
|
const spreadsheets = {};
|
||||||
|
let activeTabIdx = 0;
|
||||||
|
|
||||||
|
const columnDef = [
|
||||||
|
{ title: 'Test #', width: 80, type: 'numeric' },
|
||||||
|
{ title: 'Date', width: 110, type: 'text' },
|
||||||
|
{ title: 'Time', width: 90, type: 'text' },
|
||||||
|
{ title: 'LAmax (dBA)', width: 110, type: 'numeric' },
|
||||||
|
{ title: 'LA01 (dBA)', width: 110, type: 'numeric' },
|
||||||
|
{ title: 'LA10 (dBA)', width: 110, type: 'numeric' },
|
||||||
|
{ title: 'Comments', width: 250, type: 'text' },
|
||||||
|
];
|
||||||
|
|
||||||
|
const jssOptions = {
|
||||||
|
columns: columnDef,
|
||||||
|
allowInsertRow: true,
|
||||||
|
allowDeleteRow: true,
|
||||||
|
allowInsertColumn: false,
|
||||||
|
allowDeleteColumn: false,
|
||||||
|
rowDrag: true,
|
||||||
|
columnSorting: true,
|
||||||
|
search: true,
|
||||||
|
pagination: 50,
|
||||||
|
paginationOptions: [25, 50, 100, 200],
|
||||||
|
defaultColWidth: 100,
|
||||||
|
minDimensions: [7, 1],
|
||||||
|
tableOverflow: true,
|
||||||
|
tableWidth: '100%',
|
||||||
|
contextMenu: function(instance, col, row, e) {
|
||||||
|
const items = [];
|
||||||
|
if (row !== null) {
|
||||||
|
items.push({
|
||||||
|
title: 'Insert row above',
|
||||||
|
onclick: function() { instance.insertRow(1, row, true); }
|
||||||
|
});
|
||||||
|
items.push({
|
||||||
|
title: 'Insert row below',
|
||||||
|
onclick: function() { instance.insertRow(1, row + 1, false); }
|
||||||
|
});
|
||||||
|
items.push({
|
||||||
|
title: 'Delete this row',
|
||||||
|
onclick: function() { instance.deleteRow(row); }
|
||||||
|
});
|
||||||
|
}
|
||||||
|
return items;
|
||||||
|
},
|
||||||
|
style: {
|
||||||
|
A: 'text-align: center;',
|
||||||
|
B: 'text-align: center;',
|
||||||
|
C: 'text-align: center;',
|
||||||
|
D: 'text-align: right;',
|
||||||
|
E: 'text-align: right;',
|
||||||
|
F: 'text-align: right;',
|
||||||
|
}
|
||||||
|
};
|
||||||
|
|
||||||
|
document.addEventListener('DOMContentLoaded', function() {
|
||||||
|
allLocationData.forEach(function(loc, idx) {
|
||||||
|
const el = document.getElementById('spreadsheet-' + idx);
|
||||||
|
if (!el) return;
|
||||||
|
const opts = Object.assign({}, jssOptions, { data: loc.spreadsheet_data });
|
||||||
|
spreadsheets[idx] = jspreadsheet(el, opts);
|
||||||
|
});
|
||||||
|
if (allLocationData.length > 0) {
|
||||||
|
switchTab(0);
|
||||||
|
}
|
||||||
|
});
|
||||||
|
|
||||||
|
function switchTab(idx) {
|
||||||
|
activeTabIdx = idx;
|
||||||
|
|
||||||
|
// Update panels
|
||||||
|
document.querySelectorAll('.tab-panel').forEach(function(panel, i) {
|
||||||
|
panel.classList.toggle('hidden', i !== idx);
|
||||||
|
});
|
||||||
|
|
||||||
|
// Update tab button styles
|
||||||
|
document.querySelectorAll('.tab-btn').forEach(function(btn, i) {
|
||||||
|
const countBadge = document.getElementById('tab-count-' + i);
|
||||||
|
if (i === idx) {
|
||||||
|
btn.classList.add('border-emerald-500', 'text-emerald-600', 'dark:text-emerald-400');
|
||||||
|
btn.classList.remove('border-transparent', 'text-gray-500', 'dark:text-gray-400');
|
||||||
|
if (countBadge) {
|
||||||
|
countBadge.classList.add('bg-emerald-100', 'text-emerald-700', 'dark:bg-emerald-900/40', 'dark:text-emerald-400');
|
||||||
|
countBadge.classList.remove('bg-gray-100', 'text-gray-500', 'dark:bg-gray-700', 'dark:text-gray-400');
|
||||||
|
}
|
||||||
|
} else {
|
||||||
|
btn.classList.remove('border-emerald-500', 'text-emerald-600', 'dark:text-emerald-400');
|
||||||
|
btn.classList.add('border-transparent', 'text-gray-500', 'dark:text-gray-400');
|
||||||
|
if (countBadge) {
|
||||||
|
countBadge.classList.remove('bg-emerald-100', 'text-emerald-700', 'dark:bg-emerald-900/40', 'dark:text-emerald-400');
|
||||||
|
countBadge.classList.add('bg-gray-100', 'text-gray-500', 'dark:bg-gray-700', 'dark:text-gray-400');
|
||||||
|
}
|
||||||
|
}
|
||||||
|
});
|
||||||
|
|
||||||
|
// Update title
|
||||||
|
if (allLocationData[idx]) {
|
||||||
|
document.getElementById('active-tab-title').textContent = allLocationData[idx].location_name;
|
||||||
|
}
|
||||||
|
|
||||||
|
// Refresh jspreadsheet rendering after showing panel
|
||||||
|
if (spreadsheets[idx]) {
|
||||||
|
try { spreadsheets[idx].updateTable(); } catch(e) {}
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
async function downloadCombinedReport() {
|
||||||
|
const btn = document.getElementById('download-btn');
|
||||||
|
const originalText = btn.innerHTML;
|
||||||
|
btn.disabled = true;
|
||||||
|
btn.innerHTML = '<svg class="w-5 h-5 animate-spin" fill="none" viewBox="0 0 24 24"><circle class="opacity-25" cx="12" cy="12" r="10" stroke="currentColor" stroke-width="4"></circle><path class="opacity-75" fill="currentColor" d="M4 12a8 8 0 018-8V0C5.373 0 0 5.373 0 12h4z"></path></svg> Generating ZIP...';
|
||||||
|
|
||||||
|
try {
|
||||||
|
const locations = allLocationData.map(function(loc, idx) {
|
||||||
|
return {
|
||||||
|
session_id: loc.session_id || '',
|
||||||
|
session_label: loc.session_label || '',
|
||||||
|
period_type: loc.period_type || '',
|
||||||
|
started_at: loc.started_at || '',
|
||||||
|
location_name: loc.location_name,
|
||||||
|
spreadsheet_data: spreadsheets[idx] ? spreadsheets[idx].getData() : loc.spreadsheet_data,
|
||||||
|
};
|
||||||
|
});
|
||||||
|
|
||||||
|
const payload = {
|
||||||
|
report_title: document.getElementById('edit-report-title').value || 'Background Noise Study',
|
||||||
|
project_name: document.getElementById('edit-project-name').value || '',
|
||||||
|
client_name: document.getElementById('edit-client-name').value || '',
|
||||||
|
locations: locations,
|
||||||
|
};
|
||||||
|
|
||||||
|
const response = await fetch('/api/projects/{{ project_id }}/generate-combined-from-preview', {
|
||||||
|
method: 'POST',
|
||||||
|
headers: { 'Content-Type': 'application/json' },
|
||||||
|
body: JSON.stringify(payload),
|
||||||
|
});
|
||||||
|
|
||||||
|
if (response.ok) {
|
||||||
|
const blob = await response.blob();
|
||||||
|
const url = window.URL.createObjectURL(blob);
|
||||||
|
const a = document.createElement('a');
|
||||||
|
a.href = url;
|
||||||
|
|
||||||
|
const contentDisposition = response.headers.get('Content-Disposition');
|
||||||
|
let filename = 'combined_reports.zip';
|
||||||
|
if (contentDisposition) {
|
||||||
|
const match = contentDisposition.match(/filename="(.+)"/);
|
||||||
|
if (match) filename = match[1];
|
||||||
|
}
|
||||||
|
|
||||||
|
a.download = filename;
|
||||||
|
document.body.appendChild(a);
|
||||||
|
a.click();
|
||||||
|
window.URL.revokeObjectURL(url);
|
||||||
|
a.remove();
|
||||||
|
} else {
|
||||||
|
const error = await response.json();
|
||||||
|
alert('Error generating report: ' + (error.detail || 'Unknown error'));
|
||||||
|
}
|
||||||
|
} catch (error) {
|
||||||
|
alert('Error generating report: ' + error.message);
|
||||||
|
} finally {
|
||||||
|
btn.disabled = false;
|
||||||
|
btn.innerHTML = originalText;
|
||||||
|
}
|
||||||
|
}
|
||||||
|
</script>
|
||||||
|
|
||||||
|
<style>
|
||||||
|
/* Dark mode jspreadsheet styles */
|
||||||
|
.dark .jexcel { background-color: #1e293b; color: #e2e8f0; }
|
||||||
|
.dark .jexcel thead td { background-color: #334155 !important; color: #e2e8f0 !important; border-color: #475569 !important; }
|
||||||
|
.dark .jexcel tbody td { background-color: #1e293b; color: #e2e8f0; border-color: #475569; }
|
||||||
|
.dark .jexcel tbody td:hover { background-color: #334155; }
|
||||||
|
.dark .jexcel tbody tr:nth-child(even) td { background-color: #0f172a; }
|
||||||
|
.dark .jexcel_pagination { background-color: #1e293b; color: #e2e8f0; border-color: #475569; }
|
||||||
|
.dark .jexcel_pagination a { color: #e2e8f0; }
|
||||||
|
.dark .jexcel_search { background-color: #1e293b; color: #e2e8f0; border-color: #475569; }
|
||||||
|
.dark .jexcel_search input { background-color: #334155; color: #e2e8f0; border-color: #475569; }
|
||||||
|
.dark .jexcel_content { background-color: #1e293b; }
|
||||||
|
.dark .jexcel_contextmenu { background-color: #1e293b; border-color: #475569; }
|
||||||
|
.dark .jexcel_contextmenu a { color: #e2e8f0; }
|
||||||
|
.dark .jexcel_contextmenu a:hover { background-color: #334155; }
|
||||||
|
.jexcel_content { max-height: 600px; overflow: auto; }
|
||||||
|
</style>
|
||||||
|
{% endblock %}
|
||||||
393
templates/combined_report_wizard.html
Normal file
@@ -0,0 +1,393 @@
|
|||||||
|
{% extends "base.html" %}
|
||||||
|
|
||||||
|
{% block title %}Combined Report Wizard - {{ project.name }}{% endblock %}
|
||||||
|
|
||||||
|
{% block content %}
|
||||||
|
<div class="min-h-screen bg-gray-100 dark:bg-slate-900">
|
||||||
|
<!-- Header -->
|
||||||
|
<div class="bg-white dark:bg-slate-800 shadow-sm border-b border-gray-200 dark:border-gray-700">
|
||||||
|
<div class="max-w-4xl mx-auto px-4 sm:px-6 lg:px-8 py-4">
|
||||||
|
<div class="flex flex-col md:flex-row md:items-center md:justify-between gap-4">
|
||||||
|
<div>
|
||||||
|
<h1 class="text-2xl font-bold text-gray-900 dark:text-white">Combined Report Wizard</h1>
|
||||||
|
<p class="text-sm text-gray-500 dark:text-gray-400 mt-1">{{ project.name }}</p>
|
||||||
|
</div>
|
||||||
|
<a href="/projects/{{ project_id }}"
|
||||||
|
class="px-4 py-2 bg-gray-200 dark:bg-gray-700 text-gray-700 dark:text-gray-300 rounded-lg hover:bg-gray-300 dark:hover:bg-gray-600 transition-colors text-sm w-fit">
|
||||||
|
← Back to Project
|
||||||
|
</a>
|
||||||
|
</div>
|
||||||
|
</div>
|
||||||
|
</div>
|
||||||
|
|
||||||
|
<div class="max-w-4xl mx-auto px-4 sm:px-6 lg:px-8 py-6 space-y-6">
|
||||||
|
|
||||||
|
<!-- Report Settings Card -->
|
||||||
|
<div class="bg-white dark:bg-slate-800 rounded-lg shadow-sm border border-gray-200 dark:border-gray-700 p-6">
|
||||||
|
<h2 class="text-lg font-semibold text-gray-900 dark:text-white mb-4">Report Settings</h2>
|
||||||
|
|
||||||
|
<!-- Template Selection -->
|
||||||
|
<div class="flex items-end gap-2 mb-4">
|
||||||
|
<div class="flex-1">
|
||||||
|
<label for="template-select" class="block text-sm font-medium text-gray-700 dark:text-gray-300 mb-1">
|
||||||
|
Load Template
|
||||||
|
</label>
|
||||||
|
<select id="template-select" onchange="applyTemplate()"
|
||||||
|
class="block w-full px-3 py-2 border border-gray-300 dark:border-gray-600 rounded-md shadow-sm bg-white dark:bg-gray-700 text-gray-900 dark:text-white focus:outline-none focus:ring-emerald-500 focus:border-emerald-500 sm:text-sm">
|
||||||
|
<option value="">-- Select a template --</option>
|
||||||
|
</select>
|
||||||
|
</div>
|
||||||
|
<button type="button" onclick="saveAsTemplate()"
|
||||||
|
class="px-3 py-2 text-sm bg-gray-200 dark:bg-gray-700 text-gray-700 dark:text-gray-300 rounded-md hover:bg-gray-300 dark:hover:bg-gray-600"
|
||||||
|
title="Save current settings as template">
|
||||||
|
<svg class="w-4 h-4" fill="none" stroke="currentColor" viewBox="0 0 24 24">
|
||||||
|
<path stroke-linecap="round" stroke-linejoin="round" stroke-width="2" d="M8 7H5a2 2 0 00-2 2v9a2 2 0 002 2h14a2 2 0 002-2V9a2 2 0 00-2-2h-3m-1 4l-3 3m0 0l-3-3m3 3V4"></path>
|
||||||
|
</svg>
|
||||||
|
</button>
|
||||||
|
</div>
|
||||||
|
|
||||||
|
<!-- Report Title -->
|
||||||
|
<div class="mb-4">
|
||||||
|
<label for="report-title" class="block text-sm font-medium text-gray-700 dark:text-gray-300 mb-1">
|
||||||
|
Report Title
|
||||||
|
</label>
|
||||||
|
<input type="text" id="report-title" value="Background Noise Study"
|
||||||
|
class="block w-full px-3 py-2 border border-gray-300 dark:border-gray-600 rounded-md shadow-sm bg-white dark:bg-gray-700 text-gray-900 dark:text-white focus:outline-none focus:ring-emerald-500 focus:border-emerald-500 sm:text-sm">
|
||||||
|
</div>
|
||||||
|
|
||||||
|
<!-- Project and Client -->
|
||||||
|
<div class="grid grid-cols-1 sm:grid-cols-2 gap-4">
|
||||||
|
<div>
|
||||||
|
<label for="report-project" class="block text-sm font-medium text-gray-700 dark:text-gray-300 mb-1">
|
||||||
|
Project Name
|
||||||
|
</label>
|
||||||
|
<input type="text" id="report-project" value="{{ project.name }}"
|
||||||
|
class="block w-full px-3 py-2 border border-gray-300 dark:border-gray-600 rounded-md shadow-sm bg-white dark:bg-gray-700 text-gray-900 dark:text-white focus:outline-none focus:ring-emerald-500 focus:border-emerald-500 sm:text-sm">
|
||||||
|
</div>
|
||||||
|
<div>
|
||||||
|
<label for="report-client" class="block text-sm font-medium text-gray-700 dark:text-gray-300 mb-1">
|
||||||
|
Client Name
|
||||||
|
</label>
|
||||||
|
<input type="text" id="report-client" value="{{ project.client_name if project.client_name else '' }}"
|
||||||
|
class="block w-full px-3 py-2 border border-gray-300 dark:border-gray-600 rounded-md shadow-sm bg-white dark:bg-gray-700 text-gray-900 dark:text-white focus:outline-none focus:ring-emerald-500 focus:border-emerald-500 sm:text-sm">
|
||||||
|
</div>
|
||||||
|
</div>
|
||||||
|
</div>
|
||||||
|
|
||||||
|
<!-- Sessions Card -->
|
||||||
|
<div class="bg-white dark:bg-slate-800 rounded-lg shadow-sm border border-gray-200 dark:border-gray-700 p-6 overflow-hidden">
|
||||||
|
<div class="flex items-center justify-between mb-1">
|
||||||
|
<h2 class="text-lg font-semibold text-gray-900 dark:text-white">Monitoring Sessions</h2>
|
||||||
|
<div class="flex gap-3 text-sm">
|
||||||
|
<button type="button" onclick="selectAllSessions()" class="text-emerald-600 dark:text-emerald-400 hover:underline">Select All</button>
|
||||||
|
<button type="button" onclick="deselectAllSessions()" class="text-gray-500 dark:text-gray-400 hover:underline">Deselect All</button>
|
||||||
|
</div>
|
||||||
|
</div>
|
||||||
|
<p class="text-sm text-gray-500 dark:text-gray-400 mb-4">
|
||||||
|
<span id="selected-count">0</span> session(s) selected — each selected session becomes one sheet in the ZIP.
|
||||||
|
Change the period type per session to control how stats are bucketed (Day vs Night).
|
||||||
|
</p>
|
||||||
|
|
||||||
|
{% if locations %}
|
||||||
|
{% for loc in locations %}
|
||||||
|
{% set loc_name = loc.name %}
|
||||||
|
{% set sessions = loc.sessions %}
|
||||||
|
<div class="border border-gray-200 dark:border-gray-700 rounded-lg mb-3 overflow-hidden">
|
||||||
|
<!-- Location header / toggle -->
|
||||||
|
<button type="button"
|
||||||
|
onclick="toggleLocation('loc-{{ loop.index }}')"
|
||||||
|
class="w-full flex items-center justify-between px-4 py-3 bg-gray-50 dark:bg-slate-700/50 hover:bg-gray-100 dark:hover:bg-slate-700 transition-colors text-left">
|
||||||
|
<div class="flex items-center gap-3">
|
||||||
|
<svg id="chevron-loc-{{ loop.index }}" class="w-4 h-4 text-gray-400 transition-transform" fill="none" stroke="currentColor" viewBox="0 0 24 24">
|
||||||
|
<path stroke-linecap="round" stroke-linejoin="round" stroke-width="2" d="M19 9l-7 7-7-7"></path>
|
||||||
|
</svg>
|
||||||
|
<span class="font-medium text-gray-900 dark:text-white text-sm">{{ loc_name }}</span>
|
||||||
|
<span class="text-xs text-gray-400 dark:text-gray-500">{{ sessions|length }} session{{ 's' if sessions|length != 1 else '' }}</span>
|
||||||
|
</div>
|
||||||
|
<div class="flex items-center gap-3 text-xs" onclick="event.stopPropagation()">
|
||||||
|
<button type="button" onclick="selectLocation('loc-{{ loop.index }}')"
|
||||||
|
class="text-emerald-600 dark:text-emerald-400 hover:underline">All</button>
|
||||||
|
<button type="button" onclick="deselectLocation('loc-{{ loop.index }}')"
|
||||||
|
class="text-gray-400 hover:underline">None</button>
|
||||||
|
</div>
|
||||||
|
</button>
|
||||||
|
|
||||||
|
<!-- Session rows -->
|
||||||
|
<div id="loc-{{ loop.index }}" class="divide-y divide-gray-100 dark:divide-gray-700/50">
|
||||||
|
{% for s in sessions %}
|
||||||
|
{% set pt_colors = {
|
||||||
|
'weekday_day': 'bg-blue-100 text-blue-800 dark:bg-blue-900/30 dark:text-blue-300',
|
||||||
|
'weekday_night': 'bg-indigo-100 text-indigo-800 dark:bg-indigo-900/30 dark:text-indigo-300',
|
||||||
|
'weekend_day': 'bg-amber-100 text-amber-800 dark:bg-amber-900/30 dark:text-amber-300',
|
||||||
|
'weekend_night': 'bg-purple-100 text-purple-800 dark:bg-purple-900/30 dark:text-purple-300',
|
||||||
|
} %}
|
||||||
|
{% set pt_labels = {
|
||||||
|
'weekday_day': 'Weekday Day',
|
||||||
|
'weekday_night': 'Weekday Night',
|
||||||
|
'weekend_day': 'Weekend Day',
|
||||||
|
'weekend_night': 'Weekend Night',
|
||||||
|
} %}
|
||||||
|
<div class="flex items-center gap-3 px-4 py-3 hover:bg-gray-50 dark:hover:bg-slate-700/30 transition-colors">
|
||||||
|
<!-- Checkbox -->
|
||||||
|
<input type="checkbox"
|
||||||
|
class="session-cb loc-{{ loop.index }}-cb h-4 w-4 text-emerald-600 border-gray-300 dark:border-gray-600 rounded focus:ring-emerald-500 shrink-0"
|
||||||
|
value="{{ s.session_id }}"
|
||||||
|
checked
|
||||||
|
onchange="updateSelectionStats()">
|
||||||
|
|
||||||
|
<!-- Date/day info -->
|
||||||
|
<div class="min-w-0 flex-1">
|
||||||
|
<div class="flex flex-wrap items-center gap-2">
|
||||||
|
<span class="text-sm font-medium text-gray-900 dark:text-white">
|
||||||
|
{{ s.day_of_week }} {{ s.date_display }}
|
||||||
|
</span>
|
||||||
|
{% if s.session_label %}
|
||||||
|
<span class="text-xs text-gray-400 dark:text-gray-500 truncate">{{ s.session_label }}</span>
|
||||||
|
{% endif %}
|
||||||
|
{% if s.status == 'recording' %}
|
||||||
|
<span class="px-1.5 py-0.5 text-xs bg-red-100 text-red-700 dark:bg-red-900/30 dark:text-red-300 rounded-full flex items-center gap-1">
|
||||||
|
<span class="w-1.5 h-1.5 bg-red-500 rounded-full animate-pulse"></span>Recording
|
||||||
|
</span>
|
||||||
|
{% endif %}
|
||||||
|
</div>
|
||||||
|
<div class="flex items-center gap-3 mt-0.5 text-xs text-gray-400 dark:text-gray-500">
|
||||||
|
{% if s.started_at %}
|
||||||
|
<span>{{ s.started_at }}</span>
|
||||||
|
{% endif %}
|
||||||
|
{% if s.duration_h is not none %}
|
||||||
|
<span>{{ s.duration_h }}h {{ s.duration_m }}m</span>
|
||||||
|
{% endif %}
|
||||||
|
</div>
|
||||||
|
</div>
|
||||||
|
|
||||||
|
<!-- Period type dropdown -->
|
||||||
|
<div class="relative shrink-0" id="wiz-period-wrap-{{ s.session_id }}">
|
||||||
|
<button type="button"
|
||||||
|
onclick="toggleWizPeriodMenu('{{ s.session_id }}')"
|
||||||
|
id="wiz-period-badge-{{ s.session_id }}"
|
||||||
|
class="px-2 py-0.5 text-xs font-medium rounded-full flex items-center gap-1 transition-colors {{ pt_colors.get(s.period_type, 'bg-gray-100 text-gray-500 dark:bg-gray-700 dark:text-gray-400') }}"
|
||||||
|
title="Click to change period type">
|
||||||
|
<span id="wiz-period-label-{{ s.session_id }}">{{ pt_labels.get(s.period_type, 'Set period') }}</span>
|
||||||
|
<svg class="w-3 h-3 opacity-60 shrink-0" fill="none" stroke="currentColor" viewBox="0 0 24 24">
|
||||||
|
<path stroke-linecap="round" stroke-linejoin="round" stroke-width="2" d="M19 9l-7 7-7-7"></path>
|
||||||
|
</svg>
|
||||||
|
</button>
|
||||||
|
<div id="wiz-period-menu-{{ s.session_id }}"
|
||||||
|
class="hidden absolute right-0 top-full mt-1 z-20 bg-white dark:bg-slate-700 border border-gray-200 dark:border-gray-600 rounded-lg shadow-lg min-w-[160px] py-1">
|
||||||
|
{% for pt, pt_label in [('weekday_day','Weekday Day'),('weekday_night','Weekday Night'),('weekend_day','Weekend Day'),('weekend_night','Weekend Night')] %}
|
||||||
|
<button type="button"
|
||||||
|
onclick="setWizPeriodType('{{ s.session_id }}', '{{ pt }}')"
|
||||||
|
class="w-full text-left px-3 py-1.5 text-xs hover:bg-gray-100 dark:hover:bg-slate-600 text-gray-700 dark:text-gray-300 {% if s.period_type == pt %}font-bold{% endif %}">
|
||||||
|
{{ pt_label }}
|
||||||
|
</button>
|
||||||
|
{% endfor %}
|
||||||
|
</div>
|
||||||
|
</div>
|
||||||
|
</div>
|
||||||
|
{% endfor %}
|
||||||
|
</div>
|
||||||
|
</div>
|
||||||
|
{% endfor %}
|
||||||
|
{% else %}
|
||||||
|
<div class="text-center py-10 text-gray-500 dark:text-gray-400">
|
||||||
|
<svg class="w-12 h-12 mx-auto mb-3 text-gray-300 dark:text-gray-600" fill="none" stroke="currentColor" viewBox="0 0 24 24">
|
||||||
|
<path stroke-linecap="round" stroke-linejoin="round" stroke-width="2" d="M9 19V6l12-3v13M9 19c0 1.105-1.343 2-3 2s-3-.895-3-2 1.343-2 3-2 3 .895 3 2zm12-3c0 1.105-1.343 2-3 2s-3-.895-3-2 1.343-2 3-2 3 .895 3 2zM9 10l12-3"></path>
|
||||||
|
</svg>
|
||||||
|
<p>No monitoring sessions found.</p>
|
||||||
|
<p class="text-sm mt-1">Upload data files to create sessions first.</p>
|
||||||
|
</div>
|
||||||
|
{% endif %}
|
||||||
|
</div>
|
||||||
|
|
||||||
|
<!-- Footer Buttons -->
|
||||||
|
<div class="flex flex-col sm:flex-row items-center justify-between gap-3 pb-6">
|
||||||
|
<a href="/projects/{{ project_id }}"
|
||||||
|
class="w-full sm:w-auto px-6 py-2.5 border border-gray-300 dark:border-gray-600 text-gray-700 dark:text-gray-300 bg-white dark:bg-gray-700 rounded-lg hover:bg-gray-50 dark:hover:bg-gray-600 transition-colors text-center text-sm font-medium">
|
||||||
|
Cancel
|
||||||
|
</a>
|
||||||
|
<button type="button" onclick="gotoPreview()" id="preview-btn"
|
||||||
|
{% if not locations %}disabled{% endif %}
|
||||||
|
class="w-full sm:w-auto px-6 py-2.5 bg-emerald-600 text-white rounded-lg hover:bg-emerald-700 transition-colors text-sm font-medium flex items-center justify-center gap-2 disabled:opacity-50 disabled:cursor-not-allowed">
|
||||||
|
<svg class="w-4 h-4" fill="none" stroke="currentColor" viewBox="0 0 24 24">
|
||||||
|
<path stroke-linecap="round" stroke-linejoin="round" stroke-width="2" d="M15 12a3 3 0 11-6 0 3 3 0 016 0z"></path>
|
||||||
|
<path stroke-linecap="round" stroke-linejoin="round" stroke-width="2" d="M2.458 12C3.732 7.943 7.523 5 12 5c4.478 0 8.268 2.943 9.542 7-1.274 4.057-5.064 7-9.542 7-4.477 0-8.268-2.943-9.542-7z"></path>
|
||||||
|
</svg>
|
||||||
|
Preview & Edit →
|
||||||
|
</button>
|
||||||
|
</div>
|
||||||
|
|
||||||
|
</div>
|
||||||
|
</div>
|
||||||
|
|
||||||
|
<script>
|
||||||
|
const PROJECT_ID = '{{ project_id }}';
|
||||||
|
|
||||||
|
const PERIOD_COLORS = {
|
||||||
|
weekday_day: 'bg-blue-100 text-blue-800 dark:bg-blue-900/30 dark:text-blue-300',
|
||||||
|
weekday_night: 'bg-indigo-100 text-indigo-800 dark:bg-indigo-900/30 dark:text-indigo-300',
|
||||||
|
weekend_day: 'bg-amber-100 text-amber-800 dark:bg-amber-900/30 dark:text-amber-300',
|
||||||
|
weekend_night: 'bg-purple-100 text-purple-800 dark:bg-purple-900/30 dark:text-purple-300',
|
||||||
|
};
|
||||||
|
const PERIOD_LABELS = {
|
||||||
|
weekday_day: 'Weekday Day',
|
||||||
|
weekday_night: 'Weekday Night',
|
||||||
|
weekend_day: 'Weekend Day',
|
||||||
|
weekend_night: 'Weekend Night',
|
||||||
|
};
|
||||||
|
const ALL_PERIOD_BADGE_CLASSES = [
|
||||||
|
'bg-gray-100','text-gray-500','dark:bg-gray-700','dark:text-gray-400',
|
||||||
|
...new Set(Object.values(PERIOD_COLORS).flatMap(s => s.split(' ')))
|
||||||
|
];
|
||||||
|
|
||||||
|
// ── Location accordion ────────────────────────────────────────────
|
||||||
|
|
||||||
|
function toggleLocation(locId) {
|
||||||
|
const body = document.getElementById(locId);
|
||||||
|
const chevron = document.getElementById('chevron-' + locId);
|
||||||
|
body.classList.toggle('hidden');
|
||||||
|
chevron.style.transform = body.classList.contains('hidden') ? 'rotate(-90deg)' : '';
|
||||||
|
}
|
||||||
|
|
||||||
|
function selectLocation(locId) {
|
||||||
|
document.querySelectorAll('.' + locId + '-cb').forEach(cb => cb.checked = true);
|
||||||
|
updateSelectionStats();
|
||||||
|
}
|
||||||
|
|
||||||
|
function deselectLocation(locId) {
|
||||||
|
document.querySelectorAll('.' + locId + '-cb').forEach(cb => cb.checked = false);
|
||||||
|
updateSelectionStats();
|
||||||
|
}
|
||||||
|
|
||||||
|
function selectAllSessions() {
|
||||||
|
document.querySelectorAll('.session-cb').forEach(cb => cb.checked = true);
|
||||||
|
updateSelectionStats();
|
||||||
|
}
|
||||||
|
|
||||||
|
function deselectAllSessions() {
|
||||||
|
document.querySelectorAll('.session-cb').forEach(cb => cb.checked = false);
|
||||||
|
updateSelectionStats();
|
||||||
|
}
|
||||||
|
|
||||||
|
function updateSelectionStats() {
|
||||||
|
const count = document.querySelectorAll('.session-cb:checked').length;
|
||||||
|
document.getElementById('selected-count').textContent = count;
|
||||||
|
document.getElementById('preview-btn').disabled = count === 0;
|
||||||
|
}
|
||||||
|
|
||||||
|
// ── Period type dropdown (wizard) ─────────────────────────────────
|
||||||
|
|
||||||
|
function toggleWizPeriodMenu(sessionId) {
|
||||||
|
const menu = document.getElementById('wiz-period-menu-' + sessionId);
|
||||||
|
document.querySelectorAll('[id^="wiz-period-menu-"]').forEach(m => {
|
||||||
|
if (m.id !== 'wiz-period-menu-' + sessionId) m.classList.add('hidden');
|
||||||
|
});
|
||||||
|
menu.classList.toggle('hidden');
|
||||||
|
}
|
||||||
|
|
||||||
|
document.addEventListener('click', function(e) {
|
||||||
|
if (!e.target.closest('[id^="wiz-period-wrap-"]')) {
|
||||||
|
document.querySelectorAll('[id^="wiz-period-menu-"]').forEach(m => m.classList.add('hidden'));
|
||||||
|
}
|
||||||
|
});
|
||||||
|
|
||||||
|
async function setWizPeriodType(sessionId, periodType) {
|
||||||
|
document.getElementById('wiz-period-menu-' + sessionId).classList.add('hidden');
|
||||||
|
const badge = document.getElementById('wiz-period-badge-' + sessionId);
|
||||||
|
badge.disabled = true;
|
||||||
|
try {
|
||||||
|
const resp = await fetch(`/api/projects/${PROJECT_ID}/sessions/${sessionId}`, {
|
||||||
|
method: 'PATCH',
|
||||||
|
headers: {'Content-Type': 'application/json'},
|
||||||
|
body: JSON.stringify({period_type: periodType}),
|
||||||
|
});
|
||||||
|
if (!resp.ok) throw new Error(await resp.text());
|
||||||
|
ALL_PERIOD_BADGE_CLASSES.forEach(c => badge.classList.remove(c));
|
||||||
|
const colorStr = PERIOD_COLORS[periodType] || 'bg-gray-100 text-gray-500 dark:bg-gray-700 dark:text-gray-400';
|
||||||
|
badge.classList.add(...colorStr.split(' ').filter(Boolean));
|
||||||
|
document.getElementById('wiz-period-label-' + sessionId).textContent = PERIOD_LABELS[periodType] || periodType;
|
||||||
|
} catch(err) {
|
||||||
|
alert('Failed to update period type: ' + err.message);
|
||||||
|
} finally {
|
||||||
|
badge.disabled = false;
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
// ── Template management ───────────────────────────────────────────
|
||||||
|
|
||||||
|
let reportTemplates = [];
|
||||||
|
|
||||||
|
async function loadTemplates() {
|
||||||
|
try {
|
||||||
|
const resp = await fetch('/api/report-templates?project_id=' + PROJECT_ID);
|
||||||
|
if (resp.ok) {
|
||||||
|
reportTemplates = await resp.json();
|
||||||
|
populateTemplateDropdown();
|
||||||
|
}
|
||||||
|
} catch(e) { console.error('Error loading templates:', e); }
|
||||||
|
}
|
||||||
|
|
||||||
|
function populateTemplateDropdown() {
|
||||||
|
const select = document.getElementById('template-select');
|
||||||
|
if (!select) return;
|
||||||
|
select.innerHTML = '<option value="">-- Select a template --</option>';
|
||||||
|
reportTemplates.forEach(t => {
|
||||||
|
const opt = document.createElement('option');
|
||||||
|
opt.value = t.id;
|
||||||
|
opt.textContent = t.name;
|
||||||
|
opt.dataset.config = JSON.stringify(t);
|
||||||
|
select.appendChild(opt);
|
||||||
|
});
|
||||||
|
}
|
||||||
|
|
||||||
|
function applyTemplate() {
|
||||||
|
const select = document.getElementById('template-select');
|
||||||
|
const opt = select.options[select.selectedIndex];
|
||||||
|
if (!opt.value) return;
|
||||||
|
const t = JSON.parse(opt.dataset.config);
|
||||||
|
if (t.report_title) document.getElementById('report-title').value = t.report_title;
|
||||||
|
}
|
||||||
|
|
||||||
|
async function saveAsTemplate() {
|
||||||
|
const name = prompt('Enter a name for this template:');
|
||||||
|
if (!name) return;
|
||||||
|
const data = {
|
||||||
|
name,
|
||||||
|
project_id: PROJECT_ID,
|
||||||
|
report_title: document.getElementById('report-title').value || 'Background Noise Study',
|
||||||
|
};
|
||||||
|
try {
|
||||||
|
const resp = await fetch('/api/report-templates', {
|
||||||
|
method: 'POST',
|
||||||
|
headers: {'Content-Type': 'application/json'},
|
||||||
|
body: JSON.stringify(data),
|
||||||
|
});
|
||||||
|
if (resp.ok) { alert('Template saved!'); loadTemplates(); }
|
||||||
|
else alert('Failed to save template');
|
||||||
|
} catch(e) { alert('Error: ' + e.message); }
|
||||||
|
}
|
||||||
|
|
||||||
|
// ── Navigate to preview ───────────────────────────────────────────
|
||||||
|
|
||||||
|
function gotoPreview() {
|
||||||
|
const checked = Array.from(document.querySelectorAll('.session-cb:checked')).map(cb => cb.value);
|
||||||
|
if (checked.length === 0) {
|
||||||
|
alert('Please select at least one session.');
|
||||||
|
return;
|
||||||
|
}
|
||||||
|
const params = new URLSearchParams({
|
||||||
|
report_title: document.getElementById('report-title').value || 'Background Noise Study',
|
||||||
|
project_name: document.getElementById('report-project').value || '',
|
||||||
|
client_name: document.getElementById('report-client').value || '',
|
||||||
|
selected_sessions: checked.join(','),
|
||||||
|
});
|
||||||
|
window.location.href = `/api/projects/${PROJECT_ID}/combined-report-preview?${params.toString()}`;
|
||||||
|
}
|
||||||
|
|
||||||
|
// ── Init ─────────────────────────────────────────────────────────
|
||||||
|
|
||||||
|
document.addEventListener('DOMContentLoaded', function() {
|
||||||
|
updateSelectionStats();
|
||||||
|
loadTemplates();
|
||||||
|
});
|
||||||
|
</script>
|
||||||
|
{% endblock %}
|
||||||
@@ -27,10 +27,10 @@
|
|||||||
hx-swap="none"
|
hx-swap="none"
|
||||||
hx-on::after-request="updateDashboard(event)">
|
hx-on::after-request="updateDashboard(event)">
|
||||||
|
|
||||||
<div class="grid grid-cols-1 md:grid-cols-3 gap-6 mb-8">
|
<div class="grid grid-cols-1 md:grid-cols-2 lg:grid-cols-4 gap-6 mb-8">
|
||||||
|
|
||||||
<!-- Fleet Summary Card -->
|
<!-- Fleet Summary Card -->
|
||||||
<div class="rounded-xl shadow-lg bg-white dark:bg-slate-800 p-6" id="fleet-summary-card">
|
<div class="rounded-xl shadow-lg bg-white dark:bg-slate-700 p-6" id="fleet-summary-card">
|
||||||
<div class="flex items-center justify-between mb-4 cursor-pointer md:cursor-default" onclick="toggleCard('fleet-summary')">
|
<div class="flex items-center justify-between mb-4 cursor-pointer md:cursor-default" onclick="toggleCard('fleet-summary')">
|
||||||
<h2 class="text-lg font-semibold text-gray-900 dark:text-white">Fleet Summary</h2>
|
<h2 class="text-lg font-semibold text-gray-900 dark:text-white">Fleet Summary</h2>
|
||||||
<div class="flex items-center gap-2">
|
<div class="flex items-center gap-2">
|
||||||
@@ -57,6 +57,10 @@
|
|||||||
<span class="text-gray-600 dark:text-gray-400">Benched</span>
|
<span class="text-gray-600 dark:text-gray-400">Benched</span>
|
||||||
<span id="benched-units" class="text-3xl md:text-2xl font-bold text-gray-600 dark:text-gray-400">--</span>
|
<span id="benched-units" class="text-3xl md:text-2xl font-bold text-gray-600 dark:text-gray-400">--</span>
|
||||||
</div>
|
</div>
|
||||||
|
<div class="flex justify-between items-center">
|
||||||
|
<span class="text-orange-600 dark:text-orange-400">Allocated</span>
|
||||||
|
<span id="allocated-units" class="text-3xl md:text-2xl font-bold text-orange-500 dark:text-orange-400">--</span>
|
||||||
|
</div>
|
||||||
<div class="border-t border-gray-200 dark:border-gray-700 pt-3 mt-3">
|
<div class="border-t border-gray-200 dark:border-gray-700 pt-3 mt-3">
|
||||||
<p class="text-xs text-gray-500 dark:text-gray-500 mb-2 italic">By Device Type:</p>
|
<p class="text-xs text-gray-500 dark:text-gray-500 mb-2 italic">By Device Type:</p>
|
||||||
<div class="flex justify-between items-center mb-1">
|
<div class="flex justify-between items-center mb-1">
|
||||||
@@ -118,7 +122,7 @@
|
|||||||
</div>
|
</div>
|
||||||
|
|
||||||
<!-- Recent Alerts Card -->
|
<!-- Recent Alerts Card -->
|
||||||
<div class="rounded-xl shadow-lg bg-white dark:bg-slate-800 p-6" id="recent-alerts-card">
|
<div class="rounded-xl shadow-lg bg-white dark:bg-slate-700 p-6" id="recent-alerts-card">
|
||||||
<div class="flex items-center justify-between mb-4 cursor-pointer md:cursor-default" onclick="toggleCard('recent-alerts')">
|
<div class="flex items-center justify-between mb-4 cursor-pointer md:cursor-default" onclick="toggleCard('recent-alerts')">
|
||||||
<h2 class="text-lg font-semibold text-gray-900 dark:text-white">Recent Alerts</h2>
|
<h2 class="text-lg font-semibold text-gray-900 dark:text-white">Recent Alerts</h2>
|
||||||
<div class="flex items-center gap-2">
|
<div class="flex items-center gap-2">
|
||||||
@@ -138,7 +142,7 @@
|
|||||||
</div>
|
</div>
|
||||||
|
|
||||||
<!-- Recently Called In Units Card -->
|
<!-- Recently Called In Units Card -->
|
||||||
<div class="rounded-xl shadow-lg bg-white dark:bg-slate-800 p-6" id="recent-callins-card">
|
<div class="rounded-xl shadow-lg bg-white dark:bg-slate-700 p-6" id="recent-callins-card">
|
||||||
<div class="flex items-center justify-between mb-4 cursor-pointer md:cursor-default" onclick="toggleCard('recent-callins')">
|
<div class="flex items-center justify-between mb-4 cursor-pointer md:cursor-default" onclick="toggleCard('recent-callins')">
|
||||||
<h2 class="text-lg font-semibold text-gray-900 dark:text-white">Recent Call-Ins</h2>
|
<h2 class="text-lg font-semibold text-gray-900 dark:text-white">Recent Call-Ins</h2>
|
||||||
<div class="flex items-center gap-2">
|
<div class="flex items-center gap-2">
|
||||||
@@ -162,10 +166,95 @@
|
|||||||
</div>
|
</div>
|
||||||
</div>
|
</div>
|
||||||
|
|
||||||
|
<!-- Today's Scheduled Actions Card -->
|
||||||
|
<div class="rounded-xl shadow-lg bg-white dark:bg-slate-700 p-6" id="todays-actions-card">
|
||||||
|
<div class="flex items-center justify-between mb-4 cursor-pointer md:cursor-default" onclick="toggleCard('todays-actions')">
|
||||||
|
<h2 class="text-lg font-semibold text-gray-900 dark:text-white">Today's Schedule</h2>
|
||||||
|
<div class="flex items-center gap-2">
|
||||||
|
<svg class="w-6 h-6 text-seismo-orange" fill="none" stroke="currentColor" viewBox="0 0 24 24">
|
||||||
|
<path stroke-linecap="round" stroke-linejoin="round" stroke-width="2"
|
||||||
|
d="M8 7V3m8 4V3m-9 8h10M5 21h14a2 2 0 002-2V7a2 2 0 00-2-2H5a2 2 0 00-2 2v12a2 2 0 002 2z">
|
||||||
|
</path>
|
||||||
|
</svg>
|
||||||
|
<svg class="w-5 h-5 text-gray-500 transition-transform md:hidden chevron" id="todays-actions-chevron" fill="none" stroke="currentColor" viewBox="0 0 24 24">
|
||||||
|
<path stroke-linecap="round" stroke-linejoin="round" stroke-width="2" d="M19 9l-7 7-7-7"></path>
|
||||||
|
</svg>
|
||||||
|
</div>
|
||||||
|
</div>
|
||||||
|
<div class="card-content" id="todays-actions-content"
|
||||||
|
hx-get="/dashboard/todays-actions"
|
||||||
|
hx-trigger="load, every 30s"
|
||||||
|
hx-swap="innerHTML">
|
||||||
|
<p class="text-sm text-gray-500 dark:text-gray-400">Loading scheduled actions...</p>
|
||||||
|
</div>
|
||||||
|
</div>
|
||||||
|
|
||||||
|
</div>
|
||||||
|
|
||||||
|
<!-- Dashboard Filters -->
|
||||||
|
<div class="rounded-xl shadow-lg bg-white dark:bg-slate-700 p-4 mb-4" id="dashboard-filters-card">
|
||||||
|
<div class="flex items-center justify-between mb-3">
|
||||||
|
<h3 class="text-sm font-semibold text-gray-700 dark:text-gray-300">Filter Dashboard</h3>
|
||||||
|
<button onclick="resetFilters()" class="text-xs text-gray-500 hover:text-seismo-orange dark:hover:text-seismo-orange transition-colors">
|
||||||
|
Reset Filters
|
||||||
|
</button>
|
||||||
|
</div>
|
||||||
|
|
||||||
|
<div class="flex flex-wrap gap-6">
|
||||||
|
<!-- Device Type Filters -->
|
||||||
|
<div class="flex flex-col gap-1">
|
||||||
|
<span class="text-xs text-gray-500 dark:text-gray-400 font-medium uppercase tracking-wide">Device Type</span>
|
||||||
|
<div class="flex gap-4">
|
||||||
|
<label class="flex items-center gap-1.5 cursor-pointer">
|
||||||
|
<input type="checkbox" id="filter-seismograph" checked
|
||||||
|
class="rounded border-gray-300 text-blue-600 focus:ring-blue-500 dark:border-gray-600 dark:bg-slate-800"
|
||||||
|
onchange="applyFilters()">
|
||||||
|
<span class="text-sm text-gray-700 dark:text-gray-300">Seismographs</span>
|
||||||
|
</label>
|
||||||
|
<label class="flex items-center gap-1.5 cursor-pointer">
|
||||||
|
<input type="checkbox" id="filter-slm" checked
|
||||||
|
class="rounded border-gray-300 text-purple-600 focus:ring-purple-500 dark:border-gray-600 dark:bg-slate-800"
|
||||||
|
onchange="applyFilters()">
|
||||||
|
<span class="text-sm text-gray-700 dark:text-gray-300">SLMs</span>
|
||||||
|
</label>
|
||||||
|
<label class="flex items-center gap-1.5 cursor-pointer">
|
||||||
|
<input type="checkbox" id="filter-modem" checked
|
||||||
|
class="rounded border-gray-300 text-cyan-600 focus:ring-cyan-500 dark:border-gray-600 dark:bg-slate-800"
|
||||||
|
onchange="applyFilters()">
|
||||||
|
<span class="text-sm text-gray-700 dark:text-gray-300">Modems</span>
|
||||||
|
</label>
|
||||||
|
</div>
|
||||||
|
</div>
|
||||||
|
|
||||||
|
<!-- Status Filters -->
|
||||||
|
<div class="flex flex-col gap-1">
|
||||||
|
<span class="text-xs text-gray-500 dark:text-gray-400 font-medium uppercase tracking-wide">Status</span>
|
||||||
|
<div class="flex gap-4">
|
||||||
|
<label class="flex items-center gap-1.5 cursor-pointer">
|
||||||
|
<input type="checkbox" id="filter-ok" checked
|
||||||
|
class="rounded border-gray-300 text-green-600 focus:ring-green-500 dark:border-gray-600 dark:bg-slate-800"
|
||||||
|
onchange="applyFilters()">
|
||||||
|
<span class="text-sm text-green-600 dark:text-green-400">OK</span>
|
||||||
|
</label>
|
||||||
|
<label class="flex items-center gap-1.5 cursor-pointer">
|
||||||
|
<input type="checkbox" id="filter-pending" checked
|
||||||
|
class="rounded border-gray-300 text-yellow-600 focus:ring-yellow-500 dark:border-gray-600 dark:bg-slate-800"
|
||||||
|
onchange="applyFilters()">
|
||||||
|
<span class="text-sm text-yellow-600 dark:text-yellow-400">Pending</span>
|
||||||
|
</label>
|
||||||
|
<label class="flex items-center gap-1.5 cursor-pointer">
|
||||||
|
<input type="checkbox" id="filter-missing" checked
|
||||||
|
class="rounded border-gray-300 text-red-600 focus:ring-red-500 dark:border-gray-600 dark:bg-slate-800"
|
||||||
|
onchange="applyFilters()">
|
||||||
|
<span class="text-sm text-red-600 dark:text-red-400">Missing</span>
|
||||||
|
</label>
|
||||||
|
</div>
|
||||||
|
</div>
|
||||||
|
</div>
|
||||||
</div>
|
</div>
|
||||||
|
|
||||||
<!-- Fleet Map -->
|
<!-- Fleet Map -->
|
||||||
<div class="rounded-xl shadow-lg bg-white dark:bg-slate-800 p-6 mb-8" id="fleet-map-card">
|
<div class="rounded-xl shadow-lg bg-white dark:bg-slate-700 p-6 mb-8" id="fleet-map-card">
|
||||||
<div class="flex items-center justify-between mb-4 cursor-pointer md:cursor-default" onclick="toggleCard('fleet-map')">
|
<div class="flex items-center justify-between mb-4 cursor-pointer md:cursor-default" onclick="toggleCard('fleet-map')">
|
||||||
<h2 class="text-xl font-semibold text-gray-900 dark:text-white">Fleet Map</h2>
|
<h2 class="text-xl font-semibold text-gray-900 dark:text-white">Fleet Map</h2>
|
||||||
<div class="flex items-center gap-2">
|
<div class="flex items-center gap-2">
|
||||||
@@ -181,7 +270,7 @@
|
|||||||
</div>
|
</div>
|
||||||
|
|
||||||
<!-- Recent Photos Section -->
|
<!-- Recent Photos Section -->
|
||||||
<div class="rounded-xl shadow-lg bg-white dark:bg-slate-800 p-6 mb-8" id="recent-photos-card">
|
<div class="rounded-xl shadow-lg bg-white dark:bg-slate-700 p-6 mb-8" id="recent-photos-card">
|
||||||
<div class="flex items-center justify-between mb-4 cursor-pointer md:cursor-default" onclick="toggleCard('recent-photos')">
|
<div class="flex items-center justify-between mb-4 cursor-pointer md:cursor-default" onclick="toggleCard('recent-photos')">
|
||||||
<h2 class="text-xl font-semibold text-gray-900 dark:text-white">Recent Photos</h2>
|
<h2 class="text-xl font-semibold text-gray-900 dark:text-white">Recent Photos</h2>
|
||||||
<div class="flex items-center gap-2">
|
<div class="flex items-center gap-2">
|
||||||
@@ -201,7 +290,7 @@
|
|||||||
</div>
|
</div>
|
||||||
|
|
||||||
<!-- Fleet Status Section with Tabs -->
|
<!-- Fleet Status Section with Tabs -->
|
||||||
<div class="rounded-xl shadow-lg bg-white dark:bg-slate-800 p-6" id="fleet-status-card">
|
<div class="rounded-xl shadow-lg bg-white dark:bg-slate-700 p-6" id="fleet-status-card">
|
||||||
|
|
||||||
<div class="flex items-center justify-between mb-4 cursor-pointer md:cursor-default" onclick="toggleCard('fleet-status')">
|
<div class="flex items-center justify-between mb-4 cursor-pointer md:cursor-default" onclick="toggleCard('fleet-status')">
|
||||||
<h2 class="text-xl font-semibold text-gray-900 dark:text-white">Fleet Status</h2>
|
<h2 class="text-xl font-semibold text-gray-900 dark:text-white">Fleet Status</h2>
|
||||||
@@ -279,6 +368,255 @@
|
|||||||
|
|
||||||
|
|
||||||
<script>
|
<script>
|
||||||
|
// ===== Dashboard Filtering System =====
|
||||||
|
let currentSnapshotData = null; // Store latest snapshot data for re-filtering
|
||||||
|
|
||||||
|
// Filter state - tracks which device types and statuses to show
|
||||||
|
const filters = {
|
||||||
|
deviceTypes: {
|
||||||
|
seismograph: true,
|
||||||
|
sound_level_meter: true,
|
||||||
|
modem: true
|
||||||
|
},
|
||||||
|
statuses: {
|
||||||
|
OK: true,
|
||||||
|
Pending: true,
|
||||||
|
Missing: true
|
||||||
|
}
|
||||||
|
};
|
||||||
|
|
||||||
|
// Load saved filter preferences from localStorage
|
||||||
|
function loadFilterPreferences() {
|
||||||
|
const saved = localStorage.getItem('dashboardFilters');
|
||||||
|
if (saved) {
|
||||||
|
try {
|
||||||
|
const parsed = JSON.parse(saved);
|
||||||
|
if (parsed.deviceTypes) Object.assign(filters.deviceTypes, parsed.deviceTypes);
|
||||||
|
if (parsed.statuses) Object.assign(filters.statuses, parsed.statuses);
|
||||||
|
} catch (e) {
|
||||||
|
console.error('Error loading filter preferences:', e);
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
// Sync checkboxes with loaded state
|
||||||
|
const seismoCheck = document.getElementById('filter-seismograph');
|
||||||
|
const slmCheck = document.getElementById('filter-slm');
|
||||||
|
const modemCheck = document.getElementById('filter-modem');
|
||||||
|
const okCheck = document.getElementById('filter-ok');
|
||||||
|
const pendingCheck = document.getElementById('filter-pending');
|
||||||
|
const missingCheck = document.getElementById('filter-missing');
|
||||||
|
|
||||||
|
if (seismoCheck) seismoCheck.checked = filters.deviceTypes.seismograph;
|
||||||
|
if (slmCheck) slmCheck.checked = filters.deviceTypes.sound_level_meter;
|
||||||
|
if (modemCheck) modemCheck.checked = filters.deviceTypes.modem;
|
||||||
|
if (okCheck) okCheck.checked = filters.statuses.OK;
|
||||||
|
if (pendingCheck) pendingCheck.checked = filters.statuses.Pending;
|
||||||
|
if (missingCheck) missingCheck.checked = filters.statuses.Missing;
|
||||||
|
}
|
||||||
|
|
||||||
|
// Save filter preferences to localStorage
|
||||||
|
function saveFilterPreferences() {
|
||||||
|
localStorage.setItem('dashboardFilters', JSON.stringify(filters));
|
||||||
|
}
|
||||||
|
|
||||||
|
// Apply filters - called when any checkbox changes
|
||||||
|
function applyFilters() {
|
||||||
|
// Update filter state from checkboxes
|
||||||
|
const seismoCheck = document.getElementById('filter-seismograph');
|
||||||
|
const slmCheck = document.getElementById('filter-slm');
|
||||||
|
const modemCheck = document.getElementById('filter-modem');
|
||||||
|
const okCheck = document.getElementById('filter-ok');
|
||||||
|
const pendingCheck = document.getElementById('filter-pending');
|
||||||
|
const missingCheck = document.getElementById('filter-missing');
|
||||||
|
|
||||||
|
if (seismoCheck) filters.deviceTypes.seismograph = seismoCheck.checked;
|
||||||
|
if (slmCheck) filters.deviceTypes.sound_level_meter = slmCheck.checked;
|
||||||
|
if (modemCheck) filters.deviceTypes.modem = modemCheck.checked;
|
||||||
|
if (okCheck) filters.statuses.OK = okCheck.checked;
|
||||||
|
if (pendingCheck) filters.statuses.Pending = pendingCheck.checked;
|
||||||
|
if (missingCheck) filters.statuses.Missing = missingCheck.checked;
|
||||||
|
|
||||||
|
saveFilterPreferences();
|
||||||
|
|
||||||
|
// Re-render with current data and filters
|
||||||
|
if (currentSnapshotData) {
|
||||||
|
renderFilteredDashboard(currentSnapshotData);
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
// Reset all filters to show everything
|
||||||
|
function resetFilters() {
|
||||||
|
filters.deviceTypes = { seismograph: true, sound_level_meter: true, modem: true };
|
||||||
|
filters.statuses = { OK: true, Pending: true, Missing: true };
|
||||||
|
|
||||||
|
// Update all checkboxes
|
||||||
|
const checkboxes = [
|
||||||
|
'filter-seismograph', 'filter-slm', 'filter-modem',
|
||||||
|
'filter-ok', 'filter-pending', 'filter-missing'
|
||||||
|
];
|
||||||
|
checkboxes.forEach(id => {
|
||||||
|
const el = document.getElementById(id);
|
||||||
|
if (el) el.checked = true;
|
||||||
|
});
|
||||||
|
|
||||||
|
saveFilterPreferences();
|
||||||
|
|
||||||
|
if (currentSnapshotData) {
|
||||||
|
renderFilteredDashboard(currentSnapshotData);
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
// Check if a unit passes the current filters
|
||||||
|
function unitPassesFilter(unit) {
|
||||||
|
const deviceType = unit.device_type || 'seismograph';
|
||||||
|
const status = unit.status || 'Missing';
|
||||||
|
|
||||||
|
// Check device type filter
|
||||||
|
if (!filters.deviceTypes[deviceType]) {
|
||||||
|
return false;
|
||||||
|
}
|
||||||
|
|
||||||
|
// Check status filter
|
||||||
|
if (!filters.statuses[status]) {
|
||||||
|
return false;
|
||||||
|
}
|
||||||
|
|
||||||
|
return true;
|
||||||
|
}
|
||||||
|
|
||||||
|
// Get display label for device type
|
||||||
|
function getDeviceTypeLabel(deviceType) {
|
||||||
|
switch(deviceType) {
|
||||||
|
case 'sound_level_meter': return 'SLM';
|
||||||
|
case 'modem': return 'Modem';
|
||||||
|
default: return 'Seismograph';
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
// Render dashboard with filtered data
|
||||||
|
function renderFilteredDashboard(data) {
|
||||||
|
// Filter active units for alerts
|
||||||
|
const filteredActive = {};
|
||||||
|
Object.entries(data.active || {}).forEach(([id, unit]) => {
|
||||||
|
if (unitPassesFilter(unit)) {
|
||||||
|
filteredActive[id] = unit;
|
||||||
|
}
|
||||||
|
});
|
||||||
|
|
||||||
|
// Update alerts with filtered data
|
||||||
|
updateAlertsFiltered(filteredActive);
|
||||||
|
|
||||||
|
// Update map with filtered data
|
||||||
|
updateFleetMapFiltered(data.units);
|
||||||
|
}
|
||||||
|
|
||||||
|
// Update the Recent Alerts section with filtering
|
||||||
|
function updateAlertsFiltered(filteredActive) {
|
||||||
|
const alertsList = document.getElementById('alerts-list');
|
||||||
|
const missingUnits = Object.entries(filteredActive).filter(([_, u]) => u.status === 'Missing' && u.device_type !== 'modem');
|
||||||
|
|
||||||
|
if (!missingUnits.length) {
|
||||||
|
// Check if this is because of filters or genuinely no alerts
|
||||||
|
const anyMissing = currentSnapshotData && Object.values(currentSnapshotData.active || {}).some(u => u.status === 'Missing');
|
||||||
|
if (anyMissing) {
|
||||||
|
alertsList.innerHTML = '<p class="text-sm text-gray-500 dark:text-gray-400">No alerts match current filters</p>';
|
||||||
|
} else {
|
||||||
|
alertsList.innerHTML = '<p class="text-sm text-green-600 dark:text-green-400">All units reporting normally</p>';
|
||||||
|
}
|
||||||
|
} else {
|
||||||
|
let alertsHtml = '';
|
||||||
|
missingUnits.forEach(([id, unit]) => {
|
||||||
|
const deviceLabel = getDeviceTypeLabel(unit.device_type);
|
||||||
|
alertsHtml += `
|
||||||
|
<div class="flex items-start space-x-2 text-sm">
|
||||||
|
<span class="w-2 h-2 rounded-full bg-red-500 mt-1.5"></span>
|
||||||
|
<div>
|
||||||
|
<a href="/unit/${id}" class="font-medium text-red-600 dark:text-red-400 hover:underline">${id}</a>
|
||||||
|
<span class="text-xs text-gray-500 ml-1">(${deviceLabel})</span>
|
||||||
|
<p class="text-gray-600 dark:text-gray-400">Missing for ${unit.age}</p>
|
||||||
|
</div>
|
||||||
|
</div>`;
|
||||||
|
});
|
||||||
|
alertsList.innerHTML = alertsHtml;
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
// Update map with filtered data
|
||||||
|
function updateFleetMapFiltered(allUnits) {
|
||||||
|
if (!fleetMap) return;
|
||||||
|
|
||||||
|
// Clear existing markers
|
||||||
|
fleetMarkers.forEach(marker => fleetMap.removeLayer(marker));
|
||||||
|
fleetMarkers = [];
|
||||||
|
|
||||||
|
// Get deployed units with coordinates that pass the filter
|
||||||
|
const deployedUnits = Object.entries(allUnits || {})
|
||||||
|
.filter(([_, u]) => u.deployed && u.coordinates && unitPassesFilter(u));
|
||||||
|
|
||||||
|
if (deployedUnits.length === 0) {
|
||||||
|
return;
|
||||||
|
}
|
||||||
|
|
||||||
|
const bounds = [];
|
||||||
|
|
||||||
|
deployedUnits.forEach(([id, unit]) => {
|
||||||
|
const coords = parseLocation(unit.coordinates);
|
||||||
|
if (coords) {
|
||||||
|
const [lat, lon] = coords;
|
||||||
|
|
||||||
|
// Color based on status
|
||||||
|
const markerColor = unit.status === 'OK' ? 'green' :
|
||||||
|
unit.status === 'Pending' ? 'orange' : 'red';
|
||||||
|
|
||||||
|
// Different marker style per device type
|
||||||
|
const deviceType = unit.device_type || 'seismograph';
|
||||||
|
let radius = 8;
|
||||||
|
let weight = 2;
|
||||||
|
|
||||||
|
if (deviceType === 'modem') {
|
||||||
|
radius = 6;
|
||||||
|
weight = 2;
|
||||||
|
} else if (deviceType === 'sound_level_meter') {
|
||||||
|
radius = 8;
|
||||||
|
weight = 3;
|
||||||
|
}
|
||||||
|
|
||||||
|
const marker = L.circleMarker([lat, lon], {
|
||||||
|
radius: radius,
|
||||||
|
fillColor: markerColor,
|
||||||
|
color: '#fff',
|
||||||
|
weight: weight,
|
||||||
|
opacity: 1,
|
||||||
|
fillOpacity: 0.8
|
||||||
|
}).addTo(fleetMap);
|
||||||
|
|
||||||
|
// Popup with device type
|
||||||
|
const deviceLabel = getDeviceTypeLabel(deviceType);
|
||||||
|
|
||||||
|
marker.bindPopup(`
|
||||||
|
<div class="p-2">
|
||||||
|
<h3 class="font-bold text-lg">${id}</h3>
|
||||||
|
<p class="text-sm text-gray-600">${deviceLabel}</p>
|
||||||
|
<p class="text-sm">Status: <span style="color: ${markerColor}">${unit.status}</span></p>
|
||||||
|
${unit.note ? `<p class="text-sm text-gray-600">${unit.note}</p>` : ''}
|
||||||
|
<a href="/unit/${id}" class="text-blue-600 hover:underline text-sm">View Details</a>
|
||||||
|
</div>
|
||||||
|
`);
|
||||||
|
|
||||||
|
fleetMarkers.push(marker);
|
||||||
|
bounds.push([lat, lon]);
|
||||||
|
}
|
||||||
|
});
|
||||||
|
|
||||||
|
// Only fit bounds on initial load, not on subsequent updates
|
||||||
|
// This preserves the user's current map view when auto-refreshing
|
||||||
|
if (bounds.length > 0 && !fleetMapInitialized) {
|
||||||
|
const padding = window.innerWidth < 768 ? [20, 20] : [50, 50];
|
||||||
|
fleetMap.fitBounds(bounds, { padding: padding });
|
||||||
|
fleetMapInitialized = true;
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
// Toggle card collapse/expand (mobile only)
|
// Toggle card collapse/expand (mobile only)
|
||||||
function toggleCard(cardName) {
|
function toggleCard(cardName) {
|
||||||
// Only work on mobile
|
// Only work on mobile
|
||||||
@@ -316,7 +654,7 @@ function toggleCard(cardName) {
|
|||||||
// Restore card states from localStorage on page load
|
// Restore card states from localStorage on page load
|
||||||
function restoreCardStates() {
|
function restoreCardStates() {
|
||||||
const cardStates = JSON.parse(localStorage.getItem('dashboardCardStates') || '{}');
|
const cardStates = JSON.parse(localStorage.getItem('dashboardCardStates') || '{}');
|
||||||
const cardNames = ['fleet-summary', 'recent-alerts', 'recent-callins', 'fleet-map', 'fleet-status'];
|
const cardNames = ['fleet-summary', 'recent-alerts', 'recent-callins', 'todays-actions', 'fleet-map', 'fleet-status'];
|
||||||
|
|
||||||
cardNames.forEach(cardName => {
|
cardNames.forEach(cardName => {
|
||||||
const content = document.getElementById(`${cardName}-content`);
|
const content = document.getElementById(`${cardName}-content`);
|
||||||
@@ -343,8 +681,17 @@ if (document.readyState === 'loading') {
|
|||||||
|
|
||||||
function updateDashboard(event) {
|
function updateDashboard(event) {
|
||||||
try {
|
try {
|
||||||
|
// Only process responses from /api/status-snapshot
|
||||||
|
const requestUrl = event.detail.xhr.responseURL || event.detail.pathInfo?.requestPath;
|
||||||
|
if (!requestUrl || !requestUrl.includes('/api/status-snapshot')) {
|
||||||
|
return; // Ignore responses from other endpoints (like /dashboard/todays-actions)
|
||||||
|
}
|
||||||
|
|
||||||
const data = JSON.parse(event.detail.xhr.response);
|
const data = JSON.parse(event.detail.xhr.response);
|
||||||
|
|
||||||
|
// Store data for filter re-application
|
||||||
|
currentSnapshotData = data;
|
||||||
|
|
||||||
// Update "Last updated" timestamp with timezone
|
// Update "Last updated" timestamp with timezone
|
||||||
const now = new Date();
|
const now = new Date();
|
||||||
const timezone = localStorage.getItem('timezone') || 'America/New_York';
|
const timezone = localStorage.getItem('timezone') || 'America/New_York';
|
||||||
@@ -356,17 +703,19 @@ function updateDashboard(event) {
|
|||||||
timeZoneName: 'short'
|
timeZoneName: 'short'
|
||||||
});
|
});
|
||||||
|
|
||||||
// ===== Fleet summary numbers =====
|
// ===== Fleet summary numbers (always unfiltered) =====
|
||||||
document.getElementById('total-units').textContent = data.summary?.total ?? 0;
|
document.getElementById('total-units').textContent = data.summary?.total ?? 0;
|
||||||
document.getElementById('deployed-units').textContent = data.summary?.active ?? 0;
|
document.getElementById('deployed-units').textContent = data.summary?.active ?? 0;
|
||||||
document.getElementById('benched-units').textContent = data.summary?.benched ?? 0;
|
document.getElementById('benched-units').textContent = data.summary?.benched ?? 0;
|
||||||
|
document.getElementById('allocated-units').textContent = data.summary?.allocated ?? 0;
|
||||||
document.getElementById('status-ok').textContent = data.summary?.ok ?? 0;
|
document.getElementById('status-ok').textContent = data.summary?.ok ?? 0;
|
||||||
document.getElementById('status-pending').textContent = data.summary?.pending ?? 0;
|
document.getElementById('status-pending').textContent = data.summary?.pending ?? 0;
|
||||||
document.getElementById('status-missing').textContent = data.summary?.missing ?? 0;
|
document.getElementById('status-missing').textContent = data.summary?.missing ?? 0;
|
||||||
|
|
||||||
// ===== Device type counts =====
|
// ===== Device type counts (always unfiltered) =====
|
||||||
let seismoCount = 0;
|
let seismoCount = 0;
|
||||||
let slmCount = 0;
|
let slmCount = 0;
|
||||||
|
let modemCount = 0;
|
||||||
Object.values(data.units || {}).forEach(unit => {
|
Object.values(data.units || {}).forEach(unit => {
|
||||||
if (unit.retired) return; // Don't count retired units
|
if (unit.retired) return; // Don't count retired units
|
||||||
const deviceType = unit.device_type || 'seismograph';
|
const deviceType = unit.device_type || 'seismograph';
|
||||||
@@ -374,46 +723,26 @@ function updateDashboard(event) {
|
|||||||
seismoCount++;
|
seismoCount++;
|
||||||
} else if (deviceType === 'sound_level_meter') {
|
} else if (deviceType === 'sound_level_meter') {
|
||||||
slmCount++;
|
slmCount++;
|
||||||
|
} else if (deviceType === 'modem') {
|
||||||
|
modemCount++;
|
||||||
}
|
}
|
||||||
});
|
});
|
||||||
document.getElementById('seismo-count').textContent = seismoCount;
|
document.getElementById('seismo-count').textContent = seismoCount;
|
||||||
document.getElementById('slm-count').textContent = slmCount;
|
document.getElementById('slm-count').textContent = slmCount;
|
||||||
|
|
||||||
// ===== Alerts =====
|
// ===== Apply filters and render map + alerts =====
|
||||||
const alertsList = document.getElementById('alerts-list');
|
renderFilteredDashboard(data);
|
||||||
// Only show alerts for deployed units that are MISSING (not pending)
|
|
||||||
const missingUnits = Object.entries(data.active).filter(([_, u]) => u.status === 'Missing');
|
|
||||||
|
|
||||||
if (!missingUnits.length) {
|
|
||||||
alertsList.innerHTML =
|
|
||||||
'<p class="text-sm text-green-600 dark:text-green-400">✓ All units reporting normally</p>';
|
|
||||||
} else {
|
|
||||||
let alertsHtml = '';
|
|
||||||
|
|
||||||
missingUnits.forEach(([id, unit]) => {
|
|
||||||
alertsHtml += `
|
|
||||||
<div class="flex items-start space-x-2 text-sm">
|
|
||||||
<span class="w-2 h-2 rounded-full bg-red-500 mt-1.5"></span>
|
|
||||||
<div>
|
|
||||||
<a href="/unit/${id}" class="font-medium text-red-600 dark:text-red-400 hover:underline">${id}</a>
|
|
||||||
<p class="text-gray-600 dark:text-gray-400">Missing for ${unit.age}</p>
|
|
||||||
</div>
|
|
||||||
</div>`;
|
|
||||||
});
|
|
||||||
|
|
||||||
alertsList.innerHTML = alertsHtml;
|
|
||||||
}
|
|
||||||
|
|
||||||
// ===== Update Fleet Map =====
|
|
||||||
updateFleetMap(data);
|
|
||||||
|
|
||||||
} catch (err) {
|
} catch (err) {
|
||||||
console.error("Dashboard update error:", err);
|
console.error("Dashboard update error:", err);
|
||||||
}
|
}
|
||||||
}
|
}
|
||||||
|
|
||||||
// Handle tab switching
|
// Handle tab switching and initialize components
|
||||||
document.addEventListener('DOMContentLoaded', function() {
|
document.addEventListener('DOMContentLoaded', function() {
|
||||||
|
// Load filter preferences
|
||||||
|
loadFilterPreferences();
|
||||||
|
|
||||||
const tabButtons = document.querySelectorAll('.tab-button');
|
const tabButtons = document.querySelectorAll('.tab-button');
|
||||||
|
|
||||||
tabButtons.forEach(button => {
|
tabButtons.forEach(button => {
|
||||||
@@ -453,64 +782,6 @@ function initFleetMap() {
|
|||||||
}, 100);
|
}, 100);
|
||||||
}
|
}
|
||||||
|
|
||||||
function updateFleetMap(data) {
|
|
||||||
if (!fleetMap) return;
|
|
||||||
|
|
||||||
// Clear existing markers
|
|
||||||
fleetMarkers.forEach(marker => fleetMap.removeLayer(marker));
|
|
||||||
fleetMarkers = [];
|
|
||||||
|
|
||||||
// Get deployed units with coordinates data
|
|
||||||
const deployedUnits = Object.entries(data.units).filter(([_, u]) => u.deployed && u.coordinates);
|
|
||||||
|
|
||||||
if (deployedUnits.length === 0) {
|
|
||||||
return;
|
|
||||||
}
|
|
||||||
|
|
||||||
const bounds = [];
|
|
||||||
|
|
||||||
deployedUnits.forEach(([id, unit]) => {
|
|
||||||
const coords = parseLocation(unit.coordinates);
|
|
||||||
if (coords) {
|
|
||||||
const [lat, lon] = coords;
|
|
||||||
|
|
||||||
// Create marker with custom color based on status
|
|
||||||
const markerColor = unit.status === 'OK' ? 'green' : unit.status === 'Pending' ? 'orange' : 'red';
|
|
||||||
|
|
||||||
const marker = L.circleMarker([lat, lon], {
|
|
||||||
radius: 8,
|
|
||||||
fillColor: markerColor,
|
|
||||||
color: '#fff',
|
|
||||||
weight: 2,
|
|
||||||
opacity: 1,
|
|
||||||
fillOpacity: 0.8
|
|
||||||
}).addTo(fleetMap);
|
|
||||||
|
|
||||||
// Add popup with unit info
|
|
||||||
marker.bindPopup(`
|
|
||||||
<div class="p-2">
|
|
||||||
<h3 class="font-bold text-lg">${id}</h3>
|
|
||||||
<p class="text-sm">Status: <span style="color: ${markerColor}">${unit.status}</span></p>
|
|
||||||
<p class="text-sm">Type: ${unit.device_type}</p>
|
|
||||||
${unit.note ? `<p class="text-sm text-gray-600">${unit.note}</p>` : ''}
|
|
||||||
<a href="/unit/${id}" class="text-blue-600 hover:underline text-sm">View Details →</a>
|
|
||||||
</div>
|
|
||||||
`);
|
|
||||||
|
|
||||||
fleetMarkers.push(marker);
|
|
||||||
bounds.push([lat, lon]);
|
|
||||||
}
|
|
||||||
});
|
|
||||||
|
|
||||||
// Fit map to show all markers
|
|
||||||
if (bounds.length > 0) {
|
|
||||||
// Use different padding for mobile vs desktop
|
|
||||||
const padding = window.innerWidth < 768 ? [20, 20] : [50, 50];
|
|
||||||
fleetMap.fitBounds(bounds, { padding: padding });
|
|
||||||
fleetMapInitialized = true;
|
|
||||||
}
|
|
||||||
}
|
|
||||||
|
|
||||||
function parseLocation(location) {
|
function parseLocation(location) {
|
||||||
if (!location) return null;
|
if (!location) return null;
|
||||||
|
|
||||||
|
|||||||
2466
templates/fleet_calendar.html
Normal file
102
templates/modems.html
Normal file
@@ -0,0 +1,102 @@
|
|||||||
|
{% extends "base.html" %}
|
||||||
|
|
||||||
|
{% block title %}Field Modems - Terra-View{% endblock %}
|
||||||
|
|
||||||
|
{% block content %}
|
||||||
|
<div class="mb-8">
|
||||||
|
<h1 class="text-3xl font-bold text-gray-900 dark:text-white flex items-center">
|
||||||
|
<svg class="w-8 h-8 mr-3 text-seismo-orange" fill="none" stroke="currentColor" viewBox="0 0 24 24">
|
||||||
|
<path stroke-linecap="round" stroke-linejoin="round" stroke-width="2" d="M8.111 16.404a5.5 5.5 0 017.778 0M12 20h.01m-7.08-7.071c3.904-3.905 10.236-3.905 14.141 0M1.394 9.393c5.857-5.857 15.355-5.857 21.213 0"></path>
|
||||||
|
</svg>
|
||||||
|
Field Modems
|
||||||
|
</h1>
|
||||||
|
<p class="text-gray-600 dark:text-gray-400 mt-1">Manage network connectivity devices for field equipment</p>
|
||||||
|
</div>
|
||||||
|
|
||||||
|
<!-- Summary Stats -->
|
||||||
|
<div class="grid grid-cols-1 md:grid-cols-4 gap-6 mb-8"
|
||||||
|
hx-get="/api/modem-dashboard/stats"
|
||||||
|
hx-trigger="load, every 30s"
|
||||||
|
hx-swap="innerHTML">
|
||||||
|
<!-- Stats will be loaded here -->
|
||||||
|
<div class="animate-pulse bg-gray-200 dark:bg-gray-700 h-24 rounded-xl"></div>
|
||||||
|
<div class="animate-pulse bg-gray-200 dark:bg-gray-700 h-24 rounded-xl"></div>
|
||||||
|
<div class="animate-pulse bg-gray-200 dark:bg-gray-700 h-24 rounded-xl"></div>
|
||||||
|
<div class="animate-pulse bg-gray-200 dark:bg-gray-700 h-24 rounded-xl"></div>
|
||||||
|
</div>
|
||||||
|
|
||||||
|
<!-- Modem List -->
|
||||||
|
<div class="bg-white dark:bg-slate-800 rounded-xl shadow-lg p-6">
|
||||||
|
<div class="flex items-center justify-between mb-6">
|
||||||
|
<h2 class="text-xl font-semibold text-gray-900 dark:text-white">All Modems</h2>
|
||||||
|
<div class="flex items-center gap-4">
|
||||||
|
<!-- Search -->
|
||||||
|
<div class="relative">
|
||||||
|
<input type="text"
|
||||||
|
id="modem-search"
|
||||||
|
placeholder="Search modems..."
|
||||||
|
class="pl-9 pr-4 py-2 text-sm border border-gray-300 dark:border-gray-600 rounded-lg bg-white dark:bg-slate-700 text-gray-900 dark:text-white focus:ring-2 focus:ring-seismo-orange focus:border-transparent"
|
||||||
|
hx-get="/api/modem-dashboard/units"
|
||||||
|
hx-trigger="keyup changed delay:300ms"
|
||||||
|
hx-target="#modem-list"
|
||||||
|
hx-include="[name='search']"
|
||||||
|
name="search">
|
||||||
|
<svg class="w-4 h-4 absolute left-3 top-1/2 transform -translate-y-1/2 text-gray-400" fill="none" stroke="currentColor" viewBox="0 0 24 24">
|
||||||
|
<path stroke-linecap="round" stroke-linejoin="round" stroke-width="2" d="M21 21l-6-6m2-5a7 7 0 11-14 0 7 7 0 0114 0z"></path>
|
||||||
|
</svg>
|
||||||
|
</div>
|
||||||
|
<a href="/roster?device_type=modem" class="text-sm text-seismo-orange hover:underline">
|
||||||
|
Add modem in roster
|
||||||
|
</a>
|
||||||
|
</div>
|
||||||
|
</div>
|
||||||
|
|
||||||
|
<div id="modem-list"
|
||||||
|
hx-get="/api/modem-dashboard/units"
|
||||||
|
hx-trigger="load, every 30s"
|
||||||
|
hx-swap="innerHTML">
|
||||||
|
<p class="text-gray-500 dark:text-gray-400">Loading modems...</p>
|
||||||
|
</div>
|
||||||
|
</div>
|
||||||
|
|
||||||
|
<script>
|
||||||
|
// Ping a modem and show result
|
||||||
|
async function pingModem(modemId) {
|
||||||
|
const btn = document.getElementById(`ping-btn-${modemId}`);
|
||||||
|
const resultDiv = document.getElementById(`ping-result-${modemId}`);
|
||||||
|
|
||||||
|
// Show loading state
|
||||||
|
const originalText = btn.textContent;
|
||||||
|
btn.textContent = 'Pinging...';
|
||||||
|
btn.disabled = true;
|
||||||
|
resultDiv.classList.remove('hidden');
|
||||||
|
resultDiv.className = 'mt-2 text-xs text-gray-500';
|
||||||
|
resultDiv.textContent = 'Testing connection...';
|
||||||
|
|
||||||
|
try {
|
||||||
|
const response = await fetch(`/api/modem-dashboard/${modemId}/ping`);
|
||||||
|
const data = await response.json();
|
||||||
|
|
||||||
|
if (data.status === 'success') {
|
||||||
|
resultDiv.className = 'mt-2 text-xs text-green-600 dark:text-green-400';
|
||||||
|
resultDiv.innerHTML = `<span class="inline-block w-2 h-2 bg-green-500 rounded-full mr-1"></span>Online (${data.response_time_ms}ms)`;
|
||||||
|
} else {
|
||||||
|
resultDiv.className = 'mt-2 text-xs text-red-600 dark:text-red-400';
|
||||||
|
resultDiv.innerHTML = `<span class="inline-block w-2 h-2 bg-red-500 rounded-full mr-1"></span>${data.detail || 'Offline'}`;
|
||||||
|
}
|
||||||
|
} catch (error) {
|
||||||
|
resultDiv.className = 'mt-2 text-xs text-red-600 dark:text-red-400';
|
||||||
|
resultDiv.textContent = 'Error: ' + error.message;
|
||||||
|
}
|
||||||
|
|
||||||
|
// Restore button
|
||||||
|
btn.textContent = originalText;
|
||||||
|
btn.disabled = false;
|
||||||
|
|
||||||
|
// Hide result after 10 seconds
|
||||||
|
setTimeout(() => {
|
||||||
|
resultDiv.classList.add('hidden');
|
||||||
|
}, 10000);
|
||||||
|
}
|
||||||
|
</script>
|
||||||
|
{% endblock %}
|
||||||
@@ -70,7 +70,7 @@
|
|||||||
class="tab-button px-4 py-3 border-b-2 border-transparent font-medium text-sm text-gray-600 dark:text-gray-400 hover:text-gray-900 dark:hover:text-white hover:border-gray-300 dark:hover:border-gray-600 transition-colors">
|
class="tab-button px-4 py-3 border-b-2 border-transparent font-medium text-sm text-gray-600 dark:text-gray-400 hover:text-gray-900 dark:hover:text-white hover:border-gray-300 dark:hover:border-gray-600 transition-colors">
|
||||||
Settings
|
Settings
|
||||||
</button>
|
</button>
|
||||||
{% if assigned_unit %}
|
{% if assigned_unit and connection_mode == 'connected' %}
|
||||||
<button onclick="switchTab('command')"
|
<button onclick="switchTab('command')"
|
||||||
data-tab="command"
|
data-tab="command"
|
||||||
class="tab-button px-4 py-3 border-b-2 border-transparent font-medium text-sm text-gray-600 dark:text-gray-400 hover:text-gray-900 dark:hover:text-white hover:border-gray-300 dark:hover:border-gray-600 transition-colors">
|
class="tab-button px-4 py-3 border-b-2 border-transparent font-medium text-sm text-gray-600 dark:text-gray-400 hover:text-gray-900 dark:hover:text-white hover:border-gray-300 dark:hover:border-gray-600 transition-colors">
|
||||||
@@ -80,7 +80,7 @@
|
|||||||
<button onclick="switchTab('sessions')"
|
<button onclick="switchTab('sessions')"
|
||||||
data-tab="sessions"
|
data-tab="sessions"
|
||||||
class="tab-button px-4 py-3 border-b-2 border-transparent font-medium text-sm text-gray-600 dark:text-gray-400 hover:text-gray-900 dark:hover:text-white hover:border-gray-300 dark:hover:border-gray-600 transition-colors">
|
class="tab-button px-4 py-3 border-b-2 border-transparent font-medium text-sm text-gray-600 dark:text-gray-400 hover:text-gray-900 dark:hover:text-white hover:border-gray-300 dark:hover:border-gray-600 transition-colors">
|
||||||
Recording Sessions
|
Monitoring Sessions
|
||||||
</button>
|
</button>
|
||||||
<button onclick="switchTab('data')"
|
<button onclick="switchTab('data')"
|
||||||
data-tab="data"
|
data-tab="data"
|
||||||
@@ -123,7 +123,7 @@
|
|||||||
{% endif %}
|
{% endif %}
|
||||||
<div>
|
<div>
|
||||||
<div class="text-sm text-gray-600 dark:text-gray-400">Created</div>
|
<div class="text-sm text-gray-600 dark:text-gray-400">Created</div>
|
||||||
<div class="text-gray-900 dark:text-white">{{ location.created_at.strftime('%Y-%m-%d %H:%M') if location.created_at else 'N/A' }}</div>
|
<div class="text-gray-900 dark:text-white">{{ location.created_at|local_datetime if location.created_at else 'N/A' }}</div>
|
||||||
</div>
|
</div>
|
||||||
</div>
|
</div>
|
||||||
</div>
|
</div>
|
||||||
@@ -150,7 +150,7 @@
|
|||||||
{% if assignment %}
|
{% if assignment %}
|
||||||
<div>
|
<div>
|
||||||
<div class="text-sm text-gray-600 dark:text-gray-400">Assigned Since</div>
|
<div class="text-sm text-gray-600 dark:text-gray-400">Assigned Since</div>
|
||||||
<div class="text-gray-900 dark:text-white">{{ assignment.assigned_at.strftime('%Y-%m-%d %H:%M') if assignment.assigned_at else 'N/A' }}</div>
|
<div class="text-gray-900 dark:text-white">{{ assignment.assigned_at|local_datetime if assignment.assigned_at else 'N/A' }}</div>
|
||||||
</div>
|
</div>
|
||||||
{% if assignment.notes %}
|
{% if assignment.notes %}
|
||||||
<div>
|
<div>
|
||||||
@@ -214,23 +214,54 @@
|
|||||||
<div class="bg-white dark:bg-slate-800 rounded-xl shadow-lg p-6">
|
<div class="bg-white dark:bg-slate-800 rounded-xl shadow-lg p-6">
|
||||||
<div class="flex items-center justify-between">
|
<div class="flex items-center justify-between">
|
||||||
<div>
|
<div>
|
||||||
|
{% if connection_mode == 'connected' %}
|
||||||
<p class="text-sm text-gray-600 dark:text-gray-400">Active Session</p>
|
<p class="text-sm text-gray-600 dark:text-gray-400">Active Session</p>
|
||||||
<p class="text-lg font-semibold text-gray-900 dark:text-white mt-2">
|
<p class="text-lg font-semibold text-gray-900 dark:text-white mt-2">
|
||||||
{% if active_session %}
|
{% if active_session %}
|
||||||
<span class="text-green-600 dark:text-green-400">Recording</span>
|
<span class="text-green-600 dark:text-green-400">Monitoring</span>
|
||||||
{% else %}
|
{% else %}
|
||||||
<span class="text-gray-500">Idle</span>
|
<span class="text-gray-500">Idle</span>
|
||||||
{% endif %}
|
{% endif %}
|
||||||
</p>
|
</p>
|
||||||
|
{% else %}
|
||||||
|
<p class="text-sm text-gray-600 dark:text-gray-400">Mode</p>
|
||||||
|
<p class="text-lg font-semibold mt-2">
|
||||||
|
<span class="text-amber-600 dark:text-amber-400">Offline / Manual</span>
|
||||||
|
</p>
|
||||||
|
{% endif %}
|
||||||
</div>
|
</div>
|
||||||
<div class="w-12 h-12 bg-purple-100 dark:bg-purple-900/30 rounded-lg flex items-center justify-center">
|
<div class="w-12 h-12 bg-purple-100 dark:bg-purple-900/30 rounded-lg flex items-center justify-center">
|
||||||
|
{% if connection_mode == 'connected' %}
|
||||||
<svg class="w-6 h-6 text-purple-600 dark:text-purple-400" fill="none" stroke="currentColor" viewBox="0 0 24 24">
|
<svg class="w-6 h-6 text-purple-600 dark:text-purple-400" fill="none" stroke="currentColor" viewBox="0 0 24 24">
|
||||||
<path stroke-linecap="round" stroke-linejoin="round" stroke-width="2" d="M12 8v4l3 3m6-3a9 9 0 11-18 0 9 9 0 0118 0z"></path>
|
<path stroke-linecap="round" stroke-linejoin="round" stroke-width="2" d="M12 8v4l3 3m6-3a9 9 0 11-18 0 9 9 0 0118 0z"></path>
|
||||||
</svg>
|
</svg>
|
||||||
|
{% else %}
|
||||||
|
<svg class="w-6 h-6 text-amber-600 dark:text-amber-400" fill="none" stroke="currentColor" viewBox="0 0 24 24">
|
||||||
|
<path stroke-linecap="round" stroke-linejoin="round" stroke-width="2" d="M4 16v1a3 3 0 003 3h10a3 3 0 003-3v-1m-4-8l-4-4m0 0L8 8m4-4v12"></path>
|
||||||
|
</svg>
|
||||||
|
{% endif %}
|
||||||
</div>
|
</div>
|
||||||
</div>
|
</div>
|
||||||
</div>
|
</div>
|
||||||
</div>
|
</div>
|
||||||
|
|
||||||
|
{% if connection_mode == 'connected' and assigned_unit %}
|
||||||
|
<!-- Live Status Row (connected NRLs only) -->
|
||||||
|
<div class="mt-6">
|
||||||
|
<div class="bg-white dark:bg-slate-800 rounded-xl shadow-lg p-6">
|
||||||
|
<div class="flex items-center justify-between mb-4">
|
||||||
|
<h3 class="text-lg font-semibold text-gray-900 dark:text-white">Live Status</h3>
|
||||||
|
<span class="text-xs text-gray-500 dark:text-gray-400">{{ assigned_unit.id }}</span>
|
||||||
|
</div>
|
||||||
|
<div id="nrl-live-status"
|
||||||
|
hx-get="/api/projects/{{ project_id }}/nrl/{{ location_id }}/live-status"
|
||||||
|
hx-trigger="load, every 30s"
|
||||||
|
hx-swap="innerHTML">
|
||||||
|
<div class="text-center py-4 text-gray-500 text-sm">Loading status…</div>
|
||||||
|
</div>
|
||||||
|
</div>
|
||||||
|
</div>
|
||||||
|
{% endif %}
|
||||||
</div>
|
</div>
|
||||||
|
|
||||||
<!-- Settings Tab -->
|
<!-- Settings Tab -->
|
||||||
@@ -281,8 +312,8 @@
|
|||||||
</div>
|
</div>
|
||||||
</div>
|
</div>
|
||||||
|
|
||||||
<!-- Command Center Tab -->
|
<!-- Command Center Tab (connected NRLs only) -->
|
||||||
{% if assigned_unit %}
|
{% if assigned_unit and connection_mode == 'connected' %}
|
||||||
<div id="command-tab" class="tab-panel hidden">
|
<div id="command-tab" class="tab-panel hidden">
|
||||||
<div class="bg-white dark:bg-slate-800 rounded-xl shadow-lg p-6">
|
<div class="bg-white dark:bg-slate-800 rounded-xl shadow-lg p-6">
|
||||||
<h2 class="text-xl font-semibold text-gray-900 dark:text-white mb-6">
|
<h2 class="text-xl font-semibold text-gray-900 dark:text-white mb-6">
|
||||||
@@ -302,11 +333,11 @@
|
|||||||
</div>
|
</div>
|
||||||
{% endif %}
|
{% endif %}
|
||||||
|
|
||||||
<!-- Recording Sessions Tab -->
|
<!-- Monitoring Sessions Tab -->
|
||||||
<div id="sessions-tab" class="tab-panel hidden">
|
<div id="sessions-tab" class="tab-panel hidden">
|
||||||
<div class="bg-white dark:bg-slate-800 rounded-xl shadow-lg p-6">
|
<div class="bg-white dark:bg-slate-800 rounded-xl shadow-lg p-6">
|
||||||
<div class="flex items-center justify-between mb-6">
|
<div class="flex items-center justify-between mb-6">
|
||||||
<h2 class="text-xl font-semibold text-gray-900 dark:text-white">Recording Sessions</h2>
|
<h2 class="text-xl font-semibold text-gray-900 dark:text-white">Monitoring Sessions</h2>
|
||||||
{% if assigned_unit %}
|
{% if assigned_unit %}
|
||||||
<button onclick="openScheduleModal()"
|
<button onclick="openScheduleModal()"
|
||||||
class="px-4 py-2 bg-seismo-orange text-white rounded-lg hover:bg-seismo-navy transition-colors">
|
class="px-4 py-2 bg-seismo-orange text-white rounded-lg hover:bg-seismo-navy transition-colors">
|
||||||
@@ -329,8 +360,51 @@
|
|||||||
<div class="bg-white dark:bg-slate-800 rounded-xl shadow-lg p-6">
|
<div class="bg-white dark:bg-slate-800 rounded-xl shadow-lg p-6">
|
||||||
<div class="flex items-center justify-between mb-6">
|
<div class="flex items-center justify-between mb-6">
|
||||||
<h2 class="text-xl font-semibold text-gray-900 dark:text-white">Data Files</h2>
|
<h2 class="text-xl font-semibold text-gray-900 dark:text-white">Data Files</h2>
|
||||||
<div class="text-sm text-gray-500">
|
<div class="flex items-center gap-3">
|
||||||
<span class="font-medium">{{ file_count }}</span> files
|
<span class="text-sm text-gray-500"><span class="font-medium">{{ file_count }}</span> files</span>
|
||||||
|
<button onclick="toggleUploadPanel()"
|
||||||
|
class="px-3 py-1.5 text-sm bg-seismo-orange text-white rounded-lg hover:bg-seismo-navy transition-colors flex items-center gap-1.5">
|
||||||
|
<svg class="w-4 h-4" fill="none" stroke="currentColor" viewBox="0 0 24 24">
|
||||||
|
<path stroke-linecap="round" stroke-linejoin="round" stroke-width="2" d="M4 16v1a3 3 0 003 3h10a3 3 0 003-3v-1m-4-8l-4-4m0 0L8 8m4-4v12"></path>
|
||||||
|
</svg>
|
||||||
|
Upload Data
|
||||||
|
</button>
|
||||||
|
</div>
|
||||||
|
</div>
|
||||||
|
|
||||||
|
<!-- Upload Panel -->
|
||||||
|
<div id="upload-panel" class="hidden mb-6 p-4 border-2 border-dashed border-gray-300 dark:border-gray-600 rounded-xl bg-gray-50 dark:bg-gray-800/50">
|
||||||
|
<p class="text-sm font-medium text-gray-700 dark:text-gray-300 mb-1">Upload SD Card Data</p>
|
||||||
|
<p class="text-xs text-gray-500 dark:text-gray-400 mb-3">
|
||||||
|
Select a ZIP file, or select all files from inside an <code class="bg-gray-200 dark:bg-gray-700 px-1 rounded">Auto_####</code> folder. File types (.rnd, .rnh) are auto-detected.
|
||||||
|
</p>
|
||||||
|
<input type="file" id="upload-input" multiple
|
||||||
|
accept=".zip,.rnd,.rnh,.RND,.RNH"
|
||||||
|
class="block w-full text-sm text-gray-500 dark:text-gray-400
|
||||||
|
file:mr-3 file:py-1.5 file:px-3 file:rounded-lg file:border-0
|
||||||
|
file:text-sm file:font-medium file:bg-seismo-orange file:text-white
|
||||||
|
hover:file:bg-seismo-navy file:cursor-pointer" />
|
||||||
|
<div class="flex items-center gap-3 mt-3">
|
||||||
|
<button id="upload-btn" onclick="submitUpload()"
|
||||||
|
class="px-4 py-1.5 text-sm bg-green-600 text-white rounded-lg hover:bg-green-700 transition-colors">
|
||||||
|
Import Files
|
||||||
|
</button>
|
||||||
|
<button id="upload-cancel-btn" onclick="toggleUploadPanel()"
|
||||||
|
class="px-4 py-1.5 text-sm text-gray-600 dark:text-gray-400 hover:text-gray-900 dark:hover:text-white transition-colors">
|
||||||
|
Cancel
|
||||||
|
</button>
|
||||||
|
<span id="upload-status" class="text-sm hidden"></span>
|
||||||
|
</div>
|
||||||
|
<!-- Progress bar (hidden until upload starts) -->
|
||||||
|
<div id="upload-progress-wrap" class="hidden mt-3">
|
||||||
|
<div class="flex justify-between text-xs text-gray-500 dark:text-gray-400 mb-1">
|
||||||
|
<span id="upload-progress-label">Uploading…</span>
|
||||||
|
</div>
|
||||||
|
<div class="w-full bg-gray-200 dark:bg-gray-700 rounded-full h-2">
|
||||||
|
<div id="upload-progress-bar"
|
||||||
|
class="bg-green-500 h-2 rounded-full transition-all duration-300"
|
||||||
|
style="width: 0%"></div>
|
||||||
|
</div>
|
||||||
</div>
|
</div>
|
||||||
</div>
|
</div>
|
||||||
|
|
||||||
@@ -559,5 +633,112 @@ document.getElementById('assign-modal')?.addEventListener('click', function(e) {
|
|||||||
closeAssignModal();
|
closeAssignModal();
|
||||||
}
|
}
|
||||||
});
|
});
|
||||||
|
|
||||||
|
// ── Upload Data ─────────────────────────────────────────────────────────────
|
||||||
|
|
||||||
|
function toggleUploadPanel() {
|
||||||
|
const panel = document.getElementById('upload-panel');
|
||||||
|
const status = document.getElementById('upload-status');
|
||||||
|
panel.classList.toggle('hidden');
|
||||||
|
// Reset state when reopening
|
||||||
|
if (!panel.classList.contains('hidden')) {
|
||||||
|
status.textContent = '';
|
||||||
|
status.className = 'text-sm hidden';
|
||||||
|
document.getElementById('upload-input').value = '';
|
||||||
|
document.getElementById('upload-progress-wrap').classList.add('hidden');
|
||||||
|
document.getElementById('upload-progress-bar').style.width = '0%';
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
function submitUpload() {
|
||||||
|
const input = document.getElementById('upload-input');
|
||||||
|
const status = document.getElementById('upload-status');
|
||||||
|
const btn = document.getElementById('upload-btn');
|
||||||
|
const cancelBtn = document.getElementById('upload-cancel-btn');
|
||||||
|
const progressWrap = document.getElementById('upload-progress-wrap');
|
||||||
|
const progressBar = document.getElementById('upload-progress-bar');
|
||||||
|
const progressLabel = document.getElementById('upload-progress-label');
|
||||||
|
|
||||||
|
if (!input.files.length) {
|
||||||
|
alert('Please select files to upload.');
|
||||||
|
return;
|
||||||
|
}
|
||||||
|
|
||||||
|
const fileCount = input.files.length;
|
||||||
|
const formData = new FormData();
|
||||||
|
for (const file of input.files) {
|
||||||
|
formData.append('files', file);
|
||||||
|
}
|
||||||
|
|
||||||
|
// Disable controls and show progress bar
|
||||||
|
btn.disabled = true;
|
||||||
|
btn.textContent = 'Uploading\u2026';
|
||||||
|
btn.classList.add('opacity-60', 'cursor-not-allowed');
|
||||||
|
cancelBtn.disabled = true;
|
||||||
|
cancelBtn.classList.add('opacity-40', 'cursor-not-allowed');
|
||||||
|
status.className = 'text-sm hidden';
|
||||||
|
progressWrap.classList.remove('hidden');
|
||||||
|
progressBar.style.width = '0%';
|
||||||
|
progressLabel.textContent = `Uploading ${fileCount} file${fileCount !== 1 ? 's' : ''}\u2026`;
|
||||||
|
|
||||||
|
const xhr = new XMLHttpRequest();
|
||||||
|
|
||||||
|
xhr.upload.addEventListener('progress', (e) => {
|
||||||
|
if (e.lengthComputable) {
|
||||||
|
const pct = Math.round((e.loaded / e.total) * 100);
|
||||||
|
progressBar.style.width = pct + '%';
|
||||||
|
progressLabel.textContent = `Uploading ${fileCount} file${fileCount !== 1 ? 's' : ''}\u2026 ${pct}%`;
|
||||||
|
}
|
||||||
|
});
|
||||||
|
|
||||||
|
xhr.upload.addEventListener('load', () => {
|
||||||
|
progressBar.style.width = '100%';
|
||||||
|
progressLabel.textContent = 'Processing files on server\u2026';
|
||||||
|
});
|
||||||
|
|
||||||
|
xhr.addEventListener('load', () => {
|
||||||
|
progressWrap.classList.add('hidden');
|
||||||
|
btn.disabled = false;
|
||||||
|
btn.textContent = 'Import Files';
|
||||||
|
btn.classList.remove('opacity-60', 'cursor-not-allowed');
|
||||||
|
cancelBtn.disabled = false;
|
||||||
|
cancelBtn.classList.remove('opacity-40', 'cursor-not-allowed');
|
||||||
|
|
||||||
|
try {
|
||||||
|
const data = JSON.parse(xhr.responseText);
|
||||||
|
if (xhr.status >= 200 && xhr.status < 300) {
|
||||||
|
const parts = [`Imported ${data.files_imported} file${data.files_imported !== 1 ? 's' : ''}`];
|
||||||
|
if (data.leq_files || data.lp_files) {
|
||||||
|
parts.push(`(${data.leq_files} Leq, ${data.lp_files} Lp)`);
|
||||||
|
}
|
||||||
|
if (data.store_name) parts.push(`\u2014 ${data.store_name}`);
|
||||||
|
status.textContent = parts.join(' ');
|
||||||
|
status.className = 'text-sm text-green-600 dark:text-green-400';
|
||||||
|
input.value = '';
|
||||||
|
htmx.trigger(document.getElementById('data-files-list'), 'load');
|
||||||
|
} else {
|
||||||
|
status.textContent = `Error: ${data.detail || 'Upload failed'}`;
|
||||||
|
status.className = 'text-sm text-red-600 dark:text-red-400';
|
||||||
|
}
|
||||||
|
} catch {
|
||||||
|
status.textContent = 'Error: Unexpected server response';
|
||||||
|
status.className = 'text-sm text-red-600 dark:text-red-400';
|
||||||
|
}
|
||||||
|
});
|
||||||
|
|
||||||
|
xhr.addEventListener('error', () => {
|
||||||
|
progressWrap.classList.add('hidden');
|
||||||
|
btn.disabled = false;
|
||||||
|
btn.textContent = 'Import Files';
|
||||||
|
btn.classList.remove('opacity-60', 'cursor-not-allowed');
|
||||||
|
cancelBtn.disabled = false;
|
||||||
|
cancelBtn.classList.remove('opacity-40', 'cursor-not-allowed');
|
||||||
|
status.textContent = 'Error: Network error during upload';
|
||||||
|
status.className = 'text-sm text-red-600 dark:text-red-400';
|
||||||
|
});
|
||||||
|
|
||||||
|
xhr.open('POST', `/api/projects/${projectId}/nrl/${locationId}/upload-data`);
|
||||||
|
xhr.send(formData);
|
||||||
|
}
|
||||||
</script>
|
</script>
|
||||||
{% endblock %}
|
{% endblock %}
|
||||||
|
|||||||
566
templates/pair_devices.html
Normal file
@@ -0,0 +1,566 @@
|
|||||||
|
{% extends "base.html" %}
|
||||||
|
|
||||||
|
{% block title %}Pair Devices - Terra-View{% endblock %}
|
||||||
|
|
||||||
|
{% block content %}
|
||||||
|
<div class="max-w-7xl mx-auto">
|
||||||
|
<!-- Header -->
|
||||||
|
<div class="mb-6">
|
||||||
|
<h1 class="text-2xl font-bold text-gray-900 dark:text-white">Pair Devices</h1>
|
||||||
|
<p class="mt-1 text-sm text-gray-600 dark:text-gray-400">
|
||||||
|
Select a recorder (seismograph or SLM) and a modem to create a bidirectional pairing.
|
||||||
|
</p>
|
||||||
|
</div>
|
||||||
|
|
||||||
|
<!-- Selection Summary Bar -->
|
||||||
|
<div id="selection-bar" class="mb-6 p-4 bg-white dark:bg-slate-800 rounded-lg shadow border border-gray-200 dark:border-gray-700">
|
||||||
|
<div class="flex items-center justify-between flex-wrap gap-4">
|
||||||
|
<div class="flex items-center gap-6">
|
||||||
|
<div class="flex items-center gap-2">
|
||||||
|
<span class="text-sm text-gray-600 dark:text-gray-400">Recorder:</span>
|
||||||
|
<span id="selected-recorder" class="font-mono font-medium text-gray-900 dark:text-white">None selected</span>
|
||||||
|
</div>
|
||||||
|
<svg class="w-5 h-5 text-gray-400" fill="none" stroke="currentColor" viewBox="0 0 24 24">
|
||||||
|
<path stroke-linecap="round" stroke-linejoin="round" stroke-width="2" d="M14 5l7 7m0 0l-7 7m7-7H3"></path>
|
||||||
|
</svg>
|
||||||
|
<div class="flex items-center gap-2">
|
||||||
|
<span class="text-sm text-gray-600 dark:text-gray-400">Modem:</span>
|
||||||
|
<span id="selected-modem" class="font-mono font-medium text-gray-900 dark:text-white">None selected</span>
|
||||||
|
</div>
|
||||||
|
</div>
|
||||||
|
<div class="flex items-center gap-3">
|
||||||
|
<button id="clear-selection-btn"
|
||||||
|
onclick="clearSelection()"
|
||||||
|
class="px-4 py-2 text-sm font-medium text-gray-700 dark:text-gray-300 bg-gray-100 dark:bg-gray-700 rounded-lg hover:bg-gray-200 dark:hover:bg-gray-600 disabled:opacity-50 disabled:cursor-not-allowed"
|
||||||
|
disabled>
|
||||||
|
Clear
|
||||||
|
</button>
|
||||||
|
<button id="pair-btn"
|
||||||
|
onclick="pairDevices()"
|
||||||
|
class="px-4 py-2 text-sm font-medium text-white bg-seismo-orange rounded-lg hover:bg-orange-600 disabled:opacity-50 disabled:cursor-not-allowed"
|
||||||
|
disabled>
|
||||||
|
Pair Devices
|
||||||
|
</button>
|
||||||
|
</div>
|
||||||
|
</div>
|
||||||
|
</div>
|
||||||
|
|
||||||
|
<!-- Two Column Layout -->
|
||||||
|
<div class="grid grid-cols-1 lg:grid-cols-2 gap-6">
|
||||||
|
<!-- Left Column: Recorders (Seismographs + SLMs) -->
|
||||||
|
<div class="bg-white dark:bg-slate-800 rounded-lg shadow border border-gray-200 dark:border-gray-700">
|
||||||
|
<div class="px-4 py-3 border-b border-gray-200 dark:border-gray-700">
|
||||||
|
<div class="flex items-center justify-between mb-3">
|
||||||
|
<h2 class="text-lg font-semibold text-gray-900 dark:text-white flex items-center gap-2">
|
||||||
|
<svg class="w-5 h-5 text-seismo-orange" fill="none" stroke="currentColor" viewBox="0 0 24 24">
|
||||||
|
<path stroke-linecap="round" stroke-linejoin="round" stroke-width="2" d="M9 19v-6a2 2 0 00-2-2H5a2 2 0 00-2 2v6a2 2 0 002 2h2a2 2 0 002-2zm0 0V9a2 2 0 012-2h2a2 2 0 012 2v10m-6 0a2 2 0 002 2h2a2 2 0 002-2m0 0V5a2 2 0 012-2h2a2 2 0 012 2v14a2 2 0 01-2 2h-2a2 2 0 01-2-2z"></path>
|
||||||
|
</svg>
|
||||||
|
Recorders
|
||||||
|
<span id="recorder-count" class="text-sm font-normal text-gray-500 dark:text-gray-400">({{ recorders|length }})</span>
|
||||||
|
</h2>
|
||||||
|
</div>
|
||||||
|
<!-- Recorder Search & Filters -->
|
||||||
|
<div class="space-y-2">
|
||||||
|
<input type="text" id="recorder-search" placeholder="Search by ID..."
|
||||||
|
class="w-full px-3 py-2 rounded-lg border border-gray-300 dark:border-gray-600 bg-white dark:bg-slate-700 text-gray-900 dark:text-white text-sm focus:ring-2 focus:ring-seismo-orange focus:border-seismo-orange"
|
||||||
|
oninput="filterRecorders()">
|
||||||
|
<div class="flex items-center gap-4">
|
||||||
|
<label class="flex items-center gap-2 cursor-pointer">
|
||||||
|
<input type="checkbox" id="recorder-hide-paired" onchange="filterRecorders()" class="rounded border-gray-300 dark:border-gray-600 text-seismo-orange focus:ring-seismo-orange">
|
||||||
|
<span class="text-xs text-gray-600 dark:text-gray-400">Hide paired</span>
|
||||||
|
</label>
|
||||||
|
<label class="flex items-center gap-2 cursor-pointer">
|
||||||
|
<input type="checkbox" id="recorder-deployed-only" onchange="filterRecorders()" class="rounded border-gray-300 dark:border-gray-600 text-seismo-orange focus:ring-seismo-orange">
|
||||||
|
<span class="text-xs text-gray-600 dark:text-gray-400">Deployed only</span>
|
||||||
|
</label>
|
||||||
|
</div>
|
||||||
|
</div>
|
||||||
|
</div>
|
||||||
|
<div class="max-h-[600px] overflow-y-auto">
|
||||||
|
<div id="recorders-list" class="divide-y divide-gray-200 dark:divide-gray-700">
|
||||||
|
{% for unit in recorders %}
|
||||||
|
<div class="device-row recorder-row p-3 hover:bg-gray-50 dark:hover:bg-gray-700/50 cursor-pointer transition-colors"
|
||||||
|
data-id="{{ unit.id }}"
|
||||||
|
data-deployed="{{ unit.deployed|lower }}"
|
||||||
|
data-paired-with="{{ unit.deployed_with_modem_id or '' }}"
|
||||||
|
data-device-type="{{ unit.device_type }}"
|
||||||
|
onclick="selectRecorder('{{ unit.id }}')">
|
||||||
|
<div class="flex items-center justify-between">
|
||||||
|
<div class="flex items-center gap-3">
|
||||||
|
<div class="w-8 h-8 rounded-full flex items-center justify-center
|
||||||
|
{% if unit.device_type == 'slm' %}bg-purple-100 dark:bg-purple-900/30 text-purple-600 dark:text-purple-400
|
||||||
|
{% else %}bg-blue-100 dark:bg-blue-900/30 text-blue-600 dark:text-blue-400{% endif %}">
|
||||||
|
{% if unit.device_type == 'slm' %}
|
||||||
|
<svg class="w-4 h-4" fill="none" stroke="currentColor" viewBox="0 0 24 24">
|
||||||
|
<path stroke-linecap="round" stroke-linejoin="round" stroke-width="2" d="M15.536 8.464a5 5 0 010 7.072m2.828-9.9a9 9 0 010 12.728M5.586 15H4a1 1 0 01-1-1v-4a1 1 0 011-1h1.586l4.707-4.707C10.923 3.663 12 4.109 12 5v14c0 .891-1.077 1.337-1.707.707L5.586 15z"></path>
|
||||||
|
</svg>
|
||||||
|
{% else %}
|
||||||
|
<svg class="w-4 h-4" fill="none" stroke="currentColor" viewBox="0 0 24 24">
|
||||||
|
<path stroke-linecap="round" stroke-linejoin="round" stroke-width="2" d="M9 19v-6a2 2 0 00-2-2H5a2 2 0 00-2 2v6a2 2 0 002 2h2a2 2 0 002-2zm0 0V9a2 2 0 012-2h2a2 2 0 012 2v10m-6 0a2 2 0 002 2h2a2 2 0 002-2m0 0V5a2 2 0 012-2h2a2 2 0 012 2v14a2 2 0 01-2 2h-2a2 2 0 01-2-2z"></path>
|
||||||
|
</svg>
|
||||||
|
{% endif %}
|
||||||
|
</div>
|
||||||
|
<div>
|
||||||
|
<div class="font-mono font-medium text-gray-900 dark:text-white">{{ unit.id }}</div>
|
||||||
|
<div class="text-xs text-gray-500 dark:text-gray-400">
|
||||||
|
{{ unit.device_type|capitalize }}
|
||||||
|
{% if not unit.deployed %}<span class="text-yellow-600 dark:text-yellow-400">(Benched)</span>{% endif %}
|
||||||
|
</div>
|
||||||
|
</div>
|
||||||
|
</div>
|
||||||
|
<div class="flex items-center gap-2">
|
||||||
|
{% if unit.deployed_with_modem_id %}
|
||||||
|
<span class="px-2 py-1 text-xs rounded-full bg-green-100 dark:bg-green-900/30 text-green-700 dark:text-green-400">
|
||||||
|
→ {{ unit.deployed_with_modem_id }}
|
||||||
|
</span>
|
||||||
|
{% endif %}
|
||||||
|
<div class="w-5 h-5 rounded-full border-2 border-gray-300 dark:border-gray-600 flex items-center justify-center selection-indicator">
|
||||||
|
<svg class="w-3 h-3 text-seismo-orange hidden check-icon" fill="currentColor" viewBox="0 0 20 20">
|
||||||
|
<path fill-rule="evenodd" d="M16.707 5.293a1 1 0 010 1.414l-8 8a1 1 0 01-1.414 0l-4-4a1 1 0 011.414-1.414L8 12.586l7.293-7.293a1 1 0 011.414 0z" clip-rule="evenodd"></path>
|
||||||
|
</svg>
|
||||||
|
</div>
|
||||||
|
</div>
|
||||||
|
</div>
|
||||||
|
</div>
|
||||||
|
{% else %}
|
||||||
|
<div class="p-8 text-center text-gray-500 dark:text-gray-400">
|
||||||
|
No recorders found in roster
|
||||||
|
</div>
|
||||||
|
{% endfor %}
|
||||||
|
</div>
|
||||||
|
</div>
|
||||||
|
</div>
|
||||||
|
|
||||||
|
<!-- Right Column: Modems -->
|
||||||
|
<div class="bg-white dark:bg-slate-800 rounded-lg shadow border border-gray-200 dark:border-gray-700">
|
||||||
|
<div class="px-4 py-3 border-b border-gray-200 dark:border-gray-700">
|
||||||
|
<div class="flex items-center justify-between mb-3">
|
||||||
|
<h2 class="text-lg font-semibold text-gray-900 dark:text-white flex items-center gap-2">
|
||||||
|
<svg class="w-5 h-5 text-seismo-orange" fill="none" stroke="currentColor" viewBox="0 0 24 24">
|
||||||
|
<path stroke-linecap="round" stroke-linejoin="round" stroke-width="2" d="M8.111 16.404a5.5 5.5 0 017.778 0M12 20h.01m-7.08-7.071c3.904-3.905 10.236-3.905 14.141 0M1.394 9.393c5.857-5.857 15.355-5.857 21.213 0"></path>
|
||||||
|
</svg>
|
||||||
|
Modems
|
||||||
|
<span id="modem-count" class="text-sm font-normal text-gray-500 dark:text-gray-400">({{ modems|length }})</span>
|
||||||
|
</h2>
|
||||||
|
</div>
|
||||||
|
<!-- Modem Search & Filters -->
|
||||||
|
<div class="space-y-2">
|
||||||
|
<input type="text" id="modem-search" placeholder="Search by ID, IP, or phone..."
|
||||||
|
class="w-full px-3 py-2 rounded-lg border border-gray-300 dark:border-gray-600 bg-white dark:bg-slate-700 text-gray-900 dark:text-white text-sm focus:ring-2 focus:ring-seismo-orange focus:border-seismo-orange"
|
||||||
|
oninput="filterModems()">
|
||||||
|
<div class="flex items-center gap-4">
|
||||||
|
<label class="flex items-center gap-2 cursor-pointer">
|
||||||
|
<input type="checkbox" id="modem-hide-paired" onchange="filterModems()" class="rounded border-gray-300 dark:border-gray-600 text-seismo-orange focus:ring-seismo-orange">
|
||||||
|
<span class="text-xs text-gray-600 dark:text-gray-400">Hide paired</span>
|
||||||
|
</label>
|
||||||
|
<label class="flex items-center gap-2 cursor-pointer">
|
||||||
|
<input type="checkbox" id="modem-deployed-only" onchange="filterModems()" class="rounded border-gray-300 dark:border-gray-600 text-seismo-orange focus:ring-seismo-orange">
|
||||||
|
<span class="text-xs text-gray-600 dark:text-gray-400">Deployed only</span>
|
||||||
|
</label>
|
||||||
|
</div>
|
||||||
|
</div>
|
||||||
|
</div>
|
||||||
|
<div class="max-h-[600px] overflow-y-auto">
|
||||||
|
<div id="modems-list" class="divide-y divide-gray-200 dark:divide-gray-700">
|
||||||
|
{% for unit in modems %}
|
||||||
|
<div class="device-row modem-row p-3 hover:bg-gray-50 dark:hover:bg-gray-700/50 cursor-pointer transition-colors"
|
||||||
|
data-id="{{ unit.id }}"
|
||||||
|
data-deployed="{{ unit.deployed|lower }}"
|
||||||
|
data-paired-with="{{ unit.deployed_with_unit_id or '' }}"
|
||||||
|
data-ip="{{ unit.ip_address or '' }}"
|
||||||
|
data-phone="{{ unit.phone_number or '' }}"
|
||||||
|
onclick="selectModem('{{ unit.id }}')">
|
||||||
|
<div class="flex items-center justify-between">
|
||||||
|
<div class="flex items-center gap-3">
|
||||||
|
<div class="w-8 h-8 rounded-full bg-amber-100 dark:bg-amber-900/30 flex items-center justify-center text-amber-600 dark:text-amber-400">
|
||||||
|
<svg class="w-4 h-4" fill="none" stroke="currentColor" viewBox="0 0 24 24">
|
||||||
|
<path stroke-linecap="round" stroke-linejoin="round" stroke-width="2" d="M8.111 16.404a5.5 5.5 0 017.778 0M12 20h.01m-7.08-7.071c3.904-3.905 10.236-3.905 14.141 0M1.394 9.393c5.857-5.857 15.355-5.857 21.213 0"></path>
|
||||||
|
</svg>
|
||||||
|
</div>
|
||||||
|
<div>
|
||||||
|
<div class="font-mono font-medium text-gray-900 dark:text-white">{{ unit.id }}</div>
|
||||||
|
<div class="text-xs text-gray-500 dark:text-gray-400">
|
||||||
|
{% if unit.ip_address %}<span class="font-mono">{{ unit.ip_address }}</span>{% endif %}
|
||||||
|
{% if unit.phone_number %}{% if unit.ip_address %} · {% endif %}{{ unit.phone_number }}{% endif %}
|
||||||
|
{% if not unit.ip_address and not unit.phone_number %}Modem{% endif %}
|
||||||
|
{% if not unit.deployed %}<span class="text-yellow-600 dark:text-yellow-400">(Benched)</span>{% endif %}
|
||||||
|
</div>
|
||||||
|
</div>
|
||||||
|
</div>
|
||||||
|
<div class="flex items-center gap-2">
|
||||||
|
{% if unit.deployed_with_unit_id %}
|
||||||
|
<span class="px-2 py-1 text-xs rounded-full bg-green-100 dark:bg-green-900/30 text-green-700 dark:text-green-400">
|
||||||
|
← {{ unit.deployed_with_unit_id }}
|
||||||
|
</span>
|
||||||
|
{% endif %}
|
||||||
|
<div class="w-5 h-5 rounded-full border-2 border-gray-300 dark:border-gray-600 flex items-center justify-center selection-indicator">
|
||||||
|
<svg class="w-3 h-3 text-seismo-orange hidden check-icon" fill="currentColor" viewBox="0 0 20 20">
|
||||||
|
<path fill-rule="evenodd" d="M16.707 5.293a1 1 0 010 1.414l-8 8a1 1 0 01-1.414 0l-4-4a1 1 0 011.414-1.414L8 12.586l7.293-7.293a1 1 0 011.414 0z" clip-rule="evenodd"></path>
|
||||||
|
</svg>
|
||||||
|
</div>
|
||||||
|
</div>
|
||||||
|
</div>
|
||||||
|
</div>
|
||||||
|
{% else %}
|
||||||
|
<div class="p-8 text-center text-gray-500 dark:text-gray-400">
|
||||||
|
No modems found in roster
|
||||||
|
</div>
|
||||||
|
{% endfor %}
|
||||||
|
</div>
|
||||||
|
</div>
|
||||||
|
</div>
|
||||||
|
</div>
|
||||||
|
|
||||||
|
<!-- Existing Pairings Section -->
|
||||||
|
<div class="mt-8 bg-white dark:bg-slate-800 rounded-lg shadow border border-gray-200 dark:border-gray-700">
|
||||||
|
<div class="px-4 py-3 border-b border-gray-200 dark:border-gray-700">
|
||||||
|
<h2 class="text-lg font-semibold text-gray-900 dark:text-white flex items-center gap-2">
|
||||||
|
<svg class="w-5 h-5 text-green-500" fill="none" stroke="currentColor" viewBox="0 0 24 24">
|
||||||
|
<path stroke-linecap="round" stroke-linejoin="round" stroke-width="2" d="M13.828 10.172a4 4 0 00-5.656 0l-4 4a4 4 0 105.656 5.656l1.102-1.101m-.758-4.899a4 4 0 005.656 0l4-4a4 4 0 00-5.656-5.656l-1.1 1.1"></path>
|
||||||
|
</svg>
|
||||||
|
Existing Pairings
|
||||||
|
<span id="pairing-count" class="text-sm font-normal text-gray-500 dark:text-gray-400">({{ pairings|length }})</span>
|
||||||
|
</h2>
|
||||||
|
</div>
|
||||||
|
<div class="max-h-[400px] overflow-y-auto">
|
||||||
|
<div id="pairings-list" class="divide-y divide-gray-200 dark:divide-gray-700">
|
||||||
|
{% for pairing in pairings %}
|
||||||
|
<div class="pairing-row p-3 flex items-center justify-between hover:bg-gray-50 dark:hover:bg-gray-700/50">
|
||||||
|
<div class="flex items-center gap-4">
|
||||||
|
<div class="flex items-center gap-2">
|
||||||
|
<span class="px-2 py-1 text-sm font-mono rounded bg-blue-100 dark:bg-blue-900/30 text-blue-700 dark:text-blue-400">
|
||||||
|
{{ pairing.recorder_id }}
|
||||||
|
</span>
|
||||||
|
<span class="text-xs text-gray-500 dark:text-gray-400">{{ pairing.recorder_type }}</span>
|
||||||
|
</div>
|
||||||
|
<svg class="w-5 h-5 text-gray-400" fill="none" stroke="currentColor" viewBox="0 0 24 24">
|
||||||
|
<path stroke-linecap="round" stroke-linejoin="round" stroke-width="2" d="M8 7h12m0 0l-4-4m4 4l-4 4m0 6H4m0 0l4 4m-4-4l4-4"></path>
|
||||||
|
</svg>
|
||||||
|
<div class="flex items-center gap-2">
|
||||||
|
<span class="px-2 py-1 text-sm font-mono rounded bg-amber-100 dark:bg-amber-900/30 text-amber-700 dark:text-amber-400">
|
||||||
|
{{ pairing.modem_id }}
|
||||||
|
</span>
|
||||||
|
{% if pairing.modem_ip %}
|
||||||
|
<span class="text-xs font-mono text-gray-500 dark:text-gray-400">{{ pairing.modem_ip }}</span>
|
||||||
|
{% endif %}
|
||||||
|
</div>
|
||||||
|
</div>
|
||||||
|
<button onclick="unpairDevices('{{ pairing.recorder_id }}', '{{ pairing.modem_id }}')"
|
||||||
|
class="p-2 text-red-600 dark:text-red-400 hover:bg-red-100 dark:hover:bg-red-900/30 rounded-lg transition-colors"
|
||||||
|
title="Unpair devices">
|
||||||
|
<svg class="w-4 h-4" fill="none" stroke="currentColor" viewBox="0 0 24 24">
|
||||||
|
<path stroke-linecap="round" stroke-linejoin="round" stroke-width="2" d="M6 18L18 6M6 6l12 12"></path>
|
||||||
|
</svg>
|
||||||
|
</button>
|
||||||
|
</div>
|
||||||
|
{% else %}
|
||||||
|
<div class="p-8 text-center text-gray-500 dark:text-gray-400">
|
||||||
|
No pairings found. Select a recorder and modem above to create one.
|
||||||
|
</div>
|
||||||
|
{% endfor %}
|
||||||
|
</div>
|
||||||
|
</div>
|
||||||
|
</div>
|
||||||
|
</div>
|
||||||
|
|
||||||
|
<!-- Toast notification -->
|
||||||
|
<div id="toast" class="fixed bottom-4 right-4 px-4 py-3 rounded-lg shadow-lg transform translate-y-full opacity-0 transition-all duration-300 z-50"></div>
|
||||||
|
|
||||||
|
<script>
|
||||||
|
let selectedRecorder = null;
|
||||||
|
let selectedModem = null;
|
||||||
|
|
||||||
|
function selectRecorder(id) {
|
||||||
|
// Deselect previous
|
||||||
|
document.querySelectorAll('.recorder-row').forEach(row => {
|
||||||
|
row.classList.remove('bg-seismo-orange/10', 'dark:bg-seismo-orange/20');
|
||||||
|
row.querySelector('.selection-indicator').classList.remove('border-seismo-orange', 'bg-seismo-orange');
|
||||||
|
row.querySelector('.selection-indicator').classList.add('border-gray-300', 'dark:border-gray-600');
|
||||||
|
row.querySelector('.check-icon').classList.add('hidden');
|
||||||
|
});
|
||||||
|
|
||||||
|
// Toggle selection
|
||||||
|
if (selectedRecorder === id) {
|
||||||
|
selectedRecorder = null;
|
||||||
|
document.getElementById('selected-recorder').textContent = 'None selected';
|
||||||
|
} else {
|
||||||
|
selectedRecorder = id;
|
||||||
|
document.getElementById('selected-recorder').textContent = id;
|
||||||
|
|
||||||
|
// Highlight selected
|
||||||
|
const row = document.querySelector(`.recorder-row[data-id="${id}"]`);
|
||||||
|
if (row) {
|
||||||
|
row.classList.add('bg-seismo-orange/10', 'dark:bg-seismo-orange/20');
|
||||||
|
row.querySelector('.selection-indicator').classList.remove('border-gray-300', 'dark:border-gray-600');
|
||||||
|
row.querySelector('.selection-indicator').classList.add('border-seismo-orange', 'bg-seismo-orange');
|
||||||
|
row.querySelector('.check-icon').classList.remove('hidden');
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
updateButtons();
|
||||||
|
}
|
||||||
|
|
||||||
|
function selectModem(id) {
|
||||||
|
// Deselect previous
|
||||||
|
document.querySelectorAll('.modem-row').forEach(row => {
|
||||||
|
row.classList.remove('bg-seismo-orange/10', 'dark:bg-seismo-orange/20');
|
||||||
|
row.querySelector('.selection-indicator').classList.remove('border-seismo-orange', 'bg-seismo-orange');
|
||||||
|
row.querySelector('.selection-indicator').classList.add('border-gray-300', 'dark:border-gray-600');
|
||||||
|
row.querySelector('.check-icon').classList.add('hidden');
|
||||||
|
});
|
||||||
|
|
||||||
|
// Toggle selection
|
||||||
|
if (selectedModem === id) {
|
||||||
|
selectedModem = null;
|
||||||
|
document.getElementById('selected-modem').textContent = 'None selected';
|
||||||
|
} else {
|
||||||
|
selectedModem = id;
|
||||||
|
document.getElementById('selected-modem').textContent = id;
|
||||||
|
|
||||||
|
// Highlight selected
|
||||||
|
const row = document.querySelector(`.modem-row[data-id="${id}"]`);
|
||||||
|
if (row) {
|
||||||
|
row.classList.add('bg-seismo-orange/10', 'dark:bg-seismo-orange/20');
|
||||||
|
row.querySelector('.selection-indicator').classList.remove('border-gray-300', 'dark:border-gray-600');
|
||||||
|
row.querySelector('.selection-indicator').classList.add('border-seismo-orange', 'bg-seismo-orange');
|
||||||
|
row.querySelector('.check-icon').classList.remove('hidden');
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
updateButtons();
|
||||||
|
}
|
||||||
|
|
||||||
|
function updateButtons() {
|
||||||
|
const pairBtn = document.getElementById('pair-btn');
|
||||||
|
const clearBtn = document.getElementById('clear-selection-btn');
|
||||||
|
|
||||||
|
pairBtn.disabled = !(selectedRecorder && selectedModem);
|
||||||
|
clearBtn.disabled = !(selectedRecorder || selectedModem);
|
||||||
|
}
|
||||||
|
|
||||||
|
function clearSelection() {
|
||||||
|
if (selectedRecorder) selectRecorder(selectedRecorder);
|
||||||
|
if (selectedModem) selectModem(selectedModem);
|
||||||
|
}
|
||||||
|
|
||||||
|
function filterRecorders() {
|
||||||
|
const searchTerm = document.getElementById('recorder-search').value.toLowerCase().trim();
|
||||||
|
const hidePaired = document.getElementById('recorder-hide-paired').checked;
|
||||||
|
const deployedOnly = document.getElementById('recorder-deployed-only').checked;
|
||||||
|
|
||||||
|
let visibleRecorders = 0;
|
||||||
|
|
||||||
|
document.querySelectorAll('.recorder-row').forEach(row => {
|
||||||
|
const id = row.dataset.id.toLowerCase();
|
||||||
|
const pairedWith = row.dataset.pairedWith;
|
||||||
|
const deployed = row.dataset.deployed === 'true';
|
||||||
|
|
||||||
|
let show = true;
|
||||||
|
if (searchTerm && !id.includes(searchTerm)) show = false;
|
||||||
|
if (hidePaired && pairedWith) show = false;
|
||||||
|
if (deployedOnly && !deployed) show = false;
|
||||||
|
|
||||||
|
row.style.display = show ? '' : 'none';
|
||||||
|
if (show) visibleRecorders++;
|
||||||
|
});
|
||||||
|
|
||||||
|
document.getElementById('recorder-count').textContent = `(${visibleRecorders})`;
|
||||||
|
}
|
||||||
|
|
||||||
|
function filterModems() {
|
||||||
|
const searchTerm = document.getElementById('modem-search').value.toLowerCase().trim();
|
||||||
|
const hidePaired = document.getElementById('modem-hide-paired').checked;
|
||||||
|
const deployedOnly = document.getElementById('modem-deployed-only').checked;
|
||||||
|
|
||||||
|
let visibleModems = 0;
|
||||||
|
|
||||||
|
document.querySelectorAll('.modem-row').forEach(row => {
|
||||||
|
const id = row.dataset.id.toLowerCase();
|
||||||
|
const ip = (row.dataset.ip || '').toLowerCase();
|
||||||
|
const phone = (row.dataset.phone || '').toLowerCase();
|
||||||
|
const pairedWith = row.dataset.pairedWith;
|
||||||
|
const deployed = row.dataset.deployed === 'true';
|
||||||
|
|
||||||
|
let show = true;
|
||||||
|
if (searchTerm && !id.includes(searchTerm) && !ip.includes(searchTerm) && !phone.includes(searchTerm)) show = false;
|
||||||
|
if (hidePaired && pairedWith) show = false;
|
||||||
|
if (deployedOnly && !deployed) show = false;
|
||||||
|
|
||||||
|
row.style.display = show ? '' : 'none';
|
||||||
|
if (show) visibleModems++;
|
||||||
|
});
|
||||||
|
|
||||||
|
document.getElementById('modem-count').textContent = `(${visibleModems})`;
|
||||||
|
}
|
||||||
|
|
||||||
|
function saveScrollPositions() {
|
||||||
|
const recordersList = document.getElementById('recorders-list').parentElement;
|
||||||
|
const modemsList = document.getElementById('modems-list').parentElement;
|
||||||
|
const pairingsList = document.getElementById('pairings-list').parentElement;
|
||||||
|
|
||||||
|
sessionStorage.setItem('pairDevices_recorderScroll', recordersList.scrollTop);
|
||||||
|
sessionStorage.setItem('pairDevices_modemScroll', modemsList.scrollTop);
|
||||||
|
sessionStorage.setItem('pairDevices_pairingScroll', pairingsList.scrollTop);
|
||||||
|
|
||||||
|
// Save recorder filter state
|
||||||
|
sessionStorage.setItem('pairDevices_recorderSearch', document.getElementById('recorder-search').value);
|
||||||
|
sessionStorage.setItem('pairDevices_recorderHidePaired', document.getElementById('recorder-hide-paired').checked);
|
||||||
|
sessionStorage.setItem('pairDevices_recorderDeployedOnly', document.getElementById('recorder-deployed-only').checked);
|
||||||
|
|
||||||
|
// Save modem filter state
|
||||||
|
sessionStorage.setItem('pairDevices_modemSearch', document.getElementById('modem-search').value);
|
||||||
|
sessionStorage.setItem('pairDevices_modemHidePaired', document.getElementById('modem-hide-paired').checked);
|
||||||
|
sessionStorage.setItem('pairDevices_modemDeployedOnly', document.getElementById('modem-deployed-only').checked);
|
||||||
|
}
|
||||||
|
|
||||||
|
function restoreScrollPositions() {
|
||||||
|
const recorderScroll = sessionStorage.getItem('pairDevices_recorderScroll');
|
||||||
|
const modemScroll = sessionStorage.getItem('pairDevices_modemScroll');
|
||||||
|
const pairingScroll = sessionStorage.getItem('pairDevices_pairingScroll');
|
||||||
|
|
||||||
|
if (recorderScroll) {
|
||||||
|
document.getElementById('recorders-list').parentElement.scrollTop = parseInt(recorderScroll);
|
||||||
|
}
|
||||||
|
if (modemScroll) {
|
||||||
|
document.getElementById('modems-list').parentElement.scrollTop = parseInt(modemScroll);
|
||||||
|
}
|
||||||
|
if (pairingScroll) {
|
||||||
|
document.getElementById('pairings-list').parentElement.scrollTop = parseInt(pairingScroll);
|
||||||
|
}
|
||||||
|
|
||||||
|
// Restore recorder filter state
|
||||||
|
const recorderSearch = sessionStorage.getItem('pairDevices_recorderSearch');
|
||||||
|
const recorderHidePaired = sessionStorage.getItem('pairDevices_recorderHidePaired');
|
||||||
|
const recorderDeployedOnly = sessionStorage.getItem('pairDevices_recorderDeployedOnly');
|
||||||
|
|
||||||
|
if (recorderSearch) document.getElementById('recorder-search').value = recorderSearch;
|
||||||
|
if (recorderHidePaired === 'true') document.getElementById('recorder-hide-paired').checked = true;
|
||||||
|
if (recorderDeployedOnly === 'true') document.getElementById('recorder-deployed-only').checked = true;
|
||||||
|
|
||||||
|
// Restore modem filter state
|
||||||
|
const modemSearch = sessionStorage.getItem('pairDevices_modemSearch');
|
||||||
|
const modemHidePaired = sessionStorage.getItem('pairDevices_modemHidePaired');
|
||||||
|
const modemDeployedOnly = sessionStorage.getItem('pairDevices_modemDeployedOnly');
|
||||||
|
|
||||||
|
if (modemSearch) document.getElementById('modem-search').value = modemSearch;
|
||||||
|
if (modemHidePaired === 'true') document.getElementById('modem-hide-paired').checked = true;
|
||||||
|
if (modemDeployedOnly === 'true') document.getElementById('modem-deployed-only').checked = true;
|
||||||
|
|
||||||
|
// Apply filters if any were set
|
||||||
|
if (recorderSearch || recorderHidePaired === 'true' || recorderDeployedOnly === 'true') {
|
||||||
|
filterRecorders();
|
||||||
|
}
|
||||||
|
if (modemSearch || modemHidePaired === 'true' || modemDeployedOnly === 'true') {
|
||||||
|
filterModems();
|
||||||
|
}
|
||||||
|
|
||||||
|
// Clear stored values
|
||||||
|
sessionStorage.removeItem('pairDevices_recorderScroll');
|
||||||
|
sessionStorage.removeItem('pairDevices_modemScroll');
|
||||||
|
sessionStorage.removeItem('pairDevices_pairingScroll');
|
||||||
|
sessionStorage.removeItem('pairDevices_recorderSearch');
|
||||||
|
sessionStorage.removeItem('pairDevices_recorderHidePaired');
|
||||||
|
sessionStorage.removeItem('pairDevices_recorderDeployedOnly');
|
||||||
|
sessionStorage.removeItem('pairDevices_modemSearch');
|
||||||
|
sessionStorage.removeItem('pairDevices_modemHidePaired');
|
||||||
|
sessionStorage.removeItem('pairDevices_modemDeployedOnly');
|
||||||
|
}
|
||||||
|
|
||||||
|
// Restore scroll positions on page load
|
||||||
|
document.addEventListener('DOMContentLoaded', restoreScrollPositions);
|
||||||
|
|
||||||
|
async function pairDevices() {
|
||||||
|
if (!selectedRecorder || !selectedModem) return;
|
||||||
|
|
||||||
|
const pairBtn = document.getElementById('pair-btn');
|
||||||
|
pairBtn.disabled = true;
|
||||||
|
pairBtn.textContent = 'Pairing...';
|
||||||
|
|
||||||
|
try {
|
||||||
|
const response = await fetch('/api/roster/pair-devices', {
|
||||||
|
method: 'POST',
|
||||||
|
headers: { 'Content-Type': 'application/json' },
|
||||||
|
body: JSON.stringify({
|
||||||
|
recorder_id: selectedRecorder,
|
||||||
|
modem_id: selectedModem
|
||||||
|
})
|
||||||
|
});
|
||||||
|
|
||||||
|
const result = await response.json();
|
||||||
|
|
||||||
|
if (response.ok) {
|
||||||
|
showToast(`Paired ${selectedRecorder} with ${selectedModem}`, 'success');
|
||||||
|
// Save scroll positions before reload
|
||||||
|
saveScrollPositions();
|
||||||
|
setTimeout(() => window.location.reload(), 500);
|
||||||
|
} else {
|
||||||
|
showToast(result.detail || 'Failed to pair devices', 'error');
|
||||||
|
}
|
||||||
|
} catch (error) {
|
||||||
|
showToast('Error pairing devices: ' + error.message, 'error');
|
||||||
|
} finally {
|
||||||
|
pairBtn.disabled = false;
|
||||||
|
pairBtn.textContent = 'Pair Devices';
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
async function unpairDevices(recorderId, modemId) {
|
||||||
|
if (!confirm(`Unpair ${recorderId} from ${modemId}?`)) return;
|
||||||
|
|
||||||
|
try {
|
||||||
|
const response = await fetch('/api/roster/unpair-devices', {
|
||||||
|
method: 'POST',
|
||||||
|
headers: { 'Content-Type': 'application/json' },
|
||||||
|
body: JSON.stringify({
|
||||||
|
recorder_id: recorderId,
|
||||||
|
modem_id: modemId
|
||||||
|
})
|
||||||
|
});
|
||||||
|
|
||||||
|
const result = await response.json();
|
||||||
|
|
||||||
|
if (response.ok) {
|
||||||
|
showToast(`Unpaired ${recorderId} from ${modemId}`, 'success');
|
||||||
|
// Save scroll positions before reload
|
||||||
|
saveScrollPositions();
|
||||||
|
setTimeout(() => window.location.reload(), 500);
|
||||||
|
} else {
|
||||||
|
showToast(result.detail || 'Failed to unpair devices', 'error');
|
||||||
|
}
|
||||||
|
} catch (error) {
|
||||||
|
showToast('Error unpairing devices: ' + error.message, 'error');
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
function showToast(message, type = 'info') {
|
||||||
|
const toast = document.getElementById('toast');
|
||||||
|
toast.textContent = message;
|
||||||
|
toast.className = 'fixed bottom-4 right-4 px-4 py-3 rounded-lg shadow-lg transform transition-all duration-300 z-50';
|
||||||
|
|
||||||
|
if (type === 'success') {
|
||||||
|
toast.classList.add('bg-green-500', 'text-white');
|
||||||
|
} else if (type === 'error') {
|
||||||
|
toast.classList.add('bg-red-500', 'text-white');
|
||||||
|
} else {
|
||||||
|
toast.classList.add('bg-gray-800', 'text-white');
|
||||||
|
}
|
||||||
|
|
||||||
|
// Show
|
||||||
|
toast.classList.remove('translate-y-full', 'opacity-0');
|
||||||
|
|
||||||
|
// Hide after 3 seconds
|
||||||
|
setTimeout(() => {
|
||||||
|
toast.classList.add('translate-y-full', 'opacity-0');
|
||||||
|
}, 3000);
|
||||||
|
}
|
||||||
|
</script>
|
||||||
|
|
||||||
|
<style>
|
||||||
|
.bg-seismo-orange\/10 {
|
||||||
|
background-color: rgb(249 115 22 / 0.1);
|
||||||
|
}
|
||||||
|
.dark\:bg-seismo-orange\/20:is(.dark *) {
|
||||||
|
background-color: rgb(249 115 22 / 0.2);
|
||||||
|
}
|
||||||
|
</style>
|
||||||
|
{% endblock %}
|
||||||
87
templates/partials/alerts/alert_dropdown.html
Normal file
@@ -0,0 +1,87 @@
|
|||||||
|
<!-- Alert Dropdown Content -->
|
||||||
|
<!-- Loaded via HTMX into the alert dropdown in the navbar -->
|
||||||
|
|
||||||
|
<div class="max-h-96 overflow-y-auto">
|
||||||
|
{% if alerts %}
|
||||||
|
{% for item in alerts %}
|
||||||
|
<div class="p-3 border-b border-gray-200 dark:border-gray-700 hover:bg-gray-50 dark:hover:bg-gray-700/50 transition-colors
|
||||||
|
{% if item.alert.severity == 'critical' %}bg-red-50 dark:bg-red-900/20{% endif %}">
|
||||||
|
<div class="flex items-start gap-3">
|
||||||
|
<!-- Severity icon -->
|
||||||
|
{% if item.alert.severity == 'critical' %}
|
||||||
|
<span class="text-red-500 flex-shrink-0 mt-0.5">
|
||||||
|
<svg class="w-5 h-5" fill="currentColor" viewBox="0 0 20 20">
|
||||||
|
<path fill-rule="evenodd" d="M8.257 3.099c.765-1.36 2.722-1.36 3.486 0l5.58 9.92c.75 1.334-.213 2.98-1.742 2.98H4.42c-1.53 0-2.493-1.646-1.743-2.98l5.58-9.92zM11 13a1 1 0 11-2 0 1 1 0 012 0zm-1-8a1 1 0 00-1 1v3a1 1 0 002 0V6a1 1 0 00-1-1z" clip-rule="evenodd"/>
|
||||||
|
</svg>
|
||||||
|
</span>
|
||||||
|
{% elif item.alert.severity == 'warning' %}
|
||||||
|
<span class="text-yellow-500 flex-shrink-0 mt-0.5">
|
||||||
|
<svg class="w-5 h-5" fill="currentColor" viewBox="0 0 20 20">
|
||||||
|
<path fill-rule="evenodd" d="M8.257 3.099c.765-1.36 2.722-1.36 3.486 0l5.58 9.92c.75 1.334-.213 2.98-1.742 2.98H4.42c-1.53 0-2.493-1.646-1.743-2.98l5.58-9.92zM11 13a1 1 0 11-2 0 1 1 0 012 0zm-1-8a1 1 0 00-1 1v3a1 1 0 002 0V6a1 1 0 00-1-1z" clip-rule="evenodd"/>
|
||||||
|
</svg>
|
||||||
|
</span>
|
||||||
|
{% else %}
|
||||||
|
<span class="text-blue-500 flex-shrink-0 mt-0.5">
|
||||||
|
<svg class="w-5 h-5" fill="currentColor" viewBox="0 0 20 20">
|
||||||
|
<path fill-rule="evenodd" d="M18 10a8 8 0 11-16 0 8 8 0 0116 0zm-7-4a1 1 0 11-2 0 1 1 0 012 0zM9 9a1 1 0 000 2v3a1 1 0 001 1h1a1 1 0 100-2v-3a1 1 0 00-1-1H9z" clip-rule="evenodd"/>
|
||||||
|
</svg>
|
||||||
|
</span>
|
||||||
|
{% endif %}
|
||||||
|
|
||||||
|
<div class="flex-1 min-w-0">
|
||||||
|
<p class="text-sm font-medium text-gray-900 dark:text-white truncate">
|
||||||
|
{{ item.alert.title }}
|
||||||
|
</p>
|
||||||
|
{% if item.alert.message %}
|
||||||
|
<p class="text-xs text-gray-500 dark:text-gray-400 line-clamp-2 mt-0.5">
|
||||||
|
{{ item.alert.message }}
|
||||||
|
</p>
|
||||||
|
{% endif %}
|
||||||
|
<p class="text-xs text-gray-400 dark:text-gray-500 mt-1">
|
||||||
|
{{ item.time_ago }}
|
||||||
|
</p>
|
||||||
|
</div>
|
||||||
|
|
||||||
|
<!-- Actions -->
|
||||||
|
<div class="flex items-center gap-1 flex-shrink-0">
|
||||||
|
<button hx-post="/api/alerts/{{ item.alert.id }}/acknowledge"
|
||||||
|
hx-swap="none"
|
||||||
|
hx-on::after-request="htmx.trigger('#alert-dropdown-content', 'refresh')"
|
||||||
|
class="p-1.5 text-gray-400 hover:text-green-600 dark:hover:text-green-400 rounded transition-colors"
|
||||||
|
title="Acknowledge">
|
||||||
|
<svg class="w-4 h-4" fill="none" stroke="currentColor" viewBox="0 0 24 24">
|
||||||
|
<path stroke-linecap="round" stroke-linejoin="round" stroke-width="2" d="M5 13l4 4L19 7"/>
|
||||||
|
</svg>
|
||||||
|
</button>
|
||||||
|
<button hx-post="/api/alerts/{{ item.alert.id }}/dismiss"
|
||||||
|
hx-swap="none"
|
||||||
|
hx-on::after-request="htmx.trigger('#alert-dropdown-content', 'refresh')"
|
||||||
|
class="p-1.5 text-gray-400 hover:text-red-600 dark:hover:text-red-400 rounded transition-colors"
|
||||||
|
title="Dismiss">
|
||||||
|
<svg class="w-4 h-4" fill="none" stroke="currentColor" viewBox="0 0 24 24">
|
||||||
|
<path stroke-linecap="round" stroke-linejoin="round" stroke-width="2" d="M6 18L18 6M6 6l12 12"/>
|
||||||
|
</svg>
|
||||||
|
</button>
|
||||||
|
</div>
|
||||||
|
</div>
|
||||||
|
</div>
|
||||||
|
{% endfor %}
|
||||||
|
{% else %}
|
||||||
|
<div class="p-8 text-center">
|
||||||
|
<svg class="w-12 h-12 mx-auto mb-3 text-gray-300 dark:text-gray-600" fill="none" stroke="currentColor" viewBox="0 0 24 24">
|
||||||
|
<path stroke-linecap="round" stroke-linejoin="round" stroke-width="2" d="M9 12l2 2 4-4m6 2a9 9 0 11-18 0 9 9 0 0118 0z"/>
|
||||||
|
</svg>
|
||||||
|
<p class="text-gray-500 dark:text-gray-400 text-sm">No active alerts</p>
|
||||||
|
<p class="text-gray-400 dark:text-gray-500 text-xs mt-1">All systems operational</p>
|
||||||
|
</div>
|
||||||
|
{% endif %}
|
||||||
|
</div>
|
||||||
|
|
||||||
|
<!-- View all link -->
|
||||||
|
{% if total_count > 0 %}
|
||||||
|
<div class="p-3 border-t border-gray-200 dark:border-gray-700 text-center bg-gray-50 dark:bg-gray-800/50">
|
||||||
|
<a href="/alerts" class="text-sm text-seismo-orange hover:text-seismo-navy dark:hover:text-orange-300 font-medium">
|
||||||
|
View all {{ total_count }} alert{{ 's' if total_count != 1 else '' }}
|
||||||
|
</a>
|
||||||
|
</div>
|
||||||
|
{% endif %}
|
||||||
125
templates/partials/alerts/alert_list.html
Normal file
@@ -0,0 +1,125 @@
|
|||||||
|
<!-- Alert List Partial -->
|
||||||
|
<!-- Full list of alerts for the alerts page -->
|
||||||
|
|
||||||
|
<div class="space-y-3">
|
||||||
|
{% if alerts %}
|
||||||
|
{% for item in alerts %}
|
||||||
|
<div class="bg-white dark:bg-gray-800 rounded-lg border border-gray-200 dark:border-gray-700 p-4
|
||||||
|
{% if item.alert.severity == 'critical' and item.alert.status == 'active' %}border-l-4 border-l-red-500{% endif %}
|
||||||
|
{% if item.alert.severity == 'warning' and item.alert.status == 'active' %}border-l-4 border-l-yellow-500{% endif %}
|
||||||
|
{% if item.alert.status != 'active' %}opacity-60{% endif %}">
|
||||||
|
<div class="flex items-start gap-4">
|
||||||
|
<!-- Severity icon -->
|
||||||
|
<div class="flex-shrink-0">
|
||||||
|
{% if item.alert.severity == 'critical' %}
|
||||||
|
<div class="w-10 h-10 rounded-full bg-red-100 dark:bg-red-900/30 flex items-center justify-center">
|
||||||
|
<svg class="w-5 h-5 text-red-600 dark:text-red-400" fill="currentColor" viewBox="0 0 20 20">
|
||||||
|
<path fill-rule="evenodd" d="M8.257 3.099c.765-1.36 2.722-1.36 3.486 0l5.58 9.92c.75 1.334-.213 2.98-1.742 2.98H4.42c-1.53 0-2.493-1.646-1.743-2.98l5.58-9.92zM11 13a1 1 0 11-2 0 1 1 0 012 0zm-1-8a1 1 0 00-1 1v3a1 1 0 002 0V6a1 1 0 00-1-1z" clip-rule="evenodd"/>
|
||||||
|
</svg>
|
||||||
|
</div>
|
||||||
|
{% elif item.alert.severity == 'warning' %}
|
||||||
|
<div class="w-10 h-10 rounded-full bg-yellow-100 dark:bg-yellow-900/30 flex items-center justify-center">
|
||||||
|
<svg class="w-5 h-5 text-yellow-600 dark:text-yellow-400" fill="currentColor" viewBox="0 0 20 20">
|
||||||
|
<path fill-rule="evenodd" d="M8.257 3.099c.765-1.36 2.722-1.36 3.486 0l5.58 9.92c.75 1.334-.213 2.98-1.742 2.98H4.42c-1.53 0-2.493-1.646-1.743-2.98l5.58-9.92zM11 13a1 1 0 11-2 0 1 1 0 012 0zm-1-8a1 1 0 00-1 1v3a1 1 0 002 0V6a1 1 0 00-1-1z" clip-rule="evenodd"/>
|
||||||
|
</svg>
|
||||||
|
</div>
|
||||||
|
{% else %}
|
||||||
|
<div class="w-10 h-10 rounded-full bg-blue-100 dark:bg-blue-900/30 flex items-center justify-center">
|
||||||
|
<svg class="w-5 h-5 text-blue-600 dark:text-blue-400" fill="currentColor" viewBox="0 0 20 20">
|
||||||
|
<path fill-rule="evenodd" d="M18 10a8 8 0 11-16 0 8 8 0 0116 0zm-7-4a1 1 0 11-2 0 1 1 0 012 0zM9 9a1 1 0 000 2v3a1 1 0 001 1h1a1 1 0 100-2v-3a1 1 0 00-1-1H9z" clip-rule="evenodd"/>
|
||||||
|
</svg>
|
||||||
|
</div>
|
||||||
|
{% endif %}
|
||||||
|
</div>
|
||||||
|
|
||||||
|
<!-- Content -->
|
||||||
|
<div class="flex-1 min-w-0">
|
||||||
|
<div class="flex items-center gap-2 mb-1">
|
||||||
|
<h3 class="text-base font-semibold text-gray-900 dark:text-white">
|
||||||
|
{{ item.alert.title }}
|
||||||
|
</h3>
|
||||||
|
<!-- Status badge -->
|
||||||
|
{% if item.alert.status == 'active' %}
|
||||||
|
<span class="px-2 py-0.5 text-xs font-medium rounded-full bg-red-100 text-red-700 dark:bg-red-900/30 dark:text-red-300">
|
||||||
|
Active
|
||||||
|
</span>
|
||||||
|
{% elif item.alert.status == 'acknowledged' %}
|
||||||
|
<span class="px-2 py-0.5 text-xs font-medium rounded-full bg-yellow-100 text-yellow-700 dark:bg-yellow-900/30 dark:text-yellow-300">
|
||||||
|
Acknowledged
|
||||||
|
</span>
|
||||||
|
{% elif item.alert.status == 'resolved' %}
|
||||||
|
<span class="px-2 py-0.5 text-xs font-medium rounded-full bg-green-100 text-green-700 dark:bg-green-900/30 dark:text-green-300">
|
||||||
|
Resolved
|
||||||
|
</span>
|
||||||
|
{% elif item.alert.status == 'dismissed' %}
|
||||||
|
<span class="px-2 py-0.5 text-xs font-medium rounded-full bg-gray-100 text-gray-600 dark:bg-gray-700 dark:text-gray-400">
|
||||||
|
Dismissed
|
||||||
|
</span>
|
||||||
|
{% endif %}
|
||||||
|
</div>
|
||||||
|
|
||||||
|
{% if item.alert.message %}
|
||||||
|
<p class="text-sm text-gray-600 dark:text-gray-300 mb-2">
|
||||||
|
{{ item.alert.message }}
|
||||||
|
</p>
|
||||||
|
{% endif %}
|
||||||
|
|
||||||
|
<div class="flex items-center gap-4 text-xs text-gray-500 dark:text-gray-400">
|
||||||
|
<span>{{ item.time_ago }}</span>
|
||||||
|
{% if item.alert.unit_id %}
|
||||||
|
<span class="flex items-center gap-1">
|
||||||
|
<svg class="w-3.5 h-3.5" fill="none" stroke="currentColor" viewBox="0 0 24 24">
|
||||||
|
<path stroke-linecap="round" stroke-linejoin="round" stroke-width="2" d="M9 3v2m6-2v2M9 19v2m6-2v2M5 9H3m2 6H3m18-6h-2m2 6h-2M7 19h10a2 2 0 002-2V7a2 2 0 00-2-2H7a2 2 0 00-2 2v10a2 2 0 002 2zM9 9h6v6H9V9z"/>
|
||||||
|
</svg>
|
||||||
|
{{ item.alert.unit_id }}
|
||||||
|
</span>
|
||||||
|
{% endif %}
|
||||||
|
<span class="capitalize">{{ item.alert.alert_type | replace('_', ' ') }}</span>
|
||||||
|
</div>
|
||||||
|
</div>
|
||||||
|
|
||||||
|
<!-- Actions -->
|
||||||
|
{% if item.alert.status == 'active' %}
|
||||||
|
<div class="flex items-center gap-2 flex-shrink-0">
|
||||||
|
<button hx-post="/api/alerts/{{ item.alert.id }}/acknowledge"
|
||||||
|
hx-swap="none"
|
||||||
|
hx-on::after-request="htmx.trigger('#alert-list', 'refresh')"
|
||||||
|
class="px-3 py-1.5 text-sm bg-gray-100 dark:bg-gray-700 text-gray-700 dark:text-gray-300 rounded-lg hover:bg-gray-200 dark:hover:bg-gray-600 transition-colors">
|
||||||
|
Acknowledge
|
||||||
|
</button>
|
||||||
|
<button hx-post="/api/alerts/{{ item.alert.id }}/resolve"
|
||||||
|
hx-swap="none"
|
||||||
|
hx-on::after-request="htmx.trigger('#alert-list', 'refresh')"
|
||||||
|
class="px-3 py-1.5 text-sm bg-green-100 dark:bg-green-900/30 text-green-700 dark:text-green-300 rounded-lg hover:bg-green-200 dark:hover:bg-green-900/50 transition-colors">
|
||||||
|
Resolve
|
||||||
|
</button>
|
||||||
|
<button hx-post="/api/alerts/{{ item.alert.id }}/dismiss"
|
||||||
|
hx-swap="none"
|
||||||
|
hx-on::after-request="htmx.trigger('#alert-list', 'refresh')"
|
||||||
|
class="px-3 py-1.5 text-sm text-gray-500 hover:text-red-600 dark:hover:text-red-400 transition-colors"
|
||||||
|
title="Dismiss">
|
||||||
|
<svg class="w-5 h-5" fill="none" stroke="currentColor" viewBox="0 0 24 24">
|
||||||
|
<path stroke-linecap="round" stroke-linejoin="round" stroke-width="2" d="M6 18L18 6M6 6l12 12"/>
|
||||||
|
</svg>
|
||||||
|
</button>
|
||||||
|
</div>
|
||||||
|
{% endif %}
|
||||||
|
</div>
|
||||||
|
</div>
|
||||||
|
{% endfor %}
|
||||||
|
{% else %}
|
||||||
|
<div class="bg-white dark:bg-gray-800 rounded-lg border border-gray-200 dark:border-gray-700 p-12 text-center">
|
||||||
|
<svg class="w-16 h-16 mx-auto mb-4 text-gray-300 dark:text-gray-600" fill="none" stroke="currentColor" viewBox="0 0 24 24">
|
||||||
|
<path stroke-linecap="round" stroke-linejoin="round" stroke-width="2" d="M9 12l2 2 4-4m6 2a9 9 0 11-18 0 9 9 0 0118 0z"/>
|
||||||
|
</svg>
|
||||||
|
<h3 class="text-lg font-medium text-gray-900 dark:text-white mb-2">No alerts</h3>
|
||||||
|
<p class="text-gray-500 dark:text-gray-400">
|
||||||
|
{% if status_filter %}
|
||||||
|
No {{ status_filter }} alerts found.
|
||||||
|
{% else %}
|
||||||
|
All systems are operating normally.
|
||||||
|
{% endif %}
|
||||||
|
</p>
|
||||||
|
</div>
|
||||||
|
{% endif %}
|
||||||
|
</div>
|
||||||
131
templates/partials/dashboard/todays_actions.html
Normal file
@@ -0,0 +1,131 @@
|
|||||||
|
<!-- Today's Scheduled Actions - Dashboard Card Content -->
|
||||||
|
|
||||||
|
<!-- Summary stats -->
|
||||||
|
<div class="flex items-center gap-4 mb-4 text-sm">
|
||||||
|
{% if pending_count > 0 %}
|
||||||
|
<div class="flex items-center gap-1.5">
|
||||||
|
<span class="w-2 h-2 bg-yellow-400 rounded-full"></span>
|
||||||
|
<span class="text-gray-600 dark:text-gray-400">{{ pending_count }} pending</span>
|
||||||
|
</div>
|
||||||
|
{% endif %}
|
||||||
|
{% if completed_count > 0 %}
|
||||||
|
<div class="flex items-center gap-1.5">
|
||||||
|
<span class="w-2 h-2 bg-green-400 rounded-full"></span>
|
||||||
|
<span class="text-gray-600 dark:text-gray-400">{{ completed_count }} completed</span>
|
||||||
|
</div>
|
||||||
|
{% endif %}
|
||||||
|
{% if failed_count > 0 %}
|
||||||
|
<div class="flex items-center gap-1.5">
|
||||||
|
<span class="w-2 h-2 bg-red-400 rounded-full"></span>
|
||||||
|
<span class="text-gray-600 dark:text-gray-400">{{ failed_count }} failed</span>
|
||||||
|
</div>
|
||||||
|
{% endif %}
|
||||||
|
{% if total_count == 0 %}
|
||||||
|
<span class="text-gray-500 dark:text-gray-400">No actions scheduled for today</span>
|
||||||
|
{% endif %}
|
||||||
|
</div>
|
||||||
|
|
||||||
|
<!-- Actions list -->
|
||||||
|
{% if actions %}
|
||||||
|
<div class="space-y-2 max-h-64 overflow-y-auto">
|
||||||
|
{% for item in actions %}
|
||||||
|
<div class="flex items-center gap-3 p-2 rounded-lg
|
||||||
|
{% if item.action.execution_status == 'pending' %}bg-yellow-50 dark:bg-yellow-900/20
|
||||||
|
{% elif item.action.execution_status == 'completed' %}bg-green-50 dark:bg-green-900/20
|
||||||
|
{% elif item.action.execution_status == 'failed' %}bg-red-50 dark:bg-red-900/20
|
||||||
|
{% else %}bg-gray-50 dark:bg-gray-700/50{% endif %}">
|
||||||
|
|
||||||
|
<!-- Action type icon -->
|
||||||
|
<div class="flex-shrink-0">
|
||||||
|
{% if item.action.action_type == 'start' %}
|
||||||
|
<div class="w-8 h-8 rounded-full bg-green-100 dark:bg-green-900/30 flex items-center justify-center">
|
||||||
|
<svg class="w-4 h-4 text-green-600 dark:text-green-400" fill="currentColor" viewBox="0 0 20 20">
|
||||||
|
<path fill-rule="evenodd" d="M10 18a8 8 0 100-16 8 8 0 000 16zM9.555 7.168A1 1 0 008 8v4a1 1 0 001.555.832l3-2a1 1 0 000-1.664l-3-2z" clip-rule="evenodd"/>
|
||||||
|
</svg>
|
||||||
|
</div>
|
||||||
|
{% elif item.action.action_type == 'stop' %}
|
||||||
|
<div class="w-8 h-8 rounded-full bg-red-100 dark:bg-red-900/30 flex items-center justify-center">
|
||||||
|
<svg class="w-4 h-4 text-red-600 dark:text-red-400" fill="currentColor" viewBox="0 0 20 20">
|
||||||
|
<path fill-rule="evenodd" d="M10 18a8 8 0 100-16 8 8 0 000 16zM8 7a1 1 0 00-1 1v4a1 1 0 001 1h4a1 1 0 001-1V8a1 1 0 00-1-1H8z" clip-rule="evenodd"/>
|
||||||
|
</svg>
|
||||||
|
</div>
|
||||||
|
{% elif item.action.action_type == 'download' %}
|
||||||
|
<div class="w-8 h-8 rounded-full bg-blue-100 dark:bg-blue-900/30 flex items-center justify-center">
|
||||||
|
<svg class="w-4 h-4 text-blue-600 dark:text-blue-400" fill="currentColor" viewBox="0 0 20 20">
|
||||||
|
<path fill-rule="evenodd" d="M3 17a1 1 0 011-1h12a1 1 0 110 2H4a1 1 0 01-1-1zm3.293-7.707a1 1 0 011.414 0L9 10.586V3a1 1 0 112 0v7.586l1.293-1.293a1 1 0 111.414 1.414l-3 3a1 1 0 01-1.414 0l-3-3a1 1 0 010-1.414z" clip-rule="evenodd"/>
|
||||||
|
</svg>
|
||||||
|
</div>
|
||||||
|
{% endif %}
|
||||||
|
</div>
|
||||||
|
|
||||||
|
<!-- Action details -->
|
||||||
|
<div class="flex-1 min-w-0">
|
||||||
|
<div class="flex items-center gap-2">
|
||||||
|
<span class="font-medium text-sm text-gray-900 dark:text-white capitalize">{{ item.action.action_type }}</span>
|
||||||
|
|
||||||
|
<!-- Status indicator -->
|
||||||
|
{% if item.action.execution_status == 'pending' %}
|
||||||
|
<span class="text-xs text-yellow-600 dark:text-yellow-400">
|
||||||
|
{{ item.action.scheduled_time|local_datetime('%H:%M') }}
|
||||||
|
</span>
|
||||||
|
{% elif item.action.execution_status == 'completed' %}
|
||||||
|
<svg class="w-4 h-4 text-green-500" fill="currentColor" viewBox="0 0 20 20">
|
||||||
|
<path fill-rule="evenodd" d="M10 18a8 8 0 100-16 8 8 0 000 16zm3.707-9.293a1 1 0 00-1.414-1.414L9 10.586 7.707 9.293a1 1 0 00-1.414 1.414l2 2a1 1 0 001.414 0l4-4z" clip-rule="evenodd"/>
|
||||||
|
</svg>
|
||||||
|
{% elif item.action.execution_status == 'failed' %}
|
||||||
|
<svg class="w-4 h-4 text-red-500" fill="currentColor" viewBox="0 0 20 20">
|
||||||
|
<path fill-rule="evenodd" d="M10 18a8 8 0 100-16 8 8 0 000 16zM8.707 7.293a1 1 0 00-1.414 1.414L8.586 10l-1.293 1.293a1 1 0 101.414 1.414L10 11.414l1.293 1.293a1 1 0 001.414-1.414L11.414 10l1.293-1.293a1 1 0 00-1.414-1.414L10 8.586 8.707 7.293z" clip-rule="evenodd"/>
|
||||||
|
</svg>
|
||||||
|
{% endif %}
|
||||||
|
</div>
|
||||||
|
|
||||||
|
<!-- Location/Project info -->
|
||||||
|
<div class="text-xs text-gray-500 dark:text-gray-400 truncate">
|
||||||
|
{% if item.location %}
|
||||||
|
<a href="/projects/{{ item.action.project_id }}/nrl/{{ item.location.id }}"
|
||||||
|
class="hover:text-seismo-orange">
|
||||||
|
{{ item.location.name }}
|
||||||
|
</a>
|
||||||
|
{% elif item.project %}
|
||||||
|
<a href="/projects/{{ item.project.id }}" class="hover:text-seismo-orange">
|
||||||
|
{{ item.project.name }}
|
||||||
|
</a>
|
||||||
|
{% endif %}
|
||||||
|
</div>
|
||||||
|
|
||||||
|
<!-- Result details for completed/failed -->
|
||||||
|
{% if item.action.execution_status == 'completed' and item.result %}
|
||||||
|
{% if item.result.cycle_response and item.result.cycle_response.downloaded_folder %}
|
||||||
|
<div class="text-xs text-green-600 dark:text-green-400">
|
||||||
|
{{ item.result.cycle_response.downloaded_folder }}
|
||||||
|
{% if item.result.cycle_response.download_success %}downloaded{% endif %}
|
||||||
|
</div>
|
||||||
|
{% endif %}
|
||||||
|
{% elif item.action.execution_status == 'failed' and item.action.error_message %}
|
||||||
|
<div class="text-xs text-red-600 dark:text-red-400 truncate" title="{{ item.action.error_message }}">
|
||||||
|
{{ item.action.error_message[:50] }}{% if item.action.error_message|length > 50 %}...{% endif %}
|
||||||
|
</div>
|
||||||
|
{% endif %}
|
||||||
|
</div>
|
||||||
|
|
||||||
|
<!-- Time -->
|
||||||
|
<div class="flex-shrink-0 text-right">
|
||||||
|
{% if item.action.execution_status == 'pending' %}
|
||||||
|
<span class="text-xs text-gray-500 dark:text-gray-400">Scheduled</span>
|
||||||
|
{% elif item.action.executed_at %}
|
||||||
|
<span class="text-xs text-gray-500 dark:text-gray-400">
|
||||||
|
{{ item.action.executed_at|local_datetime('%H:%M') }}
|
||||||
|
</span>
|
||||||
|
{% endif %}
|
||||||
|
</div>
|
||||||
|
</div>
|
||||||
|
{% endfor %}
|
||||||
|
</div>
|
||||||
|
{% else %}
|
||||||
|
<div class="text-center py-6 text-gray-500 dark:text-gray-400">
|
||||||
|
<svg class="w-10 h-10 mx-auto mb-2 text-gray-300 dark:text-gray-600" fill="none" stroke="currentColor" viewBox="0 0 24 24">
|
||||||
|
<path stroke-linecap="round" stroke-linejoin="round" stroke-width="2" d="M8 7V3m8 4V3m-9 8h10M5 21h14a2 2 0 002-2V7a2 2 0 00-2-2H5a2 2 0 00-2 2v12a2 2 0 002 2z"></path>
|
||||||
|
</svg>
|
||||||
|
<p class="text-sm">No scheduled actions for today</p>
|
||||||
|
</div>
|
||||||
|
{% endif %}
|
||||||
@@ -51,7 +51,7 @@
|
|||||||
{% for unit in units %}
|
{% for unit in units %}
|
||||||
<tr class="hover:bg-gray-50 dark:hover:bg-gray-700 transition-colors"
|
<tr class="hover:bg-gray-50 dark:hover:bg-gray-700 transition-colors"
|
||||||
data-device-type="{{ unit.device_type }}"
|
data-device-type="{{ unit.device_type }}"
|
||||||
data-status="{% if unit.deployed %}deployed{% elif unit.retired %}retired{% elif unit.ignored %}ignored{% else %}benched{% endif %}"
|
data-status="{% if unit.deployed %}deployed{% elif unit.out_for_calibration %}out_for_calibration{% elif unit.retired %}retired{% elif unit.ignored %}ignored{% elif unit.allocated %}allocated{% else %}benched{% endif %}"
|
||||||
data-health="{{ unit.status }}"
|
data-health="{{ unit.status }}"
|
||||||
data-id="{{ unit.id }}"
|
data-id="{{ unit.id }}"
|
||||||
data-type="{{ unit.device_type }}"
|
data-type="{{ unit.device_type }}"
|
||||||
@@ -60,7 +60,13 @@
|
|||||||
data-note="{{ unit.note if unit.note else '' }}">
|
data-note="{{ unit.note if unit.note else '' }}">
|
||||||
<td class="px-6 py-4 whitespace-nowrap">
|
<td class="px-6 py-4 whitespace-nowrap">
|
||||||
<div class="flex items-center space-x-2">
|
<div class="flex items-center space-x-2">
|
||||||
{% if unit.status == 'OK' %}
|
{% if unit.out_for_calibration %}
|
||||||
|
<span class="w-3 h-3 rounded-full bg-purple-500" title="Out for Calibration"></span>
|
||||||
|
{% elif unit.allocated %}
|
||||||
|
<span class="w-3 h-3 rounded-full bg-orange-400" title="Allocated"></span>
|
||||||
|
{% elif not unit.deployed %}
|
||||||
|
<span class="w-3 h-3 rounded-full bg-gray-400 dark:bg-gray-500" title="Benched"></span>
|
||||||
|
{% elif unit.status == 'OK' %}
|
||||||
<span class="w-3 h-3 rounded-full bg-green-500" title="OK"></span>
|
<span class="w-3 h-3 rounded-full bg-green-500" title="OK"></span>
|
||||||
{% elif unit.status == 'Pending' %}
|
{% elif unit.status == 'Pending' %}
|
||||||
<span class="w-3 h-3 rounded-full bg-yellow-500" title="Pending"></span>
|
<span class="w-3 h-3 rounded-full bg-yellow-500" title="Pending"></span>
|
||||||
@@ -70,6 +76,10 @@
|
|||||||
|
|
||||||
{% if unit.deployed %}
|
{% if unit.deployed %}
|
||||||
<span class="w-2 h-2 rounded-full bg-blue-500" title="Deployed"></span>
|
<span class="w-2 h-2 rounded-full bg-blue-500" title="Deployed"></span>
|
||||||
|
{% elif unit.out_for_calibration %}
|
||||||
|
<span class="w-2 h-2 rounded-full bg-purple-400" title="Out for Calibration"></span>
|
||||||
|
{% elif unit.allocated %}
|
||||||
|
<span class="w-2 h-2 rounded-full bg-orange-400" title="Allocated"></span>
|
||||||
{% else %}
|
{% else %}
|
||||||
<span class="w-2 h-2 rounded-full bg-gray-300 dark:bg-gray-600" title="Benched"></span>
|
<span class="w-2 h-2 rounded-full bg-gray-300 dark:bg-gray-600" title="Benched"></span>
|
||||||
{% endif %}
|
{% endif %}
|
||||||
@@ -104,14 +114,19 @@
|
|||||||
{% if unit.phone_number %}
|
{% if unit.phone_number %}
|
||||||
<div>{{ unit.phone_number }}</div>
|
<div>{{ unit.phone_number }}</div>
|
||||||
{% endif %}
|
{% endif %}
|
||||||
{% if unit.hardware_model %}
|
{% if unit.deployed_with_unit_id %}
|
||||||
<div class="text-gray-500 dark:text-gray-500">{{ unit.hardware_model }}</div>
|
<div>
|
||||||
|
<span class="text-gray-500 dark:text-gray-500">Linked:</span>
|
||||||
|
<a href="/unit/{{ unit.deployed_with_unit_id }}" class="text-seismo-orange hover:underline font-medium">
|
||||||
|
{{ unit.deployed_with_unit_id }}
|
||||||
|
</a>
|
||||||
|
</div>
|
||||||
{% endif %}
|
{% endif %}
|
||||||
{% else %}
|
{% else %}
|
||||||
{% if unit.next_calibration_due %}
|
{% if unit.last_calibrated %}
|
||||||
<div>
|
<div>
|
||||||
<span class="text-gray-500 dark:text-gray-500">Cal Due:</span>
|
<span class="text-gray-500 dark:text-gray-500">Last Cal:</span>
|
||||||
<span class="font-medium">{{ unit.next_calibration_due }}</span>
|
<span class="font-medium">{{ unit.last_calibrated }}</span>
|
||||||
</div>
|
</div>
|
||||||
{% endif %}
|
{% endif %}
|
||||||
{% if unit.deployed_with_modem_id %}
|
{% if unit.deployed_with_modem_id %}
|
||||||
@@ -126,7 +141,7 @@
|
|||||||
</div>
|
</div>
|
||||||
</td>
|
</td>
|
||||||
<td class="px-6 py-4 whitespace-nowrap">
|
<td class="px-6 py-4 whitespace-nowrap">
|
||||||
<div class="text-sm text-gray-500 dark:text-gray-400">{{ unit.last_seen }}</div>
|
<div class="text-sm text-gray-500 dark:text-gray-400 last-seen-cell" data-iso="{{ unit.last_seen }}">{{ unit.last_seen }}</div>
|
||||||
</td>
|
</td>
|
||||||
<td class="px-6 py-4 whitespace-nowrap">
|
<td class="px-6 py-4 whitespace-nowrap">
|
||||||
<div class="text-sm
|
<div class="text-sm
|
||||||
@@ -196,14 +211,20 @@
|
|||||||
<div class="unit-card device-card"
|
<div class="unit-card device-card"
|
||||||
onclick="openUnitModal('{{ unit.id }}', '{{ unit.status }}', '{{ unit.age }}')"
|
onclick="openUnitModal('{{ unit.id }}', '{{ unit.status }}', '{{ unit.age }}')"
|
||||||
data-device-type="{{ unit.device_type }}"
|
data-device-type="{{ unit.device_type }}"
|
||||||
data-status="{% if unit.deployed %}deployed{% elif unit.retired %}retired{% elif unit.ignored %}ignored{% else %}benched{% endif %}"
|
data-status="{% if unit.deployed %}deployed{% elif unit.out_for_calibration %}out_for_calibration{% elif unit.retired %}retired{% elif unit.ignored %}ignored{% elif unit.allocated %}allocated{% else %}benched{% endif %}"
|
||||||
data-health="{{ unit.status }}"
|
data-health="{{ unit.status }}"
|
||||||
data-unit-id="{{ unit.id }}"
|
data-unit-id="{{ unit.id }}"
|
||||||
data-age="{{ unit.age }}">
|
data-age="{{ unit.age }}">
|
||||||
<!-- Header: Status Dot + Unit ID + Status Badge -->
|
<!-- Header: Status Dot + Unit ID + Status Badge -->
|
||||||
<div class="flex items-center justify-between mb-2">
|
<div class="flex items-center justify-between mb-2">
|
||||||
<div class="flex items-center gap-2">
|
<div class="flex items-center gap-2">
|
||||||
{% if unit.status == 'OK' %}
|
{% if unit.out_for_calibration %}
|
||||||
|
<span class="w-4 h-4 rounded-full bg-purple-500" title="Out for Calibration"></span>
|
||||||
|
{% elif unit.allocated %}
|
||||||
|
<span class="w-4 h-4 rounded-full bg-orange-400" title="Allocated"></span>
|
||||||
|
{% elif not unit.deployed %}
|
||||||
|
<span class="w-4 h-4 rounded-full bg-gray-400 dark:bg-gray-500" title="Benched"></span>
|
||||||
|
{% elif unit.status == 'OK' %}
|
||||||
<span class="w-4 h-4 rounded-full bg-green-500" title="OK"></span>
|
<span class="w-4 h-4 rounded-full bg-green-500" title="OK"></span>
|
||||||
{% elif unit.status == 'Pending' %}
|
{% elif unit.status == 'Pending' %}
|
||||||
<span class="w-4 h-4 rounded-full bg-yellow-500" title="Pending"></span>
|
<span class="w-4 h-4 rounded-full bg-yellow-500" title="Pending"></span>
|
||||||
@@ -215,12 +236,14 @@
|
|||||||
<span class="font-bold text-lg text-seismo-orange dark:text-seismo-orange">{{ unit.id }}</span>
|
<span class="font-bold text-lg text-seismo-orange dark:text-seismo-orange">{{ unit.id }}</span>
|
||||||
</div>
|
</div>
|
||||||
<span class="px-3 py-1 rounded-full text-xs font-medium
|
<span class="px-3 py-1 rounded-full text-xs font-medium
|
||||||
{% if unit.status == 'OK' %}bg-green-100 dark:bg-green-900/30 text-green-800 dark:text-green-300
|
{% if unit.out_for_calibration %}bg-purple-100 dark:bg-purple-900/30 text-purple-800 dark:text-purple-300
|
||||||
|
{% elif unit.allocated %}bg-orange-100 dark:bg-orange-900/30 text-orange-800 dark:text-orange-300
|
||||||
|
{% elif unit.status == 'OK' %}bg-green-100 dark:bg-green-900/30 text-green-800 dark:text-green-300
|
||||||
{% elif unit.status == 'Pending' %}bg-yellow-100 dark:bg-yellow-900/30 text-yellow-800 dark:text-yellow-300
|
{% elif unit.status == 'Pending' %}bg-yellow-100 dark:bg-yellow-900/30 text-yellow-800 dark:text-yellow-300
|
||||||
{% elif unit.status == 'Missing' %}bg-red-100 dark:bg-red-900/30 text-red-800 dark:text-red-300
|
{% elif unit.status == 'Missing' %}bg-red-100 dark:bg-red-900/30 text-red-800 dark:text-red-300
|
||||||
{% else %}bg-gray-100 dark:bg-gray-700 text-gray-600 dark:text-gray-400
|
{% else %}bg-gray-100 dark:bg-gray-700 text-gray-600 dark:text-gray-400
|
||||||
{% endif %}">
|
{% endif %}">
|
||||||
{% if unit.status in ['N/A', 'Unknown'] %}Benched{% else %}{{ unit.status }}{% endif %}
|
{% if unit.out_for_calibration %}Out for Cal{% elif unit.allocated %}Allocated{% elif unit.status in ['N/A', 'Unknown'] %}Benched{% else %}{{ unit.status }}{% endif %}
|
||||||
</span>
|
</span>
|
||||||
</div>
|
</div>
|
||||||
|
|
||||||
@@ -230,6 +253,10 @@
|
|||||||
<span class="px-2 py-1 rounded-full bg-purple-100 dark:bg-purple-900/30 text-purple-800 dark:text-purple-300 text-xs font-medium">
|
<span class="px-2 py-1 rounded-full bg-purple-100 dark:bg-purple-900/30 text-purple-800 dark:text-purple-300 text-xs font-medium">
|
||||||
Modem
|
Modem
|
||||||
</span>
|
</span>
|
||||||
|
{% elif unit.device_type == 'slm' %}
|
||||||
|
<span class="px-2 py-1 rounded-full bg-orange-100 dark:bg-orange-900/30 text-orange-800 dark:text-orange-300 text-xs font-medium">
|
||||||
|
SLM
|
||||||
|
</span>
|
||||||
{% else %}
|
{% else %}
|
||||||
<span class="px-2 py-1 rounded-full bg-blue-100 dark:bg-blue-900/30 text-blue-800 dark:text-blue-300 text-xs font-medium">
|
<span class="px-2 py-1 rounded-full bg-blue-100 dark:bg-blue-900/30 text-blue-800 dark:text-blue-300 text-xs font-medium">
|
||||||
Seismograph
|
Seismograph
|
||||||
@@ -266,6 +293,10 @@
|
|||||||
<span class="text-xs text-blue-600 dark:text-blue-400">
|
<span class="text-xs text-blue-600 dark:text-blue-400">
|
||||||
⚡ Deployed
|
⚡ Deployed
|
||||||
</span>
|
</span>
|
||||||
|
{% elif unit.out_for_calibration %}
|
||||||
|
<span class="text-xs text-purple-600 dark:text-purple-400">
|
||||||
|
🔧 Out for Calibration
|
||||||
|
</span>
|
||||||
{% else %}
|
{% else %}
|
||||||
<span class="text-xs text-gray-500 dark:text-gray-500">
|
<span class="text-xs text-gray-500 dark:text-gray-500">
|
||||||
📦 Benched
|
📦 Benched
|
||||||
@@ -345,6 +376,39 @@
|
|||||||
</style>
|
</style>
|
||||||
|
|
||||||
<script>
|
<script>
|
||||||
|
(function() {
|
||||||
|
// User's configured timezone from settings (defaults to America/New_York)
|
||||||
|
const userTimezone = '{{ user_timezone | default("America/New_York") }}';
|
||||||
|
|
||||||
|
// Format ISO timestamp to human-readable format in user's timezone
|
||||||
|
function formatLastSeenLocal(isoString) {
|
||||||
|
if (!isoString || isoString === 'Never' || isoString === 'N/A') {
|
||||||
|
return isoString || 'Never';
|
||||||
|
}
|
||||||
|
try {
|
||||||
|
const date = new Date(isoString);
|
||||||
|
if (isNaN(date.getTime())) return isoString;
|
||||||
|
|
||||||
|
// Format in user's configured timezone
|
||||||
|
return date.toLocaleString('en-US', {
|
||||||
|
timeZone: userTimezone,
|
||||||
|
month: 'short',
|
||||||
|
day: 'numeric',
|
||||||
|
hour: 'numeric',
|
||||||
|
minute: '2-digit',
|
||||||
|
hour12: true
|
||||||
|
});
|
||||||
|
} catch (e) {
|
||||||
|
return isoString;
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
// Format all last-seen cells on page load
|
||||||
|
document.querySelectorAll('.last-seen-cell').forEach(cell => {
|
||||||
|
const isoDate = cell.getAttribute('data-iso');
|
||||||
|
cell.textContent = formatLastSeenLocal(isoDate);
|
||||||
|
});
|
||||||
|
|
||||||
// Update timestamp
|
// Update timestamp
|
||||||
const timestampElement = document.getElementById('last-updated');
|
const timestampElement = document.getElementById('last-updated');
|
||||||
if (timestampElement) {
|
if (timestampElement) {
|
||||||
@@ -365,20 +429,23 @@
|
|||||||
};
|
};
|
||||||
return acc;
|
return acc;
|
||||||
}, {});
|
}, {});
|
||||||
|
})();
|
||||||
|
|
||||||
// Sorting state
|
// Sorting state (needs to persist across swaps)
|
||||||
let currentSort = { column: null, direction: 'asc' };
|
if (typeof window.currentSort === 'undefined') {
|
||||||
|
window.currentSort = { column: null, direction: 'asc' };
|
||||||
|
}
|
||||||
|
|
||||||
function sortTable(column) {
|
function sortTable(column) {
|
||||||
const tbody = document.getElementById('roster-tbody');
|
const tbody = document.getElementById('roster-tbody');
|
||||||
const rows = Array.from(tbody.getElementsByTagName('tr'));
|
const rows = Array.from(tbody.getElementsByTagName('tr'));
|
||||||
|
|
||||||
// Determine sort direction
|
// Determine sort direction
|
||||||
if (currentSort.column === column) {
|
if (window.currentSort.column === column) {
|
||||||
currentSort.direction = currentSort.direction === 'asc' ? 'desc' : 'asc';
|
window.currentSort.direction = window.currentSort.direction === 'asc' ? 'desc' : 'asc';
|
||||||
} else {
|
} else {
|
||||||
currentSort.column = column;
|
window.currentSort.column = column;
|
||||||
currentSort.direction = 'asc';
|
window.currentSort.direction = 'asc';
|
||||||
}
|
}
|
||||||
|
|
||||||
// Sort rows
|
// Sort rows
|
||||||
@@ -406,8 +473,8 @@
|
|||||||
bVal = bVal.toLowerCase();
|
bVal = bVal.toLowerCase();
|
||||||
}
|
}
|
||||||
|
|
||||||
if (aVal < bVal) return currentSort.direction === 'asc' ? -1 : 1;
|
if (aVal < bVal) return window.currentSort.direction === 'asc' ? -1 : 1;
|
||||||
if (aVal > bVal) return currentSort.direction === 'asc' ? 1 : -1;
|
if (aVal > bVal) return window.currentSort.direction === 'asc' ? 1 : -1;
|
||||||
return 0;
|
return 0;
|
||||||
});
|
});
|
||||||
|
|
||||||
@@ -443,10 +510,10 @@
|
|||||||
});
|
});
|
||||||
|
|
||||||
// Set current indicator
|
// Set current indicator
|
||||||
if (currentSort.column) {
|
if (window.currentSort.column) {
|
||||||
const indicator = document.querySelector(`.sort-indicator[data-column="${currentSort.column}"]`);
|
const indicator = document.querySelector(`.sort-indicator[data-column="${window.currentSort.column}"]`);
|
||||||
if (indicator) {
|
if (indicator) {
|
||||||
indicator.className = `sort-indicator ${currentSort.direction}`;
|
indicator.className = `sort-indicator ${window.currentSort.direction}`;
|
||||||
}
|
}
|
||||||
}
|
}
|
||||||
}
|
}
|
||||||
|
|||||||
40
templates/partials/fleet_calendar/available_units.html
Normal file
@@ -0,0 +1,40 @@
|
|||||||
|
<!-- Available Units for Assignment -->
|
||||||
|
{% if units %}
|
||||||
|
<div class="space-y-1">
|
||||||
|
{% for unit in units %}
|
||||||
|
<label class="flex items-center gap-3 p-2 hover:bg-gray-50 dark:hover:bg-gray-700 rounded cursor-pointer">
|
||||||
|
<input type="checkbox" name="unit_ids" value="{{ unit.id }}"
|
||||||
|
class="w-4 h-4 text-blue-600 focus:ring-blue-500 rounded border-gray-300 dark:border-gray-600">
|
||||||
|
<span class="font-medium text-gray-900 dark:text-white">{{ unit.id }}</span>
|
||||||
|
<span class="text-sm text-gray-500 dark:text-gray-400 flex-1">
|
||||||
|
{% if unit.last_calibrated %}
|
||||||
|
Cal: {{ unit.last_calibrated }}
|
||||||
|
{% else %}
|
||||||
|
No cal date
|
||||||
|
{% endif %}
|
||||||
|
</span>
|
||||||
|
{% if unit.calibration_status == 'expiring_soon' %}
|
||||||
|
<span class="text-xs px-2 py-0.5 bg-yellow-100 dark:bg-yellow-900/30 text-yellow-700 dark:text-yellow-400 rounded-full">
|
||||||
|
Expiring soon
|
||||||
|
</span>
|
||||||
|
{% endif %}
|
||||||
|
{% if unit.deployed %}
|
||||||
|
<span class="text-xs px-2 py-0.5 bg-green-100 dark:bg-green-900/30 text-green-700 dark:text-green-400 rounded-full">
|
||||||
|
Deployed
|
||||||
|
</span>
|
||||||
|
{% else %}
|
||||||
|
<span class="text-xs px-2 py-0.5 bg-gray-100 dark:bg-gray-700 text-gray-600 dark:text-gray-400 rounded-full">
|
||||||
|
Benched
|
||||||
|
</span>
|
||||||
|
{% endif %}
|
||||||
|
</label>
|
||||||
|
{% endfor %}
|
||||||
|
</div>
|
||||||
|
{% else %}
|
||||||
|
<p class="text-gray-500 dark:text-gray-400 text-sm py-4 text-center">
|
||||||
|
No units available for this date range.
|
||||||
|
{% if start_date and end_date %}
|
||||||
|
<br><span class="text-xs">All units are either reserved, have expired calibrations, or are retired.</span>
|
||||||
|
{% endif %}
|
||||||
|
</p>
|
||||||
|
{% endif %}
|
||||||
186
templates/partials/fleet_calendar/day_detail.html
Normal file
@@ -0,0 +1,186 @@
|
|||||||
|
<!-- Day Detail Panel Content -->
|
||||||
|
<div class="flex items-center justify-between mb-6">
|
||||||
|
<h2 class="text-xl font-semibold text-gray-900 dark:text-white">{{ date_display }}</h2>
|
||||||
|
<button onclick="closeDayPanel()" class="text-gray-400 hover:text-gray-600 dark:hover:text-gray-300">
|
||||||
|
<svg class="w-6 h-6" fill="none" stroke="currentColor" viewBox="0 0 24 24">
|
||||||
|
<path stroke-linecap="round" stroke-linejoin="round" stroke-width="2" d="M6 18L18 6M6 6l12 12"/>
|
||||||
|
</svg>
|
||||||
|
</button>
|
||||||
|
</div>
|
||||||
|
|
||||||
|
<!-- Summary Stats -->
|
||||||
|
<div class="grid grid-cols-2 gap-3 mb-6">
|
||||||
|
<div class="bg-green-50 dark:bg-green-900/20 rounded-lg p-3 text-center">
|
||||||
|
<p class="text-2xl font-bold text-green-700 dark:text-green-300">{{ day_data.counts.available }}</p>
|
||||||
|
<p class="text-xs text-green-600 dark:text-green-400">Available</p>
|
||||||
|
</div>
|
||||||
|
<div class="bg-blue-50 dark:bg-blue-900/20 rounded-lg p-3 text-center">
|
||||||
|
<p class="text-2xl font-bold text-blue-700 dark:text-blue-300">{{ day_data.counts.reserved }}</p>
|
||||||
|
<p class="text-xs text-blue-600 dark:text-blue-400">Reserved</p>
|
||||||
|
</div>
|
||||||
|
<div class="bg-yellow-50 dark:bg-yellow-900/20 rounded-lg p-3 text-center">
|
||||||
|
<p class="text-2xl font-bold text-yellow-700 dark:text-yellow-300">{{ day_data.counts.expiring_soon }}</p>
|
||||||
|
<p class="text-xs text-yellow-600 dark:text-yellow-400">Expiring Soon</p>
|
||||||
|
</div>
|
||||||
|
<div class="bg-red-50 dark:bg-red-900/20 rounded-lg p-3 text-center">
|
||||||
|
<p class="text-2xl font-bold text-red-700 dark:text-red-300">{{ day_data.counts.expired }}</p>
|
||||||
|
<p class="text-xs text-red-600 dark:text-red-400">Cal. Expired</p>
|
||||||
|
</div>
|
||||||
|
</div>
|
||||||
|
|
||||||
|
<!-- Calibration Expiring TODAY - Important alert -->
|
||||||
|
{% if day_data.cal_expiring_today %}
|
||||||
|
<div class="mb-6 p-3 bg-red-50 dark:bg-red-900/30 border border-red-200 dark:border-red-800 rounded-lg">
|
||||||
|
<h3 class="text-sm font-semibold text-red-700 dark:text-red-400 mb-2 flex items-center gap-2">
|
||||||
|
<svg class="w-4 h-4" fill="currentColor" viewBox="0 0 20 20">
|
||||||
|
<path fill-rule="evenodd" d="M8.257 3.099c.765-1.36 2.722-1.36 3.486 0l5.58 9.92c.75 1.334-.213 2.98-1.742 2.98H4.42c-1.53 0-2.493-1.646-1.743-2.98l5.58-9.92zM11 13a1 1 0 11-2 0 1 1 0 012 0zm-1-8a1 1 0 00-1 1v3a1 1 0 002 0V6a1 1 0 00-1-1z" clip-rule="evenodd"/>
|
||||||
|
</svg>
|
||||||
|
Calibration Expires Today ({{ day_data.cal_expiring_today|length }})
|
||||||
|
</h3>
|
||||||
|
<div class="space-y-1">
|
||||||
|
{% for unit in day_data.cal_expiring_today %}
|
||||||
|
<div class="flex items-center justify-between p-2 bg-white dark:bg-gray-800 rounded text-sm">
|
||||||
|
<a href="/unit/{{ unit.id }}" class="font-medium text-red-600 dark:text-red-400 hover:underline">
|
||||||
|
{{ unit.id }}
|
||||||
|
</a>
|
||||||
|
<span class="text-red-500 text-xs">
|
||||||
|
Last cal: {{ unit.last_calibrated }}
|
||||||
|
</span>
|
||||||
|
</div>
|
||||||
|
{% endfor %}
|
||||||
|
</div>
|
||||||
|
</div>
|
||||||
|
{% endif %}
|
||||||
|
|
||||||
|
<!-- Reservations on this date -->
|
||||||
|
{% if day_data.reservations %}
|
||||||
|
<div class="mb-6">
|
||||||
|
<h3 class="text-sm font-semibold text-gray-700 dark:text-gray-300 mb-3">Reservations</h3>
|
||||||
|
{% for res in day_data.reservations %}
|
||||||
|
<div class="reservation-bar mb-2" style="background-color: {{ res.color }}20; border-left: 4px solid {{ res.color }};">
|
||||||
|
<div class="flex-1">
|
||||||
|
<p class="font-medium text-gray-900 dark:text-white">{{ res.name }}</p>
|
||||||
|
<p class="text-xs text-gray-500 dark:text-gray-400">
|
||||||
|
{{ res.start_date }} - {{ res.end_date }}
|
||||||
|
</p>
|
||||||
|
</div>
|
||||||
|
<div class="text-right">
|
||||||
|
<p class="font-semibold text-gray-900 dark:text-white">
|
||||||
|
{% if res.assignment_type == 'quantity' %}
|
||||||
|
{{ res.assigned_count }}/{{ res.quantity_needed or '?' }}
|
||||||
|
{% else %}
|
||||||
|
{{ res.assigned_count }} units
|
||||||
|
{% endif %}
|
||||||
|
</p>
|
||||||
|
</div>
|
||||||
|
</div>
|
||||||
|
{% endfor %}
|
||||||
|
</div>
|
||||||
|
{% endif %}
|
||||||
|
|
||||||
|
<!-- Available Units -->
|
||||||
|
{% if day_data.available_units %}
|
||||||
|
<div class="mb-6">
|
||||||
|
<h3 class="text-sm font-semibold text-gray-700 dark:text-gray-300 mb-3">
|
||||||
|
Available Units ({{ day_data.available_units|length }})
|
||||||
|
</h3>
|
||||||
|
<div class="max-h-48 overflow-y-auto space-y-1">
|
||||||
|
{% for unit in day_data.available_units %}
|
||||||
|
<div class="flex items-center justify-between p-2 bg-gray-50 dark:bg-gray-700/50 rounded text-sm">
|
||||||
|
<a href="/unit/{{ unit.id }}" class="font-medium text-blue-600 dark:text-blue-400 hover:underline">
|
||||||
|
{{ unit.id }}
|
||||||
|
</a>
|
||||||
|
<span class="text-gray-500 dark:text-gray-400 text-xs">
|
||||||
|
{% if unit.last_calibrated %}
|
||||||
|
Cal: {{ unit.last_calibrated }}
|
||||||
|
{% else %}
|
||||||
|
No cal date
|
||||||
|
{% endif %}
|
||||||
|
</span>
|
||||||
|
</div>
|
||||||
|
{% endfor %}
|
||||||
|
</div>
|
||||||
|
</div>
|
||||||
|
{% endif %}
|
||||||
|
|
||||||
|
<!-- Reserved Units -->
|
||||||
|
{% if day_data.reserved_units %}
|
||||||
|
<div class="mb-6">
|
||||||
|
<h3 class="text-sm font-semibold text-gray-700 dark:text-gray-300 mb-3">
|
||||||
|
Reserved Units ({{ day_data.reserved_units|length }})
|
||||||
|
</h3>
|
||||||
|
<div class="max-h-48 overflow-y-auto space-y-1">
|
||||||
|
{% for unit in day_data.reserved_units %}
|
||||||
|
<div class="flex items-center justify-between p-2 bg-blue-50 dark:bg-blue-900/20 rounded text-sm">
|
||||||
|
<a href="/unit/{{ unit.id }}" class="font-medium text-blue-600 dark:text-blue-400 hover:underline">
|
||||||
|
{{ unit.id }}
|
||||||
|
</a>
|
||||||
|
<span class="text-blue-600 dark:text-blue-400 text-xs">
|
||||||
|
{{ unit.reservation_name }}
|
||||||
|
</span>
|
||||||
|
</div>
|
||||||
|
{% endfor %}
|
||||||
|
</div>
|
||||||
|
</div>
|
||||||
|
{% endif %}
|
||||||
|
|
||||||
|
<!-- Calibration Expired -->
|
||||||
|
{% if day_data.expired_units %}
|
||||||
|
<div class="mb-6">
|
||||||
|
<h3 class="text-sm font-semibold text-red-600 dark:text-red-400 mb-3">
|
||||||
|
Calibration Expired ({{ day_data.expired_units|length }})
|
||||||
|
</h3>
|
||||||
|
<div class="max-h-48 overflow-y-auto space-y-1">
|
||||||
|
{% for unit in day_data.expired_units %}
|
||||||
|
<div class="flex items-center justify-between p-2 bg-red-50 dark:bg-red-900/20 rounded text-sm">
|
||||||
|
<a href="/unit/{{ unit.id }}" class="font-medium text-red-600 dark:text-red-400 hover:underline">
|
||||||
|
{{ unit.id }}
|
||||||
|
</a>
|
||||||
|
<span class="text-red-500 text-xs">
|
||||||
|
Expired: {{ unit.expiry_date }}
|
||||||
|
</span>
|
||||||
|
</div>
|
||||||
|
{% endfor %}
|
||||||
|
</div>
|
||||||
|
</div>
|
||||||
|
{% endif %}
|
||||||
|
|
||||||
|
<!-- Needs Calibration -->
|
||||||
|
{% if day_data.needs_calibration_units %}
|
||||||
|
<div class="mb-6">
|
||||||
|
<h3 class="text-sm font-semibold text-gray-600 dark:text-gray-400 mb-3">
|
||||||
|
Needs Calibration Date ({{ day_data.needs_calibration_units|length }})
|
||||||
|
</h3>
|
||||||
|
<div class="max-h-32 overflow-y-auto space-y-1">
|
||||||
|
{% for unit in day_data.needs_calibration_units %}
|
||||||
|
<div class="flex items-center justify-between p-2 bg-gray-100 dark:bg-gray-700/50 rounded text-sm">
|
||||||
|
<a href="/unit/{{ unit.id }}" class="font-medium text-gray-600 dark:text-gray-400 hover:underline">
|
||||||
|
{{ unit.id }}
|
||||||
|
</a>
|
||||||
|
<span class="text-gray-400 text-xs">No cal date set</span>
|
||||||
|
</div>
|
||||||
|
{% endfor %}
|
||||||
|
</div>
|
||||||
|
</div>
|
||||||
|
{% endif %}
|
||||||
|
|
||||||
|
<!-- Expiring Soon (informational) -->
|
||||||
|
{% if day_data.expiring_soon_units %}
|
||||||
|
<div class="mb-6">
|
||||||
|
<h3 class="text-sm font-semibold text-yellow-600 dark:text-yellow-400 mb-3">
|
||||||
|
Calibration Expiring Soon ({{ day_data.expiring_soon_units|length }})
|
||||||
|
</h3>
|
||||||
|
<div class="max-h-32 overflow-y-auto space-y-1">
|
||||||
|
{% for unit in day_data.expiring_soon_units %}
|
||||||
|
<div class="flex items-center justify-between p-2 bg-yellow-50 dark:bg-yellow-900/20 rounded text-sm">
|
||||||
|
<a href="/unit/{{ unit.id }}" class="font-medium text-yellow-700 dark:text-yellow-400 hover:underline">
|
||||||
|
{{ unit.id }}
|
||||||
|
</a>
|
||||||
|
<span class="text-yellow-600 text-xs">
|
||||||
|
Expires: {{ unit.expiry_date }}
|
||||||
|
</span>
|
||||||
|
</div>
|
||||||
|
{% endfor %}
|
||||||
|
</div>
|
||||||
|
</div>
|
||||||
|
{% endif %}
|
||||||
203
templates/partials/fleet_calendar/reservations_list.html
Normal file
@@ -0,0 +1,203 @@
|
|||||||
|
<!-- Reservations List -->
|
||||||
|
{% if reservations %}
|
||||||
|
<div class="space-y-2">
|
||||||
|
{% for item in reservations %}
|
||||||
|
{% set res = item.reservation %}
|
||||||
|
{% set card_id = "res-card-" ~ res.id %}
|
||||||
|
{% set detail_id = "res-detail-" ~ res.id %}
|
||||||
|
|
||||||
|
<div class="rounded-lg border border-gray-200 dark:border-gray-700"
|
||||||
|
style="border-left: 4px solid {{ res.color }};">
|
||||||
|
|
||||||
|
<!-- Header row (always visible, clickable) -->
|
||||||
|
<div class="res-card-header flex items-center justify-between p-4 cursor-pointer hover:bg-gray-50 dark:hover:bg-gray-700/50 transition-colors select-none"
|
||||||
|
data-res-id="{{ res.id }}"
|
||||||
|
onclick="toggleResCard('{{ res.id }}')">
|
||||||
|
|
||||||
|
<div class="flex-1 min-w-0">
|
||||||
|
<div class="flex items-center gap-2 flex-wrap">
|
||||||
|
<h3 class="font-semibold text-gray-900 dark:text-white">{{ res.name }}</h3>
|
||||||
|
{% if res.device_type == 'slm' %}
|
||||||
|
<span class="px-2 py-0.5 text-xs font-medium bg-purple-100 dark:bg-purple-900/30 text-purple-700 dark:text-purple-400 rounded">SLM</span>
|
||||||
|
{% else %}
|
||||||
|
<span class="px-2 py-0.5 text-xs font-medium bg-blue-100 dark:bg-blue-900/30 text-blue-700 dark:text-blue-400 rounded">Seismograph</span>
|
||||||
|
{% endif %}
|
||||||
|
{% if item.has_conflicts %}
|
||||||
|
<span class="px-2 py-0.5 text-xs font-medium bg-amber-100 dark:bg-amber-900/30 text-amber-700 dark:text-amber-400 rounded"
|
||||||
|
title="{{ item.conflict_count }} unit(s) will need a calibration swap during this job">
|
||||||
|
{{ item.conflict_count }} cal swap{{ 's' if item.conflict_count != 1 else '' }}
|
||||||
|
</span>
|
||||||
|
{% endif %}
|
||||||
|
</div>
|
||||||
|
<p class="text-sm text-gray-500 dark:text-gray-400 mt-0.5">
|
||||||
|
{{ res.start_date.strftime('%b %d, %Y') }} –
|
||||||
|
{% if res.end_date %}
|
||||||
|
{{ res.end_date.strftime('%b %d, %Y') }}
|
||||||
|
{% elif res.end_date_tbd %}
|
||||||
|
<span class="text-yellow-600 dark:text-yellow-400 font-medium">TBD</span>
|
||||||
|
{% if res.estimated_end_date %}
|
||||||
|
<span class="text-gray-400">(est. {{ res.estimated_end_date.strftime('%b %d, %Y') }})</span>
|
||||||
|
{% endif %}
|
||||||
|
{% else %}
|
||||||
|
<span class="text-yellow-600 dark:text-yellow-400">Ongoing</span>
|
||||||
|
{% endif %}
|
||||||
|
</p>
|
||||||
|
</div>
|
||||||
|
|
||||||
|
<!-- Counts -->
|
||||||
|
<div class="flex flex-col items-end gap-1 mx-4 flex-shrink-0">
|
||||||
|
{% set full = item.assigned_count == item.location_count and item.location_count > 0 %}
|
||||||
|
{% set remaining = item.location_count - item.assigned_count %}
|
||||||
|
<!-- Number row -->
|
||||||
|
<div class="flex items-baseline gap-2">
|
||||||
|
<span class="text-xs text-gray-400 dark:text-gray-500">est. {% if res.estimated_units %}{{ res.estimated_units }}{% else %}—{% endif %}</span>
|
||||||
|
<span class="text-gray-300 dark:text-gray-600">·</span>
|
||||||
|
<span class="text-base font-bold {% if full %}text-green-600 dark:text-green-400{% elif item.assigned_count == 0 %}text-gray-400 dark:text-gray-500{% else %}text-amber-500 dark:text-amber-400{% endif %}">
|
||||||
|
{{ item.assigned_count }}/{{ item.location_count }}
|
||||||
|
</span>
|
||||||
|
{% if remaining > 0 %}
|
||||||
|
<span class="text-xs text-amber-500 dark:text-amber-400 whitespace-nowrap">({{ remaining }} more)</span>
|
||||||
|
{% endif %}
|
||||||
|
</div>
|
||||||
|
<!-- Progress squares -->
|
||||||
|
{% if item.location_count > 0 %}
|
||||||
|
<div class="flex gap-0.5">
|
||||||
|
{% for i in range(item.location_count) %}
|
||||||
|
<span class="w-3 h-3 rounded-sm {% if i < item.assigned_count %}{% if full %}bg-green-500{% else %}bg-amber-500{% endif %}{% else %}bg-gray-300 dark:bg-gray-600{% endif %}"></span>
|
||||||
|
{% endfor %}
|
||||||
|
</div>
|
||||||
|
{% endif %}
|
||||||
|
</div>
|
||||||
|
|
||||||
|
<!-- Action buttons -->
|
||||||
|
<div class="flex items-center gap-1 flex-shrink-0">
|
||||||
|
<!-- Assign units (always visible) -->
|
||||||
|
<button onclick="event.stopPropagation(); openPlanner('{{ res.id }}')"
|
||||||
|
class="p-2 text-gray-400 hover:text-green-600 dark:hover:text-green-400 rounded-lg hover:bg-gray-100 dark:hover:bg-gray-700"
|
||||||
|
title="Assign units">
|
||||||
|
<svg class="w-4 h-4" fill="none" stroke="currentColor" viewBox="0 0 24 24">
|
||||||
|
<path stroke-linecap="round" stroke-linejoin="round" stroke-width="2" d="M9 5H7a2 2 0 00-2 2v12a2 2 0 002 2h10a2 2 0 002-2V7a2 2 0 00-2-2h-2M9 5a2 2 0 002 2h2a2 2 0 002-2M9 5a2 2 0 012-2h2a2 2 0 012 2m-3 7h3m-3 4h3m-6-4h.01M9 16h.01"/>
|
||||||
|
</svg>
|
||||||
|
</button>
|
||||||
|
|
||||||
|
<!-- "..." overflow menu -->
|
||||||
|
<div class="relative" onclick="event.stopPropagation()">
|
||||||
|
<button onclick="toggleResMenu('{{ res.id }}')"
|
||||||
|
class="p-2 text-gray-400 hover:text-gray-600 dark:hover:text-gray-300 rounded-lg hover:bg-gray-100 dark:hover:bg-gray-700"
|
||||||
|
title="More options">
|
||||||
|
<svg class="w-4 h-4" fill="currentColor" viewBox="0 0 24 24">
|
||||||
|
<circle cx="5" cy="12" r="1.5"/><circle cx="12" cy="12" r="1.5"/><circle cx="19" cy="12" r="1.5"/>
|
||||||
|
</svg>
|
||||||
|
</button>
|
||||||
|
<div id="res-menu-{{ res.id }}"
|
||||||
|
class="hidden absolute right-0 top-8 z-20 w-44 bg-white dark:bg-slate-800 border border-gray-200 dark:border-gray-700 rounded-lg shadow-lg py-1">
|
||||||
|
<button onclick="openPromoteModal('{{ res.id }}', '{{ res.name }}'); toggleResMenu('{{ res.id }}')"
|
||||||
|
class="w-full text-left px-4 py-2 text-sm text-gray-700 dark:text-gray-300 hover:bg-gray-50 dark:hover:bg-slate-700 flex items-center gap-2">
|
||||||
|
<svg class="w-4 h-4 text-emerald-500" fill="none" stroke="currentColor" viewBox="0 0 24 24">
|
||||||
|
<path stroke-linecap="round" stroke-linejoin="round" stroke-width="2" d="M5 10l7-7m0 0l7 7m-7-7v18"/>
|
||||||
|
</svg>
|
||||||
|
Promote to Project
|
||||||
|
</button>
|
||||||
|
<button onclick="editReservation('{{ res.id }}'); toggleResMenu('{{ res.id }}')"
|
||||||
|
class="w-full text-left px-4 py-2 text-sm text-gray-700 dark:text-gray-300 hover:bg-gray-50 dark:hover:bg-slate-700 flex items-center gap-2">
|
||||||
|
<svg class="w-4 h-4 text-blue-500" fill="none" stroke="currentColor" viewBox="0 0 24 24">
|
||||||
|
<path stroke-linecap="round" stroke-linejoin="round" stroke-width="2" d="M11 5H6a2 2 0 00-2 2v11a2 2 0 002 2h11a2 2 0 002-2v-5m-1.414-9.414a2 2 0 112.828 2.828L11.828 15H9v-2.828l8.586-8.586z"/>
|
||||||
|
</svg>
|
||||||
|
Edit
|
||||||
|
</button>
|
||||||
|
<div class="border-t border-gray-100 dark:border-gray-700 my-1"></div>
|
||||||
|
<button onclick="deleteReservation('{{ res.id }}', '{{ res.name }}'); toggleResMenu('{{ res.id }}')"
|
||||||
|
class="w-full text-left px-4 py-2 text-sm text-red-600 dark:text-red-400 hover:bg-red-50 dark:hover:bg-red-900/20 flex items-center gap-2">
|
||||||
|
<svg class="w-4 h-4" fill="none" stroke="currentColor" viewBox="0 0 24 24">
|
||||||
|
<path stroke-linecap="round" stroke-linejoin="round" stroke-width="2" d="M19 7l-.867 12.142A2 2 0 0116.138 21H7.862a2 2 0 01-1.995-1.858L5 7m5 4v6m4-6v6m1-10V4a1 1 0 00-1-1h-4a1 1 0 00-1 1v3M4 7h16"/>
|
||||||
|
</svg>
|
||||||
|
Delete
|
||||||
|
</button>
|
||||||
|
</div>
|
||||||
|
</div>
|
||||||
|
|
||||||
|
<!-- Chevron -->
|
||||||
|
<svg id="chevron-{{ res.id }}" class="w-4 h-4 text-gray-400 transition-transform duration-200 ml-1 pointer-events-none" fill="none" stroke="currentColor" viewBox="0 0 24 24">
|
||||||
|
<path stroke-linecap="round" stroke-linejoin="round" stroke-width="2" d="M19 9l-7 7-7-7"/>
|
||||||
|
</svg>
|
||||||
|
</div>
|
||||||
|
</div>
|
||||||
|
|
||||||
|
<!-- Expandable detail panel -->
|
||||||
|
<div id="{{ detail_id }}" class="hidden border-t border-gray-100 dark:border-gray-700 bg-gray-50 dark:bg-slate-800/60 px-4 py-3">
|
||||||
|
|
||||||
|
{% if res.notes %}
|
||||||
|
<p class="text-sm text-gray-500 dark:text-gray-400 mb-3 italic">{{ res.notes }}</p>
|
||||||
|
{% endif %}
|
||||||
|
|
||||||
|
<div class="grid grid-cols-2 gap-x-6 gap-y-1 text-sm mb-3">
|
||||||
|
<div class="text-gray-500 dark:text-gray-400">Estimated</div>
|
||||||
|
<div class="font-medium {% if res.estimated_units %}text-gray-800 dark:text-gray-200{% else %}text-gray-400 dark:text-gray-500 italic{% endif %}">
|
||||||
|
{% if res.estimated_units %}{{ res.estimated_units }} unit{{ 's' if res.estimated_units != 1 else '' }}{% else %}not specified{% endif %}
|
||||||
|
</div>
|
||||||
|
<div class="text-gray-500 dark:text-gray-400">Locations</div>
|
||||||
|
<div class="font-medium text-gray-800 dark:text-gray-200">{{ item.assigned_count }} of {{ item.location_count }} filled</div>
|
||||||
|
{% if item.assigned_count < item.location_count %}
|
||||||
|
<div class="text-gray-500 dark:text-gray-400">Still needed</div>
|
||||||
|
<div class="font-medium text-amber-600 dark:text-amber-400">{{ item.location_count - item.assigned_count }} location{{ 's' if (item.location_count - item.assigned_count) != 1 else '' }} remaining</div>
|
||||||
|
{% endif %}
|
||||||
|
{% if item.has_conflicts %}
|
||||||
|
<div class="text-gray-500 dark:text-gray-400">Cal swaps</div>
|
||||||
|
<div class="font-medium text-amber-600 dark:text-amber-400">{{ item.conflict_count }} unit{{ 's' if item.conflict_count != 1 else '' }} will need swapping during job</div>
|
||||||
|
{% endif %}
|
||||||
|
</div>
|
||||||
|
|
||||||
|
{% if item.assigned_units %}
|
||||||
|
<p class="text-xs font-semibold uppercase tracking-wide text-gray-400 dark:text-gray-500 mb-2">Monitoring Locations</p>
|
||||||
|
<div class="flex flex-col gap-1">
|
||||||
|
{% for u in item.assigned_units %}
|
||||||
|
<div class="rounded bg-white dark:bg-slate-700 border border-gray-100 dark:border-gray-600 text-sm">
|
||||||
|
<div class="flex items-center gap-3 px-3 py-1.5">
|
||||||
|
<span class="text-gray-400 dark:text-gray-500 text-xs w-12 flex-shrink-0">Loc. {{ loop.index }}</span>
|
||||||
|
<div class="flex flex-col min-w-0">
|
||||||
|
{% if u.location_name %}
|
||||||
|
<span class="text-xs font-semibold text-gray-700 dark:text-gray-300 truncate">{{ u.location_name }}</span>
|
||||||
|
{% endif %}
|
||||||
|
<button onclick="openUnitDetailModal('{{ u.id }}')"
|
||||||
|
class="font-medium text-blue-600 dark:text-blue-400 hover:underline text-left text-sm">{{ u.id }}</button>
|
||||||
|
</div>
|
||||||
|
<span class="flex-1"></span>
|
||||||
|
{% if u.power_type == 'ac' %}
|
||||||
|
<span class="text-xs px-1.5 py-0.5 bg-blue-50 dark:bg-blue-900/20 text-blue-600 dark:text-blue-400 rounded">A/C</span>
|
||||||
|
{% elif u.power_type == 'solar' %}
|
||||||
|
<span class="text-xs px-1.5 py-0.5 bg-yellow-50 dark:bg-yellow-900/20 text-yellow-600 dark:text-yellow-400 rounded">Solar</span>
|
||||||
|
{% endif %}
|
||||||
|
{% if u.deployed %}
|
||||||
|
<span class="text-xs px-1.5 py-0.5 bg-green-50 dark:bg-green-900/20 text-green-600 dark:text-green-400 rounded">Deployed</span>
|
||||||
|
{% else %}
|
||||||
|
<span class="text-xs px-1.5 py-0.5 bg-gray-100 dark:bg-gray-600 text-gray-500 dark:text-gray-400 rounded">Benched</span>
|
||||||
|
{% endif %}
|
||||||
|
{% if u.last_calibrated %}
|
||||||
|
<span class="text-xs text-gray-400 dark:text-gray-500">Cal: {{ u.last_calibrated.strftime('%b %d, %Y') }}</span>
|
||||||
|
{% endif %}
|
||||||
|
</div>
|
||||||
|
{% if u.notes %}
|
||||||
|
<p class="px-3 pb-1.5 text-xs text-gray-400 dark:text-gray-500 italic">{{ u.notes }}</p>
|
||||||
|
{% endif %}
|
||||||
|
</div>
|
||||||
|
{% endfor %}
|
||||||
|
</div>
|
||||||
|
{% else %}
|
||||||
|
<p class="text-sm text-gray-400 dark:text-gray-500 italic">No units assigned yet. Click the clipboard icon to plan.</p>
|
||||||
|
{% endif %}
|
||||||
|
</div>
|
||||||
|
|
||||||
|
</div>
|
||||||
|
{% endfor %}
|
||||||
|
</div>
|
||||||
|
|
||||||
|
<!-- toggleResCard, deleteReservation, editReservation, openUnitDetailModal defined in fleet_calendar.html -->
|
||||||
|
{% else %}
|
||||||
|
<div class="text-center py-8">
|
||||||
|
<svg class="w-12 h-12 mx-auto text-gray-400 dark:text-gray-500 mb-4" fill="none" stroke="currentColor" viewBox="0 0 24 24">
|
||||||
|
<path stroke-linecap="round" stroke-linejoin="round" stroke-width="2" d="M8 7V3m8 4V3m-9 8h10M5 21h14a2 2 0 002-2V7a2 2 0 00-2-2H5a2 2 0 00-2 2v12a2 2 0 002 2z"/>
|
||||||
|
</svg>
|
||||||
|
<p class="text-gray-500 dark:text-gray-400">No jobs yet</p>
|
||||||
|
<p class="text-sm text-gray-400 dark:text-gray-500 mt-1">Click "New Job" to start planning a deployment</p>
|
||||||
|
</div>
|
||||||
|
{% endif %}
|
||||||
127
templates/partials/modem_list.html
Normal file
@@ -0,0 +1,127 @@
|
|||||||
|
<!-- Modem List -->
|
||||||
|
{% if modems %}
|
||||||
|
<div class="overflow-x-auto">
|
||||||
|
<table class="w-full">
|
||||||
|
<thead class="bg-gray-50 dark:bg-slate-700 border-b border-gray-200 dark:border-gray-600">
|
||||||
|
<tr>
|
||||||
|
<th class="px-4 py-3 text-left text-xs font-medium text-gray-700 dark:text-gray-300 uppercase tracking-wider">Unit ID</th>
|
||||||
|
<th class="px-4 py-3 text-left text-xs font-medium text-gray-700 dark:text-gray-300 uppercase tracking-wider">Status</th>
|
||||||
|
<th class="px-4 py-3 text-left text-xs font-medium text-gray-700 dark:text-gray-300 uppercase tracking-wider">IP Address</th>
|
||||||
|
<th class="px-4 py-3 text-left text-xs font-medium text-gray-700 dark:text-gray-300 uppercase tracking-wider">Phone</th>
|
||||||
|
<th class="px-4 py-3 text-left text-xs font-medium text-gray-700 dark:text-gray-300 uppercase tracking-wider">Paired Device</th>
|
||||||
|
<th class="px-4 py-3 text-left text-xs font-medium text-gray-700 dark:text-gray-300 uppercase tracking-wider">Location</th>
|
||||||
|
<th class="px-4 py-3 text-right text-xs font-medium text-gray-700 dark:text-gray-300 uppercase tracking-wider">Actions</th>
|
||||||
|
</tr>
|
||||||
|
</thead>
|
||||||
|
<tbody class="divide-y divide-gray-200 dark:divide-gray-700">
|
||||||
|
{% for modem in modems %}
|
||||||
|
<tr class="hover:bg-gray-50 dark:hover:bg-slate-700 transition-colors">
|
||||||
|
<td class="px-4 py-3 whitespace-nowrap">
|
||||||
|
<div class="flex items-center gap-2">
|
||||||
|
<a href="/unit/{{ modem.id }}" class="font-medium text-blue-600 dark:text-blue-400 hover:underline">
|
||||||
|
{{ modem.id }}
|
||||||
|
</a>
|
||||||
|
{% if modem.hardware_model %}
|
||||||
|
<span class="text-xs text-gray-500 dark:text-gray-400">({{ modem.hardware_model }})</span>
|
||||||
|
{% endif %}
|
||||||
|
</div>
|
||||||
|
</td>
|
||||||
|
<td class="px-4 py-3 whitespace-nowrap">
|
||||||
|
{% if modem.status == "retired" %}
|
||||||
|
<span class="inline-flex items-center px-2.5 py-0.5 rounded-full text-xs font-medium bg-gray-200 text-gray-700 dark:bg-gray-700 dark:text-gray-300">
|
||||||
|
Retired
|
||||||
|
</span>
|
||||||
|
{% elif modem.status == "benched" %}
|
||||||
|
<span class="inline-flex items-center px-2.5 py-0.5 rounded-full text-xs font-medium bg-amber-100 text-amber-800 dark:bg-amber-900/30 dark:text-amber-300">
|
||||||
|
Benched
|
||||||
|
</span>
|
||||||
|
{% elif modem.status == "in_use" %}
|
||||||
|
<span class="inline-flex items-center px-2.5 py-0.5 rounded-full text-xs font-medium bg-green-100 text-green-800 dark:bg-green-900/30 dark:text-green-300">
|
||||||
|
In Use
|
||||||
|
</span>
|
||||||
|
{% elif modem.status == "spare" %}
|
||||||
|
<span class="inline-flex items-center px-2.5 py-0.5 rounded-full text-xs font-medium bg-blue-100 text-blue-800 dark:bg-blue-900/30 dark:text-blue-300">
|
||||||
|
Spare
|
||||||
|
</span>
|
||||||
|
{% else %}
|
||||||
|
<span class="inline-flex items-center px-2.5 py-0.5 rounded-full text-xs font-medium bg-gray-100 text-gray-800 dark:bg-gray-700 dark:text-gray-300">
|
||||||
|
—
|
||||||
|
</span>
|
||||||
|
{% endif %}
|
||||||
|
</td>
|
||||||
|
<td class="px-4 py-3 whitespace-nowrap text-sm">
|
||||||
|
{% if modem.ip_address %}
|
||||||
|
<span class="font-mono text-gray-900 dark:text-gray-300">{{ modem.ip_address }}</span>
|
||||||
|
{% else %}
|
||||||
|
<span class="text-gray-400 dark:text-gray-600">—</span>
|
||||||
|
{% endif %}
|
||||||
|
</td>
|
||||||
|
<td class="px-4 py-3 whitespace-nowrap text-sm text-gray-900 dark:text-gray-300">
|
||||||
|
{% if modem.phone_number %}
|
||||||
|
{{ modem.phone_number }}
|
||||||
|
{% else %}
|
||||||
|
<span class="text-gray-400 dark:text-gray-600">—</span>
|
||||||
|
{% endif %}
|
||||||
|
</td>
|
||||||
|
<td class="px-4 py-3 whitespace-nowrap text-sm">
|
||||||
|
{% if modem.paired_device %}
|
||||||
|
<a href="/unit/{{ modem.paired_device.id }}" class="text-blue-600 dark:text-blue-400 hover:underline">
|
||||||
|
{{ modem.paired_device.id }}
|
||||||
|
<span class="text-gray-500 dark:text-gray-400">({{ modem.paired_device.device_type }})</span>
|
||||||
|
</a>
|
||||||
|
{% else %}
|
||||||
|
<span class="text-gray-400 dark:text-gray-600">None</span>
|
||||||
|
{% endif %}
|
||||||
|
</td>
|
||||||
|
<td class="px-4 py-3 text-sm text-gray-900 dark:text-gray-300">
|
||||||
|
{% if modem.project_id %}
|
||||||
|
<span class="bg-gray-200 dark:bg-gray-700 px-1.5 py-0.5 rounded text-xs mr-1">{{ modem.project_id }}</span>
|
||||||
|
{% endif %}
|
||||||
|
{% if modem.location %}
|
||||||
|
<span class="truncate max-w-xs inline-block" title="{{ modem.location }}">{{ modem.location }}</span>
|
||||||
|
{% elif not modem.project_id %}
|
||||||
|
<span class="text-gray-400 dark:text-gray-600">—</span>
|
||||||
|
{% endif %}
|
||||||
|
</td>
|
||||||
|
<td class="px-4 py-3 whitespace-nowrap text-right text-sm">
|
||||||
|
<div class="flex items-center justify-end gap-2">
|
||||||
|
<button onclick="pingModem('{{ modem.id }}')"
|
||||||
|
id="ping-btn-{{ modem.id }}"
|
||||||
|
class="text-xs px-2 py-1 bg-blue-100 hover:bg-blue-200 text-blue-700 dark:bg-blue-900/30 dark:hover:bg-blue-900/50 dark:text-blue-300 rounded transition-colors">
|
||||||
|
Ping
|
||||||
|
</button>
|
||||||
|
<a href="/unit/{{ modem.id }}" class="text-blue-600 dark:text-blue-400 hover:underline">
|
||||||
|
View →
|
||||||
|
</a>
|
||||||
|
</div>
|
||||||
|
<!-- Ping Result (hidden by default) -->
|
||||||
|
<div id="ping-result-{{ modem.id }}" class="mt-1 text-xs hidden"></div>
|
||||||
|
</td>
|
||||||
|
</tr>
|
||||||
|
{% endfor %}
|
||||||
|
</tbody>
|
||||||
|
</table>
|
||||||
|
</div>
|
||||||
|
|
||||||
|
{% if search %}
|
||||||
|
<div class="mt-4 text-sm text-gray-600 dark:text-gray-400">
|
||||||
|
Found {{ modems|length }} modem(s) matching "{{ search }}"
|
||||||
|
</div>
|
||||||
|
{% endif %}
|
||||||
|
|
||||||
|
{% else %}
|
||||||
|
<div class="text-center py-12 text-gray-500 dark:text-gray-400">
|
||||||
|
<svg class="w-12 h-12 mx-auto mb-3 opacity-50" fill="none" stroke="currentColor" viewBox="0 0 24 24">
|
||||||
|
<path stroke-linecap="round" stroke-linejoin="round" stroke-width="2" d="M8.111 16.404a5.5 5.5 0 017.778 0M12 20h.01m-7.08-7.071c3.904-3.905 10.236-3.905 14.141 0M1.394 9.393c5.857-5.857 15.355-5.857 21.213 0"></path>
|
||||||
|
</svg>
|
||||||
|
<p>No modems found</p>
|
||||||
|
{% if search %}
|
||||||
|
<button onclick="document.getElementById('modem-search').value = ''; htmx.trigger('#modem-search', 'keyup');"
|
||||||
|
class="mt-3 text-blue-600 dark:text-blue-400 hover:underline">
|
||||||
|
Clear search
|
||||||
|
</button>
|
||||||
|
{% else %}
|
||||||
|
<p class="text-sm mt-1">Add modems from the <a href="/roster" class="text-seismo-orange hover:underline">Fleet Roster</a></p>
|
||||||
|
{% endif %}
|
||||||
|
</div>
|
||||||
|
{% endif %}
|
||||||