Compare commits
104 Commits
8431784708
...
feature/pr
| Author | SHA1 | Date | |
|---|---|---|---|
| 73a6ff4d20 | |||
| 184f0ddd13 | |||
| e7bd09418b | |||
| 27eeb0fae6 | |||
| 192e15f238 | |||
| 49bc625c1a | |||
| 95fedca8c9 | |||
| e8e155556a | |||
| 33e962e73d | |||
| ac48fb2977 | |||
| 3c4b81cf78 | |||
| d135727ebd | |||
| 64d4423308 | |||
| 4f71d528ce | |||
| 4f56dea4f3 | |||
| 57a85f565b | |||
|
|
e6555ba924 | ||
| 8694282dd0 | |||
| bc02dc9564 | |||
| 0d01715f81 | |||
| b3ec249c5e | |||
| b6e74258f1 | |||
| 1a87ff13c9 | |||
| 22c62c0729 | |||
| 0f47b69c92 | |||
| 76667454b3 | |||
| 0e3f512203 | |||
|
|
15d962ba42 | ||
| e4d1f0d684 | |||
|
|
b571dc29bc | ||
|
|
e2c841d5d7 | ||
| cc94493331 | |||
|
|
5a5426cceb | ||
|
|
66eddd6fe2 | ||
|
|
c77794787c | ||
| 61c84bc71d | |||
| fbf7f2a65d | |||
|
|
202fcaf91c | ||
|
|
3a411d0a89 | ||
| 0c2186f5d8 | |||
| c138e8c6a0 | |||
| 1dd396acd8 | |||
| e89a04f58c | |||
| e4ef065db8 | |||
| 86010de60c | |||
| f89f04cd6f | |||
| 67a2faa2d3 | |||
| 14856e61ef | |||
| 2b69518b33 | |||
| 6070d03e83 | |||
| 240552751c | |||
| 015ce0a254 | |||
| ef8c046f31 | |||
| 3637cf5af8 | |||
| 7fde14d882 | |||
| bd3d937a82 | |||
| 291fa8e862 | |||
| 8e292b1aca | |||
| 7516bbea70 | |||
| da4e5f66c5 | |||
| dae2595303 | |||
| 0c4e7aa5e6 | |||
| 229499ccf6 | |||
| fdc4adeaee | |||
| b3bf91880a | |||
| 17b3f91dfc | |||
| 6c1d0bc467 | |||
|
|
abd059983f | ||
|
|
0f17841218 | ||
|
|
65362bab21 | ||
|
|
dc77a362ce | ||
|
|
28942600ab | ||
|
|
80861997af | ||
| b15d434fce | |||
|
|
70ef43de11 | ||
| 7b4e12c127 | |||
|
|
24473c9ca3 | ||
|
|
caabfd0c42 | ||
|
|
ebe60d2b7d | ||
|
|
842e9d6f61 | ||
| 742a98a8ed | |||
| 3b29c4d645 | |||
|
|
63d9c59873 | ||
|
|
794bfc00dc | ||
|
|
89662d2fa5 | ||
|
|
eb0a99796d | ||
| b47e69e609 | |||
| 1cb25b6c17 | |||
|
|
e515bff1a9 | ||
|
|
f296806fd1 | ||
|
|
24da5ab79f | ||
|
|
305540f564 | ||
|
|
639b485c28 | ||
|
|
d78bafb76e | ||
|
|
8373cff10d | ||
|
|
4957a08198 | ||
|
|
05482bd903 | ||
|
|
5ee6f5eb28 | ||
| 7ce0f6115d | |||
|
|
6492fdff82 | ||
|
|
44d7841852 | ||
|
|
38c600aca3 | ||
|
|
eeda94926f | ||
|
|
57be9bf1f1 |
@@ -1,3 +1,5 @@
|
||||
docker-compose.override.yml
|
||||
|
||||
# Python cache / compiled
|
||||
__pycache__
|
||||
*.pyc
|
||||
@@ -28,6 +30,7 @@ ENV/
|
||||
|
||||
# Runtime data (mounted volumes)
|
||||
data/
|
||||
data-dev/
|
||||
|
||||
# Editors / OS junk
|
||||
.vscode/
|
||||
|
||||
20
.gitignore
vendored
@@ -1,3 +1,17 @@
|
||||
# Terra-View Specifics
|
||||
# Dev build counter (local only, never commit)
|
||||
build_number.txt
|
||||
docker-compose.override.yml
|
||||
|
||||
# SQLite database files
|
||||
*.db
|
||||
*.db-journal
|
||||
data/
|
||||
data-dev/
|
||||
.aider*
|
||||
.aider*
|
||||
|
||||
|
||||
# Byte-compiled / optimized / DLL files
|
||||
__pycache__/
|
||||
*.py[codz]
|
||||
@@ -206,10 +220,14 @@ marimo/_static/
|
||||
marimo/_lsp/
|
||||
__marimo__/
|
||||
|
||||
<<<<<<< HEAD
|
||||
# Seismo Fleet Manager
|
||||
# SQLite database files
|
||||
*.db
|
||||
*.db-journal
|
||||
data/
|
||||
/data/
|
||||
/data-dev/
|
||||
.aider*
|
||||
.aider*
|
||||
=======
|
||||
>>>>>>> 0c2186f5d89d948b0357d674c0773a67a67d8027
|
||||
|
||||
287
CHANGELOG.md
@@ -1,10 +1,290 @@
|
||||
# Changelog
|
||||
|
||||
All notable changes to Seismo Fleet Manager will be documented in this file.
|
||||
All notable changes to Terra-View will be documented in this file.
|
||||
|
||||
The format is based on [Keep a Changelog](https://keepachangelog.com/en/1.0.0/),
|
||||
and this project adheres to [Semantic Versioning](https://semver.org/spec/v2.0.0.html).
|
||||
|
||||
## [0.9.3] - 2026-03-28
|
||||
|
||||
### Added
|
||||
- **Monitoring Session Detail Page**: New dedicated page for each session showing session info, data files (with View/Report/Download actions), an editable session panel, and report actions.
|
||||
- **Session Calendar with Gantt Bars**: Monthly calendar view below the session list, showing each session as a Gantt-style bar. The dim bar represents the full device on/off window; the bright bar highlights the effective recording window. Bars extend edge-to-edge across day cells for sessions spanning midnight.
|
||||
- **Configurable Period Windows**: Sessions now store `period_start_hour` and `period_end_hour` to define the exact hours that count toward reports, replacing hardcoded day/night defaults. The session edit panel shows a "Required Recording Window" section with a live preview (e.g. "7:00 AM → 7:00 PM") and a Defaults button that auto-fills based on period type.
|
||||
- **Report Date Field**: Sessions can now store an explicit `report_date` to override the automatic target-date heuristic — useful when a device ran across multiple days but only one specific day's data is needed for the report.
|
||||
- **Effective Window on Session Info**: Session detail and session cards now show an "Effective" row displaying the computed recording window dates and times in local time.
|
||||
- **Vibration Project Redesign**: Vibration project detail page is stripped back to project details and monitoring locations only. Each location supports assigning a seismograph and optional modem. Sound-specific tabs (Schedules, Sessions, Data Files, Assigned Units) are hidden for vibration projects.
|
||||
- **Modem Assignment on Locations**: Vibration monitoring locations now support an optional paired modem alongside the seismograph. The swap endpoint handles both assignments atomically, updating bidirectional pairing fields on both units.
|
||||
- **Available Modems Endpoint**: New `GET /api/projects/{project_id}/available-modems` endpoint returning all deployed, non-retired modems for use in assignment dropdowns.
|
||||
|
||||
### Fixed
|
||||
- **Active Assignment Checks**: Unified all `UnitAssignment` "active" checks from `status == "active"` to `assigned_until IS NULL` throughout `project_locations.py` and `projects.py` for consistency with the canonical active definition.
|
||||
|
||||
### Changed
|
||||
- **Sound-Only Endpoint Guards**: FTP browser, RND viewer, Excel report generation, combined report wizard, and data upload endpoints now return HTTP 400 if called on a non-sound-monitoring project.
|
||||
|
||||
### Migration Notes
|
||||
Run on each database before deploying:
|
||||
```bash
|
||||
docker compose exec terra-view python3 backend/migrate_add_session_period_hours.py
|
||||
docker compose exec terra-view python3 backend/migrate_add_session_report_date.py
|
||||
```
|
||||
|
||||
---
|
||||
|
||||
## [0.9.2] - 2026-03-27
|
||||
|
||||
### Added
|
||||
- **Deployment Records**: Seismographs now track a full deployment history (location, project, dates). Each deployment is logged on the unit detail page with start/end dates, and the fleet calendar service uses this history for availability calculations.
|
||||
- **Allocated Unit Status**: New `allocated` status for units reserved for an upcoming job but not yet deployed. Allocated units appear in the dashboard summary, roster filters, and devices table with visual indicators.
|
||||
- **Project Allocation**: Units can be linked to a project via `allocated_to_project_id`. Allocation is shown on the unit detail page and in a new quick-info modal accessible from the fleet calendar and roster.
|
||||
- **Quick-Info Unit Modal**: Click any unit in the fleet calendar or roster to open a modal showing cal status, project allocation, upcoming jobs, and deployment state — without leaving the page.
|
||||
- **Cal Date in Planner**: When a unit is selected for a monitoring location slot in the Job Planner, its calibration expiry date is now shown inline so you can spot near-expiry units before committing.
|
||||
- **Inline Seismograph Editing**: Unit rows in the seismograph dashboard now support inline editing of cal date, notes, and deployment status without navigating to the full detail page.
|
||||
|
||||
### Migration Notes
|
||||
Run on each database before deploying:
|
||||
```bash
|
||||
docker compose exec terra-view python3 backend/migrate_add_allocated.py
|
||||
docker compose exec terra-view python3 backend/migrate_add_deployment_records.py
|
||||
```
|
||||
|
||||
---
|
||||
|
||||
## [0.9.1] - 2026-03-20
|
||||
|
||||
### Fixed
|
||||
- **Location slots not persisting**: Empty monitoring location slots (no unit assigned yet) were lost on save/reload. Added `location_slots` JSON column to `job_reservations` to store the full slot list including empty slots.
|
||||
- **Modems in Recent Alerts**: Modems no longer appear in the dashboard Recent Alerts panel — alerts are for seismographs and SLMs only. Modem status is still tracked internally via paired device inheritance.
|
||||
- **Series 4 heartbeat `source_id`**: Updated heartbeat endpoint to accept the new `source_id` field from Series 4 units with fallback to the legacy field for backwards compatibility.
|
||||
|
||||
### Migration Notes
|
||||
Run on each database before deploying:
|
||||
```bash
|
||||
docker compose exec terra-view python3 backend/migrate_add_location_slots.py
|
||||
```
|
||||
|
||||
---
|
||||
|
||||
## [0.9.0] - 2026-03-19
|
||||
|
||||
### Added
|
||||
- **Job Planner**: Full redesign of the Fleet Calendar into a two-tab Job Planner / Calendar interface
|
||||
- **Planner tab**: Create and manage job reservations with name, device type, dates, color, estimated units, and monitoring locations
|
||||
- **Calendar tab**: 12-month rolling heatmap with colored job bars per day; confirmed jobs solid, planned jobs dashed
|
||||
- **Monitoring Locations**: Each job has named location slots (filled = unit assigned, empty = needs a unit); progress shown as `2/5` with colored squares that fill as units are assigned
|
||||
- **Estimated Units**: Separate planning number independent of actual location count; shown prominently on job cards
|
||||
- **Fleet Summary panel**: Unit counts as clickable filter buttons; unit list shows reservation badges with job name, dates, and color
|
||||
- **Available Units panel**: Shows units available for the job's date range when assigning
|
||||
- **Smart color picker**: 18-swatch palette + custom color wheel; new jobs auto-pick a color maximally distant in hue from existing jobs
|
||||
- **Job card progress**: `est. N · X/Y (Z more)` with filled/empty squares; amber → green when fully assigned
|
||||
- **Promote to Project**: Promote a planned job to a tracked project directly from the planner form
|
||||
- **Collapsible job details**: Name, dates, device type, color, project link, and estimated units collapse into a summary header
|
||||
- **Calendar bar tooltips**: Hover any job bar to see job name and date range
|
||||
- **Hash-based tab persistence**: `#cal` in URL restores Calendar tab on refresh; device type toggle preserves active tab
|
||||
- **Auto-scroll to today**: Switching to Calendar tab smooth-scrolls to the current month
|
||||
- **Upcoming project status**: New `upcoming` status for projects promoted from reservations
|
||||
- **Job device type**: Reservations carry a device type so they only appear on the correct calendar
|
||||
- **Project filtering by device type**: Projects only appear on the calendar matching their type (vibration → seismograph, sound → SLM, combined → both)
|
||||
- **Confirmed/Planned toggles**: Independent show/hide toggles for job bar layers on the calendar
|
||||
- **Cal expire dots toggle**: Calibration expiry dots off by default, togglable
|
||||
|
||||
### Changed
|
||||
- **Renamed**: "Fleet Calendar" / "Reservation Planner" → **"Job Planner"** throughout UI and sidebar
|
||||
- **Project status dropdown**: Inline `<select>` in project header for quick status changes
|
||||
- **"All Projects" tab**: Shows everything except deleted; default view excludes archived/completed
|
||||
- **Toast notifications**: All `alert()` dialogs replaced with non-blocking toasts (green = success, red = error)
|
||||
|
||||
### Migration Notes
|
||||
Run on each database before deploying:
|
||||
```bash
|
||||
docker compose exec terra-view python3 -c "
|
||||
import sqlite3
|
||||
conn = sqlite3.connect('/app/data/seismo_fleet.db')
|
||||
conn.execute('ALTER TABLE job_reservations ADD COLUMN estimated_units INTEGER')
|
||||
conn.commit()
|
||||
conn.close()
|
||||
"
|
||||
```
|
||||
|
||||
---
|
||||
|
||||
## [0.8.0] - 2026-03-18
|
||||
|
||||
### Added
|
||||
- **Watcher Manager**: New admin page (`/admin/watchers`) for monitoring field watcher agents
|
||||
- Live status cards per agent showing connectivity, version, IP, last-seen age, and log tail
|
||||
- Trigger Update button to queue a self-update on the agent's next heartbeat
|
||||
- Expand/collapse log tail with full-log expand mode
|
||||
- Live surgical refresh every 30 seconds via `/api/admin/watchers` — no full page reload, open logs stay open
|
||||
|
||||
### Changed
|
||||
- **Watcher status logic**: Agent status now reflects whether Terra-View is hearing from the watcher (ok if seen within 60 minutes, missing otherwise) — previously reflected the worst unit status from the last heartbeat payload, which caused false alarms when units went missing
|
||||
|
||||
### Fixed
|
||||
- **Watcher Manager meta row**: Dark mode background was white due to invalid `dark:bg-slate-850` Tailwind class; corrected to `dark:bg-slate-800`
|
||||
|
||||
---
|
||||
|
||||
## [0.7.1] - 2026-03-12
|
||||
|
||||
### Added
|
||||
- **"Out for Calibration" Unit Status**: New `out_for_cal` status for units currently away for calibration, with visual indicators in the roster, unit list, and seismograph stats panel
|
||||
- **Reservation Modal**: Fleet calendar reservation modal is now fully functional for creating and managing device reservations
|
||||
|
||||
### Changed
|
||||
- **Retire Unit Button**: Redesigned to be more visually prominent/destructive to reduce accidental clicks
|
||||
|
||||
### Fixed
|
||||
- **Migration Scripts**: Fixed database path references in several migration scripts
|
||||
- **Docker Compose**: Removed dev override file from the repository; dev environment config kept separate
|
||||
|
||||
### Migration Notes
|
||||
Run the following migration script once per database before deploying:
|
||||
```bash
|
||||
python backend/migrate_add_out_for_calibration.py
|
||||
```
|
||||
|
||||
---
|
||||
|
||||
## [0.7.0] - 2026-03-07
|
||||
|
||||
### Added
|
||||
- **Project Status Management**: Projects can now be placed `on_hold` or `archived`, with automatic cancellation of pending scheduled actions
|
||||
- **Hard Delete Projects**: Support for permanently deleting projects, in addition to soft-delete with auto-pruning
|
||||
- **Vibration Location Detail**: New dedicated template for vibration project location detail views
|
||||
- **Vibration Project Isolation**: Vibration projects no longer show SLM-specific project tabs
|
||||
- **Manual SD Card Data Upload**: Upload offline NRL data directly from SD card via ZIP or multi-file select
|
||||
- Accepts `.rnd`/`.rnh` files; parses `.rnh` metadata for session start/stop times, serial number, and store name
|
||||
- Creates `MonitoringSession` and `DataFile` records automatically; no unit assignment required
|
||||
- Upload panel on NRL detail Data Files tab with inline feedback and auto-refresh via HTMX
|
||||
- **Standalone SLM Type**: New SLM device mode that operates without a modem (direct IP connection)
|
||||
- **NL32 Data Support**: Report generator and web viewer now support NL32 measurement data format
|
||||
- **Combined Report Wizard**: Multi-session combined Excel report generation tool
|
||||
- Wizard UI grouped by location with period type badges (day/night)
|
||||
- Each selected session produces one `.xlsx` in a ZIP archive
|
||||
- Period type filtering: day sessions keep last calendar date (7AM–6:59PM); night sessions span both days (7PM–6:59AM)
|
||||
- **Combined Report Preview**: Interactive spreadsheet-style preview before generating combined reports
|
||||
- **Chart Preview**: Live chart preview in the report generator matching final report styling
|
||||
- **SLM Model Schemas**: Per-model configuration schemas for NL32, NL43, NL53 devices
|
||||
- **Data Collection Mode**: Projects now store a data collection mode field with UI controls and migration
|
||||
|
||||
### Changed
|
||||
- **MonitoringSession rename**: `RecordingSession` renamed to `MonitoringSession` throughout codebase; DB table renamed from `recording_sessions` to `monitoring_sessions`
|
||||
- Migration: `backend/migrate_rename_recording_to_monitoring_sessions.py`
|
||||
- **Combined Report Split Logic**: Separate days now generate separate `.xlsx` files; NRLs remain one per sheet
|
||||
- **Mass Upload Parsing**: Smarter file filtering — no longer imports unneeded Lp files or `.xlsx` files
|
||||
- **SLM Start Time Grace Period**: 15-minute grace window added so data starting at session start time is included
|
||||
- **NL32 Date Parsing**: Date now read from `start_time` field instead of file metadata
|
||||
- **Project Data Labels**: Improved Jinja filters and UI label clarity for project data views
|
||||
|
||||
### Fixed
|
||||
- **Dev/Prod Separation**: Dev server now uses Docker Compose override; production deployment no longer affected by dev config
|
||||
- **SLM Modal**: Bench/deploy toggle now correctly shown in SLM unit modal
|
||||
- **Auto-Downloaded Files**: Files downloaded by scheduler now appear in project file listings
|
||||
- **Duplicate Download**: Removed duplicate file download that occurred following a scheduled stop
|
||||
- **SLMM Environment Variables**: `TCP_IDLE_TTL` and `TCP_MAX_AGE` now correctly passed to SLMM service via docker-compose
|
||||
|
||||
### Technical Details
|
||||
- `session_label` and `period_type` stored on `monitoring_sessions` table (migration: `migrate_add_session_period_type.py`)
|
||||
- `device_model` stored on `monitoring_sessions` table (migration: `migrate_add_session_device_model.py`)
|
||||
- Upload endpoint: `POST /api/projects/{project_id}/nrl/{location_id}/upload-data`
|
||||
- ZIP filename format: `{session_label}_{project_name}_report.xlsx` (label first)
|
||||
|
||||
### Migration Notes
|
||||
Run the following migration scripts once per database before deploying:
|
||||
```bash
|
||||
python backend/migrate_rename_recording_to_monitoring_sessions.py
|
||||
python backend/migrate_add_session_period_type.py
|
||||
python backend/migrate_add_session_device_model.py
|
||||
```
|
||||
|
||||
---
|
||||
|
||||
## [0.6.1] - 2026-02-16
|
||||
|
||||
### Added
|
||||
- **One-Off Recording Schedules**: Support for scheduling single recordings with specific start and end datetimes
|
||||
- **Bidirectional Pairing Sync**: Pairing a device with a modem now automatically updates both sides, clearing stale pairings when reassigned
|
||||
- **Auto-Fill Notes from Modem**: Notes are now copied from modem to paired device when fields are empty
|
||||
- **SLMM Download Requests**: New `_download_request` method in SLMM client for binary file downloads with local save
|
||||
|
||||
### Fixed
|
||||
- **Scheduler Timezone**: One-off scheduler times now use local time instead of UTC
|
||||
- **Pairing Consistency**: Old device references are properly cleared when a modem is re-paired to a new device
|
||||
|
||||
## [0.6.0] - 2026-02-06
|
||||
|
||||
### Added
|
||||
- **Calendar & Reservation Mode**: Fleet calendar view with reservation system for scheduling device deployments
|
||||
- **Device Pairing Interface**: New two-column pairing page (`/pair-devices`) for linking recorders (seismographs/SLMs) with modems
|
||||
- Visual pairing interface with drag-and-drop style interactions
|
||||
- Fuzzy-search modem pairing for SLMs
|
||||
- Pairing options now accessible from modem page
|
||||
- Improved pair status sharing across views
|
||||
- **Modem Dashboard Enhancements**:
|
||||
- Modem model number now a dedicated configuration field with per-model options
|
||||
- Direct link to modem login page from unit detail view
|
||||
- Modem view converted to list format
|
||||
- **Seismograph List Improvements**:
|
||||
- Enhanced visibility with better filtering and sorting
|
||||
- Calibration dates now color-coded for quick status assessment
|
||||
- User sets date of previous calibration (not expiry) for clearer workflow
|
||||
- **SLMM Device Control Lock**: Prevents command flooding to NL-43 devices
|
||||
|
||||
### Changed
|
||||
- **Calibration Date UX**: Users now set the date of the previous calibration rather than upcoming expiry dates - more intuitive workflow
|
||||
- **Settings Persistence**: Settings save no longer reloads the page
|
||||
- **Tab State**: Tab state now persists in URL hash for better navigation
|
||||
- **Scheduler Management**: Schedule changes now cascade to individual events
|
||||
- **Dashboard Filtering**: Enhanced dashboard with additional filtering options and SLM status sync
|
||||
- **SLMM Polling Intervals**: Fixed and improved polling intervals for better responsiveness
|
||||
- **24-Hour Scheduler Cycle**: Improved cycle handling to prevent issues with scheduled downloads
|
||||
|
||||
### Fixed
|
||||
- **SLM Modal Fields**: Modal now only contains correct device-specific fields
|
||||
- **IP Address Handling**: IP address correctly passed via modem pairing
|
||||
- **Mobile Type Display**: Fixed incorrect device type display in roster and device tables
|
||||
- **SLMM Scheduled Downloads**: Fixed issues with scheduled download operations
|
||||
|
||||
## [0.5.1] - 2026-01-27
|
||||
|
||||
### Added
|
||||
- **Dashboard Schedule View**: Today's scheduled actions now display directly on the main dashboard
|
||||
- New "Today's Actions" panel showing upcoming and past scheduled events
|
||||
- Schedule list partial for project-specific schedule views
|
||||
- API endpoint for fetching today's schedule data
|
||||
- **New Branding Assets**: Complete logo rework for Terra-View
|
||||
- New Terra-View logos for light and dark themes
|
||||
- Retina-ready (@2x) logo variants
|
||||
- Updated favicons (16px and 32px)
|
||||
- Refreshed PWA icons (72px through 512px)
|
||||
|
||||
### Changed
|
||||
- **Dashboard Layout**: Reorganized to include schedule information panel
|
||||
- **Base Template**: Updated to use new Terra-View logos with theme-aware switching
|
||||
|
||||
## [0.5.0] - 2026-01-23
|
||||
|
||||
_Note: This version was not formally released; changes were included in v0.5.1._
|
||||
|
||||
## [0.4.4] - 2026-01-23
|
||||
|
||||
### Added
|
||||
- **Recurring schedules**: New scheduler service, recurring schedule APIs, and schedule templates (calendar/interval/list).
|
||||
- **Alerts UI + backend**: Alerting service plus dropdown/list templates for surfacing notifications.
|
||||
- **Report templates + viewers**: CRUD API for report templates, report preview screen, and RND file viewer.
|
||||
- **SLM tooling**: SLM settings modal and SLM project report generator workflow.
|
||||
|
||||
### Changed
|
||||
- **Project data management**: Unified files view, refreshed FTP browser, and new project header/templates for file/session/unit/assignment lists.
|
||||
- **Device/SLM sync**: Standardized SLM device types and tightened SLMM sync paths.
|
||||
- **Docs/scripts**: Cleanup pass and expanded device-type documentation.
|
||||
|
||||
### Fixed
|
||||
- **Scheduler actions**: Strict command definitions so actions run reliably.
|
||||
- **Project view title**: Resolved JSON string rendering in project headers.
|
||||
|
||||
## [0.4.3] - 2026-01-14
|
||||
|
||||
### Added
|
||||
@@ -361,6 +641,11 @@ No database migration required for v0.4.0. All new features use existing databas
|
||||
- Photo management per unit
|
||||
- Automated status categorization (OK/Pending/Missing)
|
||||
|
||||
[0.7.0]: https://github.com/serversdwn/seismo-fleet-manager/compare/v0.6.1...v0.7.0
|
||||
[0.6.0]: https://github.com/serversdwn/seismo-fleet-manager/compare/v0.5.1...v0.6.0
|
||||
[0.5.1]: https://github.com/serversdwn/seismo-fleet-manager/compare/v0.5.0...v0.5.1
|
||||
[0.5.0]: https://github.com/serversdwn/seismo-fleet-manager/compare/v0.4.4...v0.5.0
|
||||
[0.4.4]: https://github.com/serversdwn/seismo-fleet-manager/compare/v0.4.3...v0.4.4
|
||||
[0.4.3]: https://github.com/serversdwn/seismo-fleet-manager/compare/v0.4.2...v0.4.3
|
||||
[0.4.2]: https://github.com/serversdwn/seismo-fleet-manager/compare/v0.4.1...v0.4.2
|
||||
[0.4.1]: https://github.com/serversdwn/seismo-fleet-manager/compare/v0.4.0...v0.4.1
|
||||
|
||||
@@ -1,5 +1,9 @@
|
||||
FROM python:3.11-slim
|
||||
|
||||
# Build number for dev builds (injected via --build-arg)
|
||||
ARG BUILD_NUMBER=0
|
||||
ENV BUILD_NUMBER=${BUILD_NUMBER}
|
||||
|
||||
# Set working directory
|
||||
WORKDIR /app
|
||||
|
||||
|
||||
48
README.md
@@ -1,4 +1,4 @@
|
||||
# Seismo Fleet Manager v0.4.3
|
||||
# Terra-View v0.9.3
|
||||
Backend API and HTMX-powered web interface for managing a mixed fleet of seismographs and field modems. Track deployments, monitor health in real time, merge roster intent with incoming telemetry, and control your fleet through a unified database and dashboard.
|
||||
|
||||
## Features
|
||||
@@ -496,6 +496,34 @@ docker compose down -v
|
||||
|
||||
## Release Highlights
|
||||
|
||||
### v0.8.0 — 2026-03-18
|
||||
- **Watcher Manager**: Admin page for monitoring field watcher agents with live status cards, log tails, and one-click update triggering
|
||||
- **Watcher Status Fix**: Agent status now reflects heartbeat connectivity (missing if not heard from in >60 min) rather than unit-level data staleness
|
||||
- **Live Refresh**: Watcher Manager surgically patches status, last-seen, and pending indicators every 30s without a full page reload
|
||||
|
||||
### v0.7.0 — 2026-03-07
|
||||
- **Project Status Management**: On-hold and archived project states with automatic cancellation of pending actions
|
||||
- **Manual SD Card Upload**: Upload offline NRL/SLM data directly from SD card (ZIP or multi-file); auto-creates monitoring sessions from `.rnh` metadata
|
||||
- **Combined Report Wizard**: Multi-session Excel report generation with location grouping, period type filtering, and ZIP download
|
||||
- **NL32 Support**: Report generator and web viewer now handle NL32 measurement data
|
||||
- **Chart Preview**: Live chart preview in the report generator matching final output styling
|
||||
- **Standalone SLM Mode**: SLMs can now be configured without a paired modem (direct IP)
|
||||
- **Vibration Project Isolation**: Vibration project views no longer show SLM-specific tabs
|
||||
- **MonitoringSession Rename**: `RecordingSession` renamed to `MonitoringSession` throughout; run migration before deploying
|
||||
|
||||
### v0.6.1 — 2026-02-16
|
||||
- **One-Off Recording Schedules**: Schedule single recordings with specific start/end datetimes
|
||||
- **Bidirectional Pairing Sync**: Device-modem pairing now updates both sides automatically
|
||||
- **Scheduler Timezone Fix**: One-off schedule times use local time instead of UTC
|
||||
|
||||
### v0.6.0 — 2026-02-06
|
||||
- **Calendar & Reservation Mode**: Fleet calendar view with device deployment scheduling and reservation system
|
||||
- **Device Pairing Interface**: New `/pair-devices` page with two-column layout for linking recorders with modems, fuzzy-search, and visual pairing workflow
|
||||
- **Calibration UX Overhaul**: Users now set date of previous calibration (not expiry); seismograph list enhanced with color-coded calibration status, filtering, and sorting
|
||||
- **Modem Dashboard**: Model number as dedicated config, modem login links, list view format, and pairing options accessible from modem page
|
||||
- **SLMM Improvements**: Device control lock prevents command flooding, fixed polling intervals and scheduled downloads
|
||||
- **UI Polish**: Tab state persists in URL hash, settings save without reload, scheduler changes cascade to events, fixed mobile type display
|
||||
|
||||
### v0.4.3 — 2026-01-14
|
||||
- **Sound Level Meter workflow**: Roster manager surfaces SLM metadata, supports rename actions, and adds return-to-project navigation plus schedule/unit templates for project planning.
|
||||
- **Project insight panels**: Project dashboards now expose file and session lists so teams can see what each project stores before diving into units.
|
||||
@@ -571,9 +599,23 @@ MIT
|
||||
|
||||
## Version
|
||||
|
||||
**Current: 0.4.3** — SLM roster/project view refresh, project insight panels, FTP browser folder downloads, and SLMM sync (2026-01-14)
|
||||
**Current: 0.8.0** — Watcher Manager admin page, live agent status refresh, watcher connectivity-based status (2026-03-18)
|
||||
|
||||
Previous: 0.4.2 — SLM configuration interface with TCP/FTP controls, modem diagnostics, and dashboard endpoints for Sound Level Meters (2026-01-05)
|
||||
Previous: 0.7.1 — Out-for-calibration status, reservation modal, migration fixes (2026-03-12)
|
||||
|
||||
0.7.0 — Project status management, manual SD card upload, combined report wizard, NL32 support, MonitoringSession rename (2026-03-07)
|
||||
|
||||
0.6.1 — One-off recording schedules, bidirectional pairing sync, scheduler timezone fix (2026-02-16)
|
||||
|
||||
0.6.0 — Calendar & reservation mode, device pairing interface, calibration UX overhaul, modem dashboard enhancements (2026-02-06)
|
||||
|
||||
0.5.1 — Dashboard schedule view with today's actions panel, new Terra-View branding and logo rework (2026-01-27)
|
||||
|
||||
0.4.4 — Recurring schedules, alerting UI, report templates + RND viewer, and SLM workflow polish (2026-01-23)
|
||||
|
||||
0.4.3 — SLM roster/project view refresh, project insight panels, FTP browser folder downloads, and SLMM sync (2026-01-14)
|
||||
|
||||
0.4.2 — SLM configuration interface with TCP/FTP controls, modem diagnostics, and dashboard endpoints for Sound Level Meters (2026-01-05)
|
||||
|
||||
0.4.1 — Sound Level Meter integration with full management UI for SLM units (2026-01-05)
|
||||
|
||||
|
||||
@@ -18,7 +18,7 @@ from backend.models import (
|
||||
MonitoringLocation,
|
||||
UnitAssignment,
|
||||
ScheduledAction,
|
||||
RecordingSession,
|
||||
MonitoringSession,
|
||||
DataFile,
|
||||
)
|
||||
from datetime import datetime
|
||||
|
||||
196
backend/main.py
@@ -18,9 +18,10 @@ logging.basicConfig(
|
||||
logger = logging.getLogger(__name__)
|
||||
|
||||
from backend.database import engine, Base, get_db
|
||||
from backend.routers import roster, units, photos, roster_edit, roster_rename, dashboard, dashboard_tabs, activity, slmm, slm_ui, slm_dashboard, seismo_dashboard, projects, project_locations, scheduler
|
||||
from backend.routers import roster, units, photos, roster_edit, roster_rename, dashboard, dashboard_tabs, activity, slmm, slm_ui, slm_dashboard, seismo_dashboard, projects, project_locations, scheduler, modem_dashboard
|
||||
from backend.services.snapshot import emit_status_snapshot
|
||||
from backend.models import IgnoredUnit
|
||||
from backend.utils.timezone import get_user_timezone
|
||||
|
||||
# Create database tables
|
||||
Base.metadata.create_all(bind=engine)
|
||||
@@ -29,7 +30,11 @@ Base.metadata.create_all(bind=engine)
|
||||
ENVIRONMENT = os.getenv("ENVIRONMENT", "production")
|
||||
|
||||
# Initialize FastAPI app
|
||||
VERSION = "0.4.3"
|
||||
VERSION = "0.9.3"
|
||||
if ENVIRONMENT == "development":
|
||||
_build = os.getenv("BUILD_NUMBER", "0")
|
||||
if _build and _build != "0":
|
||||
VERSION = f"{VERSION}-{_build}"
|
||||
app = FastAPI(
|
||||
title="Seismo Fleet Manager",
|
||||
description="Backend API for managing seismograph fleet status",
|
||||
@@ -92,10 +97,14 @@ app.include_router(slmm.router)
|
||||
app.include_router(slm_ui.router)
|
||||
app.include_router(slm_dashboard.router)
|
||||
app.include_router(seismo_dashboard.router)
|
||||
app.include_router(modem_dashboard.router)
|
||||
|
||||
from backend.routers import settings
|
||||
app.include_router(settings.router)
|
||||
|
||||
from backend.routers import watcher_manager
|
||||
app.include_router(watcher_manager.router)
|
||||
|
||||
# Projects system routers
|
||||
app.include_router(projects.router)
|
||||
app.include_router(project_locations.router)
|
||||
@@ -113,6 +122,14 @@ app.include_router(alerts.router)
|
||||
from backend.routers import recurring_schedules
|
||||
app.include_router(recurring_schedules.router)
|
||||
|
||||
# Fleet Calendar router
|
||||
from backend.routers import fleet_calendar
|
||||
app.include_router(fleet_calendar.router)
|
||||
|
||||
# Deployment Records router
|
||||
from backend.routers import deployments
|
||||
app.include_router(deployments.router)
|
||||
|
||||
# Start scheduler service and device status monitor on application startup
|
||||
from backend.services.scheduler import start_scheduler, stop_scheduler
|
||||
from backend.services.device_status_monitor import start_device_status_monitor, stop_device_status_monitor
|
||||
@@ -216,6 +233,73 @@ async def seismographs_page(request: Request):
|
||||
return templates.TemplateResponse("seismographs.html", {"request": request})
|
||||
|
||||
|
||||
@app.get("/modems", response_class=HTMLResponse)
|
||||
async def modems_page(request: Request):
|
||||
"""Field modems management dashboard"""
|
||||
return templates.TemplateResponse("modems.html", {"request": request})
|
||||
|
||||
|
||||
@app.get("/pair-devices", response_class=HTMLResponse)
|
||||
async def pair_devices_page(request: Request, db: Session = Depends(get_db)):
|
||||
"""
|
||||
Device pairing page - two-column layout for pairing recorders with modems.
|
||||
"""
|
||||
from backend.models import RosterUnit
|
||||
|
||||
# Get all non-retired recorders (seismographs and SLMs)
|
||||
recorders = db.query(RosterUnit).filter(
|
||||
RosterUnit.retired == False,
|
||||
RosterUnit.device_type.in_(["seismograph", "slm", None]) # None defaults to seismograph
|
||||
).order_by(RosterUnit.id).all()
|
||||
|
||||
# Get all non-retired modems
|
||||
modems = db.query(RosterUnit).filter(
|
||||
RosterUnit.retired == False,
|
||||
RosterUnit.device_type == "modem"
|
||||
).order_by(RosterUnit.id).all()
|
||||
|
||||
# Build existing pairings list
|
||||
pairings = []
|
||||
for recorder in recorders:
|
||||
if recorder.deployed_with_modem_id:
|
||||
modem = next((m for m in modems if m.id == recorder.deployed_with_modem_id), None)
|
||||
pairings.append({
|
||||
"recorder_id": recorder.id,
|
||||
"recorder_type": (recorder.device_type or "seismograph").upper(),
|
||||
"modem_id": recorder.deployed_with_modem_id,
|
||||
"modem_ip": modem.ip_address if modem else None
|
||||
})
|
||||
|
||||
# Convert to dicts for template
|
||||
recorders_data = [
|
||||
{
|
||||
"id": r.id,
|
||||
"device_type": r.device_type or "seismograph",
|
||||
"deployed": r.deployed,
|
||||
"deployed_with_modem_id": r.deployed_with_modem_id
|
||||
}
|
||||
for r in recorders
|
||||
]
|
||||
|
||||
modems_data = [
|
||||
{
|
||||
"id": m.id,
|
||||
"deployed": m.deployed,
|
||||
"deployed_with_unit_id": m.deployed_with_unit_id,
|
||||
"ip_address": m.ip_address,
|
||||
"phone_number": m.phone_number
|
||||
}
|
||||
for m in modems
|
||||
]
|
||||
|
||||
return templates.TemplateResponse("pair_devices.html", {
|
||||
"request": request,
|
||||
"recorders": recorders_data,
|
||||
"modems": modems_data,
|
||||
"pairings": pairings
|
||||
})
|
||||
|
||||
|
||||
@app.get("/projects", response_class=HTMLResponse)
|
||||
async def projects_page(request: Request):
|
||||
"""Projects management and overview"""
|
||||
@@ -239,7 +323,7 @@ async def nrl_detail_page(
|
||||
db: Session = Depends(get_db)
|
||||
):
|
||||
"""NRL (Noise Recording Location) detail page with tabs"""
|
||||
from backend.models import Project, MonitoringLocation, UnitAssignment, RosterUnit, RecordingSession, DataFile
|
||||
from backend.models import Project, MonitoringLocation, UnitAssignment, RosterUnit, MonitoringSession, DataFile
|
||||
from sqlalchemy import and_
|
||||
|
||||
# Get project
|
||||
@@ -271,27 +355,40 @@ async def nrl_detail_page(
|
||||
).first()
|
||||
|
||||
assigned_unit = None
|
||||
assigned_modem = None
|
||||
if assignment:
|
||||
assigned_unit = db.query(RosterUnit).filter_by(id=assignment.unit_id).first()
|
||||
if assigned_unit and assigned_unit.deployed_with_modem_id:
|
||||
assigned_modem = db.query(RosterUnit).filter_by(id=assigned_unit.deployed_with_modem_id).first()
|
||||
|
||||
# Get session count
|
||||
session_count = db.query(RecordingSession).filter_by(location_id=location_id).count()
|
||||
session_count = db.query(MonitoringSession).filter_by(location_id=location_id).count()
|
||||
|
||||
# Get file count (DataFile links to session, not directly to location)
|
||||
file_count = db.query(DataFile).join(
|
||||
RecordingSession,
|
||||
DataFile.session_id == RecordingSession.id
|
||||
).filter(RecordingSession.location_id == location_id).count()
|
||||
MonitoringSession,
|
||||
DataFile.session_id == MonitoringSession.id
|
||||
).filter(MonitoringSession.location_id == location_id).count()
|
||||
|
||||
# Check for active session
|
||||
active_session = db.query(RecordingSession).filter(
|
||||
active_session = db.query(MonitoringSession).filter(
|
||||
and_(
|
||||
RecordingSession.location_id == location_id,
|
||||
RecordingSession.status == "recording"
|
||||
MonitoringSession.location_id == location_id,
|
||||
MonitoringSession.status == "recording"
|
||||
)
|
||||
).first()
|
||||
|
||||
return templates.TemplateResponse("nrl_detail.html", {
|
||||
# Parse connection_mode from location_metadata JSON
|
||||
import json as _json
|
||||
connection_mode = "connected"
|
||||
try:
|
||||
meta = _json.loads(location.location_metadata or "{}")
|
||||
connection_mode = meta.get("connection_mode", "connected")
|
||||
except Exception:
|
||||
pass
|
||||
|
||||
template = "vibration_location_detail.html" if location.location_type == "vibration" else "nrl_detail.html"
|
||||
return templates.TemplateResponse(template, {
|
||||
"request": request,
|
||||
"project_id": project_id,
|
||||
"location_id": location_id,
|
||||
@@ -299,9 +396,11 @@ async def nrl_detail_page(
|
||||
"location": location,
|
||||
"assignment": assignment,
|
||||
"assigned_unit": assigned_unit,
|
||||
"assigned_modem": assigned_modem,
|
||||
"session_count": session_count,
|
||||
"file_count": file_count,
|
||||
"active_session": active_session,
|
||||
"connection_mode": connection_mode,
|
||||
})
|
||||
|
||||
|
||||
@@ -571,6 +670,7 @@ async def devices_all_partial(request: Request):
|
||||
"last_seen": unit_data.get("last", "Never"),
|
||||
"deployed": True,
|
||||
"retired": False,
|
||||
"out_for_calibration": False,
|
||||
"ignored": False,
|
||||
"note": unit_data.get("note", ""),
|
||||
"device_type": unit_data.get("device_type", "seismograph"),
|
||||
@@ -580,6 +680,7 @@ async def devices_all_partial(request: Request):
|
||||
"last_calibrated": unit_data.get("last_calibrated"),
|
||||
"next_calibration_due": unit_data.get("next_calibration_due"),
|
||||
"deployed_with_modem_id": unit_data.get("deployed_with_modem_id"),
|
||||
"deployed_with_unit_id": unit_data.get("deployed_with_unit_id"),
|
||||
"ip_address": unit_data.get("ip_address"),
|
||||
"phone_number": unit_data.get("phone_number"),
|
||||
"hardware_model": unit_data.get("hardware_model"),
|
||||
@@ -594,6 +695,7 @@ async def devices_all_partial(request: Request):
|
||||
"last_seen": unit_data.get("last", "Never"),
|
||||
"deployed": False,
|
||||
"retired": False,
|
||||
"out_for_calibration": False,
|
||||
"ignored": False,
|
||||
"note": unit_data.get("note", ""),
|
||||
"device_type": unit_data.get("device_type", "seismograph"),
|
||||
@@ -603,6 +705,59 @@ async def devices_all_partial(request: Request):
|
||||
"last_calibrated": unit_data.get("last_calibrated"),
|
||||
"next_calibration_due": unit_data.get("next_calibration_due"),
|
||||
"deployed_with_modem_id": unit_data.get("deployed_with_modem_id"),
|
||||
"deployed_with_unit_id": unit_data.get("deployed_with_unit_id"),
|
||||
"ip_address": unit_data.get("ip_address"),
|
||||
"phone_number": unit_data.get("phone_number"),
|
||||
"hardware_model": unit_data.get("hardware_model"),
|
||||
})
|
||||
|
||||
# Add allocated units
|
||||
for unit_id, unit_data in snapshot.get("allocated", {}).items():
|
||||
units_list.append({
|
||||
"id": unit_id,
|
||||
"status": "Allocated",
|
||||
"age": "N/A",
|
||||
"last_seen": "N/A",
|
||||
"deployed": False,
|
||||
"retired": False,
|
||||
"out_for_calibration": False,
|
||||
"allocated": True,
|
||||
"allocated_to_project_id": unit_data.get("allocated_to_project_id", ""),
|
||||
"ignored": False,
|
||||
"note": unit_data.get("note", ""),
|
||||
"device_type": unit_data.get("device_type", "seismograph"),
|
||||
"address": unit_data.get("address", ""),
|
||||
"coordinates": unit_data.get("coordinates", ""),
|
||||
"project_id": unit_data.get("project_id", ""),
|
||||
"last_calibrated": unit_data.get("last_calibrated"),
|
||||
"next_calibration_due": unit_data.get("next_calibration_due"),
|
||||
"deployed_with_modem_id": unit_data.get("deployed_with_modem_id"),
|
||||
"deployed_with_unit_id": unit_data.get("deployed_with_unit_id"),
|
||||
"ip_address": unit_data.get("ip_address"),
|
||||
"phone_number": unit_data.get("phone_number"),
|
||||
"hardware_model": unit_data.get("hardware_model"),
|
||||
})
|
||||
|
||||
# Add out-for-calibration units
|
||||
for unit_id, unit_data in snapshot["out_for_calibration"].items():
|
||||
units_list.append({
|
||||
"id": unit_id,
|
||||
"status": "Out for Calibration",
|
||||
"age": "N/A",
|
||||
"last_seen": "N/A",
|
||||
"deployed": False,
|
||||
"retired": False,
|
||||
"out_for_calibration": True,
|
||||
"ignored": False,
|
||||
"note": unit_data.get("note", ""),
|
||||
"device_type": unit_data.get("device_type", "seismograph"),
|
||||
"address": unit_data.get("address", ""),
|
||||
"coordinates": unit_data.get("coordinates", ""),
|
||||
"project_id": unit_data.get("project_id", ""),
|
||||
"last_calibrated": unit_data.get("last_calibrated"),
|
||||
"next_calibration_due": unit_data.get("next_calibration_due"),
|
||||
"deployed_with_modem_id": unit_data.get("deployed_with_modem_id"),
|
||||
"deployed_with_unit_id": unit_data.get("deployed_with_unit_id"),
|
||||
"ip_address": unit_data.get("ip_address"),
|
||||
"phone_number": unit_data.get("phone_number"),
|
||||
"hardware_model": unit_data.get("hardware_model"),
|
||||
@@ -617,6 +772,7 @@ async def devices_all_partial(request: Request):
|
||||
"last_seen": "N/A",
|
||||
"deployed": False,
|
||||
"retired": True,
|
||||
"out_for_calibration": False,
|
||||
"ignored": False,
|
||||
"note": unit_data.get("note", ""),
|
||||
"device_type": unit_data.get("device_type", "seismograph"),
|
||||
@@ -626,6 +782,7 @@ async def devices_all_partial(request: Request):
|
||||
"last_calibrated": unit_data.get("last_calibrated"),
|
||||
"next_calibration_due": unit_data.get("next_calibration_due"),
|
||||
"deployed_with_modem_id": unit_data.get("deployed_with_modem_id"),
|
||||
"deployed_with_unit_id": unit_data.get("deployed_with_unit_id"),
|
||||
"ip_address": unit_data.get("ip_address"),
|
||||
"phone_number": unit_data.get("phone_number"),
|
||||
"hardware_model": unit_data.get("hardware_model"),
|
||||
@@ -640,6 +797,7 @@ async def devices_all_partial(request: Request):
|
||||
"last_seen": "N/A",
|
||||
"deployed": False,
|
||||
"retired": False,
|
||||
"out_for_calibration": False,
|
||||
"ignored": True,
|
||||
"note": unit_data.get("note", unit_data.get("reason", "")),
|
||||
"device_type": unit_data.get("device_type", "unknown"),
|
||||
@@ -649,6 +807,7 @@ async def devices_all_partial(request: Request):
|
||||
"last_calibrated": None,
|
||||
"next_calibration_due": None,
|
||||
"deployed_with_modem_id": None,
|
||||
"deployed_with_unit_id": None,
|
||||
"ip_address": None,
|
||||
"phone_number": None,
|
||||
"hardware_model": None,
|
||||
@@ -656,22 +815,27 @@ async def devices_all_partial(request: Request):
|
||||
|
||||
# Sort by status category, then by ID
|
||||
def sort_key(unit):
|
||||
# Priority: deployed (active) -> benched -> retired -> ignored
|
||||
# Priority: deployed (active) -> allocated -> benched -> out_for_calibration -> retired -> ignored
|
||||
if unit["deployed"]:
|
||||
return (0, unit["id"])
|
||||
elif not unit["retired"] and not unit["ignored"]:
|
||||
elif unit.get("allocated"):
|
||||
return (1, unit["id"])
|
||||
elif unit["retired"]:
|
||||
elif not unit["retired"] and not unit["out_for_calibration"] and not unit["ignored"]:
|
||||
return (2, unit["id"])
|
||||
else:
|
||||
elif unit["out_for_calibration"]:
|
||||
return (3, unit["id"])
|
||||
elif unit["retired"]:
|
||||
return (4, unit["id"])
|
||||
else:
|
||||
return (5, unit["id"])
|
||||
|
||||
units_list.sort(key=sort_key)
|
||||
|
||||
return templates.TemplateResponse("partials/devices_table.html", {
|
||||
"request": request,
|
||||
"units": units_list,
|
||||
"timestamp": datetime.now().strftime("%H:%M:%S")
|
||||
"timestamp": datetime.now().strftime("%H:%M:%S"),
|
||||
"user_timezone": get_user_timezone()
|
||||
})
|
||||
|
||||
|
||||
|
||||
35
backend/migrate_add_allocated.py
Normal file
@@ -0,0 +1,35 @@
|
||||
"""
|
||||
Migration: Add allocated and allocated_to_project_id columns to roster table.
|
||||
Run once: python backend/migrate_add_allocated.py
|
||||
"""
|
||||
import sqlite3
|
||||
import os
|
||||
|
||||
DB_PATH = os.path.join(os.path.dirname(__file__), '..', 'data', 'seismo_fleet.db')
|
||||
|
||||
def run():
|
||||
conn = sqlite3.connect(DB_PATH)
|
||||
cur = conn.cursor()
|
||||
|
||||
# Check existing columns
|
||||
cur.execute("PRAGMA table_info(roster)")
|
||||
cols = {row[1] for row in cur.fetchall()}
|
||||
|
||||
if 'allocated' not in cols:
|
||||
cur.execute("ALTER TABLE roster ADD COLUMN allocated BOOLEAN DEFAULT 0 NOT NULL")
|
||||
print("Added column: allocated")
|
||||
else:
|
||||
print("Column already exists: allocated")
|
||||
|
||||
if 'allocated_to_project_id' not in cols:
|
||||
cur.execute("ALTER TABLE roster ADD COLUMN allocated_to_project_id VARCHAR")
|
||||
print("Added column: allocated_to_project_id")
|
||||
else:
|
||||
print("Column already exists: allocated_to_project_id")
|
||||
|
||||
conn.commit()
|
||||
conn.close()
|
||||
print("Migration complete.")
|
||||
|
||||
if __name__ == '__main__':
|
||||
run()
|
||||
79
backend/migrate_add_deployment_records.py
Normal file
@@ -0,0 +1,79 @@
|
||||
"""
|
||||
Migration: Add deployment_records table.
|
||||
|
||||
Tracks each time a unit is sent to the field and returned.
|
||||
The active deployment is the row with actual_removal_date IS NULL.
|
||||
|
||||
Run once per database:
|
||||
python backend/migrate_add_deployment_records.py
|
||||
"""
|
||||
|
||||
import sqlite3
|
||||
import os
|
||||
|
||||
DB_PATH = "./data/seismo_fleet.db"
|
||||
|
||||
|
||||
def migrate_database():
|
||||
if not os.path.exists(DB_PATH):
|
||||
print(f"Database not found at {DB_PATH}")
|
||||
return
|
||||
|
||||
conn = sqlite3.connect(DB_PATH)
|
||||
cursor = conn.cursor()
|
||||
|
||||
try:
|
||||
# Check if table already exists
|
||||
cursor.execute("""
|
||||
SELECT name FROM sqlite_master
|
||||
WHERE type='table' AND name='deployment_records'
|
||||
""")
|
||||
if cursor.fetchone():
|
||||
print("✓ deployment_records table already exists, skipping")
|
||||
return
|
||||
|
||||
print("Creating deployment_records table...")
|
||||
cursor.execute("""
|
||||
CREATE TABLE deployment_records (
|
||||
id TEXT PRIMARY KEY,
|
||||
unit_id TEXT NOT NULL,
|
||||
deployed_date DATE,
|
||||
estimated_removal_date DATE,
|
||||
actual_removal_date DATE,
|
||||
project_ref TEXT,
|
||||
project_id TEXT,
|
||||
location_name TEXT,
|
||||
notes TEXT,
|
||||
created_at TIMESTAMP DEFAULT CURRENT_TIMESTAMP,
|
||||
updated_at TIMESTAMP DEFAULT CURRENT_TIMESTAMP
|
||||
)
|
||||
""")
|
||||
|
||||
cursor.execute("""
|
||||
CREATE INDEX idx_deployment_records_unit_id
|
||||
ON deployment_records(unit_id)
|
||||
""")
|
||||
cursor.execute("""
|
||||
CREATE INDEX idx_deployment_records_project_id
|
||||
ON deployment_records(project_id)
|
||||
""")
|
||||
# Index for finding active deployments quickly
|
||||
cursor.execute("""
|
||||
CREATE INDEX idx_deployment_records_active
|
||||
ON deployment_records(unit_id, actual_removal_date)
|
||||
""")
|
||||
|
||||
conn.commit()
|
||||
print("✓ deployment_records table created successfully")
|
||||
print("✓ Indexes created")
|
||||
|
||||
except Exception as e:
|
||||
conn.rollback()
|
||||
print(f"✗ Migration failed: {e}")
|
||||
raise
|
||||
finally:
|
||||
conn.close()
|
||||
|
||||
|
||||
if __name__ == "__main__":
|
||||
migrate_database()
|
||||
84
backend/migrate_add_deployment_type.py
Normal file
@@ -0,0 +1,84 @@
|
||||
"""
|
||||
Migration script to add deployment_type and deployed_with_unit_id fields to roster table.
|
||||
|
||||
deployment_type: tracks what type of device a modem is deployed with:
|
||||
- "seismograph" - Modem is connected to a seismograph
|
||||
- "slm" - Modem is connected to a sound level meter
|
||||
- NULL/empty - Not assigned or unknown
|
||||
|
||||
deployed_with_unit_id: stores the ID of the seismograph/SLM this modem is deployed with
|
||||
(reverse relationship of deployed_with_modem_id)
|
||||
|
||||
Run this script once to migrate an existing database.
|
||||
"""
|
||||
|
||||
import sqlite3
|
||||
import os
|
||||
|
||||
# Database path
|
||||
DB_PATH = "./data/seismo_fleet.db"
|
||||
|
||||
|
||||
def migrate_database():
|
||||
"""Add deployment_type and deployed_with_unit_id columns to roster table"""
|
||||
|
||||
if not os.path.exists(DB_PATH):
|
||||
print(f"Database not found at {DB_PATH}")
|
||||
print("The database will be created automatically when you run the application.")
|
||||
return
|
||||
|
||||
print(f"Migrating database: {DB_PATH}")
|
||||
|
||||
conn = sqlite3.connect(DB_PATH)
|
||||
cursor = conn.cursor()
|
||||
|
||||
# Check if roster table exists
|
||||
cursor.execute("SELECT name FROM sqlite_master WHERE type='table' AND name='roster'")
|
||||
table_exists = cursor.fetchone()
|
||||
|
||||
if not table_exists:
|
||||
print("Roster table does not exist yet - will be created when app runs")
|
||||
conn.close()
|
||||
return
|
||||
|
||||
# Check existing columns
|
||||
cursor.execute("PRAGMA table_info(roster)")
|
||||
columns = [col[1] for col in cursor.fetchall()]
|
||||
|
||||
try:
|
||||
# Add deployment_type if not exists
|
||||
if 'deployment_type' not in columns:
|
||||
print("Adding deployment_type column to roster table...")
|
||||
cursor.execute("ALTER TABLE roster ADD COLUMN deployment_type TEXT")
|
||||
print(" Added deployment_type column")
|
||||
|
||||
cursor.execute("CREATE INDEX IF NOT EXISTS ix_roster_deployment_type ON roster(deployment_type)")
|
||||
print(" Created index on deployment_type")
|
||||
else:
|
||||
print("deployment_type column already exists")
|
||||
|
||||
# Add deployed_with_unit_id if not exists
|
||||
if 'deployed_with_unit_id' not in columns:
|
||||
print("Adding deployed_with_unit_id column to roster table...")
|
||||
cursor.execute("ALTER TABLE roster ADD COLUMN deployed_with_unit_id TEXT")
|
||||
print(" Added deployed_with_unit_id column")
|
||||
|
||||
cursor.execute("CREATE INDEX IF NOT EXISTS ix_roster_deployed_with_unit_id ON roster(deployed_with_unit_id)")
|
||||
print(" Created index on deployed_with_unit_id")
|
||||
else:
|
||||
print("deployed_with_unit_id column already exists")
|
||||
|
||||
conn.commit()
|
||||
print("\nMigration completed successfully!")
|
||||
|
||||
except sqlite3.Error as e:
|
||||
print(f"\nError during migration: {e}")
|
||||
conn.rollback()
|
||||
raise
|
||||
|
||||
finally:
|
||||
conn.close()
|
||||
|
||||
|
||||
if __name__ == "__main__":
|
||||
migrate_database()
|
||||
62
backend/migrate_add_estimated_units.py
Normal file
@@ -0,0 +1,62 @@
|
||||
"""
|
||||
Migration: Add estimated_units to job_reservations
|
||||
|
||||
Adds column:
|
||||
- job_reservations.estimated_units: Estimated number of units for the reservation (nullable integer)
|
||||
"""
|
||||
|
||||
import sqlite3
|
||||
import sys
|
||||
from pathlib import Path
|
||||
|
||||
# Default database path (matches production pattern)
|
||||
DB_PATH = "./data/seismo_fleet.db"
|
||||
|
||||
|
||||
def migrate(db_path: str):
|
||||
"""Run the migration."""
|
||||
print(f"Migrating database: {db_path}")
|
||||
|
||||
conn = sqlite3.connect(db_path)
|
||||
cursor = conn.cursor()
|
||||
|
||||
try:
|
||||
# Check if job_reservations table exists
|
||||
cursor.execute("SELECT name FROM sqlite_master WHERE type='table' AND name='job_reservations'")
|
||||
if not cursor.fetchone():
|
||||
print("job_reservations table does not exist. Skipping migration.")
|
||||
return
|
||||
|
||||
# Get existing columns in job_reservations
|
||||
cursor.execute("PRAGMA table_info(job_reservations)")
|
||||
existing_cols = {row[1] for row in cursor.fetchall()}
|
||||
|
||||
# Add estimated_units column if it doesn't exist
|
||||
if 'estimated_units' not in existing_cols:
|
||||
print("Adding estimated_units column to job_reservations...")
|
||||
cursor.execute("ALTER TABLE job_reservations ADD COLUMN estimated_units INTEGER")
|
||||
else:
|
||||
print("estimated_units column already exists. Skipping.")
|
||||
|
||||
conn.commit()
|
||||
print("Migration completed successfully!")
|
||||
|
||||
except Exception as e:
|
||||
print(f"Migration failed: {e}")
|
||||
conn.rollback()
|
||||
raise
|
||||
finally:
|
||||
conn.close()
|
||||
|
||||
|
||||
if __name__ == "__main__":
|
||||
db_path = DB_PATH
|
||||
|
||||
if len(sys.argv) > 1:
|
||||
db_path = sys.argv[1]
|
||||
|
||||
if not Path(db_path).exists():
|
||||
print(f"Database not found: {db_path}")
|
||||
sys.exit(1)
|
||||
|
||||
migrate(db_path)
|
||||
103
backend/migrate_add_job_reservations.py
Normal file
@@ -0,0 +1,103 @@
|
||||
"""
|
||||
Migration script to add job reservations for the Fleet Calendar feature.
|
||||
|
||||
This creates two tables:
|
||||
- job_reservations: Track future unit assignments for jobs/projects
|
||||
- job_reservation_units: Link specific units to reservations
|
||||
|
||||
Run this script once to migrate an existing database.
|
||||
"""
|
||||
|
||||
import sqlite3
|
||||
import os
|
||||
|
||||
# Database path
|
||||
DB_PATH = "./data/seismo_fleet.db"
|
||||
|
||||
|
||||
def migrate_database():
|
||||
"""Create the job_reservations and job_reservation_units tables"""
|
||||
|
||||
if not os.path.exists(DB_PATH):
|
||||
print(f"Database not found at {DB_PATH}")
|
||||
print("The database will be created automatically when you run the application.")
|
||||
return
|
||||
|
||||
print(f"Migrating database: {DB_PATH}")
|
||||
|
||||
conn = sqlite3.connect(DB_PATH)
|
||||
cursor = conn.cursor()
|
||||
|
||||
# Check if job_reservations table already exists
|
||||
cursor.execute("SELECT name FROM sqlite_master WHERE type='table' AND name='job_reservations'")
|
||||
if cursor.fetchone():
|
||||
print("Migration already applied - job_reservations table exists")
|
||||
conn.close()
|
||||
return
|
||||
|
||||
print("Creating job_reservations table...")
|
||||
|
||||
try:
|
||||
# Create job_reservations table
|
||||
cursor.execute("""
|
||||
CREATE TABLE job_reservations (
|
||||
id TEXT PRIMARY KEY,
|
||||
name TEXT NOT NULL,
|
||||
project_id TEXT,
|
||||
start_date DATE NOT NULL,
|
||||
end_date DATE NOT NULL,
|
||||
assignment_type TEXT NOT NULL DEFAULT 'quantity',
|
||||
device_type TEXT DEFAULT 'seismograph',
|
||||
quantity_needed INTEGER,
|
||||
notes TEXT,
|
||||
color TEXT DEFAULT '#3B82F6',
|
||||
created_at TIMESTAMP DEFAULT CURRENT_TIMESTAMP,
|
||||
updated_at TIMESTAMP DEFAULT CURRENT_TIMESTAMP
|
||||
)
|
||||
""")
|
||||
print(" Created job_reservations table")
|
||||
|
||||
# Create indexes for job_reservations
|
||||
cursor.execute("CREATE INDEX idx_job_reservations_project_id ON job_reservations(project_id)")
|
||||
print(" Created index on project_id")
|
||||
|
||||
cursor.execute("CREATE INDEX idx_job_reservations_dates ON job_reservations(start_date, end_date)")
|
||||
print(" Created index on dates")
|
||||
|
||||
# Create job_reservation_units table
|
||||
print("Creating job_reservation_units table...")
|
||||
cursor.execute("""
|
||||
CREATE TABLE job_reservation_units (
|
||||
id TEXT PRIMARY KEY,
|
||||
reservation_id TEXT NOT NULL,
|
||||
unit_id TEXT NOT NULL,
|
||||
assignment_source TEXT DEFAULT 'specific',
|
||||
assigned_at TIMESTAMP DEFAULT CURRENT_TIMESTAMP,
|
||||
FOREIGN KEY (reservation_id) REFERENCES job_reservations(id),
|
||||
FOREIGN KEY (unit_id) REFERENCES roster(id)
|
||||
)
|
||||
""")
|
||||
print(" Created job_reservation_units table")
|
||||
|
||||
# Create indexes for job_reservation_units
|
||||
cursor.execute("CREATE INDEX idx_job_reservation_units_reservation_id ON job_reservation_units(reservation_id)")
|
||||
print(" Created index on reservation_id")
|
||||
|
||||
cursor.execute("CREATE INDEX idx_job_reservation_units_unit_id ON job_reservation_units(unit_id)")
|
||||
print(" Created index on unit_id")
|
||||
|
||||
conn.commit()
|
||||
print("\nMigration completed successfully!")
|
||||
print("You can now use the Fleet Calendar to manage unit reservations.")
|
||||
|
||||
except sqlite3.Error as e:
|
||||
print(f"\nError during migration: {e}")
|
||||
conn.rollback()
|
||||
raise
|
||||
|
||||
finally:
|
||||
conn.close()
|
||||
|
||||
|
||||
if __name__ == "__main__":
|
||||
migrate_database()
|
||||
24
backend/migrate_add_location_slots.py
Normal file
@@ -0,0 +1,24 @@
|
||||
"""
|
||||
Migration: Add location_slots column to job_reservations table.
|
||||
Stores the full ordered slot list (including empty/unassigned slots) as JSON.
|
||||
Run once per database.
|
||||
"""
|
||||
import sqlite3
|
||||
import os
|
||||
|
||||
DB_PATH = os.environ.get("DB_PATH", "/app/data/seismo_fleet.db")
|
||||
|
||||
def run():
|
||||
conn = sqlite3.connect(DB_PATH)
|
||||
cursor = conn.cursor()
|
||||
existing = [r[1] for r in cursor.execute("PRAGMA table_info(job_reservations)").fetchall()]
|
||||
if "location_slots" not in existing:
|
||||
cursor.execute("ALTER TABLE job_reservations ADD COLUMN location_slots TEXT")
|
||||
conn.commit()
|
||||
print("Added location_slots column to job_reservations.")
|
||||
else:
|
||||
print("location_slots column already exists, skipping.")
|
||||
conn.close()
|
||||
|
||||
if __name__ == "__main__":
|
||||
run()
|
||||
73
backend/migrate_add_oneoff_schedule_fields.py
Normal file
@@ -0,0 +1,73 @@
|
||||
"""
|
||||
Migration: Add one-off schedule fields to recurring_schedules table
|
||||
|
||||
Adds start_datetime and end_datetime columns for one-off recording schedules.
|
||||
|
||||
Run this script once to update existing databases:
|
||||
python -m backend.migrate_add_oneoff_schedule_fields
|
||||
"""
|
||||
|
||||
import sqlite3
|
||||
import os
|
||||
|
||||
DB_PATH = "data/seismo_fleet.db"
|
||||
|
||||
|
||||
def migrate():
|
||||
"""Add one-off schedule columns to recurring_schedules table."""
|
||||
if not os.path.exists(DB_PATH):
|
||||
print(f"Database not found at {DB_PATH}")
|
||||
return False
|
||||
|
||||
conn = sqlite3.connect(DB_PATH)
|
||||
cursor = conn.cursor()
|
||||
|
||||
try:
|
||||
cursor.execute("""
|
||||
SELECT name FROM sqlite_master
|
||||
WHERE type='table' AND name='recurring_schedules'
|
||||
""")
|
||||
if not cursor.fetchone():
|
||||
print("recurring_schedules table does not exist yet. Will be created on app startup.")
|
||||
conn.close()
|
||||
return True
|
||||
|
||||
cursor.execute("PRAGMA table_info(recurring_schedules)")
|
||||
columns = [row[1] for row in cursor.fetchall()]
|
||||
|
||||
added = False
|
||||
|
||||
if "start_datetime" not in columns:
|
||||
print("Adding start_datetime column to recurring_schedules table...")
|
||||
cursor.execute("""
|
||||
ALTER TABLE recurring_schedules
|
||||
ADD COLUMN start_datetime DATETIME NULL
|
||||
""")
|
||||
added = True
|
||||
|
||||
if "end_datetime" not in columns:
|
||||
print("Adding end_datetime column to recurring_schedules table...")
|
||||
cursor.execute("""
|
||||
ALTER TABLE recurring_schedules
|
||||
ADD COLUMN end_datetime DATETIME NULL
|
||||
""")
|
||||
added = True
|
||||
|
||||
if added:
|
||||
conn.commit()
|
||||
print("Successfully added one-off schedule columns.")
|
||||
else:
|
||||
print("One-off schedule columns already exist.")
|
||||
|
||||
conn.close()
|
||||
return True
|
||||
|
||||
except Exception as e:
|
||||
print(f"Migration failed: {e}")
|
||||
conn.close()
|
||||
return False
|
||||
|
||||
|
||||
if __name__ == "__main__":
|
||||
success = migrate()
|
||||
exit(0 if success else 1)
|
||||
54
backend/migrate_add_out_for_calibration.py
Normal file
@@ -0,0 +1,54 @@
|
||||
"""
|
||||
Database Migration: Add out_for_calibration field to roster table
|
||||
|
||||
Changes:
|
||||
- Adds out_for_calibration BOOLEAN column (default FALSE) to roster table
|
||||
- Safe to run multiple times (idempotent)
|
||||
- No data loss
|
||||
|
||||
Usage:
|
||||
python backend/migrate_add_out_for_calibration.py
|
||||
"""
|
||||
|
||||
import sys
|
||||
import os
|
||||
|
||||
sys.path.insert(0, os.path.dirname(os.path.dirname(os.path.abspath(__file__))))
|
||||
|
||||
from sqlalchemy import create_engine, text
|
||||
from sqlalchemy.orm import sessionmaker
|
||||
|
||||
SQLALCHEMY_DATABASE_URL = "sqlite:///./data/seismo_fleet.db"
|
||||
engine = create_engine(SQLALCHEMY_DATABASE_URL, connect_args={"check_same_thread": False})
|
||||
SessionLocal = sessionmaker(autocommit=False, autoflush=False, bind=engine)
|
||||
|
||||
|
||||
def migrate():
|
||||
db = SessionLocal()
|
||||
try:
|
||||
print("=" * 60)
|
||||
print("Migration: Add out_for_calibration to roster")
|
||||
print("=" * 60)
|
||||
|
||||
# Check if column already exists
|
||||
result = db.execute(text("PRAGMA table_info(roster)")).fetchall()
|
||||
columns = [row[1] for row in result]
|
||||
|
||||
if "out_for_calibration" in columns:
|
||||
print("Column out_for_calibration already exists. Skipping.")
|
||||
else:
|
||||
db.execute(text("ALTER TABLE roster ADD COLUMN out_for_calibration BOOLEAN DEFAULT FALSE"))
|
||||
db.commit()
|
||||
print("Added out_for_calibration column to roster table.")
|
||||
|
||||
print("Migration complete.")
|
||||
except Exception as e:
|
||||
db.rollback()
|
||||
print(f"Error: {e}")
|
||||
raise
|
||||
finally:
|
||||
db.close()
|
||||
|
||||
|
||||
if __name__ == "__main__":
|
||||
migrate()
|
||||
53
backend/migrate_add_project_data_collection_mode.py
Normal file
@@ -0,0 +1,53 @@
|
||||
#!/usr/bin/env python3
|
||||
"""
|
||||
Migration: Add data_collection_mode column to projects table.
|
||||
|
||||
Values:
|
||||
"remote" — units have modems; data pulled via FTP/scheduler automatically
|
||||
"manual" — no modem; SD cards retrieved daily and uploaded by hand
|
||||
|
||||
All existing projects are backfilled to "manual" (safe conservative default).
|
||||
|
||||
Run once inside the Docker container:
|
||||
docker exec terra-view python3 backend/migrate_add_project_data_collection_mode.py
|
||||
"""
|
||||
from pathlib import Path
|
||||
|
||||
DB_PATH = Path("data/seismo_fleet.db")
|
||||
|
||||
|
||||
def migrate():
|
||||
import sqlite3
|
||||
|
||||
if not DB_PATH.exists():
|
||||
print(f"Database not found at {DB_PATH}. Are you running from /home/serversdown/terra-view?")
|
||||
return
|
||||
|
||||
conn = sqlite3.connect(DB_PATH)
|
||||
conn.row_factory = sqlite3.Row
|
||||
cur = conn.cursor()
|
||||
|
||||
# ── 1. Add column (idempotent) ───────────────────────────────────────────
|
||||
cur.execute("PRAGMA table_info(projects)")
|
||||
existing_cols = {row["name"] for row in cur.fetchall()}
|
||||
|
||||
if "data_collection_mode" not in existing_cols:
|
||||
cur.execute("ALTER TABLE projects ADD COLUMN data_collection_mode TEXT DEFAULT 'manual'")
|
||||
conn.commit()
|
||||
print("✓ Added column data_collection_mode to projects")
|
||||
else:
|
||||
print("○ Column data_collection_mode already exists — skipping ALTER TABLE")
|
||||
|
||||
# ── 2. Backfill NULLs to 'manual' ────────────────────────────────────────
|
||||
cur.execute("UPDATE projects SET data_collection_mode = 'manual' WHERE data_collection_mode IS NULL")
|
||||
updated = cur.rowcount
|
||||
conn.commit()
|
||||
conn.close()
|
||||
|
||||
if updated:
|
||||
print(f"✓ Backfilled {updated} project(s) to data_collection_mode='manual'.")
|
||||
print("Migration complete.")
|
||||
|
||||
|
||||
if __name__ == "__main__":
|
||||
migrate()
|
||||
56
backend/migrate_add_project_deleted_at.py
Normal file
@@ -0,0 +1,56 @@
|
||||
"""
|
||||
Migration: Add deleted_at column to projects table
|
||||
|
||||
Adds columns:
|
||||
- projects.deleted_at: Timestamp set when status='deleted'; data hard-deleted after 60 days
|
||||
"""
|
||||
|
||||
import sqlite3
|
||||
import sys
|
||||
from pathlib import Path
|
||||
|
||||
|
||||
def migrate(db_path: str):
|
||||
"""Run the migration."""
|
||||
print(f"Migrating database: {db_path}")
|
||||
|
||||
conn = sqlite3.connect(db_path)
|
||||
cursor = conn.cursor()
|
||||
|
||||
try:
|
||||
cursor.execute("SELECT name FROM sqlite_master WHERE type='table' AND name='projects'")
|
||||
if not cursor.fetchone():
|
||||
print("projects table does not exist. Skipping migration.")
|
||||
return
|
||||
|
||||
cursor.execute("PRAGMA table_info(projects)")
|
||||
existing_cols = {row[1] for row in cursor.fetchall()}
|
||||
|
||||
if 'deleted_at' not in existing_cols:
|
||||
print("Adding deleted_at column to projects...")
|
||||
cursor.execute("ALTER TABLE projects ADD COLUMN deleted_at DATETIME")
|
||||
else:
|
||||
print("deleted_at column already exists. Skipping.")
|
||||
|
||||
conn.commit()
|
||||
print("Migration completed successfully!")
|
||||
|
||||
except Exception as e:
|
||||
print(f"Migration failed: {e}")
|
||||
conn.rollback()
|
||||
raise
|
||||
finally:
|
||||
conn.close()
|
||||
|
||||
|
||||
if __name__ == "__main__":
|
||||
db_path = "./data/seismo_fleet.db"
|
||||
|
||||
if len(sys.argv) > 1:
|
||||
db_path = sys.argv[1]
|
||||
|
||||
if not Path(db_path).exists():
|
||||
print(f"Database not found: {db_path}")
|
||||
sys.exit(1)
|
||||
|
||||
migrate(db_path)
|
||||
71
backend/migrate_add_project_modules.py
Normal file
@@ -0,0 +1,71 @@
|
||||
"""
|
||||
Migration: Add project_modules table and seed from existing project_type_id values.
|
||||
|
||||
Safe to run multiple times — idempotent.
|
||||
"""
|
||||
import sqlite3
|
||||
import uuid
|
||||
import os
|
||||
|
||||
DB_PATH = os.path.join(os.path.dirname(__file__), "..", "data", "seismo_fleet.db")
|
||||
DB_PATH = os.path.abspath(DB_PATH)
|
||||
|
||||
|
||||
def run():
|
||||
conn = sqlite3.connect(DB_PATH)
|
||||
conn.row_factory = sqlite3.Row
|
||||
cur = conn.cursor()
|
||||
|
||||
# 1. Create project_modules table if not exists
|
||||
cur.execute("""
|
||||
CREATE TABLE IF NOT EXISTS project_modules (
|
||||
id TEXT PRIMARY KEY,
|
||||
project_id TEXT NOT NULL,
|
||||
module_type TEXT NOT NULL,
|
||||
enabled INTEGER NOT NULL DEFAULT 1,
|
||||
created_at TEXT NOT NULL DEFAULT (datetime('now')),
|
||||
UNIQUE(project_id, module_type)
|
||||
)
|
||||
""")
|
||||
print(" Table 'project_modules' ready.")
|
||||
|
||||
# 2. Seed modules from existing project_type_id values
|
||||
cur.execute("SELECT id, project_type_id FROM projects WHERE project_type_id IS NOT NULL")
|
||||
projects = cur.fetchall()
|
||||
|
||||
seeded = 0
|
||||
for p in projects:
|
||||
pid = p["id"]
|
||||
ptype = p["project_type_id"]
|
||||
|
||||
modules_to_add = []
|
||||
if ptype == "sound_monitoring":
|
||||
modules_to_add = ["sound_monitoring"]
|
||||
elif ptype == "vibration_monitoring":
|
||||
modules_to_add = ["vibration_monitoring"]
|
||||
elif ptype == "combined":
|
||||
modules_to_add = ["sound_monitoring", "vibration_monitoring"]
|
||||
|
||||
for module_type in modules_to_add:
|
||||
# INSERT OR IGNORE — skip if already exists
|
||||
cur.execute("""
|
||||
INSERT OR IGNORE INTO project_modules (id, project_id, module_type, enabled)
|
||||
VALUES (?, ?, ?, 1)
|
||||
""", (str(uuid.uuid4()), pid, module_type))
|
||||
if cur.rowcount > 0:
|
||||
seeded += 1
|
||||
|
||||
conn.commit()
|
||||
print(f" Seeded {seeded} module record(s) from existing project_type_id values.")
|
||||
|
||||
# 3. Make project_type_id nullable (SQLite doesn't support ALTER COLUMN,
|
||||
# but since we're just loosening a constraint this is a no-op in SQLite —
|
||||
# the column already accepts NULL in practice. Nothing to do.)
|
||||
print(" project_type_id column is now treated as nullable (legacy field).")
|
||||
|
||||
conn.close()
|
||||
print("Migration complete.")
|
||||
|
||||
|
||||
if __name__ == "__main__":
|
||||
run()
|
||||
80
backend/migrate_add_project_number.py
Normal file
@@ -0,0 +1,80 @@
|
||||
"""
|
||||
Migration script to add project_number field to projects table.
|
||||
|
||||
This adds a new column for TMI internal project numbering:
|
||||
- Format: xxxx-YY (e.g., "2567-23")
|
||||
- xxxx = incremental project number
|
||||
- YY = year project was started
|
||||
|
||||
Combined with client_name and name (project/site name), this enables
|
||||
smart searching across all project identifiers.
|
||||
|
||||
Run this script once to migrate an existing database.
|
||||
"""
|
||||
|
||||
import sqlite3
|
||||
import os
|
||||
|
||||
# Database path
|
||||
DB_PATH = "./data/seismo_fleet.db"
|
||||
|
||||
|
||||
def migrate_database():
|
||||
"""Add project_number column to projects table"""
|
||||
|
||||
if not os.path.exists(DB_PATH):
|
||||
print(f"Database not found at {DB_PATH}")
|
||||
print("The database will be created automatically when you run the application.")
|
||||
return
|
||||
|
||||
print(f"Migrating database: {DB_PATH}")
|
||||
|
||||
conn = sqlite3.connect(DB_PATH)
|
||||
cursor = conn.cursor()
|
||||
|
||||
# Check if projects table exists
|
||||
cursor.execute("SELECT name FROM sqlite_master WHERE type='table' AND name='projects'")
|
||||
table_exists = cursor.fetchone()
|
||||
|
||||
if not table_exists:
|
||||
print("Projects table does not exist yet - will be created when app runs")
|
||||
conn.close()
|
||||
return
|
||||
|
||||
# Check if project_number column already exists
|
||||
cursor.execute("PRAGMA table_info(projects)")
|
||||
columns = [col[1] for col in cursor.fetchall()]
|
||||
|
||||
if 'project_number' in columns:
|
||||
print("Migration already applied - project_number column exists")
|
||||
conn.close()
|
||||
return
|
||||
|
||||
print("Adding project_number column to projects table...")
|
||||
|
||||
try:
|
||||
cursor.execute("ALTER TABLE projects ADD COLUMN project_number TEXT")
|
||||
print(" Added project_number column")
|
||||
|
||||
# Create index for faster searching
|
||||
cursor.execute("CREATE INDEX IF NOT EXISTS ix_projects_project_number ON projects(project_number)")
|
||||
print(" Created index on project_number")
|
||||
|
||||
# Also add index on client_name if it doesn't exist
|
||||
cursor.execute("CREATE INDEX IF NOT EXISTS ix_projects_client_name ON projects(client_name)")
|
||||
print(" Created index on client_name")
|
||||
|
||||
conn.commit()
|
||||
print("\nMigration completed successfully!")
|
||||
|
||||
except sqlite3.Error as e:
|
||||
print(f"\nError during migration: {e}")
|
||||
conn.rollback()
|
||||
raise
|
||||
|
||||
finally:
|
||||
conn.close()
|
||||
|
||||
|
||||
if __name__ == "__main__":
|
||||
migrate_database()
|
||||
127
backend/migrate_add_session_device_model.py
Normal file
@@ -0,0 +1,127 @@
|
||||
#!/usr/bin/env python3
|
||||
"""
|
||||
Migration: Add device_model column to monitoring_sessions table.
|
||||
|
||||
Records which physical SLM model produced each session's data (e.g. "NL-43",
|
||||
"NL-53", "NL-32"). Used by report generation to apply the correct parsing
|
||||
logic without re-opening files to detect format.
|
||||
|
||||
Run once inside the Docker container:
|
||||
docker exec terra-view python3 backend/migrate_add_session_device_model.py
|
||||
|
||||
Backfill strategy for existing rows:
|
||||
1. If session.unit_id is set, use roster.slm_model for that unit.
|
||||
2. Else, peek at the first .rnd file in the session: presence of the 'LAeq'
|
||||
column header identifies AU2 / NL-32 format.
|
||||
Sessions where neither hint is available remain NULL — the file-content
|
||||
fallback in report code handles them transparently.
|
||||
"""
|
||||
import csv
|
||||
import io
|
||||
from pathlib import Path
|
||||
|
||||
DB_PATH = Path("data/seismo_fleet.db")
|
||||
|
||||
|
||||
def _peek_first_row(abs_path: Path) -> dict:
|
||||
"""Read only the header + first data row of an RND file. Very cheap."""
|
||||
try:
|
||||
with open(abs_path, "r", encoding="utf-8", errors="replace") as f:
|
||||
reader = csv.DictReader(f)
|
||||
return next(reader, None) or {}
|
||||
except Exception:
|
||||
return {}
|
||||
|
||||
|
||||
def _detect_model_from_rnd(abs_path: Path) -> str | None:
|
||||
"""Return 'NL-32' if file uses AU2 column format, else None."""
|
||||
row = _peek_first_row(abs_path)
|
||||
if "LAeq" in row:
|
||||
return "NL-32"
|
||||
return None
|
||||
|
||||
|
||||
def migrate():
|
||||
import sqlite3
|
||||
|
||||
if not DB_PATH.exists():
|
||||
print(f"Database not found at {DB_PATH}. Are you running from /home/serversdown/terra-view?")
|
||||
return
|
||||
|
||||
conn = sqlite3.connect(DB_PATH)
|
||||
conn.row_factory = sqlite3.Row
|
||||
cur = conn.cursor()
|
||||
|
||||
# ── 1. Add column (idempotent) ───────────────────────────────────────────
|
||||
cur.execute("PRAGMA table_info(monitoring_sessions)")
|
||||
existing_cols = {row["name"] for row in cur.fetchall()}
|
||||
|
||||
if "device_model" not in existing_cols:
|
||||
cur.execute("ALTER TABLE monitoring_sessions ADD COLUMN device_model TEXT")
|
||||
conn.commit()
|
||||
print("✓ Added column device_model to monitoring_sessions")
|
||||
else:
|
||||
print("○ Column device_model already exists — skipping ALTER TABLE")
|
||||
|
||||
# ── 2. Backfill existing NULL rows ───────────────────────────────────────
|
||||
cur.execute(
|
||||
"SELECT id, unit_id FROM monitoring_sessions WHERE device_model IS NULL"
|
||||
)
|
||||
sessions = cur.fetchall()
|
||||
print(f"Backfilling {len(sessions)} session(s) with device_model=NULL...")
|
||||
|
||||
updated = skipped = 0
|
||||
for row in sessions:
|
||||
session_id = row["id"]
|
||||
unit_id = row["unit_id"]
|
||||
device_model = None
|
||||
|
||||
# Strategy A: look up unit's slm_model from the roster
|
||||
if unit_id:
|
||||
cur.execute(
|
||||
"SELECT slm_model FROM roster WHERE id = ?", (unit_id,)
|
||||
)
|
||||
unit_row = cur.fetchone()
|
||||
if unit_row and unit_row["slm_model"]:
|
||||
device_model = unit_row["slm_model"]
|
||||
|
||||
# Strategy B: detect from first .rnd file in the session
|
||||
if device_model is None:
|
||||
cur.execute(
|
||||
"""SELECT file_path FROM data_files
|
||||
WHERE session_id = ?
|
||||
AND lower(file_path) LIKE '%.rnd'
|
||||
LIMIT 1""",
|
||||
(session_id,),
|
||||
)
|
||||
file_row = cur.fetchone()
|
||||
if file_row:
|
||||
abs_path = Path("data") / file_row["file_path"]
|
||||
device_model = _detect_model_from_rnd(abs_path)
|
||||
# None here means NL-43/NL-53 format (or unreadable file) —
|
||||
# leave as NULL so the existing fallback applies.
|
||||
|
||||
if device_model:
|
||||
cur.execute(
|
||||
"UPDATE monitoring_sessions SET device_model = ? WHERE id = ?",
|
||||
(device_model, session_id),
|
||||
)
|
||||
updated += 1
|
||||
else:
|
||||
skipped += 1
|
||||
|
||||
conn.commit()
|
||||
conn.close()
|
||||
|
||||
print(f"✓ Backfilled {updated} session(s) with a device_model.")
|
||||
if skipped:
|
||||
print(
|
||||
f" {skipped} session(s) left as NULL "
|
||||
"(no unit link and no AU2 file hint — NL-43/NL-53 or unknown; "
|
||||
"file-content detection applies at report time)."
|
||||
)
|
||||
print("Migration complete.")
|
||||
|
||||
|
||||
if __name__ == "__main__":
|
||||
migrate()
|
||||
42
backend/migrate_add_session_period_hours.py
Normal file
@@ -0,0 +1,42 @@
|
||||
"""
|
||||
Migration: add period_start_hour and period_end_hour to monitoring_sessions.
|
||||
|
||||
Run once:
|
||||
python backend/migrate_add_session_period_hours.py
|
||||
|
||||
Or inside the container:
|
||||
docker exec terra-view python3 backend/migrate_add_session_period_hours.py
|
||||
"""
|
||||
|
||||
import sys
|
||||
import os
|
||||
sys.path.insert(0, os.path.dirname(os.path.dirname(os.path.abspath(__file__))))
|
||||
|
||||
from backend.database import engine
|
||||
from sqlalchemy import text
|
||||
|
||||
def run():
|
||||
with engine.connect() as conn:
|
||||
# Check which columns already exist
|
||||
result = conn.execute(text("PRAGMA table_info(monitoring_sessions)"))
|
||||
existing = {row[1] for row in result}
|
||||
|
||||
added = []
|
||||
for col, definition in [
|
||||
("period_start_hour", "INTEGER"),
|
||||
("period_end_hour", "INTEGER"),
|
||||
]:
|
||||
if col not in existing:
|
||||
conn.execute(text(f"ALTER TABLE monitoring_sessions ADD COLUMN {col} {definition}"))
|
||||
added.append(col)
|
||||
else:
|
||||
print(f" Column '{col}' already exists — skipping.")
|
||||
|
||||
conn.commit()
|
||||
|
||||
if added:
|
||||
print(f" Added columns: {', '.join(added)}")
|
||||
print("Migration complete.")
|
||||
|
||||
if __name__ == "__main__":
|
||||
run()
|
||||
131
backend/migrate_add_session_period_type.py
Normal file
@@ -0,0 +1,131 @@
|
||||
#!/usr/bin/env python3
|
||||
"""
|
||||
Migration: Add session_label and period_type columns to monitoring_sessions.
|
||||
|
||||
session_label - user-editable display name, e.g. "NRL-1 Sun 2/23 Night"
|
||||
period_type - one of: weekday_day | weekday_night | weekend_day | weekend_night
|
||||
Auto-derived from started_at when NULL.
|
||||
|
||||
Period definitions (used in report stats table):
|
||||
weekday_day Mon-Fri 07:00-22:00 -> Daytime (7AM-10PM)
|
||||
weekday_night Mon-Fri 22:00-07:00 -> Nighttime (10PM-7AM)
|
||||
weekend_day Sat-Sun 07:00-22:00 -> Daytime (7AM-10PM)
|
||||
weekend_night Sat-Sun 22:00-07:00 -> Nighttime (10PM-7AM)
|
||||
|
||||
Run once inside the Docker container:
|
||||
docker exec terra-view python3 backend/migrate_add_session_period_type.py
|
||||
"""
|
||||
from pathlib import Path
|
||||
from datetime import datetime
|
||||
|
||||
DB_PATH = Path("data/seismo_fleet.db")
|
||||
|
||||
|
||||
def _derive_period_type(started_at_str: str) -> str | None:
|
||||
"""Derive period_type from a started_at ISO datetime string."""
|
||||
if not started_at_str:
|
||||
return None
|
||||
try:
|
||||
dt = datetime.fromisoformat(started_at_str)
|
||||
except ValueError:
|
||||
return None
|
||||
is_weekend = dt.weekday() >= 5 # 5=Sat, 6=Sun
|
||||
is_night = dt.hour >= 22 or dt.hour < 7
|
||||
if is_weekend:
|
||||
return "weekend_night" if is_night else "weekend_day"
|
||||
else:
|
||||
return "weekday_night" if is_night else "weekday_day"
|
||||
|
||||
|
||||
def _build_label(started_at_str: str, location_name: str | None, period_type: str | None) -> str | None:
|
||||
"""Build a human-readable session label."""
|
||||
if not started_at_str:
|
||||
return None
|
||||
try:
|
||||
dt = datetime.fromisoformat(started_at_str)
|
||||
except ValueError:
|
||||
return None
|
||||
|
||||
day_abbr = dt.strftime("%a") # Mon, Tue, Sun, etc.
|
||||
date_str = dt.strftime("%-m/%-d") # 2/23
|
||||
|
||||
period_labels = {
|
||||
"weekday_day": "Day",
|
||||
"weekday_night": "Night",
|
||||
"weekend_day": "Day",
|
||||
"weekend_night": "Night",
|
||||
}
|
||||
period_str = period_labels.get(period_type or "", "")
|
||||
|
||||
parts = []
|
||||
if location_name:
|
||||
parts.append(location_name)
|
||||
parts.append(f"{day_abbr} {date_str}")
|
||||
if period_str:
|
||||
parts.append(period_str)
|
||||
return " — ".join(parts)
|
||||
|
||||
|
||||
def migrate():
|
||||
import sqlite3
|
||||
|
||||
if not DB_PATH.exists():
|
||||
print(f"Database not found at {DB_PATH}. Are you running from /home/serversdown/terra-view?")
|
||||
return
|
||||
|
||||
conn = sqlite3.connect(DB_PATH)
|
||||
conn.row_factory = sqlite3.Row
|
||||
cur = conn.cursor()
|
||||
|
||||
# 1. Add columns (idempotent)
|
||||
cur.execute("PRAGMA table_info(monitoring_sessions)")
|
||||
existing_cols = {row["name"] for row in cur.fetchall()}
|
||||
|
||||
for col, typedef in [("session_label", "TEXT"), ("period_type", "TEXT")]:
|
||||
if col not in existing_cols:
|
||||
cur.execute(f"ALTER TABLE monitoring_sessions ADD COLUMN {col} {typedef}")
|
||||
conn.commit()
|
||||
print(f"✓ Added column {col} to monitoring_sessions")
|
||||
else:
|
||||
print(f"○ Column {col} already exists — skipping ALTER TABLE")
|
||||
|
||||
# 2. Backfill existing rows
|
||||
cur.execute(
|
||||
"""SELECT ms.id, ms.started_at, ms.location_id
|
||||
FROM monitoring_sessions ms
|
||||
WHERE ms.period_type IS NULL OR ms.session_label IS NULL"""
|
||||
)
|
||||
sessions = cur.fetchall()
|
||||
print(f"Backfilling {len(sessions)} session(s)...")
|
||||
|
||||
updated = 0
|
||||
for row in sessions:
|
||||
session_id = row["id"]
|
||||
started_at = row["started_at"]
|
||||
location_id = row["location_id"]
|
||||
|
||||
# Look up location name
|
||||
location_name = None
|
||||
if location_id:
|
||||
cur.execute("SELECT name FROM monitoring_locations WHERE id = ?", (location_id,))
|
||||
loc_row = cur.fetchone()
|
||||
if loc_row:
|
||||
location_name = loc_row["name"]
|
||||
|
||||
period_type = _derive_period_type(started_at)
|
||||
label = _build_label(started_at, location_name, period_type)
|
||||
|
||||
cur.execute(
|
||||
"UPDATE monitoring_sessions SET period_type = ?, session_label = ? WHERE id = ?",
|
||||
(period_type, label, session_id),
|
||||
)
|
||||
updated += 1
|
||||
|
||||
conn.commit()
|
||||
conn.close()
|
||||
print(f"✓ Backfilled {updated} session(s).")
|
||||
print("Migration complete.")
|
||||
|
||||
|
||||
if __name__ == "__main__":
|
||||
migrate()
|
||||
41
backend/migrate_add_session_report_date.py
Normal file
@@ -0,0 +1,41 @@
|
||||
"""
|
||||
Migration: add report_date to monitoring_sessions.
|
||||
|
||||
Run once:
|
||||
python backend/migrate_add_session_report_date.py
|
||||
|
||||
Or inside the container:
|
||||
docker exec terra-view-terra-view-1 python3 backend/migrate_add_session_report_date.py
|
||||
"""
|
||||
|
||||
import sys
|
||||
import os
|
||||
sys.path.insert(0, os.path.dirname(os.path.dirname(os.path.abspath(__file__))))
|
||||
|
||||
from backend.database import engine
|
||||
from sqlalchemy import text
|
||||
|
||||
def run():
|
||||
with engine.connect() as conn:
|
||||
# Check which columns already exist
|
||||
result = conn.execute(text("PRAGMA table_info(monitoring_sessions)"))
|
||||
existing = {row[1] for row in result}
|
||||
|
||||
added = []
|
||||
for col, definition in [
|
||||
("report_date", "DATE"),
|
||||
]:
|
||||
if col not in existing:
|
||||
conn.execute(text(f"ALTER TABLE monitoring_sessions ADD COLUMN {col} {definition}"))
|
||||
added.append(col)
|
||||
else:
|
||||
print(f" Column '{col}' already exists — skipping.")
|
||||
|
||||
conn.commit()
|
||||
|
||||
if added:
|
||||
print(f" Added columns: {', '.join(added)}")
|
||||
print("Migration complete.")
|
||||
|
||||
if __name__ == "__main__":
|
||||
run()
|
||||
89
backend/migrate_add_tbd_dates.py
Normal file
@@ -0,0 +1,89 @@
|
||||
"""
|
||||
Migration: Add TBD date support to job reservations
|
||||
|
||||
Adds columns:
|
||||
- job_reservations.estimated_end_date: For planning when end is TBD
|
||||
- job_reservations.end_date_tbd: Boolean flag for TBD end dates
|
||||
- job_reservation_units.unit_start_date: Unit-specific start (for swaps)
|
||||
- job_reservation_units.unit_end_date: Unit-specific end (for swaps)
|
||||
- job_reservation_units.unit_end_tbd: Unit-specific TBD flag
|
||||
- job_reservation_units.notes: Notes for the assignment
|
||||
|
||||
Also makes job_reservations.end_date nullable.
|
||||
"""
|
||||
|
||||
import sqlite3
|
||||
import sys
|
||||
from pathlib import Path
|
||||
|
||||
def migrate(db_path: str):
|
||||
"""Run the migration."""
|
||||
print(f"Migrating database: {db_path}")
|
||||
|
||||
conn = sqlite3.connect(db_path)
|
||||
cursor = conn.cursor()
|
||||
|
||||
try:
|
||||
# Check if job_reservations table exists
|
||||
cursor.execute("SELECT name FROM sqlite_master WHERE type='table' AND name='job_reservations'")
|
||||
if not cursor.fetchone():
|
||||
print("job_reservations table does not exist. Skipping migration.")
|
||||
return
|
||||
|
||||
# Get existing columns in job_reservations
|
||||
cursor.execute("PRAGMA table_info(job_reservations)")
|
||||
existing_cols = {row[1] for row in cursor.fetchall()}
|
||||
|
||||
# Add new columns to job_reservations if they don't exist
|
||||
if 'estimated_end_date' not in existing_cols:
|
||||
print("Adding estimated_end_date column to job_reservations...")
|
||||
cursor.execute("ALTER TABLE job_reservations ADD COLUMN estimated_end_date DATE")
|
||||
|
||||
if 'end_date_tbd' not in existing_cols:
|
||||
print("Adding end_date_tbd column to job_reservations...")
|
||||
cursor.execute("ALTER TABLE job_reservations ADD COLUMN end_date_tbd BOOLEAN DEFAULT 0")
|
||||
|
||||
# Get existing columns in job_reservation_units
|
||||
cursor.execute("PRAGMA table_info(job_reservation_units)")
|
||||
unit_cols = {row[1] for row in cursor.fetchall()}
|
||||
|
||||
# Add new columns to job_reservation_units if they don't exist
|
||||
if 'unit_start_date' not in unit_cols:
|
||||
print("Adding unit_start_date column to job_reservation_units...")
|
||||
cursor.execute("ALTER TABLE job_reservation_units ADD COLUMN unit_start_date DATE")
|
||||
|
||||
if 'unit_end_date' not in unit_cols:
|
||||
print("Adding unit_end_date column to job_reservation_units...")
|
||||
cursor.execute("ALTER TABLE job_reservation_units ADD COLUMN unit_end_date DATE")
|
||||
|
||||
if 'unit_end_tbd' not in unit_cols:
|
||||
print("Adding unit_end_tbd column to job_reservation_units...")
|
||||
cursor.execute("ALTER TABLE job_reservation_units ADD COLUMN unit_end_tbd BOOLEAN DEFAULT 0")
|
||||
|
||||
if 'notes' not in unit_cols:
|
||||
print("Adding notes column to job_reservation_units...")
|
||||
cursor.execute("ALTER TABLE job_reservation_units ADD COLUMN notes TEXT")
|
||||
|
||||
conn.commit()
|
||||
print("Migration completed successfully!")
|
||||
|
||||
except Exception as e:
|
||||
print(f"Migration failed: {e}")
|
||||
conn.rollback()
|
||||
raise
|
||||
finally:
|
||||
conn.close()
|
||||
|
||||
|
||||
if __name__ == "__main__":
|
||||
# Default to dev database
|
||||
db_path = "./data-dev/seismo_fleet.db"
|
||||
|
||||
if len(sys.argv) > 1:
|
||||
db_path = sys.argv[1]
|
||||
|
||||
if not Path(db_path).exists():
|
||||
print(f"Database not found: {db_path}")
|
||||
sys.exit(1)
|
||||
|
||||
migrate(db_path)
|
||||
105
backend/migrate_fix_end_date_nullable.py
Normal file
@@ -0,0 +1,105 @@
|
||||
"""
|
||||
Migration: Make job_reservations.end_date nullable for TBD support
|
||||
|
||||
SQLite doesn't support ALTER COLUMN, so we need to:
|
||||
1. Create a new table with the correct schema
|
||||
2. Copy data
|
||||
3. Drop old table
|
||||
4. Rename new table
|
||||
"""
|
||||
|
||||
import sqlite3
|
||||
import sys
|
||||
from pathlib import Path
|
||||
|
||||
def migrate(db_path: str):
|
||||
"""Run the migration."""
|
||||
print(f"Migrating database: {db_path}")
|
||||
|
||||
conn = sqlite3.connect(db_path)
|
||||
cursor = conn.cursor()
|
||||
|
||||
try:
|
||||
# Check if job_reservations table exists
|
||||
cursor.execute("SELECT name FROM sqlite_master WHERE type='table' AND name='job_reservations'")
|
||||
if not cursor.fetchone():
|
||||
print("job_reservations table does not exist. Skipping migration.")
|
||||
return
|
||||
|
||||
# Check current schema
|
||||
cursor.execute("PRAGMA table_info(job_reservations)")
|
||||
columns = cursor.fetchall()
|
||||
col_info = {row[1]: row for row in columns}
|
||||
|
||||
# Check if end_date is already nullable (notnull=0)
|
||||
if 'end_date' in col_info and col_info['end_date'][3] == 0:
|
||||
print("end_date is already nullable. Skipping table recreation.")
|
||||
return
|
||||
|
||||
print("Recreating job_reservations table with nullable end_date...")
|
||||
|
||||
# Create new table with correct schema
|
||||
cursor.execute("""
|
||||
CREATE TABLE job_reservations_new (
|
||||
id TEXT PRIMARY KEY,
|
||||
name TEXT NOT NULL,
|
||||
project_id TEXT,
|
||||
start_date DATE NOT NULL,
|
||||
end_date DATE,
|
||||
estimated_end_date DATE,
|
||||
end_date_tbd BOOLEAN DEFAULT 0,
|
||||
assignment_type TEXT NOT NULL DEFAULT 'quantity',
|
||||
device_type TEXT DEFAULT 'seismograph',
|
||||
quantity_needed INTEGER,
|
||||
notes TEXT,
|
||||
color TEXT DEFAULT '#3B82F6',
|
||||
created_at DATETIME,
|
||||
updated_at DATETIME
|
||||
)
|
||||
""")
|
||||
|
||||
# Copy existing data
|
||||
cursor.execute("""
|
||||
INSERT INTO job_reservations_new
|
||||
SELECT
|
||||
id, name, project_id, start_date, end_date,
|
||||
COALESCE(estimated_end_date, NULL) as estimated_end_date,
|
||||
COALESCE(end_date_tbd, 0) as end_date_tbd,
|
||||
assignment_type, device_type, quantity_needed, notes, color,
|
||||
created_at, updated_at
|
||||
FROM job_reservations
|
||||
""")
|
||||
|
||||
# Drop old table
|
||||
cursor.execute("DROP TABLE job_reservations")
|
||||
|
||||
# Rename new table
|
||||
cursor.execute("ALTER TABLE job_reservations_new RENAME TO job_reservations")
|
||||
|
||||
# Recreate index
|
||||
cursor.execute("CREATE INDEX IF NOT EXISTS ix_job_reservations_id ON job_reservations (id)")
|
||||
cursor.execute("CREATE INDEX IF NOT EXISTS ix_job_reservations_project_id ON job_reservations (project_id)")
|
||||
|
||||
conn.commit()
|
||||
print("Migration completed successfully!")
|
||||
|
||||
except Exception as e:
|
||||
print(f"Migration failed: {e}")
|
||||
conn.rollback()
|
||||
raise
|
||||
finally:
|
||||
conn.close()
|
||||
|
||||
|
||||
if __name__ == "__main__":
|
||||
# Default to dev database
|
||||
db_path = "./data-dev/seismo_fleet.db"
|
||||
|
||||
if len(sys.argv) > 1:
|
||||
db_path = sys.argv[1]
|
||||
|
||||
if not Path(db_path).exists():
|
||||
print(f"Database not found: {db_path}")
|
||||
sys.exit(1)
|
||||
|
||||
migrate(db_path)
|
||||
54
backend/migrate_rename_recording_to_monitoring_sessions.py
Normal file
@@ -0,0 +1,54 @@
|
||||
"""
|
||||
Migration: Rename recording_sessions table to monitoring_sessions
|
||||
|
||||
Renames the table and updates the model name from RecordingSession to MonitoringSession.
|
||||
Run once per database: python backend/migrate_rename_recording_to_monitoring_sessions.py
|
||||
"""
|
||||
|
||||
import sqlite3
|
||||
import sys
|
||||
from pathlib import Path
|
||||
|
||||
|
||||
def migrate(db_path: str):
|
||||
"""Run the migration."""
|
||||
print(f"Migrating database: {db_path}")
|
||||
|
||||
conn = sqlite3.connect(db_path)
|
||||
cursor = conn.cursor()
|
||||
|
||||
try:
|
||||
cursor.execute("SELECT name FROM sqlite_master WHERE type='table' AND name='recording_sessions'")
|
||||
if not cursor.fetchone():
|
||||
cursor.execute("SELECT name FROM sqlite_master WHERE type='table' AND name='monitoring_sessions'")
|
||||
if cursor.fetchone():
|
||||
print("monitoring_sessions table already exists. Skipping migration.")
|
||||
else:
|
||||
print("recording_sessions table does not exist. Skipping migration.")
|
||||
return
|
||||
|
||||
print("Renaming recording_sessions -> monitoring_sessions...")
|
||||
cursor.execute("ALTER TABLE recording_sessions RENAME TO monitoring_sessions")
|
||||
|
||||
conn.commit()
|
||||
print("Migration completed successfully!")
|
||||
|
||||
except Exception as e:
|
||||
print(f"Migration failed: {e}")
|
||||
conn.rollback()
|
||||
raise
|
||||
finally:
|
||||
conn.close()
|
||||
|
||||
|
||||
if __name__ == "__main__":
|
||||
db_path = "./data/seismo_fleet.db"
|
||||
|
||||
if len(sys.argv) > 1:
|
||||
db_path = sys.argv[1]
|
||||
|
||||
if not Path(db_path).exists():
|
||||
print(f"Database not found: {db_path}")
|
||||
sys.exit(1)
|
||||
|
||||
migrate(db_path)
|
||||
@@ -1,4 +1,4 @@
|
||||
from sqlalchemy import Column, String, DateTime, Boolean, Text, Date, Integer
|
||||
from sqlalchemy import Column, String, DateTime, Boolean, Text, Date, Integer, UniqueConstraint
|
||||
from datetime import datetime
|
||||
from backend.database import Base
|
||||
|
||||
@@ -32,6 +32,9 @@ class RosterUnit(Base):
|
||||
device_type = Column(String, default="seismograph") # "seismograph" | "modem" | "slm"
|
||||
deployed = Column(Boolean, default=True)
|
||||
retired = Column(Boolean, default=False)
|
||||
out_for_calibration = Column(Boolean, default=False)
|
||||
allocated = Column(Boolean, default=False) # Staged for an upcoming job, not yet deployed
|
||||
allocated_to_project_id = Column(String, nullable=True) # Which project it's allocated to
|
||||
note = Column(String, nullable=True)
|
||||
project_id = Column(String, nullable=True)
|
||||
location = Column(String, nullable=True) # Legacy field - use address/coordinates instead
|
||||
@@ -50,6 +53,8 @@ class RosterUnit(Base):
|
||||
ip_address = Column(String, nullable=True)
|
||||
phone_number = Column(String, nullable=True)
|
||||
hardware_model = Column(String, nullable=True)
|
||||
deployment_type = Column(String, nullable=True) # "seismograph" | "slm" - what type of device this modem is deployed with
|
||||
deployed_with_unit_id = Column(String, nullable=True) # ID of seismograph/SLM this modem is deployed with
|
||||
|
||||
# Sound Level Meter-specific fields (nullable for seismographs and modems)
|
||||
slm_host = Column(String, nullable=True) # Device IP or hostname
|
||||
@@ -63,6 +68,26 @@ class RosterUnit(Base):
|
||||
slm_last_check = Column(DateTime, nullable=True) # Last communication check
|
||||
|
||||
|
||||
class WatcherAgent(Base):
|
||||
"""
|
||||
Watcher agents: tracks the watcher processes (series3-watcher, thor-watcher)
|
||||
that run on field machines and report unit heartbeats.
|
||||
|
||||
Updated on every heartbeat received from each source_id.
|
||||
"""
|
||||
__tablename__ = "watcher_agents"
|
||||
|
||||
id = Column(String, primary_key=True, index=True) # source_id (hostname)
|
||||
source_type = Column(String, nullable=False) # series3_watcher | series4_watcher
|
||||
version = Column(String, nullable=True) # e.g. "1.4.0"
|
||||
last_seen = Column(DateTime, default=datetime.utcnow)
|
||||
status = Column(String, nullable=False, default="unknown") # ok | pending | missing | error | unknown
|
||||
ip_address = Column(String, nullable=True)
|
||||
log_tail = Column(Text, nullable=True) # last N log lines (JSON array of strings)
|
||||
update_pending = Column(Boolean, default=False) # set True to trigger remote update
|
||||
update_version = Column(String, nullable=True) # target version to update to
|
||||
|
||||
|
||||
class IgnoredUnit(Base):
|
||||
"""
|
||||
Ignored units: units that report but should be filtered out from unknown emitters.
|
||||
@@ -137,17 +162,31 @@ class Project(Base):
|
||||
"""
|
||||
Projects: top-level organization for monitoring work.
|
||||
Type-aware to enable/disable features based on project_type_id.
|
||||
|
||||
Project naming convention:
|
||||
- project_number: TMI internal ID format xxxx-YY (e.g., "2567-23")
|
||||
- client_name: Client/contractor name (e.g., "PJ Dick")
|
||||
- name: Project/site name (e.g., "RKM Hall", "CMU Campus")
|
||||
|
||||
Display format: "2567-23 - PJ Dick - RKM Hall"
|
||||
Users can search by any of these fields.
|
||||
"""
|
||||
__tablename__ = "projects"
|
||||
|
||||
id = Column(String, primary_key=True, index=True) # UUID
|
||||
name = Column(String, nullable=False, unique=True)
|
||||
project_number = Column(String, nullable=True, index=True) # TMI ID: xxxx-YY format (e.g., "2567-23")
|
||||
name = Column(String, nullable=False, unique=True) # Project/site name (e.g., "RKM Hall")
|
||||
description = Column(Text, nullable=True)
|
||||
project_type_id = Column(String, nullable=False) # FK to ProjectType.id
|
||||
status = Column(String, default="active") # active, completed, archived
|
||||
project_type_id = Column(String, nullable=True) # Legacy FK to ProjectType.id; use ProjectModule for feature flags
|
||||
status = Column(String, default="active") # active, on_hold, completed, archived, deleted
|
||||
|
||||
# Data collection mode: how field data reaches Terra-View.
|
||||
# "remote" — units have modems; data pulled via FTP/scheduler automatically
|
||||
# "manual" — no modem; SD cards retrieved daily and uploaded by hand
|
||||
data_collection_mode = Column(String, default="manual") # remote | manual
|
||||
|
||||
# Project metadata
|
||||
client_name = Column(String, nullable=True)
|
||||
client_name = Column(String, nullable=True, index=True) # Client name (e.g., "PJ Dick")
|
||||
site_address = Column(String, nullable=True)
|
||||
site_coordinates = Column(String, nullable=True) # "lat,lon"
|
||||
start_date = Column(Date, nullable=True)
|
||||
@@ -155,6 +194,23 @@ class Project(Base):
|
||||
|
||||
created_at = Column(DateTime, default=datetime.utcnow)
|
||||
updated_at = Column(DateTime, default=datetime.utcnow, onupdate=datetime.utcnow)
|
||||
deleted_at = Column(DateTime, nullable=True) # Set when status='deleted'; hard delete scheduled after 60 days
|
||||
|
||||
|
||||
class ProjectModule(Base):
|
||||
"""
|
||||
Modules enabled on a project. Each module unlocks a set of features/tabs.
|
||||
A project can have zero or more modules (sound_monitoring, vibration_monitoring, etc.).
|
||||
"""
|
||||
__tablename__ = "project_modules"
|
||||
|
||||
id = Column(String, primary_key=True, default=lambda: __import__('uuid').uuid4().__str__())
|
||||
project_id = Column(String, nullable=False, index=True) # FK to projects.id
|
||||
module_type = Column(String, nullable=False) # sound_monitoring | vibration_monitoring | ...
|
||||
enabled = Column(Boolean, default=True, nullable=False)
|
||||
created_at = Column(DateTime, default=datetime.utcnow)
|
||||
|
||||
__table_args__ = (UniqueConstraint("project_id", "module_type", name="uq_project_module"),)
|
||||
|
||||
|
||||
class MonitoringLocation(Base):
|
||||
@@ -218,7 +274,7 @@ class ScheduledAction(Base):
|
||||
location_id = Column(String, nullable=False, index=True) # FK to MonitoringLocation.id
|
||||
unit_id = Column(String, nullable=True, index=True) # FK to RosterUnit.id (nullable if location-based)
|
||||
|
||||
action_type = Column(String, nullable=False) # start, stop, download, calibrate
|
||||
action_type = Column(String, nullable=False) # start, stop, download, cycle, calibrate
|
||||
device_type = Column(String, nullable=False) # "slm" | "seismograph"
|
||||
|
||||
scheduled_time = Column(DateTime, nullable=False, index=True)
|
||||
@@ -233,17 +289,21 @@ class ScheduledAction(Base):
|
||||
created_at = Column(DateTime, default=datetime.utcnow)
|
||||
|
||||
|
||||
class RecordingSession(Base):
|
||||
class MonitoringSession(Base):
|
||||
"""
|
||||
Recording sessions: tracks actual monitoring sessions.
|
||||
Created when recording starts, updated when it stops.
|
||||
Monitoring sessions: tracks actual monitoring sessions.
|
||||
Created when monitoring starts, updated when it stops.
|
||||
"""
|
||||
__tablename__ = "recording_sessions"
|
||||
__tablename__ = "monitoring_sessions"
|
||||
|
||||
id = Column(String, primary_key=True, index=True) # UUID
|
||||
project_id = Column(String, nullable=False, index=True) # FK to Project.id
|
||||
location_id = Column(String, nullable=False, index=True) # FK to MonitoringLocation.id
|
||||
unit_id = Column(String, nullable=False, index=True) # FK to RosterUnit.id
|
||||
unit_id = Column(String, nullable=True, index=True) # FK to RosterUnit.id (nullable for offline uploads)
|
||||
|
||||
# Physical device model that produced this session's data (e.g. "NL-43", "NL-53", "NL-32").
|
||||
# Null for older records; report code falls back to file-content detection when null.
|
||||
device_model = Column(String, nullable=True)
|
||||
|
||||
session_type = Column(String, nullable=False) # sound | vibration
|
||||
started_at = Column(DateTime, nullable=False)
|
||||
@@ -251,6 +311,25 @@ class RecordingSession(Base):
|
||||
duration_seconds = Column(Integer, nullable=True)
|
||||
status = Column(String, default="recording") # recording, completed, failed
|
||||
|
||||
# Human-readable label auto-derived from date/location, editable by user.
|
||||
# e.g. "NRL-1 — Sun 2/23 — Night"
|
||||
session_label = Column(String, nullable=True)
|
||||
|
||||
# Period classification for report stats columns.
|
||||
# weekday_day | weekday_night | weekend_day | weekend_night
|
||||
period_type = Column(String, nullable=True)
|
||||
|
||||
# Effective monitoring window (hours 0–23). Night sessions cross midnight
|
||||
# (period_end_hour < period_start_hour). NULL = no filtering applied.
|
||||
# e.g. Day: start=7, end=19 Night: start=19, end=7
|
||||
period_start_hour = Column(Integer, nullable=True)
|
||||
period_end_hour = Column(Integer, nullable=True)
|
||||
|
||||
# For day sessions: the specific calendar date to use for report filtering.
|
||||
# Overrides the automatic "last date with daytime rows" heuristic.
|
||||
# Null = use heuristic.
|
||||
report_date = Column(Date, nullable=True)
|
||||
|
||||
# Snapshot of device configuration at recording time
|
||||
session_metadata = Column(Text, nullable=True) # JSON
|
||||
|
||||
@@ -266,7 +345,7 @@ class DataFile(Base):
|
||||
__tablename__ = "data_files"
|
||||
|
||||
id = Column(String, primary_key=True, index=True) # UUID
|
||||
session_id = Column(String, nullable=False, index=True) # FK to RecordingSession.id
|
||||
session_id = Column(String, nullable=False, index=True) # FK to MonitoringSession.id
|
||||
|
||||
file_path = Column(String, nullable=False) # Relative to data/Projects/
|
||||
file_type = Column(String, nullable=False) # wav, csv, mseed, json
|
||||
@@ -310,9 +389,10 @@ class RecurringSchedule(Base):
|
||||
"""
|
||||
Recurring schedule definitions for automated sound monitoring.
|
||||
|
||||
Supports two schedule types:
|
||||
Supports three schedule types:
|
||||
- "weekly_calendar": Select specific days with start/end times (e.g., Mon/Wed/Fri 7pm-7am)
|
||||
- "simple_interval": For 24/7 monitoring with daily stop/download/restart cycles
|
||||
- "one_off": Single recording session with specific start and end date/time
|
||||
"""
|
||||
__tablename__ = "recurring_schedules"
|
||||
|
||||
@@ -322,7 +402,7 @@ class RecurringSchedule(Base):
|
||||
unit_id = Column(String, nullable=True, index=True) # FK to RosterUnit.id (optional, can use assignment)
|
||||
|
||||
name = Column(String, nullable=False) # "Weeknight Monitoring", "24/7 Continuous"
|
||||
schedule_type = Column(String, nullable=False) # "weekly_calendar" | "simple_interval"
|
||||
schedule_type = Column(String, nullable=False) # "weekly_calendar" | "simple_interval" | "one_off"
|
||||
device_type = Column(String, nullable=False) # "slm" | "seismograph"
|
||||
|
||||
# Weekly Calendar fields (schedule_type = "weekly_calendar")
|
||||
@@ -338,7 +418,11 @@ class RecurringSchedule(Base):
|
||||
cycle_time = Column(String, nullable=True) # "00:00" - time to run stop/download/restart
|
||||
include_download = Column(Boolean, default=True) # Download data before restart
|
||||
|
||||
# Automation options (applies to both schedule types)
|
||||
# One-Off fields (schedule_type = "one_off")
|
||||
start_datetime = Column(DateTime, nullable=True) # Exact start date+time (stored as UTC)
|
||||
end_datetime = Column(DateTime, nullable=True) # Exact end date+time (stored as UTC)
|
||||
|
||||
# Automation options (applies to all schedule types)
|
||||
auto_increment_index = Column(Boolean, default=True) # Auto-increment store/index number before start
|
||||
# When True: prevents "overwrite data?" prompts by using a new index each time
|
||||
|
||||
@@ -391,3 +475,119 @@ class Alert(Base):
|
||||
|
||||
created_at = Column(DateTime, default=datetime.utcnow)
|
||||
expires_at = Column(DateTime, nullable=True) # Auto-dismiss after this time
|
||||
|
||||
|
||||
# ============================================================================
|
||||
# Deployment Records
|
||||
# ============================================================================
|
||||
|
||||
class DeploymentRecord(Base):
|
||||
"""
|
||||
Deployment records: tracks each time a unit is sent to the field and returned.
|
||||
|
||||
Each row represents one deployment. The active deployment is the record
|
||||
with actual_removal_date IS NULL. The fleet calendar uses this to show
|
||||
units as "In Field" and surface their expected return date.
|
||||
|
||||
project_ref is a freeform string for legacy/vibration jobs like "Fay I-80".
|
||||
project_id will be populated once those jobs are migrated to proper Project records.
|
||||
"""
|
||||
__tablename__ = "deployment_records"
|
||||
|
||||
id = Column(String, primary_key=True, index=True) # UUID
|
||||
unit_id = Column(String, nullable=False, index=True) # FK to RosterUnit.id
|
||||
|
||||
deployed_date = Column(Date, nullable=True) # When unit left the yard
|
||||
estimated_removal_date = Column(Date, nullable=True) # Expected return date
|
||||
actual_removal_date = Column(Date, nullable=True) # Filled in when returned; NULL = still out
|
||||
|
||||
# Project linkage: freeform for legacy jobs, FK for proper project records
|
||||
project_ref = Column(String, nullable=True) # e.g. "Fay I-80" (vibration jobs)
|
||||
project_id = Column(String, nullable=True, index=True) # FK to Project.id (when available)
|
||||
|
||||
location_name = Column(String, nullable=True) # e.g. "North Gate", "VP-001"
|
||||
notes = Column(Text, nullable=True)
|
||||
|
||||
created_at = Column(DateTime, default=datetime.utcnow)
|
||||
updated_at = Column(DateTime, default=datetime.utcnow, onupdate=datetime.utcnow)
|
||||
|
||||
|
||||
# ============================================================================
|
||||
# Fleet Calendar & Job Reservations
|
||||
# ============================================================================
|
||||
|
||||
class JobReservation(Base):
|
||||
"""
|
||||
Job reservations: reserve units for future jobs/projects.
|
||||
|
||||
Supports two assignment modes:
|
||||
- "specific": Pick exact units (SN-001, SN-002, etc.)
|
||||
- "quantity": Reserve a number of units (e.g., "need 8 seismographs")
|
||||
|
||||
Used by the Fleet Calendar to visualize unit availability over time.
|
||||
"""
|
||||
__tablename__ = "job_reservations"
|
||||
|
||||
id = Column(String, primary_key=True, index=True) # UUID
|
||||
name = Column(String, nullable=False) # "Job A - March deployment"
|
||||
project_id = Column(String, nullable=True, index=True) # Optional FK to Project
|
||||
|
||||
# Date range for the reservation
|
||||
start_date = Column(Date, nullable=False)
|
||||
end_date = Column(Date, nullable=True) # Nullable = TBD / ongoing
|
||||
estimated_end_date = Column(Date, nullable=True) # For planning when end is TBD
|
||||
end_date_tbd = Column(Boolean, default=False) # True = end date unknown
|
||||
|
||||
# Assignment type: "specific" or "quantity"
|
||||
assignment_type = Column(String, nullable=False, default="quantity")
|
||||
|
||||
# For quantity reservations
|
||||
device_type = Column(String, default="seismograph") # seismograph | slm
|
||||
quantity_needed = Column(Integer, nullable=True) # e.g., 8 units
|
||||
estimated_units = Column(Integer, nullable=True)
|
||||
|
||||
# Full slot list as JSON: [{"location_name": "North Gate", "unit_id": null}, ...]
|
||||
# Includes empty slots (no unit assigned yet). Filled slots are authoritative in JobReservationUnit.
|
||||
location_slots = Column(Text, nullable=True)
|
||||
|
||||
# Metadata
|
||||
notes = Column(Text, nullable=True)
|
||||
color = Column(String, default="#3B82F6") # For calendar display (blue default)
|
||||
|
||||
created_at = Column(DateTime, default=datetime.utcnow)
|
||||
updated_at = Column(DateTime, default=datetime.utcnow, onupdate=datetime.utcnow)
|
||||
|
||||
|
||||
class JobReservationUnit(Base):
|
||||
"""
|
||||
Links specific units to job reservations.
|
||||
|
||||
Used when:
|
||||
- assignment_type="specific": Units are directly assigned
|
||||
- assignment_type="quantity": Units can be filled in later
|
||||
|
||||
Supports unit swaps: same reservation can have multiple units with
|
||||
different date ranges (e.g., BE17353 Feb-Jun, then BE18438 Jun-Nov).
|
||||
"""
|
||||
__tablename__ = "job_reservation_units"
|
||||
|
||||
id = Column(String, primary_key=True, index=True) # UUID
|
||||
reservation_id = Column(String, nullable=False, index=True) # FK to JobReservation
|
||||
unit_id = Column(String, nullable=False, index=True) # FK to RosterUnit
|
||||
|
||||
# Unit-specific date range (for swaps) - defaults to reservation dates if null
|
||||
unit_start_date = Column(Date, nullable=True) # When this specific unit starts
|
||||
unit_end_date = Column(Date, nullable=True) # When this unit ends (swap out date)
|
||||
unit_end_tbd = Column(Boolean, default=False) # True = end unknown (until cal expires or job ends)
|
||||
|
||||
# Track how this assignment was made
|
||||
assignment_source = Column(String, default="specific") # "specific" | "filled" | "swap"
|
||||
assigned_at = Column(DateTime, default=datetime.utcnow)
|
||||
notes = Column(Text, nullable=True) # "Replacing BE17353" etc.
|
||||
|
||||
# Power requirements for this deployment slot
|
||||
power_type = Column(String, nullable=True) # "ac" | "solar" | None
|
||||
|
||||
# Location identity
|
||||
location_name = Column(String, nullable=True) # e.g. "North Gate", "Main Entrance"
|
||||
slot_index = Column(Integer, nullable=True) # Order within reservation (0-based)
|
||||
|
||||
@@ -1,7 +1,13 @@
|
||||
from fastapi import APIRouter, Request, Depends
|
||||
from sqlalchemy.orm import Session
|
||||
from sqlalchemy import and_
|
||||
from datetime import datetime, timedelta
|
||||
|
||||
from backend.database import get_db
|
||||
from backend.models import ScheduledAction, MonitoringLocation, Project
|
||||
from backend.services.snapshot import emit_status_snapshot
|
||||
from backend.templates_config import templates
|
||||
from backend.utils.timezone import utc_to_local, local_to_utc, get_user_timezone
|
||||
|
||||
router = APIRouter()
|
||||
|
||||
@@ -22,3 +28,79 @@ def dashboard_benched(request: Request):
|
||||
"partials/benched_table.html",
|
||||
{"request": request, "units": snapshot["benched"]}
|
||||
)
|
||||
|
||||
|
||||
@router.get("/dashboard/todays-actions")
|
||||
def dashboard_todays_actions(request: Request, db: Session = Depends(get_db)):
|
||||
"""
|
||||
Get today's scheduled actions for the dashboard card.
|
||||
Shows upcoming, completed, and failed actions for today.
|
||||
"""
|
||||
import json
|
||||
from zoneinfo import ZoneInfo
|
||||
|
||||
# Get today's date range in local timezone
|
||||
tz = ZoneInfo(get_user_timezone())
|
||||
now_local = datetime.now(tz)
|
||||
today_start_local = now_local.replace(hour=0, minute=0, second=0, microsecond=0)
|
||||
today_end_local = today_start_local + timedelta(days=1)
|
||||
|
||||
# Convert to UTC for database query
|
||||
today_start_utc = today_start_local.astimezone(ZoneInfo("UTC")).replace(tzinfo=None)
|
||||
today_end_utc = today_end_local.astimezone(ZoneInfo("UTC")).replace(tzinfo=None)
|
||||
|
||||
# Exclude actions from paused/removed projects
|
||||
paused_project_ids = [
|
||||
p.id for p in db.query(Project.id).filter(
|
||||
Project.status.in_(["on_hold", "archived", "deleted"])
|
||||
).all()
|
||||
]
|
||||
|
||||
# Query today's actions
|
||||
actions = db.query(ScheduledAction).filter(
|
||||
ScheduledAction.scheduled_time >= today_start_utc,
|
||||
ScheduledAction.scheduled_time < today_end_utc,
|
||||
ScheduledAction.project_id.notin_(paused_project_ids),
|
||||
).order_by(ScheduledAction.scheduled_time.asc()).all()
|
||||
|
||||
# Enrich with location/project info and parse results
|
||||
enriched_actions = []
|
||||
for action in actions:
|
||||
location = None
|
||||
project = None
|
||||
if action.location_id:
|
||||
location = db.query(MonitoringLocation).filter_by(id=action.location_id).first()
|
||||
if action.project_id:
|
||||
project = db.query(Project).filter_by(id=action.project_id).first()
|
||||
|
||||
# Parse module_response for result details
|
||||
result_data = None
|
||||
if action.module_response:
|
||||
try:
|
||||
result_data = json.loads(action.module_response)
|
||||
except json.JSONDecodeError:
|
||||
pass
|
||||
|
||||
enriched_actions.append({
|
||||
"action": action,
|
||||
"location": location,
|
||||
"project": project,
|
||||
"result": result_data,
|
||||
})
|
||||
|
||||
# Count by status
|
||||
pending_count = sum(1 for a in actions if a.execution_status == "pending")
|
||||
completed_count = sum(1 for a in actions if a.execution_status == "completed")
|
||||
failed_count = sum(1 for a in actions if a.execution_status == "failed")
|
||||
|
||||
return templates.TemplateResponse(
|
||||
"partials/dashboard/todays_actions.html",
|
||||
{
|
||||
"request": request,
|
||||
"actions": enriched_actions,
|
||||
"pending_count": pending_count,
|
||||
"completed_count": completed_count,
|
||||
"failed_count": failed_count,
|
||||
"total_count": len(actions),
|
||||
}
|
||||
)
|
||||
|
||||
154
backend/routers/deployments.py
Normal file
@@ -0,0 +1,154 @@
|
||||
from fastapi import APIRouter, Depends, HTTPException
|
||||
from sqlalchemy.orm import Session
|
||||
from datetime import datetime, date
|
||||
from typing import Optional
|
||||
import uuid
|
||||
|
||||
from backend.database import get_db
|
||||
from backend.models import DeploymentRecord, RosterUnit
|
||||
|
||||
router = APIRouter(prefix="/api", tags=["deployments"])
|
||||
|
||||
|
||||
def _serialize(record: DeploymentRecord) -> dict:
|
||||
return {
|
||||
"id": record.id,
|
||||
"unit_id": record.unit_id,
|
||||
"deployed_date": record.deployed_date.isoformat() if record.deployed_date else None,
|
||||
"estimated_removal_date": record.estimated_removal_date.isoformat() if record.estimated_removal_date else None,
|
||||
"actual_removal_date": record.actual_removal_date.isoformat() if record.actual_removal_date else None,
|
||||
"project_ref": record.project_ref,
|
||||
"project_id": record.project_id,
|
||||
"location_name": record.location_name,
|
||||
"notes": record.notes,
|
||||
"created_at": record.created_at.isoformat() if record.created_at else None,
|
||||
"updated_at": record.updated_at.isoformat() if record.updated_at else None,
|
||||
"is_active": record.actual_removal_date is None,
|
||||
}
|
||||
|
||||
|
||||
@router.get("/deployments/{unit_id}")
|
||||
def get_deployments(unit_id: str, db: Session = Depends(get_db)):
|
||||
"""Get all deployment records for a unit, newest first."""
|
||||
unit = db.query(RosterUnit).filter_by(id=unit_id).first()
|
||||
if not unit:
|
||||
raise HTTPException(status_code=404, detail=f"Unit {unit_id} not found")
|
||||
|
||||
records = (
|
||||
db.query(DeploymentRecord)
|
||||
.filter_by(unit_id=unit_id)
|
||||
.order_by(DeploymentRecord.deployed_date.desc(), DeploymentRecord.created_at.desc())
|
||||
.all()
|
||||
)
|
||||
return {"deployments": [_serialize(r) for r in records]}
|
||||
|
||||
|
||||
@router.get("/deployments/{unit_id}/active")
|
||||
def get_active_deployment(unit_id: str, db: Session = Depends(get_db)):
|
||||
"""Get the current active deployment (actual_removal_date is NULL), or null."""
|
||||
record = (
|
||||
db.query(DeploymentRecord)
|
||||
.filter(
|
||||
DeploymentRecord.unit_id == unit_id,
|
||||
DeploymentRecord.actual_removal_date == None
|
||||
)
|
||||
.order_by(DeploymentRecord.created_at.desc())
|
||||
.first()
|
||||
)
|
||||
return {"deployment": _serialize(record) if record else None}
|
||||
|
||||
|
||||
@router.post("/deployments/{unit_id}")
|
||||
def create_deployment(unit_id: str, payload: dict, db: Session = Depends(get_db)):
|
||||
"""
|
||||
Create a new deployment record for a unit.
|
||||
|
||||
Body fields (all optional):
|
||||
deployed_date (YYYY-MM-DD)
|
||||
estimated_removal_date (YYYY-MM-DD)
|
||||
project_ref (freeform string)
|
||||
project_id (UUID if linked to Project)
|
||||
location_name
|
||||
notes
|
||||
"""
|
||||
unit = db.query(RosterUnit).filter_by(id=unit_id).first()
|
||||
if not unit:
|
||||
raise HTTPException(status_code=404, detail=f"Unit {unit_id} not found")
|
||||
|
||||
def parse_date(val) -> Optional[date]:
|
||||
if not val:
|
||||
return None
|
||||
if isinstance(val, date):
|
||||
return val
|
||||
return date.fromisoformat(str(val))
|
||||
|
||||
record = DeploymentRecord(
|
||||
id=str(uuid.uuid4()),
|
||||
unit_id=unit_id,
|
||||
deployed_date=parse_date(payload.get("deployed_date")),
|
||||
estimated_removal_date=parse_date(payload.get("estimated_removal_date")),
|
||||
actual_removal_date=None,
|
||||
project_ref=payload.get("project_ref"),
|
||||
project_id=payload.get("project_id"),
|
||||
location_name=payload.get("location_name"),
|
||||
notes=payload.get("notes"),
|
||||
created_at=datetime.utcnow(),
|
||||
updated_at=datetime.utcnow(),
|
||||
)
|
||||
db.add(record)
|
||||
db.commit()
|
||||
db.refresh(record)
|
||||
return _serialize(record)
|
||||
|
||||
|
||||
@router.put("/deployments/{unit_id}/{deployment_id}")
|
||||
def update_deployment(unit_id: str, deployment_id: str, payload: dict, db: Session = Depends(get_db)):
|
||||
"""
|
||||
Update a deployment record. Used for:
|
||||
- Setting/changing estimated_removal_date
|
||||
- Closing a deployment (set actual_removal_date to mark unit returned)
|
||||
- Editing project_ref, location_name, notes
|
||||
"""
|
||||
record = db.query(DeploymentRecord).filter_by(id=deployment_id, unit_id=unit_id).first()
|
||||
if not record:
|
||||
raise HTTPException(status_code=404, detail="Deployment record not found")
|
||||
|
||||
def parse_date(val) -> Optional[date]:
|
||||
if val is None:
|
||||
return None
|
||||
if val == "":
|
||||
return None
|
||||
if isinstance(val, date):
|
||||
return val
|
||||
return date.fromisoformat(str(val))
|
||||
|
||||
if "deployed_date" in payload:
|
||||
record.deployed_date = parse_date(payload["deployed_date"])
|
||||
if "estimated_removal_date" in payload:
|
||||
record.estimated_removal_date = parse_date(payload["estimated_removal_date"])
|
||||
if "actual_removal_date" in payload:
|
||||
record.actual_removal_date = parse_date(payload["actual_removal_date"])
|
||||
if "project_ref" in payload:
|
||||
record.project_ref = payload["project_ref"]
|
||||
if "project_id" in payload:
|
||||
record.project_id = payload["project_id"]
|
||||
if "location_name" in payload:
|
||||
record.location_name = payload["location_name"]
|
||||
if "notes" in payload:
|
||||
record.notes = payload["notes"]
|
||||
|
||||
record.updated_at = datetime.utcnow()
|
||||
db.commit()
|
||||
db.refresh(record)
|
||||
return _serialize(record)
|
||||
|
||||
|
||||
@router.delete("/deployments/{unit_id}/{deployment_id}")
|
||||
def delete_deployment(unit_id: str, deployment_id: str, db: Session = Depends(get_db)):
|
||||
"""Delete a deployment record."""
|
||||
record = db.query(DeploymentRecord).filter_by(id=deployment_id, unit_id=unit_id).first()
|
||||
if not record:
|
||||
raise HTTPException(status_code=404, detail="Deployment record not found")
|
||||
db.delete(record)
|
||||
db.commit()
|
||||
return {"ok": True}
|
||||
928
backend/routers/fleet_calendar.py
Normal file
@@ -0,0 +1,928 @@
|
||||
"""
|
||||
Fleet Calendar Router
|
||||
|
||||
API endpoints for the Fleet Calendar feature:
|
||||
- Calendar page and data
|
||||
- Job reservation CRUD
|
||||
- Unit assignment management
|
||||
- Availability checking
|
||||
"""
|
||||
|
||||
from fastapi import APIRouter, Request, Depends, HTTPException, Query
|
||||
from fastapi.responses import HTMLResponse, JSONResponse
|
||||
from sqlalchemy.orm import Session
|
||||
from datetime import datetime, date, timedelta
|
||||
from typing import Optional, List
|
||||
import uuid
|
||||
import logging
|
||||
|
||||
from backend.database import get_db
|
||||
from backend.models import (
|
||||
RosterUnit, JobReservation, JobReservationUnit,
|
||||
UserPreferences, Project, MonitoringLocation, UnitAssignment
|
||||
)
|
||||
from backend.templates_config import templates
|
||||
from backend.services.fleet_calendar_service import (
|
||||
get_day_summary,
|
||||
get_calendar_year_data,
|
||||
get_rolling_calendar_data,
|
||||
check_calibration_conflicts,
|
||||
get_available_units_for_period,
|
||||
get_calibration_status
|
||||
)
|
||||
|
||||
router = APIRouter(tags=["fleet-calendar"])
|
||||
logger = logging.getLogger(__name__)
|
||||
|
||||
|
||||
# ============================================================================
|
||||
# Calendar Page
|
||||
# ============================================================================
|
||||
|
||||
@router.get("/fleet-calendar", response_class=HTMLResponse)
|
||||
async def fleet_calendar_page(
|
||||
request: Request,
|
||||
year: Optional[int] = None,
|
||||
month: Optional[int] = None,
|
||||
device_type: str = "seismograph",
|
||||
db: Session = Depends(get_db)
|
||||
):
|
||||
"""Main Fleet Calendar page with rolling 12-month view."""
|
||||
today = date.today()
|
||||
|
||||
# Default to current month as the start
|
||||
if year is None:
|
||||
year = today.year
|
||||
if month is None:
|
||||
month = today.month
|
||||
|
||||
# Get calendar data for 12 months starting from year/month
|
||||
calendar_data = get_rolling_calendar_data(db, year, month, device_type)
|
||||
|
||||
# Get projects for the reservation form dropdown
|
||||
projects = db.query(Project).filter(
|
||||
Project.status.in_(["active", "upcoming", "on_hold"])
|
||||
).order_by(Project.name).all()
|
||||
|
||||
# Build a serializable list of items with dates for calendar bars
|
||||
# Includes both tracked Projects (with dates) and Job Reservations (matching device_type)
|
||||
project_colors = ['#3B82F6', '#10B981', '#F59E0B', '#EF4444', '#8B5CF6', '#EC4899', '#06B6D4', '#F97316']
|
||||
# Map calendar device_type to project_type_ids
|
||||
device_type_to_project_types = {
|
||||
"seismograph": ["vibration_monitoring", "combined"],
|
||||
"slm": ["sound_monitoring", "combined"],
|
||||
}
|
||||
relevant_project_types = device_type_to_project_types.get(device_type, [])
|
||||
|
||||
calendar_projects = []
|
||||
for i, p in enumerate(projects):
|
||||
if p.start_date and p.project_type_id in relevant_project_types:
|
||||
calendar_projects.append({
|
||||
"id": p.id,
|
||||
"name": p.name,
|
||||
"start_date": p.start_date.isoformat(),
|
||||
"end_date": p.end_date.isoformat() if p.end_date else None,
|
||||
"color": project_colors[i % len(project_colors)],
|
||||
"confirmed": True,
|
||||
})
|
||||
|
||||
# Add job reservations for this device_type as bars
|
||||
from sqlalchemy import or_ as _or
|
||||
cal_window_end = date(year + ((month + 10) // 12), ((month + 10) % 12) + 1, 1)
|
||||
reservations_for_cal = db.query(JobReservation).filter(
|
||||
JobReservation.device_type == device_type,
|
||||
JobReservation.start_date <= cal_window_end,
|
||||
_or(
|
||||
JobReservation.end_date >= date(year, month, 1),
|
||||
JobReservation.end_date == None,
|
||||
)
|
||||
).all()
|
||||
for res in reservations_for_cal:
|
||||
end = res.end_date or res.estimated_end_date
|
||||
calendar_projects.append({
|
||||
"id": res.id,
|
||||
"name": res.name,
|
||||
"start_date": res.start_date.isoformat(),
|
||||
"end_date": end.isoformat() if end else None,
|
||||
"color": res.color,
|
||||
"confirmed": bool(res.project_id),
|
||||
})
|
||||
|
||||
# Calculate prev/next month navigation
|
||||
prev_year, prev_month = (year - 1, 12) if month == 1 else (year, month - 1)
|
||||
next_year, next_month = (year + 1, 1) if month == 12 else (year, month + 1)
|
||||
|
||||
return templates.TemplateResponse(
|
||||
"fleet_calendar.html",
|
||||
{
|
||||
"request": request,
|
||||
"start_year": year,
|
||||
"start_month": month,
|
||||
"prev_year": prev_year,
|
||||
"prev_month": prev_month,
|
||||
"next_year": next_year,
|
||||
"next_month": next_month,
|
||||
"device_type": device_type,
|
||||
"calendar_data": calendar_data,
|
||||
"projects": projects,
|
||||
"calendar_projects": calendar_projects,
|
||||
"today": today.isoformat()
|
||||
}
|
||||
)
|
||||
|
||||
|
||||
# ============================================================================
|
||||
# Calendar Data API
|
||||
# ============================================================================
|
||||
|
||||
@router.get("/api/fleet-calendar/data", response_class=JSONResponse)
|
||||
async def get_calendar_data(
|
||||
year: int,
|
||||
device_type: str = "seismograph",
|
||||
db: Session = Depends(get_db)
|
||||
):
|
||||
"""Get calendar data for a specific year."""
|
||||
return get_calendar_year_data(db, year, device_type)
|
||||
|
||||
|
||||
@router.get("/api/fleet-calendar/day/{date_str}", response_class=HTMLResponse)
|
||||
async def get_day_detail(
|
||||
request: Request,
|
||||
date_str: str,
|
||||
device_type: str = "seismograph",
|
||||
db: Session = Depends(get_db)
|
||||
):
|
||||
"""Get detailed view for a specific day (HTMX partial)."""
|
||||
try:
|
||||
check_date = date.fromisoformat(date_str)
|
||||
except ValueError:
|
||||
raise HTTPException(status_code=400, detail="Invalid date format. Use YYYY-MM-DD")
|
||||
|
||||
day_data = get_day_summary(db, check_date, device_type)
|
||||
|
||||
# Get projects for display names
|
||||
projects = {p.id: p for p in db.query(Project).all()}
|
||||
|
||||
return templates.TemplateResponse(
|
||||
"partials/fleet_calendar/day_detail.html",
|
||||
{
|
||||
"request": request,
|
||||
"day_data": day_data,
|
||||
"date_str": date_str,
|
||||
"date_display": check_date.strftime("%B %d, %Y"),
|
||||
"device_type": device_type,
|
||||
"projects": projects
|
||||
}
|
||||
)
|
||||
|
||||
|
||||
# ============================================================================
|
||||
# Reservation CRUD
|
||||
# ============================================================================
|
||||
|
||||
@router.post("/api/fleet-calendar/reservations", response_class=JSONResponse)
|
||||
async def create_reservation(
|
||||
request: Request,
|
||||
db: Session = Depends(get_db)
|
||||
):
|
||||
"""Create a new job reservation."""
|
||||
data = await request.json()
|
||||
|
||||
# Validate required fields
|
||||
required = ["name", "start_date", "assignment_type"]
|
||||
for field in required:
|
||||
if field not in data:
|
||||
raise HTTPException(status_code=400, detail=f"Missing required field: {field}")
|
||||
|
||||
# Need either end_date or end_date_tbd
|
||||
end_date_tbd = data.get("end_date_tbd", False)
|
||||
if not end_date_tbd and not data.get("end_date"):
|
||||
raise HTTPException(status_code=400, detail="End date is required unless marked as TBD")
|
||||
|
||||
try:
|
||||
start_date = date.fromisoformat(data["start_date"])
|
||||
end_date = date.fromisoformat(data["end_date"]) if data.get("end_date") else None
|
||||
estimated_end_date = date.fromisoformat(data["estimated_end_date"]) if data.get("estimated_end_date") else None
|
||||
except ValueError:
|
||||
raise HTTPException(status_code=400, detail="Invalid date format. Use YYYY-MM-DD")
|
||||
|
||||
if end_date and end_date < start_date:
|
||||
raise HTTPException(status_code=400, detail="End date must be after start date")
|
||||
|
||||
if estimated_end_date and estimated_end_date < start_date:
|
||||
raise HTTPException(status_code=400, detail="Estimated end date must be after start date")
|
||||
|
||||
import json as _json
|
||||
reservation = JobReservation(
|
||||
id=str(uuid.uuid4()),
|
||||
name=data["name"],
|
||||
project_id=data.get("project_id"),
|
||||
start_date=start_date,
|
||||
end_date=end_date,
|
||||
estimated_end_date=estimated_end_date,
|
||||
end_date_tbd=end_date_tbd,
|
||||
assignment_type=data["assignment_type"],
|
||||
device_type=data.get("device_type", "seismograph"),
|
||||
quantity_needed=data.get("quantity_needed"),
|
||||
estimated_units=data.get("estimated_units"),
|
||||
location_slots=_json.dumps(data["location_slots"]) if data.get("location_slots") is not None else None,
|
||||
notes=data.get("notes"),
|
||||
color=data.get("color", "#3B82F6")
|
||||
)
|
||||
|
||||
db.add(reservation)
|
||||
|
||||
# If specific units were provided, assign them
|
||||
if data.get("unit_ids") and data["assignment_type"] == "specific":
|
||||
for unit_id in data["unit_ids"]:
|
||||
assignment = JobReservationUnit(
|
||||
id=str(uuid.uuid4()),
|
||||
reservation_id=reservation.id,
|
||||
unit_id=unit_id,
|
||||
assignment_source="specific"
|
||||
)
|
||||
db.add(assignment)
|
||||
|
||||
db.commit()
|
||||
|
||||
logger.info(f"Created reservation: {reservation.name} ({reservation.id})")
|
||||
|
||||
return {
|
||||
"success": True,
|
||||
"reservation_id": reservation.id,
|
||||
"message": f"Created reservation: {reservation.name}"
|
||||
}
|
||||
|
||||
|
||||
@router.get("/api/fleet-calendar/reservations/{reservation_id}", response_class=JSONResponse)
|
||||
async def get_reservation(
|
||||
reservation_id: str,
|
||||
db: Session = Depends(get_db)
|
||||
):
|
||||
"""Get a specific reservation with its assigned units."""
|
||||
reservation = db.query(JobReservation).filter_by(id=reservation_id).first()
|
||||
if not reservation:
|
||||
raise HTTPException(status_code=404, detail="Reservation not found")
|
||||
|
||||
# Get assigned units
|
||||
assignments = db.query(JobReservationUnit).filter_by(
|
||||
reservation_id=reservation_id
|
||||
).all()
|
||||
|
||||
# Sort assignments by slot_index so order is preserved
|
||||
assignments_sorted = sorted(assignments, key=lambda a: (a.slot_index if a.slot_index is not None else 999))
|
||||
unit_ids = [a.unit_id for a in assignments_sorted]
|
||||
units = db.query(RosterUnit).filter(RosterUnit.id.in_(unit_ids)).all() if unit_ids else []
|
||||
units_by_id = {u.id: u for u in units}
|
||||
# Build per-unit lookups from assignments
|
||||
assignment_map = {a.unit_id: a for a in assignments_sorted}
|
||||
|
||||
import json as _json
|
||||
stored_slots = _json.loads(reservation.location_slots) if reservation.location_slots else None
|
||||
|
||||
return {
|
||||
"id": reservation.id,
|
||||
"name": reservation.name,
|
||||
"project_id": reservation.project_id,
|
||||
"start_date": reservation.start_date.isoformat(),
|
||||
"end_date": reservation.end_date.isoformat() if reservation.end_date else None,
|
||||
"estimated_end_date": reservation.estimated_end_date.isoformat() if reservation.estimated_end_date else None,
|
||||
"end_date_tbd": reservation.end_date_tbd,
|
||||
"assignment_type": reservation.assignment_type,
|
||||
"device_type": reservation.device_type,
|
||||
"quantity_needed": reservation.quantity_needed,
|
||||
"estimated_units": reservation.estimated_units,
|
||||
"location_slots": stored_slots,
|
||||
"notes": reservation.notes,
|
||||
"color": reservation.color,
|
||||
"assigned_units": [
|
||||
{
|
||||
"id": uid,
|
||||
"last_calibrated": units_by_id[uid].last_calibrated.isoformat() if uid in units_by_id and units_by_id[uid].last_calibrated else None,
|
||||
"deployed": units_by_id[uid].deployed if uid in units_by_id else False,
|
||||
"power_type": assignment_map[uid].power_type,
|
||||
"notes": assignment_map[uid].notes,
|
||||
"location_name": assignment_map[uid].location_name,
|
||||
"slot_index": assignment_map[uid].slot_index,
|
||||
}
|
||||
for uid in unit_ids
|
||||
]
|
||||
}
|
||||
|
||||
|
||||
@router.put("/api/fleet-calendar/reservations/{reservation_id}", response_class=JSONResponse)
|
||||
async def update_reservation(
|
||||
reservation_id: str,
|
||||
request: Request,
|
||||
db: Session = Depends(get_db)
|
||||
):
|
||||
"""Update an existing reservation."""
|
||||
reservation = db.query(JobReservation).filter_by(id=reservation_id).first()
|
||||
if not reservation:
|
||||
raise HTTPException(status_code=404, detail="Reservation not found")
|
||||
|
||||
data = await request.json()
|
||||
|
||||
# Update fields if provided
|
||||
if "name" in data:
|
||||
reservation.name = data["name"]
|
||||
if "project_id" in data:
|
||||
reservation.project_id = data["project_id"]
|
||||
if "start_date" in data:
|
||||
reservation.start_date = date.fromisoformat(data["start_date"])
|
||||
if "end_date" in data:
|
||||
reservation.end_date = date.fromisoformat(data["end_date"]) if data["end_date"] else None
|
||||
if "estimated_end_date" in data:
|
||||
reservation.estimated_end_date = date.fromisoformat(data["estimated_end_date"]) if data["estimated_end_date"] else None
|
||||
if "end_date_tbd" in data:
|
||||
reservation.end_date_tbd = data["end_date_tbd"]
|
||||
if "assignment_type" in data:
|
||||
reservation.assignment_type = data["assignment_type"]
|
||||
if "quantity_needed" in data:
|
||||
reservation.quantity_needed = data["quantity_needed"]
|
||||
if "estimated_units" in data:
|
||||
reservation.estimated_units = data["estimated_units"]
|
||||
if "location_slots" in data:
|
||||
import json as _json
|
||||
reservation.location_slots = _json.dumps(data["location_slots"]) if data["location_slots"] is not None else None
|
||||
if "notes" in data:
|
||||
reservation.notes = data["notes"]
|
||||
if "color" in data:
|
||||
reservation.color = data["color"]
|
||||
|
||||
reservation.updated_at = datetime.utcnow()
|
||||
|
||||
db.commit()
|
||||
|
||||
logger.info(f"Updated reservation: {reservation.name} ({reservation.id})")
|
||||
|
||||
return {
|
||||
"success": True,
|
||||
"message": f"Updated reservation: {reservation.name}"
|
||||
}
|
||||
|
||||
|
||||
@router.delete("/api/fleet-calendar/reservations/{reservation_id}", response_class=JSONResponse)
|
||||
async def delete_reservation(
|
||||
reservation_id: str,
|
||||
db: Session = Depends(get_db)
|
||||
):
|
||||
"""Delete a reservation and its unit assignments."""
|
||||
reservation = db.query(JobReservation).filter_by(id=reservation_id).first()
|
||||
if not reservation:
|
||||
raise HTTPException(status_code=404, detail="Reservation not found")
|
||||
|
||||
# Delete unit assignments first
|
||||
db.query(JobReservationUnit).filter_by(reservation_id=reservation_id).delete()
|
||||
|
||||
# Delete the reservation
|
||||
db.delete(reservation)
|
||||
db.commit()
|
||||
|
||||
logger.info(f"Deleted reservation: {reservation.name} ({reservation_id})")
|
||||
|
||||
return {
|
||||
"success": True,
|
||||
"message": "Reservation deleted"
|
||||
}
|
||||
|
||||
|
||||
# ============================================================================
|
||||
# Unit Assignment
|
||||
# ============================================================================
|
||||
|
||||
@router.post("/api/fleet-calendar/reservations/{reservation_id}/assign-units", response_class=JSONResponse)
|
||||
async def assign_units_to_reservation(
|
||||
reservation_id: str,
|
||||
request: Request,
|
||||
db: Session = Depends(get_db)
|
||||
):
|
||||
"""Assign specific units to a reservation."""
|
||||
reservation = db.query(JobReservation).filter_by(id=reservation_id).first()
|
||||
if not reservation:
|
||||
raise HTTPException(status_code=404, detail="Reservation not found")
|
||||
|
||||
data = await request.json()
|
||||
unit_ids = data.get("unit_ids", [])
|
||||
# Optional per-unit dicts keyed by unit_id
|
||||
power_types = data.get("power_types", {})
|
||||
location_notes = data.get("location_notes", {})
|
||||
location_names = data.get("location_names", {})
|
||||
# slot_indices: {"BE17354": 0, "BE9441": 1, ...}
|
||||
slot_indices = data.get("slot_indices", {})
|
||||
|
||||
# Verify units exist (allow empty list to clear all assignments)
|
||||
if unit_ids:
|
||||
units = db.query(RosterUnit).filter(RosterUnit.id.in_(unit_ids)).all()
|
||||
found_ids = {u.id for u in units}
|
||||
missing = set(unit_ids) - found_ids
|
||||
if missing:
|
||||
raise HTTPException(status_code=404, detail=f"Units not found: {', '.join(missing)}")
|
||||
|
||||
# Full replace: delete all existing assignments for this reservation first
|
||||
db.query(JobReservationUnit).filter_by(reservation_id=reservation_id).delete()
|
||||
db.flush()
|
||||
|
||||
# Check for conflicts with other reservations and insert new assignments
|
||||
conflicts = []
|
||||
for unit_id in unit_ids:
|
||||
# Check overlapping reservations
|
||||
if reservation.end_date:
|
||||
overlapping = db.query(JobReservation).join(
|
||||
JobReservationUnit, JobReservation.id == JobReservationUnit.reservation_id
|
||||
).filter(
|
||||
JobReservationUnit.unit_id == unit_id,
|
||||
JobReservation.id != reservation_id,
|
||||
JobReservation.start_date <= reservation.end_date,
|
||||
JobReservation.end_date >= reservation.start_date
|
||||
).first()
|
||||
|
||||
if overlapping:
|
||||
conflicts.append({
|
||||
"unit_id": unit_id,
|
||||
"conflict_reservation": overlapping.name,
|
||||
"conflict_dates": f"{overlapping.start_date} - {overlapping.end_date}"
|
||||
})
|
||||
continue
|
||||
|
||||
# Add assignment
|
||||
assignment = JobReservationUnit(
|
||||
id=str(uuid.uuid4()),
|
||||
reservation_id=reservation_id,
|
||||
unit_id=unit_id,
|
||||
assignment_source="filled" if reservation.assignment_type == "quantity" else "specific",
|
||||
power_type=power_types.get(unit_id),
|
||||
notes=location_notes.get(unit_id),
|
||||
location_name=location_names.get(unit_id),
|
||||
slot_index=slot_indices.get(unit_id),
|
||||
)
|
||||
db.add(assignment)
|
||||
|
||||
db.commit()
|
||||
|
||||
# Check for calibration conflicts
|
||||
cal_conflicts = check_calibration_conflicts(db, reservation_id)
|
||||
|
||||
assigned_count = db.query(JobReservationUnit).filter_by(
|
||||
reservation_id=reservation_id
|
||||
).count()
|
||||
|
||||
return {
|
||||
"success": True,
|
||||
"assigned_count": assigned_count,
|
||||
"conflicts": conflicts,
|
||||
"calibration_warnings": cal_conflicts,
|
||||
"message": f"Assigned {len(unit_ids) - len(conflicts)} units"
|
||||
}
|
||||
|
||||
|
||||
@router.delete("/api/fleet-calendar/reservations/{reservation_id}/units/{unit_id}", response_class=JSONResponse)
|
||||
async def remove_unit_from_reservation(
|
||||
reservation_id: str,
|
||||
unit_id: str,
|
||||
db: Session = Depends(get_db)
|
||||
):
|
||||
"""Remove a unit from a reservation."""
|
||||
assignment = db.query(JobReservationUnit).filter_by(
|
||||
reservation_id=reservation_id,
|
||||
unit_id=unit_id
|
||||
).first()
|
||||
|
||||
if not assignment:
|
||||
raise HTTPException(status_code=404, detail="Unit assignment not found")
|
||||
|
||||
db.delete(assignment)
|
||||
db.commit()
|
||||
|
||||
return {
|
||||
"success": True,
|
||||
"message": f"Removed {unit_id} from reservation"
|
||||
}
|
||||
|
||||
|
||||
# ============================================================================
|
||||
# Availability & Conflicts
|
||||
# ============================================================================
|
||||
|
||||
@router.get("/api/fleet-calendar/availability", response_class=JSONResponse)
|
||||
async def check_availability(
|
||||
start_date: str,
|
||||
end_date: str,
|
||||
device_type: str = "seismograph",
|
||||
exclude_reservation_id: Optional[str] = None,
|
||||
db: Session = Depends(get_db)
|
||||
):
|
||||
"""Get units available for a specific date range."""
|
||||
try:
|
||||
start = date.fromisoformat(start_date)
|
||||
end = date.fromisoformat(end_date)
|
||||
except ValueError:
|
||||
raise HTTPException(status_code=400, detail="Invalid date format. Use YYYY-MM-DD")
|
||||
|
||||
available = get_available_units_for_period(
|
||||
db, start, end, device_type, exclude_reservation_id
|
||||
)
|
||||
|
||||
return {
|
||||
"start_date": start_date,
|
||||
"end_date": end_date,
|
||||
"device_type": device_type,
|
||||
"available_units": available,
|
||||
"count": len(available)
|
||||
}
|
||||
|
||||
|
||||
@router.get("/api/fleet-calendar/reservations/{reservation_id}/conflicts", response_class=JSONResponse)
|
||||
async def get_reservation_conflicts(
|
||||
reservation_id: str,
|
||||
db: Session = Depends(get_db)
|
||||
):
|
||||
"""Check for calibration conflicts in a reservation."""
|
||||
reservation = db.query(JobReservation).filter_by(id=reservation_id).first()
|
||||
if not reservation:
|
||||
raise HTTPException(status_code=404, detail="Reservation not found")
|
||||
|
||||
conflicts = check_calibration_conflicts(db, reservation_id)
|
||||
|
||||
return {
|
||||
"reservation_id": reservation_id,
|
||||
"reservation_name": reservation.name,
|
||||
"conflicts": conflicts,
|
||||
"has_conflicts": len(conflicts) > 0
|
||||
}
|
||||
|
||||
|
||||
# ============================================================================
|
||||
# HTMX Partials
|
||||
# ============================================================================
|
||||
|
||||
@router.get("/api/fleet-calendar/reservations-list", response_class=HTMLResponse)
|
||||
async def get_reservations_list(
|
||||
request: Request,
|
||||
year: Optional[int] = None,
|
||||
month: Optional[int] = None,
|
||||
device_type: str = "seismograph",
|
||||
db: Session = Depends(get_db)
|
||||
):
|
||||
"""Get list of reservations as HTMX partial."""
|
||||
from sqlalchemy import or_
|
||||
|
||||
today = date.today()
|
||||
if year is None:
|
||||
year = today.year
|
||||
if month is None:
|
||||
month = today.month
|
||||
|
||||
# Calculate 12-month window
|
||||
start_date = date(year, month, 1)
|
||||
# End date is 12 months later
|
||||
end_year = year + ((month + 10) // 12)
|
||||
end_month = ((month + 10) % 12) + 1
|
||||
if end_month == 12:
|
||||
end_date = date(end_year, 12, 31)
|
||||
else:
|
||||
end_date = date(end_year, end_month + 1, 1) - timedelta(days=1)
|
||||
|
||||
# Filter by device_type and date window
|
||||
reservations = db.query(JobReservation).filter(
|
||||
JobReservation.device_type == device_type,
|
||||
JobReservation.start_date <= end_date,
|
||||
or_(
|
||||
JobReservation.end_date >= start_date,
|
||||
JobReservation.end_date == None # TBD reservations
|
||||
)
|
||||
).order_by(JobReservation.start_date).all()
|
||||
|
||||
# Get assignment counts
|
||||
reservation_data = []
|
||||
for res in reservations:
|
||||
assignments = db.query(JobReservationUnit).filter_by(
|
||||
reservation_id=res.id
|
||||
).all()
|
||||
assigned_count = len(assignments)
|
||||
|
||||
# Enrich assignments with unit details, sorted by slot_index
|
||||
assignments_sorted = sorted(assignments, key=lambda a: (a.slot_index if a.slot_index is not None else 999))
|
||||
unit_ids = [a.unit_id for a in assignments_sorted]
|
||||
units = db.query(RosterUnit).filter(RosterUnit.id.in_(unit_ids)).all() if unit_ids else []
|
||||
units_by_id = {u.id: u for u in units}
|
||||
assigned_units = [
|
||||
{
|
||||
"id": a.unit_id,
|
||||
"power_type": a.power_type,
|
||||
"notes": a.notes,
|
||||
"location_name": a.location_name,
|
||||
"slot_index": a.slot_index,
|
||||
"deployed": units_by_id[a.unit_id].deployed if a.unit_id in units_by_id else False,
|
||||
"last_calibrated": units_by_id[a.unit_id].last_calibrated if a.unit_id in units_by_id else None,
|
||||
}
|
||||
for a in assignments_sorted
|
||||
]
|
||||
|
||||
# Check for calibration conflicts
|
||||
conflicts = check_calibration_conflicts(db, res.id)
|
||||
|
||||
location_count = res.quantity_needed or assigned_count
|
||||
reservation_data.append({
|
||||
"reservation": res,
|
||||
"assigned_count": assigned_count,
|
||||
"location_count": location_count,
|
||||
"assigned_units": assigned_units,
|
||||
"has_conflicts": len(conflicts) > 0,
|
||||
"conflict_count": len(conflicts)
|
||||
})
|
||||
|
||||
return templates.TemplateResponse(
|
||||
"partials/fleet_calendar/reservations_list.html",
|
||||
{
|
||||
"request": request,
|
||||
"reservations": reservation_data,
|
||||
"year": year,
|
||||
"device_type": device_type
|
||||
}
|
||||
)
|
||||
|
||||
|
||||
@router.get("/api/fleet-calendar/planner-availability", response_class=JSONResponse)
|
||||
async def get_planner_availability(
|
||||
device_type: str = "seismograph",
|
||||
start_date: Optional[str] = None,
|
||||
end_date: Optional[str] = None,
|
||||
exclude_reservation_id: Optional[str] = None,
|
||||
db: Session = Depends(get_db)
|
||||
):
|
||||
"""Get available units for the reservation planner split-panel UI.
|
||||
Dates are optional — if omitted, returns all non-retired units regardless of reservations.
|
||||
"""
|
||||
if start_date and end_date:
|
||||
try:
|
||||
start = date.fromisoformat(start_date)
|
||||
end = date.fromisoformat(end_date)
|
||||
except ValueError:
|
||||
raise HTTPException(status_code=400, detail="Invalid date format. Use YYYY-MM-DD")
|
||||
units = get_available_units_for_period(db, start, end, device_type, exclude_reservation_id)
|
||||
else:
|
||||
# No dates: return all non-retired units of this type, with current reservation info
|
||||
from backend.models import RosterUnit as RU
|
||||
from datetime import timedelta
|
||||
today = date.today()
|
||||
all_units = db.query(RU).filter(
|
||||
RU.device_type == device_type,
|
||||
RU.retired == False
|
||||
).all()
|
||||
|
||||
# Build a map: unit_id -> list of active/upcoming reservations
|
||||
active_assignments = db.query(JobReservationUnit).join(
|
||||
JobReservation, JobReservationUnit.reservation_id == JobReservation.id
|
||||
).filter(
|
||||
JobReservation.device_type == device_type,
|
||||
JobReservation.end_date >= today
|
||||
).all()
|
||||
unit_reservations = {}
|
||||
for assignment in active_assignments:
|
||||
res = db.query(JobReservation).filter(JobReservation.id == assignment.reservation_id).first()
|
||||
if not res:
|
||||
continue
|
||||
unit_reservations.setdefault(assignment.unit_id, []).append({
|
||||
"reservation_id": res.id,
|
||||
"reservation_name": res.name,
|
||||
"start_date": res.start_date.isoformat() if res.start_date else None,
|
||||
"end_date": res.end_date.isoformat() if res.end_date else None,
|
||||
"color": res.color or "#3B82F6"
|
||||
})
|
||||
|
||||
units = []
|
||||
for u in all_units:
|
||||
expiry = (u.last_calibrated + timedelta(days=365)) if u.last_calibrated else None
|
||||
units.append({
|
||||
"id": u.id,
|
||||
"last_calibrated": u.last_calibrated.isoformat() if u.last_calibrated else None,
|
||||
"expiry_date": expiry.isoformat() if expiry else None,
|
||||
"calibration_status": "needs_calibration" if not u.last_calibrated else "valid",
|
||||
"deployed": u.deployed,
|
||||
"out_for_calibration": u.out_for_calibration or False,
|
||||
"allocated": getattr(u, 'allocated', False) or False,
|
||||
"allocated_to_project_id": getattr(u, 'allocated_to_project_id', None) or "",
|
||||
"note": u.note or "",
|
||||
"reservations": unit_reservations.get(u.id, [])
|
||||
})
|
||||
|
||||
# Sort: benched first (easier to assign), then deployed, then by ID
|
||||
units.sort(key=lambda u: (1 if u["deployed"] else 0, u["id"]))
|
||||
|
||||
return {
|
||||
"units": units,
|
||||
"start_date": start_date,
|
||||
"end_date": end_date,
|
||||
"count": len(units)
|
||||
}
|
||||
|
||||
|
||||
@router.get("/api/fleet-calendar/unit-quick-info/{unit_id}", response_class=JSONResponse)
|
||||
async def get_unit_quick_info(unit_id: str, db: Session = Depends(get_db)):
|
||||
"""Return at-a-glance info for the planner quick-view modal."""
|
||||
from backend.models import Emitter
|
||||
u = db.query(RosterUnit).filter(RosterUnit.id == unit_id).first()
|
||||
if not u:
|
||||
raise HTTPException(status_code=404, detail="Unit not found")
|
||||
|
||||
today = date.today()
|
||||
expiry = (u.last_calibrated + timedelta(days=365)) if u.last_calibrated else None
|
||||
|
||||
# Active/upcoming reservations
|
||||
assignments = db.query(JobReservationUnit).filter(JobReservationUnit.unit_id == unit_id).all()
|
||||
reservations = []
|
||||
for a in assignments:
|
||||
res = db.query(JobReservation).filter(
|
||||
JobReservation.id == a.reservation_id,
|
||||
JobReservation.end_date >= today
|
||||
).first()
|
||||
if res:
|
||||
reservations.append({
|
||||
"name": res.name,
|
||||
"start_date": res.start_date.isoformat() if res.start_date else None,
|
||||
"end_date": res.end_date.isoformat() if res.end_date else None,
|
||||
"end_date_tbd": res.end_date_tbd,
|
||||
"color": res.color or "#3B82F6",
|
||||
"location_name": a.location_name,
|
||||
})
|
||||
|
||||
# Last seen from emitter
|
||||
emitter = db.query(Emitter).filter(Emitter.unit_type == unit_id).first()
|
||||
|
||||
return {
|
||||
"id": u.id,
|
||||
"unit_type": u.unit_type,
|
||||
"deployed": u.deployed,
|
||||
"out_for_calibration": u.out_for_calibration or False,
|
||||
"note": u.note or "",
|
||||
"project_id": u.project_id or "",
|
||||
"address": u.address or u.location or "",
|
||||
"coordinates": u.coordinates or "",
|
||||
"deployed_with_modem_id": u.deployed_with_modem_id or "",
|
||||
"last_calibrated": u.last_calibrated.isoformat() if u.last_calibrated else None,
|
||||
"next_calibration_due": u.next_calibration_due.isoformat() if u.next_calibration_due else (expiry.isoformat() if expiry else None),
|
||||
"cal_expired": not u.last_calibrated or (expiry and expiry < today),
|
||||
"last_seen": emitter.last_seen.isoformat() if emitter and emitter.last_seen else None,
|
||||
"reservations": reservations,
|
||||
}
|
||||
|
||||
|
||||
@router.get("/api/fleet-calendar/available-units", response_class=HTMLResponse)
|
||||
async def get_available_units_partial(
|
||||
request: Request,
|
||||
start_date: str,
|
||||
end_date: str,
|
||||
device_type: str = "seismograph",
|
||||
reservation_id: Optional[str] = None,
|
||||
db: Session = Depends(get_db)
|
||||
):
|
||||
"""Get available units as HTMX partial for the assignment modal."""
|
||||
try:
|
||||
start = date.fromisoformat(start_date)
|
||||
end = date.fromisoformat(end_date)
|
||||
except ValueError:
|
||||
raise HTTPException(status_code=400, detail="Invalid date format")
|
||||
|
||||
available = get_available_units_for_period(
|
||||
db, start, end, device_type, reservation_id
|
||||
)
|
||||
|
||||
return templates.TemplateResponse(
|
||||
"partials/fleet_calendar/available_units.html",
|
||||
{
|
||||
"request": request,
|
||||
"units": available,
|
||||
"start_date": start_date,
|
||||
"end_date": end_date,
|
||||
"device_type": device_type,
|
||||
"reservation_id": reservation_id
|
||||
}
|
||||
)
|
||||
|
||||
|
||||
@router.get("/api/fleet-calendar/month/{year}/{month}", response_class=HTMLResponse)
|
||||
async def get_month_partial(
|
||||
request: Request,
|
||||
year: int,
|
||||
month: int,
|
||||
device_type: str = "seismograph",
|
||||
db: Session = Depends(get_db)
|
||||
):
|
||||
"""Get a single month calendar as HTMX partial."""
|
||||
calendar_data = get_calendar_year_data(db, year, device_type)
|
||||
month_data = calendar_data["months"].get(month)
|
||||
|
||||
if not month_data:
|
||||
raise HTTPException(status_code=404, detail="Invalid month")
|
||||
|
||||
return templates.TemplateResponse(
|
||||
"partials/fleet_calendar/month_grid.html",
|
||||
{
|
||||
"request": request,
|
||||
"year": year,
|
||||
"month": month,
|
||||
"month_data": month_data,
|
||||
"device_type": device_type,
|
||||
"today": date.today().isoformat()
|
||||
}
|
||||
)
|
||||
|
||||
|
||||
# ============================================================================
|
||||
# Promote Reservation to Project
|
||||
# ============================================================================
|
||||
|
||||
@router.post("/api/fleet-calendar/reservations/{reservation_id}/promote-to-project", response_class=JSONResponse)
|
||||
async def promote_reservation_to_project(
|
||||
reservation_id: str,
|
||||
request: Request,
|
||||
db: Session = Depends(get_db)
|
||||
):
|
||||
"""
|
||||
Promote a job reservation to a full project in the projects DB.
|
||||
Creates: Project + MonitoringLocations + UnitAssignments.
|
||||
"""
|
||||
reservation = db.query(JobReservation).filter_by(id=reservation_id).first()
|
||||
if not reservation:
|
||||
raise HTTPException(status_code=404, detail="Reservation not found")
|
||||
|
||||
data = await request.json()
|
||||
project_number = data.get("project_number") or None
|
||||
client_name = data.get("client_name") or None
|
||||
|
||||
# Map device_type to project_type_id
|
||||
if reservation.device_type == "slm":
|
||||
project_type_id = "sound_monitoring"
|
||||
location_type = "sound"
|
||||
else:
|
||||
project_type_id = "vibration_monitoring"
|
||||
location_type = "vibration"
|
||||
|
||||
# Check for duplicate project name
|
||||
existing = db.query(Project).filter_by(name=reservation.name).first()
|
||||
if existing:
|
||||
raise HTTPException(status_code=409, detail=f"A project named '{reservation.name}' already exists.")
|
||||
|
||||
# Create the project
|
||||
project_id = str(uuid.uuid4())
|
||||
project = Project(
|
||||
id=project_id,
|
||||
name=reservation.name,
|
||||
project_number=project_number,
|
||||
client_name=client_name,
|
||||
project_type_id=project_type_id,
|
||||
status="upcoming",
|
||||
start_date=reservation.start_date,
|
||||
end_date=reservation.end_date,
|
||||
description=reservation.notes,
|
||||
)
|
||||
db.add(project)
|
||||
db.flush()
|
||||
|
||||
# Load assignments sorted by slot_index
|
||||
assignments = db.query(JobReservationUnit).filter_by(reservation_id=reservation_id).all()
|
||||
assignments_sorted = sorted(assignments, key=lambda a: (a.slot_index if a.slot_index is not None else 999))
|
||||
|
||||
locations_created = 0
|
||||
units_assigned = 0
|
||||
|
||||
for i, assignment in enumerate(assignments_sorted):
|
||||
loc_num = str(i + 1).zfill(3)
|
||||
loc_name = assignment.location_name or f"Location {i + 1}"
|
||||
|
||||
location = MonitoringLocation(
|
||||
id=str(uuid.uuid4()),
|
||||
project_id=project_id,
|
||||
location_type=location_type,
|
||||
name=loc_name,
|
||||
description=assignment.notes,
|
||||
)
|
||||
db.add(location)
|
||||
db.flush()
|
||||
locations_created += 1
|
||||
|
||||
if assignment.unit_id:
|
||||
unit_assignment = UnitAssignment(
|
||||
id=str(uuid.uuid4()),
|
||||
unit_id=assignment.unit_id,
|
||||
location_id=location.id,
|
||||
project_id=project_id,
|
||||
device_type=reservation.device_type or "seismograph",
|
||||
status="active",
|
||||
notes=f"Power: {assignment.power_type}" if assignment.power_type else None,
|
||||
)
|
||||
db.add(unit_assignment)
|
||||
units_assigned += 1
|
||||
|
||||
db.commit()
|
||||
|
||||
logger.info(f"Promoted reservation '{reservation.name}' to project {project_id}")
|
||||
|
||||
return {
|
||||
"success": True,
|
||||
"project_id": project_id,
|
||||
"project_name": reservation.name,
|
||||
"locations_created": locations_created,
|
||||
"units_assigned": units_assigned,
|
||||
}
|
||||
429
backend/routers/modem_dashboard.py
Normal file
@@ -0,0 +1,429 @@
|
||||
"""
|
||||
Modem Dashboard Router
|
||||
|
||||
Provides API endpoints for the Field Modems management page.
|
||||
"""
|
||||
|
||||
from fastapi import APIRouter, Request, Depends, Query
|
||||
from fastapi.responses import HTMLResponse
|
||||
from sqlalchemy.orm import Session
|
||||
from datetime import datetime
|
||||
import subprocess
|
||||
import time
|
||||
import logging
|
||||
|
||||
from backend.database import get_db
|
||||
from backend.models import RosterUnit
|
||||
from backend.templates_config import templates
|
||||
|
||||
logger = logging.getLogger(__name__)
|
||||
|
||||
router = APIRouter(prefix="/api/modem-dashboard", tags=["modem-dashboard"])
|
||||
|
||||
|
||||
@router.get("/stats", response_class=HTMLResponse)
|
||||
async def get_modem_stats(request: Request, db: Session = Depends(get_db)):
|
||||
"""
|
||||
Get summary statistics for modem dashboard.
|
||||
Returns HTML partial with stat cards.
|
||||
"""
|
||||
# Query all modems
|
||||
all_modems = db.query(RosterUnit).filter_by(device_type="modem").all()
|
||||
|
||||
# Get IDs of modems that have devices paired to them
|
||||
paired_modem_ids = set()
|
||||
devices_with_modems = db.query(RosterUnit).filter(
|
||||
RosterUnit.deployed_with_modem_id.isnot(None),
|
||||
RosterUnit.retired == False
|
||||
).all()
|
||||
for device in devices_with_modems:
|
||||
if device.deployed_with_modem_id:
|
||||
paired_modem_ids.add(device.deployed_with_modem_id)
|
||||
|
||||
# Count categories
|
||||
total_count = len(all_modems)
|
||||
retired_count = sum(1 for m in all_modems if m.retired)
|
||||
|
||||
# In use = deployed AND paired with a device
|
||||
in_use_count = sum(1 for m in all_modems
|
||||
if m.deployed and not m.retired and m.id in paired_modem_ids)
|
||||
|
||||
# Spare = deployed but NOT paired (available for assignment)
|
||||
spare_count = sum(1 for m in all_modems
|
||||
if m.deployed and not m.retired and m.id not in paired_modem_ids)
|
||||
|
||||
# Benched = not deployed and not retired
|
||||
benched_count = sum(1 for m in all_modems if not m.deployed and not m.retired)
|
||||
|
||||
return templates.TemplateResponse("partials/modem_stats.html", {
|
||||
"request": request,
|
||||
"total_count": total_count,
|
||||
"in_use_count": in_use_count,
|
||||
"spare_count": spare_count,
|
||||
"benched_count": benched_count,
|
||||
"retired_count": retired_count
|
||||
})
|
||||
|
||||
|
||||
@router.get("/units", response_class=HTMLResponse)
|
||||
async def get_modem_units(
|
||||
request: Request,
|
||||
db: Session = Depends(get_db),
|
||||
search: str = Query(None),
|
||||
filter_status: str = Query(None), # "in_use", "spare", "benched", "retired"
|
||||
):
|
||||
"""
|
||||
Get list of modem units for the dashboard.
|
||||
Returns HTML partial with modem cards.
|
||||
"""
|
||||
query = db.query(RosterUnit).filter_by(device_type="modem")
|
||||
|
||||
# Filter by search term if provided
|
||||
if search:
|
||||
search_term = f"%{search}%"
|
||||
query = query.filter(
|
||||
(RosterUnit.id.ilike(search_term)) |
|
||||
(RosterUnit.ip_address.ilike(search_term)) |
|
||||
(RosterUnit.hardware_model.ilike(search_term)) |
|
||||
(RosterUnit.phone_number.ilike(search_term)) |
|
||||
(RosterUnit.location.ilike(search_term))
|
||||
)
|
||||
|
||||
modems = query.order_by(
|
||||
RosterUnit.retired.asc(),
|
||||
RosterUnit.deployed.desc(),
|
||||
RosterUnit.id.asc()
|
||||
).all()
|
||||
|
||||
# Get paired device info for each modem
|
||||
paired_devices = {}
|
||||
devices_with_modems = db.query(RosterUnit).filter(
|
||||
RosterUnit.deployed_with_modem_id.isnot(None),
|
||||
RosterUnit.retired == False
|
||||
).all()
|
||||
for device in devices_with_modems:
|
||||
if device.deployed_with_modem_id:
|
||||
paired_devices[device.deployed_with_modem_id] = {
|
||||
"id": device.id,
|
||||
"device_type": device.device_type,
|
||||
"deployed": device.deployed
|
||||
}
|
||||
|
||||
# Annotate modems with paired device info
|
||||
modem_list = []
|
||||
for modem in modems:
|
||||
paired = paired_devices.get(modem.id)
|
||||
|
||||
# Determine status category
|
||||
if modem.retired:
|
||||
status = "retired"
|
||||
elif not modem.deployed:
|
||||
status = "benched"
|
||||
elif paired:
|
||||
status = "in_use"
|
||||
else:
|
||||
status = "spare"
|
||||
|
||||
# Apply filter if specified
|
||||
if filter_status and status != filter_status:
|
||||
continue
|
||||
|
||||
modem_list.append({
|
||||
"id": modem.id,
|
||||
"ip_address": modem.ip_address,
|
||||
"phone_number": modem.phone_number,
|
||||
"hardware_model": modem.hardware_model,
|
||||
"deployed": modem.deployed,
|
||||
"retired": modem.retired,
|
||||
"location": modem.location,
|
||||
"project_id": modem.project_id,
|
||||
"paired_device": paired,
|
||||
"status": status
|
||||
})
|
||||
|
||||
return templates.TemplateResponse("partials/modem_list.html", {
|
||||
"request": request,
|
||||
"modems": modem_list
|
||||
})
|
||||
|
||||
|
||||
@router.get("/{modem_id}/paired-device")
|
||||
async def get_paired_device(modem_id: str, db: Session = Depends(get_db)):
|
||||
"""
|
||||
Get the device (SLM/seismograph) that is paired with this modem.
|
||||
Returns JSON with device info or null if not paired.
|
||||
"""
|
||||
# Check modem exists
|
||||
modem = db.query(RosterUnit).filter_by(id=modem_id, device_type="modem").first()
|
||||
if not modem:
|
||||
return {"status": "error", "detail": f"Modem {modem_id} not found"}
|
||||
|
||||
# Find device paired with this modem
|
||||
device = db.query(RosterUnit).filter(
|
||||
RosterUnit.deployed_with_modem_id == modem_id,
|
||||
RosterUnit.retired == False
|
||||
).first()
|
||||
|
||||
if device:
|
||||
return {
|
||||
"paired": True,
|
||||
"device": {
|
||||
"id": device.id,
|
||||
"device_type": device.device_type,
|
||||
"deployed": device.deployed,
|
||||
"project_id": device.project_id,
|
||||
"location": device.location or device.address
|
||||
}
|
||||
}
|
||||
|
||||
return {"paired": False, "device": None}
|
||||
|
||||
|
||||
@router.get("/{modem_id}/paired-device-html", response_class=HTMLResponse)
|
||||
async def get_paired_device_html(modem_id: str, request: Request, db: Session = Depends(get_db)):
|
||||
"""
|
||||
Get HTML partial showing the device paired with this modem.
|
||||
Used by unit_detail.html for modems.
|
||||
"""
|
||||
# Check modem exists
|
||||
modem = db.query(RosterUnit).filter_by(id=modem_id, device_type="modem").first()
|
||||
if not modem:
|
||||
return HTMLResponse('<p class="text-red-500">Modem not found</p>')
|
||||
|
||||
# Find device paired with this modem
|
||||
device = db.query(RosterUnit).filter(
|
||||
RosterUnit.deployed_with_modem_id == modem_id,
|
||||
RosterUnit.retired == False
|
||||
).first()
|
||||
|
||||
return templates.TemplateResponse("partials/modem_paired_device.html", {
|
||||
"request": request,
|
||||
"modem_id": modem_id,
|
||||
"device": device
|
||||
})
|
||||
|
||||
|
||||
@router.get("/{modem_id}/ping")
|
||||
async def ping_modem(modem_id: str, db: Session = Depends(get_db)):
|
||||
"""
|
||||
Test modem connectivity with a simple ping.
|
||||
Returns response time and connection status.
|
||||
"""
|
||||
# Get modem from database
|
||||
modem = db.query(RosterUnit).filter_by(id=modem_id, device_type="modem").first()
|
||||
|
||||
if not modem:
|
||||
return {"status": "error", "detail": f"Modem {modem_id} not found"}
|
||||
|
||||
if not modem.ip_address:
|
||||
return {"status": "error", "detail": f"Modem {modem_id} has no IP address configured"}
|
||||
|
||||
try:
|
||||
# Ping the modem (1 packet, 2 second timeout)
|
||||
start_time = time.time()
|
||||
result = subprocess.run(
|
||||
["ping", "-c", "1", "-W", "2", modem.ip_address],
|
||||
capture_output=True,
|
||||
text=True,
|
||||
timeout=3
|
||||
)
|
||||
response_time = int((time.time() - start_time) * 1000) # Convert to milliseconds
|
||||
|
||||
if result.returncode == 0:
|
||||
return {
|
||||
"status": "success",
|
||||
"modem_id": modem_id,
|
||||
"ip_address": modem.ip_address,
|
||||
"response_time_ms": response_time,
|
||||
"message": "Modem is responding"
|
||||
}
|
||||
else:
|
||||
return {
|
||||
"status": "error",
|
||||
"modem_id": modem_id,
|
||||
"ip_address": modem.ip_address,
|
||||
"detail": "Modem not responding to ping"
|
||||
}
|
||||
|
||||
except subprocess.TimeoutExpired:
|
||||
return {
|
||||
"status": "error",
|
||||
"modem_id": modem_id,
|
||||
"ip_address": modem.ip_address,
|
||||
"detail": "Ping timeout"
|
||||
}
|
||||
except Exception as e:
|
||||
logger.error(f"Failed to ping modem {modem_id}: {e}")
|
||||
return {
|
||||
"status": "error",
|
||||
"modem_id": modem_id,
|
||||
"detail": str(e)
|
||||
}
|
||||
|
||||
|
||||
@router.get("/{modem_id}/diagnostics")
|
||||
async def get_modem_diagnostics(modem_id: str, db: Session = Depends(get_db)):
|
||||
"""
|
||||
Get modem diagnostics (signal strength, data usage, uptime).
|
||||
|
||||
Currently returns placeholders. When ModemManager is available,
|
||||
this endpoint will query it for real diagnostics.
|
||||
"""
|
||||
modem = db.query(RosterUnit).filter_by(id=modem_id, device_type="modem").first()
|
||||
if not modem:
|
||||
return {"status": "error", "detail": f"Modem {modem_id} not found"}
|
||||
|
||||
# TODO: Query ModemManager backend when available
|
||||
return {
|
||||
"status": "unavailable",
|
||||
"message": "ModemManager integration not yet available",
|
||||
"modem_id": modem_id,
|
||||
"signal_strength_dbm": None,
|
||||
"data_usage_mb": None,
|
||||
"uptime_seconds": None,
|
||||
"carrier": None,
|
||||
"connection_type": None # LTE, 5G, etc.
|
||||
}
|
||||
|
||||
|
||||
@router.get("/{modem_id}/pairable-devices")
|
||||
async def get_pairable_devices(
|
||||
modem_id: str,
|
||||
db: Session = Depends(get_db),
|
||||
search: str = Query(None),
|
||||
hide_paired: bool = Query(True)
|
||||
):
|
||||
"""
|
||||
Get list of devices (seismographs and SLMs) that can be paired with this modem.
|
||||
Used by the device picker modal in unit_detail.html.
|
||||
"""
|
||||
# Check modem exists
|
||||
modem = db.query(RosterUnit).filter_by(id=modem_id, device_type="modem").first()
|
||||
if not modem:
|
||||
return {"status": "error", "detail": f"Modem {modem_id} not found"}
|
||||
|
||||
# Query seismographs and SLMs
|
||||
query = db.query(RosterUnit).filter(
|
||||
RosterUnit.device_type.in_(["seismograph", "sound_level_meter"]),
|
||||
RosterUnit.retired == False
|
||||
)
|
||||
|
||||
# Filter by search term if provided
|
||||
if search:
|
||||
search_term = f"%{search}%"
|
||||
query = query.filter(
|
||||
(RosterUnit.id.ilike(search_term)) |
|
||||
(RosterUnit.project_id.ilike(search_term)) |
|
||||
(RosterUnit.location.ilike(search_term)) |
|
||||
(RosterUnit.address.ilike(search_term)) |
|
||||
(RosterUnit.note.ilike(search_term))
|
||||
)
|
||||
|
||||
devices = query.order_by(
|
||||
RosterUnit.deployed.desc(),
|
||||
RosterUnit.device_type.asc(),
|
||||
RosterUnit.id.asc()
|
||||
).all()
|
||||
|
||||
# Build device list
|
||||
device_list = []
|
||||
for device in devices:
|
||||
# Skip already paired devices if hide_paired is True
|
||||
is_paired_to_other = (
|
||||
device.deployed_with_modem_id is not None and
|
||||
device.deployed_with_modem_id != modem_id
|
||||
)
|
||||
is_paired_to_this = device.deployed_with_modem_id == modem_id
|
||||
|
||||
if hide_paired and is_paired_to_other:
|
||||
continue
|
||||
|
||||
device_list.append({
|
||||
"id": device.id,
|
||||
"device_type": device.device_type,
|
||||
"deployed": device.deployed,
|
||||
"project_id": device.project_id,
|
||||
"location": device.location or device.address,
|
||||
"note": device.note,
|
||||
"paired_modem_id": device.deployed_with_modem_id,
|
||||
"is_paired_to_this": is_paired_to_this,
|
||||
"is_paired_to_other": is_paired_to_other
|
||||
})
|
||||
|
||||
return {"devices": device_list, "modem_id": modem_id}
|
||||
|
||||
|
||||
@router.post("/{modem_id}/pair")
|
||||
async def pair_device_to_modem(
|
||||
modem_id: str,
|
||||
db: Session = Depends(get_db),
|
||||
device_id: str = Query(..., description="ID of the device to pair")
|
||||
):
|
||||
"""
|
||||
Pair a device (seismograph or SLM) to this modem.
|
||||
Updates the device's deployed_with_modem_id field.
|
||||
"""
|
||||
# Check modem exists
|
||||
modem = db.query(RosterUnit).filter_by(id=modem_id, device_type="modem").first()
|
||||
if not modem:
|
||||
return {"status": "error", "detail": f"Modem {modem_id} not found"}
|
||||
|
||||
# Find the device
|
||||
device = db.query(RosterUnit).filter(
|
||||
RosterUnit.id == device_id,
|
||||
RosterUnit.device_type.in_(["seismograph", "sound_level_meter"]),
|
||||
RosterUnit.retired == False
|
||||
).first()
|
||||
if not device:
|
||||
return {"status": "error", "detail": f"Device {device_id} not found"}
|
||||
|
||||
# Unpair any device currently paired to this modem
|
||||
currently_paired = db.query(RosterUnit).filter(
|
||||
RosterUnit.deployed_with_modem_id == modem_id
|
||||
).all()
|
||||
for paired_device in currently_paired:
|
||||
paired_device.deployed_with_modem_id = None
|
||||
|
||||
# Pair the new device
|
||||
device.deployed_with_modem_id = modem_id
|
||||
db.commit()
|
||||
|
||||
return {
|
||||
"status": "success",
|
||||
"modem_id": modem_id,
|
||||
"device_id": device_id,
|
||||
"message": f"Device {device_id} paired to modem {modem_id}"
|
||||
}
|
||||
|
||||
|
||||
@router.post("/{modem_id}/unpair")
|
||||
async def unpair_device_from_modem(modem_id: str, db: Session = Depends(get_db)):
|
||||
"""
|
||||
Unpair any device currently paired to this modem.
|
||||
"""
|
||||
# Check modem exists
|
||||
modem = db.query(RosterUnit).filter_by(id=modem_id, device_type="modem").first()
|
||||
if not modem:
|
||||
return {"status": "error", "detail": f"Modem {modem_id} not found"}
|
||||
|
||||
# Find and unpair device
|
||||
device = db.query(RosterUnit).filter(
|
||||
RosterUnit.deployed_with_modem_id == modem_id
|
||||
).first()
|
||||
|
||||
if device:
|
||||
old_device_id = device.id
|
||||
device.deployed_with_modem_id = None
|
||||
db.commit()
|
||||
return {
|
||||
"status": "success",
|
||||
"modem_id": modem_id,
|
||||
"unpaired_device_id": old_device_id,
|
||||
"message": f"Device {old_device_id} unpaired from modem {modem_id}"
|
||||
}
|
||||
|
||||
return {
|
||||
"status": "success",
|
||||
"modem_id": modem_id,
|
||||
"message": "No device was paired to this modem"
|
||||
}
|
||||
@@ -14,20 +14,80 @@ from typing import Optional
|
||||
import uuid
|
||||
import json
|
||||
|
||||
from fastapi import UploadFile, File
|
||||
import zipfile
|
||||
import hashlib
|
||||
import io
|
||||
from pathlib import Path
|
||||
|
||||
from backend.database import get_db
|
||||
from backend.models import (
|
||||
Project,
|
||||
ProjectType,
|
||||
ProjectModule,
|
||||
MonitoringLocation,
|
||||
UnitAssignment,
|
||||
RosterUnit,
|
||||
RecordingSession,
|
||||
MonitoringSession,
|
||||
DataFile,
|
||||
)
|
||||
from backend.templates_config import templates
|
||||
from backend.utils.timezone import local_to_utc
|
||||
|
||||
router = APIRouter(prefix="/api/projects/{project_id}", tags=["project-locations"])
|
||||
|
||||
|
||||
# ============================================================================
|
||||
# Shared helpers
|
||||
# ============================================================================
|
||||
|
||||
def _require_module(project, module_type: str, db: Session) -> None:
|
||||
"""Raise 400 if the project does not have the given module enabled."""
|
||||
if not project:
|
||||
raise HTTPException(status_code=404, detail="Project not found.")
|
||||
exists = db.query(ProjectModule).filter_by(
|
||||
project_id=project.id, module_type=module_type, enabled=True
|
||||
).first()
|
||||
if not exists:
|
||||
raise HTTPException(
|
||||
status_code=400,
|
||||
detail=f"This project does not have the {module_type.replace('_', ' ').title()} module enabled.",
|
||||
)
|
||||
|
||||
|
||||
# ============================================================================
|
||||
# Session period helpers
|
||||
# ============================================================================
|
||||
|
||||
def _derive_period_type(dt: datetime) -> str:
|
||||
"""
|
||||
Classify a session start time into one of four period types.
|
||||
Night = 22:00–07:00, Day = 07:00–22:00.
|
||||
Weekend = Saturday (5) or Sunday (6).
|
||||
"""
|
||||
is_weekend = dt.weekday() >= 5
|
||||
is_night = dt.hour >= 22 or dt.hour < 7
|
||||
if is_weekend:
|
||||
return "weekend_night" if is_night else "weekend_day"
|
||||
return "weekday_night" if is_night else "weekday_day"
|
||||
|
||||
|
||||
def _build_session_label(dt: datetime, location_name: str, period_type: str) -> str:
|
||||
"""Build a human-readable session label, e.g. 'NRL-1 — Sun 2/23 — Night'.
|
||||
Uses started_at date as-is; user can correct period_type in the wizard.
|
||||
"""
|
||||
day_abbr = dt.strftime("%a")
|
||||
date_str = f"{dt.month}/{dt.day}"
|
||||
period_str = {
|
||||
"weekday_day": "Day",
|
||||
"weekday_night": "Night",
|
||||
"weekend_day": "Day",
|
||||
"weekend_night": "Night",
|
||||
}.get(period_type, "")
|
||||
parts = [p for p in [location_name, f"{day_abbr} {date_str}", period_str] if p]
|
||||
return " — ".join(parts)
|
||||
|
||||
|
||||
# ============================================================================
|
||||
# Monitoring Locations CRUD
|
||||
# ============================================================================
|
||||
@@ -58,11 +118,11 @@ async def get_project_locations(
|
||||
# Enrich with assignment info
|
||||
locations_data = []
|
||||
for location in locations:
|
||||
# Get active assignment
|
||||
# Get active assignment (active = assigned_until IS NULL)
|
||||
assignment = db.query(UnitAssignment).filter(
|
||||
and_(
|
||||
UnitAssignment.location_id == location.id,
|
||||
UnitAssignment.status == "active",
|
||||
UnitAssignment.assigned_until == None,
|
||||
)
|
||||
).first()
|
||||
|
||||
@@ -70,8 +130,8 @@ async def get_project_locations(
|
||||
if assignment:
|
||||
assigned_unit = db.query(RosterUnit).filter_by(id=assignment.unit_id).first()
|
||||
|
||||
# Count recording sessions
|
||||
session_count = db.query(RecordingSession).filter_by(
|
||||
# Count monitoring sessions
|
||||
session_count = db.query(MonitoringSession).filter_by(
|
||||
location_id=location.id
|
||||
).count()
|
||||
|
||||
@@ -218,11 +278,11 @@ async def delete_location(
|
||||
if not location:
|
||||
raise HTTPException(status_code=404, detail="Location not found")
|
||||
|
||||
# Check if location has active assignments
|
||||
# Check if location has active assignments (active = assigned_until IS NULL)
|
||||
active_assignments = db.query(UnitAssignment).filter(
|
||||
and_(
|
||||
UnitAssignment.location_id == location_id,
|
||||
UnitAssignment.status == "active",
|
||||
UnitAssignment.assigned_until == None,
|
||||
)
|
||||
).count()
|
||||
|
||||
@@ -313,18 +373,18 @@ async def assign_unit_to_location(
|
||||
detail=f"Unit type '{unit.device_type}' does not match location type '{location.location_type}'",
|
||||
)
|
||||
|
||||
# Check if location already has an active assignment
|
||||
# Check if location already has an active assignment (active = assigned_until IS NULL)
|
||||
existing_assignment = db.query(UnitAssignment).filter(
|
||||
and_(
|
||||
UnitAssignment.location_id == location_id,
|
||||
UnitAssignment.status == "active",
|
||||
UnitAssignment.assigned_until == None,
|
||||
)
|
||||
).first()
|
||||
|
||||
if existing_assignment:
|
||||
raise HTTPException(
|
||||
status_code=400,
|
||||
detail=f"Location already has an active unit assignment ({existing_assignment.unit_id}). Unassign first.",
|
||||
detail=f"Location already has an active unit assignment ({existing_assignment.unit_id}). Use swap to replace it.",
|
||||
)
|
||||
|
||||
# Create new assignment
|
||||
@@ -370,19 +430,19 @@ async def unassign_unit(
|
||||
if not assignment:
|
||||
raise HTTPException(status_code=404, detail="Assignment not found")
|
||||
|
||||
# Check if there are active recording sessions
|
||||
active_sessions = db.query(RecordingSession).filter(
|
||||
# Check if there are active monitoring sessions
|
||||
active_sessions = db.query(MonitoringSession).filter(
|
||||
and_(
|
||||
RecordingSession.location_id == assignment.location_id,
|
||||
RecordingSession.unit_id == assignment.unit_id,
|
||||
RecordingSession.status == "recording",
|
||||
MonitoringSession.location_id == assignment.location_id,
|
||||
MonitoringSession.unit_id == assignment.unit_id,
|
||||
MonitoringSession.status == "recording",
|
||||
)
|
||||
).count()
|
||||
|
||||
if active_sessions > 0:
|
||||
raise HTTPException(
|
||||
status_code=400,
|
||||
detail="Cannot unassign unit with active recording sessions. Stop recording first.",
|
||||
detail="Cannot unassign unit with active monitoring sessions. Stop monitoring first.",
|
||||
)
|
||||
|
||||
assignment.status = "completed"
|
||||
@@ -393,10 +453,120 @@ async def unassign_unit(
|
||||
return {"success": True, "message": "Unit unassigned successfully"}
|
||||
|
||||
|
||||
@router.post("/locations/{location_id}/swap")
|
||||
async def swap_unit_on_location(
|
||||
project_id: str,
|
||||
location_id: str,
|
||||
request: Request,
|
||||
db: Session = Depends(get_db),
|
||||
):
|
||||
"""
|
||||
Swap the unit assigned to a vibration monitoring location.
|
||||
Ends the current active assignment (if any), creates a new one,
|
||||
and optionally updates modem pairing on the seismograph.
|
||||
Works for first-time assignments too (no current assignment = just create).
|
||||
"""
|
||||
location = db.query(MonitoringLocation).filter_by(
|
||||
id=location_id,
|
||||
project_id=project_id,
|
||||
).first()
|
||||
if not location:
|
||||
raise HTTPException(status_code=404, detail="Location not found")
|
||||
|
||||
form_data = await request.form()
|
||||
unit_id = form_data.get("unit_id")
|
||||
modem_id = form_data.get("modem_id") or None
|
||||
notes = form_data.get("notes") or None
|
||||
|
||||
if not unit_id:
|
||||
raise HTTPException(status_code=400, detail="unit_id is required")
|
||||
|
||||
# Validate new unit
|
||||
unit = db.query(RosterUnit).filter_by(id=unit_id).first()
|
||||
if not unit:
|
||||
raise HTTPException(status_code=404, detail="Unit not found")
|
||||
|
||||
expected_device_type = "slm" if location.location_type == "sound" else "seismograph"
|
||||
if unit.device_type != expected_device_type:
|
||||
raise HTTPException(
|
||||
status_code=400,
|
||||
detail=f"Unit type '{unit.device_type}' does not match location type '{location.location_type}'",
|
||||
)
|
||||
|
||||
# End current active assignment if one exists (active = assigned_until IS NULL)
|
||||
current = db.query(UnitAssignment).filter(
|
||||
and_(
|
||||
UnitAssignment.location_id == location_id,
|
||||
UnitAssignment.assigned_until == None,
|
||||
)
|
||||
).first()
|
||||
if current:
|
||||
current.assigned_until = datetime.utcnow()
|
||||
current.status = "completed"
|
||||
|
||||
# Create new assignment
|
||||
new_assignment = UnitAssignment(
|
||||
id=str(uuid.uuid4()),
|
||||
unit_id=unit_id,
|
||||
location_id=location_id,
|
||||
project_id=project_id,
|
||||
device_type=unit.device_type,
|
||||
assigned_until=None,
|
||||
status="active",
|
||||
notes=notes,
|
||||
)
|
||||
db.add(new_assignment)
|
||||
|
||||
# Update modem pairing on the seismograph if modem provided
|
||||
if modem_id:
|
||||
modem = db.query(RosterUnit).filter_by(id=modem_id, device_type="modem").first()
|
||||
if not modem:
|
||||
raise HTTPException(status_code=404, detail=f"Modem '{modem_id}' not found")
|
||||
unit.deployed_with_modem_id = modem_id
|
||||
modem.deployed_with_unit_id = unit_id
|
||||
else:
|
||||
# Clear modem pairing if not provided
|
||||
unit.deployed_with_modem_id = None
|
||||
|
||||
db.commit()
|
||||
|
||||
return JSONResponse({
|
||||
"success": True,
|
||||
"assignment_id": new_assignment.id,
|
||||
"message": f"Unit '{unit_id}' assigned to '{location.name}'" + (f" with modem '{modem_id}'" if modem_id else ""),
|
||||
})
|
||||
|
||||
|
||||
# ============================================================================
|
||||
# Available Units for Assignment
|
||||
# ============================================================================
|
||||
|
||||
@router.get("/available-modems", response_class=JSONResponse)
|
||||
async def get_available_modems(
|
||||
project_id: str,
|
||||
db: Session = Depends(get_db),
|
||||
):
|
||||
"""
|
||||
Get all deployed, non-retired modems for the modem assignment dropdown.
|
||||
"""
|
||||
modems = db.query(RosterUnit).filter(
|
||||
and_(
|
||||
RosterUnit.device_type == "modem",
|
||||
RosterUnit.deployed == True,
|
||||
RosterUnit.retired == False,
|
||||
)
|
||||
).order_by(RosterUnit.id).all()
|
||||
|
||||
return [
|
||||
{
|
||||
"id": m.id,
|
||||
"hardware_model": m.hardware_model,
|
||||
"ip_address": m.ip_address,
|
||||
}
|
||||
for m in modems
|
||||
]
|
||||
|
||||
|
||||
@router.get("/available-units", response_class=JSONResponse)
|
||||
async def get_available_units(
|
||||
project_id: str,
|
||||
@@ -419,9 +589,9 @@ async def get_available_units(
|
||||
)
|
||||
).all()
|
||||
|
||||
# Filter out units that already have active assignments
|
||||
# Filter out units that already have active assignments (active = assigned_until IS NULL)
|
||||
assigned_unit_ids = db.query(UnitAssignment.unit_id).filter(
|
||||
UnitAssignment.status == "active"
|
||||
UnitAssignment.assigned_until == None
|
||||
).distinct().all()
|
||||
assigned_unit_ids = [uid[0] for uid in assigned_unit_ids]
|
||||
|
||||
@@ -451,14 +621,12 @@ async def get_nrl_sessions(
|
||||
db: Session = Depends(get_db),
|
||||
):
|
||||
"""
|
||||
Get recording sessions for a specific NRL.
|
||||
Get monitoring sessions for a specific NRL.
|
||||
Returns HTML partial with session list.
|
||||
"""
|
||||
from backend.models import RecordingSession, RosterUnit
|
||||
|
||||
sessions = db.query(RecordingSession).filter_by(
|
||||
sessions = db.query(MonitoringSession).filter_by(
|
||||
location_id=location_id
|
||||
).order_by(RecordingSession.started_at.desc()).all()
|
||||
).order_by(MonitoringSession.started_at.desc()).all()
|
||||
|
||||
# Enrich with unit details
|
||||
sessions_data = []
|
||||
@@ -491,14 +659,12 @@ async def get_nrl_files(
|
||||
Get data files for a specific NRL.
|
||||
Returns HTML partial with file list.
|
||||
"""
|
||||
from backend.models import DataFile, RecordingSession
|
||||
|
||||
# Join DataFile with RecordingSession to filter by location_id
|
||||
# Join DataFile with MonitoringSession to filter by location_id
|
||||
files = db.query(DataFile).join(
|
||||
RecordingSession,
|
||||
DataFile.session_id == RecordingSession.id
|
||||
MonitoringSession,
|
||||
DataFile.session_id == MonitoringSession.id
|
||||
).filter(
|
||||
RecordingSession.location_id == location_id
|
||||
MonitoringSession.location_id == location_id
|
||||
).order_by(DataFile.created_at.desc()).all()
|
||||
|
||||
# Enrich with session details
|
||||
@@ -506,7 +672,7 @@ async def get_nrl_files(
|
||||
for file in files:
|
||||
session = None
|
||||
if file.session_id:
|
||||
session = db.query(RecordingSession).filter_by(id=file.session_id).first()
|
||||
session = db.query(MonitoringSession).filter_by(id=file.session_id).first()
|
||||
|
||||
files_data.append({
|
||||
"file": file,
|
||||
@@ -519,3 +685,324 @@ async def get_nrl_files(
|
||||
"location_id": location_id,
|
||||
"files": files_data,
|
||||
})
|
||||
|
||||
|
||||
# ============================================================================
|
||||
# Manual SD Card Data Upload
|
||||
# ============================================================================
|
||||
|
||||
def _parse_rnh(content: bytes) -> dict:
|
||||
"""
|
||||
Parse a Rion .rnh metadata file (INI-style with [Section] headers).
|
||||
Returns a dict of key metadata fields.
|
||||
"""
|
||||
result = {}
|
||||
try:
|
||||
text = content.decode("utf-8", errors="replace")
|
||||
for line in text.splitlines():
|
||||
line = line.strip()
|
||||
if not line or line.startswith("["):
|
||||
continue
|
||||
if "," in line:
|
||||
key, _, value = line.partition(",")
|
||||
key = key.strip()
|
||||
value = value.strip()
|
||||
if key == "Serial Number":
|
||||
result["serial_number"] = value
|
||||
elif key == "Store Name":
|
||||
result["store_name"] = value
|
||||
elif key == "Index Number":
|
||||
result["index_number"] = value
|
||||
elif key == "Measurement Start Time":
|
||||
result["start_time_str"] = value
|
||||
elif key == "Measurement Stop Time":
|
||||
result["stop_time_str"] = value
|
||||
elif key == "Total Measurement Time":
|
||||
result["total_time_str"] = value
|
||||
except Exception:
|
||||
pass
|
||||
return result
|
||||
|
||||
|
||||
def _parse_rnh_datetime(s: str):
|
||||
"""Parse RNH datetime string: '2026/02/17 19:00:19' -> datetime"""
|
||||
from datetime import datetime
|
||||
if not s:
|
||||
return None
|
||||
try:
|
||||
return datetime.strptime(s.strip(), "%Y/%m/%d %H:%M:%S")
|
||||
except Exception:
|
||||
return None
|
||||
|
||||
|
||||
def _classify_file(filename: str) -> str:
|
||||
"""Classify a file by name into a DataFile file_type."""
|
||||
name = filename.lower()
|
||||
if name.endswith(".rnh"):
|
||||
return "log"
|
||||
if name.endswith(".rnd"):
|
||||
return "measurement"
|
||||
if name.endswith(".zip"):
|
||||
return "archive"
|
||||
return "data"
|
||||
|
||||
|
||||
@router.post("/nrl/{location_id}/upload-data")
|
||||
async def upload_nrl_data(
|
||||
project_id: str,
|
||||
location_id: str,
|
||||
db: Session = Depends(get_db),
|
||||
files: list[UploadFile] = File(...),
|
||||
):
|
||||
"""
|
||||
Manually upload SD card data for an offline NRL.
|
||||
|
||||
Accepts either:
|
||||
- A single .zip file (the Auto_#### folder zipped) — auto-extracted
|
||||
- Multiple .rnd / .rnh files selected directly from the SD card folder
|
||||
|
||||
Creates a MonitoringSession from .rnh metadata and DataFile records
|
||||
for each measurement file. No unit assignment required.
|
||||
"""
|
||||
from datetime import datetime
|
||||
|
||||
# Verify project and location exist
|
||||
project = db.query(Project).filter_by(id=project_id).first()
|
||||
_require_module(project, "sound_monitoring", db)
|
||||
|
||||
location = db.query(MonitoringLocation).filter_by(
|
||||
id=location_id, project_id=project_id
|
||||
).first()
|
||||
if not location:
|
||||
raise HTTPException(status_code=404, detail="Location not found")
|
||||
|
||||
# --- Step 1: Normalize to (filename, bytes) list ---
|
||||
file_entries: list[tuple[str, bytes]] = []
|
||||
|
||||
if len(files) == 1 and files[0].filename.lower().endswith(".zip"):
|
||||
raw = await files[0].read()
|
||||
try:
|
||||
with zipfile.ZipFile(io.BytesIO(raw)) as zf:
|
||||
for info in zf.infolist():
|
||||
if info.is_dir():
|
||||
continue
|
||||
name = Path(info.filename).name # strip folder path
|
||||
if not name:
|
||||
continue
|
||||
file_entries.append((name, zf.read(info)))
|
||||
except zipfile.BadZipFile:
|
||||
raise HTTPException(status_code=400, detail="Uploaded file is not a valid ZIP archive.")
|
||||
else:
|
||||
for uf in files:
|
||||
data = await uf.read()
|
||||
file_entries.append((uf.filename, data))
|
||||
|
||||
if not file_entries:
|
||||
raise HTTPException(status_code=400, detail="No usable files found in upload.")
|
||||
|
||||
# --- Step 1b: Filter to only relevant files ---
|
||||
# Keep: .rnh (metadata) and measurement .rnd files
|
||||
# NL-43 generates two .rnd types: _Leq_ (15-min averages, wanted) and _Lp_ (1-sec granular, skip)
|
||||
# AU2 (NL-23/older Rion) generates a single Au2_####.rnd per session — always keep those
|
||||
# Drop: _Lp_ .rnd, .xlsx, .mp3, and anything else
|
||||
def _is_wanted(fname: str) -> bool:
|
||||
n = fname.lower()
|
||||
if n.endswith(".rnh"):
|
||||
return True
|
||||
if n.endswith(".rnd"):
|
||||
if "_leq_" in n: # NL-43 Leq file
|
||||
return True
|
||||
if n.startswith("au2_"): # AU2 format (NL-23) — always Leq equivalent
|
||||
return True
|
||||
if "_lp" not in n and "_leq_" not in n:
|
||||
# Unknown .rnd format — include it so we don't silently drop data
|
||||
return True
|
||||
return False
|
||||
|
||||
file_entries = [(fname, fbytes) for fname, fbytes in file_entries if _is_wanted(fname)]
|
||||
|
||||
if not file_entries:
|
||||
raise HTTPException(status_code=400, detail="No usable .rnd or .rnh files found. Expected NL-43 _Leq_ files or AU2 format .rnd files.")
|
||||
|
||||
# --- Step 2: Find and parse .rnh metadata ---
|
||||
rnh_meta = {}
|
||||
for fname, fbytes in file_entries:
|
||||
if fname.lower().endswith(".rnh"):
|
||||
rnh_meta = _parse_rnh(fbytes)
|
||||
break
|
||||
|
||||
# RNH files store local time (no UTC offset). Use local values for period
|
||||
# classification / label generation, then convert to UTC for DB storage so
|
||||
# the local_datetime Jinja filter displays the correct time.
|
||||
started_at_local = _parse_rnh_datetime(rnh_meta.get("start_time_str")) or datetime.utcnow()
|
||||
stopped_at_local = _parse_rnh_datetime(rnh_meta.get("stop_time_str"))
|
||||
|
||||
started_at = local_to_utc(started_at_local)
|
||||
stopped_at = local_to_utc(stopped_at_local) if stopped_at_local else None
|
||||
|
||||
duration_seconds = None
|
||||
if started_at and stopped_at:
|
||||
duration_seconds = int((stopped_at - started_at).total_seconds())
|
||||
|
||||
store_name = rnh_meta.get("store_name", "")
|
||||
serial_number = rnh_meta.get("serial_number", "")
|
||||
index_number = rnh_meta.get("index_number", "")
|
||||
|
||||
# --- Step 3: Create MonitoringSession ---
|
||||
# Use local times for period/label so classification reflects the clock at the site.
|
||||
period_type = _derive_period_type(started_at_local) if started_at_local else None
|
||||
session_label = _build_session_label(started_at_local, location.name, period_type) if started_at_local else None
|
||||
|
||||
session_id = str(uuid.uuid4())
|
||||
monitoring_session = MonitoringSession(
|
||||
id=session_id,
|
||||
project_id=project_id,
|
||||
location_id=location_id,
|
||||
unit_id=None,
|
||||
session_type="sound",
|
||||
started_at=started_at,
|
||||
stopped_at=stopped_at,
|
||||
duration_seconds=duration_seconds,
|
||||
status="completed",
|
||||
session_label=session_label,
|
||||
period_type=period_type,
|
||||
session_metadata=json.dumps({
|
||||
"source": "manual_upload",
|
||||
"store_name": store_name,
|
||||
"serial_number": serial_number,
|
||||
"index_number": index_number,
|
||||
}),
|
||||
)
|
||||
db.add(monitoring_session)
|
||||
db.commit()
|
||||
db.refresh(monitoring_session)
|
||||
|
||||
# --- Step 4: Write files to disk and create DataFile records ---
|
||||
output_dir = Path("data/Projects") / project_id / session_id
|
||||
output_dir.mkdir(parents=True, exist_ok=True)
|
||||
|
||||
leq_count = 0
|
||||
lp_count = 0
|
||||
metadata_count = 0
|
||||
files_imported = 0
|
||||
|
||||
for fname, fbytes in file_entries:
|
||||
file_type = _classify_file(fname)
|
||||
fname_lower = fname.lower()
|
||||
|
||||
# Track counts for summary
|
||||
if fname_lower.endswith(".rnd"):
|
||||
if "_leq_" in fname_lower:
|
||||
leq_count += 1
|
||||
elif "_lp" in fname_lower:
|
||||
lp_count += 1
|
||||
elif fname_lower.endswith(".rnh"):
|
||||
metadata_count += 1
|
||||
|
||||
# Write to disk
|
||||
dest = output_dir / fname
|
||||
dest.write_bytes(fbytes)
|
||||
|
||||
# Compute checksum
|
||||
checksum = hashlib.sha256(fbytes).hexdigest()
|
||||
|
||||
# Store relative path from data/ dir
|
||||
rel_path = str(dest.relative_to("data"))
|
||||
|
||||
data_file = DataFile(
|
||||
id=str(uuid.uuid4()),
|
||||
session_id=session_id,
|
||||
file_path=rel_path,
|
||||
file_type=file_type,
|
||||
file_size_bytes=len(fbytes),
|
||||
downloaded_at=datetime.utcnow(),
|
||||
checksum=checksum,
|
||||
file_metadata=json.dumps({
|
||||
"source": "manual_upload",
|
||||
"original_filename": fname,
|
||||
"store_name": store_name,
|
||||
}),
|
||||
)
|
||||
db.add(data_file)
|
||||
files_imported += 1
|
||||
|
||||
db.commit()
|
||||
|
||||
return {
|
||||
"success": True,
|
||||
"session_id": session_id,
|
||||
"files_imported": files_imported,
|
||||
"leq_files": leq_count,
|
||||
"lp_files": lp_count,
|
||||
"metadata_files": metadata_count,
|
||||
"store_name": store_name,
|
||||
"started_at": started_at.isoformat() if started_at else None,
|
||||
"stopped_at": stopped_at.isoformat() if stopped_at else None,
|
||||
}
|
||||
|
||||
|
||||
# ============================================================================
|
||||
# NRL Live Status (connected NRLs only)
|
||||
# ============================================================================
|
||||
|
||||
@router.get("/nrl/{location_id}/live-status", response_class=HTMLResponse)
|
||||
async def get_nrl_live_status(
|
||||
project_id: str,
|
||||
location_id: str,
|
||||
request: Request,
|
||||
db: Session = Depends(get_db),
|
||||
):
|
||||
"""
|
||||
Fetch cached status from SLMM for the unit assigned to this NRL and
|
||||
return a compact HTML status card. Used in the NRL overview tab for
|
||||
connected NRLs. Gracefully shows an offline message if SLMM is unreachable.
|
||||
Sound Monitoring projects only.
|
||||
"""
|
||||
import os
|
||||
import httpx
|
||||
|
||||
_require_module(db.query(Project).filter_by(id=project_id).first(), "sound_monitoring", db)
|
||||
|
||||
# Find the assigned unit (active = assigned_until IS NULL)
|
||||
assignment = db.query(UnitAssignment).filter(
|
||||
and_(
|
||||
UnitAssignment.location_id == location_id,
|
||||
UnitAssignment.assigned_until == None,
|
||||
)
|
||||
).first()
|
||||
|
||||
if not assignment:
|
||||
return templates.TemplateResponse("partials/projects/nrl_live_status.html", {
|
||||
"request": request,
|
||||
"status": None,
|
||||
"error": "No unit assigned",
|
||||
})
|
||||
|
||||
unit = db.query(RosterUnit).filter_by(id=assignment.unit_id).first()
|
||||
if not unit:
|
||||
return templates.TemplateResponse("partials/projects/nrl_live_status.html", {
|
||||
"request": request,
|
||||
"status": None,
|
||||
"error": "Assigned unit not found",
|
||||
})
|
||||
|
||||
slmm_base = os.getenv("SLMM_BASE_URL", "http://localhost:8100")
|
||||
status_data = None
|
||||
error_msg = None
|
||||
|
||||
try:
|
||||
async with httpx.AsyncClient(timeout=5.0) as client:
|
||||
resp = await client.get(f"{slmm_base}/api/nl43/{unit.id}/status")
|
||||
if resp.status_code == 200:
|
||||
status_data = resp.json()
|
||||
else:
|
||||
error_msg = f"SLMM returned {resp.status_code}"
|
||||
except Exception as e:
|
||||
error_msg = "SLMM unreachable"
|
||||
|
||||
return templates.TemplateResponse("partials/projects/nrl_live_status.html", {
|
||||
"request": request,
|
||||
"unit": unit,
|
||||
"status": status_data,
|
||||
"error": error_msg,
|
||||
})
|
||||
|
||||
@@ -186,6 +186,41 @@ async def create_recurring_schedule(
|
||||
created_schedules = []
|
||||
base_name = data.get("name", "Unnamed Schedule")
|
||||
|
||||
# Parse one-off datetime fields if applicable
|
||||
one_off_start = None
|
||||
one_off_end = None
|
||||
if data.get("schedule_type") == "one_off":
|
||||
from zoneinfo import ZoneInfo
|
||||
|
||||
tz = ZoneInfo(data.get("timezone", "America/New_York"))
|
||||
|
||||
start_dt_str = data.get("start_datetime")
|
||||
end_dt_str = data.get("end_datetime")
|
||||
|
||||
if not start_dt_str or not end_dt_str:
|
||||
raise HTTPException(status_code=400, detail="One-off schedules require start and end date/time")
|
||||
|
||||
try:
|
||||
start_local = datetime.fromisoformat(start_dt_str).replace(tzinfo=tz)
|
||||
end_local = datetime.fromisoformat(end_dt_str).replace(tzinfo=tz)
|
||||
except ValueError:
|
||||
raise HTTPException(status_code=400, detail="Invalid datetime format")
|
||||
|
||||
duration = end_local - start_local
|
||||
if duration.total_seconds() < 900:
|
||||
raise HTTPException(status_code=400, detail="Duration must be at least 15 minutes")
|
||||
if duration.total_seconds() > 86400:
|
||||
raise HTTPException(status_code=400, detail="Duration cannot exceed 24 hours")
|
||||
|
||||
from datetime import timezone as dt_timezone
|
||||
now_local = datetime.now(tz)
|
||||
if start_local <= now_local:
|
||||
raise HTTPException(status_code=400, detail="Start time must be in the future")
|
||||
|
||||
# Convert to UTC for storage
|
||||
one_off_start = start_local.astimezone(ZoneInfo("UTC")).replace(tzinfo=None)
|
||||
one_off_end = end_local.astimezone(ZoneInfo("UTC")).replace(tzinfo=None)
|
||||
|
||||
# Create a schedule for each location
|
||||
for location in locations:
|
||||
# Determine device type from location
|
||||
@@ -207,6 +242,8 @@ async def create_recurring_schedule(
|
||||
include_download=data.get("include_download", True),
|
||||
auto_increment_index=data.get("auto_increment_index", True),
|
||||
timezone=data.get("timezone", "America/New_York"),
|
||||
start_datetime=one_off_start,
|
||||
end_datetime=one_off_end,
|
||||
)
|
||||
|
||||
# Generate actions immediately so they appear right away
|
||||
@@ -330,19 +367,35 @@ async def disable_schedule(
|
||||
db: Session = Depends(get_db),
|
||||
):
|
||||
"""
|
||||
Disable a schedule.
|
||||
Disable a schedule and cancel all its pending actions.
|
||||
"""
|
||||
service = get_recurring_schedule_service(db)
|
||||
|
||||
# Count pending actions before disabling (for response message)
|
||||
from sqlalchemy import and_
|
||||
from backend.models import ScheduledAction
|
||||
pending_count = db.query(ScheduledAction).filter(
|
||||
and_(
|
||||
ScheduledAction.execution_status == "pending",
|
||||
ScheduledAction.notes.like(f'%"schedule_id": "{schedule_id}"%'),
|
||||
)
|
||||
).count()
|
||||
|
||||
schedule = service.disable_schedule(schedule_id)
|
||||
|
||||
if not schedule:
|
||||
raise HTTPException(status_code=404, detail="Schedule not found")
|
||||
|
||||
message = "Schedule disabled"
|
||||
if pending_count > 0:
|
||||
message += f" and {pending_count} pending action(s) cancelled"
|
||||
|
||||
return {
|
||||
"success": True,
|
||||
"schedule_id": schedule.id,
|
||||
"enabled": schedule.enabled,
|
||||
"message": "Schedule disabled",
|
||||
"cancelled_actions": pending_count,
|
||||
"message": message,
|
||||
}
|
||||
|
||||
|
||||
@@ -444,6 +497,9 @@ async def get_schedule_list_partial(
|
||||
"""
|
||||
Return HTML partial for schedule list.
|
||||
"""
|
||||
project = db.query(Project).filter_by(id=project_id).first()
|
||||
project_status = project.status if project else "active"
|
||||
|
||||
schedules = db.query(RecurringSchedule).filter_by(
|
||||
project_id=project_id
|
||||
).order_by(RecurringSchedule.created_at.desc()).all()
|
||||
@@ -462,4 +518,5 @@ async def get_schedule_list_partial(
|
||||
"request": request,
|
||||
"project_id": project_id,
|
||||
"schedules": schedule_data,
|
||||
"project_status": project_status,
|
||||
})
|
||||
|
||||
@@ -2,20 +2,32 @@ from fastapi import APIRouter, Depends
|
||||
from sqlalchemy.orm import Session
|
||||
from datetime import datetime, timedelta
|
||||
from typing import Dict, Any
|
||||
import asyncio
|
||||
import logging
|
||||
import random
|
||||
|
||||
from backend.database import get_db
|
||||
from backend.services.snapshot import emit_status_snapshot
|
||||
from backend.services.slm_status_sync import sync_slm_status_to_emitters
|
||||
|
||||
router = APIRouter(prefix="/api", tags=["roster"])
|
||||
logger = logging.getLogger(__name__)
|
||||
|
||||
|
||||
@router.get("/status-snapshot")
|
||||
def get_status_snapshot(db: Session = Depends(get_db)):
|
||||
async def get_status_snapshot(db: Session = Depends(get_db)):
|
||||
"""
|
||||
Calls emit_status_snapshot() to get current fleet status.
|
||||
This will be replaced with real Series3 emitter logic later.
|
||||
Syncs SLM status from SLMM before generating snapshot.
|
||||
"""
|
||||
# Sync SLM status from SLMM (with timeout to prevent blocking)
|
||||
try:
|
||||
await asyncio.wait_for(sync_slm_status_to_emitters(), timeout=2.0)
|
||||
except asyncio.TimeoutError:
|
||||
logger.warning("SLM status sync timed out, using cached data")
|
||||
except Exception as e:
|
||||
logger.warning(f"SLM status sync failed: {e}")
|
||||
|
||||
return emit_status_snapshot()
|
||||
|
||||
|
||||
|
||||
@@ -92,15 +92,15 @@ async def rename_unit(
|
||||
except Exception as e:
|
||||
logger.warning(f"Could not update unit_assignments: {e}")
|
||||
|
||||
# Update recording_sessions table (if exists)
|
||||
# Update monitoring_sessions table (if exists)
|
||||
try:
|
||||
from backend.models import RecordingSession
|
||||
db.query(RecordingSession).filter(RecordingSession.unit_id == old_id).update(
|
||||
from backend.models import MonitoringSession
|
||||
db.query(MonitoringSession).filter(MonitoringSession.unit_id == old_id).update(
|
||||
{"unit_id": new_id},
|
||||
synchronize_session=False
|
||||
)
|
||||
except Exception as e:
|
||||
logger.warning(f"Could not update recording_sessions: {e}")
|
||||
logger.warning(f"Could not update monitoring_sessions: {e}")
|
||||
|
||||
# Commit all changes
|
||||
db.commit()
|
||||
|
||||
@@ -3,11 +3,13 @@ Seismograph Dashboard API Router
|
||||
Provides endpoints for the seismograph-specific dashboard
|
||||
"""
|
||||
|
||||
from fastapi import APIRouter, Request, Depends, Query
|
||||
from datetime import date, datetime, timedelta
|
||||
|
||||
from fastapi import APIRouter, Request, Depends, Query, Form, HTTPException
|
||||
from fastapi.responses import HTMLResponse
|
||||
from sqlalchemy.orm import Session
|
||||
from backend.database import get_db
|
||||
from backend.models import RosterUnit
|
||||
from backend.models import RosterUnit, UnitHistory, UserPreferences
|
||||
from backend.templates_config import templates
|
||||
|
||||
router = APIRouter(prefix="/api/seismo-dashboard", tags=["seismo-dashboard"])
|
||||
@@ -26,7 +28,8 @@ async def get_seismo_stats(request: Request, db: Session = Depends(get_db)):
|
||||
|
||||
total = len(seismos)
|
||||
deployed = sum(1 for s in seismos if s.deployed)
|
||||
benched = sum(1 for s in seismos if not s.deployed)
|
||||
benched = sum(1 for s in seismos if not s.deployed and not s.out_for_calibration)
|
||||
out_for_calibration = sum(1 for s in seismos if s.out_for_calibration)
|
||||
|
||||
# Count modems assigned to deployed seismographs
|
||||
with_modem = sum(1 for s in seismos if s.deployed and s.deployed_with_modem_id)
|
||||
@@ -39,6 +42,7 @@ async def get_seismo_stats(request: Request, db: Session = Depends(get_db)):
|
||||
"total": total,
|
||||
"deployed": deployed,
|
||||
"benched": benched,
|
||||
"out_for_calibration": out_for_calibration,
|
||||
"with_modem": with_modem,
|
||||
"without_modem": without_modem
|
||||
}
|
||||
@@ -49,10 +53,14 @@ async def get_seismo_stats(request: Request, db: Session = Depends(get_db)):
|
||||
async def get_seismo_units(
|
||||
request: Request,
|
||||
db: Session = Depends(get_db),
|
||||
search: str = Query(None)
|
||||
search: str = Query(None),
|
||||
sort: str = Query("id"),
|
||||
order: str = Query("asc"),
|
||||
status: str = Query(None),
|
||||
modem: str = Query(None)
|
||||
):
|
||||
"""
|
||||
Returns HTML partial with filterable seismograph unit list
|
||||
Returns HTML partial with filterable and sortable seismograph unit list
|
||||
"""
|
||||
query = db.query(RosterUnit).filter_by(
|
||||
device_type="seismograph",
|
||||
@@ -61,20 +69,160 @@ async def get_seismo_units(
|
||||
|
||||
# Apply search filter
|
||||
if search:
|
||||
search_lower = search.lower()
|
||||
query = query.filter(
|
||||
(RosterUnit.id.ilike(f"%{search}%")) |
|
||||
(RosterUnit.note.ilike(f"%{search}%")) |
|
||||
(RosterUnit.address.ilike(f"%{search}%"))
|
||||
)
|
||||
|
||||
seismos = query.order_by(RosterUnit.id).all()
|
||||
# Apply status filter
|
||||
if status == "deployed":
|
||||
query = query.filter(RosterUnit.deployed == True)
|
||||
elif status == "benched":
|
||||
query = query.filter(RosterUnit.deployed == False, RosterUnit.out_for_calibration == False)
|
||||
elif status == "out_for_calibration":
|
||||
query = query.filter(RosterUnit.out_for_calibration == True)
|
||||
|
||||
# Apply modem filter
|
||||
if modem == "with":
|
||||
query = query.filter(RosterUnit.deployed_with_modem_id.isnot(None))
|
||||
elif modem == "without":
|
||||
query = query.filter(RosterUnit.deployed_with_modem_id.is_(None))
|
||||
|
||||
# Apply sorting
|
||||
sort_column_map = {
|
||||
"id": RosterUnit.id,
|
||||
"status": RosterUnit.deployed,
|
||||
"modem": RosterUnit.deployed_with_modem_id,
|
||||
"location": RosterUnit.address,
|
||||
"last_calibrated": RosterUnit.last_calibrated,
|
||||
"notes": RosterUnit.note
|
||||
}
|
||||
sort_column = sort_column_map.get(sort, RosterUnit.id)
|
||||
|
||||
if order == "desc":
|
||||
query = query.order_by(sort_column.desc())
|
||||
else:
|
||||
query = query.order_by(sort_column.asc())
|
||||
|
||||
seismos = query.all()
|
||||
|
||||
return templates.TemplateResponse(
|
||||
"partials/seismo_unit_list.html",
|
||||
{
|
||||
"request": request,
|
||||
"units": seismos,
|
||||
"search": search or ""
|
||||
"search": search or "",
|
||||
"sort": sort,
|
||||
"order": order,
|
||||
"status": status or "",
|
||||
"modem": modem or "",
|
||||
"today": date.today()
|
||||
}
|
||||
)
|
||||
|
||||
|
||||
def _get_calibration_interval(db: Session) -> int:
|
||||
prefs = db.query(UserPreferences).first()
|
||||
if prefs and prefs.calibration_interval_days:
|
||||
return prefs.calibration_interval_days
|
||||
return 365
|
||||
|
||||
|
||||
def _row_context(request: Request, unit: RosterUnit) -> dict:
|
||||
return {"request": request, "unit": unit, "today": date.today()}
|
||||
|
||||
|
||||
@router.get("/unit/{unit_id}/view-row", response_class=HTMLResponse)
|
||||
async def get_seismo_view_row(unit_id: str, request: Request, db: Session = Depends(get_db)):
|
||||
unit = db.query(RosterUnit).filter(RosterUnit.id == unit_id).first()
|
||||
if not unit:
|
||||
raise HTTPException(status_code=404, detail="Unit not found")
|
||||
return templates.TemplateResponse("partials/seismo_row_view.html", _row_context(request, unit))
|
||||
|
||||
|
||||
@router.get("/unit/{unit_id}/edit-row", response_class=HTMLResponse)
|
||||
async def get_seismo_edit_row(unit_id: str, request: Request, db: Session = Depends(get_db)):
|
||||
unit = db.query(RosterUnit).filter(RosterUnit.id == unit_id).first()
|
||||
if not unit:
|
||||
raise HTTPException(status_code=404, detail="Unit not found")
|
||||
return templates.TemplateResponse("partials/seismo_row_edit.html", _row_context(request, unit))
|
||||
|
||||
|
||||
@router.post("/unit/{unit_id}/quick-update", response_class=HTMLResponse)
|
||||
async def quick_update_seismo_unit(
|
||||
unit_id: str,
|
||||
request: Request,
|
||||
db: Session = Depends(get_db),
|
||||
status: str = Form(...),
|
||||
last_calibrated: str = Form(""),
|
||||
note: str = Form(""),
|
||||
):
|
||||
unit = db.query(RosterUnit).filter(RosterUnit.id == unit_id).first()
|
||||
if not unit:
|
||||
raise HTTPException(status_code=404, detail="Unit not found")
|
||||
|
||||
# --- Status ---
|
||||
old_deployed = unit.deployed
|
||||
old_out_for_cal = unit.out_for_calibration
|
||||
if status == "deployed":
|
||||
unit.deployed = True
|
||||
unit.out_for_calibration = False
|
||||
elif status == "out_for_calibration":
|
||||
unit.deployed = False
|
||||
unit.out_for_calibration = True
|
||||
else:
|
||||
unit.deployed = False
|
||||
unit.out_for_calibration = False
|
||||
|
||||
if unit.deployed != old_deployed or unit.out_for_calibration != old_out_for_cal:
|
||||
old_status = "deployed" if old_deployed else ("out_for_calibration" if old_out_for_cal else "benched")
|
||||
db.add(UnitHistory(
|
||||
unit_id=unit_id,
|
||||
change_type="deployed_change",
|
||||
field_name="status",
|
||||
old_value=old_status,
|
||||
new_value=status,
|
||||
source="manual",
|
||||
))
|
||||
|
||||
# --- Last calibrated ---
|
||||
old_cal = unit.last_calibrated
|
||||
if last_calibrated:
|
||||
try:
|
||||
new_cal = datetime.strptime(last_calibrated, "%Y-%m-%d").date()
|
||||
except ValueError:
|
||||
raise HTTPException(status_code=400, detail="Invalid date format. Use YYYY-MM-DD")
|
||||
unit.last_calibrated = new_cal
|
||||
unit.next_calibration_due = new_cal + timedelta(days=_get_calibration_interval(db))
|
||||
else:
|
||||
unit.last_calibrated = None
|
||||
unit.next_calibration_due = None
|
||||
|
||||
if unit.last_calibrated != old_cal:
|
||||
db.add(UnitHistory(
|
||||
unit_id=unit_id,
|
||||
change_type="calibration_status_change",
|
||||
field_name="last_calibrated",
|
||||
old_value=old_cal.strftime("%Y-%m-%d") if old_cal else None,
|
||||
new_value=last_calibrated or None,
|
||||
source="manual",
|
||||
))
|
||||
|
||||
# --- Note ---
|
||||
old_note = unit.note
|
||||
unit.note = note or None
|
||||
if unit.note != old_note:
|
||||
db.add(UnitHistory(
|
||||
unit_id=unit_id,
|
||||
change_type="note_change",
|
||||
field_name="note",
|
||||
old_value=old_note,
|
||||
new_value=unit.note,
|
||||
source="manual",
|
||||
))
|
||||
|
||||
db.commit()
|
||||
db.refresh(unit)
|
||||
|
||||
return templates.TemplateResponse("partials/seismo_row_view.html", _row_context(request, unit))
|
||||
|
||||
@@ -167,23 +167,7 @@ async def get_live_view(request: Request, unit_id: str, db: Session = Depends(ge
|
||||
measurement_state = state_data.get("measurement_state", "Unknown")
|
||||
is_measuring = state_data.get("is_measuring", False)
|
||||
|
||||
# If measuring, sync start time from FTP to database (fixes wrong timestamps)
|
||||
if is_measuring:
|
||||
try:
|
||||
sync_response = await client.post(
|
||||
f"{SLMM_BASE_URL}/api/nl43/{unit_id}/sync-start-time",
|
||||
timeout=10.0
|
||||
)
|
||||
if sync_response.status_code == 200:
|
||||
sync_data = sync_response.json()
|
||||
logger.info(f"Synced start time for {unit_id}: {sync_data.get('message')}")
|
||||
else:
|
||||
logger.warning(f"Failed to sync start time for {unit_id}: {sync_response.status_code}")
|
||||
except Exception as e:
|
||||
# Don't fail the whole request if sync fails
|
||||
logger.warning(f"Could not sync start time for {unit_id}: {e}")
|
||||
|
||||
# Get live status (now with corrected start time)
|
||||
# Get live status (measurement_start_time is already stored in SLMM database)
|
||||
status_response = await client.get(
|
||||
f"{SLMM_BASE_URL}/api/nl43/{unit_id}/live"
|
||||
)
|
||||
|
||||
133
backend/routers/watcher_manager.py
Normal file
@@ -0,0 +1,133 @@
|
||||
"""
|
||||
Watcher Manager — admin API for series3-watcher and thor-watcher agents.
|
||||
|
||||
Endpoints:
|
||||
GET /api/admin/watchers — list all watcher agents
|
||||
GET /api/admin/watchers/{agent_id} — get single agent detail
|
||||
POST /api/admin/watchers/{agent_id}/trigger-update — flag agent for update
|
||||
POST /api/admin/watchers/{agent_id}/clear-update — clear update flag
|
||||
GET /api/admin/watchers/{agent_id}/update-check — polled by watcher on heartbeat
|
||||
|
||||
Page:
|
||||
GET /admin/watchers — HTML admin page
|
||||
"""
|
||||
|
||||
from datetime import datetime, timezone
|
||||
from fastapi import APIRouter, Depends, HTTPException, Request
|
||||
from fastapi.responses import HTMLResponse
|
||||
from pydantic import BaseModel
|
||||
from sqlalchemy.orm import Session
|
||||
from typing import Optional
|
||||
|
||||
from backend.database import get_db
|
||||
from backend.models import WatcherAgent
|
||||
from backend.templates_config import templates
|
||||
|
||||
router = APIRouter(tags=["admin"])
|
||||
|
||||
|
||||
# ── helpers ──────────────────────────────────────────────────────────────────
|
||||
|
||||
def _agent_to_dict(agent: WatcherAgent) -> dict:
|
||||
last_seen = agent.last_seen
|
||||
if last_seen:
|
||||
now_utc = datetime.utcnow()
|
||||
age_minutes = int((now_utc - last_seen).total_seconds() // 60)
|
||||
if age_minutes > 60:
|
||||
status = "missing"
|
||||
else:
|
||||
status = "ok"
|
||||
else:
|
||||
age_minutes = None
|
||||
status = "missing"
|
||||
|
||||
return {
|
||||
"id": agent.id,
|
||||
"source_type": agent.source_type,
|
||||
"version": agent.version,
|
||||
"last_seen": last_seen.isoformat() if last_seen else None,
|
||||
"age_minutes": age_minutes,
|
||||
"status": status,
|
||||
"ip_address": agent.ip_address,
|
||||
"log_tail": agent.log_tail,
|
||||
"update_pending": bool(agent.update_pending),
|
||||
"update_version": agent.update_version,
|
||||
}
|
||||
|
||||
|
||||
# ── API routes ────────────────────────────────────────────────────────────────
|
||||
|
||||
@router.get("/api/admin/watchers")
|
||||
def list_watchers(db: Session = Depends(get_db)):
|
||||
agents = db.query(WatcherAgent).order_by(WatcherAgent.last_seen.desc()).all()
|
||||
return [_agent_to_dict(a) for a in agents]
|
||||
|
||||
|
||||
@router.get("/api/admin/watchers/{agent_id}")
|
||||
def get_watcher(agent_id: str, db: Session = Depends(get_db)):
|
||||
agent = db.query(WatcherAgent).filter(WatcherAgent.id == agent_id).first()
|
||||
if not agent:
|
||||
raise HTTPException(status_code=404, detail="Watcher agent not found")
|
||||
return _agent_to_dict(agent)
|
||||
|
||||
|
||||
class TriggerUpdateRequest(BaseModel):
|
||||
version: Optional[str] = None # target version label (informational)
|
||||
|
||||
|
||||
@router.post("/api/admin/watchers/{agent_id}/trigger-update")
|
||||
def trigger_update(agent_id: str, body: TriggerUpdateRequest, db: Session = Depends(get_db)):
|
||||
agent = db.query(WatcherAgent).filter(WatcherAgent.id == agent_id).first()
|
||||
if not agent:
|
||||
raise HTTPException(status_code=404, detail="Watcher agent not found")
|
||||
agent.update_pending = True
|
||||
agent.update_version = body.version
|
||||
db.commit()
|
||||
return {"ok": True, "agent_id": agent_id, "update_pending": True}
|
||||
|
||||
|
||||
@router.post("/api/admin/watchers/{agent_id}/clear-update")
|
||||
def clear_update(agent_id: str, db: Session = Depends(get_db)):
|
||||
agent = db.query(WatcherAgent).filter(WatcherAgent.id == agent_id).first()
|
||||
if not agent:
|
||||
raise HTTPException(status_code=404, detail="Watcher agent not found")
|
||||
agent.update_pending = False
|
||||
agent.update_version = None
|
||||
db.commit()
|
||||
return {"ok": True, "agent_id": agent_id, "update_pending": False}
|
||||
|
||||
|
||||
@router.get("/api/admin/watchers/{agent_id}/update-check")
|
||||
def update_check(agent_id: str, db: Session = Depends(get_db)):
|
||||
"""
|
||||
Polled by watcher agents on each heartbeat cycle.
|
||||
Returns update_available=True when an update has been triggered via the UI.
|
||||
Automatically clears the flag after the watcher acknowledges it.
|
||||
"""
|
||||
agent = db.query(WatcherAgent).filter(WatcherAgent.id == agent_id).first()
|
||||
if not agent:
|
||||
return {"update_available": False}
|
||||
|
||||
pending = bool(agent.update_pending)
|
||||
|
||||
if pending:
|
||||
# Clear the flag — the watcher will now self-update
|
||||
agent.update_pending = False
|
||||
db.commit()
|
||||
|
||||
return {
|
||||
"update_available": pending,
|
||||
"version": agent.update_version,
|
||||
}
|
||||
|
||||
|
||||
# ── HTML page ─────────────────────────────────────────────────────────────────
|
||||
|
||||
@router.get("/admin/watchers", response_class=HTMLResponse)
|
||||
def admin_watchers_page(request: Request, db: Session = Depends(get_db)):
|
||||
agents = db.query(WatcherAgent).order_by(WatcherAgent.last_seen.desc()).all()
|
||||
agents_data = [_agent_to_dict(a) for a in agents]
|
||||
return templates.TemplateResponse("admin_watchers.html", {
|
||||
"request": request,
|
||||
"agents": agents_data,
|
||||
})
|
||||
@@ -5,7 +5,7 @@ from datetime import datetime
|
||||
from typing import Optional, List
|
||||
|
||||
from backend.database import get_db
|
||||
from backend.models import Emitter
|
||||
from backend.models import Emitter, WatcherAgent
|
||||
|
||||
router = APIRouter()
|
||||
|
||||
@@ -107,6 +107,35 @@ def get_fleet_status(db: Session = Depends(get_db)):
|
||||
emitters = db.query(Emitter).all()
|
||||
return emitters
|
||||
|
||||
# ── Watcher agent upsert helper ───────────────────────────────────────────────
|
||||
|
||||
def _upsert_watcher_agent(db: Session, source_id: str, source_type: str,
|
||||
version: str, ip_address: str, log_tail: str,
|
||||
status: str) -> None:
|
||||
"""Create or update the WatcherAgent row for a given source_id."""
|
||||
agent = db.query(WatcherAgent).filter(WatcherAgent.id == source_id).first()
|
||||
if agent:
|
||||
agent.source_type = source_type
|
||||
agent.version = version
|
||||
agent.last_seen = datetime.utcnow()
|
||||
agent.status = status
|
||||
if ip_address:
|
||||
agent.ip_address = ip_address
|
||||
if log_tail is not None:
|
||||
agent.log_tail = log_tail
|
||||
else:
|
||||
agent = WatcherAgent(
|
||||
id=source_id,
|
||||
source_type=source_type,
|
||||
version=version,
|
||||
last_seen=datetime.utcnow(),
|
||||
status=status,
|
||||
ip_address=ip_address,
|
||||
log_tail=log_tail,
|
||||
)
|
||||
db.add(agent)
|
||||
|
||||
|
||||
# series3v1.1 Standardized Heartbeat Schema (multi-unit)
|
||||
from fastapi import Request
|
||||
|
||||
@@ -120,6 +149,11 @@ async def series3_heartbeat(request: Request, db: Session = Depends(get_db)):
|
||||
|
||||
source = payload.get("source_id")
|
||||
units = payload.get("units", [])
|
||||
version = payload.get("version")
|
||||
log_tail = payload.get("log_tail") # list of strings or None
|
||||
import json as _json
|
||||
log_tail_str = _json.dumps(log_tail) if log_tail is not None else None
|
||||
client_ip = request.client.host if request.client else None
|
||||
|
||||
print("\n=== Series 3 Heartbeat ===")
|
||||
print("Source:", source)
|
||||
@@ -182,13 +216,27 @@ async def series3_heartbeat(request: Request, db: Session = Depends(get_db)):
|
||||
|
||||
results.append({"unit": uid, "status": status})
|
||||
|
||||
if source:
|
||||
_upsert_watcher_agent(db, source, "series3_watcher", version,
|
||||
client_ip, log_tail_str, "ok")
|
||||
|
||||
db.commit()
|
||||
|
||||
# Check if an update has been triggered for this agent
|
||||
update_available = False
|
||||
if source:
|
||||
agent = db.query(WatcherAgent).filter(WatcherAgent.id == source).first()
|
||||
if agent and agent.update_pending:
|
||||
update_available = True
|
||||
agent.update_pending = False
|
||||
db.commit()
|
||||
|
||||
return {
|
||||
"message": "Heartbeat processed",
|
||||
"source": source,
|
||||
"units_processed": len(results),
|
||||
"results": results
|
||||
"results": results,
|
||||
"update_available": update_available,
|
||||
}
|
||||
|
||||
|
||||
@@ -219,8 +267,14 @@ async def series4_heartbeat(request: Request, db: Session = Depends(get_db)):
|
||||
"""
|
||||
payload = await request.json()
|
||||
|
||||
source = payload.get("source", "series4_emitter")
|
||||
# Accept source_id (new standard field) with fallback to legacy "source" key
|
||||
source = payload.get("source_id") or payload.get("source", "series4_emitter")
|
||||
units = payload.get("units", [])
|
||||
version = payload.get("version")
|
||||
log_tail = payload.get("log_tail")
|
||||
import json as _json
|
||||
log_tail_str = _json.dumps(log_tail) if log_tail is not None else None
|
||||
client_ip = request.client.host if request.client else None
|
||||
|
||||
print("\n=== Series 4 Heartbeat ===")
|
||||
print("Source:", source)
|
||||
@@ -276,11 +330,25 @@ async def series4_heartbeat(request: Request, db: Session = Depends(get_db)):
|
||||
|
||||
results.append({"unit": uid, "status": status})
|
||||
|
||||
if source:
|
||||
_upsert_watcher_agent(db, source, "series4_watcher", version,
|
||||
client_ip, log_tail_str, "ok")
|
||||
|
||||
db.commit()
|
||||
|
||||
# Check if an update has been triggered for this agent
|
||||
update_available = False
|
||||
if source:
|
||||
agent = db.query(WatcherAgent).filter(WatcherAgent.id == source).first()
|
||||
if agent and agent.update_pending:
|
||||
update_available = True
|
||||
agent.update_pending = False
|
||||
db.commit()
|
||||
|
||||
return {
|
||||
"message": "Heartbeat processed",
|
||||
"source": source,
|
||||
"units_processed": len(results),
|
||||
"results": results
|
||||
"results": results,
|
||||
"update_available": update_available,
|
||||
}
|
||||
|
||||
@@ -199,7 +199,7 @@ class AlertService:
|
||||
|
||||
Args:
|
||||
schedule_id: The ScheduledAction or RecurringSchedule ID
|
||||
action_type: start, stop, download
|
||||
action_type: start, stop, download, cycle
|
||||
unit_id: Related unit
|
||||
error_message: Error from execution
|
||||
project_id: Related project
|
||||
@@ -235,7 +235,7 @@ class AlertService:
|
||||
|
||||
Args:
|
||||
schedule_id: The ScheduledAction ID
|
||||
action_type: start, stop, download
|
||||
action_type: start, stop, download, cycle
|
||||
unit_id: Related unit
|
||||
project_id: Related project
|
||||
location_id: Related location
|
||||
|
||||
@@ -289,6 +289,74 @@ class DeviceController:
|
||||
else:
|
||||
raise UnsupportedDeviceTypeError(f"Unsupported device type: {device_type}")
|
||||
|
||||
# ========================================================================
|
||||
# FTP Control
|
||||
# ========================================================================
|
||||
|
||||
async def enable_ftp(
|
||||
self,
|
||||
unit_id: str,
|
||||
device_type: str,
|
||||
) -> Dict[str, Any]:
|
||||
"""
|
||||
Enable FTP server on device.
|
||||
|
||||
Must be called before downloading files.
|
||||
|
||||
Args:
|
||||
unit_id: Unit identifier
|
||||
device_type: "slm" | "seismograph"
|
||||
|
||||
Returns:
|
||||
Response dict with status
|
||||
"""
|
||||
if device_type == "slm":
|
||||
try:
|
||||
return await self.slmm_client.enable_ftp(unit_id)
|
||||
except SLMMClientError as e:
|
||||
raise DeviceControllerError(f"SLMM error: {str(e)}")
|
||||
|
||||
elif device_type == "seismograph":
|
||||
return {
|
||||
"status": "not_implemented",
|
||||
"message": "Seismograph FTP not yet implemented",
|
||||
"unit_id": unit_id,
|
||||
}
|
||||
|
||||
else:
|
||||
raise UnsupportedDeviceTypeError(f"Unsupported device type: {device_type}")
|
||||
|
||||
async def disable_ftp(
|
||||
self,
|
||||
unit_id: str,
|
||||
device_type: str,
|
||||
) -> Dict[str, Any]:
|
||||
"""
|
||||
Disable FTP server on device.
|
||||
|
||||
Args:
|
||||
unit_id: Unit identifier
|
||||
device_type: "slm" | "seismograph"
|
||||
|
||||
Returns:
|
||||
Response dict with status
|
||||
"""
|
||||
if device_type == "slm":
|
||||
try:
|
||||
return await self.slmm_client.disable_ftp(unit_id)
|
||||
except SLMMClientError as e:
|
||||
raise DeviceControllerError(f"SLMM error: {str(e)}")
|
||||
|
||||
elif device_type == "seismograph":
|
||||
return {
|
||||
"status": "not_implemented",
|
||||
"message": "Seismograph FTP not yet implemented",
|
||||
"unit_id": unit_id,
|
||||
}
|
||||
|
||||
else:
|
||||
raise UnsupportedDeviceTypeError(f"Unsupported device type: {device_type}")
|
||||
|
||||
# ========================================================================
|
||||
# Device Configuration
|
||||
# ========================================================================
|
||||
|
||||
725
backend/services/fleet_calendar_service.py
Normal file
@@ -0,0 +1,725 @@
|
||||
"""
|
||||
Fleet Calendar Service
|
||||
|
||||
Business logic for:
|
||||
- Calculating unit availability on any given date
|
||||
- Calibration status tracking (valid, expiring soon, expired)
|
||||
- Job reservation management
|
||||
- Conflict detection (calibration expires mid-job)
|
||||
"""
|
||||
|
||||
from datetime import date, datetime, timedelta
|
||||
from typing import Dict, List, Optional, Tuple
|
||||
from sqlalchemy.orm import Session
|
||||
from sqlalchemy import and_, or_
|
||||
|
||||
from backend.models import (
|
||||
RosterUnit, JobReservation, JobReservationUnit,
|
||||
UserPreferences, Project, DeploymentRecord
|
||||
)
|
||||
|
||||
|
||||
def get_calibration_status(
|
||||
unit: RosterUnit,
|
||||
check_date: date,
|
||||
warning_days: int = 30
|
||||
) -> str:
|
||||
"""
|
||||
Determine calibration status for a unit on a specific date.
|
||||
|
||||
Returns:
|
||||
"valid" - Calibration is good on this date
|
||||
"expiring_soon" - Within warning_days of expiry
|
||||
"expired" - Calibration has expired
|
||||
"needs_calibration" - No calibration date set
|
||||
"""
|
||||
if not unit.last_calibrated:
|
||||
return "needs_calibration"
|
||||
|
||||
# Calculate expiry date (1 year from last calibration)
|
||||
expiry_date = unit.last_calibrated + timedelta(days=365)
|
||||
|
||||
if check_date >= expiry_date:
|
||||
return "expired"
|
||||
elif check_date >= expiry_date - timedelta(days=warning_days):
|
||||
return "expiring_soon"
|
||||
else:
|
||||
return "valid"
|
||||
|
||||
|
||||
def get_unit_reservations_on_date(
|
||||
db: Session,
|
||||
unit_id: str,
|
||||
check_date: date
|
||||
) -> List[JobReservation]:
|
||||
"""Get all reservations that include this unit on the given date."""
|
||||
|
||||
# Get reservation IDs that have this unit assigned
|
||||
assigned_reservation_ids = db.query(JobReservationUnit.reservation_id).filter(
|
||||
JobReservationUnit.unit_id == unit_id
|
||||
).subquery()
|
||||
|
||||
# Get reservations that:
|
||||
# 1. Have this unit assigned AND date is within range
|
||||
reservations = db.query(JobReservation).filter(
|
||||
JobReservation.id.in_(assigned_reservation_ids),
|
||||
JobReservation.start_date <= check_date,
|
||||
JobReservation.end_date >= check_date
|
||||
).all()
|
||||
|
||||
return reservations
|
||||
|
||||
|
||||
def get_active_deployment(db: Session, unit_id: str) -> Optional[DeploymentRecord]:
|
||||
"""Return the active (unreturned) deployment record for a unit, or None."""
|
||||
return (
|
||||
db.query(DeploymentRecord)
|
||||
.filter(
|
||||
DeploymentRecord.unit_id == unit_id,
|
||||
DeploymentRecord.actual_removal_date == None
|
||||
)
|
||||
.order_by(DeploymentRecord.created_at.desc())
|
||||
.first()
|
||||
)
|
||||
|
||||
|
||||
def is_unit_available_on_date(
|
||||
db: Session,
|
||||
unit: RosterUnit,
|
||||
check_date: date,
|
||||
warning_days: int = 30
|
||||
) -> Tuple[bool, str, Optional[str]]:
|
||||
"""
|
||||
Check if a unit is available on a specific date.
|
||||
|
||||
Returns:
|
||||
(is_available, status, reservation_name)
|
||||
- is_available: True if unit can be assigned to new work
|
||||
- status: "available", "reserved", "expired", "retired", "needs_calibration", "in_field"
|
||||
- reservation_name: Name of blocking reservation or project ref (if any)
|
||||
"""
|
||||
# Check if retired
|
||||
if unit.retired:
|
||||
return False, "retired", None
|
||||
|
||||
# Check calibration status
|
||||
cal_status = get_calibration_status(unit, check_date, warning_days)
|
||||
if cal_status == "expired":
|
||||
return False, "expired", None
|
||||
if cal_status == "needs_calibration":
|
||||
return False, "needs_calibration", None
|
||||
|
||||
# Check for an active deployment record (unit is physically in the field)
|
||||
active_deployment = get_active_deployment(db, unit.id)
|
||||
if active_deployment:
|
||||
label = active_deployment.project_ref or "Field deployment"
|
||||
return False, "in_field", label
|
||||
|
||||
# Check if already reserved
|
||||
reservations = get_unit_reservations_on_date(db, unit.id, check_date)
|
||||
if reservations:
|
||||
return False, "reserved", reservations[0].name
|
||||
|
||||
# Unit is available (even if expiring soon - that's just a warning)
|
||||
return True, "available", None
|
||||
|
||||
|
||||
def get_day_summary(
|
||||
db: Session,
|
||||
check_date: date,
|
||||
device_type: str = "seismograph"
|
||||
) -> Dict:
|
||||
"""
|
||||
Get a complete summary of fleet status for a specific day.
|
||||
|
||||
Returns dict with:
|
||||
- available_units: List of available unit IDs with calibration info
|
||||
- reserved_units: List of reserved unit IDs with reservation info
|
||||
- expired_units: List of units with expired calibration
|
||||
- expiring_soon_units: List of units expiring within warning period
|
||||
- reservations: List of active reservations on this date
|
||||
- counts: Summary counts
|
||||
"""
|
||||
# Get user preferences for warning days
|
||||
prefs = db.query(UserPreferences).filter_by(id=1).first()
|
||||
warning_days = prefs.calibration_warning_days if prefs else 30
|
||||
|
||||
# Get all non-retired units of the specified device type
|
||||
units = db.query(RosterUnit).filter(
|
||||
RosterUnit.device_type == device_type,
|
||||
RosterUnit.retired == False
|
||||
).all()
|
||||
|
||||
available_units = []
|
||||
reserved_units = []
|
||||
expired_units = []
|
||||
expiring_soon_units = []
|
||||
needs_calibration_units = []
|
||||
in_field_units = []
|
||||
cal_expiring_today = [] # Units whose calibration expires ON this day
|
||||
|
||||
for unit in units:
|
||||
is_avail, status, reservation_name = is_unit_available_on_date(
|
||||
db, unit, check_date, warning_days
|
||||
)
|
||||
|
||||
cal_status = get_calibration_status(unit, check_date, warning_days)
|
||||
expiry_date = None
|
||||
if unit.last_calibrated:
|
||||
expiry_date = (unit.last_calibrated + timedelta(days=365)).isoformat()
|
||||
|
||||
unit_info = {
|
||||
"id": unit.id,
|
||||
"last_calibrated": unit.last_calibrated.isoformat() if unit.last_calibrated else None,
|
||||
"expiry_date": expiry_date,
|
||||
"calibration_status": cal_status,
|
||||
"deployed": unit.deployed,
|
||||
"note": unit.note or ""
|
||||
}
|
||||
|
||||
# Check if calibration expires ON this specific day
|
||||
if unit.last_calibrated:
|
||||
unit_expiry_date = unit.last_calibrated + timedelta(days=365)
|
||||
if unit_expiry_date == check_date:
|
||||
cal_expiring_today.append(unit_info)
|
||||
|
||||
if status == "available":
|
||||
available_units.append(unit_info)
|
||||
if cal_status == "expiring_soon":
|
||||
expiring_soon_units.append(unit_info)
|
||||
elif status == "in_field":
|
||||
unit_info["project_ref"] = reservation_name
|
||||
in_field_units.append(unit_info)
|
||||
elif status == "reserved":
|
||||
unit_info["reservation_name"] = reservation_name
|
||||
reserved_units.append(unit_info)
|
||||
if cal_status == "expiring_soon":
|
||||
expiring_soon_units.append(unit_info)
|
||||
elif status == "expired":
|
||||
expired_units.append(unit_info)
|
||||
elif status == "needs_calibration":
|
||||
needs_calibration_units.append(unit_info)
|
||||
|
||||
# Get active reservations on this date
|
||||
reservations = db.query(JobReservation).filter(
|
||||
JobReservation.device_type == device_type,
|
||||
JobReservation.start_date <= check_date,
|
||||
JobReservation.end_date >= check_date
|
||||
).all()
|
||||
|
||||
reservation_list = []
|
||||
for res in reservations:
|
||||
# Count assigned units for this reservation
|
||||
assigned_count = db.query(JobReservationUnit).filter(
|
||||
JobReservationUnit.reservation_id == res.id
|
||||
).count()
|
||||
|
||||
reservation_list.append({
|
||||
"id": res.id,
|
||||
"name": res.name,
|
||||
"start_date": res.start_date.isoformat(),
|
||||
"end_date": res.end_date.isoformat(),
|
||||
"assignment_type": res.assignment_type,
|
||||
"quantity_needed": res.quantity_needed,
|
||||
"assigned_count": assigned_count,
|
||||
"color": res.color,
|
||||
"project_id": res.project_id
|
||||
})
|
||||
|
||||
return {
|
||||
"date": check_date.isoformat(),
|
||||
"device_type": device_type,
|
||||
"available_units": available_units,
|
||||
"in_field_units": in_field_units,
|
||||
"reserved_units": reserved_units,
|
||||
"expired_units": expired_units,
|
||||
"expiring_soon_units": expiring_soon_units,
|
||||
"needs_calibration_units": needs_calibration_units,
|
||||
"cal_expiring_today": cal_expiring_today,
|
||||
"reservations": reservation_list,
|
||||
"counts": {
|
||||
"available": len(available_units),
|
||||
"in_field": len(in_field_units),
|
||||
"reserved": len(reserved_units),
|
||||
"expired": len(expired_units),
|
||||
"expiring_soon": len(expiring_soon_units),
|
||||
"needs_calibration": len(needs_calibration_units),
|
||||
"cal_expiring_today": len(cal_expiring_today),
|
||||
"total": len(units)
|
||||
}
|
||||
}
|
||||
|
||||
|
||||
def get_calendar_year_data(
|
||||
db: Session,
|
||||
year: int,
|
||||
device_type: str = "seismograph"
|
||||
) -> Dict:
|
||||
"""
|
||||
Get calendar data for an entire year.
|
||||
|
||||
For performance, this returns summary counts per day rather than
|
||||
full unit lists. Use get_day_summary() for detailed day data.
|
||||
"""
|
||||
# Get user preferences
|
||||
prefs = db.query(UserPreferences).filter_by(id=1).first()
|
||||
warning_days = prefs.calibration_warning_days if prefs else 30
|
||||
|
||||
# Get all units
|
||||
units = db.query(RosterUnit).filter(
|
||||
RosterUnit.device_type == device_type,
|
||||
RosterUnit.retired == False
|
||||
).all()
|
||||
|
||||
# Get all reservations that overlap with this year
|
||||
# Include TBD reservations (end_date is null) that started before year end
|
||||
year_start = date(year, 1, 1)
|
||||
year_end = date(year, 12, 31)
|
||||
|
||||
reservations = db.query(JobReservation).filter(
|
||||
JobReservation.device_type == device_type,
|
||||
JobReservation.start_date <= year_end,
|
||||
or_(
|
||||
JobReservation.end_date >= year_start,
|
||||
JobReservation.end_date == None # TBD reservations
|
||||
)
|
||||
).all()
|
||||
|
||||
# Get all unit assignments for these reservations
|
||||
reservation_ids = [r.id for r in reservations]
|
||||
assignments = db.query(JobReservationUnit).filter(
|
||||
JobReservationUnit.reservation_id.in_(reservation_ids)
|
||||
).all() if reservation_ids else []
|
||||
|
||||
# Build a lookup: unit_id -> list of (start_date, end_date, reservation_name)
|
||||
# For TBD reservations, use estimated_end_date if available, or a far future date
|
||||
unit_reservations = {}
|
||||
for res in reservations:
|
||||
res_assignments = [a for a in assignments if a.reservation_id == res.id]
|
||||
for assignment in res_assignments:
|
||||
unit_id = assignment.unit_id
|
||||
# Use unit-specific dates if set, otherwise use reservation dates
|
||||
start_d = assignment.unit_start_date or res.start_date
|
||||
if assignment.unit_end_tbd or (assignment.unit_end_date is None and res.end_date_tbd):
|
||||
# TBD: use estimated date or far future for availability calculation
|
||||
end_d = res.estimated_end_date or date(year + 5, 12, 31)
|
||||
else:
|
||||
end_d = assignment.unit_end_date or res.end_date or date(year + 5, 12, 31)
|
||||
|
||||
if unit_id not in unit_reservations:
|
||||
unit_reservations[unit_id] = []
|
||||
unit_reservations[unit_id].append((start_d, end_d, res.name))
|
||||
|
||||
# Build set of unit IDs that have an active deployment record (still in the field)
|
||||
unit_ids = [u.id for u in units]
|
||||
active_deployments = db.query(DeploymentRecord.unit_id).filter(
|
||||
DeploymentRecord.unit_id.in_(unit_ids),
|
||||
DeploymentRecord.actual_removal_date == None
|
||||
).all()
|
||||
unit_in_field = {row.unit_id for row in active_deployments}
|
||||
|
||||
# Generate data for each month
|
||||
months_data = {}
|
||||
|
||||
for month in range(1, 13):
|
||||
# Get first and last day of month
|
||||
first_day = date(year, month, 1)
|
||||
if month == 12:
|
||||
last_day = date(year, 12, 31)
|
||||
else:
|
||||
last_day = date(year, month + 1, 1) - timedelta(days=1)
|
||||
|
||||
days_data = {}
|
||||
current_day = first_day
|
||||
|
||||
while current_day <= last_day:
|
||||
available = 0
|
||||
in_field = 0
|
||||
reserved = 0
|
||||
expired = 0
|
||||
expiring_soon = 0
|
||||
needs_cal = 0
|
||||
cal_expiring_on_day = 0 # Units whose calibration expires ON this day
|
||||
cal_expired_on_day = 0 # Units whose calibration expired ON this day
|
||||
|
||||
for unit in units:
|
||||
# Check calibration
|
||||
cal_status = get_calibration_status(unit, current_day, warning_days)
|
||||
|
||||
# Check if calibration expires/expired ON this specific day
|
||||
if unit.last_calibrated:
|
||||
unit_expiry = unit.last_calibrated + timedelta(days=365)
|
||||
if unit_expiry == current_day:
|
||||
cal_expiring_on_day += 1
|
||||
# Check if expired yesterday (first day of being expired)
|
||||
elif unit_expiry == current_day - timedelta(days=1):
|
||||
cal_expired_on_day += 1
|
||||
|
||||
if cal_status == "expired":
|
||||
expired += 1
|
||||
continue
|
||||
if cal_status == "needs_calibration":
|
||||
needs_cal += 1
|
||||
continue
|
||||
|
||||
# Check active deployment record (in field)
|
||||
if unit.id in unit_in_field:
|
||||
in_field += 1
|
||||
continue
|
||||
|
||||
# Check if reserved
|
||||
is_reserved = False
|
||||
if unit.id in unit_reservations:
|
||||
for start_d, end_d, _ in unit_reservations[unit.id]:
|
||||
if start_d <= current_day <= end_d:
|
||||
is_reserved = True
|
||||
break
|
||||
|
||||
if is_reserved:
|
||||
reserved += 1
|
||||
else:
|
||||
available += 1
|
||||
|
||||
if cal_status == "expiring_soon":
|
||||
expiring_soon += 1
|
||||
|
||||
days_data[current_day.day] = {
|
||||
"available": available,
|
||||
"in_field": in_field,
|
||||
"reserved": reserved,
|
||||
"expired": expired,
|
||||
"expiring_soon": expiring_soon,
|
||||
"needs_calibration": needs_cal,
|
||||
"cal_expiring_on_day": cal_expiring_on_day,
|
||||
"cal_expired_on_day": cal_expired_on_day
|
||||
}
|
||||
|
||||
current_day += timedelta(days=1)
|
||||
|
||||
months_data[month] = {
|
||||
"name": first_day.strftime("%B"),
|
||||
"short_name": first_day.strftime("%b"),
|
||||
"days": days_data,
|
||||
"first_weekday": first_day.weekday(), # 0=Monday, 6=Sunday
|
||||
"num_days": last_day.day
|
||||
}
|
||||
|
||||
# Also include reservation summary for the year
|
||||
reservation_list = []
|
||||
for res in reservations:
|
||||
assigned_count = len([a for a in assignments if a.reservation_id == res.id])
|
||||
reservation_list.append({
|
||||
"id": res.id,
|
||||
"name": res.name,
|
||||
"start_date": res.start_date.isoformat(),
|
||||
"end_date": res.end_date.isoformat(),
|
||||
"quantity_needed": res.quantity_needed,
|
||||
"assigned_count": assigned_count,
|
||||
"color": res.color
|
||||
})
|
||||
|
||||
return {
|
||||
"year": year,
|
||||
"device_type": device_type,
|
||||
"months": months_data,
|
||||
"reservations": reservation_list,
|
||||
"total_units": len(units)
|
||||
}
|
||||
|
||||
|
||||
def get_rolling_calendar_data(
|
||||
db: Session,
|
||||
start_year: int,
|
||||
start_month: int,
|
||||
device_type: str = "seismograph"
|
||||
) -> Dict:
|
||||
"""
|
||||
Get calendar data for 12 months starting from a specific month/year.
|
||||
|
||||
This supports the rolling calendar view where users can scroll through
|
||||
months one at a time, viewing any 12-month window.
|
||||
"""
|
||||
# Get user preferences
|
||||
prefs = db.query(UserPreferences).filter_by(id=1).first()
|
||||
warning_days = prefs.calibration_warning_days if prefs else 30
|
||||
|
||||
# Get all units
|
||||
units = db.query(RosterUnit).filter(
|
||||
RosterUnit.device_type == device_type,
|
||||
RosterUnit.retired == False
|
||||
).all()
|
||||
|
||||
# Calculate the date range for 12 months
|
||||
first_date = date(start_year, start_month, 1)
|
||||
# Calculate end date (12 months later)
|
||||
end_year = start_year + 1 if start_month == 1 else start_year
|
||||
end_month = 12 if start_month == 1 else start_month - 1
|
||||
if start_month == 1:
|
||||
end_year = start_year
|
||||
end_month = 12
|
||||
else:
|
||||
# 12 months from start_month means we end at start_month - 1 next year
|
||||
end_year = start_year + 1
|
||||
end_month = start_month - 1
|
||||
|
||||
# Actually, simpler: go 11 months forward from start
|
||||
end_year = start_year + ((start_month + 10) // 12)
|
||||
end_month = ((start_month + 10) % 12) + 1
|
||||
if end_month == 12:
|
||||
last_date = date(end_year, 12, 31)
|
||||
else:
|
||||
last_date = date(end_year, end_month + 1, 1) - timedelta(days=1)
|
||||
|
||||
# Get all reservations that overlap with this 12-month range
|
||||
reservations = db.query(JobReservation).filter(
|
||||
JobReservation.device_type == device_type,
|
||||
JobReservation.start_date <= last_date,
|
||||
or_(
|
||||
JobReservation.end_date >= first_date,
|
||||
JobReservation.end_date == None # TBD reservations
|
||||
)
|
||||
).all()
|
||||
|
||||
# Get all unit assignments for these reservations
|
||||
reservation_ids = [r.id for r in reservations]
|
||||
assignments = db.query(JobReservationUnit).filter(
|
||||
JobReservationUnit.reservation_id.in_(reservation_ids)
|
||||
).all() if reservation_ids else []
|
||||
|
||||
# Build a lookup: unit_id -> list of (start_date, end_date, reservation_name)
|
||||
unit_reservations = {}
|
||||
for res in reservations:
|
||||
res_assignments = [a for a in assignments if a.reservation_id == res.id]
|
||||
for assignment in res_assignments:
|
||||
unit_id = assignment.unit_id
|
||||
start_d = assignment.unit_start_date or res.start_date
|
||||
if assignment.unit_end_tbd or (assignment.unit_end_date is None and res.end_date_tbd):
|
||||
end_d = res.estimated_end_date or date(start_year + 5, 12, 31)
|
||||
else:
|
||||
end_d = assignment.unit_end_date or res.end_date or date(start_year + 5, 12, 31)
|
||||
|
||||
if unit_id not in unit_reservations:
|
||||
unit_reservations[unit_id] = []
|
||||
unit_reservations[unit_id].append((start_d, end_d, res.name))
|
||||
|
||||
# Build set of unit IDs that have an active deployment record (still in the field)
|
||||
unit_ids = [u.id for u in units]
|
||||
active_deployments = db.query(DeploymentRecord.unit_id).filter(
|
||||
DeploymentRecord.unit_id.in_(unit_ids),
|
||||
DeploymentRecord.actual_removal_date == None
|
||||
).all()
|
||||
unit_in_field = {row.unit_id for row in active_deployments}
|
||||
|
||||
# Generate data for each of the 12 months
|
||||
months_data = []
|
||||
current_year = start_year
|
||||
current_month = start_month
|
||||
|
||||
for i in range(12):
|
||||
# Calculate this month's year and month
|
||||
m_year = start_year + ((start_month - 1 + i) // 12)
|
||||
m_month = ((start_month - 1 + i) % 12) + 1
|
||||
|
||||
first_day = date(m_year, m_month, 1)
|
||||
if m_month == 12:
|
||||
last_day = date(m_year, 12, 31)
|
||||
else:
|
||||
last_day = date(m_year, m_month + 1, 1) - timedelta(days=1)
|
||||
|
||||
days_data = {}
|
||||
current_day = first_day
|
||||
|
||||
while current_day <= last_day:
|
||||
available = 0
|
||||
reserved = 0
|
||||
expired = 0
|
||||
expiring_soon = 0
|
||||
needs_cal = 0
|
||||
cal_expiring_on_day = 0
|
||||
cal_expired_on_day = 0
|
||||
|
||||
for unit in units:
|
||||
cal_status = get_calibration_status(unit, current_day, warning_days)
|
||||
|
||||
if unit.last_calibrated:
|
||||
unit_expiry = unit.last_calibrated + timedelta(days=365)
|
||||
if unit_expiry == current_day:
|
||||
cal_expiring_on_day += 1
|
||||
elif unit_expiry == current_day - timedelta(days=1):
|
||||
cal_expired_on_day += 1
|
||||
|
||||
if cal_status == "expired":
|
||||
expired += 1
|
||||
continue
|
||||
if cal_status == "needs_calibration":
|
||||
needs_cal += 1
|
||||
continue
|
||||
|
||||
is_reserved = False
|
||||
if unit.id in unit_reservations:
|
||||
for start_d, end_d, _ in unit_reservations[unit.id]:
|
||||
if start_d <= current_day <= end_d:
|
||||
is_reserved = True
|
||||
break
|
||||
|
||||
if is_reserved:
|
||||
reserved += 1
|
||||
else:
|
||||
available += 1
|
||||
|
||||
if cal_status == "expiring_soon":
|
||||
expiring_soon += 1
|
||||
|
||||
days_data[current_day.day] = {
|
||||
"available": available,
|
||||
"reserved": reserved,
|
||||
"expired": expired,
|
||||
"expiring_soon": expiring_soon,
|
||||
"needs_calibration": needs_cal,
|
||||
"cal_expiring_on_day": cal_expiring_on_day,
|
||||
"cal_expired_on_day": cal_expired_on_day
|
||||
}
|
||||
|
||||
current_day += timedelta(days=1)
|
||||
|
||||
months_data.append({
|
||||
"year": m_year,
|
||||
"month": m_month,
|
||||
"name": first_day.strftime("%B"),
|
||||
"short_name": first_day.strftime("%b"),
|
||||
"year_short": first_day.strftime("%y"),
|
||||
"days": days_data,
|
||||
"first_weekday": first_day.weekday(),
|
||||
"num_days": last_day.day
|
||||
})
|
||||
|
||||
return {
|
||||
"start_year": start_year,
|
||||
"start_month": start_month,
|
||||
"device_type": device_type,
|
||||
"months": months_data,
|
||||
"total_units": len(units)
|
||||
}
|
||||
|
||||
|
||||
def check_calibration_conflicts(
|
||||
db: Session,
|
||||
reservation_id: str
|
||||
) -> List[Dict]:
|
||||
"""
|
||||
Check if any units assigned to a reservation will have their
|
||||
calibration expire during the reservation period.
|
||||
|
||||
Returns list of conflicts with unit info and expiry date.
|
||||
"""
|
||||
reservation = db.query(JobReservation).filter_by(id=reservation_id).first()
|
||||
if not reservation:
|
||||
return []
|
||||
|
||||
# Get assigned units
|
||||
assigned = db.query(JobReservationUnit).filter_by(
|
||||
reservation_id=reservation_id
|
||||
).all()
|
||||
|
||||
conflicts = []
|
||||
for assignment in assigned:
|
||||
unit = db.query(RosterUnit).filter_by(id=assignment.unit_id).first()
|
||||
if not unit or not unit.last_calibrated:
|
||||
continue
|
||||
|
||||
expiry_date = unit.last_calibrated + timedelta(days=365)
|
||||
|
||||
# Check if expiry falls within reservation period
|
||||
if reservation.start_date < expiry_date <= reservation.end_date:
|
||||
conflicts.append({
|
||||
"unit_id": unit.id,
|
||||
"last_calibrated": unit.last_calibrated.isoformat(),
|
||||
"expiry_date": expiry_date.isoformat(),
|
||||
"reservation_name": reservation.name,
|
||||
"days_into_job": (expiry_date - reservation.start_date).days
|
||||
})
|
||||
|
||||
return conflicts
|
||||
|
||||
|
||||
def get_available_units_for_period(
|
||||
db: Session,
|
||||
start_date: date,
|
||||
end_date: date,
|
||||
device_type: str = "seismograph",
|
||||
exclude_reservation_id: Optional[str] = None
|
||||
) -> List[Dict]:
|
||||
"""
|
||||
Get units that are available for the entire specified period.
|
||||
|
||||
A unit is available if:
|
||||
- Not retired
|
||||
- Calibration is valid through the end date
|
||||
- Not assigned to any other reservation that overlaps the period
|
||||
"""
|
||||
prefs = db.query(UserPreferences).filter_by(id=1).first()
|
||||
warning_days = prefs.calibration_warning_days if prefs else 30
|
||||
|
||||
units = db.query(RosterUnit).filter(
|
||||
RosterUnit.device_type == device_type,
|
||||
RosterUnit.retired == False
|
||||
).all()
|
||||
|
||||
# Get reservations that overlap with this period
|
||||
overlapping_reservations = db.query(JobReservation).filter(
|
||||
JobReservation.device_type == device_type,
|
||||
JobReservation.start_date <= end_date,
|
||||
JobReservation.end_date >= start_date
|
||||
)
|
||||
|
||||
if exclude_reservation_id:
|
||||
overlapping_reservations = overlapping_reservations.filter(
|
||||
JobReservation.id != exclude_reservation_id
|
||||
)
|
||||
|
||||
overlapping_reservations = overlapping_reservations.all()
|
||||
|
||||
# Get all units assigned to overlapping reservations
|
||||
reserved_unit_ids = set()
|
||||
for res in overlapping_reservations:
|
||||
assigned = db.query(JobReservationUnit).filter_by(
|
||||
reservation_id=res.id
|
||||
).all()
|
||||
for a in assigned:
|
||||
reserved_unit_ids.add(a.unit_id)
|
||||
|
||||
# Get units with active deployment records (still in the field)
|
||||
unit_ids = [u.id for u in units]
|
||||
active_deps = db.query(DeploymentRecord.unit_id).filter(
|
||||
DeploymentRecord.unit_id.in_(unit_ids),
|
||||
DeploymentRecord.actual_removal_date == None
|
||||
).all()
|
||||
in_field_unit_ids = {row.unit_id for row in active_deps}
|
||||
|
||||
available_units = []
|
||||
for unit in units:
|
||||
# Check if already reserved
|
||||
if unit.id in reserved_unit_ids:
|
||||
continue
|
||||
# Check if currently in the field
|
||||
if unit.id in in_field_unit_ids:
|
||||
continue
|
||||
|
||||
if unit.last_calibrated:
|
||||
expiry_date = unit.last_calibrated + timedelta(days=365)
|
||||
cal_status = get_calibration_status(unit, end_date, warning_days)
|
||||
else:
|
||||
expiry_date = None
|
||||
cal_status = "needs_calibration"
|
||||
|
||||
available_units.append({
|
||||
"id": unit.id,
|
||||
"last_calibrated": unit.last_calibrated.isoformat() if unit.last_calibrated else None,
|
||||
"expiry_date": expiry_date.isoformat() if expiry_date else None,
|
||||
"calibration_status": cal_status,
|
||||
"deployed": unit.deployed,
|
||||
"out_for_calibration": unit.out_for_calibration or False,
|
||||
"note": unit.note or ""
|
||||
})
|
||||
|
||||
return available_units
|
||||
@@ -15,7 +15,7 @@ from zoneinfo import ZoneInfo
|
||||
from sqlalchemy.orm import Session
|
||||
from sqlalchemy import and_
|
||||
|
||||
from backend.models import RecurringSchedule, ScheduledAction, MonitoringLocation, UnitAssignment
|
||||
from backend.models import RecurringSchedule, ScheduledAction, MonitoringLocation, UnitAssignment, Project
|
||||
|
||||
logger = logging.getLogger(__name__)
|
||||
|
||||
@@ -49,6 +49,8 @@ class RecurringScheduleService:
|
||||
include_download: bool = True,
|
||||
auto_increment_index: bool = True,
|
||||
timezone: str = "America/New_York",
|
||||
start_datetime: datetime = None,
|
||||
end_datetime: datetime = None,
|
||||
) -> RecurringSchedule:
|
||||
"""
|
||||
Create a new recurring schedule.
|
||||
@@ -57,7 +59,7 @@ class RecurringScheduleService:
|
||||
project_id: Project ID
|
||||
location_id: Monitoring location ID
|
||||
name: Schedule name
|
||||
schedule_type: "weekly_calendar" or "simple_interval"
|
||||
schedule_type: "weekly_calendar", "simple_interval", or "one_off"
|
||||
device_type: "slm" or "seismograph"
|
||||
unit_id: Specific unit (optional, can use assignment)
|
||||
weekly_pattern: Dict of day patterns for weekly_calendar
|
||||
@@ -66,6 +68,8 @@ class RecurringScheduleService:
|
||||
include_download: Whether to download data on cycle
|
||||
auto_increment_index: Whether to auto-increment store index before start
|
||||
timezone: Timezone for schedule times
|
||||
start_datetime: Start date+time in UTC (one_off only)
|
||||
end_datetime: End date+time in UTC (one_off only)
|
||||
|
||||
Returns:
|
||||
Created RecurringSchedule
|
||||
@@ -85,6 +89,8 @@ class RecurringScheduleService:
|
||||
auto_increment_index=auto_increment_index,
|
||||
enabled=True,
|
||||
timezone=timezone,
|
||||
start_datetime=start_datetime,
|
||||
end_datetime=end_datetime,
|
||||
)
|
||||
|
||||
# Calculate next occurrence
|
||||
@@ -169,8 +175,25 @@ class RecurringScheduleService:
|
||||
return self.update_schedule(schedule_id, enabled=True)
|
||||
|
||||
def disable_schedule(self, schedule_id: str) -> Optional[RecurringSchedule]:
|
||||
"""Disable a schedule."""
|
||||
return self.update_schedule(schedule_id, enabled=False)
|
||||
"""Disable a schedule and cancel its pending actions."""
|
||||
schedule = self.update_schedule(schedule_id, enabled=False)
|
||||
if schedule:
|
||||
# Cancel all pending actions generated by this schedule
|
||||
pending_actions = self.db.query(ScheduledAction).filter(
|
||||
and_(
|
||||
ScheduledAction.execution_status == "pending",
|
||||
ScheduledAction.notes.like(f'%"schedule_id": "{schedule_id}"%'),
|
||||
)
|
||||
).all()
|
||||
|
||||
for action in pending_actions:
|
||||
action.execution_status = "cancelled"
|
||||
|
||||
if pending_actions:
|
||||
self.db.commit()
|
||||
logger.info(f"Cancelled {len(pending_actions)} pending actions for disabled schedule {schedule.name}")
|
||||
|
||||
return schedule
|
||||
|
||||
def generate_actions_for_schedule(
|
||||
self,
|
||||
@@ -196,6 +219,8 @@ class RecurringScheduleService:
|
||||
actions = self._generate_weekly_calendar_actions(schedule, horizon_days)
|
||||
elif schedule.schedule_type == "simple_interval":
|
||||
actions = self._generate_interval_actions(schedule, horizon_days)
|
||||
elif schedule.schedule_type == "one_off":
|
||||
actions = self._generate_one_off_actions(schedule)
|
||||
else:
|
||||
logger.warning(f"Unknown schedule type: {schedule.schedule_type}")
|
||||
return []
|
||||
@@ -307,10 +332,12 @@ class RecurringScheduleService:
|
||||
)
|
||||
actions.append(start_action)
|
||||
|
||||
# Create STOP action
|
||||
# Create STOP action (stop_cycle handles download when include_download is True)
|
||||
stop_notes = json.dumps({
|
||||
"schedule_name": schedule.name,
|
||||
"schedule_id": schedule.id,
|
||||
"schedule_type": "weekly_calendar",
|
||||
"include_download": schedule.include_download,
|
||||
})
|
||||
stop_action = ScheduledAction(
|
||||
id=str(uuid.uuid4()),
|
||||
@@ -325,27 +352,6 @@ class RecurringScheduleService:
|
||||
)
|
||||
actions.append(stop_action)
|
||||
|
||||
# Create DOWNLOAD action if enabled (1 minute after stop)
|
||||
if schedule.include_download:
|
||||
download_time = end_utc + timedelta(minutes=1)
|
||||
download_notes = json.dumps({
|
||||
"schedule_name": schedule.name,
|
||||
"schedule_id": schedule.id,
|
||||
"schedule_type": "weekly_calendar",
|
||||
})
|
||||
download_action = ScheduledAction(
|
||||
id=str(uuid.uuid4()),
|
||||
project_id=schedule.project_id,
|
||||
location_id=schedule.location_id,
|
||||
unit_id=unit_id,
|
||||
action_type="download",
|
||||
device_type=schedule.device_type,
|
||||
scheduled_time=download_time,
|
||||
execution_status="pending",
|
||||
notes=download_notes,
|
||||
)
|
||||
actions.append(download_action)
|
||||
|
||||
return actions
|
||||
|
||||
def _generate_interval_actions(
|
||||
@@ -384,61 +390,71 @@ class RecurringScheduleService:
|
||||
if cycle_utc <= now_utc:
|
||||
continue
|
||||
|
||||
# Check if action already exists
|
||||
if self._action_exists(schedule.project_id, schedule.location_id, "stop", cycle_utc):
|
||||
# Check if cycle action already exists
|
||||
if self._action_exists(schedule.project_id, schedule.location_id, "cycle", cycle_utc):
|
||||
continue
|
||||
|
||||
# Build notes with metadata
|
||||
stop_notes = json.dumps({
|
||||
# Build notes with metadata for cycle action
|
||||
cycle_notes = json.dumps({
|
||||
"schedule_name": schedule.name,
|
||||
"schedule_id": schedule.id,
|
||||
"cycle_type": "daily",
|
||||
"include_download": schedule.include_download,
|
||||
"auto_increment_index": schedule.auto_increment_index,
|
||||
})
|
||||
|
||||
# Create STOP action
|
||||
stop_action = ScheduledAction(
|
||||
# Create single CYCLE action that handles stop -> download -> start
|
||||
# The scheduler's _execute_cycle method handles the full workflow with delays
|
||||
cycle_action = ScheduledAction(
|
||||
id=str(uuid.uuid4()),
|
||||
project_id=schedule.project_id,
|
||||
location_id=schedule.location_id,
|
||||
unit_id=unit_id,
|
||||
action_type="stop",
|
||||
action_type="cycle",
|
||||
device_type=schedule.device_type,
|
||||
scheduled_time=cycle_utc,
|
||||
execution_status="pending",
|
||||
notes=stop_notes,
|
||||
notes=cycle_notes,
|
||||
)
|
||||
actions.append(stop_action)
|
||||
actions.append(cycle_action)
|
||||
|
||||
# Create DOWNLOAD action if enabled (1 minute after stop)
|
||||
if schedule.include_download:
|
||||
download_time = cycle_utc + timedelta(minutes=1)
|
||||
download_notes = json.dumps({
|
||||
"schedule_name": schedule.name,
|
||||
"schedule_id": schedule.id,
|
||||
"cycle_type": "daily",
|
||||
})
|
||||
download_action = ScheduledAction(
|
||||
id=str(uuid.uuid4()),
|
||||
project_id=schedule.project_id,
|
||||
location_id=schedule.location_id,
|
||||
unit_id=unit_id,
|
||||
action_type="download",
|
||||
device_type=schedule.device_type,
|
||||
scheduled_time=download_time,
|
||||
execution_status="pending",
|
||||
notes=download_notes,
|
||||
)
|
||||
actions.append(download_action)
|
||||
return actions
|
||||
|
||||
# Create START action (2 minutes after stop, or 1 minute after download)
|
||||
start_offset = 2 if schedule.include_download else 1
|
||||
start_time = cycle_utc + timedelta(minutes=start_offset)
|
||||
def _generate_one_off_actions(
|
||||
self,
|
||||
schedule: RecurringSchedule,
|
||||
) -> List[ScheduledAction]:
|
||||
"""
|
||||
Generate start and stop actions for a one-off recording.
|
||||
|
||||
Unlike recurring types, this generates exactly one start and one stop action
|
||||
using the schedule's start_datetime and end_datetime directly.
|
||||
"""
|
||||
if not schedule.start_datetime or not schedule.end_datetime:
|
||||
logger.warning(f"One-off schedule {schedule.id} missing start/end datetime")
|
||||
return []
|
||||
|
||||
actions = []
|
||||
now_utc = datetime.utcnow()
|
||||
unit_id = self._resolve_unit_id(schedule)
|
||||
|
||||
# Skip if end time has already passed
|
||||
if schedule.end_datetime <= now_utc:
|
||||
return []
|
||||
|
||||
# Check if actions already exist for this schedule
|
||||
if self._action_exists(schedule.project_id, schedule.location_id, "start", schedule.start_datetime):
|
||||
return []
|
||||
|
||||
# Create START action (only if start time hasn't passed)
|
||||
if schedule.start_datetime > now_utc:
|
||||
start_notes = json.dumps({
|
||||
"schedule_name": schedule.name,
|
||||
"schedule_id": schedule.id,
|
||||
"cycle_type": "daily",
|
||||
"schedule_type": "one_off",
|
||||
"auto_increment_index": schedule.auto_increment_index,
|
||||
})
|
||||
|
||||
start_action = ScheduledAction(
|
||||
id=str(uuid.uuid4()),
|
||||
project_id=schedule.project_id,
|
||||
@@ -446,12 +462,33 @@ class RecurringScheduleService:
|
||||
unit_id=unit_id,
|
||||
action_type="start",
|
||||
device_type=schedule.device_type,
|
||||
scheduled_time=start_time,
|
||||
scheduled_time=schedule.start_datetime,
|
||||
execution_status="pending",
|
||||
notes=start_notes,
|
||||
)
|
||||
actions.append(start_action)
|
||||
|
||||
# Create STOP action
|
||||
stop_notes = json.dumps({
|
||||
"schedule_name": schedule.name,
|
||||
"schedule_id": schedule.id,
|
||||
"schedule_type": "one_off",
|
||||
"include_download": schedule.include_download,
|
||||
})
|
||||
|
||||
stop_action = ScheduledAction(
|
||||
id=str(uuid.uuid4()),
|
||||
project_id=schedule.project_id,
|
||||
location_id=schedule.location_id,
|
||||
unit_id=unit_id,
|
||||
action_type="stop",
|
||||
device_type=schedule.device_type,
|
||||
scheduled_time=schedule.end_datetime,
|
||||
execution_status="pending",
|
||||
notes=stop_notes,
|
||||
)
|
||||
actions.append(stop_action)
|
||||
|
||||
return actions
|
||||
|
||||
def _calculate_next_occurrence(self, schedule: RecurringSchedule) -> Optional[datetime]:
|
||||
@@ -494,6 +531,13 @@ class RecurringScheduleService:
|
||||
if cycle_utc > now_utc:
|
||||
return cycle_utc
|
||||
|
||||
elif schedule.schedule_type == "one_off":
|
||||
if schedule.start_datetime and schedule.start_datetime > now_utc:
|
||||
return schedule.start_datetime
|
||||
elif schedule.end_datetime and schedule.end_datetime > now_utc:
|
||||
return schedule.end_datetime
|
||||
return None
|
||||
|
||||
return None
|
||||
|
||||
def _resolve_unit_id(self, schedule: RecurringSchedule) -> Optional[str]:
|
||||
@@ -550,8 +594,16 @@ class RecurringScheduleService:
|
||||
return self.db.query(RecurringSchedule).filter_by(project_id=project_id).all()
|
||||
|
||||
def get_enabled_schedules(self) -> List[RecurringSchedule]:
|
||||
"""Get all enabled recurring schedules."""
|
||||
return self.db.query(RecurringSchedule).filter_by(enabled=True).all()
|
||||
"""Get all enabled recurring schedules for projects that are not on hold or deleted."""
|
||||
active_project_ids = [
|
||||
p.id for p in self.db.query(Project.id).filter(
|
||||
Project.status.notin_(["on_hold", "archived", "deleted"])
|
||||
).all()
|
||||
]
|
||||
return self.db.query(RecurringSchedule).filter(
|
||||
RecurringSchedule.enabled == True,
|
||||
RecurringSchedule.project_id.in_(active_project_ids),
|
||||
).all()
|
||||
|
||||
|
||||
def get_recurring_schedule_service(db: Session) -> RecurringScheduleService:
|
||||
|
||||
@@ -21,7 +21,7 @@ from sqlalchemy.orm import Session
|
||||
from sqlalchemy import and_
|
||||
|
||||
from backend.database import SessionLocal
|
||||
from backend.models import ScheduledAction, RecordingSession, MonitoringLocation, Project, RecurringSchedule
|
||||
from backend.models import ScheduledAction, MonitoringSession, MonitoringLocation, Project, RecurringSchedule
|
||||
from backend.services.device_controller import get_device_controller, DeviceControllerError
|
||||
from backend.services.alert_service import get_alert_service
|
||||
import uuid
|
||||
@@ -107,10 +107,19 @@ class SchedulerService:
|
||||
try:
|
||||
# Find pending actions that are due
|
||||
now = datetime.utcnow()
|
||||
|
||||
# Only execute actions for active/completed projects (not on_hold, archived, or deleted)
|
||||
active_project_ids = [
|
||||
p.id for p in db.query(Project.id).filter(
|
||||
Project.status.notin_(["on_hold", "archived", "deleted"])
|
||||
).all()
|
||||
]
|
||||
|
||||
pending_actions = db.query(ScheduledAction).filter(
|
||||
and_(
|
||||
ScheduledAction.execution_status == "pending",
|
||||
ScheduledAction.scheduled_time <= now,
|
||||
ScheduledAction.project_id.in_(active_project_ids),
|
||||
)
|
||||
).order_by(ScheduledAction.scheduled_time).all()
|
||||
|
||||
@@ -185,6 +194,8 @@ class SchedulerService:
|
||||
response = await self._execute_stop(action, unit_id, db)
|
||||
elif action.action_type == "download":
|
||||
response = await self._execute_download(action, unit_id, db)
|
||||
elif action.action_type == "cycle":
|
||||
response = await self._execute_cycle(action, unit_id, db)
|
||||
else:
|
||||
raise Exception(f"Unknown action type: {action.action_type}")
|
||||
|
||||
@@ -261,7 +272,7 @@ class SchedulerService:
|
||||
)
|
||||
|
||||
# Create recording session
|
||||
session = RecordingSession(
|
||||
session = MonitoringSession(
|
||||
id=str(uuid.uuid4()),
|
||||
project_id=action.project_id,
|
||||
location_id=action.location_id,
|
||||
@@ -293,9 +304,20 @@ class SchedulerService:
|
||||
stop_cycle handles:
|
||||
1. Stop measurement
|
||||
2. Enable FTP
|
||||
3. Download measurement folder
|
||||
4. Verify download
|
||||
3. Download measurement folder to SLMM local storage
|
||||
|
||||
After stop_cycle, if download succeeded, this method fetches the ZIP
|
||||
from SLMM and extracts it into Terra-View's project directory, creating
|
||||
DataFile records for each file.
|
||||
"""
|
||||
import hashlib
|
||||
import io
|
||||
import os
|
||||
import zipfile
|
||||
import httpx
|
||||
from pathlib import Path
|
||||
from backend.models import DataFile
|
||||
|
||||
# Parse notes for download preference
|
||||
include_download = True
|
||||
try:
|
||||
@@ -306,7 +328,7 @@ class SchedulerService:
|
||||
pass # Notes is plain text, not JSON
|
||||
|
||||
# Execute the full stop cycle via device controller
|
||||
# SLMM handles stop, FTP enable, and download
|
||||
# SLMM handles stop, FTP enable, and download to SLMM-local storage
|
||||
cycle_response = await self.device_controller.stop_cycle(
|
||||
unit_id,
|
||||
action.device_type,
|
||||
@@ -314,11 +336,11 @@ class SchedulerService:
|
||||
)
|
||||
|
||||
# Find and update the active recording session
|
||||
active_session = db.query(RecordingSession).filter(
|
||||
active_session = db.query(MonitoringSession).filter(
|
||||
and_(
|
||||
RecordingSession.location_id == action.location_id,
|
||||
RecordingSession.unit_id == unit_id,
|
||||
RecordingSession.status == "recording",
|
||||
MonitoringSession.location_id == action.location_id,
|
||||
MonitoringSession.unit_id == unit_id,
|
||||
MonitoringSession.status == "recording",
|
||||
)
|
||||
).first()
|
||||
|
||||
@@ -338,10 +360,81 @@ class SchedulerService:
|
||||
except json.JSONDecodeError:
|
||||
pass
|
||||
|
||||
db.commit()
|
||||
|
||||
# If SLMM downloaded the folder successfully, fetch the ZIP from SLMM
|
||||
# and extract it into Terra-View's project directory, creating DataFile records
|
||||
files_created = 0
|
||||
if include_download and cycle_response.get("download_success") and active_session:
|
||||
folder_name = cycle_response.get("downloaded_folder") # e.g. "Auto_0058"
|
||||
remote_path = f"/NL-43/{folder_name}"
|
||||
|
||||
try:
|
||||
SLMM_BASE_URL = os.getenv("SLMM_BASE_URL", "http://localhost:8100")
|
||||
async with httpx.AsyncClient(timeout=600.0) as client:
|
||||
zip_response = await client.post(
|
||||
f"{SLMM_BASE_URL}/api/nl43/{unit_id}/ftp/download-folder",
|
||||
json={"remote_path": remote_path}
|
||||
)
|
||||
|
||||
if zip_response.is_success and len(zip_response.content) > 22:
|
||||
base_dir = Path(f"data/Projects/{action.project_id}/{active_session.id}/{folder_name}")
|
||||
base_dir.mkdir(parents=True, exist_ok=True)
|
||||
|
||||
file_type_map = {
|
||||
'.wav': 'audio', '.mp3': 'audio',
|
||||
'.csv': 'data', '.txt': 'data', '.json': 'data', '.dat': 'data',
|
||||
'.rnd': 'data', '.rnh': 'data',
|
||||
'.log': 'log',
|
||||
'.zip': 'archive',
|
||||
'.jpg': 'image', '.jpeg': 'image', '.png': 'image',
|
||||
'.pdf': 'document',
|
||||
}
|
||||
|
||||
with zipfile.ZipFile(io.BytesIO(zip_response.content)) as zf:
|
||||
for zip_info in zf.filelist:
|
||||
if zip_info.is_dir():
|
||||
continue
|
||||
file_data = zf.read(zip_info.filename)
|
||||
file_path = base_dir / zip_info.filename
|
||||
file_path.parent.mkdir(parents=True, exist_ok=True)
|
||||
with open(file_path, 'wb') as f:
|
||||
f.write(file_data)
|
||||
checksum = hashlib.sha256(file_data).hexdigest()
|
||||
ext = os.path.splitext(zip_info.filename)[1].lower()
|
||||
data_file = DataFile(
|
||||
id=str(uuid.uuid4()),
|
||||
session_id=active_session.id,
|
||||
file_path=str(file_path.relative_to("data")),
|
||||
file_type=file_type_map.get(ext, 'data'),
|
||||
file_size_bytes=len(file_data),
|
||||
downloaded_at=datetime.utcnow(),
|
||||
checksum=checksum,
|
||||
file_metadata=json.dumps({
|
||||
"source": "stop_cycle",
|
||||
"remote_path": remote_path,
|
||||
"unit_id": unit_id,
|
||||
"folder_name": folder_name,
|
||||
"relative_path": zip_info.filename,
|
||||
}),
|
||||
)
|
||||
db.add(data_file)
|
||||
files_created += 1
|
||||
|
||||
db.commit()
|
||||
logger.info(f"Created {files_created} DataFile records for session {active_session.id} from {folder_name}")
|
||||
else:
|
||||
logger.warning(f"ZIP from SLMM for {folder_name} was empty or failed, skipping DataFile creation")
|
||||
|
||||
except Exception as e:
|
||||
logger.error(f"Failed to extract ZIP and create DataFile records for {folder_name}: {e}")
|
||||
# Don't fail the stop action — the device was stopped successfully
|
||||
|
||||
return {
|
||||
"status": "stopped",
|
||||
"session_id": active_session.id if active_session else None,
|
||||
"cycle_response": cycle_response,
|
||||
"files_created": files_created,
|
||||
}
|
||||
|
||||
async def _execute_download(
|
||||
@@ -350,7 +443,14 @@ class SchedulerService:
|
||||
unit_id: str,
|
||||
db: Session,
|
||||
) -> Dict[str, Any]:
|
||||
"""Execute a 'download' action."""
|
||||
"""Execute a 'download' action.
|
||||
|
||||
This handles standalone download actions (not part of stop_cycle).
|
||||
The workflow is:
|
||||
1. Enable FTP on device
|
||||
2. Download current measurement folder
|
||||
3. (Optionally disable FTP - left enabled for now)
|
||||
"""
|
||||
# Get project and location info for file path
|
||||
location = db.query(MonitoringLocation).filter_by(id=action.location_id).first()
|
||||
project = db.query(Project).filter_by(id=action.project_id).first()
|
||||
@@ -358,8 +458,8 @@ class SchedulerService:
|
||||
if not location or not project:
|
||||
raise Exception("Project or location not found")
|
||||
|
||||
# Build destination path
|
||||
# Example: data/Projects/{project-id}/sound/{location-name}/session-{timestamp}/
|
||||
# Build destination path (for logging/metadata reference)
|
||||
# Actual download location is managed by SLMM (data/downloads/{unit_id}/)
|
||||
session_timestamp = datetime.utcnow().strftime("%Y-%m-%d-%H%M")
|
||||
location_type_dir = "sound" if action.device_type == "slm" else "vibration"
|
||||
|
||||
@@ -368,12 +468,23 @@ class SchedulerService:
|
||||
f"{location.name}/session-{session_timestamp}/"
|
||||
)
|
||||
|
||||
# Download files via device controller
|
||||
# Step 1: Disable FTP first to reset any stale connection state
|
||||
# Then enable FTP on device
|
||||
logger.info(f"Resetting FTP on {unit_id} for download (disable then enable)")
|
||||
try:
|
||||
await self.device_controller.disable_ftp(unit_id, action.device_type)
|
||||
except Exception as e:
|
||||
logger.warning(f"FTP disable failed (may already be off): {e}")
|
||||
await self.device_controller.enable_ftp(unit_id, action.device_type)
|
||||
|
||||
# Step 2: Download current measurement folder
|
||||
# The slmm_client.download_files() now automatically determines the correct
|
||||
# folder based on the device's current index number
|
||||
response = await self.device_controller.download_files(
|
||||
unit_id,
|
||||
action.device_type,
|
||||
destination_path,
|
||||
files=None, # Download all files
|
||||
files=None, # Download all files in current measurement folder
|
||||
)
|
||||
|
||||
# TODO: Create DataFile records for downloaded files
|
||||
@@ -384,6 +495,200 @@ class SchedulerService:
|
||||
"device_response": response,
|
||||
}
|
||||
|
||||
async def _execute_cycle(
|
||||
self,
|
||||
action: ScheduledAction,
|
||||
unit_id: str,
|
||||
db: Session,
|
||||
) -> Dict[str, Any]:
|
||||
"""Execute a full 'cycle' action: stop -> download -> start.
|
||||
|
||||
This combines stop, download, and start into a single action with
|
||||
appropriate delays between steps to ensure device stability.
|
||||
|
||||
Workflow:
|
||||
0. Pause background polling to prevent command conflicts
|
||||
1. Stop measurement (wait 10s)
|
||||
2. Disable FTP to reset state (wait 10s)
|
||||
3. Enable FTP (wait 10s)
|
||||
4. Download current measurement folder
|
||||
5. Wait 30s for device to settle
|
||||
6. Start new measurement cycle
|
||||
7. Re-enable background polling
|
||||
|
||||
Total time: ~70-90 seconds depending on download size
|
||||
"""
|
||||
logger.info(f"[CYCLE] === Starting full cycle for {unit_id} ===")
|
||||
|
||||
result = {
|
||||
"status": "cycle_complete",
|
||||
"steps": {},
|
||||
"old_session_id": None,
|
||||
"new_session_id": None,
|
||||
"polling_paused": False,
|
||||
}
|
||||
|
||||
# Step 0: Pause background polling for this device to prevent command conflicts
|
||||
# NL-43 devices only support one TCP connection at a time
|
||||
logger.info(f"[CYCLE] Step 0: Pausing background polling for {unit_id}")
|
||||
polling_was_enabled = False
|
||||
try:
|
||||
if action.device_type == "slm":
|
||||
# Get current polling state to restore later
|
||||
from backend.services.slmm_client import get_slmm_client
|
||||
slmm = get_slmm_client()
|
||||
try:
|
||||
polling_config = await slmm.get_device_polling_config(unit_id)
|
||||
polling_was_enabled = polling_config.get("poll_enabled", False)
|
||||
except Exception:
|
||||
polling_was_enabled = True # Assume enabled if can't check
|
||||
|
||||
# Disable polling during cycle
|
||||
await slmm.update_device_polling_config(unit_id, poll_enabled=False)
|
||||
result["polling_paused"] = True
|
||||
logger.info(f"[CYCLE] Background polling paused for {unit_id}")
|
||||
except Exception as e:
|
||||
logger.warning(f"[CYCLE] Failed to pause polling (continuing anyway): {e}")
|
||||
|
||||
try:
|
||||
# Step 1: Stop measurement
|
||||
logger.info(f"[CYCLE] Step 1/7: Stopping measurement on {unit_id}")
|
||||
try:
|
||||
stop_response = await self.device_controller.stop_recording(unit_id, action.device_type)
|
||||
result["steps"]["stop"] = {"success": True, "response": stop_response}
|
||||
logger.info(f"[CYCLE] Measurement stopped, waiting 10s...")
|
||||
except Exception as e:
|
||||
logger.warning(f"[CYCLE] Stop failed (may already be stopped): {e}")
|
||||
result["steps"]["stop"] = {"success": False, "error": str(e)}
|
||||
|
||||
await asyncio.sleep(10)
|
||||
|
||||
# Step 2: Disable FTP to reset any stale state
|
||||
logger.info(f"[CYCLE] Step 2/7: Disabling FTP on {unit_id}")
|
||||
try:
|
||||
await self.device_controller.disable_ftp(unit_id, action.device_type)
|
||||
result["steps"]["ftp_disable"] = {"success": True}
|
||||
logger.info(f"[CYCLE] FTP disabled, waiting 10s...")
|
||||
except Exception as e:
|
||||
logger.warning(f"[CYCLE] FTP disable failed (may already be off): {e}")
|
||||
result["steps"]["ftp_disable"] = {"success": False, "error": str(e)}
|
||||
|
||||
await asyncio.sleep(10)
|
||||
|
||||
# Step 3: Enable FTP
|
||||
logger.info(f"[CYCLE] Step 3/7: Enabling FTP on {unit_id}")
|
||||
try:
|
||||
await self.device_controller.enable_ftp(unit_id, action.device_type)
|
||||
result["steps"]["ftp_enable"] = {"success": True}
|
||||
logger.info(f"[CYCLE] FTP enabled, waiting 10s...")
|
||||
except Exception as e:
|
||||
logger.error(f"[CYCLE] FTP enable failed: {e}")
|
||||
result["steps"]["ftp_enable"] = {"success": False, "error": str(e)}
|
||||
# Continue anyway - download will fail but we can still try to start
|
||||
|
||||
await asyncio.sleep(10)
|
||||
|
||||
# Step 4: Download current measurement folder
|
||||
logger.info(f"[CYCLE] Step 4/7: Downloading measurement data from {unit_id}")
|
||||
location = db.query(MonitoringLocation).filter_by(id=action.location_id).first()
|
||||
project = db.query(Project).filter_by(id=action.project_id).first()
|
||||
|
||||
if location and project:
|
||||
session_timestamp = datetime.utcnow().strftime("%Y-%m-%d-%H%M")
|
||||
location_type_dir = "sound" if action.device_type == "slm" else "vibration"
|
||||
destination_path = (
|
||||
f"data/Projects/{project.id}/{location_type_dir}/"
|
||||
f"{location.name}/session-{session_timestamp}/"
|
||||
)
|
||||
|
||||
try:
|
||||
download_response = await self.device_controller.download_files(
|
||||
unit_id,
|
||||
action.device_type,
|
||||
destination_path,
|
||||
files=None,
|
||||
)
|
||||
result["steps"]["download"] = {"success": True, "response": download_response}
|
||||
logger.info(f"[CYCLE] Download complete")
|
||||
except Exception as e:
|
||||
logger.error(f"[CYCLE] Download failed: {e}")
|
||||
result["steps"]["download"] = {"success": False, "error": str(e)}
|
||||
else:
|
||||
result["steps"]["download"] = {"success": False, "error": "Project or location not found"}
|
||||
|
||||
# Close out the old recording session
|
||||
active_session = db.query(MonitoringSession).filter(
|
||||
and_(
|
||||
MonitoringSession.location_id == action.location_id,
|
||||
MonitoringSession.unit_id == unit_id,
|
||||
MonitoringSession.status == "recording",
|
||||
)
|
||||
).first()
|
||||
|
||||
if active_session:
|
||||
active_session.stopped_at = datetime.utcnow()
|
||||
active_session.status = "completed"
|
||||
active_session.duration_seconds = int(
|
||||
(active_session.stopped_at - active_session.started_at).total_seconds()
|
||||
)
|
||||
result["old_session_id"] = active_session.id
|
||||
|
||||
# Step 5: Wait for device to settle before starting new measurement
|
||||
logger.info(f"[CYCLE] Step 5/7: Waiting 30s for device to settle...")
|
||||
await asyncio.sleep(30)
|
||||
|
||||
# Step 6: Start new measurement cycle
|
||||
logger.info(f"[CYCLE] Step 6/7: Starting new measurement on {unit_id}")
|
||||
try:
|
||||
cycle_response = await self.device_controller.start_cycle(
|
||||
unit_id,
|
||||
action.device_type,
|
||||
sync_clock=True,
|
||||
)
|
||||
result["steps"]["start"] = {"success": True, "response": cycle_response}
|
||||
|
||||
# Create new recording session
|
||||
new_session = MonitoringSession(
|
||||
id=str(uuid.uuid4()),
|
||||
project_id=action.project_id,
|
||||
location_id=action.location_id,
|
||||
unit_id=unit_id,
|
||||
session_type="sound" if action.device_type == "slm" else "vibration",
|
||||
started_at=datetime.utcnow(),
|
||||
status="recording",
|
||||
session_metadata=json.dumps({
|
||||
"scheduled_action_id": action.id,
|
||||
"cycle_response": cycle_response,
|
||||
"action_type": "cycle",
|
||||
}),
|
||||
)
|
||||
db.add(new_session)
|
||||
result["new_session_id"] = new_session.id
|
||||
|
||||
logger.info(f"[CYCLE] New measurement started, session {new_session.id}")
|
||||
|
||||
except Exception as e:
|
||||
logger.error(f"[CYCLE] Start failed: {e}")
|
||||
result["steps"]["start"] = {"success": False, "error": str(e)}
|
||||
raise # Re-raise to mark the action as failed
|
||||
|
||||
finally:
|
||||
# Step 7: Re-enable background polling (always runs, even on failure)
|
||||
if result.get("polling_paused") and polling_was_enabled:
|
||||
logger.info(f"[CYCLE] Step 7/7: Re-enabling background polling for {unit_id}")
|
||||
try:
|
||||
if action.device_type == "slm":
|
||||
from backend.services.slmm_client import get_slmm_client
|
||||
slmm = get_slmm_client()
|
||||
await slmm.update_device_polling_config(unit_id, poll_enabled=True)
|
||||
logger.info(f"[CYCLE] Background polling re-enabled for {unit_id}")
|
||||
except Exception as e:
|
||||
logger.error(f"[CYCLE] Failed to re-enable polling: {e}")
|
||||
# Don't raise - cycle completed, just log the error
|
||||
|
||||
logger.info(f"[CYCLE] === Cycle complete for {unit_id} ===")
|
||||
return result
|
||||
|
||||
# ========================================================================
|
||||
# Recurring Schedule Generation
|
||||
# ========================================================================
|
||||
@@ -414,6 +719,15 @@ class SchedulerService:
|
||||
|
||||
for schedule in schedules:
|
||||
try:
|
||||
# Auto-disable one-off schedules whose end time has passed
|
||||
if schedule.schedule_type == "one_off" and schedule.end_datetime:
|
||||
if schedule.end_datetime <= datetime.utcnow():
|
||||
schedule.enabled = False
|
||||
schedule.next_occurrence = None
|
||||
db.commit()
|
||||
logger.info(f"Auto-disabled completed one-off schedule: {schedule.name}")
|
||||
continue
|
||||
|
||||
actions = service.generate_actions_for_schedule(schedule, horizon_days=7)
|
||||
total_generated += len(actions)
|
||||
except Exception as e:
|
||||
|
||||
129
backend/services/slm_status_sync.py
Normal file
@@ -0,0 +1,129 @@
|
||||
"""
|
||||
SLM Status Synchronization Service
|
||||
|
||||
Syncs SLM device status from SLMM backend to Terra-View's Emitter table.
|
||||
This bridges SLMM's polling data with Terra-View's status snapshot system.
|
||||
|
||||
SLMM tracks device reachability via background polling. This service
|
||||
fetches that data and creates/updates Emitter records so SLMs appear
|
||||
correctly in the dashboard status snapshot.
|
||||
"""
|
||||
|
||||
import logging
|
||||
from datetime import datetime, timezone
|
||||
from typing import Dict, Any
|
||||
|
||||
from backend.database import get_db_session
|
||||
from backend.models import Emitter
|
||||
from backend.services.slmm_client import get_slmm_client, SLMMClientError
|
||||
|
||||
logger = logging.getLogger(__name__)
|
||||
|
||||
|
||||
async def sync_slm_status_to_emitters() -> Dict[str, Any]:
|
||||
"""
|
||||
Fetch SLM status from SLMM and sync to Terra-View's Emitter table.
|
||||
|
||||
For each device in SLMM's polling status:
|
||||
- If last_success exists, create/update Emitter with that timestamp
|
||||
- If not reachable, update Emitter with last known timestamp (or None)
|
||||
|
||||
Returns:
|
||||
Dict with synced_count, error_count, errors list
|
||||
"""
|
||||
client = get_slmm_client()
|
||||
synced = 0
|
||||
errors = []
|
||||
|
||||
try:
|
||||
# Get polling status from SLMM
|
||||
status_response = await client.get_polling_status()
|
||||
|
||||
# Handle nested response structure
|
||||
data = status_response.get("data", status_response)
|
||||
devices = data.get("devices", [])
|
||||
|
||||
if not devices:
|
||||
logger.debug("No SLM devices in SLMM polling status")
|
||||
return {"synced_count": 0, "error_count": 0, "errors": []}
|
||||
|
||||
db = get_db_session()
|
||||
try:
|
||||
for device in devices:
|
||||
unit_id = device.get("unit_id")
|
||||
if not unit_id:
|
||||
continue
|
||||
|
||||
try:
|
||||
# Get or create Emitter record
|
||||
emitter = db.query(Emitter).filter(Emitter.id == unit_id).first()
|
||||
|
||||
# Determine last_seen from SLMM data
|
||||
last_success_str = device.get("last_success")
|
||||
is_reachable = device.get("is_reachable", False)
|
||||
|
||||
if last_success_str:
|
||||
# Parse ISO format timestamp
|
||||
last_seen = datetime.fromisoformat(
|
||||
last_success_str.replace("Z", "+00:00")
|
||||
)
|
||||
# Convert to naive UTC for consistency with existing code
|
||||
if last_seen.tzinfo:
|
||||
last_seen = last_seen.astimezone(timezone.utc).replace(tzinfo=None)
|
||||
elif is_reachable:
|
||||
# Device is reachable but no last_success yet (first poll or just started)
|
||||
# Use current time so it shows as OK, not Missing
|
||||
last_seen = datetime.utcnow()
|
||||
else:
|
||||
last_seen = None
|
||||
|
||||
# Status will be recalculated by snapshot.py based on time thresholds
|
||||
# Just store a provisional status here
|
||||
status = "OK" if is_reachable else "Missing"
|
||||
|
||||
# Store last error message if available
|
||||
last_error = device.get("last_error") or ""
|
||||
|
||||
if emitter:
|
||||
# Update existing record
|
||||
emitter.last_seen = last_seen
|
||||
emitter.status = status
|
||||
emitter.unit_type = "slm"
|
||||
emitter.last_file = last_error
|
||||
else:
|
||||
# Create new record
|
||||
emitter = Emitter(
|
||||
id=unit_id,
|
||||
unit_type="slm",
|
||||
last_seen=last_seen,
|
||||
last_file=last_error,
|
||||
status=status
|
||||
)
|
||||
db.add(emitter)
|
||||
|
||||
synced += 1
|
||||
|
||||
except Exception as e:
|
||||
errors.append(f"{unit_id}: {str(e)}")
|
||||
logger.error(f"Error syncing SLM {unit_id}: {e}")
|
||||
|
||||
db.commit()
|
||||
|
||||
finally:
|
||||
db.close()
|
||||
|
||||
if synced > 0:
|
||||
logger.info(f"Synced {synced} SLM device(s) to Emitter table")
|
||||
|
||||
except SLMMClientError as e:
|
||||
logger.warning(f"Could not reach SLMM for status sync: {e}")
|
||||
errors.append(f"SLMM unreachable: {str(e)}")
|
||||
except Exception as e:
|
||||
logger.error(f"Error in SLM status sync: {e}", exc_info=True)
|
||||
errors.append(str(e))
|
||||
|
||||
return {
|
||||
"synced_count": synced,
|
||||
"error_count": len(errors),
|
||||
"errors": errors
|
||||
}
|
||||
@@ -109,7 +109,71 @@ class SLMMClient:
|
||||
f"SLMM operation failed: {error_detail}"
|
||||
)
|
||||
except Exception as e:
|
||||
raise SLMMClientError(f"Unexpected error: {str(e)}")
|
||||
error_msg = str(e) if str(e) else type(e).__name__
|
||||
raise SLMMClientError(f"Unexpected error: {error_msg}")
|
||||
|
||||
async def _download_request(
|
||||
self,
|
||||
endpoint: str,
|
||||
data: Dict[str, Any],
|
||||
unit_id: str,
|
||||
) -> Dict[str, Any]:
|
||||
"""
|
||||
Make a download request to SLMM that returns binary file content (not JSON).
|
||||
|
||||
Saves the file locally and returns metadata about the download.
|
||||
"""
|
||||
url = f"{self.api_base}{endpoint}"
|
||||
|
||||
try:
|
||||
async with httpx.AsyncClient(timeout=httpx.Timeout(300.0)) as client:
|
||||
response = await client.post(url, json=data)
|
||||
response.raise_for_status()
|
||||
|
||||
# Determine filename from Content-Disposition header or generate one
|
||||
content_disp = response.headers.get("content-disposition", "")
|
||||
filename = None
|
||||
if "filename=" in content_disp:
|
||||
filename = content_disp.split("filename=")[-1].strip('" ')
|
||||
|
||||
if not filename:
|
||||
remote_path = data.get("remote_path", "download")
|
||||
base = os.path.basename(remote_path.rstrip("/"))
|
||||
filename = f"{base}.zip" if not base.endswith(".zip") else base
|
||||
|
||||
# Save to local downloads directory
|
||||
download_dir = os.path.join("data", "downloads", unit_id)
|
||||
os.makedirs(download_dir, exist_ok=True)
|
||||
local_path = os.path.join(download_dir, filename)
|
||||
|
||||
with open(local_path, "wb") as f:
|
||||
f.write(response.content)
|
||||
|
||||
return {
|
||||
"success": True,
|
||||
"local_path": local_path,
|
||||
"filename": filename,
|
||||
"size_bytes": len(response.content),
|
||||
}
|
||||
|
||||
except httpx.ConnectError as e:
|
||||
raise SLMMConnectionError(
|
||||
f"Cannot connect to SLMM backend at {self.base_url}. "
|
||||
f"Is SLMM running? Error: {str(e)}"
|
||||
)
|
||||
except httpx.HTTPStatusError as e:
|
||||
error_detail = "Unknown error"
|
||||
try:
|
||||
error_data = e.response.json()
|
||||
error_detail = error_data.get("detail", str(error_data))
|
||||
except Exception:
|
||||
error_detail = e.response.text or str(e)
|
||||
raise SLMMDeviceError(f"SLMM download failed: {error_detail}")
|
||||
except (SLMMConnectionError, SLMMDeviceError):
|
||||
raise
|
||||
except Exception as e:
|
||||
error_msg = str(e) if str(e) else type(e).__name__
|
||||
raise SLMMClientError(f"Download error: {error_msg}")
|
||||
|
||||
# ========================================================================
|
||||
# Unit Management
|
||||
@@ -478,9 +542,130 @@ class SLMMClient:
|
||||
return await self._request("GET", f"/{unit_id}/settings")
|
||||
|
||||
# ========================================================================
|
||||
# Data Download (Future)
|
||||
# FTP Control
|
||||
# ========================================================================
|
||||
|
||||
async def enable_ftp(self, unit_id: str) -> Dict[str, Any]:
|
||||
"""
|
||||
Enable FTP server on device.
|
||||
|
||||
Must be called before downloading files. FTP and TCP can work in tandem.
|
||||
|
||||
Args:
|
||||
unit_id: Unit identifier
|
||||
|
||||
Returns:
|
||||
Dict with status message
|
||||
"""
|
||||
return await self._request("POST", f"/{unit_id}/ftp/enable")
|
||||
|
||||
async def disable_ftp(self, unit_id: str) -> Dict[str, Any]:
|
||||
"""
|
||||
Disable FTP server on device.
|
||||
|
||||
Args:
|
||||
unit_id: Unit identifier
|
||||
|
||||
Returns:
|
||||
Dict with status message
|
||||
"""
|
||||
return await self._request("POST", f"/{unit_id}/ftp/disable")
|
||||
|
||||
async def get_ftp_status(self, unit_id: str) -> Dict[str, Any]:
|
||||
"""
|
||||
Get FTP server status on device.
|
||||
|
||||
Args:
|
||||
unit_id: Unit identifier
|
||||
|
||||
Returns:
|
||||
Dict with ftp_enabled status
|
||||
"""
|
||||
return await self._request("GET", f"/{unit_id}/ftp/status")
|
||||
|
||||
# ========================================================================
|
||||
# Data Download
|
||||
# ========================================================================
|
||||
|
||||
async def download_file(
|
||||
self,
|
||||
unit_id: str,
|
||||
remote_path: str,
|
||||
) -> Dict[str, Any]:
|
||||
"""
|
||||
Download a single file from unit via FTP.
|
||||
|
||||
Args:
|
||||
unit_id: Unit identifier
|
||||
remote_path: Path on device to download (e.g., "/NL43_DATA/measurement.wav")
|
||||
|
||||
Returns:
|
||||
Dict with local_path, filename, size_bytes
|
||||
"""
|
||||
return await self._download_request(
|
||||
f"/{unit_id}/ftp/download",
|
||||
{"remote_path": remote_path},
|
||||
unit_id,
|
||||
)
|
||||
|
||||
async def download_folder(
|
||||
self,
|
||||
unit_id: str,
|
||||
remote_path: str,
|
||||
) -> Dict[str, Any]:
|
||||
"""
|
||||
Download an entire folder from unit via FTP as a ZIP archive.
|
||||
|
||||
Useful for downloading complete measurement sessions (e.g., Auto_0000 folders).
|
||||
|
||||
Args:
|
||||
unit_id: Unit identifier
|
||||
remote_path: Folder path on device to download (e.g., "/NL43_DATA/Auto_0000")
|
||||
|
||||
Returns:
|
||||
Dict with local_path, folder_name, size_bytes
|
||||
"""
|
||||
return await self._download_request(
|
||||
f"/{unit_id}/ftp/download-folder",
|
||||
{"remote_path": remote_path},
|
||||
unit_id,
|
||||
)
|
||||
|
||||
async def download_current_measurement(
|
||||
self,
|
||||
unit_id: str,
|
||||
) -> Dict[str, Any]:
|
||||
"""
|
||||
Download the current measurement folder based on device's index number.
|
||||
|
||||
This is the recommended method for scheduled downloads - it automatically
|
||||
determines which folder to download based on the device's current store index.
|
||||
|
||||
Args:
|
||||
unit_id: Unit identifier
|
||||
|
||||
Returns:
|
||||
Dict with local_path, folder_name, file_count, zip_size_bytes, index_number
|
||||
"""
|
||||
# Get current index number from device
|
||||
index_info = await self.get_index_number(unit_id)
|
||||
index_number_raw = index_info.get("index_number", 0)
|
||||
|
||||
# Convert to int - device returns string like "0000" or "0001"
|
||||
try:
|
||||
index_number = int(index_number_raw)
|
||||
except (ValueError, TypeError):
|
||||
index_number = 0
|
||||
|
||||
# Format as Auto_XXXX folder name
|
||||
folder_name = f"Auto_{index_number:04d}"
|
||||
remote_path = f"/NL-43/{folder_name}"
|
||||
|
||||
# Download the folder
|
||||
result = await self.download_folder(unit_id, remote_path)
|
||||
result["index_number"] = index_number
|
||||
return result
|
||||
|
||||
async def download_files(
|
||||
self,
|
||||
unit_id: str,
|
||||
@@ -488,23 +673,24 @@ class SLMMClient:
|
||||
files: Optional[List[str]] = None,
|
||||
) -> Dict[str, Any]:
|
||||
"""
|
||||
Download files from unit via FTP.
|
||||
Download measurement files from unit via FTP.
|
||||
|
||||
NOTE: This endpoint doesn't exist in SLMM yet. Will need to implement.
|
||||
This method automatically determines the current measurement folder and downloads it.
|
||||
The destination_path parameter is logged for reference but actual download location
|
||||
is managed by SLMM (data/downloads/{unit_id}/).
|
||||
|
||||
Args:
|
||||
unit_id: Unit identifier
|
||||
destination_path: Local path to save files
|
||||
files: List of filenames to download, or None for all
|
||||
destination_path: Reference path (for logging/metadata, not used by SLMM)
|
||||
files: Ignored - always downloads the current measurement folder
|
||||
|
||||
Returns:
|
||||
Dict with downloaded files list and metadata
|
||||
Dict with download result including local_path, folder_name, etc.
|
||||
"""
|
||||
data = {
|
||||
"destination_path": destination_path,
|
||||
"files": files or "all",
|
||||
}
|
||||
return await self._request("POST", f"/{unit_id}/ftp/download", data=data)
|
||||
# Use the new method that automatically determines what to download
|
||||
result = await self.download_current_measurement(unit_id)
|
||||
result["requested_destination"] = destination_path
|
||||
return result
|
||||
|
||||
# ========================================================================
|
||||
# Cycle Commands (for scheduled automation)
|
||||
|
||||
@@ -36,6 +36,10 @@ async def sync_slm_to_slmm(unit: RosterUnit) -> bool:
|
||||
logger.warning(f"SLM {unit.id} has no host configured, skipping SLMM sync")
|
||||
return False
|
||||
|
||||
# Disable polling if unit is benched (deployed=False) or retired
|
||||
# Only actively deployed units should be polled
|
||||
should_poll = unit.deployed and not unit.retired
|
||||
|
||||
try:
|
||||
async with httpx.AsyncClient(timeout=5.0) as client:
|
||||
response = await client.put(
|
||||
@@ -47,8 +51,8 @@ async def sync_slm_to_slmm(unit: RosterUnit) -> bool:
|
||||
"ftp_enabled": True,
|
||||
"ftp_username": "USER", # Default NL43 credentials
|
||||
"ftp_password": "0000",
|
||||
"poll_enabled": not unit.retired, # Disable polling for retired units
|
||||
"poll_interval_seconds": 60, # Default interval
|
||||
"poll_enabled": should_poll, # Disable polling for benched or retired units
|
||||
"poll_interval_seconds": 3600, # Default to 1 hour polling
|
||||
}
|
||||
)
|
||||
|
||||
|
||||
@@ -80,6 +80,18 @@ def emit_status_snapshot():
|
||||
age = "N/A"
|
||||
last_seen = None
|
||||
fname = ""
|
||||
elif r.out_for_calibration:
|
||||
# Out for calibration units get separated later
|
||||
status = "Out for Calibration"
|
||||
age = "N/A"
|
||||
last_seen = None
|
||||
fname = ""
|
||||
elif getattr(r, 'allocated', False) and not r.deployed:
|
||||
# Allocated: staged for an upcoming job, not yet physically deployed
|
||||
status = "Allocated"
|
||||
age = "N/A"
|
||||
last_seen = None
|
||||
fname = ""
|
||||
else:
|
||||
if e:
|
||||
last_seen = ensure_utc(e.last_seen)
|
||||
@@ -103,11 +115,15 @@ def emit_status_snapshot():
|
||||
"deployed": r.deployed,
|
||||
"note": r.note or "",
|
||||
"retired": r.retired,
|
||||
"out_for_calibration": r.out_for_calibration or False,
|
||||
"allocated": getattr(r, 'allocated', False) or False,
|
||||
"allocated_to_project_id": getattr(r, 'allocated_to_project_id', None) or "",
|
||||
# Device type and type-specific fields
|
||||
"device_type": r.device_type or "seismograph",
|
||||
"last_calibrated": r.last_calibrated.isoformat() if r.last_calibrated else None,
|
||||
"next_calibration_due": r.next_calibration_due.isoformat() if r.next_calibration_due else None,
|
||||
"deployed_with_modem_id": r.deployed_with_modem_id,
|
||||
"deployed_with_unit_id": r.deployed_with_unit_id,
|
||||
"ip_address": r.ip_address,
|
||||
"phone_number": r.phone_number,
|
||||
"hardware_model": r.hardware_model,
|
||||
@@ -132,11 +148,15 @@ def emit_status_snapshot():
|
||||
"deployed": False, # default
|
||||
"note": "",
|
||||
"retired": False,
|
||||
"out_for_calibration": False,
|
||||
"allocated": False,
|
||||
"allocated_to_project_id": "",
|
||||
# Device type and type-specific fields (defaults for unknown units)
|
||||
"device_type": "seismograph", # default
|
||||
"last_calibrated": None,
|
||||
"next_calibration_due": None,
|
||||
"deployed_with_modem_id": None,
|
||||
"deployed_with_unit_id": None,
|
||||
"ip_address": None,
|
||||
"phone_number": None,
|
||||
"hardware_model": None,
|
||||
@@ -146,15 +166,48 @@ def emit_status_snapshot():
|
||||
"coordinates": "",
|
||||
}
|
||||
|
||||
# --- Derive modem status from paired devices ---
|
||||
# Modems don't have their own check-in system, so we inherit status
|
||||
# from whatever device they're paired with (seismograph or SLM)
|
||||
# Check both directions: modem.deployed_with_unit_id OR device.deployed_with_modem_id
|
||||
for unit_id, unit_data in units.items():
|
||||
if unit_data.get("device_type") == "modem" and not unit_data.get("retired"):
|
||||
paired_unit_id = None
|
||||
roster_unit = roster.get(unit_id)
|
||||
|
||||
# First, check if modem has deployed_with_unit_id set
|
||||
if roster_unit and roster_unit.deployed_with_unit_id:
|
||||
paired_unit_id = roster_unit.deployed_with_unit_id
|
||||
else:
|
||||
# Fallback: check if any device has this modem in deployed_with_modem_id
|
||||
for other_id, other_roster in roster.items():
|
||||
if other_roster.deployed_with_modem_id == unit_id:
|
||||
paired_unit_id = other_id
|
||||
break
|
||||
|
||||
if paired_unit_id:
|
||||
paired_unit = units.get(paired_unit_id)
|
||||
if paired_unit:
|
||||
# Inherit status from paired device
|
||||
unit_data["status"] = paired_unit.get("status", "Missing")
|
||||
unit_data["age"] = paired_unit.get("age", "N/A")
|
||||
unit_data["last"] = paired_unit.get("last")
|
||||
unit_data["derived_from"] = paired_unit_id
|
||||
|
||||
# Separate buckets for UI
|
||||
active_units = {
|
||||
uid: u for uid, u in units.items()
|
||||
if not u["retired"] and u["deployed"] and uid not in ignored
|
||||
if not u["retired"] and not u["out_for_calibration"] and u["deployed"] and uid not in ignored
|
||||
}
|
||||
|
||||
benched_units = {
|
||||
uid: u for uid, u in units.items()
|
||||
if not u["retired"] and not u["deployed"] and uid not in ignored
|
||||
if not u["retired"] and not u["out_for_calibration"] and not u["allocated"] and not u["deployed"] and uid not in ignored
|
||||
}
|
||||
|
||||
allocated_units = {
|
||||
uid: u for uid, u in units.items()
|
||||
if not u["retired"] and not u["out_for_calibration"] and u["allocated"] and not u["deployed"] and uid not in ignored
|
||||
}
|
||||
|
||||
retired_units = {
|
||||
@@ -162,6 +215,11 @@ def emit_status_snapshot():
|
||||
if u["retired"]
|
||||
}
|
||||
|
||||
out_for_calibration_units = {
|
||||
uid: u for uid, u in units.items()
|
||||
if u["out_for_calibration"]
|
||||
}
|
||||
|
||||
# Unknown units - emitters that aren't in the roster and aren't ignored
|
||||
unknown_units = {
|
||||
uid: u for uid, u in units.items()
|
||||
@@ -173,13 +231,17 @@ def emit_status_snapshot():
|
||||
"units": units,
|
||||
"active": active_units,
|
||||
"benched": benched_units,
|
||||
"allocated": allocated_units,
|
||||
"retired": retired_units,
|
||||
"out_for_calibration": out_for_calibration_units,
|
||||
"unknown": unknown_units,
|
||||
"summary": {
|
||||
"total": len(active_units) + len(benched_units),
|
||||
"total": len(active_units) + len(benched_units) + len(allocated_units),
|
||||
"active": len(active_units),
|
||||
"benched": len(benched_units),
|
||||
"allocated": len(allocated_units),
|
||||
"retired": len(retired_units),
|
||||
"out_for_calibration": len(out_for_calibration_units),
|
||||
"unknown": len(unknown_units),
|
||||
# Status counts only for deployed units (active_units)
|
||||
"ok": sum(1 for u in active_units.values() if u["status"] == "OK"),
|
||||
|
||||
BIN
backend/static/icons/favicon-16.png
Normal file
|
After Width: | Height: | Size: 424 B |
BIN
backend/static/icons/favicon-32.png
Normal file
|
After Width: | Height: | Size: 1.1 KiB |
|
Before Width: | Height: | Size: 1.9 KiB After Width: | Height: | Size: 7.7 KiB |
|
Before Width: | Height: | Size: 2.2 KiB After Width: | Height: | Size: 9.2 KiB |
|
Before Width: | Height: | Size: 2.2 KiB After Width: | Height: | Size: 10 KiB |
|
Before Width: | Height: | Size: 2.9 KiB After Width: | Height: | Size: 15 KiB |
|
Before Width: | Height: | Size: 5.8 KiB After Width: | Height: | Size: 44 KiB |
|
Before Width: | Height: | Size: 7.8 KiB After Width: | Height: | Size: 68 KiB |
|
Before Width: | Height: | Size: 1.1 KiB After Width: | Height: | Size: 3.2 KiB |
|
Before Width: | Height: | Size: 1.4 KiB After Width: | Height: | Size: 5.0 KiB |
BIN
backend/static/terra-view-logo-dark.png
Normal file
|
After Width: | Height: | Size: 13 KiB |
BIN
backend/static/terra-view-logo-dark@2x.png
Normal file
|
After Width: | Height: | Size: 57 KiB |
BIN
backend/static/terra-view-logo-light.png
Normal file
|
After Width: | Height: | Size: 14 KiB |
BIN
backend/static/terra-view-logo-light@2x.png
Normal file
|
After Width: | Height: | Size: 49 KiB |
@@ -5,6 +5,7 @@ All routers should import `templates` from this module to get consistent
|
||||
filter and global function registration.
|
||||
"""
|
||||
|
||||
import json as _json
|
||||
from fastapi.templating import Jinja2Templates
|
||||
|
||||
# Import timezone utilities
|
||||
@@ -32,8 +33,58 @@ def jinja_timezone_abbr():
|
||||
# Create templates instance
|
||||
templates = Jinja2Templates(directory="templates")
|
||||
|
||||
def jinja_local_date(dt, fmt="%m-%d-%y"):
|
||||
"""Jinja filter: format a UTC datetime as a local date string (e.g. 02-19-26)."""
|
||||
return format_local_datetime(dt, fmt)
|
||||
|
||||
|
||||
def jinja_fromjson(s):
|
||||
"""Jinja filter: parse a JSON string into a dict (returns {} on failure)."""
|
||||
if not s:
|
||||
return {}
|
||||
try:
|
||||
return _json.loads(s)
|
||||
except Exception:
|
||||
return {}
|
||||
|
||||
|
||||
def jinja_same_date(dt1, dt2) -> bool:
|
||||
"""Jinja global: True if two datetimes fall on the same local date."""
|
||||
if not dt1 or not dt2:
|
||||
return False
|
||||
try:
|
||||
d1 = format_local_datetime(dt1, "%Y-%m-%d")
|
||||
d2 = format_local_datetime(dt2, "%Y-%m-%d")
|
||||
return d1 == d2
|
||||
except Exception:
|
||||
return False
|
||||
|
||||
|
||||
def jinja_log_tail_display(s):
|
||||
"""Jinja filter: decode a JSON-encoded log tail array into a plain-text string."""
|
||||
if not s:
|
||||
return ""
|
||||
try:
|
||||
lines = _json.loads(s)
|
||||
if isinstance(lines, list):
|
||||
return "\n".join(str(l) for l in lines)
|
||||
return str(s)
|
||||
except Exception:
|
||||
return str(s)
|
||||
|
||||
|
||||
def jinja_local_datetime_input(dt):
|
||||
"""Jinja filter: format UTC datetime as local YYYY-MM-DDTHH:MM for <input type='datetime-local'>."""
|
||||
return format_local_datetime(dt, "%Y-%m-%dT%H:%M")
|
||||
|
||||
|
||||
# Register Jinja filters and globals
|
||||
templates.env.filters["local_datetime"] = jinja_local_datetime
|
||||
templates.env.filters["local_time"] = jinja_local_time
|
||||
templates.env.filters["local_date"] = jinja_local_date
|
||||
templates.env.filters["local_datetime_input"] = jinja_local_datetime_input
|
||||
templates.env.filters["fromjson"] = jinja_fromjson
|
||||
templates.env.globals["timezone_abbr"] = jinja_timezone_abbr
|
||||
templates.env.globals["get_user_timezone"] = get_user_timezone
|
||||
templates.env.globals["same_date"] = jinja_same_date
|
||||
templates.env.filters["log_tail_display"] = jinja_log_tail_display
|
||||
|
||||
@@ -1,10 +0,0 @@
|
||||
{
|
||||
"filename": "snapshot_20251216_201738.db",
|
||||
"created_at": "20251216_201738",
|
||||
"created_at_iso": "2025-12-16T20:17:38.638982",
|
||||
"description": "Auto-backup before restore",
|
||||
"size_bytes": 57344,
|
||||
"size_mb": 0.05,
|
||||
"original_db_size_bytes": 57344,
|
||||
"type": "manual"
|
||||
}
|
||||
@@ -1,9 +0,0 @@
|
||||
{
|
||||
"filename": "snapshot_uploaded_20251216_201732.db",
|
||||
"created_at": "20251216_201732",
|
||||
"created_at_iso": "2025-12-16T20:17:32.574205",
|
||||
"description": "Uploaded: snapshot_20251216_200259.db",
|
||||
"size_bytes": 77824,
|
||||
"size_mb": 0.07,
|
||||
"type": "uploaded"
|
||||
}
|
||||
@@ -1,9 +1,7 @@
|
||||
services:
|
||||
|
||||
# --- TERRA-VIEW PRODUCTION ---
|
||||
terra-view-prod:
|
||||
terra-view:
|
||||
build: .
|
||||
container_name: terra-view
|
||||
ports:
|
||||
- "8001:8001"
|
||||
volumes:
|
||||
@@ -24,34 +22,11 @@ services:
|
||||
retries: 3
|
||||
start_period: 40s
|
||||
|
||||
# --- TERRA-VIEW DEVELOPMENT ---
|
||||
# terra-view-dev:
|
||||
# build: .
|
||||
# container_name: terra-view-dev
|
||||
# ports:
|
||||
# - "1001:8001"
|
||||
# volumes:
|
||||
# - ./data-dev:/app/data
|
||||
# environment:
|
||||
# - PYTHONUNBUFFERED=1
|
||||
# - ENVIRONMENT=development
|
||||
# - SLMM_BASE_URL=http://slmm:8100
|
||||
# restart: unless-stopped
|
||||
# depends_on:
|
||||
# - slmm
|
||||
# healthcheck:
|
||||
# test: ["CMD", "curl", "-f", "http://localhost:8001/health"]
|
||||
# interval: 30s
|
||||
# timeout: 10s
|
||||
# retries: 3
|
||||
# start_period: 40s
|
||||
|
||||
# --- SLMM (Sound Level Meter Manager) ---
|
||||
slmm:
|
||||
build:
|
||||
context: ../slmm
|
||||
dockerfile: Dockerfile
|
||||
container_name: slmm
|
||||
network_mode: host
|
||||
volumes:
|
||||
- ../slmm/data:/app/data
|
||||
@@ -59,6 +34,8 @@ services:
|
||||
- PYTHONUNBUFFERED=1
|
||||
- PORT=8100
|
||||
- CORS_ORIGINS=*
|
||||
- TCP_IDLE_TTL=-1
|
||||
- TCP_MAX_AGE=-1
|
||||
restart: unless-stopped
|
||||
healthcheck:
|
||||
test: ["CMD", "curl", "-f", "http://localhost:8100/health"]
|
||||
@@ -69,4 +46,3 @@ services:
|
||||
|
||||
volumes:
|
||||
data:
|
||||
data-dev:
|
||||
|
||||
37
migrate_watcher_agents.py
Normal file
@@ -0,0 +1,37 @@
|
||||
"""
|
||||
Migration: add watcher_agents table.
|
||||
|
||||
Safe to run multiple times (idempotent).
|
||||
"""
|
||||
|
||||
import sqlite3
|
||||
import os
|
||||
|
||||
DB_PATH = os.path.join(os.path.dirname(__file__), "data", "seismo.db")
|
||||
|
||||
|
||||
def migrate():
|
||||
con = sqlite3.connect(DB_PATH)
|
||||
cur = con.cursor()
|
||||
|
||||
cur.execute("""
|
||||
CREATE TABLE IF NOT EXISTS watcher_agents (
|
||||
id TEXT PRIMARY KEY,
|
||||
source_type TEXT NOT NULL,
|
||||
version TEXT,
|
||||
last_seen DATETIME,
|
||||
status TEXT NOT NULL DEFAULT 'unknown',
|
||||
ip_address TEXT,
|
||||
log_tail TEXT,
|
||||
update_pending INTEGER NOT NULL DEFAULT 0,
|
||||
update_version TEXT
|
||||
)
|
||||
""")
|
||||
|
||||
con.commit()
|
||||
con.close()
|
||||
print("Migration complete: watcher_agents table ready.")
|
||||
|
||||
|
||||
if __name__ == "__main__":
|
||||
migrate()
|
||||
19
rebuild-dev.sh
Executable file
@@ -0,0 +1,19 @@
|
||||
#!/bin/bash
|
||||
# Dev rebuild script — increments build number, rebuilds and restarts terra-view
|
||||
set -e
|
||||
|
||||
SCRIPT_DIR="$(cd "$(dirname "${BASH_SOURCE[0]}")" && pwd)"
|
||||
BUILD_FILE="$SCRIPT_DIR/build_number.txt"
|
||||
|
||||
# Read and increment build number
|
||||
BUILD_NUMBER=$(cat "$BUILD_FILE" 2>/dev/null || echo "0")
|
||||
BUILD_NUMBER=$((BUILD_NUMBER + 1))
|
||||
echo "$BUILD_NUMBER" > "$BUILD_FILE"
|
||||
|
||||
echo "Building terra-view dev (build #$BUILD_NUMBER)..."
|
||||
|
||||
cd "$SCRIPT_DIR"
|
||||
docker compose build --build-arg BUILD_NUMBER="$BUILD_NUMBER" terra-view
|
||||
docker compose up -d terra-view
|
||||
|
||||
echo "Done — terra-view v0.6.1-$BUILD_NUMBER is running on :1001"
|
||||
12
rebuild-prod.sh
Executable file
@@ -0,0 +1,12 @@
|
||||
#!/bin/bash
|
||||
# Production rebuild script — rebuilds and restarts terra-view on :8001
|
||||
set -e
|
||||
|
||||
SCRIPT_DIR="$(cd "$(dirname "${BASH_SOURCE[0]}")" && pwd)"
|
||||
cd "$SCRIPT_DIR"
|
||||
|
||||
echo "Building terra-view production..."
|
||||
docker compose -f docker-compose.yml build terra-view
|
||||
docker compose -f docker-compose.yml up -d terra-view
|
||||
|
||||
echo "Done — terra-view production is running on :8001"
|
||||
@@ -1,6 +1,23 @@
|
||||
unit_id,unit_type,deployed,retired,note,project_id,location
|
||||
BE1234,series3,true,false,Primary unit at main site,PROJ-001,San Francisco CA
|
||||
BE5678,series3,true,false,Backup sensor,PROJ-001,Los Angeles CA
|
||||
BE9012,series3,false,false,In maintenance,PROJ-002,Workshop
|
||||
BE3456,series3,true,false,,PROJ-003,New York NY
|
||||
BE7890,series3,false,true,Decommissioned 2024,,Storage
|
||||
unit_id,device_type,unit_type,deployed,retired,note,project_id,location,address,coordinates,last_calibrated,next_calibration_due,deployed_with_modem_id,ip_address,phone_number,hardware_model,slm_host,slm_tcp_port,slm_ftp_port,slm_model,slm_serial_number,slm_frequency_weighting,slm_time_weighting,slm_measurement_range
|
||||
# ============================================
|
||||
# SEISMOGRAPHS (device_type=seismograph)
|
||||
# ============================================
|
||||
BE1234,seismograph,series3,true,false,Primary unit at main site,PROJ-001,San Francisco CA,123 Market St,37.7749;-122.4194,2025-06-15,2026-06-15,MDM001,,,,,,,,,,,
|
||||
BE5678,seismograph,series3,true,false,Backup sensor,PROJ-001,Los Angeles CA,456 Sunset Blvd,34.0522;-118.2437,2025-03-01,2026-03-01,MDM002,,,,,,,,,,,
|
||||
BE9012,seismograph,series4,false,false,In maintenance - needs calibration,PROJ-002,Workshop,789 Industrial Way,,,,,,,,,,,,,,
|
||||
BE3456,seismograph,series3,true,false,,PROJ-003,New York NY,101 Broadway,40.7128;-74.0060,2025-01-10,2026-01-10,,,,,,,,,,,
|
||||
BE7890,seismograph,series3,false,true,Decommissioned 2024,,Storage,Warehouse B,,,,,,,,,,,,,,,
|
||||
# ============================================
|
||||
# MODEMS (device_type=modem)
|
||||
# ============================================
|
||||
MDM001,modem,,true,false,Cradlepoint at SF site,PROJ-001,San Francisco CA,123 Market St,37.7749;-122.4194,,,,,192.168.1.100,+1-555-0101,IBR900,,,,,,,
|
||||
MDM002,modem,,true,false,Sierra Wireless at LA site,PROJ-001,Los Angeles CA,456 Sunset Blvd,34.0522;-118.2437,,,,,10.0.0.50,+1-555-0102,RV55,,,,,,,
|
||||
MDM003,modem,,false,false,Spare modem in storage,,,Storage,Warehouse A,,,,,,+1-555-0103,IBR600,,,,,,,
|
||||
MDM004,modem,,true,false,NYC backup modem,PROJ-003,New York NY,101 Broadway,40.7128;-74.0060,,,,,172.16.0.25,+1-555-0104,IBR1700,,,,,,,
|
||||
# ============================================
|
||||
# SOUND LEVEL METERS (device_type=slm)
|
||||
# ============================================
|
||||
SLM001,slm,,true,false,NL-43 at construction site A,PROJ-004,Downtown Site,500 Main St,40.7589;-73.9851,,,,,,,,192.168.10.101,2255,21,NL-43,12345678,A,F,30-130 dB
|
||||
SLM002,slm,,true,false,NL-43 at construction site B,PROJ-004,Midtown Site,600 Park Ave,40.7614;-73.9776,,,MDM004,,,,,192.168.10.102,2255,21,NL-43,12345679,A,S,30-130 dB
|
||||
SLM003,slm,,false,false,NL-53 spare unit,,,Storage,Warehouse A,,,,,,,,,,,NL-53,98765432,C,F,25-138 dB
|
||||
SLM004,slm,,true,false,NL-43 nighttime monitoring,PROJ-005,Residential Area,200 Quiet Lane,40.7484;-73.9857,,,,,,,,10.0.5.50,2255,21,NL-43,11112222,A,S,30-130 dB
|
||||
|
||||
|
@@ -8,7 +8,7 @@ import sys
|
||||
from sqlalchemy import create_engine, text
|
||||
from sqlalchemy.orm import sessionmaker
|
||||
|
||||
DATABASE_URL = "sqlite:///data/sfm.db"
|
||||
DATABASE_URL = "sqlite:///data/seismo_fleet.db"
|
||||
|
||||
def rename_unit(old_id: str, new_id: str):
|
||||
"""
|
||||
@@ -90,14 +90,14 @@ def rename_unit(old_id: str, new_id: str):
|
||||
except Exception:
|
||||
pass # Table may not exist
|
||||
|
||||
# Update recording_sessions table (if exists)
|
||||
# Update monitoring_sessions table (if exists)
|
||||
try:
|
||||
result = session.execute(
|
||||
text("UPDATE recording_sessions SET unit_id = :new_id WHERE unit_id = :old_id"),
|
||||
text("UPDATE monitoring_sessions SET unit_id = :new_id WHERE unit_id = :old_id"),
|
||||
{"new_id": new_id, "old_id": old_id}
|
||||
)
|
||||
if result.rowcount > 0:
|
||||
print(f" ✓ Updated recording_sessions ({result.rowcount} rows)")
|
||||
print(f" ✓ Updated monitoring_sessions ({result.rowcount} rows)")
|
||||
except Exception:
|
||||
pass # Table may not exist
|
||||
|
||||
|
||||
273
templates/admin_watchers.html
Normal file
@@ -0,0 +1,273 @@
|
||||
{% extends "base.html" %}
|
||||
|
||||
{% block title %}Watcher Manager — Admin{% endblock %}
|
||||
|
||||
{% block content %}
|
||||
<div class="mb-6">
|
||||
<div class="flex items-center gap-3">
|
||||
<h1 class="text-3xl font-bold text-gray-900 dark:text-white">Watcher Manager</h1>
|
||||
<span class="px-2 py-0.5 text-xs font-semibold bg-orange-100 text-orange-700 dark:bg-orange-900 dark:text-orange-300 rounded-full">Admin</span>
|
||||
</div>
|
||||
<p class="text-gray-500 dark:text-gray-400 mt-1 text-sm">
|
||||
Monitor and manage field watcher agents. Data updates on each heartbeat received.
|
||||
</p>
|
||||
</div>
|
||||
|
||||
<!-- Agent cards -->
|
||||
<div id="agent-list" class="space-y-4">
|
||||
|
||||
{% if not agents %}
|
||||
<div class="bg-white dark:bg-slate-800 rounded-xl shadow p-8 text-center">
|
||||
<svg class="w-12 h-12 mx-auto text-gray-300 dark:text-gray-600 mb-3" fill="none" stroke="currentColor" viewBox="0 0 24 24">
|
||||
<path stroke-linecap="round" stroke-linejoin="round" stroke-width="1.5" d="M9.75 17L9 20l-1 1h8l-1-1-.75-3M3 13h18M5 17h14a2 2 0 002-2V5a2 2 0 00-2-2H5a2 2 0 00-2 2v10a2 2 0 002 2z"/>
|
||||
</svg>
|
||||
<p class="text-gray-500 dark:text-gray-400">No watcher agents have reported in yet.</p>
|
||||
<p class="text-sm text-gray-400 dark:text-gray-500 mt-1">Once a watcher sends its first heartbeat it will appear here.</p>
|
||||
</div>
|
||||
{% endif %}
|
||||
|
||||
{% for agent in agents %}
|
||||
<div class="bg-white dark:bg-slate-800 rounded-xl shadow-lg overflow-hidden" id="agent-{{ agent.id | replace(' ', '-') }}">
|
||||
|
||||
<!-- Card header -->
|
||||
<div class="flex items-center justify-between px-6 py-4 border-b border-gray-100 dark:border-slate-700">
|
||||
<div class="flex items-center gap-3">
|
||||
<!-- Status dot -->
|
||||
{% if agent.status == 'ok' %}
|
||||
<span class="status-dot inline-block w-3 h-3 rounded-full bg-green-500 flex-shrink-0"></span>
|
||||
{% elif agent.status == 'pending' %}
|
||||
<span class="status-dot inline-block w-3 h-3 rounded-full bg-yellow-400 flex-shrink-0"></span>
|
||||
{% elif agent.status in ('missing', 'error') %}
|
||||
<span class="status-dot inline-block w-3 h-3 rounded-full bg-red-500 flex-shrink-0"></span>
|
||||
{% else %}
|
||||
<span class="status-dot inline-block w-3 h-3 rounded-full bg-gray-400 flex-shrink-0"></span>
|
||||
{% endif %}
|
||||
|
||||
<div>
|
||||
<h2 class="text-lg font-semibold text-gray-900 dark:text-white">{{ agent.id }}</h2>
|
||||
<div class="flex items-center gap-3 text-xs text-gray-500 dark:text-gray-400 mt-0.5">
|
||||
<span>{{ agent.source_type }}</span>
|
||||
{% if agent.version %}
|
||||
<span class="bg-gray-100 dark:bg-slate-700 px-1.5 py-0.5 rounded font-mono">v{{ agent.version }}</span>
|
||||
{% endif %}
|
||||
{% if agent.ip_address %}
|
||||
<span>{{ agent.ip_address }}</span>
|
||||
{% endif %}
|
||||
</div>
|
||||
</div>
|
||||
</div>
|
||||
|
||||
<div class="flex items-center gap-3">
|
||||
<!-- Status badge -->
|
||||
{% if agent.status == 'ok' %}
|
||||
<span class="status-badge px-2.5 py-1 text-xs font-semibold rounded-full bg-green-100 text-green-700 dark:bg-green-900 dark:text-green-300">OK</span>
|
||||
{% elif agent.status == 'pending' %}
|
||||
<span class="status-badge px-2.5 py-1 text-xs font-semibold rounded-full bg-yellow-100 text-yellow-700 dark:bg-yellow-900 dark:text-yellow-300">Pending</span>
|
||||
{% elif agent.status == 'missing' %}
|
||||
<span class="status-badge px-2.5 py-1 text-xs font-semibold rounded-full bg-red-100 text-red-700 dark:bg-red-900 dark:text-red-300">Missing</span>
|
||||
{% elif agent.status == 'error' %}
|
||||
<span class="status-badge px-2.5 py-1 text-xs font-semibold rounded-full bg-red-100 text-red-700 dark:bg-red-900 dark:text-red-300">Error</span>
|
||||
{% else %}
|
||||
<span class="status-badge px-2.5 py-1 text-xs font-semibold rounded-full bg-gray-100 text-gray-600 dark:bg-slate-700 dark:text-gray-400">Unknown</span>
|
||||
{% endif %}
|
||||
|
||||
<!-- Trigger Update button -->
|
||||
<button
|
||||
onclick="triggerUpdate('{{ agent.id }}')"
|
||||
class="px-3 py-1.5 text-xs font-medium bg-seismo-orange hover:bg-orange-600 text-white rounded-lg transition-colors"
|
||||
id="btn-update-{{ agent.id | replace(' ', '-') }}"
|
||||
>
|
||||
Trigger Update
|
||||
</button>
|
||||
</div>
|
||||
</div>
|
||||
|
||||
<!-- Meta row -->
|
||||
<div class="px-6 py-3 bg-gray-50 dark:bg-slate-800 border-b border-gray-100 dark:border-slate-700 flex flex-wrap gap-6 text-sm">
|
||||
<div>
|
||||
<span class="text-gray-500 dark:text-gray-400">Last seen</span>
|
||||
<span class="last-seen-value ml-2 font-medium text-gray-800 dark:text-gray-200">
|
||||
{% if agent.last_seen %}
|
||||
{{ agent.last_seen }}
|
||||
{% if agent.age_minutes is not none %}
|
||||
<span class="text-gray-400 dark:text-gray-500 font-normal">({{ agent.age_minutes }}m ago)</span>
|
||||
{% endif %}
|
||||
{% else %}
|
||||
Never
|
||||
{% endif %}
|
||||
</span>
|
||||
</div>
|
||||
<div class="update-pending-indicator flex items-center gap-1.5 text-yellow-600 dark:text-yellow-400 {% if not agent.update_pending %}hidden{% endif %}">
|
||||
<svg class="w-4 h-4" fill="none" stroke="currentColor" viewBox="0 0 24 24">
|
||||
<path stroke-linecap="round" stroke-linejoin="round" stroke-width="2" d="M4 4v5h.582m15.356 2A8.001 8.001 0 004.582 9m0 0H9m11 11v-5h-.581m0 0a8.003 8.003 0 01-15.357-2m15.357 2H15"/>
|
||||
</svg>
|
||||
<span class="text-xs font-semibold">Update pending — will apply on next heartbeat</span>
|
||||
</div>
|
||||
</div>
|
||||
|
||||
<!-- Log tail -->
|
||||
{% if agent.log_tail %}
|
||||
<div class="px-6 py-4">
|
||||
<div class="flex items-center justify-between mb-2">
|
||||
<span class="text-xs font-semibold text-gray-500 dark:text-gray-400 uppercase tracking-wide">Log Tail</span>
|
||||
<div class="flex items-center gap-3">
|
||||
<button onclick="expandLog('{{ agent.id | replace(' ', '-') }}')" id="expand-{{ agent.id | replace(' ', '-') }}" class="text-xs text-gray-400 hover:text-gray-600 dark:hover:text-gray-200">
|
||||
Expand
|
||||
</button>
|
||||
<button onclick="toggleLog('{{ agent.id | replace(' ', '-') }}')" class="text-xs text-gray-400 hover:text-gray-600 dark:hover:text-gray-200">
|
||||
Toggle
|
||||
</button>
|
||||
</div>
|
||||
</div>
|
||||
<pre id="log-{{ agent.id | replace(' ', '-') }}" class="text-xs font-mono bg-gray-900 text-green-400 rounded-lg p-3 overflow-x-auto max-h-96 overflow-y-auto leading-relaxed hidden">{{ agent.log_tail | log_tail_display }}</pre>
|
||||
</div>
|
||||
{% else %}
|
||||
<div class="px-6 py-4 text-xs text-gray-400 dark:text-gray-500 italic">No log data received yet.</div>
|
||||
{% endif %}
|
||||
|
||||
</div>
|
||||
{% endfor %}
|
||||
|
||||
</div>
|
||||
|
||||
<!-- Auto-refresh every 30s -->
|
||||
<div class="mt-6 text-xs text-gray-400 dark:text-gray-600 text-center">
|
||||
Auto-refreshes every 30 seconds — or <a href="/admin/watchers" class="underline hover:text-gray-600 dark:hover:text-gray-400">refresh now</a>
|
||||
</div>
|
||||
|
||||
<script>
|
||||
function triggerUpdate(agentId) {
|
||||
if (!confirm('Trigger update for ' + agentId + '?\n\nThe watcher will self-update on its next heartbeat cycle.')) return;
|
||||
|
||||
const safeId = agentId.replace(/ /g, '-');
|
||||
const btn = document.getElementById('btn-update-' + safeId);
|
||||
btn.disabled = true;
|
||||
btn.textContent = 'Sending...';
|
||||
|
||||
fetch('/api/admin/watchers/' + encodeURIComponent(agentId) + '/trigger-update', {
|
||||
method: 'POST',
|
||||
headers: {'Content-Type': 'application/json'},
|
||||
body: JSON.stringify({})
|
||||
})
|
||||
.then(r => r.json())
|
||||
.then(data => {
|
||||
if (data.ok) {
|
||||
btn.textContent = 'Update Queued';
|
||||
btn.classList.remove('bg-seismo-orange', 'hover:bg-orange-600');
|
||||
btn.classList.add('bg-green-600');
|
||||
// Show the pending indicator immediately without a reload
|
||||
const card = document.getElementById('agent-' + safeId);
|
||||
if (card) {
|
||||
const indicator = card.querySelector('.update-pending-indicator');
|
||||
if (indicator) indicator.classList.remove('hidden');
|
||||
}
|
||||
} else {
|
||||
btn.textContent = 'Error';
|
||||
btn.classList.add('bg-red-600');
|
||||
btn.disabled = false;
|
||||
}
|
||||
})
|
||||
.catch(() => {
|
||||
btn.textContent = 'Failed';
|
||||
btn.classList.add('bg-red-600');
|
||||
btn.disabled = false;
|
||||
});
|
||||
}
|
||||
|
||||
function toggleLog(agentId) {
|
||||
const el = document.getElementById('log-' + agentId);
|
||||
if (el) el.classList.toggle('hidden');
|
||||
}
|
||||
|
||||
function expandLog(agentId) {
|
||||
const el = document.getElementById('log-' + agentId);
|
||||
const btn = document.getElementById('expand-' + agentId);
|
||||
if (!el) return;
|
||||
el.classList.remove('hidden');
|
||||
if (el.classList.contains('max-h-96')) {
|
||||
el.classList.remove('max-h-96');
|
||||
el.style.maxHeight = 'none';
|
||||
if (btn) btn.textContent = 'Collapse';
|
||||
} else {
|
||||
el.classList.add('max-h-96');
|
||||
el.style.maxHeight = '';
|
||||
if (btn) btn.textContent = 'Expand';
|
||||
}
|
||||
}
|
||||
|
||||
// Status colors for dot and badge by status value
|
||||
const STATUS_DOT = {
|
||||
ok: 'bg-green-500',
|
||||
pending: 'bg-yellow-400',
|
||||
missing: 'bg-red-500',
|
||||
error: 'bg-red-500',
|
||||
};
|
||||
const STATUS_BADGE_CLASSES = {
|
||||
ok: 'bg-green-100 text-green-700 dark:bg-green-900 dark:text-green-300',
|
||||
pending: 'bg-yellow-100 text-yellow-700 dark:bg-yellow-900 dark:text-yellow-300',
|
||||
missing: 'bg-red-100 text-red-700 dark:bg-red-900 dark:text-red-300',
|
||||
error: 'bg-red-100 text-red-700 dark:bg-red-900 dark:text-red-300',
|
||||
};
|
||||
const STATUS_BADGE_DEFAULT = 'bg-gray-100 text-gray-600 dark:bg-slate-700 dark:text-gray-400';
|
||||
const DOT_COLORS = ['bg-green-500', 'bg-yellow-400', 'bg-red-500', 'bg-gray-400'];
|
||||
const BADGE_COLORS = [
|
||||
'bg-green-100', 'text-green-700', 'dark:bg-green-900', 'dark:text-green-300',
|
||||
'bg-yellow-100', 'text-yellow-700', 'dark:bg-yellow-900', 'dark:text-yellow-300',
|
||||
'bg-red-100', 'text-red-700', 'dark:bg-red-900', 'dark:text-red-300',
|
||||
'bg-gray-100', 'text-gray-600', 'dark:bg-slate-700', 'dark:text-gray-400',
|
||||
];
|
||||
|
||||
function patchAgent(card, agent) {
|
||||
// Status dot
|
||||
const dot = card.querySelector('.status-dot');
|
||||
if (dot) {
|
||||
dot.classList.remove(...DOT_COLORS);
|
||||
dot.classList.add(STATUS_DOT[agent.status] || 'bg-gray-400');
|
||||
}
|
||||
|
||||
// Status badge
|
||||
const badge = card.querySelector('.status-badge');
|
||||
if (badge) {
|
||||
badge.classList.remove(...BADGE_COLORS);
|
||||
const label = agent.status ? agent.status.charAt(0).toUpperCase() + agent.status.slice(1) : 'Unknown';
|
||||
badge.textContent = label === 'Ok' ? 'OK' : label;
|
||||
const cls = STATUS_BADGE_CLASSES[agent.status] || STATUS_BADGE_DEFAULT;
|
||||
badge.classList.add(...cls.split(' '));
|
||||
}
|
||||
|
||||
// Last seen / age
|
||||
const lastSeen = card.querySelector('.last-seen-value');
|
||||
if (lastSeen) {
|
||||
if (agent.last_seen) {
|
||||
const age = agent.age_minutes != null
|
||||
? ` <span class="text-gray-400 dark:text-gray-500 font-normal">(${agent.age_minutes}m ago)</span>`
|
||||
: '';
|
||||
lastSeen.innerHTML = agent.last_seen + age;
|
||||
} else {
|
||||
lastSeen.textContent = 'Never';
|
||||
}
|
||||
}
|
||||
|
||||
// Update pending indicator
|
||||
const indicator = card.querySelector('.update-pending-indicator');
|
||||
if (indicator) {
|
||||
indicator.classList.toggle('hidden', !agent.update_pending);
|
||||
}
|
||||
}
|
||||
|
||||
function liveRefresh() {
|
||||
fetch('/api/admin/watchers')
|
||||
.then(r => r.json())
|
||||
.then(agents => {
|
||||
agents.forEach(agent => {
|
||||
const safeId = agent.id.replace(/ /g, '-');
|
||||
const card = document.getElementById('agent-' + safeId);
|
||||
if (card) patchAgent(card, agent);
|
||||
});
|
||||
})
|
||||
.catch(() => {}); // silently ignore fetch errors
|
||||
}
|
||||
|
||||
setInterval(liveRefresh, 30000);
|
||||
</script>
|
||||
{% endblock %}
|
||||
@@ -20,6 +20,9 @@
|
||||
|
||||
<!-- PWA Manifest -->
|
||||
<link rel="manifest" href="/static/manifest.json">
|
||||
<link rel="icon" type="image/png" sizes="32x32" href="/static/icons/favicon-32.png">
|
||||
<link rel="icon" type="image/png" sizes="16x16" href="/static/icons/favicon-16.png">
|
||||
<link rel="apple-touch-icon" sizes="180x180" href="/static/icons/icon-192.png">
|
||||
<meta name="theme-color" content="#f48b1c">
|
||||
<meta name="apple-mobile-web-app-capable" content="yes">
|
||||
<meta name="apple-mobile-web-app-status-bar-style" content="black-translucent">
|
||||
@@ -68,7 +71,7 @@
|
||||
|
||||
{% block extra_head %}{% endblock %}
|
||||
</head>
|
||||
<body class="bg-gray-100 dark:bg-gray-900 text-gray-900 dark:text-gray-100">
|
||||
<body class="bg-gray-100 dark:bg-slate-800 text-gray-900 dark:text-gray-100">
|
||||
|
||||
<!-- Offline Indicator -->
|
||||
<div id="offlineIndicator" class="offline-indicator">
|
||||
@@ -82,13 +85,13 @@
|
||||
|
||||
<div class="flex h-screen overflow-hidden">
|
||||
<!-- Sidebar (Responsive) -->
|
||||
<aside id="sidebar" class="sidebar w-64 bg-white dark:bg-slate-800 shadow-lg flex flex-col">
|
||||
<aside id="sidebar" class="sidebar w-64 bg-white dark:bg-slate-800 shadow-lg flex flex-col{% if request.query_params.get('embed') == '1' %} hidden{% endif %}">
|
||||
<!-- Logo -->
|
||||
<div class="p-6 border-b border-gray-200 dark:border-gray-700">
|
||||
<h1 class="text-2xl font-bold text-seismo-navy dark:text-seismo-orange">
|
||||
Seismo<br>
|
||||
<span class="text-seismo-orange dark:text-seismo-burgundy">Fleet Manager</span>
|
||||
</h1>
|
||||
<a href="/" class="block">
|
||||
<img src="/static/terra-view-logo-light.png" srcset="/static/terra-view-logo-light.png 1x, /static/terra-view-logo-light@2x.png 2x" alt="Terra-View" class="block dark:hidden w-44 h-auto">
|
||||
<img src="/static/terra-view-logo-dark.png" srcset="/static/terra-view-logo-dark.png 1x, /static/terra-view-logo-dark@2x.png 2x" alt="Terra-View" class="hidden dark:block w-44 h-auto">
|
||||
</a>
|
||||
<div class="flex items-center justify-between mt-2">
|
||||
<p class="text-xs text-gray-500 dark:text-gray-400">v {{ version }}</p>
|
||||
{% if environment == 'development' %}
|
||||
@@ -127,6 +130,20 @@
|
||||
Sound Level Meters
|
||||
</a>
|
||||
|
||||
<a href="/modems" class="flex items-center px-4 py-3 rounded-lg hover:bg-gray-100 dark:hover:bg-gray-700 {% if request.url.path == '/modems' %}bg-gray-100 dark:bg-gray-700{% endif %}">
|
||||
<svg class="w-5 h-5 mr-3" fill="none" stroke="currentColor" viewBox="0 0 24 24">
|
||||
<path stroke-linecap="round" stroke-linejoin="round" stroke-width="2" d="M8.111 16.404a5.5 5.5 0 017.778 0M12 20h.01m-7.08-7.071c3.904-3.905 10.236-3.905 14.141 0M1.394 9.393c5.857-5.857 15.355-5.857 21.213 0"></path>
|
||||
</svg>
|
||||
Modems
|
||||
</a>
|
||||
|
||||
<a href="/pair-devices" class="flex items-center px-4 py-3 rounded-lg hover:bg-gray-100 dark:hover:bg-gray-700 {% if request.url.path == '/pair-devices' %}bg-gray-100 dark:bg-gray-700{% endif %}">
|
||||
<svg class="w-5 h-5 mr-3" fill="none" stroke="currentColor" viewBox="0 0 24 24">
|
||||
<path stroke-linecap="round" stroke-linejoin="round" stroke-width="2" d="M13.828 10.172a4 4 0 00-5.656 0l-4 4a4 4 0 105.656 5.656l1.102-1.101m-.758-4.899a4 4 0 005.656 0l4-4a4 4 0 00-5.656-5.656l-1.1 1.1"></path>
|
||||
</svg>
|
||||
Pair Devices
|
||||
</a>
|
||||
|
||||
<a href="/projects" class="flex items-center px-4 py-3 rounded-lg hover:bg-gray-100 dark:hover:bg-gray-700 {% if request.url.path.startswith('/projects') %}bg-gray-100 dark:bg-gray-700{% endif %}">
|
||||
<svg class="w-5 h-5 mr-3" fill="none" stroke="currentColor" viewBox="0 0 24 24">
|
||||
<path stroke-linecap="round" stroke-linejoin="round" stroke-width="2" d="M19 11H5m14 0a2 2 0 012 2v6a2 2 0 01-2 2H5a2 2 0 01-2-2v-6a2 2 0 012-2m14 0V9a2 2 0 00-2-2M5 11V9a2 2 0 012-2m0 0V5a2 2 0 012-2h6a2 2 0 012 2v2M7 7h10"></path>
|
||||
@@ -134,6 +151,13 @@
|
||||
Projects
|
||||
</a>
|
||||
|
||||
<a href="/fleet-calendar" class="flex items-center px-4 py-3 rounded-lg hover:bg-gray-100 dark:hover:bg-gray-700 {% if request.url.path.startswith('/fleet-calendar') %}bg-gray-100 dark:bg-gray-700{% endif %}">
|
||||
<svg class="w-5 h-5 mr-3" fill="none" stroke="currentColor" viewBox="0 0 24 24">
|
||||
<path stroke-linecap="round" stroke-linejoin="round" stroke-width="2" d="M8 7V3m8 4V3m-9 8h10M5 21h14a2 2 0 002-2V7a2 2 0 00-2-2H5a2 2 0 00-2 2v12a2 2 0 002 2z"></path>
|
||||
</svg>
|
||||
Job Planner
|
||||
</a>
|
||||
|
||||
<a href="/settings" class="flex items-center px-4 py-3 rounded-lg hover:bg-gray-100 dark:hover:bg-gray-700 {% if request.url.path == '/settings' %}bg-gray-100 dark:bg-gray-700{% endif %}">
|
||||
<svg class="w-5 h-5 mr-3" fill="none" stroke="currentColor" viewBox="0 0 24 24">
|
||||
<path stroke-linecap="round" stroke-linejoin="round" stroke-width="2" d="M10.325 4.317c.426-1.756 2.924-1.756 3.35 0a1.724 1.724 0 002.573 1.066c1.543-.94 3.31.826 2.37 2.37a1.724 1.724 0 001.065 2.572c1.756.426 1.756 2.924 0 3.35a1.724 1.724 0 00-1.066 2.573c.94 1.543-.826 3.31-2.37 2.37a1.724 1.724 0 00-2.572 1.065c-.426 1.756-2.924 1.756-3.35 0a1.724 1.724 0 00-2.573-1.066c-1.543.94-3.31-.826-2.37-2.37a1.724 1.724 0 00-1.065-2.572c-1.756-.426-1.756-2.924 0-3.35a1.724 1.724 0 001.066-2.573c-.94-1.543.826-3.31 2.37-2.37.996.608 2.296.07 2.572-1.065z"></path>
|
||||
@@ -169,14 +193,14 @@
|
||||
|
||||
<!-- Main content -->
|
||||
<main class="main-content flex-1 overflow-y-auto">
|
||||
<div class="p-8">
|
||||
<div class="{% if request.query_params.get('embed') == '1' %}p-4{% else %}p-8{% endif %}">
|
||||
{% block content %}{% endblock %}
|
||||
</div>
|
||||
</main>
|
||||
</div>
|
||||
|
||||
<!-- Bottom Navigation (Mobile Only) -->
|
||||
<nav class="bottom-nav">
|
||||
<nav class="bottom-nav{% if request.query_params.get('embed') == '1' %} hidden{% endif %}">
|
||||
<div class="grid grid-cols-4 h-16">
|
||||
<button id="hamburgerBtn" class="bottom-nav-btn" onclick="toggleMenu()" aria-label="Menu">
|
||||
<svg fill="none" stroke="currentColor" viewBox="0 0 24 24">
|
||||
@@ -374,10 +398,10 @@
|
||||
</script>
|
||||
|
||||
<!-- Offline Database -->
|
||||
<script src="/static/offline-db.js?v=0.4.3"></script>
|
||||
<script src="/static/offline-db.js?v=0.6.1"></script>
|
||||
|
||||
<!-- Mobile JavaScript -->
|
||||
<script src="/static/mobile.js?v=0.4.3"></script>
|
||||
<script src="/static/mobile.js?v=0.6.1"></script>
|
||||
|
||||
{% block extra_scripts %}{% endblock %}
|
||||
</body>
|
||||
|
||||
315
templates/combined_report_preview.html
Normal file
@@ -0,0 +1,315 @@
|
||||
{% extends "base.html" %}
|
||||
|
||||
{% block title %}Combined Report Preview - {{ project.name }}{% endblock %}
|
||||
|
||||
{% block content %}
|
||||
<!-- jspreadsheet CSS -->
|
||||
<link rel="stylesheet" href="https://cdn.jsdelivr.net/npm/jspreadsheet-ce@4/dist/jspreadsheet.min.css" />
|
||||
<link rel="stylesheet" href="https://cdn.jsdelivr.net/npm/jsuites@5/dist/jsuites.min.css" />
|
||||
|
||||
<div class="min-h-screen bg-gray-100 dark:bg-slate-900">
|
||||
<!-- Header -->
|
||||
<div class="bg-white dark:bg-slate-800 shadow-sm border-b border-gray-200 dark:border-gray-700">
|
||||
<div class="max-w-7xl mx-auto px-4 sm:px-6 lg:px-8 py-4">
|
||||
<div class="flex flex-col md:flex-row md:items-center md:justify-between gap-4">
|
||||
<div>
|
||||
<h1 class="text-2xl font-bold text-gray-900 dark:text-white">Combined Report Preview & Editor</h1>
|
||||
<p class="text-sm text-gray-500 dark:text-gray-400 mt-1">
|
||||
{{ location_data|length }} location{{ 's' if location_data|length != 1 else '' }}
|
||||
{% if time_filter_desc %} | {{ time_filter_desc }}{% endif %}
|
||||
| {{ total_rows }} total row{{ 's' if total_rows != 1 else '' }}
|
||||
</p>
|
||||
</div>
|
||||
<div class="flex items-center gap-3">
|
||||
<button onclick="downloadCombinedReport()" id="download-btn"
|
||||
class="px-4 py-2 bg-emerald-600 text-white rounded-lg hover:bg-emerald-700 transition-colors flex items-center gap-2 text-sm font-medium">
|
||||
<svg class="w-5 h-5" fill="none" stroke="currentColor" viewBox="0 0 24 24">
|
||||
<path stroke-linecap="round" stroke-linejoin="round" stroke-width="2" d="M4 16v1a3 3 0 003 3h10a3 3 0 003-3v-1m-4-4l-4 4m0 0l-4-4m4 4V4"></path>
|
||||
</svg>
|
||||
Generate Reports (ZIP)
|
||||
</button>
|
||||
<a href="/api/projects/{{ project_id }}/combined-report-wizard"
|
||||
class="px-4 py-2 bg-gray-200 dark:bg-gray-700 text-gray-700 dark:text-gray-300 rounded-lg hover:bg-gray-300 dark:hover:bg-gray-600 transition-colors text-sm">
|
||||
← Back to Config
|
||||
</a>
|
||||
</div>
|
||||
</div>
|
||||
</div>
|
||||
</div>
|
||||
|
||||
<div class="max-w-7xl mx-auto px-4 sm:px-6 lg:px-8 py-4 space-y-4">
|
||||
|
||||
<!-- Report Metadata -->
|
||||
<div class="bg-white dark:bg-slate-800 rounded-lg shadow-sm border border-gray-200 dark:border-gray-700 p-4">
|
||||
<div class="grid grid-cols-1 md:grid-cols-3 gap-4">
|
||||
<div>
|
||||
<label class="block text-sm font-medium text-gray-700 dark:text-gray-300 mb-1">Report Title</label>
|
||||
<input type="text" id="edit-report-title" value="{{ report_title }}"
|
||||
class="w-full px-3 py-2 border border-gray-300 dark:border-gray-600 rounded-md bg-white dark:bg-gray-700 text-gray-900 dark:text-white text-sm">
|
||||
</div>
|
||||
<div>
|
||||
<label class="block text-sm font-medium text-gray-700 dark:text-gray-300 mb-1">Project Name</label>
|
||||
<input type="text" id="edit-project-name" value="{{ project_name }}"
|
||||
class="w-full px-3 py-2 border border-gray-300 dark:border-gray-600 rounded-md bg-white dark:bg-gray-700 text-gray-900 dark:text-white text-sm">
|
||||
</div>
|
||||
<div>
|
||||
<label class="block text-sm font-medium text-gray-700 dark:text-gray-300 mb-1">Client Name</label>
|
||||
<input type="text" id="edit-client-name" value="{{ client_name }}"
|
||||
class="w-full px-3 py-2 border border-gray-300 dark:border-gray-600 rounded-md bg-white dark:bg-gray-700 text-gray-900 dark:text-white text-sm">
|
||||
</div>
|
||||
</div>
|
||||
</div>
|
||||
|
||||
<!-- Location Tabs + Spreadsheet -->
|
||||
<div class="bg-white dark:bg-slate-800 rounded-lg shadow-sm border border-gray-200 dark:border-gray-700">
|
||||
|
||||
<!-- Tab Bar -->
|
||||
<div class="border-b border-gray-200 dark:border-gray-700 overflow-x-auto">
|
||||
<div class="flex min-w-max" id="tab-bar">
|
||||
{% for loc in location_data %}
|
||||
<button onclick="switchTab({{ loop.index0 }})"
|
||||
id="tab-btn-{{ loop.index0 }}"
|
||||
class="tab-btn px-4 py-3 text-sm font-medium whitespace-nowrap border-b-2 transition-colors
|
||||
{% if loop.first %}border-emerald-500 text-emerald-600 dark:text-emerald-400
|
||||
{% else %}border-transparent text-gray-500 dark:text-gray-400 hover:text-gray-700 dark:hover:text-gray-300 hover:border-gray-300{% endif %}">
|
||||
{{ loc.location_name }}
|
||||
<span class="ml-1.5 text-xs px-1.5 py-0.5 rounded-full
|
||||
{% if loop.first %}bg-emerald-100 text-emerald-700 dark:bg-emerald-900/40 dark:text-emerald-400
|
||||
{% else %}bg-gray-100 text-gray-500 dark:bg-gray-700 dark:text-gray-400{% endif %}"
|
||||
id="tab-count-{{ loop.index0 }}">
|
||||
{{ loc.filtered_count }}
|
||||
</span>
|
||||
</button>
|
||||
{% endfor %}
|
||||
</div>
|
||||
</div>
|
||||
|
||||
<!-- Spreadsheet Panels -->
|
||||
<div class="p-4">
|
||||
<div class="flex items-center justify-between mb-3">
|
||||
<h3 class="text-base font-semibold text-gray-900 dark:text-white" id="active-tab-title">
|
||||
{{ location_data[0].location_name if location_data else '' }}
|
||||
</h3>
|
||||
<div class="flex items-center gap-2 text-sm text-gray-500 dark:text-gray-400">
|
||||
<span>Right-click for options</span>
|
||||
<span class="text-gray-300 dark:text-gray-600">|</span>
|
||||
<span>Double-click to edit</span>
|
||||
</div>
|
||||
</div>
|
||||
|
||||
{% for loc in location_data %}
|
||||
<div id="panel-{{ loop.index0 }}" class="tab-panel {% if not loop.first %}hidden{% endif %} overflow-x-auto">
|
||||
<div id="spreadsheet-{{ loop.index0 }}"></div>
|
||||
</div>
|
||||
{% endfor %}
|
||||
</div>
|
||||
</div>
|
||||
|
||||
<!-- Help -->
|
||||
<div class="bg-blue-50 dark:bg-blue-900/20 rounded-lg p-4">
|
||||
<h3 class="text-sm font-medium text-blue-800 dark:text-blue-300 mb-2">Editing Tips</h3>
|
||||
<ul class="text-sm text-blue-700 dark:text-blue-400 list-disc list-inside space-y-1">
|
||||
<li>Double-click any cell to edit its value</li>
|
||||
<li>Use the Comments column to add notes about specific measurements</li>
|
||||
<li>Right-click a row to insert or delete rows</li>
|
||||
<li>Press Enter to confirm edits, Escape to cancel</li>
|
||||
<li>Switch between location tabs to edit each location's data independently</li>
|
||||
</ul>
|
||||
</div>
|
||||
|
||||
</div>
|
||||
</div>
|
||||
|
||||
<!-- jspreadsheet JS -->
|
||||
<script src="https://cdn.jsdelivr.net/npm/jsuites@5/dist/jsuites.min.js"></script>
|
||||
<script src="https://cdn.jsdelivr.net/npm/jspreadsheet-ce@4/dist/index.min.js"></script>
|
||||
|
||||
<script>
|
||||
const allLocationData = {{ locations_json | safe }};
|
||||
const spreadsheets = {};
|
||||
let activeTabIdx = 0;
|
||||
|
||||
const columnDef = [
|
||||
{ title: 'Test #', width: 80, type: 'numeric' },
|
||||
{ title: 'Date', width: 110, type: 'text' },
|
||||
{ title: 'Time', width: 90, type: 'text' },
|
||||
{ title: 'LAmax (dBA)', width: 110, type: 'numeric' },
|
||||
{ title: 'LA01 (dBA)', width: 110, type: 'numeric' },
|
||||
{ title: 'LA10 (dBA)', width: 110, type: 'numeric' },
|
||||
{ title: 'Comments', width: 250, type: 'text' },
|
||||
];
|
||||
|
||||
const jssOptions = {
|
||||
columns: columnDef,
|
||||
allowInsertRow: true,
|
||||
allowDeleteRow: true,
|
||||
allowInsertColumn: false,
|
||||
allowDeleteColumn: false,
|
||||
rowDrag: true,
|
||||
columnSorting: true,
|
||||
search: true,
|
||||
pagination: 50,
|
||||
paginationOptions: [25, 50, 100, 200],
|
||||
defaultColWidth: 100,
|
||||
minDimensions: [7, 1],
|
||||
tableOverflow: true,
|
||||
tableWidth: '100%',
|
||||
contextMenu: function(instance, col, row, e) {
|
||||
const items = [];
|
||||
if (row !== null) {
|
||||
items.push({
|
||||
title: 'Insert row above',
|
||||
onclick: function() { instance.insertRow(1, row, true); }
|
||||
});
|
||||
items.push({
|
||||
title: 'Insert row below',
|
||||
onclick: function() { instance.insertRow(1, row + 1, false); }
|
||||
});
|
||||
items.push({
|
||||
title: 'Delete this row',
|
||||
onclick: function() { instance.deleteRow(row); }
|
||||
});
|
||||
}
|
||||
return items;
|
||||
},
|
||||
style: {
|
||||
A: 'text-align: center;',
|
||||
B: 'text-align: center;',
|
||||
C: 'text-align: center;',
|
||||
D: 'text-align: right;',
|
||||
E: 'text-align: right;',
|
||||
F: 'text-align: right;',
|
||||
}
|
||||
};
|
||||
|
||||
document.addEventListener('DOMContentLoaded', function() {
|
||||
allLocationData.forEach(function(loc, idx) {
|
||||
const el = document.getElementById('spreadsheet-' + idx);
|
||||
if (!el) return;
|
||||
const opts = Object.assign({}, jssOptions, { data: loc.spreadsheet_data });
|
||||
spreadsheets[idx] = jspreadsheet(el, opts);
|
||||
});
|
||||
if (allLocationData.length > 0) {
|
||||
switchTab(0);
|
||||
}
|
||||
});
|
||||
|
||||
function switchTab(idx) {
|
||||
activeTabIdx = idx;
|
||||
|
||||
// Update panels
|
||||
document.querySelectorAll('.tab-panel').forEach(function(panel, i) {
|
||||
panel.classList.toggle('hidden', i !== idx);
|
||||
});
|
||||
|
||||
// Update tab button styles
|
||||
document.querySelectorAll('.tab-btn').forEach(function(btn, i) {
|
||||
const countBadge = document.getElementById('tab-count-' + i);
|
||||
if (i === idx) {
|
||||
btn.classList.add('border-emerald-500', 'text-emerald-600', 'dark:text-emerald-400');
|
||||
btn.classList.remove('border-transparent', 'text-gray-500', 'dark:text-gray-400');
|
||||
if (countBadge) {
|
||||
countBadge.classList.add('bg-emerald-100', 'text-emerald-700', 'dark:bg-emerald-900/40', 'dark:text-emerald-400');
|
||||
countBadge.classList.remove('bg-gray-100', 'text-gray-500', 'dark:bg-gray-700', 'dark:text-gray-400');
|
||||
}
|
||||
} else {
|
||||
btn.classList.remove('border-emerald-500', 'text-emerald-600', 'dark:text-emerald-400');
|
||||
btn.classList.add('border-transparent', 'text-gray-500', 'dark:text-gray-400');
|
||||
if (countBadge) {
|
||||
countBadge.classList.remove('bg-emerald-100', 'text-emerald-700', 'dark:bg-emerald-900/40', 'dark:text-emerald-400');
|
||||
countBadge.classList.add('bg-gray-100', 'text-gray-500', 'dark:bg-gray-700', 'dark:text-gray-400');
|
||||
}
|
||||
}
|
||||
});
|
||||
|
||||
// Update title
|
||||
if (allLocationData[idx]) {
|
||||
document.getElementById('active-tab-title').textContent = allLocationData[idx].location_name;
|
||||
}
|
||||
|
||||
// Refresh jspreadsheet rendering after showing panel
|
||||
if (spreadsheets[idx]) {
|
||||
try { spreadsheets[idx].updateTable(); } catch(e) {}
|
||||
}
|
||||
}
|
||||
|
||||
async function downloadCombinedReport() {
|
||||
const btn = document.getElementById('download-btn');
|
||||
const originalText = btn.innerHTML;
|
||||
btn.disabled = true;
|
||||
btn.innerHTML = '<svg class="w-5 h-5 animate-spin" fill="none" viewBox="0 0 24 24"><circle class="opacity-25" cx="12" cy="12" r="10" stroke="currentColor" stroke-width="4"></circle><path class="opacity-75" fill="currentColor" d="M4 12a8 8 0 018-8V0C5.373 0 0 5.373 0 12h4z"></path></svg> Generating ZIP...';
|
||||
|
||||
try {
|
||||
const locations = allLocationData.map(function(loc, idx) {
|
||||
return {
|
||||
session_id: loc.session_id || '',
|
||||
session_label: loc.session_label || '',
|
||||
period_type: loc.period_type || '',
|
||||
started_at: loc.started_at || '',
|
||||
location_name: loc.location_name,
|
||||
spreadsheet_data: spreadsheets[idx] ? spreadsheets[idx].getData() : loc.spreadsheet_data,
|
||||
};
|
||||
});
|
||||
|
||||
const payload = {
|
||||
report_title: document.getElementById('edit-report-title').value || 'Background Noise Study',
|
||||
project_name: document.getElementById('edit-project-name').value || '',
|
||||
client_name: document.getElementById('edit-client-name').value || '',
|
||||
locations: locations,
|
||||
};
|
||||
|
||||
const response = await fetch('/api/projects/{{ project_id }}/generate-combined-from-preview', {
|
||||
method: 'POST',
|
||||
headers: { 'Content-Type': 'application/json' },
|
||||
body: JSON.stringify(payload),
|
||||
});
|
||||
|
||||
if (response.ok) {
|
||||
const blob = await response.blob();
|
||||
const url = window.URL.createObjectURL(blob);
|
||||
const a = document.createElement('a');
|
||||
a.href = url;
|
||||
|
||||
const contentDisposition = response.headers.get('Content-Disposition');
|
||||
let filename = 'combined_reports.zip';
|
||||
if (contentDisposition) {
|
||||
const match = contentDisposition.match(/filename="(.+)"/);
|
||||
if (match) filename = match[1];
|
||||
}
|
||||
|
||||
a.download = filename;
|
||||
document.body.appendChild(a);
|
||||
a.click();
|
||||
window.URL.revokeObjectURL(url);
|
||||
a.remove();
|
||||
} else {
|
||||
const error = await response.json();
|
||||
alert('Error generating report: ' + (error.detail || 'Unknown error'));
|
||||
}
|
||||
} catch (error) {
|
||||
alert('Error generating report: ' + error.message);
|
||||
} finally {
|
||||
btn.disabled = false;
|
||||
btn.innerHTML = originalText;
|
||||
}
|
||||
}
|
||||
</script>
|
||||
|
||||
<style>
|
||||
/* Dark mode jspreadsheet styles */
|
||||
.dark .jexcel { background-color: #1e293b; color: #e2e8f0; }
|
||||
.dark .jexcel thead td { background-color: #334155 !important; color: #e2e8f0 !important; border-color: #475569 !important; }
|
||||
.dark .jexcel tbody td { background-color: #1e293b; color: #e2e8f0; border-color: #475569; }
|
||||
.dark .jexcel tbody td:hover { background-color: #334155; }
|
||||
.dark .jexcel tbody tr:nth-child(even) td { background-color: #0f172a; }
|
||||
.dark .jexcel_pagination { background-color: #1e293b; color: #e2e8f0; border-color: #475569; }
|
||||
.dark .jexcel_pagination a { color: #e2e8f0; }
|
||||
.dark .jexcel_search { background-color: #1e293b; color: #e2e8f0; border-color: #475569; }
|
||||
.dark .jexcel_search input { background-color: #334155; color: #e2e8f0; border-color: #475569; }
|
||||
.dark .jexcel_content { background-color: #1e293b; }
|
||||
.dark .jexcel_contextmenu { background-color: #1e293b; border-color: #475569; }
|
||||
.dark .jexcel_contextmenu a { color: #e2e8f0; }
|
||||
.dark .jexcel_contextmenu a:hover { background-color: #334155; }
|
||||
.jexcel_content { max-height: 600px; overflow: auto; }
|
||||
</style>
|
||||
{% endblock %}
|
||||
393
templates/combined_report_wizard.html
Normal file
@@ -0,0 +1,393 @@
|
||||
{% extends "base.html" %}
|
||||
|
||||
{% block title %}Combined Report Wizard - {{ project.name }}{% endblock %}
|
||||
|
||||
{% block content %}
|
||||
<div class="min-h-screen bg-gray-100 dark:bg-slate-900">
|
||||
<!-- Header -->
|
||||
<div class="bg-white dark:bg-slate-800 shadow-sm border-b border-gray-200 dark:border-gray-700">
|
||||
<div class="max-w-4xl mx-auto px-4 sm:px-6 lg:px-8 py-4">
|
||||
<div class="flex flex-col md:flex-row md:items-center md:justify-between gap-4">
|
||||
<div>
|
||||
<h1 class="text-2xl font-bold text-gray-900 dark:text-white">Combined Report Wizard</h1>
|
||||
<p class="text-sm text-gray-500 dark:text-gray-400 mt-1">{{ project.name }}</p>
|
||||
</div>
|
||||
<a href="/projects/{{ project_id }}"
|
||||
class="px-4 py-2 bg-gray-200 dark:bg-gray-700 text-gray-700 dark:text-gray-300 rounded-lg hover:bg-gray-300 dark:hover:bg-gray-600 transition-colors text-sm w-fit">
|
||||
← Back to Project
|
||||
</a>
|
||||
</div>
|
||||
</div>
|
||||
</div>
|
||||
|
||||
<div class="max-w-4xl mx-auto px-4 sm:px-6 lg:px-8 py-6 space-y-6">
|
||||
|
||||
<!-- Report Settings Card -->
|
||||
<div class="bg-white dark:bg-slate-800 rounded-lg shadow-sm border border-gray-200 dark:border-gray-700 p-6">
|
||||
<h2 class="text-lg font-semibold text-gray-900 dark:text-white mb-4">Report Settings</h2>
|
||||
|
||||
<!-- Template Selection -->
|
||||
<div class="flex items-end gap-2 mb-4">
|
||||
<div class="flex-1">
|
||||
<label for="template-select" class="block text-sm font-medium text-gray-700 dark:text-gray-300 mb-1">
|
||||
Load Template
|
||||
</label>
|
||||
<select id="template-select" onchange="applyTemplate()"
|
||||
class="block w-full px-3 py-2 border border-gray-300 dark:border-gray-600 rounded-md shadow-sm bg-white dark:bg-gray-700 text-gray-900 dark:text-white focus:outline-none focus:ring-emerald-500 focus:border-emerald-500 sm:text-sm">
|
||||
<option value="">-- Select a template --</option>
|
||||
</select>
|
||||
</div>
|
||||
<button type="button" onclick="saveAsTemplate()"
|
||||
class="px-3 py-2 text-sm bg-gray-200 dark:bg-gray-700 text-gray-700 dark:text-gray-300 rounded-md hover:bg-gray-300 dark:hover:bg-gray-600"
|
||||
title="Save current settings as template">
|
||||
<svg class="w-4 h-4" fill="none" stroke="currentColor" viewBox="0 0 24 24">
|
||||
<path stroke-linecap="round" stroke-linejoin="round" stroke-width="2" d="M8 7H5a2 2 0 00-2 2v9a2 2 0 002 2h14a2 2 0 002-2V9a2 2 0 00-2-2h-3m-1 4l-3 3m0 0l-3-3m3 3V4"></path>
|
||||
</svg>
|
||||
</button>
|
||||
</div>
|
||||
|
||||
<!-- Report Title -->
|
||||
<div class="mb-4">
|
||||
<label for="report-title" class="block text-sm font-medium text-gray-700 dark:text-gray-300 mb-1">
|
||||
Report Title
|
||||
</label>
|
||||
<input type="text" id="report-title" value="Background Noise Study"
|
||||
class="block w-full px-3 py-2 border border-gray-300 dark:border-gray-600 rounded-md shadow-sm bg-white dark:bg-gray-700 text-gray-900 dark:text-white focus:outline-none focus:ring-emerald-500 focus:border-emerald-500 sm:text-sm">
|
||||
</div>
|
||||
|
||||
<!-- Project and Client -->
|
||||
<div class="grid grid-cols-1 sm:grid-cols-2 gap-4">
|
||||
<div>
|
||||
<label for="report-project" class="block text-sm font-medium text-gray-700 dark:text-gray-300 mb-1">
|
||||
Project Name
|
||||
</label>
|
||||
<input type="text" id="report-project" value="{{ project.name }}"
|
||||
class="block w-full px-3 py-2 border border-gray-300 dark:border-gray-600 rounded-md shadow-sm bg-white dark:bg-gray-700 text-gray-900 dark:text-white focus:outline-none focus:ring-emerald-500 focus:border-emerald-500 sm:text-sm">
|
||||
</div>
|
||||
<div>
|
||||
<label for="report-client" class="block text-sm font-medium text-gray-700 dark:text-gray-300 mb-1">
|
||||
Client Name
|
||||
</label>
|
||||
<input type="text" id="report-client" value="{{ project.client_name if project.client_name else '' }}"
|
||||
class="block w-full px-3 py-2 border border-gray-300 dark:border-gray-600 rounded-md shadow-sm bg-white dark:bg-gray-700 text-gray-900 dark:text-white focus:outline-none focus:ring-emerald-500 focus:border-emerald-500 sm:text-sm">
|
||||
</div>
|
||||
</div>
|
||||
</div>
|
||||
|
||||
<!-- Sessions Card -->
|
||||
<div class="bg-white dark:bg-slate-800 rounded-lg shadow-sm border border-gray-200 dark:border-gray-700 p-6 overflow-hidden">
|
||||
<div class="flex items-center justify-between mb-1">
|
||||
<h2 class="text-lg font-semibold text-gray-900 dark:text-white">Monitoring Sessions</h2>
|
||||
<div class="flex gap-3 text-sm">
|
||||
<button type="button" onclick="selectAllSessions()" class="text-emerald-600 dark:text-emerald-400 hover:underline">Select All</button>
|
||||
<button type="button" onclick="deselectAllSessions()" class="text-gray-500 dark:text-gray-400 hover:underline">Deselect All</button>
|
||||
</div>
|
||||
</div>
|
||||
<p class="text-sm text-gray-500 dark:text-gray-400 mb-4">
|
||||
<span id="selected-count">0</span> session(s) selected — each selected session becomes one sheet in the ZIP.
|
||||
Change the period type per session to control how stats are bucketed (Day vs Night).
|
||||
</p>
|
||||
|
||||
{% if locations %}
|
||||
{% for loc in locations %}
|
||||
{% set loc_name = loc.name %}
|
||||
{% set sessions = loc.sessions %}
|
||||
<div class="border border-gray-200 dark:border-gray-700 rounded-lg mb-3 overflow-hidden">
|
||||
<!-- Location header / toggle -->
|
||||
<button type="button"
|
||||
onclick="toggleLocation('loc-{{ loop.index }}')"
|
||||
class="w-full flex items-center justify-between px-4 py-3 bg-gray-50 dark:bg-slate-700/50 hover:bg-gray-100 dark:hover:bg-slate-700 transition-colors text-left">
|
||||
<div class="flex items-center gap-3">
|
||||
<svg id="chevron-loc-{{ loop.index }}" class="w-4 h-4 text-gray-400 transition-transform" fill="none" stroke="currentColor" viewBox="0 0 24 24">
|
||||
<path stroke-linecap="round" stroke-linejoin="round" stroke-width="2" d="M19 9l-7 7-7-7"></path>
|
||||
</svg>
|
||||
<span class="font-medium text-gray-900 dark:text-white text-sm">{{ loc_name }}</span>
|
||||
<span class="text-xs text-gray-400 dark:text-gray-500">{{ sessions|length }} session{{ 's' if sessions|length != 1 else '' }}</span>
|
||||
</div>
|
||||
<div class="flex items-center gap-3 text-xs" onclick="event.stopPropagation()">
|
||||
<button type="button" onclick="selectLocation('loc-{{ loop.index }}')"
|
||||
class="text-emerald-600 dark:text-emerald-400 hover:underline">All</button>
|
||||
<button type="button" onclick="deselectLocation('loc-{{ loop.index }}')"
|
||||
class="text-gray-400 hover:underline">None</button>
|
||||
</div>
|
||||
</button>
|
||||
|
||||
<!-- Session rows -->
|
||||
<div id="loc-{{ loop.index }}" class="divide-y divide-gray-100 dark:divide-gray-700/50">
|
||||
{% for s in sessions %}
|
||||
{% set pt_colors = {
|
||||
'weekday_day': 'bg-blue-100 text-blue-800 dark:bg-blue-900/30 dark:text-blue-300',
|
||||
'weekday_night': 'bg-indigo-100 text-indigo-800 dark:bg-indigo-900/30 dark:text-indigo-300',
|
||||
'weekend_day': 'bg-amber-100 text-amber-800 dark:bg-amber-900/30 dark:text-amber-300',
|
||||
'weekend_night': 'bg-purple-100 text-purple-800 dark:bg-purple-900/30 dark:text-purple-300',
|
||||
} %}
|
||||
{% set pt_labels = {
|
||||
'weekday_day': 'Weekday Day',
|
||||
'weekday_night': 'Weekday Night',
|
||||
'weekend_day': 'Weekend Day',
|
||||
'weekend_night': 'Weekend Night',
|
||||
} %}
|
||||
<div class="flex items-center gap-3 px-4 py-3 hover:bg-gray-50 dark:hover:bg-slate-700/30 transition-colors">
|
||||
<!-- Checkbox -->
|
||||
<input type="checkbox"
|
||||
class="session-cb loc-{{ loop.index }}-cb h-4 w-4 text-emerald-600 border-gray-300 dark:border-gray-600 rounded focus:ring-emerald-500 shrink-0"
|
||||
value="{{ s.session_id }}"
|
||||
checked
|
||||
onchange="updateSelectionStats()">
|
||||
|
||||
<!-- Date/day info -->
|
||||
<div class="min-w-0 flex-1">
|
||||
<div class="flex flex-wrap items-center gap-2">
|
||||
<span class="text-sm font-medium text-gray-900 dark:text-white">
|
||||
{{ s.day_of_week }} {{ s.date_display }}
|
||||
</span>
|
||||
{% if s.session_label %}
|
||||
<span class="text-xs text-gray-400 dark:text-gray-500 truncate">{{ s.session_label }}</span>
|
||||
{% endif %}
|
||||
{% if s.status == 'recording' %}
|
||||
<span class="px-1.5 py-0.5 text-xs bg-red-100 text-red-700 dark:bg-red-900/30 dark:text-red-300 rounded-full flex items-center gap-1">
|
||||
<span class="w-1.5 h-1.5 bg-red-500 rounded-full animate-pulse"></span>Recording
|
||||
</span>
|
||||
{% endif %}
|
||||
</div>
|
||||
<div class="flex items-center gap-3 mt-0.5 text-xs text-gray-400 dark:text-gray-500">
|
||||
{% if s.started_at %}
|
||||
<span>{{ s.started_at }}</span>
|
||||
{% endif %}
|
||||
{% if s.duration_h is not none %}
|
||||
<span>{{ s.duration_h }}h {{ s.duration_m }}m</span>
|
||||
{% endif %}
|
||||
</div>
|
||||
</div>
|
||||
|
||||
<!-- Period type dropdown -->
|
||||
<div class="relative shrink-0" id="wiz-period-wrap-{{ s.session_id }}">
|
||||
<button type="button"
|
||||
onclick="toggleWizPeriodMenu('{{ s.session_id }}')"
|
||||
id="wiz-period-badge-{{ s.session_id }}"
|
||||
class="px-2 py-0.5 text-xs font-medium rounded-full flex items-center gap-1 transition-colors {{ pt_colors.get(s.period_type, 'bg-gray-100 text-gray-500 dark:bg-gray-700 dark:text-gray-400') }}"
|
||||
title="Click to change period type">
|
||||
<span id="wiz-period-label-{{ s.session_id }}">{{ pt_labels.get(s.period_type, 'Set period') }}</span>
|
||||
<svg class="w-3 h-3 opacity-60 shrink-0" fill="none" stroke="currentColor" viewBox="0 0 24 24">
|
||||
<path stroke-linecap="round" stroke-linejoin="round" stroke-width="2" d="M19 9l-7 7-7-7"></path>
|
||||
</svg>
|
||||
</button>
|
||||
<div id="wiz-period-menu-{{ s.session_id }}"
|
||||
class="hidden absolute right-0 top-full mt-1 z-20 bg-white dark:bg-slate-700 border border-gray-200 dark:border-gray-600 rounded-lg shadow-lg min-w-[160px] py-1">
|
||||
{% for pt, pt_label in [('weekday_day','Weekday Day'),('weekday_night','Weekday Night'),('weekend_day','Weekend Day'),('weekend_night','Weekend Night')] %}
|
||||
<button type="button"
|
||||
onclick="setWizPeriodType('{{ s.session_id }}', '{{ pt }}')"
|
||||
class="w-full text-left px-3 py-1.5 text-xs hover:bg-gray-100 dark:hover:bg-slate-600 text-gray-700 dark:text-gray-300 {% if s.period_type == pt %}font-bold{% endif %}">
|
||||
{{ pt_label }}
|
||||
</button>
|
||||
{% endfor %}
|
||||
</div>
|
||||
</div>
|
||||
</div>
|
||||
{% endfor %}
|
||||
</div>
|
||||
</div>
|
||||
{% endfor %}
|
||||
{% else %}
|
||||
<div class="text-center py-10 text-gray-500 dark:text-gray-400">
|
||||
<svg class="w-12 h-12 mx-auto mb-3 text-gray-300 dark:text-gray-600" fill="none" stroke="currentColor" viewBox="0 0 24 24">
|
||||
<path stroke-linecap="round" stroke-linejoin="round" stroke-width="2" d="M9 19V6l12-3v13M9 19c0 1.105-1.343 2-3 2s-3-.895-3-2 1.343-2 3-2 3 .895 3 2zm12-3c0 1.105-1.343 2-3 2s-3-.895-3-2 1.343-2 3-2 3 .895 3 2zM9 10l12-3"></path>
|
||||
</svg>
|
||||
<p>No monitoring sessions found.</p>
|
||||
<p class="text-sm mt-1">Upload data files to create sessions first.</p>
|
||||
</div>
|
||||
{% endif %}
|
||||
</div>
|
||||
|
||||
<!-- Footer Buttons -->
|
||||
<div class="flex flex-col sm:flex-row items-center justify-between gap-3 pb-6">
|
||||
<a href="/projects/{{ project_id }}"
|
||||
class="w-full sm:w-auto px-6 py-2.5 border border-gray-300 dark:border-gray-600 text-gray-700 dark:text-gray-300 bg-white dark:bg-gray-700 rounded-lg hover:bg-gray-50 dark:hover:bg-gray-600 transition-colors text-center text-sm font-medium">
|
||||
Cancel
|
||||
</a>
|
||||
<button type="button" onclick="gotoPreview()" id="preview-btn"
|
||||
{% if not locations %}disabled{% endif %}
|
||||
class="w-full sm:w-auto px-6 py-2.5 bg-emerald-600 text-white rounded-lg hover:bg-emerald-700 transition-colors text-sm font-medium flex items-center justify-center gap-2 disabled:opacity-50 disabled:cursor-not-allowed">
|
||||
<svg class="w-4 h-4" fill="none" stroke="currentColor" viewBox="0 0 24 24">
|
||||
<path stroke-linecap="round" stroke-linejoin="round" stroke-width="2" d="M15 12a3 3 0 11-6 0 3 3 0 016 0z"></path>
|
||||
<path stroke-linecap="round" stroke-linejoin="round" stroke-width="2" d="M2.458 12C3.732 7.943 7.523 5 12 5c4.478 0 8.268 2.943 9.542 7-1.274 4.057-5.064 7-9.542 7-4.477 0-8.268-2.943-9.542-7z"></path>
|
||||
</svg>
|
||||
Preview & Edit →
|
||||
</button>
|
||||
</div>
|
||||
|
||||
</div>
|
||||
</div>
|
||||
|
||||
<script>
|
||||
const PROJECT_ID = '{{ project_id }}';
|
||||
|
||||
const PERIOD_COLORS = {
|
||||
weekday_day: 'bg-blue-100 text-blue-800 dark:bg-blue-900/30 dark:text-blue-300',
|
||||
weekday_night: 'bg-indigo-100 text-indigo-800 dark:bg-indigo-900/30 dark:text-indigo-300',
|
||||
weekend_day: 'bg-amber-100 text-amber-800 dark:bg-amber-900/30 dark:text-amber-300',
|
||||
weekend_night: 'bg-purple-100 text-purple-800 dark:bg-purple-900/30 dark:text-purple-300',
|
||||
};
|
||||
const PERIOD_LABELS = {
|
||||
weekday_day: 'Weekday Day',
|
||||
weekday_night: 'Weekday Night',
|
||||
weekend_day: 'Weekend Day',
|
||||
weekend_night: 'Weekend Night',
|
||||
};
|
||||
const ALL_PERIOD_BADGE_CLASSES = [
|
||||
'bg-gray-100','text-gray-500','dark:bg-gray-700','dark:text-gray-400',
|
||||
...new Set(Object.values(PERIOD_COLORS).flatMap(s => s.split(' ')))
|
||||
];
|
||||
|
||||
// ── Location accordion ────────────────────────────────────────────
|
||||
|
||||
function toggleLocation(locId) {
|
||||
const body = document.getElementById(locId);
|
||||
const chevron = document.getElementById('chevron-' + locId);
|
||||
body.classList.toggle('hidden');
|
||||
chevron.style.transform = body.classList.contains('hidden') ? 'rotate(-90deg)' : '';
|
||||
}
|
||||
|
||||
function selectLocation(locId) {
|
||||
document.querySelectorAll('.' + locId + '-cb').forEach(cb => cb.checked = true);
|
||||
updateSelectionStats();
|
||||
}
|
||||
|
||||
function deselectLocation(locId) {
|
||||
document.querySelectorAll('.' + locId + '-cb').forEach(cb => cb.checked = false);
|
||||
updateSelectionStats();
|
||||
}
|
||||
|
||||
function selectAllSessions() {
|
||||
document.querySelectorAll('.session-cb').forEach(cb => cb.checked = true);
|
||||
updateSelectionStats();
|
||||
}
|
||||
|
||||
function deselectAllSessions() {
|
||||
document.querySelectorAll('.session-cb').forEach(cb => cb.checked = false);
|
||||
updateSelectionStats();
|
||||
}
|
||||
|
||||
function updateSelectionStats() {
|
||||
const count = document.querySelectorAll('.session-cb:checked').length;
|
||||
document.getElementById('selected-count').textContent = count;
|
||||
document.getElementById('preview-btn').disabled = count === 0;
|
||||
}
|
||||
|
||||
// ── Period type dropdown (wizard) ─────────────────────────────────
|
||||
|
||||
function toggleWizPeriodMenu(sessionId) {
|
||||
const menu = document.getElementById('wiz-period-menu-' + sessionId);
|
||||
document.querySelectorAll('[id^="wiz-period-menu-"]').forEach(m => {
|
||||
if (m.id !== 'wiz-period-menu-' + sessionId) m.classList.add('hidden');
|
||||
});
|
||||
menu.classList.toggle('hidden');
|
||||
}
|
||||
|
||||
document.addEventListener('click', function(e) {
|
||||
if (!e.target.closest('[id^="wiz-period-wrap-"]')) {
|
||||
document.querySelectorAll('[id^="wiz-period-menu-"]').forEach(m => m.classList.add('hidden'));
|
||||
}
|
||||
});
|
||||
|
||||
async function setWizPeriodType(sessionId, periodType) {
|
||||
document.getElementById('wiz-period-menu-' + sessionId).classList.add('hidden');
|
||||
const badge = document.getElementById('wiz-period-badge-' + sessionId);
|
||||
badge.disabled = true;
|
||||
try {
|
||||
const resp = await fetch(`/api/projects/${PROJECT_ID}/sessions/${sessionId}`, {
|
||||
method: 'PATCH',
|
||||
headers: {'Content-Type': 'application/json'},
|
||||
body: JSON.stringify({period_type: periodType}),
|
||||
});
|
||||
if (!resp.ok) throw new Error(await resp.text());
|
||||
ALL_PERIOD_BADGE_CLASSES.forEach(c => badge.classList.remove(c));
|
||||
const colorStr = PERIOD_COLORS[periodType] || 'bg-gray-100 text-gray-500 dark:bg-gray-700 dark:text-gray-400';
|
||||
badge.classList.add(...colorStr.split(' ').filter(Boolean));
|
||||
document.getElementById('wiz-period-label-' + sessionId).textContent = PERIOD_LABELS[periodType] || periodType;
|
||||
} catch(err) {
|
||||
alert('Failed to update period type: ' + err.message);
|
||||
} finally {
|
||||
badge.disabled = false;
|
||||
}
|
||||
}
|
||||
|
||||
// ── Template management ───────────────────────────────────────────
|
||||
|
||||
let reportTemplates = [];
|
||||
|
||||
async function loadTemplates() {
|
||||
try {
|
||||
const resp = await fetch('/api/report-templates?project_id=' + PROJECT_ID);
|
||||
if (resp.ok) {
|
||||
reportTemplates = await resp.json();
|
||||
populateTemplateDropdown();
|
||||
}
|
||||
} catch(e) { console.error('Error loading templates:', e); }
|
||||
}
|
||||
|
||||
function populateTemplateDropdown() {
|
||||
const select = document.getElementById('template-select');
|
||||
if (!select) return;
|
||||
select.innerHTML = '<option value="">-- Select a template --</option>';
|
||||
reportTemplates.forEach(t => {
|
||||
const opt = document.createElement('option');
|
||||
opt.value = t.id;
|
||||
opt.textContent = t.name;
|
||||
opt.dataset.config = JSON.stringify(t);
|
||||
select.appendChild(opt);
|
||||
});
|
||||
}
|
||||
|
||||
function applyTemplate() {
|
||||
const select = document.getElementById('template-select');
|
||||
const opt = select.options[select.selectedIndex];
|
||||
if (!opt.value) return;
|
||||
const t = JSON.parse(opt.dataset.config);
|
||||
if (t.report_title) document.getElementById('report-title').value = t.report_title;
|
||||
}
|
||||
|
||||
async function saveAsTemplate() {
|
||||
const name = prompt('Enter a name for this template:');
|
||||
if (!name) return;
|
||||
const data = {
|
||||
name,
|
||||
project_id: PROJECT_ID,
|
||||
report_title: document.getElementById('report-title').value || 'Background Noise Study',
|
||||
};
|
||||
try {
|
||||
const resp = await fetch('/api/report-templates', {
|
||||
method: 'POST',
|
||||
headers: {'Content-Type': 'application/json'},
|
||||
body: JSON.stringify(data),
|
||||
});
|
||||
if (resp.ok) { alert('Template saved!'); loadTemplates(); }
|
||||
else alert('Failed to save template');
|
||||
} catch(e) { alert('Error: ' + e.message); }
|
||||
}
|
||||
|
||||
// ── Navigate to preview ───────────────────────────────────────────
|
||||
|
||||
function gotoPreview() {
|
||||
const checked = Array.from(document.querySelectorAll('.session-cb:checked')).map(cb => cb.value);
|
||||
if (checked.length === 0) {
|
||||
alert('Please select at least one session.');
|
||||
return;
|
||||
}
|
||||
const params = new URLSearchParams({
|
||||
report_title: document.getElementById('report-title').value || 'Background Noise Study',
|
||||
project_name: document.getElementById('report-project').value || '',
|
||||
client_name: document.getElementById('report-client').value || '',
|
||||
selected_sessions: checked.join(','),
|
||||
});
|
||||
window.location.href = `/api/projects/${PROJECT_ID}/combined-report-preview?${params.toString()}`;
|
||||
}
|
||||
|
||||
// ── Init ─────────────────────────────────────────────────────────
|
||||
|
||||
document.addEventListener('DOMContentLoaded', function() {
|
||||
updateSelectionStats();
|
||||
loadTemplates();
|
||||
});
|
||||
</script>
|
||||
{% endblock %}
|
||||
@@ -27,10 +27,10 @@
|
||||
hx-swap="none"
|
||||
hx-on::after-request="updateDashboard(event)">
|
||||
|
||||
<div class="grid grid-cols-1 md:grid-cols-3 gap-6 mb-8">
|
||||
<div class="grid grid-cols-1 md:grid-cols-2 lg:grid-cols-4 gap-6 mb-8">
|
||||
|
||||
<!-- Fleet Summary Card -->
|
||||
<div class="rounded-xl shadow-lg bg-white dark:bg-slate-800 p-6" id="fleet-summary-card">
|
||||
<div class="rounded-xl shadow-lg bg-white dark:bg-slate-700 p-6" id="fleet-summary-card">
|
||||
<div class="flex items-center justify-between mb-4 cursor-pointer md:cursor-default" onclick="toggleCard('fleet-summary')">
|
||||
<h2 class="text-lg font-semibold text-gray-900 dark:text-white">Fleet Summary</h2>
|
||||
<div class="flex items-center gap-2">
|
||||
@@ -57,6 +57,10 @@
|
||||
<span class="text-gray-600 dark:text-gray-400">Benched</span>
|
||||
<span id="benched-units" class="text-3xl md:text-2xl font-bold text-gray-600 dark:text-gray-400">--</span>
|
||||
</div>
|
||||
<div class="flex justify-between items-center">
|
||||
<span class="text-orange-600 dark:text-orange-400">Allocated</span>
|
||||
<span id="allocated-units" class="text-3xl md:text-2xl font-bold text-orange-500 dark:text-orange-400">--</span>
|
||||
</div>
|
||||
<div class="border-t border-gray-200 dark:border-gray-700 pt-3 mt-3">
|
||||
<p class="text-xs text-gray-500 dark:text-gray-500 mb-2 italic">By Device Type:</p>
|
||||
<div class="flex justify-between items-center mb-1">
|
||||
@@ -118,7 +122,7 @@
|
||||
</div>
|
||||
|
||||
<!-- Recent Alerts Card -->
|
||||
<div class="rounded-xl shadow-lg bg-white dark:bg-slate-800 p-6" id="recent-alerts-card">
|
||||
<div class="rounded-xl shadow-lg bg-white dark:bg-slate-700 p-6" id="recent-alerts-card">
|
||||
<div class="flex items-center justify-between mb-4 cursor-pointer md:cursor-default" onclick="toggleCard('recent-alerts')">
|
||||
<h2 class="text-lg font-semibold text-gray-900 dark:text-white">Recent Alerts</h2>
|
||||
<div class="flex items-center gap-2">
|
||||
@@ -138,7 +142,7 @@
|
||||
</div>
|
||||
|
||||
<!-- Recently Called In Units Card -->
|
||||
<div class="rounded-xl shadow-lg bg-white dark:bg-slate-800 p-6" id="recent-callins-card">
|
||||
<div class="rounded-xl shadow-lg bg-white dark:bg-slate-700 p-6" id="recent-callins-card">
|
||||
<div class="flex items-center justify-between mb-4 cursor-pointer md:cursor-default" onclick="toggleCard('recent-callins')">
|
||||
<h2 class="text-lg font-semibold text-gray-900 dark:text-white">Recent Call-Ins</h2>
|
||||
<div class="flex items-center gap-2">
|
||||
@@ -162,10 +166,95 @@
|
||||
</div>
|
||||
</div>
|
||||
|
||||
<!-- Today's Scheduled Actions Card -->
|
||||
<div class="rounded-xl shadow-lg bg-white dark:bg-slate-700 p-6" id="todays-actions-card">
|
||||
<div class="flex items-center justify-between mb-4 cursor-pointer md:cursor-default" onclick="toggleCard('todays-actions')">
|
||||
<h2 class="text-lg font-semibold text-gray-900 dark:text-white">Today's Schedule</h2>
|
||||
<div class="flex items-center gap-2">
|
||||
<svg class="w-6 h-6 text-seismo-orange" fill="none" stroke="currentColor" viewBox="0 0 24 24">
|
||||
<path stroke-linecap="round" stroke-linejoin="round" stroke-width="2"
|
||||
d="M8 7V3m8 4V3m-9 8h10M5 21h14a2 2 0 002-2V7a2 2 0 00-2-2H5a2 2 0 00-2 2v12a2 2 0 002 2z">
|
||||
</path>
|
||||
</svg>
|
||||
<svg class="w-5 h-5 text-gray-500 transition-transform md:hidden chevron" id="todays-actions-chevron" fill="none" stroke="currentColor" viewBox="0 0 24 24">
|
||||
<path stroke-linecap="round" stroke-linejoin="round" stroke-width="2" d="M19 9l-7 7-7-7"></path>
|
||||
</svg>
|
||||
</div>
|
||||
</div>
|
||||
<div class="card-content" id="todays-actions-content"
|
||||
hx-get="/dashboard/todays-actions"
|
||||
hx-trigger="load, every 30s"
|
||||
hx-swap="innerHTML">
|
||||
<p class="text-sm text-gray-500 dark:text-gray-400">Loading scheduled actions...</p>
|
||||
</div>
|
||||
</div>
|
||||
|
||||
</div>
|
||||
|
||||
<!-- Dashboard Filters -->
|
||||
<div class="rounded-xl shadow-lg bg-white dark:bg-slate-700 p-4 mb-4" id="dashboard-filters-card">
|
||||
<div class="flex items-center justify-between mb-3">
|
||||
<h3 class="text-sm font-semibold text-gray-700 dark:text-gray-300">Filter Dashboard</h3>
|
||||
<button onclick="resetFilters()" class="text-xs text-gray-500 hover:text-seismo-orange dark:hover:text-seismo-orange transition-colors">
|
||||
Reset Filters
|
||||
</button>
|
||||
</div>
|
||||
|
||||
<div class="flex flex-wrap gap-6">
|
||||
<!-- Device Type Filters -->
|
||||
<div class="flex flex-col gap-1">
|
||||
<span class="text-xs text-gray-500 dark:text-gray-400 font-medium uppercase tracking-wide">Device Type</span>
|
||||
<div class="flex gap-4">
|
||||
<label class="flex items-center gap-1.5 cursor-pointer">
|
||||
<input type="checkbox" id="filter-seismograph" checked
|
||||
class="rounded border-gray-300 text-blue-600 focus:ring-blue-500 dark:border-gray-600 dark:bg-slate-800"
|
||||
onchange="applyFilters()">
|
||||
<span class="text-sm text-gray-700 dark:text-gray-300">Seismographs</span>
|
||||
</label>
|
||||
<label class="flex items-center gap-1.5 cursor-pointer">
|
||||
<input type="checkbox" id="filter-slm" checked
|
||||
class="rounded border-gray-300 text-purple-600 focus:ring-purple-500 dark:border-gray-600 dark:bg-slate-800"
|
||||
onchange="applyFilters()">
|
||||
<span class="text-sm text-gray-700 dark:text-gray-300">SLMs</span>
|
||||
</label>
|
||||
<label class="flex items-center gap-1.5 cursor-pointer">
|
||||
<input type="checkbox" id="filter-modem" checked
|
||||
class="rounded border-gray-300 text-cyan-600 focus:ring-cyan-500 dark:border-gray-600 dark:bg-slate-800"
|
||||
onchange="applyFilters()">
|
||||
<span class="text-sm text-gray-700 dark:text-gray-300">Modems</span>
|
||||
</label>
|
||||
</div>
|
||||
</div>
|
||||
|
||||
<!-- Status Filters -->
|
||||
<div class="flex flex-col gap-1">
|
||||
<span class="text-xs text-gray-500 dark:text-gray-400 font-medium uppercase tracking-wide">Status</span>
|
||||
<div class="flex gap-4">
|
||||
<label class="flex items-center gap-1.5 cursor-pointer">
|
||||
<input type="checkbox" id="filter-ok" checked
|
||||
class="rounded border-gray-300 text-green-600 focus:ring-green-500 dark:border-gray-600 dark:bg-slate-800"
|
||||
onchange="applyFilters()">
|
||||
<span class="text-sm text-green-600 dark:text-green-400">OK</span>
|
||||
</label>
|
||||
<label class="flex items-center gap-1.5 cursor-pointer">
|
||||
<input type="checkbox" id="filter-pending" checked
|
||||
class="rounded border-gray-300 text-yellow-600 focus:ring-yellow-500 dark:border-gray-600 dark:bg-slate-800"
|
||||
onchange="applyFilters()">
|
||||
<span class="text-sm text-yellow-600 dark:text-yellow-400">Pending</span>
|
||||
</label>
|
||||
<label class="flex items-center gap-1.5 cursor-pointer">
|
||||
<input type="checkbox" id="filter-missing" checked
|
||||
class="rounded border-gray-300 text-red-600 focus:ring-red-500 dark:border-gray-600 dark:bg-slate-800"
|
||||
onchange="applyFilters()">
|
||||
<span class="text-sm text-red-600 dark:text-red-400">Missing</span>
|
||||
</label>
|
||||
</div>
|
||||
</div>
|
||||
</div>
|
||||
</div>
|
||||
|
||||
<!-- Fleet Map -->
|
||||
<div class="rounded-xl shadow-lg bg-white dark:bg-slate-800 p-6 mb-8" id="fleet-map-card">
|
||||
<div class="rounded-xl shadow-lg bg-white dark:bg-slate-700 p-6 mb-8" id="fleet-map-card">
|
||||
<div class="flex items-center justify-between mb-4 cursor-pointer md:cursor-default" onclick="toggleCard('fleet-map')">
|
||||
<h2 class="text-xl font-semibold text-gray-900 dark:text-white">Fleet Map</h2>
|
||||
<div class="flex items-center gap-2">
|
||||
@@ -181,7 +270,7 @@
|
||||
</div>
|
||||
|
||||
<!-- Recent Photos Section -->
|
||||
<div class="rounded-xl shadow-lg bg-white dark:bg-slate-800 p-6 mb-8" id="recent-photos-card">
|
||||
<div class="rounded-xl shadow-lg bg-white dark:bg-slate-700 p-6 mb-8" id="recent-photos-card">
|
||||
<div class="flex items-center justify-between mb-4 cursor-pointer md:cursor-default" onclick="toggleCard('recent-photos')">
|
||||
<h2 class="text-xl font-semibold text-gray-900 dark:text-white">Recent Photos</h2>
|
||||
<div class="flex items-center gap-2">
|
||||
@@ -201,7 +290,7 @@
|
||||
</div>
|
||||
|
||||
<!-- Fleet Status Section with Tabs -->
|
||||
<div class="rounded-xl shadow-lg bg-white dark:bg-slate-800 p-6" id="fleet-status-card">
|
||||
<div class="rounded-xl shadow-lg bg-white dark:bg-slate-700 p-6" id="fleet-status-card">
|
||||
|
||||
<div class="flex items-center justify-between mb-4 cursor-pointer md:cursor-default" onclick="toggleCard('fleet-status')">
|
||||
<h2 class="text-xl font-semibold text-gray-900 dark:text-white">Fleet Status</h2>
|
||||
@@ -279,6 +368,255 @@
|
||||
|
||||
|
||||
<script>
|
||||
// ===== Dashboard Filtering System =====
|
||||
let currentSnapshotData = null; // Store latest snapshot data for re-filtering
|
||||
|
||||
// Filter state - tracks which device types and statuses to show
|
||||
const filters = {
|
||||
deviceTypes: {
|
||||
seismograph: true,
|
||||
sound_level_meter: true,
|
||||
modem: true
|
||||
},
|
||||
statuses: {
|
||||
OK: true,
|
||||
Pending: true,
|
||||
Missing: true
|
||||
}
|
||||
};
|
||||
|
||||
// Load saved filter preferences from localStorage
|
||||
function loadFilterPreferences() {
|
||||
const saved = localStorage.getItem('dashboardFilters');
|
||||
if (saved) {
|
||||
try {
|
||||
const parsed = JSON.parse(saved);
|
||||
if (parsed.deviceTypes) Object.assign(filters.deviceTypes, parsed.deviceTypes);
|
||||
if (parsed.statuses) Object.assign(filters.statuses, parsed.statuses);
|
||||
} catch (e) {
|
||||
console.error('Error loading filter preferences:', e);
|
||||
}
|
||||
}
|
||||
|
||||
// Sync checkboxes with loaded state
|
||||
const seismoCheck = document.getElementById('filter-seismograph');
|
||||
const slmCheck = document.getElementById('filter-slm');
|
||||
const modemCheck = document.getElementById('filter-modem');
|
||||
const okCheck = document.getElementById('filter-ok');
|
||||
const pendingCheck = document.getElementById('filter-pending');
|
||||
const missingCheck = document.getElementById('filter-missing');
|
||||
|
||||
if (seismoCheck) seismoCheck.checked = filters.deviceTypes.seismograph;
|
||||
if (slmCheck) slmCheck.checked = filters.deviceTypes.sound_level_meter;
|
||||
if (modemCheck) modemCheck.checked = filters.deviceTypes.modem;
|
||||
if (okCheck) okCheck.checked = filters.statuses.OK;
|
||||
if (pendingCheck) pendingCheck.checked = filters.statuses.Pending;
|
||||
if (missingCheck) missingCheck.checked = filters.statuses.Missing;
|
||||
}
|
||||
|
||||
// Save filter preferences to localStorage
|
||||
function saveFilterPreferences() {
|
||||
localStorage.setItem('dashboardFilters', JSON.stringify(filters));
|
||||
}
|
||||
|
||||
// Apply filters - called when any checkbox changes
|
||||
function applyFilters() {
|
||||
// Update filter state from checkboxes
|
||||
const seismoCheck = document.getElementById('filter-seismograph');
|
||||
const slmCheck = document.getElementById('filter-slm');
|
||||
const modemCheck = document.getElementById('filter-modem');
|
||||
const okCheck = document.getElementById('filter-ok');
|
||||
const pendingCheck = document.getElementById('filter-pending');
|
||||
const missingCheck = document.getElementById('filter-missing');
|
||||
|
||||
if (seismoCheck) filters.deviceTypes.seismograph = seismoCheck.checked;
|
||||
if (slmCheck) filters.deviceTypes.sound_level_meter = slmCheck.checked;
|
||||
if (modemCheck) filters.deviceTypes.modem = modemCheck.checked;
|
||||
if (okCheck) filters.statuses.OK = okCheck.checked;
|
||||
if (pendingCheck) filters.statuses.Pending = pendingCheck.checked;
|
||||
if (missingCheck) filters.statuses.Missing = missingCheck.checked;
|
||||
|
||||
saveFilterPreferences();
|
||||
|
||||
// Re-render with current data and filters
|
||||
if (currentSnapshotData) {
|
||||
renderFilteredDashboard(currentSnapshotData);
|
||||
}
|
||||
}
|
||||
|
||||
// Reset all filters to show everything
|
||||
function resetFilters() {
|
||||
filters.deviceTypes = { seismograph: true, sound_level_meter: true, modem: true };
|
||||
filters.statuses = { OK: true, Pending: true, Missing: true };
|
||||
|
||||
// Update all checkboxes
|
||||
const checkboxes = [
|
||||
'filter-seismograph', 'filter-slm', 'filter-modem',
|
||||
'filter-ok', 'filter-pending', 'filter-missing'
|
||||
];
|
||||
checkboxes.forEach(id => {
|
||||
const el = document.getElementById(id);
|
||||
if (el) el.checked = true;
|
||||
});
|
||||
|
||||
saveFilterPreferences();
|
||||
|
||||
if (currentSnapshotData) {
|
||||
renderFilteredDashboard(currentSnapshotData);
|
||||
}
|
||||
}
|
||||
|
||||
// Check if a unit passes the current filters
|
||||
function unitPassesFilter(unit) {
|
||||
const deviceType = unit.device_type || 'seismograph';
|
||||
const status = unit.status || 'Missing';
|
||||
|
||||
// Check device type filter
|
||||
if (!filters.deviceTypes[deviceType]) {
|
||||
return false;
|
||||
}
|
||||
|
||||
// Check status filter
|
||||
if (!filters.statuses[status]) {
|
||||
return false;
|
||||
}
|
||||
|
||||
return true;
|
||||
}
|
||||
|
||||
// Get display label for device type
|
||||
function getDeviceTypeLabel(deviceType) {
|
||||
switch(deviceType) {
|
||||
case 'sound_level_meter': return 'SLM';
|
||||
case 'modem': return 'Modem';
|
||||
default: return 'Seismograph';
|
||||
}
|
||||
}
|
||||
|
||||
// Render dashboard with filtered data
|
||||
function renderFilteredDashboard(data) {
|
||||
// Filter active units for alerts
|
||||
const filteredActive = {};
|
||||
Object.entries(data.active || {}).forEach(([id, unit]) => {
|
||||
if (unitPassesFilter(unit)) {
|
||||
filteredActive[id] = unit;
|
||||
}
|
||||
});
|
||||
|
||||
// Update alerts with filtered data
|
||||
updateAlertsFiltered(filteredActive);
|
||||
|
||||
// Update map with filtered data
|
||||
updateFleetMapFiltered(data.units);
|
||||
}
|
||||
|
||||
// Update the Recent Alerts section with filtering
|
||||
function updateAlertsFiltered(filteredActive) {
|
||||
const alertsList = document.getElementById('alerts-list');
|
||||
const missingUnits = Object.entries(filteredActive).filter(([_, u]) => u.status === 'Missing' && u.device_type !== 'modem');
|
||||
|
||||
if (!missingUnits.length) {
|
||||
// Check if this is because of filters or genuinely no alerts
|
||||
const anyMissing = currentSnapshotData && Object.values(currentSnapshotData.active || {}).some(u => u.status === 'Missing');
|
||||
if (anyMissing) {
|
||||
alertsList.innerHTML = '<p class="text-sm text-gray-500 dark:text-gray-400">No alerts match current filters</p>';
|
||||
} else {
|
||||
alertsList.innerHTML = '<p class="text-sm text-green-600 dark:text-green-400">All units reporting normally</p>';
|
||||
}
|
||||
} else {
|
||||
let alertsHtml = '';
|
||||
missingUnits.forEach(([id, unit]) => {
|
||||
const deviceLabel = getDeviceTypeLabel(unit.device_type);
|
||||
alertsHtml += `
|
||||
<div class="flex items-start space-x-2 text-sm">
|
||||
<span class="w-2 h-2 rounded-full bg-red-500 mt-1.5"></span>
|
||||
<div>
|
||||
<a href="/unit/${id}" class="font-medium text-red-600 dark:text-red-400 hover:underline">${id}</a>
|
||||
<span class="text-xs text-gray-500 ml-1">(${deviceLabel})</span>
|
||||
<p class="text-gray-600 dark:text-gray-400">Missing for ${unit.age}</p>
|
||||
</div>
|
||||
</div>`;
|
||||
});
|
||||
alertsList.innerHTML = alertsHtml;
|
||||
}
|
||||
}
|
||||
|
||||
// Update map with filtered data
|
||||
function updateFleetMapFiltered(allUnits) {
|
||||
if (!fleetMap) return;
|
||||
|
||||
// Clear existing markers
|
||||
fleetMarkers.forEach(marker => fleetMap.removeLayer(marker));
|
||||
fleetMarkers = [];
|
||||
|
||||
// Get deployed units with coordinates that pass the filter
|
||||
const deployedUnits = Object.entries(allUnits || {})
|
||||
.filter(([_, u]) => u.deployed && u.coordinates && unitPassesFilter(u));
|
||||
|
||||
if (deployedUnits.length === 0) {
|
||||
return;
|
||||
}
|
||||
|
||||
const bounds = [];
|
||||
|
||||
deployedUnits.forEach(([id, unit]) => {
|
||||
const coords = parseLocation(unit.coordinates);
|
||||
if (coords) {
|
||||
const [lat, lon] = coords;
|
||||
|
||||
// Color based on status
|
||||
const markerColor = unit.status === 'OK' ? 'green' :
|
||||
unit.status === 'Pending' ? 'orange' : 'red';
|
||||
|
||||
// Different marker style per device type
|
||||
const deviceType = unit.device_type || 'seismograph';
|
||||
let radius = 8;
|
||||
let weight = 2;
|
||||
|
||||
if (deviceType === 'modem') {
|
||||
radius = 6;
|
||||
weight = 2;
|
||||
} else if (deviceType === 'sound_level_meter') {
|
||||
radius = 8;
|
||||
weight = 3;
|
||||
}
|
||||
|
||||
const marker = L.circleMarker([lat, lon], {
|
||||
radius: radius,
|
||||
fillColor: markerColor,
|
||||
color: '#fff',
|
||||
weight: weight,
|
||||
opacity: 1,
|
||||
fillOpacity: 0.8
|
||||
}).addTo(fleetMap);
|
||||
|
||||
// Popup with device type
|
||||
const deviceLabel = getDeviceTypeLabel(deviceType);
|
||||
|
||||
marker.bindPopup(`
|
||||
<div class="p-2">
|
||||
<h3 class="font-bold text-lg">${id}</h3>
|
||||
<p class="text-sm text-gray-600">${deviceLabel}</p>
|
||||
<p class="text-sm">Status: <span style="color: ${markerColor}">${unit.status}</span></p>
|
||||
${unit.note ? `<p class="text-sm text-gray-600">${unit.note}</p>` : ''}
|
||||
<a href="/unit/${id}" class="text-blue-600 hover:underline text-sm">View Details</a>
|
||||
</div>
|
||||
`);
|
||||
|
||||
fleetMarkers.push(marker);
|
||||
bounds.push([lat, lon]);
|
||||
}
|
||||
});
|
||||
|
||||
// Only fit bounds on initial load, not on subsequent updates
|
||||
// This preserves the user's current map view when auto-refreshing
|
||||
if (bounds.length > 0 && !fleetMapInitialized) {
|
||||
const padding = window.innerWidth < 768 ? [20, 20] : [50, 50];
|
||||
fleetMap.fitBounds(bounds, { padding: padding });
|
||||
fleetMapInitialized = true;
|
||||
}
|
||||
}
|
||||
|
||||
// Toggle card collapse/expand (mobile only)
|
||||
function toggleCard(cardName) {
|
||||
// Only work on mobile
|
||||
@@ -316,7 +654,7 @@ function toggleCard(cardName) {
|
||||
// Restore card states from localStorage on page load
|
||||
function restoreCardStates() {
|
||||
const cardStates = JSON.parse(localStorage.getItem('dashboardCardStates') || '{}');
|
||||
const cardNames = ['fleet-summary', 'recent-alerts', 'recent-callins', 'fleet-map', 'fleet-status'];
|
||||
const cardNames = ['fleet-summary', 'recent-alerts', 'recent-callins', 'todays-actions', 'fleet-map', 'fleet-status'];
|
||||
|
||||
cardNames.forEach(cardName => {
|
||||
const content = document.getElementById(`${cardName}-content`);
|
||||
@@ -343,8 +681,17 @@ if (document.readyState === 'loading') {
|
||||
|
||||
function updateDashboard(event) {
|
||||
try {
|
||||
// Only process responses from /api/status-snapshot
|
||||
const requestUrl = event.detail.xhr.responseURL || event.detail.pathInfo?.requestPath;
|
||||
if (!requestUrl || !requestUrl.includes('/api/status-snapshot')) {
|
||||
return; // Ignore responses from other endpoints (like /dashboard/todays-actions)
|
||||
}
|
||||
|
||||
const data = JSON.parse(event.detail.xhr.response);
|
||||
|
||||
// Store data for filter re-application
|
||||
currentSnapshotData = data;
|
||||
|
||||
// Update "Last updated" timestamp with timezone
|
||||
const now = new Date();
|
||||
const timezone = localStorage.getItem('timezone') || 'America/New_York';
|
||||
@@ -356,17 +703,19 @@ function updateDashboard(event) {
|
||||
timeZoneName: 'short'
|
||||
});
|
||||
|
||||
// ===== Fleet summary numbers =====
|
||||
// ===== Fleet summary numbers (always unfiltered) =====
|
||||
document.getElementById('total-units').textContent = data.summary?.total ?? 0;
|
||||
document.getElementById('deployed-units').textContent = data.summary?.active ?? 0;
|
||||
document.getElementById('benched-units').textContent = data.summary?.benched ?? 0;
|
||||
document.getElementById('allocated-units').textContent = data.summary?.allocated ?? 0;
|
||||
document.getElementById('status-ok').textContent = data.summary?.ok ?? 0;
|
||||
document.getElementById('status-pending').textContent = data.summary?.pending ?? 0;
|
||||
document.getElementById('status-missing').textContent = data.summary?.missing ?? 0;
|
||||
|
||||
// ===== Device type counts =====
|
||||
// ===== Device type counts (always unfiltered) =====
|
||||
let seismoCount = 0;
|
||||
let slmCount = 0;
|
||||
let modemCount = 0;
|
||||
Object.values(data.units || {}).forEach(unit => {
|
||||
if (unit.retired) return; // Don't count retired units
|
||||
const deviceType = unit.device_type || 'seismograph';
|
||||
@@ -374,46 +723,26 @@ function updateDashboard(event) {
|
||||
seismoCount++;
|
||||
} else if (deviceType === 'sound_level_meter') {
|
||||
slmCount++;
|
||||
} else if (deviceType === 'modem') {
|
||||
modemCount++;
|
||||
}
|
||||
});
|
||||
document.getElementById('seismo-count').textContent = seismoCount;
|
||||
document.getElementById('slm-count').textContent = slmCount;
|
||||
|
||||
// ===== Alerts =====
|
||||
const alertsList = document.getElementById('alerts-list');
|
||||
// Only show alerts for deployed units that are MISSING (not pending)
|
||||
const missingUnits = Object.entries(data.active).filter(([_, u]) => u.status === 'Missing');
|
||||
|
||||
if (!missingUnits.length) {
|
||||
alertsList.innerHTML =
|
||||
'<p class="text-sm text-green-600 dark:text-green-400">✓ All units reporting normally</p>';
|
||||
} else {
|
||||
let alertsHtml = '';
|
||||
|
||||
missingUnits.forEach(([id, unit]) => {
|
||||
alertsHtml += `
|
||||
<div class="flex items-start space-x-2 text-sm">
|
||||
<span class="w-2 h-2 rounded-full bg-red-500 mt-1.5"></span>
|
||||
<div>
|
||||
<a href="/unit/${id}" class="font-medium text-red-600 dark:text-red-400 hover:underline">${id}</a>
|
||||
<p class="text-gray-600 dark:text-gray-400">Missing for ${unit.age}</p>
|
||||
</div>
|
||||
</div>`;
|
||||
});
|
||||
|
||||
alertsList.innerHTML = alertsHtml;
|
||||
}
|
||||
|
||||
// ===== Update Fleet Map =====
|
||||
updateFleetMap(data);
|
||||
// ===== Apply filters and render map + alerts =====
|
||||
renderFilteredDashboard(data);
|
||||
|
||||
} catch (err) {
|
||||
console.error("Dashboard update error:", err);
|
||||
}
|
||||
}
|
||||
|
||||
// Handle tab switching
|
||||
// Handle tab switching and initialize components
|
||||
document.addEventListener('DOMContentLoaded', function() {
|
||||
// Load filter preferences
|
||||
loadFilterPreferences();
|
||||
|
||||
const tabButtons = document.querySelectorAll('.tab-button');
|
||||
|
||||
tabButtons.forEach(button => {
|
||||
@@ -453,64 +782,6 @@ function initFleetMap() {
|
||||
}, 100);
|
||||
}
|
||||
|
||||
function updateFleetMap(data) {
|
||||
if (!fleetMap) return;
|
||||
|
||||
// Clear existing markers
|
||||
fleetMarkers.forEach(marker => fleetMap.removeLayer(marker));
|
||||
fleetMarkers = [];
|
||||
|
||||
// Get deployed units with coordinates data
|
||||
const deployedUnits = Object.entries(data.units).filter(([_, u]) => u.deployed && u.coordinates);
|
||||
|
||||
if (deployedUnits.length === 0) {
|
||||
return;
|
||||
}
|
||||
|
||||
const bounds = [];
|
||||
|
||||
deployedUnits.forEach(([id, unit]) => {
|
||||
const coords = parseLocation(unit.coordinates);
|
||||
if (coords) {
|
||||
const [lat, lon] = coords;
|
||||
|
||||
// Create marker with custom color based on status
|
||||
const markerColor = unit.status === 'OK' ? 'green' : unit.status === 'Pending' ? 'orange' : 'red';
|
||||
|
||||
const marker = L.circleMarker([lat, lon], {
|
||||
radius: 8,
|
||||
fillColor: markerColor,
|
||||
color: '#fff',
|
||||
weight: 2,
|
||||
opacity: 1,
|
||||
fillOpacity: 0.8
|
||||
}).addTo(fleetMap);
|
||||
|
||||
// Add popup with unit info
|
||||
marker.bindPopup(`
|
||||
<div class="p-2">
|
||||
<h3 class="font-bold text-lg">${id}</h3>
|
||||
<p class="text-sm">Status: <span style="color: ${markerColor}">${unit.status}</span></p>
|
||||
<p class="text-sm">Type: ${unit.device_type}</p>
|
||||
${unit.note ? `<p class="text-sm text-gray-600">${unit.note}</p>` : ''}
|
||||
<a href="/unit/${id}" class="text-blue-600 hover:underline text-sm">View Details →</a>
|
||||
</div>
|
||||
`);
|
||||
|
||||
fleetMarkers.push(marker);
|
||||
bounds.push([lat, lon]);
|
||||
}
|
||||
});
|
||||
|
||||
// Fit map to show all markers
|
||||
if (bounds.length > 0) {
|
||||
// Use different padding for mobile vs desktop
|
||||
const padding = window.innerWidth < 768 ? [20, 20] : [50, 50];
|
||||
fleetMap.fitBounds(bounds, { padding: padding });
|
||||
fleetMapInitialized = true;
|
||||
}
|
||||
}
|
||||
|
||||
function parseLocation(location) {
|
||||
if (!location) return null;
|
||||
|
||||
|
||||
2466
templates/fleet_calendar.html
Normal file
102
templates/modems.html
Normal file
@@ -0,0 +1,102 @@
|
||||
{% extends "base.html" %}
|
||||
|
||||
{% block title %}Field Modems - Terra-View{% endblock %}
|
||||
|
||||
{% block content %}
|
||||
<div class="mb-8">
|
||||
<h1 class="text-3xl font-bold text-gray-900 dark:text-white flex items-center">
|
||||
<svg class="w-8 h-8 mr-3 text-seismo-orange" fill="none" stroke="currentColor" viewBox="0 0 24 24">
|
||||
<path stroke-linecap="round" stroke-linejoin="round" stroke-width="2" d="M8.111 16.404a5.5 5.5 0 017.778 0M12 20h.01m-7.08-7.071c3.904-3.905 10.236-3.905 14.141 0M1.394 9.393c5.857-5.857 15.355-5.857 21.213 0"></path>
|
||||
</svg>
|
||||
Field Modems
|
||||
</h1>
|
||||
<p class="text-gray-600 dark:text-gray-400 mt-1">Manage network connectivity devices for field equipment</p>
|
||||
</div>
|
||||
|
||||
<!-- Summary Stats -->
|
||||
<div class="grid grid-cols-1 md:grid-cols-4 gap-6 mb-8"
|
||||
hx-get="/api/modem-dashboard/stats"
|
||||
hx-trigger="load, every 30s"
|
||||
hx-swap="innerHTML">
|
||||
<!-- Stats will be loaded here -->
|
||||
<div class="animate-pulse bg-gray-200 dark:bg-gray-700 h-24 rounded-xl"></div>
|
||||
<div class="animate-pulse bg-gray-200 dark:bg-gray-700 h-24 rounded-xl"></div>
|
||||
<div class="animate-pulse bg-gray-200 dark:bg-gray-700 h-24 rounded-xl"></div>
|
||||
<div class="animate-pulse bg-gray-200 dark:bg-gray-700 h-24 rounded-xl"></div>
|
||||
</div>
|
||||
|
||||
<!-- Modem List -->
|
||||
<div class="bg-white dark:bg-slate-800 rounded-xl shadow-lg p-6">
|
||||
<div class="flex items-center justify-between mb-6">
|
||||
<h2 class="text-xl font-semibold text-gray-900 dark:text-white">All Modems</h2>
|
||||
<div class="flex items-center gap-4">
|
||||
<!-- Search -->
|
||||
<div class="relative">
|
||||
<input type="text"
|
||||
id="modem-search"
|
||||
placeholder="Search modems..."
|
||||
class="pl-9 pr-4 py-2 text-sm border border-gray-300 dark:border-gray-600 rounded-lg bg-white dark:bg-slate-700 text-gray-900 dark:text-white focus:ring-2 focus:ring-seismo-orange focus:border-transparent"
|
||||
hx-get="/api/modem-dashboard/units"
|
||||
hx-trigger="keyup changed delay:300ms"
|
||||
hx-target="#modem-list"
|
||||
hx-include="[name='search']"
|
||||
name="search">
|
||||
<svg class="w-4 h-4 absolute left-3 top-1/2 transform -translate-y-1/2 text-gray-400" fill="none" stroke="currentColor" viewBox="0 0 24 24">
|
||||
<path stroke-linecap="round" stroke-linejoin="round" stroke-width="2" d="M21 21l-6-6m2-5a7 7 0 11-14 0 7 7 0 0114 0z"></path>
|
||||
</svg>
|
||||
</div>
|
||||
<a href="/roster?device_type=modem" class="text-sm text-seismo-orange hover:underline">
|
||||
Add modem in roster
|
||||
</a>
|
||||
</div>
|
||||
</div>
|
||||
|
||||
<div id="modem-list"
|
||||
hx-get="/api/modem-dashboard/units"
|
||||
hx-trigger="load, every 30s"
|
||||
hx-swap="innerHTML">
|
||||
<p class="text-gray-500 dark:text-gray-400">Loading modems...</p>
|
||||
</div>
|
||||
</div>
|
||||
|
||||
<script>
|
||||
// Ping a modem and show result
|
||||
async function pingModem(modemId) {
|
||||
const btn = document.getElementById(`ping-btn-${modemId}`);
|
||||
const resultDiv = document.getElementById(`ping-result-${modemId}`);
|
||||
|
||||
// Show loading state
|
||||
const originalText = btn.textContent;
|
||||
btn.textContent = 'Pinging...';
|
||||
btn.disabled = true;
|
||||
resultDiv.classList.remove('hidden');
|
||||
resultDiv.className = 'mt-2 text-xs text-gray-500';
|
||||
resultDiv.textContent = 'Testing connection...';
|
||||
|
||||
try {
|
||||
const response = await fetch(`/api/modem-dashboard/${modemId}/ping`);
|
||||
const data = await response.json();
|
||||
|
||||
if (data.status === 'success') {
|
||||
resultDiv.className = 'mt-2 text-xs text-green-600 dark:text-green-400';
|
||||
resultDiv.innerHTML = `<span class="inline-block w-2 h-2 bg-green-500 rounded-full mr-1"></span>Online (${data.response_time_ms}ms)`;
|
||||
} else {
|
||||
resultDiv.className = 'mt-2 text-xs text-red-600 dark:text-red-400';
|
||||
resultDiv.innerHTML = `<span class="inline-block w-2 h-2 bg-red-500 rounded-full mr-1"></span>${data.detail || 'Offline'}`;
|
||||
}
|
||||
} catch (error) {
|
||||
resultDiv.className = 'mt-2 text-xs text-red-600 dark:text-red-400';
|
||||
resultDiv.textContent = 'Error: ' + error.message;
|
||||
}
|
||||
|
||||
// Restore button
|
||||
btn.textContent = originalText;
|
||||
btn.disabled = false;
|
||||
|
||||
// Hide result after 10 seconds
|
||||
setTimeout(() => {
|
||||
resultDiv.classList.add('hidden');
|
||||
}, 10000);
|
||||
}
|
||||
</script>
|
||||
{% endblock %}
|
||||
@@ -70,7 +70,7 @@
|
||||
class="tab-button px-4 py-3 border-b-2 border-transparent font-medium text-sm text-gray-600 dark:text-gray-400 hover:text-gray-900 dark:hover:text-white hover:border-gray-300 dark:hover:border-gray-600 transition-colors">
|
||||
Settings
|
||||
</button>
|
||||
{% if assigned_unit %}
|
||||
{% if assigned_unit and connection_mode == 'connected' %}
|
||||
<button onclick="switchTab('command')"
|
||||
data-tab="command"
|
||||
class="tab-button px-4 py-3 border-b-2 border-transparent font-medium text-sm text-gray-600 dark:text-gray-400 hover:text-gray-900 dark:hover:text-white hover:border-gray-300 dark:hover:border-gray-600 transition-colors">
|
||||
@@ -80,7 +80,7 @@
|
||||
<button onclick="switchTab('sessions')"
|
||||
data-tab="sessions"
|
||||
class="tab-button px-4 py-3 border-b-2 border-transparent font-medium text-sm text-gray-600 dark:text-gray-400 hover:text-gray-900 dark:hover:text-white hover:border-gray-300 dark:hover:border-gray-600 transition-colors">
|
||||
Recording Sessions
|
||||
Monitoring Sessions
|
||||
</button>
|
||||
<button onclick="switchTab('data')"
|
||||
data-tab="data"
|
||||
@@ -214,23 +214,54 @@
|
||||
<div class="bg-white dark:bg-slate-800 rounded-xl shadow-lg p-6">
|
||||
<div class="flex items-center justify-between">
|
||||
<div>
|
||||
{% if connection_mode == 'connected' %}
|
||||
<p class="text-sm text-gray-600 dark:text-gray-400">Active Session</p>
|
||||
<p class="text-lg font-semibold text-gray-900 dark:text-white mt-2">
|
||||
{% if active_session %}
|
||||
<span class="text-green-600 dark:text-green-400">Recording</span>
|
||||
<span class="text-green-600 dark:text-green-400">Monitoring</span>
|
||||
{% else %}
|
||||
<span class="text-gray-500">Idle</span>
|
||||
{% endif %}
|
||||
</p>
|
||||
{% else %}
|
||||
<p class="text-sm text-gray-600 dark:text-gray-400">Mode</p>
|
||||
<p class="text-lg font-semibold mt-2">
|
||||
<span class="text-amber-600 dark:text-amber-400">Offline / Manual</span>
|
||||
</p>
|
||||
{% endif %}
|
||||
</div>
|
||||
<div class="w-12 h-12 bg-purple-100 dark:bg-purple-900/30 rounded-lg flex items-center justify-center">
|
||||
{% if connection_mode == 'connected' %}
|
||||
<svg class="w-6 h-6 text-purple-600 dark:text-purple-400" fill="none" stroke="currentColor" viewBox="0 0 24 24">
|
||||
<path stroke-linecap="round" stroke-linejoin="round" stroke-width="2" d="M12 8v4l3 3m6-3a9 9 0 11-18 0 9 9 0 0118 0z"></path>
|
||||
</svg>
|
||||
{% else %}
|
||||
<svg class="w-6 h-6 text-amber-600 dark:text-amber-400" fill="none" stroke="currentColor" viewBox="0 0 24 24">
|
||||
<path stroke-linecap="round" stroke-linejoin="round" stroke-width="2" d="M4 16v1a3 3 0 003 3h10a3 3 0 003-3v-1m-4-8l-4-4m0 0L8 8m4-4v12"></path>
|
||||
</svg>
|
||||
{% endif %}
|
||||
</div>
|
||||
</div>
|
||||
</div>
|
||||
</div>
|
||||
|
||||
{% if connection_mode == 'connected' and assigned_unit %}
|
||||
<!-- Live Status Row (connected NRLs only) -->
|
||||
<div class="mt-6">
|
||||
<div class="bg-white dark:bg-slate-800 rounded-xl shadow-lg p-6">
|
||||
<div class="flex items-center justify-between mb-4">
|
||||
<h3 class="text-lg font-semibold text-gray-900 dark:text-white">Live Status</h3>
|
||||
<span class="text-xs text-gray-500 dark:text-gray-400">{{ assigned_unit.id }}</span>
|
||||
</div>
|
||||
<div id="nrl-live-status"
|
||||
hx-get="/api/projects/{{ project_id }}/nrl/{{ location_id }}/live-status"
|
||||
hx-trigger="load, every 30s"
|
||||
hx-swap="innerHTML">
|
||||
<div class="text-center py-4 text-gray-500 text-sm">Loading status…</div>
|
||||
</div>
|
||||
</div>
|
||||
</div>
|
||||
{% endif %}
|
||||
</div>
|
||||
|
||||
<!-- Settings Tab -->
|
||||
@@ -281,8 +312,8 @@
|
||||
</div>
|
||||
</div>
|
||||
|
||||
<!-- Command Center Tab -->
|
||||
{% if assigned_unit %}
|
||||
<!-- Command Center Tab (connected NRLs only) -->
|
||||
{% if assigned_unit and connection_mode == 'connected' %}
|
||||
<div id="command-tab" class="tab-panel hidden">
|
||||
<div class="bg-white dark:bg-slate-800 rounded-xl shadow-lg p-6">
|
||||
<h2 class="text-xl font-semibold text-gray-900 dark:text-white mb-6">
|
||||
@@ -302,11 +333,11 @@
|
||||
</div>
|
||||
{% endif %}
|
||||
|
||||
<!-- Recording Sessions Tab -->
|
||||
<!-- Monitoring Sessions Tab -->
|
||||
<div id="sessions-tab" class="tab-panel hidden">
|
||||
<div class="bg-white dark:bg-slate-800 rounded-xl shadow-lg p-6">
|
||||
<div class="flex items-center justify-between mb-6">
|
||||
<h2 class="text-xl font-semibold text-gray-900 dark:text-white">Recording Sessions</h2>
|
||||
<h2 class="text-xl font-semibold text-gray-900 dark:text-white">Monitoring Sessions</h2>
|
||||
{% if assigned_unit %}
|
||||
<button onclick="openScheduleModal()"
|
||||
class="px-4 py-2 bg-seismo-orange text-white rounded-lg hover:bg-seismo-navy transition-colors">
|
||||
@@ -329,8 +360,51 @@
|
||||
<div class="bg-white dark:bg-slate-800 rounded-xl shadow-lg p-6">
|
||||
<div class="flex items-center justify-between mb-6">
|
||||
<h2 class="text-xl font-semibold text-gray-900 dark:text-white">Data Files</h2>
|
||||
<div class="text-sm text-gray-500">
|
||||
<span class="font-medium">{{ file_count }}</span> files
|
||||
<div class="flex items-center gap-3">
|
||||
<span class="text-sm text-gray-500"><span class="font-medium">{{ file_count }}</span> files</span>
|
||||
<button onclick="toggleUploadPanel()"
|
||||
class="px-3 py-1.5 text-sm bg-seismo-orange text-white rounded-lg hover:bg-seismo-navy transition-colors flex items-center gap-1.5">
|
||||
<svg class="w-4 h-4" fill="none" stroke="currentColor" viewBox="0 0 24 24">
|
||||
<path stroke-linecap="round" stroke-linejoin="round" stroke-width="2" d="M4 16v1a3 3 0 003 3h10a3 3 0 003-3v-1m-4-8l-4-4m0 0L8 8m4-4v12"></path>
|
||||
</svg>
|
||||
Upload Data
|
||||
</button>
|
||||
</div>
|
||||
</div>
|
||||
|
||||
<!-- Upload Panel -->
|
||||
<div id="upload-panel" class="hidden mb-6 p-4 border-2 border-dashed border-gray-300 dark:border-gray-600 rounded-xl bg-gray-50 dark:bg-gray-800/50">
|
||||
<p class="text-sm font-medium text-gray-700 dark:text-gray-300 mb-1">Upload SD Card Data</p>
|
||||
<p class="text-xs text-gray-500 dark:text-gray-400 mb-3">
|
||||
Select a ZIP file, or select all files from inside an <code class="bg-gray-200 dark:bg-gray-700 px-1 rounded">Auto_####</code> folder. File types (.rnd, .rnh) are auto-detected.
|
||||
</p>
|
||||
<input type="file" id="upload-input" multiple
|
||||
accept=".zip,.rnd,.rnh,.RND,.RNH"
|
||||
class="block w-full text-sm text-gray-500 dark:text-gray-400
|
||||
file:mr-3 file:py-1.5 file:px-3 file:rounded-lg file:border-0
|
||||
file:text-sm file:font-medium file:bg-seismo-orange file:text-white
|
||||
hover:file:bg-seismo-navy file:cursor-pointer" />
|
||||
<div class="flex items-center gap-3 mt-3">
|
||||
<button id="upload-btn" onclick="submitUpload()"
|
||||
class="px-4 py-1.5 text-sm bg-green-600 text-white rounded-lg hover:bg-green-700 transition-colors">
|
||||
Import Files
|
||||
</button>
|
||||
<button id="upload-cancel-btn" onclick="toggleUploadPanel()"
|
||||
class="px-4 py-1.5 text-sm text-gray-600 dark:text-gray-400 hover:text-gray-900 dark:hover:text-white transition-colors">
|
||||
Cancel
|
||||
</button>
|
||||
<span id="upload-status" class="text-sm hidden"></span>
|
||||
</div>
|
||||
<!-- Progress bar (hidden until upload starts) -->
|
||||
<div id="upload-progress-wrap" class="hidden mt-3">
|
||||
<div class="flex justify-between text-xs text-gray-500 dark:text-gray-400 mb-1">
|
||||
<span id="upload-progress-label">Uploading…</span>
|
||||
</div>
|
||||
<div class="w-full bg-gray-200 dark:bg-gray-700 rounded-full h-2">
|
||||
<div id="upload-progress-bar"
|
||||
class="bg-green-500 h-2 rounded-full transition-all duration-300"
|
||||
style="width: 0%"></div>
|
||||
</div>
|
||||
</div>
|
||||
</div>
|
||||
|
||||
@@ -559,5 +633,112 @@ document.getElementById('assign-modal')?.addEventListener('click', function(e) {
|
||||
closeAssignModal();
|
||||
}
|
||||
});
|
||||
|
||||
// ── Upload Data ─────────────────────────────────────────────────────────────
|
||||
|
||||
function toggleUploadPanel() {
|
||||
const panel = document.getElementById('upload-panel');
|
||||
const status = document.getElementById('upload-status');
|
||||
panel.classList.toggle('hidden');
|
||||
// Reset state when reopening
|
||||
if (!panel.classList.contains('hidden')) {
|
||||
status.textContent = '';
|
||||
status.className = 'text-sm hidden';
|
||||
document.getElementById('upload-input').value = '';
|
||||
document.getElementById('upload-progress-wrap').classList.add('hidden');
|
||||
document.getElementById('upload-progress-bar').style.width = '0%';
|
||||
}
|
||||
}
|
||||
|
||||
function submitUpload() {
|
||||
const input = document.getElementById('upload-input');
|
||||
const status = document.getElementById('upload-status');
|
||||
const btn = document.getElementById('upload-btn');
|
||||
const cancelBtn = document.getElementById('upload-cancel-btn');
|
||||
const progressWrap = document.getElementById('upload-progress-wrap');
|
||||
const progressBar = document.getElementById('upload-progress-bar');
|
||||
const progressLabel = document.getElementById('upload-progress-label');
|
||||
|
||||
if (!input.files.length) {
|
||||
alert('Please select files to upload.');
|
||||
return;
|
||||
}
|
||||
|
||||
const fileCount = input.files.length;
|
||||
const formData = new FormData();
|
||||
for (const file of input.files) {
|
||||
formData.append('files', file);
|
||||
}
|
||||
|
||||
// Disable controls and show progress bar
|
||||
btn.disabled = true;
|
||||
btn.textContent = 'Uploading\u2026';
|
||||
btn.classList.add('opacity-60', 'cursor-not-allowed');
|
||||
cancelBtn.disabled = true;
|
||||
cancelBtn.classList.add('opacity-40', 'cursor-not-allowed');
|
||||
status.className = 'text-sm hidden';
|
||||
progressWrap.classList.remove('hidden');
|
||||
progressBar.style.width = '0%';
|
||||
progressLabel.textContent = `Uploading ${fileCount} file${fileCount !== 1 ? 's' : ''}\u2026`;
|
||||
|
||||
const xhr = new XMLHttpRequest();
|
||||
|
||||
xhr.upload.addEventListener('progress', (e) => {
|
||||
if (e.lengthComputable) {
|
||||
const pct = Math.round((e.loaded / e.total) * 100);
|
||||
progressBar.style.width = pct + '%';
|
||||
progressLabel.textContent = `Uploading ${fileCount} file${fileCount !== 1 ? 's' : ''}\u2026 ${pct}%`;
|
||||
}
|
||||
});
|
||||
|
||||
xhr.upload.addEventListener('load', () => {
|
||||
progressBar.style.width = '100%';
|
||||
progressLabel.textContent = 'Processing files on server\u2026';
|
||||
});
|
||||
|
||||
xhr.addEventListener('load', () => {
|
||||
progressWrap.classList.add('hidden');
|
||||
btn.disabled = false;
|
||||
btn.textContent = 'Import Files';
|
||||
btn.classList.remove('opacity-60', 'cursor-not-allowed');
|
||||
cancelBtn.disabled = false;
|
||||
cancelBtn.classList.remove('opacity-40', 'cursor-not-allowed');
|
||||
|
||||
try {
|
||||
const data = JSON.parse(xhr.responseText);
|
||||
if (xhr.status >= 200 && xhr.status < 300) {
|
||||
const parts = [`Imported ${data.files_imported} file${data.files_imported !== 1 ? 's' : ''}`];
|
||||
if (data.leq_files || data.lp_files) {
|
||||
parts.push(`(${data.leq_files} Leq, ${data.lp_files} Lp)`);
|
||||
}
|
||||
if (data.store_name) parts.push(`\u2014 ${data.store_name}`);
|
||||
status.textContent = parts.join(' ');
|
||||
status.className = 'text-sm text-green-600 dark:text-green-400';
|
||||
input.value = '';
|
||||
htmx.trigger(document.getElementById('data-files-list'), 'load');
|
||||
} else {
|
||||
status.textContent = `Error: ${data.detail || 'Upload failed'}`;
|
||||
status.className = 'text-sm text-red-600 dark:text-red-400';
|
||||
}
|
||||
} catch {
|
||||
status.textContent = 'Error: Unexpected server response';
|
||||
status.className = 'text-sm text-red-600 dark:text-red-400';
|
||||
}
|
||||
});
|
||||
|
||||
xhr.addEventListener('error', () => {
|
||||
progressWrap.classList.add('hidden');
|
||||
btn.disabled = false;
|
||||
btn.textContent = 'Import Files';
|
||||
btn.classList.remove('opacity-60', 'cursor-not-allowed');
|
||||
cancelBtn.disabled = false;
|
||||
cancelBtn.classList.remove('opacity-40', 'cursor-not-allowed');
|
||||
status.textContent = 'Error: Network error during upload';
|
||||
status.className = 'text-sm text-red-600 dark:text-red-400';
|
||||
});
|
||||
|
||||
xhr.open('POST', `/api/projects/${projectId}/nrl/${locationId}/upload-data`);
|
||||
xhr.send(formData);
|
||||
}
|
||||
</script>
|
||||
{% endblock %}
|
||||
|
||||
566
templates/pair_devices.html
Normal file
@@ -0,0 +1,566 @@
|
||||
{% extends "base.html" %}
|
||||
|
||||
{% block title %}Pair Devices - Terra-View{% endblock %}
|
||||
|
||||
{% block content %}
|
||||
<div class="max-w-7xl mx-auto">
|
||||
<!-- Header -->
|
||||
<div class="mb-6">
|
||||
<h1 class="text-2xl font-bold text-gray-900 dark:text-white">Pair Devices</h1>
|
||||
<p class="mt-1 text-sm text-gray-600 dark:text-gray-400">
|
||||
Select a recorder (seismograph or SLM) and a modem to create a bidirectional pairing.
|
||||
</p>
|
||||
</div>
|
||||
|
||||
<!-- Selection Summary Bar -->
|
||||
<div id="selection-bar" class="mb-6 p-4 bg-white dark:bg-slate-800 rounded-lg shadow border border-gray-200 dark:border-gray-700">
|
||||
<div class="flex items-center justify-between flex-wrap gap-4">
|
||||
<div class="flex items-center gap-6">
|
||||
<div class="flex items-center gap-2">
|
||||
<span class="text-sm text-gray-600 dark:text-gray-400">Recorder:</span>
|
||||
<span id="selected-recorder" class="font-mono font-medium text-gray-900 dark:text-white">None selected</span>
|
||||
</div>
|
||||
<svg class="w-5 h-5 text-gray-400" fill="none" stroke="currentColor" viewBox="0 0 24 24">
|
||||
<path stroke-linecap="round" stroke-linejoin="round" stroke-width="2" d="M14 5l7 7m0 0l-7 7m7-7H3"></path>
|
||||
</svg>
|
||||
<div class="flex items-center gap-2">
|
||||
<span class="text-sm text-gray-600 dark:text-gray-400">Modem:</span>
|
||||
<span id="selected-modem" class="font-mono font-medium text-gray-900 dark:text-white">None selected</span>
|
||||
</div>
|
||||
</div>
|
||||
<div class="flex items-center gap-3">
|
||||
<button id="clear-selection-btn"
|
||||
onclick="clearSelection()"
|
||||
class="px-4 py-2 text-sm font-medium text-gray-700 dark:text-gray-300 bg-gray-100 dark:bg-gray-700 rounded-lg hover:bg-gray-200 dark:hover:bg-gray-600 disabled:opacity-50 disabled:cursor-not-allowed"
|
||||
disabled>
|
||||
Clear
|
||||
</button>
|
||||
<button id="pair-btn"
|
||||
onclick="pairDevices()"
|
||||
class="px-4 py-2 text-sm font-medium text-white bg-seismo-orange rounded-lg hover:bg-orange-600 disabled:opacity-50 disabled:cursor-not-allowed"
|
||||
disabled>
|
||||
Pair Devices
|
||||
</button>
|
||||
</div>
|
||||
</div>
|
||||
</div>
|
||||
|
||||
<!-- Two Column Layout -->
|
||||
<div class="grid grid-cols-1 lg:grid-cols-2 gap-6">
|
||||
<!-- Left Column: Recorders (Seismographs + SLMs) -->
|
||||
<div class="bg-white dark:bg-slate-800 rounded-lg shadow border border-gray-200 dark:border-gray-700">
|
||||
<div class="px-4 py-3 border-b border-gray-200 dark:border-gray-700">
|
||||
<div class="flex items-center justify-between mb-3">
|
||||
<h2 class="text-lg font-semibold text-gray-900 dark:text-white flex items-center gap-2">
|
||||
<svg class="w-5 h-5 text-seismo-orange" fill="none" stroke="currentColor" viewBox="0 0 24 24">
|
||||
<path stroke-linecap="round" stroke-linejoin="round" stroke-width="2" d="M9 19v-6a2 2 0 00-2-2H5a2 2 0 00-2 2v6a2 2 0 002 2h2a2 2 0 002-2zm0 0V9a2 2 0 012-2h2a2 2 0 012 2v10m-6 0a2 2 0 002 2h2a2 2 0 002-2m0 0V5a2 2 0 012-2h2a2 2 0 012 2v14a2 2 0 01-2 2h-2a2 2 0 01-2-2z"></path>
|
||||
</svg>
|
||||
Recorders
|
||||
<span id="recorder-count" class="text-sm font-normal text-gray-500 dark:text-gray-400">({{ recorders|length }})</span>
|
||||
</h2>
|
||||
</div>
|
||||
<!-- Recorder Search & Filters -->
|
||||
<div class="space-y-2">
|
||||
<input type="text" id="recorder-search" placeholder="Search by ID..."
|
||||
class="w-full px-3 py-2 rounded-lg border border-gray-300 dark:border-gray-600 bg-white dark:bg-slate-700 text-gray-900 dark:text-white text-sm focus:ring-2 focus:ring-seismo-orange focus:border-seismo-orange"
|
||||
oninput="filterRecorders()">
|
||||
<div class="flex items-center gap-4">
|
||||
<label class="flex items-center gap-2 cursor-pointer">
|
||||
<input type="checkbox" id="recorder-hide-paired" onchange="filterRecorders()" class="rounded border-gray-300 dark:border-gray-600 text-seismo-orange focus:ring-seismo-orange">
|
||||
<span class="text-xs text-gray-600 dark:text-gray-400">Hide paired</span>
|
||||
</label>
|
||||
<label class="flex items-center gap-2 cursor-pointer">
|
||||
<input type="checkbox" id="recorder-deployed-only" onchange="filterRecorders()" class="rounded border-gray-300 dark:border-gray-600 text-seismo-orange focus:ring-seismo-orange">
|
||||
<span class="text-xs text-gray-600 dark:text-gray-400">Deployed only</span>
|
||||
</label>
|
||||
</div>
|
||||
</div>
|
||||
</div>
|
||||
<div class="max-h-[600px] overflow-y-auto">
|
||||
<div id="recorders-list" class="divide-y divide-gray-200 dark:divide-gray-700">
|
||||
{% for unit in recorders %}
|
||||
<div class="device-row recorder-row p-3 hover:bg-gray-50 dark:hover:bg-gray-700/50 cursor-pointer transition-colors"
|
||||
data-id="{{ unit.id }}"
|
||||
data-deployed="{{ unit.deployed|lower }}"
|
||||
data-paired-with="{{ unit.deployed_with_modem_id or '' }}"
|
||||
data-device-type="{{ unit.device_type }}"
|
||||
onclick="selectRecorder('{{ unit.id }}')">
|
||||
<div class="flex items-center justify-between">
|
||||
<div class="flex items-center gap-3">
|
||||
<div class="w-8 h-8 rounded-full flex items-center justify-center
|
||||
{% if unit.device_type == 'slm' %}bg-purple-100 dark:bg-purple-900/30 text-purple-600 dark:text-purple-400
|
||||
{% else %}bg-blue-100 dark:bg-blue-900/30 text-blue-600 dark:text-blue-400{% endif %}">
|
||||
{% if unit.device_type == 'slm' %}
|
||||
<svg class="w-4 h-4" fill="none" stroke="currentColor" viewBox="0 0 24 24">
|
||||
<path stroke-linecap="round" stroke-linejoin="round" stroke-width="2" d="M15.536 8.464a5 5 0 010 7.072m2.828-9.9a9 9 0 010 12.728M5.586 15H4a1 1 0 01-1-1v-4a1 1 0 011-1h1.586l4.707-4.707C10.923 3.663 12 4.109 12 5v14c0 .891-1.077 1.337-1.707.707L5.586 15z"></path>
|
||||
</svg>
|
||||
{% else %}
|
||||
<svg class="w-4 h-4" fill="none" stroke="currentColor" viewBox="0 0 24 24">
|
||||
<path stroke-linecap="round" stroke-linejoin="round" stroke-width="2" d="M9 19v-6a2 2 0 00-2-2H5a2 2 0 00-2 2v6a2 2 0 002 2h2a2 2 0 002-2zm0 0V9a2 2 0 012-2h2a2 2 0 012 2v10m-6 0a2 2 0 002 2h2a2 2 0 002-2m0 0V5a2 2 0 012-2h2a2 2 0 012 2v14a2 2 0 01-2 2h-2a2 2 0 01-2-2z"></path>
|
||||
</svg>
|
||||
{% endif %}
|
||||
</div>
|
||||
<div>
|
||||
<div class="font-mono font-medium text-gray-900 dark:text-white">{{ unit.id }}</div>
|
||||
<div class="text-xs text-gray-500 dark:text-gray-400">
|
||||
{{ unit.device_type|capitalize }}
|
||||
{% if not unit.deployed %}<span class="text-yellow-600 dark:text-yellow-400">(Benched)</span>{% endif %}
|
||||
</div>
|
||||
</div>
|
||||
</div>
|
||||
<div class="flex items-center gap-2">
|
||||
{% if unit.deployed_with_modem_id %}
|
||||
<span class="px-2 py-1 text-xs rounded-full bg-green-100 dark:bg-green-900/30 text-green-700 dark:text-green-400">
|
||||
→ {{ unit.deployed_with_modem_id }}
|
||||
</span>
|
||||
{% endif %}
|
||||
<div class="w-5 h-5 rounded-full border-2 border-gray-300 dark:border-gray-600 flex items-center justify-center selection-indicator">
|
||||
<svg class="w-3 h-3 text-seismo-orange hidden check-icon" fill="currentColor" viewBox="0 0 20 20">
|
||||
<path fill-rule="evenodd" d="M16.707 5.293a1 1 0 010 1.414l-8 8a1 1 0 01-1.414 0l-4-4a1 1 0 011.414-1.414L8 12.586l7.293-7.293a1 1 0 011.414 0z" clip-rule="evenodd"></path>
|
||||
</svg>
|
||||
</div>
|
||||
</div>
|
||||
</div>
|
||||
</div>
|
||||
{% else %}
|
||||
<div class="p-8 text-center text-gray-500 dark:text-gray-400">
|
||||
No recorders found in roster
|
||||
</div>
|
||||
{% endfor %}
|
||||
</div>
|
||||
</div>
|
||||
</div>
|
||||
|
||||
<!-- Right Column: Modems -->
|
||||
<div class="bg-white dark:bg-slate-800 rounded-lg shadow border border-gray-200 dark:border-gray-700">
|
||||
<div class="px-4 py-3 border-b border-gray-200 dark:border-gray-700">
|
||||
<div class="flex items-center justify-between mb-3">
|
||||
<h2 class="text-lg font-semibold text-gray-900 dark:text-white flex items-center gap-2">
|
||||
<svg class="w-5 h-5 text-seismo-orange" fill="none" stroke="currentColor" viewBox="0 0 24 24">
|
||||
<path stroke-linecap="round" stroke-linejoin="round" stroke-width="2" d="M8.111 16.404a5.5 5.5 0 017.778 0M12 20h.01m-7.08-7.071c3.904-3.905 10.236-3.905 14.141 0M1.394 9.393c5.857-5.857 15.355-5.857 21.213 0"></path>
|
||||
</svg>
|
||||
Modems
|
||||
<span id="modem-count" class="text-sm font-normal text-gray-500 dark:text-gray-400">({{ modems|length }})</span>
|
||||
</h2>
|
||||
</div>
|
||||
<!-- Modem Search & Filters -->
|
||||
<div class="space-y-2">
|
||||
<input type="text" id="modem-search" placeholder="Search by ID, IP, or phone..."
|
||||
class="w-full px-3 py-2 rounded-lg border border-gray-300 dark:border-gray-600 bg-white dark:bg-slate-700 text-gray-900 dark:text-white text-sm focus:ring-2 focus:ring-seismo-orange focus:border-seismo-orange"
|
||||
oninput="filterModems()">
|
||||
<div class="flex items-center gap-4">
|
||||
<label class="flex items-center gap-2 cursor-pointer">
|
||||
<input type="checkbox" id="modem-hide-paired" onchange="filterModems()" class="rounded border-gray-300 dark:border-gray-600 text-seismo-orange focus:ring-seismo-orange">
|
||||
<span class="text-xs text-gray-600 dark:text-gray-400">Hide paired</span>
|
||||
</label>
|
||||
<label class="flex items-center gap-2 cursor-pointer">
|
||||
<input type="checkbox" id="modem-deployed-only" onchange="filterModems()" class="rounded border-gray-300 dark:border-gray-600 text-seismo-orange focus:ring-seismo-orange">
|
||||
<span class="text-xs text-gray-600 dark:text-gray-400">Deployed only</span>
|
||||
</label>
|
||||
</div>
|
||||
</div>
|
||||
</div>
|
||||
<div class="max-h-[600px] overflow-y-auto">
|
||||
<div id="modems-list" class="divide-y divide-gray-200 dark:divide-gray-700">
|
||||
{% for unit in modems %}
|
||||
<div class="device-row modem-row p-3 hover:bg-gray-50 dark:hover:bg-gray-700/50 cursor-pointer transition-colors"
|
||||
data-id="{{ unit.id }}"
|
||||
data-deployed="{{ unit.deployed|lower }}"
|
||||
data-paired-with="{{ unit.deployed_with_unit_id or '' }}"
|
||||
data-ip="{{ unit.ip_address or '' }}"
|
||||
data-phone="{{ unit.phone_number or '' }}"
|
||||
onclick="selectModem('{{ unit.id }}')">
|
||||
<div class="flex items-center justify-between">
|
||||
<div class="flex items-center gap-3">
|
||||
<div class="w-8 h-8 rounded-full bg-amber-100 dark:bg-amber-900/30 flex items-center justify-center text-amber-600 dark:text-amber-400">
|
||||
<svg class="w-4 h-4" fill="none" stroke="currentColor" viewBox="0 0 24 24">
|
||||
<path stroke-linecap="round" stroke-linejoin="round" stroke-width="2" d="M8.111 16.404a5.5 5.5 0 017.778 0M12 20h.01m-7.08-7.071c3.904-3.905 10.236-3.905 14.141 0M1.394 9.393c5.857-5.857 15.355-5.857 21.213 0"></path>
|
||||
</svg>
|
||||
</div>
|
||||
<div>
|
||||
<div class="font-mono font-medium text-gray-900 dark:text-white">{{ unit.id }}</div>
|
||||
<div class="text-xs text-gray-500 dark:text-gray-400">
|
||||
{% if unit.ip_address %}<span class="font-mono">{{ unit.ip_address }}</span>{% endif %}
|
||||
{% if unit.phone_number %}{% if unit.ip_address %} · {% endif %}{{ unit.phone_number }}{% endif %}
|
||||
{% if not unit.ip_address and not unit.phone_number %}Modem{% endif %}
|
||||
{% if not unit.deployed %}<span class="text-yellow-600 dark:text-yellow-400">(Benched)</span>{% endif %}
|
||||
</div>
|
||||
</div>
|
||||
</div>
|
||||
<div class="flex items-center gap-2">
|
||||
{% if unit.deployed_with_unit_id %}
|
||||
<span class="px-2 py-1 text-xs rounded-full bg-green-100 dark:bg-green-900/30 text-green-700 dark:text-green-400">
|
||||
← {{ unit.deployed_with_unit_id }}
|
||||
</span>
|
||||
{% endif %}
|
||||
<div class="w-5 h-5 rounded-full border-2 border-gray-300 dark:border-gray-600 flex items-center justify-center selection-indicator">
|
||||
<svg class="w-3 h-3 text-seismo-orange hidden check-icon" fill="currentColor" viewBox="0 0 20 20">
|
||||
<path fill-rule="evenodd" d="M16.707 5.293a1 1 0 010 1.414l-8 8a1 1 0 01-1.414 0l-4-4a1 1 0 011.414-1.414L8 12.586l7.293-7.293a1 1 0 011.414 0z" clip-rule="evenodd"></path>
|
||||
</svg>
|
||||
</div>
|
||||
</div>
|
||||
</div>
|
||||
</div>
|
||||
{% else %}
|
||||
<div class="p-8 text-center text-gray-500 dark:text-gray-400">
|
||||
No modems found in roster
|
||||
</div>
|
||||
{% endfor %}
|
||||
</div>
|
||||
</div>
|
||||
</div>
|
||||
</div>
|
||||
|
||||
<!-- Existing Pairings Section -->
|
||||
<div class="mt-8 bg-white dark:bg-slate-800 rounded-lg shadow border border-gray-200 dark:border-gray-700">
|
||||
<div class="px-4 py-3 border-b border-gray-200 dark:border-gray-700">
|
||||
<h2 class="text-lg font-semibold text-gray-900 dark:text-white flex items-center gap-2">
|
||||
<svg class="w-5 h-5 text-green-500" fill="none" stroke="currentColor" viewBox="0 0 24 24">
|
||||
<path stroke-linecap="round" stroke-linejoin="round" stroke-width="2" d="M13.828 10.172a4 4 0 00-5.656 0l-4 4a4 4 0 105.656 5.656l1.102-1.101m-.758-4.899a4 4 0 005.656 0l4-4a4 4 0 00-5.656-5.656l-1.1 1.1"></path>
|
||||
</svg>
|
||||
Existing Pairings
|
||||
<span id="pairing-count" class="text-sm font-normal text-gray-500 dark:text-gray-400">({{ pairings|length }})</span>
|
||||
</h2>
|
||||
</div>
|
||||
<div class="max-h-[400px] overflow-y-auto">
|
||||
<div id="pairings-list" class="divide-y divide-gray-200 dark:divide-gray-700">
|
||||
{% for pairing in pairings %}
|
||||
<div class="pairing-row p-3 flex items-center justify-between hover:bg-gray-50 dark:hover:bg-gray-700/50">
|
||||
<div class="flex items-center gap-4">
|
||||
<div class="flex items-center gap-2">
|
||||
<span class="px-2 py-1 text-sm font-mono rounded bg-blue-100 dark:bg-blue-900/30 text-blue-700 dark:text-blue-400">
|
||||
{{ pairing.recorder_id }}
|
||||
</span>
|
||||
<span class="text-xs text-gray-500 dark:text-gray-400">{{ pairing.recorder_type }}</span>
|
||||
</div>
|
||||
<svg class="w-5 h-5 text-gray-400" fill="none" stroke="currentColor" viewBox="0 0 24 24">
|
||||
<path stroke-linecap="round" stroke-linejoin="round" stroke-width="2" d="M8 7h12m0 0l-4-4m4 4l-4 4m0 6H4m0 0l4 4m-4-4l4-4"></path>
|
||||
</svg>
|
||||
<div class="flex items-center gap-2">
|
||||
<span class="px-2 py-1 text-sm font-mono rounded bg-amber-100 dark:bg-amber-900/30 text-amber-700 dark:text-amber-400">
|
||||
{{ pairing.modem_id }}
|
||||
</span>
|
||||
{% if pairing.modem_ip %}
|
||||
<span class="text-xs font-mono text-gray-500 dark:text-gray-400">{{ pairing.modem_ip }}</span>
|
||||
{% endif %}
|
||||
</div>
|
||||
</div>
|
||||
<button onclick="unpairDevices('{{ pairing.recorder_id }}', '{{ pairing.modem_id }}')"
|
||||
class="p-2 text-red-600 dark:text-red-400 hover:bg-red-100 dark:hover:bg-red-900/30 rounded-lg transition-colors"
|
||||
title="Unpair devices">
|
||||
<svg class="w-4 h-4" fill="none" stroke="currentColor" viewBox="0 0 24 24">
|
||||
<path stroke-linecap="round" stroke-linejoin="round" stroke-width="2" d="M6 18L18 6M6 6l12 12"></path>
|
||||
</svg>
|
||||
</button>
|
||||
</div>
|
||||
{% else %}
|
||||
<div class="p-8 text-center text-gray-500 dark:text-gray-400">
|
||||
No pairings found. Select a recorder and modem above to create one.
|
||||
</div>
|
||||
{% endfor %}
|
||||
</div>
|
||||
</div>
|
||||
</div>
|
||||
</div>
|
||||
|
||||
<!-- Toast notification -->
|
||||
<div id="toast" class="fixed bottom-4 right-4 px-4 py-3 rounded-lg shadow-lg transform translate-y-full opacity-0 transition-all duration-300 z-50"></div>
|
||||
|
||||
<script>
|
||||
let selectedRecorder = null;
|
||||
let selectedModem = null;
|
||||
|
||||
function selectRecorder(id) {
|
||||
// Deselect previous
|
||||
document.querySelectorAll('.recorder-row').forEach(row => {
|
||||
row.classList.remove('bg-seismo-orange/10', 'dark:bg-seismo-orange/20');
|
||||
row.querySelector('.selection-indicator').classList.remove('border-seismo-orange', 'bg-seismo-orange');
|
||||
row.querySelector('.selection-indicator').classList.add('border-gray-300', 'dark:border-gray-600');
|
||||
row.querySelector('.check-icon').classList.add('hidden');
|
||||
});
|
||||
|
||||
// Toggle selection
|
||||
if (selectedRecorder === id) {
|
||||
selectedRecorder = null;
|
||||
document.getElementById('selected-recorder').textContent = 'None selected';
|
||||
} else {
|
||||
selectedRecorder = id;
|
||||
document.getElementById('selected-recorder').textContent = id;
|
||||
|
||||
// Highlight selected
|
||||
const row = document.querySelector(`.recorder-row[data-id="${id}"]`);
|
||||
if (row) {
|
||||
row.classList.add('bg-seismo-orange/10', 'dark:bg-seismo-orange/20');
|
||||
row.querySelector('.selection-indicator').classList.remove('border-gray-300', 'dark:border-gray-600');
|
||||
row.querySelector('.selection-indicator').classList.add('border-seismo-orange', 'bg-seismo-orange');
|
||||
row.querySelector('.check-icon').classList.remove('hidden');
|
||||
}
|
||||
}
|
||||
|
||||
updateButtons();
|
||||
}
|
||||
|
||||
function selectModem(id) {
|
||||
// Deselect previous
|
||||
document.querySelectorAll('.modem-row').forEach(row => {
|
||||
row.classList.remove('bg-seismo-orange/10', 'dark:bg-seismo-orange/20');
|
||||
row.querySelector('.selection-indicator').classList.remove('border-seismo-orange', 'bg-seismo-orange');
|
||||
row.querySelector('.selection-indicator').classList.add('border-gray-300', 'dark:border-gray-600');
|
||||
row.querySelector('.check-icon').classList.add('hidden');
|
||||
});
|
||||
|
||||
// Toggle selection
|
||||
if (selectedModem === id) {
|
||||
selectedModem = null;
|
||||
document.getElementById('selected-modem').textContent = 'None selected';
|
||||
} else {
|
||||
selectedModem = id;
|
||||
document.getElementById('selected-modem').textContent = id;
|
||||
|
||||
// Highlight selected
|
||||
const row = document.querySelector(`.modem-row[data-id="${id}"]`);
|
||||
if (row) {
|
||||
row.classList.add('bg-seismo-orange/10', 'dark:bg-seismo-orange/20');
|
||||
row.querySelector('.selection-indicator').classList.remove('border-gray-300', 'dark:border-gray-600');
|
||||
row.querySelector('.selection-indicator').classList.add('border-seismo-orange', 'bg-seismo-orange');
|
||||
row.querySelector('.check-icon').classList.remove('hidden');
|
||||
}
|
||||
}
|
||||
|
||||
updateButtons();
|
||||
}
|
||||
|
||||
function updateButtons() {
|
||||
const pairBtn = document.getElementById('pair-btn');
|
||||
const clearBtn = document.getElementById('clear-selection-btn');
|
||||
|
||||
pairBtn.disabled = !(selectedRecorder && selectedModem);
|
||||
clearBtn.disabled = !(selectedRecorder || selectedModem);
|
||||
}
|
||||
|
||||
function clearSelection() {
|
||||
if (selectedRecorder) selectRecorder(selectedRecorder);
|
||||
if (selectedModem) selectModem(selectedModem);
|
||||
}
|
||||
|
||||
function filterRecorders() {
|
||||
const searchTerm = document.getElementById('recorder-search').value.toLowerCase().trim();
|
||||
const hidePaired = document.getElementById('recorder-hide-paired').checked;
|
||||
const deployedOnly = document.getElementById('recorder-deployed-only').checked;
|
||||
|
||||
let visibleRecorders = 0;
|
||||
|
||||
document.querySelectorAll('.recorder-row').forEach(row => {
|
||||
const id = row.dataset.id.toLowerCase();
|
||||
const pairedWith = row.dataset.pairedWith;
|
||||
const deployed = row.dataset.deployed === 'true';
|
||||
|
||||
let show = true;
|
||||
if (searchTerm && !id.includes(searchTerm)) show = false;
|
||||
if (hidePaired && pairedWith) show = false;
|
||||
if (deployedOnly && !deployed) show = false;
|
||||
|
||||
row.style.display = show ? '' : 'none';
|
||||
if (show) visibleRecorders++;
|
||||
});
|
||||
|
||||
document.getElementById('recorder-count').textContent = `(${visibleRecorders})`;
|
||||
}
|
||||
|
||||
function filterModems() {
|
||||
const searchTerm = document.getElementById('modem-search').value.toLowerCase().trim();
|
||||
const hidePaired = document.getElementById('modem-hide-paired').checked;
|
||||
const deployedOnly = document.getElementById('modem-deployed-only').checked;
|
||||
|
||||
let visibleModems = 0;
|
||||
|
||||
document.querySelectorAll('.modem-row').forEach(row => {
|
||||
const id = row.dataset.id.toLowerCase();
|
||||
const ip = (row.dataset.ip || '').toLowerCase();
|
||||
const phone = (row.dataset.phone || '').toLowerCase();
|
||||
const pairedWith = row.dataset.pairedWith;
|
||||
const deployed = row.dataset.deployed === 'true';
|
||||
|
||||
let show = true;
|
||||
if (searchTerm && !id.includes(searchTerm) && !ip.includes(searchTerm) && !phone.includes(searchTerm)) show = false;
|
||||
if (hidePaired && pairedWith) show = false;
|
||||
if (deployedOnly && !deployed) show = false;
|
||||
|
||||
row.style.display = show ? '' : 'none';
|
||||
if (show) visibleModems++;
|
||||
});
|
||||
|
||||
document.getElementById('modem-count').textContent = `(${visibleModems})`;
|
||||
}
|
||||
|
||||
function saveScrollPositions() {
|
||||
const recordersList = document.getElementById('recorders-list').parentElement;
|
||||
const modemsList = document.getElementById('modems-list').parentElement;
|
||||
const pairingsList = document.getElementById('pairings-list').parentElement;
|
||||
|
||||
sessionStorage.setItem('pairDevices_recorderScroll', recordersList.scrollTop);
|
||||
sessionStorage.setItem('pairDevices_modemScroll', modemsList.scrollTop);
|
||||
sessionStorage.setItem('pairDevices_pairingScroll', pairingsList.scrollTop);
|
||||
|
||||
// Save recorder filter state
|
||||
sessionStorage.setItem('pairDevices_recorderSearch', document.getElementById('recorder-search').value);
|
||||
sessionStorage.setItem('pairDevices_recorderHidePaired', document.getElementById('recorder-hide-paired').checked);
|
||||
sessionStorage.setItem('pairDevices_recorderDeployedOnly', document.getElementById('recorder-deployed-only').checked);
|
||||
|
||||
// Save modem filter state
|
||||
sessionStorage.setItem('pairDevices_modemSearch', document.getElementById('modem-search').value);
|
||||
sessionStorage.setItem('pairDevices_modemHidePaired', document.getElementById('modem-hide-paired').checked);
|
||||
sessionStorage.setItem('pairDevices_modemDeployedOnly', document.getElementById('modem-deployed-only').checked);
|
||||
}
|
||||
|
||||
function restoreScrollPositions() {
|
||||
const recorderScroll = sessionStorage.getItem('pairDevices_recorderScroll');
|
||||
const modemScroll = sessionStorage.getItem('pairDevices_modemScroll');
|
||||
const pairingScroll = sessionStorage.getItem('pairDevices_pairingScroll');
|
||||
|
||||
if (recorderScroll) {
|
||||
document.getElementById('recorders-list').parentElement.scrollTop = parseInt(recorderScroll);
|
||||
}
|
||||
if (modemScroll) {
|
||||
document.getElementById('modems-list').parentElement.scrollTop = parseInt(modemScroll);
|
||||
}
|
||||
if (pairingScroll) {
|
||||
document.getElementById('pairings-list').parentElement.scrollTop = parseInt(pairingScroll);
|
||||
}
|
||||
|
||||
// Restore recorder filter state
|
||||
const recorderSearch = sessionStorage.getItem('pairDevices_recorderSearch');
|
||||
const recorderHidePaired = sessionStorage.getItem('pairDevices_recorderHidePaired');
|
||||
const recorderDeployedOnly = sessionStorage.getItem('pairDevices_recorderDeployedOnly');
|
||||
|
||||
if (recorderSearch) document.getElementById('recorder-search').value = recorderSearch;
|
||||
if (recorderHidePaired === 'true') document.getElementById('recorder-hide-paired').checked = true;
|
||||
if (recorderDeployedOnly === 'true') document.getElementById('recorder-deployed-only').checked = true;
|
||||
|
||||
// Restore modem filter state
|
||||
const modemSearch = sessionStorage.getItem('pairDevices_modemSearch');
|
||||
const modemHidePaired = sessionStorage.getItem('pairDevices_modemHidePaired');
|
||||
const modemDeployedOnly = sessionStorage.getItem('pairDevices_modemDeployedOnly');
|
||||
|
||||
if (modemSearch) document.getElementById('modem-search').value = modemSearch;
|
||||
if (modemHidePaired === 'true') document.getElementById('modem-hide-paired').checked = true;
|
||||
if (modemDeployedOnly === 'true') document.getElementById('modem-deployed-only').checked = true;
|
||||
|
||||
// Apply filters if any were set
|
||||
if (recorderSearch || recorderHidePaired === 'true' || recorderDeployedOnly === 'true') {
|
||||
filterRecorders();
|
||||
}
|
||||
if (modemSearch || modemHidePaired === 'true' || modemDeployedOnly === 'true') {
|
||||
filterModems();
|
||||
}
|
||||
|
||||
// Clear stored values
|
||||
sessionStorage.removeItem('pairDevices_recorderScroll');
|
||||
sessionStorage.removeItem('pairDevices_modemScroll');
|
||||
sessionStorage.removeItem('pairDevices_pairingScroll');
|
||||
sessionStorage.removeItem('pairDevices_recorderSearch');
|
||||
sessionStorage.removeItem('pairDevices_recorderHidePaired');
|
||||
sessionStorage.removeItem('pairDevices_recorderDeployedOnly');
|
||||
sessionStorage.removeItem('pairDevices_modemSearch');
|
||||
sessionStorage.removeItem('pairDevices_modemHidePaired');
|
||||
sessionStorage.removeItem('pairDevices_modemDeployedOnly');
|
||||
}
|
||||
|
||||
// Restore scroll positions on page load
|
||||
document.addEventListener('DOMContentLoaded', restoreScrollPositions);
|
||||
|
||||
async function pairDevices() {
|
||||
if (!selectedRecorder || !selectedModem) return;
|
||||
|
||||
const pairBtn = document.getElementById('pair-btn');
|
||||
pairBtn.disabled = true;
|
||||
pairBtn.textContent = 'Pairing...';
|
||||
|
||||
try {
|
||||
const response = await fetch('/api/roster/pair-devices', {
|
||||
method: 'POST',
|
||||
headers: { 'Content-Type': 'application/json' },
|
||||
body: JSON.stringify({
|
||||
recorder_id: selectedRecorder,
|
||||
modem_id: selectedModem
|
||||
})
|
||||
});
|
||||
|
||||
const result = await response.json();
|
||||
|
||||
if (response.ok) {
|
||||
showToast(`Paired ${selectedRecorder} with ${selectedModem}`, 'success');
|
||||
// Save scroll positions before reload
|
||||
saveScrollPositions();
|
||||
setTimeout(() => window.location.reload(), 500);
|
||||
} else {
|
||||
showToast(result.detail || 'Failed to pair devices', 'error');
|
||||
}
|
||||
} catch (error) {
|
||||
showToast('Error pairing devices: ' + error.message, 'error');
|
||||
} finally {
|
||||
pairBtn.disabled = false;
|
||||
pairBtn.textContent = 'Pair Devices';
|
||||
}
|
||||
}
|
||||
|
||||
async function unpairDevices(recorderId, modemId) {
|
||||
if (!confirm(`Unpair ${recorderId} from ${modemId}?`)) return;
|
||||
|
||||
try {
|
||||
const response = await fetch('/api/roster/unpair-devices', {
|
||||
method: 'POST',
|
||||
headers: { 'Content-Type': 'application/json' },
|
||||
body: JSON.stringify({
|
||||
recorder_id: recorderId,
|
||||
modem_id: modemId
|
||||
})
|
||||
});
|
||||
|
||||
const result = await response.json();
|
||||
|
||||
if (response.ok) {
|
||||
showToast(`Unpaired ${recorderId} from ${modemId}`, 'success');
|
||||
// Save scroll positions before reload
|
||||
saveScrollPositions();
|
||||
setTimeout(() => window.location.reload(), 500);
|
||||
} else {
|
||||
showToast(result.detail || 'Failed to unpair devices', 'error');
|
||||
}
|
||||
} catch (error) {
|
||||
showToast('Error unpairing devices: ' + error.message, 'error');
|
||||
}
|
||||
}
|
||||
|
||||
function showToast(message, type = 'info') {
|
||||
const toast = document.getElementById('toast');
|
||||
toast.textContent = message;
|
||||
toast.className = 'fixed bottom-4 right-4 px-4 py-3 rounded-lg shadow-lg transform transition-all duration-300 z-50';
|
||||
|
||||
if (type === 'success') {
|
||||
toast.classList.add('bg-green-500', 'text-white');
|
||||
} else if (type === 'error') {
|
||||
toast.classList.add('bg-red-500', 'text-white');
|
||||
} else {
|
||||
toast.classList.add('bg-gray-800', 'text-white');
|
||||
}
|
||||
|
||||
// Show
|
||||
toast.classList.remove('translate-y-full', 'opacity-0');
|
||||
|
||||
// Hide after 3 seconds
|
||||
setTimeout(() => {
|
||||
toast.classList.add('translate-y-full', 'opacity-0');
|
||||
}, 3000);
|
||||
}
|
||||
</script>
|
||||
|
||||
<style>
|
||||
.bg-seismo-orange\/10 {
|
||||
background-color: rgb(249 115 22 / 0.1);
|
||||
}
|
||||
.dark\:bg-seismo-orange\/20:is(.dark *) {
|
||||
background-color: rgb(249 115 22 / 0.2);
|
||||
}
|
||||
</style>
|
||||
{% endblock %}
|
||||
131
templates/partials/dashboard/todays_actions.html
Normal file
@@ -0,0 +1,131 @@
|
||||
<!-- Today's Scheduled Actions - Dashboard Card Content -->
|
||||
|
||||
<!-- Summary stats -->
|
||||
<div class="flex items-center gap-4 mb-4 text-sm">
|
||||
{% if pending_count > 0 %}
|
||||
<div class="flex items-center gap-1.5">
|
||||
<span class="w-2 h-2 bg-yellow-400 rounded-full"></span>
|
||||
<span class="text-gray-600 dark:text-gray-400">{{ pending_count }} pending</span>
|
||||
</div>
|
||||
{% endif %}
|
||||
{% if completed_count > 0 %}
|
||||
<div class="flex items-center gap-1.5">
|
||||
<span class="w-2 h-2 bg-green-400 rounded-full"></span>
|
||||
<span class="text-gray-600 dark:text-gray-400">{{ completed_count }} completed</span>
|
||||
</div>
|
||||
{% endif %}
|
||||
{% if failed_count > 0 %}
|
||||
<div class="flex items-center gap-1.5">
|
||||
<span class="w-2 h-2 bg-red-400 rounded-full"></span>
|
||||
<span class="text-gray-600 dark:text-gray-400">{{ failed_count }} failed</span>
|
||||
</div>
|
||||
{% endif %}
|
||||
{% if total_count == 0 %}
|
||||
<span class="text-gray-500 dark:text-gray-400">No actions scheduled for today</span>
|
||||
{% endif %}
|
||||
</div>
|
||||
|
||||
<!-- Actions list -->
|
||||
{% if actions %}
|
||||
<div class="space-y-2 max-h-64 overflow-y-auto">
|
||||
{% for item in actions %}
|
||||
<div class="flex items-center gap-3 p-2 rounded-lg
|
||||
{% if item.action.execution_status == 'pending' %}bg-yellow-50 dark:bg-yellow-900/20
|
||||
{% elif item.action.execution_status == 'completed' %}bg-green-50 dark:bg-green-900/20
|
||||
{% elif item.action.execution_status == 'failed' %}bg-red-50 dark:bg-red-900/20
|
||||
{% else %}bg-gray-50 dark:bg-gray-700/50{% endif %}">
|
||||
|
||||
<!-- Action type icon -->
|
||||
<div class="flex-shrink-0">
|
||||
{% if item.action.action_type == 'start' %}
|
||||
<div class="w-8 h-8 rounded-full bg-green-100 dark:bg-green-900/30 flex items-center justify-center">
|
||||
<svg class="w-4 h-4 text-green-600 dark:text-green-400" fill="currentColor" viewBox="0 0 20 20">
|
||||
<path fill-rule="evenodd" d="M10 18a8 8 0 100-16 8 8 0 000 16zM9.555 7.168A1 1 0 008 8v4a1 1 0 001.555.832l3-2a1 1 0 000-1.664l-3-2z" clip-rule="evenodd"/>
|
||||
</svg>
|
||||
</div>
|
||||
{% elif item.action.action_type == 'stop' %}
|
||||
<div class="w-8 h-8 rounded-full bg-red-100 dark:bg-red-900/30 flex items-center justify-center">
|
||||
<svg class="w-4 h-4 text-red-600 dark:text-red-400" fill="currentColor" viewBox="0 0 20 20">
|
||||
<path fill-rule="evenodd" d="M10 18a8 8 0 100-16 8 8 0 000 16zM8 7a1 1 0 00-1 1v4a1 1 0 001 1h4a1 1 0 001-1V8a1 1 0 00-1-1H8z" clip-rule="evenodd"/>
|
||||
</svg>
|
||||
</div>
|
||||
{% elif item.action.action_type == 'download' %}
|
||||
<div class="w-8 h-8 rounded-full bg-blue-100 dark:bg-blue-900/30 flex items-center justify-center">
|
||||
<svg class="w-4 h-4 text-blue-600 dark:text-blue-400" fill="currentColor" viewBox="0 0 20 20">
|
||||
<path fill-rule="evenodd" d="M3 17a1 1 0 011-1h12a1 1 0 110 2H4a1 1 0 01-1-1zm3.293-7.707a1 1 0 011.414 0L9 10.586V3a1 1 0 112 0v7.586l1.293-1.293a1 1 0 111.414 1.414l-3 3a1 1 0 01-1.414 0l-3-3a1 1 0 010-1.414z" clip-rule="evenodd"/>
|
||||
</svg>
|
||||
</div>
|
||||
{% endif %}
|
||||
</div>
|
||||
|
||||
<!-- Action details -->
|
||||
<div class="flex-1 min-w-0">
|
||||
<div class="flex items-center gap-2">
|
||||
<span class="font-medium text-sm text-gray-900 dark:text-white capitalize">{{ item.action.action_type }}</span>
|
||||
|
||||
<!-- Status indicator -->
|
||||
{% if item.action.execution_status == 'pending' %}
|
||||
<span class="text-xs text-yellow-600 dark:text-yellow-400">
|
||||
{{ item.action.scheduled_time|local_datetime('%H:%M') }}
|
||||
</span>
|
||||
{% elif item.action.execution_status == 'completed' %}
|
||||
<svg class="w-4 h-4 text-green-500" fill="currentColor" viewBox="0 0 20 20">
|
||||
<path fill-rule="evenodd" d="M10 18a8 8 0 100-16 8 8 0 000 16zm3.707-9.293a1 1 0 00-1.414-1.414L9 10.586 7.707 9.293a1 1 0 00-1.414 1.414l2 2a1 1 0 001.414 0l4-4z" clip-rule="evenodd"/>
|
||||
</svg>
|
||||
{% elif item.action.execution_status == 'failed' %}
|
||||
<svg class="w-4 h-4 text-red-500" fill="currentColor" viewBox="0 0 20 20">
|
||||
<path fill-rule="evenodd" d="M10 18a8 8 0 100-16 8 8 0 000 16zM8.707 7.293a1 1 0 00-1.414 1.414L8.586 10l-1.293 1.293a1 1 0 101.414 1.414L10 11.414l1.293 1.293a1 1 0 001.414-1.414L11.414 10l1.293-1.293a1 1 0 00-1.414-1.414L10 8.586 8.707 7.293z" clip-rule="evenodd"/>
|
||||
</svg>
|
||||
{% endif %}
|
||||
</div>
|
||||
|
||||
<!-- Location/Project info -->
|
||||
<div class="text-xs text-gray-500 dark:text-gray-400 truncate">
|
||||
{% if item.location %}
|
||||
<a href="/projects/{{ item.action.project_id }}/nrl/{{ item.location.id }}"
|
||||
class="hover:text-seismo-orange">
|
||||
{{ item.location.name }}
|
||||
</a>
|
||||
{% elif item.project %}
|
||||
<a href="/projects/{{ item.project.id }}" class="hover:text-seismo-orange">
|
||||
{{ item.project.name }}
|
||||
</a>
|
||||
{% endif %}
|
||||
</div>
|
||||
|
||||
<!-- Result details for completed/failed -->
|
||||
{% if item.action.execution_status == 'completed' and item.result %}
|
||||
{% if item.result.cycle_response and item.result.cycle_response.downloaded_folder %}
|
||||
<div class="text-xs text-green-600 dark:text-green-400">
|
||||
{{ item.result.cycle_response.downloaded_folder }}
|
||||
{% if item.result.cycle_response.download_success %}downloaded{% endif %}
|
||||
</div>
|
||||
{% endif %}
|
||||
{% elif item.action.execution_status == 'failed' and item.action.error_message %}
|
||||
<div class="text-xs text-red-600 dark:text-red-400 truncate" title="{{ item.action.error_message }}">
|
||||
{{ item.action.error_message[:50] }}{% if item.action.error_message|length > 50 %}...{% endif %}
|
||||
</div>
|
||||
{% endif %}
|
||||
</div>
|
||||
|
||||
<!-- Time -->
|
||||
<div class="flex-shrink-0 text-right">
|
||||
{% if item.action.execution_status == 'pending' %}
|
||||
<span class="text-xs text-gray-500 dark:text-gray-400">Scheduled</span>
|
||||
{% elif item.action.executed_at %}
|
||||
<span class="text-xs text-gray-500 dark:text-gray-400">
|
||||
{{ item.action.executed_at|local_datetime('%H:%M') }}
|
||||
</span>
|
||||
{% endif %}
|
||||
</div>
|
||||
</div>
|
||||
{% endfor %}
|
||||
</div>
|
||||
{% else %}
|
||||
<div class="text-center py-6 text-gray-500 dark:text-gray-400">
|
||||
<svg class="w-10 h-10 mx-auto mb-2 text-gray-300 dark:text-gray-600" fill="none" stroke="currentColor" viewBox="0 0 24 24">
|
||||
<path stroke-linecap="round" stroke-linejoin="round" stroke-width="2" d="M8 7V3m8 4V3m-9 8h10M5 21h14a2 2 0 002-2V7a2 2 0 00-2-2H5a2 2 0 00-2 2v12a2 2 0 002 2z"></path>
|
||||
</svg>
|
||||
<p class="text-sm">No scheduled actions for today</p>
|
||||
</div>
|
||||
{% endif %}
|
||||
@@ -51,7 +51,7 @@
|
||||
{% for unit in units %}
|
||||
<tr class="hover:bg-gray-50 dark:hover:bg-gray-700 transition-colors"
|
||||
data-device-type="{{ unit.device_type }}"
|
||||
data-status="{% if unit.deployed %}deployed{% elif unit.retired %}retired{% elif unit.ignored %}ignored{% else %}benched{% endif %}"
|
||||
data-status="{% if unit.deployed %}deployed{% elif unit.out_for_calibration %}out_for_calibration{% elif unit.retired %}retired{% elif unit.ignored %}ignored{% elif unit.allocated %}allocated{% else %}benched{% endif %}"
|
||||
data-health="{{ unit.status }}"
|
||||
data-id="{{ unit.id }}"
|
||||
data-type="{{ unit.device_type }}"
|
||||
@@ -60,7 +60,13 @@
|
||||
data-note="{{ unit.note if unit.note else '' }}">
|
||||
<td class="px-6 py-4 whitespace-nowrap">
|
||||
<div class="flex items-center space-x-2">
|
||||
{% if unit.status == 'OK' %}
|
||||
{% if unit.out_for_calibration %}
|
||||
<span class="w-3 h-3 rounded-full bg-purple-500" title="Out for Calibration"></span>
|
||||
{% elif unit.allocated %}
|
||||
<span class="w-3 h-3 rounded-full bg-orange-400" title="Allocated"></span>
|
||||
{% elif not unit.deployed %}
|
||||
<span class="w-3 h-3 rounded-full bg-gray-400 dark:bg-gray-500" title="Benched"></span>
|
||||
{% elif unit.status == 'OK' %}
|
||||
<span class="w-3 h-3 rounded-full bg-green-500" title="OK"></span>
|
||||
{% elif unit.status == 'Pending' %}
|
||||
<span class="w-3 h-3 rounded-full bg-yellow-500" title="Pending"></span>
|
||||
@@ -70,6 +76,10 @@
|
||||
|
||||
{% if unit.deployed %}
|
||||
<span class="w-2 h-2 rounded-full bg-blue-500" title="Deployed"></span>
|
||||
{% elif unit.out_for_calibration %}
|
||||
<span class="w-2 h-2 rounded-full bg-purple-400" title="Out for Calibration"></span>
|
||||
{% elif unit.allocated %}
|
||||
<span class="w-2 h-2 rounded-full bg-orange-400" title="Allocated"></span>
|
||||
{% else %}
|
||||
<span class="w-2 h-2 rounded-full bg-gray-300 dark:bg-gray-600" title="Benched"></span>
|
||||
{% endif %}
|
||||
@@ -104,14 +114,19 @@
|
||||
{% if unit.phone_number %}
|
||||
<div>{{ unit.phone_number }}</div>
|
||||
{% endif %}
|
||||
{% if unit.hardware_model %}
|
||||
<div class="text-gray-500 dark:text-gray-500">{{ unit.hardware_model }}</div>
|
||||
{% if unit.deployed_with_unit_id %}
|
||||
<div>
|
||||
<span class="text-gray-500 dark:text-gray-500">Linked:</span>
|
||||
<a href="/unit/{{ unit.deployed_with_unit_id }}" class="text-seismo-orange hover:underline font-medium">
|
||||
{{ unit.deployed_with_unit_id }}
|
||||
</a>
|
||||
</div>
|
||||
{% endif %}
|
||||
{% else %}
|
||||
{% if unit.next_calibration_due %}
|
||||
{% if unit.last_calibrated %}
|
||||
<div>
|
||||
<span class="text-gray-500 dark:text-gray-500">Cal Due:</span>
|
||||
<span class="font-medium">{{ unit.next_calibration_due }}</span>
|
||||
<span class="text-gray-500 dark:text-gray-500">Last Cal:</span>
|
||||
<span class="font-medium">{{ unit.last_calibrated }}</span>
|
||||
</div>
|
||||
{% endif %}
|
||||
{% if unit.deployed_with_modem_id %}
|
||||
@@ -126,7 +141,7 @@
|
||||
</div>
|
||||
</td>
|
||||
<td class="px-6 py-4 whitespace-nowrap">
|
||||
<div class="text-sm text-gray-500 dark:text-gray-400">{{ unit.last_seen }}</div>
|
||||
<div class="text-sm text-gray-500 dark:text-gray-400 last-seen-cell" data-iso="{{ unit.last_seen }}">{{ unit.last_seen }}</div>
|
||||
</td>
|
||||
<td class="px-6 py-4 whitespace-nowrap">
|
||||
<div class="text-sm
|
||||
@@ -196,14 +211,20 @@
|
||||
<div class="unit-card device-card"
|
||||
onclick="openUnitModal('{{ unit.id }}', '{{ unit.status }}', '{{ unit.age }}')"
|
||||
data-device-type="{{ unit.device_type }}"
|
||||
data-status="{% if unit.deployed %}deployed{% elif unit.retired %}retired{% elif unit.ignored %}ignored{% else %}benched{% endif %}"
|
||||
data-status="{% if unit.deployed %}deployed{% elif unit.out_for_calibration %}out_for_calibration{% elif unit.retired %}retired{% elif unit.ignored %}ignored{% elif unit.allocated %}allocated{% else %}benched{% endif %}"
|
||||
data-health="{{ unit.status }}"
|
||||
data-unit-id="{{ unit.id }}"
|
||||
data-age="{{ unit.age }}">
|
||||
<!-- Header: Status Dot + Unit ID + Status Badge -->
|
||||
<div class="flex items-center justify-between mb-2">
|
||||
<div class="flex items-center gap-2">
|
||||
{% if unit.status == 'OK' %}
|
||||
{% if unit.out_for_calibration %}
|
||||
<span class="w-4 h-4 rounded-full bg-purple-500" title="Out for Calibration"></span>
|
||||
{% elif unit.allocated %}
|
||||
<span class="w-4 h-4 rounded-full bg-orange-400" title="Allocated"></span>
|
||||
{% elif not unit.deployed %}
|
||||
<span class="w-4 h-4 rounded-full bg-gray-400 dark:bg-gray-500" title="Benched"></span>
|
||||
{% elif unit.status == 'OK' %}
|
||||
<span class="w-4 h-4 rounded-full bg-green-500" title="OK"></span>
|
||||
{% elif unit.status == 'Pending' %}
|
||||
<span class="w-4 h-4 rounded-full bg-yellow-500" title="Pending"></span>
|
||||
@@ -215,12 +236,14 @@
|
||||
<span class="font-bold text-lg text-seismo-orange dark:text-seismo-orange">{{ unit.id }}</span>
|
||||
</div>
|
||||
<span class="px-3 py-1 rounded-full text-xs font-medium
|
||||
{% if unit.status == 'OK' %}bg-green-100 dark:bg-green-900/30 text-green-800 dark:text-green-300
|
||||
{% if unit.out_for_calibration %}bg-purple-100 dark:bg-purple-900/30 text-purple-800 dark:text-purple-300
|
||||
{% elif unit.allocated %}bg-orange-100 dark:bg-orange-900/30 text-orange-800 dark:text-orange-300
|
||||
{% elif unit.status == 'OK' %}bg-green-100 dark:bg-green-900/30 text-green-800 dark:text-green-300
|
||||
{% elif unit.status == 'Pending' %}bg-yellow-100 dark:bg-yellow-900/30 text-yellow-800 dark:text-yellow-300
|
||||
{% elif unit.status == 'Missing' %}bg-red-100 dark:bg-red-900/30 text-red-800 dark:text-red-300
|
||||
{% else %}bg-gray-100 dark:bg-gray-700 text-gray-600 dark:text-gray-400
|
||||
{% endif %}">
|
||||
{% if unit.status in ['N/A', 'Unknown'] %}Benched{% else %}{{ unit.status }}{% endif %}
|
||||
{% if unit.out_for_calibration %}Out for Cal{% elif unit.allocated %}Allocated{% elif unit.status in ['N/A', 'Unknown'] %}Benched{% else %}{{ unit.status }}{% endif %}
|
||||
</span>
|
||||
</div>
|
||||
|
||||
@@ -230,6 +253,10 @@
|
||||
<span class="px-2 py-1 rounded-full bg-purple-100 dark:bg-purple-900/30 text-purple-800 dark:text-purple-300 text-xs font-medium">
|
||||
Modem
|
||||
</span>
|
||||
{% elif unit.device_type == 'slm' %}
|
||||
<span class="px-2 py-1 rounded-full bg-orange-100 dark:bg-orange-900/30 text-orange-800 dark:text-orange-300 text-xs font-medium">
|
||||
SLM
|
||||
</span>
|
||||
{% else %}
|
||||
<span class="px-2 py-1 rounded-full bg-blue-100 dark:bg-blue-900/30 text-blue-800 dark:text-blue-300 text-xs font-medium">
|
||||
Seismograph
|
||||
@@ -266,6 +293,10 @@
|
||||
<span class="text-xs text-blue-600 dark:text-blue-400">
|
||||
⚡ Deployed
|
||||
</span>
|
||||
{% elif unit.out_for_calibration %}
|
||||
<span class="text-xs text-purple-600 dark:text-purple-400">
|
||||
🔧 Out for Calibration
|
||||
</span>
|
||||
{% else %}
|
||||
<span class="text-xs text-gray-500 dark:text-gray-500">
|
||||
📦 Benched
|
||||
@@ -345,6 +376,39 @@
|
||||
</style>
|
||||
|
||||
<script>
|
||||
(function() {
|
||||
// User's configured timezone from settings (defaults to America/New_York)
|
||||
const userTimezone = '{{ user_timezone | default("America/New_York") }}';
|
||||
|
||||
// Format ISO timestamp to human-readable format in user's timezone
|
||||
function formatLastSeenLocal(isoString) {
|
||||
if (!isoString || isoString === 'Never' || isoString === 'N/A') {
|
||||
return isoString || 'Never';
|
||||
}
|
||||
try {
|
||||
const date = new Date(isoString);
|
||||
if (isNaN(date.getTime())) return isoString;
|
||||
|
||||
// Format in user's configured timezone
|
||||
return date.toLocaleString('en-US', {
|
||||
timeZone: userTimezone,
|
||||
month: 'short',
|
||||
day: 'numeric',
|
||||
hour: 'numeric',
|
||||
minute: '2-digit',
|
||||
hour12: true
|
||||
});
|
||||
} catch (e) {
|
||||
return isoString;
|
||||
}
|
||||
}
|
||||
|
||||
// Format all last-seen cells on page load
|
||||
document.querySelectorAll('.last-seen-cell').forEach(cell => {
|
||||
const isoDate = cell.getAttribute('data-iso');
|
||||
cell.textContent = formatLastSeenLocal(isoDate);
|
||||
});
|
||||
|
||||
// Update timestamp
|
||||
const timestampElement = document.getElementById('last-updated');
|
||||
if (timestampElement) {
|
||||
@@ -365,20 +429,23 @@
|
||||
};
|
||||
return acc;
|
||||
}, {});
|
||||
})();
|
||||
|
||||
// Sorting state
|
||||
let currentSort = { column: null, direction: 'asc' };
|
||||
// Sorting state (needs to persist across swaps)
|
||||
if (typeof window.currentSort === 'undefined') {
|
||||
window.currentSort = { column: null, direction: 'asc' };
|
||||
}
|
||||
|
||||
function sortTable(column) {
|
||||
const tbody = document.getElementById('roster-tbody');
|
||||
const rows = Array.from(tbody.getElementsByTagName('tr'));
|
||||
|
||||
// Determine sort direction
|
||||
if (currentSort.column === column) {
|
||||
currentSort.direction = currentSort.direction === 'asc' ? 'desc' : 'asc';
|
||||
if (window.currentSort.column === column) {
|
||||
window.currentSort.direction = window.currentSort.direction === 'asc' ? 'desc' : 'asc';
|
||||
} else {
|
||||
currentSort.column = column;
|
||||
currentSort.direction = 'asc';
|
||||
window.currentSort.column = column;
|
||||
window.currentSort.direction = 'asc';
|
||||
}
|
||||
|
||||
// Sort rows
|
||||
@@ -406,8 +473,8 @@
|
||||
bVal = bVal.toLowerCase();
|
||||
}
|
||||
|
||||
if (aVal < bVal) return currentSort.direction === 'asc' ? -1 : 1;
|
||||
if (aVal > bVal) return currentSort.direction === 'asc' ? 1 : -1;
|
||||
if (aVal < bVal) return window.currentSort.direction === 'asc' ? -1 : 1;
|
||||
if (aVal > bVal) return window.currentSort.direction === 'asc' ? 1 : -1;
|
||||
return 0;
|
||||
});
|
||||
|
||||
@@ -443,10 +510,10 @@
|
||||
});
|
||||
|
||||
// Set current indicator
|
||||
if (currentSort.column) {
|
||||
const indicator = document.querySelector(`.sort-indicator[data-column="${currentSort.column}"]`);
|
||||
if (window.currentSort.column) {
|
||||
const indicator = document.querySelector(`.sort-indicator[data-column="${window.currentSort.column}"]`);
|
||||
if (indicator) {
|
||||
indicator.className = `sort-indicator ${currentSort.direction}`;
|
||||
indicator.className = `sort-indicator ${window.currentSort.direction}`;
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
40
templates/partials/fleet_calendar/available_units.html
Normal file
@@ -0,0 +1,40 @@
|
||||
<!-- Available Units for Assignment -->
|
||||
{% if units %}
|
||||
<div class="space-y-1">
|
||||
{% for unit in units %}
|
||||
<label class="flex items-center gap-3 p-2 hover:bg-gray-50 dark:hover:bg-gray-700 rounded cursor-pointer">
|
||||
<input type="checkbox" name="unit_ids" value="{{ unit.id }}"
|
||||
class="w-4 h-4 text-blue-600 focus:ring-blue-500 rounded border-gray-300 dark:border-gray-600">
|
||||
<span class="font-medium text-gray-900 dark:text-white">{{ unit.id }}</span>
|
||||
<span class="text-sm text-gray-500 dark:text-gray-400 flex-1">
|
||||
{% if unit.last_calibrated %}
|
||||
Cal: {{ unit.last_calibrated }}
|
||||
{% else %}
|
||||
No cal date
|
||||
{% endif %}
|
||||
</span>
|
||||
{% if unit.calibration_status == 'expiring_soon' %}
|
||||
<span class="text-xs px-2 py-0.5 bg-yellow-100 dark:bg-yellow-900/30 text-yellow-700 dark:text-yellow-400 rounded-full">
|
||||
Expiring soon
|
||||
</span>
|
||||
{% endif %}
|
||||
{% if unit.deployed %}
|
||||
<span class="text-xs px-2 py-0.5 bg-green-100 dark:bg-green-900/30 text-green-700 dark:text-green-400 rounded-full">
|
||||
Deployed
|
||||
</span>
|
||||
{% else %}
|
||||
<span class="text-xs px-2 py-0.5 bg-gray-100 dark:bg-gray-700 text-gray-600 dark:text-gray-400 rounded-full">
|
||||
Benched
|
||||
</span>
|
||||
{% endif %}
|
||||
</label>
|
||||
{% endfor %}
|
||||
</div>
|
||||
{% else %}
|
||||
<p class="text-gray-500 dark:text-gray-400 text-sm py-4 text-center">
|
||||
No units available for this date range.
|
||||
{% if start_date and end_date %}
|
||||
<br><span class="text-xs">All units are either reserved, have expired calibrations, or are retired.</span>
|
||||
{% endif %}
|
||||
</p>
|
||||
{% endif %}
|
||||
186
templates/partials/fleet_calendar/day_detail.html
Normal file
@@ -0,0 +1,186 @@
|
||||
<!-- Day Detail Panel Content -->
|
||||
<div class="flex items-center justify-between mb-6">
|
||||
<h2 class="text-xl font-semibold text-gray-900 dark:text-white">{{ date_display }}</h2>
|
||||
<button onclick="closeDayPanel()" class="text-gray-400 hover:text-gray-600 dark:hover:text-gray-300">
|
||||
<svg class="w-6 h-6" fill="none" stroke="currentColor" viewBox="0 0 24 24">
|
||||
<path stroke-linecap="round" stroke-linejoin="round" stroke-width="2" d="M6 18L18 6M6 6l12 12"/>
|
||||
</svg>
|
||||
</button>
|
||||
</div>
|
||||
|
||||
<!-- Summary Stats -->
|
||||
<div class="grid grid-cols-2 gap-3 mb-6">
|
||||
<div class="bg-green-50 dark:bg-green-900/20 rounded-lg p-3 text-center">
|
||||
<p class="text-2xl font-bold text-green-700 dark:text-green-300">{{ day_data.counts.available }}</p>
|
||||
<p class="text-xs text-green-600 dark:text-green-400">Available</p>
|
||||
</div>
|
||||
<div class="bg-blue-50 dark:bg-blue-900/20 rounded-lg p-3 text-center">
|
||||
<p class="text-2xl font-bold text-blue-700 dark:text-blue-300">{{ day_data.counts.reserved }}</p>
|
||||
<p class="text-xs text-blue-600 dark:text-blue-400">Reserved</p>
|
||||
</div>
|
||||
<div class="bg-yellow-50 dark:bg-yellow-900/20 rounded-lg p-3 text-center">
|
||||
<p class="text-2xl font-bold text-yellow-700 dark:text-yellow-300">{{ day_data.counts.expiring_soon }}</p>
|
||||
<p class="text-xs text-yellow-600 dark:text-yellow-400">Expiring Soon</p>
|
||||
</div>
|
||||
<div class="bg-red-50 dark:bg-red-900/20 rounded-lg p-3 text-center">
|
||||
<p class="text-2xl font-bold text-red-700 dark:text-red-300">{{ day_data.counts.expired }}</p>
|
||||
<p class="text-xs text-red-600 dark:text-red-400">Cal. Expired</p>
|
||||
</div>
|
||||
</div>
|
||||
|
||||
<!-- Calibration Expiring TODAY - Important alert -->
|
||||
{% if day_data.cal_expiring_today %}
|
||||
<div class="mb-6 p-3 bg-red-50 dark:bg-red-900/30 border border-red-200 dark:border-red-800 rounded-lg">
|
||||
<h3 class="text-sm font-semibold text-red-700 dark:text-red-400 mb-2 flex items-center gap-2">
|
||||
<svg class="w-4 h-4" fill="currentColor" viewBox="0 0 20 20">
|
||||
<path fill-rule="evenodd" d="M8.257 3.099c.765-1.36 2.722-1.36 3.486 0l5.58 9.92c.75 1.334-.213 2.98-1.742 2.98H4.42c-1.53 0-2.493-1.646-1.743-2.98l5.58-9.92zM11 13a1 1 0 11-2 0 1 1 0 012 0zm-1-8a1 1 0 00-1 1v3a1 1 0 002 0V6a1 1 0 00-1-1z" clip-rule="evenodd"/>
|
||||
</svg>
|
||||
Calibration Expires Today ({{ day_data.cal_expiring_today|length }})
|
||||
</h3>
|
||||
<div class="space-y-1">
|
||||
{% for unit in day_data.cal_expiring_today %}
|
||||
<div class="flex items-center justify-between p-2 bg-white dark:bg-gray-800 rounded text-sm">
|
||||
<a href="/unit/{{ unit.id }}" class="font-medium text-red-600 dark:text-red-400 hover:underline">
|
||||
{{ unit.id }}
|
||||
</a>
|
||||
<span class="text-red-500 text-xs">
|
||||
Last cal: {{ unit.last_calibrated }}
|
||||
</span>
|
||||
</div>
|
||||
{% endfor %}
|
||||
</div>
|
||||
</div>
|
||||
{% endif %}
|
||||
|
||||
<!-- Reservations on this date -->
|
||||
{% if day_data.reservations %}
|
||||
<div class="mb-6">
|
||||
<h3 class="text-sm font-semibold text-gray-700 dark:text-gray-300 mb-3">Reservations</h3>
|
||||
{% for res in day_data.reservations %}
|
||||
<div class="reservation-bar mb-2" style="background-color: {{ res.color }}20; border-left: 4px solid {{ res.color }};">
|
||||
<div class="flex-1">
|
||||
<p class="font-medium text-gray-900 dark:text-white">{{ res.name }}</p>
|
||||
<p class="text-xs text-gray-500 dark:text-gray-400">
|
||||
{{ res.start_date }} - {{ res.end_date }}
|
||||
</p>
|
||||
</div>
|
||||
<div class="text-right">
|
||||
<p class="font-semibold text-gray-900 dark:text-white">
|
||||
{% if res.assignment_type == 'quantity' %}
|
||||
{{ res.assigned_count }}/{{ res.quantity_needed or '?' }}
|
||||
{% else %}
|
||||
{{ res.assigned_count }} units
|
||||
{% endif %}
|
||||
</p>
|
||||
</div>
|
||||
</div>
|
||||
{% endfor %}
|
||||
</div>
|
||||
{% endif %}
|
||||
|
||||
<!-- Available Units -->
|
||||
{% if day_data.available_units %}
|
||||
<div class="mb-6">
|
||||
<h3 class="text-sm font-semibold text-gray-700 dark:text-gray-300 mb-3">
|
||||
Available Units ({{ day_data.available_units|length }})
|
||||
</h3>
|
||||
<div class="max-h-48 overflow-y-auto space-y-1">
|
||||
{% for unit in day_data.available_units %}
|
||||
<div class="flex items-center justify-between p-2 bg-gray-50 dark:bg-gray-700/50 rounded text-sm">
|
||||
<a href="/unit/{{ unit.id }}" class="font-medium text-blue-600 dark:text-blue-400 hover:underline">
|
||||
{{ unit.id }}
|
||||
</a>
|
||||
<span class="text-gray-500 dark:text-gray-400 text-xs">
|
||||
{% if unit.last_calibrated %}
|
||||
Cal: {{ unit.last_calibrated }}
|
||||
{% else %}
|
||||
No cal date
|
||||
{% endif %}
|
||||
</span>
|
||||
</div>
|
||||
{% endfor %}
|
||||
</div>
|
||||
</div>
|
||||
{% endif %}
|
||||
|
||||
<!-- Reserved Units -->
|
||||
{% if day_data.reserved_units %}
|
||||
<div class="mb-6">
|
||||
<h3 class="text-sm font-semibold text-gray-700 dark:text-gray-300 mb-3">
|
||||
Reserved Units ({{ day_data.reserved_units|length }})
|
||||
</h3>
|
||||
<div class="max-h-48 overflow-y-auto space-y-1">
|
||||
{% for unit in day_data.reserved_units %}
|
||||
<div class="flex items-center justify-between p-2 bg-blue-50 dark:bg-blue-900/20 rounded text-sm">
|
||||
<a href="/unit/{{ unit.id }}" class="font-medium text-blue-600 dark:text-blue-400 hover:underline">
|
||||
{{ unit.id }}
|
||||
</a>
|
||||
<span class="text-blue-600 dark:text-blue-400 text-xs">
|
||||
{{ unit.reservation_name }}
|
||||
</span>
|
||||
</div>
|
||||
{% endfor %}
|
||||
</div>
|
||||
</div>
|
||||
{% endif %}
|
||||
|
||||
<!-- Calibration Expired -->
|
||||
{% if day_data.expired_units %}
|
||||
<div class="mb-6">
|
||||
<h3 class="text-sm font-semibold text-red-600 dark:text-red-400 mb-3">
|
||||
Calibration Expired ({{ day_data.expired_units|length }})
|
||||
</h3>
|
||||
<div class="max-h-48 overflow-y-auto space-y-1">
|
||||
{% for unit in day_data.expired_units %}
|
||||
<div class="flex items-center justify-between p-2 bg-red-50 dark:bg-red-900/20 rounded text-sm">
|
||||
<a href="/unit/{{ unit.id }}" class="font-medium text-red-600 dark:text-red-400 hover:underline">
|
||||
{{ unit.id }}
|
||||
</a>
|
||||
<span class="text-red-500 text-xs">
|
||||
Expired: {{ unit.expiry_date }}
|
||||
</span>
|
||||
</div>
|
||||
{% endfor %}
|
||||
</div>
|
||||
</div>
|
||||
{% endif %}
|
||||
|
||||
<!-- Needs Calibration -->
|
||||
{% if day_data.needs_calibration_units %}
|
||||
<div class="mb-6">
|
||||
<h3 class="text-sm font-semibold text-gray-600 dark:text-gray-400 mb-3">
|
||||
Needs Calibration Date ({{ day_data.needs_calibration_units|length }})
|
||||
</h3>
|
||||
<div class="max-h-32 overflow-y-auto space-y-1">
|
||||
{% for unit in day_data.needs_calibration_units %}
|
||||
<div class="flex items-center justify-between p-2 bg-gray-100 dark:bg-gray-700/50 rounded text-sm">
|
||||
<a href="/unit/{{ unit.id }}" class="font-medium text-gray-600 dark:text-gray-400 hover:underline">
|
||||
{{ unit.id }}
|
||||
</a>
|
||||
<span class="text-gray-400 text-xs">No cal date set</span>
|
||||
</div>
|
||||
{% endfor %}
|
||||
</div>
|
||||
</div>
|
||||
{% endif %}
|
||||
|
||||
<!-- Expiring Soon (informational) -->
|
||||
{% if day_data.expiring_soon_units %}
|
||||
<div class="mb-6">
|
||||
<h3 class="text-sm font-semibold text-yellow-600 dark:text-yellow-400 mb-3">
|
||||
Calibration Expiring Soon ({{ day_data.expiring_soon_units|length }})
|
||||
</h3>
|
||||
<div class="max-h-32 overflow-y-auto space-y-1">
|
||||
{% for unit in day_data.expiring_soon_units %}
|
||||
<div class="flex items-center justify-between p-2 bg-yellow-50 dark:bg-yellow-900/20 rounded text-sm">
|
||||
<a href="/unit/{{ unit.id }}" class="font-medium text-yellow-700 dark:text-yellow-400 hover:underline">
|
||||
{{ unit.id }}
|
||||
</a>
|
||||
<span class="text-yellow-600 text-xs">
|
||||
Expires: {{ unit.expiry_date }}
|
||||
</span>
|
||||
</div>
|
||||
{% endfor %}
|
||||
</div>
|
||||
</div>
|
||||
{% endif %}
|
||||
203
templates/partials/fleet_calendar/reservations_list.html
Normal file
@@ -0,0 +1,203 @@
|
||||
<!-- Reservations List -->
|
||||
{% if reservations %}
|
||||
<div class="space-y-2">
|
||||
{% for item in reservations %}
|
||||
{% set res = item.reservation %}
|
||||
{% set card_id = "res-card-" ~ res.id %}
|
||||
{% set detail_id = "res-detail-" ~ res.id %}
|
||||
|
||||
<div class="rounded-lg border border-gray-200 dark:border-gray-700"
|
||||
style="border-left: 4px solid {{ res.color }};">
|
||||
|
||||
<!-- Header row (always visible, clickable) -->
|
||||
<div class="res-card-header flex items-center justify-between p-4 cursor-pointer hover:bg-gray-50 dark:hover:bg-gray-700/50 transition-colors select-none"
|
||||
data-res-id="{{ res.id }}"
|
||||
onclick="toggleResCard('{{ res.id }}')">
|
||||
|
||||
<div class="flex-1 min-w-0">
|
||||
<div class="flex items-center gap-2 flex-wrap">
|
||||
<h3 class="font-semibold text-gray-900 dark:text-white">{{ res.name }}</h3>
|
||||
{% if res.device_type == 'slm' %}
|
||||
<span class="px-2 py-0.5 text-xs font-medium bg-purple-100 dark:bg-purple-900/30 text-purple-700 dark:text-purple-400 rounded">SLM</span>
|
||||
{% else %}
|
||||
<span class="px-2 py-0.5 text-xs font-medium bg-blue-100 dark:bg-blue-900/30 text-blue-700 dark:text-blue-400 rounded">Seismograph</span>
|
||||
{% endif %}
|
||||
{% if item.has_conflicts %}
|
||||
<span class="px-2 py-0.5 text-xs font-medium bg-amber-100 dark:bg-amber-900/30 text-amber-700 dark:text-amber-400 rounded"
|
||||
title="{{ item.conflict_count }} unit(s) will need a calibration swap during this job">
|
||||
{{ item.conflict_count }} cal swap{{ 's' if item.conflict_count != 1 else '' }}
|
||||
</span>
|
||||
{% endif %}
|
||||
</div>
|
||||
<p class="text-sm text-gray-500 dark:text-gray-400 mt-0.5">
|
||||
{{ res.start_date.strftime('%b %d, %Y') }} –
|
||||
{% if res.end_date %}
|
||||
{{ res.end_date.strftime('%b %d, %Y') }}
|
||||
{% elif res.end_date_tbd %}
|
||||
<span class="text-yellow-600 dark:text-yellow-400 font-medium">TBD</span>
|
||||
{% if res.estimated_end_date %}
|
||||
<span class="text-gray-400">(est. {{ res.estimated_end_date.strftime('%b %d, %Y') }})</span>
|
||||
{% endif %}
|
||||
{% else %}
|
||||
<span class="text-yellow-600 dark:text-yellow-400">Ongoing</span>
|
||||
{% endif %}
|
||||
</p>
|
||||
</div>
|
||||
|
||||
<!-- Counts -->
|
||||
<div class="flex flex-col items-end gap-1 mx-4 flex-shrink-0">
|
||||
{% set full = item.assigned_count == item.location_count and item.location_count > 0 %}
|
||||
{% set remaining = item.location_count - item.assigned_count %}
|
||||
<!-- Number row -->
|
||||
<div class="flex items-baseline gap-2">
|
||||
<span class="text-xs text-gray-400 dark:text-gray-500">est. {% if res.estimated_units %}{{ res.estimated_units }}{% else %}—{% endif %}</span>
|
||||
<span class="text-gray-300 dark:text-gray-600">·</span>
|
||||
<span class="text-base font-bold {% if full %}text-green-600 dark:text-green-400{% elif item.assigned_count == 0 %}text-gray-400 dark:text-gray-500{% else %}text-amber-500 dark:text-amber-400{% endif %}">
|
||||
{{ item.assigned_count }}/{{ item.location_count }}
|
||||
</span>
|
||||
{% if remaining > 0 %}
|
||||
<span class="text-xs text-amber-500 dark:text-amber-400 whitespace-nowrap">({{ remaining }} more)</span>
|
||||
{% endif %}
|
||||
</div>
|
||||
<!-- Progress squares -->
|
||||
{% if item.location_count > 0 %}
|
||||
<div class="flex gap-0.5">
|
||||
{% for i in range(item.location_count) %}
|
||||
<span class="w-3 h-3 rounded-sm {% if i < item.assigned_count %}{% if full %}bg-green-500{% else %}bg-amber-500{% endif %}{% else %}bg-gray-300 dark:bg-gray-600{% endif %}"></span>
|
||||
{% endfor %}
|
||||
</div>
|
||||
{% endif %}
|
||||
</div>
|
||||
|
||||
<!-- Action buttons -->
|
||||
<div class="flex items-center gap-1 flex-shrink-0">
|
||||
<!-- Assign units (always visible) -->
|
||||
<button onclick="event.stopPropagation(); openPlanner('{{ res.id }}')"
|
||||
class="p-2 text-gray-400 hover:text-green-600 dark:hover:text-green-400 rounded-lg hover:bg-gray-100 dark:hover:bg-gray-700"
|
||||
title="Assign units">
|
||||
<svg class="w-4 h-4" fill="none" stroke="currentColor" viewBox="0 0 24 24">
|
||||
<path stroke-linecap="round" stroke-linejoin="round" stroke-width="2" d="M9 5H7a2 2 0 00-2 2v12a2 2 0 002 2h10a2 2 0 002-2V7a2 2 0 00-2-2h-2M9 5a2 2 0 002 2h2a2 2 0 002-2M9 5a2 2 0 012-2h2a2 2 0 012 2m-3 7h3m-3 4h3m-6-4h.01M9 16h.01"/>
|
||||
</svg>
|
||||
</button>
|
||||
|
||||
<!-- "..." overflow menu -->
|
||||
<div class="relative" onclick="event.stopPropagation()">
|
||||
<button onclick="toggleResMenu('{{ res.id }}')"
|
||||
class="p-2 text-gray-400 hover:text-gray-600 dark:hover:text-gray-300 rounded-lg hover:bg-gray-100 dark:hover:bg-gray-700"
|
||||
title="More options">
|
||||
<svg class="w-4 h-4" fill="currentColor" viewBox="0 0 24 24">
|
||||
<circle cx="5" cy="12" r="1.5"/><circle cx="12" cy="12" r="1.5"/><circle cx="19" cy="12" r="1.5"/>
|
||||
</svg>
|
||||
</button>
|
||||
<div id="res-menu-{{ res.id }}"
|
||||
class="hidden absolute right-0 top-8 z-20 w-44 bg-white dark:bg-slate-800 border border-gray-200 dark:border-gray-700 rounded-lg shadow-lg py-1">
|
||||
<button onclick="openPromoteModal('{{ res.id }}', '{{ res.name }}'); toggleResMenu('{{ res.id }}')"
|
||||
class="w-full text-left px-4 py-2 text-sm text-gray-700 dark:text-gray-300 hover:bg-gray-50 dark:hover:bg-slate-700 flex items-center gap-2">
|
||||
<svg class="w-4 h-4 text-emerald-500" fill="none" stroke="currentColor" viewBox="0 0 24 24">
|
||||
<path stroke-linecap="round" stroke-linejoin="round" stroke-width="2" d="M5 10l7-7m0 0l7 7m-7-7v18"/>
|
||||
</svg>
|
||||
Promote to Project
|
||||
</button>
|
||||
<button onclick="editReservation('{{ res.id }}'); toggleResMenu('{{ res.id }}')"
|
||||
class="w-full text-left px-4 py-2 text-sm text-gray-700 dark:text-gray-300 hover:bg-gray-50 dark:hover:bg-slate-700 flex items-center gap-2">
|
||||
<svg class="w-4 h-4 text-blue-500" fill="none" stroke="currentColor" viewBox="0 0 24 24">
|
||||
<path stroke-linecap="round" stroke-linejoin="round" stroke-width="2" d="M11 5H6a2 2 0 00-2 2v11a2 2 0 002 2h11a2 2 0 002-2v-5m-1.414-9.414a2 2 0 112.828 2.828L11.828 15H9v-2.828l8.586-8.586z"/>
|
||||
</svg>
|
||||
Edit
|
||||
</button>
|
||||
<div class="border-t border-gray-100 dark:border-gray-700 my-1"></div>
|
||||
<button onclick="deleteReservation('{{ res.id }}', '{{ res.name }}'); toggleResMenu('{{ res.id }}')"
|
||||
class="w-full text-left px-4 py-2 text-sm text-red-600 dark:text-red-400 hover:bg-red-50 dark:hover:bg-red-900/20 flex items-center gap-2">
|
||||
<svg class="w-4 h-4" fill="none" stroke="currentColor" viewBox="0 0 24 24">
|
||||
<path stroke-linecap="round" stroke-linejoin="round" stroke-width="2" d="M19 7l-.867 12.142A2 2 0 0116.138 21H7.862a2 2 0 01-1.995-1.858L5 7m5 4v6m4-6v6m1-10V4a1 1 0 00-1-1h-4a1 1 0 00-1 1v3M4 7h16"/>
|
||||
</svg>
|
||||
Delete
|
||||
</button>
|
||||
</div>
|
||||
</div>
|
||||
|
||||
<!-- Chevron -->
|
||||
<svg id="chevron-{{ res.id }}" class="w-4 h-4 text-gray-400 transition-transform duration-200 ml-1 pointer-events-none" fill="none" stroke="currentColor" viewBox="0 0 24 24">
|
||||
<path stroke-linecap="round" stroke-linejoin="round" stroke-width="2" d="M19 9l-7 7-7-7"/>
|
||||
</svg>
|
||||
</div>
|
||||
</div>
|
||||
|
||||
<!-- Expandable detail panel -->
|
||||
<div id="{{ detail_id }}" class="hidden border-t border-gray-100 dark:border-gray-700 bg-gray-50 dark:bg-slate-800/60 px-4 py-3">
|
||||
|
||||
{% if res.notes %}
|
||||
<p class="text-sm text-gray-500 dark:text-gray-400 mb-3 italic">{{ res.notes }}</p>
|
||||
{% endif %}
|
||||
|
||||
<div class="grid grid-cols-2 gap-x-6 gap-y-1 text-sm mb-3">
|
||||
<div class="text-gray-500 dark:text-gray-400">Estimated</div>
|
||||
<div class="font-medium {% if res.estimated_units %}text-gray-800 dark:text-gray-200{% else %}text-gray-400 dark:text-gray-500 italic{% endif %}">
|
||||
{% if res.estimated_units %}{{ res.estimated_units }} unit{{ 's' if res.estimated_units != 1 else '' }}{% else %}not specified{% endif %}
|
||||
</div>
|
||||
<div class="text-gray-500 dark:text-gray-400">Locations</div>
|
||||
<div class="font-medium text-gray-800 dark:text-gray-200">{{ item.assigned_count }} of {{ item.location_count }} filled</div>
|
||||
{% if item.assigned_count < item.location_count %}
|
||||
<div class="text-gray-500 dark:text-gray-400">Still needed</div>
|
||||
<div class="font-medium text-amber-600 dark:text-amber-400">{{ item.location_count - item.assigned_count }} location{{ 's' if (item.location_count - item.assigned_count) != 1 else '' }} remaining</div>
|
||||
{% endif %}
|
||||
{% if item.has_conflicts %}
|
||||
<div class="text-gray-500 dark:text-gray-400">Cal swaps</div>
|
||||
<div class="font-medium text-amber-600 dark:text-amber-400">{{ item.conflict_count }} unit{{ 's' if item.conflict_count != 1 else '' }} will need swapping during job</div>
|
||||
{% endif %}
|
||||
</div>
|
||||
|
||||
{% if item.assigned_units %}
|
||||
<p class="text-xs font-semibold uppercase tracking-wide text-gray-400 dark:text-gray-500 mb-2">Monitoring Locations</p>
|
||||
<div class="flex flex-col gap-1">
|
||||
{% for u in item.assigned_units %}
|
||||
<div class="rounded bg-white dark:bg-slate-700 border border-gray-100 dark:border-gray-600 text-sm">
|
||||
<div class="flex items-center gap-3 px-3 py-1.5">
|
||||
<span class="text-gray-400 dark:text-gray-500 text-xs w-12 flex-shrink-0">Loc. {{ loop.index }}</span>
|
||||
<div class="flex flex-col min-w-0">
|
||||
{% if u.location_name %}
|
||||
<span class="text-xs font-semibold text-gray-700 dark:text-gray-300 truncate">{{ u.location_name }}</span>
|
||||
{% endif %}
|
||||
<button onclick="openUnitDetailModal('{{ u.id }}')"
|
||||
class="font-medium text-blue-600 dark:text-blue-400 hover:underline text-left text-sm">{{ u.id }}</button>
|
||||
</div>
|
||||
<span class="flex-1"></span>
|
||||
{% if u.power_type == 'ac' %}
|
||||
<span class="text-xs px-1.5 py-0.5 bg-blue-50 dark:bg-blue-900/20 text-blue-600 dark:text-blue-400 rounded">A/C</span>
|
||||
{% elif u.power_type == 'solar' %}
|
||||
<span class="text-xs px-1.5 py-0.5 bg-yellow-50 dark:bg-yellow-900/20 text-yellow-600 dark:text-yellow-400 rounded">Solar</span>
|
||||
{% endif %}
|
||||
{% if u.deployed %}
|
||||
<span class="text-xs px-1.5 py-0.5 bg-green-50 dark:bg-green-900/20 text-green-600 dark:text-green-400 rounded">Deployed</span>
|
||||
{% else %}
|
||||
<span class="text-xs px-1.5 py-0.5 bg-gray-100 dark:bg-gray-600 text-gray-500 dark:text-gray-400 rounded">Benched</span>
|
||||
{% endif %}
|
||||
{% if u.last_calibrated %}
|
||||
<span class="text-xs text-gray-400 dark:text-gray-500">Cal: {{ u.last_calibrated.strftime('%b %d, %Y') }}</span>
|
||||
{% endif %}
|
||||
</div>
|
||||
{% if u.notes %}
|
||||
<p class="px-3 pb-1.5 text-xs text-gray-400 dark:text-gray-500 italic">{{ u.notes }}</p>
|
||||
{% endif %}
|
||||
</div>
|
||||
{% endfor %}
|
||||
</div>
|
||||
{% else %}
|
||||
<p class="text-sm text-gray-400 dark:text-gray-500 italic">No units assigned yet. Click the clipboard icon to plan.</p>
|
||||
{% endif %}
|
||||
</div>
|
||||
|
||||
</div>
|
||||
{% endfor %}
|
||||
</div>
|
||||
|
||||
<!-- toggleResCard, deleteReservation, editReservation, openUnitDetailModal defined in fleet_calendar.html -->
|
||||
{% else %}
|
||||
<div class="text-center py-8">
|
||||
<svg class="w-12 h-12 mx-auto text-gray-400 dark:text-gray-500 mb-4" fill="none" stroke="currentColor" viewBox="0 0 24 24">
|
||||
<path stroke-linecap="round" stroke-linejoin="round" stroke-width="2" d="M8 7V3m8 4V3m-9 8h10M5 21h14a2 2 0 002-2V7a2 2 0 00-2-2H5a2 2 0 00-2 2v12a2 2 0 002 2z"/>
|
||||
</svg>
|
||||
<p class="text-gray-500 dark:text-gray-400">No jobs yet</p>
|
||||
<p class="text-sm text-gray-400 dark:text-gray-500 mt-1">Click "New Job" to start planning a deployment</p>
|
||||
</div>
|
||||
{% endif %}
|
||||
127
templates/partials/modem_list.html
Normal file
@@ -0,0 +1,127 @@
|
||||
<!-- Modem List -->
|
||||
{% if modems %}
|
||||
<div class="overflow-x-auto">
|
||||
<table class="w-full">
|
||||
<thead class="bg-gray-50 dark:bg-slate-700 border-b border-gray-200 dark:border-gray-600">
|
||||
<tr>
|
||||
<th class="px-4 py-3 text-left text-xs font-medium text-gray-700 dark:text-gray-300 uppercase tracking-wider">Unit ID</th>
|
||||
<th class="px-4 py-3 text-left text-xs font-medium text-gray-700 dark:text-gray-300 uppercase tracking-wider">Status</th>
|
||||
<th class="px-4 py-3 text-left text-xs font-medium text-gray-700 dark:text-gray-300 uppercase tracking-wider">IP Address</th>
|
||||
<th class="px-4 py-3 text-left text-xs font-medium text-gray-700 dark:text-gray-300 uppercase tracking-wider">Phone</th>
|
||||
<th class="px-4 py-3 text-left text-xs font-medium text-gray-700 dark:text-gray-300 uppercase tracking-wider">Paired Device</th>
|
||||
<th class="px-4 py-3 text-left text-xs font-medium text-gray-700 dark:text-gray-300 uppercase tracking-wider">Location</th>
|
||||
<th class="px-4 py-3 text-right text-xs font-medium text-gray-700 dark:text-gray-300 uppercase tracking-wider">Actions</th>
|
||||
</tr>
|
||||
</thead>
|
||||
<tbody class="divide-y divide-gray-200 dark:divide-gray-700">
|
||||
{% for modem in modems %}
|
||||
<tr class="hover:bg-gray-50 dark:hover:bg-slate-700 transition-colors">
|
||||
<td class="px-4 py-3 whitespace-nowrap">
|
||||
<div class="flex items-center gap-2">
|
||||
<a href="/unit/{{ modem.id }}" class="font-medium text-blue-600 dark:text-blue-400 hover:underline">
|
||||
{{ modem.id }}
|
||||
</a>
|
||||
{% if modem.hardware_model %}
|
||||
<span class="text-xs text-gray-500 dark:text-gray-400">({{ modem.hardware_model }})</span>
|
||||
{% endif %}
|
||||
</div>
|
||||
</td>
|
||||
<td class="px-4 py-3 whitespace-nowrap">
|
||||
{% if modem.status == "retired" %}
|
||||
<span class="inline-flex items-center px-2.5 py-0.5 rounded-full text-xs font-medium bg-gray-200 text-gray-700 dark:bg-gray-700 dark:text-gray-300">
|
||||
Retired
|
||||
</span>
|
||||
{% elif modem.status == "benched" %}
|
||||
<span class="inline-flex items-center px-2.5 py-0.5 rounded-full text-xs font-medium bg-amber-100 text-amber-800 dark:bg-amber-900/30 dark:text-amber-300">
|
||||
Benched
|
||||
</span>
|
||||
{% elif modem.status == "in_use" %}
|
||||
<span class="inline-flex items-center px-2.5 py-0.5 rounded-full text-xs font-medium bg-green-100 text-green-800 dark:bg-green-900/30 dark:text-green-300">
|
||||
In Use
|
||||
</span>
|
||||
{% elif modem.status == "spare" %}
|
||||
<span class="inline-flex items-center px-2.5 py-0.5 rounded-full text-xs font-medium bg-blue-100 text-blue-800 dark:bg-blue-900/30 dark:text-blue-300">
|
||||
Spare
|
||||
</span>
|
||||
{% else %}
|
||||
<span class="inline-flex items-center px-2.5 py-0.5 rounded-full text-xs font-medium bg-gray-100 text-gray-800 dark:bg-gray-700 dark:text-gray-300">
|
||||
—
|
||||
</span>
|
||||
{% endif %}
|
||||
</td>
|
||||
<td class="px-4 py-3 whitespace-nowrap text-sm">
|
||||
{% if modem.ip_address %}
|
||||
<span class="font-mono text-gray-900 dark:text-gray-300">{{ modem.ip_address }}</span>
|
||||
{% else %}
|
||||
<span class="text-gray-400 dark:text-gray-600">—</span>
|
||||
{% endif %}
|
||||
</td>
|
||||
<td class="px-4 py-3 whitespace-nowrap text-sm text-gray-900 dark:text-gray-300">
|
||||
{% if modem.phone_number %}
|
||||
{{ modem.phone_number }}
|
||||
{% else %}
|
||||
<span class="text-gray-400 dark:text-gray-600">—</span>
|
||||
{% endif %}
|
||||
</td>
|
||||
<td class="px-4 py-3 whitespace-nowrap text-sm">
|
||||
{% if modem.paired_device %}
|
||||
<a href="/unit/{{ modem.paired_device.id }}" class="text-blue-600 dark:text-blue-400 hover:underline">
|
||||
{{ modem.paired_device.id }}
|
||||
<span class="text-gray-500 dark:text-gray-400">({{ modem.paired_device.device_type }})</span>
|
||||
</a>
|
||||
{% else %}
|
||||
<span class="text-gray-400 dark:text-gray-600">None</span>
|
||||
{% endif %}
|
||||
</td>
|
||||
<td class="px-4 py-3 text-sm text-gray-900 dark:text-gray-300">
|
||||
{% if modem.project_id %}
|
||||
<span class="bg-gray-200 dark:bg-gray-700 px-1.5 py-0.5 rounded text-xs mr-1">{{ modem.project_id }}</span>
|
||||
{% endif %}
|
||||
{% if modem.location %}
|
||||
<span class="truncate max-w-xs inline-block" title="{{ modem.location }}">{{ modem.location }}</span>
|
||||
{% elif not modem.project_id %}
|
||||
<span class="text-gray-400 dark:text-gray-600">—</span>
|
||||
{% endif %}
|
||||
</td>
|
||||
<td class="px-4 py-3 whitespace-nowrap text-right text-sm">
|
||||
<div class="flex items-center justify-end gap-2">
|
||||
<button onclick="pingModem('{{ modem.id }}')"
|
||||
id="ping-btn-{{ modem.id }}"
|
||||
class="text-xs px-2 py-1 bg-blue-100 hover:bg-blue-200 text-blue-700 dark:bg-blue-900/30 dark:hover:bg-blue-900/50 dark:text-blue-300 rounded transition-colors">
|
||||
Ping
|
||||
</button>
|
||||
<a href="/unit/{{ modem.id }}" class="text-blue-600 dark:text-blue-400 hover:underline">
|
||||
View →
|
||||
</a>
|
||||
</div>
|
||||
<!-- Ping Result (hidden by default) -->
|
||||
<div id="ping-result-{{ modem.id }}" class="mt-1 text-xs hidden"></div>
|
||||
</td>
|
||||
</tr>
|
||||
{% endfor %}
|
||||
</tbody>
|
||||
</table>
|
||||
</div>
|
||||
|
||||
{% if search %}
|
||||
<div class="mt-4 text-sm text-gray-600 dark:text-gray-400">
|
||||
Found {{ modems|length }} modem(s) matching "{{ search }}"
|
||||
</div>
|
||||
{% endif %}
|
||||
|
||||
{% else %}
|
||||
<div class="text-center py-12 text-gray-500 dark:text-gray-400">
|
||||
<svg class="w-12 h-12 mx-auto mb-3 opacity-50" fill="none" stroke="currentColor" viewBox="0 0 24 24">
|
||||
<path stroke-linecap="round" stroke-linejoin="round" stroke-width="2" d="M8.111 16.404a5.5 5.5 0 017.778 0M12 20h.01m-7.08-7.071c3.904-3.905 10.236-3.905 14.141 0M1.394 9.393c5.857-5.857 15.355-5.857 21.213 0"></path>
|
||||
</svg>
|
||||
<p>No modems found</p>
|
||||
{% if search %}
|
||||
<button onclick="document.getElementById('modem-search').value = ''; htmx.trigger('#modem-search', 'keyup');"
|
||||
class="mt-3 text-blue-600 dark:text-blue-400 hover:underline">
|
||||
Clear search
|
||||
</button>
|
||||
{% else %}
|
||||
<p class="text-sm mt-1">Add modems from the <a href="/roster" class="text-seismo-orange hover:underline">Fleet Roster</a></p>
|
||||
{% endif %}
|
||||
</div>
|
||||
{% endif %}
|
||||
64
templates/partials/modem_paired_device.html
Normal file
@@ -0,0 +1,64 @@
|
||||
<!-- Paired Device Info for Modem Detail Page -->
|
||||
{% if device %}
|
||||
<div class="flex items-center gap-4 p-4 bg-green-50 dark:bg-green-900/20 rounded-lg">
|
||||
<div class="bg-green-100 dark:bg-green-900/30 p-3 rounded-lg">
|
||||
{% if device.device_type == "slm" or device.device_type == "sound_level_meter" %}
|
||||
<svg class="w-6 h-6 text-green-600 dark:text-green-400" fill="none" stroke="currentColor" viewBox="0 0 24 24">
|
||||
<path stroke-linecap="round" stroke-linejoin="round" stroke-width="2" d="M9 19v-6a2 2 0 00-2-2H5a2 2 0 00-2 2v6a2 2 0 002 2h2a2 2 0 002-2zm0 0V9a2 2 0 012-2h2a2 2 0 012 2v10m-6 0a2 2 0 002 2h2a2 2 0 002-2m0 0V5a2 2 0 012-2h2a2 2 0 012 2v14a2 2 0 01-2 2h-2a2 2 0 01-2-2z"></path>
|
||||
</svg>
|
||||
{% else %}
|
||||
<svg class="w-6 h-6 text-green-600 dark:text-green-400" fill="none" stroke="currentColor" viewBox="0 0 24 24">
|
||||
<path stroke-linecap="round" stroke-linejoin="round" stroke-width="2" d="M9 3v2m6-2v2M9 19v2m6-2v2M5 9H3m2 6H3m18-6h-2m2 6h-2M7 19h10a2 2 0 002-2V7a2 2 0 00-2-2H7a2 2 0 00-2 2v10a2 2 0 002 2zM9 9h6v6H9V9z"></path>
|
||||
</svg>
|
||||
{% endif %}
|
||||
</div>
|
||||
<div class="flex-1">
|
||||
<p class="text-sm text-gray-500 dark:text-gray-400">Currently paired with</p>
|
||||
<a href="/unit/{{ device.id }}" class="text-lg font-semibold text-green-700 dark:text-green-400 hover:underline">
|
||||
{{ device.id }}
|
||||
</a>
|
||||
<div class="flex items-center gap-2 mt-1 text-sm text-gray-600 dark:text-gray-400">
|
||||
<span class="capitalize">{{ device.device_type | replace("_", " ") }}</span>
|
||||
{% if device.project_id %}
|
||||
<span class="text-gray-400">|</span>
|
||||
<span>{{ device.project_id }}</span>
|
||||
{% endif %}
|
||||
{% if device.deployed %}
|
||||
<span class="px-1.5 py-0.5 bg-green-100 dark:bg-green-900/30 text-green-700 dark:text-green-300 text-xs rounded">Deployed</span>
|
||||
{% else %}
|
||||
<span class="px-1.5 py-0.5 bg-amber-100 dark:bg-amber-900/30 text-amber-700 dark:text-amber-300 text-xs rounded">Benched</span>
|
||||
{% endif %}
|
||||
</div>
|
||||
</div>
|
||||
<div class="flex items-center gap-2">
|
||||
<button onclick="openModemPairDeviceModal()"
|
||||
class="px-3 py-1.5 text-sm bg-gray-100 hover:bg-gray-200 dark:bg-gray-700 dark:hover:bg-gray-600 text-gray-700 dark:text-gray-300 rounded-lg transition-colors">
|
||||
Edit Pairing
|
||||
</button>
|
||||
<a href="/unit/{{ device.id }}" class="text-gray-400 hover:text-seismo-orange transition-colors">
|
||||
<svg class="w-5 h-5" fill="none" stroke="currentColor" viewBox="0 0 24 24">
|
||||
<path stroke-linecap="round" stroke-linejoin="round" stroke-width="2" d="M9 5l7 7-7 7"></path>
|
||||
</svg>
|
||||
</a>
|
||||
</div>
|
||||
</div>
|
||||
{% else %}
|
||||
<div class="flex items-center gap-4 p-4 bg-gray-50 dark:bg-gray-800 rounded-lg">
|
||||
<div class="bg-gray-200 dark:bg-gray-700 p-3 rounded-lg">
|
||||
<svg class="w-6 h-6 text-gray-500 dark:text-gray-400" fill="none" stroke="currentColor" viewBox="0 0 24 24">
|
||||
<path stroke-linecap="round" stroke-linejoin="round" stroke-width="2" d="M18.364 18.364A9 9 0 005.636 5.636m12.728 12.728A9 9 0 015.636 5.636m12.728 12.728L5.636 5.636"></path>
|
||||
</svg>
|
||||
</div>
|
||||
<div class="flex-1">
|
||||
<p class="text-gray-600 dark:text-gray-400">No device currently paired</p>
|
||||
<p class="text-sm text-gray-500 dark:text-gray-500">This modem is available for assignment</p>
|
||||
</div>
|
||||
<button onclick="openModemPairDeviceModal()"
|
||||
class="px-4 py-2 text-sm bg-seismo-orange hover:bg-orange-600 text-white rounded-lg transition-colors flex items-center gap-2">
|
||||
<svg class="w-4 h-4" fill="none" stroke="currentColor" viewBox="0 0 24 24">
|
||||
<path stroke-linecap="round" stroke-linejoin="round" stroke-width="2" d="M13.828 10.172a4 4 0 00-5.656 0l-4 4a4 4 0 105.656 5.656l1.102-1.101m-.758-4.899a4 4 0 005.656 0l4-4a4 4 0 00-5.656-5.656l-1.1 1.1"></path>
|
||||
</svg>
|
||||
Pair Device
|
||||
</button>
|
||||
</div>
|
||||
{% endif %}
|
||||
128
templates/partials/modem_picker.html
Normal file
@@ -0,0 +1,128 @@
|
||||
{#
|
||||
Modem Picker Component
|
||||
A reusable HTMX-based autocomplete for selecting modems.
|
||||
|
||||
Usage: include "partials/modem_picker.html" with context
|
||||
|
||||
Variables available in context:
|
||||
- selected_modem_id: Pre-selected modem ID (optional)
|
||||
- selected_modem_display: Display text for pre-selected modem (optional)
|
||||
- input_name: Name attribute for the hidden input (default: "deployed_with_modem_id")
|
||||
- picker_id: Unique ID suffix for multiple pickers on same page (default: "")
|
||||
#}
|
||||
|
||||
{% set picker_id = picker_id|default("") %}
|
||||
{% set input_name = input_name|default("deployed_with_modem_id") %}
|
||||
{% set selected_modem_id = selected_modem_id|default("") %}
|
||||
{% set selected_modem_display = selected_modem_display|default("") %}
|
||||
|
||||
<div class="modem-picker relative" id="modem-picker-container{{ picker_id }}">
|
||||
<!-- Hidden input for form submission (stores modem ID) -->
|
||||
<input type="hidden"
|
||||
name="{{ input_name }}"
|
||||
id="modem-picker-value{{ picker_id }}"
|
||||
value="{{ selected_modem_id }}">
|
||||
|
||||
<!-- Search Input -->
|
||||
<div class="relative">
|
||||
<input type="text"
|
||||
id="modem-picker-search{{ picker_id }}"
|
||||
placeholder="Search by modem ID, IP, or note..."
|
||||
class="w-full px-4 py-2 pr-10 rounded-lg border border-gray-300 dark:border-gray-600 bg-white dark:bg-slate-700 text-gray-900 dark:text-white focus:ring-2 focus:ring-seismo-orange focus:border-seismo-orange"
|
||||
autocomplete="off"
|
||||
value="{{ selected_modem_display }}"
|
||||
hx-get="/api/roster/search/modems"
|
||||
hx-trigger="keyup changed delay:300ms, focus"
|
||||
hx-target="#modem-picker-dropdown{{ picker_id }}"
|
||||
hx-vals='{"picker_id": "{{ picker_id }}"}'
|
||||
name="q"
|
||||
onfocus="showModemDropdown('{{ picker_id }}')"
|
||||
oninput="handleModemSearchInput('{{ picker_id }}', this.value)">
|
||||
|
||||
<!-- Search icon -->
|
||||
<div class="absolute inset-y-0 right-0 flex items-center pr-3 pointer-events-none">
|
||||
<svg class="w-5 h-5 text-gray-400" fill="none" stroke="currentColor" viewBox="0 0 24 24">
|
||||
<path stroke-linecap="round" stroke-linejoin="round" stroke-width="2" d="M21 21l-6-6m2-5a7 7 0 11-14 0 7 7 0 0114 0z"></path>
|
||||
</svg>
|
||||
</div>
|
||||
|
||||
<!-- Clear button (shown when modem is selected) -->
|
||||
<button type="button"
|
||||
id="modem-picker-clear{{ picker_id }}"
|
||||
class="absolute inset-y-0 right-8 flex items-center pr-1 {{ 'hidden' if not selected_modem_id else '' }}"
|
||||
onclick="clearModemSelection('{{ picker_id }}')"
|
||||
title="Clear selection">
|
||||
<svg class="w-4 h-4 text-gray-400 hover:text-gray-600 dark:hover:text-gray-300" fill="none" stroke="currentColor" viewBox="0 0 24 24">
|
||||
<path stroke-linecap="round" stroke-linejoin="round" stroke-width="2" d="M6 18L18 6M6 6l12 12"></path>
|
||||
</svg>
|
||||
</button>
|
||||
</div>
|
||||
|
||||
<!-- Dropdown Results Container -->
|
||||
<div id="modem-picker-dropdown{{ picker_id }}"
|
||||
class="hidden absolute z-50 w-full mt-1 bg-white dark:bg-slate-800 border border-gray-300 dark:border-gray-600 rounded-lg shadow-lg max-h-60 overflow-y-auto">
|
||||
<!-- Results loaded via HTMX -->
|
||||
</div>
|
||||
</div>
|
||||
|
||||
<script>
|
||||
{# Modem picker functions - defined once, work for any picker_id #}
|
||||
if (typeof selectModem === 'undefined') {
|
||||
function selectModem(modemId, displayText, pickerId = '') {
|
||||
const valueInput = document.getElementById('modem-picker-value' + pickerId);
|
||||
const searchInput = document.getElementById('modem-picker-search' + pickerId);
|
||||
const dropdown = document.getElementById('modem-picker-dropdown' + pickerId);
|
||||
const clearBtn = document.getElementById('modem-picker-clear' + pickerId);
|
||||
|
||||
if (valueInput) valueInput.value = modemId;
|
||||
if (searchInput) searchInput.value = displayText;
|
||||
if (dropdown) dropdown.classList.add('hidden');
|
||||
if (clearBtn) clearBtn.classList.remove('hidden');
|
||||
}
|
||||
|
||||
function clearModemSelection(pickerId = '') {
|
||||
const valueInput = document.getElementById('modem-picker-value' + pickerId);
|
||||
const searchInput = document.getElementById('modem-picker-search' + pickerId);
|
||||
const clearBtn = document.getElementById('modem-picker-clear' + pickerId);
|
||||
|
||||
if (valueInput) valueInput.value = '';
|
||||
if (searchInput) {
|
||||
searchInput.value = '';
|
||||
searchInput.focus();
|
||||
}
|
||||
if (clearBtn) clearBtn.classList.add('hidden');
|
||||
}
|
||||
|
||||
function showModemDropdown(pickerId = '') {
|
||||
const dropdown = document.getElementById('modem-picker-dropdown' + pickerId);
|
||||
if (dropdown) dropdown.classList.remove('hidden');
|
||||
}
|
||||
|
||||
function hideModemDropdown(pickerId = '') {
|
||||
const dropdown = document.getElementById('modem-picker-dropdown' + pickerId);
|
||||
if (dropdown) dropdown.classList.add('hidden');
|
||||
}
|
||||
|
||||
function handleModemSearchInput(pickerId, value) {
|
||||
const valueInput = document.getElementById('modem-picker-value' + pickerId);
|
||||
const clearBtn = document.getElementById('modem-picker-clear' + pickerId);
|
||||
|
||||
// If user clears the search box, also clear the hidden value
|
||||
if (!value.trim()) {
|
||||
if (valueInput) valueInput.value = '';
|
||||
if (clearBtn) clearBtn.classList.add('hidden');
|
||||
}
|
||||
}
|
||||
|
||||
// Close dropdown when clicking outside
|
||||
document.addEventListener('click', function(event) {
|
||||
const pickers = document.querySelectorAll('.modem-picker');
|
||||
pickers.forEach(picker => {
|
||||
if (!picker.contains(event.target)) {
|
||||
const dropdown = picker.querySelector('[id^="modem-picker-dropdown"]');
|
||||
if (dropdown) dropdown.classList.add('hidden');
|
||||
}
|
||||
});
|
||||
});
|
||||
}
|
||||
</script>
|
||||
61
templates/partials/modem_search_results.html
Normal file
@@ -0,0 +1,61 @@
|
||||
{#
|
||||
Modem Search Results Partial
|
||||
Rendered by /api/roster/search/modems endpoint for HTMX dropdown.
|
||||
|
||||
Variables:
|
||||
- modems: List of modem dicts with id, ip_address, phone_number, note, deployed, display
|
||||
- query: The search query string
|
||||
- show_empty: Boolean - show "no results" message
|
||||
#}
|
||||
|
||||
{% set picker_id = request.query_params.get('picker_id', '') %}
|
||||
|
||||
{% if modems %}
|
||||
{% for modem in modems %}
|
||||
<div class="px-4 py-3 hover:bg-gray-100 dark:hover:bg-gray-700 cursor-pointer border-b border-gray-100 dark:border-gray-700 last:border-0 transition-colors"
|
||||
onclick="selectModem('{{ modem.id }}', '{{ modem.display|e }}', '{{ picker_id }}')">
|
||||
<div class="flex items-start justify-between gap-2">
|
||||
<div class="flex-1 min-w-0">
|
||||
<div class="font-medium text-gray-900 dark:text-white truncate">
|
||||
<span class="text-seismo-orange font-semibold">{{ modem.id }}</span>
|
||||
{% if modem.ip_address %}
|
||||
<span class="text-gray-400 mx-1">-</span>
|
||||
<span class="text-gray-600 dark:text-gray-400 font-mono text-sm">{{ modem.ip_address }}</span>
|
||||
{% endif %}
|
||||
</div>
|
||||
{% if modem.note %}
|
||||
<div class="text-sm text-gray-500 dark:text-gray-400 truncate">
|
||||
{{ modem.note }}
|
||||
</div>
|
||||
{% endif %}
|
||||
</div>
|
||||
<div class="flex items-center gap-2">
|
||||
{% if not modem.deployed %}
|
||||
<span class="flex-shrink-0 text-xs px-2 py-0.5 bg-gray-100 dark:bg-gray-600 text-gray-600 dark:text-gray-300 rounded">
|
||||
Benched
|
||||
</span>
|
||||
{% endif %}
|
||||
</div>
|
||||
</div>
|
||||
</div>
|
||||
{% endfor %}
|
||||
{% endif %}
|
||||
|
||||
{% if show_empty %}
|
||||
<div class="px-4 py-6 text-center text-gray-500 dark:text-gray-400">
|
||||
<svg class="w-8 h-8 mx-auto mb-2 text-gray-300 dark:text-gray-600" fill="none" stroke="currentColor" viewBox="0 0 24 24">
|
||||
<path stroke-linecap="round" stroke-linejoin="round" stroke-width="2" d="M21 21l-6-6m2-5a7 7 0 11-14 0 7 7 0 0114 0z"></path>
|
||||
</svg>
|
||||
<p class="text-sm">No modems found matching "{{ query }}"</p>
|
||||
</div>
|
||||
{% endif %}
|
||||
|
||||
{% if not modems and not show_empty %}
|
||||
<div class="px-4 py-6 text-center text-gray-500 dark:text-gray-400">
|
||||
<svg class="w-8 h-8 mx-auto mb-2 text-gray-300 dark:text-gray-600" fill="none" stroke="currentColor" viewBox="0 0 24 24">
|
||||
<path stroke-linecap="round" stroke-linejoin="round" stroke-width="2" d="M21 21l-6-6m2-5a7 7 0 11-14 0 7 7 0 0114 0z"></path>
|
||||
</svg>
|
||||
<p class="text-sm">Start typing to search modems...</p>
|
||||
<p class="text-xs mt-1">Search by modem ID, IP address, or note</p>
|
||||
</div>
|
||||
{% endif %}
|
||||
63
templates/partials/modem_stats.html
Normal file
@@ -0,0 +1,63 @@
|
||||
<!-- Modem summary stat cards -->
|
||||
|
||||
<!-- Total Modems Card -->
|
||||
<div class="bg-white dark:bg-slate-800 rounded-xl shadow-lg p-6">
|
||||
<div class="flex items-center justify-between">
|
||||
<div>
|
||||
<p class="text-sm text-gray-600 dark:text-gray-400 font-medium">Total Modems</p>
|
||||
<p class="text-3xl font-bold text-gray-900 dark:text-white mt-1">{{ total_count }}</p>
|
||||
</div>
|
||||
<div class="bg-blue-100 dark:bg-blue-900/30 p-3 rounded-lg">
|
||||
<svg class="w-8 h-8 text-blue-600 dark:text-blue-400" fill="none" stroke="currentColor" viewBox="0 0 24 24">
|
||||
<path stroke-linecap="round" stroke-linejoin="round" stroke-width="2" d="M8.111 16.404a5.5 5.5 0 017.778 0M12 20h.01m-7.08-7.071c3.904-3.905 10.236-3.905 14.141 0M1.394 9.393c5.857-5.857 15.355-5.857 21.213 0"></path>
|
||||
</svg>
|
||||
</div>
|
||||
</div>
|
||||
</div>
|
||||
|
||||
<!-- In Use Card (paired with device) -->
|
||||
<div class="bg-white dark:bg-slate-800 rounded-xl shadow-lg p-6">
|
||||
<div class="flex items-center justify-between">
|
||||
<div>
|
||||
<p class="text-sm text-gray-600 dark:text-gray-400 font-medium">In Use</p>
|
||||
<p class="text-3xl font-bold text-green-600 dark:text-green-400 mt-1">{{ in_use_count }}</p>
|
||||
</div>
|
||||
<div class="bg-green-100 dark:bg-green-900/30 p-3 rounded-lg">
|
||||
<svg class="w-8 h-8 text-green-600 dark:text-green-400" fill="none" stroke="currentColor" viewBox="0 0 24 24">
|
||||
<path stroke-linecap="round" stroke-linejoin="round" stroke-width="2" d="M9 12l2 2 4-4m6 2a9 9 0 11-18 0 9 9 0 0118 0z"></path>
|
||||
</svg>
|
||||
</div>
|
||||
</div>
|
||||
<p class="text-xs text-gray-500 dark:text-gray-400 mt-2">Paired with a device</p>
|
||||
</div>
|
||||
|
||||
<!-- Spare Card (deployed but not paired) -->
|
||||
<div class="bg-white dark:bg-slate-800 rounded-xl shadow-lg p-6">
|
||||
<div class="flex items-center justify-between">
|
||||
<div>
|
||||
<p class="text-sm text-gray-600 dark:text-gray-400 font-medium">Spare</p>
|
||||
<p class="text-3xl font-bold text-seismo-orange mt-1">{{ spare_count }}</p>
|
||||
</div>
|
||||
<div class="bg-orange-100 dark:bg-orange-900/30 p-3 rounded-lg">
|
||||
<svg class="w-8 h-8 text-seismo-orange" fill="none" stroke="currentColor" viewBox="0 0 24 24">
|
||||
<path stroke-linecap="round" stroke-linejoin="round" stroke-width="2" d="M5 8h14M5 8a2 2 0 110-4h14a2 2 0 110 4M5 8v10a2 2 0 002 2h10a2 2 0 002-2V8m-9 4h4"></path>
|
||||
</svg>
|
||||
</div>
|
||||
</div>
|
||||
<p class="text-xs text-gray-500 dark:text-gray-400 mt-2">Available for assignment</p>
|
||||
</div>
|
||||
|
||||
<!-- Benched Card -->
|
||||
<div class="bg-white dark:bg-slate-800 rounded-xl shadow-lg p-6">
|
||||
<div class="flex items-center justify-between">
|
||||
<div>
|
||||
<p class="text-sm text-gray-600 dark:text-gray-400 font-medium">Benched</p>
|
||||
<p class="text-3xl font-bold text-gray-500 dark:text-gray-400 mt-1">{{ benched_count }}</p>
|
||||
</div>
|
||||
<div class="bg-gray-200 dark:bg-gray-700 p-3 rounded-lg">
|
||||
<svg class="w-8 h-8 text-gray-600 dark:text-gray-400" fill="none" stroke="currentColor" viewBox="0 0 24 24">
|
||||
<path stroke-linecap="round" stroke-linejoin="round" stroke-width="2" d="M20 13V6a2 2 0 00-2-2H6a2 2 0 00-2 2v7m16 0v5a2 2 0 01-2 2H6a2 2 0 01-2-2v-5m16 0h-2.586a1 1 0 00-.707.293l-2.414 2.414a1 1 0 01-.707.293h-3.172a1 1 0 01-.707-.293l-2.414-2.414A1 1 0 006.586 13H4"></path>
|
||||
</svg>
|
||||
</div>
|
||||
</div>
|
||||
</div>
|
||||
301
templates/partials/project_create_modal.html
Normal file
@@ -0,0 +1,301 @@
|
||||
{#
|
||||
Quick Create Project Modal
|
||||
Allows inline creation of a new project from the project picker.
|
||||
|
||||
Include this modal in pages that use the project picker.
|
||||
#}
|
||||
|
||||
<div id="quickCreateProjectModal" class="hidden fixed inset-0 bg-black bg-opacity-50 flex items-center justify-center z-50">
|
||||
<div class="bg-white dark:bg-slate-800 rounded-xl shadow-2xl max-w-md w-full mx-4">
|
||||
<div class="p-6 border-b border-gray-200 dark:border-gray-700">
|
||||
<div class="flex justify-between items-center">
|
||||
<h2 class="text-xl font-bold text-gray-900 dark:text-white">Create New Project</h2>
|
||||
<button type="button" onclick="closeCreateProjectModal()" class="text-gray-400 hover:text-gray-600 dark:hover:text-gray-300">
|
||||
<svg class="w-6 h-6" fill="none" stroke="currentColor" viewBox="0 0 24 24">
|
||||
<path stroke-linecap="round" stroke-linejoin="round" stroke-width="2" d="M6 18L18 6M6 6l12 12"></path>
|
||||
</svg>
|
||||
</button>
|
||||
</div>
|
||||
</div>
|
||||
|
||||
<form id="quickCreateProjectForm" class="p-6 space-y-4">
|
||||
<!-- Hidden field to track which picker opened this modal -->
|
||||
<input type="hidden" id="qcp-picker-id" value="">
|
||||
|
||||
<div>
|
||||
<label class="block text-sm font-medium text-gray-700 dark:text-gray-300 mb-2">
|
||||
Project Number
|
||||
<span class="text-gray-400 font-normal">(xxxx-YY)</span>
|
||||
</label>
|
||||
<input type="text"
|
||||
name="project_number"
|
||||
id="qcp-project-number"
|
||||
pattern="\d{4}-\d{2}"
|
||||
placeholder="2567-23"
|
||||
class="w-full px-4 py-2 rounded-lg border border-gray-300 dark:border-gray-600 bg-white dark:bg-slate-700 text-gray-900 dark:text-white focus:ring-2 focus:ring-seismo-orange">
|
||||
<p class="text-xs text-gray-500 dark:text-gray-400 mt-1">TMI internal project number (optional)</p>
|
||||
</div>
|
||||
|
||||
<div>
|
||||
<label class="block text-sm font-medium text-gray-700 dark:text-gray-300 mb-2">
|
||||
Client Name <span class="text-red-500">*</span>
|
||||
</label>
|
||||
<input type="text"
|
||||
name="client_name"
|
||||
id="qcp-client-name"
|
||||
required
|
||||
placeholder="PJ Dick, Turner Construction, etc."
|
||||
class="w-full px-4 py-2 rounded-lg border border-gray-300 dark:border-gray-600 bg-white dark:bg-slate-700 text-gray-900 dark:text-white focus:ring-2 focus:ring-seismo-orange">
|
||||
</div>
|
||||
|
||||
<div>
|
||||
<label class="block text-sm font-medium text-gray-700 dark:text-gray-300 mb-2">
|
||||
Project Name <span class="text-red-500">*</span>
|
||||
</label>
|
||||
<input type="text"
|
||||
name="name"
|
||||
id="qcp-project-name"
|
||||
required
|
||||
placeholder="RKM Hall, CMU Campus, Building 7, etc."
|
||||
class="w-full px-4 py-2 rounded-lg border border-gray-300 dark:border-gray-600 bg-white dark:bg-slate-700 text-gray-900 dark:text-white focus:ring-2 focus:ring-seismo-orange">
|
||||
<p class="text-xs text-gray-500 dark:text-gray-400 mt-1">Site or building name</p>
|
||||
</div>
|
||||
|
||||
<div>
|
||||
<label class="block text-sm font-medium text-gray-700 dark:text-gray-300 mb-2">
|
||||
Modules
|
||||
<span class="text-gray-400 font-normal">(optional)</span>
|
||||
</label>
|
||||
<div class="grid grid-cols-2 gap-2">
|
||||
<label class="flex items-center gap-2 p-2.5 border border-gray-200 dark:border-gray-700 rounded-lg cursor-pointer hover:border-orange-400 has-[:checked]:border-orange-400 has-[:checked]:bg-orange-50 dark:has-[:checked]:bg-orange-900/20 transition-colors">
|
||||
<input type="checkbox" name="module_sound" value="1" class="accent-seismo-orange">
|
||||
<div>
|
||||
<p class="text-sm font-medium text-gray-900 dark:text-white leading-tight">Sound</p>
|
||||
<p class="text-xs text-gray-500 dark:text-gray-400">SLMs, sessions, reports</p>
|
||||
</div>
|
||||
</label>
|
||||
<label class="flex items-center gap-2 p-2.5 border border-gray-200 dark:border-gray-700 rounded-lg cursor-pointer hover:border-blue-400 has-[:checked]:border-blue-400 has-[:checked]:bg-blue-50 dark:has-[:checked]:bg-blue-900/20 transition-colors">
|
||||
<input type="checkbox" name="module_vibration" value="1" class="accent-blue-500">
|
||||
<div>
|
||||
<p class="text-sm font-medium text-gray-900 dark:text-white leading-tight">Vibration</p>
|
||||
<p class="text-xs text-gray-500 dark:text-gray-400">Seismographs, modems</p>
|
||||
</div>
|
||||
</label>
|
||||
</div>
|
||||
</div>
|
||||
|
||||
<div>
|
||||
<label class="block text-sm font-medium text-gray-700 dark:text-gray-300 mb-2">
|
||||
Data Collection <span class="text-red-500">*</span>
|
||||
</label>
|
||||
<div class="grid grid-cols-2 gap-3">
|
||||
<label class="flex items-start gap-3 p-3 border-2 border-seismo-orange bg-orange-50 dark:bg-orange-900/20 rounded-lg cursor-pointer" id="qcp-mode-manual-label">
|
||||
<input type="radio" name="data_collection_mode" value="manual" checked
|
||||
onchange="qcpUpdateModeStyles()"
|
||||
class="mt-0.5 accent-seismo-orange shrink-0">
|
||||
<div>
|
||||
<p class="text-sm font-medium text-gray-900 dark:text-white">Manual</p>
|
||||
<p class="text-xs text-gray-500 dark:text-gray-400">SD card retrieved daily</p>
|
||||
</div>
|
||||
</label>
|
||||
<label class="flex items-start gap-3 p-3 border-2 border-gray-300 dark:border-gray-600 rounded-lg cursor-pointer" id="qcp-mode-remote-label">
|
||||
<input type="radio" name="data_collection_mode" value="remote"
|
||||
onchange="qcpUpdateModeStyles()"
|
||||
class="mt-0.5 accent-seismo-orange shrink-0">
|
||||
<div>
|
||||
<p class="text-sm font-medium text-gray-900 dark:text-white">Remote</p>
|
||||
<p class="text-xs text-gray-500 dark:text-gray-400">Modem, data pulled via FTP</p>
|
||||
</div>
|
||||
</label>
|
||||
</div>
|
||||
</div>
|
||||
|
||||
<div id="qcp-error" class="hidden p-3 bg-red-100 dark:bg-red-900/30 text-red-700 dark:text-red-300 rounded-lg text-sm">
|
||||
</div>
|
||||
|
||||
<div class="flex gap-3 pt-2">
|
||||
<button type="submit"
|
||||
id="qcp-submit-btn"
|
||||
class="flex-1 px-4 py-2 bg-seismo-orange hover:bg-orange-600 text-white rounded-lg font-medium transition-colors flex items-center justify-center gap-2">
|
||||
<svg class="w-5 h-5" fill="none" stroke="currentColor" viewBox="0 0 24 24">
|
||||
<path stroke-linecap="round" stroke-linejoin="round" stroke-width="2" d="M12 4v16m8-8H4"></path>
|
||||
</svg>
|
||||
Create & Select
|
||||
</button>
|
||||
<button type="button"
|
||||
onclick="closeCreateProjectModal()"
|
||||
class="px-4 py-2 bg-gray-300 dark:bg-gray-600 hover:bg-gray-400 dark:hover:bg-gray-500 text-gray-700 dark:text-white rounded-lg font-medium transition-colors">
|
||||
Cancel
|
||||
</button>
|
||||
</div>
|
||||
</form>
|
||||
</div>
|
||||
</div>
|
||||
|
||||
<script>
|
||||
function qcpUpdateModeStyles() {
|
||||
const manualChecked = document.querySelector('input[name="data_collection_mode"][value="manual"]')?.checked;
|
||||
const manualLabel = document.getElementById('qcp-mode-manual-label');
|
||||
const remoteLabel = document.getElementById('qcp-mode-remote-label');
|
||||
if (!manualLabel || !remoteLabel) return;
|
||||
if (manualChecked) {
|
||||
manualLabel.classList.add('border-seismo-orange', 'bg-orange-50', 'dark:bg-orange-900/20');
|
||||
manualLabel.classList.remove('border-gray-300', 'dark:border-gray-600');
|
||||
remoteLabel.classList.remove('border-seismo-orange', 'bg-orange-50', 'dark:bg-orange-900/20');
|
||||
remoteLabel.classList.add('border-gray-300', 'dark:border-gray-600');
|
||||
} else {
|
||||
remoteLabel.classList.add('border-seismo-orange', 'bg-orange-50', 'dark:bg-orange-900/20');
|
||||
remoteLabel.classList.remove('border-gray-300', 'dark:border-gray-600');
|
||||
manualLabel.classList.remove('border-seismo-orange', 'bg-orange-50', 'dark:bg-orange-900/20');
|
||||
manualLabel.classList.add('border-gray-300', 'dark:border-gray-600');
|
||||
}
|
||||
}
|
||||
|
||||
// Quick create project modal functions
|
||||
if (typeof openCreateProjectModal === 'undefined') {
|
||||
function openCreateProjectModal(searchQuery, pickerId = '') {
|
||||
const modal = document.getElementById('quickCreateProjectModal');
|
||||
const pickerIdInput = document.getElementById('qcp-picker-id');
|
||||
const projectNumInput = document.getElementById('qcp-project-number');
|
||||
const clientNameInput = document.getElementById('qcp-client-name');
|
||||
const projectNameInput = document.getElementById('qcp-project-name');
|
||||
const errorDiv = document.getElementById('qcp-error');
|
||||
|
||||
// Store which picker opened this
|
||||
if (pickerIdInput) pickerIdInput.value = pickerId;
|
||||
|
||||
// Reset form
|
||||
document.getElementById('quickCreateProjectForm').reset();
|
||||
qcpUpdateModeStyles();
|
||||
if (errorDiv) errorDiv.classList.add('hidden');
|
||||
|
||||
// Try to parse the search query intelligently
|
||||
if (searchQuery) {
|
||||
// Check if it looks like a project number (xxxx-YY pattern)
|
||||
const projectNumMatch = searchQuery.match(/(\d{4}-\d{2})/);
|
||||
if (projectNumMatch) {
|
||||
if (projectNumInput) projectNumInput.value = projectNumMatch[1];
|
||||
// If there's more after the number, use it as client name
|
||||
const remainder = searchQuery.replace(projectNumMatch[1], '').replace(/^[\s\-]+/, '').trim();
|
||||
if (remainder && clientNameInput) clientNameInput.value = remainder;
|
||||
} else {
|
||||
// Not a project number - assume it's client or project name
|
||||
// If short (likely a name fragment), put it in client name
|
||||
if (clientNameInput) clientNameInput.value = searchQuery;
|
||||
}
|
||||
}
|
||||
|
||||
// Show modal
|
||||
if (modal) modal.classList.remove('hidden');
|
||||
|
||||
// Focus the first empty required field
|
||||
if (clientNameInput && !clientNameInput.value) {
|
||||
clientNameInput.focus();
|
||||
} else if (projectNameInput) {
|
||||
projectNameInput.focus();
|
||||
}
|
||||
}
|
||||
|
||||
function closeCreateProjectModal() {
|
||||
const modal = document.getElementById('quickCreateProjectModal');
|
||||
if (modal) modal.classList.add('hidden');
|
||||
}
|
||||
|
||||
// Handle quick create form submission
|
||||
document.getElementById('quickCreateProjectForm')?.addEventListener('submit', async function(e) {
|
||||
e.preventDefault();
|
||||
|
||||
const submitBtn = document.getElementById('qcp-submit-btn');
|
||||
const errorDiv = document.getElementById('qcp-error');
|
||||
const pickerId = document.getElementById('qcp-picker-id')?.value || '';
|
||||
|
||||
// Show loading state
|
||||
const originalBtnText = submitBtn.innerHTML;
|
||||
submitBtn.disabled = true;
|
||||
submitBtn.innerHTML = `
|
||||
<svg class="w-5 h-5 animate-spin" fill="none" stroke="currentColor" viewBox="0 0 24 24">
|
||||
<path stroke-linecap="round" stroke-linejoin="round" stroke-width="2" d="M4 4v5h.582m15.356 2A8.001 8.001 0 004.582 9m0 0H9m11 11v-5h-.581m0 0a8.003 8.003 0 01-15.357-2m15.357 2H15"></path>
|
||||
</svg>
|
||||
Creating...
|
||||
`;
|
||||
if (errorDiv) errorDiv.classList.add('hidden');
|
||||
|
||||
const formData = new FormData(this);
|
||||
|
||||
try {
|
||||
const response = await fetch('/api/projects/create', {
|
||||
method: 'POST',
|
||||
body: formData
|
||||
});
|
||||
|
||||
const result = await response.json();
|
||||
|
||||
if (response.ok && result.success) {
|
||||
const projectId = result.project_id;
|
||||
|
||||
// Add selected modules
|
||||
const moduleMap = { module_sound: 'sound_monitoring', module_vibration: 'vibration_monitoring' };
|
||||
for (const [field, moduleType] of Object.entries(moduleMap)) {
|
||||
if (formData.get(field)) {
|
||||
await fetch(`/api/projects/${projectId}/modules`, {
|
||||
method: 'POST',
|
||||
headers: {'Content-Type': 'application/json'},
|
||||
body: JSON.stringify({ module_type: moduleType }),
|
||||
});
|
||||
}
|
||||
}
|
||||
|
||||
// Build display text from form values
|
||||
const parts = [];
|
||||
const projectNumber = formData.get('project_number');
|
||||
const clientName = formData.get('client_name');
|
||||
const projectName = formData.get('name');
|
||||
|
||||
if (projectNumber) parts.push(projectNumber);
|
||||
if (clientName) parts.push(clientName);
|
||||
if (projectName) parts.push(projectName);
|
||||
|
||||
const displayText = parts.join(' - ');
|
||||
|
||||
// Select the newly created project in the picker
|
||||
selectProject(projectId, displayText, pickerId);
|
||||
|
||||
// Close modal
|
||||
closeCreateProjectModal();
|
||||
} else {
|
||||
// Show error
|
||||
if (errorDiv) {
|
||||
errorDiv.textContent = result.detail || result.message || 'Failed to create project';
|
||||
errorDiv.classList.remove('hidden');
|
||||
}
|
||||
}
|
||||
} catch (error) {
|
||||
if (errorDiv) {
|
||||
errorDiv.textContent = `Error: ${error.message}`;
|
||||
errorDiv.classList.remove('hidden');
|
||||
}
|
||||
} finally {
|
||||
// Restore button
|
||||
submitBtn.disabled = false;
|
||||
submitBtn.innerHTML = originalBtnText;
|
||||
}
|
||||
});
|
||||
|
||||
// Close modal on backdrop click
|
||||
document.getElementById('quickCreateProjectModal')?.addEventListener('click', function(e) {
|
||||
if (e.target === this) {
|
||||
closeCreateProjectModal();
|
||||
}
|
||||
});
|
||||
|
||||
// Close modal on Escape key
|
||||
document.addEventListener('keydown', function(e) {
|
||||
if (e.key === 'Escape') {
|
||||
const modal = document.getElementById('quickCreateProjectModal');
|
||||
if (modal && !modal.classList.contains('hidden')) {
|
||||
closeCreateProjectModal();
|
||||
}
|
||||
}
|
||||
});
|
||||
}
|
||||
</script>
|
||||
128
templates/partials/project_picker.html
Normal file
@@ -0,0 +1,128 @@
|
||||
{#
|
||||
Project Picker Component
|
||||
A reusable HTMX-based autocomplete for selecting projects.
|
||||
|
||||
Usage: include "partials/project_picker.html" with context
|
||||
|
||||
Variables available in context:
|
||||
- selected_project_id: Pre-selected project UUID (optional)
|
||||
- selected_project_display: Display text for pre-selected project (optional)
|
||||
- input_name: Name attribute for the hidden input (default: "project_id")
|
||||
- picker_id: Unique ID suffix for multiple pickers on same page (default: "")
|
||||
#}
|
||||
|
||||
{% set picker_id = picker_id|default("") %}
|
||||
{% set input_name = input_name|default("project_id") %}
|
||||
{% set selected_project_id = selected_project_id|default("") %}
|
||||
{% set selected_project_display = selected_project_display|default("") %}
|
||||
|
||||
<div class="project-picker relative" id="project-picker-container{{ picker_id }}">
|
||||
<!-- Hidden input for form submission (stores project UUID) -->
|
||||
<input type="hidden"
|
||||
name="{{ input_name }}"
|
||||
id="project-picker-value{{ picker_id }}"
|
||||
value="{{ selected_project_id }}">
|
||||
|
||||
<!-- Search Input -->
|
||||
<div class="relative">
|
||||
<input type="text"
|
||||
id="project-picker-search{{ picker_id }}"
|
||||
placeholder="Search by project number, client, or name..."
|
||||
class="w-full px-4 py-2 pr-10 rounded-lg border border-gray-300 dark:border-gray-600 bg-white dark:bg-slate-700 text-gray-900 dark:text-white focus:ring-2 focus:ring-seismo-orange focus:border-seismo-orange"
|
||||
autocomplete="off"
|
||||
value="{{ selected_project_display }}"
|
||||
hx-get="/api/projects/search"
|
||||
hx-trigger="keyup changed delay:300ms, focus"
|
||||
hx-target="#project-picker-dropdown{{ picker_id }}"
|
||||
hx-vals='{"picker_id": "{{ picker_id }}"}'
|
||||
name="q"
|
||||
onfocus="showProjectDropdown('{{ picker_id }}')"
|
||||
oninput="handleProjectSearchInput('{{ picker_id }}', this.value)">
|
||||
|
||||
<!-- Search icon -->
|
||||
<div class="absolute inset-y-0 right-0 flex items-center pr-3 pointer-events-none">
|
||||
<svg class="w-5 h-5 text-gray-400" fill="none" stroke="currentColor" viewBox="0 0 24 24">
|
||||
<path stroke-linecap="round" stroke-linejoin="round" stroke-width="2" d="M21 21l-6-6m2-5a7 7 0 11-14 0 7 7 0 0114 0z"></path>
|
||||
</svg>
|
||||
</div>
|
||||
|
||||
<!-- Clear button (shown when project is selected) -->
|
||||
<button type="button"
|
||||
id="project-picker-clear{{ picker_id }}"
|
||||
class="absolute inset-y-0 right-8 flex items-center pr-1 {{ 'hidden' if not selected_project_id else '' }}"
|
||||
onclick="clearProjectSelection('{{ picker_id }}')"
|
||||
title="Clear selection">
|
||||
<svg class="w-4 h-4 text-gray-400 hover:text-gray-600 dark:hover:text-gray-300" fill="none" stroke="currentColor" viewBox="0 0 24 24">
|
||||
<path stroke-linecap="round" stroke-linejoin="round" stroke-width="2" d="M6 18L18 6M6 6l12 12"></path>
|
||||
</svg>
|
||||
</button>
|
||||
</div>
|
||||
|
||||
<!-- Dropdown Results Container -->
|
||||
<div id="project-picker-dropdown{{ picker_id }}"
|
||||
class="hidden absolute z-50 w-full mt-1 bg-white dark:bg-slate-800 border border-gray-300 dark:border-gray-600 rounded-lg shadow-lg max-h-60 overflow-y-auto">
|
||||
<!-- Results loaded via HTMX -->
|
||||
</div>
|
||||
</div>
|
||||
|
||||
<script>
|
||||
// Project picker functions - defined once, work for any picker_id
|
||||
if (typeof selectProject === 'undefined') {
|
||||
function selectProject(projectId, displayText, pickerId = '') {
|
||||
const valueInput = document.getElementById('project-picker-value' + pickerId);
|
||||
const searchInput = document.getElementById('project-picker-search' + pickerId);
|
||||
const dropdown = document.getElementById('project-picker-dropdown' + pickerId);
|
||||
const clearBtn = document.getElementById('project-picker-clear' + pickerId);
|
||||
|
||||
if (valueInput) valueInput.value = projectId;
|
||||
if (searchInput) searchInput.value = displayText;
|
||||
if (dropdown) dropdown.classList.add('hidden');
|
||||
if (clearBtn) clearBtn.classList.remove('hidden');
|
||||
}
|
||||
|
||||
function clearProjectSelection(pickerId = '') {
|
||||
const valueInput = document.getElementById('project-picker-value' + pickerId);
|
||||
const searchInput = document.getElementById('project-picker-search' + pickerId);
|
||||
const clearBtn = document.getElementById('project-picker-clear' + pickerId);
|
||||
|
||||
if (valueInput) valueInput.value = '';
|
||||
if (searchInput) {
|
||||
searchInput.value = '';
|
||||
searchInput.focus();
|
||||
}
|
||||
if (clearBtn) clearBtn.classList.add('hidden');
|
||||
}
|
||||
|
||||
function showProjectDropdown(pickerId = '') {
|
||||
const dropdown = document.getElementById('project-picker-dropdown' + pickerId);
|
||||
if (dropdown) dropdown.classList.remove('hidden');
|
||||
}
|
||||
|
||||
function hideProjectDropdown(pickerId = '') {
|
||||
const dropdown = document.getElementById('project-picker-dropdown' + pickerId);
|
||||
if (dropdown) dropdown.classList.add('hidden');
|
||||
}
|
||||
|
||||
function handleProjectSearchInput(pickerId, value) {
|
||||
const valueInput = document.getElementById('project-picker-value' + pickerId);
|
||||
const clearBtn = document.getElementById('project-picker-clear' + pickerId);
|
||||
|
||||
// If user clears the search box, also clear the hidden value
|
||||
if (!value.trim()) {
|
||||
if (valueInput) valueInput.value = '';
|
||||
if (clearBtn) clearBtn.classList.add('hidden');
|
||||
}
|
||||
}
|
||||
|
||||
// Close dropdown when clicking outside
|
||||
document.addEventListener('click', function(event) {
|
||||
const pickers = document.querySelectorAll('.project-picker');
|
||||
pickers.forEach(picker => {
|
||||
if (!picker.contains(event.target)) {
|
||||
const dropdown = picker.querySelector('[id^="project-picker-dropdown"]');
|
||||
if (dropdown) dropdown.classList.add('hidden');
|
||||
}
|
||||
});
|
||||
});
|
||||
}
|
||||
</script>
|
||||
69
templates/partials/project_search_results.html
Normal file
@@ -0,0 +1,69 @@
|
||||
<!--
|
||||
Project Search Results Partial
|
||||
Rendered by /api/projects/search endpoint for HTMX dropdown.
|
||||
|
||||
Variables:
|
||||
- projects: List of project dicts with id, project_number, client_name, name, display, status
|
||||
- query: The search query string
|
||||
- show_create: Boolean - show "Create new project" option when no matches
|
||||
-->
|
||||
|
||||
{% set picker_id = request.query_params.get('picker_id', '') %}
|
||||
|
||||
{% if projects %}
|
||||
{% for project in projects %}
|
||||
<div class="px-4 py-3 hover:bg-gray-100 dark:hover:bg-gray-700 cursor-pointer border-b border-gray-100 dark:border-gray-700 last:border-0 transition-colors"
|
||||
onclick="selectProject('{{ project.id }}', '{{ project.display|e }}', '{{ picker_id }}')">
|
||||
<div class="flex items-start justify-between gap-2">
|
||||
<div class="flex-1 min-w-0">
|
||||
<div class="font-medium text-gray-900 dark:text-white truncate">
|
||||
{% if project.project_number %}
|
||||
<span class="text-seismo-orange font-semibold">{{ project.project_number }}</span>
|
||||
{% if project.client_name or project.name %}
|
||||
<span class="text-gray-400 mx-1">-</span>
|
||||
{% endif %}
|
||||
{% endif %}
|
||||
{% if project.client_name %}
|
||||
<span>{{ project.client_name }}</span>
|
||||
{% endif %}
|
||||
</div>
|
||||
{% if project.name %}
|
||||
<div class="text-sm text-gray-500 dark:text-gray-400 truncate">
|
||||
{{ project.name }}
|
||||
</div>
|
||||
{% endif %}
|
||||
</div>
|
||||
{% if project.status == 'completed' %}
|
||||
<span class="flex-shrink-0 text-xs px-2 py-0.5 bg-gray-100 dark:bg-gray-600 text-gray-600 dark:text-gray-300 rounded">
|
||||
Completed
|
||||
</span>
|
||||
{% endif %}
|
||||
</div>
|
||||
</div>
|
||||
{% endfor %}
|
||||
{% endif %}
|
||||
|
||||
{% if show_create %}
|
||||
<div class="px-4 py-3 hover:bg-green-50 dark:hover:bg-green-900/30 cursor-pointer border-t border-gray-200 dark:border-gray-600 transition-colors"
|
||||
onclick="openCreateProjectModal('{{ query|e }}', '{{ picker_id }}')">
|
||||
<div class="flex items-center gap-2 text-green-600 dark:text-green-400">
|
||||
<svg class="w-5 h-5 flex-shrink-0" fill="none" stroke="currentColor" viewBox="0 0 24 24">
|
||||
<path stroke-linecap="round" stroke-linejoin="round" stroke-width="2" d="M12 4v16m8-8H4"></path>
|
||||
</svg>
|
||||
<span class="font-medium">Create new project "{{ query }}"</span>
|
||||
</div>
|
||||
<p class="text-xs text-gray-500 dark:text-gray-400 mt-1 ml-7">
|
||||
No matching projects found. Click to create a new one.
|
||||
</p>
|
||||
</div>
|
||||
{% endif %}
|
||||
|
||||
{% if not projects and not show_create %}
|
||||
<div class="px-4 py-6 text-center text-gray-500 dark:text-gray-400">
|
||||
<svg class="w-8 h-8 mx-auto mb-2 text-gray-300 dark:text-gray-600" fill="none" stroke="currentColor" viewBox="0 0 24 24">
|
||||
<path stroke-linecap="round" stroke-linejoin="round" stroke-width="2" d="M21 21l-6-6m2-5a7 7 0 11-14 0 7 7 0 0114 0z"></path>
|
||||
</svg>
|
||||
<p class="text-sm">Start typing to search projects...</p>
|
||||
<p class="text-xs mt-1">Search by project number, client name, or project name</p>
|
||||
</div>
|
||||
{% endif %}
|
||||
188
templates/partials/projects/file_list.html
Normal file
@@ -0,0 +1,188 @@
|
||||
<!-- File List for NRL - Simple flat list of files with session info -->
|
||||
{% if files %}
|
||||
<div class="divide-y divide-gray-200 dark:divide-gray-700">
|
||||
{% for file_data in files %}
|
||||
{% set file = file_data.file %}
|
||||
{% set session = file_data.session %}
|
||||
|
||||
<div class="flex items-center gap-3 px-4 py-3 hover:bg-gray-50 dark:hover:bg-gray-800/50 transition-colors group">
|
||||
<!-- File Icon -->
|
||||
{% if file.file_type == 'audio' %}
|
||||
<svg class="w-6 h-6 text-blue-500" fill="none" stroke="currentColor" viewBox="0 0 24 24">
|
||||
<path stroke-linecap="round" stroke-linejoin="round" stroke-width="2" d="M9 19V6l12-3v13M9 19c0 1.105-1.343 2-3 2s-3-.895-3-2 1.343-2 3-2 3 .895 3 2zm12-3c0 1.105-1.343 2-3 2s-3-.895-3-2 1.343-2 3-2 3 .895 3 2zM9 10l12-3"></path>
|
||||
</svg>
|
||||
{% elif file.file_type == 'archive' %}
|
||||
<svg class="w-6 h-6 text-purple-500" fill="none" stroke="currentColor" viewBox="0 0 24 24">
|
||||
<path stroke-linecap="round" stroke-linejoin="round" stroke-width="2" d="M20 13V6a2 2 0 00-2-2H6a2 2 0 00-2 2v7m16 0v5a2 2 0 01-2 2H6a2 2 0 01-2-2v-5m16 0h-2.586a1 1 0 00-.707.293l-2.414 2.414a1 1 0 01-.707.293h-3.172a1 1 0 01-.707-.293l-2.414-2.414A1 1 0 006.586 13H4"></path>
|
||||
</svg>
|
||||
{% elif file.file_type == 'log' %}
|
||||
<svg class="w-6 h-6 text-gray-500" fill="none" stroke="currentColor" viewBox="0 0 24 24">
|
||||
<path stroke-linecap="round" stroke-linejoin="round" stroke-width="2" d="M9 12h6m-6 4h6m2 5H7a2 2 0 01-2-2V5a2 2 0 012-2h5.586a1 1 0 01.707.293l5.414 5.414a1 1 0 01.293.707V19a2 2 0 01-2 2z"></path>
|
||||
</svg>
|
||||
{% elif file.file_type == 'image' %}
|
||||
<svg class="w-6 h-6 text-green-500" fill="none" stroke="currentColor" viewBox="0 0 24 24">
|
||||
<path stroke-linecap="round" stroke-linejoin="round" stroke-width="2" d="M4 16l4.586-4.586a2 2 0 012.828 0L16 16m-2-2l1.586-1.586a2 2 0 012.828 0L20 14m-6-6h.01M6 20h12a2 2 0 002-2V6a2 2 0 00-2-2H6a2 2 0 00-2 2v12a2 2 0 002 2z"></path>
|
||||
</svg>
|
||||
{% elif file.file_type == 'measurement' %}
|
||||
<svg class="w-6 h-6 text-emerald-500" fill="none" stroke="currentColor" viewBox="0 0 24 24">
|
||||
<path stroke-linecap="round" stroke-linejoin="round" stroke-width="2" d="M9 19v-6a2 2 0 00-2-2H5a2 2 0 00-2 2v6a2 2 0 002 2h2a2 2 0 002-2zm0 0V9a2 2 0 012-2h2a2 2 0 012 2v10m-6 0a2 2 0 002 2h2a2 2 0 002-2m0 0V5a2 2 0 012-2h2a2 2 0 012 2v14a2 2 0 01-2 2h-2a2 2 0 01-2-2z"></path>
|
||||
</svg>
|
||||
{% else %}
|
||||
<svg class="w-6 h-6 text-gray-400" fill="none" stroke="currentColor" viewBox="0 0 24 24">
|
||||
<path stroke-linecap="round" stroke-linejoin="round" stroke-width="2" d="M9 12h6m-6 4h6m2 5H7a2 2 0 01-2-2V5a2 2 0 012-2h5.586a1 1 0 01.707.293l5.414 5.414a1 1 0 01.293.707V19a2 2 0 01-2 2z"></path>
|
||||
</svg>
|
||||
{% endif %}
|
||||
|
||||
<!-- File Info -->
|
||||
<div class="flex-1 min-w-0">
|
||||
<div class="flex items-center gap-2">
|
||||
<div class="font-medium text-gray-900 dark:text-white truncate">
|
||||
{{ file.file_path.split('/')[-1] if file.file_path else 'Unknown' }}
|
||||
</div>
|
||||
</div>
|
||||
<div class="text-xs text-gray-500 dark:text-gray-400 mt-0.5">
|
||||
<!-- File Type Badge -->
|
||||
<span class="px-1.5 py-0.5 rounded font-medium
|
||||
{% if file.file_type == 'audio' %}bg-blue-100 text-blue-700 dark:bg-blue-900/30 dark:text-blue-300
|
||||
{% elif file.file_type == 'data' %}bg-green-100 text-green-700 dark:bg-green-900/30 dark:text-green-300
|
||||
{% elif file.file_type == 'measurement' %}bg-emerald-100 text-emerald-700 dark:bg-emerald-900/30 dark:text-emerald-300
|
||||
{% elif file.file_type == 'log' %}bg-gray-100 text-gray-700 dark:bg-gray-700 dark:text-gray-300
|
||||
{% elif file.file_type == 'archive' %}bg-purple-100 text-purple-700 dark:bg-purple-900/30 dark:text-purple-300
|
||||
{% elif file.file_type == 'image' %}bg-pink-100 text-pink-700 dark:bg-pink-900/30 dark:text-pink-300
|
||||
{% else %}bg-gray-100 text-gray-700 dark:bg-gray-700 dark:text-gray-300{% endif %}">
|
||||
{{ file.file_type or 'unknown' }}
|
||||
</span>
|
||||
|
||||
{# Leq vs Lp badge for RND files #}
|
||||
{% if file.file_path and '_Leq_' in file.file_path %}
|
||||
<span class="px-1.5 py-0.5 rounded font-medium bg-blue-100 text-blue-700 dark:bg-blue-900/30 dark:text-blue-300">
|
||||
Leq (15-min avg)
|
||||
</span>
|
||||
{% elif file.file_path and '_Lp' in file.file_path and file.file_path.endswith('.rnd') %}
|
||||
<span class="px-1.5 py-0.5 rounded font-medium bg-orange-100 text-orange-700 dark:bg-orange-900/30 dark:text-orange-300">
|
||||
Lp (instant)
|
||||
</span>
|
||||
{% endif %}
|
||||
|
||||
<!-- File Size -->
|
||||
<span class="mx-1">•</span>
|
||||
{% if file.file_size_bytes %}
|
||||
{% if file.file_size_bytes < 1024 %}
|
||||
{{ file.file_size_bytes }} B
|
||||
{% elif file.file_size_bytes < 1048576 %}
|
||||
{{ "%.1f"|format(file.file_size_bytes / 1024) }} KB
|
||||
{% elif file.file_size_bytes < 1073741824 %}
|
||||
{{ "%.1f"|format(file.file_size_bytes / 1048576) }} MB
|
||||
{% else %}
|
||||
{{ "%.2f"|format(file.file_size_bytes / 1073741824) }} GB
|
||||
{% endif %}
|
||||
{% else %}
|
||||
Unknown size
|
||||
{% endif %}
|
||||
|
||||
<!-- Session Info -->
|
||||
{% if session %}
|
||||
<span class="mx-1">•</span>
|
||||
<span class="text-gray-400">Session: {{ session.started_at|local_datetime if session.started_at else 'Unknown' }}</span>
|
||||
{% endif %}
|
||||
|
||||
<!-- Download Time -->
|
||||
{% if file.downloaded_at %}
|
||||
<span class="mx-1">•</span>
|
||||
{{ file.downloaded_at|local_datetime }}
|
||||
{% endif %}
|
||||
|
||||
<!-- Checksum Indicator -->
|
||||
{% if file.checksum %}
|
||||
<span class="mx-1" title="SHA256: {{ file.checksum[:16] }}...">
|
||||
<svg class="w-3 h-3 inline text-green-600" fill="currentColor" viewBox="0 0 20 20">
|
||||
<path fill-rule="evenodd" d="M2.166 4.999A11.954 11.954 0 0010 1.944 11.954 11.954 0 0017.834 5c.11.65.166 1.32.166 2.001 0 5.225-3.34 9.67-8 11.317C5.34 16.67 2 12.225 2 7c0-.682.057-1.35.166-2.001zm11.541 3.708a1 1 0 00-1.414-1.414L9 10.586 7.707 9.293a1 1 0 00-1.414 1.414l2 2a1 1 0 001.414 0l4-4z" clip-rule="evenodd"></path>
|
||||
</svg>
|
||||
</span>
|
||||
{% endif %}
|
||||
</div>
|
||||
</div>
|
||||
|
||||
<!-- Action Buttons -->
|
||||
<div class="opacity-0 group-hover:opacity-100 transition-opacity flex items-center gap-2">
|
||||
{% if file.file_type == 'measurement' or (file.file_path and file.file_path.endswith('.rnd')) %}
|
||||
<a href="/api/projects/{{ project_id }}/files/{{ file.id }}/view-rnd"
|
||||
onclick="event.stopPropagation();"
|
||||
class="px-3 py-1 text-xs bg-emerald-600 text-white rounded-lg hover:bg-emerald-700 transition-colors flex items-center">
|
||||
<svg class="w-4 h-4 inline mr-1" fill="none" stroke="currentColor" viewBox="0 0 24 24">
|
||||
<path stroke-linecap="round" stroke-linejoin="round" stroke-width="2" d="M9 19v-6a2 2 0 00-2-2H5a2 2 0 00-2 2v6a2 2 0 002 2h2a2 2 0 002-2zm0 0V9a2 2 0 012-2h2a2 2 0 012 2v10m-6 0a2 2 0 002 2h2a2 2 0 002-2m0 0V5a2 2 0 012-2h2a2 2 0 012 2v14a2 2 0 01-2 2h-2a2 2 0 01-2-2z"></path>
|
||||
</svg>
|
||||
View
|
||||
</a>
|
||||
{% endif %}
|
||||
{# Only show Report button for Leq files #}
|
||||
{% if file.file_path and '_Leq_' in file.file_path %}
|
||||
<a href="/api/projects/{{ project_id }}/files/{{ file.id }}/generate-report"
|
||||
onclick="event.stopPropagation();"
|
||||
class="px-3 py-1 text-xs bg-blue-600 text-white rounded-lg hover:bg-blue-700 transition-colors flex items-center"
|
||||
title="Generate Excel Report">
|
||||
<svg class="w-4 h-4 inline mr-1" fill="none" stroke="currentColor" viewBox="0 0 24 24">
|
||||
<path stroke-linecap="round" stroke-linejoin="round" stroke-width="2" d="M9 17v-2m3 2v-4m3 4v-6m2 10H7a2 2 0 01-2-2V5a2 2 0 012-2h5.586a1 1 0 01.707.293l5.414 5.414a1 1 0 01.293.707V19a2 2 0 01-2 2z"></path>
|
||||
</svg>
|
||||
Report
|
||||
</a>
|
||||
{% endif %}
|
||||
<button onclick="event.stopPropagation(); downloadFile('{{ file.id }}')"
|
||||
class="px-3 py-1 text-xs bg-seismo-orange text-white rounded-lg hover:bg-seismo-navy transition-colors">
|
||||
<svg class="w-4 h-4 inline mr-1" fill="none" stroke="currentColor" viewBox="0 0 24 24">
|
||||
<path stroke-linecap="round" stroke-linejoin="round" stroke-width="2" d="M4 16v1a3 3 0 003 3h10a3 3 0 003-3v-1m-4-4l-4 4m0 0l-4-4m4 4V4"></path>
|
||||
</svg>
|
||||
Download
|
||||
</button>
|
||||
<button onclick="event.stopPropagation(); confirmDeleteFile('{{ file.id }}', '{{ file.file_path.split('/')[-1] if file.file_path else 'Unknown' }}')"
|
||||
class="px-3 py-1 text-xs bg-red-600 text-white rounded-lg hover:bg-red-700 transition-colors"
|
||||
title="Delete this file">
|
||||
<svg class="w-4 h-4 inline" fill="none" stroke="currentColor" viewBox="0 0 24 24">
|
||||
<path stroke-linecap="round" stroke-linejoin="round" stroke-width="2" d="M19 7l-.867 12.142A2 2 0 0116.138 21H7.862a2 2 0 01-1.995-1.858L5 7m5 4v6m4-6v6m1-10V4a1 1 0 00-1-1h-4a1 1 0 00-1 1v3M4 7h16"></path>
|
||||
</svg>
|
||||
</button>
|
||||
</div>
|
||||
</div>
|
||||
{% endfor %}
|
||||
</div>
|
||||
{% else %}
|
||||
<!-- Empty State -->
|
||||
<div class="px-6 py-12 text-center">
|
||||
<svg class="w-16 h-16 mx-auto mb-4 text-gray-400" fill="none" stroke="currentColor" viewBox="0 0 24 24">
|
||||
<path stroke-linecap="round" stroke-linejoin="round" stroke-width="2" d="M7 21h10a2 2 0 002-2V9.414a1 1 0 00-.293-.707l-5.414-5.414A1 1 0 0012.586 3H7a2 2 0 00-2 2v14a2 2 0 002 2z"></path>
|
||||
</svg>
|
||||
<p class="text-gray-500 dark:text-gray-400 mb-2">No data files yet</p>
|
||||
<p class="text-sm text-gray-400 dark:text-gray-500">
|
||||
Files appear here after an FTP download from a connected meter, or after uploading SD card data manually.
|
||||
</p>
|
||||
</div>
|
||||
{% endif %}
|
||||
|
||||
<script>
|
||||
function downloadFile(fileId) {
|
||||
window.location.href = `/api/projects/{{ project_id }}/files/${fileId}/download`;
|
||||
}
|
||||
|
||||
function confirmDeleteFile(fileId, fileName) {
|
||||
if (confirm(`Are you sure you want to delete "${fileName}"?\n\nThis action cannot be undone.`)) {
|
||||
deleteFile(fileId);
|
||||
}
|
||||
}
|
||||
|
||||
async function deleteFile(fileId) {
|
||||
try {
|
||||
const response = await fetch(`/api/projects/{{ project_id }}/files/${fileId}`, {
|
||||
method: 'DELETE'
|
||||
});
|
||||
|
||||
if (response.ok) {
|
||||
window.location.reload();
|
||||
} else {
|
||||
const data = await response.json();
|
||||
alert(`Failed to delete file: ${data.detail || 'Unknown error'}`);
|
||||
}
|
||||
} catch (error) {
|
||||
alert(`Error deleting file: ${error.message}`);
|
||||
}
|
||||
}
|
||||
</script>
|
||||
89
templates/partials/projects/nrl_live_status.html
Normal file
@@ -0,0 +1,89 @@
|
||||
<!-- Live Status Card content for connected NRLs -->
|
||||
{% if error and not status %}
|
||||
<div class="flex items-center gap-3 text-gray-500 dark:text-gray-400">
|
||||
<svg class="w-5 h-5 text-amber-500 flex-shrink-0" fill="none" stroke="currentColor" viewBox="0 0 24 24">
|
||||
<path stroke-linecap="round" stroke-linejoin="round" stroke-width="2" d="M12 9v2m0 4h.01m-6.938 4h13.856c1.54 0 2.502-1.667 1.732-3L13.732 4c-.77-1.333-2.694-1.333-3.464 0L3.34 16c-.77 1.333.192 3 1.732 3z"></path>
|
||||
</svg>
|
||||
<span class="text-sm">{{ error }}</span>
|
||||
</div>
|
||||
{% elif status %}
|
||||
<div class="grid grid-cols-2 sm:grid-cols-4 gap-4">
|
||||
|
||||
<!-- Measurement State -->
|
||||
<div class="flex flex-col">
|
||||
<span class="text-xs text-gray-500 dark:text-gray-400 mb-1">State</span>
|
||||
{% set state = status.get('measurement_state', 'unknown') if status is mapping else 'unknown' %}
|
||||
{% if state in ('measuring', 'recording') %}
|
||||
<span class="inline-flex items-center gap-1.5 text-sm font-semibold text-green-600 dark:text-green-400">
|
||||
<span class="w-2 h-2 bg-green-500 rounded-full animate-pulse"></span>
|
||||
Measuring
|
||||
</span>
|
||||
{% elif state == 'paused' %}
|
||||
<span class="inline-flex items-center gap-1.5 text-sm font-semibold text-yellow-600 dark:text-yellow-400">
|
||||
<span class="w-2 h-2 bg-yellow-500 rounded-full"></span>
|
||||
Paused
|
||||
</span>
|
||||
{% elif state == 'stopped' %}
|
||||
<span class="inline-flex items-center gap-1.5 text-sm font-semibold text-gray-600 dark:text-gray-400">
|
||||
<span class="w-2 h-2 bg-gray-400 rounded-full"></span>
|
||||
Stopped
|
||||
</span>
|
||||
{% else %}
|
||||
<span class="text-sm text-gray-500 dark:text-gray-400 capitalize">{{ state }}</span>
|
||||
{% endif %}
|
||||
</div>
|
||||
|
||||
<!-- Lp (instantaneous) -->
|
||||
<div class="flex flex-col">
|
||||
<span class="text-xs text-gray-500 dark:text-gray-400 mb-1">Lp (dB)</span>
|
||||
{% set lp = status.get('lp') if status is mapping else None %}
|
||||
<span class="text-xl font-bold text-gray-900 dark:text-white">
|
||||
{% if lp is not none %}{{ "%.1f"|format(lp) }}{% else %}—{% endif %}
|
||||
</span>
|
||||
</div>
|
||||
|
||||
<!-- Battery -->
|
||||
<div class="flex flex-col">
|
||||
<span class="text-xs text-gray-500 dark:text-gray-400 mb-1">Battery</span>
|
||||
{% set batt = status.get('battery_level') if status is mapping else None %}
|
||||
{% if batt is not none %}
|
||||
<span class="text-sm font-semibold
|
||||
{% if batt >= 60 %}text-green-600 dark:text-green-400
|
||||
{% elif batt >= 30 %}text-yellow-600 dark:text-yellow-400
|
||||
{% else %}text-red-600 dark:text-red-400{% endif %}">
|
||||
{{ batt }}%
|
||||
</span>
|
||||
{% else %}
|
||||
<span class="text-sm text-gray-500">—</span>
|
||||
{% endif %}
|
||||
</div>
|
||||
|
||||
<!-- Last Seen -->
|
||||
<div class="flex flex-col">
|
||||
<span class="text-xs text-gray-500 dark:text-gray-400 mb-1">Last Seen</span>
|
||||
{% set last_seen = status.get('last_seen') if status is mapping else None %}
|
||||
{% if last_seen %}
|
||||
<span class="text-sm text-gray-700 dark:text-gray-300">{{ last_seen|local_datetime }}</span>
|
||||
{% else %}
|
||||
<span class="text-sm text-gray-500">—</span>
|
||||
{% endif %}
|
||||
</div>
|
||||
|
||||
</div>
|
||||
|
||||
{% if unit %}
|
||||
<div class="mt-3 pt-3 border-t border-gray-100 dark:border-gray-700 flex items-center justify-between">
|
||||
<span class="text-xs text-gray-400 dark:text-gray-500">
|
||||
Unit: {{ unit.id }}
|
||||
{% if unit.slm_model %} • {{ unit.slm_model }}{% endif %}
|
||||
</span>
|
||||
<a href="/slm/{{ unit.id }}"
|
||||
class="text-xs text-seismo-orange hover:text-seismo-navy transition-colors">
|
||||
Open Unit →
|
||||
</a>
|
||||
</div>
|
||||
{% endif %}
|
||||
|
||||
{% else %}
|
||||
<div class="text-sm text-gray-500 dark:text-gray-400">No status data available.</div>
|
||||
{% endif %}
|
||||
@@ -11,8 +11,12 @@
|
||||
{% endif %}
|
||||
</p>
|
||||
</div>
|
||||
{% if project.status == 'active' %}
|
||||
{% if project.status == 'upcoming' %}
|
||||
<span class="px-3 py-1 text-xs font-medium bg-purple-100 text-purple-800 dark:bg-purple-900/30 dark:text-purple-300 rounded-full">Upcoming</span>
|
||||
{% elif project.status == 'active' %}
|
||||
<span class="px-3 py-1 text-xs font-medium bg-green-100 text-green-800 dark:bg-green-900/30 dark:text-green-300 rounded-full">Active</span>
|
||||
{% elif project.status == 'on_hold' %}
|
||||
<span class="px-3 py-1 text-xs font-medium bg-amber-100 text-amber-800 dark:bg-amber-900/30 dark:text-amber-400 rounded-full">On Hold</span>
|
||||
{% elif project.status == 'completed' %}
|
||||
<span class="px-3 py-1 text-xs font-medium bg-blue-100 text-blue-800 dark:bg-blue-900/30 dark:text-blue-300 rounded-full">Completed</span>
|
||||
{% elif project.status == 'archived' %}
|
||||
@@ -48,14 +52,14 @@
|
||||
<div class="bg-white dark:bg-slate-800 rounded-xl shadow-lg p-6">
|
||||
<div class="flex items-center justify-between mb-4">
|
||||
<h3 class="text-lg font-semibold text-gray-900 dark:text-white">
|
||||
{% if project_type and project_type.id == 'sound_monitoring' %}
|
||||
{% if 'sound_monitoring' in modules and 'vibration_monitoring' not in modules %}
|
||||
NRLs
|
||||
{% else %}
|
||||
Locations
|
||||
{% endif %}
|
||||
</h3>
|
||||
<button onclick="openLocationModal('{% if project_type and project_type.id == 'sound_monitoring' %}sound{% elif project_type and project_type.id == 'vibration_monitoring' %}vibration{% else %}{% endif %}')" class="text-sm text-seismo-orange hover:text-seismo-navy">
|
||||
{% if project_type and project_type.id == 'sound_monitoring' %}
|
||||
<button onclick="openLocationModal('{% if 'sound_monitoring' in modules and 'vibration_monitoring' not in modules %}sound{% elif 'vibration_monitoring' in modules and 'sound_monitoring' not in modules %}vibration{% endif %}')" class="text-sm text-seismo-orange hover:text-seismo-navy">
|
||||
{% if 'sound_monitoring' in modules and 'vibration_monitoring' not in modules %}
|
||||
Add NRL
|
||||
{% else %}
|
||||
Add Location
|
||||
@@ -63,7 +67,7 @@
|
||||
</button>
|
||||
</div>
|
||||
<div id="project-locations"
|
||||
hx-get="/api/projects/{{ project.id }}/locations{% if project_type and project_type.id == 'sound_monitoring' %}?location_type=sound{% endif %}"
|
||||
hx-get="/api/projects/{{ project.id }}/locations{% if 'sound_monitoring' in modules and 'vibration_monitoring' not in modules %}?location_type=sound{% endif %}"
|
||||
hx-trigger="load"
|
||||
hx-swap="innerHTML">
|
||||
<div class="animate-pulse space-y-3">
|
||||
|
||||
@@ -3,21 +3,71 @@
|
||||
<div>
|
||||
<h1 class="text-3xl font-bold text-gray-900 dark:text-white mb-2">{{ project.name }}</h1>
|
||||
<div class="flex items-center gap-4">
|
||||
<span class="inline-flex items-center px-3 py-1 rounded-full text-sm font-medium
|
||||
{% if project.status == 'active' %}bg-green-100 text-green-800 dark:bg-green-900 dark:text-green-200
|
||||
{% elif project.status == 'completed' %}bg-blue-100 text-blue-800 dark:bg-blue-900 dark:text-blue-200
|
||||
{% else %}bg-gray-100 text-gray-800 dark:bg-gray-700 dark:text-gray-200{% endif %}">
|
||||
{{ project.status|title }}
|
||||
<div class="relative inline-block">
|
||||
<select onchange="quickUpdateStatus(this.value)"
|
||||
class="appearance-none cursor-pointer inline-flex items-center pl-3 pr-7 py-1 rounded-full text-sm font-medium border-0 focus:ring-2 focus:ring-offset-1 focus:ring-blue-500
|
||||
{% if project.status == 'upcoming' %}bg-purple-100 text-purple-800 dark:bg-purple-900 dark:text-purple-200
|
||||
{% elif project.status == 'active' %}bg-green-100 text-green-800 dark:bg-green-900 dark:text-green-200
|
||||
{% elif project.status == 'on_hold' %}bg-amber-100 text-amber-800 dark:bg-amber-900 dark:text-amber-200
|
||||
{% elif project.status == 'completed' %}bg-blue-100 text-blue-800 dark:bg-blue-900 dark:text-blue-200
|
||||
{% else %}bg-gray-100 text-gray-800 dark:bg-gray-700 dark:text-gray-200{% endif %}">
|
||||
<option value="upcoming" {% if project.status == 'upcoming' %}selected{% endif %}>Upcoming</option>
|
||||
<option value="active" {% if project.status == 'active' %}selected{% endif %}>Active</option>
|
||||
<option value="on_hold" {% if project.status == 'on_hold' %}selected{% endif %}>On Hold</option>
|
||||
<option value="completed" {% if project.status == 'completed' %}selected{% endif %}>Completed</option>
|
||||
<option value="archived" {% if project.status == 'archived' %}selected{% endif %}>Archived</option>
|
||||
</select>
|
||||
<span class="pointer-events-none absolute right-2 top-1/2 -translate-y-1/2 text-current opacity-60">
|
||||
<svg class="w-3 h-3" fill="none" stroke="currentColor" viewBox="0 0 24 24">
|
||||
<path stroke-linecap="round" stroke-linejoin="round" stroke-width="2" d="M19 9l-7 7-7-7"/>
|
||||
</svg>
|
||||
</span>
|
||||
</div>
|
||||
<!-- Module badges -->
|
||||
<div id="module-badges" class="flex items-center gap-1.5 flex-wrap">
|
||||
{% for m in modules %}
|
||||
<span class="inline-flex items-center gap-1 px-2 py-0.5 rounded-full text-xs font-medium
|
||||
{% if m == 'sound_monitoring' %}bg-orange-100 text-orange-800 dark:bg-orange-900/40 dark:text-orange-300
|
||||
{% elif m == 'vibration_monitoring' %}bg-blue-100 text-blue-800 dark:bg-blue-900/40 dark:text-blue-300
|
||||
{% else %}bg-gray-100 text-gray-700 dark:bg-gray-700 dark:text-gray-300{% endif %}">
|
||||
{% if m == 'sound_monitoring' %}
|
||||
<svg class="w-3 h-3" fill="none" stroke="currentColor" viewBox="0 0 24 24"><path stroke-linecap="round" stroke-linejoin="round" stroke-width="2" d="M15.536 8.464a5 5 0 010 7.072M12 6v12M9 8.464a5 5 0 000 7.072"/></svg>
|
||||
Sound Monitoring
|
||||
{% elif m == 'vibration_monitoring' %}
|
||||
<svg class="w-3 h-3" fill="none" stroke="currentColor" viewBox="0 0 24 24"><path stroke-linecap="round" stroke-linejoin="round" stroke-width="2" d="M22 12h-4l-3 9L9 3l-3 9H2"/></svg>
|
||||
Vibration Monitoring
|
||||
{% else %}{{ m }}{% endif %}
|
||||
<button onclick="removeModule('{{ m }}')" class="ml-0.5 hover:text-red-500 transition-colors" title="Remove module">
|
||||
<svg class="w-2.5 h-2.5" fill="none" stroke="currentColor" viewBox="0 0 24 24"><path stroke-linecap="round" stroke-linejoin="round" stroke-width="2.5" d="M6 18L18 6M6 6l12 12"/></svg>
|
||||
</button>
|
||||
</span>
|
||||
{% endfor %}
|
||||
<button onclick="openAddModuleModal()" class="inline-flex items-center gap-1 px-2 py-0.5 rounded-full text-xs font-medium border border-dashed border-gray-400 dark:border-gray-600 text-gray-500 dark:text-gray-400 hover:border-orange-400 hover:text-orange-500 transition-colors">
|
||||
<svg class="w-3 h-3" fill="none" stroke="currentColor" viewBox="0 0 24 24"><path stroke-linecap="round" stroke-linejoin="round" stroke-width="2" d="M12 4v16m8-8H4"/></svg>
|
||||
Add Module
|
||||
</button>
|
||||
</div>
|
||||
{% if project.data_collection_mode == 'remote' %}
|
||||
<span class="inline-flex items-center gap-1 px-2.5 py-0.5 rounded-full text-xs font-medium bg-blue-100 text-blue-800 dark:bg-blue-900/40 dark:text-blue-300">
|
||||
<svg class="w-3 h-3" fill="none" stroke="currentColor" viewBox="0 0 24 24">
|
||||
<path stroke-linecap="round" stroke-linejoin="round" stroke-width="2" d="M8.111 16.404a5.5 5.5 0 017.778 0M12 20h.01m-7.08-7.071c3.904-3.905 10.236-3.905 14.141 0M1.394 9.393c5.857-5.857 15.355-5.857 21.213 0"></path>
|
||||
</svg>
|
||||
Remote
|
||||
</span>
|
||||
{% else %}
|
||||
<span class="inline-flex items-center gap-1 px-2.5 py-0.5 rounded-full text-xs font-medium bg-amber-100 text-amber-800 dark:bg-amber-900/40 dark:text-amber-300">
|
||||
<svg class="w-3 h-3" fill="none" stroke="currentColor" viewBox="0 0 24 24">
|
||||
<path stroke-linecap="round" stroke-linejoin="round" stroke-width="2" d="M5 8h14M5 8a2 2 0 110-4h14a2 2 0 110 4M5 8v10a2 2 0 002 2h10a2 2 0 002-2V8m-9 4h4"></path>
|
||||
</svg>
|
||||
Manual
|
||||
</span>
|
||||
{% if project_type %}
|
||||
<span class="text-gray-500 dark:text-gray-400">{{ project_type.name }}</span>
|
||||
{% endif %}
|
||||
</div>
|
||||
</div>
|
||||
<!-- Project Actions -->
|
||||
<div class="flex items-center gap-3">
|
||||
{% if project_type and project_type.id == 'sound_monitoring' %}
|
||||
<a href="/api/projects/{{ project.id }}/generate-combined-report"
|
||||
{% if 'sound_monitoring' in modules %}
|
||||
<a href="/api/projects/{{ project.id }}/combined-report-wizard"
|
||||
class="px-4 py-2 bg-emerald-600 text-white rounded-lg hover:bg-emerald-700 transition-colors flex items-center gap-2 text-sm">
|
||||
<svg class="w-5 h-5" fill="none" stroke="currentColor" viewBox="0 0 24 24">
|
||||
<path stroke-linecap="round" stroke-linejoin="round" stroke-width="2" d="M9 17v-2m3 2v-4m3 4v-6m2 10H7a2 2 0 01-2-2V5a2 2 0 012-2h5.586a1 1 0 01.707.293l5.414 5.414a1 1 0 01.293.707V19a2 2 0 01-2 2z"></path>
|
||||
@@ -28,3 +78,69 @@
|
||||
</div>
|
||||
</div>
|
||||
</div>
|
||||
|
||||
<!-- Add Module Modal -->
|
||||
<div id="add-module-modal" class="hidden fixed inset-0 z-50 flex items-center justify-center bg-black/60 backdrop-blur-sm">
|
||||
<div class="bg-white dark:bg-slate-800 rounded-xl shadow-2xl w-full max-w-sm mx-4 p-6">
|
||||
<div class="flex items-center justify-between mb-4">
|
||||
<h3 class="text-lg font-semibold text-gray-900 dark:text-white">Add Module</h3>
|
||||
<button onclick="closeAddModuleModal()" class="text-gray-400 hover:text-gray-600 dark:hover:text-gray-200">
|
||||
<svg class="w-5 h-5" fill="none" stroke="currentColor" viewBox="0 0 24 24"><path stroke-linecap="round" stroke-linejoin="round" stroke-width="2" d="M6 18L18 6M6 6l12 12"/></svg>
|
||||
</button>
|
||||
</div>
|
||||
<div id="add-module-options" class="space-y-2">
|
||||
<!-- Populated by JS -->
|
||||
</div>
|
||||
<p id="add-module-none" class="hidden text-sm text-gray-500 dark:text-gray-400 text-center py-4">All available modules are already enabled.</p>
|
||||
</div>
|
||||
</div>
|
||||
|
||||
<script>
|
||||
const _MODULE_META = {
|
||||
sound_monitoring: { name: "Sound Monitoring", color: "orange", icon: "M15.536 8.464a5 5 0 010 7.072M12 6v12M9 8.464a5 5 0 000 7.072" },
|
||||
vibration_monitoring: { name: "Vibration Monitoring", color: "blue", icon: "M22 12h-4l-3 9L9 3l-3 9H2" },
|
||||
};
|
||||
|
||||
async function openAddModuleModal() {
|
||||
const resp = await fetch(`/api/projects/${projectId}/modules`);
|
||||
const data = await resp.json();
|
||||
const container = document.getElementById('add-module-options');
|
||||
const none = document.getElementById('add-module-none');
|
||||
container.innerHTML = '';
|
||||
if (!data.available || data.available.length === 0) {
|
||||
none.classList.remove('hidden');
|
||||
} else {
|
||||
none.classList.add('hidden');
|
||||
data.available.forEach(m => {
|
||||
const meta = _MODULE_META[m.module_type] || { name: m.module_type, color: 'gray' };
|
||||
const btn = document.createElement('button');
|
||||
btn.className = `w-full text-left px-4 py-3 rounded-lg border border-gray-200 dark:border-gray-700 hover:border-${meta.color}-400 hover:bg-${meta.color}-50 dark:hover:bg-${meta.color}-900/20 transition-colors flex items-center gap-3`;
|
||||
btn.innerHTML = `<span class="flex-1 font-medium text-gray-900 dark:text-white">${meta.name}</span>`;
|
||||
btn.onclick = () => addModule(m.module_type);
|
||||
container.appendChild(btn);
|
||||
});
|
||||
}
|
||||
document.getElementById('add-module-modal').classList.remove('hidden');
|
||||
}
|
||||
|
||||
function closeAddModuleModal() {
|
||||
document.getElementById('add-module-modal').classList.add('hidden');
|
||||
}
|
||||
|
||||
async function addModule(moduleType) {
|
||||
await fetch(`/api/projects/${projectId}/modules`, {
|
||||
method: 'POST',
|
||||
headers: {'Content-Type': 'application/json'},
|
||||
body: JSON.stringify({ module_type: moduleType }),
|
||||
});
|
||||
closeAddModuleModal();
|
||||
window.location.reload();
|
||||
}
|
||||
|
||||
async function removeModule(moduleType) {
|
||||
const meta = _MODULE_META[moduleType] || { name: moduleType };
|
||||
if (!confirm(`Remove the ${meta.name} module? The data will not be deleted, but the related tabs will be hidden.`)) return;
|
||||
await fetch(`/api/projects/${projectId}/modules/${moduleType}`, { method: 'DELETE' });
|
||||
window.location.reload();
|
||||
}
|
||||
</script>
|
||||
|
||||
@@ -34,6 +34,10 @@
|
||||
<span class="px-2 py-1 text-xs font-medium bg-green-100 text-green-800 dark:bg-green-900/30 dark:text-green-400 rounded-full">
|
||||
Active
|
||||
</span>
|
||||
{% elif item.project.status == 'on_hold' %}
|
||||
<span class="px-2 py-1 text-xs font-medium bg-amber-100 text-amber-800 dark:bg-amber-900/30 dark:text-amber-400 rounded-full">
|
||||
On Hold
|
||||
</span>
|
||||
{% elif item.project.status == 'completed' %}
|
||||
<span class="px-2 py-1 text-xs font-medium bg-blue-100 text-blue-800 dark:bg-blue-900/30 dark:text-blue-400 rounded-full">
|
||||
Completed
|
||||
|
||||