diff --git a/PROJECTS_SYSTEM_IMPLEMENTATION.md b/PROJECTS_SYSTEM_IMPLEMENTATION.md
new file mode 100644
index 0000000..b2f8566
--- /dev/null
+++ b/PROJECTS_SYSTEM_IMPLEMENTATION.md
@@ -0,0 +1,546 @@
+# Projects System Implementation - Terra-View
+
+## Overview
+
+The Projects system has been successfully scaffolded in Terra-View. This document provides a complete overview of what has been built, how it works, and what needs to be completed.
+
+## ✅ Completed Components
+
+### 1. Database Schema
+
+**Location**: `/backend/models.py`
+
+Seven new tables have been added:
+
+- **ProjectType**: Template definitions for project types (Sound, Vibration, Combined)
+- **Project**: Top-level project organization with type reference
+- **MonitoringLocation**: Generic locations (NRLs for sound, monitoring points for vibration)
+- **UnitAssignment**: Links devices to locations
+- **ScheduledAction**: Automated recording control schedules
+- **RecordingSession**: Tracks actual recording/monitoring sessions
+- **DataFile**: File references for downloaded data
+
+**Key Features**:
+- Type-aware design (project_type_id determines features)
+- Flexible metadata fields (JSON columns for type-specific data)
+- Denormalized fields for efficient queries
+- Proper indexing on foreign keys
+
+### 2. Service Layer
+
+#### SLMM Client (`/backend/services/slmm_client.py`)
+- Clean wrapper for all SLMM API operations
+- Methods for: start/stop/pause/resume recording, get status, configure devices
+- Error handling with custom exceptions
+- Singleton pattern for easy access
+
+#### Device Controller (`/backend/services/device_controller.py`)
+- Routes commands to appropriate backend (SLMM for SLMs, SFM for seismographs)
+- Unified interface across device types
+- Ready for future SFM implementation
+
+#### Scheduler Service (`/backend/services/scheduler.py`)
+- Background task that checks for pending scheduled actions every 60 seconds
+- Executes actions by calling device controller
+- Creates/updates recording sessions
+- Tracks execution status and errors
+- Manual execution support for testing
+
+### 3. API Routers
+
+#### Projects Router (`/backend/routers/projects.py`)
+Endpoints:
+- `GET /api/projects/list` - Project list with stats
+- `GET /api/projects/stats` - Overview statistics
+- `POST /api/projects/create` - Create new project
+- `GET /api/projects/{id}` - Get project details
+- `PUT /api/projects/{id}` - Update project
+- `DELETE /api/projects/{id}` - Archive project
+- `GET /api/projects/{id}/dashboard` - Project dashboard data
+- `GET /api/projects/types/list` - Get project type templates
+
+#### Project Locations Router (`/backend/routers/project_locations.py`)
+Endpoints:
+- `GET /api/projects/{id}/locations` - List locations
+- `POST /api/projects/{id}/locations/create` - Create location
+- `PUT /api/projects/{id}/locations/{location_id}` - Update location
+- `DELETE /api/projects/{id}/locations/{location_id}` - Delete location
+- `GET /api/projects/{id}/assignments` - List unit assignments
+- `POST /api/projects/{id}/locations/{location_id}/assign` - Assign unit
+- `POST /api/projects/{id}/assignments/{assignment_id}/unassign` - Unassign unit
+- `GET /api/projects/{id}/available-units` - Get units available for assignment
+
+#### Scheduler Router (`/backend/routers/scheduler.py`)
+Endpoints:
+- `GET /api/projects/{id}/scheduler/actions` - List scheduled actions
+- `POST /api/projects/{id}/scheduler/actions/create` - Create action
+- `POST /api/projects/{id}/scheduler/schedule-session` - Schedule recording session
+- `PUT /api/projects/{id}/scheduler/actions/{action_id}` - Update action
+- `POST /api/projects/{id}/scheduler/actions/{action_id}/cancel` - Cancel action
+- `DELETE /api/projects/{id}/scheduler/actions/{action_id}` - Delete action
+- `POST /api/projects/{id}/scheduler/actions/{action_id}/execute` - Manual execution
+- `GET /api/projects/{id}/scheduler/status` - Scheduler status
+- `POST /api/projects/{id}/scheduler/execute-pending` - Trigger pending executions
+
+### 4. Frontend
+
+#### Main Page
+**Location**: `/templates/projects/overview.html`
+
+Features:
+- Summary statistics cards (projects, locations, assignments, sessions)
+- Tabbed interface (All, Active, Completed, Archived)
+- Project cards grid layout
+- Create project modal with two-step flow:
+ 1. Select project type (Sound/Vibration/Combined)
+ 2. Fill project details
+- HTMX-powered dynamic updates
+
+#### Navigation
+**Location**: `/templates/base.html` (updated)
+- "Projects" link added to sidebar
+- Active state highlighting
+
+### 5. Application Integration
+
+**Location**: `/backend/main.py`
+
+- Routers registered
+- Page route added (`/projects`)
+- Scheduler service starts on application startup
+- Scheduler stops on application shutdown
+
+### 6. Database Initialization
+
+**Script**: `/backend/init_projects_db.py`
+
+- Creates all project tables
+- Populates ProjectType with default templates
+- ✅ Successfully executed - database is ready
+
+---
+
+## 📁 File Organization
+
+```
+terra-view/
+├── backend/
+│ ├── models.py [✅ Updated]
+│ ├── init_projects_db.py [✅ Created]
+│ ├── main.py [✅ Updated]
+│ ├── routers/
+│ │ ├── projects.py [✅ Created]
+│ │ ├── project_locations.py [✅ Created]
+│ │ └── scheduler.py [✅ Created]
+│ └── services/
+│ ├── slmm_client.py [✅ Created]
+│ ├── device_controller.py [✅ Created]
+│ └── scheduler.py [✅ Created]
+├── templates/
+│ ├── base.html [✅ Updated]
+│ ├── projects/
+│ │ └── overview.html [✅ Created]
+│ └── partials/
+│ └── projects/ [📁 Created, empty]
+└── data/
+ └── seismo_fleet.db [✅ Tables created]
+```
+
+---
+
+## 🔨 What Still Needs to be Built
+
+### 1. Frontend Templates (Partials)
+
+**Directory**: `/templates/partials/projects/`
+
+**Required Files**:
+
+#### `project_stats.html`
+Stats cards for overview page:
+- Total/Active/Completed projects
+- Total locations
+- Assigned units
+- Active sessions
+
+#### `project_list.html`
+Project cards grid:
+- Project name, type, status
+- Location count, unit count
+- Active session indicator
+- Link to project dashboard
+
+#### `project_dashboard.html`
+Main project dashboard panel with tabs:
+- Summary stats
+- Active locations and assignments
+- Upcoming scheduled actions
+- Recent sessions
+
+#### `location_list.html`
+Location cards/table:
+- Location name, type, coordinates
+- Assigned unit (if any)
+- Session count
+- Assign/unassign button
+
+#### `assignment_list.html`
+Unit assignment table:
+- Unit ID, device type
+- Location name
+- Assignment dates
+- Status
+- Unassign button
+
+#### `scheduler_agenda.html`
+Calendar/agenda view:
+- Scheduled actions sorted by time
+- Action type (start/stop/download)
+- Location and unit
+- Status indicator
+- Cancel/execute buttons
+
+### 2. Project Dashboard Page
+
+**Location**: `/templates/projects/project_dashboard.html`
+
+Full project detail page with:
+- Header with project name, type, status
+- Tab navigation (Dashboard, Scheduler, Locations, Units, Data, Settings)
+- Tab content areas
+- Modals for adding locations, scheduling sessions
+
+### 3. Additional UI Components
+
+- Project type selection cards (with icons)
+- Location creation modal
+- Unit assignment modal
+- Schedule session modal (with date/time picker)
+- Data file browser
+
+### 4. SLMM Enhancements
+
+**Location**: `/slmm/app/routers.py` (SLMM repo)
+
+New endpoint needed:
+```python
+POST /api/nl43/{unit_id}/ftp/download
+```
+
+This should:
+- Accept destination_path and files list
+- Connect to SLM via FTP
+- Download specified files
+- Save to Terra-View's `data/Projects/` directory
+- Return file list with metadata
+
+### 5. SFM Client (Future)
+
+**Location**: `/backend/services/sfm_client.py` (to be created)
+
+Similar to SLMM client, but for seismographs:
+- Get seismograph status
+- Start/stop recording
+- Download data files
+- Integrate with device controller
+
+---
+
+## 🚀 Testing the System
+
+### 1. Start Terra-View
+
+```bash
+cd /home/serversdown/tmi/terra-view
+# Start Terra-View (however you normally start it)
+```
+
+Verify in logs:
+```
+Starting scheduler service...
+Scheduler service started
+```
+
+### 2. Navigate to Projects
+
+Open browser: `http://localhost:8001/projects`
+
+You should see:
+- Summary stats cards (all zeros initially)
+- Tabs (All Projects, Active, Completed, Archived)
+- "New Project" button
+
+### 3. Create a Project
+
+1. Click "New Project"
+2. Select a project type (e.g., "Sound Monitoring")
+3. Fill in details:
+ - Name: "Test Sound Project"
+ - Client: "Test Client"
+ - Start Date: Today
+4. Submit
+
+### 4. Test API Endpoints
+
+```bash
+# Get project types
+curl http://localhost:8001/api/projects/types/list
+
+# Get projects list
+curl http://localhost:8001/api/projects/list
+
+# Get project stats
+curl http://localhost:8001/api/projects/stats
+```
+
+### 5. Test Scheduler Status
+
+```bash
+curl http://localhost:8001/api/projects/{project_id}/scheduler/status
+```
+
+---
+
+## 📋 Dataflow Examples
+
+### Creating and Scheduling a Recording Session
+
+1. **User creates project** → Project record in DB
+2. **User adds NRL** → MonitoringLocation record
+3. **User assigns SLM to NRL** → UnitAssignment record
+4. **User schedules recording** → 2 ScheduledAction records (start + stop)
+5. **Scheduler runs every minute** → Checks for pending actions
+6. **Start action time arrives** → Scheduler calls SLMM via device controller
+7. **SLMM sends TCP command to SLM** → Recording starts
+8. **RecordingSession created** → Tracks the session
+9. **Stop action time arrives** → Scheduler stops recording
+10. **Session updated** → stopped_at, duration_seconds filled
+11. **User triggers download** → Files copied to `data/Projects/{project_id}/sound/{nrl_name}/`
+12. **DataFile records created** → Track file references
+
+---
+
+## 🎨 UI Design Patterns
+
+### Established Patterns (from SLM dashboard):
+
+1. **Stats Cards**: 4-column grid, auto-refresh every 30s
+2. **Sidebar Lists**: Searchable, filterable, auto-refresh
+3. **Main Panel**: Large central area for details
+4. **Modals**: Centered, overlay background
+5. **HTMX**: All dynamic updates, minimal JavaScript
+6. **Tailwind**: Consistent styling with dark mode support
+
+### Color Scheme:
+
+- Primary: `seismo-orange` (#f48b1c)
+- Secondary: `seismo-navy` (#142a66)
+- Accent: `seismo-burgundy` (#7d234d)
+
+---
+
+## 🔧 Configuration
+
+### Environment Variables
+
+- `SLMM_BASE_URL`: SLMM backend URL (default: http://localhost:8100)
+- `ENVIRONMENT`: "development" or "production"
+
+### Scheduler Settings
+
+Located in `/backend/services/scheduler.py`:
+- `check_interval`: 60 seconds (adjust as needed)
+
+---
+
+## 📚 Next Steps
+
+### Immediate (Get Basic UI Working):
+1. Create partial templates (stats, lists)
+2. Test creating projects via UI
+3. Implement project dashboard page
+
+### Short-term (Core Features):
+4. Add location management UI
+5. Add unit assignment UI
+6. Add scheduler UI (agenda view)
+
+### Medium-term (Data Flow):
+7. Implement SLMM download endpoint
+8. Test full recording workflow
+9. Add file browser for downloaded data
+
+### Long-term (Complete System):
+10. Implement SFM client for seismographs
+11. Add data visualization
+12. Add project reporting
+13. Add user authentication
+
+---
+
+## 🐛 Known Issues / TODOs
+
+1. **Partial templates missing**: Need to create HTML templates for all partials
+2. **SLMM download endpoint**: Needs implementation in SLMM backend
+3. **Project dashboard page**: Not yet created
+4. **SFM integration**: Placeholder only, needs real implementation
+5. **File download tracking**: DataFile records not yet created after downloads
+6. **Error handling**: Need better user-facing error messages
+7. **Validation**: Form validation could be improved
+8. **Testing**: No automated tests yet
+
+---
+
+## 📖 API Documentation
+
+### Project Type Object
+```json
+{
+ "id": "sound_monitoring",
+ "name": "Sound Monitoring",
+ "description": "...",
+ "icon": "volume-2",
+ "supports_sound": true,
+ "supports_vibration": false
+}
+```
+
+### Project Object
+```json
+{
+ "id": "uuid",
+ "name": "Project Name",
+ "description": "...",
+ "project_type_id": "sound_monitoring",
+ "status": "active",
+ "client_name": "Client Inc",
+ "site_address": "123 Main St",
+ "site_coordinates": "40.7128,-74.0060",
+ "start_date": "2024-01-15",
+ "end_date": null,
+ "created_at": "2024-01-15T10:00:00",
+ "updated_at": "2024-01-15T10:00:00"
+}
+```
+
+### MonitoringLocation Object
+```json
+{
+ "id": "uuid",
+ "project_id": "uuid",
+ "location_type": "sound",
+ "name": "NRL-001",
+ "description": "...",
+ "coordinates": "40.7128,-74.0060",
+ "address": "123 Main St",
+ "location_metadata": "{...}",
+ "created_at": "2024-01-15T10:00:00"
+}
+```
+
+### UnitAssignment Object
+```json
+{
+ "id": "uuid",
+ "unit_id": "nl43-001",
+ "location_id": "uuid",
+ "project_id": "uuid",
+ "device_type": "sound_level_meter",
+ "assigned_at": "2024-01-15T10:00:00",
+ "assigned_until": null,
+ "status": "active",
+ "notes": "..."
+}
+```
+
+### ScheduledAction Object
+```json
+{
+ "id": "uuid",
+ "project_id": "uuid",
+ "location_id": "uuid",
+ "unit_id": "nl43-001",
+ "action_type": "start",
+ "device_type": "sound_level_meter",
+ "scheduled_time": "2024-01-16T08:00:00",
+ "executed_at": null,
+ "execution_status": "pending",
+ "module_response": null,
+ "error_message": null
+}
+```
+
+---
+
+## 🎓 Architecture Decisions
+
+### Why Project Types?
+Allows the system to scale to different monitoring scenarios (air quality, multi-hazard, etc.) without code changes. Just add a new ProjectType record and the UI adapts.
+
+### Why Generic MonitoringLocation?
+Instead of separate NRL and MonitoringPoint tables, one table with a `location_type` discriminator keeps the schema clean and allows for combined projects.
+
+### Why Denormalized Fields?
+Fields like `project_id` in UnitAssignment (already have via location) enable faster queries without joins.
+
+### Why Scheduler in Terra-View?
+Terra-View is the orchestration layer. SLMM only handles device communication. Keeping scheduling logic in Terra-View allows for complex workflows across multiple device types.
+
+### Why JSON Metadata Columns?
+Type-specific fields (like ambient_conditions for sound projects) don't apply to all location types. JSON columns provide flexibility without cluttering the schema.
+
+---
+
+## 💡 Tips for Continuing Development
+
+1. **Follow Existing Patterns**: Look at the SLM dashboard code for reference
+2. **Use HTMX Aggressively**: Minimize JavaScript, let HTMX handle updates
+3. **Keep Routers Thin**: Move business logic to service layer
+4. **Return HTML Partials**: Most endpoints should return HTML, not JSON
+5. **Test Incrementally**: Build one partial at a time and test in browser
+6. **Check Logs**: Scheduler logs execution attempts
+7. **Use Browser DevTools**: Network tab shows HTMX requests
+
+---
+
+## 📞 Support
+
+For questions or issues:
+1. Check this document first
+2. Review existing dashboards (SLM, Seismographs) for patterns
+3. Check logs for scheduler execution details
+4. Test API endpoints with curl to isolate issues
+
+---
+
+## ✅ Checklist for Completion
+
+- [x] Database schema designed
+- [x] Models created
+- [x] Migration script run successfully
+- [x] Service layer complete (SLMM client, device controller, scheduler)
+- [x] API routers created (projects, locations, scheduler)
+- [x] Navigation updated
+- [x] Main overview page created
+- [x] Routes registered in main.py
+- [x] Scheduler service integrated
+- [ ] Partial templates created
+- [ ] Project dashboard page created
+- [ ] Location management UI
+- [ ] Unit assignment UI
+- [ ] Scheduler UI (agenda view)
+- [ ] SLMM download endpoint implemented
+- [ ] Full workflow tested end-to-end
+- [ ] SFM client implemented (future)
+
+---
+
+**Last Updated**: 2026-01-12
+
+**Database Status**: ✅ Initialized
+
+**Backend Status**: ✅ Complete
+
+**Frontend Status**: 🟡 Partial (overview page only)
+
+**Ready for Testing**: ✅ Yes (basic functionality)
diff --git a/backend/init_projects_db.py b/backend/init_projects_db.py
new file mode 100644
index 0000000..68802c7
--- /dev/null
+++ b/backend/init_projects_db.py
@@ -0,0 +1,108 @@
+#!/usr/bin/env python3
+"""
+Database initialization script for Projects system.
+
+This script creates the new project management tables and populates
+the project_types table with default templates.
+
+Usage:
+ python -m backend.init_projects_db
+"""
+
+from sqlalchemy.orm import Session
+from backend.database import engine, SessionLocal
+from backend.models import (
+ Base,
+ ProjectType,
+ Project,
+ MonitoringLocation,
+ UnitAssignment,
+ ScheduledAction,
+ RecordingSession,
+ DataFile,
+)
+from datetime import datetime
+
+
+def init_project_types(db: Session):
+ """Initialize default project types."""
+ project_types = [
+ {
+ "id": "sound_monitoring",
+ "name": "Sound Monitoring",
+ "description": "Noise monitoring projects with sound level meters and NRLs (Noise Recording Locations)",
+ "icon": "volume-2", # Lucide icon name
+ "supports_sound": True,
+ "supports_vibration": False,
+ },
+ {
+ "id": "vibration_monitoring",
+ "name": "Vibration Monitoring",
+ "description": "Seismic/vibration monitoring projects with seismographs and monitoring points",
+ "icon": "activity", # Lucide icon name
+ "supports_sound": False,
+ "supports_vibration": True,
+ },
+ {
+ "id": "combined",
+ "name": "Combined Monitoring",
+ "description": "Full-spectrum monitoring with both sound and vibration capabilities",
+ "icon": "layers", # Lucide icon name
+ "supports_sound": True,
+ "supports_vibration": True,
+ },
+ ]
+
+ for pt_data in project_types:
+ existing = db.query(ProjectType).filter_by(id=pt_data["id"]).first()
+ if not existing:
+ pt = ProjectType(**pt_data)
+ db.add(pt)
+ print(f"✓ Created project type: {pt_data['name']}")
+ else:
+ print(f" Project type already exists: {pt_data['name']}")
+
+ db.commit()
+
+
+def create_tables():
+ """Create all tables defined in models."""
+ print("Creating project management tables...")
+ Base.metadata.create_all(bind=engine)
+ print("✓ Tables created successfully")
+
+
+def main():
+ print("=" * 60)
+ print("Terra-View Projects System - Database Initialization")
+ print("=" * 60)
+ print()
+
+ # Create tables
+ create_tables()
+ print()
+
+ # Initialize project types
+ db = SessionLocal()
+ try:
+ print("Initializing project types...")
+ init_project_types(db)
+ print()
+ print("=" * 60)
+ print("✓ Database initialization complete!")
+ print("=" * 60)
+ print()
+ print("Next steps:")
+ print(" 1. Restart Terra-View to load new routes")
+ print(" 2. Navigate to /projects to create your first project")
+ print(" 3. Check documentation for API endpoints")
+ except Exception as e:
+ print(f"✗ Error during initialization: {e}")
+ db.rollback()
+ raise
+ finally:
+ db.close()
+
+
+if __name__ == "__main__":
+ main()
diff --git a/backend/main.py b/backend/main.py
index 6442ab3..0312ab7 100644
--- a/backend/main.py
+++ b/backend/main.py
@@ -18,7 +18,7 @@ logging.basicConfig(
logger = logging.getLogger(__name__)
from backend.database import engine, Base, get_db
-from backend.routers import roster, units, photos, roster_edit, dashboard, dashboard_tabs, activity, slmm, slm_ui, slm_dashboard, seismo_dashboard
+from backend.routers import roster, units, photos, roster_edit, dashboard, dashboard_tabs, activity, slmm, slm_ui, slm_dashboard, seismo_dashboard, projects, project_locations, scheduler
from backend.services.snapshot import emit_status_snapshot
from backend.models import IgnoredUnit
@@ -95,6 +95,27 @@ app.include_router(seismo_dashboard.router)
from backend.routers import settings
app.include_router(settings.router)
+# Projects system routers
+app.include_router(projects.router)
+app.include_router(project_locations.router)
+app.include_router(scheduler.router)
+
+# Start scheduler service on application startup
+from backend.services.scheduler import start_scheduler, stop_scheduler
+
+@app.on_event("startup")
+async def startup_event():
+ """Initialize services on app startup"""
+ logger.info("Starting scheduler service...")
+ await start_scheduler()
+ logger.info("Scheduler service started")
+
+@app.on_event("shutdown")
+def shutdown_event():
+ """Clean up services on app shutdown"""
+ logger.info("Stopping scheduler service...")
+ stop_scheduler()
+ logger.info("Scheduler service stopped")
# Legacy routes from the original backend
@@ -142,6 +163,21 @@ async def seismographs_page(request: Request):
return templates.TemplateResponse("seismographs.html", {"request": request})
+@app.get("/projects", response_class=HTMLResponse)
+async def projects_page(request: Request):
+ """Projects management and overview"""
+ return templates.TemplateResponse("projects/overview.html", {"request": request})
+
+
+@app.get("/projects/{project_id}", response_class=HTMLResponse)
+async def project_detail_page(request: Request, project_id: str):
+ """Project detail dashboard"""
+ return templates.TemplateResponse("projects/detail.html", {
+ "request": request,
+ "project_id": project_id
+ })
+
+
# ===== PWA ROUTES =====
@app.get("/sw.js")
diff --git a/backend/models.py b/backend/models.py
index b872a40..723c1dc 100644
--- a/backend/models.py
+++ b/backend/models.py
@@ -108,3 +108,170 @@ class UserPreferences(Base):
status_ok_threshold_hours = Column(Integer, default=12)
status_pending_threshold_hours = Column(Integer, default=24)
updated_at = Column(DateTime, default=datetime.utcnow, onupdate=datetime.utcnow)
+
+
+# ============================================================================
+# Project Management System
+# ============================================================================
+
+class ProjectType(Base):
+ """
+ Project type templates: defines available project types and their capabilities.
+ Pre-populated with: sound_monitoring, vibration_monitoring, combined.
+ """
+ __tablename__ = "project_types"
+
+ id = Column(String, primary_key=True) # sound_monitoring, vibration_monitoring, combined
+ name = Column(String, nullable=False, unique=True) # "Sound Monitoring", "Vibration Monitoring"
+ description = Column(Text, nullable=True)
+ icon = Column(String, nullable=True) # Icon identifier for UI
+ supports_sound = Column(Boolean, default=False) # Enables SLM features
+ supports_vibration = Column(Boolean, default=False) # Enables seismograph features
+ created_at = Column(DateTime, default=datetime.utcnow)
+
+
+class Project(Base):
+ """
+ Projects: top-level organization for monitoring work.
+ Type-aware to enable/disable features based on project_type_id.
+ """
+ __tablename__ = "projects"
+
+ id = Column(String, primary_key=True, index=True) # UUID
+ name = Column(String, nullable=False, unique=True)
+ description = Column(Text, nullable=True)
+ project_type_id = Column(String, nullable=False) # FK to ProjectType.id
+ status = Column(String, default="active") # active, completed, archived
+
+ # Project metadata
+ client_name = Column(String, nullable=True)
+ site_address = Column(String, nullable=True)
+ site_coordinates = Column(String, nullable=True) # "lat,lon"
+ start_date = Column(Date, nullable=True)
+ end_date = Column(Date, nullable=True)
+
+ created_at = Column(DateTime, default=datetime.utcnow)
+ updated_at = Column(DateTime, default=datetime.utcnow, onupdate=datetime.utcnow)
+
+
+class MonitoringLocation(Base):
+ """
+ Monitoring locations: generic location for monitoring activities.
+ Can be NRL (Noise Recording Location) for sound projects,
+ or monitoring point for vibration projects.
+ """
+ __tablename__ = "monitoring_locations"
+
+ id = Column(String, primary_key=True, index=True) # UUID
+ project_id = Column(String, nullable=False, index=True) # FK to Project.id
+ location_type = Column(String, nullable=False) # "sound" | "vibration"
+
+ name = Column(String, nullable=False) # NRL-001, VP-North, etc.
+ description = Column(Text, nullable=True)
+ coordinates = Column(String, nullable=True) # "lat,lon"
+ address = Column(String, nullable=True)
+
+ # Type-specific metadata stored as JSON
+ # For sound: {"ambient_conditions": "urban", "expected_sources": ["traffic"]}
+ # For vibration: {"ground_type": "bedrock", "depth": "10m"}
+ location_metadata = Column(Text, nullable=True)
+
+ created_at = Column(DateTime, default=datetime.utcnow)
+ updated_at = Column(DateTime, default=datetime.utcnow, onupdate=datetime.utcnow)
+
+
+class UnitAssignment(Base):
+ """
+ Unit assignments: links devices (SLMs or seismographs) to monitoring locations.
+ Supports temporary assignments with assigned_until.
+ """
+ __tablename__ = "unit_assignments"
+
+ id = Column(String, primary_key=True, index=True) # UUID
+ unit_id = Column(String, nullable=False, index=True) # FK to RosterUnit.id
+ location_id = Column(String, nullable=False, index=True) # FK to MonitoringLocation.id
+
+ assigned_at = Column(DateTime, default=datetime.utcnow)
+ assigned_until = Column(DateTime, nullable=True) # Null = indefinite
+ status = Column(String, default="active") # active, completed, cancelled
+ notes = Column(Text, nullable=True)
+
+ # Denormalized for efficient queries
+ device_type = Column(String, nullable=False) # sound_level_meter | seismograph
+ project_id = Column(String, nullable=False, index=True) # FK to Project.id
+
+ created_at = Column(DateTime, default=datetime.utcnow)
+
+
+class ScheduledAction(Base):
+ """
+ Scheduled actions: automation for recording start/stop/download.
+ Terra-View executes these by calling SLMM or SFM endpoints.
+ """
+ __tablename__ = "scheduled_actions"
+
+ id = Column(String, primary_key=True, index=True) # UUID
+ project_id = Column(String, nullable=False, index=True) # FK to Project.id
+ location_id = Column(String, nullable=False, index=True) # FK to MonitoringLocation.id
+ unit_id = Column(String, nullable=True, index=True) # FK to RosterUnit.id (nullable if location-based)
+
+ action_type = Column(String, nullable=False) # start, stop, download, calibrate
+ device_type = Column(String, nullable=False) # sound_level_meter | seismograph
+
+ scheduled_time = Column(DateTime, nullable=False, index=True)
+ executed_at = Column(DateTime, nullable=True)
+ execution_status = Column(String, default="pending") # pending, completed, failed, cancelled
+
+ # Response from device module (SLMM or SFM)
+ module_response = Column(Text, nullable=True) # JSON
+ error_message = Column(Text, nullable=True)
+
+ notes = Column(Text, nullable=True)
+ created_at = Column(DateTime, default=datetime.utcnow)
+
+
+class RecordingSession(Base):
+ """
+ Recording sessions: tracks actual monitoring sessions.
+ Created when recording starts, updated when it stops.
+ """
+ __tablename__ = "recording_sessions"
+
+ id = Column(String, primary_key=True, index=True) # UUID
+ project_id = Column(String, nullable=False, index=True) # FK to Project.id
+ location_id = Column(String, nullable=False, index=True) # FK to MonitoringLocation.id
+ unit_id = Column(String, nullable=False, index=True) # FK to RosterUnit.id
+
+ session_type = Column(String, nullable=False) # sound | vibration
+ started_at = Column(DateTime, nullable=False)
+ stopped_at = Column(DateTime, nullable=True)
+ duration_seconds = Column(Integer, nullable=True)
+ status = Column(String, default="recording") # recording, completed, failed
+
+ # Snapshot of device configuration at recording time
+ session_metadata = Column(Text, nullable=True) # JSON
+
+ created_at = Column(DateTime, default=datetime.utcnow)
+ updated_at = Column(DateTime, default=datetime.utcnow, onupdate=datetime.utcnow)
+
+
+class DataFile(Base):
+ """
+ Data files: references to recorded data files.
+ Terra-View tracks file metadata; actual files stored in data/Projects/ directory.
+ """
+ __tablename__ = "data_files"
+
+ id = Column(String, primary_key=True, index=True) # UUID
+ session_id = Column(String, nullable=False, index=True) # FK to RecordingSession.id
+
+ file_path = Column(String, nullable=False) # Relative to data/Projects/
+ file_type = Column(String, nullable=False) # wav, csv, mseed, json
+ file_size_bytes = Column(Integer, nullable=True)
+ downloaded_at = Column(DateTime, nullable=True)
+ checksum = Column(String, nullable=True) # SHA256 or MD5
+
+ # Additional file metadata
+ file_metadata = Column(Text, nullable=True) # JSON
+
+ created_at = Column(DateTime, default=datetime.utcnow)
diff --git a/backend/routers/project_locations.py b/backend/routers/project_locations.py
new file mode 100644
index 0000000..0f999c9
--- /dev/null
+++ b/backend/routers/project_locations.py
@@ -0,0 +1,404 @@
+"""
+Project Locations Router
+
+Handles monitoring locations (NRLs for sound, monitoring points for vibration)
+and unit assignments within projects.
+"""
+
+from fastapi import APIRouter, Request, Depends, HTTPException, Query
+from fastapi.templating import Jinja2Templates
+from fastapi.responses import HTMLResponse, JSONResponse
+from sqlalchemy.orm import Session
+from sqlalchemy import and_, or_
+from datetime import datetime
+from typing import Optional
+import uuid
+import json
+
+from backend.database import get_db
+from backend.models import (
+ Project,
+ ProjectType,
+ MonitoringLocation,
+ UnitAssignment,
+ RosterUnit,
+ RecordingSession,
+)
+
+router = APIRouter(prefix="/api/projects/{project_id}", tags=["project-locations"])
+templates = Jinja2Templates(directory="templates")
+
+
+# ============================================================================
+# Monitoring Locations CRUD
+# ============================================================================
+
+@router.get("/locations", response_class=HTMLResponse)
+async def get_project_locations(
+ project_id: str,
+ request: Request,
+ db: Session = Depends(get_db),
+ location_type: Optional[str] = Query(None),
+):
+ """
+ Get all monitoring locations for a project.
+ Returns HTML partial with location list.
+ """
+ project = db.query(Project).filter_by(id=project_id).first()
+ if not project:
+ raise HTTPException(status_code=404, detail="Project not found")
+
+ query = db.query(MonitoringLocation).filter_by(project_id=project_id)
+
+ # Filter by type if provided
+ if location_type:
+ query = query.filter_by(location_type=location_type)
+
+ locations = query.order_by(MonitoringLocation.name).all()
+
+ # Enrich with assignment info
+ locations_data = []
+ for location in locations:
+ # Get active assignment
+ assignment = db.query(UnitAssignment).filter(
+ and_(
+ UnitAssignment.location_id == location.id,
+ UnitAssignment.status == "active",
+ )
+ ).first()
+
+ assigned_unit = None
+ if assignment:
+ assigned_unit = db.query(RosterUnit).filter_by(id=assignment.unit_id).first()
+
+ # Count recording sessions
+ session_count = db.query(RecordingSession).filter_by(
+ location_id=location.id
+ ).count()
+
+ locations_data.append({
+ "location": location,
+ "assignment": assignment,
+ "assigned_unit": assigned_unit,
+ "session_count": session_count,
+ })
+
+ return templates.TemplateResponse("partials/projects/location_list.html", {
+ "request": request,
+ "project": project,
+ "locations": locations_data,
+ })
+
+
+@router.post("/locations/create")
+async def create_location(
+ project_id: str,
+ request: Request,
+ db: Session = Depends(get_db),
+):
+ """
+ Create a new monitoring location within a project.
+ """
+ project = db.query(Project).filter_by(id=project_id).first()
+ if not project:
+ raise HTTPException(status_code=404, detail="Project not found")
+
+ form_data = await request.form()
+
+ location = MonitoringLocation(
+ id=str(uuid.uuid4()),
+ project_id=project_id,
+ location_type=form_data.get("location_type"),
+ name=form_data.get("name"),
+ description=form_data.get("description"),
+ coordinates=form_data.get("coordinates"),
+ address=form_data.get("address"),
+ location_metadata=form_data.get("location_metadata"), # JSON string
+ )
+
+ db.add(location)
+ db.commit()
+ db.refresh(location)
+
+ return JSONResponse({
+ "success": True,
+ "location_id": location.id,
+ "message": f"Location '{location.name}' created successfully",
+ })
+
+
+@router.put("/locations/{location_id}")
+async def update_location(
+ project_id: str,
+ location_id: str,
+ request: Request,
+ db: Session = Depends(get_db),
+):
+ """
+ Update a monitoring location.
+ """
+ location = db.query(MonitoringLocation).filter_by(
+ id=location_id,
+ project_id=project_id,
+ ).first()
+
+ if not location:
+ raise HTTPException(status_code=404, detail="Location not found")
+
+ data = await request.json()
+
+ # Update fields if provided
+ if "name" in data:
+ location.name = data["name"]
+ if "description" in data:
+ location.description = data["description"]
+ if "coordinates" in data:
+ location.coordinates = data["coordinates"]
+ if "address" in data:
+ location.address = data["address"]
+ if "location_metadata" in data:
+ location.location_metadata = data["location_metadata"]
+
+ location.updated_at = datetime.utcnow()
+
+ db.commit()
+
+ return {"success": True, "message": "Location updated successfully"}
+
+
+@router.delete("/locations/{location_id}")
+async def delete_location(
+ project_id: str,
+ location_id: str,
+ db: Session = Depends(get_db),
+):
+ """
+ Delete a monitoring location.
+ """
+ location = db.query(MonitoringLocation).filter_by(
+ id=location_id,
+ project_id=project_id,
+ ).first()
+
+ if not location:
+ raise HTTPException(status_code=404, detail="Location not found")
+
+ # Check if location has active assignments
+ active_assignments = db.query(UnitAssignment).filter(
+ and_(
+ UnitAssignment.location_id == location_id,
+ UnitAssignment.status == "active",
+ )
+ ).count()
+
+ if active_assignments > 0:
+ raise HTTPException(
+ status_code=400,
+ detail="Cannot delete location with active unit assignments. Unassign units first.",
+ )
+
+ db.delete(location)
+ db.commit()
+
+ return {"success": True, "message": "Location deleted successfully"}
+
+
+# ============================================================================
+# Unit Assignments
+# ============================================================================
+
+@router.get("/assignments", response_class=HTMLResponse)
+async def get_project_assignments(
+ project_id: str,
+ request: Request,
+ db: Session = Depends(get_db),
+ status: Optional[str] = Query("active"),
+):
+ """
+ Get all unit assignments for a project.
+ Returns HTML partial with assignment list.
+ """
+ query = db.query(UnitAssignment).filter_by(project_id=project_id)
+
+ if status:
+ query = query.filter_by(status=status)
+
+ assignments = query.order_by(UnitAssignment.assigned_at.desc()).all()
+
+ # Enrich with unit and location details
+ assignments_data = []
+ for assignment in assignments:
+ unit = db.query(RosterUnit).filter_by(id=assignment.unit_id).first()
+ location = db.query(MonitoringLocation).filter_by(id=assignment.location_id).first()
+
+ assignments_data.append({
+ "assignment": assignment,
+ "unit": unit,
+ "location": location,
+ })
+
+ return templates.TemplateResponse("partials/projects/assignment_list.html", {
+ "request": request,
+ "project_id": project_id,
+ "assignments": assignments_data,
+ })
+
+
+@router.post("/locations/{location_id}/assign")
+async def assign_unit_to_location(
+ project_id: str,
+ location_id: str,
+ request: Request,
+ db: Session = Depends(get_db),
+):
+ """
+ Assign a unit to a monitoring location.
+ """
+ location = db.query(MonitoringLocation).filter_by(
+ id=location_id,
+ project_id=project_id,
+ ).first()
+
+ if not location:
+ raise HTTPException(status_code=404, detail="Location not found")
+
+ form_data = await request.form()
+ unit_id = form_data.get("unit_id")
+
+ # Verify unit exists and matches location type
+ unit = db.query(RosterUnit).filter_by(id=unit_id).first()
+ if not unit:
+ raise HTTPException(status_code=404, detail="Unit not found")
+
+ # Check device type matches location type
+ expected_device_type = "sound_level_meter" if location.location_type == "sound" else "seismograph"
+ if unit.device_type != expected_device_type:
+ raise HTTPException(
+ status_code=400,
+ detail=f"Unit type '{unit.device_type}' does not match location type '{location.location_type}'",
+ )
+
+ # Check if location already has an active assignment
+ existing_assignment = db.query(UnitAssignment).filter(
+ and_(
+ UnitAssignment.location_id == location_id,
+ UnitAssignment.status == "active",
+ )
+ ).first()
+
+ if existing_assignment:
+ raise HTTPException(
+ status_code=400,
+ detail=f"Location already has an active unit assignment ({existing_assignment.unit_id}). Unassign first.",
+ )
+
+ # Create new assignment
+ assigned_until_str = form_data.get("assigned_until")
+ assigned_until = datetime.fromisoformat(assigned_until_str) if assigned_until_str else None
+
+ assignment = UnitAssignment(
+ id=str(uuid.uuid4()),
+ unit_id=unit_id,
+ location_id=location_id,
+ project_id=project_id,
+ device_type=unit.device_type,
+ assigned_until=assigned_until,
+ status="active",
+ notes=form_data.get("notes"),
+ )
+
+ db.add(assignment)
+ db.commit()
+ db.refresh(assignment)
+
+ return JSONResponse({
+ "success": True,
+ "assignment_id": assignment.id,
+ "message": f"Unit '{unit_id}' assigned to '{location.name}'",
+ })
+
+
+@router.post("/assignments/{assignment_id}/unassign")
+async def unassign_unit(
+ project_id: str,
+ assignment_id: str,
+ db: Session = Depends(get_db),
+):
+ """
+ Unassign a unit from a location.
+ """
+ assignment = db.query(UnitAssignment).filter_by(
+ id=assignment_id,
+ project_id=project_id,
+ ).first()
+
+ if not assignment:
+ raise HTTPException(status_code=404, detail="Assignment not found")
+
+ # Check if there are active recording sessions
+ active_sessions = db.query(RecordingSession).filter(
+ and_(
+ RecordingSession.location_id == assignment.location_id,
+ RecordingSession.unit_id == assignment.unit_id,
+ RecordingSession.status == "recording",
+ )
+ ).count()
+
+ if active_sessions > 0:
+ raise HTTPException(
+ status_code=400,
+ detail="Cannot unassign unit with active recording sessions. Stop recording first.",
+ )
+
+ assignment.status = "completed"
+ assignment.assigned_until = datetime.utcnow()
+
+ db.commit()
+
+ return {"success": True, "message": "Unit unassigned successfully"}
+
+
+# ============================================================================
+# Available Units for Assignment
+# ============================================================================
+
+@router.get("/available-units", response_class=JSONResponse)
+async def get_available_units(
+ project_id: str,
+ location_type: str = Query(...),
+ db: Session = Depends(get_db),
+):
+ """
+ Get list of available units for assignment to a location.
+ Filters by device type matching the location type.
+ """
+ # Determine required device type
+ required_device_type = "sound_level_meter" if location_type == "sound" else "seismograph"
+
+ # Get all units of the required type that are deployed and not retired
+ all_units = db.query(RosterUnit).filter(
+ and_(
+ RosterUnit.device_type == required_device_type,
+ RosterUnit.deployed == True,
+ RosterUnit.retired == False,
+ )
+ ).all()
+
+ # Filter out units that already have active assignments
+ assigned_unit_ids = db.query(UnitAssignment.unit_id).filter(
+ UnitAssignment.status == "active"
+ ).distinct().all()
+ assigned_unit_ids = [uid[0] for uid in assigned_unit_ids]
+
+ available_units = [
+ {
+ "id": unit.id,
+ "device_type": unit.device_type,
+ "location": unit.address or unit.location,
+ "model": unit.slm_model if unit.device_type == "sound_level_meter" else unit.unit_type,
+ }
+ for unit in all_units
+ if unit.id not in assigned_unit_ids
+ ]
+
+ return available_units
diff --git a/backend/routers/projects.py b/backend/routers/projects.py
new file mode 100644
index 0000000..7701982
--- /dev/null
+++ b/backend/routers/projects.py
@@ -0,0 +1,359 @@
+"""
+Projects Router
+
+Provides API endpoints for the Projects system:
+- Project CRUD operations
+- Project dashboards
+- Project statistics
+- Type-aware features
+"""
+
+from fastapi import APIRouter, Request, Depends, HTTPException, Query
+from fastapi.templating import Jinja2Templates
+from fastapi.responses import HTMLResponse, JSONResponse
+from sqlalchemy.orm import Session
+from sqlalchemy import func, and_
+from datetime import datetime, timedelta
+from typing import Optional
+import uuid
+import json
+
+from backend.database import get_db
+from backend.models import (
+ Project,
+ ProjectType,
+ MonitoringLocation,
+ UnitAssignment,
+ RecordingSession,
+ ScheduledAction,
+ RosterUnit,
+)
+
+router = APIRouter(prefix="/api/projects", tags=["projects"])
+templates = Jinja2Templates(directory="templates")
+
+
+# ============================================================================
+# Project List & Overview
+# ============================================================================
+
+@router.get("/list", response_class=HTMLResponse)
+async def get_projects_list(
+ request: Request,
+ db: Session = Depends(get_db),
+ status: Optional[str] = Query(None),
+ project_type_id: Optional[str] = Query(None),
+ view: Optional[str] = Query(None),
+):
+ """
+ Get list of all projects.
+ Returns HTML partial with project cards.
+ """
+ query = db.query(Project)
+
+ # Filter by status if provided
+ if status:
+ query = query.filter(Project.status == status)
+
+ # Filter by project type if provided
+ if project_type_id:
+ query = query.filter(Project.project_type_id == project_type_id)
+
+ projects = query.order_by(Project.created_at.desc()).all()
+
+ # Enrich each project with stats
+ projects_data = []
+ for project in projects:
+ # Get project type
+ project_type = db.query(ProjectType).filter_by(id=project.project_type_id).first()
+
+ # Count locations
+ location_count = db.query(func.count(MonitoringLocation.id)).filter_by(
+ project_id=project.id
+ ).scalar()
+
+ # Count assigned units
+ unit_count = db.query(func.count(UnitAssignment.id)).filter(
+ and_(
+ UnitAssignment.project_id == project.id,
+ UnitAssignment.status == "active",
+ )
+ ).scalar()
+
+ # Count active sessions
+ active_session_count = db.query(func.count(RecordingSession.id)).filter(
+ and_(
+ RecordingSession.project_id == project.id,
+ RecordingSession.status == "recording",
+ )
+ ).scalar()
+
+ projects_data.append({
+ "project": project,
+ "project_type": project_type,
+ "location_count": location_count,
+ "unit_count": unit_count,
+ "active_session_count": active_session_count,
+ })
+
+ template_name = "partials/projects/project_list.html"
+ if view == "compact":
+ template_name = "partials/projects/project_list_compact.html"
+
+ return templates.TemplateResponse(template_name, {
+ "request": request,
+ "projects": projects_data,
+ })
+
+
+@router.get("/stats", response_class=HTMLResponse)
+async def get_projects_stats(request: Request, db: Session = Depends(get_db)):
+ """
+ Get summary statistics for projects overview.
+ Returns HTML partial with stat cards.
+ """
+ # Count projects by status
+ total_projects = db.query(func.count(Project.id)).scalar()
+ active_projects = db.query(func.count(Project.id)).filter_by(status="active").scalar()
+ completed_projects = db.query(func.count(Project.id)).filter_by(status="completed").scalar()
+
+ # Count total locations across all projects
+ total_locations = db.query(func.count(MonitoringLocation.id)).scalar()
+
+ # Count assigned units
+ assigned_units = db.query(func.count(UnitAssignment.id)).filter_by(
+ status="active"
+ ).scalar()
+
+ # Count active recording sessions
+ active_sessions = db.query(func.count(RecordingSession.id)).filter_by(
+ status="recording"
+ ).scalar()
+
+ return templates.TemplateResponse("partials/projects/project_stats.html", {
+ "request": request,
+ "total_projects": total_projects,
+ "active_projects": active_projects,
+ "completed_projects": completed_projects,
+ "total_locations": total_locations,
+ "assigned_units": assigned_units,
+ "active_sessions": active_sessions,
+ })
+
+
+# ============================================================================
+# Project CRUD
+# ============================================================================
+
+@router.post("/create")
+async def create_project(request: Request, db: Session = Depends(get_db)):
+ """
+ Create a new project.
+ Expects form data with project details.
+ """
+ form_data = await request.form()
+
+ project = Project(
+ id=str(uuid.uuid4()),
+ name=form_data.get("name"),
+ description=form_data.get("description"),
+ project_type_id=form_data.get("project_type_id"),
+ status="active",
+ client_name=form_data.get("client_name"),
+ site_address=form_data.get("site_address"),
+ site_coordinates=form_data.get("site_coordinates"),
+ start_date=datetime.fromisoformat(form_data.get("start_date")) if form_data.get("start_date") else None,
+ end_date=datetime.fromisoformat(form_data.get("end_date")) if form_data.get("end_date") else None,
+ )
+
+ db.add(project)
+ db.commit()
+ db.refresh(project)
+
+ return JSONResponse({
+ "success": True,
+ "project_id": project.id,
+ "message": f"Project '{project.name}' created successfully",
+ })
+
+
+@router.get("/{project_id}")
+async def get_project(project_id: str, db: Session = Depends(get_db)):
+ """
+ Get project details by ID.
+ Returns JSON with full project data.
+ """
+ project = db.query(Project).filter_by(id=project_id).first()
+ if not project:
+ raise HTTPException(status_code=404, detail="Project not found")
+
+ project_type = db.query(ProjectType).filter_by(id=project.project_type_id).first()
+
+ return {
+ "id": project.id,
+ "name": project.name,
+ "description": project.description,
+ "project_type_id": project.project_type_id,
+ "project_type_name": project_type.name if project_type else None,
+ "status": project.status,
+ "client_name": project.client_name,
+ "site_address": project.site_address,
+ "site_coordinates": project.site_coordinates,
+ "start_date": project.start_date.isoformat() if project.start_date else None,
+ "end_date": project.end_date.isoformat() if project.end_date else None,
+ "created_at": project.created_at.isoformat(),
+ "updated_at": project.updated_at.isoformat(),
+ }
+
+
+@router.put("/{project_id}")
+async def update_project(
+ project_id: str,
+ request: Request,
+ db: Session = Depends(get_db),
+):
+ """
+ Update project details.
+ Expects JSON body with fields to update.
+ """
+ project = db.query(Project).filter_by(id=project_id).first()
+ if not project:
+ raise HTTPException(status_code=404, detail="Project not found")
+
+ data = await request.json()
+
+ # Update fields if provided
+ if "name" in data:
+ project.name = data["name"]
+ if "description" in data:
+ project.description = data["description"]
+ if "status" in data:
+ project.status = data["status"]
+ if "client_name" in data:
+ project.client_name = data["client_name"]
+ if "site_address" in data:
+ project.site_address = data["site_address"]
+ if "site_coordinates" in data:
+ project.site_coordinates = data["site_coordinates"]
+ if "start_date" in data:
+ project.start_date = datetime.fromisoformat(data["start_date"]) if data["start_date"] else None
+ if "end_date" in data:
+ project.end_date = datetime.fromisoformat(data["end_date"]) if data["end_date"] else None
+
+ project.updated_at = datetime.utcnow()
+
+ db.commit()
+
+ return {"success": True, "message": "Project updated successfully"}
+
+
+@router.delete("/{project_id}")
+async def delete_project(project_id: str, db: Session = Depends(get_db)):
+ """
+ Delete a project (soft delete by archiving).
+ """
+ project = db.query(Project).filter_by(id=project_id).first()
+ if not project:
+ raise HTTPException(status_code=404, detail="Project not found")
+
+ project.status = "archived"
+ project.updated_at = datetime.utcnow()
+
+ db.commit()
+
+ return {"success": True, "message": "Project archived successfully"}
+
+
+# ============================================================================
+# Project Dashboard Data
+# ============================================================================
+
+@router.get("/{project_id}/dashboard", response_class=HTMLResponse)
+async def get_project_dashboard(
+ project_id: str,
+ request: Request,
+ db: Session = Depends(get_db),
+):
+ """
+ Get project dashboard data.
+ Returns HTML partial with project summary.
+ """
+ project = db.query(Project).filter_by(id=project_id).first()
+ if not project:
+ raise HTTPException(status_code=404, detail="Project not found")
+
+ project_type = db.query(ProjectType).filter_by(id=project.project_type_id).first()
+
+ # Get locations
+ locations = db.query(MonitoringLocation).filter_by(project_id=project_id).all()
+
+ # Get assigned units with details
+ assignments = db.query(UnitAssignment).filter(
+ and_(
+ UnitAssignment.project_id == project_id,
+ UnitAssignment.status == "active",
+ )
+ ).all()
+
+ assigned_units = []
+ for assignment in assignments:
+ unit = db.query(RosterUnit).filter_by(id=assignment.unit_id).first()
+ if unit:
+ assigned_units.append({
+ "assignment": assignment,
+ "unit": unit,
+ })
+
+ # Get active recording sessions
+ active_sessions = db.query(RecordingSession).filter(
+ and_(
+ RecordingSession.project_id == project_id,
+ RecordingSession.status == "recording",
+ )
+ ).all()
+
+ # Get completed sessions count
+ completed_sessions_count = db.query(func.count(RecordingSession.id)).filter(
+ and_(
+ RecordingSession.project_id == project_id,
+ RecordingSession.status == "completed",
+ )
+ ).scalar()
+
+ # Get upcoming scheduled actions
+ upcoming_actions = db.query(ScheduledAction).filter(
+ and_(
+ ScheduledAction.project_id == project_id,
+ ScheduledAction.execution_status == "pending",
+ ScheduledAction.scheduled_time > datetime.utcnow(),
+ )
+ ).order_by(ScheduledAction.scheduled_time).limit(5).all()
+
+ return templates.TemplateResponse("partials/projects/project_dashboard.html", {
+ "request": request,
+ "project": project,
+ "project_type": project_type,
+ "locations": locations,
+ "assigned_units": assigned_units,
+ "active_sessions": active_sessions,
+ "completed_sessions_count": completed_sessions_count,
+ "upcoming_actions": upcoming_actions,
+ })
+
+
+# ============================================================================
+# Project Types
+# ============================================================================
+
+@router.get("/types/list", response_class=HTMLResponse)
+async def get_project_types(request: Request, db: Session = Depends(get_db)):
+ """
+ Get all available project types.
+ Returns HTML partial with project type cards.
+ """
+ project_types = db.query(ProjectType).all()
+
+ return templates.TemplateResponse("partials/projects/project_type_cards.html", {
+ "request": request,
+ "project_types": project_types,
+ })
diff --git a/backend/routers/scheduler.py b/backend/routers/scheduler.py
new file mode 100644
index 0000000..caf64cf
--- /dev/null
+++ b/backend/routers/scheduler.py
@@ -0,0 +1,409 @@
+"""
+Scheduler Router
+
+Handles scheduled actions for automated recording control.
+"""
+
+from fastapi import APIRouter, Request, Depends, HTTPException, Query
+from fastapi.templating import Jinja2Templates
+from fastapi.responses import HTMLResponse, JSONResponse
+from sqlalchemy.orm import Session
+from sqlalchemy import and_, or_
+from datetime import datetime, timedelta
+from typing import Optional
+import uuid
+import json
+
+from backend.database import get_db
+from backend.models import (
+ Project,
+ ScheduledAction,
+ MonitoringLocation,
+ UnitAssignment,
+ RosterUnit,
+)
+from backend.services.scheduler import get_scheduler
+
+router = APIRouter(prefix="/api/projects/{project_id}/scheduler", tags=["scheduler"])
+templates = Jinja2Templates(directory="templates")
+
+
+# ============================================================================
+# Scheduled Actions List
+# ============================================================================
+
+@router.get("/actions", response_class=HTMLResponse)
+async def get_scheduled_actions(
+ project_id: str,
+ request: Request,
+ db: Session = Depends(get_db),
+ status: Optional[str] = Query(None),
+ start_date: Optional[str] = Query(None),
+ end_date: Optional[str] = Query(None),
+):
+ """
+ Get scheduled actions for a project.
+ Returns HTML partial with agenda/calendar view.
+ """
+ query = db.query(ScheduledAction).filter_by(project_id=project_id)
+
+ # Filter by status
+ if status:
+ query = query.filter_by(execution_status=status)
+ else:
+ # By default, show pending and upcoming completed/failed
+ query = query.filter(
+ or_(
+ ScheduledAction.execution_status == "pending",
+ and_(
+ ScheduledAction.execution_status.in_(["completed", "failed"]),
+ ScheduledAction.scheduled_time >= datetime.utcnow() - timedelta(days=7),
+ ),
+ )
+ )
+
+ # Filter by date range
+ if start_date:
+ query = query.filter(ScheduledAction.scheduled_time >= datetime.fromisoformat(start_date))
+ if end_date:
+ query = query.filter(ScheduledAction.scheduled_time <= datetime.fromisoformat(end_date))
+
+ actions = query.order_by(ScheduledAction.scheduled_time).all()
+
+ # Enrich with location and unit details
+ actions_data = []
+ for action in actions:
+ location = db.query(MonitoringLocation).filter_by(id=action.location_id).first()
+
+ unit = None
+ if action.unit_id:
+ unit = db.query(RosterUnit).filter_by(id=action.unit_id).first()
+ else:
+ # Get from assignment
+ assignment = db.query(UnitAssignment).filter(
+ and_(
+ UnitAssignment.location_id == action.location_id,
+ UnitAssignment.status == "active",
+ )
+ ).first()
+ if assignment:
+ unit = db.query(RosterUnit).filter_by(id=assignment.unit_id).first()
+
+ actions_data.append({
+ "action": action,
+ "location": location,
+ "unit": unit,
+ })
+
+ return templates.TemplateResponse("partials/projects/scheduler_agenda.html", {
+ "request": request,
+ "project_id": project_id,
+ "actions": actions_data,
+ })
+
+
+# ============================================================================
+# Create Scheduled Action
+# ============================================================================
+
+@router.post("/actions/create")
+async def create_scheduled_action(
+ project_id: str,
+ request: Request,
+ db: Session = Depends(get_db),
+):
+ """
+ Create a new scheduled action.
+ """
+ project = db.query(Project).filter_by(id=project_id).first()
+ if not project:
+ raise HTTPException(status_code=404, detail="Project not found")
+
+ form_data = await request.form()
+
+ location_id = form_data.get("location_id")
+ location = db.query(MonitoringLocation).filter_by(
+ id=location_id,
+ project_id=project_id,
+ ).first()
+
+ if not location:
+ raise HTTPException(status_code=404, detail="Location not found")
+
+ # Determine device type from location
+ device_type = "sound_level_meter" if location.location_type == "sound" else "seismograph"
+
+ # Get unit_id (optional - can be determined from assignment at execution time)
+ unit_id = form_data.get("unit_id")
+
+ action = ScheduledAction(
+ id=str(uuid.uuid4()),
+ project_id=project_id,
+ location_id=location_id,
+ unit_id=unit_id,
+ action_type=form_data.get("action_type"),
+ device_type=device_type,
+ scheduled_time=datetime.fromisoformat(form_data.get("scheduled_time")),
+ execution_status="pending",
+ notes=form_data.get("notes"),
+ )
+
+ db.add(action)
+ db.commit()
+ db.refresh(action)
+
+ return JSONResponse({
+ "success": True,
+ "action_id": action.id,
+ "message": f"Scheduled action '{action.action_type}' created for {action.scheduled_time}",
+ })
+
+
+# ============================================================================
+# Schedule Recording Session
+# ============================================================================
+
+@router.post("/schedule-session")
+async def schedule_recording_session(
+ project_id: str,
+ request: Request,
+ db: Session = Depends(get_db),
+):
+ """
+ Schedule a complete recording session (start + stop).
+ Creates two scheduled actions: start and stop.
+ """
+ project = db.query(Project).filter_by(id=project_id).first()
+ if not project:
+ raise HTTPException(status_code=404, detail="Project not found")
+
+ form_data = await request.form()
+
+ location_id = form_data.get("location_id")
+ location = db.query(MonitoringLocation).filter_by(
+ id=location_id,
+ project_id=project_id,
+ ).first()
+
+ if not location:
+ raise HTTPException(status_code=404, detail="Location not found")
+
+ device_type = "sound_level_meter" if location.location_type == "sound" else "seismograph"
+ unit_id = form_data.get("unit_id")
+
+ start_time = datetime.fromisoformat(form_data.get("start_time"))
+ duration_minutes = int(form_data.get("duration_minutes", 60))
+ stop_time = start_time + timedelta(minutes=duration_minutes)
+
+ # Create START action
+ start_action = ScheduledAction(
+ id=str(uuid.uuid4()),
+ project_id=project_id,
+ location_id=location_id,
+ unit_id=unit_id,
+ action_type="start",
+ device_type=device_type,
+ scheduled_time=start_time,
+ execution_status="pending",
+ notes=form_data.get("notes"),
+ )
+
+ # Create STOP action
+ stop_action = ScheduledAction(
+ id=str(uuid.uuid4()),
+ project_id=project_id,
+ location_id=location_id,
+ unit_id=unit_id,
+ action_type="stop",
+ device_type=device_type,
+ scheduled_time=stop_time,
+ execution_status="pending",
+ notes=f"Auto-stop after {duration_minutes} minutes",
+ )
+
+ db.add(start_action)
+ db.add(stop_action)
+ db.commit()
+
+ return JSONResponse({
+ "success": True,
+ "start_action_id": start_action.id,
+ "stop_action_id": stop_action.id,
+ "message": f"Recording session scheduled from {start_time} to {stop_time}",
+ })
+
+
+# ============================================================================
+# Update/Cancel Scheduled Action
+# ============================================================================
+
+@router.put("/actions/{action_id}")
+async def update_scheduled_action(
+ project_id: str,
+ action_id: str,
+ request: Request,
+ db: Session = Depends(get_db),
+):
+ """
+ Update a scheduled action (only if not yet executed).
+ """
+ action = db.query(ScheduledAction).filter_by(
+ id=action_id,
+ project_id=project_id,
+ ).first()
+
+ if not action:
+ raise HTTPException(status_code=404, detail="Action not found")
+
+ if action.execution_status != "pending":
+ raise HTTPException(
+ status_code=400,
+ detail="Cannot update action that has already been executed",
+ )
+
+ data = await request.json()
+
+ if "scheduled_time" in data:
+ action.scheduled_time = datetime.fromisoformat(data["scheduled_time"])
+ if "notes" in data:
+ action.notes = data["notes"]
+
+ db.commit()
+
+ return {"success": True, "message": "Action updated successfully"}
+
+
+@router.post("/actions/{action_id}/cancel")
+async def cancel_scheduled_action(
+ project_id: str,
+ action_id: str,
+ db: Session = Depends(get_db),
+):
+ """
+ Cancel a pending scheduled action.
+ """
+ action = db.query(ScheduledAction).filter_by(
+ id=action_id,
+ project_id=project_id,
+ ).first()
+
+ if not action:
+ raise HTTPException(status_code=404, detail="Action not found")
+
+ if action.execution_status != "pending":
+ raise HTTPException(
+ status_code=400,
+ detail="Can only cancel pending actions",
+ )
+
+ action.execution_status = "cancelled"
+ db.commit()
+
+ return {"success": True, "message": "Action cancelled successfully"}
+
+
+@router.delete("/actions/{action_id}")
+async def delete_scheduled_action(
+ project_id: str,
+ action_id: str,
+ db: Session = Depends(get_db),
+):
+ """
+ Delete a scheduled action (only if pending or cancelled).
+ """
+ action = db.query(ScheduledAction).filter_by(
+ id=action_id,
+ project_id=project_id,
+ ).first()
+
+ if not action:
+ raise HTTPException(status_code=404, detail="Action not found")
+
+ if action.execution_status not in ["pending", "cancelled"]:
+ raise HTTPException(
+ status_code=400,
+ detail="Cannot delete action that has been executed",
+ )
+
+ db.delete(action)
+ db.commit()
+
+ return {"success": True, "message": "Action deleted successfully"}
+
+
+# ============================================================================
+# Manual Execution
+# ============================================================================
+
+@router.post("/actions/{action_id}/execute")
+async def execute_action_now(
+ project_id: str,
+ action_id: str,
+ db: Session = Depends(get_db),
+):
+ """
+ Manually trigger execution of a scheduled action (for testing/debugging).
+ """
+ action = db.query(ScheduledAction).filter_by(
+ id=action_id,
+ project_id=project_id,
+ ).first()
+
+ if not action:
+ raise HTTPException(status_code=404, detail="Action not found")
+
+ if action.execution_status != "pending":
+ raise HTTPException(
+ status_code=400,
+ detail="Action is not pending",
+ )
+
+ # Execute via scheduler service
+ scheduler = get_scheduler()
+ result = await scheduler.execute_action_by_id(action_id)
+
+ # Refresh from DB to get updated status
+ db.refresh(action)
+
+ return JSONResponse({
+ "success": result.get("success", False),
+ "result": result,
+ "action": {
+ "id": action.id,
+ "execution_status": action.execution_status,
+ "executed_at": action.executed_at.isoformat() if action.executed_at else None,
+ "error_message": action.error_message,
+ },
+ })
+
+
+# ============================================================================
+# Scheduler Status
+# ============================================================================
+
+@router.get("/status")
+async def get_scheduler_status():
+ """
+ Get scheduler service status.
+ """
+ scheduler = get_scheduler()
+
+ return {
+ "running": scheduler.running,
+ "check_interval": scheduler.check_interval,
+ }
+
+
+@router.post("/execute-pending")
+async def trigger_pending_execution():
+ """
+ Manually trigger execution of all pending actions (for testing).
+ """
+ scheduler = get_scheduler()
+ results = await scheduler.execute_pending_actions()
+
+ return {
+ "success": True,
+ "executed_count": len(results),
+ "results": results,
+ }
diff --git a/backend/routers/slm_dashboard.py b/backend/routers/slm_dashboard.py
index 3d9c0df..d2e4408 100644
--- a/backend/routers/slm_dashboard.py
+++ b/backend/routers/slm_dashboard.py
@@ -10,6 +10,7 @@ from fastapi.responses import HTMLResponse
from sqlalchemy.orm import Session
from sqlalchemy import func
from datetime import datetime, timedelta
+import asyncio
import httpx
import logging
import os
@@ -60,7 +61,13 @@ async def get_slm_stats(request: Request, db: Session = Depends(get_db)):
async def get_slm_units(
request: Request,
db: Session = Depends(get_db),
+<<<<<<< Updated upstream
search: str = Query(None)
+=======
+ search: str = Query(None),
+ project: str = Query(None),
+ include_measurement: bool = Query(False),
+>>>>>>> Stashed changes
):
"""
Get list of SLM units for the sidebar.
@@ -77,10 +84,39 @@ async def get_slm_units(
(RosterUnit.address.like(search_term))
)
- # Only show deployed units by default
- units = query.filter_by(deployed=True, retired=False).order_by(RosterUnit.id).all()
+ units = query.order_by(
+ RosterUnit.retired.asc(),
+ RosterUnit.deployed.desc(),
+ RosterUnit.id.asc()
+ ).all()
- return templates.TemplateResponse("partials/slm_unit_list.html", {
+ one_hour_ago = datetime.utcnow() - timedelta(hours=1)
+ for unit in units:
+ unit.is_recent = bool(unit.slm_last_check and unit.slm_last_check > one_hour_ago)
+
+ if include_measurement:
+ async def fetch_measurement_state(client: httpx.AsyncClient, unit_id: str) -> str | None:
+ try:
+ response = await client.get(f"{SLMM_BASE_URL}/api/nl43/{unit_id}/measurement-state")
+ if response.status_code == 200:
+ return response.json().get("measurement_state")
+ except Exception:
+ return None
+ return None
+
+ deployed_units = [unit for unit in units if unit.deployed and not unit.retired]
+ if deployed_units:
+ async with httpx.AsyncClient(timeout=3.0) as client:
+ tasks = [fetch_measurement_state(client, unit.id) for unit in deployed_units]
+ results = await asyncio.gather(*tasks, return_exceptions=True)
+
+ for unit, state in zip(deployed_units, results):
+ if isinstance(state, Exception):
+ unit.measurement_state = None
+ else:
+ unit.measurement_state = state
+
+ return templates.TemplateResponse("partials/slm_device_list.html", {
"request": request,
"units": units
})
diff --git a/backend/services/device_controller.py b/backend/services/device_controller.py
new file mode 100644
index 0000000..a9aa80d
--- /dev/null
+++ b/backend/services/device_controller.py
@@ -0,0 +1,384 @@
+"""
+Device Controller Service
+
+Routes device operations to the appropriate backend module:
+- SLMM for sound level meters
+- SFM for seismographs (future implementation)
+
+This abstraction allows Projects system to work with any device type
+without knowing the underlying communication protocol.
+"""
+
+from typing import Dict, Any, Optional, List
+from backend.services.slmm_client import get_slmm_client, SLMMClientError
+
+
+class DeviceControllerError(Exception):
+ """Base exception for device controller errors."""
+ pass
+
+
+class UnsupportedDeviceTypeError(DeviceControllerError):
+ """Raised when device type is not supported."""
+ pass
+
+
+class DeviceController:
+ """
+ Unified interface for controlling all device types.
+
+ Routes commands to appropriate backend module based on device_type.
+
+ Usage:
+ controller = DeviceController()
+ await controller.start_recording("nl43-001", "sound_level_meter", config={})
+ await controller.stop_recording("seismo-042", "seismograph")
+ """
+
+ def __init__(self):
+ self.slmm_client = get_slmm_client()
+
+ # ========================================================================
+ # Recording Control
+ # ========================================================================
+
+ async def start_recording(
+ self,
+ unit_id: str,
+ device_type: str,
+ config: Optional[Dict[str, Any]] = None,
+ ) -> Dict[str, Any]:
+ """
+ Start recording on a device.
+
+ Args:
+ unit_id: Unit identifier
+ device_type: "sound_level_meter" | "seismograph"
+ config: Device-specific recording configuration
+
+ Returns:
+ Response dict from device module
+
+ Raises:
+ UnsupportedDeviceTypeError: Device type not supported
+ DeviceControllerError: Operation failed
+ """
+ if device_type == "sound_level_meter":
+ try:
+ return await self.slmm_client.start_recording(unit_id, config)
+ except SLMMClientError as e:
+ raise DeviceControllerError(f"SLMM error: {str(e)}")
+
+ elif device_type == "seismograph":
+ # TODO: Implement SFM client for seismograph control
+ # For now, return a placeholder response
+ return {
+ "status": "not_implemented",
+ "message": "Seismograph recording control not yet implemented",
+ "unit_id": unit_id,
+ }
+
+ else:
+ raise UnsupportedDeviceTypeError(
+ f"Device type '{device_type}' is not supported. "
+ f"Supported types: sound_level_meter, seismograph"
+ )
+
+ async def stop_recording(
+ self,
+ unit_id: str,
+ device_type: str,
+ ) -> Dict[str, Any]:
+ """
+ Stop recording on a device.
+
+ Args:
+ unit_id: Unit identifier
+ device_type: "sound_level_meter" | "seismograph"
+
+ Returns:
+ Response dict from device module
+ """
+ if device_type == "sound_level_meter":
+ try:
+ return await self.slmm_client.stop_recording(unit_id)
+ except SLMMClientError as e:
+ raise DeviceControllerError(f"SLMM error: {str(e)}")
+
+ elif device_type == "seismograph":
+ # TODO: Implement SFM client
+ return {
+ "status": "not_implemented",
+ "message": "Seismograph recording control not yet implemented",
+ "unit_id": unit_id,
+ }
+
+ else:
+ raise UnsupportedDeviceTypeError(f"Unsupported device type: {device_type}")
+
+ async def pause_recording(
+ self,
+ unit_id: str,
+ device_type: str,
+ ) -> Dict[str, Any]:
+ """
+ Pause recording on a device.
+
+ Args:
+ unit_id: Unit identifier
+ device_type: "sound_level_meter" | "seismograph"
+
+ Returns:
+ Response dict from device module
+ """
+ if device_type == "sound_level_meter":
+ try:
+ return await self.slmm_client.pause_recording(unit_id)
+ except SLMMClientError as e:
+ raise DeviceControllerError(f"SLMM error: {str(e)}")
+
+ elif device_type == "seismograph":
+ return {
+ "status": "not_implemented",
+ "message": "Seismograph pause not yet implemented",
+ "unit_id": unit_id,
+ }
+
+ else:
+ raise UnsupportedDeviceTypeError(f"Unsupported device type: {device_type}")
+
+ async def resume_recording(
+ self,
+ unit_id: str,
+ device_type: str,
+ ) -> Dict[str, Any]:
+ """
+ Resume paused recording on a device.
+
+ Args:
+ unit_id: Unit identifier
+ device_type: "sound_level_meter" | "seismograph"
+
+ Returns:
+ Response dict from device module
+ """
+ if device_type == "sound_level_meter":
+ try:
+ return await self.slmm_client.resume_recording(unit_id)
+ except SLMMClientError as e:
+ raise DeviceControllerError(f"SLMM error: {str(e)}")
+
+ elif device_type == "seismograph":
+ return {
+ "status": "not_implemented",
+ "message": "Seismograph resume not yet implemented",
+ "unit_id": unit_id,
+ }
+
+ else:
+ raise UnsupportedDeviceTypeError(f"Unsupported device type: {device_type}")
+
+ # ========================================================================
+ # Status & Monitoring
+ # ========================================================================
+
+ async def get_device_status(
+ self,
+ unit_id: str,
+ device_type: str,
+ ) -> Dict[str, Any]:
+ """
+ Get current device status.
+
+ Args:
+ unit_id: Unit identifier
+ device_type: "sound_level_meter" | "seismograph"
+
+ Returns:
+ Status dict from device module
+ """
+ if device_type == "sound_level_meter":
+ try:
+ return await self.slmm_client.get_unit_status(unit_id)
+ except SLMMClientError as e:
+ raise DeviceControllerError(f"SLMM error: {str(e)}")
+
+ elif device_type == "seismograph":
+ # TODO: Implement SFM status check
+ return {
+ "status": "not_implemented",
+ "message": "Seismograph status not yet implemented",
+ "unit_id": unit_id,
+ }
+
+ else:
+ raise UnsupportedDeviceTypeError(f"Unsupported device type: {device_type}")
+
+ async def get_live_data(
+ self,
+ unit_id: str,
+ device_type: str,
+ ) -> Dict[str, Any]:
+ """
+ Get live data from device.
+
+ Args:
+ unit_id: Unit identifier
+ device_type: "sound_level_meter" | "seismograph"
+
+ Returns:
+ Live data dict from device module
+ """
+ if device_type == "sound_level_meter":
+ try:
+ return await self.slmm_client.get_live_data(unit_id)
+ except SLMMClientError as e:
+ raise DeviceControllerError(f"SLMM error: {str(e)}")
+
+ elif device_type == "seismograph":
+ return {
+ "status": "not_implemented",
+ "message": "Seismograph live data not yet implemented",
+ "unit_id": unit_id,
+ }
+
+ else:
+ raise UnsupportedDeviceTypeError(f"Unsupported device type: {device_type}")
+
+ # ========================================================================
+ # Data Download
+ # ========================================================================
+
+ async def download_files(
+ self,
+ unit_id: str,
+ device_type: str,
+ destination_path: str,
+ files: Optional[List[str]] = None,
+ ) -> Dict[str, Any]:
+ """
+ Download data files from device.
+
+ Args:
+ unit_id: Unit identifier
+ device_type: "sound_level_meter" | "seismograph"
+ destination_path: Local path to save files
+ files: List of filenames, or None for all
+
+ Returns:
+ Download result with file list
+ """
+ if device_type == "sound_level_meter":
+ try:
+ return await self.slmm_client.download_files(
+ unit_id,
+ destination_path,
+ files,
+ )
+ except SLMMClientError as e:
+ raise DeviceControllerError(f"SLMM error: {str(e)}")
+
+ elif device_type == "seismograph":
+ # TODO: Implement SFM file download
+ return {
+ "status": "not_implemented",
+ "message": "Seismograph file download not yet implemented",
+ "unit_id": unit_id,
+ }
+
+ else:
+ raise UnsupportedDeviceTypeError(f"Unsupported device type: {device_type}")
+
+ # ========================================================================
+ # Device Configuration
+ # ========================================================================
+
+ async def update_device_config(
+ self,
+ unit_id: str,
+ device_type: str,
+ config: Dict[str, Any],
+ ) -> Dict[str, Any]:
+ """
+ Update device configuration.
+
+ Args:
+ unit_id: Unit identifier
+ device_type: "sound_level_meter" | "seismograph"
+ config: Configuration parameters
+
+ Returns:
+ Updated config from device module
+ """
+ if device_type == "sound_level_meter":
+ try:
+ return await self.slmm_client.update_unit_config(
+ unit_id,
+ host=config.get("host"),
+ tcp_port=config.get("tcp_port"),
+ ftp_port=config.get("ftp_port"),
+ ftp_username=config.get("ftp_username"),
+ ftp_password=config.get("ftp_password"),
+ )
+ except SLMMClientError as e:
+ raise DeviceControllerError(f"SLMM error: {str(e)}")
+
+ elif device_type == "seismograph":
+ return {
+ "status": "not_implemented",
+ "message": "Seismograph config update not yet implemented",
+ "unit_id": unit_id,
+ }
+
+ else:
+ raise UnsupportedDeviceTypeError(f"Unsupported device type: {device_type}")
+
+ # ========================================================================
+ # Health Check
+ # ========================================================================
+
+ async def check_device_connectivity(
+ self,
+ unit_id: str,
+ device_type: str,
+ ) -> bool:
+ """
+ Check if device is reachable.
+
+ Args:
+ unit_id: Unit identifier
+ device_type: "sound_level_meter" | "seismograph"
+
+ Returns:
+ True if device is reachable, False otherwise
+ """
+ if device_type == "sound_level_meter":
+ try:
+ status = await self.slmm_client.get_unit_status(unit_id)
+ return status.get("last_seen") is not None
+ except:
+ return False
+
+ elif device_type == "seismograph":
+ # TODO: Implement SFM connectivity check
+ return False
+
+ else:
+ return False
+
+
+# Singleton instance
+_default_controller: Optional[DeviceController] = None
+
+
+def get_device_controller() -> DeviceController:
+ """
+ Get the default device controller instance.
+
+ Returns:
+ DeviceController instance
+ """
+ global _default_controller
+ if _default_controller is None:
+ _default_controller = DeviceController()
+ return _default_controller
diff --git a/backend/services/scheduler.py b/backend/services/scheduler.py
new file mode 100644
index 0000000..678f8ec
--- /dev/null
+++ b/backend/services/scheduler.py
@@ -0,0 +1,355 @@
+"""
+Scheduler Service
+
+Executes scheduled actions for Projects system.
+Monitors pending scheduled actions and executes them by calling device modules (SLMM/SFM).
+
+This service runs as a background task in FastAPI, checking for pending actions
+every minute and executing them when their scheduled time arrives.
+"""
+
+import asyncio
+import json
+from datetime import datetime, timedelta
+from typing import Optional, List, Dict, Any
+from sqlalchemy.orm import Session
+from sqlalchemy import and_
+
+from backend.database import SessionLocal
+from backend.models import ScheduledAction, RecordingSession, MonitoringLocation, Project
+from backend.services.device_controller import get_device_controller, DeviceControllerError
+import uuid
+
+
+class SchedulerService:
+ """
+ Service for executing scheduled actions.
+
+ Usage:
+ scheduler = SchedulerService()
+ await scheduler.start() # Start background loop
+ scheduler.stop() # Stop background loop
+ """
+
+ def __init__(self, check_interval: int = 60):
+ """
+ Initialize scheduler.
+
+ Args:
+ check_interval: Seconds between checks for pending actions (default: 60)
+ """
+ self.check_interval = check_interval
+ self.running = False
+ self.task: Optional[asyncio.Task] = None
+ self.device_controller = get_device_controller()
+
+ async def start(self):
+ """Start the scheduler background task."""
+ if self.running:
+ print("Scheduler is already running")
+ return
+
+ self.running = True
+ self.task = asyncio.create_task(self._run_loop())
+ print(f"Scheduler started (checking every {self.check_interval}s)")
+
+ def stop(self):
+ """Stop the scheduler background task."""
+ self.running = False
+ if self.task:
+ self.task.cancel()
+ print("Scheduler stopped")
+
+ async def _run_loop(self):
+ """Main scheduler loop."""
+ while self.running:
+ try:
+ await self.execute_pending_actions()
+ except Exception as e:
+ print(f"Scheduler error: {e}")
+ # Continue running even if there's an error
+
+ await asyncio.sleep(self.check_interval)
+
+ async def execute_pending_actions(self) -> List[Dict[str, Any]]:
+ """
+ Find and execute all pending scheduled actions that are due.
+
+ Returns:
+ List of execution results
+ """
+ db = SessionLocal()
+ results = []
+
+ try:
+ # Find pending actions that are due
+ now = datetime.utcnow()
+ pending_actions = db.query(ScheduledAction).filter(
+ and_(
+ ScheduledAction.execution_status == "pending",
+ ScheduledAction.scheduled_time <= now,
+ )
+ ).order_by(ScheduledAction.scheduled_time).all()
+
+ if not pending_actions:
+ return []
+
+ print(f"Found {len(pending_actions)} pending action(s) to execute")
+
+ for action in pending_actions:
+ result = await self._execute_action(action, db)
+ results.append(result)
+
+ db.commit()
+
+ except Exception as e:
+ print(f"Error executing pending actions: {e}")
+ db.rollback()
+ finally:
+ db.close()
+
+ return results
+
+ async def _execute_action(
+ self,
+ action: ScheduledAction,
+ db: Session,
+ ) -> Dict[str, Any]:
+ """
+ Execute a single scheduled action.
+
+ Args:
+ action: ScheduledAction to execute
+ db: Database session
+
+ Returns:
+ Execution result dict
+ """
+ print(f"Executing action {action.id}: {action.action_type} for unit {action.unit_id}")
+
+ result = {
+ "action_id": action.id,
+ "action_type": action.action_type,
+ "unit_id": action.unit_id,
+ "scheduled_time": action.scheduled_time.isoformat(),
+ "success": False,
+ "error": None,
+ }
+
+ try:
+ # Determine which unit to use
+ # If unit_id is specified, use it; otherwise get from location assignment
+ unit_id = action.unit_id
+ if not unit_id:
+ # Get assigned unit from location
+ from backend.models import UnitAssignment
+ assignment = db.query(UnitAssignment).filter(
+ and_(
+ UnitAssignment.location_id == action.location_id,
+ UnitAssignment.status == "active",
+ )
+ ).first()
+
+ if not assignment:
+ raise Exception(f"No active unit assigned to location {action.location_id}")
+
+ unit_id = assignment.unit_id
+
+ # Execute the action based on type
+ if action.action_type == "start":
+ response = await self._execute_start(action, unit_id, db)
+ elif action.action_type == "stop":
+ response = await self._execute_stop(action, unit_id, db)
+ elif action.action_type == "download":
+ response = await self._execute_download(action, unit_id, db)
+ else:
+ raise Exception(f"Unknown action type: {action.action_type}")
+
+ # Mark action as completed
+ action.execution_status = "completed"
+ action.executed_at = datetime.utcnow()
+ action.module_response = json.dumps(response)
+
+ result["success"] = True
+ result["response"] = response
+
+ print(f"✓ Action {action.id} completed successfully")
+
+ except Exception as e:
+ # Mark action as failed
+ action.execution_status = "failed"
+ action.executed_at = datetime.utcnow()
+ action.error_message = str(e)
+
+ result["error"] = str(e)
+
+ print(f"✗ Action {action.id} failed: {e}")
+
+ return result
+
+ async def _execute_start(
+ self,
+ action: ScheduledAction,
+ unit_id: str,
+ db: Session,
+ ) -> Dict[str, Any]:
+ """Execute a 'start' action."""
+ # Start recording via device controller
+ response = await self.device_controller.start_recording(
+ unit_id,
+ action.device_type,
+ config={}, # TODO: Load config from action.notes or metadata
+ )
+
+ # Create recording session
+ session = RecordingSession(
+ id=str(uuid.uuid4()),
+ project_id=action.project_id,
+ location_id=action.location_id,
+ unit_id=unit_id,
+ session_type="sound" if action.device_type == "sound_level_meter" else "vibration",
+ started_at=datetime.utcnow(),
+ status="recording",
+ session_metadata=json.dumps({"scheduled_action_id": action.id}),
+ )
+ db.add(session)
+
+ return {
+ "status": "started",
+ "session_id": session.id,
+ "device_response": response,
+ }
+
+ async def _execute_stop(
+ self,
+ action: ScheduledAction,
+ unit_id: str,
+ db: Session,
+ ) -> Dict[str, Any]:
+ """Execute a 'stop' action."""
+ # Stop recording via device controller
+ response = await self.device_controller.stop_recording(
+ unit_id,
+ action.device_type,
+ )
+
+ # Find and update the active recording session
+ active_session = db.query(RecordingSession).filter(
+ and_(
+ RecordingSession.location_id == action.location_id,
+ RecordingSession.unit_id == unit_id,
+ RecordingSession.status == "recording",
+ )
+ ).first()
+
+ if active_session:
+ active_session.stopped_at = datetime.utcnow()
+ active_session.status = "completed"
+ active_session.duration_seconds = int(
+ (active_session.stopped_at - active_session.started_at).total_seconds()
+ )
+
+ return {
+ "status": "stopped",
+ "session_id": active_session.id if active_session else None,
+ "device_response": response,
+ }
+
+ async def _execute_download(
+ self,
+ action: ScheduledAction,
+ unit_id: str,
+ db: Session,
+ ) -> Dict[str, Any]:
+ """Execute a 'download' action."""
+ # Get project and location info for file path
+ location = db.query(MonitoringLocation).filter_by(id=action.location_id).first()
+ project = db.query(Project).filter_by(id=action.project_id).first()
+
+ if not location or not project:
+ raise Exception("Project or location not found")
+
+ # Build destination path
+ # Example: data/Projects/{project-id}/sound/{location-name}/session-{timestamp}/
+ session_timestamp = datetime.utcnow().strftime("%Y-%m-%d-%H%M")
+ location_type_dir = "sound" if action.device_type == "sound_level_meter" else "vibration"
+
+ destination_path = (
+ f"data/Projects/{project.id}/{location_type_dir}/"
+ f"{location.name}/session-{session_timestamp}/"
+ )
+
+ # Download files via device controller
+ response = await self.device_controller.download_files(
+ unit_id,
+ action.device_type,
+ destination_path,
+ files=None, # Download all files
+ )
+
+ # TODO: Create DataFile records for downloaded files
+
+ return {
+ "status": "downloaded",
+ "destination_path": destination_path,
+ "device_response": response,
+ }
+
+ # ========================================================================
+ # Manual Execution (for testing/debugging)
+ # ========================================================================
+
+ async def execute_action_by_id(self, action_id: str) -> Dict[str, Any]:
+ """
+ Manually execute a specific action by ID.
+
+ Args:
+ action_id: ScheduledAction ID
+
+ Returns:
+ Execution result
+ """
+ db = SessionLocal()
+ try:
+ action = db.query(ScheduledAction).filter_by(id=action_id).first()
+ if not action:
+ return {"success": False, "error": "Action not found"}
+
+ result = await self._execute_action(action, db)
+ db.commit()
+ return result
+
+ except Exception as e:
+ db.rollback()
+ return {"success": False, "error": str(e)}
+ finally:
+ db.close()
+
+
+# Singleton instance
+_scheduler_instance: Optional[SchedulerService] = None
+
+
+def get_scheduler() -> SchedulerService:
+ """
+ Get the scheduler singleton instance.
+
+ Returns:
+ SchedulerService instance
+ """
+ global _scheduler_instance
+ if _scheduler_instance is None:
+ _scheduler_instance = SchedulerService()
+ return _scheduler_instance
+
+
+async def start_scheduler():
+ """Start the global scheduler instance."""
+ scheduler = get_scheduler()
+ await scheduler.start()
+
+
+def stop_scheduler():
+ """Stop the global scheduler instance."""
+ scheduler = get_scheduler()
+ scheduler.stop()
diff --git a/backend/services/slmm_client.py b/backend/services/slmm_client.py
new file mode 100644
index 0000000..f04badf
--- /dev/null
+++ b/backend/services/slmm_client.py
@@ -0,0 +1,423 @@
+"""
+SLMM API Client Wrapper
+
+Provides a clean interface for Terra-View to interact with the SLMM backend.
+All SLM operations should go through this client instead of direct HTTP calls.
+
+SLMM (Sound Level Meter Manager) is a separate service running on port 8100
+that handles TCP/FTP communication with Rion NL-43/NL-53 devices.
+"""
+
+import httpx
+from typing import Optional, Dict, Any, List
+from datetime import datetime
+import json
+
+
+# SLMM backend base URLs
+SLMM_BASE_URL = "http://localhost:8100"
+SLMM_API_BASE = f"{SLMM_BASE_URL}/api/nl43"
+
+
+class SLMMClientError(Exception):
+ """Base exception for SLMM client errors."""
+ pass
+
+
+class SLMMConnectionError(SLMMClientError):
+ """Raised when cannot connect to SLMM backend."""
+ pass
+
+
+class SLMMDeviceError(SLMMClientError):
+ """Raised when device operation fails."""
+ pass
+
+
+class SLMMClient:
+ """
+ Client for interacting with SLMM backend.
+
+ Usage:
+ client = SLMMClient()
+ units = await client.get_all_units()
+ status = await client.get_unit_status("nl43-001")
+ await client.start_recording("nl43-001", config={...})
+ """
+
+ def __init__(self, base_url: str = SLMM_BASE_URL, timeout: float = 30.0):
+ self.base_url = base_url
+ self.api_base = f"{base_url}/api/nl43"
+ self.timeout = timeout
+
+ async def _request(
+ self,
+ method: str,
+ endpoint: str,
+ data: Optional[Dict] = None,
+ params: Optional[Dict] = None,
+ ) -> Dict[str, Any]:
+ """
+ Make an HTTP request to SLMM backend.
+
+ Args:
+ method: HTTP method (GET, POST, PUT, DELETE)
+ endpoint: API endpoint (e.g., "/units", "/{unit_id}/status")
+ data: JSON body for POST/PUT requests
+ params: Query parameters
+
+ Returns:
+ Response JSON as dict
+
+ Raises:
+ SLMMConnectionError: Cannot reach SLMM
+ SLMMDeviceError: Device operation failed
+ """
+ url = f"{self.api_base}{endpoint}"
+
+ try:
+ async with httpx.AsyncClient(timeout=self.timeout) as client:
+ response = await client.request(
+ method=method,
+ url=url,
+ json=data,
+ params=params,
+ )
+ response.raise_for_status()
+
+ # Handle empty responses
+ if not response.content:
+ return {}
+
+ return response.json()
+
+ except httpx.ConnectError as e:
+ raise SLMMConnectionError(
+ f"Cannot connect to SLMM backend at {self.base_url}. "
+ f"Is SLMM running? Error: {str(e)}"
+ )
+ except httpx.HTTPStatusError as e:
+ error_detail = "Unknown error"
+ try:
+ error_data = e.response.json()
+ error_detail = error_data.get("detail", str(error_data))
+ except:
+ error_detail = e.response.text or str(e)
+
+ raise SLMMDeviceError(
+ f"SLMM operation failed: {error_detail}"
+ )
+ except Exception as e:
+ raise SLMMClientError(f"Unexpected error: {str(e)}")
+
+ # ========================================================================
+ # Unit Management
+ # ========================================================================
+
+ async def get_all_units(self) -> List[Dict[str, Any]]:
+ """
+ Get all configured SLM units from SLMM.
+
+ Returns:
+ List of unit dicts with id, config, and status
+ """
+ # SLMM doesn't have a /units endpoint yet, so we'll need to add this
+ # For now, return empty list or implement when SLMM endpoint is ready
+ try:
+ response = await self._request("GET", "/units")
+ return response.get("units", [])
+ except SLMMClientError:
+ # Endpoint may not exist yet
+ return []
+
+ async def get_unit_config(self, unit_id: str) -> Dict[str, Any]:
+ """
+ Get unit configuration from SLMM cache.
+
+ Args:
+ unit_id: Unit identifier (e.g., "nl43-001")
+
+ Returns:
+ Config dict with host, tcp_port, ftp_port, etc.
+ """
+ return await self._request("GET", f"/{unit_id}/config")
+
+ async def update_unit_config(
+ self,
+ unit_id: str,
+ host: Optional[str] = None,
+ tcp_port: Optional[int] = None,
+ ftp_port: Optional[int] = None,
+ ftp_username: Optional[str] = None,
+ ftp_password: Optional[str] = None,
+ ) -> Dict[str, Any]:
+ """
+ Update unit configuration in SLMM cache.
+
+ Args:
+ unit_id: Unit identifier
+ host: Device IP address
+ tcp_port: TCP control port (default: 2255)
+ ftp_port: FTP data port (default: 21)
+ ftp_username: FTP username
+ ftp_password: FTP password
+
+ Returns:
+ Updated config
+ """
+ config = {}
+ if host is not None:
+ config["host"] = host
+ if tcp_port is not None:
+ config["tcp_port"] = tcp_port
+ if ftp_port is not None:
+ config["ftp_port"] = ftp_port
+ if ftp_username is not None:
+ config["ftp_username"] = ftp_username
+ if ftp_password is not None:
+ config["ftp_password"] = ftp_password
+
+ return await self._request("PUT", f"/{unit_id}/config", data=config)
+
+ # ========================================================================
+ # Status & Monitoring
+ # ========================================================================
+
+ async def get_unit_status(self, unit_id: str) -> Dict[str, Any]:
+ """
+ Get cached status snapshot from SLMM.
+
+ Args:
+ unit_id: Unit identifier
+
+ Returns:
+ Status dict with measurement_state, lp, leq, battery, etc.
+ """
+ return await self._request("GET", f"/{unit_id}/status")
+
+ async def get_live_data(self, unit_id: str) -> Dict[str, Any]:
+ """
+ Request fresh data from device (DOD command).
+
+ Args:
+ unit_id: Unit identifier
+
+ Returns:
+ Live data snapshot
+ """
+ return await self._request("GET", f"/{unit_id}/live")
+
+ # ========================================================================
+ # Recording Control
+ # ========================================================================
+
+ async def start_recording(
+ self,
+ unit_id: str,
+ config: Optional[Dict[str, Any]] = None,
+ ) -> Dict[str, Any]:
+ """
+ Start recording on a unit.
+
+ Args:
+ unit_id: Unit identifier
+ config: Optional recording config (interval, settings, etc.)
+
+ Returns:
+ Response from SLMM with success status
+ """
+ return await self._request("POST", f"/{unit_id}/start", data=config or {})
+
+ async def stop_recording(self, unit_id: str) -> Dict[str, Any]:
+ """
+ Stop recording on a unit.
+
+ Args:
+ unit_id: Unit identifier
+
+ Returns:
+ Response from SLMM
+ """
+ return await self._request("POST", f"/{unit_id}/stop")
+
+ async def pause_recording(self, unit_id: str) -> Dict[str, Any]:
+ """
+ Pause recording on a unit.
+
+ Args:
+ unit_id: Unit identifier
+
+ Returns:
+ Response from SLMM
+ """
+ return await self._request("POST", f"/{unit_id}/pause")
+
+ async def resume_recording(self, unit_id: str) -> Dict[str, Any]:
+ """
+ Resume paused recording on a unit.
+
+ Args:
+ unit_id: Unit identifier
+
+ Returns:
+ Response from SLMM
+ """
+ return await self._request("POST", f"/{unit_id}/resume")
+
+ async def reset_data(self, unit_id: str) -> Dict[str, Any]:
+ """
+ Reset measurement data on a unit.
+
+ Args:
+ unit_id: Unit identifier
+
+ Returns:
+ Response from SLMM
+ """
+ return await self._request("POST", f"/{unit_id}/reset")
+
+ # ========================================================================
+ # Device Settings
+ # ========================================================================
+
+ async def get_frequency_weighting(self, unit_id: str) -> Dict[str, Any]:
+ """
+ Get frequency weighting setting (A, C, or Z).
+
+ Args:
+ unit_id: Unit identifier
+
+ Returns:
+ Dict with current weighting
+ """
+ return await self._request("GET", f"/{unit_id}/frequency-weighting")
+
+ async def set_frequency_weighting(
+ self,
+ unit_id: str,
+ weighting: str,
+ ) -> Dict[str, Any]:
+ """
+ Set frequency weighting (A, C, or Z).
+
+ Args:
+ unit_id: Unit identifier
+ weighting: "A", "C", or "Z"
+
+ Returns:
+ Confirmation response
+ """
+ return await self._request(
+ "PUT",
+ f"/{unit_id}/frequency-weighting",
+ data={"weighting": weighting},
+ )
+
+ async def get_time_weighting(self, unit_id: str) -> Dict[str, Any]:
+ """
+ Get time weighting setting (F, S, or I).
+
+ Args:
+ unit_id: Unit identifier
+
+ Returns:
+ Dict with current time weighting
+ """
+ return await self._request("GET", f"/{unit_id}/time-weighting")
+
+ async def set_time_weighting(
+ self,
+ unit_id: str,
+ weighting: str,
+ ) -> Dict[str, Any]:
+ """
+ Set time weighting (F=Fast, S=Slow, I=Impulse).
+
+ Args:
+ unit_id: Unit identifier
+ weighting: "F", "S", or "I"
+
+ Returns:
+ Confirmation response
+ """
+ return await self._request(
+ "PUT",
+ f"/{unit_id}/time-weighting",
+ data={"weighting": weighting},
+ )
+
+ async def get_all_settings(self, unit_id: str) -> Dict[str, Any]:
+ """
+ Get all device settings.
+
+ Args:
+ unit_id: Unit identifier
+
+ Returns:
+ Dict with all settings
+ """
+ return await self._request("GET", f"/{unit_id}/settings")
+
+ # ========================================================================
+ # Data Download (Future)
+ # ========================================================================
+
+ async def download_files(
+ self,
+ unit_id: str,
+ destination_path: str,
+ files: Optional[List[str]] = None,
+ ) -> Dict[str, Any]:
+ """
+ Download files from unit via FTP.
+
+ NOTE: This endpoint doesn't exist in SLMM yet. Will need to implement.
+
+ Args:
+ unit_id: Unit identifier
+ destination_path: Local path to save files
+ files: List of filenames to download, or None for all
+
+ Returns:
+ Dict with downloaded files list and metadata
+ """
+ data = {
+ "destination_path": destination_path,
+ "files": files or "all",
+ }
+ return await self._request("POST", f"/{unit_id}/ftp/download", data=data)
+
+ # ========================================================================
+ # Health Check
+ # ========================================================================
+
+ async def health_check(self) -> bool:
+ """
+ Check if SLMM backend is reachable.
+
+ Returns:
+ True if SLMM is responding, False otherwise
+ """
+ try:
+ async with httpx.AsyncClient(timeout=5.0) as client:
+ response = await client.get(f"{self.base_url}/health")
+ return response.status_code == 200
+ except:
+ return False
+
+
+# Singleton instance for convenience
+_default_client: Optional[SLMMClient] = None
+
+
+def get_slmm_client() -> SLMMClient:
+ """
+ Get the default SLMM client instance.
+
+ Returns:
+ SLMMClient instance
+ """
+ global _default_client
+ if _default_client is None:
+ _default_client = SLMMClient()
+ return _default_client
diff --git a/docker-compose.yml b/docker-compose.yml
index ddb9e1d..bda2c14 100644
--- a/docker-compose.yml
+++ b/docker-compose.yml
@@ -47,13 +47,13 @@ services:
# --- SLMM (Sound Level Meter Manager) ---
slmm:
build:
- context: ../../slmm
+ context: ../slmm
dockerfile: Dockerfile
container_name: slmm
ports:
- "8100:8100"
volumes:
- - ../../slmm/data:/app/data
+ - ../slmm/data:/app/data
environment:
- PYTHONUNBUFFERED=1
- PORT=8100
diff --git a/templates/base.html b/templates/base.html
index 645646d..670b9b7 100644
--- a/templates/base.html
+++ b/templates/base.html
@@ -127,7 +127,7 @@
Sound Level Meters
-
+
diff --git a/templates/partials/projects/project_dashboard.html b/templates/partials/projects/project_dashboard.html
new file mode 100644
index 0000000..9fbe084
--- /dev/null
+++ b/templates/partials/projects/project_dashboard.html
@@ -0,0 +1,108 @@
+
+
+ {% if project_type %}
+ {{ project_type.name }}
+ {% else %}
+ Project
+ {% endif %}
+ {{ project.description }} Locations {{ locations | length }} Assigned Units {{ assigned_units | length }} Active Sessions {{ active_sessions | length }} Completed Sessions {{ completed_sessions_count }} {{ location.name }} {{ location.address }} {{ location.coordinates }} No locations added yet. {{ item.unit.id }} {{ item.unit.slm_model }} {{ item.unit.address }} No units assigned yet. {{ action.action_type }} {{ action.scheduled_time.strftime('%Y-%m-%d %H:%M') }} {{ action.description }} No scheduled actions. No projects found Create your first project to get started
+ Client: {{ item.project.client_name }}
+ {{ project.name }}
+ Locations
+ {% if locations %}
+ Assigned Units
+ {% if assigned_units %}
+ Upcoming Actions
+ {% if upcoming_actions %}
+
+ {{ item.project.name }}
+
+ {% if item.project.client_name %}
+
No active sound monitoring projects
+Create a project to get started
+Total Projects
+{{ total_projects }}
+Active Projects
+{{ active_projects }}
+Total Locations
+{{ total_locations }}
+Active Sessions
+{{ active_sessions }}
+{{ unit.address }}
+ {% elif unit.location %} +{{ unit.location }}
+ {% endif %} +No sound level meters found
+Add units from the Fleet Roster
+Sound monitoring project overview and assignments
+Manage monitoring projects, locations, and schedules
+Select a project type and configure settings
+No unit selected
-Select a sound level meter from the list to view live data
+