Compare commits
30 Commits
142998251c
...
v1.4.2
| Author | SHA1 | Date | |
|---|---|---|---|
| 0bea6ca4ea | |||
|
|
d2a8c2d928 | ||
|
|
3303e22843 | ||
| 2456fd0ee8 | |||
|
|
d2fd3b7182 | ||
|
|
1d94c5dd04 | ||
|
|
814b6f915e | ||
|
|
9cfdebe553 | ||
|
|
f773e1dac9 | ||
|
|
326658ed26 | ||
|
|
504ee1d470 | ||
|
|
e67b6eb89f | ||
|
|
1b8c63025f | ||
|
|
0807e09047 | ||
|
|
00956c022a | ||
|
|
9b20d93f4c | ||
| c133932b29 | |||
|
|
d404bf6542 | ||
|
|
0d5fa7677f | ||
|
|
44476248c3 | ||
|
|
fa56b93c8f | ||
|
|
58ba506f54 | ||
|
|
62a4ca2b1c | ||
|
|
f29943f8e4 | ||
|
|
35e3292f01 | ||
|
|
73204ee92e | ||
|
|
47718e7cad | ||
|
|
9074277ff3 | ||
|
|
551fdae106 | ||
|
|
a03d4a1f05 |
11
.gitignore
vendored
11
.gitignore
vendored
@@ -1,3 +1,8 @@
|
||||
# --------------------------
|
||||
# s3-agent files
|
||||
# --------------------------
|
||||
config.ini
|
||||
|
||||
# -------------------------
|
||||
# Python ignores
|
||||
# -------------------------
|
||||
@@ -17,13 +22,15 @@ env/
|
||||
# Distribution / packaging
|
||||
build/
|
||||
dist/
|
||||
Output/
|
||||
*.egg-info/
|
||||
*.spec
|
||||
|
||||
# -------------------------
|
||||
# Logs + runtime artifacts
|
||||
# -------------------------
|
||||
emitter_logs/*
|
||||
!emitter_logs/.gitkeep # keep the folder but ignore its contents
|
||||
agent_logs/*
|
||||
!agent_logs/.gitkeep # keep the folder but ignore its contents
|
||||
|
||||
*.log
|
||||
|
||||
|
||||
117
BUILDING.md
Normal file
117
BUILDING.md
Normal file
@@ -0,0 +1,117 @@
|
||||
# Building & Releasing Series 3 Watcher
|
||||
|
||||
## Prerequisites (Win7 VM — do this once)
|
||||
|
||||
- Python 3.7.2 (or 3.8.10 if SP1 is installed)
|
||||
- Inno Setup 6 — installed at `C:\Program Files (x86)\Inno Setup 6\`
|
||||
- PyInstaller, pystray, Pillow — installed automatically by `build.bat`
|
||||
|
||||
The Win7 VM is the build machine. All builds must happen there to ensure
|
||||
compatibility with the production DL2 computer.
|
||||
|
||||
---
|
||||
|
||||
## First-Time Install on a New Machine
|
||||
|
||||
Do this when setting up a brand new machine that has never had the watcher before.
|
||||
|
||||
**Step 1 — Build the .exe (on the Win7 VM)**
|
||||
|
||||
1. Copy the `series3-watcher/` folder to the VM (shared folder, USB, etc.)
|
||||
2. Double-click `build.bat`
|
||||
3. Wait for it to finish — output: `dist\series3-watcher.exe`
|
||||
|
||||
**Step 2 — Build the installer (on the Win7 VM)**
|
||||
|
||||
1. Open `installer.iss` in Inno Setup Compiler
|
||||
2. Click **Build → Compile**
|
||||
3. Output: `Output\series3-watcher-setup.exe`
|
||||
|
||||
**Step 3 — Create a Gitea release**
|
||||
|
||||
1. On your main machine, go to `https://gitea.serversdown.net/serversdown/series3-watcher`
|
||||
2. Click **Releases → New Release**
|
||||
3. Set the tag to match the version in `series3_watcher.py` (e.g. `v1.4.1`)
|
||||
4. Upload **both** files as release assets:
|
||||
- `dist\series3-watcher.exe` — used by the auto-updater on existing installs
|
||||
- `Output\series3-watcher-setup.exe` — used for fresh installs
|
||||
|
||||
**Step 4 — Install on the target machine**
|
||||
|
||||
1. Download `series3-watcher-setup.exe` from the Gitea release
|
||||
2. Run it on the target machine — installs to `C:\Program Files\Series3Watcher\`
|
||||
3. The watcher launches automatically after install (or on next login)
|
||||
4. The Setup Wizard appears on first run — fill in the Terra-View URL and Blastware path
|
||||
|
||||
---
|
||||
|
||||
## Releasing an Update (existing machines auto-update)
|
||||
|
||||
Do this for any code change — bug fix, new feature, etc.
|
||||
|
||||
**Step 1 — Bump the version**
|
||||
|
||||
In `series3_watcher.py`, update the `VERSION` string:
|
||||
```python
|
||||
VERSION = "1.4.2" # increment appropriately
|
||||
```
|
||||
Also update `installer.iss`:
|
||||
```
|
||||
AppVersion=1.4.2
|
||||
```
|
||||
|
||||
**Step 2 — Build the .exe (on the Win7 VM)**
|
||||
|
||||
1. Pull the latest code to the VM
|
||||
2. Double-click `build.bat`
|
||||
3. Output: `dist\series3-watcher.exe`
|
||||
|
||||
> For hotfixes you can skip Inno Setup — existing machines only need the `.exe`.
|
||||
> Only rebuild the installer if you need a fresh install package for a new machine.
|
||||
|
||||
**Step 3 — Create a Gitea release**
|
||||
|
||||
1. Go to `https://gitea.serversdown.net/serversdown/series3-watcher`
|
||||
2. Click **Releases → New Release**
|
||||
3. Tag must match the new version exactly (e.g. `v1.4.2`) — the auto-updater
|
||||
compares this tag against its own version to decide whether to update
|
||||
4. Upload `dist\series3-watcher.exe` as a release asset
|
||||
5. Optionally upload `Output\series3-watcher-setup.exe` if you rebuilt the installer
|
||||
|
||||
**Step 4 — Done**
|
||||
|
||||
Existing installs check Gitea every ~5 minutes. When they see the new tag they
|
||||
will download `series3-watcher.exe`, swap it in place, and relaunch silently.
|
||||
No user action required on the target machine.
|
||||
|
||||
---
|
||||
|
||||
## Version Numbering
|
||||
|
||||
Follows Semantic Versioning: `MAJOR.MINOR.PATCH`
|
||||
|
||||
| Change type | Example |
|
||||
|-------------|---------|
|
||||
| Bug fix / text change | `1.4.1 → 1.4.2` |
|
||||
| New feature | `1.4.x → 1.5.0` |
|
||||
| Breaking change | `1.x.x → 2.0.0` |
|
||||
|
||||
---
|
||||
|
||||
## Files That Go in the Gitea Release
|
||||
|
||||
| File | Required for | Notes |
|
||||
|------|-------------|-------|
|
||||
| `dist\series3-watcher.exe` | Auto-updates on existing machines | Always upload this |
|
||||
| `Output\series3-watcher-setup.exe` | Fresh installs on new machines | Only needed for new deployments |
|
||||
|
||||
---
|
||||
|
||||
## Files That Are NOT Committed to Git
|
||||
|
||||
- `dist/` — PyInstaller output
|
||||
- `Output/` — Inno Setup output
|
||||
- `build/` — PyInstaller temp files
|
||||
- `*.spec` — PyInstaller spec file
|
||||
- `config.ini` — machine-specific, never commit
|
||||
- `agent_logs/` — log files
|
||||
111
CHANGELOG.md
111
CHANGELOG.md
@@ -1,36 +1,131 @@
|
||||
# Changelog
|
||||
All notable changes to **Series3 Emitter** will be documented in this file.
|
||||
All notable changes to **Series 3 Watcher** will be documented in this file.
|
||||
|
||||
The format is based on [Keep a Changelog](https://keepachangelog.com/en/1.1.0/),
|
||||
and this project adheres to [Semantic Versioning](https://semver.org/spec/v2.0.0.html).
|
||||
|
||||
---
|
||||
|
||||
## [Unreleased]
|
||||
## [1.4.2] - 2026-03-17
|
||||
|
||||
### Changed
|
||||
- Tray icon color now reflects watcher + API health rather than unit ages — green=API OK, amber=API disabled, red=API failing, purple=watcher error.
|
||||
- Status menu text updated to show `Running — API OK | N unit(s) | scan Xm ago`.
|
||||
- Units submenu removed from tray — status tracking for individual units is handled by terra-view, not the watcher.
|
||||
- Unit list still logged to console and log file for debugging, but no OK/Pending/Missing judgement applied.
|
||||
- `watcher_status` field added to heartbeat payload so terra-view receives accurate watcher health data.
|
||||
|
||||
## [1.4.1] - 2026-03-17
|
||||
|
||||
### Fixed
|
||||
- `config.ini` now saves to `AppData\Local\Series3Watcher\` instead of `Program Files` — fixes permission denied error on first-run wizard save.
|
||||
- Config path resolution in both `series3_tray.py` and `series3_watcher.py` updated to use `sys.frozen` + `LOCALAPPDATA` when running as a PyInstaller `.exe`.
|
||||
- Status menu item now uses a callable so it updates every time the menu opens — was showing stale "Starting..." while tooltip correctly showed current status.
|
||||
- Settings dialog now opens in its own thread — fixes unresponsive tabs and text fields while the watcher loop is running.
|
||||
- Tray icon reverted to plain colored dot — custom icon graphic was unreadable at 16px tray size. `.ico` file is still used for the `.exe` file icon.
|
||||
|
||||
### Changed
|
||||
- Terra-View URL field in settings wizard now accepts base URL only (e.g. `http://192.168.x.x:8000`) — `/api/series3/heartbeat` endpoint appended automatically.
|
||||
- Test Connection button now hits `/health` endpoint instead of posting a fake heartbeat — no database side effects.
|
||||
- "terra-view URL" label capitalized to "Terra-View URL".
|
||||
- Default log path updated to `AppData\Local\Series3Watcher\agent_logs\series3_watcher.log`.
|
||||
- Installer now creates `agent_logs\` folder on install.
|
||||
- `BUILDING.md` added — step-by-step guide for building, releasing, and updating.
|
||||
|
||||
## [1.4.0] - 2026-03-12
|
||||
|
||||
### Added
|
||||
- `series3_tray.py` — system tray launcher using `pystray` + `Pillow`. Color-coded icon (green=OK, amber=Pending, red=Missing, purple=Error, grey=Starting). Right-click menu shows live status, unit count, last scan age, Open Log Folder, and Exit.
|
||||
- `run_watcher(state, stop_event)` in `series3_watcher.py` for background thread use by the tray. Shared `state` dict updated on every scan cycle with status, unit list, last scan time, and last error.
|
||||
- Interruptible sleep in watcher loop — tray exit is immediate, no waiting out the full scan interval.
|
||||
|
||||
### Changed
|
||||
- `main()` now calls `run_watcher()` — standalone behavior unchanged.
|
||||
- `requirements.txt` updated to document tray dependencies (`pystray`, `Pillow`); watcher itself remains stdlib-only.
|
||||
|
||||
---
|
||||
|
||||
## [1.0.0] – 2025-09-02
|
||||
## [1.3.0] - 2026-03-12
|
||||
|
||||
### Changed
|
||||
- Renamed program to "series3-watcher" and main script to `series3_watcher.py` — better reflects what it does (watches for activity) rather than implying active data emission.
|
||||
- Default `SOURCE_TYPE` updated to `series3_watcher`.
|
||||
- Default log filename updated to `series3_watcher.log`.
|
||||
|
||||
---
|
||||
|
||||
## [1.2.1] - 2026-03-03
|
||||
|
||||
### Changed
|
||||
- Changed the name of the program to "series3-agent", this was done to align with the s4/thor agent and because it represents the program's functionality better.
|
||||
- All instances of "emitter" changed to agent.
|
||||
- config.ini added to .gitignore, replaced with a template example file.
|
||||
- README.md updated to reflect changes.
|
||||
|
||||
---
|
||||
|
||||
## [1.2.0] - 2025-12-04
|
||||
|
||||
### Changed
|
||||
- Removed roster CSV dependency and all Dropbox refresh/hot-reload logic; heartbeat now only enumerates `.MLG` files.
|
||||
- Added `MAX_EVENT_AGE_DAYS` filter to ignore stale events and log when no recent activity exists.
|
||||
- Simplified heartbeat output/logging to show detected units only; logging hardened to never crash the agent.
|
||||
|
||||
---
|
||||
|
||||
## [1.1.1] - 2025-12-02
|
||||
|
||||
### Added
|
||||
- Example `config.ini` now ships with API heartbeat settings enabled (`API_ENABLED`, `API_URL`, `API_INTERVAL_SECONDS`, `SOURCE_ID`, `SOURCE_TYPE`).
|
||||
|
||||
---
|
||||
|
||||
## [1.1.0] - 2025-12-01
|
||||
|
||||
### Added
|
||||
- Standardized SFM telemetry payload builder and periodic HTTP heartbeat POST via `urllib`.
|
||||
- Config support for API heartbeat (`API_ENABLED`, `API_URL`, `API_INTERVAL_SECONDS`, `SOURCE_ID`, `SOURCE_TYPE`); payload includes file path/size metadata.
|
||||
|
||||
### Changed
|
||||
- Refactored scanner to retain file paths and header sniff cache; reformatted logging/ANSI handling.
|
||||
|
||||
---
|
||||
|
||||
## [1.0.1] - 2025-11-20
|
||||
|
||||
### Added
|
||||
- `API_URL` config key and `report_to_server` per-unit POST hook (adds `requests` dependency).
|
||||
|
||||
### Changed
|
||||
- Example `config.ini` roster URL updated; merged into `main`.
|
||||
|
||||
---
|
||||
|
||||
## [1.0.0] - 2025-11-17
|
||||
|
||||
### Added
|
||||
- **Automatic roster refresh** from Dropbox at a configurable interval (`ROSTER_REFRESH_MIN_SECONDS`).
|
||||
- **Hot-reload** of roster file without restarting the script.
|
||||
- **Failsafe reload:** if the new roster is missing or invalid, the previous good roster is retained.
|
||||
- **Atomic roster downloads** (temp file → replace) to avoid partial/corrupted CSVs.
|
||||
- **Atomic roster downloads** (temp file in-place replace) to avoid partial/corrupted CSVs.
|
||||
- **Startup config echo** printing WATCH_PATH, ROSTER_FILE, and ROSTER_URL visibility.
|
||||
- **Active / Bench / Ignored** unit categories for clearer fleet status mapping.
|
||||
|
||||
### Fixed
|
||||
- Removed stray `note=note_suffix` bug in the “Unexpected Units” section.
|
||||
- Removed stray `note=note_suffix` bug in the Unexpected Units section.
|
||||
- Removed duplicate `import time`.
|
||||
- Removed duplicate roster load during startup (roster now loads once).
|
||||
- Cleaned indentation for Python 3.8 compatibility.
|
||||
|
||||
### Changed
|
||||
- Reset versioning from legacy `v5.9 beta` → **v1.0.0** (clean semver baseline).
|
||||
- Main script normalized as `series3_emitter.py`.
|
||||
- Reset versioning from legacy `v5.9 beta` to **v1.0.0** (clean semver baseline).
|
||||
- Main script normalized as `series3_emitter.py` (later renamed to `series3_agent.py` in v1.2.1).
|
||||
|
||||
---
|
||||
|
||||
[Unreleased]: https://example.com/compare/v1.0.0...HEAD
|
||||
[Unreleased]: https://example.com/compare/v1.2.0...HEAD
|
||||
[1.2.0]: https://example.com/releases/v1.2.0
|
||||
[1.1.1]: https://example.com/releases/v1.1.1
|
||||
[1.1.0]: https://example.com/releases/v1.1.0
|
||||
[1.0.1]: https://example.com/releases/v1.0.1
|
||||
[1.0.0]: https://example.com/releases/v1.0.0
|
||||
169
README.md
169
README.md
@@ -1,79 +1,126 @@
|
||||
A lightweight Python script that monitors Instantel **Series 3 (Minimate)** call-in activity on a Blastware server.
|
||||
# Series 3 Watcher v1.4.2
|
||||
|
||||
It scans the event folder, reads `.MLG` headers to identify unit IDs, and prints a live status table showing:
|
||||
Monitors Instantel **Series 3 (Minimate)** call-in activity on a Blastware server. Runs as a **system tray app** that starts automatically on login, reports heartbeats to terra-view, and self-updates from Gitea.
|
||||
|
||||
- Last event received
|
||||
- Age since last call-in
|
||||
- OK / Pending / Missing states
|
||||
- Bench and ignored units
|
||||
- Unexpected units
|
||||
- Notes from the roster file
|
||||
---
|
||||
|
||||
This script is part of the larger **Seismograph Fleet Manager** project.
|
||||
## Deployment (Recommended — Installer)
|
||||
|
||||
The easiest way to deploy to a field machine is the pre-built Windows installer.
|
||||
|
||||
1. Download `series3-watcher-setup.exe` from the [latest release](https://gitea.serversdown.net/serversdown/series3-watcher/releases) on Gitea.
|
||||
2. Run the installer on the target machine. It installs to `C:\Program Files\Series3Watcher\` and adds a shortcut to the user's Startup folder.
|
||||
3. On first launch the **Setup Wizard** opens automatically — fill in the terra-view URL and Blastware path, then click **Save & Start**.
|
||||
4. A coloured dot appears in the system tray. Done.
|
||||
|
||||
The watcher will auto-start on every login from that point on.
|
||||
|
||||
### Auto-Updates
|
||||
|
||||
The watcher checks [Gitea](https://gitea.serversdown.net/serversdown/series3-watcher) for a newer release approximately every 5 minutes. When a newer `.exe` is found it downloads it silently, swaps the file, and relaunches — no user action required.
|
||||
|
||||
Updates can also be pushed remotely from terra-view → **Settings → Developer → Watcher Manager**.
|
||||
|
||||
---
|
||||
|
||||
## Building & Releasing
|
||||
|
||||
See [BUILDING.md](BUILDING.md) for the full step-by-step process covering:
|
||||
- First-time build and installer creation
|
||||
- Publishing a release to Gitea
|
||||
- Releasing hotfix updates (auto-updater picks them up automatically)
|
||||
|
||||
---
|
||||
|
||||
## Running Without the Installer (Dev / Debug)
|
||||
|
||||
```
|
||||
pip install -r requirements.txt
|
||||
python series3_tray.py # tray app (recommended)
|
||||
python series3_watcher.py # console-only, no tray
|
||||
```
|
||||
|
||||
`config.ini` must exist in the same directory. Copy `config-template.ini` to `config.ini` and edit it, or just run `series3_tray.py` — the wizard will create it on first run.
|
||||
|
||||
---
|
||||
|
||||
## Configuration
|
||||
|
||||
All settings live in `config.ini`. The Setup Wizard covers every field, but here's the reference:
|
||||
|
||||
### API / terra-view
|
||||
|
||||
| Key | Description |
|
||||
|-----|-------------|
|
||||
| `API_ENABLED` | `true` to send heartbeats to terra-view |
|
||||
| `API_URL` | Terra-View base URL, e.g. `http://192.168.1.10:8000` — the `/api/series3/heartbeat` endpoint is appended automatically |
|
||||
| `API_INTERVAL_SECONDS` | How often to POST (default `300`) |
|
||||
| `SOURCE_ID` | Identifier for this machine (defaults to hostname) |
|
||||
| `SOURCE_TYPE` | Always `series3_watcher` |
|
||||
|
||||
### Paths
|
||||
|
||||
| Key | Description |
|
||||
|-----|-------------|
|
||||
| `SERIES3_PATH` | Blastware autocall folder, e.g. `C:\Blastware 10\Event\autocall home` |
|
||||
| `MAX_EVENT_AGE_DAYS` | Ignore `.MLG` files older than this (default `365`) |
|
||||
| `LOG_FILE` | Path to the log file |
|
||||
|
||||
### Scanning
|
||||
|
||||
| Key | Description |
|
||||
|-----|-------------|
|
||||
| `SCAN_INTERVAL_SECONDS` | How often to scan the folder (default `300`) |
|
||||
| `OK_HOURS` | Age threshold for OK status (default `12`) |
|
||||
| `MISSING_HOURS` | Age threshold for Missing status (default `24`) |
|
||||
| `MLG_HEADER_BYTES` | Bytes to read from each `.MLG` header for unit ID (default `2048`) |
|
||||
| `RECENT_WARN_DAYS` | Log unsniffable files newer than this window |
|
||||
|
||||
### Logging
|
||||
|
||||
| Key | Description |
|
||||
|-----|-------------|
|
||||
| `ENABLE_LOGGING` | `true` / `false` |
|
||||
| `LOG_RETENTION_DAYS` | Auto-clear log after this many days (default `30`) |
|
||||
|
||||
---
|
||||
|
||||
## Tray Icon
|
||||
|
||||
| Colour | Meaning |
|
||||
|--------|---------|
|
||||
| Grey | Starting / no scan yet |
|
||||
| Green | All detected units OK |
|
||||
| Yellow | At least one unit Pending |
|
||||
| Red | At least one unit Missing, or error |
|
||||
|
||||
Right-click the icon for: status, per-unit list, Settings, Open Log Folder, Exit.
|
||||
|
||||
---
|
||||
|
||||
## terra-view Integration
|
||||
|
||||
When `API_ENABLED = true`, the watcher POSTs a telemetry payload to terra-view on each heartbeat interval. terra-view updates the emitter table and tracks the watcher process itself (version, last seen, log tail) in the Watcher Manager.
|
||||
|
||||
To view connected watchers: **Settings → Developer → Watcher Manager**.
|
||||
|
||||
---
|
||||
|
||||
## Requirements
|
||||
|
||||
- Python 3.8 (Windows 7 compatible)
|
||||
- Blastware 10 event folder available locally
|
||||
- `series3_roster.csv` in the configured path
|
||||
- `config.ini` in the same directory as the script
|
||||
|
||||
Install dependencies with:
|
||||
|
||||
`pip install -r requirements.txt`
|
||||
|
||||
---
|
||||
|
||||
## Usage
|
||||
|
||||
Run the emitter from the folder containing the script:
|
||||
|
||||
`python series3_emitter.py`
|
||||
|
||||
The script will:
|
||||
|
||||
1. Load the roster file
|
||||
2. Scan the Blastware event folder for `.MLG` files
|
||||
3. Sniff each file header for the unit ID
|
||||
4. Print a status line for each active unit
|
||||
5. Refresh the roster automatically if `ROSTER_URL` is set
|
||||
6. Write logs into the `emitter_logs/` folder
|
||||
|
||||
---
|
||||
|
||||
## Config
|
||||
|
||||
All settings are stored in `config.ini`.
|
||||
|
||||
Key fields:
|
||||
|
||||
- `SERIES3_PATH` – folder containing `.MLG` files
|
||||
- `ROSTER_FILE` – path to the local roster CSV
|
||||
- `ROSTER_URL` – optional URL for automatic roster downloads
|
||||
- `SCAN_INTERVAL_SECONDS` – how often to scan
|
||||
- `OK_HOURS` / `MISSING_HOURS` – thresholds for status
|
||||
|
||||
---
|
||||
|
||||
## Logs
|
||||
|
||||
Logs are stored under `emitter_logs/`.
|
||||
Git ignores all log files but keeps the folder itself.
|
||||
- Windows 7 or later
|
||||
- Python 3.8 (only needed if running from source — not needed with the installer)
|
||||
- Blastware 10 event folder accessible on the local machine
|
||||
|
||||
---
|
||||
|
||||
## Versioning
|
||||
|
||||
This repo follows **Semantic Versioning (SemVer)**.
|
||||
|
||||
Current release: **v1.0.0** – stable baseline emitter.
|
||||
See `CHANGELOG.md` for details.
|
||||
Follows **Semantic Versioning**. Current release: **v1.4.2**.
|
||||
See `CHANGELOG.md` for full history.
|
||||
|
||||
---
|
||||
|
||||
## License
|
||||
|
||||
Private / internal project.
|
||||
```
|
||||
Private / internal — Terra-Mechanics Inc.
|
||||
|
||||
@@ -1,26 +0,0 @@
|
||||
# Series 3 Emitter — v1_0(py38-safe) for DL2
|
||||
|
||||
**Target**: Windows 7 + Python 3.8.10
|
||||
**Baseline**: v5_4 (no logic changes)
|
||||
|
||||
## Files
|
||||
- series3_emitter_v1_0_py38.py — main script (py38-safe)
|
||||
- config.ini — your config (already included)
|
||||
- series3_roster.csv — your roster (already included, this auto updates from a URL to a dropbox file)
|
||||
- requirements.txt — none beyond stdlib
|
||||
|
||||
## Install
|
||||
1) Create `C:\SeismoEmitter\` on DL2
|
||||
2) Extract this ZIP into that folder
|
||||
3) Open CMD:
|
||||
```cmd
|
||||
cd C:\SeismoEmitter
|
||||
python series3_emitter_v1_0_py38.py
|
||||
```
|
||||
(If the console shows escape codes on Win7, set `COLORIZE = False` in `config.ini`.)
|
||||
|
||||
## Quick validation
|
||||
- Heartbeat prints Local/UTC timestamps
|
||||
- One line per active roster unit with OK/Pending/Missing, Age, Last, File
|
||||
- Unexpected units block shows .MLG not in roster
|
||||
- emitter.log rotates per LOG_RETENTION_DAYS
|
||||
20
build.bat
Normal file
20
build.bat
Normal file
@@ -0,0 +1,20 @@
|
||||
@echo off
|
||||
echo Building series3-watcher.exe...
|
||||
pip install pyinstaller pystray Pillow
|
||||
|
||||
REM Check whether icon.ico exists alongside this script.
|
||||
REM If it does, embed it as the .exe icon AND bundle it as a data file
|
||||
REM so the tray overlay can load it at runtime.
|
||||
if exist "%~dp0icon.ico" (
|
||||
pyinstaller --onefile --windowed --name series3-watcher ^
|
||||
--icon="%~dp0icon.ico" ^
|
||||
--add-data "%~dp0icon.ico;." ^
|
||||
series3_tray.py
|
||||
) else (
|
||||
echo [INFO] icon.ico not found -- building without custom icon.
|
||||
pyinstaller --onefile --windowed --name series3-watcher series3_tray.py
|
||||
)
|
||||
|
||||
echo.
|
||||
echo Done. Check dist\series3-watcher.exe
|
||||
pause
|
||||
@@ -1,9 +1,15 @@
|
||||
[emitter]
|
||||
[agent]
|
||||
|
||||
# --- API Heartbeat Settings ---
|
||||
API_ENABLED = true
|
||||
API_URL =
|
||||
API_INTERVAL_SECONDS = 300
|
||||
SOURCE_ID = #computer that is running agent.
|
||||
SOURCE_TYPE = series3_watcher
|
||||
|
||||
# Paths
|
||||
SERIES3_PATH = C:\Blastware 10\Event\autocall home
|
||||
ROSTER_FILE = C:\SeismoEmitter\series3_roster.csv
|
||||
ROSTER_URL = https://www.dropbox.com/URL
|
||||
ROSTER_REFRESH_MIN_SECONDS = 0
|
||||
MAX_EVENT_AGE_DAYS = 365
|
||||
|
||||
|
||||
# Scanning
|
||||
@@ -13,12 +19,9 @@ MISSING_HOURS = 24
|
||||
|
||||
# Logging
|
||||
ENABLE_LOGGING = True
|
||||
LOG_FILE = C:\SeismoEmitter\emitter_logs\series3_emitter.log
|
||||
LOG_FILE = C:\Users\%USERNAME%\AppData\Local\Series3Watcher\agent_logs\series3_watcher.log
|
||||
LOG_RETENTION_DAYS = 30
|
||||
|
||||
# Console colors
|
||||
COLORIZE = FALSE
|
||||
|
||||
# .MLG parsing
|
||||
MLG_HEADER_BYTES = 2048 ; used for unit-id extraction
|
||||
|
||||
41
installer.iss
Normal file
41
installer.iss
Normal file
@@ -0,0 +1,41 @@
|
||||
; Inno Setup script for Series 3 Watcher
|
||||
; Run through Inno Setup Compiler after building dist\series3-watcher.exe
|
||||
|
||||
[Setup]
|
||||
AppName=Series 3 Watcher
|
||||
AppVersion=1.4.2
|
||||
AppPublisher=Terra-Mechanics Inc.
|
||||
DefaultDirName={pf}\Series3Watcher
|
||||
DefaultGroupName=Series 3 Watcher
|
||||
OutputBaseFilename=series3-watcher-setup
|
||||
Compression=lzma
|
||||
SolidCompression=yes
|
||||
; Require admin rights so we can write to Program Files
|
||||
PrivilegesRequired=admin
|
||||
|
||||
[Tasks]
|
||||
Name: "desktopicon"; Description: "Create a &desktop icon"; GroupDescription: "Additional icons:"; Flags: unchecked
|
||||
|
||||
[Dirs]
|
||||
; Create the agent_logs folder so the watcher can write logs on first run
|
||||
Name: "{app}\agent_logs"
|
||||
|
||||
[Files]
|
||||
; Main executable — built by build.bat / PyInstaller
|
||||
Source: "dist\series3-watcher.exe"; DestDir: "{app}"; Flags: ignoreversion
|
||||
|
||||
[Icons]
|
||||
; Start Menu shortcut
|
||||
Name: "{group}\Series 3 Watcher"; Filename: "{app}\series3-watcher.exe"
|
||||
; Start Menu uninstall shortcut
|
||||
Name: "{group}\Uninstall Series 3 Watcher"; Filename: "{uninstallexe}"
|
||||
; Desktop shortcut (optional — controlled by [Tasks] above)
|
||||
Name: "{commondesktop}\Series 3 Watcher"; Filename: "{app}\series3-watcher.exe"; Tasks: desktopicon
|
||||
; Startup folder shortcut so the tray app launches on login
|
||||
Name: "{userstartup}\Series 3 Watcher"; Filename: "{app}\series3-watcher.exe"
|
||||
|
||||
[Run]
|
||||
; Offer to launch the app after install (unchecked by default)
|
||||
Filename: "{app}\series3-watcher.exe"; \
|
||||
Description: "Launch Series 3 Watcher"; \
|
||||
Flags: nowait postinstall skipifsilent unchecked
|
||||
@@ -1 +1,5 @@
|
||||
# Python 3.8.10 standard library only (no external packages required).
|
||||
# series3_watcher.py — stdlib only, no external packages required.
|
||||
|
||||
# series3_tray.py — required for system tray mode:
|
||||
pystray>=0.19.5
|
||||
Pillow>=9.0.0
|
||||
|
||||
@@ -1,459 +0,0 @@
|
||||
"""
|
||||
Series 3 Emitter — v1.0.0 (Stable Baseline, SemVer Reset)
|
||||
|
||||
Environment:
|
||||
- Python 3.8 (Windows 7 compatible)
|
||||
- Runs on DL2 with Blastware 10 event path
|
||||
|
||||
Key Features:
|
||||
- Atomic roster downloads from Dropbox (no partial files)
|
||||
- Automatic roster refresh from Dropbox at configurable interval
|
||||
- Automatic hot-reload into memory when roster CSV changes
|
||||
- Failsafe reload: keeps previous roster if new file is invalid or empty
|
||||
- Config-driven paths, intervals, and logging
|
||||
- Compact console heartbeat with status per unit
|
||||
- Logging with retention auto-clean (days configurable)
|
||||
- Safe .MLG header sniff for unit IDs (BE#### / BA####)
|
||||
|
||||
Changelog:
|
||||
- Reset to semantic versioning (from legacy v5.9 beta)
|
||||
- Fixed stray `note=note_suffix` bug in Unexpected Units block
|
||||
- Removed duplicate imports and redundant roster load at startup
|
||||
- Added startup config echo (paths + URL status)
|
||||
"""
|
||||
|
||||
import os
|
||||
import re
|
||||
import csv
|
||||
import time
|
||||
import configparser
|
||||
import urllib.request
|
||||
from datetime import datetime, timezone, timedelta
|
||||
from typing import Dict, Any, Optional, Tuple, Set, List
|
||||
|
||||
# ---------------- Config ----------------
|
||||
def load_config(path: str) -> Dict[str, Any]:
|
||||
"""Load INI with tolerant inline comments and a required [emitter] section."""
|
||||
cp = configparser.ConfigParser(inline_comment_prefixes=(';', '#'))
|
||||
cp.optionxform = str # preserve key case
|
||||
with open(path, "r", encoding="utf-8") as f:
|
||||
txt = f.read()
|
||||
# Ensure we have a section header
|
||||
if not re.search(r'^\s*\[', txt, flags=re.M):
|
||||
txt = "[emitter]\n" + txt
|
||||
cp.read_string(txt)
|
||||
sec = cp["emitter"]
|
||||
|
||||
def get_str(k: str, dflt: str) -> str:
|
||||
return sec.get(k, dflt).strip()
|
||||
|
||||
def get_int(k: str, dflt: int) -> int:
|
||||
try:
|
||||
return int(sec.get(k, str(dflt)).strip())
|
||||
except Exception:
|
||||
return dflt
|
||||
|
||||
def get_bool(k: str, dflt: bool) -> bool:
|
||||
v = sec.get(k, None)
|
||||
if v is None:
|
||||
return dflt
|
||||
return v.strip().lower() in ("1","true","on","yes","y")
|
||||
|
||||
return {
|
||||
"WATCH_PATH": get_str("SERIES3_PATH", r"C:\Blastware 10\Event\autocall home"),
|
||||
"ROSTER_FILE": get_str("ROSTER_FILE", r"C:\SeismoEmitter\series3_roster.csv"),
|
||||
"ROSTER_URL": get_str("ROSTER_URL", ""),
|
||||
"ROSTER_REFRESH_MIN_SECONDS": get_int("ROSTER_REFRESH_MIN_SECONDS", 300),
|
||||
"SCAN_INTERVAL": get_int("SCAN_INTERVAL_SECONDS", 300),
|
||||
"OK_HOURS": float(get_int("OK_HOURS", 12)),
|
||||
"MISSING_HOURS": float(get_int("MISSING_HOURS", 24)),
|
||||
"ENABLE_LOGGING": get_bool("ENABLE_LOGGING", True),
|
||||
"LOG_FILE": get_str("LOG_FILE", r"C:\SeismoEmitter\emitter_logs\series3_emitter.log"),
|
||||
"LOG_RETENTION_DAYS": get_int("LOG_RETENTION_DAYS", 30),
|
||||
"COLORIZE": get_bool("COLORIZE", False), # Win7 default off
|
||||
"MLG_HEADER_BYTES": max(256, min(get_int("MLG_HEADER_BYTES", 2048), 65536)),
|
||||
"RECENT_WARN_DAYS": get_int("RECENT_WARN_DAYS", 30),
|
||||
}
|
||||
|
||||
# --------------- ANSI helpers ---------------
|
||||
def ansi(enabled: bool, code: str) -> str:
|
||||
return code if enabled else ""
|
||||
|
||||
# --------------- Logging --------------------
|
||||
def log_message(path: str, enabled: bool, msg: str) -> None:
|
||||
if not enabled:
|
||||
return
|
||||
try:
|
||||
d = os.path.dirname(path) or "."
|
||||
if not os.path.exists(d):
|
||||
os.makedirs(d)
|
||||
with open(path, "a", encoding="utf-8") as f:
|
||||
f.write("{} {}\n".format(datetime.now(timezone.utc).isoformat(), msg))
|
||||
except Exception:
|
||||
pass
|
||||
|
||||
def clear_logs_if_needed(log_file: str, enabled: bool, retention_days: int) -> None:
|
||||
if not enabled or retention_days <= 0:
|
||||
return
|
||||
stamp_file = os.path.join(os.path.dirname(log_file) or ".", "last_clean.txt")
|
||||
now = datetime.now(timezone.utc)
|
||||
last = None
|
||||
try:
|
||||
if os.path.exists(stamp_file):
|
||||
with open(stamp_file, "r", encoding="utf-8") as f:
|
||||
last = datetime.fromisoformat(f.read().strip())
|
||||
except Exception:
|
||||
last = None
|
||||
if (last is None) or (now - last > timedelta(days=retention_days)):
|
||||
try:
|
||||
if os.path.exists(log_file):
|
||||
open(log_file, "w", encoding="utf-8").close()
|
||||
with open(stamp_file, "w", encoding="utf-8") as f:
|
||||
f.write(now.isoformat())
|
||||
print("Log cleared on {}".format(now.astimezone().strftime("%Y-%m-%d %H:%M:%S")))
|
||||
log_message(log_file, enabled, "Logs auto-cleared")
|
||||
except Exception:
|
||||
pass
|
||||
|
||||
# --------------- Roster ---------------------
|
||||
def normalize_id(uid: str) -> str:
|
||||
if uid is None:
|
||||
return ""
|
||||
return uid.replace(" ", "").strip().upper()
|
||||
|
||||
def load_roster(path: str) -> Tuple[Set[str], Set[str], Set[str], Dict[str, str]]:
|
||||
"""CSV tolerant of commas in notes: device_id, active, notes...
|
||||
Returns: active, bench, ignored, notes_by_unit
|
||||
"""
|
||||
active: Set[str] = set()
|
||||
bench: Set[str] = set()
|
||||
ignored: Set[str] = set()
|
||||
notes_by_unit: Dict[str, str] = {}
|
||||
|
||||
if not os.path.exists(path):
|
||||
print("[WARN] Roster not found:", path)
|
||||
return active, notes_by_unit
|
||||
try:
|
||||
with open(path, "r", encoding="utf-8-sig", newline="") as f:
|
||||
rdr = csv.reader(f)
|
||||
try:
|
||||
headers = next(rdr)
|
||||
except StopIteration:
|
||||
return active, notes_by_unit
|
||||
headers = [(h or "").strip().lower() for h in headers]
|
||||
def idx_of(name: str, fallbacks: List[str]) -> Optional[int]:
|
||||
if name in headers:
|
||||
return headers.index(name)
|
||||
for fb in fallbacks:
|
||||
if fb in headers:
|
||||
return headers.index(fb)
|
||||
return None
|
||||
i_id = idx_of("device_id", ["unitid","id"])
|
||||
i_ac = idx_of("active", [])
|
||||
i_no = idx_of("notes", ["note","location"])
|
||||
if i_id is None or i_ac is None:
|
||||
print("[WARN] Roster missing device_id/active columns")
|
||||
return active, notes_by_unit
|
||||
for row in rdr:
|
||||
if len(row) <= max(i_id, i_ac):
|
||||
continue
|
||||
uid = normalize_id(row[i_id])
|
||||
note = ""
|
||||
if i_no is not None:
|
||||
extra = row[i_no:]
|
||||
note = ",".join([c or "" for c in extra]).strip().rstrip(",")
|
||||
notes_by_unit[uid] = note
|
||||
if not uid:
|
||||
continue
|
||||
is_active = (row[i_ac] or "").strip().lower() in ("yes","y","true","1","on")
|
||||
flag = (row[i_ac] or "").strip().lower()
|
||||
if flag in ("yes","y","true","1","on"):
|
||||
active.add(uid)
|
||||
elif flag in ("no","n","off","0"):
|
||||
bench.add(uid)
|
||||
elif flag in ("ignore","retired","old"):
|
||||
ignored.add(uid)
|
||||
|
||||
except Exception as e:
|
||||
print("[WARN] Roster read error:", e)
|
||||
return active, bench, ignored, notes_by_unit
|
||||
|
||||
# --------------- .MLG sniff ------------------
|
||||
UNIT_BYTES_RE = re.compile(rb"(?:^|[^A-Z])(BE|BA)\d{4,5}(?:[^0-9]|$)")
|
||||
|
||||
def sniff_unit_from_mlg(path: str, header_bytes: int) -> Optional[str]:
|
||||
"""Return BE####/BA#### from header bytes, or None."""
|
||||
try:
|
||||
with open(path, "rb") as f:
|
||||
chunk = f.read(max(256, min(header_bytes, 65536)))
|
||||
m = UNIT_BYTES_RE.search(chunk)
|
||||
if not m:
|
||||
return None
|
||||
raw = m.group(0)
|
||||
cleaned = re.sub(rb"[^A-Z0-9]", b"", raw)
|
||||
try:
|
||||
return cleaned.decode("ascii").upper()
|
||||
except Exception:
|
||||
return None
|
||||
except Exception:
|
||||
return None
|
||||
|
||||
# --------------- Scan helpers ---------------
|
||||
def fmt_last(ts: float) -> str:
|
||||
return datetime.fromtimestamp(ts, tz=timezone.utc).astimezone().strftime("%Y-%m-%d %H:%M:%S")
|
||||
|
||||
def fmt_age(now_epoch: float, mtime: float) -> str:
|
||||
mins = int((now_epoch - mtime) // 60)
|
||||
if mins < 0: mins = 0
|
||||
return "{}h {}m".format(mins//60, mins%60)
|
||||
|
||||
def scan_latest(watch: str, header_bytes: int,
|
||||
cache: Dict[str, Tuple[float, str]],
|
||||
recent_cutoff: float = None,
|
||||
logger=None):
|
||||
|
||||
"""Return newest .MLG per unit: {uid: {'mtime': float, 'fname': str}}"""
|
||||
latest: Dict[str, Dict[str, Any]] = {}
|
||||
if not os.path.exists(watch):
|
||||
print("[WARN] Watch path not found:", watch)
|
||||
return latest
|
||||
try:
|
||||
with os.scandir(watch) as it:
|
||||
for e in it:
|
||||
if (not e.is_file()) or (not e.name.lower().endswith(".mlg")):
|
||||
continue
|
||||
fpath = e.path
|
||||
try:
|
||||
mtime = e.stat().st_mtime
|
||||
except Exception:
|
||||
continue
|
||||
cached = cache.get(fpath)
|
||||
if cached is not None and cached[0] == mtime:
|
||||
uid = cached[1]
|
||||
else:
|
||||
uid = sniff_unit_from_mlg(fpath, header_bytes)
|
||||
if not uid:
|
||||
if (recent_cutoff is not None) and (mtime >= recent_cutoff):
|
||||
if logger:
|
||||
logger(f"[unsniffable-recent] {fpath}")
|
||||
continue # skip file if no unit ID found in header
|
||||
cache[fpath] = (mtime, uid)
|
||||
if (uid not in latest) or (mtime > latest[uid]["mtime"]):
|
||||
latest[uid] = {"mtime": mtime, "fname": e.name}
|
||||
except Exception as ex:
|
||||
print("[WARN] Scan error:", ex)
|
||||
return latest
|
||||
|
||||
# --- Roster fetch (Dropbox/HTTPS) helper ---
|
||||
def refresh_roster_from_url(url: str, dest: str, min_seconds: int,
|
||||
state: dict, logger=None):
|
||||
now = time.time()
|
||||
|
||||
# throttle fetches; only pull if enough time elapsed
|
||||
if now - state.get("t", 0) < max(0, int(min_seconds or 0)):
|
||||
return
|
||||
|
||||
try:
|
||||
with urllib.request.urlopen(url, timeout=15) as r:
|
||||
data = r.read()
|
||||
if data and data.strip():
|
||||
with open(dest, "wb") as f:
|
||||
f.write(data)
|
||||
state["t"] = now
|
||||
if logger:
|
||||
from datetime import datetime
|
||||
logger(f"[roster] refreshed from {url} at {datetime.now().strftime('%Y-%m-%d %H:%M:%S')} "
|
||||
f"-> {dest} ({len(data)} bytes)")
|
||||
except Exception as e:
|
||||
if logger:
|
||||
logger(f"[roster-fetch-error] {e}")
|
||||
|
||||
|
||||
# --- config helper: case-insensitive key lookup ---
|
||||
def cfg_get(cfg: dict, key: str, default=None):
|
||||
return cfg.get(key, cfg.get(key.lower(), cfg.get(key.upper(), default)))
|
||||
|
||||
# --------------- Main loop ------------------
|
||||
def main() -> None:
|
||||
here = os.path.dirname(__file__) or "."
|
||||
cfg = load_config(os.path.join(here, "config.ini"))
|
||||
|
||||
WATCH_PATH = cfg["WATCH_PATH"]
|
||||
ROSTER_FILE = cfg["ROSTER_FILE"]
|
||||
SCAN_INTERVAL = int(cfg["SCAN_INTERVAL"])
|
||||
OK_HOURS = float(cfg["OK_HOURS"])
|
||||
MISSING_HOURS = float(cfg["MISSING_HOURS"])
|
||||
ENABLE_LOGGING = bool(cfg["ENABLE_LOGGING"])
|
||||
LOG_FILE = cfg["LOG_FILE"]
|
||||
LOG_RETENTION_DAYS = int(cfg["LOG_RETENTION_DAYS"])
|
||||
COLORIZE = bool(cfg["COLORIZE"])
|
||||
MLG_HEADER_BYTES = int(cfg["MLG_HEADER_BYTES"])
|
||||
|
||||
C_OK = ansi(COLORIZE, "\033[92m")
|
||||
C_PEN = ansi(COLORIZE, "\033[93m")
|
||||
C_MIS = ansi(COLORIZE, "\033[91m")
|
||||
C_UNX = ansi(COLORIZE, "\033[95m")
|
||||
C_RST = ansi(COLORIZE, "\033[0m")
|
||||
|
||||
# --- Dropbox roster refresh (pull CSV to local cache) ---
|
||||
roster_state = {}
|
||||
url = str(cfg_get(cfg, "ROSTER_URL", "") or "")
|
||||
# --- Dropbox roster refresh (pull CSV to local cache) ---
|
||||
roster_state = {}
|
||||
url = str(cfg_get(cfg, "ROSTER_URL", "") or "")
|
||||
|
||||
# 🔎 Patch 3: startup config echo (helps debugging)
|
||||
print(f"[CFG] WATCH_PATH={WATCH_PATH} ROSTER_FILE={ROSTER_FILE} ROSTER_URL={'set' if url else 'not set'}")
|
||||
# (optional, also write it to the log file)
|
||||
log_message(LOG_FILE, ENABLE_LOGGING,
|
||||
f"[cfg] WATCH_PATH={WATCH_PATH} ROSTER_FILE={ROSTER_FILE} ROSTER_URL={'set' if url else 'not set'}")
|
||||
|
||||
if url.lower().startswith("http"):
|
||||
refresh_roster_from_url(
|
||||
url,
|
||||
ROSTER_FILE,
|
||||
int(cfg_get(cfg, "ROSTER_REFRESH_MIN_SECONDS", 300)),
|
||||
roster_state,
|
||||
lambda m: log_message(LOG_FILE, ENABLE_LOGGING, m),
|
||||
)
|
||||
|
||||
# cache for scanning
|
||||
sniff_cache: Dict[str, Tuple[float, str]] = {}
|
||||
|
||||
# Always load the (possibly refreshed) local roster
|
||||
try:
|
||||
active, bench, ignored, notes_by_unit = load_roster(ROSTER_FILE)
|
||||
except Exception as ex:
|
||||
log_message(LOG_FILE, ENABLE_LOGGING, f"[WARN] roster load failed: {ex}")
|
||||
active = set()
|
||||
bench = set()
|
||||
ignored = set()
|
||||
notes_by_unit = {}
|
||||
|
||||
# track roster file modification time
|
||||
try:
|
||||
roster_mtime = os.path.getmtime(ROSTER_FILE)
|
||||
except Exception:
|
||||
roster_mtime = None
|
||||
|
||||
|
||||
|
||||
while True:
|
||||
try:
|
||||
now_local = datetime.now().isoformat()
|
||||
now_utc = datetime.now(timezone.utc).isoformat()
|
||||
print("-" * 110)
|
||||
print("Heartbeat @ {} (Local) | {} (UTC)".format(now_local, now_utc))
|
||||
print("-" * 110)
|
||||
|
||||
# Periodically refresh roster file from Dropbox
|
||||
if url.lower().startswith("http"):
|
||||
refresh_roster_from_url(
|
||||
url,
|
||||
ROSTER_FILE,
|
||||
int(cfg_get(cfg, "ROSTER_REFRESH_MIN_SECONDS", 300)),
|
||||
roster_state,
|
||||
lambda m: log_message(LOG_FILE, ENABLE_LOGGING, m),
|
||||
)
|
||||
|
||||
# Reload roster into memory if the file changed
|
||||
try:
|
||||
m = os.path.getmtime(ROSTER_FILE)
|
||||
except Exception:
|
||||
m = None
|
||||
|
||||
if m is not None and m != roster_mtime:
|
||||
roster_mtime = m
|
||||
try:
|
||||
new_active, new_bench, new_ignored, new_notes_by_unit = load_roster(ROSTER_FILE)
|
||||
if new_active or new_bench or new_ignored:
|
||||
active, bench, ignored, notes_by_unit = new_active, new_bench, new_ignored, new_notes_by_unit
|
||||
print(f"[ROSTER] Reloaded: {len(active)} active unit(s) from {ROSTER_FILE}")
|
||||
log_message(LOG_FILE, ENABLE_LOGGING,
|
||||
f"[roster] reloaded {len(active)} active units")
|
||||
else:
|
||||
print("[ROSTER] Reload skipped — no valid active units in new file")
|
||||
log_message(LOG_FILE, ENABLE_LOGGING,
|
||||
"[roster] reload skipped — roster parse failed or empty")
|
||||
except Exception as ex:
|
||||
print(f"[ROSTER] Reload failed, keeping previous roster: {ex}")
|
||||
log_message(LOG_FILE, ENABLE_LOGGING,
|
||||
f"[roster] reload failed, keeping previous roster: {ex}")
|
||||
|
||||
clear_logs_if_needed(LOG_FILE, ENABLE_LOGGING, LOG_RETENTION_DAYS)
|
||||
recent_cutoff = time.time() - (float(cfg.get("RECENT_WARN_DAYS", 30)) * 86400)
|
||||
logger = lambda m: log_message(LOG_FILE, ENABLE_LOGGING, m)
|
||||
latest = scan_latest(WATCH_PATH, MLG_HEADER_BYTES, sniff_cache, recent_cutoff, logger)
|
||||
now_epoch = time.time()
|
||||
|
||||
for uid in sorted(active):
|
||||
info = latest.get(uid)
|
||||
if info is not None:
|
||||
age_hours = (now_epoch - info["mtime"]) / 3600.0
|
||||
if age_hours > MISSING_HOURS:
|
||||
status, col = "Missing", C_MIS
|
||||
elif age_hours > OK_HOURS:
|
||||
status, col = "Pending", C_PEN
|
||||
else:
|
||||
status, col = "OK", C_OK
|
||||
note = notes_by_unit.get(uid, "")
|
||||
note_suffix = f" [{note}]" if note else ""
|
||||
line = ("{col}{uid:<8} {status:<8} Age: {age:<7} Last: {last} (File: {fname}){note}{rst}"
|
||||
.format(col=col, uid=uid, status=status,
|
||||
age=fmt_age(now_epoch, info["mtime"]),
|
||||
last=fmt_last(info["mtime"]), fname=info["fname"], note=note_suffix, rst=C_RST))
|
||||
else:
|
||||
note = notes_by_unit.get(uid, "")
|
||||
note_suffix = f" [{note}]" if note else ""
|
||||
line = "{col}{uid:<8} Missing Age: N/A Last: ---{note}{rst}".format(col=C_MIS, uid=uid, note=note_suffix, rst=C_RST)
|
||||
print(line)
|
||||
log_message(LOG_FILE, ENABLE_LOGGING, line)
|
||||
|
||||
# Bench Units (rostered but not active in field)
|
||||
print("\nBench Units (rostered, not active):")
|
||||
for uid in sorted(bench):
|
||||
info = latest.get(uid)
|
||||
note = notes_by_unit.get(uid, "")
|
||||
note_suffix = f" [{note}]" if note else ""
|
||||
if info:
|
||||
line = (f"{uid:<8} Bench Last: {fmt_last(info['mtime'])} (File: {info['fname']}){note_suffix}")
|
||||
else:
|
||||
line = (f"{uid:<8} Bench Last: ---{note_suffix}")
|
||||
print(line)
|
||||
log_message(LOG_FILE, ENABLE_LOGGING, "[bench] " + line)
|
||||
|
||||
# Ignored Units (retired, broken, or do-not-care)
|
||||
# ignored_detected = [u for u in latest.keys() if u in ignored]
|
||||
# if ignored_detected:
|
||||
# print("\nIgnored Units:")
|
||||
# for uid in sorted(ignored_detected):
|
||||
# info = latest[uid]
|
||||
# note = notes_by_unit.get(uid, "")
|
||||
# note_suffix = f" [{note}]" if note else ""
|
||||
# line = (f"{uid:<8} Ignored Last: {fmt_last(info['mtime'])} (File: {info['fname']}){note_suffix}")
|
||||
# print(line)
|
||||
# log_message(LOG_FILE, ENABLE_LOGGING, "[ignored] " + line)
|
||||
unexpected = [
|
||||
u for u in latest.keys()
|
||||
if u not in active and u not in bench and u not in ignored and u not in notes_by_unit
|
||||
]
|
||||
if unexpected:
|
||||
print("\nUnexpected Units Detected:")
|
||||
for uid in sorted(unexpected):
|
||||
info = latest[uid]
|
||||
line = ("{col}{uid:<8} Age: - Last: {last} (File: {fname}){rst}"
|
||||
.format(col=C_UNX, uid=uid, last=fmt_last(info["mtime"]), fname=info["fname"], rst=C_RST))
|
||||
print(line)
|
||||
log_message(LOG_FILE, ENABLE_LOGGING, "[unexpected] " + line)
|
||||
|
||||
except KeyboardInterrupt:
|
||||
print("\nStopping...")
|
||||
break
|
||||
except Exception as e:
|
||||
err = "[loop-error] {}".format(e)
|
||||
print(err)
|
||||
log_message(LOG_FILE, ENABLE_LOGGING, err)
|
||||
time.sleep(SCAN_INTERVAL)
|
||||
|
||||
if __name__ == "__main__":
|
||||
main()
|
||||
416
series3_tray.py
Normal file
416
series3_tray.py
Normal file
@@ -0,0 +1,416 @@
|
||||
"""
|
||||
Series 3 Watcher — System Tray Launcher v1.4.2
|
||||
Requires: pystray, Pillow, tkinter (stdlib)
|
||||
|
||||
Run with: pythonw series3_tray.py (no console window)
|
||||
or: python series3_tray.py (with console, for debugging)
|
||||
|
||||
Put a shortcut to this in shell:startup for auto-start on login.
|
||||
|
||||
Python 3.8 compatible — no walrus operators, no f-string = specifier,
|
||||
no match statements, no 3.9+ syntax.
|
||||
"""
|
||||
|
||||
import os
|
||||
import sys
|
||||
import subprocess
|
||||
import tempfile
|
||||
import threading
|
||||
import urllib.request
|
||||
import urllib.error
|
||||
from datetime import datetime
|
||||
|
||||
import pystray
|
||||
from PIL import Image, ImageDraw
|
||||
|
||||
import series3_watcher as watcher
|
||||
|
||||
|
||||
# --------------- Auto-updater ---------------
|
||||
|
||||
GITEA_BASE = "https://gitea.serversdown.net"
|
||||
GITEA_USER = "serversdown"
|
||||
GITEA_REPO = "series3-watcher"
|
||||
GITEA_API_URL = "{}/api/v1/repos/{}/{}/releases?limit=1&page=1".format(
|
||||
GITEA_BASE, GITEA_USER, GITEA_REPO
|
||||
)
|
||||
|
||||
# Populated from watcher version string at startup
|
||||
_CURRENT_VERSION = getattr(watcher, "VERSION", "0.0.0")
|
||||
|
||||
|
||||
def _version_tuple(v):
|
||||
"""Convert '1.4.0' -> (1, 4, 0) for comparison. Non-numeric parts -> 0."""
|
||||
parts = []
|
||||
for p in str(v).lstrip("v").split(".")[:3]:
|
||||
try:
|
||||
parts.append(int(p))
|
||||
except ValueError:
|
||||
parts.append(0)
|
||||
while len(parts) < 3:
|
||||
parts.append(0)
|
||||
return tuple(parts)
|
||||
|
||||
|
||||
def check_for_update():
|
||||
"""
|
||||
Query Gitea for the latest release.
|
||||
Returns (tag, download_url) if an update is available, else (None, None).
|
||||
"""
|
||||
import json as _json
|
||||
try:
|
||||
req = urllib.request.Request(
|
||||
GITEA_API_URL,
|
||||
headers={"User-Agent": "series3-watcher/{}".format(_CURRENT_VERSION)},
|
||||
)
|
||||
with urllib.request.urlopen(req, timeout=8) as resp:
|
||||
releases = _json.loads(resp.read().decode("utf-8"))
|
||||
if not releases:
|
||||
return None, None
|
||||
latest = releases[0]
|
||||
tag = latest.get("tag_name", "")
|
||||
if _version_tuple(tag) <= _version_tuple(_CURRENT_VERSION):
|
||||
return None, None
|
||||
# Find the .exe asset
|
||||
assets = latest.get("assets", [])
|
||||
for asset in assets:
|
||||
name = asset.get("name", "")
|
||||
if name.lower().endswith(".exe"):
|
||||
return tag, asset.get("browser_download_url")
|
||||
return tag, None
|
||||
except Exception:
|
||||
return None, None
|
||||
|
||||
|
||||
def apply_update(download_url):
|
||||
"""
|
||||
Download new .exe to a temp file, write a swap .bat, launch it, exit.
|
||||
The bat waits for us to exit, then swaps the files and relaunches.
|
||||
"""
|
||||
exe_path = os.path.abspath(sys.executable if getattr(sys, "frozen", False) else sys.argv[0])
|
||||
|
||||
try:
|
||||
tmp_fd, tmp_path = tempfile.mkstemp(suffix=".exe", prefix="s3w_update_")
|
||||
os.close(tmp_fd)
|
||||
|
||||
req = urllib.request.Request(
|
||||
download_url,
|
||||
headers={"User-Agent": "series3-watcher/{}".format(_CURRENT_VERSION)},
|
||||
)
|
||||
with urllib.request.urlopen(req, timeout=60) as resp:
|
||||
with open(tmp_path, "wb") as f:
|
||||
f.write(resp.read())
|
||||
|
||||
bat_fd, bat_path = tempfile.mkstemp(suffix=".bat", prefix="s3w_swap_")
|
||||
os.close(bat_fd)
|
||||
|
||||
bat_content = (
|
||||
"@echo off\r\n"
|
||||
"ping 127.0.0.1 -n 4 > nul\r\n"
|
||||
"copy /Y \"{new}\" \"{exe}\"\r\n"
|
||||
"start \"\" \"{exe}\"\r\n"
|
||||
"del \"%~f0\"\r\n"
|
||||
).format(new=tmp_path, exe=exe_path)
|
||||
|
||||
with open(bat_path, "w") as f:
|
||||
f.write(bat_content)
|
||||
|
||||
subprocess.Popen(
|
||||
["cmd", "/C", bat_path],
|
||||
creationflags=subprocess.CREATE_NO_WINDOW if hasattr(subprocess, "CREATE_NO_WINDOW") else 0,
|
||||
)
|
||||
return True
|
||||
except Exception:
|
||||
return False
|
||||
|
||||
|
||||
# --------------- Paths ---------------
|
||||
|
||||
# Executable location — used for bundled assets (icon.ico etc.)
|
||||
if getattr(sys, "frozen", False):
|
||||
HERE = os.path.dirname(os.path.abspath(sys.executable))
|
||||
else:
|
||||
HERE = os.path.dirname(os.path.abspath(__file__))
|
||||
|
||||
# config.ini lives in AppData so normal users can write it without UAC issues.
|
||||
# Fall back to the exe directory when running from source (dev mode).
|
||||
if getattr(sys, "frozen", False):
|
||||
_appdata = os.environ.get("LOCALAPPDATA") or os.environ.get("APPDATA") or HERE
|
||||
CONFIG_DIR = os.path.join(_appdata, "Series3Watcher")
|
||||
os.makedirs(CONFIG_DIR, exist_ok=True)
|
||||
else:
|
||||
CONFIG_DIR = HERE
|
||||
CONFIG_PATH = os.path.join(CONFIG_DIR, "config.ini")
|
||||
|
||||
|
||||
# --------------- Icon drawing ---------------
|
||||
|
||||
COLORS = {
|
||||
"ok": (60, 200, 80), # green
|
||||
"pending": (230, 180, 0), # amber
|
||||
"missing": (210, 40, 40), # red
|
||||
"error": (160, 40, 200), # purple
|
||||
"starting": (120, 120, 120), # grey
|
||||
}
|
||||
|
||||
ICON_SIZE = 64
|
||||
|
||||
|
||||
def make_icon(status):
|
||||
"""Draw a plain colored circle for the system tray — clean and readable at 16px."""
|
||||
color = COLORS.get(status, COLORS["starting"])
|
||||
img = Image.new("RGBA", (ICON_SIZE, ICON_SIZE), (0, 0, 0, 0))
|
||||
draw = ImageDraw.Draw(img)
|
||||
margin = 6
|
||||
draw.ellipse(
|
||||
[margin, margin, ICON_SIZE - margin, ICON_SIZE - margin],
|
||||
fill=color,
|
||||
)
|
||||
return img
|
||||
|
||||
|
||||
# --------------- First-run check ---------------
|
||||
|
||||
def ensure_config():
|
||||
"""
|
||||
If config.ini is missing, launch the first-run wizard.
|
||||
Returns True if config is ready, False if user cancelled.
|
||||
"""
|
||||
if os.path.exists(CONFIG_PATH):
|
||||
return True
|
||||
|
||||
# Import here to avoid pulling in tkinter unless needed
|
||||
from settings_dialog import show_dialog
|
||||
saved = show_dialog(CONFIG_PATH, wizard=True)
|
||||
if not saved:
|
||||
_show_cancel_message()
|
||||
return False
|
||||
return True
|
||||
|
||||
|
||||
def _show_cancel_message():
|
||||
"""Show a plain messagebox telling the user the app cannot start."""
|
||||
try:
|
||||
import tkinter as tk
|
||||
from tkinter import messagebox
|
||||
root = tk.Tk()
|
||||
root.withdraw()
|
||||
messagebox.showwarning(
|
||||
"Series 3 Watcher",
|
||||
"No configuration was saved.\nThe application will now exit.",
|
||||
)
|
||||
root.destroy()
|
||||
except Exception:
|
||||
pass
|
||||
|
||||
|
||||
# --------------- Tray app ---------------
|
||||
|
||||
class WatcherTray:
|
||||
def __init__(self):
|
||||
self.state = {}
|
||||
self.stop_event = threading.Event()
|
||||
self._watcher_thread = None
|
||||
self._icon = None
|
||||
# Lock guards _rebuild_menu calls from the updater thread
|
||||
self._menu_lock = threading.Lock()
|
||||
|
||||
# --- Watcher thread management ---
|
||||
|
||||
def _start_watcher(self):
|
||||
self.stop_event.clear()
|
||||
self._watcher_thread = threading.Thread(
|
||||
target=watcher.run_watcher,
|
||||
args=(self.state, self.stop_event),
|
||||
daemon=True,
|
||||
name="watcher",
|
||||
)
|
||||
self._watcher_thread.start()
|
||||
|
||||
def _stop_watcher(self):
|
||||
self.stop_event.set()
|
||||
if self._watcher_thread is not None:
|
||||
self._watcher_thread.join(timeout=10)
|
||||
self._watcher_thread = None
|
||||
|
||||
def _restart_watcher(self):
|
||||
"""Stop any running watcher and start a fresh one."""
|
||||
self._stop_watcher()
|
||||
self.stop_event = threading.Event()
|
||||
self.state["status"] = "starting"
|
||||
self.state["units"] = []
|
||||
self.state["last_scan"] = None
|
||||
self.state["last_error"] = None
|
||||
self._start_watcher()
|
||||
|
||||
# --- Menu item callbacks ---
|
||||
|
||||
def _open_settings(self, icon, item):
|
||||
"""Open the settings dialog in its own thread so the tray stays responsive."""
|
||||
def _run():
|
||||
from settings_dialog import show_dialog
|
||||
saved = show_dialog(CONFIG_PATH, wizard=False)
|
||||
if saved:
|
||||
self._restart_watcher()
|
||||
threading.Thread(target=_run, daemon=True, name="settings-dialog").start()
|
||||
|
||||
def _open_logs(self, icon, item):
|
||||
log_dir = self.state.get("log_dir")
|
||||
if not log_dir:
|
||||
log_dir = HERE
|
||||
if os.path.exists(log_dir):
|
||||
subprocess.Popen(["explorer", log_dir])
|
||||
else:
|
||||
parent = os.path.dirname(log_dir)
|
||||
if os.path.exists(parent):
|
||||
subprocess.Popen(["explorer", parent])
|
||||
else:
|
||||
subprocess.Popen(["explorer", HERE])
|
||||
|
||||
def _exit(self, icon, item):
|
||||
self.stop_event.set()
|
||||
icon.stop()
|
||||
|
||||
# --- Dynamic menu text helpers ---
|
||||
|
||||
def _status_text(self):
|
||||
status = self.state.get("status", "starting")
|
||||
last_err = self.state.get("last_error")
|
||||
last_scan = self.state.get("last_scan")
|
||||
api_status = self.state.get("api_status", "disabled")
|
||||
unit_count = len(self.state.get("units", []))
|
||||
|
||||
if status == "error":
|
||||
return "Error — {}".format(last_err or "unknown")
|
||||
if status == "starting":
|
||||
return "Starting..."
|
||||
|
||||
# Scan age
|
||||
if last_scan is not None:
|
||||
age_secs = int((datetime.now() - last_scan).total_seconds())
|
||||
age_str = "{}s ago".format(age_secs) if age_secs < 60 else "{}m ago".format(age_secs // 60)
|
||||
else:
|
||||
age_str = "never"
|
||||
|
||||
# API status label
|
||||
if api_status == "ok":
|
||||
api_str = "API OK"
|
||||
elif api_status == "fail":
|
||||
api_str = "API FAIL"
|
||||
else:
|
||||
api_str = "API off"
|
||||
|
||||
return "Running — {} | {} unit(s) | scan {}".format(api_str, unit_count, age_str)
|
||||
|
||||
def _tray_status(self):
|
||||
"""Return the icon status key based on watcher + API health."""
|
||||
status = self.state.get("status", "starting")
|
||||
if status == "error":
|
||||
return "error"
|
||||
if status == "starting":
|
||||
return "starting"
|
||||
api_status = self.state.get("api_status", "disabled")
|
||||
if api_status == "fail":
|
||||
return "missing" # red — API failing
|
||||
if api_status == "disabled":
|
||||
return "pending" # amber — running but not reporting
|
||||
return "ok" # green — running and API good
|
||||
|
||||
def _build_menu(self):
|
||||
return pystray.Menu(
|
||||
pystray.MenuItem(lambda item: self._status_text(), None, enabled=False),
|
||||
pystray.Menu.SEPARATOR,
|
||||
pystray.MenuItem("Settings...", self._open_settings),
|
||||
pystray.MenuItem("Open Log Folder", self._open_logs),
|
||||
pystray.Menu.SEPARATOR,
|
||||
pystray.MenuItem("Exit", self._exit),
|
||||
)
|
||||
|
||||
# --- Icon / menu update loop ---
|
||||
|
||||
def _icon_updater(self):
|
||||
"""Periodically refresh the tray icon color and menu to match watcher state."""
|
||||
last_status = None
|
||||
update_check_counter = 0 # check for updates every ~5 min (30 * 10s ticks)
|
||||
|
||||
while not self.stop_event.is_set():
|
||||
icon_status = self._tray_status()
|
||||
|
||||
if self._icon is not None:
|
||||
# Always rebuild menu every cycle so status text stays fresh
|
||||
with self._menu_lock:
|
||||
self._icon.menu = self._build_menu()
|
||||
|
||||
if icon_status != last_status:
|
||||
self._icon.icon = make_icon(icon_status)
|
||||
self._icon.title = "Series 3 Watcher — {}".format(self._status_text())
|
||||
last_status = icon_status
|
||||
|
||||
# Check if terra-view signalled an update via heartbeat response
|
||||
if self.state.get("update_available"):
|
||||
self.state["update_available"] = False
|
||||
self._do_update()
|
||||
return # exit loop; swap bat will relaunch
|
||||
|
||||
# Periodic Gitea update check (every ~5 min)
|
||||
update_check_counter += 1
|
||||
if update_check_counter >= 30:
|
||||
update_check_counter = 0
|
||||
tag, url = check_for_update()
|
||||
if tag and url:
|
||||
self._do_update(url)
|
||||
return # exit loop; swap bat will relaunch
|
||||
|
||||
self.stop_event.wait(timeout=10)
|
||||
|
||||
def _do_update(self, download_url=None):
|
||||
"""Notify tray icon then apply update. If url is None, fetch it first."""
|
||||
if download_url is None:
|
||||
_, download_url = check_for_update()
|
||||
if not download_url:
|
||||
return
|
||||
|
||||
if self._icon is not None:
|
||||
self._icon.title = "Series 3 Watcher — Updating..."
|
||||
self._icon.icon = make_icon("starting")
|
||||
|
||||
success = apply_update(download_url)
|
||||
if success:
|
||||
self.stop_event.set()
|
||||
if self._icon is not None:
|
||||
self._icon.stop()
|
||||
# If update failed, just keep running silently
|
||||
|
||||
# --- Entry point ---
|
||||
|
||||
def run(self):
|
||||
self._start_watcher()
|
||||
|
||||
icon_img = make_icon("starting")
|
||||
self._icon = pystray.Icon(
|
||||
name="series3_watcher",
|
||||
icon=icon_img,
|
||||
title="Series 3 Watcher — Starting...",
|
||||
menu=self._build_menu(),
|
||||
)
|
||||
|
||||
updater = threading.Thread(
|
||||
target=self._icon_updater, daemon=True, name="icon-updater"
|
||||
)
|
||||
updater.start()
|
||||
|
||||
self._icon.run()
|
||||
|
||||
|
||||
# --------------- Entry point ---------------
|
||||
|
||||
def main():
|
||||
if not ensure_config():
|
||||
sys.exit(0)
|
||||
|
||||
app = WatcherTray()
|
||||
app.run()
|
||||
|
||||
|
||||
if __name__ == "__main__":
|
||||
main()
|
||||
471
series3_watcher.py
Normal file
471
series3_watcher.py
Normal file
@@ -0,0 +1,471 @@
|
||||
"""
|
||||
Series 3 Watcher — v1.4.2
|
||||
|
||||
Environment:
|
||||
- Python 3.8 (Windows 7 compatible)
|
||||
- Runs on DL2 with Blastware 10 event path
|
||||
|
||||
Key Features:
|
||||
- Config-driven paths, intervals, and logging
|
||||
- Compact console heartbeat with status per unit
|
||||
- Logging with retention auto-clean (days configurable)
|
||||
- Safe .MLG header sniff for unit IDs (BE#### / BA####)
|
||||
- Standardized SFM Telemetry JSON payload (source-agnostic)
|
||||
- Periodic HTTP heartbeat POST to SFM backend
|
||||
- Tray-friendly: run_watcher(state, stop_event) for background thread use
|
||||
"""
|
||||
|
||||
import os
|
||||
import re
|
||||
import sys
|
||||
import time
|
||||
import json
|
||||
import threading
|
||||
import configparser
|
||||
import urllib.request
|
||||
import urllib.error
|
||||
from datetime import datetime, timezone, timedelta
|
||||
from typing import Dict, Any, Optional, Tuple
|
||||
from socket import gethostname
|
||||
|
||||
|
||||
# ---------------- Config ----------------
|
||||
def load_config(path: str) -> Dict[str, Any]:
|
||||
"""Load INI with tolerant inline comments and a required [agent] section."""
|
||||
cp = configparser.ConfigParser(inline_comment_prefixes=(";", "#"))
|
||||
cp.optionxform = str # preserve key case
|
||||
with open(path, "r", encoding="utf-8") as f:
|
||||
txt = f.read()
|
||||
# Ensure we have a section header
|
||||
if not re.search(r"^\s*\[", txt, flags=re.M):
|
||||
txt = "[agent]\n" + txt
|
||||
cp.read_string(txt)
|
||||
sec = cp["agent"]
|
||||
|
||||
def get_str(k: str, dflt: str) -> str:
|
||||
return sec.get(k, dflt).strip()
|
||||
|
||||
def get_int(k: str, dflt: int) -> int:
|
||||
try:
|
||||
return int(sec.get(k, str(dflt)).strip())
|
||||
except Exception:
|
||||
return dflt
|
||||
|
||||
def get_bool(k: str, dflt: bool) -> bool:
|
||||
v = sec.get(k, None)
|
||||
if v is None:
|
||||
return dflt
|
||||
return v.strip().lower() in ("1", "true", "on", "yes", "y")
|
||||
|
||||
return {
|
||||
"WATCH_PATH": get_str("SERIES3_PATH", r"C:\Blastware 10\Event\autocall home"),
|
||||
"SCAN_INTERVAL": get_int("SCAN_INTERVAL_SECONDS", 300),
|
||||
"OK_HOURS": float(get_int("OK_HOURS", 12)),
|
||||
"MISSING_HOURS": float(get_int("MISSING_HOURS", 24)),
|
||||
"ENABLE_LOGGING": get_bool("ENABLE_LOGGING", True),
|
||||
"LOG_FILE": get_str("LOG_FILE", os.path.join(
|
||||
os.environ.get("LOCALAPPDATA") or os.environ.get("APPDATA") or "C:\\",
|
||||
"Series3Watcher", "agent_logs", "series3_watcher.log"
|
||||
)),
|
||||
"LOG_RETENTION_DAYS": get_int("LOG_RETENTION_DAYS", 30),
|
||||
"MLG_HEADER_BYTES": max(256, min(get_int("MLG_HEADER_BYTES", 2048), 65536)),
|
||||
"RECENT_WARN_DAYS": get_int("RECENT_WARN_DAYS", 30),
|
||||
"MAX_EVENT_AGE_DAYS": get_int("MAX_EVENT_AGE_DAYS", 365),
|
||||
|
||||
# API heartbeat / SFM telemetry
|
||||
"API_ENABLED": get_bool("API_ENABLED", False),
|
||||
"API_URL": get_str("API_URL", ""),
|
||||
"API_INTERVAL_SECONDS": get_int("API_INTERVAL_SECONDS", 300),
|
||||
"SOURCE_ID": get_str("SOURCE_ID", gethostname()),
|
||||
"SOURCE_TYPE": get_str("SOURCE_TYPE", "series3_watcher"),
|
||||
}
|
||||
|
||||
|
||||
|
||||
# --------------- Logging --------------------
|
||||
def log_message(path: str, enabled: bool, msg: str) -> None:
|
||||
if not enabled:
|
||||
return
|
||||
try:
|
||||
d = os.path.dirname(path) or "."
|
||||
if not os.path.exists(d):
|
||||
os.makedirs(d)
|
||||
with open(path, "a", encoding="utf-8") as f:
|
||||
f.write("{} {}\n".format(datetime.now(timezone.utc).isoformat(), msg))
|
||||
except Exception:
|
||||
# Logging must never crash the watcher
|
||||
pass
|
||||
|
||||
|
||||
def clear_logs_if_needed(log_file: str, enabled: bool, retention_days: int) -> None:
|
||||
if not enabled or retention_days <= 0:
|
||||
return
|
||||
stamp_file = os.path.join(os.path.dirname(log_file) or ".", "last_clean.txt")
|
||||
now = datetime.now(timezone.utc)
|
||||
last = None
|
||||
try:
|
||||
if os.path.exists(stamp_file):
|
||||
with open(stamp_file, "r", encoding="utf-8") as f:
|
||||
last = datetime.fromisoformat(f.read().strip())
|
||||
except Exception:
|
||||
last = None
|
||||
if (last is None) or (now - last > timedelta(days=retention_days)):
|
||||
try:
|
||||
if os.path.exists(log_file):
|
||||
open(log_file, "w", encoding="utf-8").close()
|
||||
with open(stamp_file, "w", encoding="utf-8") as f:
|
||||
f.write(now.isoformat())
|
||||
print("Log cleared on {}".format(now.astimezone().strftime("%Y-%m-%d %H:%M:%S")))
|
||||
log_message(log_file, enabled, "Logs auto-cleared")
|
||||
except Exception:
|
||||
pass
|
||||
|
||||
|
||||
# --------------- .MLG sniff ------------------
|
||||
UNIT_BYTES_RE = re.compile(rb"(?:^|[^A-Z])(BE|BA)\d{4,5}(?:[^0-9]|$)")
|
||||
|
||||
|
||||
def sniff_unit_from_mlg(path: str, header_bytes: int) -> Optional[str]:
|
||||
"""Return BE####/BA#### from header bytes, or None."""
|
||||
try:
|
||||
with open(path, "rb") as f:
|
||||
chunk = f.read(max(256, min(header_bytes, 65536)))
|
||||
m = UNIT_BYTES_RE.search(chunk)
|
||||
if not m:
|
||||
return None
|
||||
raw = m.group(0)
|
||||
cleaned = re.sub(rb"[^A-Z0-9]", b"", raw)
|
||||
try:
|
||||
return cleaned.decode("ascii").upper()
|
||||
except Exception:
|
||||
return None
|
||||
except Exception:
|
||||
return None
|
||||
|
||||
|
||||
# --------------- Scan helpers ---------------
|
||||
def fmt_last(ts: float) -> str:
|
||||
return datetime.fromtimestamp(ts, tz=timezone.utc).astimezone().strftime("%Y-%m-%d %H:%M:%S")
|
||||
|
||||
|
||||
def fmt_age(now_epoch: float, mtime: float) -> str:
|
||||
mins = int((now_epoch - mtime) // 60)
|
||||
if mins < 0:
|
||||
mins = 0
|
||||
return "{}h {}m".format(mins // 60, mins % 60)
|
||||
|
||||
|
||||
def scan_latest(
|
||||
watch: str,
|
||||
header_bytes: int,
|
||||
cache: Dict[str, Tuple[float, str]],
|
||||
recent_cutoff: float,
|
||||
max_age_days: int,
|
||||
logger=None,
|
||||
) -> Dict[str, Dict[str, Any]]:
|
||||
"""
|
||||
Return newest .MLG per unit, only for files newer than max_age_days:
|
||||
{uid: {'mtime': float, 'fname': str, 'path': str}}
|
||||
"""
|
||||
latest: Dict[str, Dict[str, Any]] = {}
|
||||
if not os.path.exists(watch):
|
||||
print("[WARN] Watch path not found:", watch)
|
||||
return latest
|
||||
|
||||
now_ts = time.time()
|
||||
max_age_days = max(1, int(max_age_days)) # sanity floor
|
||||
max_age_seconds = max_age_days * 86400.0
|
||||
|
||||
try:
|
||||
with os.scandir(watch) as it:
|
||||
for e in it:
|
||||
if (not e.is_file()) or (not e.name.lower().endswith(".mlg")):
|
||||
continue
|
||||
fpath = e.path
|
||||
try:
|
||||
mtime = e.stat().st_mtime
|
||||
except Exception:
|
||||
continue
|
||||
|
||||
# Skip very old events (beyond retention window)
|
||||
age_seconds = now_ts - mtime
|
||||
if age_seconds < 0:
|
||||
age_seconds = 0
|
||||
if age_seconds > max_age_seconds:
|
||||
continue # too old, ignore this file
|
||||
|
||||
cached = cache.get(fpath)
|
||||
if cached is not None and cached[0] == mtime:
|
||||
uid = cached[1]
|
||||
else:
|
||||
uid = sniff_unit_from_mlg(fpath, header_bytes)
|
||||
if not uid:
|
||||
# If unsniffable but very recent, log for later inspection
|
||||
if (recent_cutoff is not None) and (mtime >= recent_cutoff):
|
||||
if logger:
|
||||
logger("[unsniffable-recent] {}".format(fpath))
|
||||
continue # skip file if no unit ID found in header
|
||||
cache[fpath] = (mtime, uid)
|
||||
|
||||
if (uid not in latest) or (mtime > latest[uid]["mtime"]):
|
||||
latest[uid] = {"mtime": mtime, "fname": e.name, "path": fpath}
|
||||
except Exception as ex:
|
||||
print("[WARN] Scan error:", ex)
|
||||
return latest
|
||||
|
||||
|
||||
# --- API heartbeat / SFM telemetry helpers ---
|
||||
VERSION = "1.4.2"
|
||||
|
||||
|
||||
def _read_log_tail(log_file: str, n: int = 25) -> Optional[list]:
|
||||
"""Return the last n lines of the log file as a list of strings, or None on failure."""
|
||||
if not log_file:
|
||||
return None
|
||||
try:
|
||||
with open(log_file, "r", errors="replace") as f:
|
||||
lines = f.readlines()
|
||||
return [l.rstrip("\n") for l in lines[-n:]]
|
||||
except Exception:
|
||||
return None
|
||||
|
||||
|
||||
def send_api_payload(payload: dict, api_url: str) -> Optional[dict]:
|
||||
"""POST payload to API. Returns parsed JSON response dict, or None on failure."""
|
||||
if not api_url:
|
||||
return None
|
||||
data = json.dumps(payload).encode("utf-8")
|
||||
req = urllib.request.Request(api_url, data=data, headers={"Content-Type": "application/json"})
|
||||
try:
|
||||
with urllib.request.urlopen(req, timeout=5) as res:
|
||||
print("[API] POST success: {}".format(res.status))
|
||||
try:
|
||||
return json.loads(res.read().decode("utf-8"))
|
||||
except Exception:
|
||||
return None
|
||||
except urllib.error.URLError as e:
|
||||
print("[API] POST failed: {}".format(e))
|
||||
return None
|
||||
|
||||
|
||||
def build_sfm_payload(units_dict: Dict[str, Dict[str, Any]], cfg: Dict[str, Any]) -> dict:
|
||||
"""
|
||||
Build SFM Telemetry JSON v1 payload from latest-unit dict.
|
||||
Schema is source-agnostic and future-proof.
|
||||
"""
|
||||
now_iso = datetime.now(timezone.utc).isoformat()
|
||||
now_ts = time.time()
|
||||
|
||||
payload = {
|
||||
"source_id": cfg.get("SOURCE_ID", gethostname()),
|
||||
"source_type": cfg.get("SOURCE_TYPE", "series3_watcher"),
|
||||
"timestamp": now_iso,
|
||||
"units": [],
|
||||
}
|
||||
|
||||
for unit_id, info in units_dict.items():
|
||||
mtime = info.get("mtime")
|
||||
if mtime is not None:
|
||||
last_event_iso = datetime.fromtimestamp(mtime, tz=timezone.utc).isoformat()
|
||||
age_minutes = int(max(0, (now_ts - mtime) // 60))
|
||||
else:
|
||||
last_event_iso = None
|
||||
age_minutes = None
|
||||
|
||||
file_path = info.get("path")
|
||||
file_size = None
|
||||
if file_path:
|
||||
try:
|
||||
file_size = os.path.getsize(file_path)
|
||||
except Exception:
|
||||
file_size = None
|
||||
|
||||
payload["units"].append(
|
||||
{
|
||||
"unit_id": unit_id,
|
||||
"last_event_time": last_event_iso,
|
||||
"age_minutes": age_minutes,
|
||||
"observation_method": "mlg_scan",
|
||||
"event_metadata": {
|
||||
"file_name": info.get("fname"),
|
||||
"file_path": file_path,
|
||||
"file_size_bytes": file_size,
|
||||
"event_number": None,
|
||||
"event_type": None,
|
||||
},
|
||||
}
|
||||
)
|
||||
|
||||
return payload
|
||||
|
||||
|
||||
# --------------- Watcher loop (tray-friendly) ----------------
|
||||
def run_watcher(state: Dict[str, Any], stop_event: threading.Event) -> None:
|
||||
"""
|
||||
Main watcher loop. Runs in a background thread when launched from the tray.
|
||||
|
||||
state dict is written on every scan cycle:
|
||||
state["status"] — "running" | "error" | "starting"
|
||||
state["api_status"] — "ok" | "fail" | "disabled"
|
||||
state["units"] — list of dicts: {uid, age_hours, last, fname}
|
||||
state["last_scan"] — datetime of last successful scan (or None)
|
||||
state["last_api"] — datetime of last successful API POST (or None)
|
||||
state["last_error"] — last error string (or None)
|
||||
state["log_dir"] — directory containing the log file
|
||||
state["cfg"] — loaded config dict
|
||||
"""
|
||||
if getattr(sys, "frozen", False):
|
||||
here = os.path.dirname(os.path.abspath(sys.executable))
|
||||
_appdata = os.environ.get("LOCALAPPDATA") or os.environ.get("APPDATA") or here
|
||||
config_dir = os.path.join(_appdata, "Series3Watcher")
|
||||
else:
|
||||
here = os.path.dirname(os.path.abspath(__file__)) or "."
|
||||
config_dir = here
|
||||
config_path = os.path.join(config_dir, "config.ini")
|
||||
|
||||
state["status"] = "starting"
|
||||
state["units"] = []
|
||||
state["last_scan"] = None
|
||||
state["last_error"] = None
|
||||
state["log_dir"] = None
|
||||
state["cfg"] = {}
|
||||
|
||||
try:
|
||||
cfg = load_config(config_path)
|
||||
except Exception as e:
|
||||
state["status"] = "error"
|
||||
state["last_error"] = "Config load failed: {}".format(e)
|
||||
return
|
||||
|
||||
state["cfg"] = cfg
|
||||
state["log_dir"] = os.path.dirname(cfg["LOG_FILE"]) or here
|
||||
|
||||
WATCH_PATH = cfg["WATCH_PATH"]
|
||||
SCAN_INTERVAL = int(cfg["SCAN_INTERVAL"])
|
||||
ENABLE_LOGGING = bool(cfg["ENABLE_LOGGING"])
|
||||
LOG_FILE = cfg["LOG_FILE"]
|
||||
LOG_RETENTION_DAYS = int(cfg["LOG_RETENTION_DAYS"])
|
||||
MLG_HEADER_BYTES = int(cfg["MLG_HEADER_BYTES"])
|
||||
RECENT_WARN_DAYS = int(cfg["RECENT_WARN_DAYS"])
|
||||
MAX_EVENT_AGE_DAYS = int(cfg["MAX_EVENT_AGE_DAYS"])
|
||||
|
||||
print(
|
||||
"[CFG] WATCH_PATH={} SCAN_INTERVAL={}s MAX_EVENT_AGE_DAYS={} API_ENABLED={}".format(
|
||||
WATCH_PATH, SCAN_INTERVAL, MAX_EVENT_AGE_DAYS, bool(cfg.get("API_ENABLED", False))
|
||||
)
|
||||
)
|
||||
log_message(
|
||||
LOG_FILE,
|
||||
ENABLE_LOGGING,
|
||||
"[cfg] WATCH_PATH={} SCAN_INTERVAL={} MAX_EVENT_AGE_DAYS={} API_ENABLED={}".format(
|
||||
WATCH_PATH, SCAN_INTERVAL, MAX_EVENT_AGE_DAYS, bool(cfg.get("API_ENABLED", False))
|
||||
),
|
||||
)
|
||||
|
||||
sniff_cache: Dict[str, Tuple[float, str]] = {}
|
||||
last_api_ts: float = 0.0
|
||||
|
||||
while not stop_event.is_set():
|
||||
try:
|
||||
now_local = datetime.now().isoformat()
|
||||
now_utc = datetime.now(timezone.utc).isoformat()
|
||||
print("-" * 110)
|
||||
print("Heartbeat @ {} (Local) | {} (UTC)".format(now_local, now_utc))
|
||||
print("-" * 110)
|
||||
|
||||
clear_logs_if_needed(LOG_FILE, ENABLE_LOGGING, LOG_RETENTION_DAYS)
|
||||
recent_cutoff = time.time() - (float(RECENT_WARN_DAYS) * 86400)
|
||||
logger_fn = lambda m: log_message(LOG_FILE, ENABLE_LOGGING, m)
|
||||
|
||||
latest = scan_latest(
|
||||
WATCH_PATH,
|
||||
MLG_HEADER_BYTES,
|
||||
sniff_cache,
|
||||
recent_cutoff,
|
||||
MAX_EVENT_AGE_DAYS,
|
||||
logger_fn,
|
||||
)
|
||||
now_epoch = time.time()
|
||||
|
||||
# Log detected units to console and log file (info only, no status judgement)
|
||||
unit_list = []
|
||||
if latest:
|
||||
print("\nDetected Units (within last {} days):".format(MAX_EVENT_AGE_DAYS))
|
||||
for uid in sorted(latest.keys()):
|
||||
info = latest[uid]
|
||||
age_hours = (now_epoch - info["mtime"]) / 3600.0
|
||||
unit_list.append({
|
||||
"uid": uid,
|
||||
"age_hours": age_hours,
|
||||
"last": fmt_last(info["mtime"]),
|
||||
"fname": info["fname"],
|
||||
})
|
||||
line = (
|
||||
"{uid:<8} Age: {age:<7} Last: {last} (File: {fname})".format(
|
||||
uid=uid,
|
||||
age=fmt_age(now_epoch, info["mtime"]),
|
||||
last=fmt_last(info["mtime"]),
|
||||
fname=info["fname"],
|
||||
)
|
||||
)
|
||||
print(line)
|
||||
log_message(LOG_FILE, ENABLE_LOGGING, line)
|
||||
else:
|
||||
print("\nNo recent .MLG activity found within last {} days.".format(MAX_EVENT_AGE_DAYS))
|
||||
log_message(
|
||||
LOG_FILE,
|
||||
ENABLE_LOGGING,
|
||||
"[info] no recent MLG activity within {} days".format(MAX_EVENT_AGE_DAYS),
|
||||
)
|
||||
|
||||
# Update shared state for tray — status reflects watcher health, not unit ages
|
||||
state["status"] = "running"
|
||||
state["units"] = unit_list
|
||||
state["last_scan"] = datetime.now()
|
||||
state["last_error"] = None
|
||||
|
||||
# ---- API heartbeat to SFM ----
|
||||
if cfg.get("API_ENABLED", False):
|
||||
now_ts = time.time()
|
||||
interval = int(cfg.get("API_INTERVAL_SECONDS", 300))
|
||||
if now_ts - last_api_ts >= interval:
|
||||
hb_payload = build_sfm_payload(latest, cfg)
|
||||
hb_payload["version"] = VERSION
|
||||
hb_payload["watcher_status"] = state.get("status", "unknown")
|
||||
hb_payload["log_tail"] = _read_log_tail(cfg.get("LOG_FILE", ""), 25)
|
||||
response = send_api_payload(hb_payload, cfg.get("API_URL", ""))
|
||||
last_api_ts = now_ts
|
||||
if response is not None:
|
||||
state["api_status"] = "ok"
|
||||
state["last_api"] = datetime.now()
|
||||
if response.get("update_available"):
|
||||
state["update_available"] = True
|
||||
else:
|
||||
state["api_status"] = "fail"
|
||||
else:
|
||||
state["api_status"] = "disabled"
|
||||
|
||||
except Exception as e:
|
||||
err = "[loop-error] {}".format(e)
|
||||
print(err)
|
||||
log_message(LOG_FILE, ENABLE_LOGGING, err)
|
||||
state["status"] = "error"
|
||||
state["last_error"] = str(e)
|
||||
|
||||
# Interruptible sleep: wake immediately if stop_event fires
|
||||
stop_event.wait(timeout=SCAN_INTERVAL)
|
||||
|
||||
|
||||
# --------------- Main (standalone) ------------------
|
||||
def main() -> None:
|
||||
state = {}
|
||||
stop_event = threading.Event()
|
||||
try:
|
||||
run_watcher(state, stop_event)
|
||||
except KeyboardInterrupt:
|
||||
print("\nStopping...")
|
||||
stop_event.set()
|
||||
|
||||
|
||||
if __name__ == "__main__":
|
||||
main()
|
||||
526
settings_dialog.py
Normal file
526
settings_dialog.py
Normal file
@@ -0,0 +1,526 @@
|
||||
"""
|
||||
Series 3 Watcher — Settings Dialog v1.4.2
|
||||
|
||||
Provides a Tkinter settings dialog that doubles as a first-run wizard.
|
||||
|
||||
Public API:
|
||||
show_dialog(config_path, wizard=False) -> bool
|
||||
Returns True if the user saved, False if they cancelled.
|
||||
|
||||
Python 3.8 compatible — no walrus operators, no f-string = specifier,
|
||||
no match statements, no 3.9+ syntax.
|
||||
No external dependencies beyond stdlib + tkinter.
|
||||
"""
|
||||
|
||||
import os
|
||||
import configparser
|
||||
import tkinter as tk
|
||||
from tkinter import ttk, filedialog, messagebox
|
||||
from socket import gethostname
|
||||
|
||||
|
||||
# --------------- Defaults (mirror config-template.ini) ---------------
|
||||
|
||||
DEFAULTS = {
|
||||
"API_ENABLED": "true",
|
||||
"API_URL": "",
|
||||
"API_INTERVAL_SECONDS": "300",
|
||||
"SOURCE_ID": "", # empty = use hostname at runtime
|
||||
"SOURCE_TYPE": "series3_watcher",
|
||||
"SERIES3_PATH": r"C:\Blastware 10\Event\autocall home",
|
||||
"MAX_EVENT_AGE_DAYS": "365",
|
||||
"SCAN_INTERVAL_SECONDS":"300",
|
||||
"OK_HOURS": "12",
|
||||
"MISSING_HOURS": "24",
|
||||
"MLG_HEADER_BYTES": "2048",
|
||||
"ENABLE_LOGGING": "true",
|
||||
"LOG_FILE": os.path.join(
|
||||
os.environ.get("LOCALAPPDATA") or os.environ.get("APPDATA") or "C:\\",
|
||||
"Series3Watcher", "agent_logs", "series3_watcher.log"
|
||||
),
|
||||
"LOG_RETENTION_DAYS": "30",
|
||||
}
|
||||
|
||||
|
||||
# --------------- Config I/O ---------------
|
||||
|
||||
def _load_config(config_path):
|
||||
"""
|
||||
Load existing config.ini. Returns a flat dict of string values.
|
||||
Falls back to DEFAULTS for any missing key.
|
||||
"""
|
||||
values = dict(DEFAULTS)
|
||||
if not os.path.exists(config_path):
|
||||
return values
|
||||
|
||||
cp = configparser.ConfigParser(inline_comment_prefixes=(";", "#"))
|
||||
cp.optionxform = str
|
||||
try:
|
||||
cp.read(config_path, encoding="utf-8")
|
||||
except Exception:
|
||||
return values
|
||||
|
||||
# Accept either [agent] section or a bare file
|
||||
section = None
|
||||
if cp.has_section("agent"):
|
||||
section = "agent"
|
||||
elif cp.sections():
|
||||
section = cp.sections()[0]
|
||||
|
||||
if section:
|
||||
for k in DEFAULTS:
|
||||
if cp.has_option(section, k):
|
||||
values[k] = cp.get(section, k).strip()
|
||||
|
||||
return values
|
||||
|
||||
|
||||
def _save_config(config_path, values):
|
||||
"""Write all values to config_path under [agent] section."""
|
||||
cp = configparser.ConfigParser()
|
||||
cp.optionxform = str
|
||||
cp["agent"] = {}
|
||||
for k, v in values.items():
|
||||
cp["agent"][k] = v
|
||||
|
||||
config_dir = os.path.dirname(config_path)
|
||||
if config_dir and not os.path.exists(config_dir):
|
||||
os.makedirs(config_dir)
|
||||
|
||||
with open(config_path, "w", encoding="utf-8") as f:
|
||||
cp.write(f)
|
||||
|
||||
|
||||
# --------------- Spinbox helper ---------------
|
||||
|
||||
def _make_spinbox(parent, from_, to, width=8):
|
||||
"""Create a ttk.Spinbox; fall back to tk.Spinbox on older ttk."""
|
||||
try:
|
||||
sb = ttk.Spinbox(parent, from_=from_, to=to, width=width)
|
||||
except AttributeError:
|
||||
# ttk.Spinbox added in Python 3.7 but not available everywhere
|
||||
sb = tk.Spinbox(parent, from_=from_, to=to, width=width)
|
||||
return sb
|
||||
|
||||
|
||||
# --------------- Field helpers ---------------
|
||||
|
||||
def _add_label_entry(frame, row, label_text, var, hint=None, readonly=False):
|
||||
"""Add a label + entry row to a grid frame. Returns the Entry widget."""
|
||||
tk.Label(frame, text=label_text, anchor="w").grid(
|
||||
row=row, column=0, sticky="w", padx=(8, 4), pady=4
|
||||
)
|
||||
state = "readonly" if readonly else "normal"
|
||||
entry = ttk.Entry(frame, textvariable=var, width=42, state=state)
|
||||
entry.grid(row=row, column=1, sticky="ew", padx=(0, 8), pady=4)
|
||||
if hint and not var.get():
|
||||
# Show placeholder hint in grey; clear on focus
|
||||
entry.config(foreground="grey")
|
||||
entry.insert(0, hint)
|
||||
|
||||
def _on_focus_in(event, e=entry, h=hint, v=var):
|
||||
if e.get() == h:
|
||||
e.delete(0, tk.END)
|
||||
e.config(foreground="black")
|
||||
|
||||
def _on_focus_out(event, e=entry, h=hint, v=var):
|
||||
if not e.get():
|
||||
e.config(foreground="grey")
|
||||
e.insert(0, h)
|
||||
v.set("")
|
||||
|
||||
entry.bind("<FocusIn>", _on_focus_in)
|
||||
entry.bind("<FocusOut>", _on_focus_out)
|
||||
return entry
|
||||
|
||||
|
||||
def _add_label_spinbox(frame, row, label_text, var, from_, to):
|
||||
"""Add a label + spinbox row to a grid frame. Returns the Spinbox widget."""
|
||||
tk.Label(frame, text=label_text, anchor="w").grid(
|
||||
row=row, column=0, sticky="w", padx=(8, 4), pady=4
|
||||
)
|
||||
sb = _make_spinbox(frame, from_=from_, to=to, width=8)
|
||||
sb.grid(row=row, column=1, sticky="w", padx=(0, 8), pady=4)
|
||||
sb.delete(0, tk.END)
|
||||
sb.insert(0, var.get())
|
||||
|
||||
def _on_change(*args):
|
||||
var.set(sb.get())
|
||||
|
||||
sb.config(command=_on_change)
|
||||
sb.bind("<KeyRelease>", _on_change)
|
||||
return sb
|
||||
|
||||
|
||||
def _add_label_check(frame, row, label_text, var):
|
||||
"""Add a checkbox row to a grid frame."""
|
||||
cb = ttk.Checkbutton(frame, text=label_text, variable=var)
|
||||
cb.grid(row=row, column=0, columnspan=2, sticky="w", padx=(8, 8), pady=4)
|
||||
return cb
|
||||
|
||||
|
||||
def _add_label_browse_entry(frame, row, label_text, var, browse_fn):
|
||||
"""Add a label + entry + Browse button row."""
|
||||
tk.Label(frame, text=label_text, anchor="w").grid(
|
||||
row=row, column=0, sticky="w", padx=(8, 4), pady=4
|
||||
)
|
||||
inner = tk.Frame(frame)
|
||||
inner.grid(row=row, column=1, sticky="ew", padx=(0, 8), pady=4)
|
||||
inner.columnconfigure(0, weight=1)
|
||||
|
||||
entry = ttk.Entry(inner, textvariable=var, width=36)
|
||||
entry.grid(row=0, column=0, sticky="ew")
|
||||
btn = ttk.Button(inner, text="Browse...", command=browse_fn, width=9)
|
||||
btn.grid(row=0, column=1, padx=(4, 0))
|
||||
return entry
|
||||
|
||||
|
||||
# --------------- Main dialog class ---------------
|
||||
|
||||
class SettingsDialog:
|
||||
def __init__(self, parent, config_path, wizard=False):
|
||||
self.config_path = config_path
|
||||
self.wizard = wizard
|
||||
self.saved = False
|
||||
|
||||
self.root = parent
|
||||
if wizard:
|
||||
self.root.title("Series 3 Watcher — Setup")
|
||||
else:
|
||||
self.root.title("Series 3 Watcher — Settings")
|
||||
self.root.resizable(False, False)
|
||||
|
||||
# Center on screen
|
||||
self.root.update_idletasks()
|
||||
|
||||
self._values = _load_config(config_path)
|
||||
self._build_vars()
|
||||
self._build_ui()
|
||||
|
||||
# Make dialog modal
|
||||
self.root.grab_set()
|
||||
self.root.protocol("WM_DELETE_WINDOW", self._on_cancel)
|
||||
|
||||
# --- Variable setup ---
|
||||
|
||||
def _build_vars(self):
|
||||
v = self._values
|
||||
|
||||
# Connection
|
||||
self.var_api_enabled = tk.BooleanVar(value=v["API_ENABLED"].lower() in ("1","true","yes","on"))
|
||||
# Strip the fixed endpoint suffix so the dialog shows just the base URL
|
||||
_raw_url = v["API_URL"]
|
||||
_suffix = "/api/series3/heartbeat"
|
||||
if _raw_url.endswith(_suffix):
|
||||
_raw_url = _raw_url[:-len(_suffix)]
|
||||
self.var_api_url = tk.StringVar(value=_raw_url)
|
||||
self.var_api_interval = tk.StringVar(value=v["API_INTERVAL_SECONDS"])
|
||||
self.var_source_id = tk.StringVar(value=v["SOURCE_ID"])
|
||||
self.var_source_type = tk.StringVar(value=v["SOURCE_TYPE"])
|
||||
|
||||
# Paths
|
||||
self.var_series3_path = tk.StringVar(value=v["SERIES3_PATH"])
|
||||
self.var_max_event_age_days = tk.StringVar(value=v["MAX_EVENT_AGE_DAYS"])
|
||||
self.var_log_file = tk.StringVar(value=v["LOG_FILE"])
|
||||
|
||||
# Scanning
|
||||
self.var_scan_interval = tk.StringVar(value=v["SCAN_INTERVAL_SECONDS"])
|
||||
self.var_ok_hours = tk.StringVar(value=v["OK_HOURS"])
|
||||
self.var_missing_hours = tk.StringVar(value=v["MISSING_HOURS"])
|
||||
self.var_mlg_header_bytes = tk.StringVar(value=v["MLG_HEADER_BYTES"])
|
||||
|
||||
# Logging
|
||||
self.var_enable_logging = tk.BooleanVar(value=v["ENABLE_LOGGING"].lower() in ("1","true","yes","on"))
|
||||
self.var_log_retention_days = tk.StringVar(value=v["LOG_RETENTION_DAYS"])
|
||||
|
||||
# --- UI construction ---
|
||||
|
||||
def _build_ui(self):
|
||||
outer = tk.Frame(self.root, padx=10, pady=8)
|
||||
outer.pack(fill="both", expand=True)
|
||||
|
||||
if self.wizard:
|
||||
welcome = (
|
||||
"Welcome to Series 3 Watcher!\n\n"
|
||||
"No configuration file was found. Please review the settings below\n"
|
||||
"and click \"Save & Start\" when you are ready."
|
||||
)
|
||||
lbl = tk.Label(
|
||||
outer, text=welcome, justify="left",
|
||||
wraplength=460, fg="#1a5276", font=("TkDefaultFont", 9, "bold"),
|
||||
)
|
||||
lbl.pack(fill="x", pady=(0, 8))
|
||||
|
||||
# Notebook
|
||||
nb = ttk.Notebook(outer)
|
||||
nb.pack(fill="both", expand=True)
|
||||
|
||||
self._build_tab_connection(nb)
|
||||
self._build_tab_paths(nb)
|
||||
self._build_tab_scanning(nb)
|
||||
self._build_tab_logging(nb)
|
||||
|
||||
# Buttons
|
||||
btn_frame = tk.Frame(outer)
|
||||
btn_frame.pack(fill="x", pady=(10, 0))
|
||||
|
||||
save_label = "Save & Start" if self.wizard else "Save"
|
||||
btn_save = ttk.Button(btn_frame, text=save_label, command=self._on_save, width=14)
|
||||
btn_save.pack(side="right", padx=(4, 0))
|
||||
|
||||
btn_cancel = ttk.Button(btn_frame, text="Cancel", command=self._on_cancel, width=10)
|
||||
btn_cancel.pack(side="right")
|
||||
|
||||
def _tab_frame(self, nb, title):
|
||||
"""Create a new tab in nb, return a scrollable inner frame."""
|
||||
outer = tk.Frame(nb, padx=4, pady=4)
|
||||
nb.add(outer, text=title)
|
||||
outer.columnconfigure(1, weight=1)
|
||||
return outer
|
||||
|
||||
def _build_tab_connection(self, nb):
|
||||
f = self._tab_frame(nb, "Connection")
|
||||
|
||||
_add_label_check(f, 0, "API Enabled", self.var_api_enabled)
|
||||
|
||||
# URL row — entry + Test button in an inner frame
|
||||
tk.Label(f, text="Terra-View URL", anchor="w").grid(
|
||||
row=1, column=0, sticky="w", padx=(8, 4), pady=4
|
||||
)
|
||||
url_frame = tk.Frame(f)
|
||||
url_frame.grid(row=1, column=1, sticky="ew", padx=(0, 8), pady=4)
|
||||
url_frame.columnconfigure(0, weight=1)
|
||||
|
||||
url_entry = ttk.Entry(url_frame, textvariable=self.var_api_url, width=32)
|
||||
url_entry.grid(row=0, column=0, sticky="ew")
|
||||
|
||||
# Placeholder hint behaviour
|
||||
_hint = "http://192.168.x.x:8000"
|
||||
if not self.var_api_url.get():
|
||||
url_entry.config(foreground="grey")
|
||||
url_entry.insert(0, _hint)
|
||||
def _on_focus_in(e):
|
||||
if url_entry.get() == _hint:
|
||||
url_entry.delete(0, tk.END)
|
||||
url_entry.config(foreground="black")
|
||||
def _on_focus_out(e):
|
||||
if not url_entry.get():
|
||||
url_entry.config(foreground="grey")
|
||||
url_entry.insert(0, _hint)
|
||||
self.var_api_url.set("")
|
||||
url_entry.bind("<FocusIn>", _on_focus_in)
|
||||
url_entry.bind("<FocusOut>", _on_focus_out)
|
||||
|
||||
self._test_btn = ttk.Button(url_frame, text="Test", width=6,
|
||||
command=self._test_connection)
|
||||
self._test_btn.grid(row=0, column=1, padx=(4, 0))
|
||||
|
||||
self._test_status = tk.Label(url_frame, text="", anchor="w", width=20)
|
||||
self._test_status.grid(row=0, column=2, padx=(6, 0))
|
||||
|
||||
_add_label_spinbox(f, 2, "API Interval (sec)", self.var_api_interval, 30, 3600)
|
||||
|
||||
source_id_hint = "Defaults to hostname ({})".format(gethostname())
|
||||
_add_label_entry(f, 3, "Source ID", self.var_source_id, hint=source_id_hint)
|
||||
|
||||
_add_label_entry(f, 4, "Source Type", self.var_source_type, readonly=True)
|
||||
|
||||
def _test_connection(self):
|
||||
"""POST a minimal ping to the Terra-View heartbeat endpoint and show result."""
|
||||
import urllib.request
|
||||
import urllib.error
|
||||
|
||||
self._test_status.config(text="Testing...", foreground="grey")
|
||||
self._test_btn.config(state="disabled")
|
||||
self.root.update_idletasks()
|
||||
|
||||
raw = self.var_api_url.get().strip()
|
||||
if not raw or raw == "http://192.168.x.x:8000":
|
||||
self._test_status.config(text="Enter a URL first", foreground="orange")
|
||||
self._test_btn.config(state="normal")
|
||||
return
|
||||
|
||||
url = raw.rstrip("/") + "/health"
|
||||
|
||||
try:
|
||||
req = urllib.request.Request(url)
|
||||
with urllib.request.urlopen(req, timeout=5) as resp:
|
||||
if resp.status == 200:
|
||||
self._test_status.config(text="Connected!", foreground="green")
|
||||
else:
|
||||
self._test_status.config(
|
||||
text="HTTP {}".format(resp.status), foreground="orange"
|
||||
)
|
||||
except urllib.error.URLError as e:
|
||||
reason = str(e.reason) if hasattr(e, "reason") else str(e)
|
||||
self._test_status.config(text="Failed: {}".format(reason[:30]), foreground="red")
|
||||
except Exception as e:
|
||||
self._test_status.config(text="Error: {}".format(str(e)[:30]), foreground="red")
|
||||
finally:
|
||||
self._test_btn.config(state="normal")
|
||||
|
||||
def _build_tab_paths(self, nb):
|
||||
f = self._tab_frame(nb, "Paths")
|
||||
|
||||
def browse_series3():
|
||||
d = filedialog.askdirectory(
|
||||
title="Select Blastware Event Folder",
|
||||
initialdir=self.var_series3_path.get() or "C:\\",
|
||||
)
|
||||
if d:
|
||||
self.var_series3_path.set(d.replace("/", "\\"))
|
||||
|
||||
_add_label_browse_entry(f, 0, "Series3 Path", self.var_series3_path, browse_series3)
|
||||
_add_label_spinbox(f, 1, "Max Event Age (days)", self.var_max_event_age_days, 1, 3650)
|
||||
|
||||
def browse_log():
|
||||
p = filedialog.asksaveasfilename(
|
||||
title="Select Log File",
|
||||
defaultextension=".log",
|
||||
filetypes=[("Log files", "*.log"), ("Text files", "*.txt"), ("All files", "*.*")],
|
||||
initialfile=os.path.basename(self.var_log_file.get() or "series3_watcher.log"),
|
||||
initialdir=os.path.dirname(self.var_log_file.get() or "C:\\"),
|
||||
)
|
||||
if p:
|
||||
self.var_log_file.set(p.replace("/", "\\"))
|
||||
|
||||
_add_label_browse_entry(f, 2, "Log File", self.var_log_file, browse_log)
|
||||
|
||||
def _build_tab_scanning(self, nb):
|
||||
f = self._tab_frame(nb, "Scanning")
|
||||
_add_label_spinbox(f, 0, "Scan Interval (sec)", self.var_scan_interval, 10, 3600)
|
||||
_add_label_spinbox(f, 1, "OK Hours", self.var_ok_hours, 1, 168)
|
||||
_add_label_spinbox(f, 2, "Missing Hours", self.var_missing_hours, 1, 168)
|
||||
_add_label_spinbox(f, 3, "MLG Header Bytes", self.var_mlg_header_bytes, 256, 65536)
|
||||
|
||||
def _build_tab_logging(self, nb):
|
||||
f = self._tab_frame(nb, "Logging")
|
||||
_add_label_check(f, 0, "Enable Logging", self.var_enable_logging)
|
||||
_add_label_spinbox(f, 1, "Log Retention (days)", self.var_log_retention_days, 1, 365)
|
||||
|
||||
# --- Validation helpers ---
|
||||
|
||||
def _get_int_var(self, var, name, min_val, max_val, default):
|
||||
"""Parse a StringVar as int, clamp to range, return clamped value or None on error."""
|
||||
raw = var.get().strip()
|
||||
try:
|
||||
val = int(raw)
|
||||
except ValueError:
|
||||
messagebox.showerror(
|
||||
"Validation Error",
|
||||
"{} must be an integer (got: {!r}).".format(name, raw),
|
||||
)
|
||||
return None
|
||||
if val < min_val or val > max_val:
|
||||
messagebox.showerror(
|
||||
"Validation Error",
|
||||
"{} must be between {} and {} (got {}).".format(name, min_val, max_val, val),
|
||||
)
|
||||
return None
|
||||
return val
|
||||
|
||||
# --- Save / Cancel ---
|
||||
|
||||
def _on_save(self):
|
||||
# Validate numeric fields before writing
|
||||
checks = [
|
||||
(self.var_api_interval, "API Interval", 30, 3600, 300),
|
||||
(self.var_max_event_age_days, "Max Event Age Days", 1, 3650, 365),
|
||||
(self.var_scan_interval, "Scan Interval", 10, 3600, 300),
|
||||
(self.var_ok_hours, "OK Hours", 1, 168, 12),
|
||||
(self.var_missing_hours, "Missing Hours", 1, 168, 24),
|
||||
(self.var_mlg_header_bytes, "MLG Header Bytes", 256, 65536, 2048),
|
||||
(self.var_log_retention_days, "Log Retention Days", 1, 365, 30),
|
||||
]
|
||||
int_values = {}
|
||||
for var, name, mn, mx, dflt in checks:
|
||||
result = self._get_int_var(var, name, mn, mx, dflt)
|
||||
if result is None:
|
||||
return # validation failed; keep dialog open
|
||||
int_values[name] = result
|
||||
|
||||
# Resolve source_id placeholder
|
||||
source_id = self.var_source_id.get().strip()
|
||||
# Strip placeholder hint if user left it
|
||||
if source_id.startswith("Defaults to hostname"):
|
||||
source_id = ""
|
||||
|
||||
# Resolve api_url — append the fixed endpoint, strip placeholder
|
||||
api_url = self.var_api_url.get().strip()
|
||||
if api_url == "http://192.168.x.x:8000" or not api_url:
|
||||
api_url = ""
|
||||
else:
|
||||
api_url = api_url.rstrip("/") + "/api/series3/heartbeat"
|
||||
|
||||
values = {
|
||||
"API_ENABLED": "true" if self.var_api_enabled.get() else "false",
|
||||
"API_URL": api_url,
|
||||
"API_INTERVAL_SECONDS": str(int_values["API Interval"]),
|
||||
"SOURCE_ID": source_id,
|
||||
"SOURCE_TYPE": self.var_source_type.get().strip() or "series3_watcher",
|
||||
"SERIES3_PATH": self.var_series3_path.get().strip(),
|
||||
"MAX_EVENT_AGE_DAYS": str(int_values["Max Event Age Days"]),
|
||||
"SCAN_INTERVAL_SECONDS":str(int_values["Scan Interval"]),
|
||||
"OK_HOURS": str(int_values["OK Hours"]),
|
||||
"MISSING_HOURS": str(int_values["Missing Hours"]),
|
||||
"MLG_HEADER_BYTES": str(int_values["MLG Header Bytes"]),
|
||||
"ENABLE_LOGGING": "true" if self.var_enable_logging.get() else "false",
|
||||
"LOG_FILE": self.var_log_file.get().strip(),
|
||||
"LOG_RETENTION_DAYS": str(int_values["Log Retention Days"]),
|
||||
}
|
||||
|
||||
try:
|
||||
_save_config(self.config_path, values)
|
||||
except Exception as e:
|
||||
messagebox.showerror("Save Error", "Could not write config.ini:\n{}".format(e))
|
||||
return
|
||||
|
||||
self.saved = True
|
||||
self.root.destroy()
|
||||
|
||||
def _on_cancel(self):
|
||||
self.saved = False
|
||||
self.root.destroy()
|
||||
|
||||
|
||||
# --------------- Public API ---------------
|
||||
|
||||
def show_dialog(config_path, wizard=False):
|
||||
"""
|
||||
Open the settings dialog.
|
||||
|
||||
Parameters
|
||||
----------
|
||||
config_path : str
|
||||
Absolute path to config.ini (read if it exists, written on Save).
|
||||
wizard : bool
|
||||
If True, shows first-run welcome message and "Save & Start" button.
|
||||
|
||||
Returns
|
||||
-------
|
||||
bool
|
||||
True if the user saved, False if they cancelled.
|
||||
"""
|
||||
root = tk.Tk()
|
||||
root.withdraw() # hide blank root window
|
||||
|
||||
# Create a Toplevel that acts as the dialog window
|
||||
top = tk.Toplevel(root)
|
||||
top.deiconify()
|
||||
|
||||
dlg = SettingsDialog(top, config_path, wizard=wizard)
|
||||
|
||||
# Center after build
|
||||
top.update_idletasks()
|
||||
w = top.winfo_reqwidth()
|
||||
h = top.winfo_reqheight()
|
||||
sw = top.winfo_screenwidth()
|
||||
sh = top.winfo_screenheight()
|
||||
x = (sw - w) // 2
|
||||
y = (sh - h) // 2
|
||||
top.geometry("{}x{}+{}+{}".format(w, h, x, y))
|
||||
|
||||
root.wait_window(top)
|
||||
root.destroy()
|
||||
|
||||
return dlg.saved
|
||||
Reference in New Issue
Block a user