Compare commits
27 Commits
4448c74f6c
...
main
| Author | SHA1 | Date | |
|---|---|---|---|
|
|
6a0422a6fc | ||
|
|
1078576023 | ||
|
|
8074bf0fee | ||
|
|
de02f9cccf | ||
|
|
da446cb2e3 | ||
|
|
51d1aa917a | ||
|
|
b8032e0578 | ||
|
|
3f142ce1c0 | ||
|
|
88adcbcb81 | ||
|
|
8e985154a7 | ||
|
|
f8f590b19b | ||
|
|
58a35a3afd | ||
|
|
45f4fb5a68 | ||
|
|
99d66453fe | ||
|
|
41606d2f31 | ||
|
|
8d06492dbc | ||
|
|
6be434e65f | ||
|
|
6d99f86502 | ||
|
|
5eb5499034 | ||
|
|
0db3780e65 | ||
|
|
d7a0e1b501 | ||
|
|
154a11d057 | ||
|
|
faa869d03b | ||
|
|
fa9873cf4a | ||
|
|
a684d3e642 | ||
|
|
22d4023ea0 | ||
|
|
a5a21a6c32 |
28
.gitignore
vendored
28
.gitignore
vendored
@@ -1 +1,27 @@
|
|||||||
/bridge/captures
|
/bridges/captures/
|
||||||
|
|
||||||
|
/manuals/
|
||||||
|
|
||||||
|
# Python bytecode
|
||||||
|
__pycache__/
|
||||||
|
*.py[cod]
|
||||||
|
|
||||||
|
# Virtual environments
|
||||||
|
.venv/
|
||||||
|
venv/
|
||||||
|
env/
|
||||||
|
|
||||||
|
# Editor / OS
|
||||||
|
.vscode/
|
||||||
|
*.swp
|
||||||
|
.DS_Store
|
||||||
|
Thumbs.db
|
||||||
|
|
||||||
|
# Analyzer outputs
|
||||||
|
*.report
|
||||||
|
claude_export_*.md
|
||||||
|
|
||||||
|
# Frame database
|
||||||
|
*.db
|
||||||
|
*.db-wal
|
||||||
|
*.db-shm
|
||||||
|
|||||||
84
CHANGELOG.md
Normal file
84
CHANGELOG.md
Normal file
@@ -0,0 +1,84 @@
|
|||||||
|
# Changelog
|
||||||
|
|
||||||
|
All notable changes to seismo-relay are documented here.
|
||||||
|
|
||||||
|
---
|
||||||
|
|
||||||
|
## v0.5.0 — 2026-03-31
|
||||||
|
|
||||||
|
### Added
|
||||||
|
- **Console tab in `seismo_lab.py`** — direct device connection without the bridge subprocess.
|
||||||
|
- Serial and TCP transport selectable via radio buttons.
|
||||||
|
- Four one-click commands: POLL, Serial #, Full Config, Event Index.
|
||||||
|
- Colour-coded scrolling output: TX (blue), RX raw hex (teal), parsed/decoded (green), errors (red).
|
||||||
|
- Save Log and Send to Analyzer buttons; logs auto-saved to `bridges/captures/console_<ts>.log`.
|
||||||
|
- Queue/`after(100)` pattern — no UI blocking or performance impact.
|
||||||
|
- **`minimateplus` package** — clean Python client library for the MiniMate Plus S3 protocol.
|
||||||
|
- `SerialTransport` and `TcpTransport` (for Sierra Wireless RV50/RV55 cellular modems).
|
||||||
|
- `MiniMateProtocol` — DLE frame parser/builder, two-step paged reads, checksum validation.
|
||||||
|
- `MiniMateClient` — high-level client: `connect()`, `get_serial()`, `get_config()`, `get_events()`.
|
||||||
|
- **TCP/cellular transport** (`TcpTransport`) — connect to field units via Sierra Wireless RV50/RV55 modems over cellular.
|
||||||
|
- `read_until_idle(idle_gap=1.5s)` to handle modem data-forwarding buffer delay.
|
||||||
|
- Confirmed working end-to-end: TCP → RV50/RV55 → RS-232 → MiniMate Plus.
|
||||||
|
- **`bridges/tcp_serial_bridge.py`** — local TCP-to-serial bridge for bench testing `TcpTransport` without a cellular modem.
|
||||||
|
- **SFM REST server** (`sfm/server.py`) — FastAPI server with device info, event list, and event record endpoints over both serial and TCP.
|
||||||
|
|
||||||
|
### Fixed
|
||||||
|
- `protocol.py` `startup()` was using a hardcoded `POLL_RECV_TIMEOUT = 10.0` constant, ignoring the configurable `self._recv_timeout`. Fixed to use `self._recv_timeout` throughout.
|
||||||
|
- `sfm/server.py` now retries once on `ProtocolError` for TCP connections to handle cold-boot timing on first connect.
|
||||||
|
|
||||||
|
### Protocol / Documentation
|
||||||
|
- **Sierra Wireless RV50/RV55 modem config** — confirmed required ACEmanager settings: Quiet Mode = Enable, Data Forwarding Timeout = 1, TCP Connect Response Delay = 0. Quiet Mode disabled causes modem to inject `RING\r\nCONNECT\r\n` onto the serial line, breaking the S3 handshake.
|
||||||
|
- **Calibration year** confirmed at SUB FE (Full Config) destuffed payload offset 0x56–0x57 (uint16 BE). `0x07E7` = 2023, `0x07E9` = 2025.
|
||||||
|
- **`"Operating System"` boot string** — 16-byte UART boot message captured on cold-start before unit enters DLE-framed mode. Parser handles correctly by scanning for DLE+STX.
|
||||||
|
- RV50/RV55 sends `RING`/`CONNECT` over TCP to the calling client even with Quiet Mode enabled — this is normal behaviour, parser discards it.
|
||||||
|
|
||||||
|
---
|
||||||
|
|
||||||
|
## v0.4.0 — 2026-03-12
|
||||||
|
|
||||||
|
### Added
|
||||||
|
- **`seismo_lab.py`** — combined Bridge + Analyzer GUI. Single window with two tabs; bridge start auto-wires live mode in the Analyzer.
|
||||||
|
- **`frame_db.py`** — SQLite frame database. Captures accumulate over time; Query DB tab searches across all sessions.
|
||||||
|
- **`bridges/s3-bridge/proxy.py`** — bridge proxy module.
|
||||||
|
- Large BW→S3 write frame checksum algorithm confirmed and implemented (`SUM8` of payload `[2:-1]` skipping `0x10` bytes, plus constant `0x10`, mod 256).
|
||||||
|
- SUB `A4` identified as composite container frame with embedded inner frames; `_extract_a4_inner_frames()` and `_diff_a4_payloads()` reduce diff noise from 2300 → 17 meaningful entries.
|
||||||
|
|
||||||
|
### Fixed
|
||||||
|
- BAD CHK false positives on BW POLL frames — BW frame terminator `03 41` was being included in the de-stuffed payload. Fixed to strip correctly.
|
||||||
|
- Aux Trigger read location confirmed at SUB FE offset `0x0109`.
|
||||||
|
|
||||||
|
---
|
||||||
|
|
||||||
|
## v0.3.0 — 2026-03-09
|
||||||
|
|
||||||
|
### Added
|
||||||
|
- Record time confirmed at SUB E5 page2 offset `+0x28` as float32 BE.
|
||||||
|
- Trigger Sample Width confirmed at BW→S3 write frame SUB `0x82`, destuffed payload offset `[22]`.
|
||||||
|
- Mode-gating documented: several settings only appear on the wire when the appropriate mode is active.
|
||||||
|
|
||||||
|
### Fixed
|
||||||
|
- `0x082A` mystery resolved — fixed-size E5 payload length (2090 bytes), not a record-time field.
|
||||||
|
|
||||||
|
---
|
||||||
|
|
||||||
|
## v0.2.0 — 2026-03-01
|
||||||
|
|
||||||
|
### Added
|
||||||
|
- Channel config float layout fully confirmed: trigger level, alarm level, and unit string per channel (IEEE 754 BE floats).
|
||||||
|
- Blastware `.set` file format decoded — little-endian binary struct mirroring the wire payload.
|
||||||
|
- Operator manual (716U0101 Rev 15) added as cross-reference source.
|
||||||
|
|
||||||
|
---
|
||||||
|
|
||||||
|
## v0.1.0 — 2026-02-26
|
||||||
|
|
||||||
|
### Added
|
||||||
|
- Initial `s3_bridge.py` serial bridge — transparent RS-232 tap between Blastware and MiniMate Plus.
|
||||||
|
- `s3_parser.py` — deterministic DLE state machine frame extractor.
|
||||||
|
- `s3_analyzer.py` — session parser, frame differ, Claude export.
|
||||||
|
- `gui_bridge.py` and `gui_analyzer.py` — Tkinter GUIs.
|
||||||
|
- DLE framing confirmed: `DLE+STX` / `DLE+ETX`, `0x41` = ACK (not STX), DLE stuffing rule.
|
||||||
|
- Response SUB rule confirmed: `response_SUB = 0xFF - request_SUB`.
|
||||||
|
- Year `0x07CB` = 1995 confirmed as MiniMate factory RTC default.
|
||||||
|
- Full write command family documented (SUBs `68`–`83`).
|
||||||
251
README.md
251
README.md
@@ -0,0 +1,251 @@
|
|||||||
|
# seismo-relay `v0.5.0`
|
||||||
|
|
||||||
|
A ground-up replacement for **Blastware** — Instantel's aging Windows-only
|
||||||
|
software for managing MiniMate Plus seismographs.
|
||||||
|
|
||||||
|
Built in Python. Runs on Windows. Connects to instruments over direct RS-232
|
||||||
|
or cellular modem (Sierra Wireless RV50 / RV55).
|
||||||
|
|
||||||
|
> **Status:** Active development. Core read pipeline working (device info,
|
||||||
|
> config, event index). Event download and write commands in progress.
|
||||||
|
> See [CHANGELOG.md](CHANGELOG.md) for version history.
|
||||||
|
|
||||||
|
---
|
||||||
|
|
||||||
|
## What's in here
|
||||||
|
|
||||||
|
```
|
||||||
|
seismo-relay/
|
||||||
|
├── seismo_lab.py ← Main GUI (Bridge + Analyzer + Console tabs)
|
||||||
|
│
|
||||||
|
├── minimateplus/ ← MiniMate Plus client library
|
||||||
|
│ ├── transport.py ← SerialTransport and TcpTransport
|
||||||
|
│ ├── protocol.py ← DLE frame layer (read/write/parse)
|
||||||
|
│ ├── client.py ← High-level client (connect, get_config, etc.)
|
||||||
|
│ ├── framing.py ← Frame builder/parser primitives
|
||||||
|
│ └── models.py ← DeviceInfo, EventRecord, etc.
|
||||||
|
│
|
||||||
|
├── sfm/ ← SFM REST API server (FastAPI)
|
||||||
|
│ └── server.py ← /device/info, /device/events, /device/event
|
||||||
|
│
|
||||||
|
├── bridges/
|
||||||
|
│ ├── s3-bridge/
|
||||||
|
│ │ └── s3_bridge.py ← RS-232 serial bridge (capture tool)
|
||||||
|
│ ├── tcp_serial_bridge.py ← Local TCP↔serial bridge (bench testing)
|
||||||
|
│ ├── gui_bridge.py ← Standalone bridge GUI (legacy)
|
||||||
|
│ └── raw_capture.py ← Simple raw capture tool
|
||||||
|
│
|
||||||
|
├── parsers/
|
||||||
|
│ ├── s3_parser.py ← DLE frame extractor
|
||||||
|
│ ├── s3_analyzer.py ← Session parser, differ, Claude export
|
||||||
|
│ ├── gui_analyzer.py ← Standalone analyzer GUI (legacy)
|
||||||
|
│ └── frame_db.py ← SQLite frame database
|
||||||
|
│
|
||||||
|
└── docs/
|
||||||
|
└── instantel_protocol_reference.md ← Reverse-engineered protocol spec
|
||||||
|
```
|
||||||
|
|
||||||
|
---
|
||||||
|
|
||||||
|
## Quick start
|
||||||
|
|
||||||
|
### Seismo Lab (main GUI)
|
||||||
|
|
||||||
|
The all-in-one tool. Three tabs: **Bridge**, **Analyzer**, **Console**.
|
||||||
|
|
||||||
|
```
|
||||||
|
python seismo_lab.py
|
||||||
|
```
|
||||||
|
|
||||||
|
### SFM REST server
|
||||||
|
|
||||||
|
Exposes MiniMate Plus commands as a REST API for integration with other systems.
|
||||||
|
|
||||||
|
```
|
||||||
|
cd sfm
|
||||||
|
uvicorn server:app --reload
|
||||||
|
```
|
||||||
|
|
||||||
|
**Endpoints:**
|
||||||
|
|
||||||
|
| Method | URL | Description |
|
||||||
|
|--------|-----|-------------|
|
||||||
|
| `GET` | `/device/info?port=COM5` | Device info via serial |
|
||||||
|
| `GET` | `/device/info?host=1.2.3.4&tcp_port=9034` | Device info via cellular modem |
|
||||||
|
| `GET` | `/device/events?port=COM5` | Event index |
|
||||||
|
| `GET` | `/device/event?port=COM5&index=0` | Single event record |
|
||||||
|
|
||||||
|
---
|
||||||
|
|
||||||
|
## Seismo Lab tabs
|
||||||
|
|
||||||
|
### Bridge tab
|
||||||
|
|
||||||
|
Captures live RS-232 traffic between Blastware and the seismograph. Sits in
|
||||||
|
the middle as a transparent pass-through while logging everything to disk.
|
||||||
|
|
||||||
|
```
|
||||||
|
Blastware → COM4 (virtual) ↔ s3_bridge ↔ COM5 (physical) → MiniMate Plus
|
||||||
|
```
|
||||||
|
|
||||||
|
Set your COM ports and log directory, then hit **Start Bridge**. Use
|
||||||
|
**Add Mark** to annotate the capture at specific moments (e.g. "changed
|
||||||
|
trigger level"). When the bridge starts, the Analyzer tab automatically wires
|
||||||
|
up to the live files and starts updating in real time.
|
||||||
|
|
||||||
|
### Analyzer tab
|
||||||
|
|
||||||
|
Parses raw captures into DLE-framed protocol sessions, diffs consecutive
|
||||||
|
sessions to show exactly which bytes changed, and lets you query across all
|
||||||
|
historical captures via the built-in SQLite database.
|
||||||
|
|
||||||
|
- **Inventory** — all frames in a session, click to drill in
|
||||||
|
- **Hex Dump** — full payload hex dump with changed-byte annotations
|
||||||
|
- **Diff** — byte-level before/after diff between sessions
|
||||||
|
- **Full Report** — plain text session report
|
||||||
|
- **Query DB** — search across all captures by SUB, direction, or byte value
|
||||||
|
|
||||||
|
Use **Export for Claude** to generate a self-contained `.md` report for
|
||||||
|
AI-assisted field mapping.
|
||||||
|
|
||||||
|
### Console tab
|
||||||
|
|
||||||
|
Direct connection to a MiniMate Plus — no bridge, no Blastware. Useful for
|
||||||
|
diagnosing field units over cellular without a full capture session.
|
||||||
|
|
||||||
|
**Connection:** choose Serial (COM port + baud) or TCP (IP + port for
|
||||||
|
cellular modem).
|
||||||
|
|
||||||
|
**Commands:**
|
||||||
|
| Button | What it does |
|
||||||
|
|--------|-------------|
|
||||||
|
| POLL | Startup handshake — confirms unit is alive and identifies model |
|
||||||
|
| Serial # | Reads unit serial number |
|
||||||
|
| Full Config | Reads full 166-byte config block (firmware version, channel scales, etc.) |
|
||||||
|
| Event Index | Reads stored event list |
|
||||||
|
|
||||||
|
Output is colour-coded: TX in blue, raw RX bytes in teal, decoded fields in
|
||||||
|
green, errors in red. **Save Log** writes a timestamped `.log` file to
|
||||||
|
`bridges/captures/`. **Send to Analyzer** injects the captured bytes into the
|
||||||
|
Analyzer tab for deeper inspection.
|
||||||
|
|
||||||
|
---
|
||||||
|
|
||||||
|
## Connecting over cellular (RV50 / RV55 modems)
|
||||||
|
|
||||||
|
Field units connect via Sierra Wireless RV50 or RV55 cellular modems. Use
|
||||||
|
TCP mode in the Console or SFM:
|
||||||
|
|
||||||
|
```
|
||||||
|
# Console tab
|
||||||
|
Transport: TCP
|
||||||
|
Host: <modem public IP>
|
||||||
|
Port: 9034 ← Device Port in ACEmanager (call-up mode)
|
||||||
|
```
|
||||||
|
|
||||||
|
```python
|
||||||
|
# In code
|
||||||
|
from minimateplus.transport import TcpTransport
|
||||||
|
from minimateplus.client import MiniMateClient
|
||||||
|
|
||||||
|
client = MiniMateClient(transport=TcpTransport("1.2.3.4", 9034))
|
||||||
|
info = client.connect()
|
||||||
|
```
|
||||||
|
|
||||||
|
### Required ACEmanager settings (Serial tab)
|
||||||
|
|
||||||
|
These must match exactly — a single wrong setting causes the unit to beep
|
||||||
|
on connect but never respond:
|
||||||
|
|
||||||
|
| Setting | Value | Why |
|
||||||
|
|---------|-------|-----|
|
||||||
|
| Configure Serial Port | `38400,8N1` | Must match MiniMate baud rate |
|
||||||
|
| Flow Control | `None` | Hardware flow control blocks unit TX if pins unconnected |
|
||||||
|
| **Quiet Mode** | **Enable** | **Critical.** Disabled → modem injects `RING`/`CONNECT` onto serial line, corrupting the S3 handshake |
|
||||||
|
| Data Forwarding Timeout | `1` (= 0.1 s) | Lower latency; `5` works but is sluggish |
|
||||||
|
| TCP Connect Response Delay | `0` | Non-zero silently drops the first POLL frame |
|
||||||
|
| TCP Idle Timeout | `2` (minutes) | Prevents premature disconnect |
|
||||||
|
| DB9 Serial Echo | `Disable` | Echo corrupts the data stream |
|
||||||
|
|
||||||
|
---
|
||||||
|
|
||||||
|
## minimateplus library
|
||||||
|
|
||||||
|
```python
|
||||||
|
from minimateplus import MiniMateClient
|
||||||
|
from minimateplus.transport import SerialTransport, TcpTransport
|
||||||
|
|
||||||
|
# Serial
|
||||||
|
client = MiniMateClient(port="COM5")
|
||||||
|
|
||||||
|
# TCP (cellular modem)
|
||||||
|
client = MiniMateClient(transport=TcpTransport("1.2.3.4", 9034), timeout=30.0)
|
||||||
|
|
||||||
|
with client:
|
||||||
|
info = client.connect() # DeviceInfo — model, serial, firmware
|
||||||
|
serial = client.get_serial() # Serial number string
|
||||||
|
config = client.get_config() # Full config block (bytes)
|
||||||
|
events = client.get_events() # Event index
|
||||||
|
```
|
||||||
|
|
||||||
|
---
|
||||||
|
|
||||||
|
## Protocol quick-reference
|
||||||
|
|
||||||
|
| Term | Value | Meaning |
|
||||||
|
|------|-------|---------|
|
||||||
|
| DLE | `0x10` | Data Link Escape |
|
||||||
|
| STX | `0x02` | Start of frame |
|
||||||
|
| ETX | `0x03` | End of frame |
|
||||||
|
| ACK | `0x41` (`'A'`) | Frame-start marker sent before every frame |
|
||||||
|
| DLE stuffing | `10 10` on wire | Literal `0x10` in payload |
|
||||||
|
|
||||||
|
**S3-side frame** (seismograph → Blastware): `ACK DLE+STX [payload] CHK DLE+ETX`
|
||||||
|
|
||||||
|
**De-stuffed payload header:**
|
||||||
|
```
|
||||||
|
[0] CMD 0x10 = BW request, 0x00 = S3 response
|
||||||
|
[1] ? unknown (0x00 BW / 0x10 S3)
|
||||||
|
[2] SUB Command/response identifier ← the key field
|
||||||
|
[3] PAGE_HI Page address high byte
|
||||||
|
[4] PAGE_LO Page address low byte
|
||||||
|
[5+] DATA Payload content
|
||||||
|
```
|
||||||
|
|
||||||
|
**Response SUB rule:** `response_SUB = 0xFF - request_SUB`
|
||||||
|
Example: request SUB `0x08` (Event Index) → response SUB `0xF7`
|
||||||
|
|
||||||
|
Full protocol documentation: [`docs/instantel_protocol_reference.md`](docs/instantel_protocol_reference.md)
|
||||||
|
|
||||||
|
---
|
||||||
|
|
||||||
|
## Requirements
|
||||||
|
|
||||||
|
```
|
||||||
|
pip install pyserial fastapi uvicorn
|
||||||
|
```
|
||||||
|
|
||||||
|
Python 3.10+. Tkinter is included with the standard Python installer on
|
||||||
|
Windows (make sure "tcl/tk and IDLE" is checked during install).
|
||||||
|
|
||||||
|
---
|
||||||
|
|
||||||
|
## Virtual COM ports (bridge capture)
|
||||||
|
|
||||||
|
The bridge needs two COM ports on the same PC — one that Blastware connects
|
||||||
|
to, and one wired to the seismograph. Use a virtual COM port pair
|
||||||
|
(**com0com** or **VSPD**) to give Blastware a port to talk to.
|
||||||
|
|
||||||
|
```
|
||||||
|
Blastware → COM4 (virtual) ↔ s3_bridge.py ↔ COM5 (physical) → MiniMate Plus
|
||||||
|
```
|
||||||
|
|
||||||
|
---
|
||||||
|
|
||||||
|
## Roadmap
|
||||||
|
|
||||||
|
- [ ] Event download — pull waveform records from the unit (SUBs `1E` → `0A` → `0C` → `5A`)
|
||||||
|
- [ ] Write commands — push config changes to the unit (compliance setup, channel config, trigger settings)
|
||||||
|
- [ ] ACH inbound server — accept call-home connections from field units
|
||||||
|
- [ ] Modem manager — push standard configs to RV50/RV55 fleet via Sierra Wireless API
|
||||||
|
- [ ] Full Blastware parity — complete read/write/download cycle without Blastware
|
||||||
|
|||||||
@@ -13,6 +13,7 @@ Requires only the stdlib (Tkinter is bundled on Windows/Python).
|
|||||||
|
|
||||||
from __future__ import annotations
|
from __future__ import annotations
|
||||||
|
|
||||||
|
import datetime
|
||||||
import os
|
import os
|
||||||
import queue
|
import queue
|
||||||
import subprocess
|
import subprocess
|
||||||
@@ -125,11 +126,22 @@ class BridgeGUI(tk.Tk):
|
|||||||
|
|
||||||
args = [sys.executable, BRIDGE_PATH, "--bw", bw, "--s3", s3, "--baud", baud, "--logdir", logdir]
|
args = [sys.executable, BRIDGE_PATH, "--bw", bw, "--s3", s3, "--baud", baud, "--logdir", logdir]
|
||||||
|
|
||||||
|
ts = datetime.datetime.now().strftime("%Y%m%d_%H%M%S")
|
||||||
|
|
||||||
raw_bw = self.raw_bw_var.get().strip()
|
raw_bw = self.raw_bw_var.get().strip()
|
||||||
raw_s3 = self.raw_s3_var.get().strip()
|
raw_s3 = self.raw_s3_var.get().strip()
|
||||||
|
|
||||||
|
# If the user left the default generic name, replace with a timestamped one
|
||||||
|
# so each session gets its own file.
|
||||||
if raw_bw:
|
if raw_bw:
|
||||||
|
if os.path.basename(raw_bw) in ("raw_bw.bin", "raw_bw"):
|
||||||
|
raw_bw = os.path.join(os.path.dirname(raw_bw) or logdir, f"raw_bw_{ts}.bin")
|
||||||
|
self.raw_bw_var.set(raw_bw)
|
||||||
args += ["--raw-bw", raw_bw]
|
args += ["--raw-bw", raw_bw]
|
||||||
if raw_s3:
|
if raw_s3:
|
||||||
|
if os.path.basename(raw_s3) in ("raw_s3.bin", "raw_s3"):
|
||||||
|
raw_s3 = os.path.join(os.path.dirname(raw_s3) or logdir, f"raw_s3_{ts}.bin")
|
||||||
|
self.raw_s3_var.set(raw_s3)
|
||||||
args += ["--raw-s3", raw_s3]
|
args += ["--raw-s3", raw_s3]
|
||||||
|
|
||||||
try:
|
try:
|
||||||
|
|||||||
@@ -35,7 +35,7 @@ from typing import Optional
|
|||||||
|
|
||||||
import serial
|
import serial
|
||||||
|
|
||||||
VERSION = "v0.5.0"
|
VERSION = "v0.5.1"
|
||||||
|
|
||||||
DLE = 0x10
|
DLE = 0x10
|
||||||
STX = 0x02
|
STX = 0x02
|
||||||
@@ -345,14 +345,25 @@ def main() -> int:
|
|||||||
ts = _dt.datetime.now().strftime("%Y%m%d_%H%M%S")
|
ts = _dt.datetime.now().strftime("%Y%m%d_%H%M%S")
|
||||||
log_path = os.path.join(args.logdir, f"s3_session_{ts}.log")
|
log_path = os.path.join(args.logdir, f"s3_session_{ts}.log")
|
||||||
bin_path = os.path.join(args.logdir, f"s3_session_{ts}.bin")
|
bin_path = os.path.join(args.logdir, f"s3_session_{ts}.bin")
|
||||||
logger = SessionLogger(log_path, bin_path, raw_bw_path=args.raw_bw, raw_s3_path=args.raw_s3)
|
|
||||||
|
# If raw tap flags were passed without a path (bare --raw-bw / --raw-s3),
|
||||||
|
# or if the sentinel value "auto" is used, generate a timestamped name.
|
||||||
|
# If a specific path was provided, use it as-is (caller's responsibility).
|
||||||
|
raw_bw_path = args.raw_bw
|
||||||
|
raw_s3_path = args.raw_s3
|
||||||
|
if raw_bw_path in (None, "", "auto"):
|
||||||
|
raw_bw_path = os.path.join(args.logdir, f"raw_bw_{ts}.bin") if args.raw_bw is not None else None
|
||||||
|
if raw_s3_path in (None, "", "auto"):
|
||||||
|
raw_s3_path = os.path.join(args.logdir, f"raw_s3_{ts}.bin") if args.raw_s3 is not None else None
|
||||||
|
|
||||||
|
logger = SessionLogger(log_path, bin_path, raw_bw_path=raw_bw_path, raw_s3_path=raw_s3_path)
|
||||||
|
|
||||||
print(f"[LOG] Writing hex log to {log_path}")
|
print(f"[LOG] Writing hex log to {log_path}")
|
||||||
print(f"[LOG] Writing binary log to {bin_path}")
|
print(f"[LOG] Writing binary log to {bin_path}")
|
||||||
if args.raw_bw:
|
if raw_bw_path:
|
||||||
print(f"[LOG] Raw tap BW->S3 -> {args.raw_bw}")
|
print(f"[LOG] Raw tap BW->S3 -> {raw_bw_path}")
|
||||||
if args.raw_s3:
|
if raw_s3_path:
|
||||||
print(f"[LOG] Raw tap S3->BW -> {args.raw_s3}")
|
print(f"[LOG] Raw tap S3->BW -> {raw_s3_path}")
|
||||||
|
|
||||||
logger.log_info(f"s3_bridge {VERSION} start")
|
logger.log_info(f"s3_bridge {VERSION} start")
|
||||||
logger.log_info(f"BW={args.bw} S3={args.s3} baud={args.baud}")
|
logger.log_info(f"BW={args.bw} S3={args.s3} baud={args.baud}")
|
||||||
|
|||||||
205
bridges/tcp_serial_bridge.py
Normal file
205
bridges/tcp_serial_bridge.py
Normal file
@@ -0,0 +1,205 @@
|
|||||||
|
"""
|
||||||
|
tcp_serial_bridge.py — Local TCP-to-serial bridge for bench testing TcpTransport.
|
||||||
|
|
||||||
|
Listens on a TCP port and, when a client connects, opens a serial port and
|
||||||
|
bridges bytes bidirectionally. This lets you test the SFM server's TCP
|
||||||
|
endpoint (?host=127.0.0.1&tcp_port=12345) against a locally-attached MiniMate
|
||||||
|
Plus without needing a field modem.
|
||||||
|
|
||||||
|
The bridge simulates an RV55 cellular modem in transparent TCP passthrough mode:
|
||||||
|
- No handshake bytes on connect
|
||||||
|
- Raw bytes forwarded in both directions
|
||||||
|
- One connection at a time (new connection closes any existing serial session)
|
||||||
|
|
||||||
|
Usage:
|
||||||
|
python bridges/tcp_serial_bridge.py --serial COM5 --tcp-port 12345
|
||||||
|
|
||||||
|
Then in another window:
|
||||||
|
python -m uvicorn sfm.server:app --port 8200
|
||||||
|
curl "http://localhost:8200/device/info?host=127.0.0.1&tcp_port=12345"
|
||||||
|
|
||||||
|
Or just hit http://localhost:8200/device/info?host=127.0.0.1&tcp_port=12345
|
||||||
|
in a browser.
|
||||||
|
|
||||||
|
Requirements:
|
||||||
|
pip install pyserial
|
||||||
|
"""
|
||||||
|
|
||||||
|
from __future__ import annotations
|
||||||
|
|
||||||
|
import argparse
|
||||||
|
import logging
|
||||||
|
import select
|
||||||
|
import socket
|
||||||
|
import sys
|
||||||
|
import threading
|
||||||
|
import time
|
||||||
|
|
||||||
|
try:
|
||||||
|
import serial # type: ignore
|
||||||
|
except ImportError:
|
||||||
|
print("pyserial required: pip install pyserial", file=sys.stderr)
|
||||||
|
sys.exit(1)
|
||||||
|
|
||||||
|
logging.basicConfig(
|
||||||
|
level=logging.INFO,
|
||||||
|
format="%(asctime)s %(levelname)-7s %(message)s",
|
||||||
|
datefmt="%H:%M:%S",
|
||||||
|
)
|
||||||
|
log = logging.getLogger("tcp_serial_bridge")
|
||||||
|
|
||||||
|
# ── Constants ─────────────────────────────────────────────────────────────────
|
||||||
|
|
||||||
|
DEFAULT_BAUD = 38_400
|
||||||
|
DEFAULT_TCP_PORT = 12345
|
||||||
|
CHUNK = 256 # bytes per read call
|
||||||
|
SERIAL_TIMEOUT = 0.02 # serial read timeout (s) — non-blocking in practice
|
||||||
|
TCP_TIMEOUT = 0.02 # socket recv timeout (s)
|
||||||
|
BOOT_DELAY = 8.0 # seconds to wait after opening serial port before
|
||||||
|
# forwarding data — unit cold-boot (beep + OS init)
|
||||||
|
# takes 5-10s from first RS-232 line assertion.
|
||||||
|
# Set to 0 if unit was already running before connect.
|
||||||
|
|
||||||
|
|
||||||
|
# ── Bridge session ─────────────────────────────────────────────────────────────
|
||||||
|
|
||||||
|
def _pipe_tcp_to_serial(sock: socket.socket, ser: serial.Serial, stop: threading.Event) -> None:
|
||||||
|
"""Forward bytes from TCP socket → serial port."""
|
||||||
|
sock.settimeout(TCP_TIMEOUT)
|
||||||
|
while not stop.is_set():
|
||||||
|
try:
|
||||||
|
data = sock.recv(CHUNK)
|
||||||
|
if not data:
|
||||||
|
log.info("TCP peer closed connection")
|
||||||
|
stop.set()
|
||||||
|
break
|
||||||
|
log.debug("TCP→SER %d bytes: %s", len(data), data.hex())
|
||||||
|
ser.write(data)
|
||||||
|
except socket.timeout:
|
||||||
|
pass
|
||||||
|
except OSError as exc:
|
||||||
|
if not stop.is_set():
|
||||||
|
log.warning("TCP read error: %s", exc)
|
||||||
|
stop.set()
|
||||||
|
break
|
||||||
|
|
||||||
|
|
||||||
|
def _pipe_serial_to_tcp(sock: socket.socket, ser: serial.Serial, stop: threading.Event) -> None:
|
||||||
|
"""Forward bytes from serial port → TCP socket."""
|
||||||
|
while not stop.is_set():
|
||||||
|
try:
|
||||||
|
data = ser.read(CHUNK)
|
||||||
|
if data:
|
||||||
|
log.debug("SER→TCP %d bytes: %s", len(data), data.hex())
|
||||||
|
try:
|
||||||
|
sock.sendall(data)
|
||||||
|
except OSError as exc:
|
||||||
|
if not stop.is_set():
|
||||||
|
log.warning("TCP send error: %s", exc)
|
||||||
|
stop.set()
|
||||||
|
break
|
||||||
|
except serial.SerialException as exc:
|
||||||
|
if not stop.is_set():
|
||||||
|
log.warning("Serial read error: %s", exc)
|
||||||
|
stop.set()
|
||||||
|
break
|
||||||
|
|
||||||
|
|
||||||
|
def _run_session(conn: socket.socket, addr: tuple, serial_port: str, baud: int, boot_delay: float) -> None:
|
||||||
|
"""Handle one TCP client connection."""
|
||||||
|
peer = f"{addr[0]}:{addr[1]}"
|
||||||
|
log.info("Connection from %s", peer)
|
||||||
|
|
||||||
|
try:
|
||||||
|
ser = serial.Serial(
|
||||||
|
port = serial_port,
|
||||||
|
baudrate = baud,
|
||||||
|
bytesize = 8,
|
||||||
|
parity = "N",
|
||||||
|
stopbits = 1,
|
||||||
|
timeout = SERIAL_TIMEOUT,
|
||||||
|
)
|
||||||
|
except serial.SerialException as exc:
|
||||||
|
log.error("Cannot open serial port %s: %s", serial_port, exc)
|
||||||
|
conn.close()
|
||||||
|
return
|
||||||
|
|
||||||
|
log.info("Opened %s at %d baud — waiting %.1fs for unit boot", serial_port, baud, boot_delay)
|
||||||
|
ser.reset_input_buffer()
|
||||||
|
ser.reset_output_buffer()
|
||||||
|
|
||||||
|
if boot_delay > 0:
|
||||||
|
time.sleep(boot_delay)
|
||||||
|
ser.reset_input_buffer() # discard any boot noise
|
||||||
|
|
||||||
|
log.info("Bridge active: TCP %s ↔ %s", peer, serial_port)
|
||||||
|
|
||||||
|
stop = threading.Event()
|
||||||
|
t_tcp_to_ser = threading.Thread(
|
||||||
|
target=_pipe_tcp_to_serial, args=(conn, ser, stop), daemon=True
|
||||||
|
)
|
||||||
|
t_ser_to_tcp = threading.Thread(
|
||||||
|
target=_pipe_serial_to_tcp, args=(conn, ser, stop), daemon=True
|
||||||
|
)
|
||||||
|
t_tcp_to_ser.start()
|
||||||
|
t_ser_to_tcp.start()
|
||||||
|
|
||||||
|
stop.wait() # block until either thread sets the stop flag
|
||||||
|
|
||||||
|
log.info("Session ended, cleaning up")
|
||||||
|
try:
|
||||||
|
conn.close()
|
||||||
|
except OSError:
|
||||||
|
pass
|
||||||
|
try:
|
||||||
|
ser.close()
|
||||||
|
except OSError:
|
||||||
|
pass
|
||||||
|
|
||||||
|
t_tcp_to_ser.join(timeout=2.0)
|
||||||
|
t_ser_to_tcp.join(timeout=2.0)
|
||||||
|
log.info("Session with %s closed", peer)
|
||||||
|
|
||||||
|
|
||||||
|
# ── Server ────────────────────────────────────────────────────────────────────
|
||||||
|
|
||||||
|
def run_bridge(serial_port: str, baud: int, tcp_port: int, boot_delay: float) -> None:
|
||||||
|
"""Accept TCP connections forever and bridge each one to the serial port."""
|
||||||
|
srv = socket.socket(socket.AF_INET, socket.SOCK_STREAM)
|
||||||
|
srv.setsockopt(socket.SOL_SOCKET, socket.SO_REUSEADDR, 1)
|
||||||
|
srv.bind(("0.0.0.0", tcp_port))
|
||||||
|
srv.listen(1)
|
||||||
|
log.info(
|
||||||
|
"Listening on TCP :%d — will bridge to %s at %d baud",
|
||||||
|
tcp_port, serial_port, baud,
|
||||||
|
)
|
||||||
|
log.info("Send test: curl 'http://localhost:8200/device/info?host=127.0.0.1&tcp_port=%d'", tcp_port)
|
||||||
|
|
||||||
|
try:
|
||||||
|
while True:
|
||||||
|
conn, addr = srv.accept()
|
||||||
|
# Handle one session at a time (synchronous) — matches modem behaviour
|
||||||
|
_run_session(conn, addr, serial_port, baud, boot_delay)
|
||||||
|
except KeyboardInterrupt:
|
||||||
|
log.info("Shutting down")
|
||||||
|
finally:
|
||||||
|
srv.close()
|
||||||
|
|
||||||
|
|
||||||
|
# ── Entry point ────────────────────────────────────────────────────────────────
|
||||||
|
|
||||||
|
if __name__ == "__main__":
|
||||||
|
ap = argparse.ArgumentParser(description="TCP-to-serial bridge for bench testing TcpTransport")
|
||||||
|
ap.add_argument("--serial", default="COM5", help="Serial port (default: COM5)")
|
||||||
|
ap.add_argument("--baud", type=int, default=DEFAULT_BAUD, help="Baud rate (default: 38400)")
|
||||||
|
ap.add_argument("--tcp-port", type=int, default=DEFAULT_TCP_PORT, help="TCP listen port (default: 12345)")
|
||||||
|
ap.add_argument("--boot-delay", type=float, default=BOOT_DELAY,
|
||||||
|
help="Seconds to wait after opening serial before forwarding (default: 2.0). "
|
||||||
|
"Set to 0 if unit is already powered on.")
|
||||||
|
ap.add_argument("--debug", action="store_true", help="Show individual byte transfers")
|
||||||
|
args = ap.parse_args()
|
||||||
|
|
||||||
|
if args.debug:
|
||||||
|
logging.getLogger().setLevel(logging.DEBUG)
|
||||||
|
|
||||||
|
run_bridge(args.serial, args.baud, args.tcp_port, args.boot_delay)
|
||||||
File diff suppressed because it is too large
Load Diff
27
minimateplus/__init__.py
Normal file
27
minimateplus/__init__.py
Normal file
@@ -0,0 +1,27 @@
|
|||||||
|
"""
|
||||||
|
minimateplus — Instantel MiniMate Plus protocol library.
|
||||||
|
|
||||||
|
Provides a clean Python API for communicating with MiniMate Plus seismographs
|
||||||
|
over RS-232 serial (direct cable) or TCP (modem / ACH Auto Call Home).
|
||||||
|
|
||||||
|
Typical usage (serial):
|
||||||
|
from minimateplus import MiniMateClient
|
||||||
|
|
||||||
|
with MiniMateClient("COM5") as device:
|
||||||
|
info = device.connect()
|
||||||
|
events = device.get_events()
|
||||||
|
|
||||||
|
Typical usage (TCP / modem):
|
||||||
|
from minimateplus import MiniMateClient
|
||||||
|
from minimateplus.transport import TcpTransport
|
||||||
|
|
||||||
|
with MiniMateClient(transport=TcpTransport("203.0.113.5", 12345)) as device:
|
||||||
|
info = device.connect()
|
||||||
|
"""
|
||||||
|
|
||||||
|
from .client import MiniMateClient
|
||||||
|
from .models import DeviceInfo, Event
|
||||||
|
from .transport import SerialTransport, TcpTransport
|
||||||
|
|
||||||
|
__version__ = "0.1.0"
|
||||||
|
__all__ = ["MiniMateClient", "DeviceInfo", "Event", "SerialTransport", "TcpTransport"]
|
||||||
483
minimateplus/client.py
Normal file
483
minimateplus/client.py
Normal file
@@ -0,0 +1,483 @@
|
|||||||
|
"""
|
||||||
|
client.py — MiniMateClient: the top-level public API for the library.
|
||||||
|
|
||||||
|
Combines transport, protocol, and model decoding into a single easy-to-use
|
||||||
|
class. This is the only layer that the SFM server (sfm/server.py) imports
|
||||||
|
directly.
|
||||||
|
|
||||||
|
Design: stateless per-call (connect → do work → disconnect).
|
||||||
|
The client does not hold an open connection between calls. This keeps the
|
||||||
|
first implementation simple and matches Blastware's observed behaviour.
|
||||||
|
Persistent connections can be added later without changing the public API.
|
||||||
|
|
||||||
|
Example (serial):
|
||||||
|
from minimateplus import MiniMateClient
|
||||||
|
|
||||||
|
with MiniMateClient("COM5") as device:
|
||||||
|
info = device.connect() # POLL handshake + identity read
|
||||||
|
events = device.get_events() # download all events
|
||||||
|
|
||||||
|
Example (TCP / modem):
|
||||||
|
from minimateplus import MiniMateClient
|
||||||
|
from minimateplus.transport import TcpTransport
|
||||||
|
|
||||||
|
transport = TcpTransport("203.0.113.5", port=12345)
|
||||||
|
with MiniMateClient(transport=transport) as device:
|
||||||
|
info = device.connect()
|
||||||
|
"""
|
||||||
|
|
||||||
|
from __future__ import annotations
|
||||||
|
|
||||||
|
import logging
|
||||||
|
import struct
|
||||||
|
from typing import Optional
|
||||||
|
|
||||||
|
from .framing import S3Frame
|
||||||
|
from .models import (
|
||||||
|
DeviceInfo,
|
||||||
|
Event,
|
||||||
|
PeakValues,
|
||||||
|
ProjectInfo,
|
||||||
|
Timestamp,
|
||||||
|
)
|
||||||
|
from .protocol import MiniMateProtocol, ProtocolError
|
||||||
|
from .protocol import (
|
||||||
|
SUB_SERIAL_NUMBER,
|
||||||
|
SUB_FULL_CONFIG,
|
||||||
|
SUB_EVENT_INDEX,
|
||||||
|
SUB_EVENT_HEADER,
|
||||||
|
SUB_WAVEFORM_RECORD,
|
||||||
|
)
|
||||||
|
from .transport import SerialTransport, BaseTransport
|
||||||
|
|
||||||
|
log = logging.getLogger(__name__)
|
||||||
|
|
||||||
|
|
||||||
|
# ── MiniMateClient ────────────────────────────────────────────────────────────
|
||||||
|
|
||||||
|
class MiniMateClient:
|
||||||
|
"""
|
||||||
|
High-level client for a single MiniMate Plus device.
|
||||||
|
|
||||||
|
Args:
|
||||||
|
port: Serial port name (e.g. "COM5", "/dev/ttyUSB0").
|
||||||
|
Not required when a pre-built transport is provided.
|
||||||
|
baud: Baud rate (default 38400, ignored when transport is provided).
|
||||||
|
timeout: Per-request receive timeout in seconds (default 15.0).
|
||||||
|
transport: Pre-built transport (SerialTransport or TcpTransport).
|
||||||
|
If None, a SerialTransport is constructed from port/baud.
|
||||||
|
"""
|
||||||
|
|
||||||
|
def __init__(
|
||||||
|
self,
|
||||||
|
port: str = "",
|
||||||
|
baud: int = 38_400,
|
||||||
|
timeout: float = 15.0,
|
||||||
|
transport: Optional[BaseTransport] = None,
|
||||||
|
) -> None:
|
||||||
|
self.port = port
|
||||||
|
self.baud = baud
|
||||||
|
self.timeout = timeout
|
||||||
|
self._transport: Optional[BaseTransport] = transport
|
||||||
|
self._proto: Optional[MiniMateProtocol] = None
|
||||||
|
|
||||||
|
# ── Connection lifecycle ──────────────────────────────────────────────────
|
||||||
|
|
||||||
|
def open(self) -> None:
|
||||||
|
"""Open the transport connection."""
|
||||||
|
if self._transport is None:
|
||||||
|
self._transport = SerialTransport(self.port, self.baud)
|
||||||
|
if not self._transport.is_connected:
|
||||||
|
self._transport.connect()
|
||||||
|
self._proto = MiniMateProtocol(self._transport, recv_timeout=self.timeout)
|
||||||
|
|
||||||
|
def close(self) -> None:
|
||||||
|
"""Close the transport connection."""
|
||||||
|
if self._transport and self._transport.is_connected:
|
||||||
|
self._transport.disconnect()
|
||||||
|
self._proto = None
|
||||||
|
|
||||||
|
@property
|
||||||
|
def is_open(self) -> bool:
|
||||||
|
return bool(self._transport and self._transport.is_connected)
|
||||||
|
|
||||||
|
# ── Context manager ───────────────────────────────────────────────────────
|
||||||
|
|
||||||
|
def __enter__(self) -> "MiniMateClient":
|
||||||
|
self.open()
|
||||||
|
return self
|
||||||
|
|
||||||
|
def __exit__(self, *_) -> None:
|
||||||
|
self.close()
|
||||||
|
|
||||||
|
# ── Public API ────────────────────────────────────────────────────────────
|
||||||
|
|
||||||
|
def connect(self) -> DeviceInfo:
|
||||||
|
"""
|
||||||
|
Perform the startup handshake and read device identity.
|
||||||
|
|
||||||
|
Opens the connection if not already open.
|
||||||
|
|
||||||
|
Reads:
|
||||||
|
1. POLL handshake (startup)
|
||||||
|
2. SUB 15 — serial number
|
||||||
|
3. SUB 01 — full config block (firmware, model strings)
|
||||||
|
|
||||||
|
Returns:
|
||||||
|
Populated DeviceInfo.
|
||||||
|
|
||||||
|
Raises:
|
||||||
|
ProtocolError: on any communication failure.
|
||||||
|
"""
|
||||||
|
if not self.is_open:
|
||||||
|
self.open()
|
||||||
|
|
||||||
|
proto = self._require_proto()
|
||||||
|
|
||||||
|
log.info("connect: POLL startup")
|
||||||
|
proto.startup()
|
||||||
|
|
||||||
|
log.info("connect: reading serial number (SUB 15)")
|
||||||
|
sn_data = proto.read(SUB_SERIAL_NUMBER)
|
||||||
|
device_info = _decode_serial_number(sn_data)
|
||||||
|
|
||||||
|
log.info("connect: reading full config (SUB 01)")
|
||||||
|
cfg_data = proto.read(SUB_FULL_CONFIG)
|
||||||
|
_decode_full_config_into(cfg_data, device_info)
|
||||||
|
|
||||||
|
log.info("connect: %s", device_info)
|
||||||
|
return device_info
|
||||||
|
|
||||||
|
def get_events(self, include_waveforms: bool = True) -> list[Event]:
|
||||||
|
"""
|
||||||
|
Download all stored events from the device.
|
||||||
|
|
||||||
|
For each event in the index:
|
||||||
|
1. SUB 1E — event header (timestamp, sample rate)
|
||||||
|
2. SUB 0C — full waveform record (peak values, project strings)
|
||||||
|
|
||||||
|
Raw ADC waveform samples (SUB 5A bulk stream) are NOT downloaded
|
||||||
|
here — they can be large. Pass include_waveforms=True to also
|
||||||
|
download them (not yet implemented, reserved for a future call).
|
||||||
|
|
||||||
|
Args:
|
||||||
|
include_waveforms: Reserved. Currently ignored.
|
||||||
|
|
||||||
|
Returns:
|
||||||
|
List of Event objects, one per stored record on the device.
|
||||||
|
|
||||||
|
Raises:
|
||||||
|
ProtocolError: on any communication failure.
|
||||||
|
"""
|
||||||
|
proto = self._require_proto()
|
||||||
|
|
||||||
|
log.info("get_events: reading event index (SUB 08)")
|
||||||
|
index_data = proto.read(SUB_EVENT_INDEX)
|
||||||
|
event_count = _decode_event_count(index_data)
|
||||||
|
log.info("get_events: %d event(s) found", event_count)
|
||||||
|
|
||||||
|
events: list[Event] = []
|
||||||
|
for i in range(event_count):
|
||||||
|
log.info("get_events: downloading event %d/%d", i + 1, event_count)
|
||||||
|
ev = self._download_event(proto, i)
|
||||||
|
if ev:
|
||||||
|
events.append(ev)
|
||||||
|
|
||||||
|
return events
|
||||||
|
|
||||||
|
# ── Internal helpers ──────────────────────────────────────────────────────
|
||||||
|
|
||||||
|
def _require_proto(self) -> MiniMateProtocol:
|
||||||
|
if self._proto is None:
|
||||||
|
raise RuntimeError("MiniMateClient is not connected. Call open() first.")
|
||||||
|
return self._proto
|
||||||
|
|
||||||
|
def _download_event(
|
||||||
|
self, proto: MiniMateProtocol, index: int
|
||||||
|
) -> Optional[Event]:
|
||||||
|
"""Download header + waveform record for one event by index."""
|
||||||
|
ev = Event(index=index)
|
||||||
|
|
||||||
|
# SUB 1E — event header (timestamp, sample rate).
|
||||||
|
#
|
||||||
|
# The two-step event-header read passes the event index at payload[5]
|
||||||
|
# of the data-request frame (consistent with all other reads).
|
||||||
|
# This limits addressing to events 0–255 without a multi-byte scheme;
|
||||||
|
# the MiniMate Plus stores up to ~1000 events, so high indices may need
|
||||||
|
# a revised approach once we have captured event-download frames.
|
||||||
|
try:
|
||||||
|
from .framing import build_bw_frame
|
||||||
|
from .protocol import _expected_rsp_sub, SUB_EVENT_HEADER
|
||||||
|
|
||||||
|
# Step 1 — probe (offset=0)
|
||||||
|
probe_frame = build_bw_frame(SUB_EVENT_HEADER, 0)
|
||||||
|
proto._send(probe_frame)
|
||||||
|
_probe_rsp = proto._recv_one(expected_sub=_expected_rsp_sub(SUB_EVENT_HEADER))
|
||||||
|
|
||||||
|
# Step 2 — data request (offset = event index, clamped to 0xFF)
|
||||||
|
event_offset = min(index, 0xFF)
|
||||||
|
data_frame = build_bw_frame(SUB_EVENT_HEADER, event_offset)
|
||||||
|
proto._send(data_frame)
|
||||||
|
data_rsp = proto._recv_one(expected_sub=_expected_rsp_sub(SUB_EVENT_HEADER))
|
||||||
|
|
||||||
|
_decode_event_header_into(data_rsp.data, ev)
|
||||||
|
except ProtocolError as exc:
|
||||||
|
log.warning("event %d: header read failed: %s", index, exc)
|
||||||
|
return ev # Return partial event rather than losing it entirely
|
||||||
|
|
||||||
|
# SUB 0C — full waveform record (peak values, project strings).
|
||||||
|
try:
|
||||||
|
wf_data = proto.read(SUB_WAVEFORM_RECORD)
|
||||||
|
_decode_waveform_record_into(wf_data, ev)
|
||||||
|
except ProtocolError as exc:
|
||||||
|
log.warning("event %d: waveform record read failed: %s", index, exc)
|
||||||
|
|
||||||
|
return ev
|
||||||
|
|
||||||
|
|
||||||
|
# ── Decoder functions ─────────────────────────────────────────────────────────
|
||||||
|
#
|
||||||
|
# Pure functions: bytes → model field population.
|
||||||
|
# Kept here (not in models.py) to isolate protocol knowledge from data shapes.
|
||||||
|
|
||||||
|
def _decode_serial_number(data: bytes) -> DeviceInfo:
|
||||||
|
"""
|
||||||
|
Decode SUB EA (SERIAL_NUMBER_RESPONSE) payload into a new DeviceInfo.
|
||||||
|
|
||||||
|
Layout (10 bytes total per §7.2):
|
||||||
|
bytes 0–7: serial string, null-terminated, null-padded ("BE18189\\x00")
|
||||||
|
byte 8: unit-specific trailing byte (purpose unknown ❓)
|
||||||
|
byte 9: firmware minor version (0x11 = 17) ✅
|
||||||
|
|
||||||
|
Returns:
|
||||||
|
New DeviceInfo with serial, firmware_minor, serial_trail_0 populated.
|
||||||
|
"""
|
||||||
|
if len(data) < 9:
|
||||||
|
# Short payload — gracefully degrade
|
||||||
|
serial = data.rstrip(b"\x00").decode("ascii", errors="replace")
|
||||||
|
return DeviceInfo(serial=serial, firmware_minor=0)
|
||||||
|
|
||||||
|
serial = data[:8].rstrip(b"\x00").decode("ascii", errors="replace")
|
||||||
|
trail_0 = data[8] if len(data) > 8 else None
|
||||||
|
fw_minor = data[9] if len(data) > 9 else 0
|
||||||
|
|
||||||
|
return DeviceInfo(
|
||||||
|
serial=serial,
|
||||||
|
firmware_minor=fw_minor,
|
||||||
|
serial_trail_0=trail_0,
|
||||||
|
)
|
||||||
|
|
||||||
|
|
||||||
|
def _decode_full_config_into(data: bytes, info: DeviceInfo) -> None:
|
||||||
|
"""
|
||||||
|
Decode SUB FE (FULL_CONFIG_RESPONSE) payload into an existing DeviceInfo.
|
||||||
|
|
||||||
|
The FE response arrives as a composite S3 outer frame whose data section
|
||||||
|
contains inner DLE-framed sub-frames. Because of this nesting the §7.3
|
||||||
|
fixed offsets (0x34, 0x3C, 0x44, 0x6D) are unreliable — they assume a
|
||||||
|
clean non-nested payload starting at byte 0.
|
||||||
|
|
||||||
|
Instead we search the whole byte array for known ASCII patterns. The
|
||||||
|
strings are long enough to be unique in any reasonable payload.
|
||||||
|
|
||||||
|
Modifies info in-place.
|
||||||
|
"""
|
||||||
|
def _extract(needle: bytes, max_len: int = 32) -> Optional[str]:
|
||||||
|
"""Return the null-terminated ASCII string that starts with *needle*."""
|
||||||
|
pos = data.find(needle)
|
||||||
|
if pos < 0:
|
||||||
|
return None
|
||||||
|
end = pos
|
||||||
|
while end < len(data) and data[end] != 0 and (end - pos) < max_len:
|
||||||
|
end += 1
|
||||||
|
s = data[pos:end].decode("ascii", errors="replace").strip()
|
||||||
|
return s or None
|
||||||
|
|
||||||
|
# ── Manufacturer and model are straightforward literal matches ────────────
|
||||||
|
info.manufacturer = _extract(b"Instantel")
|
||||||
|
info.model = _extract(b"MiniMate Plus")
|
||||||
|
|
||||||
|
# ── Firmware version: "S3xx.xx" — scan for the 'S3' prefix ───────────────
|
||||||
|
for i in range(len(data) - 5):
|
||||||
|
if data[i] == ord('S') and data[i + 1] == ord('3') and chr(data[i + 2]).isdigit():
|
||||||
|
end = i
|
||||||
|
while end < len(data) and data[end] not in (0, 0x20) and (end - i) < 12:
|
||||||
|
end += 1
|
||||||
|
candidate = data[i:end].decode("ascii", errors="replace").strip()
|
||||||
|
if "." in candidate and len(candidate) >= 5:
|
||||||
|
info.firmware_version = candidate
|
||||||
|
break
|
||||||
|
|
||||||
|
# ── DSP version: numeric "xx.xx" — search for known prefixes ─────────────
|
||||||
|
for prefix in (b"10.", b"11.", b"12.", b"9.", b"8."):
|
||||||
|
pos = data.find(prefix)
|
||||||
|
if pos < 0:
|
||||||
|
continue
|
||||||
|
end = pos
|
||||||
|
while end < len(data) and data[end] not in (0, 0x20) and (end - pos) < 8:
|
||||||
|
end += 1
|
||||||
|
candidate = data[pos:end].decode("ascii", errors="replace").strip()
|
||||||
|
# Accept only strings that look like "digits.digits"
|
||||||
|
if "." in candidate and all(c in "0123456789." for c in candidate):
|
||||||
|
info.dsp_version = candidate
|
||||||
|
break
|
||||||
|
|
||||||
|
|
||||||
|
def _decode_event_count(data: bytes) -> int:
|
||||||
|
"""
|
||||||
|
Extract stored event count from SUB F7 (EVENT_INDEX_RESPONSE) payload.
|
||||||
|
|
||||||
|
Layout per §7.4 (offsets from data section start):
|
||||||
|
+00: 00 58 09 — total index size or record count ❓
|
||||||
|
+03: 00 00 00 01 — possibly stored event count = 1 ❓
|
||||||
|
|
||||||
|
We use bytes +03..+06 interpreted as uint32 BE as the event count.
|
||||||
|
This is inferred (🔶) — the exact meaning of the first 3 bytes is unclear.
|
||||||
|
"""
|
||||||
|
if len(data) < 7:
|
||||||
|
log.warning("event index payload too short (%d bytes), assuming 0 events", len(data))
|
||||||
|
return 0
|
||||||
|
|
||||||
|
# Try the uint32 at +3 first
|
||||||
|
count = struct.unpack_from(">I", data, 3)[0]
|
||||||
|
|
||||||
|
# Sanity check: MiniMate Plus manual says max ~1000 events
|
||||||
|
if count > 1000:
|
||||||
|
log.warning(
|
||||||
|
"event count %d looks unreasonably large — clamping to 0", count
|
||||||
|
)
|
||||||
|
return 0
|
||||||
|
|
||||||
|
return count
|
||||||
|
|
||||||
|
|
||||||
|
def _decode_event_header_into(data: bytes, event: Event) -> None:
|
||||||
|
"""
|
||||||
|
Decode SUB E1 (EVENT_HEADER_RESPONSE) into an existing Event.
|
||||||
|
|
||||||
|
The 6-byte timestamp is at the start of the data payload.
|
||||||
|
Sample rate location is not yet confirmed — left as None for now.
|
||||||
|
|
||||||
|
Modifies event in-place.
|
||||||
|
"""
|
||||||
|
if len(data) < 6:
|
||||||
|
log.warning("event header payload too short (%d bytes)", len(data))
|
||||||
|
return
|
||||||
|
try:
|
||||||
|
event.timestamp = Timestamp.from_bytes(data[:6])
|
||||||
|
except ValueError as exc:
|
||||||
|
log.warning("event header timestamp decode failed: %s", exc)
|
||||||
|
|
||||||
|
|
||||||
|
def _decode_waveform_record_into(data: bytes, event: Event) -> None:
|
||||||
|
"""
|
||||||
|
Decode SUB F3 (FULL_WAVEFORM_RECORD) data into an existing Event.
|
||||||
|
|
||||||
|
Peak values are stored as IEEE 754 big-endian floats. Confirmed
|
||||||
|
positions per §7.5 (search for the known float bytes in the payload).
|
||||||
|
|
||||||
|
This decoder is intentionally conservative — it searches for the
|
||||||
|
canonical 4×float32 pattern rather than relying on a fixed offset,
|
||||||
|
since the exact field layout is only partially confirmed.
|
||||||
|
|
||||||
|
Modifies event in-place.
|
||||||
|
"""
|
||||||
|
# Attempt to extract four consecutive IEEE 754 BE floats from the
|
||||||
|
# known region of the payload (offsets are 🔶 INFERRED from captured data)
|
||||||
|
try:
|
||||||
|
peak_values = _extract_peak_floats(data)
|
||||||
|
if peak_values:
|
||||||
|
event.peak_values = peak_values
|
||||||
|
except Exception as exc:
|
||||||
|
log.warning("waveform record peak decode failed: %s", exc)
|
||||||
|
|
||||||
|
# Project strings — search for known ASCII labels
|
||||||
|
try:
|
||||||
|
project_info = _extract_project_strings(data)
|
||||||
|
if project_info:
|
||||||
|
event.project_info = project_info
|
||||||
|
except Exception as exc:
|
||||||
|
log.warning("waveform record project strings decode failed: %s", exc)
|
||||||
|
|
||||||
|
|
||||||
|
def _extract_peak_floats(data: bytes) -> Optional[PeakValues]:
|
||||||
|
"""
|
||||||
|
Scan the waveform record payload for four sequential float32 BE values
|
||||||
|
corresponding to Tran, Vert, Long, MicL peak values.
|
||||||
|
|
||||||
|
The exact offset is not confirmed (🔶), so we do a heuristic scan:
|
||||||
|
look for four consecutive 4-byte groups where each decodes as a
|
||||||
|
plausible PPV value (0 < v < 100 in/s or psi).
|
||||||
|
|
||||||
|
Returns PeakValues if a plausible group is found, else None.
|
||||||
|
"""
|
||||||
|
# Require at least 16 bytes for 4 floats
|
||||||
|
if len(data) < 16:
|
||||||
|
return None
|
||||||
|
|
||||||
|
for start in range(0, len(data) - 15, 4):
|
||||||
|
try:
|
||||||
|
vals = struct.unpack_from(">4f", data, start)
|
||||||
|
except struct.error:
|
||||||
|
continue
|
||||||
|
|
||||||
|
# All four values should be non-negative and within plausible PPV range
|
||||||
|
if all(0.0 <= v < 100.0 for v in vals):
|
||||||
|
tran, vert, long_, micl = vals
|
||||||
|
# MicL (psi) is typically much smaller than geo values
|
||||||
|
# Simple sanity: at least two non-zero values
|
||||||
|
if sum(v > 0 for v in vals) >= 2:
|
||||||
|
log.debug(
|
||||||
|
"peak floats at offset %d: T=%.4f V=%.4f L=%.4f M=%.6f",
|
||||||
|
start, tran, vert, long_, micl
|
||||||
|
)
|
||||||
|
return PeakValues(
|
||||||
|
tran=tran, vert=vert, long=long_, micl=micl
|
||||||
|
)
|
||||||
|
return None
|
||||||
|
|
||||||
|
|
||||||
|
def _extract_project_strings(data: bytes) -> Optional[ProjectInfo]:
|
||||||
|
"""
|
||||||
|
Search the waveform record payload for known ASCII label strings
|
||||||
|
("Project:", "Client:", "User Name:", "Seis Loc:", "Extended Notes")
|
||||||
|
and extract the associated value strings that follow them.
|
||||||
|
|
||||||
|
Layout (per §7.5): each entry is [label ~16 bytes][value ~32 bytes],
|
||||||
|
null-padded. We find the label, then read the next non-null chars.
|
||||||
|
"""
|
||||||
|
def _find_string_after(needle: bytes, max_value_len: int = 64) -> Optional[str]:
|
||||||
|
pos = data.find(needle)
|
||||||
|
if pos < 0:
|
||||||
|
return None
|
||||||
|
# Skip the label (including null padding) until we find a non-null value
|
||||||
|
# The value starts at pos+len(needle), but may have a gap of null bytes
|
||||||
|
value_start = pos + len(needle)
|
||||||
|
# Skip nulls
|
||||||
|
while value_start < len(data) and data[value_start] == 0:
|
||||||
|
value_start += 1
|
||||||
|
if value_start >= len(data):
|
||||||
|
return None
|
||||||
|
# Read until null terminator or max_value_len
|
||||||
|
end = value_start
|
||||||
|
while end < len(data) and data[end] != 0 and (end - value_start) < max_value_len:
|
||||||
|
end += 1
|
||||||
|
value = data[value_start:end].decode("ascii", errors="replace").strip()
|
||||||
|
return value or None
|
||||||
|
|
||||||
|
project = _find_string_after(b"Project:")
|
||||||
|
client = _find_string_after(b"Client:")
|
||||||
|
operator = _find_string_after(b"User Name:")
|
||||||
|
location = _find_string_after(b"Seis Loc:")
|
||||||
|
notes = _find_string_after(b"Extended Notes")
|
||||||
|
|
||||||
|
if not any([project, client, operator, location, notes]):
|
||||||
|
return None
|
||||||
|
|
||||||
|
return ProjectInfo(
|
||||||
|
project=project,
|
||||||
|
client=client,
|
||||||
|
operator=operator,
|
||||||
|
sensor_location=location,
|
||||||
|
notes=notes,
|
||||||
|
)
|
||||||
276
minimateplus/framing.py
Normal file
276
minimateplus/framing.py
Normal file
@@ -0,0 +1,276 @@
|
|||||||
|
"""
|
||||||
|
framing.py — DLE frame codec for the Instantel MiniMate Plus RS-232 protocol.
|
||||||
|
|
||||||
|
Wire format:
|
||||||
|
BW→S3 (our requests): [ACK=0x41] [STX=0x02] [stuffed payload+chk] [ETX=0x03]
|
||||||
|
S3→BW (device replies): [DLE=0x10] [STX=0x02] [stuffed payload+chk] [DLE=0x10] [ETX=0x03]
|
||||||
|
|
||||||
|
The ACK 0x41 byte often precedes S3 frames too — it is silently discarded
|
||||||
|
by the streaming parser.
|
||||||
|
|
||||||
|
De-stuffed payload layout:
|
||||||
|
BW→S3 request frame:
|
||||||
|
[0] CMD 0x10 (BW request marker)
|
||||||
|
[1] flags 0x00
|
||||||
|
[2] SUB command sub-byte
|
||||||
|
[3] 0x00 always zero in captured frames
|
||||||
|
[4] 0x00 always zero in captured frames
|
||||||
|
[5] OFFSET two-step offset: 0x00 = length-probe, DATA_LEN = data-request
|
||||||
|
[6-15] zero padding (total de-stuffed payload = 16 bytes)
|
||||||
|
|
||||||
|
S3→BW response frame:
|
||||||
|
[0] CMD 0x00 (S3 response marker)
|
||||||
|
[1] flags 0x10
|
||||||
|
[2] SUB response sub-byte (= 0xFF - request SUB)
|
||||||
|
[3] PAGE_HI high byte of page address (always 0x00 in observed frames)
|
||||||
|
[4] PAGE_LO low byte (always 0x00 in observed frames)
|
||||||
|
[5+] data payload data section (composite inner frames for large responses)
|
||||||
|
|
||||||
|
DLE stuffing rule: any 0x10 byte in the payload is doubled on the wire (0x10 → 0x10 0x10).
|
||||||
|
This applies to the checksum byte too.
|
||||||
|
|
||||||
|
Confirmed from live captures (s3_parser.py validation + raw_bw.bin / raw_s3.bin).
|
||||||
|
"""
|
||||||
|
|
||||||
|
from __future__ import annotations
|
||||||
|
|
||||||
|
from dataclasses import dataclass
|
||||||
|
from typing import Optional
|
||||||
|
|
||||||
|
# ── Protocol byte constants ───────────────────────────────────────────────────
|
||||||
|
|
||||||
|
DLE = 0x10 # Data Link Escape
|
||||||
|
STX = 0x02 # Start of text
|
||||||
|
ETX = 0x03 # End of text
|
||||||
|
ACK = 0x41 # Acknowledgement / frame-start marker (BW side)
|
||||||
|
|
||||||
|
BW_CMD = 0x10 # CMD byte value in BW→S3 frames
|
||||||
|
S3_CMD = 0x00 # CMD byte value in S3→BW frames
|
||||||
|
S3_FLAGS = 0x10 # flags byte value in S3→BW frames
|
||||||
|
|
||||||
|
# BW read-command payload size: 5 header bytes + 11 padding bytes = 16 total.
|
||||||
|
# Confirmed from captured raw_bw.bin: all read-command frames carry exactly 16
|
||||||
|
# de-stuffed bytes (excluding the appended checksum).
|
||||||
|
_BW_PAYLOAD_SIZE = 16
|
||||||
|
|
||||||
|
|
||||||
|
# ── DLE stuffing / de-stuffing ────────────────────────────────────────────────
|
||||||
|
|
||||||
|
def dle_stuff(data: bytes) -> bytes:
|
||||||
|
"""Escape literal 0x10 bytes: 0x10 → 0x10 0x10."""
|
||||||
|
out = bytearray()
|
||||||
|
for b in data:
|
||||||
|
if b == DLE:
|
||||||
|
out.append(DLE)
|
||||||
|
out.append(b)
|
||||||
|
return bytes(out)
|
||||||
|
|
||||||
|
|
||||||
|
def dle_unstuff(data: bytes) -> bytes:
|
||||||
|
"""Remove DLE stuffing: 0x10 0x10 → 0x10."""
|
||||||
|
out = bytearray()
|
||||||
|
i = 0
|
||||||
|
while i < len(data):
|
||||||
|
b = data[i]
|
||||||
|
if b == DLE and i + 1 < len(data) and data[i + 1] == DLE:
|
||||||
|
out.append(DLE)
|
||||||
|
i += 2
|
||||||
|
else:
|
||||||
|
out.append(b)
|
||||||
|
i += 1
|
||||||
|
return bytes(out)
|
||||||
|
|
||||||
|
|
||||||
|
# ── Checksum ─────────────────────────────────────────────────────────────────
|
||||||
|
|
||||||
|
def checksum(payload: bytes) -> int:
|
||||||
|
"""SUM8: sum of all de-stuffed payload bytes, mod 256."""
|
||||||
|
return sum(payload) & 0xFF
|
||||||
|
|
||||||
|
|
||||||
|
# ── BW→S3 frame builder ───────────────────────────────────────────────────────
|
||||||
|
|
||||||
|
def build_bw_frame(sub: int, offset: int = 0) -> bytes:
|
||||||
|
"""
|
||||||
|
Build a BW→S3 read-command frame.
|
||||||
|
|
||||||
|
The payload is always 16 de-stuffed bytes:
|
||||||
|
[BW_CMD, 0x00, sub, 0x00, 0x00, offset, 0x00 × 10]
|
||||||
|
|
||||||
|
Confirmed from BW capture analysis: payload[3] and payload[4] are always
|
||||||
|
0x00 across all observed read commands. The two-step offset lives at
|
||||||
|
payload[5]: 0x00 for the length-probe step, DATA_LEN for the data-fetch step.
|
||||||
|
|
||||||
|
Wire output: [ACK] [STX] dle_stuff(payload + checksum) [ETX]
|
||||||
|
|
||||||
|
Args:
|
||||||
|
sub: SUB command byte (e.g. 0x01 = FULL_CONFIG_READ)
|
||||||
|
offset: Value placed at payload[5].
|
||||||
|
Pass 0 for the probe step; pass DATA_LENGTHS[sub] for the data step.
|
||||||
|
|
||||||
|
Returns:
|
||||||
|
Complete frame bytes ready to write to the serial port / socket.
|
||||||
|
"""
|
||||||
|
payload = bytes([BW_CMD, 0x00, sub, 0x00, 0x00, offset]) + bytes(_BW_PAYLOAD_SIZE - 6)
|
||||||
|
chk = checksum(payload)
|
||||||
|
wire = bytes([ACK, STX]) + dle_stuff(payload + bytes([chk])) + bytes([ETX])
|
||||||
|
return wire
|
||||||
|
|
||||||
|
|
||||||
|
# ── Pre-built POLL frames ─────────────────────────────────────────────────────
|
||||||
|
#
|
||||||
|
# POLL (SUB 0x5B) uses the same two-step pattern as all other reads — the
|
||||||
|
# hardcoded length 0x30 lives at payload[5], exactly as in build_bw_frame().
|
||||||
|
|
||||||
|
POLL_PROBE = build_bw_frame(0x5B, 0x00) # length-probe POLL (offset = 0)
|
||||||
|
POLL_DATA = build_bw_frame(0x5B, 0x30) # data-request POLL (offset = 0x30)
|
||||||
|
|
||||||
|
|
||||||
|
# ── S3 response dataclass ─────────────────────────────────────────────────────
|
||||||
|
|
||||||
|
@dataclass
|
||||||
|
class S3Frame:
|
||||||
|
"""A fully parsed and de-stuffed S3→BW response frame."""
|
||||||
|
sub: int # response SUB byte (e.g. 0xA4 = POLL_RESPONSE)
|
||||||
|
page_hi: int # PAGE_HI from header (= data length on step-2 length response)
|
||||||
|
page_lo: int # PAGE_LO from header
|
||||||
|
data: bytes # payload data section (payload[5:], checksum already stripped)
|
||||||
|
checksum_valid: bool
|
||||||
|
|
||||||
|
@property
|
||||||
|
def page_key(self) -> int:
|
||||||
|
"""Combined 16-bit page address / length: (page_hi << 8) | page_lo."""
|
||||||
|
return (self.page_hi << 8) | self.page_lo
|
||||||
|
|
||||||
|
|
||||||
|
# ── Streaming S3 frame parser ─────────────────────────────────────────────────
|
||||||
|
|
||||||
|
class S3FrameParser:
|
||||||
|
"""
|
||||||
|
Incremental byte-stream parser for S3→BW response frames.
|
||||||
|
|
||||||
|
Feed incoming bytes with feed(). Complete, valid frames are returned
|
||||||
|
immediately and also accumulated in self.frames.
|
||||||
|
|
||||||
|
State machine:
|
||||||
|
IDLE — scanning for DLE (0x10)
|
||||||
|
SEEN_DLE — saw DLE, waiting for STX (0x02) to start a frame
|
||||||
|
IN_FRAME — collecting de-stuffed payload bytes; bare ETX ends frame
|
||||||
|
IN_FRAME_DLE — inside frame, saw DLE; DLE continues stuffing;
|
||||||
|
DLE+ETX is treated as literal data (NOT a frame end),
|
||||||
|
which lets inner-frame terminators pass through intact
|
||||||
|
|
||||||
|
Wire format confirmed from captures:
|
||||||
|
[DLE=0x10] [STX=0x02] [stuffed payload+chk] [bare ETX=0x03]
|
||||||
|
The ETX is NOT preceded by a DLE on the wire. DLE+ETX sequences that
|
||||||
|
appear inside the payload are inner-frame terminators and must be
|
||||||
|
treated as literal data.
|
||||||
|
|
||||||
|
ACK (0x41) bytes and arbitrary non-DLE bytes in IDLE state are silently
|
||||||
|
discarded (covers device boot string "Operating System" and keepalive ACKs).
|
||||||
|
"""
|
||||||
|
|
||||||
|
_IDLE = 0
|
||||||
|
_SEEN_DLE = 1
|
||||||
|
_IN_FRAME = 2
|
||||||
|
_IN_FRAME_DLE = 3
|
||||||
|
|
||||||
|
def __init__(self) -> None:
|
||||||
|
self._state = self._IDLE
|
||||||
|
self._body = bytearray() # accumulates de-stuffed frame bytes
|
||||||
|
self.frames: list[S3Frame] = []
|
||||||
|
|
||||||
|
def reset(self) -> None:
|
||||||
|
self._state = self._IDLE
|
||||||
|
self._body.clear()
|
||||||
|
|
||||||
|
def feed(self, data: bytes) -> list[S3Frame]:
|
||||||
|
"""
|
||||||
|
Process a chunk of incoming bytes.
|
||||||
|
|
||||||
|
Returns a list of S3Frame objects completed during this call.
|
||||||
|
All completed frames are also appended to self.frames.
|
||||||
|
"""
|
||||||
|
completed: list[S3Frame] = []
|
||||||
|
for b in data:
|
||||||
|
frame = self._step(b)
|
||||||
|
if frame is not None:
|
||||||
|
completed.append(frame)
|
||||||
|
self.frames.append(frame)
|
||||||
|
return completed
|
||||||
|
|
||||||
|
def _step(self, b: int) -> Optional[S3Frame]:
|
||||||
|
"""Process one byte. Returns a completed S3Frame or None."""
|
||||||
|
|
||||||
|
if self._state == self._IDLE:
|
||||||
|
if b == DLE:
|
||||||
|
self._state = self._SEEN_DLE
|
||||||
|
# ACK, boot strings, garbage — silently ignored
|
||||||
|
|
||||||
|
elif self._state == self._SEEN_DLE:
|
||||||
|
if b == STX:
|
||||||
|
self._body.clear()
|
||||||
|
self._state = self._IN_FRAME
|
||||||
|
else:
|
||||||
|
# Stray DLE not followed by STX — back to idle
|
||||||
|
self._state = self._IDLE
|
||||||
|
|
||||||
|
elif self._state == self._IN_FRAME:
|
||||||
|
if b == DLE:
|
||||||
|
self._state = self._IN_FRAME_DLE
|
||||||
|
elif b == ETX:
|
||||||
|
# Bare ETX = real frame terminator (confirmed from captures)
|
||||||
|
frame = self._finalise()
|
||||||
|
self._state = self._IDLE
|
||||||
|
return frame
|
||||||
|
else:
|
||||||
|
self._body.append(b)
|
||||||
|
|
||||||
|
elif self._state == self._IN_FRAME_DLE:
|
||||||
|
if b == DLE:
|
||||||
|
# DLE DLE → literal 0x10 in payload
|
||||||
|
self._body.append(DLE)
|
||||||
|
self._state = self._IN_FRAME
|
||||||
|
elif b == ETX:
|
||||||
|
# DLE+ETX inside a frame is an inner-frame terminator, NOT
|
||||||
|
# the outer frame end. Treat as literal data and continue.
|
||||||
|
self._body.append(DLE)
|
||||||
|
self._body.append(ETX)
|
||||||
|
self._state = self._IN_FRAME
|
||||||
|
else:
|
||||||
|
# Unexpected DLE + byte — treat both as literal data and continue
|
||||||
|
self._body.append(DLE)
|
||||||
|
self._body.append(b)
|
||||||
|
self._state = self._IN_FRAME
|
||||||
|
|
||||||
|
return None
|
||||||
|
|
||||||
|
def _finalise(self) -> Optional[S3Frame]:
|
||||||
|
"""
|
||||||
|
Called when DLE+ETX is seen. Validates checksum and builds S3Frame.
|
||||||
|
Returns None if the frame is too short or structurally invalid.
|
||||||
|
"""
|
||||||
|
body = bytes(self._body)
|
||||||
|
|
||||||
|
# Minimum valid frame: 5-byte header + at least 1 checksum byte = 6
|
||||||
|
if len(body) < 6:
|
||||||
|
return None
|
||||||
|
|
||||||
|
raw_payload = body[:-1] # everything except the trailing checksum byte
|
||||||
|
chk_received = body[-1]
|
||||||
|
chk_computed = checksum(raw_payload)
|
||||||
|
|
||||||
|
if len(raw_payload) < 5:
|
||||||
|
return None
|
||||||
|
|
||||||
|
# Validate CMD byte — we only accept S3→BW response frames here
|
||||||
|
if raw_payload[0] != S3_CMD:
|
||||||
|
return None
|
||||||
|
|
||||||
|
return S3Frame(
|
||||||
|
sub = raw_payload[2],
|
||||||
|
page_hi = raw_payload[3],
|
||||||
|
page_lo = raw_payload[4],
|
||||||
|
data = raw_payload[5:],
|
||||||
|
checksum_valid = (chk_received == chk_computed),
|
||||||
|
)
|
||||||
215
minimateplus/models.py
Normal file
215
minimateplus/models.py
Normal file
@@ -0,0 +1,215 @@
|
|||||||
|
"""
|
||||||
|
models.py — Plain-Python data models for the MiniMate Plus protocol library.
|
||||||
|
|
||||||
|
All models are intentionally simple dataclasses with no protocol logic.
|
||||||
|
They represent *decoded* device data — the client layer translates raw frame
|
||||||
|
bytes into these objects, and the SFM API layer serialises them to JSON.
|
||||||
|
|
||||||
|
Notes on certainty:
|
||||||
|
Fields marked ✅ are confirmed from captured data.
|
||||||
|
Fields marked 🔶 are strongly inferred but not formally proven.
|
||||||
|
Fields marked ❓ are present in the captured payload but not yet decoded.
|
||||||
|
See docs/instantel_protocol_reference.md for full derivation details.
|
||||||
|
"""
|
||||||
|
|
||||||
|
from __future__ import annotations
|
||||||
|
|
||||||
|
import struct
|
||||||
|
from dataclasses import dataclass, field
|
||||||
|
from typing import Optional
|
||||||
|
|
||||||
|
|
||||||
|
# ── Timestamp ─────────────────────────────────────────────────────────────────
|
||||||
|
|
||||||
|
@dataclass
|
||||||
|
class Timestamp:
|
||||||
|
"""
|
||||||
|
6-byte event timestamp decoded from the MiniMate Plus wire format.
|
||||||
|
|
||||||
|
Wire layout: [flag:1] [year:2 BE] [unknown:1] [month:1] [day:1]
|
||||||
|
|
||||||
|
The year 1995 is the device's factory-default RTC date — it appears
|
||||||
|
whenever the battery has been disconnected. Treat 1995 as "clock not set".
|
||||||
|
"""
|
||||||
|
raw: bytes # raw 6-byte sequence for round-tripping
|
||||||
|
flag: int # byte 0 — validity/type flag (usually 0x01) 🔶
|
||||||
|
year: int # bytes 1–2 big-endian uint16 ✅
|
||||||
|
unknown_byte: int # byte 3 — likely hours/minutes ❓
|
||||||
|
month: int # byte 4 ✅
|
||||||
|
day: int # byte 5 ✅
|
||||||
|
|
||||||
|
@classmethod
|
||||||
|
def from_bytes(cls, data: bytes) -> "Timestamp":
|
||||||
|
"""
|
||||||
|
Decode a 6-byte timestamp sequence.
|
||||||
|
|
||||||
|
Args:
|
||||||
|
data: exactly 6 bytes from the device payload.
|
||||||
|
|
||||||
|
Returns:
|
||||||
|
Decoded Timestamp.
|
||||||
|
|
||||||
|
Raises:
|
||||||
|
ValueError: if data is not exactly 6 bytes.
|
||||||
|
"""
|
||||||
|
if len(data) != 6:
|
||||||
|
raise ValueError(f"Timestamp requires exactly 6 bytes, got {len(data)}")
|
||||||
|
flag = data[0]
|
||||||
|
year = struct.unpack_from(">H", data, 1)[0]
|
||||||
|
unknown_byte = data[3]
|
||||||
|
month = data[4]
|
||||||
|
day = data[5]
|
||||||
|
return cls(
|
||||||
|
raw=bytes(data),
|
||||||
|
flag=flag,
|
||||||
|
year=year,
|
||||||
|
unknown_byte=unknown_byte,
|
||||||
|
month=month,
|
||||||
|
day=day,
|
||||||
|
)
|
||||||
|
|
||||||
|
@property
|
||||||
|
def clock_set(self) -> bool:
|
||||||
|
"""False when year == 1995 (factory default / battery-lost state)."""
|
||||||
|
return self.year != 1995
|
||||||
|
|
||||||
|
def __str__(self) -> str:
|
||||||
|
if not self.clock_set:
|
||||||
|
return f"CLOCK_NOT_SET ({self.year}-{self.month:02d}-{self.day:02d})"
|
||||||
|
return f"{self.year}-{self.month:02d}-{self.day:02d}"
|
||||||
|
|
||||||
|
|
||||||
|
# ── Device identity ───────────────────────────────────────────────────────────
|
||||||
|
|
||||||
|
@dataclass
|
||||||
|
class DeviceInfo:
|
||||||
|
"""
|
||||||
|
Combined device identity information gathered during the startup sequence.
|
||||||
|
|
||||||
|
Populated from three response SUBs:
|
||||||
|
- SUB EA (SERIAL_NUMBER_RESPONSE): serial, firmware_minor
|
||||||
|
- SUB FE (FULL_CONFIG_RESPONSE): serial (repeat), firmware_version,
|
||||||
|
dsp_version, manufacturer, model
|
||||||
|
- SUB A4 (POLL_RESPONSE): manufacturer (repeat), model (repeat)
|
||||||
|
|
||||||
|
All string fields are stripped of null padding before storage.
|
||||||
|
"""
|
||||||
|
|
||||||
|
# ── From SUB EA (SERIAL_NUMBER_RESPONSE) ─────────────────────────────────
|
||||||
|
serial: str # e.g. "BE18189" ✅
|
||||||
|
firmware_minor: int # 0x11 = 17 for S337.17 ✅
|
||||||
|
serial_trail_0: Optional[int] = None # unit-specific byte — purpose unknown ❓
|
||||||
|
|
||||||
|
# ── From SUB FE (FULL_CONFIG_RESPONSE) ────────────────────────────────────
|
||||||
|
firmware_version: Optional[str] = None # e.g. "S337.17" ✅
|
||||||
|
dsp_version: Optional[str] = None # e.g. "10.72" ✅
|
||||||
|
manufacturer: Optional[str] = None # e.g. "Instantel" ✅
|
||||||
|
model: Optional[str] = None # e.g. "MiniMate Plus" ✅
|
||||||
|
|
||||||
|
def __str__(self) -> str:
|
||||||
|
fw = self.firmware_version or f"?.{self.firmware_minor}"
|
||||||
|
mdl = self.model or "MiniMate Plus"
|
||||||
|
return f"{mdl} S/N:{self.serial} FW:{fw}"
|
||||||
|
|
||||||
|
|
||||||
|
# ── Channel threshold / scaling ───────────────────────────────────────────────
|
||||||
|
|
||||||
|
@dataclass
|
||||||
|
class ChannelConfig:
|
||||||
|
"""
|
||||||
|
Per-channel threshold and scaling values from SUB E5 / SUB 71.
|
||||||
|
|
||||||
|
Floats are stored in the device in imperial units (in/s for geo channels,
|
||||||
|
psi for MicL). Unit strings embedded in the payload confirm this.
|
||||||
|
|
||||||
|
Certainty: ✅ CONFIRMED for trigger_level, alarm_level, unit strings.
|
||||||
|
"""
|
||||||
|
label: str # e.g. "Tran", "Vert", "Long", "MicL" ✅
|
||||||
|
trigger_level: float # in/s (geo) or psi (MicL) ✅
|
||||||
|
alarm_level: float # in/s (geo) or psi (MicL) ✅
|
||||||
|
max_range: float # full-scale calibration constant (e.g. 6.206) 🔶
|
||||||
|
unit_label: str # e.g. "in./s" or "psi" ✅
|
||||||
|
|
||||||
|
|
||||||
|
# ── Peak values for one event ─────────────────────────────────────────────────
|
||||||
|
|
||||||
|
@dataclass
|
||||||
|
class PeakValues:
|
||||||
|
"""
|
||||||
|
Per-channel peak particle velocity / pressure for a single event.
|
||||||
|
|
||||||
|
Extracted from the Full Waveform Record (SUB F3), stored as IEEE 754
|
||||||
|
big-endian floats in the device's native units (in/s / psi).
|
||||||
|
"""
|
||||||
|
tran: Optional[float] = None # Transverse PPV (in/s) ✅
|
||||||
|
vert: Optional[float] = None # Vertical PPV (in/s) ✅
|
||||||
|
long: Optional[float] = None # Longitudinal PPV (in/s) ✅
|
||||||
|
micl: Optional[float] = None # Air overpressure (psi) 🔶 (units uncertain)
|
||||||
|
|
||||||
|
|
||||||
|
# ── Project / operator metadata ───────────────────────────────────────────────
|
||||||
|
|
||||||
|
@dataclass
|
||||||
|
class ProjectInfo:
|
||||||
|
"""
|
||||||
|
Operator-supplied project and location strings from the Full Waveform
|
||||||
|
Record (SUB F3) and compliance config block (SUB E5 / SUB 71).
|
||||||
|
|
||||||
|
All fields are optional — they may be blank if the operator did not fill
|
||||||
|
them in through Blastware.
|
||||||
|
"""
|
||||||
|
setup_name: Optional[str] = None # "Standard Recording Setup"
|
||||||
|
project: Optional[str] = None # project description
|
||||||
|
client: Optional[str] = None # client name ✅ confirmed offset
|
||||||
|
operator: Optional[str] = None # operator / user name
|
||||||
|
sensor_location: Optional[str] = None # sensor location string
|
||||||
|
notes: Optional[str] = None # extended notes
|
||||||
|
|
||||||
|
|
||||||
|
# ── Event ─────────────────────────────────────────────────────────────────────
|
||||||
|
|
||||||
|
@dataclass
|
||||||
|
class Event:
|
||||||
|
"""
|
||||||
|
A single seismic event record downloaded from the device.
|
||||||
|
|
||||||
|
Populated progressively across several request/response pairs:
|
||||||
|
1. SUB 1E (EVENT_HEADER) → index, timestamp, sample_rate
|
||||||
|
2. SUB 0C (FULL_WAVEFORM_RECORD) → peak_values, project_info, record_type
|
||||||
|
3. SUB 5A (BULK_WAVEFORM_STREAM) → raw_samples (downloaded on demand)
|
||||||
|
|
||||||
|
Fields not yet retrieved are None.
|
||||||
|
"""
|
||||||
|
# ── Identity ──────────────────────────────────────────────────────────────
|
||||||
|
index: int # 0-based event number on device
|
||||||
|
|
||||||
|
# ── From EVENT_HEADER (SUB 1E) ────────────────────────────────────────────
|
||||||
|
timestamp: Optional[Timestamp] = None # 6-byte timestamp ✅
|
||||||
|
sample_rate: Optional[int] = None # samples/sec (e.g. 1024) 🔶
|
||||||
|
|
||||||
|
# ── From FULL_WAVEFORM_RECORD (SUB F3) ───────────────────────────────────
|
||||||
|
peak_values: Optional[PeakValues] = None
|
||||||
|
project_info: Optional[ProjectInfo] = None
|
||||||
|
record_type: Optional[str] = None # e.g. "Histogram", "Waveform" 🔶
|
||||||
|
|
||||||
|
# ── From BULK_WAVEFORM_STREAM (SUB 5A) ───────────────────────────────────
|
||||||
|
# Raw ADC samples keyed by channel label. Not fetched unless explicitly
|
||||||
|
# requested (large data transfer — up to several MB per event).
|
||||||
|
raw_samples: Optional[dict] = None # {"Tran": [...], "Vert": [...], ...}
|
||||||
|
|
||||||
|
def __str__(self) -> str:
|
||||||
|
ts = str(self.timestamp) if self.timestamp else "no timestamp"
|
||||||
|
ppv = ""
|
||||||
|
if self.peak_values:
|
||||||
|
pv = self.peak_values
|
||||||
|
parts = []
|
||||||
|
if pv.tran is not None:
|
||||||
|
parts.append(f"T={pv.tran:.4f}")
|
||||||
|
if pv.vert is not None:
|
||||||
|
parts.append(f"V={pv.vert:.4f}")
|
||||||
|
if pv.long is not None:
|
||||||
|
parts.append(f"L={pv.long:.4f}")
|
||||||
|
if pv.micl is not None:
|
||||||
|
parts.append(f"M={pv.micl:.6f}")
|
||||||
|
ppv = " [" + ", ".join(parts) + " in/s]"
|
||||||
|
return f"Event#{self.index} {ts}{ppv}"
|
||||||
317
minimateplus/protocol.py
Normal file
317
minimateplus/protocol.py
Normal file
@@ -0,0 +1,317 @@
|
|||||||
|
"""
|
||||||
|
protocol.py — High-level MiniMate Plus request/response protocol.
|
||||||
|
|
||||||
|
Implements the request/response patterns documented in
|
||||||
|
docs/instantel_protocol_reference.md on top of:
|
||||||
|
- minimateplus.framing — DLE codec, frame builder, S3 streaming parser
|
||||||
|
- minimateplus.transport — byte I/O (SerialTransport / future TcpTransport)
|
||||||
|
|
||||||
|
This module knows nothing about pyserial or TCP — it only calls
|
||||||
|
transport.write() and transport.read_until_idle().
|
||||||
|
|
||||||
|
Key patterns implemented:
|
||||||
|
- POLL startup handshake (two-step, special payload[5] format)
|
||||||
|
- Generic two-step paged read (probe → get length → fetch data)
|
||||||
|
- Response timeout + checksum validation
|
||||||
|
- Boot-string drain (device sends "Operating System" ASCII before framing)
|
||||||
|
|
||||||
|
All public methods raise ProtocolError on timeout, bad checksum, or
|
||||||
|
unexpected response SUB.
|
||||||
|
"""
|
||||||
|
|
||||||
|
from __future__ import annotations
|
||||||
|
|
||||||
|
import logging
|
||||||
|
import time
|
||||||
|
from typing import Optional
|
||||||
|
|
||||||
|
from .framing import (
|
||||||
|
S3Frame,
|
||||||
|
S3FrameParser,
|
||||||
|
build_bw_frame,
|
||||||
|
POLL_PROBE,
|
||||||
|
POLL_DATA,
|
||||||
|
)
|
||||||
|
from .transport import BaseTransport
|
||||||
|
|
||||||
|
log = logging.getLogger(__name__)
|
||||||
|
|
||||||
|
|
||||||
|
# ── Constants ─────────────────────────────────────────────────────────────────
|
||||||
|
|
||||||
|
# Response SUB = 0xFF - Request SUB (confirmed pattern, no known exceptions
|
||||||
|
# among read commands; one write-path exception documented for SUB 1C→6E).
|
||||||
|
def _expected_rsp_sub(req_sub: int) -> int:
|
||||||
|
return (0xFF - req_sub) & 0xFF
|
||||||
|
|
||||||
|
|
||||||
|
# SUB byte constants (request side) — see protocol reference §5.1
|
||||||
|
SUB_POLL = 0x5B
|
||||||
|
SUB_SERIAL_NUMBER = 0x15
|
||||||
|
SUB_FULL_CONFIG = 0x01
|
||||||
|
SUB_EVENT_INDEX = 0x08
|
||||||
|
SUB_CHANNEL_CONFIG = 0x06
|
||||||
|
SUB_TRIGGER_CONFIG = 0x1C
|
||||||
|
SUB_EVENT_HEADER = 0x1E
|
||||||
|
SUB_WAVEFORM_HEADER = 0x0A
|
||||||
|
SUB_WAVEFORM_RECORD = 0x0C
|
||||||
|
SUB_BULK_WAVEFORM = 0x5A
|
||||||
|
SUB_COMPLIANCE = 0x1A
|
||||||
|
SUB_UNKNOWN_2E = 0x2E
|
||||||
|
|
||||||
|
# Hardcoded data lengths for the two-step read protocol.
|
||||||
|
#
|
||||||
|
# The S3 probe response page_key is always 0x0000 — it does NOT carry the
|
||||||
|
# data length back to us. Instead, each SUB has a fixed known payload size
|
||||||
|
# confirmed from BW capture analysis (offset at payload[5] of the data-request
|
||||||
|
# frame).
|
||||||
|
#
|
||||||
|
# Key: request SUB byte. Value: offset/length byte sent in the data-request.
|
||||||
|
# Entries marked 🔶 are inferred from captured frames and may need adjustment.
|
||||||
|
DATA_LENGTHS: dict[int, int] = {
|
||||||
|
SUB_POLL: 0x30, # POLL startup data block ✅
|
||||||
|
SUB_SERIAL_NUMBER: 0x0A, # 10-byte serial number block ✅
|
||||||
|
SUB_FULL_CONFIG: 0x98, # 152-byte full config block ✅
|
||||||
|
SUB_EVENT_INDEX: 0x58, # 88-byte event index ✅
|
||||||
|
SUB_TRIGGER_CONFIG: 0x2C, # 44-byte trigger config 🔶
|
||||||
|
SUB_UNKNOWN_2E: 0x1A, # 26 bytes, purpose TBD 🔶
|
||||||
|
0x09: 0xCA, # 202 bytes, purpose TBD 🔶
|
||||||
|
# SUB_COMPLIANCE (0x1A) uses a multi-step sequence with a 2090-byte total;
|
||||||
|
# NOT handled here — requires specialised read logic.
|
||||||
|
}
|
||||||
|
|
||||||
|
# Default timeout values (seconds).
|
||||||
|
# MiniMate Plus is a slow device — keep these generous.
|
||||||
|
DEFAULT_RECV_TIMEOUT = 10.0
|
||||||
|
POLL_RECV_TIMEOUT = 10.0
|
||||||
|
|
||||||
|
|
||||||
|
# ── Exception ─────────────────────────────────────────────────────────────────
|
||||||
|
|
||||||
|
class ProtocolError(Exception):
|
||||||
|
"""Raised when the device violates the expected protocol."""
|
||||||
|
|
||||||
|
|
||||||
|
class TimeoutError(ProtocolError):
|
||||||
|
"""Raised when no response is received within the allowed time."""
|
||||||
|
|
||||||
|
|
||||||
|
class ChecksumError(ProtocolError):
|
||||||
|
"""Raised when a received frame has a bad checksum."""
|
||||||
|
|
||||||
|
|
||||||
|
class UnexpectedResponse(ProtocolError):
|
||||||
|
"""Raised when the response SUB doesn't match what we requested."""
|
||||||
|
|
||||||
|
|
||||||
|
# ── MiniMateProtocol ──────────────────────────────────────────────────────────
|
||||||
|
|
||||||
|
class MiniMateProtocol:
|
||||||
|
"""
|
||||||
|
Protocol state machine for one open connection to a MiniMate Plus device.
|
||||||
|
|
||||||
|
Does not own the transport — transport lifetime is managed by MiniMateClient.
|
||||||
|
|
||||||
|
Typical usage (via MiniMateClient — not directly):
|
||||||
|
proto = MiniMateProtocol(transport)
|
||||||
|
proto.startup() # POLL handshake, drain boot string
|
||||||
|
data = proto.read(SUB_FULL_CONFIG)
|
||||||
|
sn_data = proto.read(SUB_SERIAL_NUMBER)
|
||||||
|
"""
|
||||||
|
|
||||||
|
def __init__(
|
||||||
|
self,
|
||||||
|
transport: BaseTransport,
|
||||||
|
recv_timeout: float = DEFAULT_RECV_TIMEOUT,
|
||||||
|
) -> None:
|
||||||
|
self._transport = transport
|
||||||
|
self._recv_timeout = recv_timeout
|
||||||
|
self._parser = S3FrameParser()
|
||||||
|
|
||||||
|
# ── Public API ────────────────────────────────────────────────────────────
|
||||||
|
|
||||||
|
def startup(self) -> S3Frame:
|
||||||
|
"""
|
||||||
|
Perform the POLL startup handshake and return the POLL data frame.
|
||||||
|
|
||||||
|
Steps (matching §6 Session Startup Sequence):
|
||||||
|
1. Drain any boot-string bytes ("Operating System" ASCII)
|
||||||
|
2. Send POLL_PROBE (SUB 5B, offset=0x00)
|
||||||
|
3. Receive probe ack (page_key is 0x0000; data length 0x30 is hardcoded)
|
||||||
|
4. Send POLL_DATA (SUB 5B, offset=0x30)
|
||||||
|
5. Receive data frame with "Instantel" + "MiniMate Plus" strings
|
||||||
|
|
||||||
|
Returns:
|
||||||
|
The data-phase POLL response S3Frame.
|
||||||
|
|
||||||
|
Raises:
|
||||||
|
ProtocolError: if either POLL step fails.
|
||||||
|
"""
|
||||||
|
log.debug("startup: draining boot string")
|
||||||
|
self._drain_boot_string()
|
||||||
|
|
||||||
|
log.debug("startup: POLL probe")
|
||||||
|
self._send(POLL_PROBE)
|
||||||
|
probe_rsp = self._recv_one(
|
||||||
|
expected_sub=_expected_rsp_sub(SUB_POLL),
|
||||||
|
timeout=self._recv_timeout,
|
||||||
|
)
|
||||||
|
log.debug(
|
||||||
|
"startup: POLL probe response page_key=0x%04X", probe_rsp.page_key
|
||||||
|
)
|
||||||
|
|
||||||
|
log.debug("startup: POLL data request")
|
||||||
|
self._send(POLL_DATA)
|
||||||
|
data_rsp = self._recv_one(
|
||||||
|
expected_sub=_expected_rsp_sub(SUB_POLL),
|
||||||
|
timeout=self._recv_timeout,
|
||||||
|
)
|
||||||
|
log.debug("startup: POLL data received, %d bytes", len(data_rsp.data))
|
||||||
|
return data_rsp
|
||||||
|
|
||||||
|
def read(self, sub: int) -> bytes:
|
||||||
|
"""
|
||||||
|
Execute a two-step paged read and return the data payload bytes.
|
||||||
|
|
||||||
|
Step 1: send probe frame (offset=0x00) → device sends a short ack
|
||||||
|
Step 2: send data-request (offset=DATA_LEN) → device sends the data block
|
||||||
|
|
||||||
|
The S3 probe response does NOT carry the data length — page_key is always
|
||||||
|
0x0000 in observed frames. DATA_LENGTHS holds the known fixed lengths
|
||||||
|
derived from BW capture analysis.
|
||||||
|
|
||||||
|
Args:
|
||||||
|
sub: Request SUB byte (e.g. SUB_FULL_CONFIG = 0x01).
|
||||||
|
|
||||||
|
Returns:
|
||||||
|
De-stuffed data payload bytes (payload[5:] of the response frame,
|
||||||
|
with the checksum already stripped by the parser).
|
||||||
|
|
||||||
|
Raises:
|
||||||
|
ProtocolError: on timeout, bad checksum, or wrong response SUB.
|
||||||
|
KeyError: if sub is not in DATA_LENGTHS (caller should add it).
|
||||||
|
"""
|
||||||
|
rsp_sub = _expected_rsp_sub(sub)
|
||||||
|
|
||||||
|
# Step 1 — probe (offset = 0)
|
||||||
|
log.debug("read SUB=0x%02X: probe", sub)
|
||||||
|
self._send(build_bw_frame(sub, 0))
|
||||||
|
_probe = self._recv_one(expected_sub=rsp_sub) # ack; page_key always 0
|
||||||
|
|
||||||
|
# Look up the hardcoded data length for this SUB
|
||||||
|
if sub not in DATA_LENGTHS:
|
||||||
|
raise ProtocolError(
|
||||||
|
f"No known data length for SUB=0x{sub:02X}. "
|
||||||
|
"Add it to DATA_LENGTHS in protocol.py."
|
||||||
|
)
|
||||||
|
length = DATA_LENGTHS[sub]
|
||||||
|
log.debug("read SUB=0x%02X: data request offset=0x%02X", sub, length)
|
||||||
|
|
||||||
|
if length == 0:
|
||||||
|
log.warning("read SUB=0x%02X: DATA_LENGTHS entry is zero", sub)
|
||||||
|
return b""
|
||||||
|
|
||||||
|
# Step 2 — data-request (offset = length)
|
||||||
|
self._send(build_bw_frame(sub, length))
|
||||||
|
data_rsp = self._recv_one(expected_sub=rsp_sub)
|
||||||
|
|
||||||
|
log.debug("read SUB=0x%02X: received %d data bytes", sub, len(data_rsp.data))
|
||||||
|
return data_rsp.data
|
||||||
|
|
||||||
|
def send_keepalive(self) -> None:
|
||||||
|
"""
|
||||||
|
Send a single POLL_PROBE keepalive without waiting for a response.
|
||||||
|
|
||||||
|
Blastware sends these every ~80ms during idle. Useful if you need to
|
||||||
|
hold the session open between real requests.
|
||||||
|
"""
|
||||||
|
self._send(POLL_PROBE)
|
||||||
|
|
||||||
|
# ── Internal helpers ──────────────────────────────────────────────────────
|
||||||
|
|
||||||
|
def _send(self, frame: bytes) -> None:
|
||||||
|
"""Write a pre-built frame to the transport."""
|
||||||
|
log.debug("TX %d bytes: %s", len(frame), frame.hex())
|
||||||
|
self._transport.write(frame)
|
||||||
|
|
||||||
|
def _recv_one(
|
||||||
|
self,
|
||||||
|
expected_sub: Optional[int] = None,
|
||||||
|
timeout: Optional[float] = None,
|
||||||
|
) -> S3Frame:
|
||||||
|
"""
|
||||||
|
Read bytes from the transport until one complete S3 frame is parsed.
|
||||||
|
|
||||||
|
Feeds bytes through the streaming S3FrameParser. Keeps reading until
|
||||||
|
a frame arrives or the deadline expires.
|
||||||
|
|
||||||
|
Args:
|
||||||
|
expected_sub: If provided, raises UnexpectedResponse if the
|
||||||
|
received frame's SUB doesn't match.
|
||||||
|
timeout: Seconds to wait. Defaults to self._recv_timeout.
|
||||||
|
|
||||||
|
Returns:
|
||||||
|
The first complete S3Frame received.
|
||||||
|
|
||||||
|
Raises:
|
||||||
|
TimeoutError: if no frame arrives within the timeout.
|
||||||
|
ChecksumError: if the frame has an invalid checksum.
|
||||||
|
UnexpectedResponse: if expected_sub is set and doesn't match.
|
||||||
|
"""
|
||||||
|
deadline = time.monotonic() + (timeout or self._recv_timeout)
|
||||||
|
self._parser.reset()
|
||||||
|
|
||||||
|
while time.monotonic() < deadline:
|
||||||
|
chunk = self._transport.read(256)
|
||||||
|
if chunk:
|
||||||
|
log.debug("RX %d bytes: %s", len(chunk), chunk.hex())
|
||||||
|
frames = self._parser.feed(chunk)
|
||||||
|
if frames:
|
||||||
|
frame = frames[0]
|
||||||
|
self._validate_frame(frame, expected_sub)
|
||||||
|
return frame
|
||||||
|
else:
|
||||||
|
time.sleep(0.005)
|
||||||
|
|
||||||
|
raise TimeoutError(
|
||||||
|
f"No S3 frame received within {timeout or self._recv_timeout:.1f}s"
|
||||||
|
+ (f" (expected SUB 0x{expected_sub:02X})" if expected_sub is not None else "")
|
||||||
|
)
|
||||||
|
|
||||||
|
@staticmethod
|
||||||
|
def _validate_frame(frame: S3Frame, expected_sub: Optional[int]) -> None:
|
||||||
|
"""Validate SUB; log but do not raise on bad checksum.
|
||||||
|
|
||||||
|
S3 response checksums frequently fail SUM8 validation due to inner-frame
|
||||||
|
delimiter bytes being captured as the checksum byte. The original
|
||||||
|
s3_parser.py deliberately never validates S3 checksums for exactly this
|
||||||
|
reason. We log a warning and continue.
|
||||||
|
"""
|
||||||
|
if not frame.checksum_valid:
|
||||||
|
# S3 checksums frequently fail SUM8 due to inner-frame delimiter bytes
|
||||||
|
# landing in the checksum position. Treat as informational only.
|
||||||
|
log.debug("S3 frame SUB=0x%02X: checksum mismatch (ignoring)", frame.sub)
|
||||||
|
if expected_sub is not None and frame.sub != expected_sub:
|
||||||
|
raise UnexpectedResponse(
|
||||||
|
f"Expected SUB=0x{expected_sub:02X}, got 0x{frame.sub:02X}"
|
||||||
|
)
|
||||||
|
|
||||||
|
def _drain_boot_string(self, drain_ms: int = 200) -> None:
|
||||||
|
"""
|
||||||
|
Read and discard any boot-string bytes ("Operating System") the device
|
||||||
|
may send before entering binary protocol mode.
|
||||||
|
|
||||||
|
We simply read with a short timeout and throw the bytes away. The
|
||||||
|
S3FrameParser's IDLE state already handles non-frame bytes gracefully,
|
||||||
|
but it's cleaner to drain them explicitly before the first real frame.
|
||||||
|
"""
|
||||||
|
deadline = time.monotonic() + (drain_ms / 1000)
|
||||||
|
discarded = 0
|
||||||
|
while time.monotonic() < deadline:
|
||||||
|
chunk = self._transport.read(256)
|
||||||
|
if chunk:
|
||||||
|
discarded += len(chunk)
|
||||||
|
else:
|
||||||
|
time.sleep(0.005)
|
||||||
|
if discarded:
|
||||||
|
log.debug("drain_boot_string: discarded %d bytes", discarded)
|
||||||
420
minimateplus/transport.py
Normal file
420
minimateplus/transport.py
Normal file
@@ -0,0 +1,420 @@
|
|||||||
|
"""
|
||||||
|
transport.py — Serial and TCP transport layer for the MiniMate Plus protocol.
|
||||||
|
|
||||||
|
Provides a thin I/O abstraction so that protocol.py never imports pyserial or
|
||||||
|
socket directly. Two concrete implementations:
|
||||||
|
|
||||||
|
SerialTransport — direct RS-232 cable connection (pyserial)
|
||||||
|
TcpTransport — TCP socket to a modem or ACH relay (stdlib socket)
|
||||||
|
|
||||||
|
The MiniMate Plus protocol bytes are identical over both transports. TCP is used
|
||||||
|
when field units call home via the ACH (Auto Call Home) server, or when SFM
|
||||||
|
"calls up" a unit by connecting to the modem's IP address directly.
|
||||||
|
|
||||||
|
Field hardware: Sierra Wireless RV55 / RX55 (4G LTE) cellular modem, replacing
|
||||||
|
the older 3G-only Raven X (now decommissioned). All run ALEOS firmware with an
|
||||||
|
ACEmanager web UI. Serial port must be configured 38400,8N1, no flow control,
|
||||||
|
Data Forwarding Timeout = 1 s.
|
||||||
|
|
||||||
|
Typical usage:
|
||||||
|
from minimateplus.transport import SerialTransport, TcpTransport
|
||||||
|
|
||||||
|
# Direct serial connection
|
||||||
|
with SerialTransport("COM5") as t:
|
||||||
|
t.write(frame_bytes)
|
||||||
|
|
||||||
|
# Modem / ACH TCP connection (Blastware port 12345)
|
||||||
|
with TcpTransport("192.168.1.50", 12345) as t:
|
||||||
|
t.write(frame_bytes)
|
||||||
|
"""
|
||||||
|
|
||||||
|
from __future__ import annotations
|
||||||
|
|
||||||
|
import socket
|
||||||
|
import time
|
||||||
|
from abc import ABC, abstractmethod
|
||||||
|
from typing import Optional
|
||||||
|
|
||||||
|
# pyserial is the only non-stdlib dependency in this project.
|
||||||
|
# Import lazily so unit-tests that mock the transport can run without it.
|
||||||
|
try:
|
||||||
|
import serial # type: ignore
|
||||||
|
except ImportError: # pragma: no cover
|
||||||
|
serial = None # type: ignore
|
||||||
|
|
||||||
|
|
||||||
|
# ── Abstract base ─────────────────────────────────────────────────────────────
|
||||||
|
|
||||||
|
class BaseTransport(ABC):
|
||||||
|
"""Common interface for all transport implementations."""
|
||||||
|
|
||||||
|
@abstractmethod
|
||||||
|
def connect(self) -> None:
|
||||||
|
"""Open the underlying connection."""
|
||||||
|
|
||||||
|
@abstractmethod
|
||||||
|
def disconnect(self) -> None:
|
||||||
|
"""Close the underlying connection."""
|
||||||
|
|
||||||
|
@property
|
||||||
|
@abstractmethod
|
||||||
|
def is_connected(self) -> bool:
|
||||||
|
"""True while the connection is open."""
|
||||||
|
|
||||||
|
@abstractmethod
|
||||||
|
def write(self, data: bytes) -> None:
|
||||||
|
"""Write *data* bytes to the wire."""
|
||||||
|
|
||||||
|
@abstractmethod
|
||||||
|
def read(self, n: int) -> bytes:
|
||||||
|
"""
|
||||||
|
Read up to *n* bytes. Returns immediately with whatever is available
|
||||||
|
(may return fewer than *n* bytes, or b"" if nothing is ready).
|
||||||
|
"""
|
||||||
|
|
||||||
|
# ── Context manager ───────────────────────────────────────────────────────
|
||||||
|
|
||||||
|
def __enter__(self) -> "BaseTransport":
|
||||||
|
self.connect()
|
||||||
|
return self
|
||||||
|
|
||||||
|
def __exit__(self, *_) -> None:
|
||||||
|
self.disconnect()
|
||||||
|
|
||||||
|
# ── Higher-level read helpers ─────────────────────────────────────────────
|
||||||
|
|
||||||
|
def read_until_idle(
|
||||||
|
self,
|
||||||
|
timeout: float = 2.0,
|
||||||
|
idle_gap: float = 0.05,
|
||||||
|
chunk: int = 256,
|
||||||
|
) -> bytes:
|
||||||
|
"""
|
||||||
|
Read bytes until the line goes quiet.
|
||||||
|
|
||||||
|
Keeps reading in *chunk*-sized bursts. Returns when either:
|
||||||
|
- *timeout* seconds have elapsed since the first byte arrived, or
|
||||||
|
- *idle_gap* seconds pass with no new bytes (line went quiet).
|
||||||
|
|
||||||
|
This mirrors how Blastware behaves: it waits for the seismograph to
|
||||||
|
stop transmitting rather than counting bytes.
|
||||||
|
|
||||||
|
Args:
|
||||||
|
timeout: Hard deadline (seconds) from the moment read starts.
|
||||||
|
idle_gap: How long to wait after the last byte before declaring done.
|
||||||
|
chunk: How many bytes to request per low-level read() call.
|
||||||
|
|
||||||
|
Returns:
|
||||||
|
All bytes received as a single bytes object (may be b"" if nothing
|
||||||
|
arrived within *timeout*).
|
||||||
|
"""
|
||||||
|
buf = bytearray()
|
||||||
|
deadline = time.monotonic() + timeout
|
||||||
|
last_rx = None
|
||||||
|
|
||||||
|
while time.monotonic() < deadline:
|
||||||
|
got = self.read(chunk)
|
||||||
|
if got:
|
||||||
|
buf.extend(got)
|
||||||
|
last_rx = time.monotonic()
|
||||||
|
else:
|
||||||
|
# Nothing ready — check idle gap
|
||||||
|
if last_rx is not None and (time.monotonic() - last_rx) >= idle_gap:
|
||||||
|
break
|
||||||
|
time.sleep(0.005)
|
||||||
|
|
||||||
|
return bytes(buf)
|
||||||
|
|
||||||
|
def read_exact(self, n: int, timeout: float = 2.0) -> bytes:
|
||||||
|
"""
|
||||||
|
Read exactly *n* bytes or raise TimeoutError.
|
||||||
|
|
||||||
|
Useful when the caller already knows the expected response length
|
||||||
|
(e.g. fixed-size ACK packets).
|
||||||
|
"""
|
||||||
|
buf = bytearray()
|
||||||
|
deadline = time.monotonic() + timeout
|
||||||
|
while len(buf) < n:
|
||||||
|
if time.monotonic() >= deadline:
|
||||||
|
raise TimeoutError(
|
||||||
|
f"read_exact: wanted {n} bytes, got {len(buf)} "
|
||||||
|
f"after {timeout:.1f}s"
|
||||||
|
)
|
||||||
|
got = self.read(n - len(buf))
|
||||||
|
if got:
|
||||||
|
buf.extend(got)
|
||||||
|
else:
|
||||||
|
time.sleep(0.005)
|
||||||
|
return bytes(buf)
|
||||||
|
|
||||||
|
|
||||||
|
# ── Serial transport ──────────────────────────────────────────────────────────
|
||||||
|
|
||||||
|
# Default baud rate confirmed from Blastware / MiniMate Plus documentation.
|
||||||
|
DEFAULT_BAUD = 38_400
|
||||||
|
|
||||||
|
# pyserial serial port config matching the MiniMate Plus RS-232 spec:
|
||||||
|
# 8 data bits, no parity, 1 stop bit (8N1).
|
||||||
|
_SERIAL_BYTESIZE = 8 # serial.EIGHTBITS
|
||||||
|
_SERIAL_PARITY = "N" # serial.PARITY_NONE
|
||||||
|
_SERIAL_STOPBITS = 1 # serial.STOPBITS_ONE
|
||||||
|
|
||||||
|
|
||||||
|
class SerialTransport(BaseTransport):
|
||||||
|
"""
|
||||||
|
pyserial-backed transport for a direct RS-232 cable connection.
|
||||||
|
|
||||||
|
The port is opened with a very short read timeout (10 ms) so that
|
||||||
|
read() returns quickly and the caller can implement its own framing /
|
||||||
|
timeout logic without blocking the whole process.
|
||||||
|
|
||||||
|
Args:
|
||||||
|
port: COM port name (e.g. "COM5" on Windows, "/dev/ttyUSB0" on Linux).
|
||||||
|
baud: Baud rate (default 38400).
|
||||||
|
rts_cts: Enable RTS/CTS hardware flow control (default False — MiniMate
|
||||||
|
typically uses no flow control).
|
||||||
|
"""
|
||||||
|
|
||||||
|
# Internal read timeout (seconds). Short so read() is non-blocking in practice.
|
||||||
|
_READ_TIMEOUT = 0.01
|
||||||
|
|
||||||
|
def __init__(
|
||||||
|
self,
|
||||||
|
port: str,
|
||||||
|
baud: int = DEFAULT_BAUD,
|
||||||
|
rts_cts: bool = False,
|
||||||
|
) -> None:
|
||||||
|
if serial is None:
|
||||||
|
raise ImportError(
|
||||||
|
"pyserial is required for SerialTransport. "
|
||||||
|
"Install it with: pip install pyserial"
|
||||||
|
)
|
||||||
|
self.port = port
|
||||||
|
self.baud = baud
|
||||||
|
self.rts_cts = rts_cts
|
||||||
|
self._ser: Optional[serial.Serial] = None
|
||||||
|
|
||||||
|
# ── BaseTransport interface ───────────────────────────────────────────────
|
||||||
|
|
||||||
|
def connect(self) -> None:
|
||||||
|
"""Open the serial port. Raises serial.SerialException on failure."""
|
||||||
|
if self._ser and self._ser.is_open:
|
||||||
|
return # Already open — idempotent
|
||||||
|
self._ser = serial.Serial(
|
||||||
|
port = self.port,
|
||||||
|
baudrate = self.baud,
|
||||||
|
bytesize = _SERIAL_BYTESIZE,
|
||||||
|
parity = _SERIAL_PARITY,
|
||||||
|
stopbits = _SERIAL_STOPBITS,
|
||||||
|
timeout = self._READ_TIMEOUT,
|
||||||
|
rtscts = self.rts_cts,
|
||||||
|
xonxoff = False,
|
||||||
|
dsrdtr = False,
|
||||||
|
)
|
||||||
|
# Flush any stale bytes left in device / OS buffers from a previous session
|
||||||
|
self._ser.reset_input_buffer()
|
||||||
|
self._ser.reset_output_buffer()
|
||||||
|
|
||||||
|
def disconnect(self) -> None:
|
||||||
|
"""Close the serial port. Safe to call even if already closed."""
|
||||||
|
if self._ser:
|
||||||
|
try:
|
||||||
|
self._ser.close()
|
||||||
|
except Exception:
|
||||||
|
pass
|
||||||
|
self._ser = None
|
||||||
|
|
||||||
|
@property
|
||||||
|
def is_connected(self) -> bool:
|
||||||
|
return bool(self._ser and self._ser.is_open)
|
||||||
|
|
||||||
|
def write(self, data: bytes) -> None:
|
||||||
|
"""
|
||||||
|
Write *data* to the serial port.
|
||||||
|
|
||||||
|
Raises:
|
||||||
|
RuntimeError: if not connected.
|
||||||
|
serial.SerialException: on I/O error.
|
||||||
|
"""
|
||||||
|
if not self.is_connected:
|
||||||
|
raise RuntimeError("SerialTransport.write: not connected")
|
||||||
|
self._ser.write(data) # type: ignore[union-attr]
|
||||||
|
self._ser.flush() # type: ignore[union-attr]
|
||||||
|
|
||||||
|
def read(self, n: int) -> bytes:
|
||||||
|
"""
|
||||||
|
Read up to *n* bytes from the serial port.
|
||||||
|
|
||||||
|
Returns b"" immediately if no data is available (non-blocking in
|
||||||
|
practice thanks to the 10 ms read timeout).
|
||||||
|
|
||||||
|
Raises:
|
||||||
|
RuntimeError: if not connected.
|
||||||
|
"""
|
||||||
|
if not self.is_connected:
|
||||||
|
raise RuntimeError("SerialTransport.read: not connected")
|
||||||
|
return self._ser.read(n) # type: ignore[union-attr]
|
||||||
|
|
||||||
|
# ── Extras ────────────────────────────────────────────────────────────────
|
||||||
|
|
||||||
|
def flush_input(self) -> None:
|
||||||
|
"""Discard any unread bytes in the OS receive buffer."""
|
||||||
|
if self.is_connected:
|
||||||
|
self._ser.reset_input_buffer() # type: ignore[union-attr]
|
||||||
|
|
||||||
|
def __repr__(self) -> str:
|
||||||
|
state = "open" if self.is_connected else "closed"
|
||||||
|
return f"SerialTransport({self.port!r}, baud={self.baud}, {state})"
|
||||||
|
|
||||||
|
|
||||||
|
# ── TCP transport ─────────────────────────────────────────────────────────────
|
||||||
|
|
||||||
|
# Default TCP port for Blastware modem communications / ACH relay.
|
||||||
|
# Confirmed from field setup: Blastware → Communication Setup → TCP/IP uses 12345.
|
||||||
|
DEFAULT_TCP_PORT = 12345
|
||||||
|
|
||||||
|
|
||||||
|
class TcpTransport(BaseTransport):
|
||||||
|
"""
|
||||||
|
TCP socket transport for MiniMate Plus units in the field.
|
||||||
|
|
||||||
|
The protocol bytes over TCP are identical to RS-232 — TCP is simply a
|
||||||
|
different physical layer. The modem (Sierra Wireless RV55 / RX55, or older
|
||||||
|
Raven X) bridges the unit's RS-232 serial port to a TCP socket transparently.
|
||||||
|
No application-layer handshake or framing is added.
|
||||||
|
|
||||||
|
Two usage scenarios:
|
||||||
|
|
||||||
|
"Call up" (outbound): SFM connects to the unit's modem IP directly.
|
||||||
|
TcpTransport(host="203.0.113.5", port=12345)
|
||||||
|
|
||||||
|
"Call home" / ACH relay: The unit has already dialled in to the office
|
||||||
|
ACH server, which bridged the modem to a TCP socket. In this case
|
||||||
|
the host/port identifies the relay's listening socket, not the modem.
|
||||||
|
(ACH inbound mode is handled by a separate AchServer — not this class.)
|
||||||
|
|
||||||
|
IMPORTANT — modem data forwarding delay:
|
||||||
|
Sierra Wireless (and Raven) modems buffer RS-232 bytes for up to 1 second
|
||||||
|
before forwarding them as a TCP segment ("Data Forwarding Timeout" in
|
||||||
|
ACEmanager). read_until_idle() is overridden to use idle_gap=1.5 s rather
|
||||||
|
than the serial default of 0.05 s — without this, the parser would declare
|
||||||
|
a frame complete mid-stream during the modem's buffering pause.
|
||||||
|
|
||||||
|
Args:
|
||||||
|
host: IP address or hostname of the modem / ACH relay.
|
||||||
|
port: TCP port number (default 12345).
|
||||||
|
connect_timeout: Seconds to wait for the TCP handshake (default 10.0).
|
||||||
|
"""
|
||||||
|
|
||||||
|
# Internal recv timeout — short so read() returns promptly if no data.
|
||||||
|
_RECV_TIMEOUT = 0.01
|
||||||
|
|
||||||
|
def __init__(
|
||||||
|
self,
|
||||||
|
host: str,
|
||||||
|
port: int = DEFAULT_TCP_PORT,
|
||||||
|
connect_timeout: float = 10.0,
|
||||||
|
) -> None:
|
||||||
|
self.host = host
|
||||||
|
self.port = port
|
||||||
|
self.connect_timeout = connect_timeout
|
||||||
|
self._sock: Optional[socket.socket] = None
|
||||||
|
|
||||||
|
# ── BaseTransport interface ───────────────────────────────────────────────
|
||||||
|
|
||||||
|
def connect(self) -> None:
|
||||||
|
"""
|
||||||
|
Open a TCP connection to host:port.
|
||||||
|
|
||||||
|
Idempotent — does nothing if already connected.
|
||||||
|
|
||||||
|
Raises:
|
||||||
|
OSError / socket.timeout: if the connection cannot be established.
|
||||||
|
"""
|
||||||
|
if self._sock is not None:
|
||||||
|
return # Already connected — idempotent
|
||||||
|
sock = socket.socket(socket.AF_INET, socket.SOCK_STREAM)
|
||||||
|
sock.settimeout(self.connect_timeout)
|
||||||
|
sock.connect((self.host, self.port))
|
||||||
|
# Switch to short timeout so read() is non-blocking in practice
|
||||||
|
sock.settimeout(self._RECV_TIMEOUT)
|
||||||
|
self._sock = sock
|
||||||
|
|
||||||
|
def disconnect(self) -> None:
|
||||||
|
"""Close the TCP socket. Safe to call even if already closed."""
|
||||||
|
if self._sock:
|
||||||
|
try:
|
||||||
|
self._sock.shutdown(socket.SHUT_RDWR)
|
||||||
|
except OSError:
|
||||||
|
pass
|
||||||
|
try:
|
||||||
|
self._sock.close()
|
||||||
|
except OSError:
|
||||||
|
pass
|
||||||
|
self._sock = None
|
||||||
|
|
||||||
|
@property
|
||||||
|
def is_connected(self) -> bool:
|
||||||
|
return self._sock is not None
|
||||||
|
|
||||||
|
def write(self, data: bytes) -> None:
|
||||||
|
"""
|
||||||
|
Send all bytes to the peer.
|
||||||
|
|
||||||
|
Raises:
|
||||||
|
RuntimeError: if not connected.
|
||||||
|
OSError: on network I/O error.
|
||||||
|
"""
|
||||||
|
if not self.is_connected:
|
||||||
|
raise RuntimeError("TcpTransport.write: not connected")
|
||||||
|
self._sock.sendall(data) # type: ignore[union-attr]
|
||||||
|
|
||||||
|
def read(self, n: int) -> bytes:
|
||||||
|
"""
|
||||||
|
Read up to *n* bytes from the socket.
|
||||||
|
|
||||||
|
Returns b"" immediately if no data is available (non-blocking in
|
||||||
|
practice thanks to the short socket timeout).
|
||||||
|
|
||||||
|
Raises:
|
||||||
|
RuntimeError: if not connected.
|
||||||
|
"""
|
||||||
|
if not self.is_connected:
|
||||||
|
raise RuntimeError("TcpTransport.read: not connected")
|
||||||
|
try:
|
||||||
|
return self._sock.recv(n) # type: ignore[union-attr]
|
||||||
|
except socket.timeout:
|
||||||
|
return b""
|
||||||
|
|
||||||
|
def read_until_idle(
|
||||||
|
self,
|
||||||
|
timeout: float = 2.0,
|
||||||
|
idle_gap: float = 1.5,
|
||||||
|
chunk: int = 256,
|
||||||
|
) -> bytes:
|
||||||
|
"""
|
||||||
|
TCP-aware version of read_until_idle.
|
||||||
|
|
||||||
|
Overrides the BaseTransport default to use a much longer idle_gap (1.5 s
|
||||||
|
vs 0.05 s for serial). This is necessary because the Raven modem (and
|
||||||
|
similar cellular modems) buffer serial-port bytes for up to 1 second
|
||||||
|
before forwarding them over TCP ("Data Forwarding Timeout" setting).
|
||||||
|
|
||||||
|
If read_until_idle returned after a 50 ms quiet period, it would trigger
|
||||||
|
mid-frame when the modem is still accumulating bytes — causing frame
|
||||||
|
parse failures on every call.
|
||||||
|
|
||||||
|
Args:
|
||||||
|
timeout: Hard deadline from first byte (default 2.0 s — callers
|
||||||
|
typically pass a longer value for S3 frames).
|
||||||
|
idle_gap: Quiet-line threshold (default 1.5 s to survive modem
|
||||||
|
buffering). Pass a smaller value only if you are
|
||||||
|
connecting directly to a unit's Ethernet port with no
|
||||||
|
modem buffering in the path.
|
||||||
|
chunk: Bytes per low-level recv() call.
|
||||||
|
"""
|
||||||
|
return super().read_until_idle(timeout=timeout, idle_gap=idle_gap, chunk=chunk)
|
||||||
|
|
||||||
|
def __repr__(self) -> str:
|
||||||
|
state = "connected" if self.is_connected else "disconnected"
|
||||||
|
return f"TcpTransport({self.host!r}, port={self.port}, {state})"
|
||||||
98
parsers/bw_frames.jsonl
Normal file
98
parsers/bw_frames.jsonl
Normal file
@@ -0,0 +1,98 @@
|
|||||||
|
{"index": 0, "start_offset": 0, "end_offset": 21, "payload_len": 17, "payload_hex": "10005b000000000000000000000000006b", "trailer_hex": "", "checksum_valid": null, "checksum_type": null, "checksum_hex": null}
|
||||||
|
{"index": 1, "start_offset": 21, "end_offset": 42, "payload_len": 17, "payload_hex": "10005b000000000000000000000000006b", "trailer_hex": "", "checksum_valid": null, "checksum_type": null, "checksum_hex": null}
|
||||||
|
{"index": 2, "start_offset": 42, "end_offset": 63, "payload_len": 17, "payload_hex": "10005b000030000000000000000000009b", "trailer_hex": "", "checksum_valid": null, "checksum_type": null, "checksum_hex": null}
|
||||||
|
{"index": 3, "start_offset": 63, "end_offset": 84, "payload_len": 17, "payload_hex": "10005b000000000000000000000000006b", "trailer_hex": "", "checksum_valid": null, "checksum_type": null, "checksum_hex": null}
|
||||||
|
{"index": 4, "start_offset": 84, "end_offset": 105, "payload_len": 17, "payload_hex": "10005b000030000000000000000000009b", "trailer_hex": "", "checksum_valid": null, "checksum_type": null, "checksum_hex": null}
|
||||||
|
{"index": 5, "start_offset": 105, "end_offset": 126, "payload_len": 17, "payload_hex": "10005b000000000000000000000000006b", "trailer_hex": "", "checksum_valid": null, "checksum_type": null, "checksum_hex": null}
|
||||||
|
{"index": 6, "start_offset": 126, "end_offset": 147, "payload_len": 17, "payload_hex": "10005b000030000000000000000000009b", "trailer_hex": "", "checksum_valid": null, "checksum_type": null, "checksum_hex": null}
|
||||||
|
{"index": 7, "start_offset": 147, "end_offset": 168, "payload_len": 17, "payload_hex": "10005b000000000000000000000000006b", "trailer_hex": "", "checksum_valid": null, "checksum_type": null, "checksum_hex": null}
|
||||||
|
{"index": 8, "start_offset": 168, "end_offset": 189, "payload_len": 17, "payload_hex": "10005b000030000000000000000000009b", "trailer_hex": "", "checksum_valid": null, "checksum_type": null, "checksum_hex": null}
|
||||||
|
{"index": 9, "start_offset": 189, "end_offset": 210, "payload_len": 17, "payload_hex": "10005b000000000000000000000000006b", "trailer_hex": "", "checksum_valid": null, "checksum_type": null, "checksum_hex": null}
|
||||||
|
{"index": 10, "start_offset": 210, "end_offset": 231, "payload_len": 17, "payload_hex": "10005b000030000000000000000000009b", "trailer_hex": "", "checksum_valid": null, "checksum_type": null, "checksum_hex": null}
|
||||||
|
{"index": 11, "start_offset": 231, "end_offset": 252, "payload_len": 17, "payload_hex": "10005b000000000000000000000000006b", "trailer_hex": "", "checksum_valid": null, "checksum_type": null, "checksum_hex": null}
|
||||||
|
{"index": 12, "start_offset": 252, "end_offset": 273, "payload_len": 17, "payload_hex": "10005b000030000000000000000000009b", "trailer_hex": "", "checksum_valid": null, "checksum_type": null, "checksum_hex": null}
|
||||||
|
{"index": 13, "start_offset": 273, "end_offset": 294, "payload_len": 17, "payload_hex": "1000150000000000000000000000000025", "trailer_hex": "", "checksum_valid": null, "checksum_type": null, "checksum_hex": null}
|
||||||
|
{"index": 14, "start_offset": 294, "end_offset": 315, "payload_len": 17, "payload_hex": "10001500000a000000000000000000002f", "trailer_hex": "", "checksum_valid": null, "checksum_type": null, "checksum_hex": null}
|
||||||
|
{"index": 15, "start_offset": 315, "end_offset": 427, "payload_len": 108, "payload_hex": "10006800005a00000000000000000000005809000000010107cb00061e00010107cb00140000000000173b00000000000000000000000000000100000000000100000000000000010001000000000000000000000000000000000064000000000000001effdc0000100200c8", "trailer_hex": "", "checksum_valid": null, "checksum_type": null, "checksum_hex": null}
|
||||||
|
{"index": 16, "start_offset": 427, "end_offset": 448, "payload_len": 17, "payload_hex": "1000730000000000000000000000000083", "trailer_hex": "", "checksum_valid": null, "checksum_type": null, "checksum_hex": null}
|
||||||
|
{"index": 17, "start_offset": 448, "end_offset": 1497, "payload_len": 1045, "payload_hex": "1000710010040000000000000000000000082a6400001004100400003c0000be800000000040400000001003000f000000073dbb457a3db956e1000100015374616e64617264205265636f7264696e672053657475702e7365740000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000050726f6a6563743a0000000000000000000000000000544553542000000000000000000000000000000000000000000000000000000000000000000000000000436c69656e743a000000000000000000000000000000436c6175646520746573743200000000000000000000000000000000000000000000000000000000000055736572204e616d653a00000000000000000000000054657272612d4d656368616e69637320496e632e202d20422e204861727269736f6e000000000000000053656973204c6f633a000000000000000000000000004c6f636174696f6e202331202d20427269616e7320486f75736500000000000000000000000000000000457874656e646564204e6f74657300000000000000000a000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000007", "trailer_hex": "", "checksum_valid": null, "checksum_type": null, "checksum_hex": null}
|
||||||
|
{"index": 18, "start_offset": 1497, "end_offset": 2574, "payload_len": 1073, "payload_hex": "1000710010040000001004000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000015472616e000000010050000f0028001510021003011004001003000040c697fd00003f19999a696e2e00400000002f730000000156657274000000010050000f0028001510021003011004001003000040c697fd00003f19999a696e2e00400000002f73000000014c6f6e67000000010050000f0028001510021003011004001003000040c697fd00003f19999a696e2e00400000002f73000000004d69634c000000100200c80032000a000a1002d501db000500003d38560800003c1374bc707369003cac0831284c29000010025472616e320000010050000f0028001510021003011004001003000040c697fd00003f000000696e2e00400000002f73000000100256657274320000010050000f0028001510021003011004001003000040c697fd00003f000000696e2e00400000002f7300000010024c6f6e67320000010050000f0028001510021003011004001003000040c697fd00003f000000696e2e00400000002f73000000004d69634c1002", "trailer_hex": "", "checksum_valid": null, "checksum_type": null, "checksum_hex": null}
|
||||||
|
{"index": 19, "start_offset": 2574, "end_offset": 2641, "payload_len": 63, "payload_hex": "10007100002c00000800000000000000320000100200c80032000a000a1002d501db000500003d38560800003c23d70a707369003cac0831284c29007cea32", "trailer_hex": "", "checksum_valid": null, "checksum_type": null, "checksum_hex": null}
|
||||||
|
{"index": 20, "start_offset": 2641, "end_offset": 2662, "payload_len": 17, "payload_hex": "1000720000000000000000000000000082", "trailer_hex": "", "checksum_valid": null, "checksum_type": null, "checksum_hex": null}
|
||||||
|
{"index": 21, "start_offset": 2662, "end_offset": 2711, "payload_len": 45, "payload_hex": "10008200001c00000000000000000000001ad5000001080affffffffffffffffffffffffffffffffffff00009e", "trailer_hex": "", "checksum_valid": null, "checksum_type": null, "checksum_hex": null}
|
||||||
|
{"index": 22, "start_offset": 2711, "end_offset": 2732, "payload_len": 17, "payload_hex": "1000830000000000000000000000000093", "trailer_hex": "", "checksum_valid": null, "checksum_type": null, "checksum_hex": null}
|
||||||
|
{"index": 23, "start_offset": 2732, "end_offset": 2957, "payload_len": 221, "payload_hex": "1000690000ca0000000000000000000000c8080000010001000100010001000100010010020001001e0010020001000a000a4576656e742053756d6d617279205265706f7274000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000002580000801018c76af", "trailer_hex": "", "checksum_valid": null, "checksum_type": null, "checksum_hex": null}
|
||||||
|
{"index": 24, "start_offset": 2957, "end_offset": 2978, "payload_len": 17, "payload_hex": "1000740000000000000000000000000084", "trailer_hex": "", "checksum_valid": null, "checksum_type": null, "checksum_hex": null}
|
||||||
|
{"index": 25, "start_offset": 2978, "end_offset": 2999, "payload_len": 17, "payload_hex": "1000720000000000000000000000000082", "trailer_hex": "", "checksum_valid": null, "checksum_type": null, "checksum_hex": null}
|
||||||
|
{"index": 26, "start_offset": 2999, "end_offset": 3020, "payload_len": 17, "payload_hex": "10005b000000000000000000000000006b", "trailer_hex": "", "checksum_valid": null, "checksum_type": null, "checksum_hex": null}
|
||||||
|
{"index": 27, "start_offset": 3020, "end_offset": 3041, "payload_len": 17, "payload_hex": "10005b000030000000000000000000009b", "trailer_hex": "", "checksum_valid": null, "checksum_type": null, "checksum_hex": null}
|
||||||
|
{"index": 28, "start_offset": 3041, "end_offset": 3062, "payload_len": 17, "payload_hex": "10005b000000000000000000000000006b", "trailer_hex": "", "checksum_valid": null, "checksum_type": null, "checksum_hex": null}
|
||||||
|
{"index": 29, "start_offset": 3062, "end_offset": 3083, "payload_len": 17, "payload_hex": "10005b000030000000000000000000009b", "trailer_hex": "", "checksum_valid": null, "checksum_type": null, "checksum_hex": null}
|
||||||
|
{"index": 30, "start_offset": 3083, "end_offset": 3104, "payload_len": 17, "payload_hex": "10005b000000000000000000000000006b", "trailer_hex": "", "checksum_valid": null, "checksum_type": null, "checksum_hex": null}
|
||||||
|
{"index": 31, "start_offset": 3104, "end_offset": 3125, "payload_len": 17, "payload_hex": "10005b000030000000000000000000009b", "trailer_hex": "", "checksum_valid": null, "checksum_type": null, "checksum_hex": null}
|
||||||
|
{"index": 32, "start_offset": 3125, "end_offset": 3146, "payload_len": 17, "payload_hex": "1000150000000000000000000000000025", "trailer_hex": "", "checksum_valid": null, "checksum_type": null, "checksum_hex": null}
|
||||||
|
{"index": 33, "start_offset": 3146, "end_offset": 3167, "payload_len": 17, "payload_hex": "10001500000a000000000000000000002f", "trailer_hex": "", "checksum_valid": null, "checksum_type": null, "checksum_hex": null}
|
||||||
|
{"index": 34, "start_offset": 3167, "end_offset": 3188, "payload_len": 17, "payload_hex": "1000010000000000000000000000000011", "trailer_hex": "", "checksum_valid": null, "checksum_type": null, "checksum_hex": null}
|
||||||
|
{"index": 35, "start_offset": 3188, "end_offset": 3209, "payload_len": 17, "payload_hex": "10000100009800000000000000000000a9", "trailer_hex": "", "checksum_valid": null, "checksum_type": null, "checksum_hex": null}
|
||||||
|
{"index": 36, "start_offset": 3209, "end_offset": 3230, "payload_len": 17, "payload_hex": "1000080000000000000000000000000018", "trailer_hex": "", "checksum_valid": null, "checksum_type": null, "checksum_hex": null}
|
||||||
|
{"index": 37, "start_offset": 3230, "end_offset": 3251, "payload_len": 17, "payload_hex": "1000080000580000000000000000000070", "trailer_hex": "", "checksum_valid": null, "checksum_type": null, "checksum_hex": null}
|
||||||
|
{"index": 38, "start_offset": 3251, "end_offset": 3272, "payload_len": 17, "payload_hex": "10005b000000000000000000000000006b", "trailer_hex": "", "checksum_valid": null, "checksum_type": null, "checksum_hex": null}
|
||||||
|
{"index": 39, "start_offset": 3272, "end_offset": 3293, "payload_len": 17, "payload_hex": "10005b000030000000000000000000009b", "trailer_hex": "", "checksum_valid": null, "checksum_type": null, "checksum_hex": null}
|
||||||
|
{"index": 40, "start_offset": 3293, "end_offset": 3314, "payload_len": 17, "payload_hex": "10005b000000000000000000000000006b", "trailer_hex": "", "checksum_valid": null, "checksum_type": null, "checksum_hex": null}
|
||||||
|
{"index": 41, "start_offset": 3314, "end_offset": 3335, "payload_len": 17, "payload_hex": "10005b000030000000000000000000009b", "trailer_hex": "", "checksum_valid": null, "checksum_type": null, "checksum_hex": null}
|
||||||
|
{"index": 42, "start_offset": 3335, "end_offset": 3356, "payload_len": 17, "payload_hex": "10005b000000000000000000000000006b", "trailer_hex": "", "checksum_valid": null, "checksum_type": null, "checksum_hex": null}
|
||||||
|
{"index": 43, "start_offset": 3356, "end_offset": 3377, "payload_len": 17, "payload_hex": "10005b000030000000000000000000009b", "trailer_hex": "", "checksum_valid": null, "checksum_type": null, "checksum_hex": null}
|
||||||
|
{"index": 44, "start_offset": 3377, "end_offset": 3398, "payload_len": 17, "payload_hex": "1000010000000000000000000000000011", "trailer_hex": "", "checksum_valid": null, "checksum_type": null, "checksum_hex": null}
|
||||||
|
{"index": 45, "start_offset": 3398, "end_offset": 3419, "payload_len": 17, "payload_hex": "10000100009800000000000000000000a9", "trailer_hex": "", "checksum_valid": null, "checksum_type": null, "checksum_hex": null}
|
||||||
|
{"index": 46, "start_offset": 3419, "end_offset": 3440, "payload_len": 17, "payload_hex": "10005b000000000000000000000000006b", "trailer_hex": "", "checksum_valid": null, "checksum_type": null, "checksum_hex": null}
|
||||||
|
{"index": 47, "start_offset": 3440, "end_offset": 3461, "payload_len": 17, "payload_hex": "10005b000030000000000000000000009b", "trailer_hex": "", "checksum_valid": null, "checksum_type": null, "checksum_hex": null}
|
||||||
|
{"index": 48, "start_offset": 3461, "end_offset": 3482, "payload_len": 17, "payload_hex": "10005b000000000000000000000000006b", "trailer_hex": "", "checksum_valid": null, "checksum_type": null, "checksum_hex": null}
|
||||||
|
{"index": 49, "start_offset": 3482, "end_offset": 3503, "payload_len": 17, "payload_hex": "10005b000030000000000000000000009b", "trailer_hex": "", "checksum_valid": null, "checksum_type": null, "checksum_hex": null}
|
||||||
|
{"index": 50, "start_offset": 3503, "end_offset": 3524, "payload_len": 17, "payload_hex": "10005b000000000000000000000000006b", "trailer_hex": "", "checksum_valid": null, "checksum_type": null, "checksum_hex": null}
|
||||||
|
{"index": 51, "start_offset": 3524, "end_offset": 3545, "payload_len": 17, "payload_hex": "10005b000030000000000000000000009b", "trailer_hex": "", "checksum_valid": null, "checksum_type": null, "checksum_hex": null}
|
||||||
|
{"index": 52, "start_offset": 3545, "end_offset": 3566, "payload_len": 17, "payload_hex": "1000150000000000000000000000000025", "trailer_hex": "", "checksum_valid": null, "checksum_type": null, "checksum_hex": null}
|
||||||
|
{"index": 53, "start_offset": 3566, "end_offset": 3587, "payload_len": 17, "payload_hex": "10001500000a000000000000000000002f", "trailer_hex": "", "checksum_valid": null, "checksum_type": null, "checksum_hex": null}
|
||||||
|
{"index": 54, "start_offset": 3587, "end_offset": 3608, "payload_len": 17, "payload_hex": "10005b000000000000000000000000006b", "trailer_hex": "", "checksum_valid": null, "checksum_type": null, "checksum_hex": null}
|
||||||
|
{"index": 55, "start_offset": 3608, "end_offset": 3629, "payload_len": 17, "payload_hex": "10005b000030000000000000000000009b", "trailer_hex": "", "checksum_valid": null, "checksum_type": null, "checksum_hex": null}
|
||||||
|
{"index": 56, "start_offset": 3629, "end_offset": 3650, "payload_len": 17, "payload_hex": "10005b000000000000000000000000006b", "trailer_hex": "", "checksum_valid": null, "checksum_type": null, "checksum_hex": null}
|
||||||
|
{"index": 57, "start_offset": 3650, "end_offset": 3671, "payload_len": 17, "payload_hex": "10005b000030000000000000000000009b", "trailer_hex": "", "checksum_valid": null, "checksum_type": null, "checksum_hex": null}
|
||||||
|
{"index": 58, "start_offset": 3671, "end_offset": 3692, "payload_len": 17, "payload_hex": "10005b000000000000000000000000006b", "trailer_hex": "", "checksum_valid": null, "checksum_type": null, "checksum_hex": null}
|
||||||
|
{"index": 59, "start_offset": 3692, "end_offset": 3713, "payload_len": 17, "payload_hex": "10005b000030000000000000000000009b", "trailer_hex": "", "checksum_valid": null, "checksum_type": null, "checksum_hex": null}
|
||||||
|
{"index": 60, "start_offset": 3713, "end_offset": 3734, "payload_len": 17, "payload_hex": "1000150000000000000000000000000025", "trailer_hex": "", "checksum_valid": null, "checksum_type": null, "checksum_hex": null}
|
||||||
|
{"index": 61, "start_offset": 3734, "end_offset": 3755, "payload_len": 17, "payload_hex": "10001500000a000000000000000000002f", "trailer_hex": "", "checksum_valid": null, "checksum_type": null, "checksum_hex": null}
|
||||||
|
{"index": 62, "start_offset": 3755, "end_offset": 3776, "payload_len": 17, "payload_hex": "10005b000000000000000000000000006b", "trailer_hex": "", "checksum_valid": null, "checksum_type": null, "checksum_hex": null}
|
||||||
|
{"index": 63, "start_offset": 3776, "end_offset": 3797, "payload_len": 17, "payload_hex": "10005b000030000000000000000000009b", "trailer_hex": "", "checksum_valid": null, "checksum_type": null, "checksum_hex": null}
|
||||||
|
{"index": 64, "start_offset": 3797, "end_offset": 3818, "payload_len": 17, "payload_hex": "10005b000000000000000000000000006b", "trailer_hex": "", "checksum_valid": null, "checksum_type": null, "checksum_hex": null}
|
||||||
|
{"index": 65, "start_offset": 3818, "end_offset": 3839, "payload_len": 17, "payload_hex": "10005b000030000000000000000000009b", "trailer_hex": "", "checksum_valid": null, "checksum_type": null, "checksum_hex": null}
|
||||||
|
{"index": 66, "start_offset": 3839, "end_offset": 3860, "payload_len": 17, "payload_hex": "10005b000000000000000000000000006b", "trailer_hex": "", "checksum_valid": null, "checksum_type": null, "checksum_hex": null}
|
||||||
|
{"index": 67, "start_offset": 3860, "end_offset": 3881, "payload_len": 17, "payload_hex": "10005b000030000000000000000000009b", "trailer_hex": "", "checksum_valid": null, "checksum_type": null, "checksum_hex": null}
|
||||||
|
{"index": 68, "start_offset": 3881, "end_offset": 3902, "payload_len": 17, "payload_hex": "10005b000000000000000000000000006b", "trailer_hex": "", "checksum_valid": null, "checksum_type": null, "checksum_hex": null}
|
||||||
|
{"index": 69, "start_offset": 3902, "end_offset": 3923, "payload_len": 17, "payload_hex": "10005b000030000000000000000000009b", "trailer_hex": "", "checksum_valid": null, "checksum_type": null, "checksum_hex": null}
|
||||||
|
{"index": 70, "start_offset": 3923, "end_offset": 3944, "payload_len": 17, "payload_hex": "10005b000000000000000000000000006b", "trailer_hex": "", "checksum_valid": null, "checksum_type": null, "checksum_hex": null}
|
||||||
|
{"index": 71, "start_offset": 3944, "end_offset": 3965, "payload_len": 17, "payload_hex": "10005b000030000000000000000000009b", "trailer_hex": "", "checksum_valid": null, "checksum_type": null, "checksum_hex": null}
|
||||||
|
{"index": 72, "start_offset": 3965, "end_offset": 3986, "payload_len": 17, "payload_hex": "10005b000000000000000000000000006b", "trailer_hex": "", "checksum_valid": null, "checksum_type": null, "checksum_hex": null}
|
||||||
|
{"index": 73, "start_offset": 3986, "end_offset": 4007, "payload_len": 17, "payload_hex": "10005b000030000000000000000000009b", "trailer_hex": "", "checksum_valid": null, "checksum_type": null, "checksum_hex": null}
|
||||||
|
{"index": 74, "start_offset": 4007, "end_offset": 4028, "payload_len": 17, "payload_hex": "10005b000000000000000000000000006b", "trailer_hex": "", "checksum_valid": null, "checksum_type": null, "checksum_hex": null}
|
||||||
|
{"index": 75, "start_offset": 4028, "end_offset": 4049, "payload_len": 17, "payload_hex": "10005b000030000000000000000000009b", "trailer_hex": "", "checksum_valid": null, "checksum_type": null, "checksum_hex": null}
|
||||||
|
{"index": 76, "start_offset": 4049, "end_offset": 4070, "payload_len": 17, "payload_hex": "10005b000000000000000000000000006b", "trailer_hex": "", "checksum_valid": null, "checksum_type": null, "checksum_hex": null}
|
||||||
|
{"index": 77, "start_offset": 4070, "end_offset": 4091, "payload_len": 17, "payload_hex": "10005b000030000000000000000000009b", "trailer_hex": "", "checksum_valid": null, "checksum_type": null, "checksum_hex": null}
|
||||||
|
{"index": 78, "start_offset": 4091, "end_offset": 4112, "payload_len": 17, "payload_hex": "10005b000000000000000000000000006b", "trailer_hex": "", "checksum_valid": null, "checksum_type": null, "checksum_hex": null}
|
||||||
|
{"index": 79, "start_offset": 4112, "end_offset": 4133, "payload_len": 17, "payload_hex": "10005b000030000000000000000000009b", "trailer_hex": "", "checksum_valid": null, "checksum_type": null, "checksum_hex": null}
|
||||||
|
{"index": 80, "start_offset": 4133, "end_offset": 4154, "payload_len": 17, "payload_hex": "1000010000000000000000000000000011", "trailer_hex": "", "checksum_valid": null, "checksum_type": null, "checksum_hex": null}
|
||||||
|
{"index": 81, "start_offset": 4154, "end_offset": 4175, "payload_len": 17, "payload_hex": "10000100009800000000000000000000a9", "trailer_hex": "", "checksum_valid": null, "checksum_type": null, "checksum_hex": null}
|
||||||
|
{"index": 82, "start_offset": 4175, "end_offset": 4196, "payload_len": 17, "payload_hex": "10002e000000000000000000000000003e", "trailer_hex": "", "checksum_valid": null, "checksum_type": null, "checksum_hex": null}
|
||||||
|
{"index": 83, "start_offset": 4196, "end_offset": 4217, "payload_len": 17, "payload_hex": "10002e00001a0000000000000000000058", "trailer_hex": "", "checksum_valid": null, "checksum_type": null, "checksum_hex": null}
|
||||||
|
{"index": 84, "start_offset": 4217, "end_offset": 4238, "payload_len": 17, "payload_hex": "1000010000000000000000000000000011", "trailer_hex": "", "checksum_valid": null, "checksum_type": null, "checksum_hex": null}
|
||||||
|
{"index": 85, "start_offset": 4238, "end_offset": 4259, "payload_len": 17, "payload_hex": "10000100009800000000000000000000a9", "trailer_hex": "", "checksum_valid": null, "checksum_type": null, "checksum_hex": null}
|
||||||
|
{"index": 86, "start_offset": 4259, "end_offset": 4280, "payload_len": 17, "payload_hex": "10001a000000000000000000006400008e", "trailer_hex": "", "checksum_valid": null, "checksum_type": null, "checksum_hex": null}
|
||||||
|
{"index": 87, "start_offset": 4280, "end_offset": 4302, "payload_len": 18, "payload_hex": "10001a001004000000000000000064000092", "trailer_hex": "", "checksum_valid": null, "checksum_type": null, "checksum_hex": null}
|
||||||
|
{"index": 88, "start_offset": 4302, "end_offset": 4325, "payload_len": 19, "payload_hex": "10001a00100400000010040000000064000096", "trailer_hex": "", "checksum_valid": null, "checksum_type": null, "checksum_hex": null}
|
||||||
|
{"index": 89, "start_offset": 4325, "end_offset": 4346, "payload_len": 17, "payload_hex": "10001a00002a00000800000000640000c0", "trailer_hex": "", "checksum_valid": null, "checksum_type": null, "checksum_hex": null}
|
||||||
|
{"index": 90, "start_offset": 4346, "end_offset": 4367, "payload_len": 17, "payload_hex": "1000090000000000000000000000000019", "trailer_hex": "", "checksum_valid": null, "checksum_type": null, "checksum_hex": null}
|
||||||
|
{"index": 91, "start_offset": 4367, "end_offset": 4388, "payload_len": 17, "payload_hex": "1000090000ca00000000000000000000e3", "trailer_hex": "", "checksum_valid": null, "checksum_type": null, "checksum_hex": null}
|
||||||
|
{"index": 92, "start_offset": 4388, "end_offset": 4409, "payload_len": 17, "payload_hex": "1000080000000000000000000000000018", "trailer_hex": "", "checksum_valid": null, "checksum_type": null, "checksum_hex": null}
|
||||||
|
{"index": 93, "start_offset": 4409, "end_offset": 4430, "payload_len": 17, "payload_hex": "1000080000580000000000000000000070", "trailer_hex": "", "checksum_valid": null, "checksum_type": null, "checksum_hex": null}
|
||||||
|
{"index": 94, "start_offset": 4430, "end_offset": 4451, "payload_len": 17, "payload_hex": "1000010000000000000000000000000011", "trailer_hex": "", "checksum_valid": null, "checksum_type": null, "checksum_hex": null}
|
||||||
|
{"index": 95, "start_offset": 4451, "end_offset": 4472, "payload_len": 17, "payload_hex": "10000100009800000000000000000000a9", "trailer_hex": "", "checksum_valid": null, "checksum_type": null, "checksum_hex": null}
|
||||||
|
{"index": 96, "start_offset": 4472, "end_offset": 4493, "payload_len": 17, "payload_hex": "1000080000000000000000000000000018", "trailer_hex": "", "checksum_valid": null, "checksum_type": null, "checksum_hex": null}
|
||||||
|
{"index": 97, "start_offset": 4493, "end_offset": 4514, "payload_len": 17, "payload_hex": "1000080000580000000000000000000070", "trailer_hex": "", "checksum_valid": null, "checksum_type": null, "checksum_hex": null}
|
||||||
337
parsers/frame_db.py
Normal file
337
parsers/frame_db.py
Normal file
@@ -0,0 +1,337 @@
|
|||||||
|
#!/usr/bin/env python3
|
||||||
|
"""
|
||||||
|
frame_db.py — SQLite frame database for Instantel protocol captures.
|
||||||
|
|
||||||
|
Schema:
|
||||||
|
captures — one row per ingested capture pair (deduped by SHA256)
|
||||||
|
frames — one row per parsed frame
|
||||||
|
byte_values — one row per (frame, offset, value) for fast indexed queries
|
||||||
|
|
||||||
|
Usage:
|
||||||
|
db = FrameDB() # opens default DB at ~/.seismo_lab/frames.db
|
||||||
|
db = FrameDB(path) # custom path
|
||||||
|
cap_id = db.ingest(sessions, s3_path, bw_path)
|
||||||
|
rows = db.query_frames(sub=0xF7, direction="S3")
|
||||||
|
rows = db.query_by_byte(offset=85, value=0x0A)
|
||||||
|
"""
|
||||||
|
|
||||||
|
from __future__ import annotations
|
||||||
|
|
||||||
|
import hashlib
|
||||||
|
import os
|
||||||
|
import sqlite3
|
||||||
|
import struct
|
||||||
|
from pathlib import Path
|
||||||
|
from typing import Optional
|
||||||
|
|
||||||
|
# ─────────────────────────────────────────────────────────────────────────────
|
||||||
|
# DB location
|
||||||
|
# ─────────────────────────────────────────────────────────────────────────────
|
||||||
|
|
||||||
|
DEFAULT_DB_DIR = Path.home() / ".seismo_lab"
|
||||||
|
DEFAULT_DB_PATH = DEFAULT_DB_DIR / "frames.db"
|
||||||
|
|
||||||
|
|
||||||
|
# ─────────────────────────────────────────────────────────────────────────────
|
||||||
|
# Schema
|
||||||
|
# ─────────────────────────────────────────────────────────────────────────────
|
||||||
|
|
||||||
|
_DDL = """
|
||||||
|
PRAGMA journal_mode=WAL;
|
||||||
|
PRAGMA foreign_keys=ON;
|
||||||
|
|
||||||
|
CREATE TABLE IF NOT EXISTS captures (
|
||||||
|
id INTEGER PRIMARY KEY AUTOINCREMENT,
|
||||||
|
timestamp TEXT NOT NULL, -- ISO-8601 ingest time
|
||||||
|
s3_path TEXT,
|
||||||
|
bw_path TEXT,
|
||||||
|
capture_hash TEXT NOT NULL UNIQUE, -- SHA256 of s3_blob+bw_blob
|
||||||
|
notes TEXT DEFAULT ''
|
||||||
|
);
|
||||||
|
|
||||||
|
CREATE TABLE IF NOT EXISTS frames (
|
||||||
|
id INTEGER PRIMARY KEY AUTOINCREMENT,
|
||||||
|
capture_id INTEGER NOT NULL REFERENCES captures(id) ON DELETE CASCADE,
|
||||||
|
session_idx INTEGER NOT NULL,
|
||||||
|
direction TEXT NOT NULL, -- 'BW' or 'S3'
|
||||||
|
sub INTEGER, -- NULL if malformed
|
||||||
|
page_key INTEGER,
|
||||||
|
sub_name TEXT,
|
||||||
|
payload BLOB NOT NULL,
|
||||||
|
payload_len INTEGER NOT NULL,
|
||||||
|
checksum_ok INTEGER -- 1/0/NULL
|
||||||
|
);
|
||||||
|
|
||||||
|
CREATE INDEX IF NOT EXISTS idx_frames_capture ON frames(capture_id);
|
||||||
|
CREATE INDEX IF NOT EXISTS idx_frames_sub ON frames(sub);
|
||||||
|
CREATE INDEX IF NOT EXISTS idx_frames_page_key ON frames(page_key);
|
||||||
|
CREATE INDEX IF NOT EXISTS idx_frames_dir ON frames(direction);
|
||||||
|
|
||||||
|
CREATE TABLE IF NOT EXISTS byte_values (
|
||||||
|
id INTEGER PRIMARY KEY AUTOINCREMENT,
|
||||||
|
frame_id INTEGER NOT NULL REFERENCES frames(id) ON DELETE CASCADE,
|
||||||
|
offset INTEGER NOT NULL,
|
||||||
|
value INTEGER NOT NULL
|
||||||
|
);
|
||||||
|
|
||||||
|
CREATE INDEX IF NOT EXISTS idx_bv_frame ON byte_values(frame_id);
|
||||||
|
CREATE INDEX IF NOT EXISTS idx_bv_offset ON byte_values(offset);
|
||||||
|
CREATE INDEX IF NOT EXISTS idx_bv_value ON byte_values(value);
|
||||||
|
CREATE INDEX IF NOT EXISTS idx_bv_off_val ON byte_values(offset, value);
|
||||||
|
"""
|
||||||
|
|
||||||
|
|
||||||
|
# ─────────────────────────────────────────────────────────────────────────────
|
||||||
|
# Helpers
|
||||||
|
# ─────────────────────────────────────────────────────────────────────────────
|
||||||
|
|
||||||
|
def _sha256_blobs(s3_blob: bytes, bw_blob: bytes) -> str:
|
||||||
|
h = hashlib.sha256()
|
||||||
|
h.update(s3_blob)
|
||||||
|
h.update(bw_blob)
|
||||||
|
return h.hexdigest()
|
||||||
|
|
||||||
|
|
||||||
|
def _interp_bytes(data: bytes, offset: int) -> dict:
|
||||||
|
"""
|
||||||
|
Return multi-interpretation dict for 1–4 bytes starting at offset.
|
||||||
|
Used in the GUI's byte interpretation panel.
|
||||||
|
"""
|
||||||
|
result: dict = {}
|
||||||
|
remaining = len(data) - offset
|
||||||
|
if remaining <= 0:
|
||||||
|
return result
|
||||||
|
|
||||||
|
b1 = data[offset]
|
||||||
|
result["uint8"] = b1
|
||||||
|
result["int8"] = b1 if b1 < 128 else b1 - 256
|
||||||
|
|
||||||
|
if remaining >= 2:
|
||||||
|
u16be = struct.unpack_from(">H", data, offset)[0]
|
||||||
|
u16le = struct.unpack_from("<H", data, offset)[0]
|
||||||
|
result["uint16_be"] = u16be
|
||||||
|
result["uint16_le"] = u16le
|
||||||
|
|
||||||
|
if remaining >= 4:
|
||||||
|
f32be = struct.unpack_from(">f", data, offset)[0]
|
||||||
|
f32le = struct.unpack_from("<f", data, offset)[0]
|
||||||
|
u32be = struct.unpack_from(">I", data, offset)[0]
|
||||||
|
u32le = struct.unpack_from("<I", data, offset)[0]
|
||||||
|
result["float32_be"] = round(f32be, 6)
|
||||||
|
result["float32_le"] = round(f32le, 6)
|
||||||
|
result["uint32_be"] = u32be
|
||||||
|
result["uint32_le"] = u32le
|
||||||
|
|
||||||
|
return result
|
||||||
|
|
||||||
|
|
||||||
|
# ─────────────────────────────────────────────────────────────────────────────
|
||||||
|
# FrameDB class
|
||||||
|
# ─────────────────────────────────────────────────────────────────────────────
|
||||||
|
|
||||||
|
class FrameDB:
|
||||||
|
def __init__(self, path: Optional[Path] = None) -> None:
|
||||||
|
if path is None:
|
||||||
|
path = DEFAULT_DB_PATH
|
||||||
|
path = Path(path)
|
||||||
|
path.parent.mkdir(parents=True, exist_ok=True)
|
||||||
|
self.path = path
|
||||||
|
self._con = sqlite3.connect(str(path), check_same_thread=False)
|
||||||
|
self._con.row_factory = sqlite3.Row
|
||||||
|
self._init_schema()
|
||||||
|
|
||||||
|
def _init_schema(self) -> None:
|
||||||
|
self._con.executescript(_DDL)
|
||||||
|
self._con.commit()
|
||||||
|
|
||||||
|
def close(self) -> None:
|
||||||
|
self._con.close()
|
||||||
|
|
||||||
|
# ── Ingest ────────────────────────────────────────────────────────────
|
||||||
|
|
||||||
|
def ingest(
|
||||||
|
self,
|
||||||
|
sessions: list, # list[Session] from s3_analyzer
|
||||||
|
s3_path: Optional[Path],
|
||||||
|
bw_path: Optional[Path],
|
||||||
|
notes: str = "",
|
||||||
|
) -> Optional[int]:
|
||||||
|
"""
|
||||||
|
Ingest a list of sessions into the DB.
|
||||||
|
Returns capture_id, or None if already ingested (duplicate hash).
|
||||||
|
"""
|
||||||
|
import datetime
|
||||||
|
|
||||||
|
s3_blob = s3_path.read_bytes() if s3_path and s3_path.exists() else b""
|
||||||
|
bw_blob = bw_path.read_bytes() if bw_path and bw_path.exists() else b""
|
||||||
|
cap_hash = _sha256_blobs(s3_blob, bw_blob)
|
||||||
|
|
||||||
|
# Dedup check
|
||||||
|
row = self._con.execute(
|
||||||
|
"SELECT id FROM captures WHERE capture_hash=?", (cap_hash,)
|
||||||
|
).fetchone()
|
||||||
|
if row:
|
||||||
|
return None # already in DB
|
||||||
|
|
||||||
|
ts = datetime.datetime.now().isoformat(timespec="seconds")
|
||||||
|
cur = self._con.execute(
|
||||||
|
"INSERT INTO captures (timestamp, s3_path, bw_path, capture_hash, notes) "
|
||||||
|
"VALUES (?, ?, ?, ?, ?)",
|
||||||
|
(ts, str(s3_path) if s3_path else None,
|
||||||
|
str(bw_path) if bw_path else None,
|
||||||
|
cap_hash, notes)
|
||||||
|
)
|
||||||
|
cap_id = cur.lastrowid
|
||||||
|
|
||||||
|
for sess in sessions:
|
||||||
|
for af in sess.all_frames:
|
||||||
|
frame_id = self._insert_frame(cap_id, af)
|
||||||
|
self._insert_byte_values(frame_id, af.frame.payload)
|
||||||
|
|
||||||
|
self._con.commit()
|
||||||
|
return cap_id
|
||||||
|
|
||||||
|
def _insert_frame(self, cap_id: int, af) -> int:
|
||||||
|
"""Insert one AnnotatedFrame; return its rowid."""
|
||||||
|
sub = af.header.sub if af.header else None
|
||||||
|
page_key = af.header.page_key if af.header else None
|
||||||
|
chk_ok = None
|
||||||
|
if af.frame.checksum_valid is True:
|
||||||
|
chk_ok = 1
|
||||||
|
elif af.frame.checksum_valid is False:
|
||||||
|
chk_ok = 0
|
||||||
|
|
||||||
|
cur = self._con.execute(
|
||||||
|
"INSERT INTO frames "
|
||||||
|
"(capture_id, session_idx, direction, sub, page_key, sub_name, payload, payload_len, checksum_ok) "
|
||||||
|
"VALUES (?, ?, ?, ?, ?, ?, ?, ?, ?)",
|
||||||
|
(cap_id, af.session_idx, af.source,
|
||||||
|
sub, page_key, af.sub_name,
|
||||||
|
af.frame.payload, len(af.frame.payload), chk_ok)
|
||||||
|
)
|
||||||
|
return cur.lastrowid
|
||||||
|
|
||||||
|
def _insert_byte_values(self, frame_id: int, payload: bytes) -> None:
|
||||||
|
"""Insert one row per byte in payload into byte_values."""
|
||||||
|
rows = [(frame_id, i, b) for i, b in enumerate(payload)]
|
||||||
|
self._con.executemany(
|
||||||
|
"INSERT INTO byte_values (frame_id, offset, value) VALUES (?, ?, ?)",
|
||||||
|
rows
|
||||||
|
)
|
||||||
|
|
||||||
|
# ── Queries ───────────────────────────────────────────────────────────
|
||||||
|
|
||||||
|
def list_captures(self) -> list[sqlite3.Row]:
|
||||||
|
return self._con.execute(
|
||||||
|
"SELECT id, timestamp, s3_path, bw_path, notes, "
|
||||||
|
" (SELECT COUNT(*) FROM frames WHERE capture_id=captures.id) AS frame_count "
|
||||||
|
"FROM captures ORDER BY id DESC"
|
||||||
|
).fetchall()
|
||||||
|
|
||||||
|
def query_frames(
|
||||||
|
self,
|
||||||
|
capture_id: Optional[int] = None,
|
||||||
|
direction: Optional[str] = None, # "BW" or "S3"
|
||||||
|
sub: Optional[int] = None,
|
||||||
|
page_key: Optional[int] = None,
|
||||||
|
limit: int = 500,
|
||||||
|
) -> list[sqlite3.Row]:
|
||||||
|
"""
|
||||||
|
Query frames table with optional filters.
|
||||||
|
Returns rows with: id, capture_id, session_idx, direction, sub, page_key,
|
||||||
|
sub_name, payload, payload_len, checksum_ok
|
||||||
|
"""
|
||||||
|
clauses = []
|
||||||
|
params = []
|
||||||
|
|
||||||
|
if capture_id is not None:
|
||||||
|
clauses.append("capture_id=?"); params.append(capture_id)
|
||||||
|
if direction is not None:
|
||||||
|
clauses.append("direction=?"); params.append(direction)
|
||||||
|
if sub is not None:
|
||||||
|
clauses.append("sub=?"); params.append(sub)
|
||||||
|
if page_key is not None:
|
||||||
|
clauses.append("page_key=?"); params.append(page_key)
|
||||||
|
|
||||||
|
where = ("WHERE " + " AND ".join(clauses)) if clauses else ""
|
||||||
|
sql = f"SELECT * FROM frames {where} ORDER BY id LIMIT ?"
|
||||||
|
params.append(limit)
|
||||||
|
|
||||||
|
return self._con.execute(sql, params).fetchall()
|
||||||
|
|
||||||
|
def query_by_byte(
|
||||||
|
self,
|
||||||
|
offset: int,
|
||||||
|
value: Optional[int] = None,
|
||||||
|
capture_id: Optional[int] = None,
|
||||||
|
direction: Optional[str] = None,
|
||||||
|
sub: Optional[int] = None,
|
||||||
|
limit: int = 500,
|
||||||
|
) -> list[sqlite3.Row]:
|
||||||
|
"""
|
||||||
|
Return frames that have a specific byte at a specific offset.
|
||||||
|
Joins byte_values -> frames for indexed lookup.
|
||||||
|
"""
|
||||||
|
clauses = ["bv.offset=?"]
|
||||||
|
params = [offset]
|
||||||
|
|
||||||
|
if value is not None:
|
||||||
|
clauses.append("bv.value=?"); params.append(value)
|
||||||
|
if capture_id is not None:
|
||||||
|
clauses.append("f.capture_id=?"); params.append(capture_id)
|
||||||
|
if direction is not None:
|
||||||
|
clauses.append("f.direction=?"); params.append(direction)
|
||||||
|
if sub is not None:
|
||||||
|
clauses.append("f.sub=?"); params.append(sub)
|
||||||
|
|
||||||
|
where = "WHERE " + " AND ".join(clauses)
|
||||||
|
sql = (
|
||||||
|
f"SELECT f.*, bv.offset AS q_offset, bv.value AS q_value "
|
||||||
|
f"FROM byte_values bv "
|
||||||
|
f"JOIN frames f ON f.id=bv.frame_id "
|
||||||
|
f"{where} "
|
||||||
|
f"ORDER BY f.id LIMIT ?"
|
||||||
|
)
|
||||||
|
params.append(limit)
|
||||||
|
return self._con.execute(sql, params).fetchall()
|
||||||
|
|
||||||
|
def get_frame_payload(self, frame_id: int) -> Optional[bytes]:
|
||||||
|
row = self._con.execute(
|
||||||
|
"SELECT payload FROM frames WHERE id=?", (frame_id,)
|
||||||
|
).fetchone()
|
||||||
|
return bytes(row["payload"]) if row else None
|
||||||
|
|
||||||
|
def get_distinct_subs(self, capture_id: Optional[int] = None) -> list[int]:
|
||||||
|
if capture_id is not None:
|
||||||
|
rows = self._con.execute(
|
||||||
|
"SELECT DISTINCT sub FROM frames WHERE capture_id=? AND sub IS NOT NULL ORDER BY sub",
|
||||||
|
(capture_id,)
|
||||||
|
).fetchall()
|
||||||
|
else:
|
||||||
|
rows = self._con.execute(
|
||||||
|
"SELECT DISTINCT sub FROM frames WHERE sub IS NOT NULL ORDER BY sub"
|
||||||
|
).fetchall()
|
||||||
|
return [r[0] for r in rows]
|
||||||
|
|
||||||
|
def get_distinct_offsets(self, capture_id: Optional[int] = None) -> list[int]:
|
||||||
|
if capture_id is not None:
|
||||||
|
rows = self._con.execute(
|
||||||
|
"SELECT DISTINCT bv.offset FROM byte_values bv "
|
||||||
|
"JOIN frames f ON f.id=bv.frame_id WHERE f.capture_id=? ORDER BY bv.offset",
|
||||||
|
(capture_id,)
|
||||||
|
).fetchall()
|
||||||
|
else:
|
||||||
|
rows = self._con.execute(
|
||||||
|
"SELECT DISTINCT offset FROM byte_values ORDER BY offset"
|
||||||
|
).fetchall()
|
||||||
|
return [r[0] for r in rows]
|
||||||
|
|
||||||
|
def interpret_offset(self, payload: bytes, offset: int) -> dict:
|
||||||
|
"""Return multi-format interpretation of bytes starting at offset."""
|
||||||
|
return _interp_bytes(payload, offset)
|
||||||
|
|
||||||
|
def get_stats(self) -> dict:
|
||||||
|
captures = self._con.execute("SELECT COUNT(*) FROM captures").fetchone()[0]
|
||||||
|
frames = self._con.execute("SELECT COUNT(*) FROM frames").fetchone()[0]
|
||||||
|
bv_rows = self._con.execute("SELECT COUNT(*) FROM byte_values").fetchone()[0]
|
||||||
|
return {"captures": captures, "frames": frames, "byte_value_rows": bv_rows}
|
||||||
940
parsers/gui_analyzer.py
Normal file
940
parsers/gui_analyzer.py
Normal file
@@ -0,0 +1,940 @@
|
|||||||
|
#!/usr/bin/env python3
|
||||||
|
"""
|
||||||
|
gui_analyzer.py — Tkinter GUI for s3_analyzer.
|
||||||
|
|
||||||
|
Layout:
|
||||||
|
┌─────────────────────────────────────────────────────────┐
|
||||||
|
│ [S3 file: ___________ Browse] [BW file: ___ Browse] │
|
||||||
|
│ [Analyze] [Live mode toggle] Status: Idle │
|
||||||
|
├──────────────────┬──────────────────────────────────────┤
|
||||||
|
│ Session list │ Detail panel (tabs) │
|
||||||
|
│ ─ Session 0 │ Inventory | Hex Dump | Diff │
|
||||||
|
│ └ POLL (BW) │ │
|
||||||
|
│ └ POLL_RESP │ (content of selected tab) │
|
||||||
|
│ ─ Session 1 │ │
|
||||||
|
│ └ ... │ │
|
||||||
|
└──────────────────┴──────────────────────────────────────┘
|
||||||
|
│ Status bar │
|
||||||
|
└─────────────────────────────────────────────────────────┘
|
||||||
|
"""
|
||||||
|
|
||||||
|
from __future__ import annotations
|
||||||
|
|
||||||
|
import queue
|
||||||
|
import sys
|
||||||
|
import threading
|
||||||
|
import time
|
||||||
|
import tkinter as tk
|
||||||
|
from pathlib import Path
|
||||||
|
from tkinter import filedialog, font, messagebox, ttk
|
||||||
|
from typing import Optional
|
||||||
|
|
||||||
|
sys.path.insert(0, str(Path(__file__).parent))
|
||||||
|
from s3_analyzer import ( # noqa: E402
|
||||||
|
AnnotatedFrame,
|
||||||
|
FrameDiff,
|
||||||
|
Session,
|
||||||
|
annotate_frames,
|
||||||
|
diff_sessions,
|
||||||
|
format_hex_dump,
|
||||||
|
parse_bw,
|
||||||
|
parse_s3,
|
||||||
|
render_session_report,
|
||||||
|
split_into_sessions,
|
||||||
|
write_claude_export,
|
||||||
|
)
|
||||||
|
from frame_db import FrameDB, DEFAULT_DB_PATH # noqa: E402
|
||||||
|
|
||||||
|
|
||||||
|
# ──────────────────────────────────────────────────────────────────────────────
|
||||||
|
# Colour palette (dark-ish terminal feel)
|
||||||
|
# ──────────────────────────────────────────────────────────────────────────────
|
||||||
|
BG = "#1e1e1e"
|
||||||
|
BG2 = "#252526"
|
||||||
|
BG3 = "#2d2d30"
|
||||||
|
FG = "#d4d4d4"
|
||||||
|
FG_DIM = "#6a6a6a"
|
||||||
|
ACCENT = "#569cd6"
|
||||||
|
ACCENT2 = "#4ec9b0"
|
||||||
|
RED = "#f44747"
|
||||||
|
YELLOW = "#dcdcaa"
|
||||||
|
GREEN = "#4caf50"
|
||||||
|
ORANGE = "#ce9178"
|
||||||
|
|
||||||
|
COL_BW = "#9cdcfe" # BW frames
|
||||||
|
COL_S3 = "#4ec9b0" # S3 frames
|
||||||
|
COL_DIFF = "#f44747" # Changed bytes
|
||||||
|
COL_KNOW = "#4caf50" # Known-field annotations
|
||||||
|
COL_HEAD = "#569cd6" # Section headers
|
||||||
|
|
||||||
|
MONO = ("Consolas", 9)
|
||||||
|
MONO_SM = ("Consolas", 8)
|
||||||
|
|
||||||
|
|
||||||
|
# ──────────────────────────────────────────────────────────────────────────────
|
||||||
|
# State container
|
||||||
|
# ──────────────────────────────────────────────────────────────────────────────
|
||||||
|
|
||||||
|
class AnalyzerState:
|
||||||
|
def __init__(self) -> None:
|
||||||
|
self.sessions: list[Session] = []
|
||||||
|
self.diffs: list[Optional[list[FrameDiff]]] = [] # diffs[i] = diff of session i vs i-1
|
||||||
|
self.s3_path: Optional[Path] = None
|
||||||
|
self.bw_path: Optional[Path] = None
|
||||||
|
self.last_capture_id: Optional[int] = None
|
||||||
|
|
||||||
|
|
||||||
|
# ──────────────────────────────────────────────────────────────────────────────
|
||||||
|
# Main GUI
|
||||||
|
# ──────────────────────────────────────────────────────────────────────────────
|
||||||
|
|
||||||
|
class AnalyzerGUI(tk.Tk):
|
||||||
|
def __init__(self) -> None:
|
||||||
|
super().__init__()
|
||||||
|
self.title("S3 Protocol Analyzer")
|
||||||
|
self.configure(bg=BG)
|
||||||
|
self.minsize(1050, 600)
|
||||||
|
|
||||||
|
self.state = AnalyzerState()
|
||||||
|
self._live_thread: Optional[threading.Thread] = None
|
||||||
|
self._live_stop = threading.Event()
|
||||||
|
self._live_q: queue.Queue[str] = queue.Queue()
|
||||||
|
self._db = FrameDB()
|
||||||
|
|
||||||
|
self._build_widgets()
|
||||||
|
self._poll_live_queue()
|
||||||
|
|
||||||
|
# ── widget construction ────────────────────────────────────────────────
|
||||||
|
|
||||||
|
def _build_widgets(self) -> None:
|
||||||
|
self._build_toolbar()
|
||||||
|
self._build_panes()
|
||||||
|
self._build_statusbar()
|
||||||
|
|
||||||
|
def _build_toolbar(self) -> None:
|
||||||
|
bar = tk.Frame(self, bg=BG2, pady=4)
|
||||||
|
bar.pack(side=tk.TOP, fill=tk.X)
|
||||||
|
|
||||||
|
pad = {"padx": 5, "pady": 2}
|
||||||
|
|
||||||
|
# S3 file
|
||||||
|
tk.Label(bar, text="S3 raw:", bg=BG2, fg=FG, font=MONO).pack(side=tk.LEFT, **pad)
|
||||||
|
self.s3_var = tk.StringVar()
|
||||||
|
tk.Entry(bar, textvariable=self.s3_var, width=28, bg=BG3, fg=FG,
|
||||||
|
insertbackground=FG, relief="flat", font=MONO).pack(side=tk.LEFT, **pad)
|
||||||
|
tk.Button(bar, text="Browse", bg=BG3, fg=FG, relief="flat",
|
||||||
|
activebackground=ACCENT, cursor="hand2",
|
||||||
|
command=lambda: self._browse_file(self.s3_var, "raw_s3.bin")
|
||||||
|
).pack(side=tk.LEFT, **pad)
|
||||||
|
|
||||||
|
tk.Label(bar, text=" BW raw:", bg=BG2, fg=FG, font=MONO).pack(side=tk.LEFT, **pad)
|
||||||
|
self.bw_var = tk.StringVar()
|
||||||
|
tk.Entry(bar, textvariable=self.bw_var, width=28, bg=BG3, fg=FG,
|
||||||
|
insertbackground=FG, relief="flat", font=MONO).pack(side=tk.LEFT, **pad)
|
||||||
|
tk.Button(bar, text="Browse", bg=BG3, fg=FG, relief="flat",
|
||||||
|
activebackground=ACCENT, cursor="hand2",
|
||||||
|
command=lambda: self._browse_file(self.bw_var, "raw_bw.bin")
|
||||||
|
).pack(side=tk.LEFT, **pad)
|
||||||
|
|
||||||
|
# Buttons
|
||||||
|
tk.Frame(bar, bg=BG2, width=10).pack(side=tk.LEFT)
|
||||||
|
self.analyze_btn = tk.Button(bar, text="Analyze", bg=ACCENT, fg="#ffffff",
|
||||||
|
relief="flat", padx=10, cursor="hand2",
|
||||||
|
font=("Consolas", 9, "bold"),
|
||||||
|
command=self._run_analyze)
|
||||||
|
self.analyze_btn.pack(side=tk.LEFT, **pad)
|
||||||
|
|
||||||
|
self.live_btn = tk.Button(bar, text="Live: OFF", bg=BG3, fg=FG,
|
||||||
|
relief="flat", padx=10, cursor="hand2",
|
||||||
|
font=MONO, command=self._toggle_live)
|
||||||
|
self.live_btn.pack(side=tk.LEFT, **pad)
|
||||||
|
|
||||||
|
self.export_btn = tk.Button(bar, text="Export for Claude", bg=ORANGE, fg="#000000",
|
||||||
|
relief="flat", padx=10, cursor="hand2",
|
||||||
|
font=("Consolas", 9, "bold"),
|
||||||
|
command=self._run_export, state="disabled")
|
||||||
|
self.export_btn.pack(side=tk.LEFT, **pad)
|
||||||
|
|
||||||
|
self.status_var = tk.StringVar(value="Idle")
|
||||||
|
tk.Label(bar, textvariable=self.status_var, bg=BG2, fg=FG_DIM,
|
||||||
|
font=MONO, anchor="w").pack(side=tk.LEFT, padx=10)
|
||||||
|
|
||||||
|
def _build_panes(self) -> None:
|
||||||
|
pane = tk.PanedWindow(self, orient=tk.HORIZONTAL, bg=BG,
|
||||||
|
sashwidth=4, sashrelief="flat")
|
||||||
|
pane.pack(fill=tk.BOTH, expand=True, padx=0, pady=0)
|
||||||
|
|
||||||
|
# ── Left: session/frame tree ──────────────────────────────────────
|
||||||
|
left = tk.Frame(pane, bg=BG2, width=260)
|
||||||
|
pane.add(left, minsize=200)
|
||||||
|
|
||||||
|
tk.Label(left, text="Sessions", bg=BG2, fg=ACCENT,
|
||||||
|
font=("Consolas", 9, "bold"), anchor="w", padx=6).pack(fill=tk.X)
|
||||||
|
|
||||||
|
tree_frame = tk.Frame(left, bg=BG2)
|
||||||
|
tree_frame.pack(fill=tk.BOTH, expand=True)
|
||||||
|
|
||||||
|
style = ttk.Style()
|
||||||
|
style.theme_use("clam")
|
||||||
|
style.configure("Treeview",
|
||||||
|
background=BG2, foreground=FG, fieldbackground=BG2,
|
||||||
|
font=MONO_SM, rowheight=18, borderwidth=0)
|
||||||
|
style.configure("Treeview.Heading",
|
||||||
|
background=BG3, foreground=ACCENT, font=MONO_SM)
|
||||||
|
style.map("Treeview", background=[("selected", BG3)],
|
||||||
|
foreground=[("selected", "#ffffff")])
|
||||||
|
|
||||||
|
self.tree = ttk.Treeview(tree_frame, columns=("info",), show="tree headings",
|
||||||
|
selectmode="browse")
|
||||||
|
self.tree.heading("#0", text="Frame")
|
||||||
|
self.tree.heading("info", text="Info")
|
||||||
|
self.tree.column("#0", width=160, stretch=True)
|
||||||
|
self.tree.column("info", width=80, stretch=False)
|
||||||
|
|
||||||
|
vsb = ttk.Scrollbar(tree_frame, orient="vertical", command=self.tree.yview)
|
||||||
|
self.tree.configure(yscrollcommand=vsb.set)
|
||||||
|
vsb.pack(side=tk.RIGHT, fill=tk.Y)
|
||||||
|
self.tree.pack(fill=tk.BOTH, expand=True)
|
||||||
|
|
||||||
|
self.tree.tag_configure("session", foreground=ACCENT, font=("Consolas", 9, "bold"))
|
||||||
|
self.tree.tag_configure("bw_frame", foreground=COL_BW)
|
||||||
|
self.tree.tag_configure("s3_frame", foreground=COL_S3)
|
||||||
|
self.tree.tag_configure("bad_chk", foreground=RED)
|
||||||
|
self.tree.tag_configure("malformed", foreground=RED)
|
||||||
|
|
||||||
|
self.tree.bind("<<TreeviewSelect>>", self._on_tree_select)
|
||||||
|
|
||||||
|
# ── Right: detail notebook ────────────────────────────────────────
|
||||||
|
right = tk.Frame(pane, bg=BG)
|
||||||
|
pane.add(right, minsize=600)
|
||||||
|
|
||||||
|
style.configure("TNotebook", background=BG2, borderwidth=0)
|
||||||
|
style.configure("TNotebook.Tab", background=BG3, foreground=FG,
|
||||||
|
font=MONO, padding=[8, 2])
|
||||||
|
style.map("TNotebook.Tab", background=[("selected", BG)],
|
||||||
|
foreground=[("selected", ACCENT)])
|
||||||
|
|
||||||
|
self.nb = ttk.Notebook(right)
|
||||||
|
self.nb.pack(fill=tk.BOTH, expand=True)
|
||||||
|
|
||||||
|
# Tab: Inventory
|
||||||
|
self.inv_text = self._make_text_tab("Inventory")
|
||||||
|
# Tab: Hex Dump
|
||||||
|
self.hex_text = self._make_text_tab("Hex Dump")
|
||||||
|
# Tab: Diff
|
||||||
|
self.diff_text = self._make_text_tab("Diff")
|
||||||
|
# Tab: Full Report (raw text)
|
||||||
|
self.report_text = self._make_text_tab("Full Report")
|
||||||
|
# Tab: Query (DB)
|
||||||
|
self._build_query_tab()
|
||||||
|
|
||||||
|
# Tag colours for rich text in all tabs
|
||||||
|
for w in (self.inv_text, self.hex_text, self.diff_text, self.report_text):
|
||||||
|
w.tag_configure("head", foreground=COL_HEAD, font=("Consolas", 9, "bold"))
|
||||||
|
w.tag_configure("bw", foreground=COL_BW)
|
||||||
|
w.tag_configure("s3", foreground=COL_S3)
|
||||||
|
w.tag_configure("changed", foreground=COL_DIFF)
|
||||||
|
w.tag_configure("known", foreground=COL_KNOW)
|
||||||
|
w.tag_configure("dim", foreground=FG_DIM)
|
||||||
|
w.tag_configure("normal", foreground=FG)
|
||||||
|
w.tag_configure("warn", foreground=YELLOW)
|
||||||
|
w.tag_configure("addr", foreground=ORANGE)
|
||||||
|
|
||||||
|
def _make_text_tab(self, title: str) -> tk.Text:
|
||||||
|
frame = tk.Frame(self.nb, bg=BG)
|
||||||
|
self.nb.add(frame, text=title)
|
||||||
|
w = tk.Text(frame, bg=BG, fg=FG, font=MONO, state="disabled",
|
||||||
|
relief="flat", wrap="none", insertbackground=FG,
|
||||||
|
selectbackground=BG3, selectforeground="#ffffff")
|
||||||
|
vsb = ttk.Scrollbar(frame, orient="vertical", command=w.yview)
|
||||||
|
hsb = ttk.Scrollbar(frame, orient="horizontal", command=w.xview)
|
||||||
|
w.configure(yscrollcommand=vsb.set, xscrollcommand=hsb.set)
|
||||||
|
vsb.pack(side=tk.RIGHT, fill=tk.Y)
|
||||||
|
hsb.pack(side=tk.BOTTOM, fill=tk.X)
|
||||||
|
w.pack(fill=tk.BOTH, expand=True)
|
||||||
|
return w
|
||||||
|
|
||||||
|
def _build_query_tab(self) -> None:
|
||||||
|
"""Build the Query tab: filter controls + results table + interpretation panel."""
|
||||||
|
frame = tk.Frame(self.nb, bg=BG)
|
||||||
|
self.nb.add(frame, text="Query DB")
|
||||||
|
|
||||||
|
# ── Filter row ────────────────────────────────────────────────────
|
||||||
|
filt = tk.Frame(frame, bg=BG2, pady=4)
|
||||||
|
filt.pack(side=tk.TOP, fill=tk.X)
|
||||||
|
|
||||||
|
pad = {"padx": 4, "pady": 2}
|
||||||
|
|
||||||
|
# Capture filter
|
||||||
|
tk.Label(filt, text="Capture:", bg=BG2, fg=FG, font=MONO_SM).grid(row=0, column=0, sticky="e", **pad)
|
||||||
|
self._q_capture_var = tk.StringVar(value="All")
|
||||||
|
self._q_capture_cb = ttk.Combobox(filt, textvariable=self._q_capture_var,
|
||||||
|
width=18, font=MONO_SM, state="readonly")
|
||||||
|
self._q_capture_cb.grid(row=0, column=1, sticky="w", **pad)
|
||||||
|
|
||||||
|
# Direction filter
|
||||||
|
tk.Label(filt, text="Dir:", bg=BG2, fg=FG, font=MONO_SM).grid(row=0, column=2, sticky="e", **pad)
|
||||||
|
self._q_dir_var = tk.StringVar(value="All")
|
||||||
|
self._q_dir_cb = ttk.Combobox(filt, textvariable=self._q_dir_var,
|
||||||
|
values=["All", "BW", "S3"],
|
||||||
|
width=6, font=MONO_SM, state="readonly")
|
||||||
|
self._q_dir_cb.grid(row=0, column=3, sticky="w", **pad)
|
||||||
|
|
||||||
|
# SUB filter
|
||||||
|
tk.Label(filt, text="SUB:", bg=BG2, fg=FG, font=MONO_SM).grid(row=0, column=4, sticky="e", **pad)
|
||||||
|
self._q_sub_var = tk.StringVar(value="All")
|
||||||
|
self._q_sub_cb = ttk.Combobox(filt, textvariable=self._q_sub_var,
|
||||||
|
width=12, font=MONO_SM, state="readonly")
|
||||||
|
self._q_sub_cb.grid(row=0, column=5, sticky="w", **pad)
|
||||||
|
|
||||||
|
# Byte offset filter
|
||||||
|
tk.Label(filt, text="Offset:", bg=BG2, fg=FG, font=MONO_SM).grid(row=0, column=6, sticky="e", **pad)
|
||||||
|
self._q_offset_var = tk.StringVar(value="")
|
||||||
|
tk.Entry(filt, textvariable=self._q_offset_var, width=8, bg=BG3, fg=FG,
|
||||||
|
font=MONO_SM, insertbackground=FG, relief="flat").grid(row=0, column=7, sticky="w", **pad)
|
||||||
|
|
||||||
|
# Value filter
|
||||||
|
tk.Label(filt, text="Value:", bg=BG2, fg=FG, font=MONO_SM).grid(row=0, column=8, sticky="e", **pad)
|
||||||
|
self._q_value_var = tk.StringVar(value="")
|
||||||
|
tk.Entry(filt, textvariable=self._q_value_var, width=8, bg=BG3, fg=FG,
|
||||||
|
font=MONO_SM, insertbackground=FG, relief="flat").grid(row=0, column=9, sticky="w", **pad)
|
||||||
|
|
||||||
|
# Run / Refresh buttons
|
||||||
|
tk.Button(filt, text="Run Query", bg=ACCENT, fg="#ffffff", relief="flat",
|
||||||
|
padx=8, cursor="hand2", font=("Consolas", 8, "bold"),
|
||||||
|
command=self._run_db_query).grid(row=0, column=10, padx=8)
|
||||||
|
tk.Button(filt, text="Refresh dropdowns", bg=BG3, fg=FG, relief="flat",
|
||||||
|
padx=6, cursor="hand2", font=MONO_SM,
|
||||||
|
command=self._refresh_query_dropdowns).grid(row=0, column=11, padx=4)
|
||||||
|
|
||||||
|
# DB stats label
|
||||||
|
self._q_stats_var = tk.StringVar(value="DB: —")
|
||||||
|
tk.Label(filt, textvariable=self._q_stats_var, bg=BG2, fg=FG_DIM,
|
||||||
|
font=MONO_SM).grid(row=0, column=12, padx=12, sticky="w")
|
||||||
|
|
||||||
|
# ── Results table ─────────────────────────────────────────────────
|
||||||
|
res_frame = tk.Frame(frame, bg=BG)
|
||||||
|
res_frame.pack(side=tk.TOP, fill=tk.BOTH, expand=True)
|
||||||
|
|
||||||
|
# Results treeview
|
||||||
|
cols = ("cap", "sess", "dir", "sub", "sub_name", "page", "len", "chk")
|
||||||
|
self._q_tree = ttk.Treeview(res_frame, columns=cols,
|
||||||
|
show="headings", selectmode="browse")
|
||||||
|
col_cfg = [
|
||||||
|
("cap", "Cap", 40),
|
||||||
|
("sess", "Sess", 40),
|
||||||
|
("dir", "Dir", 40),
|
||||||
|
("sub", "SUB", 50),
|
||||||
|
("sub_name", "Name", 160),
|
||||||
|
("page", "Page", 60),
|
||||||
|
("len", "Len", 50),
|
||||||
|
("chk", "Chk", 50),
|
||||||
|
]
|
||||||
|
for cid, heading, width in col_cfg:
|
||||||
|
self._q_tree.heading(cid, text=heading, anchor="w")
|
||||||
|
self._q_tree.column(cid, width=width, stretch=(cid == "sub_name"))
|
||||||
|
|
||||||
|
q_vsb = ttk.Scrollbar(res_frame, orient="vertical", command=self._q_tree.yview)
|
||||||
|
q_hsb = ttk.Scrollbar(res_frame, orient="horizontal", command=self._q_tree.xview)
|
||||||
|
self._q_tree.configure(yscrollcommand=q_vsb.set, xscrollcommand=q_hsb.set)
|
||||||
|
q_vsb.pack(side=tk.RIGHT, fill=tk.Y)
|
||||||
|
q_hsb.pack(side=tk.BOTTOM, fill=tk.X)
|
||||||
|
self._q_tree.pack(side=tk.LEFT, fill=tk.BOTH, expand=True)
|
||||||
|
|
||||||
|
self._q_tree.tag_configure("bw_row", foreground=COL_BW)
|
||||||
|
self._q_tree.tag_configure("s3_row", foreground=COL_S3)
|
||||||
|
self._q_tree.tag_configure("bad_row", foreground=RED)
|
||||||
|
|
||||||
|
# ── Interpretation panel (below results) ──────────────────────────
|
||||||
|
interp_frame = tk.Frame(frame, bg=BG2, height=120)
|
||||||
|
interp_frame.pack(side=tk.BOTTOM, fill=tk.X)
|
||||||
|
interp_frame.pack_propagate(False)
|
||||||
|
|
||||||
|
tk.Label(interp_frame, text="Byte interpretation (click a row, enter offset):",
|
||||||
|
bg=BG2, fg=ACCENT, font=MONO_SM, anchor="w", padx=6).pack(fill=tk.X)
|
||||||
|
|
||||||
|
interp_inner = tk.Frame(interp_frame, bg=BG2)
|
||||||
|
interp_inner.pack(fill=tk.X, padx=6, pady=2)
|
||||||
|
|
||||||
|
tk.Label(interp_inner, text="Offset:", bg=BG2, fg=FG, font=MONO_SM).pack(side=tk.LEFT)
|
||||||
|
self._interp_offset_var = tk.StringVar(value="5")
|
||||||
|
tk.Entry(interp_inner, textvariable=self._interp_offset_var,
|
||||||
|
width=6, bg=BG3, fg=FG, font=MONO_SM,
|
||||||
|
insertbackground=FG, relief="flat").pack(side=tk.LEFT, padx=4)
|
||||||
|
tk.Button(interp_inner, text="Interpret", bg=BG3, fg=FG, relief="flat",
|
||||||
|
cursor="hand2", font=MONO_SM,
|
||||||
|
command=self._run_interpret).pack(side=tk.LEFT, padx=4)
|
||||||
|
|
||||||
|
self._interp_text = tk.Text(interp_frame, bg=BG2, fg=FG, font=MONO_SM,
|
||||||
|
height=4, relief="flat", state="disabled",
|
||||||
|
insertbackground=FG)
|
||||||
|
self._interp_text.pack(fill=tk.X, padx=6, pady=2)
|
||||||
|
self._interp_text.tag_configure("label", foreground=FG_DIM)
|
||||||
|
self._interp_text.tag_configure("value", foreground=YELLOW)
|
||||||
|
|
||||||
|
# Store frame rows by tree iid -> db row
|
||||||
|
self._q_rows: dict[str, object] = {}
|
||||||
|
self._q_capture_rows: list = [None]
|
||||||
|
self._q_sub_values: list = [None]
|
||||||
|
self._q_tree.bind("<<TreeviewSelect>>", self._on_q_select)
|
||||||
|
|
||||||
|
# Init dropdowns
|
||||||
|
self._refresh_query_dropdowns()
|
||||||
|
|
||||||
|
def _refresh_query_dropdowns(self) -> None:
|
||||||
|
"""Reload capture and SUB dropdowns from the DB."""
|
||||||
|
try:
|
||||||
|
captures = self._db.list_captures()
|
||||||
|
cap_labels = ["All"] + [
|
||||||
|
f"#{r['id']} {r['timestamp'][:16]} ({r['frame_count']} frames)"
|
||||||
|
for r in captures
|
||||||
|
]
|
||||||
|
self._q_capture_cb["values"] = cap_labels
|
||||||
|
self._q_capture_rows = [None] + [r["id"] for r in captures]
|
||||||
|
|
||||||
|
subs = self._db.get_distinct_subs()
|
||||||
|
sub_labels = ["All"] + [f"0x{s:02X}" for s in subs]
|
||||||
|
self._q_sub_cb["values"] = sub_labels
|
||||||
|
self._q_sub_values = [None] + subs
|
||||||
|
|
||||||
|
stats = self._db.get_stats()
|
||||||
|
self._q_stats_var.set(
|
||||||
|
f"DB: {stats['captures']} captures | {stats['frames']} frames"
|
||||||
|
)
|
||||||
|
except Exception as exc:
|
||||||
|
self._q_stats_var.set(f"DB error: {exc}")
|
||||||
|
|
||||||
|
def _parse_hex_or_int(self, s: str) -> Optional[int]:
|
||||||
|
"""Parse '0x1F', '31', or '' into int or None."""
|
||||||
|
s = s.strip()
|
||||||
|
if not s:
|
||||||
|
return None
|
||||||
|
try:
|
||||||
|
return int(s, 0)
|
||||||
|
except ValueError:
|
||||||
|
return None
|
||||||
|
|
||||||
|
def _run_db_query(self) -> None:
|
||||||
|
"""Execute query with current filter values and populate results tree."""
|
||||||
|
# Resolve capture_id
|
||||||
|
cap_idx = self._q_capture_cb.current()
|
||||||
|
cap_id = self._q_capture_rows[cap_idx] if cap_idx > 0 else None
|
||||||
|
|
||||||
|
# Direction
|
||||||
|
dir_val = self._q_dir_var.get()
|
||||||
|
direction = dir_val if dir_val != "All" else None
|
||||||
|
|
||||||
|
# SUB
|
||||||
|
sub_idx = self._q_sub_cb.current()
|
||||||
|
sub = self._q_sub_values[sub_idx] if sub_idx > 0 else None
|
||||||
|
|
||||||
|
# Offset / value
|
||||||
|
offset = self._parse_hex_or_int(self._q_offset_var.get())
|
||||||
|
value = self._parse_hex_or_int(self._q_value_var.get())
|
||||||
|
|
||||||
|
try:
|
||||||
|
if offset is not None:
|
||||||
|
rows = self._db.query_by_byte(
|
||||||
|
offset=offset, value=value,
|
||||||
|
capture_id=cap_id, direction=direction, sub=sub
|
||||||
|
)
|
||||||
|
else:
|
||||||
|
rows = self._db.query_frames(
|
||||||
|
capture_id=cap_id, direction=direction, sub=sub
|
||||||
|
)
|
||||||
|
except Exception as exc:
|
||||||
|
messagebox.showerror("Query error", str(exc))
|
||||||
|
return
|
||||||
|
|
||||||
|
# Populate tree
|
||||||
|
self._q_tree.delete(*self._q_tree.get_children())
|
||||||
|
self._q_rows.clear()
|
||||||
|
|
||||||
|
for row in rows:
|
||||||
|
sub_hex = f"0x{row['sub']:02X}" if row["sub"] is not None else "—"
|
||||||
|
page_hex = f"0x{row['page_key']:04X}" if row["page_key"] is not None else "—"
|
||||||
|
chk_str = {1: "OK", 0: "BAD", None: "—"}.get(row["checksum_ok"], "—")
|
||||||
|
tag = "bw_row" if row["direction"] == "BW" else "s3_row"
|
||||||
|
if row["checksum_ok"] == 0:
|
||||||
|
tag = "bad_row"
|
||||||
|
|
||||||
|
iid = str(row["id"])
|
||||||
|
self._q_tree.insert("", tk.END, iid=iid, tags=(tag,), values=(
|
||||||
|
row["capture_id"],
|
||||||
|
row["session_idx"],
|
||||||
|
row["direction"],
|
||||||
|
sub_hex,
|
||||||
|
row["sub_name"] or "",
|
||||||
|
page_hex,
|
||||||
|
row["payload_len"],
|
||||||
|
chk_str,
|
||||||
|
))
|
||||||
|
self._q_rows[iid] = row
|
||||||
|
|
||||||
|
self.sb_var.set(f"Query returned {len(rows)} rows")
|
||||||
|
|
||||||
|
def _on_q_select(self, _event: tk.Event) -> None:
|
||||||
|
"""When a DB result row is selected, auto-run interpret at current offset."""
|
||||||
|
self._run_interpret()
|
||||||
|
|
||||||
|
def _run_interpret(self) -> None:
|
||||||
|
"""Show multi-format byte interpretation for the selected row + offset."""
|
||||||
|
sel = self._q_tree.selection()
|
||||||
|
if not sel:
|
||||||
|
return
|
||||||
|
iid = sel[0]
|
||||||
|
row = self._q_rows.get(iid)
|
||||||
|
if row is None:
|
||||||
|
return
|
||||||
|
|
||||||
|
offset = self._parse_hex_or_int(self._interp_offset_var.get())
|
||||||
|
if offset is None:
|
||||||
|
return
|
||||||
|
|
||||||
|
payload = bytes(row["payload"])
|
||||||
|
interp = self._db.interpret_offset(payload, offset)
|
||||||
|
|
||||||
|
w = self._interp_text
|
||||||
|
w.configure(state="normal")
|
||||||
|
w.delete("1.0", tk.END)
|
||||||
|
|
||||||
|
sub_hex = f"0x{row['sub']:02X}" if row["sub"] is not None else "??"
|
||||||
|
w.insert(tk.END, f"Frame #{row['id']} [{row['direction']}] SUB={sub_hex} "
|
||||||
|
f"offset={offset} (0x{offset:04X})\n", "label")
|
||||||
|
|
||||||
|
label_order = [
|
||||||
|
("uint8", "uint8 "),
|
||||||
|
("int8", "int8 "),
|
||||||
|
("uint16_be", "uint16 BE "),
|
||||||
|
("uint16_le", "uint16 LE "),
|
||||||
|
("uint32_be", "uint32 BE "),
|
||||||
|
("uint32_le", "uint32 LE "),
|
||||||
|
("float32_be", "float32 BE "),
|
||||||
|
("float32_le", "float32 LE "),
|
||||||
|
]
|
||||||
|
line = ""
|
||||||
|
for key, label in label_order:
|
||||||
|
if key in interp:
|
||||||
|
val = interp[key]
|
||||||
|
if isinstance(val, float):
|
||||||
|
val_str = f"{val:.6g}"
|
||||||
|
else:
|
||||||
|
val_str = str(val)
|
||||||
|
if key.startswith("uint") or key.startswith("int"):
|
||||||
|
val_str += f" (0x{int(val) & 0xFFFFFFFF:X})"
|
||||||
|
chunk = f"{label}: {val_str}"
|
||||||
|
line += f" {chunk:<30}"
|
||||||
|
if len(line) > 80:
|
||||||
|
w.insert(tk.END, line + "\n", "value")
|
||||||
|
line = ""
|
||||||
|
if line:
|
||||||
|
w.insert(tk.END, line + "\n", "value")
|
||||||
|
|
||||||
|
w.configure(state="disabled")
|
||||||
|
|
||||||
|
def _build_statusbar(self) -> None:
|
||||||
|
bar = tk.Frame(self, bg=BG3, height=20)
|
||||||
|
bar.pack(side=tk.BOTTOM, fill=tk.X)
|
||||||
|
self.sb_var = tk.StringVar(value="Ready")
|
||||||
|
tk.Label(bar, textvariable=self.sb_var, bg=BG3, fg=FG_DIM,
|
||||||
|
font=MONO_SM, anchor="w", padx=6).pack(fill=tk.X)
|
||||||
|
|
||||||
|
# ── file picking ───────────────────────────────────────────────────────
|
||||||
|
|
||||||
|
def _browse_file(self, var: tk.StringVar, default_name: str) -> None:
|
||||||
|
path = filedialog.askopenfilename(
|
||||||
|
title=f"Select {default_name}",
|
||||||
|
filetypes=[("Binary files", "*.bin"), ("All files", "*.*")],
|
||||||
|
initialfile=default_name,
|
||||||
|
)
|
||||||
|
if path:
|
||||||
|
var.set(path)
|
||||||
|
|
||||||
|
# ── analysis ──────────────────────────────────────────────────────────
|
||||||
|
|
||||||
|
def _run_analyze(self) -> None:
|
||||||
|
s3_path = Path(self.s3_var.get().strip()) if self.s3_var.get().strip() else None
|
||||||
|
bw_path = Path(self.bw_var.get().strip()) if self.bw_var.get().strip() else None
|
||||||
|
|
||||||
|
if not s3_path or not bw_path:
|
||||||
|
messagebox.showerror("Missing files", "Please select both S3 and BW raw files.")
|
||||||
|
return
|
||||||
|
if not s3_path.exists():
|
||||||
|
messagebox.showerror("File not found", f"S3 file not found:\n{s3_path}")
|
||||||
|
return
|
||||||
|
if not bw_path.exists():
|
||||||
|
messagebox.showerror("File not found", f"BW file not found:\n{bw_path}")
|
||||||
|
return
|
||||||
|
|
||||||
|
self.state.s3_path = s3_path
|
||||||
|
self.state.bw_path = bw_path
|
||||||
|
self._do_analyze(s3_path, bw_path)
|
||||||
|
|
||||||
|
def _run_export(self) -> None:
|
||||||
|
if not self.state.sessions:
|
||||||
|
messagebox.showinfo("Export", "Run Analyze first.")
|
||||||
|
return
|
||||||
|
|
||||||
|
outdir = self.state.s3_path.parent if self.state.s3_path else Path(".")
|
||||||
|
out_path = write_claude_export(
|
||||||
|
self.state.sessions,
|
||||||
|
self.state.diffs,
|
||||||
|
outdir,
|
||||||
|
self.state.s3_path,
|
||||||
|
self.state.bw_path,
|
||||||
|
)
|
||||||
|
|
||||||
|
self.sb_var.set(f"Exported: {out_path.name}")
|
||||||
|
if messagebox.askyesno(
|
||||||
|
"Export complete",
|
||||||
|
f"Saved to:\n{out_path}\n\nOpen the folder?",
|
||||||
|
):
|
||||||
|
import subprocess
|
||||||
|
subprocess.Popen(["explorer", str(out_path.parent)])
|
||||||
|
|
||||||
|
def _do_analyze(self, s3_path: Path, bw_path: Path) -> None:
|
||||||
|
self.status_var.set("Parsing...")
|
||||||
|
self.update_idletasks()
|
||||||
|
|
||||||
|
s3_blob = s3_path.read_bytes()
|
||||||
|
bw_blob = bw_path.read_bytes()
|
||||||
|
|
||||||
|
s3_frames = annotate_frames(parse_s3(s3_blob, trailer_len=0), "S3")
|
||||||
|
bw_frames = annotate_frames(parse_bw(bw_blob, trailer_len=0, validate_checksum=True), "BW")
|
||||||
|
|
||||||
|
sessions = split_into_sessions(bw_frames, s3_frames)
|
||||||
|
|
||||||
|
diffs: list[Optional[list[FrameDiff]]] = [None]
|
||||||
|
for i in range(1, len(sessions)):
|
||||||
|
diffs.append(diff_sessions(sessions[i - 1], sessions[i]))
|
||||||
|
|
||||||
|
self.state.sessions = sessions
|
||||||
|
self.state.diffs = diffs
|
||||||
|
|
||||||
|
n_s3 = sum(len(s.s3_frames) for s in sessions)
|
||||||
|
n_bw = sum(len(s.bw_frames) for s in sessions)
|
||||||
|
self.status_var.set(
|
||||||
|
f"{len(sessions)} sessions | BW: {n_bw} frames S3: {n_s3} frames"
|
||||||
|
)
|
||||||
|
self.sb_var.set(f"Loaded: {s3_path.name} + {bw_path.name}")
|
||||||
|
|
||||||
|
self.export_btn.configure(state="normal")
|
||||||
|
self._rebuild_tree()
|
||||||
|
|
||||||
|
# Auto-ingest into DB (deduped by SHA256 — fast no-op on re-analyze)
|
||||||
|
try:
|
||||||
|
cap_id = self._db.ingest(sessions, s3_path, bw_path)
|
||||||
|
if cap_id is not None:
|
||||||
|
self.state.last_capture_id = cap_id
|
||||||
|
self._refresh_query_dropdowns()
|
||||||
|
# Pre-select this capture in the Query tab
|
||||||
|
cap_labels = list(self._q_capture_cb["values"])
|
||||||
|
# Find label that starts with #<cap_id>
|
||||||
|
for i, lbl in enumerate(cap_labels):
|
||||||
|
if lbl.startswith(f"#{cap_id} "):
|
||||||
|
self._q_capture_cb.current(i)
|
||||||
|
break
|
||||||
|
# else: already ingested — no change to dropdown selection
|
||||||
|
except Exception as exc:
|
||||||
|
self.sb_var.set(f"DB ingest error: {exc}")
|
||||||
|
|
||||||
|
# ── tree building ──────────────────────────────────────────────────────
|
||||||
|
|
||||||
|
def _rebuild_tree(self) -> None:
|
||||||
|
self.tree.delete(*self.tree.get_children())
|
||||||
|
|
||||||
|
for sess in self.state.sessions:
|
||||||
|
is_complete = any(
|
||||||
|
af.header is not None and af.header.sub == 0x74
|
||||||
|
for af in sess.bw_frames
|
||||||
|
)
|
||||||
|
label = f"Session {sess.index}"
|
||||||
|
if not is_complete:
|
||||||
|
label += " [partial]"
|
||||||
|
n_diff = len(self.state.diffs[sess.index] or [])
|
||||||
|
diff_info = f"{n_diff} changes" if n_diff > 0 else ""
|
||||||
|
sess_id = self.tree.insert("", tk.END, text=label,
|
||||||
|
values=(diff_info,), tags=("session",))
|
||||||
|
|
||||||
|
for af in sess.all_frames:
|
||||||
|
src_tag = "bw_frame" if af.source == "BW" else "s3_frame"
|
||||||
|
sub_hex = f"{af.header.sub:02X}" if af.header else "??"
|
||||||
|
label_text = f"[{af.source}] {sub_hex} {af.sub_name}"
|
||||||
|
extra = ""
|
||||||
|
tags = (src_tag,)
|
||||||
|
if af.frame.checksum_valid is False:
|
||||||
|
extra = "BAD CHK"
|
||||||
|
tags = ("bad_chk",)
|
||||||
|
elif af.header is None:
|
||||||
|
tags = ("malformed",)
|
||||||
|
label_text = f"[{af.source}] MALFORMED"
|
||||||
|
self.tree.insert(sess_id, tk.END, text=label_text,
|
||||||
|
values=(extra,), tags=tags,
|
||||||
|
iid=f"frame_{sess.index}_{af.frame.index}_{af.source}")
|
||||||
|
|
||||||
|
# Expand all sessions
|
||||||
|
for item in self.tree.get_children():
|
||||||
|
self.tree.item(item, open=True)
|
||||||
|
|
||||||
|
# ── tree selection → detail panel ─────────────────────────────────────
|
||||||
|
|
||||||
|
def _on_tree_select(self, _event: tk.Event) -> None:
|
||||||
|
sel = self.tree.selection()
|
||||||
|
if not sel:
|
||||||
|
return
|
||||||
|
iid = sel[0]
|
||||||
|
|
||||||
|
# Determine if it's a session node or a frame node
|
||||||
|
if iid.startswith("frame_"):
|
||||||
|
# frame_<sessidx>_<frameidx>_<source>
|
||||||
|
parts = iid.split("_")
|
||||||
|
sess_idx = int(parts[1])
|
||||||
|
frame_idx = int(parts[2])
|
||||||
|
source = parts[3]
|
||||||
|
self._show_frame_detail(sess_idx, frame_idx, source)
|
||||||
|
else:
|
||||||
|
# Session node — show session summary
|
||||||
|
# Find session index from text
|
||||||
|
text = self.tree.item(iid, "text")
|
||||||
|
try:
|
||||||
|
idx = int(text.split()[1])
|
||||||
|
self._show_session_detail(idx)
|
||||||
|
except (IndexError, ValueError):
|
||||||
|
pass
|
||||||
|
|
||||||
|
def _find_frame(self, sess_idx: int, frame_idx: int, source: str) -> Optional[AnnotatedFrame]:
|
||||||
|
if sess_idx >= len(self.state.sessions):
|
||||||
|
return None
|
||||||
|
sess = self.state.sessions[sess_idx]
|
||||||
|
pool = sess.bw_frames if source == "BW" else sess.s3_frames
|
||||||
|
for af in pool:
|
||||||
|
if af.frame.index == frame_idx:
|
||||||
|
return af
|
||||||
|
return None
|
||||||
|
|
||||||
|
# ── detail renderers ──────────────────────────────────────────────────
|
||||||
|
|
||||||
|
def _clear_all_tabs(self) -> None:
|
||||||
|
for w in (self.inv_text, self.hex_text, self.diff_text, self.report_text):
|
||||||
|
self._text_clear(w)
|
||||||
|
|
||||||
|
def _show_session_detail(self, sess_idx: int) -> None:
|
||||||
|
if sess_idx >= len(self.state.sessions):
|
||||||
|
return
|
||||||
|
sess = self.state.sessions[sess_idx]
|
||||||
|
diffs = self.state.diffs[sess_idx]
|
||||||
|
|
||||||
|
self._clear_all_tabs()
|
||||||
|
|
||||||
|
# ── Inventory tab ────────────────────────────────────────────────
|
||||||
|
w = self.inv_text
|
||||||
|
self._text_clear(w)
|
||||||
|
self._tw(w, f"SESSION {sess.index}", "head"); self._tn(w)
|
||||||
|
n_bw, n_s3 = len(sess.bw_frames), len(sess.s3_frames)
|
||||||
|
self._tw(w, f"Frames: {n_bw + n_s3} (BW: {n_bw}, S3: {n_s3})\n", "normal")
|
||||||
|
if n_bw != n_s3:
|
||||||
|
self._tw(w, " WARNING: BW/S3 count mismatch\n", "warn")
|
||||||
|
self._tn(w)
|
||||||
|
|
||||||
|
for seq_i, af in enumerate(sess.all_frames):
|
||||||
|
src_tag = "bw" if af.source == "BW" else "s3"
|
||||||
|
sub_hex = f"{af.header.sub:02X}" if af.header else "??"
|
||||||
|
page_str = f" (page {af.header.page_key:04X})" if af.header and af.header.page_key != 0 else ""
|
||||||
|
chk = ""
|
||||||
|
if af.frame.checksum_valid is False:
|
||||||
|
chk = " [BAD CHECKSUM]"
|
||||||
|
elif af.frame.checksum_valid is True:
|
||||||
|
chk = f" [{af.frame.checksum_type}]"
|
||||||
|
self._tw(w, f" [{af.source}] #{seq_i:<3} ", src_tag)
|
||||||
|
self._tw(w, f"SUB={sub_hex} ", "addr")
|
||||||
|
self._tw(w, f"{af.sub_name:<30}", src_tag)
|
||||||
|
self._tw(w, f"{page_str} len={len(af.frame.payload)}", "dim")
|
||||||
|
if chk:
|
||||||
|
self._tw(w, chk, "warn" if af.frame.checksum_valid is False else "dim")
|
||||||
|
self._tn(w)
|
||||||
|
|
||||||
|
# ── Diff tab ─────────────────────────────────────────────────────
|
||||||
|
w = self.diff_text
|
||||||
|
self._text_clear(w)
|
||||||
|
if diffs is None:
|
||||||
|
self._tw(w, "(No previous session to diff against)\n", "dim")
|
||||||
|
elif not diffs:
|
||||||
|
self._tw(w, f"DIFF vs SESSION {sess_idx - 1}\n", "head"); self._tn(w)
|
||||||
|
self._tw(w, " No changes detected.\n", "dim")
|
||||||
|
else:
|
||||||
|
self._tw(w, f"DIFF vs SESSION {sess_idx - 1}\n", "head"); self._tn(w)
|
||||||
|
for fd in diffs:
|
||||||
|
page_str = f" (page {fd.page_key:04X})" if fd.page_key != 0 else ""
|
||||||
|
self._tw(w, f"\n SUB {fd.sub:02X} ({fd.sub_name}){page_str}:\n", "addr")
|
||||||
|
for bd in fd.diffs:
|
||||||
|
before_s = f"{bd.before:02x}" if bd.before >= 0 else "--"
|
||||||
|
after_s = f"{bd.after:02x}" if bd.after >= 0 else "--"
|
||||||
|
self._tw(w, f" [{bd.payload_offset:3d}] 0x{bd.payload_offset:04X}: ", "dim")
|
||||||
|
self._tw(w, f"{before_s} -> {after_s}", "changed")
|
||||||
|
if bd.field_name:
|
||||||
|
self._tw(w, f" [{bd.field_name}]", "known")
|
||||||
|
self._tn(w)
|
||||||
|
|
||||||
|
# ── Full Report tab ───────────────────────────────────────────────
|
||||||
|
report_text = render_session_report(sess, diffs, sess_idx - 1 if sess_idx > 0 else None)
|
||||||
|
w = self.report_text
|
||||||
|
self._text_clear(w)
|
||||||
|
self._tw(w, report_text, "normal")
|
||||||
|
|
||||||
|
# Switch to Inventory tab
|
||||||
|
self.nb.select(0)
|
||||||
|
|
||||||
|
def _show_frame_detail(self, sess_idx: int, frame_idx: int, source: str) -> None:
|
||||||
|
af = self._find_frame(sess_idx, frame_idx, source)
|
||||||
|
if af is None:
|
||||||
|
return
|
||||||
|
|
||||||
|
self._clear_all_tabs()
|
||||||
|
src_tag = "bw" if source == "BW" else "s3"
|
||||||
|
sub_hex = f"{af.header.sub:02X}" if af.header else "??"
|
||||||
|
|
||||||
|
# ── Inventory tab — single frame summary ─────────────────────────
|
||||||
|
w = self.inv_text
|
||||||
|
self._tw(w, f"[{af.source}] Frame #{af.frame.index}\n", src_tag)
|
||||||
|
self._tw(w, f"Session {sess_idx} | ", "dim")
|
||||||
|
self._tw(w, f"SUB={sub_hex} {af.sub_name}\n", "addr")
|
||||||
|
if af.header:
|
||||||
|
self._tw(w, f" OFFSET: {af.header.page_key:04X} ", "dim")
|
||||||
|
self._tw(w, f"CMD={af.header.cmd:02X} FLAGS={af.header.flags:02X}\n", "dim")
|
||||||
|
self._tn(w)
|
||||||
|
self._tw(w, f"Payload bytes: {len(af.frame.payload)}\n", "dim")
|
||||||
|
if af.frame.checksum_valid is False:
|
||||||
|
self._tw(w, " BAD CHECKSUM\n", "warn")
|
||||||
|
elif af.frame.checksum_valid is True:
|
||||||
|
self._tw(w, f" Checksum: {af.frame.checksum_type} {af.frame.checksum_hex}\n", "dim")
|
||||||
|
self._tn(w)
|
||||||
|
|
||||||
|
# Protocol header breakdown
|
||||||
|
p = af.frame.payload
|
||||||
|
if len(p) >= 5:
|
||||||
|
self._tw(w, "Header breakdown:\n", "head")
|
||||||
|
self._tw(w, f" [0] CMD = {p[0]:02x}\n", "dim")
|
||||||
|
self._tw(w, f" [1] ? = {p[1]:02x}\n", "dim")
|
||||||
|
self._tw(w, f" [2] SUB = {p[2]:02x} ({af.sub_name})\n", src_tag)
|
||||||
|
self._tw(w, f" [3] OFFSET_HI = {p[3]:02x}\n", "dim")
|
||||||
|
self._tw(w, f" [4] OFFSET_LO = {p[4]:02x}\n", "dim")
|
||||||
|
if len(p) > 5:
|
||||||
|
self._tw(w, f" [5..] data = {len(p) - 5} bytes\n", "dim")
|
||||||
|
|
||||||
|
# ── Hex Dump tab ─────────────────────────────────────────────────
|
||||||
|
w = self.hex_text
|
||||||
|
self._tw(w, f"[{af.source}] SUB={sub_hex} {af.sub_name}\n", src_tag)
|
||||||
|
self._tw(w, f"Payload ({len(af.frame.payload)} bytes):\n", "dim")
|
||||||
|
self._tn(w)
|
||||||
|
dump_lines = format_hex_dump(af.frame.payload, indent=" ")
|
||||||
|
self._tw(w, "\n".join(dump_lines) + "\n", "normal")
|
||||||
|
|
||||||
|
# Annotate known field offsets within this frame
|
||||||
|
diffs_for_sess = self.state.diffs[sess_idx] if sess_idx < len(self.state.diffs) else None
|
||||||
|
if diffs_for_sess and af.header:
|
||||||
|
page_key = af.header.page_key
|
||||||
|
matching = [fd for fd in diffs_for_sess
|
||||||
|
if fd.sub == af.header.sub and fd.page_key == page_key]
|
||||||
|
if matching:
|
||||||
|
self._tn(w)
|
||||||
|
self._tw(w, "Changed bytes in this frame (vs prev session):\n", "head")
|
||||||
|
for bd in matching[0].diffs:
|
||||||
|
before_s = f"{bd.before:02x}" if bd.before >= 0 else "--"
|
||||||
|
after_s = f"{bd.after:02x}" if bd.after >= 0 else "--"
|
||||||
|
self._tw(w, f" [{bd.payload_offset:3d}] 0x{bd.payload_offset:04X}: ", "dim")
|
||||||
|
self._tw(w, f"{before_s} -> {after_s}", "changed")
|
||||||
|
if bd.field_name:
|
||||||
|
self._tw(w, f" [{bd.field_name}]", "known")
|
||||||
|
self._tn(w)
|
||||||
|
|
||||||
|
# Switch to Hex Dump tab for frame selection
|
||||||
|
self.nb.select(1)
|
||||||
|
|
||||||
|
# ── live mode ─────────────────────────────────────────────────────────
|
||||||
|
|
||||||
|
def _toggle_live(self) -> None:
|
||||||
|
if self._live_thread and self._live_thread.is_alive():
|
||||||
|
self._live_stop.set()
|
||||||
|
self.live_btn.configure(text="Live: OFF", bg=BG3, fg=FG)
|
||||||
|
self.status_var.set("Live stopped")
|
||||||
|
else:
|
||||||
|
s3_path = Path(self.s3_var.get().strip()) if self.s3_var.get().strip() else None
|
||||||
|
bw_path = Path(self.bw_var.get().strip()) if self.bw_var.get().strip() else None
|
||||||
|
if not s3_path or not bw_path:
|
||||||
|
messagebox.showerror("Missing files", "Select both raw files before starting live mode.")
|
||||||
|
return
|
||||||
|
self.state.s3_path = s3_path
|
||||||
|
self.state.bw_path = bw_path
|
||||||
|
self._live_stop.clear()
|
||||||
|
self._live_thread = threading.Thread(
|
||||||
|
target=self._live_worker, args=(s3_path, bw_path), daemon=True)
|
||||||
|
self._live_thread.start()
|
||||||
|
self.live_btn.configure(text="Live: ON", bg=GREEN, fg="#000000")
|
||||||
|
self.status_var.set("Live mode running...")
|
||||||
|
|
||||||
|
def _live_worker(self, s3_path: Path, bw_path: Path) -> None:
|
||||||
|
s3_buf = bytearray()
|
||||||
|
bw_buf = bytearray()
|
||||||
|
s3_pos = bw_pos = 0
|
||||||
|
|
||||||
|
while not self._live_stop.is_set():
|
||||||
|
changed = False
|
||||||
|
if s3_path.exists():
|
||||||
|
with s3_path.open("rb") as fh:
|
||||||
|
fh.seek(s3_pos)
|
||||||
|
nb = fh.read()
|
||||||
|
if nb:
|
||||||
|
s3_buf.extend(nb); s3_pos += len(nb); changed = True
|
||||||
|
if bw_path.exists():
|
||||||
|
with bw_path.open("rb") as fh:
|
||||||
|
fh.seek(bw_pos)
|
||||||
|
nb = fh.read()
|
||||||
|
if nb:
|
||||||
|
bw_buf.extend(nb); bw_pos += len(nb); changed = True
|
||||||
|
|
||||||
|
if changed:
|
||||||
|
self._live_q.put("refresh")
|
||||||
|
|
||||||
|
time.sleep(0.1)
|
||||||
|
|
||||||
|
def _poll_live_queue(self) -> None:
|
||||||
|
try:
|
||||||
|
while True:
|
||||||
|
msg = self._live_q.get_nowait()
|
||||||
|
if msg == "refresh" and self.state.s3_path and self.state.bw_path:
|
||||||
|
self._do_analyze(self.state.s3_path, self.state.bw_path)
|
||||||
|
except queue.Empty:
|
||||||
|
pass
|
||||||
|
finally:
|
||||||
|
self.after(150, self._poll_live_queue)
|
||||||
|
|
||||||
|
# ── text helpers ──────────────────────────────────────────────────────
|
||||||
|
|
||||||
|
def _text_clear(self, w: tk.Text) -> None:
|
||||||
|
w.configure(state="normal")
|
||||||
|
w.delete("1.0", tk.END)
|
||||||
|
# leave enabled for further inserts
|
||||||
|
|
||||||
|
def _tw(self, w: tk.Text, text: str, tag: str = "normal") -> None:
|
||||||
|
"""Insert text with a colour tag."""
|
||||||
|
w.configure(state="normal")
|
||||||
|
w.insert(tk.END, text, tag)
|
||||||
|
|
||||||
|
def _tn(self, w: tk.Text) -> None:
|
||||||
|
"""Insert newline."""
|
||||||
|
w.configure(state="normal")
|
||||||
|
w.insert(tk.END, "\n")
|
||||||
|
w.configure(state="disabled")
|
||||||
|
|
||||||
|
|
||||||
|
# ──────────────────────────────────────────────────────────────────────────────
|
||||||
|
# Entry point
|
||||||
|
# ──────────────────────────────────────────────────────────────────────────────
|
||||||
|
|
||||||
|
def main() -> None:
|
||||||
|
app = AnalyzerGUI()
|
||||||
|
app.mainloop()
|
||||||
|
|
||||||
|
|
||||||
|
if __name__ == "__main__":
|
||||||
|
main()
|
||||||
BIN
parsers/raw_bw.bin
Normal file
BIN
parsers/raw_bw.bin
Normal file
Binary file not shown.
BIN
parsers/raw_s3.bin
Normal file
BIN
parsers/raw_s3.bin
Normal file
Binary file not shown.
1204
parsers/s3_analyzer.py
Normal file
1204
parsers/s3_analyzer.py
Normal file
File diff suppressed because it is too large
Load Diff
@@ -1,232 +1,386 @@
|
|||||||
#!/usr/bin/env python3
|
#!/usr/bin/env python3
|
||||||
"""
|
"""
|
||||||
s3_parse.py — parse Instantel/Series3-like DLE-framed serial captures from a raw .bin logger.
|
s3_parser.py — Unified Instantel frame parser (S3 + BW).
|
||||||
|
|
||||||
Assumptions (based on your HxD patterns):
|
Modes:
|
||||||
- Frames are delimited by DLE STX (0x10 0x02) ... DLE ETX (0x10 0x03)
|
- s3: DLE STX (10 02) ... DLE ETX (10 03)
|
||||||
- Inside payload, a literal 0x10 is escaped as 0x10 0x10
|
- bw: ACK+STX (41 02) ... ETX (03)
|
||||||
- After ETX, there may be a trailer (often CRC16, maybe + seq/flags)
|
|
||||||
|
Stuffing:
|
||||||
|
- Literal 0x10 in payload is stuffed as 10 10 in both directions.
|
||||||
|
|
||||||
|
Checksums:
|
||||||
|
- BW frames appear to use more than one checksum style depending on message type.
|
||||||
|
Small frames often validate with 1-byte SUM8.
|
||||||
|
Large config/write frames appear to use a 2-byte CRC16 variant.
|
||||||
|
|
||||||
|
In BW mode we therefore validate candidate ETX positions using AUTO checksum matching:
|
||||||
|
- SUM8 (1 byte)
|
||||||
|
- CRC16 variants (2 bytes), both little/big endian
|
||||||
|
If any match, we accept the ETX as a real frame terminator.
|
||||||
"""
|
"""
|
||||||
|
|
||||||
from __future__ import annotations
|
from __future__ import annotations
|
||||||
|
|
||||||
import argparse
|
import argparse
|
||||||
import json
|
import json
|
||||||
from dataclasses import dataclass
|
from dataclasses import dataclass
|
||||||
from pathlib import Path
|
from pathlib import Path
|
||||||
from typing import List, Optional, Tuple
|
from typing import Callable, Dict, List, Optional, Tuple
|
||||||
|
|
||||||
DLE = 0x10
|
DLE = 0x10
|
||||||
STX = 0x02
|
STX = 0x02
|
||||||
ETX = 0x03
|
ETX = 0x03
|
||||||
EOT = 0x04
|
ACK = 0x41
|
||||||
|
|
||||||
|
__version__ = "0.2.2"
|
||||||
|
|
||||||
# How the capture was produced:
|
|
||||||
# - Raw serial captures include DLE+ETX (`0x10 0x03`).
|
|
||||||
# - The s3_bridge `.bin` logger strips the DLE byte from ETX, so frames end with a
|
|
||||||
# bare `0x03`. See docs/instantel_protocol_reference.md §Appendix A.
|
|
||||||
ETX_MODE_AUTO = "auto"
|
|
||||||
ETX_MODE_RAW = "raw" # expect DLE+ETX
|
|
||||||
ETX_MODE_STRIPPED = "stripped" # expect bare ETX
|
|
||||||
|
|
||||||
@dataclass
|
@dataclass
|
||||||
class Frame:
|
class Frame:
|
||||||
index: int
|
index: int
|
||||||
start_offset: int
|
start_offset: int
|
||||||
end_offset: int
|
end_offset: int
|
||||||
payload_raw: bytes # as captured between STX..ETX, still escaped
|
payload_raw: bytes # de-stuffed bytes between STX..ETX (includes checksum bytes at end)
|
||||||
payload: bytes # unescaped
|
payload: bytes # payload without checksum bytes
|
||||||
trailer: bytes # bytes immediately after ETX (length chosen by user)
|
trailer: bytes
|
||||||
crc_match: Optional[str] # best-guess CRC type if verified, else None
|
checksum_valid: Optional[bool]
|
||||||
|
checksum_type: Optional[str]
|
||||||
|
checksum_hex: Optional[str]
|
||||||
|
|
||||||
def unescape_dle(payload_escaped: bytes) -> bytes:
|
|
||||||
"""Convert DLE-stuffing: 0x10 0x10 => 0x10 (literal DLE)."""
|
|
||||||
out = bytearray()
|
|
||||||
i = 0
|
|
||||||
n = len(payload_escaped)
|
|
||||||
while i < n:
|
|
||||||
b = payload_escaped[i]
|
|
||||||
if b == DLE:
|
|
||||||
if i + 1 < n and payload_escaped[i + 1] == DLE:
|
|
||||||
out.append(DLE)
|
|
||||||
i += 2
|
|
||||||
continue
|
|
||||||
# If we see a single DLE not followed by DLE inside payload,
|
|
||||||
# keep it as-is (conservative) — could be real data or malformed capture.
|
|
||||||
out.append(b)
|
|
||||||
i += 1
|
|
||||||
return bytes(out)
|
|
||||||
|
|
||||||
# ---- CRC helpers (we don't know which one yet, so we try a few) ----
|
# ------------------------
|
||||||
|
# Checksum / CRC helpers
|
||||||
|
# ------------------------
|
||||||
|
|
||||||
|
def checksum8_sum(data: bytes) -> int:
|
||||||
|
"""SUM8: sum(payload) & 0xFF"""
|
||||||
|
return sum(data) & 0xFF
|
||||||
|
|
||||||
|
|
||||||
def crc16_ibm(data: bytes) -> int:
|
def crc16_ibm(data: bytes) -> int:
|
||||||
# CRC-16/IBM (aka ARC) poly=0xA001 (reflected 0x8005), init=0x0000
|
# CRC-16/IBM (aka ARC) poly=0xA001, init=0x0000, refin/refout true
|
||||||
crc = 0x0000
|
crc = 0x0000
|
||||||
for b in data:
|
for b in data:
|
||||||
crc ^= b
|
crc ^= b
|
||||||
for _ in range(8):
|
for _ in range(8):
|
||||||
if crc & 1:
|
crc = (crc >> 1) ^ 0xA001 if (crc & 1) else (crc >> 1)
|
||||||
crc = (crc >> 1) ^ 0xA001
|
|
||||||
else:
|
|
||||||
crc >>= 1
|
|
||||||
return crc & 0xFFFF
|
return crc & 0xFFFF
|
||||||
|
|
||||||
|
|
||||||
def crc16_ccitt_false(data: bytes) -> int:
|
def crc16_ccitt_false(data: bytes) -> int:
|
||||||
# CRC-16/CCITT-FALSE poly=0x1021, init=0xFFFF, no reflection
|
# CRC-16/CCITT-FALSE poly=0x1021, init=0xFFFF, refin/refout false
|
||||||
crc = 0xFFFF
|
crc = 0xFFFF
|
||||||
for b in data:
|
for b in data:
|
||||||
crc ^= (b << 8)
|
crc ^= (b << 8)
|
||||||
for _ in range(8):
|
for _ in range(8):
|
||||||
if crc & 0x8000:
|
crc = ((crc << 1) ^ 0x1021) & 0xFFFF if (crc & 0x8000) else (crc << 1) & 0xFFFF
|
||||||
crc = ((crc << 1) ^ 0x1021) & 0xFFFF
|
|
||||||
else:
|
|
||||||
crc = (crc << 1) & 0xFFFF
|
|
||||||
return crc
|
return crc
|
||||||
|
|
||||||
|
|
||||||
def crc16_x25(data: bytes) -> int:
|
def crc16_x25(data: bytes) -> int:
|
||||||
# CRC-16/X-25 poly=0x1021, init=0xFFFF, refin/refout true, xorout=0xFFFF
|
# CRC-16/X-25 poly=0x8408 (reflected), init=0xFFFF, xorout=0xFFFF
|
||||||
crc = 0xFFFF
|
crc = 0xFFFF
|
||||||
for b in data:
|
for b in data:
|
||||||
crc ^= b
|
crc ^= b
|
||||||
for _ in range(8):
|
for _ in range(8):
|
||||||
if crc & 1:
|
crc = (crc >> 1) ^ 0x8408 if (crc & 1) else (crc >> 1)
|
||||||
crc = (crc >> 1) ^ 0x8408
|
|
||||||
else:
|
|
||||||
crc >>= 1
|
|
||||||
return (crc ^ 0xFFFF) & 0xFFFF
|
return (crc ^ 0xFFFF) & 0xFFFF
|
||||||
|
|
||||||
CRC_FUNCS = {
|
|
||||||
"CRC-16/IBM": crc16_ibm,
|
CRC16_FUNCS: Dict[str, Callable[[bytes], int]] = {
|
||||||
"CRC-16/CCITT-FALSE": crc16_ccitt_false,
|
"CRC16_IBM": crc16_ibm,
|
||||||
"CRC-16/X-25": crc16_x25,
|
"CRC16_CCITT_FALSE": crc16_ccitt_false,
|
||||||
|
"CRC16_X25": crc16_x25,
|
||||||
}
|
}
|
||||||
|
|
||||||
def parse_frames(blob: bytes, trailer_len: int) -> List[Frame]:
|
|
||||||
|
def _try_validate_sum8(body: bytes) -> Optional[Tuple[bytes, bytes, str]]:
|
||||||
|
"""
|
||||||
|
body = payload + chk8
|
||||||
|
Returns (payload, chk_bytes, type) if valid, else None
|
||||||
|
"""
|
||||||
|
if len(body) < 1:
|
||||||
|
return None
|
||||||
|
payload = body[:-1]
|
||||||
|
chk = body[-1]
|
||||||
|
if checksum8_sum(payload) == chk:
|
||||||
|
return payload, bytes([chk]), "SUM8"
|
||||||
|
return None
|
||||||
|
|
||||||
|
|
||||||
|
def _try_validate_sum8_large(body: bytes) -> Optional[Tuple[bytes, bytes, str]]:
|
||||||
|
"""
|
||||||
|
Large BW->S3 write frame checksum (SUBs 68, 69, 71, 82, 1A with data).
|
||||||
|
|
||||||
|
Formula: (sum(b for b in payload[2:-1] if b != 0x10) + 0x10) & 0xFF
|
||||||
|
- Starts from byte [2], skipping CMD (0x10) and DLE (0x10) at [0][1]
|
||||||
|
- Skips all 0x10 bytes in the covered range
|
||||||
|
- Adds 0x10 as a constant offset
|
||||||
|
- body[-1] is the checksum byte
|
||||||
|
|
||||||
|
Confirmed across 20 frames from two independent captures (2026-03-12).
|
||||||
|
"""
|
||||||
|
if len(body) < 3:
|
||||||
|
return None
|
||||||
|
payload = body[:-1]
|
||||||
|
chk = body[-1]
|
||||||
|
calc = (sum(b for b in payload[2:] if b != 0x10) + 0x10) & 0xFF
|
||||||
|
if calc == chk:
|
||||||
|
return payload, bytes([chk]), "SUM8_LARGE"
|
||||||
|
return None
|
||||||
|
|
||||||
|
|
||||||
|
def _try_validate_crc16(body: bytes) -> Optional[Tuple[bytes, bytes, str]]:
|
||||||
|
"""
|
||||||
|
body = payload + crc16(2 bytes)
|
||||||
|
Try multiple CRC16 types and both endian interpretations.
|
||||||
|
Returns (payload, chk_bytes, type) if valid, else None
|
||||||
|
"""
|
||||||
|
if len(body) < 2:
|
||||||
|
return None
|
||||||
|
payload = body[:-2]
|
||||||
|
chk_bytes = body[-2:]
|
||||||
|
|
||||||
|
given_le = int.from_bytes(chk_bytes, "little", signed=False)
|
||||||
|
given_be = int.from_bytes(chk_bytes, "big", signed=False)
|
||||||
|
|
||||||
|
for name, fn in CRC16_FUNCS.items():
|
||||||
|
calc = fn(payload)
|
||||||
|
if calc == given_le:
|
||||||
|
return payload, chk_bytes, f"{name}_LE"
|
||||||
|
if calc == given_be:
|
||||||
|
return payload, chk_bytes, f"{name}_BE"
|
||||||
|
return None
|
||||||
|
|
||||||
|
|
||||||
|
def validate_bw_body_auto(body: bytes) -> Optional[Tuple[bytes, bytes, str]]:
|
||||||
|
"""
|
||||||
|
Try to interpret the tail of body as a checksum in several ways.
|
||||||
|
Return (payload, checksum_bytes, checksum_type) if any match; else None.
|
||||||
|
"""
|
||||||
|
# Prefer plain SUM8 first (small frames: POLL, read commands)
|
||||||
|
hit = _try_validate_sum8(body)
|
||||||
|
if hit:
|
||||||
|
return hit
|
||||||
|
|
||||||
|
# Large BW->S3 write frames (SUBs 68, 69, 71, 82, 1A with data)
|
||||||
|
hit = _try_validate_sum8_large(body)
|
||||||
|
if hit:
|
||||||
|
return hit
|
||||||
|
|
||||||
|
# Then CRC16 variants
|
||||||
|
hit = _try_validate_crc16(body)
|
||||||
|
if hit:
|
||||||
|
return hit
|
||||||
|
|
||||||
|
return None
|
||||||
|
|
||||||
|
|
||||||
|
# ------------------------
|
||||||
|
# S3 MODE (DLE framed)
|
||||||
|
# ------------------------
|
||||||
|
|
||||||
|
def parse_s3(blob: bytes, trailer_len: int) -> List[Frame]:
|
||||||
frames: List[Frame] = []
|
frames: List[Frame] = []
|
||||||
|
|
||||||
STATE_IDLE = 0
|
IDLE = 0
|
||||||
STATE_IN_FRAME = 1
|
IN_FRAME = 1
|
||||||
STATE_AFTER_DLE = 2
|
AFTER_DLE = 2
|
||||||
|
|
||||||
state = STATE_IDLE
|
state = IDLE
|
||||||
payload_raw = bytearray()
|
body = bytearray()
|
||||||
start_offset = 0
|
start_offset = 0
|
||||||
idx = 0
|
idx = 0
|
||||||
|
|
||||||
i = 0
|
i = 0
|
||||||
n = len(blob)
|
n = len(blob)
|
||||||
|
|
||||||
print(">>> CLEAN RAW STATE MACHINE ACTIVE <<<")
|
|
||||||
|
|
||||||
while i < n:
|
while i < n:
|
||||||
b = blob[i]
|
b = blob[i]
|
||||||
|
|
||||||
if state == STATE_IDLE:
|
if state == IDLE:
|
||||||
# look for DLE STX
|
|
||||||
if b == DLE and i + 1 < n and blob[i + 1] == STX:
|
if b == DLE and i + 1 < n and blob[i + 1] == STX:
|
||||||
print("FRAME START at", i)
|
|
||||||
start_offset = i
|
start_offset = i
|
||||||
payload_raw = bytearray()
|
body.clear()
|
||||||
state = STATE_IN_FRAME
|
state = IN_FRAME
|
||||||
i += 2
|
i += 2
|
||||||
continue
|
continue
|
||||||
|
|
||||||
elif state == STATE_IN_FRAME:
|
elif state == IN_FRAME:
|
||||||
if b == DLE:
|
if b == DLE:
|
||||||
state = STATE_AFTER_DLE
|
state = AFTER_DLE
|
||||||
i += 1
|
i += 1
|
||||||
continue
|
continue
|
||||||
else:
|
body.append(b)
|
||||||
payload_raw.append(b)
|
|
||||||
|
|
||||||
elif state == STATE_AFTER_DLE:
|
else: # AFTER_DLE
|
||||||
if b == DLE:
|
if b == DLE:
|
||||||
# escaped literal DLE
|
body.append(DLE)
|
||||||
payload_raw.append(DLE)
|
state = IN_FRAME
|
||||||
state = STATE_IN_FRAME
|
|
||||||
i += 1
|
i += 1
|
||||||
continue
|
continue
|
||||||
|
|
||||||
elif b == ETX:
|
if b == ETX:
|
||||||
print("FRAME END at", i)
|
|
||||||
# end of frame
|
|
||||||
end_offset = i + 1
|
end_offset = i + 1
|
||||||
|
|
||||||
# capture trailer
|
|
||||||
trailer_start = i + 1
|
trailer_start = i + 1
|
||||||
trailer_end = trailer_start + trailer_len
|
trailer_end = trailer_start + trailer_len
|
||||||
trailer = blob[trailer_start:trailer_end]
|
trailer = blob[trailer_start:trailer_end]
|
||||||
|
|
||||||
|
# For S3 mode we don't assume checksum type here yet.
|
||||||
frames.append(Frame(
|
frames.append(Frame(
|
||||||
index=idx,
|
index=idx,
|
||||||
start_offset=start_offset,
|
start_offset=start_offset,
|
||||||
end_offset=end_offset,
|
end_offset=end_offset,
|
||||||
payload_raw=bytes(payload_raw),
|
payload_raw=bytes(body),
|
||||||
payload=bytes(payload_raw),
|
payload=bytes(body),
|
||||||
trailer=trailer,
|
trailer=trailer,
|
||||||
crc_match=None
|
checksum_valid=None,
|
||||||
|
checksum_type=None,
|
||||||
|
checksum_hex=None
|
||||||
))
|
))
|
||||||
|
|
||||||
idx += 1
|
idx += 1
|
||||||
state = STATE_IDLE
|
state = IDLE
|
||||||
i = trailer_end
|
i = trailer_end
|
||||||
continue
|
continue
|
||||||
|
|
||||||
else:
|
# Unexpected DLE + byte → treat as literal data
|
||||||
# unexpected sequence: DLE followed by non-DLE/non-ETX
|
body.append(DLE)
|
||||||
# treat both bytes as data (robust recovery)
|
body.append(b)
|
||||||
payload_raw.append(DLE)
|
state = IN_FRAME
|
||||||
payload_raw.append(b)
|
i += 1
|
||||||
state = STATE_IN_FRAME
|
continue
|
||||||
i += 1
|
|
||||||
continue
|
|
||||||
|
|
||||||
i += 1
|
i += 1
|
||||||
|
|
||||||
print("Frames parsed:", len(frames))
|
|
||||||
return frames
|
return frames
|
||||||
|
|
||||||
def best_crc_match(payload: bytes, trailer: bytes, little_endian: bool) -> Optional[str]:
|
|
||||||
"""Try to interpret first 2 trailer bytes as CRC16 and see which algorithm matches."""
|
# ------------------------
|
||||||
if len(trailer) < 2:
|
# BW MODE (ACK+STX framed, bare ETX)
|
||||||
return None
|
# ------------------------
|
||||||
given = int.from_bytes(trailer[:2], byteorder="little" if little_endian else "big", signed=False)
|
|
||||||
matches = []
|
def parse_bw(blob: bytes, trailer_len: int, validate_checksum: bool) -> List[Frame]:
|
||||||
for name, fn in CRC_FUNCS.items():
|
frames: List[Frame] = []
|
||||||
calc = fn(payload)
|
|
||||||
if calc == given:
|
IDLE = 0
|
||||||
matches.append(name)
|
IN_FRAME = 1
|
||||||
if len(matches) == 1:
|
AFTER_DLE = 2
|
||||||
return matches[0]
|
|
||||||
if len(matches) > 1:
|
state = IDLE
|
||||||
return " / ".join(matches)
|
body = bytearray()
|
||||||
return None
|
start_offset = 0
|
||||||
|
idx = 0
|
||||||
|
|
||||||
|
i = 0
|
||||||
|
n = len(blob)
|
||||||
|
|
||||||
|
while i < n:
|
||||||
|
b = blob[i]
|
||||||
|
|
||||||
|
if state == IDLE:
|
||||||
|
# Frame start signature: ACK + STX
|
||||||
|
if b == ACK and i + 1 < n and blob[i + 1] == STX:
|
||||||
|
start_offset = i
|
||||||
|
body.clear()
|
||||||
|
state = IN_FRAME
|
||||||
|
i += 2
|
||||||
|
continue
|
||||||
|
i += 1
|
||||||
|
continue
|
||||||
|
|
||||||
|
if state == IN_FRAME:
|
||||||
|
if b == DLE:
|
||||||
|
state = AFTER_DLE
|
||||||
|
i += 1
|
||||||
|
continue
|
||||||
|
|
||||||
|
if b == ETX:
|
||||||
|
# Candidate end-of-frame.
|
||||||
|
# Accept ETX if the next bytes look like a real next-frame start (ACK+STX),
|
||||||
|
# or we're at EOF. This prevents chopping on in-payload 0x03.
|
||||||
|
next_is_start = (i + 2 < n and blob[i + 1] == ACK and blob[i + 2] == STX)
|
||||||
|
at_eof = (i == n - 1)
|
||||||
|
|
||||||
|
if not (next_is_start or at_eof):
|
||||||
|
# Not a real boundary -> payload byte
|
||||||
|
body.append(ETX)
|
||||||
|
i += 1
|
||||||
|
continue
|
||||||
|
|
||||||
|
trailer_start = i + 1
|
||||||
|
trailer_end = trailer_start + trailer_len
|
||||||
|
trailer = blob[trailer_start:trailer_end]
|
||||||
|
|
||||||
|
chk_valid = None
|
||||||
|
chk_type = None
|
||||||
|
chk_hex = None
|
||||||
|
payload = bytes(body)
|
||||||
|
|
||||||
|
if validate_checksum:
|
||||||
|
hit = validate_bw_body_auto(payload)
|
||||||
|
if hit:
|
||||||
|
payload, chk_bytes, chk_type = hit
|
||||||
|
chk_valid = True
|
||||||
|
chk_hex = chk_bytes.hex()
|
||||||
|
else:
|
||||||
|
chk_valid = False
|
||||||
|
|
||||||
|
frames.append(Frame(
|
||||||
|
index=idx,
|
||||||
|
start_offset=start_offset,
|
||||||
|
end_offset=i + 1,
|
||||||
|
payload_raw=bytes(body),
|
||||||
|
payload=payload,
|
||||||
|
trailer=trailer,
|
||||||
|
checksum_valid=chk_valid,
|
||||||
|
checksum_type=chk_type,
|
||||||
|
checksum_hex=chk_hex
|
||||||
|
))
|
||||||
|
idx += 1
|
||||||
|
state = IDLE
|
||||||
|
i = trailer_end
|
||||||
|
continue
|
||||||
|
|
||||||
|
# Normal byte
|
||||||
|
body.append(b)
|
||||||
|
i += 1
|
||||||
|
continue
|
||||||
|
|
||||||
|
# AFTER_DLE: DLE XX => literal XX for any XX (full DLE stuffing)
|
||||||
|
body.append(b)
|
||||||
|
state = IN_FRAME
|
||||||
|
i += 1
|
||||||
|
|
||||||
|
return frames
|
||||||
|
|
||||||
|
|
||||||
|
# ------------------------
|
||||||
|
# CLI
|
||||||
|
# ------------------------
|
||||||
|
|
||||||
def main() -> None:
|
def main() -> None:
|
||||||
ap = argparse.ArgumentParser(description="Parse DLE-framed serial capture .bin into frames (and guess CRC).")
|
ap = argparse.ArgumentParser(description="Parse Instantel S3/BW binary captures.")
|
||||||
ap.add_argument("binfile", type=Path, help="Path to capture .bin file")
|
ap.add_argument("binfile", type=Path)
|
||||||
ap.add_argument("--trailer-len", type=int, default=2, help="Bytes to capture after DLE ETX (default: 2)")
|
ap.add_argument("--mode", choices=["s3", "bw"], default="s3")
|
||||||
ap.add_argument("--crc", action="store_true", help="Attempt CRC match using first 2 trailer bytes")
|
ap.add_argument("--trailer-len", type=int, default=0)
|
||||||
ap.add_argument("--crc-endian", choices=["little", "big"], default="little", help="CRC endian when reading trailer")
|
ap.add_argument("--no-checksum", action="store_true")
|
||||||
|
ap.add_argument("--out", type=Path, default=None)
|
||||||
|
|
||||||
ap.add_argument("--out", type=Path, default=None, help="Write JSONL output to this file")
|
|
||||||
args = ap.parse_args()
|
args = ap.parse_args()
|
||||||
|
|
||||||
|
print(f"s3_parser v{__version__}")
|
||||||
|
|
||||||
blob = args.binfile.read_bytes()
|
blob = args.binfile.read_bytes()
|
||||||
frames = parse_frames(blob, trailer_len=args.trailer_len)
|
|
||||||
|
|
||||||
little = (args.crc_endian == "little")
|
if args.mode == "s3":
|
||||||
if args.crc:
|
frames = parse_s3(blob, args.trailer_len)
|
||||||
for f in frames:
|
else:
|
||||||
f.crc_match = best_crc_match(f.payload, f.trailer, little_endian=little)
|
frames = parse_bw(blob, args.trailer_len, validate_checksum=not args.no_checksum)
|
||||||
|
|
||||||
# Summary
|
print("Frames found:", len(frames))
|
||||||
total = len(frames)
|
|
||||||
crc_hits = sum(1 for f in frames if f.crc_match) if args.crc else 0
|
|
||||||
print(f"Frames found: {total}")
|
|
||||||
if args.crc:
|
|
||||||
print(f"CRC matches: {crc_hits} ({(crc_hits/total*100.0):.1f}%)" if total else "CRC matches: 0")
|
|
||||||
|
|
||||||
# Emit JSONL
|
|
||||||
def to_hex(b: bytes) -> str:
|
def to_hex(b: bytes) -> str:
|
||||||
return b.hex()
|
return b.hex()
|
||||||
|
|
||||||
@@ -239,7 +393,9 @@ def main() -> None:
|
|||||||
"payload_len": len(f.payload),
|
"payload_len": len(f.payload),
|
||||||
"payload_hex": to_hex(f.payload),
|
"payload_hex": to_hex(f.payload),
|
||||||
"trailer_hex": to_hex(f.trailer),
|
"trailer_hex": to_hex(f.trailer),
|
||||||
"crc_match": f.crc_match,
|
"checksum_valid": f.checksum_valid,
|
||||||
|
"checksum_type": f.checksum_type,
|
||||||
|
"checksum_hex": f.checksum_hex,
|
||||||
}
|
}
|
||||||
lines.append(json.dumps(obj))
|
lines.append(json.dumps(obj))
|
||||||
|
|
||||||
@@ -247,11 +403,11 @@ def main() -> None:
|
|||||||
args.out.write_text("\n".join(lines) + "\n", encoding="utf-8")
|
args.out.write_text("\n".join(lines) + "\n", encoding="utf-8")
|
||||||
print(f"Wrote: {args.out}")
|
print(f"Wrote: {args.out}")
|
||||||
else:
|
else:
|
||||||
# Print first few only (avoid spewing your terminal)
|
|
||||||
for line in lines[:10]:
|
for line in lines[:10]:
|
||||||
print(line)
|
print(line)
|
||||||
if len(lines) > 10:
|
if len(lines) > 10:
|
||||||
print(f"... ({len(lines) - 10} more)")
|
print(f"... ({len(lines) - 10} more)")
|
||||||
|
|
||||||
|
|
||||||
if __name__ == "__main__":
|
if __name__ == "__main__":
|
||||||
main()
|
main()
|
||||||
Binary file not shown.
File diff suppressed because one or more lines are too long
1538
seismo_lab.py
Normal file
1538
seismo_lab.py
Normal file
File diff suppressed because it is too large
Load Diff
0
sfm/__init__.py
Normal file
0
sfm/__init__.py
Normal file
351
sfm/server.py
Normal file
351
sfm/server.py
Normal file
@@ -0,0 +1,351 @@
|
|||||||
|
"""
|
||||||
|
sfm/server.py — Seismograph Field Module REST API
|
||||||
|
|
||||||
|
Wraps the minimateplus library in a small FastAPI service.
|
||||||
|
Terra-view proxies /api/sfm/* to this service (same pattern as SLMM at :8100).
|
||||||
|
|
||||||
|
Default port: 8200
|
||||||
|
|
||||||
|
Endpoints
|
||||||
|
---------
|
||||||
|
GET /health Service heartbeat — no device I/O
|
||||||
|
GET /device/info POLL + serial number + full config read
|
||||||
|
GET /device/events Download all stored events (headers + peak values)
|
||||||
|
POST /device/connect Explicit connect/identify (same as /device/info)
|
||||||
|
GET /device/event/{idx} Single event by index (header + waveform record)
|
||||||
|
|
||||||
|
Transport query params (supply one set):
|
||||||
|
Serial (direct RS-232 cable):
|
||||||
|
port — serial port name (e.g. COM5, /dev/ttyUSB0)
|
||||||
|
baud — baud rate (default 38400)
|
||||||
|
|
||||||
|
TCP (modem / ACH Auto Call Home):
|
||||||
|
host — IP address or hostname of the modem or ACH relay
|
||||||
|
tcp_port — TCP port number (default 12345, Blastware default)
|
||||||
|
|
||||||
|
Each call opens the connection, does its work, then closes it.
|
||||||
|
(Stateless / reconnect-per-call, matching Blastware's observed behaviour.)
|
||||||
|
|
||||||
|
Run with:
|
||||||
|
python -m uvicorn sfm.server:app --host 0.0.0.0 --port 8200 --reload
|
||||||
|
or:
|
||||||
|
python sfm/server.py
|
||||||
|
"""
|
||||||
|
|
||||||
|
from __future__ import annotations
|
||||||
|
|
||||||
|
import logging
|
||||||
|
import sys
|
||||||
|
from typing import Optional
|
||||||
|
|
||||||
|
# FastAPI / Pydantic
|
||||||
|
try:
|
||||||
|
from fastapi import FastAPI, HTTPException, Query
|
||||||
|
from fastapi.responses import JSONResponse
|
||||||
|
import uvicorn
|
||||||
|
except ImportError:
|
||||||
|
print(
|
||||||
|
"fastapi and uvicorn are required for the SFM server.\n"
|
||||||
|
"Install them with: pip install fastapi uvicorn",
|
||||||
|
file=sys.stderr,
|
||||||
|
)
|
||||||
|
sys.exit(1)
|
||||||
|
|
||||||
|
from minimateplus import MiniMateClient
|
||||||
|
from minimateplus.protocol import ProtocolError
|
||||||
|
from minimateplus.models import DeviceInfo, Event, PeakValues, ProjectInfo, Timestamp
|
||||||
|
from minimateplus.transport import TcpTransport, DEFAULT_TCP_PORT
|
||||||
|
|
||||||
|
logging.basicConfig(
|
||||||
|
level=logging.INFO,
|
||||||
|
format="%(asctime)s %(levelname)-7s %(name)s %(message)s",
|
||||||
|
datefmt="%H:%M:%S",
|
||||||
|
)
|
||||||
|
log = logging.getLogger("sfm.server")
|
||||||
|
|
||||||
|
# ── FastAPI app ────────────────────────────────────────────────────────────────
|
||||||
|
|
||||||
|
app = FastAPI(
|
||||||
|
title="Seismograph Field Module (SFM)",
|
||||||
|
description=(
|
||||||
|
"REST API for Instantel MiniMate Plus seismographs.\n"
|
||||||
|
"Implements the minimateplus RS-232 protocol library.\n"
|
||||||
|
"Proxied by terra-view at /api/sfm/*."
|
||||||
|
),
|
||||||
|
version="0.1.0",
|
||||||
|
)
|
||||||
|
|
||||||
|
|
||||||
|
# ── Serialisers ────────────────────────────────────────────────────────────────
|
||||||
|
# Plain dict helpers — avoids a Pydantic dependency in the library layer.
|
||||||
|
|
||||||
|
def _serialise_timestamp(ts: Optional[Timestamp]) -> Optional[dict]:
|
||||||
|
if ts is None:
|
||||||
|
return None
|
||||||
|
return {
|
||||||
|
"year": ts.year,
|
||||||
|
"month": ts.month,
|
||||||
|
"day": ts.day,
|
||||||
|
"clock_set": ts.clock_set,
|
||||||
|
"display": str(ts),
|
||||||
|
}
|
||||||
|
|
||||||
|
|
||||||
|
def _serialise_peak_values(pv: Optional[PeakValues]) -> Optional[dict]:
|
||||||
|
if pv is None:
|
||||||
|
return None
|
||||||
|
return {
|
||||||
|
"tran_in_s": pv.tran,
|
||||||
|
"vert_in_s": pv.vert,
|
||||||
|
"long_in_s": pv.long,
|
||||||
|
"micl_psi": pv.micl,
|
||||||
|
}
|
||||||
|
|
||||||
|
|
||||||
|
def _serialise_project_info(pi: Optional[ProjectInfo]) -> Optional[dict]:
|
||||||
|
if pi is None:
|
||||||
|
return None
|
||||||
|
return {
|
||||||
|
"setup_name": pi.setup_name,
|
||||||
|
"project": pi.project,
|
||||||
|
"client": pi.client,
|
||||||
|
"operator": pi.operator,
|
||||||
|
"sensor_location": pi.sensor_location,
|
||||||
|
"notes": pi.notes,
|
||||||
|
}
|
||||||
|
|
||||||
|
|
||||||
|
def _serialise_device_info(info: DeviceInfo) -> dict:
|
||||||
|
return {
|
||||||
|
"serial": info.serial,
|
||||||
|
"firmware_version": info.firmware_version,
|
||||||
|
"firmware_minor": info.firmware_minor,
|
||||||
|
"dsp_version": info.dsp_version,
|
||||||
|
"manufacturer": info.manufacturer,
|
||||||
|
"model": info.model,
|
||||||
|
}
|
||||||
|
|
||||||
|
|
||||||
|
def _serialise_event(ev: Event) -> dict:
|
||||||
|
return {
|
||||||
|
"index": ev.index,
|
||||||
|
"timestamp": _serialise_timestamp(ev.timestamp),
|
||||||
|
"sample_rate": ev.sample_rate,
|
||||||
|
"record_type": ev.record_type,
|
||||||
|
"peak_values": _serialise_peak_values(ev.peak_values),
|
||||||
|
"project_info": _serialise_project_info(ev.project_info),
|
||||||
|
}
|
||||||
|
|
||||||
|
|
||||||
|
# ── Transport factory ─────────────────────────────────────────────────────────
|
||||||
|
|
||||||
|
def _build_client(
|
||||||
|
port: Optional[str],
|
||||||
|
baud: int,
|
||||||
|
host: Optional[str],
|
||||||
|
tcp_port: int,
|
||||||
|
) -> MiniMateClient:
|
||||||
|
"""
|
||||||
|
Return a MiniMateClient configured for either serial or TCP transport.
|
||||||
|
|
||||||
|
TCP takes priority if *host* is supplied; otherwise *port* (serial) is used.
|
||||||
|
Raises HTTPException(422) if neither is provided.
|
||||||
|
"""
|
||||||
|
if host:
|
||||||
|
# TCP / modem / ACH path — use a longer timeout to survive cold boots
|
||||||
|
# (unit takes 5-15s to wake from RS-232 line assertion over cellular)
|
||||||
|
transport = TcpTransport(host, port=tcp_port)
|
||||||
|
log.debug("TCP transport: %s:%d", host, tcp_port)
|
||||||
|
return MiniMateClient(transport=transport, timeout=30.0)
|
||||||
|
elif port:
|
||||||
|
# Direct serial path
|
||||||
|
log.debug("Serial transport: %s baud=%d", port, baud)
|
||||||
|
return MiniMateClient(port, baud)
|
||||||
|
else:
|
||||||
|
raise HTTPException(
|
||||||
|
status_code=422,
|
||||||
|
detail=(
|
||||||
|
"Specify either 'port' (serial, e.g. ?port=COM5) "
|
||||||
|
"or 'host' (TCP, e.g. ?host=192.168.1.50&tcp_port=12345)"
|
||||||
|
),
|
||||||
|
)
|
||||||
|
|
||||||
|
|
||||||
|
def _is_tcp(host: Optional[str]) -> bool:
|
||||||
|
return bool(host)
|
||||||
|
|
||||||
|
|
||||||
|
def _run_with_retry(fn, *, is_tcp: bool):
|
||||||
|
"""
|
||||||
|
Call fn() and, for TCP connections only, retry once on ProtocolError.
|
||||||
|
|
||||||
|
Rationale: when a MiniMate Plus is cold (just had its serial lines asserted
|
||||||
|
by the modem or a local bridge), it takes 5-10 seconds to boot before it
|
||||||
|
will respond to POLL_PROBE. The first request may time out during that boot
|
||||||
|
window; a single automatic retry is enough to recover once the unit is up.
|
||||||
|
|
||||||
|
Serial connections are NOT retried — a timeout there usually means a real
|
||||||
|
problem (wrong port, wrong baud, cable unplugged).
|
||||||
|
"""
|
||||||
|
try:
|
||||||
|
return fn()
|
||||||
|
except ProtocolError as exc:
|
||||||
|
if not is_tcp:
|
||||||
|
raise
|
||||||
|
log.info("TCP poll timed out (unit may have been cold) — retrying once")
|
||||||
|
return fn() # let any second failure propagate normally
|
||||||
|
|
||||||
|
|
||||||
|
# ── Endpoints ──────────────────────────────────────────────────────────────────
|
||||||
|
|
||||||
|
@app.get("/health")
|
||||||
|
def health() -> dict:
|
||||||
|
"""Service heartbeat. No device I/O."""
|
||||||
|
return {"status": "ok", "service": "sfm", "version": "0.1.0"}
|
||||||
|
|
||||||
|
|
||||||
|
@app.get("/device/info")
|
||||||
|
def device_info(
|
||||||
|
port: Optional[str] = Query(None, description="Serial port (e.g. COM5, /dev/ttyUSB0)"),
|
||||||
|
baud: int = Query(38400, description="Serial baud rate (default 38400)"),
|
||||||
|
host: Optional[str] = Query(None, description="TCP host — modem IP or ACH relay (e.g. 203.0.113.5)"),
|
||||||
|
tcp_port: int = Query(DEFAULT_TCP_PORT, description=f"TCP port (default {DEFAULT_TCP_PORT})"),
|
||||||
|
) -> dict:
|
||||||
|
"""
|
||||||
|
Connect to the device, perform the POLL startup handshake, and return
|
||||||
|
identity information (serial number, firmware version, model).
|
||||||
|
|
||||||
|
Supply either *port* (serial) or *host* (TCP/modem).
|
||||||
|
Equivalent to POST /device/connect — provided as GET for convenience.
|
||||||
|
"""
|
||||||
|
log.info("GET /device/info port=%s host=%s tcp_port=%d", port, host, tcp_port)
|
||||||
|
|
||||||
|
try:
|
||||||
|
def _do():
|
||||||
|
with _build_client(port, baud, host, tcp_port) as client:
|
||||||
|
return client.connect()
|
||||||
|
info = _run_with_retry(_do, is_tcp=_is_tcp(host))
|
||||||
|
except HTTPException:
|
||||||
|
raise
|
||||||
|
except ProtocolError as exc:
|
||||||
|
raise HTTPException(status_code=502, detail=f"Protocol error: {exc}") from exc
|
||||||
|
except OSError as exc:
|
||||||
|
raise HTTPException(status_code=502, detail=f"Connection error: {exc}") from exc
|
||||||
|
except Exception as exc:
|
||||||
|
raise HTTPException(status_code=500, detail=f"Device error: {exc}") from exc
|
||||||
|
|
||||||
|
return _serialise_device_info(info)
|
||||||
|
|
||||||
|
|
||||||
|
@app.post("/device/connect")
|
||||||
|
def device_connect(
|
||||||
|
port: Optional[str] = Query(None, description="Serial port (e.g. COM5)"),
|
||||||
|
baud: int = Query(38400, description="Serial baud rate"),
|
||||||
|
host: Optional[str] = Query(None, description="TCP host — modem IP or ACH relay"),
|
||||||
|
tcp_port: int = Query(DEFAULT_TCP_PORT, description=f"TCP port (default {DEFAULT_TCP_PORT})"),
|
||||||
|
) -> dict:
|
||||||
|
"""
|
||||||
|
Connect to the device and return identity. POST variant for terra-view
|
||||||
|
compatibility with the SLMM proxy pattern.
|
||||||
|
"""
|
||||||
|
return device_info(port=port, baud=baud, host=host, tcp_port=tcp_port)
|
||||||
|
|
||||||
|
|
||||||
|
@app.get("/device/events")
|
||||||
|
def device_events(
|
||||||
|
port: Optional[str] = Query(None, description="Serial port (e.g. COM5)"),
|
||||||
|
baud: int = Query(38400, description="Serial baud rate"),
|
||||||
|
host: Optional[str] = Query(None, description="TCP host — modem IP or ACH relay"),
|
||||||
|
tcp_port: int = Query(DEFAULT_TCP_PORT, description=f"TCP port (default {DEFAULT_TCP_PORT})"),
|
||||||
|
) -> dict:
|
||||||
|
"""
|
||||||
|
Connect to the device, read the event index, and download all stored
|
||||||
|
events (event headers + full waveform records with peak values).
|
||||||
|
|
||||||
|
Supply either *port* (serial) or *host* (TCP/modem).
|
||||||
|
|
||||||
|
This does NOT download raw ADC waveform samples — those are large and
|
||||||
|
fetched separately via GET /device/event/{idx}/waveform (future endpoint).
|
||||||
|
"""
|
||||||
|
log.info("GET /device/events port=%s host=%s", port, host)
|
||||||
|
|
||||||
|
try:
|
||||||
|
def _do():
|
||||||
|
with _build_client(port, baud, host, tcp_port) as client:
|
||||||
|
return client.connect(), client.get_events()
|
||||||
|
info, events = _run_with_retry(_do, is_tcp=_is_tcp(host))
|
||||||
|
except HTTPException:
|
||||||
|
raise
|
||||||
|
except ProtocolError as exc:
|
||||||
|
raise HTTPException(status_code=502, detail=f"Protocol error: {exc}") from exc
|
||||||
|
except OSError as exc:
|
||||||
|
raise HTTPException(status_code=502, detail=f"Connection error: {exc}") from exc
|
||||||
|
except Exception as exc:
|
||||||
|
raise HTTPException(status_code=500, detail=f"Device error: {exc}") from exc
|
||||||
|
|
||||||
|
return {
|
||||||
|
"device": _serialise_device_info(info),
|
||||||
|
"event_count": len(events),
|
||||||
|
"events": [_serialise_event(ev) for ev in events],
|
||||||
|
}
|
||||||
|
|
||||||
|
|
||||||
|
@app.get("/device/event/{index}")
|
||||||
|
def device_event(
|
||||||
|
index: int,
|
||||||
|
port: Optional[str] = Query(None, description="Serial port (e.g. COM5)"),
|
||||||
|
baud: int = Query(38400, description="Serial baud rate"),
|
||||||
|
host: Optional[str] = Query(None, description="TCP host — modem IP or ACH relay"),
|
||||||
|
tcp_port: int = Query(DEFAULT_TCP_PORT, description=f"TCP port (default {DEFAULT_TCP_PORT})"),
|
||||||
|
) -> dict:
|
||||||
|
"""
|
||||||
|
Download a single event by index (0-based).
|
||||||
|
|
||||||
|
Supply either *port* (serial) or *host* (TCP/modem).
|
||||||
|
Performs: POLL startup → event index → event header → waveform record.
|
||||||
|
"""
|
||||||
|
log.info("GET /device/event/%d port=%s host=%s", index, port, host)
|
||||||
|
|
||||||
|
try:
|
||||||
|
def _do():
|
||||||
|
with _build_client(port, baud, host, tcp_port) as client:
|
||||||
|
client.connect()
|
||||||
|
return client.get_events()
|
||||||
|
events = _run_with_retry(_do, is_tcp=_is_tcp(host))
|
||||||
|
except HTTPException:
|
||||||
|
raise
|
||||||
|
except ProtocolError as exc:
|
||||||
|
raise HTTPException(status_code=502, detail=f"Protocol error: {exc}") from exc
|
||||||
|
except OSError as exc:
|
||||||
|
raise HTTPException(status_code=502, detail=f"Connection error: {exc}") from exc
|
||||||
|
except Exception as exc:
|
||||||
|
raise HTTPException(status_code=500, detail=f"Device error: {exc}") from exc
|
||||||
|
|
||||||
|
matching = [ev for ev in events if ev.index == index]
|
||||||
|
if not matching:
|
||||||
|
raise HTTPException(
|
||||||
|
status_code=404,
|
||||||
|
detail=f"Event index {index} not found on device",
|
||||||
|
)
|
||||||
|
|
||||||
|
return _serialise_event(matching[0])
|
||||||
|
|
||||||
|
|
||||||
|
# ── Entry point ────────────────────────────────────────────────────────────────
|
||||||
|
|
||||||
|
if __name__ == "__main__":
|
||||||
|
import argparse
|
||||||
|
|
||||||
|
ap = argparse.ArgumentParser(description="SFM — Seismograph Field Module API server")
|
||||||
|
ap.add_argument("--host", default="0.0.0.0", help="Bind address (default: 0.0.0.0)")
|
||||||
|
ap.add_argument("--port", type=int, default=8200, help="Port (default: 8200)")
|
||||||
|
ap.add_argument("--reload", action="store_true", help="Enable auto-reload (dev mode)")
|
||||||
|
args = ap.parse_args()
|
||||||
|
|
||||||
|
log.info("Starting SFM server on %s:%d", args.host, args.port)
|
||||||
|
uvicorn.run(
|
||||||
|
"sfm.server:app",
|
||||||
|
host=args.host,
|
||||||
|
port=args.port,
|
||||||
|
reload=args.reload,
|
||||||
|
)
|
||||||
Reference in New Issue
Block a user