Compare commits
38 Commits
0ad1505cc5
...
main
| Author | SHA1 | Date | |
|---|---|---|---|
|
|
8074bf0fee | ||
|
|
de02f9cccf | ||
|
|
da446cb2e3 | ||
|
|
51d1aa917a | ||
|
|
b8032e0578 | ||
|
|
3f142ce1c0 | ||
|
|
88adcbcb81 | ||
|
|
8e985154a7 | ||
|
|
f8f590b19b | ||
|
|
58a35a3afd | ||
|
|
45f4fb5a68 | ||
|
|
99d66453fe | ||
|
|
41606d2f31 | ||
|
|
8d06492dbc | ||
|
|
6be434e65f | ||
|
|
6d99f86502 | ||
|
|
5eb5499034 | ||
|
|
0db3780e65 | ||
|
|
d7a0e1b501 | ||
|
|
154a11d057 | ||
|
|
faa869d03b | ||
|
|
fa9873cf4a | ||
|
|
a684d3e642 | ||
|
|
22d4023ea0 | ||
|
|
a5a21a6c32 | ||
|
|
4448c74f6c | ||
|
|
feceb7b482 | ||
|
|
3acb49da0c | ||
|
|
927aad6c1f | ||
|
|
9c0753f5d3 | ||
|
|
50be6410fe | ||
|
|
8ca40d52a4 | ||
|
|
9db55ffcee | ||
|
|
967a5b2dad | ||
|
|
088e81b55d | ||
|
|
6e6c9874f0 | ||
|
|
43c9c8b3a3 | ||
|
|
413fc53a39 |
27
.gitignore
vendored
Normal file
27
.gitignore
vendored
Normal file
@@ -0,0 +1,27 @@
|
||||
/bridges/captures/
|
||||
|
||||
/manuals/
|
||||
|
||||
# Python bytecode
|
||||
__pycache__/
|
||||
*.py[cod]
|
||||
|
||||
# Virtual environments
|
||||
.venv/
|
||||
venv/
|
||||
env/
|
||||
|
||||
# Editor / OS
|
||||
.vscode/
|
||||
*.swp
|
||||
.DS_Store
|
||||
Thumbs.db
|
||||
|
||||
# Analyzer outputs
|
||||
*.report
|
||||
claude_export_*.md
|
||||
|
||||
# Frame database
|
||||
*.db
|
||||
*.db-wal
|
||||
*.db-shm
|
||||
278
README.md
278
README.md
@@ -0,0 +1,278 @@
|
||||
# seismo-relay
|
||||
|
||||
Tools for capturing and reverse-engineering the RS-232 serial protocol between
|
||||
**Blastware** software and **Instantel MiniMate Plus** seismographs.
|
||||
|
||||
Built for Windows, stdlib-only (plus `pyserial` for the bridge).
|
||||
|
||||
---
|
||||
|
||||
## What's in here
|
||||
|
||||
```
|
||||
seismo-relay/
|
||||
├── bridges/
|
||||
│ ├── s3-bridge/
|
||||
│ │ └── s3_bridge.py ← The serial bridge (core capture tool)
|
||||
│ ├── gui_bridge.py ← Tkinter GUI wrapper for s3_bridge
|
||||
│ └── raw_capture.py ← Simpler raw-only capture tool
|
||||
└── parsers/
|
||||
├── s3_parser.py ← Low-level DLE frame extractor
|
||||
├── s3_analyzer.py ← Protocol analyzer (sessions, diffs, exports)
|
||||
├── gui_analyzer.py ← Tkinter GUI for the analyzer
|
||||
└── frame_db.py ← SQLite frame database
|
||||
```
|
||||
|
||||
---
|
||||
|
||||
## How it all fits together
|
||||
|
||||
The workflow has two phases: **capture**, then **analyze**.
|
||||
|
||||
```
|
||||
Blastware PC
|
||||
│
|
||||
Virtual COM (e.g. COM4)
|
||||
│
|
||||
s3_bridge.py ←─── sits in the middle, forwards all bytes both ways
|
||||
│ writes raw_bw.bin and raw_s3.bin
|
||||
Physical COM (e.g. COM5)
|
||||
│
|
||||
MiniMate Plus seismograph
|
||||
```
|
||||
|
||||
After capturing, you point the analyzer at the two `.bin` files to inspect
|
||||
what happened.
|
||||
|
||||
---
|
||||
|
||||
## Part 1 — The Bridge
|
||||
|
||||
### `s3_bridge.py` — Serial bridge
|
||||
|
||||
Transparently forwards bytes between Blastware and the seismograph while
|
||||
logging everything to disk. Blastware operates normally and has no idea the
|
||||
bridge is there.
|
||||
|
||||
**Run it:**
|
||||
```
|
||||
python bridges/s3-bridge/s3_bridge.py --bw COM4 --s3 COM5 --logdir captures/
|
||||
```
|
||||
|
||||
**Key flags:**
|
||||
| Flag | Default | Description |
|
||||
|------|---------|-------------|
|
||||
| `--bw` | required | COM port connected to Blastware |
|
||||
| `--s3` | required | COM port connected to the seismograph |
|
||||
| `--baud` | 38400 | Baud rate (match your device) |
|
||||
| `--logdir` | `.` | Where to write log/bin files |
|
||||
| `--raw-bw` | off | Also write a flat raw file for BW→S3 traffic |
|
||||
| `--raw-s3` | off | Also write a flat raw file for S3→BW traffic |
|
||||
|
||||
**Output files (in `--logdir`):**
|
||||
- `s3_session_<timestamp>.bin` — structured binary log with timestamps
|
||||
and direction tags (record format: `[type:1][ts_us:8][len:4][payload]`)
|
||||
- `s3_session_<timestamp>.log` — human-readable hex dump (text)
|
||||
- `raw_bw.bin` — flat BW→S3 byte stream (if `--raw-bw` used)
|
||||
- `raw_s3.bin` — flat S3→BW byte stream (if `--raw-s3` used)
|
||||
|
||||
> The analyzer needs `raw_bw.bin` + `raw_s3.bin`. Always use `--raw-bw` and
|
||||
> `--raw-s3` when capturing.
|
||||
|
||||
**Interactive commands** (type while bridge is running):
|
||||
- `m` + Enter → prompts for a label and inserts a MARK record into the log
|
||||
- `q` + Enter → quit
|
||||
|
||||
---
|
||||
|
||||
### `gui_bridge.py` — Bridge GUI
|
||||
|
||||
A simple point-and-click wrapper around `s3_bridge.py`. Easier than the
|
||||
command line if you don't want to type flags every time.
|
||||
|
||||
```
|
||||
python bridges/gui_bridge.py
|
||||
```
|
||||
|
||||
Set your COM ports, log directory, and tick the raw tap checkboxes before
|
||||
hitting **Start**. The **Add Mark** button lets you annotate the capture
|
||||
at any point (e.g. "changed record time to 13s").
|
||||
|
||||
---
|
||||
|
||||
## Part 2 — The Analyzer
|
||||
|
||||
After capturing, you have `raw_bw.bin` (bytes Blastware sent) and `raw_s3.bin`
|
||||
(bytes the seismograph replied with). The analyzer parses these into protocol
|
||||
frames, groups them into sessions, and helps you figure out what each byte means.
|
||||
|
||||
### What's a "session"?
|
||||
|
||||
Each time you open the settings dialog in Blastware and click Apply/OK, that's
|
||||
one session — a complete read/modify/write cycle. The bridge detects session
|
||||
boundaries by watching for the final write-confirm packet (SUB `0x74`).
|
||||
|
||||
Each session contains a sequence of request/response frame pairs:
|
||||
- Blastware sends a **request** (BW→S3): "give me your config block"
|
||||
- The seismograph sends a **response** (S3→BW): here it is
|
||||
- At the end, Blastware sends the modified settings back in a series of write packets
|
||||
|
||||
The analyzer lines these up and diffs consecutive sessions to show you exactly
|
||||
which bytes changed.
|
||||
|
||||
---
|
||||
|
||||
### `gui_analyzer.py` — Analyzer GUI
|
||||
|
||||
```
|
||||
python parsers/gui_analyzer.py
|
||||
```
|
||||
|
||||
This is the main tool. It has five tabs:
|
||||
|
||||
#### Toolbar
|
||||
- **S3 raw / BW raw** — browse to your `raw_s3.bin` and `raw_bw.bin` files
|
||||
- **Analyze** — parse and load the captures
|
||||
- **Live: OFF/ON** — watch the files grow in real time while the bridge is running
|
||||
- **Export for Claude** — generate a self-contained `.md` report for AI-assisted analysis
|
||||
|
||||
#### Inventory tab
|
||||
Shows all frames in the selected session — direction, SUB command, page,
|
||||
length, and checksum status. Click any frame in the left tree to drill in.
|
||||
|
||||
#### Hex Dump tab
|
||||
Full hex dump of the selected frame's payload. If the frame had changed bytes
|
||||
vs the previous session, those are listed below the dump with before/after values
|
||||
and field names where known.
|
||||
|
||||
#### Diff tab
|
||||
Side-by-side byte-level diff between the current session and the previous one.
|
||||
Only SUBs (command types) that actually changed are shown.
|
||||
|
||||
#### Full Report tab
|
||||
Raw text version of the session report — useful for copying into notes.
|
||||
|
||||
#### Query DB tab
|
||||
Search across all your captured sessions using the built-in database.
|
||||
|
||||
---
|
||||
|
||||
### `s3_analyzer.py` — Analyzer (command line)
|
||||
|
||||
If you prefer the terminal:
|
||||
|
||||
```
|
||||
python parsers/s3_analyzer.py --s3 raw_s3.bin --bw raw_bw.bin
|
||||
```
|
||||
|
||||
**Flags:**
|
||||
| Flag | Description |
|
||||
|------|-------------|
|
||||
| `--s3` | Path to raw_s3.bin |
|
||||
| `--bw` | Path to raw_bw.bin |
|
||||
| `--live` | Tail files in real time (poll mode) |
|
||||
| `--export` | Also write a `claude_export_<ts>.md` file |
|
||||
| `--outdir` | Where to write `.report` files (default: same folder as input) |
|
||||
| `--poll` | Live mode poll interval in seconds (default: 0.05) |
|
||||
|
||||
Writes one `.report` file per session and prints a summary to the console.
|
||||
|
||||
---
|
||||
|
||||
## The Frame Database
|
||||
|
||||
Every time you click **Analyze**, the frames are automatically saved to a
|
||||
SQLite database at:
|
||||
|
||||
```
|
||||
C:\Users\<you>\.seismo_lab\frames.db
|
||||
```
|
||||
|
||||
This accumulates captures over time so you can query across sessions and dates.
|
||||
|
||||
### Query DB tab
|
||||
|
||||
Use the filter bar to search:
|
||||
- **Capture** — narrow to a specific capture (timestamp shown)
|
||||
- **Dir** — BW (requests) or S3 (responses) only
|
||||
- **SUB** — filter by command type (e.g. `0xF7` = EVENT_INDEX_RESPONSE)
|
||||
- **Offset** — filter to frames that have a specific byte offset
|
||||
- **Value** — combined with Offset: "show frames where byte 85 = 0x0A"
|
||||
|
||||
Click any result row, then use the **Byte interpretation** panel at the bottom
|
||||
to see what that offset's bytes look like as uint8, int8, uint16 BE/LE,
|
||||
uint32 BE/LE, and float32 BE/LE simultaneously.
|
||||
|
||||
This is the main tool for mapping unknown fields — if you change one setting in
|
||||
Blastware, capture before and after, then query for frames where that offset
|
||||
moved, you can pin down exactly which byte controls what.
|
||||
|
||||
---
|
||||
|
||||
## Export for Claude
|
||||
|
||||
The **Export for Claude** button (orange, in the toolbar) generates a single
|
||||
`.md` file containing:
|
||||
|
||||
1. Protocol background and known field map
|
||||
2. Capture summary (session count, frame counts, what changed)
|
||||
3. Per-diff tables — before/after bytes for every changed offset, with field
|
||||
names where known
|
||||
4. Full hex dumps of all frames in the baseline session
|
||||
|
||||
Paste this file into a Claude conversation to get help mapping unknown fields,
|
||||
interpreting data structures, or understanding sequences.
|
||||
|
||||
---
|
||||
|
||||
## Protocol quick-reference
|
||||
|
||||
| Term | Value | Meaning |
|
||||
|------|-------|---------|
|
||||
| DLE | `0x10` | Data Link Escape |
|
||||
| STX | `0x02` | Start of frame |
|
||||
| ETX | `0x03` | End of frame |
|
||||
| ACK | `0x41` | Frame start marker (BW side) |
|
||||
| DLE stuffing | `10 10` on wire | Literal `0x10` in payload |
|
||||
|
||||
**S3-side frame** (seismograph → Blastware): `DLE STX [payload] DLE ETX`
|
||||
**BW-side frame** (Blastware → seismograph): `ACK STX [payload] ETX`
|
||||
|
||||
**De-stuffed payload header** (first 5 bytes after de-stuffing):
|
||||
```
|
||||
[0] CMD 0x10 = BW request, 0x00 = S3 response
|
||||
[1] ? 0x00 (BW) or 0x10 (S3)
|
||||
[2] SUB Command/response identifier ← the key field
|
||||
[3] OFFSET_HI Page address high byte
|
||||
[4] OFFSET_LO Page address low byte
|
||||
[5+] DATA Payload content
|
||||
```
|
||||
|
||||
**Response SUB rule:** `response_SUB = 0xFF - request_SUB`
|
||||
Example: request SUB `0x08` → response SUB `0xF7`
|
||||
|
||||
---
|
||||
|
||||
## Requirements
|
||||
|
||||
```
|
||||
pip install pyserial
|
||||
```
|
||||
|
||||
Python 3.10+. Everything else is stdlib (Tkinter, sqlite3, struct, hashlib).
|
||||
|
||||
Tkinter is included with the standard Python installer on Windows. If it's
|
||||
missing, reinstall Python and make sure "tcl/tk and IDLE" is checked.
|
||||
|
||||
---
|
||||
|
||||
## Virtual COM ports
|
||||
|
||||
The bridge needs two COM ports on the same PC — one that Blastware connects to,
|
||||
and one wired to the actual seismograph. On Windows, use a virtual COM port pair
|
||||
(e.g. **com0com** or **VSPD**) to give Blastware a port to talk to while the
|
||||
bridge sits in the middle.
|
||||
|
||||
```
|
||||
Blastware → COM4 (virtual) ↔ s3_bridge ↔ COM5 (physical) → MiniMate
|
||||
```
|
||||
|
||||
226
bridges/gui_bridge.py
Normal file
226
bridges/gui_bridge.py
Normal file
@@ -0,0 +1,226 @@
|
||||
#!/usr/bin/env python3
|
||||
"""
|
||||
gui_bridge.py — simple Tk GUI wrapper for s3_bridge.py (Windows-friendly).
|
||||
|
||||
Features:
|
||||
- Select BW and S3 COM ports, baud, log directory.
|
||||
- Optional raw taps (BW->S3, S3->BW).
|
||||
- Start/Stop buttons spawn/terminate s3_bridge as a subprocess.
|
||||
- Live stdout view from the bridge process.
|
||||
|
||||
Requires only the stdlib (Tkinter is bundled on Windows/Python).
|
||||
"""
|
||||
|
||||
from __future__ import annotations
|
||||
|
||||
import datetime
|
||||
import os
|
||||
import queue
|
||||
import subprocess
|
||||
import sys
|
||||
import threading
|
||||
import tkinter as tk
|
||||
from tkinter import filedialog, messagebox, scrolledtext, simpledialog
|
||||
|
||||
SCRIPT_DIR = os.path.dirname(os.path.abspath(__file__))
|
||||
BRIDGE_PATH = os.path.join(SCRIPT_DIR, "s3-bridge", "s3_bridge.py")
|
||||
|
||||
|
||||
class BridgeGUI(tk.Tk):
|
||||
def __init__(self) -> None:
|
||||
super().__init__()
|
||||
self.title("S3 Bridge GUI")
|
||||
self.process: subprocess.Popen | None = None
|
||||
self.stdout_q: queue.Queue[str] = queue.Queue()
|
||||
self._build_widgets()
|
||||
self._poll_stdout()
|
||||
|
||||
def _build_widgets(self) -> None:
|
||||
pad = {"padx": 6, "pady": 4}
|
||||
|
||||
# Row 0: Ports
|
||||
tk.Label(self, text="BW COM:").grid(row=0, column=0, sticky="e", **pad)
|
||||
self.bw_var = tk.StringVar(value="COM4")
|
||||
tk.Entry(self, textvariable=self.bw_var, width=10).grid(row=0, column=1, sticky="w", **pad)
|
||||
|
||||
tk.Label(self, text="S3 COM:").grid(row=0, column=2, sticky="e", **pad)
|
||||
self.s3_var = tk.StringVar(value="COM5")
|
||||
tk.Entry(self, textvariable=self.s3_var, width=10).grid(row=0, column=3, sticky="w", **pad)
|
||||
|
||||
# Row 1: Baud
|
||||
tk.Label(self, text="Baud:").grid(row=1, column=0, sticky="e", **pad)
|
||||
self.baud_var = tk.StringVar(value="38400")
|
||||
tk.Entry(self, textvariable=self.baud_var, width=10).grid(row=1, column=1, sticky="w", **pad)
|
||||
|
||||
# Row 1: Logdir chooser
|
||||
tk.Label(self, text="Log dir:").grid(row=1, column=2, sticky="e", **pad)
|
||||
self.logdir_var = tk.StringVar(value=".")
|
||||
tk.Entry(self, textvariable=self.logdir_var, width=24).grid(row=1, column=3, sticky="we", **pad)
|
||||
tk.Button(self, text="Browse", command=self._choose_dir).grid(row=1, column=4, sticky="w", **pad)
|
||||
|
||||
# Row 2: Raw taps
|
||||
self.raw_bw_var = tk.StringVar(value="")
|
||||
self.raw_s3_var = tk.StringVar(value="")
|
||||
tk.Checkbutton(self, text="Save BW->S3 raw", command=self._toggle_raw_bw, onvalue="1", offvalue="").grid(row=2, column=0, sticky="w", **pad)
|
||||
tk.Entry(self, textvariable=self.raw_bw_var, width=28).grid(row=2, column=1, columnspan=3, sticky="we", **pad)
|
||||
tk.Button(self, text="...", command=lambda: self._choose_file(self.raw_bw_var, "bw")).grid(row=2, column=4, **pad)
|
||||
|
||||
tk.Checkbutton(self, text="Save S3->BW raw", command=self._toggle_raw_s3, onvalue="1", offvalue="").grid(row=3, column=0, sticky="w", **pad)
|
||||
tk.Entry(self, textvariable=self.raw_s3_var, width=28).grid(row=3, column=1, columnspan=3, sticky="we", **pad)
|
||||
tk.Button(self, text="...", command=lambda: self._choose_file(self.raw_s3_var, "s3")).grid(row=3, column=4, **pad)
|
||||
|
||||
# Row 4: Status + buttons
|
||||
self.status_var = tk.StringVar(value="Idle")
|
||||
tk.Label(self, textvariable=self.status_var, anchor="w").grid(row=4, column=0, columnspan=5, sticky="we", **pad)
|
||||
|
||||
tk.Button(self, text="Start", command=self.start_bridge, width=12).grid(row=5, column=0, columnspan=2, **pad)
|
||||
tk.Button(self, text="Stop", command=self.stop_bridge, width=12).grid(row=5, column=2, columnspan=2, **pad)
|
||||
self.mark_btn = tk.Button(self, text="Add Mark", command=self.add_mark, width=12, state="disabled")
|
||||
self.mark_btn.grid(row=5, column=4, **pad)
|
||||
|
||||
# Row 6: Log view
|
||||
self.log_view = scrolledtext.ScrolledText(self, height=20, width=90, state="disabled")
|
||||
self.log_view.grid(row=6, column=0, columnspan=5, sticky="nsew", **pad)
|
||||
|
||||
# Grid weights
|
||||
for c in range(5):
|
||||
self.grid_columnconfigure(c, weight=1)
|
||||
self.grid_rowconfigure(6, weight=1)
|
||||
|
||||
def _choose_dir(self) -> None:
|
||||
path = filedialog.askdirectory()
|
||||
if path:
|
||||
self.logdir_var.set(path)
|
||||
|
||||
def _choose_file(self, var: tk.StringVar, direction: str) -> None:
|
||||
filename = filedialog.asksaveasfilename(
|
||||
title=f"Raw tap file for {direction}",
|
||||
defaultextension=".bin",
|
||||
filetypes=[("Binary", "*.bin"), ("All files", "*.*")]
|
||||
)
|
||||
if filename:
|
||||
var.set(filename)
|
||||
|
||||
def _toggle_raw_bw(self) -> None:
|
||||
if not self.raw_bw_var.get():
|
||||
# default name
|
||||
self.raw_bw_var.set(os.path.join(self.logdir_var.get(), "raw_bw.bin"))
|
||||
|
||||
def _toggle_raw_s3(self) -> None:
|
||||
if not self.raw_s3_var.get():
|
||||
self.raw_s3_var.set(os.path.join(self.logdir_var.get(), "raw_s3.bin"))
|
||||
|
||||
def start_bridge(self) -> None:
|
||||
if self.process and self.process.poll() is None:
|
||||
messagebox.showinfo("Bridge", "Bridge is already running.")
|
||||
return
|
||||
|
||||
bw = self.bw_var.get().strip()
|
||||
s3 = self.s3_var.get().strip()
|
||||
baud = self.baud_var.get().strip()
|
||||
logdir = self.logdir_var.get().strip() or "."
|
||||
|
||||
if not bw or not s3:
|
||||
messagebox.showerror("Error", "Please enter both BW and S3 COM ports.")
|
||||
return
|
||||
|
||||
args = [sys.executable, BRIDGE_PATH, "--bw", bw, "--s3", s3, "--baud", baud, "--logdir", logdir]
|
||||
|
||||
ts = datetime.datetime.now().strftime("%Y%m%d_%H%M%S")
|
||||
|
||||
raw_bw = self.raw_bw_var.get().strip()
|
||||
raw_s3 = self.raw_s3_var.get().strip()
|
||||
|
||||
# If the user left the default generic name, replace with a timestamped one
|
||||
# so each session gets its own file.
|
||||
if raw_bw:
|
||||
if os.path.basename(raw_bw) in ("raw_bw.bin", "raw_bw"):
|
||||
raw_bw = os.path.join(os.path.dirname(raw_bw) or logdir, f"raw_bw_{ts}.bin")
|
||||
self.raw_bw_var.set(raw_bw)
|
||||
args += ["--raw-bw", raw_bw]
|
||||
if raw_s3:
|
||||
if os.path.basename(raw_s3) in ("raw_s3.bin", "raw_s3"):
|
||||
raw_s3 = os.path.join(os.path.dirname(raw_s3) or logdir, f"raw_s3_{ts}.bin")
|
||||
self.raw_s3_var.set(raw_s3)
|
||||
args += ["--raw-s3", raw_s3]
|
||||
|
||||
try:
|
||||
self.process = subprocess.Popen(
|
||||
args,
|
||||
stdout=subprocess.PIPE,
|
||||
stderr=subprocess.STDOUT,
|
||||
stdin=subprocess.PIPE,
|
||||
text=True,
|
||||
bufsize=1,
|
||||
)
|
||||
except Exception as e:
|
||||
messagebox.showerror("Error", f"Failed to start bridge: {e}")
|
||||
return
|
||||
|
||||
threading.Thread(target=self._reader_thread, daemon=True).start()
|
||||
self.status_var.set("Running...")
|
||||
self._append_log("== Bridge started ==\n")
|
||||
self.mark_btn.configure(state="normal")
|
||||
|
||||
def stop_bridge(self) -> None:
|
||||
if self.process and self.process.poll() is None:
|
||||
self.process.terminate()
|
||||
try:
|
||||
self.process.wait(timeout=3)
|
||||
except subprocess.TimeoutExpired:
|
||||
self.process.kill()
|
||||
self.status_var.set("Stopped")
|
||||
self._append_log("== Bridge stopped ==\n")
|
||||
self.mark_btn.configure(state="disabled")
|
||||
|
||||
def _reader_thread(self) -> None:
|
||||
if not self.process or not self.process.stdout:
|
||||
return
|
||||
for line in self.process.stdout:
|
||||
self.stdout_q.put(line)
|
||||
self.stdout_q.put("<<process-exit>>")
|
||||
|
||||
def add_mark(self) -> None:
|
||||
if not self.process or not self.process.stdin or self.process.poll() is not None:
|
||||
return
|
||||
label = simpledialog.askstring("Mark", "Enter label for mark:", parent=self)
|
||||
if label is None or label.strip() == "":
|
||||
return
|
||||
try:
|
||||
# Mimic CLI behavior: send 'm' + Enter, then label + Enter
|
||||
self.process.stdin.write("m\n")
|
||||
self.process.stdin.write(label.strip() + "\n")
|
||||
self.process.stdin.flush()
|
||||
self._append_log(f"[GUI] Mark sent: {label.strip()}\n")
|
||||
except Exception as e:
|
||||
messagebox.showerror("Error", f"Failed to send mark: {e}")
|
||||
|
||||
def _poll_stdout(self) -> None:
|
||||
try:
|
||||
while True:
|
||||
line = self.stdout_q.get_nowait()
|
||||
if line == "<<process-exit>>":
|
||||
self.status_var.set("Stopped")
|
||||
self.mark_btn.configure(state="disabled")
|
||||
break
|
||||
self._append_log(line)
|
||||
except queue.Empty:
|
||||
pass
|
||||
finally:
|
||||
self.after(100, self._poll_stdout)
|
||||
|
||||
def _append_log(self, text: str) -> None:
|
||||
self.log_view.configure(state="normal")
|
||||
self.log_view.insert(tk.END, text)
|
||||
self.log_view.see(tk.END)
|
||||
self.log_view.configure(state="disabled")
|
||||
|
||||
|
||||
def main() -> int:
|
||||
app = BridgeGUI()
|
||||
app.mainloop()
|
||||
return 0
|
||||
|
||||
|
||||
if __name__ == "__main__":
|
||||
raise SystemExit(main())
|
||||
157
bridges/raw_capture.py
Normal file
157
bridges/raw_capture.py
Normal file
@@ -0,0 +1,157 @@
|
||||
#!/usr/bin/env python3
|
||||
"""
|
||||
raw_capture.py — minimal serial logger for raw byte collection.
|
||||
|
||||
Opens a single COM port, streams all bytes to a timestamped binary file,
|
||||
and does no parsing or forwarding. Useful when you just need the raw
|
||||
wire data without DLE framing or Blastware bridging.
|
||||
|
||||
Record format (little-endian):
|
||||
[ts_us:8][len:4][payload:len]
|
||||
Exactly one record type is used, so there is no type byte.
|
||||
"""
|
||||
|
||||
from __future__ import annotations
|
||||
|
||||
import argparse
|
||||
import datetime as _dt
|
||||
import os
|
||||
import signal
|
||||
import sys
|
||||
import time
|
||||
from typing import Optional
|
||||
|
||||
import serial
|
||||
|
||||
|
||||
def now_ts() -> str:
|
||||
t = _dt.datetime.now()
|
||||
return t.strftime("%H:%M:%S.") + f"{int(t.microsecond/1000):03d}"
|
||||
|
||||
|
||||
def pack_u32_le(n: int) -> bytes:
|
||||
return bytes((n & 0xFF, (n >> 8) & 0xFF, (n >> 16) & 0xFF, (n >> 24) & 0xFF))
|
||||
|
||||
|
||||
def pack_u64_le(n: int) -> bytes:
|
||||
out = []
|
||||
for i in range(8):
|
||||
out.append((n >> (8 * i)) & 0xFF)
|
||||
return bytes(out)
|
||||
|
||||
|
||||
def open_serial(port: str, baud: int, timeout: float) -> serial.Serial:
|
||||
return serial.Serial(
|
||||
port=port,
|
||||
baudrate=baud,
|
||||
bytesize=serial.EIGHTBITS,
|
||||
parity=serial.PARITY_NONE,
|
||||
stopbits=serial.STOPBITS_ONE,
|
||||
timeout=timeout,
|
||||
write_timeout=timeout,
|
||||
)
|
||||
|
||||
|
||||
class RawWriter:
|
||||
def __init__(self, path: str):
|
||||
self.path = path
|
||||
self._fh = open(path, "ab", buffering=0)
|
||||
|
||||
def write(self, payload: bytes, ts_us: Optional[int] = None) -> None:
|
||||
if ts_us is None:
|
||||
ts_us = int(time.time() * 1_000_000)
|
||||
header = pack_u64_le(ts_us) + pack_u32_le(len(payload))
|
||||
self._fh.write(header)
|
||||
if payload:
|
||||
self._fh.write(payload)
|
||||
|
||||
def close(self) -> None:
|
||||
try:
|
||||
self._fh.flush()
|
||||
finally:
|
||||
self._fh.close()
|
||||
|
||||
|
||||
def capture_loop(port: serial.Serial, writer: RawWriter, stop_flag: "StopFlag", status_every_s: float) -> None:
|
||||
last_status = time.monotonic()
|
||||
bytes_written = 0
|
||||
|
||||
while not stop_flag.is_set():
|
||||
try:
|
||||
n = port.in_waiting
|
||||
chunk = port.read(n if n and n < 4096 else (4096 if n else 1))
|
||||
except serial.SerialException as e:
|
||||
print(f"[{now_ts()}] [ERROR] serial exception: {e!r}", file=sys.stderr)
|
||||
break
|
||||
|
||||
if chunk:
|
||||
writer.write(chunk)
|
||||
bytes_written += len(chunk)
|
||||
|
||||
if status_every_s > 0:
|
||||
now = time.monotonic()
|
||||
if now - last_status >= status_every_s:
|
||||
print(f"[{now_ts()}] captured {bytes_written} bytes", flush=True)
|
||||
last_status = now
|
||||
|
||||
if not chunk:
|
||||
time.sleep(0.002)
|
||||
|
||||
|
||||
class StopFlag:
|
||||
def __init__(self):
|
||||
self._set = False
|
||||
|
||||
def set(self):
|
||||
self._set = True
|
||||
|
||||
def is_set(self) -> bool:
|
||||
return self._set
|
||||
|
||||
|
||||
def main() -> int:
|
||||
ap = argparse.ArgumentParser(description="Raw serial capture to timestamped binary file (no forwarding).")
|
||||
ap.add_argument("--port", default="COM5", help="Serial port to capture (default: COM5)")
|
||||
ap.add_argument("--baud", type=int, default=38400, help="Baud rate (default: 38400)")
|
||||
ap.add_argument("--timeout", type=float, default=0.05, help="Serial read timeout in seconds (default: 0.05)")
|
||||
ap.add_argument("--logdir", default=".", help="Directory to write captures (default: .)")
|
||||
ap.add_argument("--status-every", type=float, default=5.0, help="Seconds between progress lines (0 disables)")
|
||||
args = ap.parse_args()
|
||||
|
||||
os.makedirs(args.logdir, exist_ok=True)
|
||||
ts = _dt.datetime.now().strftime("%Y%m%d_%H%M%S")
|
||||
bin_path = os.path.join(args.logdir, f"raw_capture_{ts}.bin")
|
||||
|
||||
print(f"[INFO] Opening {args.port} @ {args.baud}...")
|
||||
try:
|
||||
ser = open_serial(args.port, args.baud, args.timeout)
|
||||
except Exception as e:
|
||||
print(f"[ERROR] failed to open port: {e!r}", file=sys.stderr)
|
||||
return 2
|
||||
|
||||
writer = RawWriter(bin_path)
|
||||
print(f"[INFO] Writing raw bytes to {bin_path}")
|
||||
print("[INFO] Press Ctrl+C to stop.")
|
||||
|
||||
stop = StopFlag()
|
||||
|
||||
def handle_sigint(sig, frame):
|
||||
stop.set()
|
||||
|
||||
signal.signal(signal.SIGINT, handle_sigint)
|
||||
|
||||
try:
|
||||
capture_loop(ser, writer, stop, args.status_every)
|
||||
finally:
|
||||
writer.close()
|
||||
try:
|
||||
ser.close()
|
||||
except Exception:
|
||||
pass
|
||||
print(f"[INFO] Capture stopped. Total bytes written: {os.path.getsize(bin_path)}")
|
||||
|
||||
return 0
|
||||
|
||||
|
||||
if __name__ == "__main__":
|
||||
raise SystemExit(main())
|
||||
@@ -1,29 +1,25 @@
|
||||
#!/usr/bin/env python3
|
||||
"""
|
||||
s3_bridge.py — S3 <-> Blastware serial bridge with frame-aware session logging
|
||||
Version: v0.4.0
|
||||
s3_bridge.py — S3 <-> Blastware serial bridge with raw binary capture + DLE-aware text framing
|
||||
Version: v0.5.1
|
||||
|
||||
Key features:
|
||||
- Low CPU: avoids per-byte console printing
|
||||
- Forwards bytes immediately (true bridge)
|
||||
- Frame-aware logging: buffers per direction until ETX (0x03), then logs full frame on one line
|
||||
- Also logs plain ASCII bursts (e.g., "Operating System") cleanly
|
||||
- Dual log output: hex text log (.log) AND raw binary log (.bin) written simultaneously
|
||||
- Interactive annotation: type 'm' + Enter to stamp a [MARK] into both logs mid-capture
|
||||
- Binary sentinel markers: out-of-band FF FF FF FF <len> <label> in .bin for programmatic correlation
|
||||
- Auto-marks on session start and end
|
||||
What’s new vs v0.4.0:
|
||||
- .bin is now a TRUE raw capture stream with direction + timestamps (record container format).
|
||||
- .log remains human-friendly and frame-oriented, but frame detection is now DLE-aware:
|
||||
- frame start = 0x10 0x02 (DLE STX)
|
||||
- frame end = 0x10 0x03 (DLE ETX)
|
||||
(No longer splits on bare 0x03.)
|
||||
- Marks/Info are stored as proper record types in .bin (no unsafe sentinel bytes).
|
||||
- Optional raw taps: use --raw-bw / --raw-s3 to also dump byte-for-byte traffic per direction
|
||||
with no headers (for tools that just need a flat stream).
|
||||
|
||||
Usage examples:
|
||||
python s3_bridge.py
|
||||
python s3_bridge.py --bw COM5 --s3 COM4 --baud 38400
|
||||
python s3_bridge.py --quiet
|
||||
|
||||
Annotation:
|
||||
While running, type 'm' and press Enter. You will be prompted for a label.
|
||||
The mark is written to the .log as:
|
||||
[HH:MM:SS.mmm] >>> MARK: your label here
|
||||
And to the .bin as an out-of-band sentinel (never valid frame data):
|
||||
FF FF FF FF <1-byte length> <label bytes>
|
||||
BIN record format (little-endian):
|
||||
[type:1][ts_us:8][len:4][payload:len]
|
||||
Types:
|
||||
0x01 BW->S3 bytes
|
||||
0x02 S3->BW bytes
|
||||
0x03 MARK (utf-8)
|
||||
0x04 INFO (utf-8)
|
||||
"""
|
||||
|
||||
from __future__ import annotations
|
||||
@@ -39,68 +35,94 @@ from typing import Optional
|
||||
|
||||
import serial
|
||||
|
||||
VERSION = "v0.5.1"
|
||||
|
||||
VERSION = "v0.4.0"
|
||||
DLE = 0x10
|
||||
STX = 0x02
|
||||
ETX = 0x03
|
||||
ACK = 0x41
|
||||
|
||||
# Sentinel prefix for binary markers. Four 0xFF bytes can never appear in
|
||||
# valid Instantel DLE-framed data (0xFF is not a legal protocol byte in any
|
||||
# framing position), so this sequence is unambiguously out-of-band.
|
||||
BIN_MARK_SENTINEL = b"\xFF\xFF\xFF\xFF"
|
||||
REC_BW = 0x01
|
||||
REC_S3 = 0x02
|
||||
REC_MARK = 0x03
|
||||
REC_INFO = 0x04
|
||||
|
||||
|
||||
def now_ts() -> str:
|
||||
# Local time with milliseconds, like [13:37:06.239]
|
||||
t = _dt.datetime.now()
|
||||
return t.strftime("%H:%M:%S.") + f"{int(t.microsecond/1000):03d}"
|
||||
|
||||
|
||||
def now_us() -> int:
|
||||
# Wall-clock microseconds (fine for correlation). If you want monotonic, we can switch.
|
||||
return int(time.time() * 1_000_000)
|
||||
|
||||
|
||||
def bytes_to_hex(b: bytes) -> str:
|
||||
return " ".join(f"{x:02X}" for x in b)
|
||||
|
||||
|
||||
def looks_like_text(b: bytes) -> bool:
|
||||
# Heuristic: mostly printable ASCII plus spaces
|
||||
if not b:
|
||||
return False
|
||||
printable = 0
|
||||
for x in b:
|
||||
if x in (9, 10, 13): # \t \n \r
|
||||
if x in (9, 10, 13):
|
||||
printable += 1
|
||||
elif 32 <= x <= 126:
|
||||
printable += 1
|
||||
return (printable / len(b)) >= 0.90
|
||||
|
||||
|
||||
def pack_u32_le(n: int) -> bytes:
|
||||
return bytes((n & 0xFF, (n >> 8) & 0xFF, (n >> 16) & 0xFF, (n >> 24) & 0xFF))
|
||||
|
||||
|
||||
def pack_u64_le(n: int) -> bytes:
|
||||
out = []
|
||||
for i in range(8):
|
||||
out.append((n >> (8 * i)) & 0xFF)
|
||||
return bytes(out)
|
||||
|
||||
|
||||
class SessionLogger:
|
||||
def __init__(self, path: str, bin_path: str):
|
||||
def __init__(self, path: str, bin_path: str, raw_bw_path: Optional[str] = None, raw_s3_path: Optional[str] = None):
|
||||
self.path = path
|
||||
self.bin_path = bin_path
|
||||
self._fh = open(path, "a", buffering=1, encoding="utf-8", errors="replace")
|
||||
self._bin_fh = open(bin_path, "ab", buffering=0)
|
||||
self._lock = threading.Lock()
|
||||
# Optional pure-byte taps (no headers). BW=Blastware tx, S3=device tx.
|
||||
self._raw_bw = open(raw_bw_path, "ab", buffering=0) if raw_bw_path else None
|
||||
self._raw_s3 = open(raw_s3_path, "ab", buffering=0) if raw_s3_path else None
|
||||
|
||||
def log_line(self, line: str) -> None:
|
||||
with self._lock:
|
||||
self._fh.write(line + "\n")
|
||||
|
||||
def log_raw(self, data: bytes) -> None:
|
||||
"""Write raw bytes directly to the binary log."""
|
||||
def bin_write_record(self, rec_type: int, payload: bytes, ts_us: Optional[int] = None) -> None:
|
||||
if ts_us is None:
|
||||
ts_us = now_us()
|
||||
header = bytes([rec_type]) + pack_u64_le(ts_us) + pack_u32_le(len(payload))
|
||||
with self._lock:
|
||||
self._bin_fh.write(data)
|
||||
self._bin_fh.write(header)
|
||||
if payload:
|
||||
self._bin_fh.write(payload)
|
||||
# Raw taps: write only the payload bytes (no headers)
|
||||
if rec_type == REC_BW and self._raw_bw:
|
||||
self._raw_bw.write(payload)
|
||||
if rec_type == REC_S3 and self._raw_s3:
|
||||
self._raw_s3.write(payload)
|
||||
|
||||
def log_mark(self, label: str) -> None:
|
||||
"""
|
||||
Write an annotation mark to both logs simultaneously.
|
||||
|
||||
.log — visually distinct line: [TS] >>> MARK: label
|
||||
.bin — out-of-band sentinel: FF FF FF FF <len> <label utf-8, max 255 bytes>
|
||||
"""
|
||||
ts = now_ts()
|
||||
label_bytes = label.encode("utf-8", errors="replace")[:255]
|
||||
sentinel = BIN_MARK_SENTINEL + bytes([len(label_bytes)]) + label_bytes
|
||||
with self._lock:
|
||||
self._fh.write(f"[{ts}] >>> MARK: {label}\n")
|
||||
self._bin_fh.write(sentinel)
|
||||
self.log_line(f"[{ts}] >>> MARK: {label}")
|
||||
self.bin_write_record(REC_MARK, label.encode("utf-8", errors="replace"))
|
||||
|
||||
def log_info(self, msg: str) -> None:
|
||||
ts = now_ts()
|
||||
self.log_line(f"[{ts}] [INFO] {msg}")
|
||||
self.bin_write_record(REC_INFO, msg.encode("utf-8", errors="replace"))
|
||||
|
||||
def close(self) -> None:
|
||||
with self._lock:
|
||||
@@ -110,53 +132,93 @@ class SessionLogger:
|
||||
finally:
|
||||
self._fh.close()
|
||||
self._bin_fh.close()
|
||||
if self._raw_bw:
|
||||
self._raw_bw.close()
|
||||
if self._raw_s3:
|
||||
self._raw_s3.close()
|
||||
|
||||
|
||||
class FrameAssembler:
|
||||
class DLEFrameSniffer:
|
||||
"""
|
||||
Maintains a rolling buffer of bytes for one direction and emits complete frames.
|
||||
We treat ETX=0x03 as an end-of-frame marker.
|
||||
DLE-aware sniffer for logging only.
|
||||
Extracts:
|
||||
- ACK bytes (0x41) as single-byte events
|
||||
- DLE-framed blocks starting at 10 02 and ending at 10 03
|
||||
- Occasional ASCII bursts (e.g. "Operating System") outside framing
|
||||
It does NOT modify bytes; it just segments them for the .log.
|
||||
"""
|
||||
def __init__(self):
|
||||
self.buf = bytearray()
|
||||
|
||||
def push(self, chunk: bytes) -> list[bytes]:
|
||||
def push(self, chunk: bytes) -> list[tuple[str, bytes]]:
|
||||
if chunk:
|
||||
self.buf.extend(chunk)
|
||||
|
||||
frames: list[bytes] = []
|
||||
events: list[tuple[str, bytes]] = []
|
||||
|
||||
# Opportunistically peel off leading ACK(s) when idle-ish.
|
||||
# We do this only when an ACK is not inside a frame (frames start with DLE).
|
||||
while self.buf and self.buf[0] == ACK:
|
||||
events.append(("ACK", bytes([ACK])))
|
||||
del self.buf[0]
|
||||
|
||||
# Try to parse frames: find DLE STX then scan for DLE ETX
|
||||
while True:
|
||||
try:
|
||||
etx_i = self.buf.index(0x03)
|
||||
except ValueError:
|
||||
# Find start of frame
|
||||
start = self._find_dle_stx(self.buf)
|
||||
if start is None:
|
||||
# No frame start. Maybe text?
|
||||
txt = bytes(self.buf)
|
||||
if looks_like_text(txt):
|
||||
events.append(("TEXT", txt))
|
||||
self.buf.clear()
|
||||
break
|
||||
|
||||
# include ETX byte
|
||||
frame = bytes(self.buf[: etx_i + 1])
|
||||
del self.buf[: etx_i + 1]
|
||||
# Emit any leading text before the frame
|
||||
if start > 0:
|
||||
leading = bytes(self.buf[:start])
|
||||
if looks_like_text(leading):
|
||||
events.append(("TEXT", leading))
|
||||
else:
|
||||
# Unknown junk; still preserve in log as RAW so you can see it
|
||||
events.append(("RAW", leading))
|
||||
del self.buf[:start]
|
||||
|
||||
# ignore empty noise
|
||||
if frame:
|
||||
frames.append(frame)
|
||||
# Now buf starts with DLE STX
|
||||
end = self._find_dle_etx(self.buf)
|
||||
if end is None:
|
||||
break # need more bytes
|
||||
|
||||
return frames
|
||||
frame = bytes(self.buf[:end])
|
||||
del self.buf[:end]
|
||||
|
||||
def drain_as_text_if_any(self) -> Optional[bytes]:
|
||||
"""
|
||||
If buffer contains non-framed data (no ETX) and looks like text, emit it.
|
||||
Useful for things like "Operating System" that come as raw ASCII.
|
||||
"""
|
||||
if not self.buf:
|
||||
return None
|
||||
b = bytes(self.buf)
|
||||
if looks_like_text(b):
|
||||
self.buf.clear()
|
||||
return b
|
||||
events.append(("FRAME", frame))
|
||||
|
||||
# peel off any ACKs that may immediately follow
|
||||
while self.buf and self.buf[0] == ACK:
|
||||
events.append(("ACK", bytes([ACK])))
|
||||
del self.buf[0]
|
||||
|
||||
return events
|
||||
|
||||
@staticmethod
|
||||
def _find_dle_stx(b: bytearray) -> Optional[int]:
|
||||
for i in range(len(b) - 1):
|
||||
if b[i] == DLE and b[i + 1] == STX:
|
||||
return i
|
||||
return None
|
||||
|
||||
@staticmethod
|
||||
def _find_dle_etx(b: bytearray) -> Optional[int]:
|
||||
# Find first occurrence of DLE ETX after the initial DLE STX.
|
||||
# Return index *after* ETX (slice end).
|
||||
for i in range(2, len(b) - 1):
|
||||
if b[i] == DLE and b[i + 1] == ETX:
|
||||
return i + 2
|
||||
return None
|
||||
|
||||
|
||||
def open_serial(port: str, baud: int) -> serial.Serial:
|
||||
# timeout keeps read() from blocking forever, enabling clean Ctrl+C shutdown
|
||||
return serial.Serial(
|
||||
port=port,
|
||||
baudrate=baud,
|
||||
@@ -170,6 +232,7 @@ def open_serial(port: str, baud: int) -> serial.Serial:
|
||||
|
||||
def forward_loop(
|
||||
name: str,
|
||||
rec_type: int,
|
||||
src: serial.Serial,
|
||||
dst: serial.Serial,
|
||||
logger: SessionLogger,
|
||||
@@ -177,22 +240,24 @@ def forward_loop(
|
||||
quiet: bool,
|
||||
status_every_s: float,
|
||||
) -> None:
|
||||
assembler = FrameAssembler()
|
||||
sniffer = DLEFrameSniffer()
|
||||
last_status = time.monotonic()
|
||||
|
||||
while not stop.is_set():
|
||||
try:
|
||||
n = src.in_waiting
|
||||
if n:
|
||||
chunk = src.read(n if n < 4096 else 4096)
|
||||
else:
|
||||
chunk = src.read(1) # will return b"" after timeout
|
||||
chunk = src.read(n if n and n < 4096 else (4096 if n else 1))
|
||||
except serial.SerialException as e:
|
||||
logger.log_line(f"[{now_ts()}] [ERROR] {name} serial exception: {e!r}")
|
||||
break
|
||||
|
||||
if chunk:
|
||||
# forward immediately
|
||||
ts = now_us()
|
||||
|
||||
# 1) RAW BIN CAPTURE (absolute truth)
|
||||
logger.bin_write_record(rec_type, chunk, ts_us=ts)
|
||||
|
||||
# 2) Forward immediately (bridge behavior)
|
||||
try:
|
||||
dst.write(chunk)
|
||||
except serial.SerialTimeoutException:
|
||||
@@ -201,55 +266,42 @@ def forward_loop(
|
||||
logger.log_line(f"[{now_ts()}] [ERROR] {name} dst write exception: {e!r}")
|
||||
break
|
||||
|
||||
# frame-aware logging
|
||||
frames = assembler.push(chunk)
|
||||
for frame in frames:
|
||||
# Some devices send leading STX separately; we still log as-is.
|
||||
logger.log_line(f"[{now_ts()}] [{name}] {bytes_to_hex(frame)}")
|
||||
logger.log_raw(frame)
|
||||
# 3) Human-friendly .log segmentation (DLE-aware)
|
||||
for kind, payload in sniffer.push(chunk):
|
||||
if kind == "ACK":
|
||||
logger.log_line(f"[{now_ts()}] [{name}] [ACK] 41")
|
||||
elif kind == "FRAME":
|
||||
logger.log_line(f"[{now_ts()}] [{name}] {bytes_to_hex(payload)}")
|
||||
elif kind == "TEXT":
|
||||
try:
|
||||
s = payload.decode("ascii", errors="replace").strip("\r\n")
|
||||
except Exception:
|
||||
s = repr(payload)
|
||||
logger.log_line(f"[{now_ts()}] [{name}] [TEXT] {s}")
|
||||
else: # RAW
|
||||
logger.log_line(f"[{now_ts()}] [{name}] [RAW] {bytes_to_hex(payload)}")
|
||||
|
||||
# If we have non-ETX data that looks like text, flush it as TEXT
|
||||
text = assembler.drain_as_text_if_any()
|
||||
if text is not None:
|
||||
try:
|
||||
s = text.decode("ascii", errors="replace").strip("\r\n")
|
||||
except Exception:
|
||||
s = repr(text)
|
||||
logger.log_line(f"[{now_ts()}] [{name}] [TEXT] {s}")
|
||||
logger.log_raw(text)
|
||||
|
||||
# minimal console heartbeat (cheap)
|
||||
if not quiet and status_every_s > 0:
|
||||
now = time.monotonic()
|
||||
if (now - last_status) >= status_every_s:
|
||||
print(f"[{now_ts()}] {name} alive")
|
||||
last_status = now
|
||||
|
||||
# tiny sleep only when idle to avoid spin
|
||||
if not chunk:
|
||||
time.sleep(0.002)
|
||||
|
||||
|
||||
def annotation_loop(logger: SessionLogger, stop: threading.Event) -> None:
|
||||
"""
|
||||
Runs on the main thread (or a dedicated thread) reading stdin.
|
||||
Type 'm' + Enter to trigger an annotation prompt.
|
||||
Any other non-empty input is ignored with a hint.
|
||||
Bare Enter (empty line) is silently ignored to prevent accidental marks.
|
||||
"""
|
||||
print("[MARK] Type 'm' + Enter to annotate the capture. Ctrl+C to stop.")
|
||||
while not stop.is_set():
|
||||
try:
|
||||
line = input()
|
||||
except EOFError:
|
||||
# stdin closed (e.g. piped input exhausted)
|
||||
break
|
||||
except KeyboardInterrupt:
|
||||
except (EOFError, KeyboardInterrupt):
|
||||
break
|
||||
|
||||
line = line.strip()
|
||||
if not line:
|
||||
continue # bare Enter — ignore silently
|
||||
continue
|
||||
|
||||
if line.lower() == "m":
|
||||
try:
|
||||
@@ -269,10 +321,12 @@ def annotation_loop(logger: SessionLogger, stop: threading.Event) -> None:
|
||||
|
||||
def main() -> int:
|
||||
ap = argparse.ArgumentParser()
|
||||
ap.add_argument("--bw", default="COM5", help="Blastware-side COM port (default: COM5)")
|
||||
ap.add_argument("--s3", default="COM4", help="S3-side COM port (default: COM4)")
|
||||
ap.add_argument("--bw", default="COM4", help="Blastware-side COM port (default: COM4)")
|
||||
ap.add_argument("--s3", default="COM5", help="S3-side COM port (default: COM5)")
|
||||
ap.add_argument("--baud", type=int, default=38400, help="Baud rate (default: 38400)")
|
||||
ap.add_argument("--logdir", default=".", help="Directory to write session logs into (default: .)")
|
||||
ap.add_argument("--raw-bw", default=None, help="Optional file to append raw bytes sent from BW->S3 (no headers)")
|
||||
ap.add_argument("--raw-s3", default=None, help="Optional file to append raw bytes sent from S3->BW (no headers)")
|
||||
ap.add_argument("--quiet", action="store_true", help="No console heartbeat output")
|
||||
ap.add_argument("--status-every", type=float, default=0.0, help="Seconds between console heartbeat lines (default: 0 = off)")
|
||||
args = ap.parse_args()
|
||||
@@ -291,12 +345,28 @@ def main() -> int:
|
||||
ts = _dt.datetime.now().strftime("%Y%m%d_%H%M%S")
|
||||
log_path = os.path.join(args.logdir, f"s3_session_{ts}.log")
|
||||
bin_path = os.path.join(args.logdir, f"s3_session_{ts}.bin")
|
||||
logger = SessionLogger(log_path, bin_path)
|
||||
|
||||
# If raw tap flags were passed without a path (bare --raw-bw / --raw-s3),
|
||||
# or if the sentinel value "auto" is used, generate a timestamped name.
|
||||
# If a specific path was provided, use it as-is (caller's responsibility).
|
||||
raw_bw_path = args.raw_bw
|
||||
raw_s3_path = args.raw_s3
|
||||
if raw_bw_path in (None, "", "auto"):
|
||||
raw_bw_path = os.path.join(args.logdir, f"raw_bw_{ts}.bin") if args.raw_bw is not None else None
|
||||
if raw_s3_path in (None, "", "auto"):
|
||||
raw_s3_path = os.path.join(args.logdir, f"raw_s3_{ts}.bin") if args.raw_s3 is not None else None
|
||||
|
||||
logger = SessionLogger(log_path, bin_path, raw_bw_path=raw_bw_path, raw_s3_path=raw_s3_path)
|
||||
|
||||
print(f"[LOG] Writing hex log to {log_path}")
|
||||
print(f"[LOG] Writing binary log to {bin_path}")
|
||||
if raw_bw_path:
|
||||
print(f"[LOG] Raw tap BW->S3 -> {raw_bw_path}")
|
||||
if raw_s3_path:
|
||||
print(f"[LOG] Raw tap S3->BW -> {raw_s3_path}")
|
||||
|
||||
logger.log_line(f"[{now_ts()}] [INFO] s3_bridge {VERSION} start")
|
||||
logger.log_line(f"[{now_ts()}] [INFO] BW={args.bw} S3={args.s3} baud={args.baud}")
|
||||
logger.log_info(f"s3_bridge {VERSION} start")
|
||||
logger.log_info(f"BW={args.bw} S3={args.s3} baud={args.baud}")
|
||||
logger.log_mark(f"SESSION START — BW={args.bw} S3={args.s3} baud={args.baud}")
|
||||
|
||||
stop = threading.Event()
|
||||
@@ -309,16 +379,15 @@ def main() -> int:
|
||||
t1 = threading.Thread(
|
||||
target=forward_loop,
|
||||
name="BW_to_S3",
|
||||
args=("BW->S3", bw, s3, logger, stop, args.quiet, args.status_every),
|
||||
args=("BW->S3", REC_BW, bw, s3, logger, stop, args.quiet, args.status_every),
|
||||
daemon=True,
|
||||
)
|
||||
t2 = threading.Thread(
|
||||
target=forward_loop,
|
||||
name="S3_to_BW",
|
||||
args=("S3->BW", s3, bw, logger, stop, args.quiet, args.status_every),
|
||||
args=("S3->BW", REC_S3, s3, bw, logger, stop, args.quiet, args.status_every),
|
||||
daemon=True,
|
||||
)
|
||||
# Annotation loop runs in its own daemon thread so it doesn't block shutdown
|
||||
t_ann = threading.Thread(
|
||||
target=annotation_loop,
|
||||
name="Annotator",
|
||||
@@ -335,12 +404,11 @@ def main() -> int:
|
||||
time.sleep(0.05)
|
||||
finally:
|
||||
print("\n[INFO] Ctrl+C detected, shutting down...")
|
||||
logger.log_line(f"[{now_ts()}] [INFO] shutdown requested")
|
||||
logger.log_info("shutdown requested")
|
||||
|
||||
stop.set()
|
||||
t1.join(timeout=1.0)
|
||||
t2.join(timeout=1.0)
|
||||
# t_ann is daemon — don't join, it may be blocked on input()
|
||||
|
||||
try:
|
||||
bw.close()
|
||||
@@ -352,8 +420,7 @@ def main() -> int:
|
||||
pass
|
||||
|
||||
logger.log_mark("SESSION END")
|
||||
logger.log_line(f"[{now_ts()}] [INFO] ports closed, session end")
|
||||
print("[LOG] Closing session log")
|
||||
logger.log_info("ports closed, session end")
|
||||
logger.close()
|
||||
|
||||
return 0
|
||||
|
||||
205
bridges/tcp_serial_bridge.py
Normal file
205
bridges/tcp_serial_bridge.py
Normal file
@@ -0,0 +1,205 @@
|
||||
"""
|
||||
tcp_serial_bridge.py — Local TCP-to-serial bridge for bench testing TcpTransport.
|
||||
|
||||
Listens on a TCP port and, when a client connects, opens a serial port and
|
||||
bridges bytes bidirectionally. This lets you test the SFM server's TCP
|
||||
endpoint (?host=127.0.0.1&tcp_port=12345) against a locally-attached MiniMate
|
||||
Plus without needing a field modem.
|
||||
|
||||
The bridge simulates an RV55 cellular modem in transparent TCP passthrough mode:
|
||||
- No handshake bytes on connect
|
||||
- Raw bytes forwarded in both directions
|
||||
- One connection at a time (new connection closes any existing serial session)
|
||||
|
||||
Usage:
|
||||
python bridges/tcp_serial_bridge.py --serial COM5 --tcp-port 12345
|
||||
|
||||
Then in another window:
|
||||
python -m uvicorn sfm.server:app --port 8200
|
||||
curl "http://localhost:8200/device/info?host=127.0.0.1&tcp_port=12345"
|
||||
|
||||
Or just hit http://localhost:8200/device/info?host=127.0.0.1&tcp_port=12345
|
||||
in a browser.
|
||||
|
||||
Requirements:
|
||||
pip install pyserial
|
||||
"""
|
||||
|
||||
from __future__ import annotations
|
||||
|
||||
import argparse
|
||||
import logging
|
||||
import select
|
||||
import socket
|
||||
import sys
|
||||
import threading
|
||||
import time
|
||||
|
||||
try:
|
||||
import serial # type: ignore
|
||||
except ImportError:
|
||||
print("pyserial required: pip install pyserial", file=sys.stderr)
|
||||
sys.exit(1)
|
||||
|
||||
logging.basicConfig(
|
||||
level=logging.INFO,
|
||||
format="%(asctime)s %(levelname)-7s %(message)s",
|
||||
datefmt="%H:%M:%S",
|
||||
)
|
||||
log = logging.getLogger("tcp_serial_bridge")
|
||||
|
||||
# ── Constants ─────────────────────────────────────────────────────────────────
|
||||
|
||||
DEFAULT_BAUD = 38_400
|
||||
DEFAULT_TCP_PORT = 12345
|
||||
CHUNK = 256 # bytes per read call
|
||||
SERIAL_TIMEOUT = 0.02 # serial read timeout (s) — non-blocking in practice
|
||||
TCP_TIMEOUT = 0.02 # socket recv timeout (s)
|
||||
BOOT_DELAY = 8.0 # seconds to wait after opening serial port before
|
||||
# forwarding data — unit cold-boot (beep + OS init)
|
||||
# takes 5-10s from first RS-232 line assertion.
|
||||
# Set to 0 if unit was already running before connect.
|
||||
|
||||
|
||||
# ── Bridge session ─────────────────────────────────────────────────────────────
|
||||
|
||||
def _pipe_tcp_to_serial(sock: socket.socket, ser: serial.Serial, stop: threading.Event) -> None:
|
||||
"""Forward bytes from TCP socket → serial port."""
|
||||
sock.settimeout(TCP_TIMEOUT)
|
||||
while not stop.is_set():
|
||||
try:
|
||||
data = sock.recv(CHUNK)
|
||||
if not data:
|
||||
log.info("TCP peer closed connection")
|
||||
stop.set()
|
||||
break
|
||||
log.debug("TCP→SER %d bytes: %s", len(data), data.hex())
|
||||
ser.write(data)
|
||||
except socket.timeout:
|
||||
pass
|
||||
except OSError as exc:
|
||||
if not stop.is_set():
|
||||
log.warning("TCP read error: %s", exc)
|
||||
stop.set()
|
||||
break
|
||||
|
||||
|
||||
def _pipe_serial_to_tcp(sock: socket.socket, ser: serial.Serial, stop: threading.Event) -> None:
|
||||
"""Forward bytes from serial port → TCP socket."""
|
||||
while not stop.is_set():
|
||||
try:
|
||||
data = ser.read(CHUNK)
|
||||
if data:
|
||||
log.debug("SER→TCP %d bytes: %s", len(data), data.hex())
|
||||
try:
|
||||
sock.sendall(data)
|
||||
except OSError as exc:
|
||||
if not stop.is_set():
|
||||
log.warning("TCP send error: %s", exc)
|
||||
stop.set()
|
||||
break
|
||||
except serial.SerialException as exc:
|
||||
if not stop.is_set():
|
||||
log.warning("Serial read error: %s", exc)
|
||||
stop.set()
|
||||
break
|
||||
|
||||
|
||||
def _run_session(conn: socket.socket, addr: tuple, serial_port: str, baud: int, boot_delay: float) -> None:
|
||||
"""Handle one TCP client connection."""
|
||||
peer = f"{addr[0]}:{addr[1]}"
|
||||
log.info("Connection from %s", peer)
|
||||
|
||||
try:
|
||||
ser = serial.Serial(
|
||||
port = serial_port,
|
||||
baudrate = baud,
|
||||
bytesize = 8,
|
||||
parity = "N",
|
||||
stopbits = 1,
|
||||
timeout = SERIAL_TIMEOUT,
|
||||
)
|
||||
except serial.SerialException as exc:
|
||||
log.error("Cannot open serial port %s: %s", serial_port, exc)
|
||||
conn.close()
|
||||
return
|
||||
|
||||
log.info("Opened %s at %d baud — waiting %.1fs for unit boot", serial_port, baud, boot_delay)
|
||||
ser.reset_input_buffer()
|
||||
ser.reset_output_buffer()
|
||||
|
||||
if boot_delay > 0:
|
||||
time.sleep(boot_delay)
|
||||
ser.reset_input_buffer() # discard any boot noise
|
||||
|
||||
log.info("Bridge active: TCP %s ↔ %s", peer, serial_port)
|
||||
|
||||
stop = threading.Event()
|
||||
t_tcp_to_ser = threading.Thread(
|
||||
target=_pipe_tcp_to_serial, args=(conn, ser, stop), daemon=True
|
||||
)
|
||||
t_ser_to_tcp = threading.Thread(
|
||||
target=_pipe_serial_to_tcp, args=(conn, ser, stop), daemon=True
|
||||
)
|
||||
t_tcp_to_ser.start()
|
||||
t_ser_to_tcp.start()
|
||||
|
||||
stop.wait() # block until either thread sets the stop flag
|
||||
|
||||
log.info("Session ended, cleaning up")
|
||||
try:
|
||||
conn.close()
|
||||
except OSError:
|
||||
pass
|
||||
try:
|
||||
ser.close()
|
||||
except OSError:
|
||||
pass
|
||||
|
||||
t_tcp_to_ser.join(timeout=2.0)
|
||||
t_ser_to_tcp.join(timeout=2.0)
|
||||
log.info("Session with %s closed", peer)
|
||||
|
||||
|
||||
# ── Server ────────────────────────────────────────────────────────────────────
|
||||
|
||||
def run_bridge(serial_port: str, baud: int, tcp_port: int, boot_delay: float) -> None:
|
||||
"""Accept TCP connections forever and bridge each one to the serial port."""
|
||||
srv = socket.socket(socket.AF_INET, socket.SOCK_STREAM)
|
||||
srv.setsockopt(socket.SOL_SOCKET, socket.SO_REUSEADDR, 1)
|
||||
srv.bind(("0.0.0.0", tcp_port))
|
||||
srv.listen(1)
|
||||
log.info(
|
||||
"Listening on TCP :%d — will bridge to %s at %d baud",
|
||||
tcp_port, serial_port, baud,
|
||||
)
|
||||
log.info("Send test: curl 'http://localhost:8200/device/info?host=127.0.0.1&tcp_port=%d'", tcp_port)
|
||||
|
||||
try:
|
||||
while True:
|
||||
conn, addr = srv.accept()
|
||||
# Handle one session at a time (synchronous) — matches modem behaviour
|
||||
_run_session(conn, addr, serial_port, baud, boot_delay)
|
||||
except KeyboardInterrupt:
|
||||
log.info("Shutting down")
|
||||
finally:
|
||||
srv.close()
|
||||
|
||||
|
||||
# ── Entry point ────────────────────────────────────────────────────────────────
|
||||
|
||||
if __name__ == "__main__":
|
||||
ap = argparse.ArgumentParser(description="TCP-to-serial bridge for bench testing TcpTransport")
|
||||
ap.add_argument("--serial", default="COM5", help="Serial port (default: COM5)")
|
||||
ap.add_argument("--baud", type=int, default=DEFAULT_BAUD, help="Baud rate (default: 38400)")
|
||||
ap.add_argument("--tcp-port", type=int, default=DEFAULT_TCP_PORT, help="TCP listen port (default: 12345)")
|
||||
ap.add_argument("--boot-delay", type=float, default=BOOT_DELAY,
|
||||
help="Seconds to wait after opening serial before forwarding (default: 2.0). "
|
||||
"Set to 0 if unit is already powered on.")
|
||||
ap.add_argument("--debug", action="store_true", help="Show individual byte transfers")
|
||||
args = ap.parse_args()
|
||||
|
||||
if args.debug:
|
||||
logging.getLogger().setLevel(logging.DEBUG)
|
||||
|
||||
run_bridge(args.serial, args.baud, args.tcp_port, args.boot_delay)
|
||||
@@ -1,7 +1,7 @@
|
||||
# Instantel MiniMate Plus — Blastware RS-232 Protocol Reference
|
||||
### "The Rosetta Stone"
|
||||
> Reverse-engineered via RS-232 serial bridge sniffing between Blastware software and an Instantel MiniMate Plus seismograph (S/N: BE18189).
|
||||
> All findings derived from live packet capture. No vendor documentation was used.
|
||||
> Cross-referenced against Instantel MiniMate Plus Operator Manual (716U0101 Rev 15) from v0.18 onward.
|
||||
> **Certainty Ratings:** ✅ CONFIRMED | 🔶 INFERRED | ❓ SPECULATIVE
|
||||
> Certainty ratings apply only to protocol semantics, not to capture tooling behavior.
|
||||
|
||||
@@ -11,10 +11,10 @@
|
||||
|
||||
| Date | Section | Change |
|
||||
|---|---|---|
|
||||
| 2026-02-25 | Initial | Document created from first hex dump analysis |
|
||||
| 2026-02-25 | §2 Frame Structure | **CORRECTED:** Frame uses DLE-STX (`0x10 0x02`) and DLE-ETX (`0x10 0x03`), not bare `0x02`/`0x03`. `0x41` confirmed as ACK not STX. DLE stuffing rule added. |
|
||||
| 2026-02-25 | §8 Timestamp | **UPDATED:** Year `0x07CB = 1995` confirmed as MiniMate hardware default date when RTC battery is disconnected. Not an encoding error. Confidence upgraded from ❓ to 🔶. |
|
||||
| 2026-02-25 | §10 DLE Stuffing | **UPGRADED:** Section upgraded from ❓ SPECULATIVE to ✅ CONFIRMED. Full stuffing rules and parser state machine documented. |
|
||||
| 2026-02-26 | Initial | Document created from first hex dump analysis |
|
||||
| 2026-02-26 | §2 Frame Structure | **CORRECTED:** Frame uses DLE-STX (`0x10 0x02`) and DLE-ETX (`0x10 0x03`), not bare `0x02`/`0x03`. `0x41` confirmed as ACK not STX. DLE stuffing rule added. |
|
||||
| 2026-02-26 | §8 Timestamp | **UPDATED:** Year `0x07CB = 1995` confirmed as MiniMate hardware default date when RTC battery is disconnected. Not an encoding error. Confidence upgraded from ❓ to 🔶. |
|
||||
| 2026-02-26 | §10 DLE Stuffing | **UPGRADED:** Section upgraded from ❓ SPECULATIVE to ✅ CONFIRMED. Full stuffing rules and parser state machine documented. |
|
||||
| 2026-02-26 | §11 Checksum | **UPDATED:** Frame builder and parser rewritten to handle DLE framing and byte stuffing correctly. |
|
||||
| 2026-02-26 | §14 Open Questions | DLE question removed (resolved). Timestamp year question removed (resolved). |
|
||||
| 2026-02-26 | §7.2 Serial Number Response | **CORRECTED:** Trailing bytes are `0x79 0x11` only (2 bytes, not 3). `0x20` was misidentified as a trailing byte — it is the frame checksum. |
|
||||
@@ -30,7 +30,35 @@
|
||||
| 2026-02-26 | §15 → Appendix A | **RENAMED:** Binary log format section moved to Appendix A with explicit note that it describes tooling behavior, not protocol. |
|
||||
| 2026-02-26 | Header | **ADDED:** Certainty legend clarification — ratings apply to protocol semantics only, not tooling behavior. |
|
||||
| 2026-02-26 | §7.6 Channel Config Float Layout | **NEW SECTION:** Trigger level confirmed as IEEE 754 BE float in in/s. Alarm level identified as adjacent float = 1.0 in/s. Unit string `"in./s"` embedded inline. `0x082A` removed as trigger level candidate. |
|
||||
| 2026-02-25 | Appendix A | **UPDATED:** v0.4.0 — annotation markers added. `.bin` sentinel format documented. Parser caveat added for SUB `5A` raw ADC payloads. |
|
||||
| 2026-03-01 | §7.6 Channel Config Float Layout | **UPGRADED:** Alarm level offset fully confirmed via controlled capture (alarm 1.0→2.0, trigger 0.5→0.6). Complete per-channel layout documented. Three-channel repetition confirmed (Tran, Vert, Long). Certainty upgraded to ✅ CONFIRMED. |
|
||||
| 2026-03-01 | §7.7 `.set` File Format | **NEW SECTION:** Blastware save-to-disk format decoded. Little-endian binary struct matching wire protocol payload. Full per-channel block layout mapped. Record time confirmed as uint32 at +16. MicL unit string confirmed as `"psi\0"`. `0x082A` mystery noted — not obviously record time, needs one more capture to resolve. |
|
||||
| 2026-03-02 | §7.4 Event Index Block | **CONFIRMED:** Backlight and power save offsets independently confirmed via device-set capture (backlight=100=0x64 at +75, power-save=30=0x1E at +83). On-device change visible in S3→BW read response — no Blastware write involved. Offsets are ✅ CONFIRMED. |
|
||||
| 2026-03-02 | §7.4 Event Index Block | **NEW:** `Monitoring LCD Cycle` identified at offsets +84/+85 as uint16 BE. Default value = 65500 (0xFFDC) = effectively disabled / maximum. Confirmed from operator manual §3.13.1g. |
|
||||
| 2026-03-02 | §7.4 Event Index Block | **UPDATED:** Backlight confirmed as uint8 range 0–255 seconds per operator manual §3.13.1e ("adjustable timer, 0 to 255 seconds"). Power save unit confirmed as minutes per operator manual §3.13.1f. |
|
||||
| 2026-03-02 | Global | **NEW SOURCE:** Operator manual (716U0101 Rev 15) added as reference. Cross-referencing settings definitions, ranges, and units. Header updated. |
|
||||
| 2026-03-02 | §14 Open Questions | Float 6.2061 in/s mystery: manual confirms only two geo ranges (1.25 in/s and 10.0 in/s). 6.2061 is NOT a user-selectable range → likely internal ADC full-scale calibration constant or hardware range ceiling. Downgraded to LOW priority. |
|
||||
| 2026-03-02 | §14 Open Questions | `0x082A` hypothesis refined: 2090 decimal. At 1024 sps, 2 sec record = 2048 samples. Possible that 0x082A = total samples including 0.25s pre-trigger (256 samples) at some adjusted rate. Needs capture with different record time. |
|
||||
| 2026-03-02 | §14 Open Questions | **NEW items added:** Trigger sample width (default=2), Auto Window (1-9 sec), Aux Trigger (enabled/disabled) — all confirmed settings from operator manual not yet mapped in protocol. |
|
||||
| 2026-03-02 | §14 Open Questions | Monitoring LCD Cycle resolved — removed from open questions. |
|
||||
| 2026-03-02 | Appendix A | **CORRECTED:** Previous entry stated logger strips DLE from ETX. This was wrong — it applied to an older logger version. `s3_bridge v0.5.0` confirmed to preserve raw wire bytes including `0x10 0x03` intact. HxD inspection of new capture confirmed `10 03` present in S3→BW record payloads. |
|
||||
| 2026-03-02 | Appendix A | **UPDATED:** New capture architecture: two flat raw wire dumps per session (`raw_s3.bin`, `raw_bw.bin`), one per direction, no record wrapper. Replaces structured `.bin` format for parser input. |
|
||||
| 2026-03-02 | Appendix A | **PARSER:** Deterministic DLE state machine implemented (`s3_parser.py`). Three states: `IDLE → IN_FRAME → AFTER_DLE`. Replaces heuristic global scanning. Properly handles DLE stuffing (`10 10` → literal `10`). Only complete STX→ETX pairs counted as frames. |
|
||||
| 2026-03-02 | Appendix A | **VALIDATED:** `raw_bw.bin` yields 7 complete frames via state machine. `raw_s3.bin` contains large structured responses (first frame payload ~3922 bytes). Both files confirmed lossless. BW bare `0x02` pattern confirmed as asymmetric framing (BW sends bare STX, S3 sends DLE+STX). |
|
||||
| 2026-03-09 | §7.6, §Appendix B | **CONFIRMED:** Record time located in SUB E5 data page2 at payload offset `+0x28` as **float32 BE**. Confirmed via two controlled captures: 7 sec = `40 E0 00 00`, 13 sec = `41 50 00 00`. Geo range (only 1.25 or 10.0 in/s) eliminates ambiguity — 7 and 13 are not valid geo range values. |
|
||||
| 2026-03-09 | §7.5, §14 | **CORRECTED:** The byte `0x0A` appearing after the "Extended Notes" null-padded label in the E5 payload is **NOT** record time. It is an unknown field that equals 10 and does not change when record time changes. False lead closed. |
|
||||
| 2026-03-09 | §14 | **RESOLVED:** `0x082A` mystery closed — confirmed as fixed-size E5 payload length (2090 bytes), not a record-time-derived sample count. Value is constant regardless of record time or other settings. |
|
||||
| 2026-03-09 | §7.8, §14, Appendix B | **NEW — Trigger Sample Width confirmed:** Located in BW→S3 write frame SUB `0x82`, destuffed payload offset `[22]`, uint8. Confirmed via BW-side capture (`raw_bw.bin`) diffing two sessions: Width=4 → `0x04`, Width=3 → `0x03`. Setting is **transmitted only on BW→S3 write** (SUB `0x82`), invisible in S3-side compliance dumps. |
|
||||
| 2026-03-09 | §14, Appendix B | **CONFIRMED — Mode gating is a real protocol behavior:** Several settings are only transmitted (and possibly only interpreted by the device) when the required mode is active. Trigger Sample Width is only sent when in Compliance/Single-Shot/Fixed Record Time mode. Auto Window is only relevant when Record Stop Mode = Auto — attempting to capture it in Fixed mode produced no change on the wire (F7 and D1 blocks identical before/after). This is an architectural property, not a gap in the capture methodology. Future capture attempts for mode-gated settings must first activate the appropriate mode. |
|
||||
| 2026-03-09 | §14 | **UPDATED — Auto Window:** Capture attempted (Auto Window 3→9) in Fixed record time mode. No change observed in any S3-side frame (F7, D1, E5 all identical). Confirmed mode-gated behind Record Stop Mode = Auto. Not capturable without switching modes — deferred. |
|
||||
| 2026-03-11 | §14, Appendix B | **CONFIRMED — Aux Trigger read location:** SUB `FE` (FULL_CONFIG_RESPONSE), destuffed payload offset `0x0109`, uint8. `0x00` = disabled, `0x01` = enabled. Confirmed via controlled capture: changed Aux Trigger in Blastware, sent to unit, re-read config. FE diff showed clean isolated flip at `0x0109` with only 3 other bytes changing (likely counters/checksums at `0x0033`, `0x00C0`, `0x04ED`). |
|
||||
| 2026-03-11 | §14, Appendix B | **PARTIAL — Aux Trigger write path:** Write command not yet isolated. The BW→S3 write appears to occur inside the A4 (POLL_RESPONSE) stream via inner frame handshaking — multiple WRITE_CONFIRM_RESPONSE inner frames (SUBs `7C`, `7D`, `8B`, `8C`, `8D`, `8E`, `96`, `97`) appeared in A4 after the write, and the TRIGGER_CONFIG_RESPONSE (SUB `E3`) inner frames were removed. Write command itself not yet captured in a clean session — likely SUB `15` or embedded in the partial session 0. Write path deferred for a future clean capture. |
|
||||
| 2026-03-11 | §4, §14 | **NEW — SUB A4 is a composite container frame:** A4 (POLL_RESPONSE) payload contains multiple embedded inner frames using the same DLE framing (10 02 start, 10 03 end, 10 10 stuffing). Phase-shift diffing issue resolved in s3_analyzer.py by adding `_extract_a4_inner_frames()` and `_diff_a4_payloads()` — diff count reduced from 2300 → 17 meaningful entries. |
|
||||
| 2026-03-11 | §14 | **NEW — SUB `6E` response anomaly:** BW sends SUB `1C` (TRIGGER_CONFIG_READ) and S3 responds with SUB `6E` — does NOT follow the `0xFF - SUB` rule (`0xFF - 0x1C = 0xE3`). Only known exception to the response pairing rule observed to date. SUB `6E` payload starts with ASCII string `"Long2"`. |
|
||||
| 2026-03-12 | §11 | **CONFIRMED — BW→S3 large-frame checksum algorithm:** SUBs `68`, `69`, `71`, `82`, and `1A` (with data) use: `chk = (sum(b for b in payload[2:-1] if b != 0x10) + 0x10) % 256` — SUM8 of payload bytes `[2:-1]` skipping all `0x10` bytes, plus `0x10` as a constant, mod 256. Validated across 20 frames from two independent captures with differing string content (checksums differ between sessions, both validate correctly). Small frames (POLL, read commands) continue to use plain SUM8 of `payload[0:-1]`. The two formulas are consistent: small frames have exactly one `0x10` (CMD at `[0]`), which the large-frame formula's `[2:]` start and `+0x10` constant account for. |
|
||||
| 2026-03-12 | §11 | **RESOLVED — BAD CHK false positives on BW POLL frames:** Parser bug — BW frame terminator (`03 41`, ETX+ACK) was being included in the de-stuffed payload instead of being stripped as framing. BW frames end with bare `0x03` (not `10 03`). Fix: strip trailing `03 41` from BW payloads before checksum computation. |
|
||||
| 2026-03-30 | §3, §5.1 | **CONFIRMED — BW→S3 two-step read offset is at payload[5], NOT payload[3:4].** All BW read-command frames have `payload[3] = 0x00` and `payload[4] = 0x00` unconditionally. The two-step offset byte lives at `payload[5]`: `0x00` for the length-probe step, `DATA_LEN` for the data-fetch step. Validated against all captured frames in `bridges/captures/3-11-26/raw_bw_*.bin` — every frame is an exact bit-for-bit match when built with offset at `[5]`. The `page_hi`/`page_lo` framing in the docstring was a misattribution from the S3-side response layout (where `[3]`/`[4]` ARE page bytes). |
|
||||
| 2026-03-30 | §4, §5.2 | **CONFIRMED — S3 probe response page_key is always 0x0000.** The S3 response to a length-probe step does NOT carry the data length back in `page_hi`/`page_lo`. Both bytes are `0x00` in every observed probe response. Data lengths for each SUB are fixed constants (see §5.1 table). The `minimateplus` library now uses a hardcoded `DATA_LENGTHS` dict rather than trying to read the length from the probe response. |
|
||||
| 2026-03-31 | §12 TCP Transport | **NEW SECTION — TCP/modem transport confirmed transparent from Blastware Operator Manual (714U0301 Rev 22).** Key facts confirmed: (1) Protocol bytes over TCP are bit-for-bit identical to RS-232 — no handshake framing. (2) No ENQ byte on TCP connect (`Enable ENQ on TCP Connect: 0-Disable` in Raven ACEmanager). (3) Raven modem `Data Forwarding Timeout = 1 second` — modem buffers serial bytes up to 1s before forwarding over TCP; `TcpTransport.read_until_idle` uses `idle_gap=1.5s` to compensate. (4) TCP port is user-configurable (12335 in manual example; user's install uses 12345). (5) Baud rate over serial link to modem is 38400,8N1 regardless of TCP path. (6) ACH (Auto Call Home) = INBOUND to server (unit calls home); "call up" = OUTBOUND from client (Blastware/SFM connects to modem IP). `TcpTransport` implements outbound (call-up) mode. |
|
||||
|
||||
---
|
||||
|
||||
@@ -274,7 +302,9 @@ Write commands are initiated by Blastware (`BW->S3`) and use SUB bytes in the `0
|
||||
|
||||
## 7. Known Data Payloads
|
||||
|
||||
### 7.1 Poll Response (SUB A4) — Device Identity Block
|
||||
### 7.1 Poll Response (SUB A4) — Device Identity Block / Composite Container
|
||||
|
||||
> ⚠️ **SUB A4 is a composite container frame.** The large A4 payload (~3600+ bytes) contains multiple embedded inner sub-frames using the same DLE framing as the outer protocol (`10 02` start, `10 03` end, `10 10` stuffing). Inner frames carry WRITE_CONFIRM_RESPONSE and TRIGGER_CONFIG_RESPONSE sub-frames among others. Flat byte-by-byte diffing of A4 is unreliable due to phase shifting — use inner-frame-aware diffing (`_diff_a4_payloads()` in s3_analyzer.py). Confirmed 2026-03-11.
|
||||
|
||||
Two-step read. Data payload = 0x30 bytes.
|
||||
|
||||
@@ -332,15 +362,41 @@ Unit 2: serial="BE11529" trail=70 11 firmware=S337.17
|
||||
|
||||
### 7.4 Event Index Response (SUB F7) — 0x58 bytes
|
||||
|
||||
> ✅ **2026-03-02 — CONFIRMED:** Backlight and power save offsets confirmed via two independent captures with device-set values. Offsets are from the start of the **data section** (after the 16-byte protocol header).
|
||||
|
||||
**Layout (offsets relative to data section start):**
|
||||
|
||||
```
|
||||
Offset 0x00: 00 58 09 — Total index size or record count ❓
|
||||
Offset 0x03: 00 00 00 01 — Possibly stored event count = 1 ❓
|
||||
Offset 0x07: 01 07 CB 00 06 1E — Timestamp of event 1 (see §8)
|
||||
Offset 0x0D: 01 07 CB 00 14 00 — Timestamp of event 2 (see §8)
|
||||
Offset 0x13: 00 00 00 17 3B — Unknown ❓
|
||||
Offset 0x50: 10 02 FF DC — Sub-block pointer or data segment header ❓
|
||||
Offset +00: 00 58 09 — Total index size or record count ❓
|
||||
Offset +03: 00 00 00 01 — Possibly stored event count = 1 ❓
|
||||
Offset +07: 01 07 CB 00 06 1E — Timestamp of event 1 (see §8)
|
||||
Offset +0D: 01 07 CB 00 14 00 — Timestamp of event 2 (see §8)
|
||||
Offset +13: 00 00 00 17 3B — Unknown ❓
|
||||
Offset +4B: [backlight] — BACKLIGHT ON TIME ✅ CONFIRMED
|
||||
Offset +4C: 00 — padding (backlight is uint8, not uint16)
|
||||
Offset +53: [power_save] — POWER SAVING TIMEOUT ✅ CONFIRMED
|
||||
Offset +54: [lcd_hi] [lcd_lo] — MONITORING LCD CYCLE (uint16 BE) ✅ CONFIRMED
|
||||
```
|
||||
|
||||
| Offset | Size | Type | Known values | Meaning | Certainty |
|
||||
|---|---|---|---|---|---|
|
||||
| +4B | 1 | uint8 | 250, 100 | **BACKLIGHT ON TIME** (0–255 seconds per manual) | ✅ CONFIRMED |
|
||||
| +4C | 1 | — | 0x00 | Padding / high byte of potential uint16 | 🔶 INFERRED |
|
||||
| +53 | 1 | uint8 | 10, 30 | **POWER SAVING TIMEOUT** (minutes) | ✅ CONFIRMED |
|
||||
| +54..+55 | 2 | uint16 BE | 0xFFDC = 65500 | **MONITORING LCD CYCLE** (seconds; 65500 ≈ disabled/max) | ✅ CONFIRMED |
|
||||
|
||||
**Confirmation captures:**
|
||||
|
||||
| Capture | Backlight (+4B) | Power Save (+53) | LCD Cycle (+54/55) |
|
||||
|---|---|---|---|
|
||||
| `20260301_160702` (BW-written) | `0xFA` = 250 | `0x0A` = 10 min | `0xFF 0xDC` = 65500 |
|
||||
| `20260302_144606` (device-set) | `0x64` = 100 | `0x1E` = 30 min | `0xFF 0xDC` = 65500 |
|
||||
|
||||
> 📖 **Manual cross-reference (716U0101 Rev 15, §3.13.1):**
|
||||
> - Backlight On Time: "adjustable timer, from 0 to 255 seconds" (§3.13.1e)
|
||||
> - Power Saving Timeout: "automatically turns the Minimate Plus off" — stored in minutes (§3.13.1f)
|
||||
> - Monitoring LCD Cycle: "cycles off for the time period... set to zero to turn off" — 65500 = effectively disabled (§3.13.1g)
|
||||
|
||||
### 7.5 Full Waveform Record (SUB F3) — 0xD2 bytes × 2 pages
|
||||
|
||||
> ✅ **2026-02-26 — UPDATED:** Project strings field layout confirmed by diffing compliance setup write payload (SUB `71`). Client field change `"Hello Claude"` → `"Claude test2"` isolated exact byte position.
|
||||
@@ -383,27 +439,187 @@ Confirmed ASCII strings extracted from payload:
|
||||
|
||||
### 7.6 Channel Config Float Layout (SUB E5 / SUB 71)
|
||||
|
||||
> ✅ **CONFIRMED — 2026-02-26** from trigger level change capture (session `193237`). Trigger changed `0.500 → 0.200 in/s`, alarm level independently read as `1.0 in/s`.
|
||||
> ✅ **CONFIRMED — 2026-03-01** from controlled captures (sessions `193237` and `151147`). Trigger changed `0.500 → 0.200`, then `0.200 → 0.600`. Alarm changed `1.0 → 2.0`. All positions confirmed.
|
||||
|
||||
The SUB `1A` read response (`E5`) and SUB `71` write block contain per-channel threshold and scaling values packed as **IEEE 754 big-endian floats**, with an inline unit string:
|
||||
The SUB `1A` read response (`E5`) and SUB `71` write block contain per-channel threshold and scaling values packed as **IEEE 754 big-endian floats**, with inline unit strings. This layout repeats **once per geophone channel** (Tran, Vert, Long — 3×):
|
||||
|
||||
```
|
||||
[max_range float] [trigger float] ["in.\0"] [alarm float] ["/s\0\0"]
|
||||
40 C6 97 FD 3E 4C CC CD 69 6E 2E 3F 80 00 00 2F 73 00 00
|
||||
= 6.206 = 0.200 in/s "in." = 1.000 in/s "/s"
|
||||
[00 00] [max_range float] [00 00] [trigger float] ["in.\0"] [alarm float] ["/s\0\0"] [00 01] [chan_label...]
|
||||
40 C6 97 FD 3F 19 99 9A 69 6E 2E 40 00 00 00 2F 73 00 00
|
||||
= 6.206 = 0.600 in/s "in." = 2.000 in/s "/s"
|
||||
```
|
||||
|
||||
| Float | Value observed | Meaning | Certainty |
|
||||
| Field | Example bytes | Decoded | Certainty |
|
||||
|---|---|---|---|
|
||||
| `40 C6 97 FD` | 6.206 | Maximum range (likely full-scale ADC range in in/s) | 🔶 INFERRED |
|
||||
| `3E 4C CC CD` | 0.200 | **Geophone trigger level** — changed `0.500 → 0.200` in capture | ✅ CONFIRMED |
|
||||
| `3F 80 00 00` | 1.000 | **Geophone alarm level** — matched UI value of 1.0 in/s | ✅ CONFIRMED |
|
||||
| `[00 00]` | `00 00` | Separator / padding | 🔶 INFERRED |
|
||||
| Max range float | `40 C6 97 FD` | 6.206 — full-scale range in in/s | 🔶 INFERRED |
|
||||
| `[00 00]` | `00 00` | Separator / padding | 🔶 INFERRED |
|
||||
| **Trigger level** | `3F 19 99 9A` | **0.600 in/s** — IEEE 754 BE float | ✅ CONFIRMED |
|
||||
| Unit string | `69 6E 2E 00` | `"in.\0"` | ✅ CONFIRMED |
|
||||
| **Alarm level** | `40 00 00 00` | **2.000 in/s** — IEEE 754 BE float | ✅ CONFIRMED |
|
||||
| Unit string | `2F 73 00 00` | `"/s\0\0"` | ✅ CONFIRMED |
|
||||
| `[00 01]` | `00 01` | Unknown flag / separator | 🔶 INFERRED |
|
||||
| Channel label | e.g. `56 65 72 74` | `"Vert"` — identifies which channel | ✅ CONFIRMED |
|
||||
|
||||
Unit strings `"in.\0"` and `"/s\0\0"` are embedded inline between the floats, confirming values are stored natively in **imperial units (in/s)** regardless of display locale.
|
||||
**State transitions observed across captures:**
|
||||
|
||||
> ❓ **`0x082A` (= 2090)** — appears in the same block but did not change when trigger or alarm level was adjusted. Previous hypothesis that it was the trigger level is incorrect. Possibly record time, sample count, or a different threshold. Needs a targeted capture changing a known integer field to identify.
|
||||
| Capture | Trigger | Alarm | Notes |
|
||||
|---|---|---|---|
|
||||
| `193237` (read) | `3F000000` = 0.500 | `3F800000` = 1.000 | Device state before any change |
|
||||
| `193237` (write 1) | `3E4CCCCD` = 0.200 | `3F800000` = 1.000 | Trigger changed only |
|
||||
| `151147` (write 1) | `3E4CCCCD` = 0.200 | `40000000` = 2.000 | Alarm changed, trigger carried over |
|
||||
| `151147` (write 2) | `3F19999A` = 0.600 | `40000000` = 2.000 | Trigger changed, alarm carried over |
|
||||
|
||||
Values are stored natively in **imperial units (in/s)** — unit strings `"in."` and `"/s"` embedded inline confirm this regardless of display locale.
|
||||
|
||||
### 7.6.1 Record Time (SUB E5 data page2 `+0x28`)
|
||||
|
||||
> ✅ **CONFIRMED — 2026-03-09** from two controlled captures (record time 7→13 sec, raw_s3-3-9-26_2.bin and raw_s3-3-9-26_3.bin).
|
||||
|
||||
Record time is stored as a **32-bit IEEE 754 float, big-endian** at offset `+0x28` from the start of the E5 data page2 payload.
|
||||
|
||||
| Record Time | float32 BE bytes | Decoded |
|
||||
|---|---|---|
|
||||
| 7 seconds | `40 E0 00 00` | 7.0 |
|
||||
| 10 seconds | `41 20 00 00` | 10.0 |
|
||||
| 13 seconds | `41 50 00 00` | 13.0 |
|
||||
|
||||
**Disambiguation note:** Max geo range (also a float in this block) only takes values 1.25 or 10.0 in/s. The values 7 and 13 are not valid geo range selections, confirming this field is record time.
|
||||
|
||||
**`0x0A` after "Extended Notes" label:** The byte `0x0A` that appears after the null-padded "Extended Notes" string in the E5 payload is **not** record time. It is an unknown field that equals 10 and is invariant across record time changes. Do not confuse it with the record time float at `+0x28`.
|
||||
|
||||
> ✅ **`0x082A` (= 2090) — RESOLVED:** This value is the fixed payload length of the E5 response block (2090 bytes). It is constant regardless of record time, trigger level, or any other setting. It appears in the E5 frame header as the declared data length for the paged read, not as a settings field.
|
||||
|
||||
---
|
||||
|
||||
### 7.7 Blastware `.set` File Format
|
||||
|
||||
> 🔶 **INFERRED — 2026-03-01** from `Standard_Recording_Setup.set` cross-referenced against known wire payloads.
|
||||
|
||||
Blastware's "save setup to disk" feature produces a binary `.set` file that is structurally identical to the wire protocol payload, but with **all multi-byte values in little-endian byte order** (Windows-native) rather than the big-endian order used on the wire. No DLE framing, no checksums — raw struct dump.
|
||||
|
||||
**File layout (2522 bytes observed):**
|
||||
|
||||
```
|
||||
0x0000 Header / metadata block (~40 bytes) — partially decoded
|
||||
0x002A "Standard Recording Setup.set\0" — setup filename, null-padded
|
||||
0x0078 Project strings block — same layout as SUB 71 wire payload
|
||||
"Project:\0" + value, "Client:\0" + value, "User Name:\0" + value,
|
||||
"Seis Loc:\0" + value, "Extended Notes\0" + value
|
||||
0x06A0 Channel records block — one record per channel (geo×3 + mic×1 + duplicates)
|
||||
0x0820 Device info block — serial number, firmware, model strings
|
||||
0x08C0 Event index / timestamp block
|
||||
0x0910 Histogram / reporting config
|
||||
0x09D0 Trailer (10 bytes)
|
||||
```
|
||||
|
||||
**Per-channel record layout (little-endian, ~46 bytes per channel):**
|
||||
|
||||
```
|
||||
offset size type value (Tran example) meaning
|
||||
+00 2 uint16 0x0001 channel type (1=geophone, 0=mic)
|
||||
+02 4 char[4] "Tran" channel label
|
||||
+06 2 uint16 0x0000 padding
|
||||
+08 2 uint16 0x0001 unknown
|
||||
+0A 2 uint16 0x0050 = 80 unknown (sensitivity? gain?)
|
||||
+0C 2 uint16 0x000F = 15 unknown
|
||||
+0E 2 uint16 0x0028 = 40 unknown
|
||||
+10 2 uint16 0x0015 = 21 unknown
|
||||
+12 4 bytes 03 02 04 01 flags (recording mode etc.)
|
||||
+16 4 uint32 0x00000003 record time in seconds ✅ CONFIRMED
|
||||
+1A 4 float32 6.2061 max range (in/s for geo, psi for mic)
|
||||
+1E 2 00 00 padding
|
||||
+20 4 float32 0.6000 trigger level ✅ CONFIRMED
|
||||
+24 4 char[4] "in.\0" / "psi\0" unit string (geo vs mic)
|
||||
+28 4 float32 2.0000 alarm level ✅ CONFIRMED
|
||||
+2C 4 char[4] "/s\0\0" / varies unit string 2
|
||||
```
|
||||
|
||||
**MicL channel differences:**
|
||||
- `channel_type` = 0 (vs 1 for geophones)
|
||||
- trigger = 0.009, alarm = 0.021 (in psi)
|
||||
- unit string = `"psi\0"` instead of `"in.\0"` — **confirms MicL units are psi** ✅
|
||||
|
||||
**Endianness summary:**
|
||||
|
||||
| Context | Byte order | Example (0.6 in/s trigger) |
|
||||
|---|---|---|
|
||||
| `.set` file | Little-endian | `9A 99 19 3F` |
|
||||
| Wire protocol (SUB 71 / E5) | Big-endian | `3F 19 99 9A` |
|
||||
|
||||
> ❓ **`0x082A`** — still unidentified. Record time in the `.set` file = `0x00000003` (3 sec), which would be `00 00 00 03` on wire — not `0x082A`. The original sessions had record time = 2, which would be `00 00 00 02`. `0x082A` = 2090 doesn't match any obvious record time encoding. May correspond to one of the unknown uint16 fields at +0A through +10. A capture changing sample rate or histogram interval would help isolate it.
|
||||
|
||||
---
|
||||
|
||||
### 7.8 Trigger / Advanced Config Write Frame (BW→S3 SUB `0x82`)
|
||||
|
||||
> ✅ **CONFIRMED — 2026-03-09** from controlled BW-side capture diff (Trigger Sample Width 4→3).
|
||||
|
||||
SUB `0x82` is the BW→S3 write command for the advanced trigger configuration block. It is the write counterpart to the S3→BW read response SUB `0xD1` (0xFF − 0x82 = 0x7D is a separate sub; the D1/2E read pair is distinct). The `0x82` write frame is only visible in `raw_bw.bin` — it does not appear in S3-side compliance dumps.
|
||||
|
||||
**Destuffed BW write frame layout (47 raw bytes → 46 destuffed):**
|
||||
|
||||
```
|
||||
offset value meaning
|
||||
[00] 0x10 addr (literal 0x10 after destuffing)
|
||||
[01] 0x00 unknown
|
||||
[02] 0x82 SUB: advanced config write
|
||||
[03] 0x00 unknown
|
||||
[04] 0x00 unknown
|
||||
[05] 0x1C length = 28 bytes (payload size)
|
||||
[06..10] 00.. header/padding
|
||||
[11..16] 00.. header/padding
|
||||
[17] 0x1A unknown (constant 26 = 0x1A)
|
||||
[18] 0xD5 unknown (constant)
|
||||
[19] 0x00 unknown
|
||||
[20] 0x00 unknown
|
||||
[21] 0x10 literal 0x10 (stuffed in raw frame as 10 10)
|
||||
[22] 0x04/0x03 Trigger Sample Width ✅ CONFIRMED (uint8, samples)
|
||||
[23] 0x0A unknown (constant 10; NOT Auto Window)
|
||||
[24..43] 0xFF.. padding
|
||||
[44] 0x00 unknown
|
||||
[45] checksum
|
||||
```
|
||||
|
||||
**Confirmed Trigger Sample Width values:**
|
||||
|
||||
| Width setting | Byte [22] |
|
||||
|---|---|
|
||||
| 4 samples | `0x04` |
|
||||
| 3 samples | `0x03` |
|
||||
| 2 samples (default) | `0x02` (expected — not yet captured) |
|
||||
|
||||
**Known constants in this frame:** `[17]=0x1A`, `[18]=0xD5`, `[23]=0x0A`. These do not change with Trigger Sample Width changes. Byte `[23]` = 10 was initially a candidate for Auto Window (range 1–9) but cannot be Auto Window because 10 is outside the valid range.
|
||||
|
||||
**Mode gating:** This write frame is only transmitted when Blastware performs a Send To Unit operation in Compliance / Single-Shot / Fixed Record Time mode. The frame is absent from other session types.
|
||||
|
||||
---
|
||||
|
||||
### 7.9 Mode Gating — Protocol Architecture Note
|
||||
|
||||
> ✅ **CONFIRMED — 2026-03-09** from controlled captures and null-change experiments.
|
||||
|
||||
Several settings are **mode-gated**: the device only transmits (reads) or accepts (writes) certain fields when the appropriate operating mode is active. This is an architectural property of the protocol, not a gap in capture methodology.
|
||||
|
||||
**Observed mode gating:**
|
||||
|
||||
| Setting | Gate Condition | Evidence |
|
||||
|---|---|---|
|
||||
| Trigger Sample Width | Compliance / Single-Shot / Fixed Record Time mode | Not visible in S3-side reads; only in BW write frame (SUB `0x82`) when mode is active |
|
||||
| Auto Window | Record Stop Mode = Auto | Capture of 3→9 change in Fixed mode produced zero wire change in all frames (F7, D1, E5 all identical) |
|
||||
|
||||
**Implication for captures:** To map a mode-gated setting, you must first activate the gating mode on the device, then perform the compliance dump or write capture. Changing the setting value while in the wrong mode will produce no observable wire change.
|
||||
|
||||
**Suspected mode-gated settings not yet captured:**
|
||||
- Auto Window (requires Record Stop Mode = Auto)
|
||||
- Auxiliary Trigger (unknown gate condition)
|
||||
|
||||
---
|
||||
|
||||
### 7.5 Full Waveform Record (SUB F3) — 0xD2 bytes × 2 pages
|
||||
|
||||
Peak values as IEEE 754 big-endian floats (restored section header):
|
||||
|
||||
> 🔶 **Pending confirmation:** Alarm level identification is based on value match (`3F 80 00 00` = 1.0 = UI value). A capture changing the alarm level will confirm the exact byte offset.
|
||||
```
|
||||
Tran: 3D BB 45 7A = 0.0916 (in/s — unit config dependent)
|
||||
Vert: 3D B9 56 E1 = 0.0907
|
||||
@@ -504,7 +720,32 @@ ESCAPE:
|
||||
---
|
||||
|
||||
## 11. Checksum Reference Implementation
|
||||
> ⚠️ **Updated 2026-02-26** — Rewritten for correct DLE framing and byte stuffing.
|
||||
> ⚠️ **Updated 2026-03-12** — BW→S3 large-frame checksum algorithm confirmed. Two distinct formulas apply depending on frame direction and size.
|
||||
|
||||
### Checksum Overview
|
||||
|
||||
| Direction | Frame type | Formula | Coverage |
|
||||
|---|---|---|---|
|
||||
| S3→BW | All frames | `sum(payload) & 0xFF` | All de-stuffed payload bytes `[0:-1]` |
|
||||
| BW→S3 | Small frames (POLL, read cmds) | `sum(payload) & 0xFF` | All de-stuffed payload bytes `[0:-1]` |
|
||||
| BW→S3 | Large write frames (SUB `68`,`69`,`71`,`82`,`1A`+data) | See formula below | De-stuffed payload bytes `[2:-1]`, skipping `0x10` bytes, plus constant |
|
||||
|
||||
### BW→S3 Large-Frame Checksum Formula
|
||||
|
||||
```python
|
||||
def calc_checksum_bw_large(payload: bytes) -> int:
|
||||
"""
|
||||
Checksum for large BW→S3 write frames (SUB 68, 69, 71, 82, 1A with data).
|
||||
|
||||
Formula: sum all bytes in payload[2:-1], skipping 0x10 bytes, add 0x10, mod 256.
|
||||
Confirmed across 20 frames from two independent captures (2026-03-12).
|
||||
"""
|
||||
return (sum(b for b in payload[2:-1] if b != 0x10) + 0x10) & 0xFF
|
||||
```
|
||||
|
||||
**Why this formula:** The CMD byte at `payload[0]` is always `0x10` (DLE). The byte at `payload[1]` is always `0x00`. Starting from `payload[2]` skips both. All `0x10` bytes in the data section are excluded from the sum, then `0x10` is added back as a constant — effectively treating DLE as a transparent/invisible byte in the checksum. This is consistent with `0x10` being a framing/control character in the protocol.
|
||||
|
||||
**Consistency check:** For small frames, `payload[0]` = `0x10` and there are no other `0x10` bytes in the payload. The large-frame formula applied to a small frame would give `(sum(payload[2:-1]) + 0x10) = sum(payload[0:-1])` — identical to the plain SUM8. The two formulas converge for frames without embedded `0x10` data bytes.
|
||||
|
||||
```python
|
||||
DLE = 0x10
|
||||
@@ -537,14 +778,27 @@ def destuff(data: bytes) -> bytes:
|
||||
return bytes(out)
|
||||
|
||||
|
||||
def calc_checksum(payload: bytes) -> int:
|
||||
def calc_checksum_s3(payload: bytes) -> int:
|
||||
"""
|
||||
8-bit sum of de-stuffed payload bytes, modulo 256.
|
||||
Pass the original (pre-stuff) payload — not the wire bytes.
|
||||
Standard SUM8: used for all S3→BW frames and small BW→S3 frames.
|
||||
Sum of all payload bytes (excluding the checksum byte itself), mod 256.
|
||||
"""
|
||||
return sum(payload) & 0xFF
|
||||
|
||||
|
||||
def calc_checksum_bw_large(payload: bytes) -> int:
|
||||
"""
|
||||
Large BW→S3 write frame checksum (SUB 68, 69, 71, 82, 1A with data).
|
||||
Sum payload[2:-1] skipping 0x10 bytes, add 0x10, mod 256.
|
||||
"""
|
||||
return (sum(b for b in payload[2:-1] if b != 0x10) + 0x10) & 0xFF
|
||||
|
||||
|
||||
# Backwards-compatible alias
|
||||
def calc_checksum(payload: bytes) -> int:
|
||||
return calc_checksum_s3(payload)
|
||||
|
||||
|
||||
def build_frame(payload: bytes) -> bytes:
|
||||
"""
|
||||
Build a complete on-wire frame from a raw payload.
|
||||
@@ -632,21 +886,141 @@ Build in this order — each step is independently testable:
|
||||
| Channels | Tran, Vert, Long, MicL (4 channels) |
|
||||
| Sample Rate | ~1024 sps (🔶 INFERRED) |
|
||||
| Bridge Config | COM5 (Blastware) ↔ COM4 (Device), 38400 baud |
|
||||
| Capture Tool | s3_bridge v0.4.0 (annotation markers, dual .log/.bin output) |
|
||||
| Capture Tool | s3_bridge v0.4.0 |
|
||||
|
||||
---
|
||||
|
||||
## 14. TCP / Modem Transport
|
||||
> ✅ **CONFIRMED — 2026-03-31** from Blastware Operator Manual 714U0301 Rev 22 §4.4 and ACEmanager Raven modem configuration screenshots.
|
||||
|
||||
The MiniMate Plus protocol is **fully transport-agnostic at the byte level**. The same DLE-framed S3/BW frame stream that flows over RS-232 is transmitted unmodified over a TCP socket. No additional framing, handshake bytes, or session tokens are added at the application layer.
|
||||
|
||||
---
|
||||
|
||||
### 14.1 Two Usage Modes
|
||||
|
||||
**"Call Up" (Outbound TCP — SFM connects to modem)**
|
||||
|
||||
Blastware or SFM opens a TCP connection to the modem's static IP address on its device port. The modem bridges the TCP socket to its RS-232 serial port, which is wired directly to the MiniMate Plus. From the protocol perspective this is identical to a direct serial connection.
|
||||
|
||||
```
|
||||
SFM ──TCP──► Raven modem ──RS-232──► MiniMate Plus
|
||||
(static IP, port N) (38400,8N1)
|
||||
```
|
||||
|
||||
This is the mode implemented by `TcpTransport(host, port)`. Typical call:
|
||||
|
||||
```
|
||||
GET /device/info?host=203.0.113.5&tcp_port=12345
|
||||
```
|
||||
|
||||
**"Call Home" / ACH (Inbound TCP — unit calls the server)**
|
||||
|
||||
The MiniMate Plus is configured with an IP address and port. On an event trigger or scheduled time it powers up its modem, which establishes a TCP connection outbound to the server. Blastware (or a future SFM ACH listener) accepts the incoming connection. After the unit connects, the PC has a configurable "Wait for Connection" window to send the first command before the unit times out and hangs up.
|
||||
|
||||
```
|
||||
MiniMate Plus ──RS-232──► Raven modem ──TCP──► ACH server (listening)
|
||||
(static office IP, port N)
|
||||
```
|
||||
|
||||
`TcpTransport` is a **client** (outbound connect only). A separate `AchServer` listener component is needed for this mode — not yet implemented.
|
||||
|
||||
---
|
||||
|
||||
### 14.2 No Application-Layer Handshake on TCP Connect
|
||||
|
||||
✅ **Confirmed from ACEmanager configuration screenshot:**
|
||||
|
||||
```
|
||||
Enable ENQ on TCP Connect: 0-Disable
|
||||
```
|
||||
|
||||
When a TCP connection is established (in either direction), **no ENQ byte or other handshake marker is sent** by the modem before the protocol stream starts. The first byte from either side is a raw protocol byte — for SFM-initiated call-up, SFM sends POLL_PROBE immediately after `connect()`.
|
||||
|
||||
No banner, no "CONNECT" string, no Telnet negotiation preamble. The Raven modem's TCP dialog is configured with:
|
||||
|
||||
| ACEmanager Setting | Value | Meaning |
|
||||
|---|---|---|
|
||||
| TCP Auto Answer | 2 — Telnet Server | TCP mode (transparent pass-through, not actually Telnet) |
|
||||
| Telnet Echo Mode | 0 — No Echo | No echo of received bytes |
|
||||
| Enable ENQ on TCP Connect | 0 — Disable | No ENQ byte on connect |
|
||||
| TCP Connect Response Delay | 0 | No delay before first byte |
|
||||
| TCP Idle Timeout | 0 | No modem-level idle disconnect |
|
||||
|
||||
---
|
||||
|
||||
### 14.3 Modem Serial Port Configuration
|
||||
|
||||
> **Hardware note:** The Raven X modem shown in the Blastware manual is 3G-only and no longer operational (3G network shutdown). The current field hardware is the **Sierra Wireless RV55** (and newer RX55). Both run ALEOS firmware and have an identical ACEmanager web UI — the settings below apply to all three generations.
|
||||
|
||||
The modem's RS-232 port (wired to the MiniMate Plus) must be configured as:
|
||||
|
||||
| ACEmanager Setting | Value |
|
||||
|---|---|
|
||||
| Configure Serial Port | **38400,8N1** |
|
||||
| Flow Control | None |
|
||||
| DB9 Serial Echo | OFF |
|
||||
| Data Forwarding Timeout | **1 second** (S50=1) |
|
||||
| Data Forwarding Character | 0 (disabled) |
|
||||
|
||||
The **Data Forwarding Timeout** is the most protocol-critical setting. The modem **accumulates bytes from the RS-232 port for up to 1 second** before forwarding them as a TCP segment. This means:
|
||||
|
||||
- A large S3 response frame may arrive as multiple TCP segments with up to 1-second gaps between them.
|
||||
- A `read_until_idle` implementation with `idle_gap < 1.0 s` will **incorrectly declare the frame complete mid-stream**.
|
||||
- `TcpTransport.read_until_idle` overrides the default `idle_gap=0.05 s` to `idle_gap=1.5 s` to compensate.
|
||||
|
||||
If connecting to a unit via a direct Ethernet connection (no serial modem in the path), the 1.5 s idle gap will still work but will feel slower. In that case you can pass `idle_gap=0.1` explicitly.
|
||||
|
||||
---
|
||||
|
||||
### 14.4 Connection Timeouts on the Unit Side
|
||||
|
||||
The MiniMate Plus firmware has two relevant timeouts configurable via Blastware's Call Home Setup dialog:
|
||||
|
||||
| Timeout | Description | Impact |
|
||||
|---|---|---|
|
||||
| **Wait for Connection** | Seconds after TCP connect during which the unit waits for the first BW frame. If nothing arrives, unit terminates the session. | SFM must send POLL_PROBE within this window after `connect()`. Default appears short (≈15–30 s). |
|
||||
| **Serial Idle Time** | Seconds of inactivity after which the unit terminates the connection. | SFM must complete its work and disconnect cleanly — or send periodic keep-alive frames — within this window. |
|
||||
|
||||
For our `TcpTransport` + `MiniMateProtocol` stack, both timeouts are satisfied automatically because `connect()` is immediately followed by `protocol.poll()` which sends POLL_PROBE, and the full session (POLL + read + disconnect) typically completes in < 30 seconds.
|
||||
|
||||
---
|
||||
|
||||
### 14.5 Port Numbers
|
||||
|
||||
The TCP port is **user-configurable** in both Blastware and the modem. There is no universally fixed port.
|
||||
|
||||
| Setting location | Value in manual example | Value in user's install |
|
||||
|---|---|---|
|
||||
| Blastware TCP Communication dialog | 12335 | 12345 |
|
||||
| Raven ACEmanager Destination Port | 12349 (UDP example) | varies |
|
||||
|
||||
`TcpTransport` defaults to `DEFAULT_TCP_PORT = 12345` which matches the user's install. This can be overridden by the `port` argument or the `tcp_port` query parameter in the SFM server.
|
||||
|
||||
---
|
||||
|
||||
### 14.6 ACH Session Lifecycle (Call Home Mode — Future)
|
||||
|
||||
When the unit calls home under ACH, the session lifecycle from the unit's perspective is:
|
||||
|
||||
1. Unit triggers (event or scheduled time)
|
||||
2. Unit powers up modem, dials / connects TCP to server IP:port
|
||||
3. Unit waits for "Wait for Connection" window for first BW frame from server
|
||||
4. Server sends POLL_PROBE → unit responds with POLL_RESPONSE (same as serial)
|
||||
5. Server reads serial number, full config, events as needed
|
||||
6. Server disconnects (or unit disconnects on Serial Idle Time expiry)
|
||||
7. Unit powers modem down, returns to monitor mode
|
||||
|
||||
Step 4 onward is **identical to the serial/call-up protocol**. The only difference from our perspective is that we are the **listener** rather than the **connector**. A future `AchServer` class will accept the incoming TCP connection and hand the socket to `TcpTransport` for processing.
|
||||
|
||||
---
|
||||
|
||||
## Appendix A — s3_bridge Capture Format
|
||||
> ✅ **CONFIRMED — 2026-02-26**
|
||||
> ⚠️ **Updated for v0.4.0 — annotation markers added.**
|
||||
|
||||
> ⚠️ **This behavior is not part of the Instantel protocol. It is an artifact of the bridge logger implementation.**
|
||||
|
||||
### A.1 Binary modifications
|
||||
|
||||
The `.bin` files produced by `s3_bridge` are **not raw wire bytes**. The logger makes one modification to frame data:
|
||||
The `.bin` files produced by `s3_bridge` are **not raw wire bytes**. The logger makes one modification:
|
||||
|
||||
| Wire sequence | In .bin file | Notes |
|
||||
|---|---|---|
|
||||
@@ -660,55 +1034,160 @@ The `.bin` files produced by `s3_bridge` are **not raw wire bytes**. The logger
|
||||
|
||||
> ⚠️ This means checksums cannot be verified on frames where the stuffed payload ends in `0x10` — that trailing `0x10` would normally be the DLE prefix of ETX, but the logger strips it, making the frame boundary ambiguous in that edge case. In practice this has not been observed in captured data.
|
||||
|
||||
### A.2 Annotation markers (v0.4.0+)
|
||||
|
||||
When the operator types `m` + Enter during a capture, both files receive a marker at that timestamp.
|
||||
|
||||
**`.log` format:**
|
||||
```
|
||||
[HH:MM:SS.mmm] >>> MARK: label text here
|
||||
```
|
||||
The `>>>` prefix never appears in frame log lines (which use `[direction]`) and is trivially skippable by a parser.
|
||||
|
||||
**`.bin` format — out-of-band sentinel:**
|
||||
```
|
||||
FF FF FF FF <len: 1 byte> <label: len bytes, UTF-8>
|
||||
```
|
||||
|
||||
The four `0xFF` sentinel bytes are chosen because `0xFF` is not a valid byte in any Instantel framing position:
|
||||
- Not a valid ACK (`0x41`), DLE (`0x10`), STX (`0x02`), or ETX (`0x03`)
|
||||
- The `0xFF - SUB` response pattern produces values like `0xA4`, `0xEA`, `0xFE` — never a bare `0xFF` in the framing layer
|
||||
|
||||
**⚠️ Parser caveat — SUB `5A` raw ADC payloads:**
|
||||
The sentinel assumption is robust for the framing layer, but the raw ADC sample data in SUB `5A` bulk waveform streams is less constrained. High-amplitude samples could theoretically produce `FF FF FF FF` within the data portion of a frame. **Do not scan the entire `.bin` file as a flat byte stream for sentinels.** Instead:
|
||||
1. Parse frame boundaries first (walk `0x41` ACK → `0x10 0x02` STX → ... → bare `0x03` ETX)
|
||||
2. Only scan for `FF FF FF FF` in the **gaps between frames** — sentinels are always written between complete frames, never mid-frame
|
||||
3. Any `FF FF FF FF` appearing inside a frame boundary is ADC data, not a marker
|
||||
|
||||
Session start and end are automatically marked in both files.
|
||||
|
||||
---
|
||||
|
||||
## 14. Open Questions / Still Needs Cracking
|
||||
|
||||
| Question | Priority | Added |
|
||||
|---|---|---|
|
||||
| Byte at timestamp offset 3 — hours, minutes, or padding? | MEDIUM | 2026-02-26 |
|
||||
| `trail[0]` in serial number response — unit-specific byte, derivation unknown. `trail[1]` resolved as firmware minor version. | MEDIUM | 2026-02-26 |
|
||||
| Full channel ID mapping in SUB `5A` stream (01/02/03/04 → which sensor?) | MEDIUM | 2026-02-26 |
|
||||
| Exact byte boundaries of project string fields in SUB `71` write frame — padding rules unconfirmed | MEDIUM | 2026-02-26 |
|
||||
| Purpose of SUB `09` / response `F6` — 202-byte read block | MEDIUM | 2026-02-26 |
|
||||
| Purpose of SUB `2E` / response `D1` — 26-byte read block | MEDIUM | 2026-02-26 |
|
||||
| Full field mapping of SUB `1A` / response `E5` — channel scaling / compliance config block | MEDIUM | 2026-02-26 |
|
||||
| `0x082A` in channel config block — not trigger or alarm level. Possibly record time, sample count, or secondary threshold. Needs targeted capture. | MEDIUM | 2026-02-26 |
|
||||
| Geophone alarm level float offset confirmation — value match suggests `3F 80 00 00` at known position, needs change capture to confirm. | LOW | 2026-02-26 |
|
||||
| Max range float `40 C6 97 FD` = 6.206 — meaning unclear. Screenshot shows "Normal 10.000 in/s" range setting. | LOW | 2026-02-26 |
|
||||
| Full trigger configuration field mapping (SUB `1C` / write `82`) | LOW | 2026-02-26 |
|
||||
| Whether SUB `24`/`25` are distinct from SUB `5A` or redundant | LOW | 2026-02-26 |
|
||||
| Meaning of `0x07 E7` field in config block | LOW | 2026-02-26 |
|
||||
| MicL channel units — PSI, dB linear, or dB(L)? | LOW | 2026-02-26 |
|
||||
| Question | Priority | Added | Notes |
|
||||
|---|---|---|---|
|
||||
| Byte at timestamp offset 3 — hours, minutes, or padding? | MEDIUM | 2026-02-26 | |
|
||||
| `trail[0]` in serial number response — unit-specific byte, derivation unknown. `trail[1]` resolved as firmware minor version. | MEDIUM | 2026-02-26 | |
|
||||
| Full channel ID mapping in SUB `5A` stream (01/02/03/04 → which sensor?) | MEDIUM | 2026-02-26 | |
|
||||
| Exact byte boundaries of project string fields in SUB `71` write frame — padding rules unconfirmed | MEDIUM | 2026-02-26 | |
|
||||
| Purpose of SUB `09` / response `F6` — 202-byte read block | MEDIUM | 2026-02-26 | |
|
||||
| Purpose of SUB `2E` / response `D1` — 26-byte read block | MEDIUM | 2026-02-26 | |
|
||||
| Full field mapping of SUB `1A` / response `E5` — channel scaling / compliance config block | MEDIUM | 2026-02-26 | |
|
||||
| `0x082A` in channel config block — not trigger, alarm, or record time directly. **RESOLVED: fixed E5 payload length (2090 bytes).** Constant regardless of all settings. | RESOLVED | 2026-03-01 | Resolved 2026-03-09 |
|
||||
| **Record time in wire protocol** — float32 BE at E5 data page2 `+0x28`. **RESOLVED.** See §7.6.1. | RESOLVED | 2026-03-09 | Confirmed via 7→13 sec captures |
|
||||
| Unknown uint16 fields at channel block +0A (=80), +0C (=15), +0E (=40), +10 (=21) — manual describes "Sensitive (Gain=8) / Normal (Gain=1)" per-channel range; 80/15/40/21 might encode gain, sensitivity, or ADC config. | LOW | 2026-03-01 | |
|
||||
| Full trigger configuration field mapping (SUB `1C` / write `82`) | LOW | 2026-02-26 | |
|
||||
| Whether SUB `24`/`25` are distinct from SUB `5A` or redundant | LOW | 2026-02-26 | |
|
||||
| Meaning of `0x07 E7` field in config block | LOW | 2026-02-26 | |
|
||||
| **Trigger Sample Width** — **RESOLVED:** BW→S3 write frame SUB `0x82`, destuffed payload offset `[22]`, uint8. Width=4 → `0x04`, Width=3 → `0x03`. Confirmed via BW-side capture diff. Only visible in `raw_bw.bin` write traffic, not in S3-side compliance reads. | RESOLVED | 2026-03-02 | Confirmed 2026-03-09 |
|
||||
| **Auto Window** — "1 to 9 seconds" per manual (§3.13.1b). **Mode-gated:** only transmitted/active when Record Stop Mode = Auto. Capture attempted in Fixed mode (3→9 change) — no wire change observed in any frame. Deferred pending mode switch. | LOW | 2026-03-02 | Updated 2026-03-09 |
|
||||
| **Auxiliary Trigger read location** — **RESOLVED:** SUB `FE` offset `0x0109`, uint8, `0x00`=disabled, `0x01`=enabled. Confirmed 2026-03-11 via controlled toggle capture. | RESOLVED | 2026-03-02 | Resolved 2026-03-11 |
|
||||
| **Auxiliary Trigger write path** — Write command not yet captured in a clean session. Inner frame handshake visible in A4 (multiple WRITE_CONFIRM_RESPONSE SUBs appear, TRIGGER_CONFIG_RESPONSE removed), but the BW→S3 write command itself was in a partial session. Likely SUB `15` or similar. Deferred for clean capture. | LOW | 2026-03-11 | NEW |
|
||||
| **SUB `6E` response to SUB `1C`** — S3 responds to TRIGGER_CONFIG_READ (SUB `1C`) with SUB `6E`, NOT `0xE3` as the `0xFF - SUB` rule would predict. Only known exception to the response pairing rule observed to date. Payload starts with ASCII `"Long2"`. Purpose unknown. | LOW | 2026-03-11 | NEW |
|
||||
| **Max Geo Range float 6.2061 in/s** — NOT a user-selectable range (manual only shows 1.25 and 10.0 in/s). Likely internal ADC full-scale constant or hardware range ceiling. Not worth capturing. | LOW | 2026-02-26 | Downgraded 2026-03-02 |
|
||||
| MicL channel units — **RESOLVED: psi**, confirmed from `.set` file unit string `"psi\0"` | RESOLVED | 2026-03-01 | |
|
||||
| Backlight offset — **RESOLVED: +4B in event index data**, uint8, seconds | RESOLVED | 2026-03-02 | |
|
||||
| Power save offset — **RESOLVED: +53 in event index data**, uint8, minutes | RESOLVED | 2026-03-02 | |
|
||||
| Monitoring LCD Cycle — **RESOLVED: +54/+55 in event index data**, uint16 BE, seconds (65500 = disabled) | RESOLVED | 2026-03-02 | |
|
||||
|
||||
---
|
||||
|
||||
*All findings reverse-engineered from live RS-232 bridge captures. No Instantel proprietary documentation was referenced or used.*
|
||||
---
|
||||
|
||||
## Appendix B — Operator Manual Cross-Reference (716U0101 Rev 15)
|
||||
|
||||
> Added 2026-03-02. Cross-referencing confirms setting names, ranges, units, and behavior for fields found in protocol captures. The manual does NOT describe the wire protocol — it describes the user-facing device interface. Use to infer data types, ranges, and semantics of protocol fields.
|
||||
|
||||
| Setting Name (Manual) | Manual Location | Protocol Location | Type | Range / Notes |
|
||||
|---|---|---|---|---|
|
||||
| Backlight On Time | §3.13.1e | Event Index +4B | uint8 | 0–255 seconds |
|
||||
| Power Saving Timeout | §3.13.1f | Event Index +53 | uint8 | minutes (user sets 1–60+) |
|
||||
| Monitoring LCD Cycle | §3.13.1g | Event Index +54/55 | uint16 BE | seconds; 0=off; 65500≈disabled |
|
||||
| Trigger Level (Geo) | §3.8.6 | Channel block, float | float32 BE | 0.005–10.000 in/s |
|
||||
| Alarm Level (Geo) | §3.9.9 | Channel block, float | float32 BE | higher than trigger level |
|
||||
| Trigger Level (Mic) | §3.8.6 | Channel block, float | float32 BE | 100–148 dB in 1 dB steps |
|
||||
| Alarm Level (Mic) | §3.9.10 | Channel block, float | float32 BE | higher than mic trigger |
|
||||
| Record Time | §3.8.9 | E5 data page2 `+0x28` (wire); `.set` +16 (file) | float32 BE (wire); uint32 LE (file) | 1–105 seconds (menu label `<105`); confirmed 7→`40E00000`, 10→`41200000`, 13→`41500000` |
|
||||
| Max Geo Range | §3.8.4 | Channel block, float | float32 BE | 1.25 or 10.0 in/s (user); 6.2061 in protocol = internal constant |
|
||||
| Microphone Units | §3.9.7 | Inline unit string | char[4] | `"psi\0"`, `"pa.\0"`, `"dB\0\0"` |
|
||||
| Sample Rate | §3.8.2 | Unknown — needs capture | — | 1024, 2048, 4096 (compliance); up to 65536 (advanced) |
|
||||
| Record Mode | §3.8.1 | Unknown | — | Single Shot, Continuous, Manual, Histogram, Histogram Combo |
|
||||
| Trigger Sample Width | §3.13.1h | BW→S3 SUB `0x82` write frame, destuffed `[22]`, uint8 | uint8 | Default=2; confirmed 4=`0x04`, 3=`0x03`. **BW-side write only** — not visible in S3 compliance reads. Mode-gated: only sent in Compliance/Single-Shot/Fixed mode. |
|
||||
| Auto Window | §3.13.1b | **Mode-gated — NOT YET MAPPED** | uint8? | 1–9 seconds; only active when Record Stop Mode = Auto. Capture in Fixed mode produced no wire change. |
|
||||
| Auxiliary Trigger | §3.13.1d | SUB `FE` (FULL_CONFIG_RESPONSE) offset `0x0109` (read); write path not yet isolated | uint8 (bool) | `0x00`=disabled, `0x01`=enabled; confirmed 2026-03-11 |
|
||||
| Password | §3.13.1c | Unknown | — | 4-key sequence |
|
||||
| Serial Connection | §3.9.11 | Unknown | — | Direct / Via Modem |
|
||||
| Baud Rate | §3.9.12 | Unknown | — | 38400 for direct |
|
||||
|
||||
---
|
||||
|
||||
## Appendix C — Logger & Parser Validation (2026-03-02)
|
||||
|
||||
> Documents the logger integrity verification and parser refactor completed 2026-03-02. Tooling behavior only — not protocol semantics.
|
||||
|
||||
### C.1 Logger Validation
|
||||
|
||||
**Concern:** Earlier sessions noted that the `s3_bridge` logger may have been stripping `0x10` from `DLE ETX` sequences, producing bare `0x03` terminators in the capture file.
|
||||
|
||||
**Resolution:** HxD inspection of a new capture produced by `s3_bridge v0.5.0` confirmed that `10 03` sequences are present intact inside S3→BW record payloads. The `forward_loop` function writes raw bytes to the `.bin` before any sniffer or framing logic runs — there is no ETX stripping in v0.5.0.
|
||||
|
||||
The earlier stripping behavior applied to a previous logger version. v0.5.0 is confirmed lossless with respect to wire bytes.
|
||||
|
||||
**Confirmed wire framing:**
|
||||
- Frame start: `0x10 0x02` (DLE STX) ✅
|
||||
- Frame end: `0x10 0x03` (DLE ETX) ✅
|
||||
- DLE stuffing: `0x10 0x10` in payload = literal `0x10` ✅
|
||||
|
||||
### C.2 Capture Architecture (Current)
|
||||
|
||||
As of 2026-03-02 the capture pipeline produces two flat raw wire dump files per session:
|
||||
|
||||
| File | Contents |
|
||||
|---|---|
|
||||
| `raw_s3.bin` | All bytes transmitted by S3 (device → Blastware), in order |
|
||||
| `raw_bw.bin` | All bytes transmitted by BW (Blastware → device), in order |
|
||||
|
||||
No record headers, no timestamps, no framing logic applied by the dumper. Files are flat concatenations of `serial.read()` chunks. Frame boundaries must be recovered by the parser.
|
||||
|
||||
### C.3 Parser Design — DLE State Machine
|
||||
|
||||
A deterministic state machine replaces all prior heuristic scanning.
|
||||
|
||||
**States:**
|
||||
|
||||
```
|
||||
STATE_IDLE — scanning for frame start
|
||||
STATE_IN_FRAME — consuming payload bytes
|
||||
STATE_AFTER_DLE — last byte was 0x10, awaiting qualifier
|
||||
```
|
||||
|
||||
**Transitions:**
|
||||
|
||||
| Current State | Byte | Action | Next State |
|
||||
|---|---|---|---|
|
||||
| IDLE | `10 02` | Begin new frame | IN_FRAME |
|
||||
| IDLE | any | Discard | IDLE |
|
||||
| IN_FRAME | `!= 10` | Append to payload | IN_FRAME |
|
||||
| IN_FRAME | `10` | — | AFTER_DLE |
|
||||
| AFTER_DLE | `10` | Append literal `0x10` | IN_FRAME |
|
||||
| AFTER_DLE | `03` | Frame complete, emit | IDLE |
|
||||
| AFTER_DLE | other | Treat as payload (recovery) | IN_FRAME |
|
||||
|
||||
**Properties:**
|
||||
- Does not scan globally for `10 02`
|
||||
- Only complete STX→ETX pairs are emitted as frames
|
||||
- Incomplete trailing frames at EOF are discarded (expected at capture boundaries)
|
||||
- DLE stuffing handled correctly
|
||||
|
||||
### C.4 Observed Traffic (Validation Captures)
|
||||
|
||||
**`raw_bw.bin`** (Blastware → S3):
|
||||
- 7 complete frames via state machine
|
||||
- Mostly small command/control frames, several zero-length payloads
|
||||
- Bare `0x02` used as STX (asymmetric — BW does not use DLE STX)
|
||||
- Contains project metadata strings: `"Standard Recording Setup.set"`, `"Claude test2"`, `"Location #1 - Brians House"`
|
||||
|
||||
**`raw_s3.bin`** (S3 → Blastware):
|
||||
- First frame payload ~3922 bytes (large structured response)
|
||||
- Repeated `"Instantel"` / `"MiniMate Plus"` / `"BE18189"` strings throughout
|
||||
- Multiple medium-length structured frames
|
||||
- DLE+ETX confirmed intact
|
||||
|
||||
### C.5 Key Lessons
|
||||
|
||||
1. **Global byte counting ≠ frame counting.** `0x10 0x02` appears inside payloads. Only state machine transitions produce valid frame boundaries.
|
||||
2. **STX count ≠ frame count.** Only STX→ETX pairs within proper state transitions count.
|
||||
3. **EOF mid-frame is normal.** Capture termination during active traffic produces an incomplete trailing frame. Not an error.
|
||||
4. **Layer separation.** The parser extracts frames only. Decoding block IDs, validating checksums, and interpreting semantics are responsibilities of a separate protocol decoder layer above it.
|
||||
|
||||
### C.6 Parser Layer Architecture
|
||||
|
||||
```
|
||||
raw_s3.bin / raw_bw.bin
|
||||
↓
|
||||
DLE Frame Parser (s3_parser.py) <- framing only
|
||||
↓
|
||||
Protocol Decoder (future) <- SUB IDs, block layout, checksums
|
||||
↓
|
||||
Semantic Interpretation <- settings, events, responses
|
||||
```
|
||||
|
||||
---
|
||||
|
||||
*All findings reverse-engineered from live RS-232 bridge captures.*
|
||||
*Cross-referenced from 2026-03-02 with Instantel MiniMate Plus Operator Manual (716U0101 Rev 15).*
|
||||
*This is a living document — append changelog entries and timestamps as new findings are confirmed or corrected.*
|
||||
139
docs/instantel_protocol_session_summary_2-26_3-1.md
Normal file
139
docs/instantel_protocol_session_summary_2-26_3-1.md
Normal file
@@ -0,0 +1,139 @@
|
||||
# Instantel MiniMate Plus — RS-232 Protocol RE
|
||||
**Session Summary: Chat Compacted 2026-03-01**
|
||||
Device: MiniMate Plus S/N BE18189
|
||||
FW S338.17 / DSP 10.72
|
||||
Capture: 38400 baud, COM4/COM5
|
||||
|
||||
---
|
||||
|
||||
# Session 1 — Protocol Foundations & Write Command Discovery
|
||||
**2026-02-27**
|
||||
|
||||
## Frame Structure Confirmed
|
||||
- DLE framing: `ACK (0x41)` + `DLE+STX (0x10 0x02)` … payload … checksum … `DLE+ETX (0x10 0x03)`
|
||||
- DLE byte stuffing: `0x10` in payload → `0x10 0x10` on wire
|
||||
- Checksum: 8-bit sum of de-stuffed payload bytes, mod 256
|
||||
- Payload structure:
|
||||
`CMD | DLE | ADDR | FLAGS | SUB | OFFSET_HI | OFFSET_LO | data…`
|
||||
- All BW→S3 requests use `CMD=0x02`
|
||||
- All responses use CMD matching the DLE prefix
|
||||
- Response `SUB = 0xFF − Request SUB`
|
||||
|
||||
## Session Startup Sequence
|
||||
Device boot prints ASCII **“Operating System”** before binary protocol mode.
|
||||
|
||||
Blastware init sequence:
|
||||
1. POLL (SUB 5B)
|
||||
2. Channel config (06)
|
||||
3. Serial (15)
|
||||
4. Full config (01)
|
||||
5. Event index (08)
|
||||
6. Event headers (1E)
|
||||
7. Waveform records (0C)
|
||||
8. Bulk stream (5A)
|
||||
|
||||
## Write Commands Discovered
|
||||
|
||||
| SUB (Req) | SUB (Resp) | Function |
|
||||
|---|---|---|
|
||||
| 0x71 | 0x8E | Trigger config write |
|
||||
| 0x72 | 0x8D | Trigger config page 2 |
|
||||
| 0x73 | 0x8C | Unknown write |
|
||||
| 0x74 | 0x8B | Unknown write |
|
||||
| 0x82 | 0x7D | Unknown write (post config) |
|
||||
| 0x83 | 0x7C | Unknown write (terminal) |
|
||||
| 0x68 | 0x97 | Event index write? |
|
||||
| 0x09 | 0xF6 | Unknown read |
|
||||
| 0x1A | 0xE5 | Unknown multi-page read |
|
||||
| 0x2E | 0xD1 | Unknown short read |
|
||||
|
||||
---
|
||||
|
||||
# Session 2 — Trigger & Alarm Level Floats
|
||||
**2026-03-01 ~20:51**
|
||||
|
||||
## Key Findings
|
||||
- Trigger & alarm levels are IEEE‑754 single‑precision **big‑endian floats**
|
||||
- Trigger level change verified (0.5 → 0.2 in/s)
|
||||
- Alarm level verified (1.0 → 2.0 in/s)
|
||||
- Unit strings embedded inline (`"psi"`, `"in./s"`)
|
||||
- `0x082A` ruled out as trigger candidate
|
||||
|
||||
## SUB 71 Float Offsets
|
||||
|
||||
| Offset | Field | Value | Encoding |
|
||||
|---|---|---|---|
|
||||
| d[32..35] | MicL trigger | 0.0450 psi | IEEE754 BE |
|
||||
| d[38..41] | MicL low thresh | 0.0100 psi | IEEE754 BE |
|
||||
| d[46..49] | MicL alarm | 0.0210 psi | IEEE754 BE |
|
||||
| d[42..44] | Units | psi\0 | ASCII |
|
||||
|
||||
---
|
||||
|
||||
# Session 3 — Multi‑Parameter Capture
|
||||
**2026-03-01 ~20:53**
|
||||
|
||||
| Parameter | Change | Result |
|
||||
|---|---|---|
|
||||
| Alarm level | 2.0 in/s | Confirmed |
|
||||
| Trigger level | 0.6 in/s | Confirmed |
|
||||
| Record time | 3s | Confirmed |
|
||||
| Sentinels | FF FF FF FF | Write boundaries confirmed |
|
||||
|
||||
---
|
||||
|
||||
# Session 4 — .set File Decode
|
||||
**2026-03-01 ~20:55**
|
||||
|
||||
## .set Format
|
||||
- Binary per‑channel structs
|
||||
- Backlight field at **+0x0C**
|
||||
- MicL units confirmed as **psi**
|
||||
- Record time offset confirmed
|
||||
|
||||
Unknown uint16 fields:
|
||||
- +0x0A = 80
|
||||
- +0x0E = 40
|
||||
- +0x10 = 21
|
||||
|
||||
## Backlight / Power Saving Tests
|
||||
Changes tested:
|
||||
- Backlight 15 → 30
|
||||
- Power save 2 → 5
|
||||
- Mic dB toggle
|
||||
|
||||
Result:
|
||||
- SUB 71 frames identical
|
||||
- No new writes after sentinels
|
||||
- Device confirmed to support settings → offsets unknown
|
||||
|
||||
---
|
||||
|
||||
# Current State — Pending Capture
|
||||
|
||||
Next capture targets:
|
||||
- Backlight = 250 → search `0xFA`
|
||||
- Power saving = 10 → search `0x0A`
|
||||
- Possible encodings:
|
||||
- uint16 BE
|
||||
- uint32 BE
|
||||
- Little‑endian variants
|
||||
|
||||
---
|
||||
|
||||
# Open Questions
|
||||
|
||||
| Question | Priority | Status |
|
||||
|---|---|---|
|
||||
| Timestamp byte 3 | MEDIUM | Open |
|
||||
| Serial response trailing bytes | MEDIUM | Open |
|
||||
| Channel ID mapping | MEDIUM | Open |
|
||||
| Write config coverage | MEDIUM | Partial |
|
||||
| Backlight offsets | HIGH | Active |
|
||||
| MicL units | LOW | Resolved |
|
||||
| SUB 24/25 vs 5A | LOW | Open |
|
||||
| 0x07E7 config field | LOW | Open |
|
||||
|
||||
---
|
||||
|
||||
All findings reverse‑engineered from RS‑232 captures. No vendor docs used.
|
||||
27
minimateplus/__init__.py
Normal file
27
minimateplus/__init__.py
Normal file
@@ -0,0 +1,27 @@
|
||||
"""
|
||||
minimateplus — Instantel MiniMate Plus protocol library.
|
||||
|
||||
Provides a clean Python API for communicating with MiniMate Plus seismographs
|
||||
over RS-232 serial (direct cable) or TCP (modem / ACH Auto Call Home).
|
||||
|
||||
Typical usage (serial):
|
||||
from minimateplus import MiniMateClient
|
||||
|
||||
with MiniMateClient("COM5") as device:
|
||||
info = device.connect()
|
||||
events = device.get_events()
|
||||
|
||||
Typical usage (TCP / modem):
|
||||
from minimateplus import MiniMateClient
|
||||
from minimateplus.transport import TcpTransport
|
||||
|
||||
with MiniMateClient(transport=TcpTransport("203.0.113.5", 12345)) as device:
|
||||
info = device.connect()
|
||||
"""
|
||||
|
||||
from .client import MiniMateClient
|
||||
from .models import DeviceInfo, Event
|
||||
from .transport import SerialTransport, TcpTransport
|
||||
|
||||
__version__ = "0.1.0"
|
||||
__all__ = ["MiniMateClient", "DeviceInfo", "Event", "SerialTransport", "TcpTransport"]
|
||||
483
minimateplus/client.py
Normal file
483
minimateplus/client.py
Normal file
@@ -0,0 +1,483 @@
|
||||
"""
|
||||
client.py — MiniMateClient: the top-level public API for the library.
|
||||
|
||||
Combines transport, protocol, and model decoding into a single easy-to-use
|
||||
class. This is the only layer that the SFM server (sfm/server.py) imports
|
||||
directly.
|
||||
|
||||
Design: stateless per-call (connect → do work → disconnect).
|
||||
The client does not hold an open connection between calls. This keeps the
|
||||
first implementation simple and matches Blastware's observed behaviour.
|
||||
Persistent connections can be added later without changing the public API.
|
||||
|
||||
Example (serial):
|
||||
from minimateplus import MiniMateClient
|
||||
|
||||
with MiniMateClient("COM5") as device:
|
||||
info = device.connect() # POLL handshake + identity read
|
||||
events = device.get_events() # download all events
|
||||
|
||||
Example (TCP / modem):
|
||||
from minimateplus import MiniMateClient
|
||||
from minimateplus.transport import TcpTransport
|
||||
|
||||
transport = TcpTransport("203.0.113.5", port=12345)
|
||||
with MiniMateClient(transport=transport) as device:
|
||||
info = device.connect()
|
||||
"""
|
||||
|
||||
from __future__ import annotations
|
||||
|
||||
import logging
|
||||
import struct
|
||||
from typing import Optional
|
||||
|
||||
from .framing import S3Frame
|
||||
from .models import (
|
||||
DeviceInfo,
|
||||
Event,
|
||||
PeakValues,
|
||||
ProjectInfo,
|
||||
Timestamp,
|
||||
)
|
||||
from .protocol import MiniMateProtocol, ProtocolError
|
||||
from .protocol import (
|
||||
SUB_SERIAL_NUMBER,
|
||||
SUB_FULL_CONFIG,
|
||||
SUB_EVENT_INDEX,
|
||||
SUB_EVENT_HEADER,
|
||||
SUB_WAVEFORM_RECORD,
|
||||
)
|
||||
from .transport import SerialTransport, BaseTransport
|
||||
|
||||
log = logging.getLogger(__name__)
|
||||
|
||||
|
||||
# ── MiniMateClient ────────────────────────────────────────────────────────────
|
||||
|
||||
class MiniMateClient:
|
||||
"""
|
||||
High-level client for a single MiniMate Plus device.
|
||||
|
||||
Args:
|
||||
port: Serial port name (e.g. "COM5", "/dev/ttyUSB0").
|
||||
Not required when a pre-built transport is provided.
|
||||
baud: Baud rate (default 38400, ignored when transport is provided).
|
||||
timeout: Per-request receive timeout in seconds (default 15.0).
|
||||
transport: Pre-built transport (SerialTransport or TcpTransport).
|
||||
If None, a SerialTransport is constructed from port/baud.
|
||||
"""
|
||||
|
||||
def __init__(
|
||||
self,
|
||||
port: str = "",
|
||||
baud: int = 38_400,
|
||||
timeout: float = 15.0,
|
||||
transport: Optional[BaseTransport] = None,
|
||||
) -> None:
|
||||
self.port = port
|
||||
self.baud = baud
|
||||
self.timeout = timeout
|
||||
self._transport: Optional[BaseTransport] = transport
|
||||
self._proto: Optional[MiniMateProtocol] = None
|
||||
|
||||
# ── Connection lifecycle ──────────────────────────────────────────────────
|
||||
|
||||
def open(self) -> None:
|
||||
"""Open the transport connection."""
|
||||
if self._transport is None:
|
||||
self._transport = SerialTransport(self.port, self.baud)
|
||||
if not self._transport.is_connected:
|
||||
self._transport.connect()
|
||||
self._proto = MiniMateProtocol(self._transport, recv_timeout=self.timeout)
|
||||
|
||||
def close(self) -> None:
|
||||
"""Close the transport connection."""
|
||||
if self._transport and self._transport.is_connected:
|
||||
self._transport.disconnect()
|
||||
self._proto = None
|
||||
|
||||
@property
|
||||
def is_open(self) -> bool:
|
||||
return bool(self._transport and self._transport.is_connected)
|
||||
|
||||
# ── Context manager ───────────────────────────────────────────────────────
|
||||
|
||||
def __enter__(self) -> "MiniMateClient":
|
||||
self.open()
|
||||
return self
|
||||
|
||||
def __exit__(self, *_) -> None:
|
||||
self.close()
|
||||
|
||||
# ── Public API ────────────────────────────────────────────────────────────
|
||||
|
||||
def connect(self) -> DeviceInfo:
|
||||
"""
|
||||
Perform the startup handshake and read device identity.
|
||||
|
||||
Opens the connection if not already open.
|
||||
|
||||
Reads:
|
||||
1. POLL handshake (startup)
|
||||
2. SUB 15 — serial number
|
||||
3. SUB 01 — full config block (firmware, model strings)
|
||||
|
||||
Returns:
|
||||
Populated DeviceInfo.
|
||||
|
||||
Raises:
|
||||
ProtocolError: on any communication failure.
|
||||
"""
|
||||
if not self.is_open:
|
||||
self.open()
|
||||
|
||||
proto = self._require_proto()
|
||||
|
||||
log.info("connect: POLL startup")
|
||||
proto.startup()
|
||||
|
||||
log.info("connect: reading serial number (SUB 15)")
|
||||
sn_data = proto.read(SUB_SERIAL_NUMBER)
|
||||
device_info = _decode_serial_number(sn_data)
|
||||
|
||||
log.info("connect: reading full config (SUB 01)")
|
||||
cfg_data = proto.read(SUB_FULL_CONFIG)
|
||||
_decode_full_config_into(cfg_data, device_info)
|
||||
|
||||
log.info("connect: %s", device_info)
|
||||
return device_info
|
||||
|
||||
def get_events(self, include_waveforms: bool = True) -> list[Event]:
|
||||
"""
|
||||
Download all stored events from the device.
|
||||
|
||||
For each event in the index:
|
||||
1. SUB 1E — event header (timestamp, sample rate)
|
||||
2. SUB 0C — full waveform record (peak values, project strings)
|
||||
|
||||
Raw ADC waveform samples (SUB 5A bulk stream) are NOT downloaded
|
||||
here — they can be large. Pass include_waveforms=True to also
|
||||
download them (not yet implemented, reserved for a future call).
|
||||
|
||||
Args:
|
||||
include_waveforms: Reserved. Currently ignored.
|
||||
|
||||
Returns:
|
||||
List of Event objects, one per stored record on the device.
|
||||
|
||||
Raises:
|
||||
ProtocolError: on any communication failure.
|
||||
"""
|
||||
proto = self._require_proto()
|
||||
|
||||
log.info("get_events: reading event index (SUB 08)")
|
||||
index_data = proto.read(SUB_EVENT_INDEX)
|
||||
event_count = _decode_event_count(index_data)
|
||||
log.info("get_events: %d event(s) found", event_count)
|
||||
|
||||
events: list[Event] = []
|
||||
for i in range(event_count):
|
||||
log.info("get_events: downloading event %d/%d", i + 1, event_count)
|
||||
ev = self._download_event(proto, i)
|
||||
if ev:
|
||||
events.append(ev)
|
||||
|
||||
return events
|
||||
|
||||
# ── Internal helpers ──────────────────────────────────────────────────────
|
||||
|
||||
def _require_proto(self) -> MiniMateProtocol:
|
||||
if self._proto is None:
|
||||
raise RuntimeError("MiniMateClient is not connected. Call open() first.")
|
||||
return self._proto
|
||||
|
||||
def _download_event(
|
||||
self, proto: MiniMateProtocol, index: int
|
||||
) -> Optional[Event]:
|
||||
"""Download header + waveform record for one event by index."""
|
||||
ev = Event(index=index)
|
||||
|
||||
# SUB 1E — event header (timestamp, sample rate).
|
||||
#
|
||||
# The two-step event-header read passes the event index at payload[5]
|
||||
# of the data-request frame (consistent with all other reads).
|
||||
# This limits addressing to events 0–255 without a multi-byte scheme;
|
||||
# the MiniMate Plus stores up to ~1000 events, so high indices may need
|
||||
# a revised approach once we have captured event-download frames.
|
||||
try:
|
||||
from .framing import build_bw_frame
|
||||
from .protocol import _expected_rsp_sub, SUB_EVENT_HEADER
|
||||
|
||||
# Step 1 — probe (offset=0)
|
||||
probe_frame = build_bw_frame(SUB_EVENT_HEADER, 0)
|
||||
proto._send(probe_frame)
|
||||
_probe_rsp = proto._recv_one(expected_sub=_expected_rsp_sub(SUB_EVENT_HEADER))
|
||||
|
||||
# Step 2 — data request (offset = event index, clamped to 0xFF)
|
||||
event_offset = min(index, 0xFF)
|
||||
data_frame = build_bw_frame(SUB_EVENT_HEADER, event_offset)
|
||||
proto._send(data_frame)
|
||||
data_rsp = proto._recv_one(expected_sub=_expected_rsp_sub(SUB_EVENT_HEADER))
|
||||
|
||||
_decode_event_header_into(data_rsp.data, ev)
|
||||
except ProtocolError as exc:
|
||||
log.warning("event %d: header read failed: %s", index, exc)
|
||||
return ev # Return partial event rather than losing it entirely
|
||||
|
||||
# SUB 0C — full waveform record (peak values, project strings).
|
||||
try:
|
||||
wf_data = proto.read(SUB_WAVEFORM_RECORD)
|
||||
_decode_waveform_record_into(wf_data, ev)
|
||||
except ProtocolError as exc:
|
||||
log.warning("event %d: waveform record read failed: %s", index, exc)
|
||||
|
||||
return ev
|
||||
|
||||
|
||||
# ── Decoder functions ─────────────────────────────────────────────────────────
|
||||
#
|
||||
# Pure functions: bytes → model field population.
|
||||
# Kept here (not in models.py) to isolate protocol knowledge from data shapes.
|
||||
|
||||
def _decode_serial_number(data: bytes) -> DeviceInfo:
|
||||
"""
|
||||
Decode SUB EA (SERIAL_NUMBER_RESPONSE) payload into a new DeviceInfo.
|
||||
|
||||
Layout (10 bytes total per §7.2):
|
||||
bytes 0–7: serial string, null-terminated, null-padded ("BE18189\\x00")
|
||||
byte 8: unit-specific trailing byte (purpose unknown ❓)
|
||||
byte 9: firmware minor version (0x11 = 17) ✅
|
||||
|
||||
Returns:
|
||||
New DeviceInfo with serial, firmware_minor, serial_trail_0 populated.
|
||||
"""
|
||||
if len(data) < 9:
|
||||
# Short payload — gracefully degrade
|
||||
serial = data.rstrip(b"\x00").decode("ascii", errors="replace")
|
||||
return DeviceInfo(serial=serial, firmware_minor=0)
|
||||
|
||||
serial = data[:8].rstrip(b"\x00").decode("ascii", errors="replace")
|
||||
trail_0 = data[8] if len(data) > 8 else None
|
||||
fw_minor = data[9] if len(data) > 9 else 0
|
||||
|
||||
return DeviceInfo(
|
||||
serial=serial,
|
||||
firmware_minor=fw_minor,
|
||||
serial_trail_0=trail_0,
|
||||
)
|
||||
|
||||
|
||||
def _decode_full_config_into(data: bytes, info: DeviceInfo) -> None:
|
||||
"""
|
||||
Decode SUB FE (FULL_CONFIG_RESPONSE) payload into an existing DeviceInfo.
|
||||
|
||||
The FE response arrives as a composite S3 outer frame whose data section
|
||||
contains inner DLE-framed sub-frames. Because of this nesting the §7.3
|
||||
fixed offsets (0x34, 0x3C, 0x44, 0x6D) are unreliable — they assume a
|
||||
clean non-nested payload starting at byte 0.
|
||||
|
||||
Instead we search the whole byte array for known ASCII patterns. The
|
||||
strings are long enough to be unique in any reasonable payload.
|
||||
|
||||
Modifies info in-place.
|
||||
"""
|
||||
def _extract(needle: bytes, max_len: int = 32) -> Optional[str]:
|
||||
"""Return the null-terminated ASCII string that starts with *needle*."""
|
||||
pos = data.find(needle)
|
||||
if pos < 0:
|
||||
return None
|
||||
end = pos
|
||||
while end < len(data) and data[end] != 0 and (end - pos) < max_len:
|
||||
end += 1
|
||||
s = data[pos:end].decode("ascii", errors="replace").strip()
|
||||
return s or None
|
||||
|
||||
# ── Manufacturer and model are straightforward literal matches ────────────
|
||||
info.manufacturer = _extract(b"Instantel")
|
||||
info.model = _extract(b"MiniMate Plus")
|
||||
|
||||
# ── Firmware version: "S3xx.xx" — scan for the 'S3' prefix ───────────────
|
||||
for i in range(len(data) - 5):
|
||||
if data[i] == ord('S') and data[i + 1] == ord('3') and chr(data[i + 2]).isdigit():
|
||||
end = i
|
||||
while end < len(data) and data[end] not in (0, 0x20) and (end - i) < 12:
|
||||
end += 1
|
||||
candidate = data[i:end].decode("ascii", errors="replace").strip()
|
||||
if "." in candidate and len(candidate) >= 5:
|
||||
info.firmware_version = candidate
|
||||
break
|
||||
|
||||
# ── DSP version: numeric "xx.xx" — search for known prefixes ─────────────
|
||||
for prefix in (b"10.", b"11.", b"12.", b"9.", b"8."):
|
||||
pos = data.find(prefix)
|
||||
if pos < 0:
|
||||
continue
|
||||
end = pos
|
||||
while end < len(data) and data[end] not in (0, 0x20) and (end - pos) < 8:
|
||||
end += 1
|
||||
candidate = data[pos:end].decode("ascii", errors="replace").strip()
|
||||
# Accept only strings that look like "digits.digits"
|
||||
if "." in candidate and all(c in "0123456789." for c in candidate):
|
||||
info.dsp_version = candidate
|
||||
break
|
||||
|
||||
|
||||
def _decode_event_count(data: bytes) -> int:
|
||||
"""
|
||||
Extract stored event count from SUB F7 (EVENT_INDEX_RESPONSE) payload.
|
||||
|
||||
Layout per §7.4 (offsets from data section start):
|
||||
+00: 00 58 09 — total index size or record count ❓
|
||||
+03: 00 00 00 01 — possibly stored event count = 1 ❓
|
||||
|
||||
We use bytes +03..+06 interpreted as uint32 BE as the event count.
|
||||
This is inferred (🔶) — the exact meaning of the first 3 bytes is unclear.
|
||||
"""
|
||||
if len(data) < 7:
|
||||
log.warning("event index payload too short (%d bytes), assuming 0 events", len(data))
|
||||
return 0
|
||||
|
||||
# Try the uint32 at +3 first
|
||||
count = struct.unpack_from(">I", data, 3)[0]
|
||||
|
||||
# Sanity check: MiniMate Plus manual says max ~1000 events
|
||||
if count > 1000:
|
||||
log.warning(
|
||||
"event count %d looks unreasonably large — clamping to 0", count
|
||||
)
|
||||
return 0
|
||||
|
||||
return count
|
||||
|
||||
|
||||
def _decode_event_header_into(data: bytes, event: Event) -> None:
|
||||
"""
|
||||
Decode SUB E1 (EVENT_HEADER_RESPONSE) into an existing Event.
|
||||
|
||||
The 6-byte timestamp is at the start of the data payload.
|
||||
Sample rate location is not yet confirmed — left as None for now.
|
||||
|
||||
Modifies event in-place.
|
||||
"""
|
||||
if len(data) < 6:
|
||||
log.warning("event header payload too short (%d bytes)", len(data))
|
||||
return
|
||||
try:
|
||||
event.timestamp = Timestamp.from_bytes(data[:6])
|
||||
except ValueError as exc:
|
||||
log.warning("event header timestamp decode failed: %s", exc)
|
||||
|
||||
|
||||
def _decode_waveform_record_into(data: bytes, event: Event) -> None:
|
||||
"""
|
||||
Decode SUB F3 (FULL_WAVEFORM_RECORD) data into an existing Event.
|
||||
|
||||
Peak values are stored as IEEE 754 big-endian floats. Confirmed
|
||||
positions per §7.5 (search for the known float bytes in the payload).
|
||||
|
||||
This decoder is intentionally conservative — it searches for the
|
||||
canonical 4×float32 pattern rather than relying on a fixed offset,
|
||||
since the exact field layout is only partially confirmed.
|
||||
|
||||
Modifies event in-place.
|
||||
"""
|
||||
# Attempt to extract four consecutive IEEE 754 BE floats from the
|
||||
# known region of the payload (offsets are 🔶 INFERRED from captured data)
|
||||
try:
|
||||
peak_values = _extract_peak_floats(data)
|
||||
if peak_values:
|
||||
event.peak_values = peak_values
|
||||
except Exception as exc:
|
||||
log.warning("waveform record peak decode failed: %s", exc)
|
||||
|
||||
# Project strings — search for known ASCII labels
|
||||
try:
|
||||
project_info = _extract_project_strings(data)
|
||||
if project_info:
|
||||
event.project_info = project_info
|
||||
except Exception as exc:
|
||||
log.warning("waveform record project strings decode failed: %s", exc)
|
||||
|
||||
|
||||
def _extract_peak_floats(data: bytes) -> Optional[PeakValues]:
|
||||
"""
|
||||
Scan the waveform record payload for four sequential float32 BE values
|
||||
corresponding to Tran, Vert, Long, MicL peak values.
|
||||
|
||||
The exact offset is not confirmed (🔶), so we do a heuristic scan:
|
||||
look for four consecutive 4-byte groups where each decodes as a
|
||||
plausible PPV value (0 < v < 100 in/s or psi).
|
||||
|
||||
Returns PeakValues if a plausible group is found, else None.
|
||||
"""
|
||||
# Require at least 16 bytes for 4 floats
|
||||
if len(data) < 16:
|
||||
return None
|
||||
|
||||
for start in range(0, len(data) - 15, 4):
|
||||
try:
|
||||
vals = struct.unpack_from(">4f", data, start)
|
||||
except struct.error:
|
||||
continue
|
||||
|
||||
# All four values should be non-negative and within plausible PPV range
|
||||
if all(0.0 <= v < 100.0 for v in vals):
|
||||
tran, vert, long_, micl = vals
|
||||
# MicL (psi) is typically much smaller than geo values
|
||||
# Simple sanity: at least two non-zero values
|
||||
if sum(v > 0 for v in vals) >= 2:
|
||||
log.debug(
|
||||
"peak floats at offset %d: T=%.4f V=%.4f L=%.4f M=%.6f",
|
||||
start, tran, vert, long_, micl
|
||||
)
|
||||
return PeakValues(
|
||||
tran=tran, vert=vert, long=long_, micl=micl
|
||||
)
|
||||
return None
|
||||
|
||||
|
||||
def _extract_project_strings(data: bytes) -> Optional[ProjectInfo]:
|
||||
"""
|
||||
Search the waveform record payload for known ASCII label strings
|
||||
("Project:", "Client:", "User Name:", "Seis Loc:", "Extended Notes")
|
||||
and extract the associated value strings that follow them.
|
||||
|
||||
Layout (per §7.5): each entry is [label ~16 bytes][value ~32 bytes],
|
||||
null-padded. We find the label, then read the next non-null chars.
|
||||
"""
|
||||
def _find_string_after(needle: bytes, max_value_len: int = 64) -> Optional[str]:
|
||||
pos = data.find(needle)
|
||||
if pos < 0:
|
||||
return None
|
||||
# Skip the label (including null padding) until we find a non-null value
|
||||
# The value starts at pos+len(needle), but may have a gap of null bytes
|
||||
value_start = pos + len(needle)
|
||||
# Skip nulls
|
||||
while value_start < len(data) and data[value_start] == 0:
|
||||
value_start += 1
|
||||
if value_start >= len(data):
|
||||
return None
|
||||
# Read until null terminator or max_value_len
|
||||
end = value_start
|
||||
while end < len(data) and data[end] != 0 and (end - value_start) < max_value_len:
|
||||
end += 1
|
||||
value = data[value_start:end].decode("ascii", errors="replace").strip()
|
||||
return value or None
|
||||
|
||||
project = _find_string_after(b"Project:")
|
||||
client = _find_string_after(b"Client:")
|
||||
operator = _find_string_after(b"User Name:")
|
||||
location = _find_string_after(b"Seis Loc:")
|
||||
notes = _find_string_after(b"Extended Notes")
|
||||
|
||||
if not any([project, client, operator, location, notes]):
|
||||
return None
|
||||
|
||||
return ProjectInfo(
|
||||
project=project,
|
||||
client=client,
|
||||
operator=operator,
|
||||
sensor_location=location,
|
||||
notes=notes,
|
||||
)
|
||||
276
minimateplus/framing.py
Normal file
276
minimateplus/framing.py
Normal file
@@ -0,0 +1,276 @@
|
||||
"""
|
||||
framing.py — DLE frame codec for the Instantel MiniMate Plus RS-232 protocol.
|
||||
|
||||
Wire format:
|
||||
BW→S3 (our requests): [ACK=0x41] [STX=0x02] [stuffed payload+chk] [ETX=0x03]
|
||||
S3→BW (device replies): [DLE=0x10] [STX=0x02] [stuffed payload+chk] [DLE=0x10] [ETX=0x03]
|
||||
|
||||
The ACK 0x41 byte often precedes S3 frames too — it is silently discarded
|
||||
by the streaming parser.
|
||||
|
||||
De-stuffed payload layout:
|
||||
BW→S3 request frame:
|
||||
[0] CMD 0x10 (BW request marker)
|
||||
[1] flags 0x00
|
||||
[2] SUB command sub-byte
|
||||
[3] 0x00 always zero in captured frames
|
||||
[4] 0x00 always zero in captured frames
|
||||
[5] OFFSET two-step offset: 0x00 = length-probe, DATA_LEN = data-request
|
||||
[6-15] zero padding (total de-stuffed payload = 16 bytes)
|
||||
|
||||
S3→BW response frame:
|
||||
[0] CMD 0x00 (S3 response marker)
|
||||
[1] flags 0x10
|
||||
[2] SUB response sub-byte (= 0xFF - request SUB)
|
||||
[3] PAGE_HI high byte of page address (always 0x00 in observed frames)
|
||||
[4] PAGE_LO low byte (always 0x00 in observed frames)
|
||||
[5+] data payload data section (composite inner frames for large responses)
|
||||
|
||||
DLE stuffing rule: any 0x10 byte in the payload is doubled on the wire (0x10 → 0x10 0x10).
|
||||
This applies to the checksum byte too.
|
||||
|
||||
Confirmed from live captures (s3_parser.py validation + raw_bw.bin / raw_s3.bin).
|
||||
"""
|
||||
|
||||
from __future__ import annotations
|
||||
|
||||
from dataclasses import dataclass
|
||||
from typing import Optional
|
||||
|
||||
# ── Protocol byte constants ───────────────────────────────────────────────────
|
||||
|
||||
DLE = 0x10 # Data Link Escape
|
||||
STX = 0x02 # Start of text
|
||||
ETX = 0x03 # End of text
|
||||
ACK = 0x41 # Acknowledgement / frame-start marker (BW side)
|
||||
|
||||
BW_CMD = 0x10 # CMD byte value in BW→S3 frames
|
||||
S3_CMD = 0x00 # CMD byte value in S3→BW frames
|
||||
S3_FLAGS = 0x10 # flags byte value in S3→BW frames
|
||||
|
||||
# BW read-command payload size: 5 header bytes + 11 padding bytes = 16 total.
|
||||
# Confirmed from captured raw_bw.bin: all read-command frames carry exactly 16
|
||||
# de-stuffed bytes (excluding the appended checksum).
|
||||
_BW_PAYLOAD_SIZE = 16
|
||||
|
||||
|
||||
# ── DLE stuffing / de-stuffing ────────────────────────────────────────────────
|
||||
|
||||
def dle_stuff(data: bytes) -> bytes:
|
||||
"""Escape literal 0x10 bytes: 0x10 → 0x10 0x10."""
|
||||
out = bytearray()
|
||||
for b in data:
|
||||
if b == DLE:
|
||||
out.append(DLE)
|
||||
out.append(b)
|
||||
return bytes(out)
|
||||
|
||||
|
||||
def dle_unstuff(data: bytes) -> bytes:
|
||||
"""Remove DLE stuffing: 0x10 0x10 → 0x10."""
|
||||
out = bytearray()
|
||||
i = 0
|
||||
while i < len(data):
|
||||
b = data[i]
|
||||
if b == DLE and i + 1 < len(data) and data[i + 1] == DLE:
|
||||
out.append(DLE)
|
||||
i += 2
|
||||
else:
|
||||
out.append(b)
|
||||
i += 1
|
||||
return bytes(out)
|
||||
|
||||
|
||||
# ── Checksum ─────────────────────────────────────────────────────────────────
|
||||
|
||||
def checksum(payload: bytes) -> int:
|
||||
"""SUM8: sum of all de-stuffed payload bytes, mod 256."""
|
||||
return sum(payload) & 0xFF
|
||||
|
||||
|
||||
# ── BW→S3 frame builder ───────────────────────────────────────────────────────
|
||||
|
||||
def build_bw_frame(sub: int, offset: int = 0) -> bytes:
|
||||
"""
|
||||
Build a BW→S3 read-command frame.
|
||||
|
||||
The payload is always 16 de-stuffed bytes:
|
||||
[BW_CMD, 0x00, sub, 0x00, 0x00, offset, 0x00 × 10]
|
||||
|
||||
Confirmed from BW capture analysis: payload[3] and payload[4] are always
|
||||
0x00 across all observed read commands. The two-step offset lives at
|
||||
payload[5]: 0x00 for the length-probe step, DATA_LEN for the data-fetch step.
|
||||
|
||||
Wire output: [ACK] [STX] dle_stuff(payload + checksum) [ETX]
|
||||
|
||||
Args:
|
||||
sub: SUB command byte (e.g. 0x01 = FULL_CONFIG_READ)
|
||||
offset: Value placed at payload[5].
|
||||
Pass 0 for the probe step; pass DATA_LENGTHS[sub] for the data step.
|
||||
|
||||
Returns:
|
||||
Complete frame bytes ready to write to the serial port / socket.
|
||||
"""
|
||||
payload = bytes([BW_CMD, 0x00, sub, 0x00, 0x00, offset]) + bytes(_BW_PAYLOAD_SIZE - 6)
|
||||
chk = checksum(payload)
|
||||
wire = bytes([ACK, STX]) + dle_stuff(payload + bytes([chk])) + bytes([ETX])
|
||||
return wire
|
||||
|
||||
|
||||
# ── Pre-built POLL frames ─────────────────────────────────────────────────────
|
||||
#
|
||||
# POLL (SUB 0x5B) uses the same two-step pattern as all other reads — the
|
||||
# hardcoded length 0x30 lives at payload[5], exactly as in build_bw_frame().
|
||||
|
||||
POLL_PROBE = build_bw_frame(0x5B, 0x00) # length-probe POLL (offset = 0)
|
||||
POLL_DATA = build_bw_frame(0x5B, 0x30) # data-request POLL (offset = 0x30)
|
||||
|
||||
|
||||
# ── S3 response dataclass ─────────────────────────────────────────────────────
|
||||
|
||||
@dataclass
|
||||
class S3Frame:
|
||||
"""A fully parsed and de-stuffed S3→BW response frame."""
|
||||
sub: int # response SUB byte (e.g. 0xA4 = POLL_RESPONSE)
|
||||
page_hi: int # PAGE_HI from header (= data length on step-2 length response)
|
||||
page_lo: int # PAGE_LO from header
|
||||
data: bytes # payload data section (payload[5:], checksum already stripped)
|
||||
checksum_valid: bool
|
||||
|
||||
@property
|
||||
def page_key(self) -> int:
|
||||
"""Combined 16-bit page address / length: (page_hi << 8) | page_lo."""
|
||||
return (self.page_hi << 8) | self.page_lo
|
||||
|
||||
|
||||
# ── Streaming S3 frame parser ─────────────────────────────────────────────────
|
||||
|
||||
class S3FrameParser:
|
||||
"""
|
||||
Incremental byte-stream parser for S3→BW response frames.
|
||||
|
||||
Feed incoming bytes with feed(). Complete, valid frames are returned
|
||||
immediately and also accumulated in self.frames.
|
||||
|
||||
State machine:
|
||||
IDLE — scanning for DLE (0x10)
|
||||
SEEN_DLE — saw DLE, waiting for STX (0x02) to start a frame
|
||||
IN_FRAME — collecting de-stuffed payload bytes; bare ETX ends frame
|
||||
IN_FRAME_DLE — inside frame, saw DLE; DLE continues stuffing;
|
||||
DLE+ETX is treated as literal data (NOT a frame end),
|
||||
which lets inner-frame terminators pass through intact
|
||||
|
||||
Wire format confirmed from captures:
|
||||
[DLE=0x10] [STX=0x02] [stuffed payload+chk] [bare ETX=0x03]
|
||||
The ETX is NOT preceded by a DLE on the wire. DLE+ETX sequences that
|
||||
appear inside the payload are inner-frame terminators and must be
|
||||
treated as literal data.
|
||||
|
||||
ACK (0x41) bytes and arbitrary non-DLE bytes in IDLE state are silently
|
||||
discarded (covers device boot string "Operating System" and keepalive ACKs).
|
||||
"""
|
||||
|
||||
_IDLE = 0
|
||||
_SEEN_DLE = 1
|
||||
_IN_FRAME = 2
|
||||
_IN_FRAME_DLE = 3
|
||||
|
||||
def __init__(self) -> None:
|
||||
self._state = self._IDLE
|
||||
self._body = bytearray() # accumulates de-stuffed frame bytes
|
||||
self.frames: list[S3Frame] = []
|
||||
|
||||
def reset(self) -> None:
|
||||
self._state = self._IDLE
|
||||
self._body.clear()
|
||||
|
||||
def feed(self, data: bytes) -> list[S3Frame]:
|
||||
"""
|
||||
Process a chunk of incoming bytes.
|
||||
|
||||
Returns a list of S3Frame objects completed during this call.
|
||||
All completed frames are also appended to self.frames.
|
||||
"""
|
||||
completed: list[S3Frame] = []
|
||||
for b in data:
|
||||
frame = self._step(b)
|
||||
if frame is not None:
|
||||
completed.append(frame)
|
||||
self.frames.append(frame)
|
||||
return completed
|
||||
|
||||
def _step(self, b: int) -> Optional[S3Frame]:
|
||||
"""Process one byte. Returns a completed S3Frame or None."""
|
||||
|
||||
if self._state == self._IDLE:
|
||||
if b == DLE:
|
||||
self._state = self._SEEN_DLE
|
||||
# ACK, boot strings, garbage — silently ignored
|
||||
|
||||
elif self._state == self._SEEN_DLE:
|
||||
if b == STX:
|
||||
self._body.clear()
|
||||
self._state = self._IN_FRAME
|
||||
else:
|
||||
# Stray DLE not followed by STX — back to idle
|
||||
self._state = self._IDLE
|
||||
|
||||
elif self._state == self._IN_FRAME:
|
||||
if b == DLE:
|
||||
self._state = self._IN_FRAME_DLE
|
||||
elif b == ETX:
|
||||
# Bare ETX = real frame terminator (confirmed from captures)
|
||||
frame = self._finalise()
|
||||
self._state = self._IDLE
|
||||
return frame
|
||||
else:
|
||||
self._body.append(b)
|
||||
|
||||
elif self._state == self._IN_FRAME_DLE:
|
||||
if b == DLE:
|
||||
# DLE DLE → literal 0x10 in payload
|
||||
self._body.append(DLE)
|
||||
self._state = self._IN_FRAME
|
||||
elif b == ETX:
|
||||
# DLE+ETX inside a frame is an inner-frame terminator, NOT
|
||||
# the outer frame end. Treat as literal data and continue.
|
||||
self._body.append(DLE)
|
||||
self._body.append(ETX)
|
||||
self._state = self._IN_FRAME
|
||||
else:
|
||||
# Unexpected DLE + byte — treat both as literal data and continue
|
||||
self._body.append(DLE)
|
||||
self._body.append(b)
|
||||
self._state = self._IN_FRAME
|
||||
|
||||
return None
|
||||
|
||||
def _finalise(self) -> Optional[S3Frame]:
|
||||
"""
|
||||
Called when DLE+ETX is seen. Validates checksum and builds S3Frame.
|
||||
Returns None if the frame is too short or structurally invalid.
|
||||
"""
|
||||
body = bytes(self._body)
|
||||
|
||||
# Minimum valid frame: 5-byte header + at least 1 checksum byte = 6
|
||||
if len(body) < 6:
|
||||
return None
|
||||
|
||||
raw_payload = body[:-1] # everything except the trailing checksum byte
|
||||
chk_received = body[-1]
|
||||
chk_computed = checksum(raw_payload)
|
||||
|
||||
if len(raw_payload) < 5:
|
||||
return None
|
||||
|
||||
# Validate CMD byte — we only accept S3→BW response frames here
|
||||
if raw_payload[0] != S3_CMD:
|
||||
return None
|
||||
|
||||
return S3Frame(
|
||||
sub = raw_payload[2],
|
||||
page_hi = raw_payload[3],
|
||||
page_lo = raw_payload[4],
|
||||
data = raw_payload[5:],
|
||||
checksum_valid = (chk_received == chk_computed),
|
||||
)
|
||||
215
minimateplus/models.py
Normal file
215
minimateplus/models.py
Normal file
@@ -0,0 +1,215 @@
|
||||
"""
|
||||
models.py — Plain-Python data models for the MiniMate Plus protocol library.
|
||||
|
||||
All models are intentionally simple dataclasses with no protocol logic.
|
||||
They represent *decoded* device data — the client layer translates raw frame
|
||||
bytes into these objects, and the SFM API layer serialises them to JSON.
|
||||
|
||||
Notes on certainty:
|
||||
Fields marked ✅ are confirmed from captured data.
|
||||
Fields marked 🔶 are strongly inferred but not formally proven.
|
||||
Fields marked ❓ are present in the captured payload but not yet decoded.
|
||||
See docs/instantel_protocol_reference.md for full derivation details.
|
||||
"""
|
||||
|
||||
from __future__ import annotations
|
||||
|
||||
import struct
|
||||
from dataclasses import dataclass, field
|
||||
from typing import Optional
|
||||
|
||||
|
||||
# ── Timestamp ─────────────────────────────────────────────────────────────────
|
||||
|
||||
@dataclass
|
||||
class Timestamp:
|
||||
"""
|
||||
6-byte event timestamp decoded from the MiniMate Plus wire format.
|
||||
|
||||
Wire layout: [flag:1] [year:2 BE] [unknown:1] [month:1] [day:1]
|
||||
|
||||
The year 1995 is the device's factory-default RTC date — it appears
|
||||
whenever the battery has been disconnected. Treat 1995 as "clock not set".
|
||||
"""
|
||||
raw: bytes # raw 6-byte sequence for round-tripping
|
||||
flag: int # byte 0 — validity/type flag (usually 0x01) 🔶
|
||||
year: int # bytes 1–2 big-endian uint16 ✅
|
||||
unknown_byte: int # byte 3 — likely hours/minutes ❓
|
||||
month: int # byte 4 ✅
|
||||
day: int # byte 5 ✅
|
||||
|
||||
@classmethod
|
||||
def from_bytes(cls, data: bytes) -> "Timestamp":
|
||||
"""
|
||||
Decode a 6-byte timestamp sequence.
|
||||
|
||||
Args:
|
||||
data: exactly 6 bytes from the device payload.
|
||||
|
||||
Returns:
|
||||
Decoded Timestamp.
|
||||
|
||||
Raises:
|
||||
ValueError: if data is not exactly 6 bytes.
|
||||
"""
|
||||
if len(data) != 6:
|
||||
raise ValueError(f"Timestamp requires exactly 6 bytes, got {len(data)}")
|
||||
flag = data[0]
|
||||
year = struct.unpack_from(">H", data, 1)[0]
|
||||
unknown_byte = data[3]
|
||||
month = data[4]
|
||||
day = data[5]
|
||||
return cls(
|
||||
raw=bytes(data),
|
||||
flag=flag,
|
||||
year=year,
|
||||
unknown_byte=unknown_byte,
|
||||
month=month,
|
||||
day=day,
|
||||
)
|
||||
|
||||
@property
|
||||
def clock_set(self) -> bool:
|
||||
"""False when year == 1995 (factory default / battery-lost state)."""
|
||||
return self.year != 1995
|
||||
|
||||
def __str__(self) -> str:
|
||||
if not self.clock_set:
|
||||
return f"CLOCK_NOT_SET ({self.year}-{self.month:02d}-{self.day:02d})"
|
||||
return f"{self.year}-{self.month:02d}-{self.day:02d}"
|
||||
|
||||
|
||||
# ── Device identity ───────────────────────────────────────────────────────────
|
||||
|
||||
@dataclass
|
||||
class DeviceInfo:
|
||||
"""
|
||||
Combined device identity information gathered during the startup sequence.
|
||||
|
||||
Populated from three response SUBs:
|
||||
- SUB EA (SERIAL_NUMBER_RESPONSE): serial, firmware_minor
|
||||
- SUB FE (FULL_CONFIG_RESPONSE): serial (repeat), firmware_version,
|
||||
dsp_version, manufacturer, model
|
||||
- SUB A4 (POLL_RESPONSE): manufacturer (repeat), model (repeat)
|
||||
|
||||
All string fields are stripped of null padding before storage.
|
||||
"""
|
||||
|
||||
# ── From SUB EA (SERIAL_NUMBER_RESPONSE) ─────────────────────────────────
|
||||
serial: str # e.g. "BE18189" ✅
|
||||
firmware_minor: int # 0x11 = 17 for S337.17 ✅
|
||||
serial_trail_0: Optional[int] = None # unit-specific byte — purpose unknown ❓
|
||||
|
||||
# ── From SUB FE (FULL_CONFIG_RESPONSE) ────────────────────────────────────
|
||||
firmware_version: Optional[str] = None # e.g. "S337.17" ✅
|
||||
dsp_version: Optional[str] = None # e.g. "10.72" ✅
|
||||
manufacturer: Optional[str] = None # e.g. "Instantel" ✅
|
||||
model: Optional[str] = None # e.g. "MiniMate Plus" ✅
|
||||
|
||||
def __str__(self) -> str:
|
||||
fw = self.firmware_version or f"?.{self.firmware_minor}"
|
||||
mdl = self.model or "MiniMate Plus"
|
||||
return f"{mdl} S/N:{self.serial} FW:{fw}"
|
||||
|
||||
|
||||
# ── Channel threshold / scaling ───────────────────────────────────────────────
|
||||
|
||||
@dataclass
|
||||
class ChannelConfig:
|
||||
"""
|
||||
Per-channel threshold and scaling values from SUB E5 / SUB 71.
|
||||
|
||||
Floats are stored in the device in imperial units (in/s for geo channels,
|
||||
psi for MicL). Unit strings embedded in the payload confirm this.
|
||||
|
||||
Certainty: ✅ CONFIRMED for trigger_level, alarm_level, unit strings.
|
||||
"""
|
||||
label: str # e.g. "Tran", "Vert", "Long", "MicL" ✅
|
||||
trigger_level: float # in/s (geo) or psi (MicL) ✅
|
||||
alarm_level: float # in/s (geo) or psi (MicL) ✅
|
||||
max_range: float # full-scale calibration constant (e.g. 6.206) 🔶
|
||||
unit_label: str # e.g. "in./s" or "psi" ✅
|
||||
|
||||
|
||||
# ── Peak values for one event ─────────────────────────────────────────────────
|
||||
|
||||
@dataclass
|
||||
class PeakValues:
|
||||
"""
|
||||
Per-channel peak particle velocity / pressure for a single event.
|
||||
|
||||
Extracted from the Full Waveform Record (SUB F3), stored as IEEE 754
|
||||
big-endian floats in the device's native units (in/s / psi).
|
||||
"""
|
||||
tran: Optional[float] = None # Transverse PPV (in/s) ✅
|
||||
vert: Optional[float] = None # Vertical PPV (in/s) ✅
|
||||
long: Optional[float] = None # Longitudinal PPV (in/s) ✅
|
||||
micl: Optional[float] = None # Air overpressure (psi) 🔶 (units uncertain)
|
||||
|
||||
|
||||
# ── Project / operator metadata ───────────────────────────────────────────────
|
||||
|
||||
@dataclass
|
||||
class ProjectInfo:
|
||||
"""
|
||||
Operator-supplied project and location strings from the Full Waveform
|
||||
Record (SUB F3) and compliance config block (SUB E5 / SUB 71).
|
||||
|
||||
All fields are optional — they may be blank if the operator did not fill
|
||||
them in through Blastware.
|
||||
"""
|
||||
setup_name: Optional[str] = None # "Standard Recording Setup"
|
||||
project: Optional[str] = None # project description
|
||||
client: Optional[str] = None # client name ✅ confirmed offset
|
||||
operator: Optional[str] = None # operator / user name
|
||||
sensor_location: Optional[str] = None # sensor location string
|
||||
notes: Optional[str] = None # extended notes
|
||||
|
||||
|
||||
# ── Event ─────────────────────────────────────────────────────────────────────
|
||||
|
||||
@dataclass
|
||||
class Event:
|
||||
"""
|
||||
A single seismic event record downloaded from the device.
|
||||
|
||||
Populated progressively across several request/response pairs:
|
||||
1. SUB 1E (EVENT_HEADER) → index, timestamp, sample_rate
|
||||
2. SUB 0C (FULL_WAVEFORM_RECORD) → peak_values, project_info, record_type
|
||||
3. SUB 5A (BULK_WAVEFORM_STREAM) → raw_samples (downloaded on demand)
|
||||
|
||||
Fields not yet retrieved are None.
|
||||
"""
|
||||
# ── Identity ──────────────────────────────────────────────────────────────
|
||||
index: int # 0-based event number on device
|
||||
|
||||
# ── From EVENT_HEADER (SUB 1E) ────────────────────────────────────────────
|
||||
timestamp: Optional[Timestamp] = None # 6-byte timestamp ✅
|
||||
sample_rate: Optional[int] = None # samples/sec (e.g. 1024) 🔶
|
||||
|
||||
# ── From FULL_WAVEFORM_RECORD (SUB F3) ───────────────────────────────────
|
||||
peak_values: Optional[PeakValues] = None
|
||||
project_info: Optional[ProjectInfo] = None
|
||||
record_type: Optional[str] = None # e.g. "Histogram", "Waveform" 🔶
|
||||
|
||||
# ── From BULK_WAVEFORM_STREAM (SUB 5A) ───────────────────────────────────
|
||||
# Raw ADC samples keyed by channel label. Not fetched unless explicitly
|
||||
# requested (large data transfer — up to several MB per event).
|
||||
raw_samples: Optional[dict] = None # {"Tran": [...], "Vert": [...], ...}
|
||||
|
||||
def __str__(self) -> str:
|
||||
ts = str(self.timestamp) if self.timestamp else "no timestamp"
|
||||
ppv = ""
|
||||
if self.peak_values:
|
||||
pv = self.peak_values
|
||||
parts = []
|
||||
if pv.tran is not None:
|
||||
parts.append(f"T={pv.tran:.4f}")
|
||||
if pv.vert is not None:
|
||||
parts.append(f"V={pv.vert:.4f}")
|
||||
if pv.long is not None:
|
||||
parts.append(f"L={pv.long:.4f}")
|
||||
if pv.micl is not None:
|
||||
parts.append(f"M={pv.micl:.6f}")
|
||||
ppv = " [" + ", ".join(parts) + " in/s]"
|
||||
return f"Event#{self.index} {ts}{ppv}"
|
||||
317
minimateplus/protocol.py
Normal file
317
minimateplus/protocol.py
Normal file
@@ -0,0 +1,317 @@
|
||||
"""
|
||||
protocol.py — High-level MiniMate Plus request/response protocol.
|
||||
|
||||
Implements the request/response patterns documented in
|
||||
docs/instantel_protocol_reference.md on top of:
|
||||
- minimateplus.framing — DLE codec, frame builder, S3 streaming parser
|
||||
- minimateplus.transport — byte I/O (SerialTransport / future TcpTransport)
|
||||
|
||||
This module knows nothing about pyserial or TCP — it only calls
|
||||
transport.write() and transport.read_until_idle().
|
||||
|
||||
Key patterns implemented:
|
||||
- POLL startup handshake (two-step, special payload[5] format)
|
||||
- Generic two-step paged read (probe → get length → fetch data)
|
||||
- Response timeout + checksum validation
|
||||
- Boot-string drain (device sends "Operating System" ASCII before framing)
|
||||
|
||||
All public methods raise ProtocolError on timeout, bad checksum, or
|
||||
unexpected response SUB.
|
||||
"""
|
||||
|
||||
from __future__ import annotations
|
||||
|
||||
import logging
|
||||
import time
|
||||
from typing import Optional
|
||||
|
||||
from .framing import (
|
||||
S3Frame,
|
||||
S3FrameParser,
|
||||
build_bw_frame,
|
||||
POLL_PROBE,
|
||||
POLL_DATA,
|
||||
)
|
||||
from .transport import BaseTransport
|
||||
|
||||
log = logging.getLogger(__name__)
|
||||
|
||||
|
||||
# ── Constants ─────────────────────────────────────────────────────────────────
|
||||
|
||||
# Response SUB = 0xFF - Request SUB (confirmed pattern, no known exceptions
|
||||
# among read commands; one write-path exception documented for SUB 1C→6E).
|
||||
def _expected_rsp_sub(req_sub: int) -> int:
|
||||
return (0xFF - req_sub) & 0xFF
|
||||
|
||||
|
||||
# SUB byte constants (request side) — see protocol reference §5.1
|
||||
SUB_POLL = 0x5B
|
||||
SUB_SERIAL_NUMBER = 0x15
|
||||
SUB_FULL_CONFIG = 0x01
|
||||
SUB_EVENT_INDEX = 0x08
|
||||
SUB_CHANNEL_CONFIG = 0x06
|
||||
SUB_TRIGGER_CONFIG = 0x1C
|
||||
SUB_EVENT_HEADER = 0x1E
|
||||
SUB_WAVEFORM_HEADER = 0x0A
|
||||
SUB_WAVEFORM_RECORD = 0x0C
|
||||
SUB_BULK_WAVEFORM = 0x5A
|
||||
SUB_COMPLIANCE = 0x1A
|
||||
SUB_UNKNOWN_2E = 0x2E
|
||||
|
||||
# Hardcoded data lengths for the two-step read protocol.
|
||||
#
|
||||
# The S3 probe response page_key is always 0x0000 — it does NOT carry the
|
||||
# data length back to us. Instead, each SUB has a fixed known payload size
|
||||
# confirmed from BW capture analysis (offset at payload[5] of the data-request
|
||||
# frame).
|
||||
#
|
||||
# Key: request SUB byte. Value: offset/length byte sent in the data-request.
|
||||
# Entries marked 🔶 are inferred from captured frames and may need adjustment.
|
||||
DATA_LENGTHS: dict[int, int] = {
|
||||
SUB_POLL: 0x30, # POLL startup data block ✅
|
||||
SUB_SERIAL_NUMBER: 0x0A, # 10-byte serial number block ✅
|
||||
SUB_FULL_CONFIG: 0x98, # 152-byte full config block ✅
|
||||
SUB_EVENT_INDEX: 0x58, # 88-byte event index ✅
|
||||
SUB_TRIGGER_CONFIG: 0x2C, # 44-byte trigger config 🔶
|
||||
SUB_UNKNOWN_2E: 0x1A, # 26 bytes, purpose TBD 🔶
|
||||
0x09: 0xCA, # 202 bytes, purpose TBD 🔶
|
||||
# SUB_COMPLIANCE (0x1A) uses a multi-step sequence with a 2090-byte total;
|
||||
# NOT handled here — requires specialised read logic.
|
||||
}
|
||||
|
||||
# Default timeout values (seconds).
|
||||
# MiniMate Plus is a slow device — keep these generous.
|
||||
DEFAULT_RECV_TIMEOUT = 10.0
|
||||
POLL_RECV_TIMEOUT = 10.0
|
||||
|
||||
|
||||
# ── Exception ─────────────────────────────────────────────────────────────────
|
||||
|
||||
class ProtocolError(Exception):
|
||||
"""Raised when the device violates the expected protocol."""
|
||||
|
||||
|
||||
class TimeoutError(ProtocolError):
|
||||
"""Raised when no response is received within the allowed time."""
|
||||
|
||||
|
||||
class ChecksumError(ProtocolError):
|
||||
"""Raised when a received frame has a bad checksum."""
|
||||
|
||||
|
||||
class UnexpectedResponse(ProtocolError):
|
||||
"""Raised when the response SUB doesn't match what we requested."""
|
||||
|
||||
|
||||
# ── MiniMateProtocol ──────────────────────────────────────────────────────────
|
||||
|
||||
class MiniMateProtocol:
|
||||
"""
|
||||
Protocol state machine for one open connection to a MiniMate Plus device.
|
||||
|
||||
Does not own the transport — transport lifetime is managed by MiniMateClient.
|
||||
|
||||
Typical usage (via MiniMateClient — not directly):
|
||||
proto = MiniMateProtocol(transport)
|
||||
proto.startup() # POLL handshake, drain boot string
|
||||
data = proto.read(SUB_FULL_CONFIG)
|
||||
sn_data = proto.read(SUB_SERIAL_NUMBER)
|
||||
"""
|
||||
|
||||
def __init__(
|
||||
self,
|
||||
transport: BaseTransport,
|
||||
recv_timeout: float = DEFAULT_RECV_TIMEOUT,
|
||||
) -> None:
|
||||
self._transport = transport
|
||||
self._recv_timeout = recv_timeout
|
||||
self._parser = S3FrameParser()
|
||||
|
||||
# ── Public API ────────────────────────────────────────────────────────────
|
||||
|
||||
def startup(self) -> S3Frame:
|
||||
"""
|
||||
Perform the POLL startup handshake and return the POLL data frame.
|
||||
|
||||
Steps (matching §6 Session Startup Sequence):
|
||||
1. Drain any boot-string bytes ("Operating System" ASCII)
|
||||
2. Send POLL_PROBE (SUB 5B, offset=0x00)
|
||||
3. Receive probe ack (page_key is 0x0000; data length 0x30 is hardcoded)
|
||||
4. Send POLL_DATA (SUB 5B, offset=0x30)
|
||||
5. Receive data frame with "Instantel" + "MiniMate Plus" strings
|
||||
|
||||
Returns:
|
||||
The data-phase POLL response S3Frame.
|
||||
|
||||
Raises:
|
||||
ProtocolError: if either POLL step fails.
|
||||
"""
|
||||
log.debug("startup: draining boot string")
|
||||
self._drain_boot_string()
|
||||
|
||||
log.debug("startup: POLL probe")
|
||||
self._send(POLL_PROBE)
|
||||
probe_rsp = self._recv_one(
|
||||
expected_sub=_expected_rsp_sub(SUB_POLL),
|
||||
timeout=self._recv_timeout,
|
||||
)
|
||||
log.debug(
|
||||
"startup: POLL probe response page_key=0x%04X", probe_rsp.page_key
|
||||
)
|
||||
|
||||
log.debug("startup: POLL data request")
|
||||
self._send(POLL_DATA)
|
||||
data_rsp = self._recv_one(
|
||||
expected_sub=_expected_rsp_sub(SUB_POLL),
|
||||
timeout=self._recv_timeout,
|
||||
)
|
||||
log.debug("startup: POLL data received, %d bytes", len(data_rsp.data))
|
||||
return data_rsp
|
||||
|
||||
def read(self, sub: int) -> bytes:
|
||||
"""
|
||||
Execute a two-step paged read and return the data payload bytes.
|
||||
|
||||
Step 1: send probe frame (offset=0x00) → device sends a short ack
|
||||
Step 2: send data-request (offset=DATA_LEN) → device sends the data block
|
||||
|
||||
The S3 probe response does NOT carry the data length — page_key is always
|
||||
0x0000 in observed frames. DATA_LENGTHS holds the known fixed lengths
|
||||
derived from BW capture analysis.
|
||||
|
||||
Args:
|
||||
sub: Request SUB byte (e.g. SUB_FULL_CONFIG = 0x01).
|
||||
|
||||
Returns:
|
||||
De-stuffed data payload bytes (payload[5:] of the response frame,
|
||||
with the checksum already stripped by the parser).
|
||||
|
||||
Raises:
|
||||
ProtocolError: on timeout, bad checksum, or wrong response SUB.
|
||||
KeyError: if sub is not in DATA_LENGTHS (caller should add it).
|
||||
"""
|
||||
rsp_sub = _expected_rsp_sub(sub)
|
||||
|
||||
# Step 1 — probe (offset = 0)
|
||||
log.debug("read SUB=0x%02X: probe", sub)
|
||||
self._send(build_bw_frame(sub, 0))
|
||||
_probe = self._recv_one(expected_sub=rsp_sub) # ack; page_key always 0
|
||||
|
||||
# Look up the hardcoded data length for this SUB
|
||||
if sub not in DATA_LENGTHS:
|
||||
raise ProtocolError(
|
||||
f"No known data length for SUB=0x{sub:02X}. "
|
||||
"Add it to DATA_LENGTHS in protocol.py."
|
||||
)
|
||||
length = DATA_LENGTHS[sub]
|
||||
log.debug("read SUB=0x%02X: data request offset=0x%02X", sub, length)
|
||||
|
||||
if length == 0:
|
||||
log.warning("read SUB=0x%02X: DATA_LENGTHS entry is zero", sub)
|
||||
return b""
|
||||
|
||||
# Step 2 — data-request (offset = length)
|
||||
self._send(build_bw_frame(sub, length))
|
||||
data_rsp = self._recv_one(expected_sub=rsp_sub)
|
||||
|
||||
log.debug("read SUB=0x%02X: received %d data bytes", sub, len(data_rsp.data))
|
||||
return data_rsp.data
|
||||
|
||||
def send_keepalive(self) -> None:
|
||||
"""
|
||||
Send a single POLL_PROBE keepalive without waiting for a response.
|
||||
|
||||
Blastware sends these every ~80ms during idle. Useful if you need to
|
||||
hold the session open between real requests.
|
||||
"""
|
||||
self._send(POLL_PROBE)
|
||||
|
||||
# ── Internal helpers ──────────────────────────────────────────────────────
|
||||
|
||||
def _send(self, frame: bytes) -> None:
|
||||
"""Write a pre-built frame to the transport."""
|
||||
log.debug("TX %d bytes: %s", len(frame), frame.hex())
|
||||
self._transport.write(frame)
|
||||
|
||||
def _recv_one(
|
||||
self,
|
||||
expected_sub: Optional[int] = None,
|
||||
timeout: Optional[float] = None,
|
||||
) -> S3Frame:
|
||||
"""
|
||||
Read bytes from the transport until one complete S3 frame is parsed.
|
||||
|
||||
Feeds bytes through the streaming S3FrameParser. Keeps reading until
|
||||
a frame arrives or the deadline expires.
|
||||
|
||||
Args:
|
||||
expected_sub: If provided, raises UnexpectedResponse if the
|
||||
received frame's SUB doesn't match.
|
||||
timeout: Seconds to wait. Defaults to self._recv_timeout.
|
||||
|
||||
Returns:
|
||||
The first complete S3Frame received.
|
||||
|
||||
Raises:
|
||||
TimeoutError: if no frame arrives within the timeout.
|
||||
ChecksumError: if the frame has an invalid checksum.
|
||||
UnexpectedResponse: if expected_sub is set and doesn't match.
|
||||
"""
|
||||
deadline = time.monotonic() + (timeout or self._recv_timeout)
|
||||
self._parser.reset()
|
||||
|
||||
while time.monotonic() < deadline:
|
||||
chunk = self._transport.read(256)
|
||||
if chunk:
|
||||
log.debug("RX %d bytes: %s", len(chunk), chunk.hex())
|
||||
frames = self._parser.feed(chunk)
|
||||
if frames:
|
||||
frame = frames[0]
|
||||
self._validate_frame(frame, expected_sub)
|
||||
return frame
|
||||
else:
|
||||
time.sleep(0.005)
|
||||
|
||||
raise TimeoutError(
|
||||
f"No S3 frame received within {timeout or self._recv_timeout:.1f}s"
|
||||
+ (f" (expected SUB 0x{expected_sub:02X})" if expected_sub is not None else "")
|
||||
)
|
||||
|
||||
@staticmethod
|
||||
def _validate_frame(frame: S3Frame, expected_sub: Optional[int]) -> None:
|
||||
"""Validate SUB; log but do not raise on bad checksum.
|
||||
|
||||
S3 response checksums frequently fail SUM8 validation due to inner-frame
|
||||
delimiter bytes being captured as the checksum byte. The original
|
||||
s3_parser.py deliberately never validates S3 checksums for exactly this
|
||||
reason. We log a warning and continue.
|
||||
"""
|
||||
if not frame.checksum_valid:
|
||||
# S3 checksums frequently fail SUM8 due to inner-frame delimiter bytes
|
||||
# landing in the checksum position. Treat as informational only.
|
||||
log.debug("S3 frame SUB=0x%02X: checksum mismatch (ignoring)", frame.sub)
|
||||
if expected_sub is not None and frame.sub != expected_sub:
|
||||
raise UnexpectedResponse(
|
||||
f"Expected SUB=0x{expected_sub:02X}, got 0x{frame.sub:02X}"
|
||||
)
|
||||
|
||||
def _drain_boot_string(self, drain_ms: int = 200) -> None:
|
||||
"""
|
||||
Read and discard any boot-string bytes ("Operating System") the device
|
||||
may send before entering binary protocol mode.
|
||||
|
||||
We simply read with a short timeout and throw the bytes away. The
|
||||
S3FrameParser's IDLE state already handles non-frame bytes gracefully,
|
||||
but it's cleaner to drain them explicitly before the first real frame.
|
||||
"""
|
||||
deadline = time.monotonic() + (drain_ms / 1000)
|
||||
discarded = 0
|
||||
while time.monotonic() < deadline:
|
||||
chunk = self._transport.read(256)
|
||||
if chunk:
|
||||
discarded += len(chunk)
|
||||
else:
|
||||
time.sleep(0.005)
|
||||
if discarded:
|
||||
log.debug("drain_boot_string: discarded %d bytes", discarded)
|
||||
420
minimateplus/transport.py
Normal file
420
minimateplus/transport.py
Normal file
@@ -0,0 +1,420 @@
|
||||
"""
|
||||
transport.py — Serial and TCP transport layer for the MiniMate Plus protocol.
|
||||
|
||||
Provides a thin I/O abstraction so that protocol.py never imports pyserial or
|
||||
socket directly. Two concrete implementations:
|
||||
|
||||
SerialTransport — direct RS-232 cable connection (pyserial)
|
||||
TcpTransport — TCP socket to a modem or ACH relay (stdlib socket)
|
||||
|
||||
The MiniMate Plus protocol bytes are identical over both transports. TCP is used
|
||||
when field units call home via the ACH (Auto Call Home) server, or when SFM
|
||||
"calls up" a unit by connecting to the modem's IP address directly.
|
||||
|
||||
Field hardware: Sierra Wireless RV55 / RX55 (4G LTE) cellular modem, replacing
|
||||
the older 3G-only Raven X (now decommissioned). All run ALEOS firmware with an
|
||||
ACEmanager web UI. Serial port must be configured 38400,8N1, no flow control,
|
||||
Data Forwarding Timeout = 1 s.
|
||||
|
||||
Typical usage:
|
||||
from minimateplus.transport import SerialTransport, TcpTransport
|
||||
|
||||
# Direct serial connection
|
||||
with SerialTransport("COM5") as t:
|
||||
t.write(frame_bytes)
|
||||
|
||||
# Modem / ACH TCP connection (Blastware port 12345)
|
||||
with TcpTransport("192.168.1.50", 12345) as t:
|
||||
t.write(frame_bytes)
|
||||
"""
|
||||
|
||||
from __future__ import annotations
|
||||
|
||||
import socket
|
||||
import time
|
||||
from abc import ABC, abstractmethod
|
||||
from typing import Optional
|
||||
|
||||
# pyserial is the only non-stdlib dependency in this project.
|
||||
# Import lazily so unit-tests that mock the transport can run without it.
|
||||
try:
|
||||
import serial # type: ignore
|
||||
except ImportError: # pragma: no cover
|
||||
serial = None # type: ignore
|
||||
|
||||
|
||||
# ── Abstract base ─────────────────────────────────────────────────────────────
|
||||
|
||||
class BaseTransport(ABC):
|
||||
"""Common interface for all transport implementations."""
|
||||
|
||||
@abstractmethod
|
||||
def connect(self) -> None:
|
||||
"""Open the underlying connection."""
|
||||
|
||||
@abstractmethod
|
||||
def disconnect(self) -> None:
|
||||
"""Close the underlying connection."""
|
||||
|
||||
@property
|
||||
@abstractmethod
|
||||
def is_connected(self) -> bool:
|
||||
"""True while the connection is open."""
|
||||
|
||||
@abstractmethod
|
||||
def write(self, data: bytes) -> None:
|
||||
"""Write *data* bytes to the wire."""
|
||||
|
||||
@abstractmethod
|
||||
def read(self, n: int) -> bytes:
|
||||
"""
|
||||
Read up to *n* bytes. Returns immediately with whatever is available
|
||||
(may return fewer than *n* bytes, or b"" if nothing is ready).
|
||||
"""
|
||||
|
||||
# ── Context manager ───────────────────────────────────────────────────────
|
||||
|
||||
def __enter__(self) -> "BaseTransport":
|
||||
self.connect()
|
||||
return self
|
||||
|
||||
def __exit__(self, *_) -> None:
|
||||
self.disconnect()
|
||||
|
||||
# ── Higher-level read helpers ─────────────────────────────────────────────
|
||||
|
||||
def read_until_idle(
|
||||
self,
|
||||
timeout: float = 2.0,
|
||||
idle_gap: float = 0.05,
|
||||
chunk: int = 256,
|
||||
) -> bytes:
|
||||
"""
|
||||
Read bytes until the line goes quiet.
|
||||
|
||||
Keeps reading in *chunk*-sized bursts. Returns when either:
|
||||
- *timeout* seconds have elapsed since the first byte arrived, or
|
||||
- *idle_gap* seconds pass with no new bytes (line went quiet).
|
||||
|
||||
This mirrors how Blastware behaves: it waits for the seismograph to
|
||||
stop transmitting rather than counting bytes.
|
||||
|
||||
Args:
|
||||
timeout: Hard deadline (seconds) from the moment read starts.
|
||||
idle_gap: How long to wait after the last byte before declaring done.
|
||||
chunk: How many bytes to request per low-level read() call.
|
||||
|
||||
Returns:
|
||||
All bytes received as a single bytes object (may be b"" if nothing
|
||||
arrived within *timeout*).
|
||||
"""
|
||||
buf = bytearray()
|
||||
deadline = time.monotonic() + timeout
|
||||
last_rx = None
|
||||
|
||||
while time.monotonic() < deadline:
|
||||
got = self.read(chunk)
|
||||
if got:
|
||||
buf.extend(got)
|
||||
last_rx = time.monotonic()
|
||||
else:
|
||||
# Nothing ready — check idle gap
|
||||
if last_rx is not None and (time.monotonic() - last_rx) >= idle_gap:
|
||||
break
|
||||
time.sleep(0.005)
|
||||
|
||||
return bytes(buf)
|
||||
|
||||
def read_exact(self, n: int, timeout: float = 2.0) -> bytes:
|
||||
"""
|
||||
Read exactly *n* bytes or raise TimeoutError.
|
||||
|
||||
Useful when the caller already knows the expected response length
|
||||
(e.g. fixed-size ACK packets).
|
||||
"""
|
||||
buf = bytearray()
|
||||
deadline = time.monotonic() + timeout
|
||||
while len(buf) < n:
|
||||
if time.monotonic() >= deadline:
|
||||
raise TimeoutError(
|
||||
f"read_exact: wanted {n} bytes, got {len(buf)} "
|
||||
f"after {timeout:.1f}s"
|
||||
)
|
||||
got = self.read(n - len(buf))
|
||||
if got:
|
||||
buf.extend(got)
|
||||
else:
|
||||
time.sleep(0.005)
|
||||
return bytes(buf)
|
||||
|
||||
|
||||
# ── Serial transport ──────────────────────────────────────────────────────────
|
||||
|
||||
# Default baud rate confirmed from Blastware / MiniMate Plus documentation.
|
||||
DEFAULT_BAUD = 38_400
|
||||
|
||||
# pyserial serial port config matching the MiniMate Plus RS-232 spec:
|
||||
# 8 data bits, no parity, 1 stop bit (8N1).
|
||||
_SERIAL_BYTESIZE = 8 # serial.EIGHTBITS
|
||||
_SERIAL_PARITY = "N" # serial.PARITY_NONE
|
||||
_SERIAL_STOPBITS = 1 # serial.STOPBITS_ONE
|
||||
|
||||
|
||||
class SerialTransport(BaseTransport):
|
||||
"""
|
||||
pyserial-backed transport for a direct RS-232 cable connection.
|
||||
|
||||
The port is opened with a very short read timeout (10 ms) so that
|
||||
read() returns quickly and the caller can implement its own framing /
|
||||
timeout logic without blocking the whole process.
|
||||
|
||||
Args:
|
||||
port: COM port name (e.g. "COM5" on Windows, "/dev/ttyUSB0" on Linux).
|
||||
baud: Baud rate (default 38400).
|
||||
rts_cts: Enable RTS/CTS hardware flow control (default False — MiniMate
|
||||
typically uses no flow control).
|
||||
"""
|
||||
|
||||
# Internal read timeout (seconds). Short so read() is non-blocking in practice.
|
||||
_READ_TIMEOUT = 0.01
|
||||
|
||||
def __init__(
|
||||
self,
|
||||
port: str,
|
||||
baud: int = DEFAULT_BAUD,
|
||||
rts_cts: bool = False,
|
||||
) -> None:
|
||||
if serial is None:
|
||||
raise ImportError(
|
||||
"pyserial is required for SerialTransport. "
|
||||
"Install it with: pip install pyserial"
|
||||
)
|
||||
self.port = port
|
||||
self.baud = baud
|
||||
self.rts_cts = rts_cts
|
||||
self._ser: Optional[serial.Serial] = None
|
||||
|
||||
# ── BaseTransport interface ───────────────────────────────────────────────
|
||||
|
||||
def connect(self) -> None:
|
||||
"""Open the serial port. Raises serial.SerialException on failure."""
|
||||
if self._ser and self._ser.is_open:
|
||||
return # Already open — idempotent
|
||||
self._ser = serial.Serial(
|
||||
port = self.port,
|
||||
baudrate = self.baud,
|
||||
bytesize = _SERIAL_BYTESIZE,
|
||||
parity = _SERIAL_PARITY,
|
||||
stopbits = _SERIAL_STOPBITS,
|
||||
timeout = self._READ_TIMEOUT,
|
||||
rtscts = self.rts_cts,
|
||||
xonxoff = False,
|
||||
dsrdtr = False,
|
||||
)
|
||||
# Flush any stale bytes left in device / OS buffers from a previous session
|
||||
self._ser.reset_input_buffer()
|
||||
self._ser.reset_output_buffer()
|
||||
|
||||
def disconnect(self) -> None:
|
||||
"""Close the serial port. Safe to call even if already closed."""
|
||||
if self._ser:
|
||||
try:
|
||||
self._ser.close()
|
||||
except Exception:
|
||||
pass
|
||||
self._ser = None
|
||||
|
||||
@property
|
||||
def is_connected(self) -> bool:
|
||||
return bool(self._ser and self._ser.is_open)
|
||||
|
||||
def write(self, data: bytes) -> None:
|
||||
"""
|
||||
Write *data* to the serial port.
|
||||
|
||||
Raises:
|
||||
RuntimeError: if not connected.
|
||||
serial.SerialException: on I/O error.
|
||||
"""
|
||||
if not self.is_connected:
|
||||
raise RuntimeError("SerialTransport.write: not connected")
|
||||
self._ser.write(data) # type: ignore[union-attr]
|
||||
self._ser.flush() # type: ignore[union-attr]
|
||||
|
||||
def read(self, n: int) -> bytes:
|
||||
"""
|
||||
Read up to *n* bytes from the serial port.
|
||||
|
||||
Returns b"" immediately if no data is available (non-blocking in
|
||||
practice thanks to the 10 ms read timeout).
|
||||
|
||||
Raises:
|
||||
RuntimeError: if not connected.
|
||||
"""
|
||||
if not self.is_connected:
|
||||
raise RuntimeError("SerialTransport.read: not connected")
|
||||
return self._ser.read(n) # type: ignore[union-attr]
|
||||
|
||||
# ── Extras ────────────────────────────────────────────────────────────────
|
||||
|
||||
def flush_input(self) -> None:
|
||||
"""Discard any unread bytes in the OS receive buffer."""
|
||||
if self.is_connected:
|
||||
self._ser.reset_input_buffer() # type: ignore[union-attr]
|
||||
|
||||
def __repr__(self) -> str:
|
||||
state = "open" if self.is_connected else "closed"
|
||||
return f"SerialTransport({self.port!r}, baud={self.baud}, {state})"
|
||||
|
||||
|
||||
# ── TCP transport ─────────────────────────────────────────────────────────────
|
||||
|
||||
# Default TCP port for Blastware modem communications / ACH relay.
|
||||
# Confirmed from field setup: Blastware → Communication Setup → TCP/IP uses 12345.
|
||||
DEFAULT_TCP_PORT = 12345
|
||||
|
||||
|
||||
class TcpTransport(BaseTransport):
|
||||
"""
|
||||
TCP socket transport for MiniMate Plus units in the field.
|
||||
|
||||
The protocol bytes over TCP are identical to RS-232 — TCP is simply a
|
||||
different physical layer. The modem (Sierra Wireless RV55 / RX55, or older
|
||||
Raven X) bridges the unit's RS-232 serial port to a TCP socket transparently.
|
||||
No application-layer handshake or framing is added.
|
||||
|
||||
Two usage scenarios:
|
||||
|
||||
"Call up" (outbound): SFM connects to the unit's modem IP directly.
|
||||
TcpTransport(host="203.0.113.5", port=12345)
|
||||
|
||||
"Call home" / ACH relay: The unit has already dialled in to the office
|
||||
ACH server, which bridged the modem to a TCP socket. In this case
|
||||
the host/port identifies the relay's listening socket, not the modem.
|
||||
(ACH inbound mode is handled by a separate AchServer — not this class.)
|
||||
|
||||
IMPORTANT — modem data forwarding delay:
|
||||
Sierra Wireless (and Raven) modems buffer RS-232 bytes for up to 1 second
|
||||
before forwarding them as a TCP segment ("Data Forwarding Timeout" in
|
||||
ACEmanager). read_until_idle() is overridden to use idle_gap=1.5 s rather
|
||||
than the serial default of 0.05 s — without this, the parser would declare
|
||||
a frame complete mid-stream during the modem's buffering pause.
|
||||
|
||||
Args:
|
||||
host: IP address or hostname of the modem / ACH relay.
|
||||
port: TCP port number (default 12345).
|
||||
connect_timeout: Seconds to wait for the TCP handshake (default 10.0).
|
||||
"""
|
||||
|
||||
# Internal recv timeout — short so read() returns promptly if no data.
|
||||
_RECV_TIMEOUT = 0.01
|
||||
|
||||
def __init__(
|
||||
self,
|
||||
host: str,
|
||||
port: int = DEFAULT_TCP_PORT,
|
||||
connect_timeout: float = 10.0,
|
||||
) -> None:
|
||||
self.host = host
|
||||
self.port = port
|
||||
self.connect_timeout = connect_timeout
|
||||
self._sock: Optional[socket.socket] = None
|
||||
|
||||
# ── BaseTransport interface ───────────────────────────────────────────────
|
||||
|
||||
def connect(self) -> None:
|
||||
"""
|
||||
Open a TCP connection to host:port.
|
||||
|
||||
Idempotent — does nothing if already connected.
|
||||
|
||||
Raises:
|
||||
OSError / socket.timeout: if the connection cannot be established.
|
||||
"""
|
||||
if self._sock is not None:
|
||||
return # Already connected — idempotent
|
||||
sock = socket.socket(socket.AF_INET, socket.SOCK_STREAM)
|
||||
sock.settimeout(self.connect_timeout)
|
||||
sock.connect((self.host, self.port))
|
||||
# Switch to short timeout so read() is non-blocking in practice
|
||||
sock.settimeout(self._RECV_TIMEOUT)
|
||||
self._sock = sock
|
||||
|
||||
def disconnect(self) -> None:
|
||||
"""Close the TCP socket. Safe to call even if already closed."""
|
||||
if self._sock:
|
||||
try:
|
||||
self._sock.shutdown(socket.SHUT_RDWR)
|
||||
except OSError:
|
||||
pass
|
||||
try:
|
||||
self._sock.close()
|
||||
except OSError:
|
||||
pass
|
||||
self._sock = None
|
||||
|
||||
@property
|
||||
def is_connected(self) -> bool:
|
||||
return self._sock is not None
|
||||
|
||||
def write(self, data: bytes) -> None:
|
||||
"""
|
||||
Send all bytes to the peer.
|
||||
|
||||
Raises:
|
||||
RuntimeError: if not connected.
|
||||
OSError: on network I/O error.
|
||||
"""
|
||||
if not self.is_connected:
|
||||
raise RuntimeError("TcpTransport.write: not connected")
|
||||
self._sock.sendall(data) # type: ignore[union-attr]
|
||||
|
||||
def read(self, n: int) -> bytes:
|
||||
"""
|
||||
Read up to *n* bytes from the socket.
|
||||
|
||||
Returns b"" immediately if no data is available (non-blocking in
|
||||
practice thanks to the short socket timeout).
|
||||
|
||||
Raises:
|
||||
RuntimeError: if not connected.
|
||||
"""
|
||||
if not self.is_connected:
|
||||
raise RuntimeError("TcpTransport.read: not connected")
|
||||
try:
|
||||
return self._sock.recv(n) # type: ignore[union-attr]
|
||||
except socket.timeout:
|
||||
return b""
|
||||
|
||||
def read_until_idle(
|
||||
self,
|
||||
timeout: float = 2.0,
|
||||
idle_gap: float = 1.5,
|
||||
chunk: int = 256,
|
||||
) -> bytes:
|
||||
"""
|
||||
TCP-aware version of read_until_idle.
|
||||
|
||||
Overrides the BaseTransport default to use a much longer idle_gap (1.5 s
|
||||
vs 0.05 s for serial). This is necessary because the Raven modem (and
|
||||
similar cellular modems) buffer serial-port bytes for up to 1 second
|
||||
before forwarding them over TCP ("Data Forwarding Timeout" setting).
|
||||
|
||||
If read_until_idle returned after a 50 ms quiet period, it would trigger
|
||||
mid-frame when the modem is still accumulating bytes — causing frame
|
||||
parse failures on every call.
|
||||
|
||||
Args:
|
||||
timeout: Hard deadline from first byte (default 2.0 s — callers
|
||||
typically pass a longer value for S3 frames).
|
||||
idle_gap: Quiet-line threshold (default 1.5 s to survive modem
|
||||
buffering). Pass a smaller value only if you are
|
||||
connecting directly to a unit's Ethernet port with no
|
||||
modem buffering in the path.
|
||||
chunk: Bytes per low-level recv() call.
|
||||
"""
|
||||
return super().read_until_idle(timeout=timeout, idle_gap=idle_gap, chunk=chunk)
|
||||
|
||||
def __repr__(self) -> str:
|
||||
state = "connected" if self.is_connected else "disconnected"
|
||||
return f"TcpTransport({self.host!r}, port={self.port}, {state})"
|
||||
125
parsers/README_s3_parser.md
Normal file
125
parsers/README_s3_parser.md
Normal file
@@ -0,0 +1,125 @@
|
||||
# s3_parser.py
|
||||
|
||||
## Purpose
|
||||
|
||||
`s3_parser.py` extracts complete DLE-framed packets from raw serial
|
||||
capture files produced by the `s3_bridge` logger.
|
||||
|
||||
It operates strictly at the **framing layer**. It does **not** decode
|
||||
higher-level protocol structures.
|
||||
|
||||
This parser is designed specifically for Instantel / Series 3--style
|
||||
serial traffic using:
|
||||
|
||||
- `DLE STX` (`0x10 0x02`) to start a frame
|
||||
- `DLE ETX` (`0x10 0x03`) to end a frame
|
||||
- DLE byte stuffing (`0x10 0x10` → literal `0x10`)
|
||||
|
||||
------------------------------------------------------------------------
|
||||
|
||||
## Design Philosophy
|
||||
|
||||
This parser:
|
||||
|
||||
- Uses a deterministic state machine (no regex, no global scanning).
|
||||
- Assumes raw wire framing is preserved (`DLE+ETX` is present).
|
||||
- Does **not** attempt auto-detection of framing style.
|
||||
- Extracts only complete `STX → ETX` frame pairs.
|
||||
- Safely ignores incomplete trailing frames at EOF.
|
||||
|
||||
Separation of concerns is intentional:
|
||||
|
||||
- **Parser = framing extraction**
|
||||
- **Decoder = protocol interpretation (future layer)**
|
||||
|
||||
Do not add message-level logic here.
|
||||
|
||||
------------------------------------------------------------------------
|
||||
|
||||
## Input
|
||||
|
||||
Raw binary `.bin` files captured from:
|
||||
|
||||
- `--raw-bw` tap (Blastware → S3)
|
||||
- `--raw-s3` tap (S3 → Blastware)
|
||||
|
||||
These must preserve raw serial bytes.
|
||||
|
||||
------------------------------------------------------------------------
|
||||
|
||||
## Usage
|
||||
|
||||
Basic frame extraction:
|
||||
|
||||
``` bash
|
||||
python s3_parser.py raw_s3.bin --trailer-len 2
|
||||
```
|
||||
|
||||
Options:
|
||||
|
||||
- `--trailer-len N`
|
||||
- Number of bytes to capture after `DLE ETX`
|
||||
- Often `2` (CRC16)
|
||||
- `--crc`
|
||||
- Attempts CRC16 validation against first 2 trailer bytes
|
||||
- Tries several common CRC16 variants
|
||||
- `--crc-endian {little|big}`
|
||||
- Endianness for interpreting trailer bytes (default: little)
|
||||
- `--out frames.jsonl`
|
||||
- Writes full JSONL output instead of printing summary
|
||||
|
||||
------------------------------------------------------------------------
|
||||
|
||||
## Output Format
|
||||
|
||||
Each extracted frame produces:
|
||||
|
||||
``` json
|
||||
{
|
||||
"index": 0,
|
||||
"start_offset": 20,
|
||||
"end_offset": 4033,
|
||||
"payload_len": 3922,
|
||||
"payload_hex": "...",
|
||||
"trailer_hex": "000f",
|
||||
"crc_match": null
|
||||
}
|
||||
```
|
||||
|
||||
Where:
|
||||
|
||||
- `payload_hex` = unescaped payload bytes (DLE stuffing removed)
|
||||
- `trailer_hex` = bytes immediately following `DLE ETX`
|
||||
- `crc_match` = matched CRC algorithm (if `--crc` enabled)
|
||||
|
||||
------------------------------------------------------------------------
|
||||
|
||||
## Known Behavior
|
||||
|
||||
- Frames that start but never receive a matching `DLE ETX` before EOF
|
||||
are discarded.
|
||||
- Embedded `0x10 0x02` inside payload does not trigger a new frame
|
||||
(correct behavior).
|
||||
- Embedded `0x10 0x10` is correctly unescaped to a single `0x10`.
|
||||
|
||||
------------------------------------------------------------------------
|
||||
|
||||
## What This Parser Does NOT Do
|
||||
|
||||
- It does not decode Instantel message structure.
|
||||
- It does not interpret block IDs or message types.
|
||||
- It does not validate protocol-level fields.
|
||||
- It does not reconstruct multi-frame logical responses.
|
||||
|
||||
That is the responsibility of a higher-level decoder.
|
||||
|
||||
------------------------------------------------------------------------
|
||||
|
||||
## Status
|
||||
|
||||
Framing layer verified against:
|
||||
|
||||
- raw_bw.bin (command/control direction)
|
||||
- raw_s3.bin (device response direction)
|
||||
|
||||
State machine validated via start/end instrumentation.
|
||||
98
parsers/bw_frames.jsonl
Normal file
98
parsers/bw_frames.jsonl
Normal file
@@ -0,0 +1,98 @@
|
||||
{"index": 0, "start_offset": 0, "end_offset": 21, "payload_len": 17, "payload_hex": "10005b000000000000000000000000006b", "trailer_hex": "", "checksum_valid": null, "checksum_type": null, "checksum_hex": null}
|
||||
{"index": 1, "start_offset": 21, "end_offset": 42, "payload_len": 17, "payload_hex": "10005b000000000000000000000000006b", "trailer_hex": "", "checksum_valid": null, "checksum_type": null, "checksum_hex": null}
|
||||
{"index": 2, "start_offset": 42, "end_offset": 63, "payload_len": 17, "payload_hex": "10005b000030000000000000000000009b", "trailer_hex": "", "checksum_valid": null, "checksum_type": null, "checksum_hex": null}
|
||||
{"index": 3, "start_offset": 63, "end_offset": 84, "payload_len": 17, "payload_hex": "10005b000000000000000000000000006b", "trailer_hex": "", "checksum_valid": null, "checksum_type": null, "checksum_hex": null}
|
||||
{"index": 4, "start_offset": 84, "end_offset": 105, "payload_len": 17, "payload_hex": "10005b000030000000000000000000009b", "trailer_hex": "", "checksum_valid": null, "checksum_type": null, "checksum_hex": null}
|
||||
{"index": 5, "start_offset": 105, "end_offset": 126, "payload_len": 17, "payload_hex": "10005b000000000000000000000000006b", "trailer_hex": "", "checksum_valid": null, "checksum_type": null, "checksum_hex": null}
|
||||
{"index": 6, "start_offset": 126, "end_offset": 147, "payload_len": 17, "payload_hex": "10005b000030000000000000000000009b", "trailer_hex": "", "checksum_valid": null, "checksum_type": null, "checksum_hex": null}
|
||||
{"index": 7, "start_offset": 147, "end_offset": 168, "payload_len": 17, "payload_hex": "10005b000000000000000000000000006b", "trailer_hex": "", "checksum_valid": null, "checksum_type": null, "checksum_hex": null}
|
||||
{"index": 8, "start_offset": 168, "end_offset": 189, "payload_len": 17, "payload_hex": "10005b000030000000000000000000009b", "trailer_hex": "", "checksum_valid": null, "checksum_type": null, "checksum_hex": null}
|
||||
{"index": 9, "start_offset": 189, "end_offset": 210, "payload_len": 17, "payload_hex": "10005b000000000000000000000000006b", "trailer_hex": "", "checksum_valid": null, "checksum_type": null, "checksum_hex": null}
|
||||
{"index": 10, "start_offset": 210, "end_offset": 231, "payload_len": 17, "payload_hex": "10005b000030000000000000000000009b", "trailer_hex": "", "checksum_valid": null, "checksum_type": null, "checksum_hex": null}
|
||||
{"index": 11, "start_offset": 231, "end_offset": 252, "payload_len": 17, "payload_hex": "10005b000000000000000000000000006b", "trailer_hex": "", "checksum_valid": null, "checksum_type": null, "checksum_hex": null}
|
||||
{"index": 12, "start_offset": 252, "end_offset": 273, "payload_len": 17, "payload_hex": "10005b000030000000000000000000009b", "trailer_hex": "", "checksum_valid": null, "checksum_type": null, "checksum_hex": null}
|
||||
{"index": 13, "start_offset": 273, "end_offset": 294, "payload_len": 17, "payload_hex": "1000150000000000000000000000000025", "trailer_hex": "", "checksum_valid": null, "checksum_type": null, "checksum_hex": null}
|
||||
{"index": 14, "start_offset": 294, "end_offset": 315, "payload_len": 17, "payload_hex": "10001500000a000000000000000000002f", "trailer_hex": "", "checksum_valid": null, "checksum_type": null, "checksum_hex": null}
|
||||
{"index": 15, "start_offset": 315, "end_offset": 427, "payload_len": 108, "payload_hex": "10006800005a00000000000000000000005809000000010107cb00061e00010107cb00140000000000173b00000000000000000000000000000100000000000100000000000000010001000000000000000000000000000000000064000000000000001effdc0000100200c8", "trailer_hex": "", "checksum_valid": null, "checksum_type": null, "checksum_hex": null}
|
||||
{"index": 16, "start_offset": 427, "end_offset": 448, "payload_len": 17, "payload_hex": "1000730000000000000000000000000083", "trailer_hex": "", "checksum_valid": null, "checksum_type": null, "checksum_hex": null}
|
||||
{"index": 17, "start_offset": 448, "end_offset": 1497, "payload_len": 1045, "payload_hex": "1000710010040000000000000000000000082a6400001004100400003c0000be800000000040400000001003000f000000073dbb457a3db956e1000100015374616e64617264205265636f7264696e672053657475702e7365740000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000050726f6a6563743a0000000000000000000000000000544553542000000000000000000000000000000000000000000000000000000000000000000000000000436c69656e743a000000000000000000000000000000436c6175646520746573743200000000000000000000000000000000000000000000000000000000000055736572204e616d653a00000000000000000000000054657272612d4d656368616e69637320496e632e202d20422e204861727269736f6e000000000000000053656973204c6f633a000000000000000000000000004c6f636174696f6e202331202d20427269616e7320486f75736500000000000000000000000000000000457874656e646564204e6f74657300000000000000000a000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000007", "trailer_hex": "", "checksum_valid": null, "checksum_type": null, "checksum_hex": null}
|
||||
{"index": 18, "start_offset": 1497, "end_offset": 2574, "payload_len": 1073, "payload_hex": "1000710010040000001004000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000015472616e000000010050000f0028001510021003011004001003000040c697fd00003f19999a696e2e00400000002f730000000156657274000000010050000f0028001510021003011004001003000040c697fd00003f19999a696e2e00400000002f73000000014c6f6e67000000010050000f0028001510021003011004001003000040c697fd00003f19999a696e2e00400000002f73000000004d69634c000000100200c80032000a000a1002d501db000500003d38560800003c1374bc707369003cac0831284c29000010025472616e320000010050000f0028001510021003011004001003000040c697fd00003f000000696e2e00400000002f73000000100256657274320000010050000f0028001510021003011004001003000040c697fd00003f000000696e2e00400000002f7300000010024c6f6e67320000010050000f0028001510021003011004001003000040c697fd00003f000000696e2e00400000002f73000000004d69634c1002", "trailer_hex": "", "checksum_valid": null, "checksum_type": null, "checksum_hex": null}
|
||||
{"index": 19, "start_offset": 2574, "end_offset": 2641, "payload_len": 63, "payload_hex": "10007100002c00000800000000000000320000100200c80032000a000a1002d501db000500003d38560800003c23d70a707369003cac0831284c29007cea32", "trailer_hex": "", "checksum_valid": null, "checksum_type": null, "checksum_hex": null}
|
||||
{"index": 20, "start_offset": 2641, "end_offset": 2662, "payload_len": 17, "payload_hex": "1000720000000000000000000000000082", "trailer_hex": "", "checksum_valid": null, "checksum_type": null, "checksum_hex": null}
|
||||
{"index": 21, "start_offset": 2662, "end_offset": 2711, "payload_len": 45, "payload_hex": "10008200001c00000000000000000000001ad5000001080affffffffffffffffffffffffffffffffffff00009e", "trailer_hex": "", "checksum_valid": null, "checksum_type": null, "checksum_hex": null}
|
||||
{"index": 22, "start_offset": 2711, "end_offset": 2732, "payload_len": 17, "payload_hex": "1000830000000000000000000000000093", "trailer_hex": "", "checksum_valid": null, "checksum_type": null, "checksum_hex": null}
|
||||
{"index": 23, "start_offset": 2732, "end_offset": 2957, "payload_len": 221, "payload_hex": "1000690000ca0000000000000000000000c8080000010001000100010001000100010010020001001e0010020001000a000a4576656e742053756d6d617279205265706f7274000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000002580000801018c76af", "trailer_hex": "", "checksum_valid": null, "checksum_type": null, "checksum_hex": null}
|
||||
{"index": 24, "start_offset": 2957, "end_offset": 2978, "payload_len": 17, "payload_hex": "1000740000000000000000000000000084", "trailer_hex": "", "checksum_valid": null, "checksum_type": null, "checksum_hex": null}
|
||||
{"index": 25, "start_offset": 2978, "end_offset": 2999, "payload_len": 17, "payload_hex": "1000720000000000000000000000000082", "trailer_hex": "", "checksum_valid": null, "checksum_type": null, "checksum_hex": null}
|
||||
{"index": 26, "start_offset": 2999, "end_offset": 3020, "payload_len": 17, "payload_hex": "10005b000000000000000000000000006b", "trailer_hex": "", "checksum_valid": null, "checksum_type": null, "checksum_hex": null}
|
||||
{"index": 27, "start_offset": 3020, "end_offset": 3041, "payload_len": 17, "payload_hex": "10005b000030000000000000000000009b", "trailer_hex": "", "checksum_valid": null, "checksum_type": null, "checksum_hex": null}
|
||||
{"index": 28, "start_offset": 3041, "end_offset": 3062, "payload_len": 17, "payload_hex": "10005b000000000000000000000000006b", "trailer_hex": "", "checksum_valid": null, "checksum_type": null, "checksum_hex": null}
|
||||
{"index": 29, "start_offset": 3062, "end_offset": 3083, "payload_len": 17, "payload_hex": "10005b000030000000000000000000009b", "trailer_hex": "", "checksum_valid": null, "checksum_type": null, "checksum_hex": null}
|
||||
{"index": 30, "start_offset": 3083, "end_offset": 3104, "payload_len": 17, "payload_hex": "10005b000000000000000000000000006b", "trailer_hex": "", "checksum_valid": null, "checksum_type": null, "checksum_hex": null}
|
||||
{"index": 31, "start_offset": 3104, "end_offset": 3125, "payload_len": 17, "payload_hex": "10005b000030000000000000000000009b", "trailer_hex": "", "checksum_valid": null, "checksum_type": null, "checksum_hex": null}
|
||||
{"index": 32, "start_offset": 3125, "end_offset": 3146, "payload_len": 17, "payload_hex": "1000150000000000000000000000000025", "trailer_hex": "", "checksum_valid": null, "checksum_type": null, "checksum_hex": null}
|
||||
{"index": 33, "start_offset": 3146, "end_offset": 3167, "payload_len": 17, "payload_hex": "10001500000a000000000000000000002f", "trailer_hex": "", "checksum_valid": null, "checksum_type": null, "checksum_hex": null}
|
||||
{"index": 34, "start_offset": 3167, "end_offset": 3188, "payload_len": 17, "payload_hex": "1000010000000000000000000000000011", "trailer_hex": "", "checksum_valid": null, "checksum_type": null, "checksum_hex": null}
|
||||
{"index": 35, "start_offset": 3188, "end_offset": 3209, "payload_len": 17, "payload_hex": "10000100009800000000000000000000a9", "trailer_hex": "", "checksum_valid": null, "checksum_type": null, "checksum_hex": null}
|
||||
{"index": 36, "start_offset": 3209, "end_offset": 3230, "payload_len": 17, "payload_hex": "1000080000000000000000000000000018", "trailer_hex": "", "checksum_valid": null, "checksum_type": null, "checksum_hex": null}
|
||||
{"index": 37, "start_offset": 3230, "end_offset": 3251, "payload_len": 17, "payload_hex": "1000080000580000000000000000000070", "trailer_hex": "", "checksum_valid": null, "checksum_type": null, "checksum_hex": null}
|
||||
{"index": 38, "start_offset": 3251, "end_offset": 3272, "payload_len": 17, "payload_hex": "10005b000000000000000000000000006b", "trailer_hex": "", "checksum_valid": null, "checksum_type": null, "checksum_hex": null}
|
||||
{"index": 39, "start_offset": 3272, "end_offset": 3293, "payload_len": 17, "payload_hex": "10005b000030000000000000000000009b", "trailer_hex": "", "checksum_valid": null, "checksum_type": null, "checksum_hex": null}
|
||||
{"index": 40, "start_offset": 3293, "end_offset": 3314, "payload_len": 17, "payload_hex": "10005b000000000000000000000000006b", "trailer_hex": "", "checksum_valid": null, "checksum_type": null, "checksum_hex": null}
|
||||
{"index": 41, "start_offset": 3314, "end_offset": 3335, "payload_len": 17, "payload_hex": "10005b000030000000000000000000009b", "trailer_hex": "", "checksum_valid": null, "checksum_type": null, "checksum_hex": null}
|
||||
{"index": 42, "start_offset": 3335, "end_offset": 3356, "payload_len": 17, "payload_hex": "10005b000000000000000000000000006b", "trailer_hex": "", "checksum_valid": null, "checksum_type": null, "checksum_hex": null}
|
||||
{"index": 43, "start_offset": 3356, "end_offset": 3377, "payload_len": 17, "payload_hex": "10005b000030000000000000000000009b", "trailer_hex": "", "checksum_valid": null, "checksum_type": null, "checksum_hex": null}
|
||||
{"index": 44, "start_offset": 3377, "end_offset": 3398, "payload_len": 17, "payload_hex": "1000010000000000000000000000000011", "trailer_hex": "", "checksum_valid": null, "checksum_type": null, "checksum_hex": null}
|
||||
{"index": 45, "start_offset": 3398, "end_offset": 3419, "payload_len": 17, "payload_hex": "10000100009800000000000000000000a9", "trailer_hex": "", "checksum_valid": null, "checksum_type": null, "checksum_hex": null}
|
||||
{"index": 46, "start_offset": 3419, "end_offset": 3440, "payload_len": 17, "payload_hex": "10005b000000000000000000000000006b", "trailer_hex": "", "checksum_valid": null, "checksum_type": null, "checksum_hex": null}
|
||||
{"index": 47, "start_offset": 3440, "end_offset": 3461, "payload_len": 17, "payload_hex": "10005b000030000000000000000000009b", "trailer_hex": "", "checksum_valid": null, "checksum_type": null, "checksum_hex": null}
|
||||
{"index": 48, "start_offset": 3461, "end_offset": 3482, "payload_len": 17, "payload_hex": "10005b000000000000000000000000006b", "trailer_hex": "", "checksum_valid": null, "checksum_type": null, "checksum_hex": null}
|
||||
{"index": 49, "start_offset": 3482, "end_offset": 3503, "payload_len": 17, "payload_hex": "10005b000030000000000000000000009b", "trailer_hex": "", "checksum_valid": null, "checksum_type": null, "checksum_hex": null}
|
||||
{"index": 50, "start_offset": 3503, "end_offset": 3524, "payload_len": 17, "payload_hex": "10005b000000000000000000000000006b", "trailer_hex": "", "checksum_valid": null, "checksum_type": null, "checksum_hex": null}
|
||||
{"index": 51, "start_offset": 3524, "end_offset": 3545, "payload_len": 17, "payload_hex": "10005b000030000000000000000000009b", "trailer_hex": "", "checksum_valid": null, "checksum_type": null, "checksum_hex": null}
|
||||
{"index": 52, "start_offset": 3545, "end_offset": 3566, "payload_len": 17, "payload_hex": "1000150000000000000000000000000025", "trailer_hex": "", "checksum_valid": null, "checksum_type": null, "checksum_hex": null}
|
||||
{"index": 53, "start_offset": 3566, "end_offset": 3587, "payload_len": 17, "payload_hex": "10001500000a000000000000000000002f", "trailer_hex": "", "checksum_valid": null, "checksum_type": null, "checksum_hex": null}
|
||||
{"index": 54, "start_offset": 3587, "end_offset": 3608, "payload_len": 17, "payload_hex": "10005b000000000000000000000000006b", "trailer_hex": "", "checksum_valid": null, "checksum_type": null, "checksum_hex": null}
|
||||
{"index": 55, "start_offset": 3608, "end_offset": 3629, "payload_len": 17, "payload_hex": "10005b000030000000000000000000009b", "trailer_hex": "", "checksum_valid": null, "checksum_type": null, "checksum_hex": null}
|
||||
{"index": 56, "start_offset": 3629, "end_offset": 3650, "payload_len": 17, "payload_hex": "10005b000000000000000000000000006b", "trailer_hex": "", "checksum_valid": null, "checksum_type": null, "checksum_hex": null}
|
||||
{"index": 57, "start_offset": 3650, "end_offset": 3671, "payload_len": 17, "payload_hex": "10005b000030000000000000000000009b", "trailer_hex": "", "checksum_valid": null, "checksum_type": null, "checksum_hex": null}
|
||||
{"index": 58, "start_offset": 3671, "end_offset": 3692, "payload_len": 17, "payload_hex": "10005b000000000000000000000000006b", "trailer_hex": "", "checksum_valid": null, "checksum_type": null, "checksum_hex": null}
|
||||
{"index": 59, "start_offset": 3692, "end_offset": 3713, "payload_len": 17, "payload_hex": "10005b000030000000000000000000009b", "trailer_hex": "", "checksum_valid": null, "checksum_type": null, "checksum_hex": null}
|
||||
{"index": 60, "start_offset": 3713, "end_offset": 3734, "payload_len": 17, "payload_hex": "1000150000000000000000000000000025", "trailer_hex": "", "checksum_valid": null, "checksum_type": null, "checksum_hex": null}
|
||||
{"index": 61, "start_offset": 3734, "end_offset": 3755, "payload_len": 17, "payload_hex": "10001500000a000000000000000000002f", "trailer_hex": "", "checksum_valid": null, "checksum_type": null, "checksum_hex": null}
|
||||
{"index": 62, "start_offset": 3755, "end_offset": 3776, "payload_len": 17, "payload_hex": "10005b000000000000000000000000006b", "trailer_hex": "", "checksum_valid": null, "checksum_type": null, "checksum_hex": null}
|
||||
{"index": 63, "start_offset": 3776, "end_offset": 3797, "payload_len": 17, "payload_hex": "10005b000030000000000000000000009b", "trailer_hex": "", "checksum_valid": null, "checksum_type": null, "checksum_hex": null}
|
||||
{"index": 64, "start_offset": 3797, "end_offset": 3818, "payload_len": 17, "payload_hex": "10005b000000000000000000000000006b", "trailer_hex": "", "checksum_valid": null, "checksum_type": null, "checksum_hex": null}
|
||||
{"index": 65, "start_offset": 3818, "end_offset": 3839, "payload_len": 17, "payload_hex": "10005b000030000000000000000000009b", "trailer_hex": "", "checksum_valid": null, "checksum_type": null, "checksum_hex": null}
|
||||
{"index": 66, "start_offset": 3839, "end_offset": 3860, "payload_len": 17, "payload_hex": "10005b000000000000000000000000006b", "trailer_hex": "", "checksum_valid": null, "checksum_type": null, "checksum_hex": null}
|
||||
{"index": 67, "start_offset": 3860, "end_offset": 3881, "payload_len": 17, "payload_hex": "10005b000030000000000000000000009b", "trailer_hex": "", "checksum_valid": null, "checksum_type": null, "checksum_hex": null}
|
||||
{"index": 68, "start_offset": 3881, "end_offset": 3902, "payload_len": 17, "payload_hex": "10005b000000000000000000000000006b", "trailer_hex": "", "checksum_valid": null, "checksum_type": null, "checksum_hex": null}
|
||||
{"index": 69, "start_offset": 3902, "end_offset": 3923, "payload_len": 17, "payload_hex": "10005b000030000000000000000000009b", "trailer_hex": "", "checksum_valid": null, "checksum_type": null, "checksum_hex": null}
|
||||
{"index": 70, "start_offset": 3923, "end_offset": 3944, "payload_len": 17, "payload_hex": "10005b000000000000000000000000006b", "trailer_hex": "", "checksum_valid": null, "checksum_type": null, "checksum_hex": null}
|
||||
{"index": 71, "start_offset": 3944, "end_offset": 3965, "payload_len": 17, "payload_hex": "10005b000030000000000000000000009b", "trailer_hex": "", "checksum_valid": null, "checksum_type": null, "checksum_hex": null}
|
||||
{"index": 72, "start_offset": 3965, "end_offset": 3986, "payload_len": 17, "payload_hex": "10005b000000000000000000000000006b", "trailer_hex": "", "checksum_valid": null, "checksum_type": null, "checksum_hex": null}
|
||||
{"index": 73, "start_offset": 3986, "end_offset": 4007, "payload_len": 17, "payload_hex": "10005b000030000000000000000000009b", "trailer_hex": "", "checksum_valid": null, "checksum_type": null, "checksum_hex": null}
|
||||
{"index": 74, "start_offset": 4007, "end_offset": 4028, "payload_len": 17, "payload_hex": "10005b000000000000000000000000006b", "trailer_hex": "", "checksum_valid": null, "checksum_type": null, "checksum_hex": null}
|
||||
{"index": 75, "start_offset": 4028, "end_offset": 4049, "payload_len": 17, "payload_hex": "10005b000030000000000000000000009b", "trailer_hex": "", "checksum_valid": null, "checksum_type": null, "checksum_hex": null}
|
||||
{"index": 76, "start_offset": 4049, "end_offset": 4070, "payload_len": 17, "payload_hex": "10005b000000000000000000000000006b", "trailer_hex": "", "checksum_valid": null, "checksum_type": null, "checksum_hex": null}
|
||||
{"index": 77, "start_offset": 4070, "end_offset": 4091, "payload_len": 17, "payload_hex": "10005b000030000000000000000000009b", "trailer_hex": "", "checksum_valid": null, "checksum_type": null, "checksum_hex": null}
|
||||
{"index": 78, "start_offset": 4091, "end_offset": 4112, "payload_len": 17, "payload_hex": "10005b000000000000000000000000006b", "trailer_hex": "", "checksum_valid": null, "checksum_type": null, "checksum_hex": null}
|
||||
{"index": 79, "start_offset": 4112, "end_offset": 4133, "payload_len": 17, "payload_hex": "10005b000030000000000000000000009b", "trailer_hex": "", "checksum_valid": null, "checksum_type": null, "checksum_hex": null}
|
||||
{"index": 80, "start_offset": 4133, "end_offset": 4154, "payload_len": 17, "payload_hex": "1000010000000000000000000000000011", "trailer_hex": "", "checksum_valid": null, "checksum_type": null, "checksum_hex": null}
|
||||
{"index": 81, "start_offset": 4154, "end_offset": 4175, "payload_len": 17, "payload_hex": "10000100009800000000000000000000a9", "trailer_hex": "", "checksum_valid": null, "checksum_type": null, "checksum_hex": null}
|
||||
{"index": 82, "start_offset": 4175, "end_offset": 4196, "payload_len": 17, "payload_hex": "10002e000000000000000000000000003e", "trailer_hex": "", "checksum_valid": null, "checksum_type": null, "checksum_hex": null}
|
||||
{"index": 83, "start_offset": 4196, "end_offset": 4217, "payload_len": 17, "payload_hex": "10002e00001a0000000000000000000058", "trailer_hex": "", "checksum_valid": null, "checksum_type": null, "checksum_hex": null}
|
||||
{"index": 84, "start_offset": 4217, "end_offset": 4238, "payload_len": 17, "payload_hex": "1000010000000000000000000000000011", "trailer_hex": "", "checksum_valid": null, "checksum_type": null, "checksum_hex": null}
|
||||
{"index": 85, "start_offset": 4238, "end_offset": 4259, "payload_len": 17, "payload_hex": "10000100009800000000000000000000a9", "trailer_hex": "", "checksum_valid": null, "checksum_type": null, "checksum_hex": null}
|
||||
{"index": 86, "start_offset": 4259, "end_offset": 4280, "payload_len": 17, "payload_hex": "10001a000000000000000000006400008e", "trailer_hex": "", "checksum_valid": null, "checksum_type": null, "checksum_hex": null}
|
||||
{"index": 87, "start_offset": 4280, "end_offset": 4302, "payload_len": 18, "payload_hex": "10001a001004000000000000000064000092", "trailer_hex": "", "checksum_valid": null, "checksum_type": null, "checksum_hex": null}
|
||||
{"index": 88, "start_offset": 4302, "end_offset": 4325, "payload_len": 19, "payload_hex": "10001a00100400000010040000000064000096", "trailer_hex": "", "checksum_valid": null, "checksum_type": null, "checksum_hex": null}
|
||||
{"index": 89, "start_offset": 4325, "end_offset": 4346, "payload_len": 17, "payload_hex": "10001a00002a00000800000000640000c0", "trailer_hex": "", "checksum_valid": null, "checksum_type": null, "checksum_hex": null}
|
||||
{"index": 90, "start_offset": 4346, "end_offset": 4367, "payload_len": 17, "payload_hex": "1000090000000000000000000000000019", "trailer_hex": "", "checksum_valid": null, "checksum_type": null, "checksum_hex": null}
|
||||
{"index": 91, "start_offset": 4367, "end_offset": 4388, "payload_len": 17, "payload_hex": "1000090000ca00000000000000000000e3", "trailer_hex": "", "checksum_valid": null, "checksum_type": null, "checksum_hex": null}
|
||||
{"index": 92, "start_offset": 4388, "end_offset": 4409, "payload_len": 17, "payload_hex": "1000080000000000000000000000000018", "trailer_hex": "", "checksum_valid": null, "checksum_type": null, "checksum_hex": null}
|
||||
{"index": 93, "start_offset": 4409, "end_offset": 4430, "payload_len": 17, "payload_hex": "1000080000580000000000000000000070", "trailer_hex": "", "checksum_valid": null, "checksum_type": null, "checksum_hex": null}
|
||||
{"index": 94, "start_offset": 4430, "end_offset": 4451, "payload_len": 17, "payload_hex": "1000010000000000000000000000000011", "trailer_hex": "", "checksum_valid": null, "checksum_type": null, "checksum_hex": null}
|
||||
{"index": 95, "start_offset": 4451, "end_offset": 4472, "payload_len": 17, "payload_hex": "10000100009800000000000000000000a9", "trailer_hex": "", "checksum_valid": null, "checksum_type": null, "checksum_hex": null}
|
||||
{"index": 96, "start_offset": 4472, "end_offset": 4493, "payload_len": 17, "payload_hex": "1000080000000000000000000000000018", "trailer_hex": "", "checksum_valid": null, "checksum_type": null, "checksum_hex": null}
|
||||
{"index": 97, "start_offset": 4493, "end_offset": 4514, "payload_len": 17, "payload_hex": "1000080000580000000000000000000070", "trailer_hex": "", "checksum_valid": null, "checksum_type": null, "checksum_hex": null}
|
||||
337
parsers/frame_db.py
Normal file
337
parsers/frame_db.py
Normal file
@@ -0,0 +1,337 @@
|
||||
#!/usr/bin/env python3
|
||||
"""
|
||||
frame_db.py — SQLite frame database for Instantel protocol captures.
|
||||
|
||||
Schema:
|
||||
captures — one row per ingested capture pair (deduped by SHA256)
|
||||
frames — one row per parsed frame
|
||||
byte_values — one row per (frame, offset, value) for fast indexed queries
|
||||
|
||||
Usage:
|
||||
db = FrameDB() # opens default DB at ~/.seismo_lab/frames.db
|
||||
db = FrameDB(path) # custom path
|
||||
cap_id = db.ingest(sessions, s3_path, bw_path)
|
||||
rows = db.query_frames(sub=0xF7, direction="S3")
|
||||
rows = db.query_by_byte(offset=85, value=0x0A)
|
||||
"""
|
||||
|
||||
from __future__ import annotations
|
||||
|
||||
import hashlib
|
||||
import os
|
||||
import sqlite3
|
||||
import struct
|
||||
from pathlib import Path
|
||||
from typing import Optional
|
||||
|
||||
# ─────────────────────────────────────────────────────────────────────────────
|
||||
# DB location
|
||||
# ─────────────────────────────────────────────────────────────────────────────
|
||||
|
||||
DEFAULT_DB_DIR = Path.home() / ".seismo_lab"
|
||||
DEFAULT_DB_PATH = DEFAULT_DB_DIR / "frames.db"
|
||||
|
||||
|
||||
# ─────────────────────────────────────────────────────────────────────────────
|
||||
# Schema
|
||||
# ─────────────────────────────────────────────────────────────────────────────
|
||||
|
||||
_DDL = """
|
||||
PRAGMA journal_mode=WAL;
|
||||
PRAGMA foreign_keys=ON;
|
||||
|
||||
CREATE TABLE IF NOT EXISTS captures (
|
||||
id INTEGER PRIMARY KEY AUTOINCREMENT,
|
||||
timestamp TEXT NOT NULL, -- ISO-8601 ingest time
|
||||
s3_path TEXT,
|
||||
bw_path TEXT,
|
||||
capture_hash TEXT NOT NULL UNIQUE, -- SHA256 of s3_blob+bw_blob
|
||||
notes TEXT DEFAULT ''
|
||||
);
|
||||
|
||||
CREATE TABLE IF NOT EXISTS frames (
|
||||
id INTEGER PRIMARY KEY AUTOINCREMENT,
|
||||
capture_id INTEGER NOT NULL REFERENCES captures(id) ON DELETE CASCADE,
|
||||
session_idx INTEGER NOT NULL,
|
||||
direction TEXT NOT NULL, -- 'BW' or 'S3'
|
||||
sub INTEGER, -- NULL if malformed
|
||||
page_key INTEGER,
|
||||
sub_name TEXT,
|
||||
payload BLOB NOT NULL,
|
||||
payload_len INTEGER NOT NULL,
|
||||
checksum_ok INTEGER -- 1/0/NULL
|
||||
);
|
||||
|
||||
CREATE INDEX IF NOT EXISTS idx_frames_capture ON frames(capture_id);
|
||||
CREATE INDEX IF NOT EXISTS idx_frames_sub ON frames(sub);
|
||||
CREATE INDEX IF NOT EXISTS idx_frames_page_key ON frames(page_key);
|
||||
CREATE INDEX IF NOT EXISTS idx_frames_dir ON frames(direction);
|
||||
|
||||
CREATE TABLE IF NOT EXISTS byte_values (
|
||||
id INTEGER PRIMARY KEY AUTOINCREMENT,
|
||||
frame_id INTEGER NOT NULL REFERENCES frames(id) ON DELETE CASCADE,
|
||||
offset INTEGER NOT NULL,
|
||||
value INTEGER NOT NULL
|
||||
);
|
||||
|
||||
CREATE INDEX IF NOT EXISTS idx_bv_frame ON byte_values(frame_id);
|
||||
CREATE INDEX IF NOT EXISTS idx_bv_offset ON byte_values(offset);
|
||||
CREATE INDEX IF NOT EXISTS idx_bv_value ON byte_values(value);
|
||||
CREATE INDEX IF NOT EXISTS idx_bv_off_val ON byte_values(offset, value);
|
||||
"""
|
||||
|
||||
|
||||
# ─────────────────────────────────────────────────────────────────────────────
|
||||
# Helpers
|
||||
# ─────────────────────────────────────────────────────────────────────────────
|
||||
|
||||
def _sha256_blobs(s3_blob: bytes, bw_blob: bytes) -> str:
|
||||
h = hashlib.sha256()
|
||||
h.update(s3_blob)
|
||||
h.update(bw_blob)
|
||||
return h.hexdigest()
|
||||
|
||||
|
||||
def _interp_bytes(data: bytes, offset: int) -> dict:
|
||||
"""
|
||||
Return multi-interpretation dict for 1–4 bytes starting at offset.
|
||||
Used in the GUI's byte interpretation panel.
|
||||
"""
|
||||
result: dict = {}
|
||||
remaining = len(data) - offset
|
||||
if remaining <= 0:
|
||||
return result
|
||||
|
||||
b1 = data[offset]
|
||||
result["uint8"] = b1
|
||||
result["int8"] = b1 if b1 < 128 else b1 - 256
|
||||
|
||||
if remaining >= 2:
|
||||
u16be = struct.unpack_from(">H", data, offset)[0]
|
||||
u16le = struct.unpack_from("<H", data, offset)[0]
|
||||
result["uint16_be"] = u16be
|
||||
result["uint16_le"] = u16le
|
||||
|
||||
if remaining >= 4:
|
||||
f32be = struct.unpack_from(">f", data, offset)[0]
|
||||
f32le = struct.unpack_from("<f", data, offset)[0]
|
||||
u32be = struct.unpack_from(">I", data, offset)[0]
|
||||
u32le = struct.unpack_from("<I", data, offset)[0]
|
||||
result["float32_be"] = round(f32be, 6)
|
||||
result["float32_le"] = round(f32le, 6)
|
||||
result["uint32_be"] = u32be
|
||||
result["uint32_le"] = u32le
|
||||
|
||||
return result
|
||||
|
||||
|
||||
# ─────────────────────────────────────────────────────────────────────────────
|
||||
# FrameDB class
|
||||
# ─────────────────────────────────────────────────────────────────────────────
|
||||
|
||||
class FrameDB:
|
||||
def __init__(self, path: Optional[Path] = None) -> None:
|
||||
if path is None:
|
||||
path = DEFAULT_DB_PATH
|
||||
path = Path(path)
|
||||
path.parent.mkdir(parents=True, exist_ok=True)
|
||||
self.path = path
|
||||
self._con = sqlite3.connect(str(path), check_same_thread=False)
|
||||
self._con.row_factory = sqlite3.Row
|
||||
self._init_schema()
|
||||
|
||||
def _init_schema(self) -> None:
|
||||
self._con.executescript(_DDL)
|
||||
self._con.commit()
|
||||
|
||||
def close(self) -> None:
|
||||
self._con.close()
|
||||
|
||||
# ── Ingest ────────────────────────────────────────────────────────────
|
||||
|
||||
def ingest(
|
||||
self,
|
||||
sessions: list, # list[Session] from s3_analyzer
|
||||
s3_path: Optional[Path],
|
||||
bw_path: Optional[Path],
|
||||
notes: str = "",
|
||||
) -> Optional[int]:
|
||||
"""
|
||||
Ingest a list of sessions into the DB.
|
||||
Returns capture_id, or None if already ingested (duplicate hash).
|
||||
"""
|
||||
import datetime
|
||||
|
||||
s3_blob = s3_path.read_bytes() if s3_path and s3_path.exists() else b""
|
||||
bw_blob = bw_path.read_bytes() if bw_path and bw_path.exists() else b""
|
||||
cap_hash = _sha256_blobs(s3_blob, bw_blob)
|
||||
|
||||
# Dedup check
|
||||
row = self._con.execute(
|
||||
"SELECT id FROM captures WHERE capture_hash=?", (cap_hash,)
|
||||
).fetchone()
|
||||
if row:
|
||||
return None # already in DB
|
||||
|
||||
ts = datetime.datetime.now().isoformat(timespec="seconds")
|
||||
cur = self._con.execute(
|
||||
"INSERT INTO captures (timestamp, s3_path, bw_path, capture_hash, notes) "
|
||||
"VALUES (?, ?, ?, ?, ?)",
|
||||
(ts, str(s3_path) if s3_path else None,
|
||||
str(bw_path) if bw_path else None,
|
||||
cap_hash, notes)
|
||||
)
|
||||
cap_id = cur.lastrowid
|
||||
|
||||
for sess in sessions:
|
||||
for af in sess.all_frames:
|
||||
frame_id = self._insert_frame(cap_id, af)
|
||||
self._insert_byte_values(frame_id, af.frame.payload)
|
||||
|
||||
self._con.commit()
|
||||
return cap_id
|
||||
|
||||
def _insert_frame(self, cap_id: int, af) -> int:
|
||||
"""Insert one AnnotatedFrame; return its rowid."""
|
||||
sub = af.header.sub if af.header else None
|
||||
page_key = af.header.page_key if af.header else None
|
||||
chk_ok = None
|
||||
if af.frame.checksum_valid is True:
|
||||
chk_ok = 1
|
||||
elif af.frame.checksum_valid is False:
|
||||
chk_ok = 0
|
||||
|
||||
cur = self._con.execute(
|
||||
"INSERT INTO frames "
|
||||
"(capture_id, session_idx, direction, sub, page_key, sub_name, payload, payload_len, checksum_ok) "
|
||||
"VALUES (?, ?, ?, ?, ?, ?, ?, ?, ?)",
|
||||
(cap_id, af.session_idx, af.source,
|
||||
sub, page_key, af.sub_name,
|
||||
af.frame.payload, len(af.frame.payload), chk_ok)
|
||||
)
|
||||
return cur.lastrowid
|
||||
|
||||
def _insert_byte_values(self, frame_id: int, payload: bytes) -> None:
|
||||
"""Insert one row per byte in payload into byte_values."""
|
||||
rows = [(frame_id, i, b) for i, b in enumerate(payload)]
|
||||
self._con.executemany(
|
||||
"INSERT INTO byte_values (frame_id, offset, value) VALUES (?, ?, ?)",
|
||||
rows
|
||||
)
|
||||
|
||||
# ── Queries ───────────────────────────────────────────────────────────
|
||||
|
||||
def list_captures(self) -> list[sqlite3.Row]:
|
||||
return self._con.execute(
|
||||
"SELECT id, timestamp, s3_path, bw_path, notes, "
|
||||
" (SELECT COUNT(*) FROM frames WHERE capture_id=captures.id) AS frame_count "
|
||||
"FROM captures ORDER BY id DESC"
|
||||
).fetchall()
|
||||
|
||||
def query_frames(
|
||||
self,
|
||||
capture_id: Optional[int] = None,
|
||||
direction: Optional[str] = None, # "BW" or "S3"
|
||||
sub: Optional[int] = None,
|
||||
page_key: Optional[int] = None,
|
||||
limit: int = 500,
|
||||
) -> list[sqlite3.Row]:
|
||||
"""
|
||||
Query frames table with optional filters.
|
||||
Returns rows with: id, capture_id, session_idx, direction, sub, page_key,
|
||||
sub_name, payload, payload_len, checksum_ok
|
||||
"""
|
||||
clauses = []
|
||||
params = []
|
||||
|
||||
if capture_id is not None:
|
||||
clauses.append("capture_id=?"); params.append(capture_id)
|
||||
if direction is not None:
|
||||
clauses.append("direction=?"); params.append(direction)
|
||||
if sub is not None:
|
||||
clauses.append("sub=?"); params.append(sub)
|
||||
if page_key is not None:
|
||||
clauses.append("page_key=?"); params.append(page_key)
|
||||
|
||||
where = ("WHERE " + " AND ".join(clauses)) if clauses else ""
|
||||
sql = f"SELECT * FROM frames {where} ORDER BY id LIMIT ?"
|
||||
params.append(limit)
|
||||
|
||||
return self._con.execute(sql, params).fetchall()
|
||||
|
||||
def query_by_byte(
|
||||
self,
|
||||
offset: int,
|
||||
value: Optional[int] = None,
|
||||
capture_id: Optional[int] = None,
|
||||
direction: Optional[str] = None,
|
||||
sub: Optional[int] = None,
|
||||
limit: int = 500,
|
||||
) -> list[sqlite3.Row]:
|
||||
"""
|
||||
Return frames that have a specific byte at a specific offset.
|
||||
Joins byte_values -> frames for indexed lookup.
|
||||
"""
|
||||
clauses = ["bv.offset=?"]
|
||||
params = [offset]
|
||||
|
||||
if value is not None:
|
||||
clauses.append("bv.value=?"); params.append(value)
|
||||
if capture_id is not None:
|
||||
clauses.append("f.capture_id=?"); params.append(capture_id)
|
||||
if direction is not None:
|
||||
clauses.append("f.direction=?"); params.append(direction)
|
||||
if sub is not None:
|
||||
clauses.append("f.sub=?"); params.append(sub)
|
||||
|
||||
where = "WHERE " + " AND ".join(clauses)
|
||||
sql = (
|
||||
f"SELECT f.*, bv.offset AS q_offset, bv.value AS q_value "
|
||||
f"FROM byte_values bv "
|
||||
f"JOIN frames f ON f.id=bv.frame_id "
|
||||
f"{where} "
|
||||
f"ORDER BY f.id LIMIT ?"
|
||||
)
|
||||
params.append(limit)
|
||||
return self._con.execute(sql, params).fetchall()
|
||||
|
||||
def get_frame_payload(self, frame_id: int) -> Optional[bytes]:
|
||||
row = self._con.execute(
|
||||
"SELECT payload FROM frames WHERE id=?", (frame_id,)
|
||||
).fetchone()
|
||||
return bytes(row["payload"]) if row else None
|
||||
|
||||
def get_distinct_subs(self, capture_id: Optional[int] = None) -> list[int]:
|
||||
if capture_id is not None:
|
||||
rows = self._con.execute(
|
||||
"SELECT DISTINCT sub FROM frames WHERE capture_id=? AND sub IS NOT NULL ORDER BY sub",
|
||||
(capture_id,)
|
||||
).fetchall()
|
||||
else:
|
||||
rows = self._con.execute(
|
||||
"SELECT DISTINCT sub FROM frames WHERE sub IS NOT NULL ORDER BY sub"
|
||||
).fetchall()
|
||||
return [r[0] for r in rows]
|
||||
|
||||
def get_distinct_offsets(self, capture_id: Optional[int] = None) -> list[int]:
|
||||
if capture_id is not None:
|
||||
rows = self._con.execute(
|
||||
"SELECT DISTINCT bv.offset FROM byte_values bv "
|
||||
"JOIN frames f ON f.id=bv.frame_id WHERE f.capture_id=? ORDER BY bv.offset",
|
||||
(capture_id,)
|
||||
).fetchall()
|
||||
else:
|
||||
rows = self._con.execute(
|
||||
"SELECT DISTINCT offset FROM byte_values ORDER BY offset"
|
||||
).fetchall()
|
||||
return [r[0] for r in rows]
|
||||
|
||||
def interpret_offset(self, payload: bytes, offset: int) -> dict:
|
||||
"""Return multi-format interpretation of bytes starting at offset."""
|
||||
return _interp_bytes(payload, offset)
|
||||
|
||||
def get_stats(self) -> dict:
|
||||
captures = self._con.execute("SELECT COUNT(*) FROM captures").fetchone()[0]
|
||||
frames = self._con.execute("SELECT COUNT(*) FROM frames").fetchone()[0]
|
||||
bv_rows = self._con.execute("SELECT COUNT(*) FROM byte_values").fetchone()[0]
|
||||
return {"captures": captures, "frames": frames, "byte_value_rows": bv_rows}
|
||||
940
parsers/gui_analyzer.py
Normal file
940
parsers/gui_analyzer.py
Normal file
@@ -0,0 +1,940 @@
|
||||
#!/usr/bin/env python3
|
||||
"""
|
||||
gui_analyzer.py — Tkinter GUI for s3_analyzer.
|
||||
|
||||
Layout:
|
||||
┌─────────────────────────────────────────────────────────┐
|
||||
│ [S3 file: ___________ Browse] [BW file: ___ Browse] │
|
||||
│ [Analyze] [Live mode toggle] Status: Idle │
|
||||
├──────────────────┬──────────────────────────────────────┤
|
||||
│ Session list │ Detail panel (tabs) │
|
||||
│ ─ Session 0 │ Inventory | Hex Dump | Diff │
|
||||
│ └ POLL (BW) │ │
|
||||
│ └ POLL_RESP │ (content of selected tab) │
|
||||
│ ─ Session 1 │ │
|
||||
│ └ ... │ │
|
||||
└──────────────────┴──────────────────────────────────────┘
|
||||
│ Status bar │
|
||||
└─────────────────────────────────────────────────────────┘
|
||||
"""
|
||||
|
||||
from __future__ import annotations
|
||||
|
||||
import queue
|
||||
import sys
|
||||
import threading
|
||||
import time
|
||||
import tkinter as tk
|
||||
from pathlib import Path
|
||||
from tkinter import filedialog, font, messagebox, ttk
|
||||
from typing import Optional
|
||||
|
||||
sys.path.insert(0, str(Path(__file__).parent))
|
||||
from s3_analyzer import ( # noqa: E402
|
||||
AnnotatedFrame,
|
||||
FrameDiff,
|
||||
Session,
|
||||
annotate_frames,
|
||||
diff_sessions,
|
||||
format_hex_dump,
|
||||
parse_bw,
|
||||
parse_s3,
|
||||
render_session_report,
|
||||
split_into_sessions,
|
||||
write_claude_export,
|
||||
)
|
||||
from frame_db import FrameDB, DEFAULT_DB_PATH # noqa: E402
|
||||
|
||||
|
||||
# ──────────────────────────────────────────────────────────────────────────────
|
||||
# Colour palette (dark-ish terminal feel)
|
||||
# ──────────────────────────────────────────────────────────────────────────────
|
||||
BG = "#1e1e1e"
|
||||
BG2 = "#252526"
|
||||
BG3 = "#2d2d30"
|
||||
FG = "#d4d4d4"
|
||||
FG_DIM = "#6a6a6a"
|
||||
ACCENT = "#569cd6"
|
||||
ACCENT2 = "#4ec9b0"
|
||||
RED = "#f44747"
|
||||
YELLOW = "#dcdcaa"
|
||||
GREEN = "#4caf50"
|
||||
ORANGE = "#ce9178"
|
||||
|
||||
COL_BW = "#9cdcfe" # BW frames
|
||||
COL_S3 = "#4ec9b0" # S3 frames
|
||||
COL_DIFF = "#f44747" # Changed bytes
|
||||
COL_KNOW = "#4caf50" # Known-field annotations
|
||||
COL_HEAD = "#569cd6" # Section headers
|
||||
|
||||
MONO = ("Consolas", 9)
|
||||
MONO_SM = ("Consolas", 8)
|
||||
|
||||
|
||||
# ──────────────────────────────────────────────────────────────────────────────
|
||||
# State container
|
||||
# ──────────────────────────────────────────────────────────────────────────────
|
||||
|
||||
class AnalyzerState:
|
||||
def __init__(self) -> None:
|
||||
self.sessions: list[Session] = []
|
||||
self.diffs: list[Optional[list[FrameDiff]]] = [] # diffs[i] = diff of session i vs i-1
|
||||
self.s3_path: Optional[Path] = None
|
||||
self.bw_path: Optional[Path] = None
|
||||
self.last_capture_id: Optional[int] = None
|
||||
|
||||
|
||||
# ──────────────────────────────────────────────────────────────────────────────
|
||||
# Main GUI
|
||||
# ──────────────────────────────────────────────────────────────────────────────
|
||||
|
||||
class AnalyzerGUI(tk.Tk):
|
||||
def __init__(self) -> None:
|
||||
super().__init__()
|
||||
self.title("S3 Protocol Analyzer")
|
||||
self.configure(bg=BG)
|
||||
self.minsize(1050, 600)
|
||||
|
||||
self.state = AnalyzerState()
|
||||
self._live_thread: Optional[threading.Thread] = None
|
||||
self._live_stop = threading.Event()
|
||||
self._live_q: queue.Queue[str] = queue.Queue()
|
||||
self._db = FrameDB()
|
||||
|
||||
self._build_widgets()
|
||||
self._poll_live_queue()
|
||||
|
||||
# ── widget construction ────────────────────────────────────────────────
|
||||
|
||||
def _build_widgets(self) -> None:
|
||||
self._build_toolbar()
|
||||
self._build_panes()
|
||||
self._build_statusbar()
|
||||
|
||||
def _build_toolbar(self) -> None:
|
||||
bar = tk.Frame(self, bg=BG2, pady=4)
|
||||
bar.pack(side=tk.TOP, fill=tk.X)
|
||||
|
||||
pad = {"padx": 5, "pady": 2}
|
||||
|
||||
# S3 file
|
||||
tk.Label(bar, text="S3 raw:", bg=BG2, fg=FG, font=MONO).pack(side=tk.LEFT, **pad)
|
||||
self.s3_var = tk.StringVar()
|
||||
tk.Entry(bar, textvariable=self.s3_var, width=28, bg=BG3, fg=FG,
|
||||
insertbackground=FG, relief="flat", font=MONO).pack(side=tk.LEFT, **pad)
|
||||
tk.Button(bar, text="Browse", bg=BG3, fg=FG, relief="flat",
|
||||
activebackground=ACCENT, cursor="hand2",
|
||||
command=lambda: self._browse_file(self.s3_var, "raw_s3.bin")
|
||||
).pack(side=tk.LEFT, **pad)
|
||||
|
||||
tk.Label(bar, text=" BW raw:", bg=BG2, fg=FG, font=MONO).pack(side=tk.LEFT, **pad)
|
||||
self.bw_var = tk.StringVar()
|
||||
tk.Entry(bar, textvariable=self.bw_var, width=28, bg=BG3, fg=FG,
|
||||
insertbackground=FG, relief="flat", font=MONO).pack(side=tk.LEFT, **pad)
|
||||
tk.Button(bar, text="Browse", bg=BG3, fg=FG, relief="flat",
|
||||
activebackground=ACCENT, cursor="hand2",
|
||||
command=lambda: self._browse_file(self.bw_var, "raw_bw.bin")
|
||||
).pack(side=tk.LEFT, **pad)
|
||||
|
||||
# Buttons
|
||||
tk.Frame(bar, bg=BG2, width=10).pack(side=tk.LEFT)
|
||||
self.analyze_btn = tk.Button(bar, text="Analyze", bg=ACCENT, fg="#ffffff",
|
||||
relief="flat", padx=10, cursor="hand2",
|
||||
font=("Consolas", 9, "bold"),
|
||||
command=self._run_analyze)
|
||||
self.analyze_btn.pack(side=tk.LEFT, **pad)
|
||||
|
||||
self.live_btn = tk.Button(bar, text="Live: OFF", bg=BG3, fg=FG,
|
||||
relief="flat", padx=10, cursor="hand2",
|
||||
font=MONO, command=self._toggle_live)
|
||||
self.live_btn.pack(side=tk.LEFT, **pad)
|
||||
|
||||
self.export_btn = tk.Button(bar, text="Export for Claude", bg=ORANGE, fg="#000000",
|
||||
relief="flat", padx=10, cursor="hand2",
|
||||
font=("Consolas", 9, "bold"),
|
||||
command=self._run_export, state="disabled")
|
||||
self.export_btn.pack(side=tk.LEFT, **pad)
|
||||
|
||||
self.status_var = tk.StringVar(value="Idle")
|
||||
tk.Label(bar, textvariable=self.status_var, bg=BG2, fg=FG_DIM,
|
||||
font=MONO, anchor="w").pack(side=tk.LEFT, padx=10)
|
||||
|
||||
def _build_panes(self) -> None:
|
||||
pane = tk.PanedWindow(self, orient=tk.HORIZONTAL, bg=BG,
|
||||
sashwidth=4, sashrelief="flat")
|
||||
pane.pack(fill=tk.BOTH, expand=True, padx=0, pady=0)
|
||||
|
||||
# ── Left: session/frame tree ──────────────────────────────────────
|
||||
left = tk.Frame(pane, bg=BG2, width=260)
|
||||
pane.add(left, minsize=200)
|
||||
|
||||
tk.Label(left, text="Sessions", bg=BG2, fg=ACCENT,
|
||||
font=("Consolas", 9, "bold"), anchor="w", padx=6).pack(fill=tk.X)
|
||||
|
||||
tree_frame = tk.Frame(left, bg=BG2)
|
||||
tree_frame.pack(fill=tk.BOTH, expand=True)
|
||||
|
||||
style = ttk.Style()
|
||||
style.theme_use("clam")
|
||||
style.configure("Treeview",
|
||||
background=BG2, foreground=FG, fieldbackground=BG2,
|
||||
font=MONO_SM, rowheight=18, borderwidth=0)
|
||||
style.configure("Treeview.Heading",
|
||||
background=BG3, foreground=ACCENT, font=MONO_SM)
|
||||
style.map("Treeview", background=[("selected", BG3)],
|
||||
foreground=[("selected", "#ffffff")])
|
||||
|
||||
self.tree = ttk.Treeview(tree_frame, columns=("info",), show="tree headings",
|
||||
selectmode="browse")
|
||||
self.tree.heading("#0", text="Frame")
|
||||
self.tree.heading("info", text="Info")
|
||||
self.tree.column("#0", width=160, stretch=True)
|
||||
self.tree.column("info", width=80, stretch=False)
|
||||
|
||||
vsb = ttk.Scrollbar(tree_frame, orient="vertical", command=self.tree.yview)
|
||||
self.tree.configure(yscrollcommand=vsb.set)
|
||||
vsb.pack(side=tk.RIGHT, fill=tk.Y)
|
||||
self.tree.pack(fill=tk.BOTH, expand=True)
|
||||
|
||||
self.tree.tag_configure("session", foreground=ACCENT, font=("Consolas", 9, "bold"))
|
||||
self.tree.tag_configure("bw_frame", foreground=COL_BW)
|
||||
self.tree.tag_configure("s3_frame", foreground=COL_S3)
|
||||
self.tree.tag_configure("bad_chk", foreground=RED)
|
||||
self.tree.tag_configure("malformed", foreground=RED)
|
||||
|
||||
self.tree.bind("<<TreeviewSelect>>", self._on_tree_select)
|
||||
|
||||
# ── Right: detail notebook ────────────────────────────────────────
|
||||
right = tk.Frame(pane, bg=BG)
|
||||
pane.add(right, minsize=600)
|
||||
|
||||
style.configure("TNotebook", background=BG2, borderwidth=0)
|
||||
style.configure("TNotebook.Tab", background=BG3, foreground=FG,
|
||||
font=MONO, padding=[8, 2])
|
||||
style.map("TNotebook.Tab", background=[("selected", BG)],
|
||||
foreground=[("selected", ACCENT)])
|
||||
|
||||
self.nb = ttk.Notebook(right)
|
||||
self.nb.pack(fill=tk.BOTH, expand=True)
|
||||
|
||||
# Tab: Inventory
|
||||
self.inv_text = self._make_text_tab("Inventory")
|
||||
# Tab: Hex Dump
|
||||
self.hex_text = self._make_text_tab("Hex Dump")
|
||||
# Tab: Diff
|
||||
self.diff_text = self._make_text_tab("Diff")
|
||||
# Tab: Full Report (raw text)
|
||||
self.report_text = self._make_text_tab("Full Report")
|
||||
# Tab: Query (DB)
|
||||
self._build_query_tab()
|
||||
|
||||
# Tag colours for rich text in all tabs
|
||||
for w in (self.inv_text, self.hex_text, self.diff_text, self.report_text):
|
||||
w.tag_configure("head", foreground=COL_HEAD, font=("Consolas", 9, "bold"))
|
||||
w.tag_configure("bw", foreground=COL_BW)
|
||||
w.tag_configure("s3", foreground=COL_S3)
|
||||
w.tag_configure("changed", foreground=COL_DIFF)
|
||||
w.tag_configure("known", foreground=COL_KNOW)
|
||||
w.tag_configure("dim", foreground=FG_DIM)
|
||||
w.tag_configure("normal", foreground=FG)
|
||||
w.tag_configure("warn", foreground=YELLOW)
|
||||
w.tag_configure("addr", foreground=ORANGE)
|
||||
|
||||
def _make_text_tab(self, title: str) -> tk.Text:
|
||||
frame = tk.Frame(self.nb, bg=BG)
|
||||
self.nb.add(frame, text=title)
|
||||
w = tk.Text(frame, bg=BG, fg=FG, font=MONO, state="disabled",
|
||||
relief="flat", wrap="none", insertbackground=FG,
|
||||
selectbackground=BG3, selectforeground="#ffffff")
|
||||
vsb = ttk.Scrollbar(frame, orient="vertical", command=w.yview)
|
||||
hsb = ttk.Scrollbar(frame, orient="horizontal", command=w.xview)
|
||||
w.configure(yscrollcommand=vsb.set, xscrollcommand=hsb.set)
|
||||
vsb.pack(side=tk.RIGHT, fill=tk.Y)
|
||||
hsb.pack(side=tk.BOTTOM, fill=tk.X)
|
||||
w.pack(fill=tk.BOTH, expand=True)
|
||||
return w
|
||||
|
||||
def _build_query_tab(self) -> None:
|
||||
"""Build the Query tab: filter controls + results table + interpretation panel."""
|
||||
frame = tk.Frame(self.nb, bg=BG)
|
||||
self.nb.add(frame, text="Query DB")
|
||||
|
||||
# ── Filter row ────────────────────────────────────────────────────
|
||||
filt = tk.Frame(frame, bg=BG2, pady=4)
|
||||
filt.pack(side=tk.TOP, fill=tk.X)
|
||||
|
||||
pad = {"padx": 4, "pady": 2}
|
||||
|
||||
# Capture filter
|
||||
tk.Label(filt, text="Capture:", bg=BG2, fg=FG, font=MONO_SM).grid(row=0, column=0, sticky="e", **pad)
|
||||
self._q_capture_var = tk.StringVar(value="All")
|
||||
self._q_capture_cb = ttk.Combobox(filt, textvariable=self._q_capture_var,
|
||||
width=18, font=MONO_SM, state="readonly")
|
||||
self._q_capture_cb.grid(row=0, column=1, sticky="w", **pad)
|
||||
|
||||
# Direction filter
|
||||
tk.Label(filt, text="Dir:", bg=BG2, fg=FG, font=MONO_SM).grid(row=0, column=2, sticky="e", **pad)
|
||||
self._q_dir_var = tk.StringVar(value="All")
|
||||
self._q_dir_cb = ttk.Combobox(filt, textvariable=self._q_dir_var,
|
||||
values=["All", "BW", "S3"],
|
||||
width=6, font=MONO_SM, state="readonly")
|
||||
self._q_dir_cb.grid(row=0, column=3, sticky="w", **pad)
|
||||
|
||||
# SUB filter
|
||||
tk.Label(filt, text="SUB:", bg=BG2, fg=FG, font=MONO_SM).grid(row=0, column=4, sticky="e", **pad)
|
||||
self._q_sub_var = tk.StringVar(value="All")
|
||||
self._q_sub_cb = ttk.Combobox(filt, textvariable=self._q_sub_var,
|
||||
width=12, font=MONO_SM, state="readonly")
|
||||
self._q_sub_cb.grid(row=0, column=5, sticky="w", **pad)
|
||||
|
||||
# Byte offset filter
|
||||
tk.Label(filt, text="Offset:", bg=BG2, fg=FG, font=MONO_SM).grid(row=0, column=6, sticky="e", **pad)
|
||||
self._q_offset_var = tk.StringVar(value="")
|
||||
tk.Entry(filt, textvariable=self._q_offset_var, width=8, bg=BG3, fg=FG,
|
||||
font=MONO_SM, insertbackground=FG, relief="flat").grid(row=0, column=7, sticky="w", **pad)
|
||||
|
||||
# Value filter
|
||||
tk.Label(filt, text="Value:", bg=BG2, fg=FG, font=MONO_SM).grid(row=0, column=8, sticky="e", **pad)
|
||||
self._q_value_var = tk.StringVar(value="")
|
||||
tk.Entry(filt, textvariable=self._q_value_var, width=8, bg=BG3, fg=FG,
|
||||
font=MONO_SM, insertbackground=FG, relief="flat").grid(row=0, column=9, sticky="w", **pad)
|
||||
|
||||
# Run / Refresh buttons
|
||||
tk.Button(filt, text="Run Query", bg=ACCENT, fg="#ffffff", relief="flat",
|
||||
padx=8, cursor="hand2", font=("Consolas", 8, "bold"),
|
||||
command=self._run_db_query).grid(row=0, column=10, padx=8)
|
||||
tk.Button(filt, text="Refresh dropdowns", bg=BG3, fg=FG, relief="flat",
|
||||
padx=6, cursor="hand2", font=MONO_SM,
|
||||
command=self._refresh_query_dropdowns).grid(row=0, column=11, padx=4)
|
||||
|
||||
# DB stats label
|
||||
self._q_stats_var = tk.StringVar(value="DB: —")
|
||||
tk.Label(filt, textvariable=self._q_stats_var, bg=BG2, fg=FG_DIM,
|
||||
font=MONO_SM).grid(row=0, column=12, padx=12, sticky="w")
|
||||
|
||||
# ── Results table ─────────────────────────────────────────────────
|
||||
res_frame = tk.Frame(frame, bg=BG)
|
||||
res_frame.pack(side=tk.TOP, fill=tk.BOTH, expand=True)
|
||||
|
||||
# Results treeview
|
||||
cols = ("cap", "sess", "dir", "sub", "sub_name", "page", "len", "chk")
|
||||
self._q_tree = ttk.Treeview(res_frame, columns=cols,
|
||||
show="headings", selectmode="browse")
|
||||
col_cfg = [
|
||||
("cap", "Cap", 40),
|
||||
("sess", "Sess", 40),
|
||||
("dir", "Dir", 40),
|
||||
("sub", "SUB", 50),
|
||||
("sub_name", "Name", 160),
|
||||
("page", "Page", 60),
|
||||
("len", "Len", 50),
|
||||
("chk", "Chk", 50),
|
||||
]
|
||||
for cid, heading, width in col_cfg:
|
||||
self._q_tree.heading(cid, text=heading, anchor="w")
|
||||
self._q_tree.column(cid, width=width, stretch=(cid == "sub_name"))
|
||||
|
||||
q_vsb = ttk.Scrollbar(res_frame, orient="vertical", command=self._q_tree.yview)
|
||||
q_hsb = ttk.Scrollbar(res_frame, orient="horizontal", command=self._q_tree.xview)
|
||||
self._q_tree.configure(yscrollcommand=q_vsb.set, xscrollcommand=q_hsb.set)
|
||||
q_vsb.pack(side=tk.RIGHT, fill=tk.Y)
|
||||
q_hsb.pack(side=tk.BOTTOM, fill=tk.X)
|
||||
self._q_tree.pack(side=tk.LEFT, fill=tk.BOTH, expand=True)
|
||||
|
||||
self._q_tree.tag_configure("bw_row", foreground=COL_BW)
|
||||
self._q_tree.tag_configure("s3_row", foreground=COL_S3)
|
||||
self._q_tree.tag_configure("bad_row", foreground=RED)
|
||||
|
||||
# ── Interpretation panel (below results) ──────────────────────────
|
||||
interp_frame = tk.Frame(frame, bg=BG2, height=120)
|
||||
interp_frame.pack(side=tk.BOTTOM, fill=tk.X)
|
||||
interp_frame.pack_propagate(False)
|
||||
|
||||
tk.Label(interp_frame, text="Byte interpretation (click a row, enter offset):",
|
||||
bg=BG2, fg=ACCENT, font=MONO_SM, anchor="w", padx=6).pack(fill=tk.X)
|
||||
|
||||
interp_inner = tk.Frame(interp_frame, bg=BG2)
|
||||
interp_inner.pack(fill=tk.X, padx=6, pady=2)
|
||||
|
||||
tk.Label(interp_inner, text="Offset:", bg=BG2, fg=FG, font=MONO_SM).pack(side=tk.LEFT)
|
||||
self._interp_offset_var = tk.StringVar(value="5")
|
||||
tk.Entry(interp_inner, textvariable=self._interp_offset_var,
|
||||
width=6, bg=BG3, fg=FG, font=MONO_SM,
|
||||
insertbackground=FG, relief="flat").pack(side=tk.LEFT, padx=4)
|
||||
tk.Button(interp_inner, text="Interpret", bg=BG3, fg=FG, relief="flat",
|
||||
cursor="hand2", font=MONO_SM,
|
||||
command=self._run_interpret).pack(side=tk.LEFT, padx=4)
|
||||
|
||||
self._interp_text = tk.Text(interp_frame, bg=BG2, fg=FG, font=MONO_SM,
|
||||
height=4, relief="flat", state="disabled",
|
||||
insertbackground=FG)
|
||||
self._interp_text.pack(fill=tk.X, padx=6, pady=2)
|
||||
self._interp_text.tag_configure("label", foreground=FG_DIM)
|
||||
self._interp_text.tag_configure("value", foreground=YELLOW)
|
||||
|
||||
# Store frame rows by tree iid -> db row
|
||||
self._q_rows: dict[str, object] = {}
|
||||
self._q_capture_rows: list = [None]
|
||||
self._q_sub_values: list = [None]
|
||||
self._q_tree.bind("<<TreeviewSelect>>", self._on_q_select)
|
||||
|
||||
# Init dropdowns
|
||||
self._refresh_query_dropdowns()
|
||||
|
||||
def _refresh_query_dropdowns(self) -> None:
|
||||
"""Reload capture and SUB dropdowns from the DB."""
|
||||
try:
|
||||
captures = self._db.list_captures()
|
||||
cap_labels = ["All"] + [
|
||||
f"#{r['id']} {r['timestamp'][:16]} ({r['frame_count']} frames)"
|
||||
for r in captures
|
||||
]
|
||||
self._q_capture_cb["values"] = cap_labels
|
||||
self._q_capture_rows = [None] + [r["id"] for r in captures]
|
||||
|
||||
subs = self._db.get_distinct_subs()
|
||||
sub_labels = ["All"] + [f"0x{s:02X}" for s in subs]
|
||||
self._q_sub_cb["values"] = sub_labels
|
||||
self._q_sub_values = [None] + subs
|
||||
|
||||
stats = self._db.get_stats()
|
||||
self._q_stats_var.set(
|
||||
f"DB: {stats['captures']} captures | {stats['frames']} frames"
|
||||
)
|
||||
except Exception as exc:
|
||||
self._q_stats_var.set(f"DB error: {exc}")
|
||||
|
||||
def _parse_hex_or_int(self, s: str) -> Optional[int]:
|
||||
"""Parse '0x1F', '31', or '' into int or None."""
|
||||
s = s.strip()
|
||||
if not s:
|
||||
return None
|
||||
try:
|
||||
return int(s, 0)
|
||||
except ValueError:
|
||||
return None
|
||||
|
||||
def _run_db_query(self) -> None:
|
||||
"""Execute query with current filter values and populate results tree."""
|
||||
# Resolve capture_id
|
||||
cap_idx = self._q_capture_cb.current()
|
||||
cap_id = self._q_capture_rows[cap_idx] if cap_idx > 0 else None
|
||||
|
||||
# Direction
|
||||
dir_val = self._q_dir_var.get()
|
||||
direction = dir_val if dir_val != "All" else None
|
||||
|
||||
# SUB
|
||||
sub_idx = self._q_sub_cb.current()
|
||||
sub = self._q_sub_values[sub_idx] if sub_idx > 0 else None
|
||||
|
||||
# Offset / value
|
||||
offset = self._parse_hex_or_int(self._q_offset_var.get())
|
||||
value = self._parse_hex_or_int(self._q_value_var.get())
|
||||
|
||||
try:
|
||||
if offset is not None:
|
||||
rows = self._db.query_by_byte(
|
||||
offset=offset, value=value,
|
||||
capture_id=cap_id, direction=direction, sub=sub
|
||||
)
|
||||
else:
|
||||
rows = self._db.query_frames(
|
||||
capture_id=cap_id, direction=direction, sub=sub
|
||||
)
|
||||
except Exception as exc:
|
||||
messagebox.showerror("Query error", str(exc))
|
||||
return
|
||||
|
||||
# Populate tree
|
||||
self._q_tree.delete(*self._q_tree.get_children())
|
||||
self._q_rows.clear()
|
||||
|
||||
for row in rows:
|
||||
sub_hex = f"0x{row['sub']:02X}" if row["sub"] is not None else "—"
|
||||
page_hex = f"0x{row['page_key']:04X}" if row["page_key"] is not None else "—"
|
||||
chk_str = {1: "OK", 0: "BAD", None: "—"}.get(row["checksum_ok"], "—")
|
||||
tag = "bw_row" if row["direction"] == "BW" else "s3_row"
|
||||
if row["checksum_ok"] == 0:
|
||||
tag = "bad_row"
|
||||
|
||||
iid = str(row["id"])
|
||||
self._q_tree.insert("", tk.END, iid=iid, tags=(tag,), values=(
|
||||
row["capture_id"],
|
||||
row["session_idx"],
|
||||
row["direction"],
|
||||
sub_hex,
|
||||
row["sub_name"] or "",
|
||||
page_hex,
|
||||
row["payload_len"],
|
||||
chk_str,
|
||||
))
|
||||
self._q_rows[iid] = row
|
||||
|
||||
self.sb_var.set(f"Query returned {len(rows)} rows")
|
||||
|
||||
def _on_q_select(self, _event: tk.Event) -> None:
|
||||
"""When a DB result row is selected, auto-run interpret at current offset."""
|
||||
self._run_interpret()
|
||||
|
||||
def _run_interpret(self) -> None:
|
||||
"""Show multi-format byte interpretation for the selected row + offset."""
|
||||
sel = self._q_tree.selection()
|
||||
if not sel:
|
||||
return
|
||||
iid = sel[0]
|
||||
row = self._q_rows.get(iid)
|
||||
if row is None:
|
||||
return
|
||||
|
||||
offset = self._parse_hex_or_int(self._interp_offset_var.get())
|
||||
if offset is None:
|
||||
return
|
||||
|
||||
payload = bytes(row["payload"])
|
||||
interp = self._db.interpret_offset(payload, offset)
|
||||
|
||||
w = self._interp_text
|
||||
w.configure(state="normal")
|
||||
w.delete("1.0", tk.END)
|
||||
|
||||
sub_hex = f"0x{row['sub']:02X}" if row["sub"] is not None else "??"
|
||||
w.insert(tk.END, f"Frame #{row['id']} [{row['direction']}] SUB={sub_hex} "
|
||||
f"offset={offset} (0x{offset:04X})\n", "label")
|
||||
|
||||
label_order = [
|
||||
("uint8", "uint8 "),
|
||||
("int8", "int8 "),
|
||||
("uint16_be", "uint16 BE "),
|
||||
("uint16_le", "uint16 LE "),
|
||||
("uint32_be", "uint32 BE "),
|
||||
("uint32_le", "uint32 LE "),
|
||||
("float32_be", "float32 BE "),
|
||||
("float32_le", "float32 LE "),
|
||||
]
|
||||
line = ""
|
||||
for key, label in label_order:
|
||||
if key in interp:
|
||||
val = interp[key]
|
||||
if isinstance(val, float):
|
||||
val_str = f"{val:.6g}"
|
||||
else:
|
||||
val_str = str(val)
|
||||
if key.startswith("uint") or key.startswith("int"):
|
||||
val_str += f" (0x{int(val) & 0xFFFFFFFF:X})"
|
||||
chunk = f"{label}: {val_str}"
|
||||
line += f" {chunk:<30}"
|
||||
if len(line) > 80:
|
||||
w.insert(tk.END, line + "\n", "value")
|
||||
line = ""
|
||||
if line:
|
||||
w.insert(tk.END, line + "\n", "value")
|
||||
|
||||
w.configure(state="disabled")
|
||||
|
||||
def _build_statusbar(self) -> None:
|
||||
bar = tk.Frame(self, bg=BG3, height=20)
|
||||
bar.pack(side=tk.BOTTOM, fill=tk.X)
|
||||
self.sb_var = tk.StringVar(value="Ready")
|
||||
tk.Label(bar, textvariable=self.sb_var, bg=BG3, fg=FG_DIM,
|
||||
font=MONO_SM, anchor="w", padx=6).pack(fill=tk.X)
|
||||
|
||||
# ── file picking ───────────────────────────────────────────────────────
|
||||
|
||||
def _browse_file(self, var: tk.StringVar, default_name: str) -> None:
|
||||
path = filedialog.askopenfilename(
|
||||
title=f"Select {default_name}",
|
||||
filetypes=[("Binary files", "*.bin"), ("All files", "*.*")],
|
||||
initialfile=default_name,
|
||||
)
|
||||
if path:
|
||||
var.set(path)
|
||||
|
||||
# ── analysis ──────────────────────────────────────────────────────────
|
||||
|
||||
def _run_analyze(self) -> None:
|
||||
s3_path = Path(self.s3_var.get().strip()) if self.s3_var.get().strip() else None
|
||||
bw_path = Path(self.bw_var.get().strip()) if self.bw_var.get().strip() else None
|
||||
|
||||
if not s3_path or not bw_path:
|
||||
messagebox.showerror("Missing files", "Please select both S3 and BW raw files.")
|
||||
return
|
||||
if not s3_path.exists():
|
||||
messagebox.showerror("File not found", f"S3 file not found:\n{s3_path}")
|
||||
return
|
||||
if not bw_path.exists():
|
||||
messagebox.showerror("File not found", f"BW file not found:\n{bw_path}")
|
||||
return
|
||||
|
||||
self.state.s3_path = s3_path
|
||||
self.state.bw_path = bw_path
|
||||
self._do_analyze(s3_path, bw_path)
|
||||
|
||||
def _run_export(self) -> None:
|
||||
if not self.state.sessions:
|
||||
messagebox.showinfo("Export", "Run Analyze first.")
|
||||
return
|
||||
|
||||
outdir = self.state.s3_path.parent if self.state.s3_path else Path(".")
|
||||
out_path = write_claude_export(
|
||||
self.state.sessions,
|
||||
self.state.diffs,
|
||||
outdir,
|
||||
self.state.s3_path,
|
||||
self.state.bw_path,
|
||||
)
|
||||
|
||||
self.sb_var.set(f"Exported: {out_path.name}")
|
||||
if messagebox.askyesno(
|
||||
"Export complete",
|
||||
f"Saved to:\n{out_path}\n\nOpen the folder?",
|
||||
):
|
||||
import subprocess
|
||||
subprocess.Popen(["explorer", str(out_path.parent)])
|
||||
|
||||
def _do_analyze(self, s3_path: Path, bw_path: Path) -> None:
|
||||
self.status_var.set("Parsing...")
|
||||
self.update_idletasks()
|
||||
|
||||
s3_blob = s3_path.read_bytes()
|
||||
bw_blob = bw_path.read_bytes()
|
||||
|
||||
s3_frames = annotate_frames(parse_s3(s3_blob, trailer_len=0), "S3")
|
||||
bw_frames = annotate_frames(parse_bw(bw_blob, trailer_len=0, validate_checksum=True), "BW")
|
||||
|
||||
sessions = split_into_sessions(bw_frames, s3_frames)
|
||||
|
||||
diffs: list[Optional[list[FrameDiff]]] = [None]
|
||||
for i in range(1, len(sessions)):
|
||||
diffs.append(diff_sessions(sessions[i - 1], sessions[i]))
|
||||
|
||||
self.state.sessions = sessions
|
||||
self.state.diffs = diffs
|
||||
|
||||
n_s3 = sum(len(s.s3_frames) for s in sessions)
|
||||
n_bw = sum(len(s.bw_frames) for s in sessions)
|
||||
self.status_var.set(
|
||||
f"{len(sessions)} sessions | BW: {n_bw} frames S3: {n_s3} frames"
|
||||
)
|
||||
self.sb_var.set(f"Loaded: {s3_path.name} + {bw_path.name}")
|
||||
|
||||
self.export_btn.configure(state="normal")
|
||||
self._rebuild_tree()
|
||||
|
||||
# Auto-ingest into DB (deduped by SHA256 — fast no-op on re-analyze)
|
||||
try:
|
||||
cap_id = self._db.ingest(sessions, s3_path, bw_path)
|
||||
if cap_id is not None:
|
||||
self.state.last_capture_id = cap_id
|
||||
self._refresh_query_dropdowns()
|
||||
# Pre-select this capture in the Query tab
|
||||
cap_labels = list(self._q_capture_cb["values"])
|
||||
# Find label that starts with #<cap_id>
|
||||
for i, lbl in enumerate(cap_labels):
|
||||
if lbl.startswith(f"#{cap_id} "):
|
||||
self._q_capture_cb.current(i)
|
||||
break
|
||||
# else: already ingested — no change to dropdown selection
|
||||
except Exception as exc:
|
||||
self.sb_var.set(f"DB ingest error: {exc}")
|
||||
|
||||
# ── tree building ──────────────────────────────────────────────────────
|
||||
|
||||
def _rebuild_tree(self) -> None:
|
||||
self.tree.delete(*self.tree.get_children())
|
||||
|
||||
for sess in self.state.sessions:
|
||||
is_complete = any(
|
||||
af.header is not None and af.header.sub == 0x74
|
||||
for af in sess.bw_frames
|
||||
)
|
||||
label = f"Session {sess.index}"
|
||||
if not is_complete:
|
||||
label += " [partial]"
|
||||
n_diff = len(self.state.diffs[sess.index] or [])
|
||||
diff_info = f"{n_diff} changes" if n_diff > 0 else ""
|
||||
sess_id = self.tree.insert("", tk.END, text=label,
|
||||
values=(diff_info,), tags=("session",))
|
||||
|
||||
for af in sess.all_frames:
|
||||
src_tag = "bw_frame" if af.source == "BW" else "s3_frame"
|
||||
sub_hex = f"{af.header.sub:02X}" if af.header else "??"
|
||||
label_text = f"[{af.source}] {sub_hex} {af.sub_name}"
|
||||
extra = ""
|
||||
tags = (src_tag,)
|
||||
if af.frame.checksum_valid is False:
|
||||
extra = "BAD CHK"
|
||||
tags = ("bad_chk",)
|
||||
elif af.header is None:
|
||||
tags = ("malformed",)
|
||||
label_text = f"[{af.source}] MALFORMED"
|
||||
self.tree.insert(sess_id, tk.END, text=label_text,
|
||||
values=(extra,), tags=tags,
|
||||
iid=f"frame_{sess.index}_{af.frame.index}_{af.source}")
|
||||
|
||||
# Expand all sessions
|
||||
for item in self.tree.get_children():
|
||||
self.tree.item(item, open=True)
|
||||
|
||||
# ── tree selection → detail panel ─────────────────────────────────────
|
||||
|
||||
def _on_tree_select(self, _event: tk.Event) -> None:
|
||||
sel = self.tree.selection()
|
||||
if not sel:
|
||||
return
|
||||
iid = sel[0]
|
||||
|
||||
# Determine if it's a session node or a frame node
|
||||
if iid.startswith("frame_"):
|
||||
# frame_<sessidx>_<frameidx>_<source>
|
||||
parts = iid.split("_")
|
||||
sess_idx = int(parts[1])
|
||||
frame_idx = int(parts[2])
|
||||
source = parts[3]
|
||||
self._show_frame_detail(sess_idx, frame_idx, source)
|
||||
else:
|
||||
# Session node — show session summary
|
||||
# Find session index from text
|
||||
text = self.tree.item(iid, "text")
|
||||
try:
|
||||
idx = int(text.split()[1])
|
||||
self._show_session_detail(idx)
|
||||
except (IndexError, ValueError):
|
||||
pass
|
||||
|
||||
def _find_frame(self, sess_idx: int, frame_idx: int, source: str) -> Optional[AnnotatedFrame]:
|
||||
if sess_idx >= len(self.state.sessions):
|
||||
return None
|
||||
sess = self.state.sessions[sess_idx]
|
||||
pool = sess.bw_frames if source == "BW" else sess.s3_frames
|
||||
for af in pool:
|
||||
if af.frame.index == frame_idx:
|
||||
return af
|
||||
return None
|
||||
|
||||
# ── detail renderers ──────────────────────────────────────────────────
|
||||
|
||||
def _clear_all_tabs(self) -> None:
|
||||
for w in (self.inv_text, self.hex_text, self.diff_text, self.report_text):
|
||||
self._text_clear(w)
|
||||
|
||||
def _show_session_detail(self, sess_idx: int) -> None:
|
||||
if sess_idx >= len(self.state.sessions):
|
||||
return
|
||||
sess = self.state.sessions[sess_idx]
|
||||
diffs = self.state.diffs[sess_idx]
|
||||
|
||||
self._clear_all_tabs()
|
||||
|
||||
# ── Inventory tab ────────────────────────────────────────────────
|
||||
w = self.inv_text
|
||||
self._text_clear(w)
|
||||
self._tw(w, f"SESSION {sess.index}", "head"); self._tn(w)
|
||||
n_bw, n_s3 = len(sess.bw_frames), len(sess.s3_frames)
|
||||
self._tw(w, f"Frames: {n_bw + n_s3} (BW: {n_bw}, S3: {n_s3})\n", "normal")
|
||||
if n_bw != n_s3:
|
||||
self._tw(w, " WARNING: BW/S3 count mismatch\n", "warn")
|
||||
self._tn(w)
|
||||
|
||||
for seq_i, af in enumerate(sess.all_frames):
|
||||
src_tag = "bw" if af.source == "BW" else "s3"
|
||||
sub_hex = f"{af.header.sub:02X}" if af.header else "??"
|
||||
page_str = f" (page {af.header.page_key:04X})" if af.header and af.header.page_key != 0 else ""
|
||||
chk = ""
|
||||
if af.frame.checksum_valid is False:
|
||||
chk = " [BAD CHECKSUM]"
|
||||
elif af.frame.checksum_valid is True:
|
||||
chk = f" [{af.frame.checksum_type}]"
|
||||
self._tw(w, f" [{af.source}] #{seq_i:<3} ", src_tag)
|
||||
self._tw(w, f"SUB={sub_hex} ", "addr")
|
||||
self._tw(w, f"{af.sub_name:<30}", src_tag)
|
||||
self._tw(w, f"{page_str} len={len(af.frame.payload)}", "dim")
|
||||
if chk:
|
||||
self._tw(w, chk, "warn" if af.frame.checksum_valid is False else "dim")
|
||||
self._tn(w)
|
||||
|
||||
# ── Diff tab ─────────────────────────────────────────────────────
|
||||
w = self.diff_text
|
||||
self._text_clear(w)
|
||||
if diffs is None:
|
||||
self._tw(w, "(No previous session to diff against)\n", "dim")
|
||||
elif not diffs:
|
||||
self._tw(w, f"DIFF vs SESSION {sess_idx - 1}\n", "head"); self._tn(w)
|
||||
self._tw(w, " No changes detected.\n", "dim")
|
||||
else:
|
||||
self._tw(w, f"DIFF vs SESSION {sess_idx - 1}\n", "head"); self._tn(w)
|
||||
for fd in diffs:
|
||||
page_str = f" (page {fd.page_key:04X})" if fd.page_key != 0 else ""
|
||||
self._tw(w, f"\n SUB {fd.sub:02X} ({fd.sub_name}){page_str}:\n", "addr")
|
||||
for bd in fd.diffs:
|
||||
before_s = f"{bd.before:02x}" if bd.before >= 0 else "--"
|
||||
after_s = f"{bd.after:02x}" if bd.after >= 0 else "--"
|
||||
self._tw(w, f" [{bd.payload_offset:3d}] 0x{bd.payload_offset:04X}: ", "dim")
|
||||
self._tw(w, f"{before_s} -> {after_s}", "changed")
|
||||
if bd.field_name:
|
||||
self._tw(w, f" [{bd.field_name}]", "known")
|
||||
self._tn(w)
|
||||
|
||||
# ── Full Report tab ───────────────────────────────────────────────
|
||||
report_text = render_session_report(sess, diffs, sess_idx - 1 if sess_idx > 0 else None)
|
||||
w = self.report_text
|
||||
self._text_clear(w)
|
||||
self._tw(w, report_text, "normal")
|
||||
|
||||
# Switch to Inventory tab
|
||||
self.nb.select(0)
|
||||
|
||||
def _show_frame_detail(self, sess_idx: int, frame_idx: int, source: str) -> None:
|
||||
af = self._find_frame(sess_idx, frame_idx, source)
|
||||
if af is None:
|
||||
return
|
||||
|
||||
self._clear_all_tabs()
|
||||
src_tag = "bw" if source == "BW" else "s3"
|
||||
sub_hex = f"{af.header.sub:02X}" if af.header else "??"
|
||||
|
||||
# ── Inventory tab — single frame summary ─────────────────────────
|
||||
w = self.inv_text
|
||||
self._tw(w, f"[{af.source}] Frame #{af.frame.index}\n", src_tag)
|
||||
self._tw(w, f"Session {sess_idx} | ", "dim")
|
||||
self._tw(w, f"SUB={sub_hex} {af.sub_name}\n", "addr")
|
||||
if af.header:
|
||||
self._tw(w, f" OFFSET: {af.header.page_key:04X} ", "dim")
|
||||
self._tw(w, f"CMD={af.header.cmd:02X} FLAGS={af.header.flags:02X}\n", "dim")
|
||||
self._tn(w)
|
||||
self._tw(w, f"Payload bytes: {len(af.frame.payload)}\n", "dim")
|
||||
if af.frame.checksum_valid is False:
|
||||
self._tw(w, " BAD CHECKSUM\n", "warn")
|
||||
elif af.frame.checksum_valid is True:
|
||||
self._tw(w, f" Checksum: {af.frame.checksum_type} {af.frame.checksum_hex}\n", "dim")
|
||||
self._tn(w)
|
||||
|
||||
# Protocol header breakdown
|
||||
p = af.frame.payload
|
||||
if len(p) >= 5:
|
||||
self._tw(w, "Header breakdown:\n", "head")
|
||||
self._tw(w, f" [0] CMD = {p[0]:02x}\n", "dim")
|
||||
self._tw(w, f" [1] ? = {p[1]:02x}\n", "dim")
|
||||
self._tw(w, f" [2] SUB = {p[2]:02x} ({af.sub_name})\n", src_tag)
|
||||
self._tw(w, f" [3] OFFSET_HI = {p[3]:02x}\n", "dim")
|
||||
self._tw(w, f" [4] OFFSET_LO = {p[4]:02x}\n", "dim")
|
||||
if len(p) > 5:
|
||||
self._tw(w, f" [5..] data = {len(p) - 5} bytes\n", "dim")
|
||||
|
||||
# ── Hex Dump tab ─────────────────────────────────────────────────
|
||||
w = self.hex_text
|
||||
self._tw(w, f"[{af.source}] SUB={sub_hex} {af.sub_name}\n", src_tag)
|
||||
self._tw(w, f"Payload ({len(af.frame.payload)} bytes):\n", "dim")
|
||||
self._tn(w)
|
||||
dump_lines = format_hex_dump(af.frame.payload, indent=" ")
|
||||
self._tw(w, "\n".join(dump_lines) + "\n", "normal")
|
||||
|
||||
# Annotate known field offsets within this frame
|
||||
diffs_for_sess = self.state.diffs[sess_idx] if sess_idx < len(self.state.diffs) else None
|
||||
if diffs_for_sess and af.header:
|
||||
page_key = af.header.page_key
|
||||
matching = [fd for fd in diffs_for_sess
|
||||
if fd.sub == af.header.sub and fd.page_key == page_key]
|
||||
if matching:
|
||||
self._tn(w)
|
||||
self._tw(w, "Changed bytes in this frame (vs prev session):\n", "head")
|
||||
for bd in matching[0].diffs:
|
||||
before_s = f"{bd.before:02x}" if bd.before >= 0 else "--"
|
||||
after_s = f"{bd.after:02x}" if bd.after >= 0 else "--"
|
||||
self._tw(w, f" [{bd.payload_offset:3d}] 0x{bd.payload_offset:04X}: ", "dim")
|
||||
self._tw(w, f"{before_s} -> {after_s}", "changed")
|
||||
if bd.field_name:
|
||||
self._tw(w, f" [{bd.field_name}]", "known")
|
||||
self._tn(w)
|
||||
|
||||
# Switch to Hex Dump tab for frame selection
|
||||
self.nb.select(1)
|
||||
|
||||
# ── live mode ─────────────────────────────────────────────────────────
|
||||
|
||||
def _toggle_live(self) -> None:
|
||||
if self._live_thread and self._live_thread.is_alive():
|
||||
self._live_stop.set()
|
||||
self.live_btn.configure(text="Live: OFF", bg=BG3, fg=FG)
|
||||
self.status_var.set("Live stopped")
|
||||
else:
|
||||
s3_path = Path(self.s3_var.get().strip()) if self.s3_var.get().strip() else None
|
||||
bw_path = Path(self.bw_var.get().strip()) if self.bw_var.get().strip() else None
|
||||
if not s3_path or not bw_path:
|
||||
messagebox.showerror("Missing files", "Select both raw files before starting live mode.")
|
||||
return
|
||||
self.state.s3_path = s3_path
|
||||
self.state.bw_path = bw_path
|
||||
self._live_stop.clear()
|
||||
self._live_thread = threading.Thread(
|
||||
target=self._live_worker, args=(s3_path, bw_path), daemon=True)
|
||||
self._live_thread.start()
|
||||
self.live_btn.configure(text="Live: ON", bg=GREEN, fg="#000000")
|
||||
self.status_var.set("Live mode running...")
|
||||
|
||||
def _live_worker(self, s3_path: Path, bw_path: Path) -> None:
|
||||
s3_buf = bytearray()
|
||||
bw_buf = bytearray()
|
||||
s3_pos = bw_pos = 0
|
||||
|
||||
while not self._live_stop.is_set():
|
||||
changed = False
|
||||
if s3_path.exists():
|
||||
with s3_path.open("rb") as fh:
|
||||
fh.seek(s3_pos)
|
||||
nb = fh.read()
|
||||
if nb:
|
||||
s3_buf.extend(nb); s3_pos += len(nb); changed = True
|
||||
if bw_path.exists():
|
||||
with bw_path.open("rb") as fh:
|
||||
fh.seek(bw_pos)
|
||||
nb = fh.read()
|
||||
if nb:
|
||||
bw_buf.extend(nb); bw_pos += len(nb); changed = True
|
||||
|
||||
if changed:
|
||||
self._live_q.put("refresh")
|
||||
|
||||
time.sleep(0.1)
|
||||
|
||||
def _poll_live_queue(self) -> None:
|
||||
try:
|
||||
while True:
|
||||
msg = self._live_q.get_nowait()
|
||||
if msg == "refresh" and self.state.s3_path and self.state.bw_path:
|
||||
self._do_analyze(self.state.s3_path, self.state.bw_path)
|
||||
except queue.Empty:
|
||||
pass
|
||||
finally:
|
||||
self.after(150, self._poll_live_queue)
|
||||
|
||||
# ── text helpers ──────────────────────────────────────────────────────
|
||||
|
||||
def _text_clear(self, w: tk.Text) -> None:
|
||||
w.configure(state="normal")
|
||||
w.delete("1.0", tk.END)
|
||||
# leave enabled for further inserts
|
||||
|
||||
def _tw(self, w: tk.Text, text: str, tag: str = "normal") -> None:
|
||||
"""Insert text with a colour tag."""
|
||||
w.configure(state="normal")
|
||||
w.insert(tk.END, text, tag)
|
||||
|
||||
def _tn(self, w: tk.Text) -> None:
|
||||
"""Insert newline."""
|
||||
w.configure(state="normal")
|
||||
w.insert(tk.END, "\n")
|
||||
w.configure(state="disabled")
|
||||
|
||||
|
||||
# ──────────────────────────────────────────────────────────────────────────────
|
||||
# Entry point
|
||||
# ──────────────────────────────────────────────────────────────────────────────
|
||||
|
||||
def main() -> None:
|
||||
app = AnalyzerGUI()
|
||||
app.mainloop()
|
||||
|
||||
|
||||
if __name__ == "__main__":
|
||||
main()
|
||||
BIN
parsers/raw_bw.bin
Normal file
BIN
parsers/raw_bw.bin
Normal file
Binary file not shown.
BIN
parsers/raw_s3.bin
Normal file
BIN
parsers/raw_s3.bin
Normal file
Binary file not shown.
1204
parsers/s3_analyzer.py
Normal file
1204
parsers/s3_analyzer.py
Normal file
File diff suppressed because it is too large
Load Diff
413
parsers/s3_parser.py
Normal file
413
parsers/s3_parser.py
Normal file
@@ -0,0 +1,413 @@
|
||||
#!/usr/bin/env python3
|
||||
"""
|
||||
s3_parser.py — Unified Instantel frame parser (S3 + BW).
|
||||
|
||||
Modes:
|
||||
- s3: DLE STX (10 02) ... DLE ETX (10 03)
|
||||
- bw: ACK+STX (41 02) ... ETX (03)
|
||||
|
||||
Stuffing:
|
||||
- Literal 0x10 in payload is stuffed as 10 10 in both directions.
|
||||
|
||||
Checksums:
|
||||
- BW frames appear to use more than one checksum style depending on message type.
|
||||
Small frames often validate with 1-byte SUM8.
|
||||
Large config/write frames appear to use a 2-byte CRC16 variant.
|
||||
|
||||
In BW mode we therefore validate candidate ETX positions using AUTO checksum matching:
|
||||
- SUM8 (1 byte)
|
||||
- CRC16 variants (2 bytes), both little/big endian
|
||||
If any match, we accept the ETX as a real frame terminator.
|
||||
"""
|
||||
|
||||
from __future__ import annotations
|
||||
|
||||
import argparse
|
||||
import json
|
||||
from dataclasses import dataclass
|
||||
from pathlib import Path
|
||||
from typing import Callable, Dict, List, Optional, Tuple
|
||||
|
||||
DLE = 0x10
|
||||
STX = 0x02
|
||||
ETX = 0x03
|
||||
ACK = 0x41
|
||||
|
||||
__version__ = "0.2.2"
|
||||
|
||||
|
||||
@dataclass
|
||||
class Frame:
|
||||
index: int
|
||||
start_offset: int
|
||||
end_offset: int
|
||||
payload_raw: bytes # de-stuffed bytes between STX..ETX (includes checksum bytes at end)
|
||||
payload: bytes # payload without checksum bytes
|
||||
trailer: bytes
|
||||
checksum_valid: Optional[bool]
|
||||
checksum_type: Optional[str]
|
||||
checksum_hex: Optional[str]
|
||||
|
||||
|
||||
# ------------------------
|
||||
# Checksum / CRC helpers
|
||||
# ------------------------
|
||||
|
||||
def checksum8_sum(data: bytes) -> int:
|
||||
"""SUM8: sum(payload) & 0xFF"""
|
||||
return sum(data) & 0xFF
|
||||
|
||||
|
||||
def crc16_ibm(data: bytes) -> int:
|
||||
# CRC-16/IBM (aka ARC) poly=0xA001, init=0x0000, refin/refout true
|
||||
crc = 0x0000
|
||||
for b in data:
|
||||
crc ^= b
|
||||
for _ in range(8):
|
||||
crc = (crc >> 1) ^ 0xA001 if (crc & 1) else (crc >> 1)
|
||||
return crc & 0xFFFF
|
||||
|
||||
|
||||
def crc16_ccitt_false(data: bytes) -> int:
|
||||
# CRC-16/CCITT-FALSE poly=0x1021, init=0xFFFF, refin/refout false
|
||||
crc = 0xFFFF
|
||||
for b in data:
|
||||
crc ^= (b << 8)
|
||||
for _ in range(8):
|
||||
crc = ((crc << 1) ^ 0x1021) & 0xFFFF if (crc & 0x8000) else (crc << 1) & 0xFFFF
|
||||
return crc
|
||||
|
||||
|
||||
def crc16_x25(data: bytes) -> int:
|
||||
# CRC-16/X-25 poly=0x8408 (reflected), init=0xFFFF, xorout=0xFFFF
|
||||
crc = 0xFFFF
|
||||
for b in data:
|
||||
crc ^= b
|
||||
for _ in range(8):
|
||||
crc = (crc >> 1) ^ 0x8408 if (crc & 1) else (crc >> 1)
|
||||
return (crc ^ 0xFFFF) & 0xFFFF
|
||||
|
||||
|
||||
CRC16_FUNCS: Dict[str, Callable[[bytes], int]] = {
|
||||
"CRC16_IBM": crc16_ibm,
|
||||
"CRC16_CCITT_FALSE": crc16_ccitt_false,
|
||||
"CRC16_X25": crc16_x25,
|
||||
}
|
||||
|
||||
|
||||
def _try_validate_sum8(body: bytes) -> Optional[Tuple[bytes, bytes, str]]:
|
||||
"""
|
||||
body = payload + chk8
|
||||
Returns (payload, chk_bytes, type) if valid, else None
|
||||
"""
|
||||
if len(body) < 1:
|
||||
return None
|
||||
payload = body[:-1]
|
||||
chk = body[-1]
|
||||
if checksum8_sum(payload) == chk:
|
||||
return payload, bytes([chk]), "SUM8"
|
||||
return None
|
||||
|
||||
|
||||
def _try_validate_sum8_large(body: bytes) -> Optional[Tuple[bytes, bytes, str]]:
|
||||
"""
|
||||
Large BW->S3 write frame checksum (SUBs 68, 69, 71, 82, 1A with data).
|
||||
|
||||
Formula: (sum(b for b in payload[2:-1] if b != 0x10) + 0x10) & 0xFF
|
||||
- Starts from byte [2], skipping CMD (0x10) and DLE (0x10) at [0][1]
|
||||
- Skips all 0x10 bytes in the covered range
|
||||
- Adds 0x10 as a constant offset
|
||||
- body[-1] is the checksum byte
|
||||
|
||||
Confirmed across 20 frames from two independent captures (2026-03-12).
|
||||
"""
|
||||
if len(body) < 3:
|
||||
return None
|
||||
payload = body[:-1]
|
||||
chk = body[-1]
|
||||
calc = (sum(b for b in payload[2:] if b != 0x10) + 0x10) & 0xFF
|
||||
if calc == chk:
|
||||
return payload, bytes([chk]), "SUM8_LARGE"
|
||||
return None
|
||||
|
||||
|
||||
def _try_validate_crc16(body: bytes) -> Optional[Tuple[bytes, bytes, str]]:
|
||||
"""
|
||||
body = payload + crc16(2 bytes)
|
||||
Try multiple CRC16 types and both endian interpretations.
|
||||
Returns (payload, chk_bytes, type) if valid, else None
|
||||
"""
|
||||
if len(body) < 2:
|
||||
return None
|
||||
payload = body[:-2]
|
||||
chk_bytes = body[-2:]
|
||||
|
||||
given_le = int.from_bytes(chk_bytes, "little", signed=False)
|
||||
given_be = int.from_bytes(chk_bytes, "big", signed=False)
|
||||
|
||||
for name, fn in CRC16_FUNCS.items():
|
||||
calc = fn(payload)
|
||||
if calc == given_le:
|
||||
return payload, chk_bytes, f"{name}_LE"
|
||||
if calc == given_be:
|
||||
return payload, chk_bytes, f"{name}_BE"
|
||||
return None
|
||||
|
||||
|
||||
def validate_bw_body_auto(body: bytes) -> Optional[Tuple[bytes, bytes, str]]:
|
||||
"""
|
||||
Try to interpret the tail of body as a checksum in several ways.
|
||||
Return (payload, checksum_bytes, checksum_type) if any match; else None.
|
||||
"""
|
||||
# Prefer plain SUM8 first (small frames: POLL, read commands)
|
||||
hit = _try_validate_sum8(body)
|
||||
if hit:
|
||||
return hit
|
||||
|
||||
# Large BW->S3 write frames (SUBs 68, 69, 71, 82, 1A with data)
|
||||
hit = _try_validate_sum8_large(body)
|
||||
if hit:
|
||||
return hit
|
||||
|
||||
# Then CRC16 variants
|
||||
hit = _try_validate_crc16(body)
|
||||
if hit:
|
||||
return hit
|
||||
|
||||
return None
|
||||
|
||||
|
||||
# ------------------------
|
||||
# S3 MODE (DLE framed)
|
||||
# ------------------------
|
||||
|
||||
def parse_s3(blob: bytes, trailer_len: int) -> List[Frame]:
|
||||
frames: List[Frame] = []
|
||||
|
||||
IDLE = 0
|
||||
IN_FRAME = 1
|
||||
AFTER_DLE = 2
|
||||
|
||||
state = IDLE
|
||||
body = bytearray()
|
||||
start_offset = 0
|
||||
idx = 0
|
||||
|
||||
i = 0
|
||||
n = len(blob)
|
||||
|
||||
while i < n:
|
||||
b = blob[i]
|
||||
|
||||
if state == IDLE:
|
||||
if b == DLE and i + 1 < n and blob[i + 1] == STX:
|
||||
start_offset = i
|
||||
body.clear()
|
||||
state = IN_FRAME
|
||||
i += 2
|
||||
continue
|
||||
|
||||
elif state == IN_FRAME:
|
||||
if b == DLE:
|
||||
state = AFTER_DLE
|
||||
i += 1
|
||||
continue
|
||||
body.append(b)
|
||||
|
||||
else: # AFTER_DLE
|
||||
if b == DLE:
|
||||
body.append(DLE)
|
||||
state = IN_FRAME
|
||||
i += 1
|
||||
continue
|
||||
|
||||
if b == ETX:
|
||||
end_offset = i + 1
|
||||
trailer_start = i + 1
|
||||
trailer_end = trailer_start + trailer_len
|
||||
trailer = blob[trailer_start:trailer_end]
|
||||
|
||||
# For S3 mode we don't assume checksum type here yet.
|
||||
frames.append(Frame(
|
||||
index=idx,
|
||||
start_offset=start_offset,
|
||||
end_offset=end_offset,
|
||||
payload_raw=bytes(body),
|
||||
payload=bytes(body),
|
||||
trailer=trailer,
|
||||
checksum_valid=None,
|
||||
checksum_type=None,
|
||||
checksum_hex=None
|
||||
))
|
||||
|
||||
idx += 1
|
||||
state = IDLE
|
||||
i = trailer_end
|
||||
continue
|
||||
|
||||
# Unexpected DLE + byte → treat as literal data
|
||||
body.append(DLE)
|
||||
body.append(b)
|
||||
state = IN_FRAME
|
||||
i += 1
|
||||
continue
|
||||
|
||||
i += 1
|
||||
|
||||
return frames
|
||||
|
||||
|
||||
# ------------------------
|
||||
# BW MODE (ACK+STX framed, bare ETX)
|
||||
# ------------------------
|
||||
|
||||
def parse_bw(blob: bytes, trailer_len: int, validate_checksum: bool) -> List[Frame]:
|
||||
frames: List[Frame] = []
|
||||
|
||||
IDLE = 0
|
||||
IN_FRAME = 1
|
||||
AFTER_DLE = 2
|
||||
|
||||
state = IDLE
|
||||
body = bytearray()
|
||||
start_offset = 0
|
||||
idx = 0
|
||||
|
||||
i = 0
|
||||
n = len(blob)
|
||||
|
||||
while i < n:
|
||||
b = blob[i]
|
||||
|
||||
if state == IDLE:
|
||||
# Frame start signature: ACK + STX
|
||||
if b == ACK and i + 1 < n and blob[i + 1] == STX:
|
||||
start_offset = i
|
||||
body.clear()
|
||||
state = IN_FRAME
|
||||
i += 2
|
||||
continue
|
||||
i += 1
|
||||
continue
|
||||
|
||||
if state == IN_FRAME:
|
||||
if b == DLE:
|
||||
state = AFTER_DLE
|
||||
i += 1
|
||||
continue
|
||||
|
||||
if b == ETX:
|
||||
# Candidate end-of-frame.
|
||||
# Accept ETX if the next bytes look like a real next-frame start (ACK+STX),
|
||||
# or we're at EOF. This prevents chopping on in-payload 0x03.
|
||||
next_is_start = (i + 2 < n and blob[i + 1] == ACK and blob[i + 2] == STX)
|
||||
at_eof = (i == n - 1)
|
||||
|
||||
if not (next_is_start or at_eof):
|
||||
# Not a real boundary -> payload byte
|
||||
body.append(ETX)
|
||||
i += 1
|
||||
continue
|
||||
|
||||
trailer_start = i + 1
|
||||
trailer_end = trailer_start + trailer_len
|
||||
trailer = blob[trailer_start:trailer_end]
|
||||
|
||||
chk_valid = None
|
||||
chk_type = None
|
||||
chk_hex = None
|
||||
payload = bytes(body)
|
||||
|
||||
if validate_checksum:
|
||||
hit = validate_bw_body_auto(payload)
|
||||
if hit:
|
||||
payload, chk_bytes, chk_type = hit
|
||||
chk_valid = True
|
||||
chk_hex = chk_bytes.hex()
|
||||
else:
|
||||
chk_valid = False
|
||||
|
||||
frames.append(Frame(
|
||||
index=idx,
|
||||
start_offset=start_offset,
|
||||
end_offset=i + 1,
|
||||
payload_raw=bytes(body),
|
||||
payload=payload,
|
||||
trailer=trailer,
|
||||
checksum_valid=chk_valid,
|
||||
checksum_type=chk_type,
|
||||
checksum_hex=chk_hex
|
||||
))
|
||||
idx += 1
|
||||
state = IDLE
|
||||
i = trailer_end
|
||||
continue
|
||||
|
||||
# Normal byte
|
||||
body.append(b)
|
||||
i += 1
|
||||
continue
|
||||
|
||||
# AFTER_DLE: DLE XX => literal XX for any XX (full DLE stuffing)
|
||||
body.append(b)
|
||||
state = IN_FRAME
|
||||
i += 1
|
||||
|
||||
return frames
|
||||
|
||||
|
||||
# ------------------------
|
||||
# CLI
|
||||
# ------------------------
|
||||
|
||||
def main() -> None:
|
||||
ap = argparse.ArgumentParser(description="Parse Instantel S3/BW binary captures.")
|
||||
ap.add_argument("binfile", type=Path)
|
||||
ap.add_argument("--mode", choices=["s3", "bw"], default="s3")
|
||||
ap.add_argument("--trailer-len", type=int, default=0)
|
||||
ap.add_argument("--no-checksum", action="store_true")
|
||||
ap.add_argument("--out", type=Path, default=None)
|
||||
|
||||
args = ap.parse_args()
|
||||
|
||||
print(f"s3_parser v{__version__}")
|
||||
|
||||
blob = args.binfile.read_bytes()
|
||||
|
||||
if args.mode == "s3":
|
||||
frames = parse_s3(blob, args.trailer_len)
|
||||
else:
|
||||
frames = parse_bw(blob, args.trailer_len, validate_checksum=not args.no_checksum)
|
||||
|
||||
print("Frames found:", len(frames))
|
||||
|
||||
def to_hex(b: bytes) -> str:
|
||||
return b.hex()
|
||||
|
||||
lines = []
|
||||
for f in frames:
|
||||
obj = {
|
||||
"index": f.index,
|
||||
"start_offset": f.start_offset,
|
||||
"end_offset": f.end_offset,
|
||||
"payload_len": len(f.payload),
|
||||
"payload_hex": to_hex(f.payload),
|
||||
"trailer_hex": to_hex(f.trailer),
|
||||
"checksum_valid": f.checksum_valid,
|
||||
"checksum_type": f.checksum_type,
|
||||
"checksum_hex": f.checksum_hex,
|
||||
}
|
||||
lines.append(json.dumps(obj))
|
||||
|
||||
if args.out:
|
||||
args.out.write_text("\n".join(lines) + "\n", encoding="utf-8")
|
||||
print(f"Wrote: {args.out}")
|
||||
else:
|
||||
for line in lines[:10]:
|
||||
print(line)
|
||||
if len(lines) > 10:
|
||||
print(f"... ({len(lines) - 10} more)")
|
||||
|
||||
|
||||
if __name__ == "__main__":
|
||||
main()
|
||||
1130
seismo_lab.py
Normal file
1130
seismo_lab.py
Normal file
File diff suppressed because it is too large
Load Diff
0
sfm/__init__.py
Normal file
0
sfm/__init__.py
Normal file
351
sfm/server.py
Normal file
351
sfm/server.py
Normal file
@@ -0,0 +1,351 @@
|
||||
"""
|
||||
sfm/server.py — Seismograph Field Module REST API
|
||||
|
||||
Wraps the minimateplus library in a small FastAPI service.
|
||||
Terra-view proxies /api/sfm/* to this service (same pattern as SLMM at :8100).
|
||||
|
||||
Default port: 8200
|
||||
|
||||
Endpoints
|
||||
---------
|
||||
GET /health Service heartbeat — no device I/O
|
||||
GET /device/info POLL + serial number + full config read
|
||||
GET /device/events Download all stored events (headers + peak values)
|
||||
POST /device/connect Explicit connect/identify (same as /device/info)
|
||||
GET /device/event/{idx} Single event by index (header + waveform record)
|
||||
|
||||
Transport query params (supply one set):
|
||||
Serial (direct RS-232 cable):
|
||||
port — serial port name (e.g. COM5, /dev/ttyUSB0)
|
||||
baud — baud rate (default 38400)
|
||||
|
||||
TCP (modem / ACH Auto Call Home):
|
||||
host — IP address or hostname of the modem or ACH relay
|
||||
tcp_port — TCP port number (default 12345, Blastware default)
|
||||
|
||||
Each call opens the connection, does its work, then closes it.
|
||||
(Stateless / reconnect-per-call, matching Blastware's observed behaviour.)
|
||||
|
||||
Run with:
|
||||
python -m uvicorn sfm.server:app --host 0.0.0.0 --port 8200 --reload
|
||||
or:
|
||||
python sfm/server.py
|
||||
"""
|
||||
|
||||
from __future__ import annotations
|
||||
|
||||
import logging
|
||||
import sys
|
||||
from typing import Optional
|
||||
|
||||
# FastAPI / Pydantic
|
||||
try:
|
||||
from fastapi import FastAPI, HTTPException, Query
|
||||
from fastapi.responses import JSONResponse
|
||||
import uvicorn
|
||||
except ImportError:
|
||||
print(
|
||||
"fastapi and uvicorn are required for the SFM server.\n"
|
||||
"Install them with: pip install fastapi uvicorn",
|
||||
file=sys.stderr,
|
||||
)
|
||||
sys.exit(1)
|
||||
|
||||
from minimateplus import MiniMateClient
|
||||
from minimateplus.protocol import ProtocolError
|
||||
from minimateplus.models import DeviceInfo, Event, PeakValues, ProjectInfo, Timestamp
|
||||
from minimateplus.transport import TcpTransport, DEFAULT_TCP_PORT
|
||||
|
||||
logging.basicConfig(
|
||||
level=logging.INFO,
|
||||
format="%(asctime)s %(levelname)-7s %(name)s %(message)s",
|
||||
datefmt="%H:%M:%S",
|
||||
)
|
||||
log = logging.getLogger("sfm.server")
|
||||
|
||||
# ── FastAPI app ────────────────────────────────────────────────────────────────
|
||||
|
||||
app = FastAPI(
|
||||
title="Seismograph Field Module (SFM)",
|
||||
description=(
|
||||
"REST API for Instantel MiniMate Plus seismographs.\n"
|
||||
"Implements the minimateplus RS-232 protocol library.\n"
|
||||
"Proxied by terra-view at /api/sfm/*."
|
||||
),
|
||||
version="0.1.0",
|
||||
)
|
||||
|
||||
|
||||
# ── Serialisers ────────────────────────────────────────────────────────────────
|
||||
# Plain dict helpers — avoids a Pydantic dependency in the library layer.
|
||||
|
||||
def _serialise_timestamp(ts: Optional[Timestamp]) -> Optional[dict]:
|
||||
if ts is None:
|
||||
return None
|
||||
return {
|
||||
"year": ts.year,
|
||||
"month": ts.month,
|
||||
"day": ts.day,
|
||||
"clock_set": ts.clock_set,
|
||||
"display": str(ts),
|
||||
}
|
||||
|
||||
|
||||
def _serialise_peak_values(pv: Optional[PeakValues]) -> Optional[dict]:
|
||||
if pv is None:
|
||||
return None
|
||||
return {
|
||||
"tran_in_s": pv.tran,
|
||||
"vert_in_s": pv.vert,
|
||||
"long_in_s": pv.long,
|
||||
"micl_psi": pv.micl,
|
||||
}
|
||||
|
||||
|
||||
def _serialise_project_info(pi: Optional[ProjectInfo]) -> Optional[dict]:
|
||||
if pi is None:
|
||||
return None
|
||||
return {
|
||||
"setup_name": pi.setup_name,
|
||||
"project": pi.project,
|
||||
"client": pi.client,
|
||||
"operator": pi.operator,
|
||||
"sensor_location": pi.sensor_location,
|
||||
"notes": pi.notes,
|
||||
}
|
||||
|
||||
|
||||
def _serialise_device_info(info: DeviceInfo) -> dict:
|
||||
return {
|
||||
"serial": info.serial,
|
||||
"firmware_version": info.firmware_version,
|
||||
"firmware_minor": info.firmware_minor,
|
||||
"dsp_version": info.dsp_version,
|
||||
"manufacturer": info.manufacturer,
|
||||
"model": info.model,
|
||||
}
|
||||
|
||||
|
||||
def _serialise_event(ev: Event) -> dict:
|
||||
return {
|
||||
"index": ev.index,
|
||||
"timestamp": _serialise_timestamp(ev.timestamp),
|
||||
"sample_rate": ev.sample_rate,
|
||||
"record_type": ev.record_type,
|
||||
"peak_values": _serialise_peak_values(ev.peak_values),
|
||||
"project_info": _serialise_project_info(ev.project_info),
|
||||
}
|
||||
|
||||
|
||||
# ── Transport factory ─────────────────────────────────────────────────────────
|
||||
|
||||
def _build_client(
|
||||
port: Optional[str],
|
||||
baud: int,
|
||||
host: Optional[str],
|
||||
tcp_port: int,
|
||||
) -> MiniMateClient:
|
||||
"""
|
||||
Return a MiniMateClient configured for either serial or TCP transport.
|
||||
|
||||
TCP takes priority if *host* is supplied; otherwise *port* (serial) is used.
|
||||
Raises HTTPException(422) if neither is provided.
|
||||
"""
|
||||
if host:
|
||||
# TCP / modem / ACH path — use a longer timeout to survive cold boots
|
||||
# (unit takes 5-15s to wake from RS-232 line assertion over cellular)
|
||||
transport = TcpTransport(host, port=tcp_port)
|
||||
log.debug("TCP transport: %s:%d", host, tcp_port)
|
||||
return MiniMateClient(transport=transport, timeout=30.0)
|
||||
elif port:
|
||||
# Direct serial path
|
||||
log.debug("Serial transport: %s baud=%d", port, baud)
|
||||
return MiniMateClient(port, baud)
|
||||
else:
|
||||
raise HTTPException(
|
||||
status_code=422,
|
||||
detail=(
|
||||
"Specify either 'port' (serial, e.g. ?port=COM5) "
|
||||
"or 'host' (TCP, e.g. ?host=192.168.1.50&tcp_port=12345)"
|
||||
),
|
||||
)
|
||||
|
||||
|
||||
def _is_tcp(host: Optional[str]) -> bool:
|
||||
return bool(host)
|
||||
|
||||
|
||||
def _run_with_retry(fn, *, is_tcp: bool):
|
||||
"""
|
||||
Call fn() and, for TCP connections only, retry once on ProtocolError.
|
||||
|
||||
Rationale: when a MiniMate Plus is cold (just had its serial lines asserted
|
||||
by the modem or a local bridge), it takes 5-10 seconds to boot before it
|
||||
will respond to POLL_PROBE. The first request may time out during that boot
|
||||
window; a single automatic retry is enough to recover once the unit is up.
|
||||
|
||||
Serial connections are NOT retried — a timeout there usually means a real
|
||||
problem (wrong port, wrong baud, cable unplugged).
|
||||
"""
|
||||
try:
|
||||
return fn()
|
||||
except ProtocolError as exc:
|
||||
if not is_tcp:
|
||||
raise
|
||||
log.info("TCP poll timed out (unit may have been cold) — retrying once")
|
||||
return fn() # let any second failure propagate normally
|
||||
|
||||
|
||||
# ── Endpoints ──────────────────────────────────────────────────────────────────
|
||||
|
||||
@app.get("/health")
|
||||
def health() -> dict:
|
||||
"""Service heartbeat. No device I/O."""
|
||||
return {"status": "ok", "service": "sfm", "version": "0.1.0"}
|
||||
|
||||
|
||||
@app.get("/device/info")
|
||||
def device_info(
|
||||
port: Optional[str] = Query(None, description="Serial port (e.g. COM5, /dev/ttyUSB0)"),
|
||||
baud: int = Query(38400, description="Serial baud rate (default 38400)"),
|
||||
host: Optional[str] = Query(None, description="TCP host — modem IP or ACH relay (e.g. 203.0.113.5)"),
|
||||
tcp_port: int = Query(DEFAULT_TCP_PORT, description=f"TCP port (default {DEFAULT_TCP_PORT})"),
|
||||
) -> dict:
|
||||
"""
|
||||
Connect to the device, perform the POLL startup handshake, and return
|
||||
identity information (serial number, firmware version, model).
|
||||
|
||||
Supply either *port* (serial) or *host* (TCP/modem).
|
||||
Equivalent to POST /device/connect — provided as GET for convenience.
|
||||
"""
|
||||
log.info("GET /device/info port=%s host=%s tcp_port=%d", port, host, tcp_port)
|
||||
|
||||
try:
|
||||
def _do():
|
||||
with _build_client(port, baud, host, tcp_port) as client:
|
||||
return client.connect()
|
||||
info = _run_with_retry(_do, is_tcp=_is_tcp(host))
|
||||
except HTTPException:
|
||||
raise
|
||||
except ProtocolError as exc:
|
||||
raise HTTPException(status_code=502, detail=f"Protocol error: {exc}") from exc
|
||||
except OSError as exc:
|
||||
raise HTTPException(status_code=502, detail=f"Connection error: {exc}") from exc
|
||||
except Exception as exc:
|
||||
raise HTTPException(status_code=500, detail=f"Device error: {exc}") from exc
|
||||
|
||||
return _serialise_device_info(info)
|
||||
|
||||
|
||||
@app.post("/device/connect")
|
||||
def device_connect(
|
||||
port: Optional[str] = Query(None, description="Serial port (e.g. COM5)"),
|
||||
baud: int = Query(38400, description="Serial baud rate"),
|
||||
host: Optional[str] = Query(None, description="TCP host — modem IP or ACH relay"),
|
||||
tcp_port: int = Query(DEFAULT_TCP_PORT, description=f"TCP port (default {DEFAULT_TCP_PORT})"),
|
||||
) -> dict:
|
||||
"""
|
||||
Connect to the device and return identity. POST variant for terra-view
|
||||
compatibility with the SLMM proxy pattern.
|
||||
"""
|
||||
return device_info(port=port, baud=baud, host=host, tcp_port=tcp_port)
|
||||
|
||||
|
||||
@app.get("/device/events")
|
||||
def device_events(
|
||||
port: Optional[str] = Query(None, description="Serial port (e.g. COM5)"),
|
||||
baud: int = Query(38400, description="Serial baud rate"),
|
||||
host: Optional[str] = Query(None, description="TCP host — modem IP or ACH relay"),
|
||||
tcp_port: int = Query(DEFAULT_TCP_PORT, description=f"TCP port (default {DEFAULT_TCP_PORT})"),
|
||||
) -> dict:
|
||||
"""
|
||||
Connect to the device, read the event index, and download all stored
|
||||
events (event headers + full waveform records with peak values).
|
||||
|
||||
Supply either *port* (serial) or *host* (TCP/modem).
|
||||
|
||||
This does NOT download raw ADC waveform samples — those are large and
|
||||
fetched separately via GET /device/event/{idx}/waveform (future endpoint).
|
||||
"""
|
||||
log.info("GET /device/events port=%s host=%s", port, host)
|
||||
|
||||
try:
|
||||
def _do():
|
||||
with _build_client(port, baud, host, tcp_port) as client:
|
||||
return client.connect(), client.get_events()
|
||||
info, events = _run_with_retry(_do, is_tcp=_is_tcp(host))
|
||||
except HTTPException:
|
||||
raise
|
||||
except ProtocolError as exc:
|
||||
raise HTTPException(status_code=502, detail=f"Protocol error: {exc}") from exc
|
||||
except OSError as exc:
|
||||
raise HTTPException(status_code=502, detail=f"Connection error: {exc}") from exc
|
||||
except Exception as exc:
|
||||
raise HTTPException(status_code=500, detail=f"Device error: {exc}") from exc
|
||||
|
||||
return {
|
||||
"device": _serialise_device_info(info),
|
||||
"event_count": len(events),
|
||||
"events": [_serialise_event(ev) for ev in events],
|
||||
}
|
||||
|
||||
|
||||
@app.get("/device/event/{index}")
|
||||
def device_event(
|
||||
index: int,
|
||||
port: Optional[str] = Query(None, description="Serial port (e.g. COM5)"),
|
||||
baud: int = Query(38400, description="Serial baud rate"),
|
||||
host: Optional[str] = Query(None, description="TCP host — modem IP or ACH relay"),
|
||||
tcp_port: int = Query(DEFAULT_TCP_PORT, description=f"TCP port (default {DEFAULT_TCP_PORT})"),
|
||||
) -> dict:
|
||||
"""
|
||||
Download a single event by index (0-based).
|
||||
|
||||
Supply either *port* (serial) or *host* (TCP/modem).
|
||||
Performs: POLL startup → event index → event header → waveform record.
|
||||
"""
|
||||
log.info("GET /device/event/%d port=%s host=%s", index, port, host)
|
||||
|
||||
try:
|
||||
def _do():
|
||||
with _build_client(port, baud, host, tcp_port) as client:
|
||||
client.connect()
|
||||
return client.get_events()
|
||||
events = _run_with_retry(_do, is_tcp=_is_tcp(host))
|
||||
except HTTPException:
|
||||
raise
|
||||
except ProtocolError as exc:
|
||||
raise HTTPException(status_code=502, detail=f"Protocol error: {exc}") from exc
|
||||
except OSError as exc:
|
||||
raise HTTPException(status_code=502, detail=f"Connection error: {exc}") from exc
|
||||
except Exception as exc:
|
||||
raise HTTPException(status_code=500, detail=f"Device error: {exc}") from exc
|
||||
|
||||
matching = [ev for ev in events if ev.index == index]
|
||||
if not matching:
|
||||
raise HTTPException(
|
||||
status_code=404,
|
||||
detail=f"Event index {index} not found on device",
|
||||
)
|
||||
|
||||
return _serialise_event(matching[0])
|
||||
|
||||
|
||||
# ── Entry point ────────────────────────────────────────────────────────────────
|
||||
|
||||
if __name__ == "__main__":
|
||||
import argparse
|
||||
|
||||
ap = argparse.ArgumentParser(description="SFM — Seismograph Field Module API server")
|
||||
ap.add_argument("--host", default="0.0.0.0", help="Bind address (default: 0.0.0.0)")
|
||||
ap.add_argument("--port", type=int, default=8200, help="Port (default: 8200)")
|
||||
ap.add_argument("--reload", action="store_true", help="Enable auto-reload (dev mode)")
|
||||
args = ap.parse_args()
|
||||
|
||||
log.info("Starting SFM server on %s:%d", args.host, args.port)
|
||||
uvicorn.run(
|
||||
"sfm.server:app",
|
||||
host=args.host,
|
||||
port=args.port,
|
||||
reload=args.reload,
|
||||
)
|
||||
Reference in New Issue
Block a user