docs updated

This commit is contained in:
serversdwn
2025-11-28 18:05:59 -05:00
parent a83405beb1
commit d9281a1816
12 changed files with 557 additions and 477 deletions

View File

@@ -2,11 +2,106 @@
All notable changes to Project Lyra are organized by component.
The format is based on [Keep a Changelog](https://keepachangelog.com/en/1.1.0/)
and adheres to [Semantic Versioning](https://semver.org/).
# Last Updated: 11-26-25
# Last Updated: 11-28-25
---
## 🧠 Lyra-Core ##############################################################################
## [Project Lyra v0.5.0] - 2025-11-28
### 🔧 Fixed - Critical API Wiring & Integration
After the major architectural rewire (v0.4.x), this release fixes all critical endpoint mismatches and ensures end-to-end system connectivity.
#### Cortex → Intake Integration ✅
- **Fixed** `IntakeClient` to use correct Intake v0.2 API endpoints
- Changed `GET /context/{session_id}``GET /summaries?session_id={session_id}`
- Updated JSON response parsing to extract `summary_text` field
- Fixed environment variable name: `INTAKE_API``INTAKE_API_URL`
- Corrected default port: `7083``7080`
- Added deprecation warning to `summarize_turn()` method (endpoint removed in Intake v0.2)
#### Relay → UI Compatibility ✅
- **Added** OpenAI-compatible endpoint `POST /v1/chat/completions`
- Accepts standard OpenAI format with `messages[]` array
- Returns OpenAI-compatible response structure with `choices[]`
- Extracts last message content from messages array
- Includes usage metadata (stub values for compatibility)
- **Refactored** Relay to use shared `handleChatRequest()` function
- Both `/chat` and `/v1/chat/completions` use same core logic
- Eliminates code duplication
- Consistent error handling across endpoints
#### Relay → Intake Connection ✅
- **Fixed** Intake URL fallback in Relay server configuration
- Corrected port: `7082``7080`
- Updated endpoint: `/summary``/add_exchange`
- Now properly sends exchanges to Intake for summarization
#### Code Quality & Python Package Structure ✅
- **Added** missing `__init__.py` files to all Cortex subdirectories
- `cortex/llm/__init__.py`
- `cortex/reasoning/__init__.py`
- `cortex/persona/__init__.py`
- `cortex/ingest/__init__.py`
- `cortex/utils/__init__.py`
- Improves package imports and IDE support
- **Removed** unused import in `cortex/router.py`: `from unittest import result`
- **Deleted** empty file `cortex/llm/resolve_llm_url.py` (was 0 bytes, never implemented)
### ✅ Verified Working
Complete end-to-end message flow now operational:
```
UI → Relay (/v1/chat/completions)
Relay → Cortex (/reason)
Cortex → Intake (/summaries) [retrieves context]
Cortex 4-stage pipeline:
1. reflection.py → meta-awareness notes
2. reasoning.py → draft answer
3. refine.py → polished answer
4. persona/speak.py → Lyra personality
Cortex → Relay (returns persona response)
Relay → Intake (/add_exchange) [async summary]
Intake → NeoMem (background memory storage)
Relay → UI (final response)
```
### 📝 Documentation
- **Added** this CHANGELOG entry with comprehensive v0.5.0 notes
- **Updated** README.md to reflect v0.5.0 architecture
- Documented new endpoints
- Updated data flow diagrams
- Clarified Intake v0.2 changes
- Corrected service descriptions
### 🐛 Issues Resolved
- ❌ Cortex could not retrieve context from Intake (wrong endpoint)
- ❌ UI could not send messages to Relay (endpoint mismatch)
- ❌ Relay could not send summaries to Intake (wrong port/endpoint)
- ❌ Python package imports were implicit (missing __init__.py)
### ⚠️ Known Issues (Non-Critical)
- Session management endpoints not implemented in Relay (`GET/POST /sessions/:id`)
- RAG service currently disabled in docker-compose.yml
- Cortex `/ingest` endpoint is a stub returning `{"status": "ok"}`
### 🎯 Migration Notes
If upgrading from v0.4.x:
1. Pull latest changes from git
2. Verify environment variables in `.env` files:
- Check `INTAKE_API_URL=http://intake:7080` (not `INTAKE_API`)
- Verify all service URLs use correct ports
3. Restart Docker containers: `docker-compose down && docker-compose up -d`
4. Test with a simple message through the UI
---
## [Infrastructure v1.0.0] - 2025-11-26
### Changed