Frontmatter
| id | 7964 |
| title | Create ContextAssembler for memory integration |
| state | Closed |
| labels | enhancementai |
| assignees | tobiu |
| createdAt | Dec 1, 2025, 12:32 PM |
| updatedAt | Dec 1, 2025, 1:12 PM |
| githubUrl | https://github.com/neomjs/neo/issues/7964 |
| author | tobiu |
| commentsCount | 1 |
| parentIssue | 7961 |
| subIssues | [] |
| subIssuesCompleted | 0 |
| subIssuesTotal | 0 |
| blockedBy | [] |
| blocking | [] |
| closedAt | Dec 1, 2025, 1:12 PM |
Create ContextAssembler for memory integration
tobiu added parent issue #7961 on Dec 1, 2025, 12:32 PM
tobiu assigned to @tobiu on Dec 1, 2025, 12:57 PM

tobiu
Dec 1, 2025, 1:12 PM
Input from Gemini 2.5:
✦ Implemented
Neo.ai.context.Assemblerusing the "Thick Client" pattern.
- Refactored
Assemblerto useai/services.mjsdirectly.- Updated
KB_QueryServiceto self-initialize viainitAsync.- Updated
ai/services.mjsto exportMemory_SummaryService.- Verified end-to-end RAG injection from both Memory Core and Knowledge Base.
tobiu closed this issue on Dec 1, 2025, 1:12 PM
Goal: Create the mechanism to assemble the full context for the LLM. Scope:
Neo.ai.context.Assembler.memory-core).memory-coreorknowledge-base).