LearnNewsExamplesServices
Frontmatter
id7964
titleCreate ContextAssembler for memory integration
stateClosed
labels
enhancementai
assigneestobiu
createdAtDec 1, 2025, 12:32 PM
updatedAtDec 1, 2025, 1:12 PM
githubUrlhttps://github.com/neomjs/neo/issues/7964
authortobiu
commentsCount1
parentIssue7961
subIssues[]
subIssuesCompleted0
subIssuesTotal0
blockedBy[]
blocking[]
closedAtDec 1, 2025, 1:12 PM

Create ContextAssembler for memory integration

Closed v11.14.0 enhancementai
tobiu
tobiu commented on Dec 1, 2025, 12:32 PM

Goal: Create the mechanism to assemble the full context for the LLM. Scope:

  • Implement Neo.ai.context.Assembler.
  • Logic to combine:
    • System Prompt (Identity & Rules).
    • Short-Term Memory (Current Session History via memory-core).
    • Long-Term Memory (RAG results via memory-core or knowledge-base).
    • Current Event (The trigger).
  • Implement token counting estimation to ensure the payload fits the model's window. Context: Part of Epic #7961.
tobiu added the enhancement label on Dec 1, 2025, 12:32 PM
tobiu added the ai label on Dec 1, 2025, 12:32 PM
tobiu added parent issue #7961 on Dec 1, 2025, 12:32 PM
tobiu assigned to @tobiu on Dec 1, 2025, 12:57 PM
tobiu referenced in commit e9d2d4c - "Create ContextAssembler for memory integration #7964" on Dec 1, 2025, 12:58 PM
tobiu
tobiu Dec 1, 2025, 1:12 PM

Input from Gemini 2.5:

✦ Implemented Neo.ai.context.Assembler using the "Thick Client" pattern.

  • Refactored Assembler to use ai/services.mjs directly.
  • Updated KB_QueryService to self-initialize via initAsync.
  • Updated ai/services.mjs to export Memory_SummaryService.
  • Verified end-to-end RAG injection from both Memory Core and Knowledge Base.
tobiu closed this issue on Dec 1, 2025, 1:12 PM