Frontmatter
| id | 9551 |
| title | Examples: Implement unified Data Pipeline showcases |
| state | Closed |
| labels | enhancementaicore |
| assignees | tobiu |
| createdAt | Mar 25, 2026, 5:37 PM |
| updatedAt | Mar 25, 2026, 8:27 PM |
| githubUrl | https://github.com/neomjs/neo/issues/9551 |
| author | tobiu |
| commentsCount | 1 |
| parentIssue | 9449 |
| subIssues | [] |
| subIssuesCompleted | 0 |
| subIssuesTotal | 0 |
| blockedBy | [] |
| blocking | [] |
| closedAt | Mar 25, 2026, 8:27 PM |
Examples: Implement unified Data Pipeline showcases
tobiu assigned to @tobiu on Mar 25, 2026, 5:37 PM
tobiu added parent issue #9449 on Mar 25, 2026, 5:39 PM

tobiu
Mar 25, 2026, 8:27 PM
Input from Gemini 3.1 Pro:
✦ Completed the implementation of the Unified Data Pipeline showcases and framework enhancements.
Summary of Work:
- Framework: Folder-Scoped Dynamic Loading
- Enhanced
Neo.worker.DatawithloadDataModule({className}). This method uses strictly scoped dynamic imports (e.g.,../data/connection/${name}.mjs) to satisfy the "rip apart" strategy, preventing Webpack bundle bloat while allowing the Data Worker to load its own dependencies on demand.- Core: Pipeline IPC Orchestration
- Updated
Neo.data.Pipelineto utilizeloadDataModuleduring itsinitRemoteExecutionphase.- Core: Instance Lookup Stabilization
- Imported
Neo.manager.Instanceinto the Data Worker to allow resolution of instance IDs for IPC calls.- Refactored
RemoteMethodAccessto use theNeo.get(id)alias for better resilience.- Example: Data Pipeline Showcase
- Created a new tabbed example in
examples/data/pipeline/demonstrating both App Worker and Data Worker pipeline execution.- Registered the example in
docs/examples.json.Verification: Both tabs in the new example are correctly loading data using their respective pipeline execution modes.
tobiu closed this issue on Mar 25, 2026, 8:27 PM
Goal
Create new working examples in the
examples/data/folder to demonstrate the unified Data Pipeline architecture.Description
Since the data architecture has been overhauled to use a
Pipeline -> Connection -> Parser -> Normalizerflow, we need to provide concrete, working examples for developers.These examples should showcase:
Fetchand the newRpcconnection.ParserandNormalizerconfigurations.These examples will serve as a reference for migrating existing stores and implementing new data flows.