Goal
Investigate the architectural feasibility of moving data validation and sanitization into the Data Worker Pipeline.
Description
Currently, Neo.data.RecordFactory handles validation and type conversion during Record instantiation in the App Worker. While the Store's "Turbo Mode" (Lazy Hydration) defers this cost, performing these operations in the App Worker still consumes the main execution thread when data is eventually accessed.
Exploration Points:
- Thread-Agnostic Validation: Can we extract validation logic (maxLength, nullable, etc.) into a shared utility that the Data Worker can run on raw parsed objects?
- Pre-Sanitization: What are the performance trade-offs of "cleaning" raw data in the Data Worker before it crosses the worker boundary?
- Turbo Mode Impact: Does pre-sanitizing data in the Data Worker simplify or redundantize the "Soft Hydration" logic (
resolveField) in Neo.data.Store?
- Error Reporting: How do we efficiently report validation failures for bulk datasets (e.g., 10k rows) back to the App Worker without bloating the IPC payload?
This is a research-first ticket to determine if a "Pre-Hydration" phase in the Data Worker is a viable architectural evolution.
Goal
Investigate the architectural feasibility of moving data validation and sanitization into the Data Worker Pipeline.
Description
Currently,
Neo.data.RecordFactoryhandles validation and type conversion during Record instantiation in the App Worker. While the Store's "Turbo Mode" (Lazy Hydration) defers this cost, performing these operations in the App Worker still consumes the main execution thread when data is eventually accessed.Exploration Points:
resolveField) inNeo.data.Store?This is a research-first ticket to determine if a "Pre-Hydration" phase in the Data Worker is a viable architectural evolution.