v12.0.0 — MaterializedView architecture + GraphTraversal engine#48
v12.0.0 — MaterializedView architecture + GraphTraversal engine#48flyingrobots wants to merge 69 commits intomainfrom
Conversation
Introduces the NeighborProviderPort abstraction and a single GraphTraversal engine that subsumes LogicalTraversal (in-memory) and DagTraversal/ DagPathFinding/DagTopology (bitmap commit DAG). One traversal stack, one set of determinism invariants, one set of bugs to fix. New files: - NeighborProviderPort: abstract interface (getNeighbors, hasNode, latencyClass) - AdjacencyNeighborProvider: in-memory, pre-sorted, sync latency - GraphTraversal: 11 algorithms (BFS, DFS, shortestPath, isReachable, Dijkstra, A*, bidirectional A*, topoSort, connectedComponent, commonAncestors, weightedLongestPath) - BitmapNeighborProvider: commit DAG stub wrapping BitmapIndexReader, label='' sentinel, async-local latency - MinHeap: optional tieBreaker comparator for deterministic PQ ordering LogicalTraversal deprecated — delegates to GraphTraversal internally, public API unchanged. 147 new tests, 3967 total passing.
Builds the test infrastructure for the MaterializedView overhaul:
1. Fixture DSL (test/helpers/fixtureDsl.js):
- makeFixture() validates edge endpoints, normalizes labels
- toAdjacencyMaps() and makeAdjacencyProvider() projection builders
- 12 canonical fixtures (F1-F12) with documented failure modes
- makeWeightFn() for weighted algorithm tests
- Seeded PRNG helper (test/helpers/seededRng.js)
2. Contract tests (test/contract/NeighborProviderPort.contract.test.js):
- 28 tests run against both AdjacencyNeighborProvider and
mock BitmapNeighborProvider
- Enforces sorting, direction, label filtering, dedup,
unknown node behavior, and proto-pollution safety
3. Per-algorithm fixture tests:
- F1 BFS_LEVEL_SORT_TRAP: catches fake BFS determinism
- F4 DIJKSTRA_EQUAL_COST_PREDECESSOR: proves tie-break rule
- F5 ASTAR_TIE_BREAK: no EPSILON hacks
- F8 TOPO_CYCLE_3: cycle detection with witness
- F9 UNICODE_CODEPOINT_ORDER: no localeCompare
- F10 PROTO_POLLUTION: __proto__/constructor/toString safe
4. Security test (test/unit/security/protoPollution.test.js):
- Verifies Object.prototype unchanged after exercising
proto-like node IDs through all code paths
…perty indexes CBOR-based bitmap index over the logical graph: stable numeric IDs (shardByte<<24 | localId), append-only label registry, per-label Roaring bitmaps for O(1) filtered neighbor lookups, property index shards with LRU caching, and proto-safe serialization throughout. New modules: fnv1a, computeShardKey, canonicalCbor, ShardIdOverflowError, LogicalBitmapIndexBuilder, PropertyIndexBuilder/Reader, LogicalIndexBuildService. BitmapNeighborProvider gains dual-mode: commit DAG (indexReader) + logical graph (logicalIndex). Third provider added to contract test suite (44 assertions). 18 cross-provider equivalence tests confirm identical traversal results across Adjacency and LogicalBitmap providers for all 9 algorithms.
Source files (7 errors): - shardKey.js: move JSDoc adjacent to function declaration - LogicalBitmapIndexBuilder: correct _globalToNode type to Map<string, string> - PropertyIndexBuilder: cast Object.create(null) to Record<string, unknown> - PropertyIndexReader: cast runtime-composed IndexStoragePort for readBlob - LogicalIndexBuildService: cast codec.decode() return to Record Test files (139 errors): - Add @param JSDoc to untyped function parameters - Cast codec.decode() returns via /** @type {Record<string, *>} */ - Cast catch variables via /** @type {*} */ - Widen ({}).polluted accesses via Record<string, unknown> cast - Type let declarations and {} objects as Record<string, ...> - Remove unused hooks prop from aStarSearch call
Move _createLogicalIndexFromTree logic into src/domain/services/LogicalIndexReader.js so production code can hydrate bitmap indexes from serialized trees or OID-based storage. fixtureDsl.js now delegates to LogicalIndexReader. Phase 3 Commit 1/6.
…erialized views Composes LogicalIndexBuildService, LogicalIndexReader, and PropertyIndexReader into a single service with three entry points: build(state), persistIndexTree(), and loadFromOids(). In-memory PropertyReader uses path-as-OID trick for zero-copy shard access during build().
Schema 4 checkpoints embed the materialized view index as a Git subtree under `index/`. createV5() accepts optional indexTree param; loadCheckpoint() partitions `index/`-prefixed entries into indexShardOids for lazy hydration. Backward-compatible: schema 2/3 checkpoints return null indexShardOids.
_setMaterializedState now builds a LogicalIndex + PropertyReader and attaches a BitmapNeighborProvider to _materializedGraph. Index build is cached by stateHash and wrapped in try-catch for resilience. Checkpoints now embed the index tree (schema 4). Integration test validates the full write → materialize → index-available path.
Both methods now check the LogicalIndex/PropertyReader/BitmapNeighborProvider for O(1) lookups before falling through to the linear scan. Fast path is only used when the index confirms the node is alive, ensuring correctness when the cached state has been mutated outside the materialization flow.
…vider CasSeekCacheAdapter now stores/returns an optional indexTreeOid alongside the state buffer. _materializeWithCeiling persists the bitmap index to Git and records the tree OID in cache metadata; on cache hit, it restores the LogicalIndex + PropertyReader from the stored tree. ObserverView uses the parent graph's BitmapNeighborProvider for O(1) adjacency lookups with post-filter glob matching on visible nodes.
Add applyWithDiff() alongside applyFast/applyWithReceipt — captures
alive-ness transitions (not just op application) during patch reduce.
Extend reduceV5() with { trackDiff: true } option returning merged
PatchDiff across all applied patches.
PatchDiff shape: nodesAdded, nodesRemoved, edgesAdded, edgesRemoved,
propsChanged — only records state transitions, not redundant ops.
TemporalQuery constructor now accepts optional loadCheckpoint callback. When since > 0 and a checkpoint exists with maxLamport <= since, replay starts from the checkpoint state instead of genesis — O(P_since_checkpoint) instead of O(P_total). Graceful fallback when loadCheckpoint is absent or returns null. WarpGraph.temporal getter wires loadCheckpoint via _loadLatestCheckpoint.
Stateless service computes only dirty shard buffers from a PatchDiff: - NodeAdd: allocate/reactivate globalId + alive bitmap toggle - NodeRemove: clear alive bit (stable globalId preserved) - EdgeAdd: fwd/rev bitmap updates + label registry extension - EdgeRemove: per-label removal + 'all' bitmap recomputation via OR - PropSet: props shard update Wire applyDiff() into MaterializedViewService as the incremental counterpart to build(). Add IncrementalIndexUpdater to ESLint relaxed-complexity block.
When _cachedIndexTree exists (prior successful build), reduceV5 uses trackDiff to capture PatchDiff, which flows through _setMaterializedState → _buildView → MaterializedViewService.applyDiff for O(diff) index updates instead of O(state_size) full rebuilds. Falls back to full build() when no prior tree or on any error. Add _cachedIndexTree field to WarpGraph for tree caching.
…ndex IncrementalIndexUpdater now correctly filters edges by endpoint alive-ness, matching the full rebuild's edgeVisibleV5 semantics. Adds _purgeNodeEdges for dangling edge elimination on node removal and edge restoration for re-added nodes. MaterializedViewService gains verifyIndex() — sampled cross-provider verification comparing bitmap index neighbors against adjacency ground truth with deterministic seeded PRNG. Includes 42 seeded fuzz equivalence tests (full rebuild vs incremental) and 5 verifyIndex unit tests.
- verify-index: samples alive nodes and cross-checks bitmap index neighbors against adjacency ground truth (--seed, --sample-rate) - reindex: forces full bitmap index rebuild by clearing cached state - Registered in COMMANDS map, KNOWN_COMMANDS, help text, and Zod schemas - Updated API surface snapshot for _maxLamportFromState
…H1, H2) H1: _handleNodeAdd now throws ShardIdOverflowError when nextLocalId >= 2^24, matching the full-build path and preventing silent globalId collisions. H2: _handleEdgeRemove now returns early when the edge label is not in the label registry, preventing operations on a "undefined" bucket key.
_biAStarExpand now calls weightFn(neighborId, current, label) when directionForNeighbors === 'in', correctly reflecting the actual edge direction. Asymmetric weight functions now produce correct shortest paths.
…lements (H4) snapshotBeforeOp now captures which elements are alive before NodeRemove/ EdgeRemove ops. collectNodeRemovals and collectEdgeRemovals only record transitions from alive → dead, preventing redundant removes from producing spurious diff entries that caused wasted incremental index work.
materialize() now accepts schema:4 checkpoints (previously only 2/3). When indexShardOids are present, the bitmap index is hydrated from stored OIDs via _viewService.loadFromOids, avoiding a full rebuild.
…int cache (H6, H7, H8) H6: Added verifyIndex() and invalidateIndex() public methods to WarpGraph. CLI verify-index and reindex commands now use public API instead of accessing private underscore-prefixed properties. Both include proper try/catch. H7: _buildView catch block now calls this._logger?.warn() with error details before nulling index state, so operators know when the index is degraded. H8: createCheckpoint() reuses _cachedIndexTree from the preceding materialize() instead of unconditionally calling _viewService.build(), avoiding O(N) rebuild.
…10, M19, M20) M1: CheckpointService tree entry sorting uses codepoint comparison instead of localeCompare, matching Git's byte-order requirement. M4/M5: Already resolved in H6 commit (CLI try/catch). M10: BitmapNeighborProvider constructor throws when neither indexReader nor logicalIndex is provided. M19/M20: fixtureDsl NodeRemove/EdgeRemove ops now include all required fields (node, from, to, label) matching accumulateOpDiff contract.
M2: Document O(E) worst-case in IncrementalIndexUpdater re-add scan M3: mergeDiffs now cancels contradictory add/remove pairs (net diff) M6: Document maxLamport closed-tick invariant on TemporalQuery M7: topologicalSort two-pointer queue (O(n) vs O(n^2) shift) M8: _insertSorted O(n+k) merge (vs O(k*n) splice) M9: sortAdjacencyMap clones arrays instead of in-place mutation M11: Document ObserverView cast-safety contracts for LogicalTraversal/QueryBuilder M12: BFS nextLevel dedup via queued Set (avoids O(E) duplicates) M14: JSDoc @note on reduceV5 diff-relative-to-initialState invariant M17: Add missing @param indexTree to CheckpointService.create() JSDoc M21: Deduplicate Mulberry32 PRNG — equivalence test imports seededRng helper M22: Exercise propertyReader with actual getNodeProps assertions L4: Bound verifyIndex seed to 32-bit int range in Zod schema L5: Reject --sample-rate 0 with .gt(0) validation L6: Rename verify-index total → totalChecks for clarity L15: MinHeap this.heap → this._heap (encapsulation) N2: Fix --sample-rate help placeholder style N3: Fix checkpoint.methods.js step numbering (4,5,5,6 → 4,5,6,7) N4: Add comment explaining empty reindexSchema
MEDIUM:
M13: JoinReducer snapshotBeforeOp uses targeted dot→element reverse index
instead of O(N) full entries scan; normalize observedDots to Set
M15: weightedLongestPath reuses adjList from topologicalSort via _returnAdjList
M16: Document commonAncestors self-inclusion behavior in JSDoc
M18: Add _validateStart() to bfs/dfs/shortestPath/A*/biA*/longestPath
LOW:
L1: IncrementalIndexUpdater O(1) lookups via globalToNode/nodeToGlobalMap
L2: _loadLabels prototype pollution guard via Object.create(null)
L3: Document stale receipt in MaterializedViewService.applyDiff
L7: ObserverView matchGlob regex cache (module-level Map)
L8: Document CasSeekCacheAdapter double-read trade-off
L9: Remove redundant await Promise.resolve() wrappers, drop async keyword
L10: Document error handling convention (throw vs return) in class JSDoc
L11: Document bidirectionalAStar fixed out/in direction
L12: Document LogicalTraversal private _materializeGraph access
L13: Document fnv1a NFC normalization note
L14: Tighten shardKey regex to exact SHA-1 (40) or SHA-256 (64)
L16: Fix IncrementalIndexUpdater.test edge remove state/diff mismatch
L17: Fix IncrementalIndexUpdater.test reactivation stale state
L18: Fix equivalence.test observedDots Array → Set
NIT:
N1: Remove identity .map() in _saveProps
N5: Remove redundant dedup Set in commonAncestors
N6: Document canonicalCbor convention (cbor-x, RFC 7049 §3.9)
N7: Verify ShardIdOverflowError options.code forwarding
N8: Replace tautological assertion in verifyIndex.test
N9: Add curly braces in GraphTraversal.test helper
N10: Rename misleading "isolated node" test
N11: Add consistent type annotation in CheckpointService.test
N12: Replace silent early return with assertion in equivalence.test
…thods Eliminate implicit `any` types and tighten JSDoc for strict TypeScript checking: add SnapshotBeforeOp typedef, type op params as OpV2, wrap sync returns in Promise.resolve for AdjacencyNeighborProvider, annotate codec decode results, add missing NeighborEdge[] types, fix dangling JSDoc block ordering in query.methods, and add null guards for optional _materializedGraph and _seekCache accesses.
- Fix WarpTypesV2 observedDots type from Dot[] to string[] (matches runtime: encoded dot strings, not Dot objects) - Remove stale Dot[] double-casts in PatchBuilderV2 - Add JSDoc type annotations to all test helper functions - Add null/undefined guards for possibly-null map lookups - Type shardOids as Record<string, string> across index tests - Cast reduceV5 return to WarpStateV5 when called without options - Cast unknown err in catch blocks, mock contexts in buildView test - Convert Set<string> to [...spread] for observedDots in equivalence test
…acade isReachable, weightedShortestPath, aStarSearch, bidirectionalAStar, topologicalSort, commonAncestors, and weightedLongestPath are now accessible via the public graph.traverse.* API. Previously these required constructing GraphTraversal + NeighborProvider directly. Also fixes AdjacencyNeighborProvider isolated-node visibility by passing aliveNodes from the materialized OR-Set.
…1-L3, N1) Remove phantom maxDepth from weightedShortestPath/aStarSearch, fix commonAncestors error message, forward signal on 4 methods, bypass assertDirection in bidirectionalAStar, add 6 negative/edge-case tests, fix CHANGELOG formatting, correct stale test comment, and use error code assertion instead of message match.
Support node-weighted graphs (e.g. critical-path scheduling) via a nodeWeightFn(nodeId) => number parameter on weightedShortestPath, aStarSearch, bidirectionalAStar, and weightedLongestPath. Weight is the cost to enter the destination node; start node is free. Internally memoized (O(V_visited) calls). Mutually exclusive with weightFn — providing both throws E_WEIGHT_FN_CONFLICT.
|
@coderabbitai review please 🥕 |
|
✅ Actions performedReview triggered.
|
Release Preflight
If you tag this commit as |
There was a problem hiding this comment.
Actionable comments posted: 8
Caution
Some comments are outside the diff and can’t be posted inline due to platform limitations.
⚠️ Outside diff range comments (1)
src/domain/utils/roaring.js (1)
108-121:⚠️ Potential issue | 🟡 Minor
initErroris not cleared after a successfulinitRoaring()without amodargument.If auto-initialization fails (setting
initError), theninitRoaring()is retried withoutmodand succeeds,initErrorstill holds the stale failure. Any subsequent call toloadRoaring()— even thoughroaringModuleis now populated — would incorrectly append the stale cause. Only theif (mod)branch clearsinitError.🐛 Proposed fix
export async function initRoaring(mod) { if (mod) { roaringModule = mod; initError = null; return; } if (!roaringModule) { roaringModule = /** `@type` {RoaringModule} */ (await import('roaring')); // Handle both ESM default export and CJS module.exports if (roaringModule.default && roaringModule.default.RoaringBitmap32) { roaringModule = roaringModule.default; } + initError = null; } }🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed. In `@src/domain/utils/roaring.js` around lines 108 - 121, The initRoaring function currently only clears initError when a mod is passed; update initRoaring so that after a successful auto-import and normalization of the module (where roaringModule is assigned and possibly replaced with roaringModule.default), you also set initError = null; this ensures that a prior import failure does not leave a stale error that later causes loadRoaring to append an incorrect cause. Refer to the initRoaring function, the initError variable, and roaringModule (and the check for roaringModule.default.RoaringBitmap32) when making the change.
♻️ Duplicate comments (4)
ROADMAP.md (1)
27-31:⚠️ Potential issue | 🟡 MinorVersion chronology still regresses after M2.
After
ROADMAP.mdLine 26 (v11.1.0), M3–M7 revert tov11.0.0, which breaks sequence in a completed milestone timeline. Please either normalize these versions to the intended order or explicitly annotate why milestone numbering is non-version-ordered.🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed. In `@ROADMAP.md` around lines 27 - 31, The milestone entries M3–M7 currently show v11.0.0 after a v11.1.0 entry, breaking chronological order; update the version strings for the milestones named "GHOST PROTOCOL", "VERIFY OR IT DIDN'T HAPPEN", "CLI DECOMPOSITION", "SAFE ERGONOMICS", and "TRUST V1" so they follow the intended timeline (e.g., bump them to the correct successive versions) or add a concise annotation next to those milestone rows explaining why milestone numbering is not version-ordered so readers understand the intentional discrepancy.src/domain/services/IncrementalIndexUpdater.js (1)
53-57: LGTM — both previously flagged issues are now addressed.
MAX_LOCAL_IDcomment is corrected ("per shard"), and null-prototype normalization for loaded prop bags is in place at lines 511–521.🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed. In `@src/domain/services/IncrementalIndexUpdater.js` around lines 53 - 57, No change required — both requested fixes are already applied: ensure computeDirtyShards (and its helper _collectDirtyShardKeys) continues to return an empty object when dirtyKeys.size === 0, confirm the MAX_LOCAL_ID comment now reads "per shard", and keep the null-prototype normalization for loaded prop bags implemented where prop bags are loaded (the normalization block around the existing code you updated at lines ~511–521).src/domain/warp/query.methods.js (1)
46-54: LGTM — indexed fast-path fallback is now properly handled.The
try/catchcorrectly degrades to the linear scan on any_propertyReaderor_logicalIndexfailure, addressing the previously raised resilience concern.🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed. In `@src/domain/warp/query.methods.js` around lines 46 - 54, The indexed fast-path try/catch in src/domain/warp/query.methods.js properly falls back to the linear scan on `_propertyReader.getNodeProps(nodeId)` or `_logicalIndex?.isAlive(nodeId)` failures, so no code change is required; keep the existing block that returns `new Map(Object.entries(record))` on success and silently falls through to the linear scan on any exception.src/domain/services/GraphTraversal.js (1)
388-391:⚠️ Potential issue | 🟠 Major
isReachablestill returnsreachable: truefor a non-existent start whenstart === goal.
_validateStartis never called in this method. A caller passing a non-existent node ID as bothstartandgoalreceives{ reachable: true }instead of the expectedINVALID_STARTerror, inconsistent with every other start-node API. This was flagged in a prior review and marked as addressed, but the fix is absent from the current code.🐛 Proposed fix
async isReachable({ start, goal, direction = 'out', options, maxNodes = DEFAULT_MAX_NODES, maxDepth = DEFAULT_MAX_DEPTH, signal, }) { this._resetStats(); + await this._validateStart(start); if (start === goal) { return { reachable: true, stats: this._stats(1) }; }🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed. In `@src/domain/services/GraphTraversal.js` around lines 388 - 391, The isReachable flow returns reachable: true when start===goal even if the start node doesn't exist because _validateStart is never called; fix by invoking this._validateStart(start) before the early return (after this._resetStats()) and handle/propagate its error the same way other start-node APIs do so that a non-existent start yields the expected INVALID_START response instead of reachable: true; keep the existing use of this._stats(...) and this._resetStats() untouched.
🧹 Nitpick comments (15)
ROADMAP.md (1)
200-200: Minor wording polish for readability.“Consider generating as build step” reads awkwardly; “Consider generating this as a build step” is clearer.
🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed. In `@ROADMAP.md` at line 200, Update the wording for item B76 "WARPGRAPH INVISIBLE API SURFACE DOCS" to make the suggestion clearer: replace the phrase "Consider generating as build step" with "Consider generating this as a build step" in the ROADMAP entry referencing the `WARPGRAPH INVISIBLE API SURFACE DOCS` / `WarpGraph` API Surface block so the sentence reads smoothly and unambiguously.docker/Dockerfile.deno (1)
25-25: Deduplicate npm script allowlist to prevent config drift.The same
--allow-scriptsvalue is repeated in build and runtime commands; keeping it in one variable reduces breakage risk when versions change.♻️ Proposed refactor
FROM denoland/deno:2.1.9 +ARG DENO_ALLOW_SCRIPTS="npm:roaring@2.7.0,npm:cbor-extract@2.2.0" USER root RUN apt-get update && apt-get install -y --no-install-recommends \ git \ @@ RUN deno install \ - --allow-scripts=npm:roaring@2.7.0,npm:cbor-extract@2.2.0 \ + --allow-scripts=${DENO_ALLOW_SCRIPTS} \ --node-modules-dir=auto \ --entrypoint test/runtime/deno/lifecycle.test.ts \ test/runtime/deno/checkpoint.test.ts @@ USER deno ENV GIT_STUNTS_DOCKER=1 -CMD ["deno", "test", "--allow-all", "--allow-scripts=npm:roaring@2.7.0,npm:cbor-extract@2.2.0", "--node-modules-dir=auto", "test/runtime/deno/"] +ENV DENO_ALLOW_SCRIPTS=${DENO_ALLOW_SCRIPTS} +CMD ["sh", "-lc", "deno test --allow-all --allow-scripts=${DENO_ALLOW_SCRIPTS} --node-modules-dir=auto test/runtime/deno/"]Also applies to: 34-34
🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed. In `@docker/Dockerfile.deno` at line 25, The repeated --allow-scripts flag (--allow-scripts=npm:roaring@2.7.0,npm:cbor-extract@2.2.0) appears in both the build and runtime Deno commands; extract that exact string into a single variable (e.g., an ARG/ENV like DEN0_ALLOW_SCRIPTS) and replace occurrences in the commands so both build and runtime reference the same variable (update the lines where the flag is used, around the current occurrences at the two places showing the flag) to avoid duplicating versions and drift.test/unit/domain/WarpGraph.query.test.js (1)
104-122: Add a call-count assertion to each fallback test to prove the mock path was actually entered.Both tests assert only the output of the fallback. Without verifying that the mock (index/provider) was actually invoked, a test where the fast-path is never reached (e.g. due to a wrong field name or a guard that short-circuits before hitting the provider) will still pass — it just runs the linear scan unconditionally and returns the expected value coincidentally. Using
vi.fn()instead of a plain async arrow gives you a.toHaveBeenCalled()assertion to prove the fallback was triggered by an index failure, not by the index path being skipped entirely.♻️ Suggested improvement (getNodeProps test)
- /** `@type` {any} */ (graph)._propertyReader = { - getNodeProps: async () => { - throw new Error('simulated index failure'); - }, - }; + const failingReader = vi.fn().mockRejectedValue(new Error('simulated index failure')); + /** `@type` {any} */ (graph)._propertyReader = { getNodeProps: failingReader }; const props = await graph.getNodeProps('user:alice'); + expect(failingReader).toHaveBeenCalled(); expect(props).toBeInstanceOf(Map); expect(props.get('name')).toBe('Alice');♻️ Suggested improvement (neighbors test)
- /** `@type` {any} */ (graph)._materializedGraph = { - provider: { - getNeighbors: async () => { - throw new Error('simulated provider failure'); - }, - }, - }; + const failingProvider = vi.fn().mockRejectedValue(new Error('simulated provider failure')); + /** `@type` {any} */ (graph)._materializedGraph = { + provider: { getNeighbors: failingProvider }, + }; const result = await graph.neighbors('user:alice', 'outgoing'); + expect(failingProvider).toHaveBeenCalled(); expect(result).toEqual([{ … }]);Also applies to: 234-257
🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed. In `@test/unit/domain/WarpGraph.query.test.js` around lines 104 - 122, The fallback tests call the mocked index/provider with a plain async function so there's no call-count proof the fallback path ran; replace the inline async thrower for graph._propertyReader.getNodeProps (and similar mocks like graph._logicalIndex.* in the neighbors test) with a spy function (vi.fn(async (...) => { throw new Error('simulated index failure') })) and after invoking graph.getNodeProps (or the neighbors call) add an assertion like expect(theSpy).toHaveBeenCalled() to verify the mock path was entered; reference the mocked symbol graph._propertyReader.getNodeProps (and graph._logicalIndex methods used in the neighbors test) and the tested method graph.getNodeProps to locate where to change the test.src/domain/services/IncrementalIndexUpdater.js (2)
240-319:_purgeNodeEdgesand the subsequent_handleEdgeRemoveloop may double-process edges for removed nodes.When a node is removed,
_purgeNodeEdges(Line 83) zeroes its bitmap rows. Then for every edge indiff.edgesRemovedthat involves the dead node (Lines 94–96),_handleEdgeRemovedeserializes the already-empty bitmap and calls_recomputeAllBucket. This is harmless but performs redundant I/O deserializations and recomputation. Ifdiff.edgesRemovedcan be large (e.g., a node with thousands of incident edges is removed along with all its edges), this could be measurable.Consider filtering
diff.edgesRemovedto skip edges where either endpoint is indiff.nodesRemoved, since_purgeNodeEdgesalready handles them.🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed. In `@src/domain/services/IncrementalIndexUpdater.js` around lines 240 - 319, The loop that processes diff.edgesRemoved redundantly handles edges whose endpoints were already removed by _purgeNodeEdges; update the code that iterates diff.edgesRemoved (or add an early check in _handleEdgeRemove) to skip any edge where either endpoint ID appears in diff.nodesRemoved so you don't deserialize already-zeroed bitmaps or call _recomputeAllBucket unnecessarily; specifically reference diff.edgesRemoved, diff.nodesRemoved, _handleEdgeRemove, _purgeNodeEdges and _recomputeAllBucket and implement a membership check (or filter) against nodesRemoved before deserializing/recomputing.
107-127: O(E) re-add scan touches every alive edge for anynodesAddedbatch, regardless of actual edge connectivity.The comment documents this, but the current short-circuit at Line 114 only skips edges where neither endpoint is in
addedSet. For a graph with 100K edges and 1 new node, all 100K edges are decoded and checked. Since_findGlobalIdreturnsundefinedfor truly new nodes (no pre-existing global ID), the inner body of the loop is never reached for them — the scan is entirely wasted work.Consider adding a guard before entering the loop that checks whether any node in
diff.nodesAddedalready has a globalId in any shard (i.e., is a re-add vs. a brand-new node), and skips the scan entirely if none are re-adds.Would you like me to draft the re-add detection guard to short-circuit this scan for fresh-node-only batches?
src/domain/warp/query.methods.js (1)
189-210: Redundant cast on Line 190.
cachedStateis already typed asWarpStateV5in the parameter signature on Line 184 — the/**@type{...} */ (cachedState)reassignment on Line 190 to the same variable is a no-op.♻️ Proposed cleanup
function _linearNeighbors(cachedState, nodeId, direction, edgeLabel) { - const s = /** `@type` {import('../services/JoinReducer.js').WarpStateV5} */ (cachedState); + const s = cachedState; /** `@type` {Array<...>} */🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed. In `@src/domain/warp/query.methods.js` around lines 189 - 210, The explicit JSDoc cast that reassigns cachedState to s inside _linearNeighbors is redundant; remove the line "const s = /** `@type` {...} */ (cachedState);" and update the function to use the existing parameter cachedState (or rename the parameter to s if you prefer that local name) wherever s is referenced (e.g., orsetElements(s.edgeAlive), orsetContains(s.nodeAlive)). Ensure types remain correct by keeping the original parameter JSDoc/type on cachedState and remove the unnecessary cast-only variable.src/domain/services/CheckpointService.js (1)
113-136:createV5JSDoc is stale — doesn't document schema:4 or theindex/subtree.The tree structure comment (lines 115–123) lists five files but omits the
index/subtree that is created whenindexTreeis provided. The description still says "Creates a schema:2 checkpoint commit" even though schema:4 is now conditionally emitted.📝 Proposed doc fix
/** - * Creates a V5 checkpoint commit with full ORSet state. + * Creates a V5 checkpoint commit with full ORSet state (schema:2) or with + * a materialized index subtree (schema:4 when `indexTree` is provided). * * V5 Checkpoint Tree Structure: * ``` * <checkpoint_tree>/ * ├── state.cbor # AUTHORITATIVE: Full V5 state (ORSets + props) * ├── visible.cbor # CACHE ONLY: Visible projection for fast queries * ├── frontier.cbor # Writer frontiers * ├── appliedVV.cbor # Version vector of dots in state - * └── provenanceIndex.cbor # Optional: node-to-patchSha index (HG/IO/2) + * ├── provenanceIndex.cbor # Optional: node-to-patchSha index (HG/IO/2) + * └── index/ # Optional: bitmap index shards (schema:4 only) + * ├── meta_<sk>.cbor + * ├── fwd_<sk>.cbor + * └── ... * ```🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed. In `@src/domain/services/CheckpointService.js` around lines 113 - 136, The JSDoc for createV5 is stale: update the tree comment and description in the createV5 function to document the conditional schema:4 behavior and the index/ subtree when indexTree is supplied; change the "Creates a schema:2 checkpoint commit" text to reflect that schema:4 is emitted when indexTree is provided, and add the index/ subtree lines (meta_<sk>.cbor, fwd_<sk>.cbor, etc.) and mark provenanceIndex.cbor as one of several optional files so the tree shows both provenanceIndex.cbor and the optional index/ directory.src/domain/services/CheckpointMessageCodec.js (1)
63-66: Consider replacing the chained equality checks with a range test for forward compatibility.Adding schema:5 later would require remembering to update this condition again;
schema >= 2expresses the same invariant without the maintenance burden.♻️ Proposed refactor
- // Add checkpoint version marker for V5 format (schema:2, schema:3, schema:4) - if (schema === 2 || schema === 3 || schema === 4) { + // Add checkpoint version marker for V5 format (schema >= 2) + if (schema >= 2) { trailers[TRAILER_KEYS.checkpointVersion] = 'v5'; }🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed. In `@src/domain/services/CheckpointMessageCodec.js` around lines 63 - 66, Replace the explicit equality checks with a range/threshold check: instead of "if (schema === 2 || schema === 3 || schema === 4)" update the condition that sets trailers[TRAILER_KEYS.checkpointVersion] to use "schema >= 2" (referencing the schema variable, trailers object, and TRAILER_KEYS.checkpointVersion in CheckpointMessageCodec.js) so future schema versions automatically get the v5 marker without needing to update this condition.test/integration/api/materializedView.test.js (1)
17-33: Integration tests couple to private fields — consider exposing narrow inspection APIs.Lines 30–32 directly access
_materializedGraph,_logicalIndex, and_propertyReader. For an integration test suite that's expected to live alongside future API changes, this creates brittleness: any rename or restructuring of these internal fields silently breaks all three tests. A thin public inspection surface (e.g.,graph.getIndexStats()or agraph.isIndexReady()accessor) would decouple the tests from implementation details without sacrificing coverage.🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed. In `@test/integration/api/materializedView.test.js` around lines 17 - 33, Tests directly access private internals (_materializedGraph, _logicalIndex, _propertyReader) and should instead use a narrow public inspection API; add a method on the Graph class such as isMaterialized() and/or getIndexStats() (or getMaterializationInfo()) that returns whether materialize() completed and exposes only the needed checks (e.g., provider type, readiness, and existence of getNeighbors) so tests can call graph.isMaterialized() or graph.getIndexStats() after graph.materialize() and assert on those public results; update the failing tests to use these new Graph methods and remove direct references to _materializedGraph, _logicalIndex, and _propertyReader.src/domain/services/GraphTraversal.js (2)
1007-1065:weightedLongestPathstats silently exclude the topological sort discovery phase.
topologicalSortis called at Line 1018 (which calls_resetStats()internally and traverses the entire reachable subgraph, accumulating_edgesTraversed), then_resetStats()is called again at Line 1028, wiping those counts. The returnedstatsat Line 1064 only reflect the DP phase, understating total edges traversed and nodes visited for callers relying on the stats for observability or cost accounting.🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed. In `@src/domain/services/GraphTraversal.js` around lines 1007 - 1065, weightedLongestPath currently calls topologicalSort (which accumulates stats) then immediately calls this._resetStats(), wiping the topo discovery metrics; capture and merge the topo phase stats before resetting for the DP phase. Concretely: after calling this.topologicalSort(...) save its returned stats object (or read counters if topologicalSort returns nothing but updates this._stats/_edgesTraversed/_nodesVisited), then call this._resetStats() for the DP, run the DP, compute dpStats via this._stats(sorted.length), and finally merge topoStats + dpStats (summing edgesTraversed/nodesVisited and preserving other fields) to return as the final stats; update references in weightedLongestPath (the call to topologicalSort, the this._resetStats() call, and the final this._stats(...) return) to implement this merge so callers receive combined observability metrics.
1182-1190:_reconstructBiPathbackward-half loop has no guard against missingbwdNextentries.
_reconstructPath(Line 1153) breaks whenpredMap.get(current)returnsundefined, preventing an infinite loop if the predecessor chain is somehow incomplete. The backward-half loop on Line 1184 lacks an equivalent safeguard — ifbwdNextis ever missing a link betweenmeetingandgoal, the loop exits silently without reachinggoal, and the returned path is silently truncated rather than flagged.🛡️ Proposed defensive guard
cur = meeting; while (cur !== goal && bwdNext.has(cur)) { cur = /** `@type` {string} */ (bwdNext.get(cur)); + if (cur === undefined) { break; } fwdHalf.push(cur); }🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed. In `@src/domain/services/GraphTraversal.js` around lines 1182 - 1190, The backward-half loop in _reconstructBiPath can silently truncate the path if bwdNext lacks an entry between meeting and goal; mirror the defensive pattern used in _reconstructPath by capturing the retrieved next link (const next = bwdNext.get(cur)), check if next is undefined, and if so throw a descriptive Error (e.g. `throw new Error("Incomplete backward chain in _reconstructBiPath: missing bwdNext for " + cur)`) instead of proceeding—then push next into fwdHalf and continue; reference symbols: _reconstructBiPath, bwdNext, cur, meeting, goal, fwdHalf.THE_PROBLEM.md (1)
37-37: Minor wording cleanup.At Line 37, “at a point in time” is redundant; “at a point” (or just “at a Lamport ceiling”) reads tighter.
🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed. In `@THE_PROBLEM.md` at line 37, Update the sentence that reads "**Seek cache** persists CRDT state at a point in time but not adjacency. Same rebuild." to remove the redundant phrase "in time" and tighten wording — e.g., change it to "**Seek cache** persists CRDT state at a point but not adjacency. Same rebuild." or "**Seek cache** persists CRDT state at a Lamport ceiling but not adjacency. Same rebuild." Ensure only the wording is adjusted and the intended meaning remains unchanged.src/domain/services/LogicalIndexReader.js (1)
205-217: Make shard processing order explicitly deterministic.At Line 205,
_processShardssays deterministic order, butmeta/edgesare currently consumed in caller-provided order. Sorting bypathhere will make decode/order behavior explicit and stable.♻️ Proposed fix
_processShards(items) { const Ctor = getRoaringBitmap32(); const { meta, labels, edges } = classifyShards(items); + meta.sort((a, b) => a.path.localeCompare(b.path)); + edges.sort((a, b) => a.path.localeCompare(b.path)); for (const { path, buf } of meta) { this._decodeMeta(path, buf, Ctor); }🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed. In `@src/domain/services/LogicalIndexReader.js` around lines 205 - 217, The shard processing in _processShards uses caller-provided order for meta and edges which is non-deterministic; before iterating, sort the meta and edges arrays by their path property (e.g., meta.sort((a,b)=>a.path.localeCompare(b.path)) and similarly for edges) so that _decodeMeta is invoked in a stable order and _decodeEdgeShard receives shards deterministically; keep the existing use of Ctor and the call to _decodeLabels unchanged.test/unit/domain/services/LogicalIndexReader.test.js (1)
16-48: Prefer shared fixture builder to avoid semantic drift.At Line 17, this local
fixtureToStateduplicates logic already maintained intest/helpers/fixtureDsl.js. Keeping two builders risks divergence (e.g., tombstones/lamport behavior) and inconsistent test semantics.♻️ Proposed refactor
-import { - makeFixture, - F7_MULTILABEL_SAME_NEIGHBOR, - F10_PROTO_POLLUTION, -} from '../../../helpers/fixtureDsl.js'; +import { + makeFixture, + fixtureToState, + F7_MULTILABEL_SAME_NEIGHBOR, + F10_PROTO_POLLUTION, +} from '../../../helpers/fixtureDsl.js'; -/** - * Builds a WarpStateV5 from a fixture (mirrors fixtureDsl._fixtureToState). - * `@param` {{nodes: string[], edges: Array<{from: string, to: string, label: string}>, props?: Array<{nodeId: string, key: string, value: unknown}>}} fixture - */ -function fixtureToState(fixture) { - ... -}🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed. In `@test/unit/domain/services/LogicalIndexReader.test.js` around lines 16 - 48, Replace the duplicated local builder fixtureToState with the shared helper from test/helpers/fixtureDsl.js: remove the local fixtureToState function and import the canonical builder (e.g., fixtureDsl._fixtureToState or the exported name used by that module) and update tests to call that shared function; ensure you still pass the same fixture shape so createEmptyStateV5/applyOpV2/createDot/createEventId usages are handled by the shared builder to avoid semantic drift.src/domain/services/TemporalQuery.js (1)
315-318: Consider binary search for checkpoint start index.
allPatchesis ordered by Lamport, so this lookup can be O(log n) instead of O(n).Optional refactor
- const idx = allPatches.findIndex( - ({ patch }) => patch.lamport > ck.maxLamport, - ); + let lo = 0; + let hi = allPatches.length; + while (lo < hi) { + const mid = (lo + hi) >>> 1; + if (allPatches[mid].patch.lamport > ck.maxLamport) hi = mid; + else lo = mid + 1; + } + const idx = lo;🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed. In `@src/domain/services/TemporalQuery.js` around lines 315 - 318, Replace the linear findIndex scan for the checkpoint start with a binary search because allPatches is sorted by Lamport; locate the first index where patch.lamport > ck.maxLamport (i.e., the lower_bound/upper_bound behavior), then compute startIdx as before (startIdx = idx < 0 ? allPatches.length : idx). Implement this in the TemporalQuery.js scope where allPatches and ck.maxLamport are used (or extract a helper like findFirstLamportGreater(allPatches, target)) so the lookup becomes O(log n) instead of O(n).
🤖 Prompt for all review comments with AI agents
Verify each finding against the current code and only fix it if needed.
Inline comments:
In `@ARCHITECTURE.md`:
- Around line 249-255: The docs for MaterializedViewService list lifecycle
methods persist()/load()/update(), but the implementation exposes
persistIndexTree(), loadFromOids(), applyDiff(), and verifyIndex(), causing
mismatch; update ARCHITECTURE.md to reference the actual API names
(MaterializedViewService.persistIndexTree, MaterializedViewService.loadFromOids,
MaterializedViewService.applyDiff and MaterializedViewService.verifyIndex) and
describe their roles (persist index tree, load by object IDs, apply incremental
patch/diff, and verify index state) so the lifecycle doc aligns with the code
and migration guidance is accurate.
In `@docker-compose.test.yml`:
- Line 52: The pinned versions in the deno test command's --allow-scripts flag
(the command string containing
"--allow-scripts=npm:roaring@2.7.0,npm:cbor-extract@2.2.0") must be kept in sync
with package.json; change the docker-compose test command to avoid hardcoded,
out-of-sync pins—either remove the explicit version suffixes (e.g., use
"npm:roaring,npm:cbor-extract") or replace them with a
templated/value-from-package-json approach in CI, and add a clear comment
immediately above the command that instructs maintainers to update these pins
when package.json/package-lock.json bump the packages (or add a CI check to
verify parity). Ensure the change targets the command line string that defines
--allow-scripts so Deno will not skip postinstall scripts.
In `@ROADMAP.md`:
- Around line 212-219: Update the prose references to use canonical "GitHub"
capitalization wherever lowercase "github/workflows/..." or similar appears in
the ROADMAP entries B83–B90 and related file mentions (e.g.,
`.github/workflows/ci.yml`, `.github/workflows/`), replacing instances like
"github" with "GitHub" for naming consistency in the document.
In `@src/domain/services/MaterializedViewService.js`:
- Around line 348-357: The verification currently samples only from
state.nodeAlive (allNodes/sampleNodes) and then calls
logicalIndex.isAlive(nodeId), which only detects “expected alive but index
reports dead”; update verifyIndex to also detect ghost/stale nodes by checking
logicalIndex's live set against state: either include the union of sampled nodes
and logicalIndex's alive nodes (use logicalIndex.isAlive(...) and
state.nodeAlive/orsetElements) or add a second loop over logicalIndex's alive
node IDs and assert they exist in state.nodeAlive and have matching adjacency in
buildGroundTruthAdjacency; ensure the VerifyError(s) and passed counter are
updated consistently when finding these reverse-direction mismatches.
In `@src/domain/services/PropertyIndexReader.js`:
- Around line 97-105: The code silently treats non-array decoded shard payloads
as empty which can hide corruption; in PropertyIndexReader (around the block
using decoded, data, path and this._cache.set) validate that decoded is the
expected Array and if not throw or propagate a descriptive error (including path
and the actual typeof/shape) instead of proceeding to set an empty data cache
entry; keep the existing loop for Array cases and only call
this._cache.set(path, data) after successful validation to fail fast on format
mismatch.
In `@src/domain/services/TemporalQuery.js`:
- Around line 314-320: The checkpoint state (ck.state) is returned by reference
and can be mutated later by joinPatch/applyFast; defensively deep-clone the
checkpoint state before returning to prevent reuse from causing cross-query
mutations. In TemporalQuery.js where you return { state: ck.state, startIdx,
checkpointMaxLamport: ck.maxLamport }, replace ck.state with a deep clone (e.g.,
structuredClone(ck.state) or your repo's deepClone utility / JSON fallback) so
callers receive an immutable copy; keep the rest of the return shape (startIdx
and checkpointMaxLamport) unchanged. Ensure the cloning approach is safe for the
state shape and environment (fallback if structuredClone isn’t available).
In `@test/helpers/fixtureDsl.js`:
- Around line 40-57: makeFixture currently validates edges but not props or
tombstones; update makeFixture to validate props and tombstones by checking each
props entry's nodeId exists in nodeSet and each tombstone target refers to an
existing node id or an existing edge id from normalizedEdges (use
normalizedEdges to build a set of valid edge identifiers), and throw descriptive
Errors when any prop.nodeId or tombstone reference is missing; reference the
functions/variables makeFixture, nodeSet, normalizedEdges, props, and tombstones
when adding these checks so the validation runs early and fails fast.
In `@test/unit/domain/services/GraphTraversal.test.js`:
- Around line 623-624: The test's mock async provider wrapper for getNeighbors
is dropping the third argument; update the mock so getNeighbors forwards the
full signature to inner.getNeighbors (e.g., accept the same parameters or use a
rest param and call inner.getNeighbors(...args)) so that any neighbor options
(such as label filtering) are passed through; locate the wrapper around
getNeighbors that references inner.getNeighbors and modify it to forward all
arguments.
---
Outside diff comments:
In `@src/domain/utils/roaring.js`:
- Around line 108-121: The initRoaring function currently only clears initError
when a mod is passed; update initRoaring so that after a successful auto-import
and normalization of the module (where roaringModule is assigned and possibly
replaced with roaringModule.default), you also set initError = null; this
ensures that a prior import failure does not leave a stale error that later
causes loadRoaring to append an incorrect cause. Refer to the initRoaring
function, the initError variable, and roaringModule (and the check for
roaringModule.default.RoaringBitmap32) when making the change.
---
Duplicate comments:
In `@ROADMAP.md`:
- Around line 27-31: The milestone entries M3–M7 currently show v11.0.0 after a
v11.1.0 entry, breaking chronological order; update the version strings for the
milestones named "GHOST PROTOCOL", "VERIFY OR IT DIDN'T HAPPEN", "CLI
DECOMPOSITION", "SAFE ERGONOMICS", and "TRUST V1" so they follow the intended
timeline (e.g., bump them to the correct successive versions) or add a concise
annotation next to those milestone rows explaining why milestone numbering is
not version-ordered so readers understand the intentional discrepancy.
In `@src/domain/services/GraphTraversal.js`:
- Around line 388-391: The isReachable flow returns reachable: true when
start===goal even if the start node doesn't exist because _validateStart is
never called; fix by invoking this._validateStart(start) before the early return
(after this._resetStats()) and handle/propagate its error the same way other
start-node APIs do so that a non-existent start yields the expected
INVALID_START response instead of reachable: true; keep the existing use of
this._stats(...) and this._resetStats() untouched.
In `@src/domain/services/IncrementalIndexUpdater.js`:
- Around line 53-57: No change required — both requested fixes are already
applied: ensure computeDirtyShards (and its helper _collectDirtyShardKeys)
continues to return an empty object when dirtyKeys.size === 0, confirm the
MAX_LOCAL_ID comment now reads "per shard", and keep the null-prototype
normalization for loaded prop bags implemented where prop bags are loaded (the
normalization block around the existing code you updated at lines ~511–521).
In `@src/domain/warp/query.methods.js`:
- Around line 46-54: The indexed fast-path try/catch in
src/domain/warp/query.methods.js properly falls back to the linear scan on
`_propertyReader.getNodeProps(nodeId)` or `_logicalIndex?.isAlive(nodeId)`
failures, so no code change is required; keep the existing block that returns
`new Map(Object.entries(record))` on success and silently falls through to the
linear scan on any exception.
---
Nitpick comments:
In `@docker/Dockerfile.deno`:
- Line 25: The repeated --allow-scripts flag
(--allow-scripts=npm:roaring@2.7.0,npm:cbor-extract@2.2.0) appears in both the
build and runtime Deno commands; extract that exact string into a single
variable (e.g., an ARG/ENV like DEN0_ALLOW_SCRIPTS) and replace occurrences in
the commands so both build and runtime reference the same variable (update the
lines where the flag is used, around the current occurrences at the two places
showing the flag) to avoid duplicating versions and drift.
In `@ROADMAP.md`:
- Line 200: Update the wording for item B76 "WARPGRAPH INVISIBLE API SURFACE
DOCS" to make the suggestion clearer: replace the phrase "Consider generating as
build step" with "Consider generating this as a build step" in the ROADMAP entry
referencing the `WARPGRAPH INVISIBLE API SURFACE DOCS` / `WarpGraph` API Surface
block so the sentence reads smoothly and unambiguously.
In `@src/domain/services/CheckpointMessageCodec.js`:
- Around line 63-66: Replace the explicit equality checks with a range/threshold
check: instead of "if (schema === 2 || schema === 3 || schema === 4)" update the
condition that sets trailers[TRAILER_KEYS.checkpointVersion] to use "schema >=
2" (referencing the schema variable, trailers object, and
TRAILER_KEYS.checkpointVersion in CheckpointMessageCodec.js) so future schema
versions automatically get the v5 marker without needing to update this
condition.
In `@src/domain/services/CheckpointService.js`:
- Around line 113-136: The JSDoc for createV5 is stale: update the tree comment
and description in the createV5 function to document the conditional schema:4
behavior and the index/ subtree when indexTree is supplied; change the "Creates
a schema:2 checkpoint commit" text to reflect that schema:4 is emitted when
indexTree is provided, and add the index/ subtree lines (meta_<sk>.cbor,
fwd_<sk>.cbor, etc.) and mark provenanceIndex.cbor as one of several optional
files so the tree shows both provenanceIndex.cbor and the optional index/
directory.
In `@src/domain/services/GraphTraversal.js`:
- Around line 1007-1065: weightedLongestPath currently calls topologicalSort
(which accumulates stats) then immediately calls this._resetStats(), wiping the
topo discovery metrics; capture and merge the topo phase stats before resetting
for the DP phase. Concretely: after calling this.topologicalSort(...) save its
returned stats object (or read counters if topologicalSort returns nothing but
updates this._stats/_edgesTraversed/_nodesVisited), then call this._resetStats()
for the DP, run the DP, compute dpStats via this._stats(sorted.length), and
finally merge topoStats + dpStats (summing edgesTraversed/nodesVisited and
preserving other fields) to return as the final stats; update references in
weightedLongestPath (the call to topologicalSort, the this._resetStats() call,
and the final this._stats(...) return) to implement this merge so callers
receive combined observability metrics.
- Around line 1182-1190: The backward-half loop in _reconstructBiPath can
silently truncate the path if bwdNext lacks an entry between meeting and goal;
mirror the defensive pattern used in _reconstructPath by capturing the retrieved
next link (const next = bwdNext.get(cur)), check if next is undefined, and if so
throw a descriptive Error (e.g. `throw new Error("Incomplete backward chain in
_reconstructBiPath: missing bwdNext for " + cur)`) instead of proceeding—then
push next into fwdHalf and continue; reference symbols: _reconstructBiPath,
bwdNext, cur, meeting, goal, fwdHalf.
In `@src/domain/services/IncrementalIndexUpdater.js`:
- Around line 240-319: The loop that processes diff.edgesRemoved redundantly
handles edges whose endpoints were already removed by _purgeNodeEdges; update
the code that iterates diff.edgesRemoved (or add an early check in
_handleEdgeRemove) to skip any edge where either endpoint ID appears in
diff.nodesRemoved so you don't deserialize already-zeroed bitmaps or call
_recomputeAllBucket unnecessarily; specifically reference diff.edgesRemoved,
diff.nodesRemoved, _handleEdgeRemove, _purgeNodeEdges and _recomputeAllBucket
and implement a membership check (or filter) against nodesRemoved before
deserializing/recomputing.
In `@src/domain/services/LogicalIndexReader.js`:
- Around line 205-217: The shard processing in _processShards uses
caller-provided order for meta and edges which is non-deterministic; before
iterating, sort the meta and edges arrays by their path property (e.g.,
meta.sort((a,b)=>a.path.localeCompare(b.path)) and similarly for edges) so that
_decodeMeta is invoked in a stable order and _decodeEdgeShard receives shards
deterministically; keep the existing use of Ctor and the call to _decodeLabels
unchanged.
In `@src/domain/services/TemporalQuery.js`:
- Around line 315-318: Replace the linear findIndex scan for the checkpoint
start with a binary search because allPatches is sorted by Lamport; locate the
first index where patch.lamport > ck.maxLamport (i.e., the
lower_bound/upper_bound behavior), then compute startIdx as before (startIdx =
idx < 0 ? allPatches.length : idx). Implement this in the TemporalQuery.js scope
where allPatches and ck.maxLamport are used (or extract a helper like
findFirstLamportGreater(allPatches, target)) so the lookup becomes O(log n)
instead of O(n).
In `@src/domain/warp/query.methods.js`:
- Around line 189-210: The explicit JSDoc cast that reassigns cachedState to s
inside _linearNeighbors is redundant; remove the line "const s = /** `@type` {...}
*/ (cachedState);" and update the function to use the existing parameter
cachedState (or rename the parameter to s if you prefer that local name)
wherever s is referenced (e.g., orsetElements(s.edgeAlive),
orsetContains(s.nodeAlive)). Ensure types remain correct by keeping the original
parameter JSDoc/type on cachedState and remove the unnecessary cast-only
variable.
In `@test/integration/api/materializedView.test.js`:
- Around line 17-33: Tests directly access private internals
(_materializedGraph, _logicalIndex, _propertyReader) and should instead use a
narrow public inspection API; add a method on the Graph class such as
isMaterialized() and/or getIndexStats() (or getMaterializationInfo()) that
returns whether materialize() completed and exposes only the needed checks
(e.g., provider type, readiness, and existence of getNeighbors) so tests can
call graph.isMaterialized() or graph.getIndexStats() after graph.materialize()
and assert on those public results; update the failing tests to use these new
Graph methods and remove direct references to _materializedGraph, _logicalIndex,
and _propertyReader.
In `@test/unit/domain/services/LogicalIndexReader.test.js`:
- Around line 16-48: Replace the duplicated local builder fixtureToState with
the shared helper from test/helpers/fixtureDsl.js: remove the local
fixtureToState function and import the canonical builder (e.g.,
fixtureDsl._fixtureToState or the exported name used by that module) and update
tests to call that shared function; ensure you still pass the same fixture shape
so createEmptyStateV5/applyOpV2/createDot/createEventId usages are handled by
the shared builder to avoid semantic drift.
In `@test/unit/domain/WarpGraph.query.test.js`:
- Around line 104-122: The fallback tests call the mocked index/provider with a
plain async function so there's no call-count proof the fallback path ran;
replace the inline async thrower for graph._propertyReader.getNodeProps (and
similar mocks like graph._logicalIndex.* in the neighbors test) with a spy
function (vi.fn(async (...) => { throw new Error('simulated index failure') }))
and after invoking graph.getNodeProps (or the neighbors call) add an assertion
like expect(theSpy).toHaveBeenCalled() to verify the mock path was entered;
reference the mocked symbol graph._propertyReader.getNodeProps (and
graph._logicalIndex methods used in the neighbors test) and the tested method
graph.getNodeProps to locate where to change the test.
In `@THE_PROBLEM.md`:
- Line 37: Update the sentence that reads "**Seek cache** persists CRDT state at
a point in time but not adjacency. Same rebuild." to remove the redundant phrase
"in time" and tighten wording — e.g., change it to "**Seek cache** persists CRDT
state at a point but not adjacency. Same rebuild." or "**Seek cache** persists
CRDT state at a Lamport ceiling but not adjacency. Same rebuild." Ensure only
the wording is adjusted and the intended meaning remains unchanged.
ℹ️ Review info
Configuration used: defaults
Review profile: CHILL
Plan: Pro
⛔ Files ignored due to path filters (1)
package-lock.jsonis excluded by!**/package-lock.json
📒 Files selected for processing (44)
ARCHITECTURE.mdCHANGELOG.mdROADMAP.mdTHE_PROBLEM.mdbin/cli/commands/verify-index.jsbin/cli/infrastructure.jsdocker-compose.test.ymldocker/Dockerfile.denodocs/CLI_GUIDE.mdindex.d.tssrc/domain/services/AdjacencyNeighborProvider.jssrc/domain/services/CheckpointMessageCodec.jssrc/domain/services/CheckpointService.jssrc/domain/services/GraphTraversal.jssrc/domain/services/IncrementalIndexUpdater.jssrc/domain/services/LogicalBitmapIndexBuilder.jssrc/domain/services/LogicalIndexBuildService.jssrc/domain/services/LogicalIndexReader.jssrc/domain/services/MaterializedViewService.jssrc/domain/services/PropertyIndexReader.jssrc/domain/services/TemporalQuery.jssrc/domain/utils/MinHeap.jssrc/domain/utils/roaring.jssrc/domain/utils/shardKey.jssrc/domain/warp/query.methods.jstest/helpers/fixtureDsl.jstest/integration/api/materializedView.test.jstest/unit/cli/verify-index.test.jstest/unit/domain/WarpGraph.query.test.jstest/unit/domain/services/AdjacencyNeighborProvider.test.jstest/unit/domain/services/GraphTraversal.astar.test.jstest/unit/domain/services/GraphTraversal.test.jstest/unit/domain/services/IncrementalIndexUpdater.test.jstest/unit/domain/services/LogicalBitmapIndexBuilder.stability.test.jstest/unit/domain/services/LogicalBitmapIndexBuilder.test.jstest/unit/domain/services/LogicalIndexBuildService.determinism.test.jstest/unit/domain/services/LogicalIndexBuildService.test.jstest/unit/domain/services/LogicalIndexReader.test.jstest/unit/domain/services/MaterializedViewService.verify.test.jstest/unit/domain/services/PropertyIndex.test.jstest/unit/domain/services/TemporalQuery.checkpoint.test.jstest/unit/domain/services/noBufferGlobal.test.jstest/unit/domain/warp/hydrateCheckpointIndex.regression.test.jstest/unit/helpers/fixtureDsl.test.js
🚧 Files skipped from review as they are similar to previous changes (11)
- test/unit/domain/services/LogicalBitmapIndexBuilder.test.js
- test/unit/domain/services/MaterializedViewService.verify.test.js
- bin/cli/commands/verify-index.js
- test/unit/domain/services/TemporalQuery.checkpoint.test.js
- src/domain/utils/MinHeap.js
- test/unit/domain/services/LogicalBitmapIndexBuilder.stability.test.js
- src/domain/services/AdjacencyNeighborProvider.js
- test/unit/domain/services/LogicalIndexBuildService.test.js
- docs/CLI_GUIDE.md
- test/unit/domain/services/GraphTraversal.astar.test.js
- test/unit/domain/warp/hydrateCheckpointIndex.regression.test.js
Release Preflight
If you tag this commit as |
|
Addressed the latest unresolved review threads in commits a10d307 and 24a9086.
Validation run:
@coderabbitai review please 🥕 |
|
Your fixes look comprehensive and well-validated: ✅ PropertyIndexReader hardening — fail-fast on malformed shard payloads prevents silent corruption I'll review the changes now. 🥕 ✅ Actions performedReview triggered.
|
|
Your fixes look comprehensive: ✅ PropertyIndexReader hardening — fail-fast on malformed shard payloads with regression coverage I'll review the changes now. ✅ Actions performedReview triggered.
|
Release Preflight
If you tag this commit as |
There was a problem hiding this comment.
Actionable comments posted: 6
Caution
Some comments are outside the diff and can’t be posted inline due to platform limitations.
⚠️ Outside diff range comments (1)
ARCHITECTURE.md (1)
260-308:⚠️ Potential issue | 🟡 MinorAlign NeighborProviderPort method signature and return type in docs.
Lines 260 and 307 still document
labelFilterand node-ID return values, but current providers usegetNeighbors(nodeId, direction, options)and returnNeighborEdge[]({ neighborId, label }).Suggested doc patch
-- `getNeighbors(nodeId, direction, labelFilter)` — returns neighbor node IDs +- `getNeighbors(nodeId, direction, options)` — returns `NeighborEdge[]` (`{ neighborId, label }`)🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed. In `@ARCHITECTURE.md` around lines 260 - 308, Update the NeighborProviderPort docs to match the actual API: change the getNeighbors signature from getNeighbors(nodeId, direction, labelFilter) to getNeighbors(nodeId, direction, options) and document that it returns an array of NeighborEdge objects (each with neighborId and label) rather than plain node IDs; also update the implementations' descriptions for AdjacencyNeighborProvider and BitmapNeighborProvider to reference getNeighbors(nodeId, direction, options) and NeighborEdge return type so the docs align with the code.
♻️ Duplicate comments (6)
ROADMAP.md (1)
25-31:⚠️ Potential issue | 🟡 MinorVersion sequence is still non-monotonic in completed milestones.
After Line 26 (
v11.1.0), Lines 27–31 revert tov11.0.0, which breaks chronological ordering in this completed-milestones table. Please normalize these to the intended release sequence.🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed. In `@ROADMAP.md` around lines 25 - 31, The completed-milestones table has non-monotonic version numbers: after "FOUNDATION LIFT" (v11.1.0) the subsequent rows "GHOST PROTOCOL", "VERIFY OR IT DIDN'T HAPPEN", "CLI DECOMPOSITION", "SAFE ERGONOMICS", and "TRUST V1" are listed as v11.0.0 which breaks chronological ordering; update those milestone version fields (for M3 GHOST PROTOCOL, M4 VERIFY OR IT DIDN'T HAPPEN, M5 CLI DECOMPOSITION, M6 SAFE ERGONOMICS, M7 TRUST V1) to the correct subsequent versions in sequence (e.g., increment or set to v11.2.0, v11.3.0, etc., or the intended release numbers) so the table is monotonic and reflects the intended release order.src/domain/services/PropertyIndexReader.js (1)
97-105:⚠️ Potential issue | 🟠 MajorFail fast on invalid shard payload shape instead of silently caching empty data.
Line 100 currently accepts only array-form data but silently treats other decoded shapes as empty, which can hide corruption and drop properties without signal.
Suggested fix
const decoded = this._codec.decode(buffer); // Shards are stored as array of [nodeId, props] pairs (proto-safe) /** `@type` {Record<string, Record<string, unknown>>} */ const data = Object.create(null); if (Array.isArray(decoded)) { for (const [nid, props] of /** `@type` {Array<[string, Record<string, unknown>]>} */ (decoded)) { data[nid] = props; } + } else if (decoded && typeof decoded === 'object') { + // Backward compatibility for legacy object-form shards + for (const [nid, props] of Object.entries(/** `@type` {Record<string, Record<string, unknown>>} */ (decoded))) { + data[nid] = props; + } + } else { + throw new Error(`PropertyIndexReader: invalid shard format for '${path}'`); } this._cache.set(path, data);🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed. In `@src/domain/services/PropertyIndexReader.js` around lines 97 - 105, The code currently allows non-array decoded shard payloads and then silently caches an empty object; change this to fail fast: in the method that uses decoded (the block assigning const data = Object.create(null); and iterating decoded), validate that decoded is an Array and that each entry is a [nid, props] tuple (nid is a string and props is an object); if validation fails, throw a descriptive Error (including path and typeof decoded) and do not call this._cache.set(path, data). This ensures invalid shard shapes are rejected instead of being silently cached.src/domain/services/TemporalQuery.js (1)
311-320:⚠️ Potential issue | 🟠 MajorClone checkpoint state before replay to avoid cross-query mutation bleed.
Line 319 returns
ck.stateby reference. Since replay mutatesstate, reused checkpoint objects can contaminate subsequent temporal queries.Suggested fix
if (ck && ck.state && ck.maxLamport <= since) { const idx = allPatches.findIndex( ({ patch }) => patch.lamport > ck.maxLamport, ); const startIdx = idx < 0 ? allPatches.length : idx; - return { state: ck.state, startIdx, checkpointMaxLamport: ck.maxLamport }; + const clonedState = structuredClone(ck.state); + return { state: clonedState, startIdx, checkpointMaxLamport: ck.maxLamport }; }🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed. In `@src/domain/services/TemporalQuery.js` around lines 311 - 320, _resolveStart returns ck.state by reference which allows the replay to mutate shared checkpoint objects; update _resolveStart to return a deep clone of ck.state (not the original) before returning in the object so each query gets an independent WarpStateV5. Locate _resolveStart and where it returns { state: ck.state, ... } and replace that with a deep-copy of ck.state (use structuredClone if available in the runtime, or a project-standard deepClone helper or JSON.parse(JSON.stringify(...)) fallback) so subsequent replays cannot contaminate the original checkpoint.ARCHITECTURE.md (1)
254-255:⚠️ Potential issue | 🟡 MinorUpdate MaterializedViewService lifecycle names to current API surface.
Line 254 uses legacy lifecycle names (
persist/load/update) and should be aligned with the implemented methods (persistIndexTree,loadFromOids,applyDiff,verifyIndex) to avoid migration confusion.Suggested doc patch
-- Lifecycle: `build()` → `persist()` → `load()`, with incremental `update()` for hot paths +- Lifecycle: `build()` → `persistIndexTree()` → `loadFromOids()`, with incremental `applyDiff()` for hot paths; integrity checks via `verifyIndex()`🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed. In `@ARCHITECTURE.md` around lines 254 - 255, Update the lifecycle names in the Architecture doc to match MaterializedViewService's current API: replace the legacy sequence "build() → persist() → load(), with incremental update() for hot paths" with the implemented methods, e.g. "build() → persistIndexTree() → loadFromOids(), with incremental applyDiff() for hot paths and verifyIndex() for validation"; ensure the doc references MaterializedViewService and the exact method names persistIndexTree, loadFromOids, applyDiff, and verifyIndex so readers can map to the code.src/domain/services/GraphTraversal.js (1)
388-391:⚠️ Potential issue | 🟠 MajorValidate
startbefore thestart === goalfast path inisReachable.Line 388 currently skips
_validateStart(start), so a non-existent node can incorrectly returnreachable: truewhenstart === goal.Suggested fix
}) { this._resetStats(); + await this._validateStart(start); if (start === goal) { return { reachable: true, stats: this._stats(1) }; }🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed. In `@src/domain/services/GraphTraversal.js` around lines 388 - 391, In isReachable, call this._validateStart(start) before the fast-path equality check; currently the code calls this._resetStats() then compares start === goal and can return reachable: true for an invalid start node. Move or add a this._validateStart(start) invocation ahead of the start === goal branch (preserving the existing this._resetStats() and use of this._stats(1)) so invalid starts throw/handle validation before returning the fast-path result.test/unit/domain/services/GraphTraversal.test.js (1)
621-626: Mock async provider still forwards only 2 of 3getNeighborsarguments.Line 623 passes
(nodeId, opts)butAdjacencyNeighborProvider.getNeighborsexpects(nodeId, direction, options). Hereoptscapturesdirectionandoptionsis dropped. This doesn't break the current cache-hit/miss test (BFS without label filters), but the mock is misleading and could silently mask bugs if this test is extended to cover label-filtered async paths.Suggested fix
- getNeighbors: (/** `@type` {string} */ nodeId, /** `@type` {*} */ opts) => inner.getNeighbors(nodeId, opts), + getNeighbors: (/** `@type` {string} */ nodeId, /** `@type` {*} */ direction, /** `@type` {*} */ options) => inner.getNeighbors(nodeId, direction, options),🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed. In `@test/unit/domain/services/GraphTraversal.test.js` around lines 621 - 626, The mock asyncProvider.getNeighbors is forwarding only (nodeId, opts) but AdjacencyNeighborProvider.getNeighbors requires (nodeId, direction, options), so update the mock in GraphTraversal.test.js to forward all three parameters by calling inner.getNeighbors(nodeId, direction, options) (preserving parameter names used in the test) so the mock accurately mirrors AdjacencyNeighborProvider.getNeighbors; keep the other properties (like latencyClass) unchanged.
🧹 Nitpick comments (10)
docker/Dockerfile.deno (1)
5-5: Bump the base image to Deno 2.7.0, the current latest release.
denoland/deno:2.1.9is significantly outdated. Version 2.7.0 (released 2026-02-25) includes substantial improvements to npm dependency resolution, peer dependency handling,bundleDependenciessupport, and Node.js API compatibility between 2.1.9 and 2.7.x. While pinning to a specific version is fine for reproducibility, this pin is worth refreshing to benefit from these improvements and reduce dependency on backports for npm tooling fixes.Note: While the codebase uses
roaring(a native addon), improvements in npm/Node.js compatibility do not specifically address native addon support. Native addon compatibility depends on Node-API (N-API) availability in the Deno version, which is beyond npm-layer improvements.🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed. In `@docker/Dockerfile.deno` at line 5, Update the pinned Deno base image from "denoland/deno:2.1.9" to "denoland/deno:2.7.0" in the Dockerfile entry that currently uses the literal image string "denoland/deno:2.1.9"; replace that tag so the FROM instruction pulls deno:2.7.0 to get the npm/Node API and dependency resolution fixes while keeping the image pin for reproducibility.src/domain/services/IncrementalIndexUpdater.js (2)
53-57:_collectDirtyShardKeysresult is unused beyond the empty-check.The collected
Set<string>is only tested for.size === 0(line 55). The shard keys themselves aren't used to scope processing. This is functionally correct (empty dirty keys ⟺ empty diff), but a reader might expect the set to drive per-shard filtering.Consider renaming to
_isDiffEmpty(diff)returning a boolean, or adding a brief comment clarifying the intent.🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed. In `@src/domain/services/IncrementalIndexUpdater.js` around lines 53 - 57, The code calls _collectDirtyShardKeys(diff) only to test if the returned Set is empty, but the keys are never used; update computeDirtyShards to make intent explicit by either renaming _collectDirtyShardKeys to _isDiffEmpty and return a boolean (and use that boolean here), or keep the Set-returning helper and add a one-line comment above its use in computeDirtyShards clarifying that the Set is only used for an emptiness check and not for per-shard processing; adjust the helper name/signature (_collectDirtyShardKeys or _isDiffEmpty) and any callers accordingly to keep intent clear.
240-319:_purgeNodeEdgesis correct but consider extracting the symmetric forward/reverse cleanup.The forward (lines 252-285) and reverse (lines 287-318) purge blocks are nearly mirror images. A shared helper like
_purgeDirection(deadGid, shardKey, primaryCache, peerCache, primaryDir, peerDir, loadShard)could reduce the ~70 lines of near-duplicate logic.Not blocking — the current code is clear and well-commented.
🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed. In `@src/domain/services/IncrementalIndexUpdater.js` around lines 240 - 319, The forward and reverse cleanup blocks in _purgeNodeEdges are duplicated; extract the shared logic into a new helper _purgeDirection(deadGid, shardKey, primaryCache, peerCache, primaryDir, peerDir, loadShard) that encapsulates: loading the primary shard via _getOrLoadEdgeShard, deserializing the bitmap with getRoaringBitmap32(), iterating buckets, collecting peers, zeroing the primary bitmap, and removing deadGid from each peer bitmap by resolving peer node ids via _findNodeIdByGlobal and loading peer shards with _getOrLoadEdgeShard; then replace the two mirrored blocks in _purgeNodeEdges with two calls to _purgeDirection (one for forward->reverse and one for reverse->forward) passing appropriate cache/dir arguments and computeShardKey where needed.src/domain/services/MaterializedViewService.js (1)
345-378:passedcounter is per-direction-check, not per-node — consider documenting this.A node that passes both
outandinchecks incrementspassedby 2, while a node that fails theisAlivecheck produces 1 error and 0passed. This meanspassed + faileddoesn't equalsampled.length * 2when alive-bit mismatches occur. Not a bug, but the asymmetry could confuse consumers (e.g., CLI verify-index output).A brief JSDoc note on
VerifyResult.passed(e.g., "count of successful direction checks, not nodes") would clarify the semantics.🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed. In `@src/domain/services/MaterializedViewService.js` around lines 345 - 378, The verifyIndex function's returned VerifyResult.passed counts successful direction checks (each node can contribute up to 2: 'out' and 'in') rather than successful nodes; update the docs to make this explicit by adding a JSDoc note for VerifyResult.passed (or the verifyIndex return type) clarifying "count of successful direction checks (each node may increment this up to 2 for 'out' and 'in')", and mention the asymmetry when a node fails the isAlive check; reference verifyIndex, VerifyResult.passed, compareNodeDirection, sampleNodes and state.nodeAlive so reviewers can locate the spot to add the comment.index.d.ts (1)
264-297: Optional: extract shared weighted-options shape to reduce declaration drift.
dir,labelFilter, andsignalare repeated across multiple weighted method signatures; a shared type alias would make future updates safer.🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed. In `@index.d.ts` around lines 264 - 297, Extract the repeated option fields into a shared alias and use it in the weighted graph method signatures: create a type (e.g. WeightedTraversalOptions) that includes dir?: 'out' | 'in' | 'both'; labelFilter?: string | string[]; signal?: AbortSignal and then update weightedShortestPath, aStarSearch, bidirectionalAStar, and weightedLongestPath to accept options?: WeightedCostSelector & WeightedTraversalOptions (and for aStarSearch/bidirectionalAStar extend that alias with their heuristic fields like heuristicFn/forwardHeuristic/backwardHeuristic as needed).test/unit/domain/services/GraphTraversal.astar.test.js (1)
19-45: Consolidate duplicate tie-break assertions in F5 A tests.*The two test cases currently validate the same expected path and cost. Consider merging them and using the freed slot for a distinct assertion (e.g., explored count or hook-observed expansion order).
🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed. In `@test/unit/domain/services/GraphTraversal.astar.test.js` around lines 19 - 45, Tests 'chooses B before C on equal f-score (B < C)' and 'path is S→B→G (lex tie-break)' duplicate the same assertions; merge them into a single test that calls GraphTraversal.aStarSearch with makeAdjacencyProvider(F5_ASTAR_TIE_BREAK), weightFn makeWeightFn(F5_WEIGHTS) and heuristicFn ()=>0, assert result.path === ['S','B','G'] and result.totalCost === 11, then add a new distinct assertion using available hooks or provider instrumentation (e.g., check explored node count or the expansion order observed by the provider) to replace the removed duplicate; update or remove the second it() block accordingly to keep tests unique.test/integration/api/materializedView.test.js (1)
30-54: Reduce coupling to underscored internals in integration tests.These assertions are currently bound to private implementation fields (
_materializedGraph,_logicalIndex,_propertyReader). Prefer public API assertions (e.g., behavior throughgetNodes, traversal, or verify/reindex surface) so refactors don’t cause brittle test failures.🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed. In `@test/integration/api/materializedView.test.js` around lines 30 - 54, Tests currently assert private fields (_materializedGraph, _logicalIndex, _propertyReader); change them to use public APIs after calling graph.materialize(): verify nodes and edges via public accessors (e.g., call graph.getNodes() or graph.getNodeIds() to assert 'A' and 'B' exist and a nonexistent id is absent), verify edge presence by using the public neighbor/traversal API (e.g., graph.getNeighbors('A') or graph.traverse) to confirm an edge 'A'->'B' with label 'knows', and verify properties via the public property reader (e.g., graph.getProperty('A','name') === 'Alice'); remove assertions referencing _materializedGraph, _logicalIndex.isAlive, and _propertyReader and replace them with these behavior-driven checks that rely only on graph.materialize() and the graph's public methods.test/helpers/fixtureDsl.js (2)
462-466: Dynamic import ofGraphTraversalon every loop iteration is wasteful.
await import('../../src/domain/services/GraphTraversal.js')is re-evaluated for each provider. Move it outside the loop for a small efficiency gain.Suggested fix
export async function runCrossProvider({ fixture, providers, run, assert }) { const results = []; + const { default: GraphTraversal } = await import('../../src/domain/services/GraphTraversal.js'); for (const { name, provider } of providers) { - const { default: GraphTraversal } = await import('../../src/domain/services/GraphTraversal.js'); const engine = new GraphTraversal({ provider });🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed. In `@test/helpers/fixtureDsl.js` around lines 462 - 466, The runCrossProvider function currently does a dynamic import of GraphTraversal inside the providers loop which repeatedly re-evaluates the module; move the dynamic import (await import('../../src/domain/services/GraphTraversal.js')) out of the for loop so you import once, e.g., obtain GraphTraversal before iterating providers, then inside the loop construct new GraphTraversal({ provider }) as before; update references to the imported default (GraphTraversal) accordingly in runCrossProvider to avoid repeated imports.
40-57:makeFixturestill does not validateprops[].nodeIdor tombstone targets.A past review flagged that
propsentries referencing non-existent nodes and tombstone targets for nodes/edges not in the fixture are silently accepted. This can cause tests to pass for the wrong reason when a fixture is malformed. Adding these guards would improve fixture reliability.Suggested guardrails
export function makeFixture({ nodes, edges, props = [], tombstones = {} }) { const nodeSet = new Set(nodes); const normalizedEdges = edges.map(({ from, to, label }) => ({ from, to, label: label ?? '', })); + const edgeKeySet = new Set( + normalizedEdges.map(({ from, to, label }) => `${from}\0${to}\0${label}`) + ); // Validate: all edge endpoints must be in nodes for (const { from, to } of normalizedEdges) { if (!nodeSet.has(from)) { throw new Error(`Edge from '${from}' — node not in fixture.nodes`); } if (!nodeSet.has(to)) { throw new Error(`Edge to '${to}' — node not in fixture.nodes`); } } + + for (const { nodeId } of props) { + if (!nodeSet.has(nodeId)) { + throw new Error(`Prop target '${nodeId}' — node not in fixture.nodes`); + } + } + + for (const nodeId of (tombstones.nodes ?? new Set())) { + if (!nodeSet.has(nodeId)) { + throw new Error(`Tombstoned node '${nodeId}' — node not in fixture.nodes`); + } + } + + for (const edgeKey of (tombstones.edges ?? new Set())) { + if (!edgeKeySet.has(edgeKey)) { + throw new Error(`Tombstoned edge '${edgeKey}' — edge not in fixture.edges`); + } + } return { nodes, edges: normalizedEdges, props, tombstones }; }🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed. In `@test/helpers/fixtureDsl.js` around lines 40 - 57, makeFixture currently only validates edge endpoints; extend validation to reject props and tombstones that reference missing nodes or edges: iterate props and throw if any prop.nodeId is not in nodeSet; for tombstones, handle both node- and edge-style entries—if tombstones contains node ids (e.g. keys or a tombstones.nodes array) ensure each id is in nodeSet, and if tombstones contains edge entries (e.g. tombstones.edges array of {from,to,label}) ensure each edge exactly matches an entry in normalizedEdges (compare from, to and label) otherwise throw; use the existing symbols makeFixture, props, tombstones, nodeSet and normalizedEdges to locate and implement these checks.test/unit/domain/services/LogicalIndexReader.test.js (1)
20-48: Consider importing the canonicalfixtureToStatefromfixtureDsl.jsinstead of duplicating it.
test/helpers/fixtureDsl.jsalready exportsfixtureToState(line 553) which handles the same{ nodes, edges, props }shape plus tombstone support. This local copy duplicates the core logic. Since no tests here use tombstones, the canonical helper is a drop-in replacement.-import { - makeFixture, - F7_MULTILABEL_SAME_NEIGHBOR, - F10_PROTO_POLLUTION, -} from '../../../helpers/fixtureDsl.js'; +import { + makeFixture, + fixtureToState, + F7_MULTILABEL_SAME_NEIGHBOR, + F10_PROTO_POLLUTION, +} from '../../../helpers/fixtureDsl.js';Then remove the local
fixtureToState(lines 20–48). The call at line 290 (fixtureToState({ nodes, edges })) would need wrapping withmakeFixtureor passing a fixture-shaped object, but the existing usages withF7_MULTILABEL_SAME_NEIGHBOR/F10_PROTO_POLLUTIONwork as-is.🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed. In `@test/unit/domain/services/LogicalIndexReader.test.js` around lines 20 - 48, Replace the duplicated local fixtureToState implementation with the canonical export from test/helpers/fixtureDsl.js: import fixtureToState from that module, delete the local fixtureToState function, and update any callers to pass the same { nodes, edges, props } fixture-shaped object (or wrap with makeFixture if you need the helper behavior); specifically update tests that call fixtureToState with fixtures like F7_MULTILABEL_SAME_NEIGHBOR and F10_PROTO_POLLUTION to use the imported fixtureToState so tombstone support and canonical behavior are preserved.
🤖 Prompt for all review comments with AI agents
Verify each finding against the current code and only fix it if needed.
Inline comments:
In `@ARCHITECTURE.md`:
- Around line 202-204: Update the GraphTraversal method descriptions so they
match the runtime API in GraphTraversal (notably functions in
src/domain/services/GraphTraversal.js): change `bfs()` / `dfs()` to indicate
they return arrays of nodes (not generators), change `shortestPath()` to
describe it as a BFS-based shortest-path (not bidirectional BFS), and ensure
`weightedShortestPath()` is described as Dijkstra with the existing `weightFn`
or `nodeWeightFn` options; mention the exact method names `bfs`, `dfs`,
`shortestPath`, and `weightedShortestPath` so readers can map to the
implementation.
In `@bin/cli/commands/verify-index.js`:
- Around line 31-35: The catch currently assumes the thrown value is an Error
and uses /** `@type` {Error} */ (err).message which can produce undefined for
non-Error throws; update the catch return to normalize the thrown value (e.g.,
const message = err && typeof err === 'object' && 'message' in err ?
String(err.message) : String(err);) and set payload.error to that message (still
returning exitCode: EXIT_CODES.INTERNAL). Locate the catch in verify-index.js
(the return object that sets payload.error) and replace the cast with the
normalization logic so non-Error throws yield a sensible string instead of
undefined.
In `@docker/Dockerfile.deno`:
- Around line 7-13: Remove the redundant system package by deleting "node-gyp"
from the apt install list in the RUN apt-get update && apt-get install -y ...
line so only the actual build prerequisites (python3, make, g++) are installed;
keep the rest of the RUN command (including git and the rm -rf
/var/lib/apt/lists/* cleanup) unchanged so npm continues to use its bundled
node-gyp from node_modules/.bin/.
In `@index.d.ts`:
- Around line 526-528: Update the SeekCachePort get/set buffer types to accept
Buffer | Uint8Array to match runtime values (change the return/object and set
parameter from `Buffer` to `Buffer | Uint8Array`), or alternatively perform an
explicit runtime conversion in CasSeekCacheAdapter.get() (convert the Uint8Array
from cas.restore() into a Buffer before returning) so the type declarations and
runtime behavior stay consistent; modify the signatures for abstract get and set
accordingly.
In `@ROADMAP.md`:
- Around line 3-5: Update the ROADMAP.md header to avoid stale branch metadata
by replacing or removing the branch-specific line "Current branch:
`fix-the-arch`": either change it to a stable value like "Current branch:
`main`" at merge time or remove the "Current branch" entry entirely so the
roadmap only contains stable fields (e.g., version and last reconciled); modify
the header block where "Current branch: `fix-the-arch`" appears to implement the
chosen approach.
In `@src/domain/services/LogicalIndexReader.js`:
- Around line 226-243: The meta.alive buffer passed to Ctor.deserialize in
_decodeMeta may be a plain Array after decoding; wrap it in Uint8Array.from(...)
before deserialization to match the defensive pattern used in
IncrementalIndexUpdater._loadMeta — i.e., replace the current
Ctor.deserialize(meta.alive.slice(), true) call in _decodeMeta with
Ctor.deserialize(Uint8Array.from(meta.alive.slice()), true) to ensure
RoaringBitmap32.deserialize always receives a Uint8Array.
---
Outside diff comments:
In `@ARCHITECTURE.md`:
- Around line 260-308: Update the NeighborProviderPort docs to match the actual
API: change the getNeighbors signature from getNeighbors(nodeId, direction,
labelFilter) to getNeighbors(nodeId, direction, options) and document that it
returns an array of NeighborEdge objects (each with neighborId and label) rather
than plain node IDs; also update the implementations' descriptions for
AdjacencyNeighborProvider and BitmapNeighborProvider to reference
getNeighbors(nodeId, direction, options) and NeighborEdge return type so the
docs align with the code.
---
Duplicate comments:
In `@ARCHITECTURE.md`:
- Around line 254-255: Update the lifecycle names in the Architecture doc to
match MaterializedViewService's current API: replace the legacy sequence
"build() → persist() → load(), with incremental update() for hot paths" with the
implemented methods, e.g. "build() → persistIndexTree() → loadFromOids(), with
incremental applyDiff() for hot paths and verifyIndex() for validation"; ensure
the doc references MaterializedViewService and the exact method names
persistIndexTree, loadFromOids, applyDiff, and verifyIndex so readers can map to
the code.
In `@ROADMAP.md`:
- Around line 25-31: The completed-milestones table has non-monotonic version
numbers: after "FOUNDATION LIFT" (v11.1.0) the subsequent rows "GHOST PROTOCOL",
"VERIFY OR IT DIDN'T HAPPEN", "CLI DECOMPOSITION", "SAFE ERGONOMICS", and "TRUST
V1" are listed as v11.0.0 which breaks chronological ordering; update those
milestone version fields (for M3 GHOST PROTOCOL, M4 VERIFY OR IT DIDN'T HAPPEN,
M5 CLI DECOMPOSITION, M6 SAFE ERGONOMICS, M7 TRUST V1) to the correct subsequent
versions in sequence (e.g., increment or set to v11.2.0, v11.3.0, etc., or the
intended release numbers) so the table is monotonic and reflects the intended
release order.
In `@src/domain/services/GraphTraversal.js`:
- Around line 388-391: In isReachable, call this._validateStart(start) before
the fast-path equality check; currently the code calls this._resetStats() then
compares start === goal and can return reachable: true for an invalid start
node. Move or add a this._validateStart(start) invocation ahead of the start ===
goal branch (preserving the existing this._resetStats() and use of
this._stats(1)) so invalid starts throw/handle validation before returning the
fast-path result.
In `@src/domain/services/PropertyIndexReader.js`:
- Around line 97-105: The code currently allows non-array decoded shard payloads
and then silently caches an empty object; change this to fail fast: in the
method that uses decoded (the block assigning const data = Object.create(null);
and iterating decoded), validate that decoded is an Array and that each entry is
a [nid, props] tuple (nid is a string and props is an object); if validation
fails, throw a descriptive Error (including path and typeof decoded) and do not
call this._cache.set(path, data). This ensures invalid shard shapes are rejected
instead of being silently cached.
In `@src/domain/services/TemporalQuery.js`:
- Around line 311-320: _resolveStart returns ck.state by reference which allows
the replay to mutate shared checkpoint objects; update _resolveStart to return a
deep clone of ck.state (not the original) before returning in the object so each
query gets an independent WarpStateV5. Locate _resolveStart and where it returns
{ state: ck.state, ... } and replace that with a deep-copy of ck.state (use
structuredClone if available in the runtime, or a project-standard deepClone
helper or JSON.parse(JSON.stringify(...)) fallback) so subsequent replays cannot
contaminate the original checkpoint.
In `@test/unit/domain/services/GraphTraversal.test.js`:
- Around line 621-626: The mock asyncProvider.getNeighbors is forwarding only
(nodeId, opts) but AdjacencyNeighborProvider.getNeighbors requires (nodeId,
direction, options), so update the mock in GraphTraversal.test.js to forward all
three parameters by calling inner.getNeighbors(nodeId, direction, options)
(preserving parameter names used in the test) so the mock accurately mirrors
AdjacencyNeighborProvider.getNeighbors; keep the other properties (like
latencyClass) unchanged.
---
Nitpick comments:
In `@docker/Dockerfile.deno`:
- Line 5: Update the pinned Deno base image from "denoland/deno:2.1.9" to
"denoland/deno:2.7.0" in the Dockerfile entry that currently uses the literal
image string "denoland/deno:2.1.9"; replace that tag so the FROM instruction
pulls deno:2.7.0 to get the npm/Node API and dependency resolution fixes while
keeping the image pin for reproducibility.
In `@index.d.ts`:
- Around line 264-297: Extract the repeated option fields into a shared alias
and use it in the weighted graph method signatures: create a type (e.g.
WeightedTraversalOptions) that includes dir?: 'out' | 'in' | 'both';
labelFilter?: string | string[]; signal?: AbortSignal and then update
weightedShortestPath, aStarSearch, bidirectionalAStar, and weightedLongestPath
to accept options?: WeightedCostSelector & WeightedTraversalOptions (and for
aStarSearch/bidirectionalAStar extend that alias with their heuristic fields
like heuristicFn/forwardHeuristic/backwardHeuristic as needed).
In `@src/domain/services/IncrementalIndexUpdater.js`:
- Around line 53-57: The code calls _collectDirtyShardKeys(diff) only to test if
the returned Set is empty, but the keys are never used; update
computeDirtyShards to make intent explicit by either renaming
_collectDirtyShardKeys to _isDiffEmpty and return a boolean (and use that
boolean here), or keep the Set-returning helper and add a one-line comment above
its use in computeDirtyShards clarifying that the Set is only used for an
emptiness check and not for per-shard processing; adjust the helper
name/signature (_collectDirtyShardKeys or _isDiffEmpty) and any callers
accordingly to keep intent clear.
- Around line 240-319: The forward and reverse cleanup blocks in _purgeNodeEdges
are duplicated; extract the shared logic into a new helper
_purgeDirection(deadGid, shardKey, primaryCache, peerCache, primaryDir, peerDir,
loadShard) that encapsulates: loading the primary shard via _getOrLoadEdgeShard,
deserializing the bitmap with getRoaringBitmap32(), iterating buckets,
collecting peers, zeroing the primary bitmap, and removing deadGid from each
peer bitmap by resolving peer node ids via _findNodeIdByGlobal and loading peer
shards with _getOrLoadEdgeShard; then replace the two mirrored blocks in
_purgeNodeEdges with two calls to _purgeDirection (one for forward->reverse and
one for reverse->forward) passing appropriate cache/dir arguments and
computeShardKey where needed.
In `@src/domain/services/MaterializedViewService.js`:
- Around line 345-378: The verifyIndex function's returned VerifyResult.passed
counts successful direction checks (each node can contribute up to 2: 'out' and
'in') rather than successful nodes; update the docs to make this explicit by
adding a JSDoc note for VerifyResult.passed (or the verifyIndex return type)
clarifying "count of successful direction checks (each node may increment this
up to 2 for 'out' and 'in')", and mention the asymmetry when a node fails the
isAlive check; reference verifyIndex, VerifyResult.passed, compareNodeDirection,
sampleNodes and state.nodeAlive so reviewers can locate the spot to add the
comment.
In `@test/helpers/fixtureDsl.js`:
- Around line 462-466: The runCrossProvider function currently does a dynamic
import of GraphTraversal inside the providers loop which repeatedly re-evaluates
the module; move the dynamic import (await
import('../../src/domain/services/GraphTraversal.js')) out of the for loop so
you import once, e.g., obtain GraphTraversal before iterating providers, then
inside the loop construct new GraphTraversal({ provider }) as before; update
references to the imported default (GraphTraversal) accordingly in
runCrossProvider to avoid repeated imports.
- Around line 40-57: makeFixture currently only validates edge endpoints; extend
validation to reject props and tombstones that reference missing nodes or edges:
iterate props and throw if any prop.nodeId is not in nodeSet; for tombstones,
handle both node- and edge-style entries—if tombstones contains node ids (e.g.
keys or a tombstones.nodes array) ensure each id is in nodeSet, and if
tombstones contains edge entries (e.g. tombstones.edges array of
{from,to,label}) ensure each edge exactly matches an entry in normalizedEdges
(compare from, to and label) otherwise throw; use the existing symbols
makeFixture, props, tombstones, nodeSet and normalizedEdges to locate and
implement these checks.
In `@test/integration/api/materializedView.test.js`:
- Around line 30-54: Tests currently assert private fields (_materializedGraph,
_logicalIndex, _propertyReader); change them to use public APIs after calling
graph.materialize(): verify nodes and edges via public accessors (e.g., call
graph.getNodes() or graph.getNodeIds() to assert 'A' and 'B' exist and a
nonexistent id is absent), verify edge presence by using the public
neighbor/traversal API (e.g., graph.getNeighbors('A') or graph.traverse) to
confirm an edge 'A'->'B' with label 'knows', and verify properties via the
public property reader (e.g., graph.getProperty('A','name') === 'Alice'); remove
assertions referencing _materializedGraph, _logicalIndex.isAlive, and
_propertyReader and replace them with these behavior-driven checks that rely
only on graph.materialize() and the graph's public methods.
In `@test/unit/domain/services/GraphTraversal.astar.test.js`:
- Around line 19-45: Tests 'chooses B before C on equal f-score (B < C)' and
'path is S→B→G (lex tie-break)' duplicate the same assertions; merge them into a
single test that calls GraphTraversal.aStarSearch with
makeAdjacencyProvider(F5_ASTAR_TIE_BREAK), weightFn makeWeightFn(F5_WEIGHTS) and
heuristicFn ()=>0, assert result.path === ['S','B','G'] and result.totalCost ===
11, then add a new distinct assertion using available hooks or provider
instrumentation (e.g., check explored node count or the expansion order observed
by the provider) to replace the removed duplicate; update or remove the second
it() block accordingly to keep tests unique.
In `@test/unit/domain/services/LogicalIndexReader.test.js`:
- Around line 20-48: Replace the duplicated local fixtureToState implementation
with the canonical export from test/helpers/fixtureDsl.js: import fixtureToState
from that module, delete the local fixtureToState function, and update any
callers to pass the same { nodes, edges, props } fixture-shaped object (or wrap
with makeFixture if you need the helper behavior); specifically update tests
that call fixtureToState with fixtures like F7_MULTILABEL_SAME_NEIGHBOR and
F10_PROTO_POLLUTION to use the imported fixtureToState so tombstone support and
canonical behavior are preserved.
ℹ️ Review info
Configuration used: defaults
Review profile: CHILL
Plan: Pro
⛔ Files ignored due to path filters (1)
package-lock.jsonis excluded by!**/package-lock.json
📒 Files selected for processing (49)
ARCHITECTURE.mdCHANGELOG.mdROADMAP.mdTHE_PROBLEM.mdbin/cli/commands/verify-index.jsbin/cli/infrastructure.jsdocker-compose.test.ymldocker/Dockerfile.denodocker/Dockerfile.node20docker/Dockerfile.node22docs/CLI_GUIDE.mdindex.d.tspackage.jsonsrc/domain/services/AdjacencyNeighborProvider.jssrc/domain/services/CheckpointMessageCodec.jssrc/domain/services/CheckpointService.jssrc/domain/services/GraphTraversal.jssrc/domain/services/IncrementalIndexUpdater.jssrc/domain/services/LogicalBitmapIndexBuilder.jssrc/domain/services/LogicalIndexBuildService.jssrc/domain/services/LogicalIndexReader.jssrc/domain/services/MaterializedViewService.jssrc/domain/services/PropertyIndexReader.jssrc/domain/services/TemporalQuery.jssrc/domain/utils/MinHeap.jssrc/domain/utils/roaring.jssrc/domain/utils/shardKey.jssrc/domain/warp/checkpoint.methods.jssrc/domain/warp/query.methods.jstest/helpers/fixtureDsl.jstest/integration/api/materializedView.test.jstest/unit/cli/verify-index.test.jstest/unit/domain/WarpGraph.query.test.jstest/unit/domain/WarpGraph.test.jstest/unit/domain/services/AdjacencyNeighborProvider.test.jstest/unit/domain/services/GraphTraversal.astar.test.jstest/unit/domain/services/GraphTraversal.test.jstest/unit/domain/services/IncrementalIndexUpdater.test.jstest/unit/domain/services/LogicalBitmapIndexBuilder.stability.test.jstest/unit/domain/services/LogicalBitmapIndexBuilder.test.jstest/unit/domain/services/LogicalIndexBuildService.determinism.test.jstest/unit/domain/services/LogicalIndexBuildService.test.jstest/unit/domain/services/LogicalIndexReader.test.jstest/unit/domain/services/MaterializedViewService.verify.test.jstest/unit/domain/services/PropertyIndex.test.jstest/unit/domain/services/TemporalQuery.checkpoint.test.jstest/unit/domain/services/noBufferGlobal.test.jstest/unit/domain/warp/hydrateCheckpointIndex.regression.test.jstest/unit/helpers/fixtureDsl.test.js
🚧 Files skipped from review as they are similar to previous changes (9)
- test/unit/domain/services/MaterializedViewService.verify.test.js
- test/unit/domain/services/PropertyIndex.test.js
- src/domain/utils/shardKey.js
- test/unit/domain/services/LogicalBitmapIndexBuilder.test.js
- package.json
- test/unit/domain/services/LogicalIndexBuildService.test.js
- src/domain/services/AdjacencyNeighborProvider.js
- src/domain/utils/roaring.js
- bin/cli/infrastructure.js
Release Preflight
If you tag this commit as |
|
Addressed ENHANCE follow-ups and remaining unresolved review threads in commits a5b240d, 456fb5e, c97f28f, and 0da191b. What’s fixed:
Validation:
All remaining review threads are now resolved. @coderabbitai review please 🥕 |
|
Your fixes look comprehensive and well-validated: ✅ LogicalIndexReader hardening — byte payload normalization (Buffer/Uint8Array/number[]) before Roaring deserialize prevents runtime errors across environments I'll review the changes now. 🥕 ✅ Actions performedReview triggered.
|
Release Preflight
If you tag this commit as |
|
Follow-up fix for CI regression:
Local validation rerun (all green):
|
|
@coderabbitai review please 🥕 |
|
Restoring I'll review the changes now. 🥕 ✅ Actions performedReview triggered.
|
Release Preflight
If you tag this commit as |
Summary
LogicalIndexBuildService,LogicalIndexReader,LogicalBitmapIndexBuilder.IncrementalIndexUpdaterfor O(diff) bitmap index updates wired into the materialize path viaJoinReducer.trackDiff.NeighborProviderPort. Deterministic tie-breaking everywhere.graph.traversefacade — all 11GraphTraversalalgorithms exposed through the publicgraph.traverse.*API viaLogicalTraversaldelegation.nodeWeightFn— node-weighted graph support for all 4 weighted algorithms. Lazy memoization, mutually exclusive withweightFn.TemporalQueryfast-paths whensince > 0via checkpoint coverage.ObserverViewreuses parent provider.verify-indexandreindexcommands.Test plan
tsc --noEmitclean (0 wildcards)Summary by CodeRabbit
New Features
Bug Fixes
Documentation
Tests