v0.16: Parallel Fetching, Collection.move, Lazy
New Features:
- Parallel data loading with useFetch() - Fetch multiple endpoints concurrently with
use(useFetch()), avoiding sequential waterfalls - Collection.move - Move entities between Collections with a single operation
- Collection.moveWith() - Customize move behavior (e.g., prepend instead of append)
- Lazy - Deferred relationship denormalization for performance and memoization isolation
Other Improvements:
- Direct schema imports - Import schema classes directly without the
schemanamespace - Denormalization depth limit - Prevent stack overflow in large bidirectional entity graphs; configurable via
Entity.maxEntityDepth(#3822) - DevToolsManager exposes
globalThis.__DC_CONTROLLERS__in dev mode for programmatic store access from Chrome DevTools MCP and Expo MCP. Use the data-client-react skill to enable AI-assisted debugging. - Remove misleading 'Uncaught Suspense' warning during Next.js SSR
- Fix
sideEffect: falsetype being lost withmethod: 'POST'in RestEndpoint - renderDataHook() automatic cleanup after each test โ no manual
cleanup()calls needed - renderDataHook() returns per-render
cleanup()andallSettled()for reliable test teardown
Collection.move makes it easy to move entities between Collections with a single operation:
import { useController } from '@data-client/react'; import { TaskResource, type Task } from './TaskResource'; export default function TaskCard({ task }: { task: Task }) { const handleMove = () => ctrl.fetch( TaskResource.getList.move, { id: task.id }, { id: task.id, status: task.status === 'backlog' ? 'in-progress' : 'backlog' }, ); const ctrl = useController(); return ( <div className="listItem"> <span style={{ flex: 1 }}>{task.title}</span> <button onClick={handleMove}> {task.status === 'backlog' ? '\u25bc' : '\u25b2'} </button> </div> ); }
- path-to-regexp v8 - RestEndpoint.path syntax updated:
/:optional?โ{/:optional},/:repeat+โ/*repeat(typed asstring[]) - useFetch() returns UsablePromise - Returns a thenable for
React.use(); check.resolvedinstead of truthiness
Upgrade with an automated codemod for all breaking changes:
npx jscodeshift -t https://dataclient.io/codemods/v0.16.js --extensions=ts,tsx,js,jsx src/
Collection.moveโ
Collection.move and RestEndpoint.move make it easy to move entities between collections with a single PATCH request. The entity is automatically removed from collections matching its old state and added to collections matching the new values from the request body.
This is useful for kanban boards, group reassignment, status changes, or any workflow where an item moves between filtered lists.
const TaskResource = resource({
path: '/tasks/:id',
searchParams: {} as { status: string },
schema: Task,
});
// PATCH /tasks/3 - moves task from 'backlog' to 'in-progress'
await ctrl.fetch(
TaskResource.getList.move,
{ id: '3' },
{ id: '3', status: 'in-progress' },
);
Works for both path parameters and search parameters, and supports Array and Values collections.
Collection.moveWith()โ
Collection.moveWith() constructs a custom move schema, analogous to addWith(). The merge function controls how entities are added to their destination collection (e.g., prepending instead of appending), while the remove behavior is automatically derived. The unshift merge function is provided for convenience.
import { Collection, unshift } from '@data-client/rest';
class MyCollection extends Collection {
constructor(schema, options) {
super(schema, options);
this.move = this.moveWith(unshift);
}
}
Parallel data loadingโ
useFetch() now returns a UsablePromise thenable,
making use(useFetch(endpoint, args)) a drop-in equivalent to useSuspense() โ
it suspends when data is loading, returns denormalized data when cached, throws on errors,
and re-suspends on invalidation.
Because useFetch() and use() are separate calls, multiple fetches start in parallel.
All useFetch() calls execute before any use() suspends, so every fetch is already in-flight
even if the first use() suspends the component.
This avoids the sequential waterfall that occurs with multiple useSuspense() calls,
where the second fetch cannot begin until the first resolves.
#3755
import { use } from 'react'; import { useFetch } from '@data-client/react'; import { PostResource, CommentResource } from './Resources'; function PostWithComments({ id }: { id: number }) { // Both fetches start in parallel const postPromise = useFetch(PostResource.get, { id }); const commentsPromise = useFetch(CommentResource.getList, { postId: id }); // use() reads the results โ if the first suspends, // the second fetch is already in-flight const post = use(postPromise); const comments = use(commentsPromise); return ( <article> <h3>{post.title}</h3> <p>{post.body}</p> <h4>Comments</h4> {comments.map(comment => ( <div key={comment.id} className="listItem"> <strong>{comment.author}</strong>: {comment.text} </div> ))} </article> ); } render(<PostWithComments id={1} />);
Denormalization depth limit & Lazyโ
When schemas have bidirectional relationships across entity types
(e.g., Department โ Building โ Department โ ...), denormalization traverses every unique
entity in the connected graph. With small datasets this is invisible, but at scale โ thousands
of interconnected entities โ the recursion exceeds the JS call stack and throws
RangeError: Maximum call stack size exceeded.
#3822
Two complementary features address this: a depth limit as a safety net, and Lazy for precise, per-field control over which relationships are resolved eagerly.
Denormalization depth limitโ
Denormalization now enforces a default depth limit of 64 entity hops. Entities beyond
the limit are returned with unresolved ids instead of fully denormalized objects.
A console.error is emitted in development mode when the limit is reached.
The limit can be configured per-Entity with static maxEntityDepth:
class Department extends Entity {
static maxEntityDepth = 16;
}
This prevents crashes, but is a blunt instrument โ it cuts off all deep paths, including legitimate non-cyclic ones. For fine-grained control, use Lazy.
Lazyโ
Lazy wraps a schema to skip eager denormalization of specific relationship fields.
During parent entity denormalization, the field retains its raw normalized value (primary keys).
The relationship can then be resolved on demand via useQuery using the .query accessor.
import { Entity, Lazy } from '@data-client/rest';
class Department extends Entity {
id = '';
name = '';
buildings: string[] = [];
static schema = {
buildings: new Lazy([Building]),
};
}
Unlike the depth limit, Lazy is a targeted opt-in per relationship. Only the
fields you mark as lazy skip eager resolution โ the rest of the schema denormalizes normally.
This also improves performance by deferring work for relationships that aren't always
needed, and provides memoization isolation โ changes to lazy entities don't invalidate
the parent's denormalized form.
See Lazy documentation for full usage details and #3828 for design discussion.
Migration guideโ
This upgrade requires updating all package versions simultaneously.
- NPM
- Yarn
- pnpm
- esm.sh
yarn add @data-client/react@^0.16.0 @data-client/rest@^0.16.0 @data-client/endpoint@^0.16.0 @data-client/core@^0.16.0 @data-client/vue@^0.16.0 @data-client/test@^0.16.0 @data-client/img@^0.16.0
npm install --save @data-client/react@^0.16.0 @data-client/rest@^0.16.0 @data-client/endpoint@^0.16.0 @data-client/core@^0.16.0 @data-client/vue@^0.16.0 @data-client/test@^0.16.0 @data-client/img@^0.16.0
pnpm add @data-client/react@^0.16.0 @data-client/rest@^0.16.0 @data-client/endpoint@^0.16.0 @data-client/core@^0.16.0 @data-client/vue@^0.16.0 @data-client/test@^0.16.0 @data-client/img@^0.16.0
<script type="module">
import * from 'https://esm.sh/@data-client/react@^0.16.0';
import * from 'https://esm.sh/@data-client/rest@^0.16.0';
import * from 'https://esm.sh/@data-client/endpoint@^0.16.0';
import * from 'https://esm.sh/@data-client/core@^0.16.0';
import * from 'https://esm.sh/@data-client/vue@^0.16.0';
import * from 'https://esm.sh/@data-client/test@^0.16.0';
import * from 'https://esm.sh/@data-client/img@^0.16.0';
</script>
An automated codemod handles all three breaking changes below โ path syntax, useFetch() truthiness, and schema namespace imports:
npx jscodeshift -t https://dataclient.io/codemods/v0.16.js --extensions=ts,tsx,js,jsx src/
path-to-regexp v8โ
RestEndpoint.path now uses path-to-regexp v8, which changes the syntax for optional parameters, repeating parameters, and special characters.
| v6 syntax | v8 syntax | Description |
|---|---|---|
/:optional? | {/:optional} | Optional parameter |
/:repeat+ | /*repeat | One-or-more repeating (typed as string[]) |
/:repeat* | {/*repeat} | Zero-or-more repeating (typed as string[]) |
/:with-dash | /:"with-dash" | Quoted parameter name |
/:id(\d+) | /:id | Custom regex removed |
Optional parametersโ
The ? suffix is removed. Wrap the optional segment (including its prefix) in {}.
const ep = new RestEndpoint({
path: '/:group/things/:number?',
});
const ep = new RestEndpoint({
path: '/:group/things{/:number}',
});
Multiple optional segments with different prefixes:
const ep = new RestEndpoint({
path: '/:attr1?{-:attr2}?{-:attr3}?',
});
const ep = new RestEndpoint({
path: '{/:attr1}{-:attr2}{-:attr3}',
});
Repeating parametersโ
+ (one-or-more) and * (zero-or-more) suffixes are removed. Use *name wildcard syntax, wrapped in {} for zero-or-more.
Wildcard parameters are now typed as string[] (arrays) instead of string | number, since they represent multiple path segments.
const ep = new RestEndpoint({
path: '/files/:path+',
});
ep({ path: 'docs/reports/q4' });
const optEp = new RestEndpoint({
path: '/files/:path*',
});
const ep = new RestEndpoint({
path: '/files/*path',
});
ep({ path: ['docs', 'reports', 'q4'] });
const optEp = new RestEndpoint({
path: '/files{/*path}',
});
Quoted parameter namesโ
Parameter names with special characters (dashes, dots) must now be quoted with double quotes:
const ep = new RestEndpoint({
path: '/:with-dash',
});
const ep = new RestEndpoint({
path: '/:"with-dash"',
});
Custom regex removedโ
Inline regex patterns like /:id(\d+) are no longer supported. Remove the regex โ validate on the server instead.
const ep = new RestEndpoint({
path: '/users/:id(\\d+)',
});
const ep = new RestEndpoint({
path: '/users/:id',
});
Escaping special charactersโ
Characters {}()*: and \\ must be escaped with \\ when used as literals. However, ? and + are
no longer special in v8, so existing escapes for those can be removed:
const ep = new RestEndpoint({
path: '/search\\?q=:q',
});
const ep = new RestEndpoint({
path: '/search?{q=:q}{&page=:page}',
});
Finding paths to migrateโ
Search your codebase for these patterns in path: values:
?after a:paramโ optional parameters to convert+or*after a:paramโ repeating parameters to convert(\after a:paramโ custom regex to remove- Parameter names with
-or.โ need quoting
See RestEndpoint.path for full v8 syntax documentation, or path-to-regexp errors for detailed explanations. AI-assisted migration is also available:
- Skills
- OpenSkills
npx skills add reactive/data-client --skill path-to-regexp-v8-migration
npx openskills install reactive/data-client --skill path-to-regexp-v8-migration
Browse all skills on skills.sh
useFetch() returns UsablePromiseโ
useFetch() now returns a UsablePromise thenable with denormalized data, error handling, and a .resolved property. This makes it directly compatible with React.use() โ use(useFetch(endpoint, args)) behaves identically to useSuspense().
Previously useFetch() returned undefined when data was valid. Code that checked for a fetch via truthiness must now use .resolved:
const promise = useFetch(PostResource.get, { id });
if (promise) {
// fetch was triggered
}
const promise = useFetch(PostResource.get, { id });
if (!promise.resolved) {
// fetch is in-flight
}
use(promise); // suspends or returns denormalized data
See Parallel data loading above for how this enables concurrent fetches that avoid sequential waterfalls. #3752, #3755
Direct schema importsโ
Schema classes can now be imported directly from @data-client/endpoint (or @data-client/rest which re-exports them) instead of requiring the schema namespace. This provides a cleaner import syntax while maintaining full backward compatibility.
import { schema } from '@data-client/endpoint';
const myUnion = new schema.Union(
{ users: User, groups: Group },
'type',
);
import { Union } from '@data-client/endpoint';
const myUnion = new Union(
{ users: User, groups: Group },
'type',
);
All schema classes are available as direct exports: Union, Invalidate, Collection, Query, Values, All, and Lazy. The schema namespace export remains available for backward compatibility.
Upgrade supportโ
As usual, if you have any troubles or questions, feel free to join our or file a bug
