Skip to main content

v0.16: Parallel Fetching, Collection.move, Lazy

ยท 12 min read
Nathaniel Tucker
Creator of Reactive Data Client

New Features:

Other Improvements:

Collection.move makes it easy to move entities between Collections with a single operation:

import { useController } from '@data-client/react';
import { TaskResource, type Task } from './TaskResource';

export default function TaskCard({ task }: { task: Task }) {
  const handleMove = () => ctrl.fetch(
    TaskResource.getList.move,
    { id: task.id },
    { id: task.id, status: task.status === 'backlog' ? 'in-progress' : 'backlog' },
  );
  const ctrl = useController();
  return (
    <div className="listItem">
      <span style={{ flex: 1 }}>{task.title}</span>
      <button onClick={handleMove}>
        {task.status === 'backlog' ? '\u25bc' : '\u25b2'}
      </button>
    </div>
  );
}
๐Ÿ”ด Live Preview
Storeโ–ถ

Breaking Changes:

Upgrade with an automated codemod for all breaking changes:

npx jscodeshift -t https://dataclient.io/codemods/v0.16.js --extensions=ts,tsx,js,jsx src/

Collection.moveโ€‹

Collection.move and RestEndpoint.move make it easy to move entities between collections with a single PATCH request. The entity is automatically removed from collections matching its old state and added to collections matching the new values from the request body.

This is useful for kanban boards, group reassignment, status changes, or any workflow where an item moves between filtered lists.

const TaskResource = resource({
path: '/tasks/:id',
searchParams: {} as { status: string },
schema: Task,
});

// PATCH /tasks/3 - moves task from 'backlog' to 'in-progress'
await ctrl.fetch(
TaskResource.getList.move,
{ id: '3' },
{ id: '3', status: 'in-progress' },
);

Works for both path parameters and search parameters, and supports Array and Values collections.

Collection.moveWith()โ€‹

Collection.moveWith() constructs a custom move schema, analogous to addWith(). The merge function controls how entities are added to their destination collection (e.g., prepending instead of appending), while the remove behavior is automatically derived. The unshift merge function is provided for convenience.

import { Collection, unshift } from '@data-client/rest';

class MyCollection extends Collection {
constructor(schema, options) {
super(schema, options);
this.move = this.moveWith(unshift);
}
}

Parallel data loadingโ€‹

useFetch() now returns a UsablePromise thenable, making use(useFetch(endpoint, args)) a drop-in equivalent to useSuspense() โ€” it suspends when data is loading, returns denormalized data when cached, throws on errors, and re-suspends on invalidation.

Because useFetch() and use() are separate calls, multiple fetches start in parallel. All useFetch() calls execute before any use() suspends, so every fetch is already in-flight even if the first use() suspends the component. This avoids the sequential waterfall that occurs with multiple useSuspense() calls, where the second fetch cannot begin until the first resolves. #3755

import { use } from 'react';
import { useFetch } from '@data-client/react';
import { PostResource, CommentResource } from './Resources';

function PostWithComments({ id }: { id: number }) {
  // Both fetches start in parallel
  const postPromise = useFetch(PostResource.get, { id });
  const commentsPromise = useFetch(CommentResource.getList, { postId: id });

  // use() reads the results โ€” if the first suspends,
  // the second fetch is already in-flight
  const post = use(postPromise);
  const comments = use(commentsPromise);

  return (
    <article>
      <h3>{post.title}</h3>
      <p>{post.body}</p>
      <h4>Comments</h4>
      {comments.map(comment => (
        <div key={comment.id} className="listItem">
          <strong>{comment.author}</strong>: {comment.text}
        </div>
      ))}
    </article>
  );
}
render(<PostWithComments id={1} />);
๐Ÿ”ด Live Preview
Storeโ–ถ

Denormalization depth limit & Lazyโ€‹

When schemas have bidirectional relationships across entity types (e.g., Department โ†’ Building โ†’ Department โ†’ ...), denormalization traverses every unique entity in the connected graph. With small datasets this is invisible, but at scale โ€” thousands of interconnected entities โ€” the recursion exceeds the JS call stack and throws RangeError: Maximum call stack size exceeded. #3822

Two complementary features address this: a depth limit as a safety net, and Lazy for precise, per-field control over which relationships are resolved eagerly.

Denormalization depth limitโ€‹

Denormalization now enforces a default depth limit of 64 entity hops. Entities beyond the limit are returned with unresolved ids instead of fully denormalized objects. A console.error is emitted in development mode when the limit is reached.

The limit can be configured per-Entity with static maxEntityDepth:

class Department extends Entity {
static maxEntityDepth = 16;
}

This prevents crashes, but is a blunt instrument โ€” it cuts off all deep paths, including legitimate non-cyclic ones. For fine-grained control, use Lazy.

Lazyโ€‹

Lazy wraps a schema to skip eager denormalization of specific relationship fields. During parent entity denormalization, the field retains its raw normalized value (primary keys). The relationship can then be resolved on demand via useQuery using the .query accessor.

import { Entity, Lazy } from '@data-client/rest';

class Department extends Entity {
id = '';
name = '';
buildings: string[] = [];

static schema = {
buildings: new Lazy([Building]),
};
}

Unlike the depth limit, Lazy is a targeted opt-in per relationship. Only the fields you mark as lazy skip eager resolution โ€” the rest of the schema denormalizes normally. This also improves performance by deferring work for relationships that aren't always needed, and provides memoization isolation โ€” changes to lazy entities don't invalidate the parent's denormalized form.

See Lazy documentation for full usage details and #3828 for design discussion.

Migration guideโ€‹

This upgrade requires updating all package versions simultaneously.

npm install --save @data-client/react@^0.16.0 @data-client/rest@^0.16.0 @data-client/endpoint@^0.16.0 @data-client/core@^0.16.0 @data-client/vue@^0.16.0 @data-client/test@^0.16.0 @data-client/img@^0.16.0

An automated codemod handles all three breaking changes below โ€” path syntax, useFetch() truthiness, and schema namespace imports:

npx jscodeshift -t https://dataclient.io/codemods/v0.16.js --extensions=ts,tsx,js,jsx src/

path-to-regexp v8โ€‹

RestEndpoint.path now uses path-to-regexp v8, which changes the syntax for optional parameters, repeating parameters, and special characters.

v6 syntaxv8 syntaxDescription
/:optional?{/:optional}Optional parameter
/:repeat+/*repeatOne-or-more repeating (typed as string[])
/:repeat*{/*repeat}Zero-or-more repeating (typed as string[])
/:with-dash/:"with-dash"Quoted parameter name
/:id(\d+)/:idCustom regex removed

Optional parametersโ€‹

The ? suffix is removed. Wrap the optional segment (including its prefix) in {}.

Before
const ep = new RestEndpoint({
path: '/:group/things/:number?',
});
After
const ep = new RestEndpoint({
path: '/:group/things{/:number}',
});

Multiple optional segments with different prefixes:

Before
const ep = new RestEndpoint({
path: '/:attr1?{-:attr2}?{-:attr3}?',
});
After
const ep = new RestEndpoint({
path: '{/:attr1}{-:attr2}{-:attr3}',
});

Repeating parametersโ€‹

+ (one-or-more) and * (zero-or-more) suffixes are removed. Use *name wildcard syntax, wrapped in {} for zero-or-more.

Wildcard parameters are now typed as string[] (arrays) instead of string | number, since they represent multiple path segments.

Before
const ep = new RestEndpoint({
path: '/files/:path+',
});
ep({ path: 'docs/reports/q4' });
const optEp = new RestEndpoint({
path: '/files/:path*',
});
After
const ep = new RestEndpoint({
path: '/files/*path',
});
ep({ path: ['docs', 'reports', 'q4'] });
const optEp = new RestEndpoint({
path: '/files{/*path}',
});

Quoted parameter namesโ€‹

Parameter names with special characters (dashes, dots) must now be quoted with double quotes:

Before
const ep = new RestEndpoint({
path: '/:with-dash',
});
After
const ep = new RestEndpoint({
path: '/:"with-dash"',
});

Custom regex removedโ€‹

Inline regex patterns like /:id(\d+) are no longer supported. Remove the regex โ€” validate on the server instead.

Before
const ep = new RestEndpoint({
path: '/users/:id(\\d+)',
});
After
const ep = new RestEndpoint({
path: '/users/:id',
});

Escaping special charactersโ€‹

Characters {}()*: and \\ must be escaped with \\ when used as literals. However, ? and + are no longer special in v8, so existing escapes for those can be removed:

Before
const ep = new RestEndpoint({
path: '/search\\?q=:q',
});
After
const ep = new RestEndpoint({
path: '/search?{q=:q}{&page=:page}',
});

Finding paths to migrateโ€‹

Search your codebase for these patterns in path: values:

  • ? after a :param โ€” optional parameters to convert
  • + or * after a :param โ€” repeating parameters to convert
  • (\ after a :param โ€” custom regex to remove
  • Parameter names with - or . โ€” need quoting

See RestEndpoint.path for full v8 syntax documentation, or path-to-regexp errors for detailed explanations. AI-assisted migration is also available:

npx skills add reactive/data-client --skill path-to-regexp-v8-migration

Browse all skills on skills.sh

useFetch() returns UsablePromiseโ€‹

useFetch() now returns a UsablePromise thenable with denormalized data, error handling, and a .resolved property. This makes it directly compatible with React.use() โ€” use(useFetch(endpoint, args)) behaves identically to useSuspense().

Previously useFetch() returned undefined when data was valid. Code that checked for a fetch via truthiness must now use .resolved:

Before
const promise = useFetch(PostResource.get, { id });
if (promise) {
// fetch was triggered
}
After
const promise = useFetch(PostResource.get, { id });
if (!promise.resolved) {
// fetch is in-flight
}
use(promise); // suspends or returns denormalized data

See Parallel data loading above for how this enables concurrent fetches that avoid sequential waterfalls. #3752, #3755

Direct schema importsโ€‹

Schema classes can now be imported directly from @data-client/endpoint (or @data-client/rest which re-exports them) instead of requiring the schema namespace. This provides a cleaner import syntax while maintaining full backward compatibility.

Before
import { schema } from '@data-client/endpoint';

const myUnion = new schema.Union(
{ users: User, groups: Group },
'type',
);
After
import { Union } from '@data-client/endpoint';

const myUnion = new Union(
{ users: User, groups: Group },
'type',
);

All schema classes are available as direct exports: Union, Invalidate, Collection, Query, Values, All, and Lazy. The schema namespace export remains available for backward compatibility.

Upgrade supportโ€‹

As usual, if you have any troubles or questions, feel free to join our Chat or file a bug