Skip to content

Configuration

λ Query is highly configurable. Options can be set at the instance level (applies to all queries) or per-query (overrides instance defaults for a single call).

import { createQuery } from "@studiolambda/query";
const query = createQuery({
expiration: () => 5000,
stale: true,
});

These options apply globally to all queries made with the instance:

OptionTypeDefaultDescription
expiration(item: T) => number() => 2000Function returning the cache TTL in milliseconds. Receives the resolved item, allowing dynamic expiration based on data.
fetcher(key: string, opts: { signal: AbortSignal }) => Promise<T>fetch wrapperThe function that performs the actual data fetching. Receives the cache key and an object with an AbortSignal.
stalebooleantrueWhether to return stale (expired) data immediately while revalidating in the background.
removeOnErrorbooleanfalseWhether to remove the cached item if a background refetch fails.
freshbooleanfalseWhether to always bypass the cache and fetch fresh data.

These configure the underlying infrastructure:

OptionTypeDefaultDescription
itemsCacheCache<ItemsCacheItem>new Map()Custom cache implementation for resolved items. Must implement get, set, delete, keys.
resolversCacheCache<ResolversCacheItem>new Map()Custom cache implementation for in-flight resolvers.
eventsEventTargetnew EventTarget()Custom event system for cache lifecycle events.
broadcastBroadcastChannelundefinedCross-tab broadcast channel. When set, mutations, hydrations, and cache invalidations are synchronized across browser tabs.

If you don’t provide a custom fetcher, λ Query uses the global fetch() with the cache key as the URL:

// The default fetcher does approximately this:
async function defaultFetcher(key, { signal }) {
const response = await fetch(key, { signal });
if (!response.ok) {
throw new Error("Unable to fetch the data: " + response.statusText);
}
return response.json();
}

This means your cache keys should be valid URLs when using the default fetcher. The signal parameter is an AbortSignal that is triggered when the query is aborted.

Provide a custom fetcher to control how data is loaded:

const query = createQuery({
async fetcher(key, { signal }) {
const response = await fetch(`https://api.example.com${key}`, {
signal,
headers: {
Authorization: `Bearer ${getToken()}`,
},
});
if (!response.ok) {
throw new Error(`API error: ${response.status}`);
}
return response.json();
},
});

You can override the fetcher for a specific query call:

const user = await query.query("current-user", {
async fetcher(key, { signal }) {
const response = await fetch("/api/me", { signal });
return response.json();
},
});

This is useful when a specific resource needs different fetching logic (e.g., a GraphQL query, a local storage read, or a WebSocket message).

The expiration function determines how long a cached item is considered fresh. It receives the resolved item, so you can set dynamic expiration based on the data:

const query = createQuery({
expiration(item) {
// Cache user profiles for 30 seconds.
// Cache everything else for 5 seconds.
if (item?.type === "user") return 30_000;
return 5_000;
},
});

The default expiration is 2 seconds (() => 2000).

After an item expires:

  • With stale: true (default): the expired value is returned immediately, and a background refetch starts.
  • With stale: false: the query blocks until the refetch completes.

The stale option controls the SWR behavior:

// Always wait for fresh data (no stale returns).
const query = createQuery({ stale: false });
// Return stale data while revalidating (default).
const query = createQuery({ stale: true });

You can also override per-query:

// This specific query should always return fresh data.
const data = await query.query("/api/critical-data", { stale: false });

You can reconfigure an instance at runtime using the configure() method. Only the provided options are updated — everything else keeps its current value:

const query = createQuery({ stale: true });
// Later, change the expiration without affecting other options.
query.configure({
expiration: () => 10_000,
});

If you need a custom cache implementation (e.g., backed by IndexedDB or a size-limited LRU), implement the Cache interface:

interface Cache<T> {
get(key: string): T | undefined;
set(key: string, value: T): void;
delete(key: string): void;
keys(): IterableIterator<string>;
}

A standard Map satisfies this interface, which is why it’s used as the default. Pass custom caches at construction time:

const query = createQuery({
itemsCache: new LRUCache(100), // Your custom LRU implementation.
resolversCache: new Map(),
});