Skip to content

Commit f005123

Browse files
authored
Add Readme and tests about cache disabling (#72)
Based on discussion in #63, a decision was made to not break existing correct but potentially confusing behavior but instead to clearly document that behavior and illustrate patterns for producing different cache behavior such as per-frame caching. Closes #63 Closes #46 Closes #44 Closes #34
1 parent fcd9dd7 commit f005123

File tree

3 files changed

+99
-10
lines changed

3 files changed

+99
-10
lines changed

README.md

+48-7
Original file line numberDiff line numberDiff line change
@@ -124,9 +124,9 @@ with the original keys `[ 2, 9, 6, 1 ]`:
124124

125125
## Caching
126126

127-
DataLoader provides a cache for all loads which occur in a single request to
128-
your application. After `.load()` is called once with a given key, the resulting
129-
value is cached to eliminate redundant loads.
127+
DataLoader provides a memoization cache for all loads which occur in a single
128+
request to your application. After `.load()` is called once with a given key,
129+
the resulting value is cached to eliminate redundant loads.
130130

131131
In addition to reliving pressure on your data storage, caching results per-request
132132
also creates fewer objects which may relieve memory pressure on your application:
@@ -144,7 +144,7 @@ DataLoader caching *does not* replace Redis, Memcache, or any other shared
144144
application-level cache. DataLoader is first and foremost a data loading mechanism,
145145
and its cache only serves the purpose of not repeatedly loading the same data in
146146
the context of a single request to your Application. To do this, it maintains a
147-
simple in-memory cache (more accurately: `.load()` is a memoized function).
147+
simple in-memory memoization cache (more accurately: `.load()` is a memoized function).
148148

149149
Avoid multiple requests from different users using the DataLoader instance, which
150150
could result in cached data incorrectly appearing in each request. Typically,
@@ -217,6 +217,46 @@ userLoader.load(1).catch(error => {
217217
});
218218
```
219219

220+
#### Disabling Cache
221+
222+
In certain uncommon cases, a DataLoader which *does not* cache may be desirable.
223+
Calling `new DataLoader(myBatchFn, { cache: false })` will ensure that every
224+
call to `.load()` will produce a *new* Promise, and requested keys will not be
225+
saved in memory.
226+
227+
However, when the memoization cache is disabled, your batch function will
228+
receive an array of keys which may contain duplicates! Each key will be
229+
associated with each call to `.load()`. Your batch loader should provide a value
230+
for each instance of the requested key.
231+
232+
For example:
233+
234+
```js
235+
var myLoader = new DataLoader(keys => {
236+
console.log(keys)
237+
return someBatchLoadFn(keys)
238+
}, { cache: false })
239+
240+
myLoader.load('A')
241+
myLoader.load('B')
242+
myLoader.load('A')
243+
244+
// > [ 'A', 'B', 'A' ]
245+
```
246+
247+
More complex cache behavior can be achieved by calling `.clear()` or `.clearAll()`
248+
rather than disabling the cache completely. For example, this DataLoader will
249+
provide unique keys to a batch function due to the memoization cache being
250+
enabled, but will immediately clear its cache when the batch function is called
251+
so later requests will load new values.
252+
253+
```js
254+
var myLoader = new DataLoader(keys => {
255+
identityLoader.clearAll()
256+
return someBatchLoadFn(keys)
257+
})
258+
```
259+
220260

221261
## API
222262

@@ -245,8 +285,9 @@ Create a new `DataLoader` given a batch loading function and options.
245285
- *maxBatchSize*: Default `Infinity`. Limits the number of items that get
246286
passed in to the `batchLoadFn`.
247287

248-
- *cache*: Default `true`. Set to `false` to disable caching, instead
249-
creating a new Promise and new key in the `batchLoadFn` for every load.
288+
- *cache*: Default `true`. Set to `false` to disable memoization caching,
289+
instead creating a new Promise and new key in the `batchLoadFn` for every
290+
load of the same key.
250291

251292
- *cacheKeyFn*: A function to produce a cache key for a given load key.
252293
Defaults to `key => key`. Useful to provide when JavaScript objects are keys
@@ -413,7 +454,7 @@ let usernameLoader = new DataLoader(names => genUsernames(names).then(users => {
413454
## Custom Caches
414455

415456
DataLoader can optionaly be provided a custom Map instance to use as its
416-
cache. More specifically, any object that implements the methods `get()`,
457+
memoization cache. More specifically, any object that implements the methods `get()`,
417458
`set()`, `delete()` and `clear()` can be provided. This allows for custom Maps
418459
which implement various [cache algorithms][] to be provided. By default,
419460
DataLoader uses the standard [Map][] which simply grows until the DataLoader

src/__tests__/dataloader-test.js

+48
Original file line numberDiff line numberDiff line change
@@ -509,6 +509,54 @@ describe('Accepts options', () => {
509509
);
510510
});
511511

512+
it('Keys are repeated in batch when cache disabled', async () => {
513+
var [ identityLoader, loadCalls ] = idLoader({ cache: false });
514+
515+
var [ values1, values2, values3, values4 ] = await Promise.all([
516+
identityLoader.load('A'),
517+
identityLoader.load('C'),
518+
identityLoader.load('D'),
519+
identityLoader.loadMany([ 'C', 'D', 'A', 'A', 'B' ]),
520+
]);
521+
522+
expect(values1).to.equal('A');
523+
expect(values2).to.equal('C');
524+
expect(values3).to.equal('D');
525+
expect(values4).to.deep.equal([ 'C', 'D', 'A', 'A', 'B' ]);
526+
527+
expect(loadCalls).to.deep.equal([
528+
[ 'A', 'C', 'D', 'C', 'D', 'A', 'A', 'B' ]
529+
]);
530+
});
531+
532+
it('Complex cache behavior via clearAll()', async () => {
533+
// This loader clears its cache as soon as a batch function is dispatched.
534+
var loadCalls = [];
535+
var identityLoader = new DataLoader(keys => {
536+
identityLoader.clearAll();
537+
loadCalls.push(keys);
538+
return Promise.resolve(keys);
539+
});
540+
541+
var values1 = await Promise.all([
542+
identityLoader.load('A'),
543+
identityLoader.load('B'),
544+
identityLoader.load('A'),
545+
]);
546+
547+
expect(values1).to.deep.equal([ 'A', 'B', 'A' ]);
548+
549+
var values2 = await Promise.all([
550+
identityLoader.load('A'),
551+
identityLoader.load('B'),
552+
identityLoader.load('A'),
553+
]);
554+
555+
expect(values2).to.deep.equal([ 'A', 'B', 'A' ]);
556+
557+
expect(loadCalls).to.deep.equal([ [ 'A', 'B' ], [ 'A', 'B' ] ]);
558+
});
559+
512560
describe('Accepts object key in custom cacheKey function', () => {
513561
function cacheKey(key) {
514562
return Object.keys(key).sort().map(k => k + ':' + key[k]).join();

src/index.d.ts

+3-3
Original file line numberDiff line numberDiff line change
@@ -92,9 +92,9 @@ declare namespace DataLoader {
9292
maxBatchSize?: number;
9393

9494
/**
95-
* Default `true`. Set to `false` to disable caching,
96-
* instead creating a new Promise and new key in
97-
* the `batchLoadFn` for every load.
95+
* Default `true`. Set to `false` to disable memoization caching,
96+
* instead creating a new Promise and new key in the `batchLoadFn` for every
97+
* load of the same key.
9898
*/
9999
cache?: boolean,
100100

0 commit comments

Comments
 (0)