You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Copy file name to clipboardExpand all lines: README.md
+13-11Lines changed: 13 additions & 11 deletions
Original file line number
Diff line number
Diff line change
@@ -331,7 +331,6 @@ const stats = await diff({
331
331
newSource: './tests/b.csv',
332
332
keys: ['id'],
333
333
duplicateKeyHandling: 'keepFirstRow', // or 'keepLastRow'
334
-
duplicateRowBufferSize: 2000,
335
334
}).to('console');
336
335
console.log(stats);
337
336
```
@@ -349,6 +348,8 @@ const stats = await diff({
349
348
console.log(stats);
350
349
```
351
350
351
+
Note that you can specify the size of the buffer if you know that it cannot exceed this quantity, otherwise you can enable the **duplicateRowBufferOverflow** option,
352
+
which will remove the first entries when it exceeds the allocated capacity, to avoid a failure.
352
353
353
354
### Order 2 CSV files and diff them on the console
354
355
@@ -580,16 +581,17 @@ sortDirection| no | ASC | specifies if the column is sorted in ascen
oldSource | yes | | either a string filename, a URL or a SourceOptions
586
-
newSource | yes | | either a string filename, a URL or a SourceOptions
587
-
keys | yes | | the list of columns that form the primary key. This is required for comparing the rows. A key can be a string name or a {ColumnDefinition}
588
-
includedColumns | no | | the list of columns to keep from the input sources. If not specified, all columns are selected.
589
-
excludedColumns | no | | the list of columns to exclude from the input sources.
590
-
rowComparer | no | | specifies a custom row comparer.
591
-
duplicateKeyHandling |no | fail | specifies how to handle duplicate rows in a source. It will fail by default and throw a UniqueKeyViolationError exception. But you can ignore, keep the first or last row, or even provide your own function that will receive the duplicates and select the best candidate.
592
-
duplicateRowBufferSize|no | 1000 | specifies the maximum size of the buffer used to accumulate duplicate rows.
oldSource | yes | | either a string filename, a URL or a SourceOptions
587
+
newSource | yes | | either a string filename, a URL or a SourceOptions
588
+
keys | yes | | the list of columns that form the primary key. This is required for comparing the rows. A key can be a string name or a {ColumnDefinition}
589
+
includedColumns | no | | the list of columns to keep from the input sources. If not specified, all columns are selected.
590
+
excludedColumns | no | | the list of columns to exclude from the input sources.
591
+
rowComparer | no | | specifies a custom row comparer.
592
+
duplicateKeyHandling |no | fail | specifies how to handle duplicate rows in a source. It will fail by default and throw a UniqueKeyViolationError exception. But you can ignore, keep the first or last row, or even provide your own function that will receive the duplicates and select the best candidate.
593
+
duplicateRowBufferSize |no | 1000 | specifies the maximum size of the buffer used to accumulate duplicate rows.
594
+
duplicateRowBufferOverflow|no | false | specifies if we can remove the first entries of the buffer to continue adding new duplicate entries when reaching maximum capacity, to avoir throwing an error and halting the process.
* specifies the maximum size of the buffer used to accumulate duplicate rows.
213
+
* Note that the buffer size matters only when you provide a custom function to the duplicateKeyHandling, since it will receive the accumulated duplicates
214
+
* as an input parameter.
213
215
* @default 1000
214
216
* @see duplicateKeyHandling
215
217
*/
216
218
duplicateRowBufferSize?: number;
219
+
/**
220
+
* specifies if we can remove the first entries of the buffer to continue adding new duplicate entries when reaching maximum capacity,
221
+
* to avoir throwing an error and halting the process.
222
+
* Note that the buffer size matters only when you provide a custom function to the duplicateKeyHandling, since it will receive the accumulated duplicates
0 commit comments