-
Notifications
You must be signed in to change notification settings - Fork 1.3k
List elements should be batched #3878
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
base: master
Are you sure you want to change the base?
Conversation
When the list elements grow larger than 8000 items, Lua obscurely fails with "too many results to unpack". Found the fix here: taskforcesh/bullmq#422
|
The latest updates on your projects. Learn more about Vercel for GitHub.
|
WalkthroughIntroduces a Lua batching helper and updates list-merging logic to push elements to Redis in chunks via multiple RPUSH calls instead of a single unpack-based RPUSH, altering control flow within the merge step while preserving output semantics. Changes
Sequence Diagram(s)sequenceDiagram
autonumber
participant Caller
participant TS as get-jobs-by-type.ts
participant Redis
Caller->>TS: invoke getJobsByType(...)
TS->>Redis: EVAL Lua script
activate Redis
rect rgba(220,235,255,0.5)
note over Redis: Collect list elements to merge
Redis->>Redis: batches(total, batchSize)
loop For each (from,to) batch
Redis->>Redis: RPUSH key elements[from..to]
end
end
Redis-->>TS: merged list result
deactivate Redis
TS-->>Caller: return result
Estimated code review effort🎯 2 (Simple) | ⏱️ ~10 minutes Pre-merge checks and finishing touches❌ Failed checks (1 warning)
✅ Passed checks (2 passed)
✨ Finishing touches
🧪 Generate unit tests (beta)
Thanks for using CodeRabbit! It's free for OSS, and your support helps us grow. If you like it, consider giving us a shout-out. Comment |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Actionable comments posted: 0
🧹 Nitpick comments (1)
packages/job-queue-plugin/src/bullmq/scripts/get-jobs-by-type.ts (1)
180-180: Consider applying the same batching pattern for consistency.While
sortedElementsis typically small (limited by thetakepagination parameter), a caller could pass a very largetakevalue (e.g., to fetch thousands of items in one request), causing thisunpackto fail with the same error. For defensive programming and consistency with the fix at lines 160-162, consider batching this operation as well.Apply this diff to add batching:
- rcall('RPUSH', tempListKey, unpack(sortedElements)) + for from, to in batches(#sortedElements, 7000) do + rcall('RPUSH', tempListKey, unpack(sortedElements, from, to)) + end
📜 Review details
Configuration used: CodeRabbit UI
Review profile: CHILL
Plan: Pro
📒 Files selected for processing (1)
packages/job-queue-plugin/src/bullmq/scripts/get-jobs-by-type.ts(2 hunks)
⏰ Context from checks skipped due to timeout of 90000ms. You can increase the timeout in your CodeRabbit configuration to a maximum of 15 minutes (900000ms). (20)
- GitHub Check: e2e tests (24.x, mariadb)
- GitHub Check: e2e tests (20.x, mariadb)
- GitHub Check: e2e tests (24.x, sqljs)
- GitHub Check: e2e tests (22.x, postgres)
- GitHub Check: codegen / codegen
- GitHub Check: unit tests (22.x)
- GitHub Check: unit tests (20.x)
- GitHub Check: build (22.x)
- GitHub Check: unit tests (24.x)
- GitHub Check: build (24.x)
- GitHub Check: build (20.x)
- GitHub Check: publish_install (ubuntu-latest, 20.x)
- GitHub Check: publish_install (ubuntu-latest, 24.x)
- GitHub Check: publish_install (macos-latest, 22.x)
- GitHub Check: publish_install (windows-latest, 24.x)
- GitHub Check: publish_install (macos-latest, 20.x)
- GitHub Check: publish_install (windows-latest, 20.x)
- GitHub Check: publish_install (macos-latest, 24.x)
- GitHub Check: publish_install (ubuntu-latest, 22.x)
- GitHub Check: publish_install (windows-latest, 22.x)
🔇 Additional comments (2)
packages/job-queue-plugin/src/bullmq/scripts/get-jobs-by-type.ts (2)
21-32: LGTM! Well-implemented batching iterator.The
batchesfunction correctly generates (from, to) ranges for chunking arrays in Lua. The logic handles 1-based indexing properly and ensures the final batch doesn't exceed bounds.
160-162: Excellent fix for the unpack limit issue!Batching the RPUSH operations with a chunk size of 7000 directly resolves the "too many results to unpack" error when list elements exceed 8000 items. The implementation correctly uses
unpack(listElements, from, to)to push each batch.
|



When the list elements grow larger than 8000 items, Lua obscurely fails with "too many results to unpack".
Found the fix here:
taskforcesh/bullmq#422
Summary by CodeRabbit