-
Notifications
You must be signed in to change notification settings - Fork 106
WIP: Async and deferred loaders #173
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Changes from all commits
File filter
Filter by extension
Conversations
Jump to
Diff view
Diff view
There are no files selected for viewing
Original file line number | Diff line number | Diff line change |
---|---|---|
@@ -0,0 +1,25 @@ | ||
module GraphQL::Batch | ||
module Async | ||
def resolve | ||
defer # Let other non-async loaders run to completion first. | ||
@peek_queue_index = 0 # The queue is consumed in super, future peeks will start from the beinning. | ||
super | ||
end | ||
|
||
def on_any_loader_wait | ||
@peek_queue_index ||= 0 | ||
peek_queue = queue[@peek_queue_index..] | ||
return if peek_queue.empty? | ||
@peek_queue_index = peek_queue.size | ||
perform_early(peek_queue) | ||
end | ||
|
||
def perform_early(keys) | ||
raise NotImplementedError, "Implement GraphQL::Batch::Async#perform_early to trigger async operations early" | ||
end | ||
|
||
def perform(keys) | ||
raise NotImplementedError, "Implement GraphQL::Batch::Async#perform to wait on the async operations" | ||
end | ||
end | ||
end |
Original file line number | Diff line number | Diff line change |
---|---|---|
|
@@ -42,13 +42,14 @@ def current_executor | |
end | ||
end | ||
|
||
attr_accessor :loader_key, :executor | ||
attr_accessor :loader_key, :executor, :deferred | ||
There was a problem hiding this comment. Choose a reason for hiding this commentThe reason will be displayed to describe this comment to others. Learn more. I'm concerned with There was a problem hiding this comment. Choose a reason for hiding this commentThe reason will be displayed to describe this comment to others. Learn more. Yes, I agree. I'd like to use WDYT @gmac @swalkinshaw @amomchilov ? There was a problem hiding this comment. Choose a reason for hiding this commentThe reason will be displayed to describe this comment to others. Learn more. Since this is an internal implementation detail now, why not just call it We could even get rid of this entirely and just filter out loaders that don't include There was a problem hiding this comment. Choose a reason for hiding this commentThe reason will be displayed to describe this comment to others. Learn more. We can, though I think that's a worse API. I think There was a problem hiding this comment. Choose a reason for hiding this commentThe reason will be displayed to describe this comment to others. Learn more. Can you put a commit up showing what it looks like? |
||
|
||
def initialize | ||
@loader_key = nil | ||
@executor = nil | ||
@queue = nil | ||
@cache = nil | ||
@deferred = false | ||
end | ||
|
||
def load(key) | ||
|
@@ -66,6 +67,13 @@ def prime(key, value) | |
cache[cache_key(key)] ||= ::Promise.resolve(value).tap { |p| p.source = self } | ||
end | ||
|
||
# Called when any GraphQL::Batch::Loader starts waiting. May be called more than once per loader, if | ||
# the loader is waiting multiple times. Will not be called once per promise. | ||
# | ||
# Use GraphQL::Batch::Async for the common way to use this. | ||
def on_any_loader_wait | ||
end | ||
|
||
def resolve # :nodoc: | ||
return if resolved? | ||
load_keys = queue | ||
|
@@ -88,6 +96,7 @@ def around_perform | |
# For Promise#sync | ||
def wait # :nodoc: | ||
if executor | ||
executor.on_wait | ||
executor.resolve(self) | ||
else | ||
resolve | ||
|
@@ -146,6 +155,13 @@ def finish_resolve(key) | |
end | ||
end | ||
|
||
def defer | ||
@deferred = true | ||
executor.defer_to_other_loaders | ||
ensure | ||
@deferred = false | ||
end | ||
|
||
def cache | ||
@cache ||= {} | ||
end | ||
|
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.