怎么改造grpc流式请求啊?
#1053
Replies: 2 comments
-
Beta Was this translation helpful? Give feedback.
0 replies
-
|
稍后我们会支持流式请求, 可能的设计类似于: Lazy<void> stream_call(stream<Req,Resp> stream) {
auto req = co_await stream.recv(); // recv a stream input
std::vector<Req> reqs = co_await stream.batch_recv(100); // batch recv input until finish request or recv 100 times
if (!stream.peer_over()) { // if client dont close stream
reqs = co_await stream.batch_recv(); // batch recv all stream input until client finish request and close stream
}
auto ec = co_await stream.send(Resp{}); // response data
ec = co_await stream.batch_send(std::vector<Resp>{}); // batch response data
// ec = co_await stream.send(Resp{}, EOF_FLAG); // response data and close stream immediately
co_return; // or close stream call by RAII
}
Lazy<void> client_stream_example(coro_rpc_client& cli) {
stream<Resp,Req> stream = cli.call_stream<stream_call>();
auto ec = co_await stream.send(Req{}); // response data
ec = co_await stream.batch_send(std::vector<Req>{}); // batch response data
ec = co_await stream.batch_send(std::vector<Req>{}); // batch response data
stream.close_peer(); // manually close stream
std::vector<Req> reqs = co_await stream.batch_recv(100); // batch recv response until finish server finish response or recv 100 times
if (stream.peer_over()) {
co_return;
}
std::vector<Req> reqs = co_await stream.batch_recv(); // batch recv all stream input until server finish response
// or close stream automatically by RAII
} |
Beta Was this translation helpful? Give feedback.
0 replies
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
Uh oh!
There was an error while loading. Please reload this page.
-
比如这样的:
rpc ServerStreamPing(PingRequest) returns (stream PingReply);
Beta Was this translation helpful? Give feedback.
All reactions