-
Notifications
You must be signed in to change notification settings - Fork 407
Client-side of static invoice server #3618
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
base: main
Are you sure you want to change the base?
Client-side of static invoice server #3618
Conversation
Will go through the commits in a bit more detail before taking this out of draft, but conceptual feedback or feedback on the protocol itself is welcome, or the way the code is organized overall. It does add a significant amount of code to |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Hmmm, I wonder if we shouldn't allow the client to cache N offers rather than only 1. I worry a bit about the privacy implications of having One Offer that gets reused across different contexts.
I think that makes sense, so they would interactively build and cache a few and then randomly(?) return one of them on It seems reasonable to save for follow-up although I could adapt the |
lightning/src/ln/channelmanager.rs
Outdated
// Expire the offer at the same time as the static invoice so we automatically refresh both | ||
// at the same time. | ||
let offer_and_invoice_absolute_expiry = Duration::from_secs(core::cmp::min( | ||
offer_paths_absolute_expiry.as_secs(), | ||
duration_since_epoch.saturating_add(STATIC_INVOICE_DEFAULT_RELATIVE_EXPIRY).as_secs() | ||
)); |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
One thing I want to address eventually (but maybe not in this PR) is that right now we cap the expiry of our offer/static invoice at 2 weeks, which doesn't work well for the "offer in Twitter bio" use case. Probably we can add something to UserConfig
for this, and expose a method for users to proactively rotate their offer if it never expires?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I wonder if for that we shouldn't try to come up with a scheme to allow the offer to last longer than the static invoice? I mean ideally an offer lasts at least a few years, but it kinda could cause you just care about the storage server being reliable, you don't care much about the static invoice.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
That makes sense. We could include another set of long-lived paths in the OfferPaths
message that allows the recipient to refresh their invoice later while keeping the same offer [paths].
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I mean maybe the OffersPath
paths should just be super long-lived? I don't see a strong reason to have some concept of long-lived paths?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Even if the OfferPaths
offer_paths are super long lived, we still need a way for the recipient to update their static invoice later. So the additional paths would be for that purpose, is my thinking.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Oh vs just having the original paths be long-lived? I guess we could, but it seems like we could just make all the paths long-lived?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Gotcha. In the current PR, the recipient sends PersistStaticInvoice
over the reply path to the OfferPaths
message, and that reply path is short-lived.
So we could make that reply path long-lived instead and have the recipient cache that reply path to update their invoice later. Just to confirm, that's what you're suggesting?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Yea, that's what I was thinking. Basically just make it a "multi-reply path"
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
That makes sense. It means we won't get extra message attempts over alternate blinded paths, but that might be a premature optimization anyway, hard to tell.
Yea, I dunno what to do for the fetch'er, maybe we just expose the whole list?
Makes sense, tho I imagine it would be a rather trivial diff, no? |
const IV_BYTES: &[u8; IV_LEN] = b"LDK Offer Paths~"; | ||
let mut hmac = expanded_key.hmac_for_offer(); | ||
hmac.input(IV_BYTES); | ||
hmac.input(&nonce.0); | ||
hmac.input(ASYNC_PAYMENTS_OFFER_PATHS_INPUT); |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Do we need to include path_absolute_expiry
?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I thought the nonce/IV was sufficient but I'm not certain. @TheBlueMatt would it be an improvement to commit to the expiry in the hmac? IIUC the path still can't be re-purposed...
Going to base this on #3640. Will finish updating the ser macros based on those changes and push updates here after finishing some review. |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Added a couple of comments that I find out while working on the CI failure in #3593
5cf3585
to
5455d55
Compare
Pushed some updates after moving the async receive offer cache into the new |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
New to the codebase but interested in following async payments. From reading the explanation in the commit messages, the protocol/flow between the async recipient and the always-online node to build the static invoice and offer made sense. Overall the code changes look good to me.
fn handle_offer_paths_request( | ||
&self, message: OfferPathsRequest, context: AsyncPaymentsContext, | ||
responder: Option<Responder>, | ||
) -> Option<(OfferPaths, ResponseInstruction)>; |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I see it is similar to other message handler traits in the OnionMessenger
but I was wondering why return Option
s in these handle_
methods instead of Result
s?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Good question, I wrote that code forever ago but I think it was just consistency with the other onion message handler traits at the time. Fine to switch if reviewers prefer, although I might punt since the handle_held_htlc_available
instance within the async payments trait is pre-existing...
Self::OfferPathsRequest(_) => OFFER_PATHS_REQ_TLV_TYPE, | ||
Self::OfferPaths(msg) => msg.tlv_type(), | ||
Self::ServeStaticInvoice(msg) => msg.tlv_type(), | ||
Self::StaticInvoicePersisted(_) => INVOICE_PERSISTED_TLV_TYPE, |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Why do some use the const
directly here and others get the const
set through the tlv_type
on the msg
?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
The variants that return const
s correspond to messages that don't implement the OnionMessageContents
trait, so they don't have the tlv_type
method available. Looks like docs are a bit lacking here but the OnionMessageContents
trait implementation seems to only be needed for onion messages that are sent in direct response to other onion messages.
Document that MessageForwardNode must represent a node that supports the onion messages feature in order to be used in blinded reply paths.
Encapsulates logic for fetching peers used in blinded path creation. Reduces duplication and improves reusability across functions.
`OffersMessageFlow` is a mid-level abstraction for handling BOLT12 messages and flow control. It provides utilities to help implement Offer Message Handlers in a cleaner, more modular way. The core motivation is to decouple Onion Messaging logic from `ChannelManager`, reducing its responsibilities and code overhead. This separation improves clarity, maintainability, and lays the groundwork for giving users greater flexibility in customizing their BOLT12 message flows.
These functions will be used in the following commit to replace closure usage in Flow trait functions.
As part of being an async recipient, we need to support interactively building an offer and static invoice with an always-online node that will serve static invoices on our behalf. Add a config field containing blinded message paths that async recipients can use to request blinded paths that will be included in their offer. Payers will forward invoice requests over the paths returned by the server, and receive a static invoice in response if the recipient is offline.
5455d55
to
f8023ca
Compare
Rebased on the latest version of #3639 |
Codecov ReportAttention: Patch coverage is
Additional details and impacted files@@ Coverage Diff @@
## main #3618 +/- ##
==========================================
+ Coverage 89.52% 89.69% +0.16%
==========================================
Files 157 161 +4
Lines 125100 129076 +3976
Branches 125100 129076 +3976
==========================================
+ Hits 112002 115781 +3779
- Misses 10408 10605 +197
Partials 2690 2690 ☔ View full report in Codecov by Sentry. 🚀 New features to boost your workflow:
|
d1cc154
to
68fd751
Compare
Pushed some minor fixes for CI. |
Because async recipients are not online to respond to invoice requests, the plan is for another node on the network that is always-online to serve static invoices on their behalf. The protocol is as follows: - Recipient is configured with blinded message paths to reach the static invoice server - On startup, recipient requests blinded message paths for inclusion in their offer from the static invoice server over the configured paths - Server replies with offer paths for the recipient - Recipient builds their offer using these paths and the corresponding static invoice and replies with the invoice - Server persists the invoice and confirms that they've persisted it, causing the recipient to cache the interactively built offer for use At pay-time, the payer sends an invoice request to the static invoice server, who replies with the static invoice after forwarding the invreq to the recipient (to give them a chance to provide a fresh invoice in case they're online). Here we add the requisite trait methods and onion messages to support this protocol.
In future commits, as part of being an async recipient, we will interactively build offers and static invoices with an always-online node that will serve static invoices on our behalf. Once an offer is built and the static invoice is confirmed as persisted by the server, we will use the new offer cache added here to save the invoice metadata and the offer in ChannelManager, though the OffersMessageFlow is responsible for keeping the cache updated.
As an async recipient, we need to interactively build static invoices that an always-online node will serve to payers on our behalf. At the start of this process, we send a requests for paths to include in our offers to the always-online node on startup and refresh the cached offers when they expire.
As an async recipient, we need to interactively build a static invoice that an always-online node will serve to payers on our behalf. As part of this process, the static invoice server sends us blinded message paths to include in our offer so they'll receive invoice requests from senders trying to pay us while we're offline. On receipt of these paths, create an offer and static invoice and send the invoice back to the server so they can provide the invoice to payers.
As an async recipient, we need to interactively build a static invoice that an always-online node will serve on our behalf. Once this invoice is built and persisted by the static invoice server, they will send us a confirmation onion message. At this time, cache the corresponding offer and mark it as ready to receive async payments.
As an async recipient, we need to interactively build offers and corresponding static invoices, the latter of which an always-online node will serve to payers on our behalf. Offers may be very long-lived and have a longer expiration than their corresponding static invoice. Therefore, persist a fresh invoice with the static invoice server when the current invoice gets close to expiration.
Over the past several commits we've implemented interactively building an async receive offer with a static invoice server that will service invoice requests on our behalf as an async recipient. Here we add an API to retrieve the resulting offers so we can receive payments when we're offline.
68fd751
to
a5e4718
Compare
As part of being an async recipient, we need to interactively build an offer and static invoice with an always-online node that will serve static invoices on our behalf in response to invoice requests from payers.
While users could build this invoice manually, the plan is for LDK to automatically build it for them using onion messages. See this doc for more details on the protocol. Here we implement the client side of the linked protocol.
See lightning/bolts#1149 for more information on async payments.