Skip to content

Client-side of static invoice server #3618

New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Open
wants to merge 14 commits into
base: main
Choose a base branch
from

Conversation

valentinewallace
Copy link
Contributor

As part of being an async recipient, we need to interactively build an offer and static invoice with an always-online node that will serve static invoices on our behalf in response to invoice requests from payers.

While users could build this invoice manually, the plan is for LDK to automatically build it for them using onion messages. See this doc for more details on the protocol. Here we implement the client side of the linked protocol.

See lightning/bolts#1149 for more information on async payments.

@valentinewallace
Copy link
Contributor Author

Will go through the commits in a bit more detail before taking this out of draft, but conceptual feedback or feedback on the protocol itself is welcome, or the way the code is organized overall. It does add a significant amount of code to ChannelManager currently.

Copy link
Collaborator

@TheBlueMatt TheBlueMatt left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Hmmm, I wonder if we shouldn't allow the client to cache N offers rather than only 1. I worry a bit about the privacy implications of having One Offer that gets reused across different contexts.

@valentinewallace
Copy link
Contributor Author

Hmmm, I wonder if we shouldn't allow the client to cache N offers rather than only 1. I worry a bit about the privacy implications of having One Offer that gets reused across different contexts.

I think that makes sense, so they would interactively build and cache a few and then randomly(?) return one of them on get_cached_async_receive_offer?

It seems reasonable to save for follow-up although I could adapt the AsyncReceiveOffer cache struct serialization for this now.

Comment on lines 12709 to 12714
// Expire the offer at the same time as the static invoice so we automatically refresh both
// at the same time.
let offer_and_invoice_absolute_expiry = Duration::from_secs(core::cmp::min(
offer_paths_absolute_expiry.as_secs(),
duration_since_epoch.saturating_add(STATIC_INVOICE_DEFAULT_RELATIVE_EXPIRY).as_secs()
));
Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

One thing I want to address eventually (but maybe not in this PR) is that right now we cap the expiry of our offer/static invoice at 2 weeks, which doesn't work well for the "offer in Twitter bio" use case. Probably we can add something to UserConfig for this, and expose a method for users to proactively rotate their offer if it never expires?

Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I wonder if for that we shouldn't try to come up with a scheme to allow the offer to last longer than the static invoice? I mean ideally an offer lasts at least a few years, but it kinda could cause you just care about the storage server being reliable, you don't care much about the static invoice.

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

That makes sense. We could include another set of long-lived paths in the OfferPaths message that allows the recipient to refresh their invoice later while keeping the same offer [paths].

Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I mean maybe the OffersPath paths should just be super long-lived? I don't see a strong reason to have some concept of long-lived paths?

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Even if the OfferPaths offer_paths are super long lived, we still need a way for the recipient to update their static invoice later. So the additional paths would be for that purpose, is my thinking.

Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Oh vs just having the original paths be long-lived? I guess we could, but it seems like we could just make all the paths long-lived?

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Gotcha. In the current PR, the recipient sends PersistStaticInvoice over the reply path to the OfferPaths message, and that reply path is short-lived.

So we could make that reply path long-lived instead and have the recipient cache that reply path to update their invoice later. Just to confirm, that's what you're suggesting?

Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Yea, that's what I was thinking. Basically just make it a "multi-reply path"

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

That makes sense. It means we won't get extra message attempts over alternate blinded paths, but that might be a premature optimization anyway, hard to tell.

@TheBlueMatt
Copy link
Collaborator

TheBlueMatt commented Feb 25, 2025

I think that makes sense, so they would interactively build and cache a few and then randomly(?) return one of them on get_cached_async_receive_offer?

Yea, I dunno what to do for the fetch'er, maybe we just expose the whole list?

It seems reasonable to save for follow-up although I could adapt the AsyncReceiveOffer cache struct serialization for this now.

Makes sense, tho I imagine it would be a rather trivial diff, no?

@jkczyz jkczyz self-requested a review February 27, 2025 18:10
@valentinewallace valentinewallace added the weekly goal Someone wants to land this this week label Feb 27, 2025
Comment on lines +568 to +592
const IV_BYTES: &[u8; IV_LEN] = b"LDK Offer Paths~";
let mut hmac = expanded_key.hmac_for_offer();
hmac.input(IV_BYTES);
hmac.input(&nonce.0);
hmac.input(ASYNC_PAYMENTS_OFFER_PATHS_INPUT);
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Do we need to include path_absolute_expiry?

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I thought the nonce/IV was sufficient but I'm not certain. @TheBlueMatt would it be an improvement to commit to the expiry in the hmac? IIUC the path still can't be re-purposed...

@valentinewallace valentinewallace marked this pull request as ready for review March 4, 2025 21:14
@valentinewallace
Copy link
Contributor Author

Going to base this on #3640. Will finish updating the ser macros based on those changes and push updates here after finishing some review.

@valentinewallace valentinewallace removed the weekly goal Someone wants to land this this week label Mar 4, 2025
Copy link
Contributor

@vincenzopalazzo vincenzopalazzo left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Added a couple of comments that I find out while working on the CI failure in #3593

@valentinewallace
Copy link
Contributor Author

Pushed some updates after moving the async receive offer cache into the new OffersMessageFlow struct added in #3639.

Copy link

@elnosh elnosh left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

New to the codebase but interested in following async payments. From reading the explanation in the commit messages, the protocol/flow between the async recipient and the always-online node to build the static invoice and offer made sense. Overall the code changes look good to me.

Comment on lines +41 to +44
fn handle_offer_paths_request(
&self, message: OfferPathsRequest, context: AsyncPaymentsContext,
responder: Option<Responder>,
) -> Option<(OfferPaths, ResponseInstruction)>;
Copy link

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I see it is similar to other message handler traits in the OnionMessenger but I was wondering why return Options in these handle_ methods instead of Results?

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Good question, I wrote that code forever ago but I think it was just consistency with the other onion message handler traits at the time. Fine to switch if reviewers prefer, although I might punt since the handle_held_htlc_available instance within the async payments trait is pre-existing...

Comment on lines +257 to +260
Self::OfferPathsRequest(_) => OFFER_PATHS_REQ_TLV_TYPE,
Self::OfferPaths(msg) => msg.tlv_type(),
Self::ServeStaticInvoice(msg) => msg.tlv_type(),
Self::StaticInvoicePersisted(_) => INVOICE_PERSISTED_TLV_TYPE,
Copy link

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Why do some use the const directly here and others get the const set through the tlv_type on the msg?

Copy link
Contributor Author

@valentinewallace valentinewallace May 19, 2025

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

The variants that return consts correspond to messages that don't implement the OnionMessageContents trait, so they don't have the tlv_type method available. Looks like docs are a bit lacking here but the OnionMessageContents trait implementation seems to only be needed for onion messages that are sent in direct response to other onion messages.

shaavan and others added 7 commits May 20, 2025 16:03
Document that MessageForwardNode must represent a node that supports
the onion messages feature in order to be used in blinded reply paths.
Encapsulates logic for fetching peers used in blinded path creation.
Reduces duplication and improves reusability across functions.
`OffersMessageFlow` is a mid-level abstraction for handling
BOLT12 messages and flow control. It provides utilities to
help implement Offer Message Handlers in a cleaner, more modular
way.

The core motivation is to decouple Onion Messaging logic from
`ChannelManager`, reducing its responsibilities and code overhead.
This separation improves clarity, maintainability, and lays the
groundwork for giving users greater flexibility in customizing
their BOLT12 message flows.
These functions will be used in the following commit to replace closure usage
in Flow trait functions.
As part of being an async recipient, we need to support interactively building
an offer and static invoice with an always-online node that will serve static
invoices on our behalf.

Add a config field containing blinded message paths that async recipients can
use to request blinded paths that will be included in their offer. Payers will
forward invoice requests over the paths returned by the server, and receive a
static invoice in response if the recipient is offline.
@valentinewallace valentinewallace force-pushed the 2025-02-static-inv-server-client branch from 5455d55 to f8023ca Compare May 21, 2025 00:11
@valentinewallace
Copy link
Contributor Author

Rebased on the latest version of #3639

Copy link

codecov bot commented May 21, 2025

Codecov Report

Attention: Patch coverage is 76.42276% with 174 lines in your changes missing coverage. Please review.

Project coverage is 89.69%. Comparing base (78fee88) to head (a5e4718).
Report is 52 commits behind head on main.

Files with missing lines Patch % Lines
lightning/src/ln/channelmanager.rs 48.63% 69 Missing and 6 partials ⚠️
lightning/src/onion_message/async_payments.rs 0.00% 29 Missing ⚠️
lightning/src/onion_message/functional_tests.rs 0.00% 20 Missing ⚠️
lightning/src/offers/flow.rs 96.36% 12 Missing and 6 partials ⚠️
lightning/src/ln/peer_handler.rs 0.00% 17 Missing ⚠️
lightning/src/offers/async_receive_offer_cache.rs 31.81% 15 Missing ⚠️
Additional details and impacted files
@@            Coverage Diff             @@
##             main    #3618      +/-   ##
==========================================
+ Coverage   89.52%   89.69%   +0.16%     
==========================================
  Files         157      161       +4     
  Lines      125100   129076    +3976     
  Branches   125100   129076    +3976     
==========================================
+ Hits       112002   115781    +3779     
- Misses      10408    10605     +197     
  Partials     2690     2690              

☔ View full report in Codecov by Sentry.
📢 Have feedback on the report? Share it here.

🚀 New features to boost your workflow:
  • ❄️ Test Analytics: Detect flaky tests, report on failures, and find test suite problems.

@valentinewallace valentinewallace force-pushed the 2025-02-static-inv-server-client branch 3 times, most recently from d1cc154 to 68fd751 Compare May 22, 2025 22:52
@valentinewallace
Copy link
Contributor Author

Pushed some minor fixes for CI.

Because async recipients are not online to respond to invoice requests,
the plan is for another node on the network that is always-online to serve
static invoices on their behalf.

The protocol is as follows:
- Recipient is configured with blinded message paths to reach the static invoice
  server
- On startup, recipient requests blinded message paths for inclusion in their
  offer from the static invoice server over the configured paths
- Server replies with offer paths for the recipient
- Recipient builds their offer using these paths and the corresponding static
  invoice and replies with the invoice
- Server persists the invoice and confirms that they've persisted it, causing
  the recipient to cache the interactively built offer for use

At pay-time, the payer sends an invoice request to the static invoice server,
who replies with the static invoice after forwarding the invreq to the
recipient (to give them a chance to provide a fresh invoice in case they're
online).

Here we add the requisite trait methods and onion messages to support this
protocol.
In future commits, as part of being an async recipient, we will interactively
build offers and static invoices with an always-online node that will serve
static invoices on our behalf.

Once an offer is built and the static invoice is confirmed as persisted by the
server, we will use the new offer cache added here to save the invoice metadata
and the offer in ChannelManager, though the OffersMessageFlow is responsible
for keeping the cache updated.
As an async recipient, we need to interactively build static invoices that an
always-online node will serve to payers on our behalf.

At the start of this process, we send a requests for paths to include in our
offers to the always-online node on startup and refresh the cached offers when
they expire.
As an async recipient, we need to interactively build a static invoice that an
always-online node will serve to payers on our behalf.

As part of this process, the static invoice server sends us blinded message
paths to include in our offer so they'll receive invoice requests from senders
trying to pay us while we're offline. On receipt of these paths, create an
offer and static invoice and send the invoice back to the server so they can
provide the invoice to payers.
As an async recipient, we need to interactively build a static invoice that an
always-online node will serve on our behalf.

Once this invoice is built and persisted by the static invoice server, they
will send us a confirmation onion message. At this time, cache the
corresponding offer and mark it as ready to receive async payments.
As an async recipient, we need to interactively build offers and corresponding
static invoices, the latter of which an always-online node will serve to payers
on our behalf.

Offers may be very long-lived and have a longer expiration than their
corresponding static invoice. Therefore, persist a fresh invoice with the
static invoice server when the current invoice gets close to expiration.
Over the past several commits we've implemented interactively building an async
receive offer with a static invoice server that will service invoice requests
on our behalf as an async recipient.

Here we add an API to retrieve the resulting offers so we can receive payments
when we're offline.
@valentinewallace valentinewallace force-pushed the 2025-02-static-inv-server-client branch from 68fd751 to a5e4718 Compare May 22, 2025 23:21
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

7 participants