Skip to content

Add support for mapping old fields to new ones in TLV read macros #3378

New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Merged
Merged
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension


Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
2 changes: 1 addition & 1 deletion lightning-macros/Cargo.toml
Original file line number Diff line number Diff line change
@@ -1,6 +1,6 @@
[package]
name = "lightning-macros"
version = "0.1.0"
version = "0.2.0+git"
authors = ["Elias Rohrer"]
license = "MIT OR Apache-2.0"
repository = "https://github.com/lightningdevkit/rust-lightning/"
Expand Down
222 changes: 221 additions & 1 deletion lightning-macros/src/lib.rs
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

As async stuff is very different purpose from the serialization stuff, do we want to add ser.rs and async.rs sub-modules (potentially still re-exporting the macros at the crate-level)?

Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

bump.

Original file line number Diff line number Diff line change
Expand Up @@ -18,7 +18,10 @@
#![deny(rustdoc::private_intra_doc_links)]
#![cfg_attr(docsrs, feature(doc_auto_cfg))]

use proc_macro::TokenStream;
extern crate alloc;

use alloc::string::ToString;
use proc_macro::{Delimiter, Group, TokenStream, TokenTree};
use quote::quote;
use syn::spanned::Spanned;
use syn::{parse, ImplItemFn, Token};
Expand Down Expand Up @@ -74,3 +77,220 @@ pub fn maybe_await(expr: TokenStream) -> TokenStream {

quoted.into()
}

fn expect_ident(token: &TokenTree, expected_name: Option<&str>) {
if let TokenTree::Ident(id) = &token {
if let Some(exp) = expected_name {
assert_eq!(id.to_string(), exp, "Expected ident {}, got {:?}", exp, token);
}
} else {
panic!("Expected ident {:?}, got {:?}", expected_name, token);
}
}

fn expect_punct(token: &TokenTree, expected: char) {
if let TokenTree::Punct(p) = &token {
assert_eq!(p.as_char(), expected, "Expected punctuation {}, got {}", expected, p);
} else {
panic!("Expected punctuation {}, got {:?}", expected, token);
}
}

fn token_to_stream(token: TokenTree) -> proc_macro::TokenStream {
proc_macro::TokenStream::from(token)
}
Comment on lines +99 to +101
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

IMO we can get rid of this helper

Copy link
Collaborator Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

It removes a ton of really tall lines because of additional map calls with the function.

Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

download


/// Processes a list of fields in a variant definition (see the docs for [`skip_legacy_fields!`])
fn process_fields(group: Group) -> proc_macro::TokenStream {
let mut computed_fields = proc_macro::TokenStream::new();
if group.delimiter() == Delimiter::Brace {
let mut fields_stream = group.stream().into_iter().peekable();

let mut new_fields = proc_macro::TokenStream::new();
loop {
// The field list should end with .., at which point we break
let next_tok = fields_stream.peek();
if let Some(TokenTree::Punct(_)) = next_tok {
let dot1 = fields_stream.next().unwrap();
expect_punct(&dot1, '.');
let dot2 = fields_stream.next().expect("Missing second trailing .");
expect_punct(&dot2, '.');
let trailing_dots = [dot1, dot2];
new_fields.extend(trailing_dots.into_iter().map(token_to_stream));
assert!(fields_stream.peek().is_none());
break;
}

// Fields should take the form `ref field_name: ty_info` where `ty_info`
// may be a single ident or may be a group. We skip the field if `ty_info`
// is a group where the first token is the ident `legacy`.
let ref_ident = fields_stream.next().unwrap();
expect_ident(&ref_ident, Some("ref"));
let field_name_ident = fields_stream.next().unwrap();
let co = fields_stream.next().unwrap();
expect_punct(&co, ':');
let ty_info = fields_stream.next().unwrap();
let com = fields_stream.next().unwrap();
expect_punct(&com, ',');

if let TokenTree::Group(group) = ty_info {
let first_group_tok = group.stream().into_iter().next().unwrap();
if let TokenTree::Ident(ident) = first_group_tok {
if ident.to_string() == "legacy" {
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I'm curious if there's some way to avoid the string allocation by converting ident to a proc_macro2 ident and comparing it to Ident::new("legacy", ident.span())?

Copy link
Collaborator Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I believe Ident::new("legacy", ident.span()) will create a String under the hood. Really Idents need to support fetching the string they likely have, but oh well...

continue;
}
}
}

let field = [ref_ident, field_name_ident, com];
new_fields.extend(field.into_iter().map(token_to_stream));
}
let fields_group = Group::new(Delimiter::Brace, new_fields);
computed_fields.extend(token_to_stream(TokenTree::Group(fields_group)));
} else {
computed_fields.extend(token_to_stream(TokenTree::Group(group)));
}
computed_fields
}

/// Scans a match statement for legacy fields which should be skipped.
///
/// This is used internally in LDK's TLV serialization logic and is not expected to be used by
/// other crates.
///
/// Wraps a `match self {..}` statement and scans the fields in the match patterns (in the form
/// `ref $field_name: $field_ty`) for types marked `legacy`, skipping those fields.
///
/// Specifically, it expects input like the following, simply dropping `field3` and the
/// `: $field_ty` after each field name.
/// ```ignore
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Seems better to avoid ignore if possible, here and below

Copy link
Collaborator Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Hmm, I can't get them to run in doctests 🤷‍♂️

/// match self {
/// Enum::Variant {
/// ref field1: option,
/// ref field2: (option, explicit_type: u64),
/// ref field3: (legacy, u64, {}, {}), // will be skipped
/// ..
/// } => expression
/// }
/// ```
#[proc_macro]
pub fn skip_legacy_fields(expr: TokenStream) -> TokenStream {
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Maybe we can break the function into modular parts. Something like process_match_pattern for handling the Enum::Variant part and a process_field for the internal fields. It might help keep things organized and easier to follow!

Copy link
Collaborator Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Sadly, this stuff is just kinda inscrutable :(. I'll add more comments and split it but I'm not sure how much it'll help...

let mut stream = expr.into_iter();
let mut res = TokenStream::new();

// First expect `match self` followed by a `{}` group...
let match_ident = stream.next().unwrap();
expect_ident(&match_ident, Some("match"));
res.extend(proc_macro::TokenStream::from(match_ident));

let self_ident = stream.next().unwrap();
expect_ident(&self_ident, Some("self"));
res.extend(proc_macro::TokenStream::from(self_ident));

let arms = stream.next().unwrap();
if let TokenTree::Group(group) = arms {
let mut new_arms = TokenStream::new();

let mut arm_stream = group.stream().into_iter().peekable();
while arm_stream.peek().is_some() {
// Each arm should contain Enum::Variant { fields } => init
// We explicitly check the :s, =, and >, as well as an optional trailing ,
let enum_ident = arm_stream.next().unwrap();
let co1 = arm_stream.next().unwrap();
expect_punct(&co1, ':');
let co2 = arm_stream.next().unwrap();
expect_punct(&co2, ':');
let variant_ident = arm_stream.next().unwrap();
let fields = arm_stream.next().unwrap();
let eq = arm_stream.next().unwrap();
expect_punct(&eq, '=');
let gt = arm_stream.next().unwrap();
expect_punct(&gt, '>');
let init = arm_stream.next().unwrap();

let next_tok = arm_stream.peek();
if let Some(TokenTree::Punct(_)) = next_tok {
expect_punct(next_tok.unwrap(), ',');
arm_stream.next();
}

let computed_fields = if let TokenTree::Group(group) = fields {
process_fields(group)
} else {
panic!("Expected a group for the fields in a match arm");
};

let arm_pfx = [enum_ident, co1, co2, variant_ident];
new_arms.extend(arm_pfx.into_iter().map(token_to_stream));
new_arms.extend(computed_fields);
let arm_sfx = [eq, gt, init];
new_arms.extend(arm_sfx.into_iter().map(token_to_stream));
}

let new_arm_group = Group::new(Delimiter::Brace, new_arms);
res.extend(token_to_stream(TokenTree::Group(new_arm_group)));
} else {
panic!("Expected `match self {{..}}` and nothing else");
}

assert!(stream.next().is_none(), "Expected `match self {{..}}` and nothing else");

res
}

/// Scans an enum definition for fields initialized with `legacy` types and drops them.
///
/// This is used internally in LDK's TLV serialization logic and is not expected to be used by
/// other crates.
///
/// Is expected to wrap a struct definition like
/// ```ignore
/// drop_legacy_field_definition!(Self {
/// field1: $crate::_ignore_arg!(field1, option),
/// field2: $crate::_ignore_arg!(field2, (legacy, u64, {})),
/// })
/// ```
/// and will drop fields defined like `field2` with a type starting with `legacy`.
#[proc_macro]
pub fn drop_legacy_field_definition(expr: TokenStream) -> TokenStream {
let mut st = if let Ok(parsed) = parse::<syn::Expr>(expr) {
if let syn::Expr::Struct(st) = parsed {
st
} else {
return (quote! {
compile_error!("drop_legacy_field_definition!() can only be used on struct expressions")
})
.into();
}
} else {
return (quote! {
compile_error!("drop_legacy_field_definition!() can only be used on expressions")
})
.into();
};
assert!(st.attrs.is_empty());
assert!(st.qself.is_none());
assert!(st.dot2_token.is_none());
assert!(st.rest.is_none());
let mut new_fields = syn::punctuated::Punctuated::new();
core::mem::swap(&mut new_fields, &mut st.fields);
for field in new_fields {
if let syn::Expr::Macro(syn::ExprMacro { mac, .. }) = &field.expr {
let macro_name = mac.path.segments.last().unwrap().ident.to_string();
let is_init = macro_name == "_ignore_arg";
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I somehow would think that the ignore_macro would be expanded before the proc_macro would be able to view the macro name, if that makes sense. Any rules of thumb for how the macros get expanded? Tried to read this https://rustc-dev-guide.rust-lang.org/macro-expansion.html but it's a lot to take in

Copy link
Collaborator Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Uhhh, yea, uhhh, no idea? I think its outside to inside? But I never looked, this just works in tests :)

// Skip `field_name` and `:`, giving us just the type's group
let ty_tokens = mac.tokens.clone().into_iter().skip(2).next();
if let Some(proc_macro2::TokenTree::Group(group)) = ty_tokens {
let first_token = group.stream().into_iter().next();
if let Some(proc_macro2::TokenTree::Ident(ident)) = first_token {
if is_init && ident == "legacy" {
continue;
}
}
}
}
st.fields.push(field);
}
let out = syn::Expr::Struct(st);
quote! { #out }.into()
}
2 changes: 1 addition & 1 deletion lightning-transaction-sync/Cargo.toml
Original file line number Diff line number Diff line change
Expand Up @@ -24,7 +24,7 @@ async-interface = []

[dependencies]
lightning = { version = "0.2.0", path = "../lightning", default-features = false, features = ["std"] }
lightning-macros = { version = "0.1", path = "../lightning-macros", default-features = false }
lightning-macros = { version = "0.2", path = "../lightning-macros", default-features = false }
bitcoin = { version = "0.32.2", default-features = false }
futures = { version = "0.3", optional = true }
esplora-client = { version = "0.11", default-features = false, optional = true }
Expand Down
1 change: 1 addition & 0 deletions lightning/Cargo.toml
Original file line number Diff line number Diff line change
Expand Up @@ -35,6 +35,7 @@ default = ["std", "grind_signatures"]
[dependencies]
lightning-types = { version = "0.3.0", path = "../lightning-types", default-features = false }
lightning-invoice = { version = "0.34.0", path = "../lightning-invoice", default-features = false }
lightning-macros = { version = "0.2", path = "../lightning-macros" }

bech32 = { version = "0.11.0", default-features = false }
bitcoin = { version = "0.32.2", default-features = false, features = ["secp-recovery"] }
Expand Down
Loading
Loading