Skip to content

chore: create and apply rustfmt.toml #213

New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Open
wants to merge 3 commits into
base: master
Choose a base branch
from
Open
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension


Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
2 changes: 1 addition & 1 deletion .github/pull_request_template.md
Original file line number Diff line number Diff line change
Expand Up @@ -20,7 +20,7 @@ of the PR were done in a specific way -->

* [ ] I've signed all my commits
* [ ] I followed the [contribution guidelines](https://github.com/bitcoindevkit/bdk/blob/master/CONTRIBUTING.md)
* [ ] I ran `cargo fmt` and `cargo clippy` before committing
* [ ] I ran `cargo +nightly fmt` and `cargo clippy` before committing

#### New Features:

Expand Down
5 changes: 2 additions & 3 deletions .github/workflows/cont_integration.yml
Original file line number Diff line number Diff line change
Expand Up @@ -106,7 +106,6 @@ jobs:
run: cargo check --target wasm32-unknown-unknown --no-default-features --features miniscript/no-std,bdk_chain/hashbrown

fmt:
needs: prepare
name: Rust fmt
runs-on: ubuntu-latest
steps:
Expand All @@ -117,12 +116,12 @@ jobs:
- name: Install Rust toolchain
uses: actions-rs/toolchain@v1
with:
toolchain: ${{ needs.prepare.outputs.rust_version }}
toolchain: nightly
override: true
profile: minimal
components: rustfmt
- name: Check fmt
run: cargo fmt --all -- --config format_code_in_doc_comments=true --check
run: cargo fmt --all --check

clippy_check:
needs: prepare
Expand Down
3 changes: 3 additions & 0 deletions rustfmt.toml
Original file line number Diff line number Diff line change
@@ -0,0 +1,3 @@
comment_width = 140
format_code_in_doc_comments = true
wrap_comments = true
6 changes: 3 additions & 3 deletions wallet/examples/compiler.rs
Original file line number Diff line number Diff line change
Expand Up @@ -25,9 +25,9 @@ use bdk_wallet::{KeychainKind, Wallet};

/// Miniscript policy is a high level abstraction of spending conditions. Defined in the
/// rust-miniscript library here https://docs.rs/miniscript/7.0.0/miniscript/policy/index.html
/// rust-miniscript provides a `compile()` function that can be used to compile any miniscript policy
/// into a descriptor. This descriptor then in turn can be used in bdk a fully functioning wallet
/// can be derived from the policy.
/// rust-miniscript provides a `compile()` function that can be used to compile any miniscript
/// policy into a descriptor. This descriptor then in turn can be used in bdk a fully functioning
/// wallet can be derived from the policy.
///
/// This example demonstrates the interaction between a bdk wallet and miniscript policy.
#[allow(clippy::print_stdout)]
Expand Down
8 changes: 4 additions & 4 deletions wallet/examples/policy.rs
Original file line number Diff line number Diff line change
Expand Up @@ -20,11 +20,11 @@ use bdk_wallet::signer::SignersContainer;
///
/// Policy is higher abstraction representation of the wallet descriptor spending condition.
/// This is useful to express complex miniscript spending conditions into more human readable form.
/// The resulting `Policy` structure can be used to derive spending conditions the wallet is capable
/// to spend from.
/// The resulting `Policy` structure can be used to derive spending conditions the wallet is
/// capable to spend from.
///
/// This example demos a Policy output for a 2of2 multisig between between 2 parties, where the wallet holds
/// one of the Extend Private key.
/// This example demos a Policy output for a 2of2 multisig between between 2 parties, where the
/// wallet holds one of the Extend Private key.
#[allow(clippy::print_stdout)]
fn main() -> Result<(), Box<dyn Error>> {
let secp = bitcoin::secp256k1::Secp256k1::new();
Expand Down
3 changes: 2 additions & 1 deletion wallet/src/descriptor/checksum.rs
Original file line number Diff line number Diff line change
Expand Up @@ -19,7 +19,8 @@ use alloc::string::String;

use miniscript::descriptor::checksum::desc_checksum;

/// Compute the checksum of a descriptor, excludes any existing checksum in the descriptor string from the calculation
/// Compute the checksum of a descriptor, excludes any existing checksum in the descriptor string
/// from the calculation
pub fn calc_checksum(desc: &str) -> Result<String, DescriptorError> {
if let Some(split) = desc.split_once('#') {
let og_checksum = split.1;
Expand Down
32 changes: 20 additions & 12 deletions wallet/src/descriptor/dsl.rs
Original file line number Diff line number Diff line change
Expand Up @@ -404,7 +404,8 @@ macro_rules! apply_modifier {
/// Macro to write full descriptors with code
///
/// This macro expands to a `Result` of
/// [`DescriptorTemplateOut`](super::template::DescriptorTemplateOut) and [`DescriptorError`](crate::descriptor::DescriptorError)
/// [`DescriptorTemplateOut`](super::template::DescriptorTemplateOut) and
/// [`DescriptorError`](crate::descriptor::DescriptorError)
///
/// The syntax is very similar to the normal descriptor syntax, with the exception that modifiers
/// cannot be grouped together. For instance, a descriptor fragment like `sdv:older(144)` has to be
Expand All @@ -429,9 +430,10 @@ macro_rules! apply_modifier {
///
/// -------
///
/// 2-of-3 that becomes a 1-of-3 after a timelock has expired. Both `descriptor_a` and `descriptor_b` are equivalent: the first
/// syntax is more suitable for a fixed number of items known at compile time, while the other accepts a
/// [`Vec`] of items, which makes it more suitable for writing dynamic descriptors.
/// 2-of-3 that becomes a 1-of-3 after a timelock has expired. Both `descriptor_a` and
/// `descriptor_b` are equivalent: the first syntax is more suitable for a fixed number of items
/// known at compile time, while the other accepts a [`Vec`] of items, which makes it more suitable
/// for writing dynamic descriptors.
///
/// They both produce the descriptor: `wsh(thresh(2,pk(...),s:pk(...),sndv:older(...)))`
///
Expand Down Expand Up @@ -672,8 +674,9 @@ macro_rules! fragment_internal {

/// Macro to write descriptor fragments with code
///
/// This macro will be expanded to an object of type `Result<(Miniscript<DescriptorPublicKey, _>, KeyMap, ValidNetworks), DescriptorError>`. It allows writing
/// fragments of larger descriptors that can be pieced together using `fragment!(thresh_vec(m, ...))`.
/// This macro will be expanded to an object of type `Result<(Miniscript<DescriptorPublicKey, _>,
/// KeyMap, ValidNetworks), DescriptorError>`. It allows writing fragments of larger descriptors
/// that can be pieced together using `fragment!(thresh_vec(m, ...))`.
///
/// The syntax to write macro fragment is the same as documented for the [`descriptor`] macro.
#[macro_export]
Expand Down Expand Up @@ -846,11 +849,13 @@ mod test {
}
}

// - at least one of each "type" of operator; i.e. one modifier, one leaf_opcode, one leaf_opcode_value, etc.
// - at least one of each "type" of operator; i.e. one modifier, one leaf_opcode, one
// leaf_opcode_value, etc.
// - mixing up key types that implement IntoDescriptorKey in multi() or thresh()

// expected script for pk and bare manually created
// expected addresses created with `bitcoin-cli getdescriptorinfo` (for hash) and `bitcoin-cli deriveaddresses`
// expected addresses created with `bitcoin-cli getdescriptorinfo` (for hash) and `bitcoin-cli
// deriveaddresses`

#[test]
fn test_fixed_legacy_descriptors() {
Expand Down Expand Up @@ -1105,7 +1110,8 @@ mod test {
);
}

// - verify the valid_networks returned is correctly computed based on the keys present in the descriptor
// - verify the valid_networks returned is correctly computed based on the keys present in the
// descriptor
#[test]
fn test_valid_networks() {
let xprv = bip32::Xpriv::from_str("tprv8ZgxMBicQKsPcx5nBGsR63Pe8KnRUqmbJNENAfGftF3yuXoMMoVJJcYeUw5eVkm9WBPjWYt6HMWYJNesB5HaNVBaFc1M6dRjWSYnmewUMYy").unwrap();
Expand Down Expand Up @@ -1162,7 +1168,8 @@ mod test {
assert_eq!(key_map.get(&key3).unwrap().to_string(), "tprv8ZgxMBicQKsPdZXrcHNLf5JAJWFAoJ2TrstMRdSKtEggz6PddbuSkvHKM9oKJyFgZV1B7rw8oChspxyYbtmEXYyg1AjfWbL3ho3XHDpHRZf/10/20/30/40/*");
}

// - verify the ScriptContext is correctly validated (i.e. passing a type that only impl IntoDescriptorKey<Segwitv0> to a pkh() descriptor should throw a compilation error
// - verify the ScriptContext is correctly validated (i.e. passing a type that only impl
// IntoDescriptorKey<Segwitv0> to a pkh() descriptor should throw a compilation error
#[test]
fn test_script_context_validation() {
// this compiles
Expand All @@ -1174,8 +1181,9 @@ mod test {
assert_eq!(desc.to_string(), "pkh(tpubD6NzVbkrYhZ4WR7a4vY1VT3khMJMeAxVsfq9TBJyJWrNk247zCJtV7AWf6UJP7rAVsn8NNKdJi3gFyKPTmWZS9iukb91xbn2HbFSMQm2igY/0/*)#yrnz9pp2");

// as expected this does not compile due to invalid context
//let desc_key:DescriptorKey<Segwitv0> = (xprv, path.clone()).into_descriptor_key().unwrap();
//let (desc, _key_map, _valid_networks) = descriptor!(pkh(desc_key)).unwrap();
//let desc_key:DescriptorKey<Segwitv0> = (xprv,
// path.clone()).into_descriptor_key().unwrap(); let (desc, _key_map,
// _valid_networks) = descriptor!(pkh(desc_key)).unwrap();
}

#[test]
Expand Down
21 changes: 12 additions & 9 deletions wallet/src/descriptor/mod.rs
Original file line number Diff line number Diff line change
Expand Up @@ -69,7 +69,8 @@ pub type HdKeyPaths = BTreeMap<secp256k1::PublicKey, KeySource>;
/// [`psbt::Output`]: bitcoin::psbt::Output
pub type TapKeyOrigins = BTreeMap<XOnlyPublicKey, (Vec<taproot::TapLeafHash>, KeySource)>;

/// Trait for types which can be converted into an [`ExtendedDescriptor`] and a [`KeyMap`] usable by a wallet in a specific [`Network`]
/// Trait for types which can be converted into an [`ExtendedDescriptor`] and a [`KeyMap`] usable by
/// a wallet in a specific [`Network`]
pub trait IntoWalletDescriptor {
/// Convert to wallet descriptor
fn into_wallet_descriptor(
Expand Down Expand Up @@ -460,10 +461,11 @@ impl DescriptorMeta for ExtendedDescriptor {
// using `for_any_key` should make this stop as soon as we return `true`
self.for_any_key(|key| {
if let DescriptorPublicKey::XPub(xpub) = key {
// Check if the key matches one entry in our `key_origins`. If it does, `matches()` will
// return the "prefix" that matched, so we remove that prefix from the full path
// found in `key_origins` and save it in `derive_path`. We expect this to be a derivation
// path of length 1 if the key is `wildcard` and an empty path otherwise.
// Check if the key matches one entry in our `key_origins`. If it does, `matches()`
// will return the "prefix" that matched, so we remove that prefix
// from the full path found in `key_origins` and save it in
// `derive_path`. We expect this to be a derivation path of length 1
// if the key is `wildcard` and an empty path otherwise.
let root_fingerprint = xpub.root_fingerprint(secp);
let derive_path = key_origins
.get_key_value(&root_fingerprint)
Expand All @@ -478,10 +480,11 @@ impl DescriptorMeta for ExtendedDescriptor {
.cloned()
.collect::<DerivationPath>();

// `derive_path` only contains the replacement index for the wildcard, if present, or
// an empty path for fixed descriptors. To verify the key we also need the normal steps
// that come before the wildcard, so we take them directly from `xpub` and then append
// the final index
// `derive_path` only contains the replacement index for the wildcard, if
// present, or an empty path for fixed descriptors.
// To verify the key we also need the normal steps
// that come before the wildcard, so we take them directly from `xpub` and
// then append the final index
if verify_key(
xpub,
&xpub.derivation_path.extend(derive_path.clone()),
Expand Down
35 changes: 23 additions & 12 deletions wallet/src/descriptor/policy.rs
Original file line number Diff line number Diff line change
Expand Up @@ -237,7 +237,8 @@ fn mix<T: Clone>(vec: Vec<Vec<T>>) -> Vec<Vec<T>> {

/// Type for a map of sets of [`Condition`] items keyed by each set's index
pub type ConditionMap = BTreeMap<usize, HashSet<Condition>>;
/// Type for a map of folded sets of [`Condition`] items keyed by a vector of the combined set's indexes
/// Type for a map of folded sets of [`Condition`] items keyed by a vector of the combined set's
/// indexes
pub type FoldedConditionMap = BTreeMap<Vec<usize>, HashSet<Condition>>;

fn serialize_folded_cond_map<S>(
Expand Down Expand Up @@ -363,14 +364,18 @@ impl Satisfaction {
if items.len() >= *m {
let mut map = BTreeMap::new();
let indexes = combinations(items, *m);
// `indexes` at this point is a Vec<Vec<usize>>, with the "n choose k" of items (m of n)
// `indexes` at this point is a Vec<Vec<usize>>, with the "n choose k" of items (m
// of n)
indexes
.into_iter()
// .inspect(|x| println!("--- orig --- {:?}", x))
// we map each of the combinations of elements into a tuple of ([chosen items], [conditions]). unfortunately, those items have potentially more than one
// condition (think about ORs), so we also use `mix` to expand those, i.e. [[0], [1, 2]] becomes [[0, 1], [0, 2]]. This is necessary to make sure that we
// we map each of the combinations of elements into a tuple of ([chosen items],
// [conditions]). unfortunately, those items have potentially more than one
// condition (think about ORs), so we also use `mix` to expand those, i.e. [[0],
// [1, 2]] becomes [[0, 1], [0, 2]]. This is necessary to make sure that we
// consider every possible options and check whether or not they are compatible.
// since this step can turn one item of the iterator into multiple ones, we use `flat_map()` to expand them out
// since this step can turn one item of the iterator into multiple ones, we use
// `flat_map()` to expand them out
.flat_map(|i_vec| {
mix(i_vec
.iter()
Expand All @@ -386,7 +391,8 @@ impl Satisfaction {
.collect::<Vec<(Vec<usize>, Vec<Condition>)>>()
})
// .inspect(|x| println!("flat {:?}", x))
// try to fold all the conditions for this specific combination of indexes/options. if they are not compatible, try_fold will be Err
// try to fold all the conditions for this specific combination of
// indexes/options. if they are not compatible, try_fold will be Err
.map(|(key, val)| {
(
key,
Expand Down Expand Up @@ -503,15 +509,18 @@ impl Condition {
/// Errors that can happen while extracting and manipulating policies
#[derive(Debug, PartialEq, Eq)]
pub enum PolicyError {
/// Not enough items are selected to satisfy a [`SatisfiableItem::Thresh`] or a [`SatisfiableItem::Multisig`]
/// Not enough items are selected to satisfy a [`SatisfiableItem::Thresh`] or a
/// [`SatisfiableItem::Multisig`]
NotEnoughItemsSelected(String),
/// Index out of range for an item to satisfy a [`SatisfiableItem::Thresh`] or a [`SatisfiableItem::Multisig`]
/// Index out of range for an item to satisfy a [`SatisfiableItem::Thresh`] or a
/// [`SatisfiableItem::Multisig`]
IndexOutOfRange(usize),
/// Can not add to an item that is [`Satisfaction::None`] or [`Satisfaction::Complete`]
AddOnLeaf,
/// Can not add to an item that is [`Satisfaction::PartialComplete`]
AddOnPartialComplete,
/// Can not merge CSV or timelock values unless both are less than or both are equal or greater than 500_000_000
/// Can not merge CSV or timelock values unless both are less than or both are equal or greater
/// than 500_000_000
MixedTimelockUnits,
/// Incompatible conditions (not currently used)
IncompatibleConditions,
Expand Down Expand Up @@ -642,8 +651,8 @@ impl Policy {
/// create a transaction
///
/// What this means is that for some spending policies the user should select which paths in
/// the tree it intends to satisfy while signing, because the transaction must be created differently based
/// on that.
/// the tree it intends to satisfy while signing, because the transaction must be created
/// differently based on that.
pub fn requires_path(&self) -> bool {
self.get_condition(&BTreeMap::new()).is_err()
}
Expand Down Expand Up @@ -1063,7 +1072,8 @@ pub enum BuildSatisfaction<'a> {
/// Current blockchain height
current_height: u32,
/// The highest confirmation height between the inputs
/// CSV should consider different inputs, but we consider the worst condition for the tx as whole
/// CSV should consider different inputs, but we consider the worst condition for the tx as
/// whole
input_max_height: u32,
},
}
Expand Down Expand Up @@ -1633,6 +1643,7 @@ mod test {
);
}

#[rustfmt::skip]
#[test]
fn test_extract_satisfaction_timelock() {
//const PSBT_POLICY_CONSIDER_TIMELOCK_NOT_EXPIRED: &str = "cHNidP8BAFMBAAAAAdld52uJFGT7Yde0YZdSVh2vL020Zm2exadH5R4GSNScAAAAAAD/////ATrcAAAAAAAAF6kUXv2Fn+YemPP4PUpNR1ZbU16/eRCHAAAAAAABASvI3AAAAAAAACIAILhzvvcBzw/Zfnc9ispRK0PCahxn1F6RHXTZAmw5tqNPAQVSdmNSsmlofCEDeAtjYQk/Vfu4db2+68hyMKjc38+kWl5sP5QH8L42Zsusk3whAvhhP8vi6bSPMZokerDnvffCBs8m6MdEH8+PgUJdZ5mIrJNShyIGAvhhP8vi6bSPMZokerDnvffCBs8m6MdEH8+PgUJdZ5mIDBwu7j4AAACAAAAAACIGA3gLY2EJP1X7uHW9vuvIcjCo3N/PpFpebD+UB/C+NmbLDMkRfC4AAACAAAAAAAAA";
Expand Down
Loading
Loading