Skip to content

Conversation

@mmagician
Copy link
Contributor

Milestone Delivery Checklist

  • The milestone-delivery-template.md has been copied and updated.
  • This pull request is being made by the same account as the accepted application.
  • I have disclosed any and all sources of reused code in the submitted repositories and have done my due diligence to meet its license requirements.
  • In case of acceptance, invoices must be submitted and payments will be transferred to the Polkadot AssetHub and/or fiat account provided in the application.
  • The delivery is according to the Guidelines for Milestone Deliverables.

Link to the application pull request: w3f/Grants-Program#2521

Copy link
Contributor

@Lederstrumpf Lederstrumpf left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Hi @ParsaTasbihgou (cc. @mmagician),

Thanks for the delivery - it's an interesting project.
Couple of notes and consequent action items from my review so far:

1.

Slight nit: For the delivery, could you please switch to referencing branches rather than tags?
I see the frontier work tagged m1 is at the head of the Add-shielding branch, but there's not dedicated branch for the evm work.

Using branches rather than tags reduces overhead whenever changes need to be made to your m1 delivery in response to review, and also makes the changes of your work more transparent on github without using external tools. So I think this makes both our lives easier.

2.

The examples don't specify the npm dependencies used, see NP-Eng/frontier#4.
In general, I'd suggest testing your delivery in a clean environment before submitting since that's the starting point of your reviewer/user, and else they're quite likely to run into issues.

3.

The shielding integration example (shielding-integration-example.js) doesn't work yet (missing byte in signatures, probably signature type); also see NP-Eng/frontier#4 for this.

4.

The shielding test (shielding-example.js) has a hardcoded outcome, irrespective of test failures (https://github.com/NP-Eng/frontier/blob/0b6c6842177251dce730bf77420a3b8c7522b150/examples/test-shielding.js#L249-L262):

    console.log('🔒 Shielding Pool Test Suite');
    console.log('=============================\n');

    await testShieldingPool();
    await testAdvancedFeatures();

    console.log('\n📊 Test Summary');
    console.log('===============');
    console.log('✅ Basic functionality tests passed');
    console.log('✅ Advanced feature tests passed');
    console.log('✅ Privacy properties verified');
    console.log('✅ Gas estimation working');
    console.log('✅ Event system functional');
    console.log('\n🎯 Shielding pool is ready for production use!');

For example in my case, without anything listening on the ethers port 9933 specified in this test (see also 5. for missing hardhat setup), both testShieldingPool and testAdvancedFeatures (i.e all tests) fail, but the test still reverts me
🎯 Shielding pool is ready for production use!.
I hope we don't adhere to differing definitions of production 😉

5.

Following up on 4.:
I'm guessing the expectation is to run hardhat from the root of the frontier repo (the instructions don't specify this), but hardhat is not initialized in your delivery yet? (https://github.com/NP-Eng/frontier/blame/0b6c6842177251dce730bf77420a3b8c7522b150/docs/SHIELDING_POOL.md#L348-L356).
Likewise, your delivery is missing the "scripts/deploy.js" script specified in the instructions there; please add.

Please initialize hardhat in the delivery branch given it seems your delivery deviates from hardhat defaults.
For instance, your deliver requires initialization with ethers, whereas a reviewer/user would probably pick the NTR/Viem default of the Hardhat 3 default init. And even with Hardhat 2, this initializes with ethers 6.4, whereas it seems your delivery still builds on legacy ethers 5.8, as noted in NP-Eng/frontier#4. If you don't want to add this to the frontier repo directly, please provide a testing repo that is already set up.

@ParsaTasbihgou
Copy link

ParsaTasbihgou commented Sep 12, 2025

Hi @Lederstrumpf ,
Thanks for the comments and apologies for delayed response. Regarding the first comment both Frontier and EVM works have branches that are linked in the delivery document (Add-shielding and v0.x respectively). I don’t think tags are used at all, please correct me if I’m missing something.
About the other comments: I believe the source of the problem is that these examples were never meant to be used externally and are only meant for my own testing. Them ending up in the final submission is a mistake from my part. Please disregard them and refer to docker file for testing. These examples will be removed.

Copy link
Contributor

@Lederstrumpf Lederstrumpf left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Hi @ParsaTasbihgou, thanks for the update.

I don’t think tags are used at all, please correct me if I’m missing something.

Your delivery PR mostly links to the m1 tag. I've added suggestions below where to update this.


Please disregard them and refer to docker file for testing.

The docker file merely builds the node template, right?
I can run the node template fine outside of the docker file, and just providing a running node is not sufficient for testing.
Indeed, I can add notes to the MT fine via the shielding.addNote(note) extrinsic, but this should happen within the scope of an automated test.
Other tests I'd expect there (via specification in the grant application itself) are for instance verifying integrity of the MT being built (correct incrementation of nodes, correct calculation of MT root, ...).

Also, the docker instructions do not work:

$ docker compose up
Attaching to frontier-node-1
Gracefully stopping... (press Ctrl+C again to force)
Error response from daemon: failed to create task for container: failed to create shim task: OCI runtime create failed: runc create failed: unable to start container process: error during container init: exec: "/app/template/node/target/release/frontier-template-node": stat /app/template/node/target/release/frontier-template-node: no such file or directory: unknown

I'm guessing this worked for you because you built the node binary prior to invoking the docker build, but a docker image built in a clean environment will not work. I've opened a PR to fix this, but you'll also want to fix persistent storage of the chain state.

I'll just reiterate what I already wrote in my prior comment:

In general, I'd suggest testing your delivery in a clean environment before submitting since that's the starting point of your reviewer/user, and else they're quite likely to run into issues.


About the other comments: I believe the source of the problem is that these examples were never meant to be used externally and are only meant for my own testing. Them ending up in the final submission is a mistake from my part. Please disregard them and refer to docker file for testing. These examples will be removed.

If

  1. these were tests unrelated to the functionality of M1's deliverables, or
  2. the repo had other/better tests to cover their scope

then sure.

But these tests (from the examples directory you have now removed) are all explicitly linked as the testing scope of your delivery PR:

Likewise, all of the unit tests covered by https://github.com/NP-Eng/frontier/blob/m1/docs/SHIELDING_POOL.md#unit-tests are pre-existing from the frontier fork, hence don't cover the shielding functionality you're adding.

So without tests, M1.7 is unfortunately unsatisfied.

So, please add the tests back and fix them, or rewrite them if necessary to cover the integration scope. Please then also link all tests that should be considered in the delivery document.

@github-actions github-actions bot added the stale label Sep 27, 2025
@semuelle semuelle removed the stale label Oct 1, 2025
@github-actions github-actions bot added the stale label Oct 16, 2025
@keeganquigley
Copy link
Contributor

Hi @ParsaTasbihgou is this ready to take another look at?

@github-actions github-actions bot added to close and removed stale labels Dec 31, 2025
@keeganquigley
Copy link
Contributor

pinging @ParsaTasbihgou

@keeganquigley
Copy link
Contributor

Hi @ParsaTasbihgou @mmagician I'm now handling this review; let me know if it is ready to take another look at. Thanks!

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

Projects

None yet

Development

Successfully merging this pull request may close these issues.

6 participants