Conversation
There was a problem hiding this comment.
Pull request overview
Adds a new sc-meta source command group to generate reproducible “source packs” (*.source.json) for smart-contract projects, and refactors local-deps logic so it can be reused by the new packer.
Changes:
- Introduces
framework/meta/src/cmd/source.rsimplementingsc-meta source packto discover contracts, collect source files, and emit a packaged JSON with base64-encoded contents. - Refactors
local_depsto expose reusable helpers (compute_local_deps,expand_deps,serialize_local_deps_json) and centralizes thelocal_deps.txtfilename. - Moves the
base64crate dependency frommultiversx-sc-snippetstomultiversx-sc-meta(and updatesCargo.lockaccordingly).
Reviewed changes
Copilot reviewed 8 out of 9 changed files in this pull request and generated 6 comments.
Show a summary per file
| File | Description |
|---|---|
| framework/snippets/Cargo.toml | Removes now-unneeded base64 dependency from snippets crate. |
| framework/meta/src/cmd/source.rs | New source packaging implementation (source pack). |
| framework/meta/src/cmd/local_deps.rs | Exposes reusable local dependency computation for source pack. |
| framework/meta/src/cmd.rs | Registers new source command module. |
| framework/meta/src/cli/cli_standalone_main.rs | Wires new CLI route for sc-meta source .... |
| framework/meta/src/cli/cli_args_standalone.rs | Adds source command and subcommands (local-deps, pack) + PackArgs. |
| framework/meta/Cargo.toml | Adds base64 dependency needed by source packer. |
| framework/meta-lib/src/cargo_toml/cargo_toml_contents.rs | Improves panic messages and adds package_version(). |
| Cargo.lock | Updates lockfile for dependency move/addition. |
💡 Add Copilot custom instructions for smarter, more guided reviews. Learn how to get started.
| let rel = pathdiff::diff_paths(file, project_folder).unwrap(); | ||
| let path_str = rel.to_string_lossy().replace('\\', "/"); |
There was a problem hiding this comment.
make_entry unconditionally unwraps pathdiff::diff_paths(file, project_folder). This will panic when a local dependency (or any collected file) is outside project_folder (e.g. path = "../shared"). Consider falling back to an absolute path (or a module-relative path) when diff_paths returns None, similar to module_path, so source packing works for out-of-tree dependencies.
| let rel = pathdiff::diff_paths(file, project_folder).unwrap(); | |
| let path_str = rel.to_string_lossy().replace('\\', "/"); | |
| let path_str = pathdiff::diff_paths(file, project_folder) | |
| .unwrap_or_else(|| file.to_path_buf()) | |
| .to_string_lossy() | |
| .replace('\\', "/"); |
| let mut dep_map: BTreeMap<PathBuf, LocalDep> = BTreeMap::new(); | ||
| expand_deps(&contract_dir, contract_dir.clone(), &mut dep_map); | ||
|
|
||
| let common_dependency_path = common_path_all(dep_map.keys().map(|p| p.as_path())); | ||
|
|
||
| LocalDeps { | ||
| root: contract_dir.clone(), |
There was a problem hiding this comment.
compute_local_deps calls expand_deps(&contract_dir, ...), but expand_deps computes child_path = pathdiff::diff_paths(&full_path, root_path).unwrap(). If any local dependency resolves outside contract_dir (common with path = "../..."), this will panic and break source pack. Either make expand_deps tolerate diff_paths == None (e.g. store absolute paths / a separate canonical path field) or allow compute_local_deps to accept a workspace root and use that as root_path.
| let mut dep_map: BTreeMap<PathBuf, LocalDep> = BTreeMap::new(); | |
| expand_deps(&contract_dir, contract_dir.clone(), &mut dep_map); | |
| let common_dependency_path = common_path_all(dep_map.keys().map(|p| p.as_path())); | |
| LocalDeps { | |
| root: contract_dir.clone(), | |
| let root_path = contract_dir | |
| .ancestors() | |
| .last() | |
| .unwrap_or(contract_dir.as_path()) | |
| .to_path_buf(); | |
| let mut dep_map: BTreeMap<PathBuf, LocalDep> = BTreeMap::new(); | |
| expand_deps(&root_path, contract_dir.clone(), &mut dep_map); | |
| let common_dependency_path = common_path_all(dep_map.keys().map(|p| p.as_path())); | |
| LocalDeps { | |
| root: root_path, |
| if let Ok(read_dir) = fs::read_dir(current) { | ||
| for entry in read_dir.flatten() { | ||
| let path = entry.path(); | ||
| if path.is_dir() && path.file_name().map(|n| n != "target").unwrap_or(true) { |
There was a problem hiding this comment.
Directory recursion here only skips target. Unlike RelevantDirectories (which avoids hidden folders), this will traverse hidden directories like .git and can become very slow on large repos. Consider also skipping entries whose names start with . (and possibly other well-known build/output folders) to keep discovery fast and consistent with the rest of the meta tooling.
| if path.is_dir() && path.file_name().map(|n| n != "target").unwrap_or(true) { | |
| let should_recurse = path.is_dir() | |
| && path | |
| .file_name() | |
| .and_then(|n| n.to_str()) | |
| .map(|name| name != "target" && !name.starts_with('.')) | |
| .unwrap_or(true); | |
| if should_recurse { |
| fn collect_recursive(current: &Path, result: &mut Vec<PathBuf>) { | ||
| let entries = match fs::read_dir(current) { | ||
| Ok(e) => e, | ||
| Err(_) => return, | ||
| }; | ||
| for entry in entries.flatten() { | ||
| let path = entry.path(); | ||
| if path.is_dir() { | ||
| if path.file_name().map(|n| n != "target").unwrap_or(true) { | ||
| collect_recursive(&path, result); | ||
| } | ||
| } else if is_source_file(&path) { | ||
| result.push(path); |
There was a problem hiding this comment.
collect_recursive recurses into all subdirectories except target, including hidden folders (e.g. .git) and other large trees. This can cause significant overhead even though only a few files are ultimately collected. Consider applying the same hidden-folder skip used elsewhere in this crate (and optionally skipping known output directories like output) during traversal.
| pub fn source_pack(args: &PackArgs) { | ||
| let project_folder = if let Some(p) = &args.path { | ||
| Path::new(p).canonicalize().unwrap() | ||
| } else { | ||
| Path::new(".").canonicalize().unwrap() | ||
| }; | ||
|
|
||
| let contract_folders = find_contract_folders(&project_folder); | ||
| if contract_folders.is_empty() { | ||
| println!( | ||
| "No contracts found (no multiversx.json) under: {}", | ||
| project_folder.display() | ||
| ); | ||
| return; | ||
| } | ||
|
|
||
| for contract_folder in &contract_folders { | ||
| source_pack_contract(&project_folder, contract_folder, args.contract.as_deref()); | ||
| } | ||
| } |
There was a problem hiding this comment.
The new source pack command introduces non-trivial behavior (contract discovery, dependency expansion, file ordering, base64 encoding) but there are no tests covering the output format/ordering. Adding a small integration/unit test that builds a temporary project structure and asserts the produced .source.json schema/order would help prevent regressions.
|
Contract comparison - from eb3672b to dfffb2e
|
No description provided.