Skip to content

Remove SparseDiffTools.jl as dependency#137

Open
mx-p9a wants to merge 2 commits intobyuflowlab:masterfrom
mx-p9a:update-sparse-diff-tools
Open

Remove SparseDiffTools.jl as dependency#137
mx-p9a wants to merge 2 commits intobyuflowlab:masterfrom
mx-p9a:update-sparse-diff-tools

Conversation

@mx-p9a
Copy link
Copy Markdown

@mx-p9a mx-p9a commented Feb 11, 2026

Replace SparseDiffTools.jl with DifferentiationInterface.jl

Closes #136. Note: requires byuflowlab/ImplicitAD.jl#24.

Summary

Removes the deprecated SparseDiffTools.jl dependency and replaces it with DifferentiationInterface.jl (v0.7) + SparseConnectivityTracer.jl + SparseMatrixColorings.jl. Also replaces DifferentialEquations.jl with OrdinaryDiffEq in the test environment.

Motivation

SparseDiffTools.jl is deprecated and causes version conflicts with modern SciML packages (specifically SciMLOperators >= 1 required by recent OrdinaryDiffEq). The replacement ecosystem is DifferentiationInterface.jl with sparse AD support via SparseConnectivityTracer.jl and SparseMatrixColorings.jl.

Changes

Project.toml

  • Removed: SparseDiffTools
  • Added: DifferentiationInterface (0.7), SparseConnectivityTracer (1), SparseMatrixColorings (0.4)
  • Updated compat: ImplicitAD (1), SciMLBase (2.117-2)

src/GXBeam.jl

  • Replaced import SparseDiffTools with using DifferentiationInterface, import SparseConnectivityTracer, import SparseMatrixColorings

src/analyses.jl

Three functions were replaced:

  1. jacobian_colors() -- Deleted. This was dead code (never called anywhere, referenced undefined variables).

  2. autodiff_jacobian!() -- Replaced SparseDiffTools.forwarddiff_color_jacobian! with DI's AutoSparse(AutoForwardDiff(); ...) backend using prepare_jacobian / jacobian!. This is actually an improvement: the old code passed colorvec = 1:length(x) (dense coloring, never exploiting sparsity), while the new code automatically detects and exploits the sparsity pattern via SparseConnectivityTracer.

  3. matrixfree_jacobian() -- Replaced SparseDiffTools.JacVec with DI's prepare_pushforward / pushforward! wrapped in a LinearMap. This preserves the exact same interface (an operator supporting mul!) compatible with IterativeSolvers.gmres!.

Test environment

  • Replaced DifferentialEquations.jl with OrdinaryDiffEq (the tests only use Rodas4() and DABDF2(), both from OrdinaryDiffEq)
  • Added BenchmarkTools for the benchmark script

Benchmark Results

Benchmark script at benchmark/benchmark.jl using BenchmarkTools.jl (10 samples each):

Static Joined-Wing (40 elements, 141 nonlinear load steps) -- matrixfree_jacobian

Old (SparseDiffTools) New (DI 0.7)
Median 266.2 ms 257.8 ms
Memory 536.56 MiB 536.56 MiB
Allocs 57,345 57,258

Eigenvalue Analysis (40 elements, 18 sweeps x 3 RPMs) -- autodiff_jacobian!

Old (SparseDiffTools) New (DI 0.7)
Median 661.1 ms 655.3 ms
Memory 643.70 MiB 643.70 MiB
Allocs 5,481,063 5,481,140

Time-Domain Wind Turbine (5 elements, 101 time steps) -- matrixfree_jacobian

Old (SparseDiffTools) New (DI 0.7)
Median 44.2 ms 44.3 ms
Memory 79.70 MiB 79.70 MiB
Allocs 69,150 69,150

Performance is identical within measurement noise. Memory usage is unchanged.

Dependency note

This PR requires ImplicitAD.jl#XX which removes the unused DifferentiationInterface dependency from ImplicitAD (it was listed as a dep but never imported in any source file). Without that change, ImplicitAD's DI = "0.6" compat bound prevents resolving DI 0.7.

Test results

All 33 test groups pass, 0 failures.

function autodiff_jacobian!(jacob, residual!, x, p, constants)
f! = (r, x) -> residual!(r, x, p, constants)
y = constants.resid
prep = prepare_jacobian(f!, y, SPARSE_AD_BACKEND, x)
Copy link
Copy Markdown

@gdalle gdalle Feb 12, 2026

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

The whole point of preparation is to do it just once and reuse it several times. Is there any way to store the prep object you get here to avoid recomputing it? This is especially important for sparse AD, where sparsity detection is very expensive

Copy link
Copy Markdown
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

That's a good point. Storing the prep object shouldn't be too difficult, I'll just need to check what the optimal behavior is. I don't know how the sparsity pattern changes from function call to function call. I remember correctly, SparseConnectivityTracer captures the full sparsity pattern for a given branch in the code, right? I'll need to check in the code if that's going to create an overly dense matrix.

Copy link
Copy Markdown
Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

That makes sense. I'll defer to @Cardoza2 on how he would like to manage this. Perhaps could be part of the constants object?

Copy link
Copy Markdown
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Yeah, that seems like a decent place for it.

@mx-p9a
Copy link
Copy Markdown
Author

mx-p9a commented Feb 12, 2026

Suspect the reason for CI failure is that byuflowlab/ImplicitAD.jl#24 needs to get merged first (with new release).

@Cardoza2
Copy link
Copy Markdown
Member

Oh yeah, duh... that makes sense. I'll run the tests on ImplicitAD and make sure things are good over there, then rerun the tests here.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

Remove deprecated SparseDiffTools.jl as dependency

3 participants