First of all, I really enjoyed and found the paper very interesting — thanks for sharing this work!
While reading the code, I noticed what seems to be an extra Euler-like update in the sampling loop:
pred_trajectory.append(x0)
delta_lamda = lamda_next - lamda
x = (sigma_next/sigma) * x + sigma_next * (delta_lamda) * x0
This runs after the main coefficient-weighted update, so it might be redundant and could overwrite the intended solver step.
Is this behavior correct, or should the extra update be removed?
Thanks in advance! :)
First of all, I really enjoyed and found the paper very interesting — thanks for sharing this work!
While reading the code, I noticed what seems to be an extra Euler-like update in the sampling loop:
This runs after the main coefficient-weighted update, so it might be redundant and could overwrite the intended solver step.
Is this behavior correct, or should the extra update be removed?
Thanks in advance! :)