-
Notifications
You must be signed in to change notification settings - Fork 1.3k
[WIP] Implement Tensor::unfold(dim, size, step): 1-dim unfolded window views. #3091
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
base: main
Are you sure you want to change the base?
Conversation
This manipulates layout to unfold a window along a dimension. See: https://docs.pytorch.org/docs/2.8/generated/torch.Tensor.unfold.html TODO: I need help with `backprop.rs::Tensor::backward()`
|
I'm working to thread I'm new to candle, and don't know how to write the backprop code; could use a suggested impl or some pointers. |
|
A heavier handed approach would be something like Tensor::with_layout(); which I understand is a scary API to provide, though it is actually pretty simple to validate. Any given layout has an easy to compute max and min ravel offset; and the test is just "is that inside the storage size?" |
- switched to pytorch's return shape. - added burn-router - exposed unfold calculation module. - ndarray and candle both need either upstream support or work-arounds. candle has a PR in-flight (from me): huggingface/candle#3091
|
I have a broken impl of backprop; I'm trying to debug it, but it never runs? there's an assert in it that never triggers. |
|
@ivarflakstad ping? |
- switched to pytorch's return shape. - added burn-router - exposed unfold calculation module. - ndarray and candle both need either upstream support or work-arounds. candle has a PR in-flight (from me): huggingface/candle#3091
- switched to pytorch's return shape. - added burn-router - exposed unfold calculation module. - ndarray and candle both need either upstream support or work-arounds. candle has a PR in-flight (from me): huggingface/candle#3091
- switched to pytorch's return shape. - added burn-router - exposed unfold calculation module. - ndarray and candle both need either upstream support or work-arounds. candle has a PR in-flight (from me): huggingface/candle#3091
- switched to pytorch's return shape. - added burn-router - exposed unfold calculation module. - ndarray and candle both need either upstream support or work-arounds. candle has a PR in-flight (from me): huggingface/candle#3091
* [WIP] towards pytorch.unfold() * torch * Expand unfold impl. - switched to pytorch's return shape. - added burn-router - exposed unfold calculation module. - ndarray and candle both need either upstream support or work-arounds. candle has a PR in-flight (from me): huggingface/candle#3091 * docs * Implement `unfold` operation for tensor backends. - Added `unfold` function for `candle`, `ndarray` backends with a copy-based implementation. - Updated function docs and tensor operation traits accordingly. - Incorporated window shape calculation for `unfold`. * Simplify `into_ranges` call for tensor shape calculation. * Remove redundant field repetition in `UnfoldOpIr` initialization. * book * Update `slice` implementation to use step-aware slice arguments in `unfold`. * rustfmt; fix rebase errors. * Optimize `unfold4d` implementation for zero-padding and unit-dilation cases. Update imports. * Refactor `unfold` implementation by introducing `calculate_unfold_shape`. - Replaced `calculate_unfold_windows` with `calculate_unfold_shape` across tensor backends. - Updated function documentation to reflect the new shape calculation. - Added unit tests for `calculate_unfold_shape`. - Simplified shape handling in tensor unfold operations.
|
Hi @crutcher, In your test instead of: you should do: This sets the With this change |
This manipulates layout to unfold a window along a dimension.
See: https://docs.pytorch.org/docs/2.8/generated/torch.Tensor.unfold.html
TODO: I need help with
backprop.rs::Tensor::backward()