-
-
Couldn't load subscription status.
- Fork 16
Open
Description
Hi,
When implementing a linear layer, the following code does not produce the expected output:
import Torch: tensor
W = collect(reshape(1.f0:6.f0, (3,2)))
x = reshape([1.f0; 1.f0], (2, 1))
expected = W * x
output = tensor(W, dev = 0) * tensor(x, dev = 0)I noted that 2d and nd tensors are treated differently and to me this looks like the root cause:
https://github.com/FluxML/Torch.jl/blob/master/src/tensor.jl#L143-L149
Is there a good reason to treat them differently?
Metadata
Metadata
Assignees
Labels
No labels