You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
As things stand, only some layers in Flux have direct functional equivalents in NNlib - maxpool and meanpool do, for example, while the adaptive versions of those two layers don't. I know that they're only thin wrappers around maxpool and meanpool themselves, but would it be worth having functions that do the forward pass for these layers? While writing wrapper layers in Metalhead.jl it often feels weird to have entire Flux layers described inside of the forward pass of another layer. An example I'm working on right now is adaptive_avgmax_pool, currently written out as:
While this works perfectly fine, it feels kinda odd to have a layer inside of a function which is gonna be wrapped by a layer 😅 (this may just be a side-effect of having worked with torch quite a bit so I'm not sure if this request is justified or not)
The text was updated successfully, but these errors were encountered:
There are outstanding requests for creating/moving function-based implementations of many layers to NNlib (I should know, I created some of them!) I think everyone agrees it's a good idea, but there is a dearth of dev time to o do it.
As things stand, only some layers in Flux have direct functional equivalents in NNlib -
maxpool
andmeanpool
do, for example, while the adaptive versions of those two layers don't. I know that they're only thin wrappers aroundmaxpool
andmeanpool
themselves, but would it be worth having functions that do the forward pass for these layers? While writing wrapper layers in Metalhead.jl it often feels weird to have entire Flux layers described inside of the forward pass of another layer. An example I'm working on right now isadaptive_avgmax_pool
, currently written out as:While this works perfectly fine, it feels kinda odd to have a layer inside of a function which is gonna be wrapped by a layer 😅 (this may just be a side-effect of having worked with
torch
quite a bit so I'm not sure if this request is justified or not)The text was updated successfully, but these errors were encountered: