Custom initialization for nn layers? #1831
-
|
Do we have a way to do custom weight initialization for nn layers (conv, linear, norms etc). For eg. in tch-rs the configs had ws_init and bs_init. From what I have seen from the code, these are initialized locally in the functions/modules (for candle). Am I missing something? |
Beta Was this translation helpful? Give feedback.
Replies: 1 comment 6 replies
-
|
We usually provide two ways to create nn layers, e.g. via |
Beta Was this translation helpful? Give feedback.
We usually provide two ways to create nn layers, e.g. via
conv1dthat would perform the initialization if required or viaConv1d::newthat would take the weight tensor as inputs.conv1dcallsConv1d::newand you can easily duplicate the related code to perform a different initialization if you want, e.g. just copy this function and tweak the "kaiming normal" bit to your liking.