Let's say I have a custom optimizer named Adabelief. Now I want to use this optimizer to update weights and biases of analog layers. How can I do it? One option I found that I can AnalogOptimizer like optimizer = AnalogOptimizer(Adabelief, model.parameters(), lr=0.5). In the documentation it is mentioned that "This class wraps an existing Optimizer, customizing the optimization step for triggering the analog update needed for analog tiles." Is it a correct approach to implement a custom optimizer in analog training? How this AnalogOptimizer class is actually working?
Let's say I have a custom optimizer named Adabelief. Now I want to use this optimizer to update weights and biases of analog layers. How can I do it? One option I found that I can AnalogOptimizer like optimizer = AnalogOptimizer(Adabelief, model.parameters(), lr=0.5). In the documentation it is mentioned that "This class wraps an existing Optimizer, customizing the optimization step for triggering the analog update needed for analog tiles." Is it a correct approach to implement a custom optimizer in analog training? How this AnalogOptimizer class is actually working?