Skip to content

How to use custom optimizer for analog update #542

@afmsaif

Description

@afmsaif

Let's say I have a custom optimizer named Adabelief. Now I want to use this optimizer to update weights and biases of analog layers. How can I do it? One option I found that I can AnalogOptimizer like optimizer = AnalogOptimizer(Adabelief, model.parameters(), lr=0.5). In the documentation it is mentioned that "This class wraps an existing Optimizer, customizing the optimization step for triggering the analog update needed for analog tiles." Is it a correct approach to implement a custom optimizer in analog training? How this AnalogOptimizer class is actually working?

Metadata

Metadata

Assignees

No one assigned

    Labels

    questionFurther information is requested

    Type

    No type
    No fields configured for issues without a type.

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions