Skip to content

[API Compatibility] Support arg closure for paddle.optimizer.optimizer.step#78570

Open
algorithm1832 wants to merge 6 commits intoPaddlePaddle:developfrom
algorithm1832:improvement_optimizer_step
Open

[API Compatibility] Support arg closure for paddle.optimizer.optimizer.step#78570
algorithm1832 wants to merge 6 commits intoPaddlePaddle:developfrom
algorithm1832:improvement_optimizer_step

Conversation

@algorithm1832
Copy link
Copy Markdown
Contributor

PR Category

User Experience

PR Types

New features

Description

  • Add arg closure for paddle.optimizer.optimizer.step
  • Add EN doc
  • Add test

Used AI Studio

是否引起精度变化

@paddle-bot
Copy link
Copy Markdown

paddle-bot bot commented Apr 2, 2026

你的PR提交成功,感谢你对开源项目的贡献!
请关注后续CI自动化测试结果,详情请参考Paddle-CI手册
Your PR has been submitted. Thanks for your contribution!
Please wait for the result of CI firstly. See Paddle CI Manual for details.

@paddle-bot paddle-bot bot added the contributor External developers label Apr 2, 2026
@codecov-commenter
Copy link
Copy Markdown

codecov-commenter commented Apr 2, 2026

Codecov Report

❌ Patch coverage is 66.66667% with 7 lines in your changes missing coverage. Please review.
⚠️ Please upload report for BASE (develop@a35f006). Learn more about missing BASE report.

Files with missing lines Patch % Lines
python/paddle/optimizer/adamw.py 57.14% 3 Missing ⚠️
python/paddle/optimizer/optimizer.py 57.14% 3 Missing ⚠️
python/paddle/optimizer/adam.py 85.71% 1 Missing ⚠️
Additional details and impacted files
@@            Coverage Diff             @@
##             develop   #78570   +/-   ##
==========================================
  Coverage           ?   66.66%           
==========================================
  Files              ?        3           
  Lines              ?       21           
  Branches           ?        0           
==========================================
  Hits               ?       14           
  Misses             ?        7           
  Partials           ?        0           

☔ View full report in Codecov by Sentry.
📢 Have feedback on the report? Share it here.

🚀 New features to boost your workflow:
  • ❄️ Test Analytics: Detect flaky tests, report on failures, and find test suite problems.

>>> adam.step()
>>> adam.clear_grad()
"""
if closure is not None:
Copy link
Copy Markdown
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

loss = None
if closure is not None:
    with imperative_base.enable_grad():
        loss = closure()

...

return loss

>>> adam.step()
>>> adam.clear_grad()

>>> # With closure
Copy link
Copy Markdown
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

两种用法二选一,分别写:

# usage 1:not use closure
optimizer.zero_grad()
output = model(X)
loss = criterion(output, y)
loss.backward()
optimizer.step()

# usage 2:use closure
def closure():
    optimizer.zero_grad()
    output = model(X)
    loss = criterion(output, y)
    loss.backward()
    return loss

loss = optimizer.step(closure)

... out = linear(a)
... loss = paddle.mean(out)
... loss.backward()
... return loss
Copy link
Copy Markdown
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

这几个加了closure的都改下示例代码

parameters=linear.parameters(),
)

def closure():
Copy link
Copy Markdown
Contributor

@zhwesky2010 zhwesky2010 Apr 3, 2026

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

单测要测全一些,按示例代码的逻辑测全,测一个完整的前反向过程

zhwesky2010
zhwesky2010 previously approved these changes Apr 8, 2026
Copy link
Copy Markdown
Contributor

@zhwesky2010 zhwesky2010 left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

LGTM

@zhwesky2010 zhwesky2010 requested a review from SigureMo April 8, 2026 10:02
SigureMo
SigureMo previously approved these changes Apr 8, 2026
@zhwesky2010
Copy link
Copy Markdown
Contributor

很多CI没过 @algorithm1832

@algorithm1832 algorithm1832 dismissed stale reviews from SigureMo and zhwesky2010 via f5d0741 April 8, 2026 13:20
@algorithm1832
Copy link
Copy Markdown
Contributor Author

Coverage没过,一是测试只包含adam,没包含adamw和optimizer类;二是下面这个分支没覆盖上,我再看一下

if paddle.base.dygraph.base.in_to_static_mode():
    self._declarative_step()
    return loss

@algorithm1832
Copy link
Copy Markdown
Contributor Author

if paddle.base.dygraph.base.in_to_static_mode():

上面这个判断条件,我尝试了一下好像没法让代码进入,转静态图之后似乎就不走原先的step函数了

想请教一下@SigureMo关于这个怎么写测试覆盖,谢谢~

@SigureMo
Copy link
Copy Markdown
Member

上面这个判断条件,我尝试了一下好像没法让代码进入,转静态图之后似乎就不走原先的step函数了

这个目前逻辑无法走到,如果只是这行没覆盖到我觉得可以豁免,目前优先让其他行覆盖吧

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

Projects

None yet

Development

Successfully merging this pull request may close these issues.

4 participants