Skip to content

Commit c7aa29e

Browse files
committed
fix bug
1 parent 8ed8e07 commit c7aa29e

File tree

1 file changed

+3
-5
lines changed

1 file changed

+3
-5
lines changed

src/optimum/rbln/ops/moe.py

Lines changed: 3 additions & 5 deletions
Original file line numberDiff line numberDiff line change
@@ -101,11 +101,9 @@ def custom_moe_glu_fake(
101101
gate_proj_bias: Optional[Tensor] = None,
102102
up_proj_bias: Optional[Tensor] = None,
103103
down_proj_bias: Optional[Tensor] = None,
104-
105-
gate_proj_scale: Optional[Tensor] = None,
106-
up_proj_scale: Optional[Tensor] = None,
107-
down_proj_bias: Optional[Tensor] = None,
108-
104+
# gate_proj_scale: Optional[Tensor] = None,
105+
# up_proj_scale: Optional[Tensor] = None,
106+
# down_proj_scale: Optional[Tensor] = None,
109107
) -> Tensor:
110108
return torch.empty_like(hidden_states)
111109

0 commit comments

Comments
 (0)