Skip to content

Commit 247010d

Browse files
fix arguement error (PaddlePaddle#3030)
1 parent 6ccc10a commit 247010d

File tree

1 file changed

+1
-1
lines changed

1 file changed

+1
-1
lines changed

fastdeploy/model_executor/layers/moe/fused_moe_backend_base.py

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -85,8 +85,8 @@ def init_ep(self, layer: nn.Layer) -> None:
8585
layer.top_k,
8686
layer.hidden_size,
8787
layer.num_experts,
88-
layer.moe_config.num_max_dispatch_tokens_per_rank,
8988
layer.fd_config.parallel_config.splitwise_role,
89+
layer.fd_config.model_config.num_max_dispatch_tokens_per_rank,
9090
layer.ep_size,
9191
layer.ep_rank,
9292
layer.fd_config.model_config.redundant_experts_num,

0 commit comments

Comments
 (0)