-
Notifications
You must be signed in to change notification settings - Fork 153
The NPU output length is set to 2048 by default. #966
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
base: master
Are you sure you want to change the base?
Conversation
@@ -22,6 +22,19 @@ package genai | |||
typedef int (*callback_function)(const char*, void*); | |||
|
|||
extern int goCallbackBridge(char* input, void* ptr); | |||
|
|||
static ov_status_e ov_genai_llm_pipeline_create_npu_output_2048(const char* models_path, |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
This should not be hardcoded to 2048. It should updated to take a variable max_prompt_len and min_response_len so that larger range of sizes can be supported.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Hi @gblong1
To my knowledge, the default output length for the NPU is 1024, which is not dynamically adjustable. This can be referenced in the following link:
genai npu default output len
The decision to set the default to 2048 was made because during testing, it was observed that some responses were being truncated, which is not the intended behavior. This adjustment ensures that responses remain complete and aligned with expectations.
thank you.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I agree that it should be made larger, however, it should not be hard coded, it should be configurable so that it doesn't have to be re-hard coded to a bigger number later, and so that some models which need even larger than 2048 (like deepseek) will also work well.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Yes, I agree that having a dynamically adjustable output length would be ideal. However, the behavior of the NPU differs from that of the CPU/GPU. While the output length for the CPU/GPU can be set during generation, the NPU requires the output length to be specified at the time of model loading. Once the model is loaded, the output length cannot be modified without reloading the model.
At this stage, making the NPU's output length dynamically adjustable would necessitate reloading the model and involve significant changes. Therefore, I believe setting the output length to 2048 is a practical solution that should meet most common use cases.
Upgrade to GenAI version 2025.2.0.0.dev20250513 for the latest features and improvements.