Different parameters for each custom prompt #2021
wan-robert
started this conversation in
Ideas
Replies: 1 comment
-
Beta Was this translation helpful? Give feedback.
0 replies
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
Uh oh!
There was an error while loading. Please reload this page.
Uh oh!
There was an error while loading. Please reload this page.
-
Summary of Feature
Custom prompts currently allow users to specify a model key. It would be very nice if users could also specify model paramters for each prompt.
Why This Is Helpful
Different prompts require different output length and temperatures. A user may want a summary prompt to have very low temperature and short output length. In contrast, a creative writing prompt will benefit from higher temperature and longer output length.
Currently, we have to manually adjust the output length and temperature in the settings each time we want to change the LLM behavior. Allowing users to save parameters with the custom prompt would make this process more efficient for users.
Example
The parameters can be saved with the existing properties, at the head of the markdown file.
Beta Was this translation helpful? Give feedback.
All reactions