You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Copy file name to clipboardExpand all lines: README.md
+7-7Lines changed: 7 additions & 7 deletions
Display the source diff
Display the rich diff
Original file line number
Diff line number
Diff line change
@@ -86,13 +86,13 @@ A: Yes, MinimalGPT is designed be responsive and works well on mobile devices. Y
86
86
-**Claude 3 Sonnet**
87
87
-**Claude 3 Haiku**
88
88
-**Claude Vision** activated by having the **Claude** model selected and starting a message with **vision::** and then your prompt
89
-
-**Hugging Face Inference Endpoint**
90
-
-**Max Tokens**- Hugging Face models and their context windows can vary greatky. Use this setting to adjust the maximum number of tokens that can be generated as a response.
91
-
-**Local LLM Model (Via [LM Studio](https://lmstudio.ai/))** users configure the current model name and [LM Studio](https://lmstudio.ai/) api endpoint url in the settings panel.
92
-
-**Local Model Name**: The name of the model you are hosting locally
93
-
-**Example**: [This DeepSeek Coder Model](https://huggingface.co/LoneStriker/deepseek-coder-7b-instruct-v1.5-GGUF) has a model name of `LoneStriker/deepseek-coder-7b-instruct-v1.5-GGUF`. That is what should be entered into the **Local Model Name** field. This is also displayed directly in **[LM Studio](https://lmstudio.ai/)** for the user.
94
-
-**Local URL**: The API endpoint URL that **[LM Studio](https://lmstudio.ai/)**is running on
95
-
-**Example**: `http://192.168.0.45:1234`
89
+
-**Open AI Response Formatted APIs** - Supports any API Endpoint that returns Open AI formatted responses.
90
+
-**([LM Studio Example](https://lmstudio.ai/))**users can configure the current model name and [LM Studio](https://lmstudio.ai/) api endpoint url in the settings panel.
91
+
-**Model Name**: The name or relevent value for the model field.
92
+
-**Example**: [This DeepSeek Coder Model](https://huggingface.co/LoneStriker/deepseek-coder-7b-instruct-v1.5-GGUF) has a model name of `LoneStriker/deepseek-coder-7b-instruct-v1.5-GGUF`. That is what should be entered into the **Local Model Name** field if using **[LM Studio](https://lmstudio.ai/)**. This is also displayed directly in **[LM Studio](https://lmstudio.ai/)** for the user.
93
+
-**API Endpoint**: The API endpoint URL that **[LM Studio](https://lmstudio.ai/)**is running on, for example `http://192.168.0.45:1234`
94
+
-**API Key**: The API Key needed, **[LM Studio](https://lmstudio.ai/)**for example uses the value `lm-studio` for the api key.
95
+
-**Max Tokens** - Some models and their context windows can vary greatly. Use this setting to adjust the maximum number of tokens that can be generated as a response. Typically this is roughly half of the maximum input token limit by default, though some models may default to much shorter responses.
96
96
- Switch models mid conversations and maintain context
97
97
- Swipe Gestures for quick settings and conversations access
0 commit comments