You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Copy file name to clipboardExpand all lines: docs/docs/walkthroughs/tab-autocomplete.md
+10Lines changed: 10 additions & 0 deletions
Original file line number
Diff line number
Diff line change
@@ -30,6 +30,16 @@ All of the configuration options available for chat models are available to use
30
30
31
31
If you aren't yet familiar with the available options, you can learn more in our [overview](../model-setup/overview.md).
32
32
33
+
### What model should I use?
34
+
35
+
If you are running the model locally, we recommend `starcoder:3b`.
36
+
37
+
If you find it to be too slow, you should try `deepseek-coder:1.3b-base`.
38
+
39
+
If you have a bit more compute, or are running a model in the cloud, you can upgrade to `deepseek-coder:6.7b-base`.
40
+
41
+
Regardless of what you are willing to spend, we do not recommend using GPT or Claude for autocomplete. Learn why [below](#i-want-better-completions-should-i-use-gpt-4).
0 commit comments