You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Copy file name to clipboardexpand all lines: README.md
+22-37
Original file line number
Diff line number
Diff line change
@@ -28,30 +28,19 @@ You can use OpenCommit by simply running it via the CLI like this `oco`. 2 secon
28
28
npm install -g opencommit
29
29
```
30
30
31
-
Alternatively run it via `npx opencommit` or `bunx opencommit`
32
-
33
-
MacOS may ask to run the command with `sudo` when installing a package globally.
34
-
35
-
2. Get your API key from [OpenAI](https://platform.openai.com/account/api-keys). Make sure that you add your payment details, so the API works.
31
+
2. Get your API key from [OpenAI](https://platform.openai.com/account/api-keys) or other supported LLM providers (we support them all). Make sure that you add your OpenAI payment details to your account, so the API works.
36
32
37
33
3. Set the key to OpenCommit config:
38
34
39
35
```sh
40
-
oco config setOCO_OPENAI_API_KEY=<your_api_key>
36
+
oco config setOCO_API_KEY=<your_api_key>
41
37
```
42
38
43
39
Your API key is stored locally in the `~/.opencommit` config file.
44
40
45
41
## Usage
46
42
47
-
You can call OpenCommit directly to generate a commit message for your staged changes:
48
-
49
-
```sh
50
-
git add <files...>
51
-
opencommit
52
-
```
53
-
54
-
You can also use the `oco` shortcut:
43
+
You can call OpenCommit with `oco` command to generate a commit message for your staged changes:
55
44
56
45
```sh
57
46
git add <files...>
@@ -70,22 +59,17 @@ You can also run it with local model through ollama:
70
59
71
60
```sh
72
61
git add <files...>
73
-
oco config set OCO_AI_PROVIDER='ollama'
62
+
oco config set OCO_AI_PROVIDER='ollama' OCO_MODEL='llama3:8b'
74
63
```
75
64
76
-
If you want to use a model other than mistral (default), you can do so by setting the `OCO_AI_PROVIDER` environment variable as follows:
77
-
78
-
```sh
79
-
oco config set OCO_AI_PROVIDER='ollama'
80
-
oco config set OCO_MODEL='llama3:8b'
81
-
```
65
+
Default model is `mistral`.
82
66
83
67
If you have ollama that is set up in docker/ on another machine with GPUs (not locally), you can change the default endpoint url.
84
68
85
-
You can do so by setting the `OCO_OLLAMA_API_URL` environment variable as follows:
69
+
You can do so by setting the `OCO_API_URL` environment variable as follows:
OCO_OPENAI_BASE_PATH=<may be used to set proxy path to OpenAI api>
129
114
OCO_DESCRIPTION=<postface a message with ~3 sentences description of the changes>
130
115
OCO_EMOJI=<boolean, add GitMoji>
131
116
OCO_MODEL=<either 'gpt-4o', 'gpt-4', 'gpt-4-turbo', 'gpt-3.5-turbo' (default), 'gpt-3.5-turbo-0125', 'gpt-4-1106-preview', 'gpt-4-turbo-preview' or 'gpt-4-0125-preview' or any Anthropic or Ollama model or any string basically, but it should be a valid model name>
132
117
OCO_LANGUAGE=<locale, scroll to the bottom to see options>
This are not all the config options, but you get the point.
123
+
Global configs are same as local configs, but they are stored in the global `~/.opencommit`config file and set with `oco config set` command, e.g. `oco config set OCO_MODEL=gpt-4o`.
141
124
142
125
### Global config for all repos
143
126
@@ -189,26 +172,26 @@ or for as a cheaper option:
189
172
oco config set OCO_MODEL=gpt-3.5-turbo
190
173
```
191
174
192
-
### Switch to Azure OpenAI
175
+
### Switch to other LLM providers with a custom URL
193
176
194
177
By default OpenCommit uses [OpenAI](https://openai.com).
195
178
196
-
You could switch to [Azure OpenAI Service](https://learn.microsoft.com/azure/cognitive-services/openai/)🚀
179
+
You could switch to [Azure OpenAI Service](https://learn.microsoft.com/azure/cognitive-services/openai/) or Flowise or Ollama.
197
180
198
181
```sh
199
-
opencommit config set OCO_AI_PROVIDER=azure
200
-
```
182
+
oco config set OCO_AI_PROVIDER=azure OCO_API_KEY=<your_azure_api_key> OCO_API_URL=<your_azure_endpoint>
201
183
202
-
Of course need to set 'OCO_OPENAI_API_KEY'. And also need to set the
203
-
'OPENAI_BASE_PATH' for the endpoint and set the deployment name to
204
-
'model'.
184
+
oco config set OCO_AI_PROVIDER=flowise OCO_API_KEY=<your_flowise_api_key> OCO_API_URL=<your_flowise_endpoint>
185
+
186
+
oco config set OCO_AI_PROVIDER=ollama OCO_API_KEY=<your_ollama_api_key> OCO_API_URL=<your_ollama_endpoint>
187
+
```
205
188
206
189
### Locale configuration
207
190
208
191
To globally specify the language used to generate commit messages:
209
192
210
193
```sh
211
-
# de, German ,Deutsch
194
+
# de, German, Deutsch
212
195
oco config set OCO_LANGUAGE=de
213
196
oco config set OCO_LANGUAGE=German
214
197
oco config set OCO_LANGUAGE=Deutsch
@@ -230,6 +213,8 @@ A prompt for pushing to git is on by default but if you would like to turn it of
230
213
oco config set OCO_GITPUSH=false
231
214
```
232
215
216
+
and it will exit right after commit is confirmed without asking if you would like to push to remote.
217
+
233
218
### Switch to `@commitlint`
234
219
235
220
OpenCommit allows you to choose the prompt module used to generate commit messages. By default, OpenCommit uses its conventional-commit message generator. However, you can switch to using the `@commitlint` prompt module if you prefer. This option lets you generate commit messages in respect with the local config.
@@ -404,7 +389,7 @@ jobs:
404
389
# set openAI api key in repo actions secrets,
405
390
# for openAI keys go to: https://platform.openai.com/account/api-keys
406
391
# for repo secret go to: <your_repo_url>/settings/secrets/actions
0 commit comments