Skip to content

Commit

Permalink
🔧 chore(devcontainer.json, README.md, package.json, CohereAdapter.ts,…
Browse files Browse the repository at this point in the history
… GoogleGeminiAdapter.ts, botservice.ts, openai-wrapper.ts, postMessage.ts): update dependencies and improve code

Updated the development environment Docker image to the latest 1-22-bookworm and upgraded dependency packages to their latest versions, enhancing security and performance. Added comments and function descriptions to improve code readability and maintainability. Improved token management in OpenAI API calls and organized log outputs.
 @anthropic-ai/sdk                 ^0.21.1  →   ^0.32.1
 @eslint/js                         ^9.3.0  →   ^9.16.0
 @google/generative-ai              ^0.7.1  →   ^0.21.0
 @mattermost/client                 ^9.6.0  →   ^10.2.0
 @mattermost/types                  ^9.6.0  →   ^10.2.0
 @swc/core                         ^1.4.14  →   ^1.10.1
 @swc/helpers                      ^0.5.10  →   ^0.5.15
 @types/node                      ^20.12.7  →  ^22.10.2
 @types/node-fetch                 ^2.6.11  →   ^2.6.12
 @types/ws                         ^8.5.11  →   ^8.5.13
 cohere-ai                          ^7.9.4  →   ^7.15.0
 debug-level                         3.1.4  →     3.2.1
 esbuild                           ^0.20.2  →   ^0.24.0
 eslint                             ^9.0.0  →   ^9.16.0
 form-data                          ^4.0.0  →    ^4.0.1
 openai                            ^4.35.0  →   ^4.76.1
 prettier                           ^3.3.3  →    ^3.4.2
 sharp                             ^0.33.3  →   ^0.33.5
 textlint                          ^14.0.4  →   ^14.4.0
 textlint-rule-preset-jtf-style    ^2.3.14  →    ^3.0.0
 tsx                                ^4.7.2  →   ^4.19.2
 typescript                         ^5.5.3  →    ^5.7.2
 typescript-eslint                  ^7.7.0  →   ^8.18.0
 vitest                             ^1.5.2  →    ^2.1.8
  • Loading branch information
takuya-o committed Dec 12, 2024
1 parent f50687b commit 029231c
Show file tree
Hide file tree
Showing 10 changed files with 4,161 additions and 2,223 deletions.
5 changes: 2 additions & 3 deletions .devcontainer/devcontainer.json
Original file line number Diff line number Diff line change
Expand Up @@ -3,8 +3,7 @@
{
"name": "Node.js & TypeScript",
// Or use a Dockerfile or Docker Compose file. More info: https://containers.dev/guide/dockerfile
"image": "mcr.microsoft.com/devcontainers/typescript-node:1-20-bookworm"

"image": "mcr.microsoft.com/devcontainers/typescript-node:1-22-bookworm"
// Features to add to the dev container. More info: https://containers.dev/features.
// "features": {},

Expand All @@ -22,7 +21,7 @@
}
// 新しいイメージを持ってきたら
// ID=$(docker ps |awk '$2~/^vsc-/{print $1}')
// docker exec -u=root $ID sh -c "apt update && apt install git-secrets connect-proxy"
// docker exec -u=root $ID sh -c "apt update && apt install git-secrets connect-proxy netcat-openbsd"
// docker cp ~/.ssh/config $ID:/home/node/.ssh/
// が必要
// 鍵についてはssh-add -l で確認
Expand Down
6 changes: 3 additions & 3 deletions README.md
Original file line number Diff line number Diff line change
Expand Up @@ -20,7 +20,7 @@

![A chat window in Mattermost showing the chat between the OpenAI bot and "yGuy"](./mattermost-chat.png)

The bot can talk to you like a regular mattermost user. It's like having chat.openai.com built collaboratively built into Mattermost!
The bot can talk to you like a regular mattermost user. It's like having ``chat.openai.com`` built collaboratively built into Mattermost!
But that's not all, you can also use it to generate images via Dall-E or diagram visualizations via a yFiles plugin!

Here's how to get the bot running - it's easy if you have a Docker host.
Expand Down Expand Up @@ -65,7 +65,7 @@ or when [running the docker image](#using-the-ready-made-docker-image) or when c
| COHERE_API_KEY | no | `0123456789abcdefghijklmno` | The Cohere API key to authenticate. If OPENAI_API_KEY is also set, the original OpenAI is used for vision or image generation. |
| GOOGLE_API_KEY | no | `0123456789abcdefghijklmno` | The Gemini API key to authenticate. If OPENAI_API_KEY is also set, the original OpenAI is used for vision or image generation. Tested model is only 'gemini-1.5-pro-latest'' |
| YFILES_SERVER_URL | no | `http://localhost:3835` | The URL to the yFiles graph service for embedding auto-generated diagrams. |
| NODE_EXTRA_CA_CERTS | no | `/file/to/cert.crt` | a link to a certificate file to pass to node.js for authenticating self-signed certificates |
| NODE_EXTRA_CA_CERTS | no | `/file/to/cert.crt` | a link to a certificate file to pass to ``node.js`` for authenticating self-signed certificates |
| MATTERMOST_BOTNAME | no | `"@chatgpt"` | the name of the bot user in Mattermost, defaults to '@chatgpt' |
| PLUGINS | no | `graph-plugin, image-plugin` | The enabled plugins of the bot. By default all plugins (grpah-plugin and image-plugin) are enabled. |
| DEBUG_LEVEL | no | `TRACE` | a debug level used for logging activity, defaults to `INFO` |
Expand Down Expand Up @@ -219,7 +219,7 @@ docker compose down


## Deploy to Kubernetes with Helm
The chatgpt-mattermost-bot chart deploys a containerized chatgpt-mattermost-bot instance which will connect to a running mattermost container in the same kubernetes cluster. Chart uses 'mattermost-team-edition' and the 'mattermost' namespace by default. Uses environment variables MATTERMOST_TOKEN and OPENAI_API_KEY.
The chatgpt-mattermost-bot chart deploys a containerized chatgpt-mattermost-bot instance which will connect to a running mattermost container in the same Kubernetes cluster. Chart uses 'mattermost-team-edition' and the 'mattermost' namespace by default. Uses environment variables MATTERMOST_TOKEN and OPENAI_API_KEY.
```bash
helm upgrade chatgpt-mattermost-bot ./helm/chatgpt-mattermost-bot \
--create-namespace \
Expand Down
106 changes: 72 additions & 34 deletions dist/botservice.mjs
Original file line number Diff line number Diff line change
Expand Up @@ -243,7 +243,9 @@ var CohereAdapter = class extends AIAdapter {
} else {
return {
role: "assistant",
content: chat.text
content: chat.text,
refusal: null
// アシスタントからの拒否メッセージ
};
}
}
Expand All @@ -261,7 +263,9 @@ var CohereAdapter = class extends AIAdapter {
const message = {
role: "assistant",
content: null,
tool_calls: openAItoolCalls
tool_calls: openAItoolCalls,
refusal: null
// アシスタントからの拒否メッセージ
};
return message;
}
Expand Down Expand Up @@ -365,7 +369,8 @@ var CohereAdapter = class extends AIAdapter {
// src/adapters/GoogleGeminiAdapter.ts
import {
FinishReason,
GoogleGenerativeAI
GoogleGenerativeAI,
SchemaType
} from "@google/generative-ai";
import { Log as Log4 } from "debug-level";
Log4.options({ json: true, colors: true });
Expand Down Expand Up @@ -487,7 +492,9 @@ var GoogleGeminiAdapter = class extends AIAdapter {
role: "assistant",
//this.convertRoleGeminitoOpenAI(candidate.content.role),
content,
tool_calls: toolCalls
tool_calls: toolCalls,
refusal: null
// アシスタントからの拒否メッセージ
}
});
});
Expand Down Expand Up @@ -524,7 +531,9 @@ var GoogleGeminiAdapter = class extends AIAdapter {
//example:
};
}
const parameters = tool.function.parameters;
let parameters = tool.function.parameters;
this.convertType(tool, parameters);
parameters = this.workaroundObjectNoParameters(parameters);
functionDeclarations.push({
name: tool.function.name,
description: tool.function.description,
Expand All @@ -533,10 +542,31 @@ var GoogleGeminiAdapter = class extends AIAdapter {
});
return geminiTool;
}
workaroundObjectNoParameters(parameters) {
if (parameters?.type === SchemaType.OBJECT && Object.keys(parameters?.properties).length === 0) {
parameters = void 0;
}
return parameters;
}
convertType(tool, parameters) {
const typeMapping = {
object: SchemaType.OBJECT,
string: SchemaType.STRING,
number: SchemaType.NUMBER,
integer: SchemaType.INTEGER,
boolean: SchemaType.BOOLEAN,
array: SchemaType.ARRAY
};
const paramType = tool.function.parameters?.type;
if (paramType && typeMapping[paramType]) {
parameters.type = typeMapping[paramType];
}
}
createContents(messages) {
const currentMessages = [];
messages.forEach(async (message) => {
switch (message.role) {
// To Google ["user", "model", "function", "system"]
case "system":
currentMessages.push({
role: "user",
Expand All @@ -558,6 +588,7 @@ var GoogleGeminiAdapter = class extends AIAdapter {
break;
case "tool":
case "function":
//Deprecated
default:
log3.error(`getChatHistory(): ${message.role} not yet support.`, message);
break;
Expand Down Expand Up @@ -762,29 +793,12 @@ function registerChatPlugin(plugin) {
});
}
async function continueThread(messages, msgData) {
openAILog.trace(
"messsages: ",
JSON.parse(JSON.stringify(messages)).map(
//シリアライズでDeep Copy
(message) => {
if (typeof message.content !== "string") {
message.content?.map((content) => {
const url = shortenString(content.image_url?.url);
if (url) {
;
content.image_url.url = url;
}
return content;
});
}
return message;
}
)
);
logMessages(messages);
const NO_MESSAGE = "Sorry, but it seems I found no valid response.";
const promptTokensDetails = { cached_tokens: 0 };
let aiResponse = {
message: NO_MESSAGE,
usage: { prompt_tokens: 0, completion_tokens: 0, total_tokens: 0 },
usage: { prompt_tokens: 0, completion_tokens: 0, prompt_tokens_details: promptTokensDetails, total_tokens: 0 },
model: ""
};
let maxChainLength = 7;
Expand All @@ -797,6 +811,7 @@ async function continueThread(messages, msgData) {
if (usage && aiResponse.usage) {
aiResponse.usage.prompt_tokens += usage.prompt_tokens;
aiResponse.usage.completion_tokens += usage.completion_tokens;
aiResponse.usage.prompt_tokens_details.cached_tokens += usage?.prompt_tokens_details?.cached_tokens ? usage.prompt_tokens_details.cached_tokens : 0;
aiResponse.usage.total_tokens += usage.total_tokens;
}
if (responseMessage.function_call) {
Expand Down Expand Up @@ -879,6 +894,24 @@ async function continueThread(messages, msgData) {
}
return aiResponse;
}
function logMessages(messages) {
openAILog.trace(
"messages: ",
//シリアライズでDeep Copy
JSON.parse(JSON.stringify(messages)).map((message) => {
if (typeof message.content !== "string") {
message.content?.forEach((content) => {
const url = shortenString(content.image_url?.url);
if (url) {
;
content.image_url.url = url;
}
});
}
return message;
})
);
}
async function createChatCompletion(messages, functions2 = void 0) {
let useTools = true;
let currentOpenAi = openai;
Expand All @@ -902,19 +935,19 @@ async function createChatCompletion(messages, functions2 = void 0) {
const chatCompletionOptions = {
model: currentModel,
messages,
max_tokens: MAX_TOKENS,
//TODO: messageのTOKEN数から最大値にする。レスポンス長くなるけど翻訳などが一発になる
temperature
};
if (currentModel.indexOf("o1") === 0) {
chatCompletionOptions.max_completion_tokens = MAX_TOKENS;
} else {
chatCompletionOptions.max_tokens = MAX_TOKENS;
}
if (functions2 && useTools) {
if (model.indexOf("gpt-3") >= 0) {
chatCompletionOptions.functions = functions2;
chatCompletionOptions.function_call = "auto";
} else {
chatCompletionOptions.tools = [];
functions2?.forEach((funciton) => {
chatCompletionOptions.tools?.push({ type: "function", function: funciton });
});
chatCompletionOptions.tools = functions2.map((func) => ({ type: "function", function: func }));
chatCompletionOptions.tool_choice = "auto";
}
}
Expand Down Expand Up @@ -1390,7 +1423,11 @@ Error: ${e.message}`;
let message = `
${SYSTEM_MESSAGE_HEADER} `;
if (usage) {
message += ` Prompt:${usage.prompt_tokens} Completion:${usage.completion_tokens} Total:${usage.total_tokens}`;
message += ` Prompt:${usage.prompt_tokens} Completion:${usage.completion_tokens} `;
if (usage.prompt_tokens_details?.cached_tokens) {
message += `Cached:${usage.prompt_tokens_details.cached_tokens} `;
}
message += `Total:${usage.total_tokens}`;
}
if (model2) {
message += ` Model:${model2}`;
Expand Down Expand Up @@ -1653,10 +1690,11 @@ async function isMessageIgnored(msgData, meId, previousPosts) {
}
function isUnuseImages(meId, previousPosts) {
for (let i = previousPosts.length - 1; i >= 0; i--) {
if (previousPosts[i].props.bot_images === "stopped") {
const post = previousPosts[i];
if (post.props.bot_images === "stopped") {
return true;
}
if (previousPosts[i].user_id === meId || previousPosts[i].message.includes(name)) {
if (post.user_id === meId || post.message.includes(name)) {
return false;
}
}
Expand Down
Loading

0 comments on commit 029231c

Please sign in to comment.