diff --git a/MenuConfig.xml b/MenuConfig.xml index 3920f536a5..4811099275 100644 --- a/MenuConfig.xml +++ b/MenuConfig.xml @@ -109,7 +109,7 @@ - + diff --git a/concepts/05 UI Components/Chat/15 Integrate with AI Service/00 OpenAI/00 OpenAI.md b/concepts/05 UI Components/Chat/15 Integrate with AI Service/00 OpenAI/00 OpenAI.md new file mode 100644 index 0000000000..90c9010774 --- /dev/null +++ b/concepts/05 UI Components/Chat/15 Integrate with AI Service/00 OpenAI/00 OpenAI.md @@ -0,0 +1,3 @@ +[OpenAI](https://openai.com/) creates advanced artificial intelligence tools and models for various applications. Their core offering is the [GPT model](https://chatgpt.com/), known for its natural language understanding and generation. + +This help topic explains how to integrate OpenAI with the DevExtreme Chat component. \ No newline at end of file diff --git a/concepts/05 UI Components/Chat/15 Integrate with AI Service/00 OpenAI/05 Prerequisites.md b/concepts/05 UI Components/Chat/15 Integrate with AI Service/00 OpenAI/05 Prerequisites.md new file mode 100644 index 0000000000..f76f835b1e --- /dev/null +++ b/concepts/05 UI Components/Chat/15 Integrate with AI Service/00 OpenAI/05 Prerequisites.md @@ -0,0 +1,15 @@ +1. Obtain an API key: + - Register on the [OpenAI](https://platform.openai.com/) website if you do not have an account. + - Log in and go to the [API Keys](https://platform.openai.com/account/api-keys) section and create a new key. + - Save the API key for authenticating requests. + +2. Select a model: + - This example uses the model defined as `deployment = 'gpt-4o-mini'`. + - If necessary, replace the `deployment` variable with any other Chat Completion model like `gpt-3.5-turbo` or `gpt-4`. + +3. Deploy your application: + - Ensure your client web project is set up to integrate the Chat component with OpenAI. + +#####See Also##### +- [OpenAI documentation](https://platform.openai.com/docs/introduction) +- [OpenAI Node.js library on GitHub](https://github.com/openai/openai-node) \ No newline at end of file diff --git a/concepts/05 UI Components/Chat/15 Integrate with AI Service/00 OpenAI/10 Installation.md b/concepts/05 UI Components/Chat/15 Integrate with AI Service/00 OpenAI/10 Installation.md new file mode 100644 index 0000000000..59ae5cfaf7 --- /dev/null +++ b/concepts/05 UI Components/Chat/15 Integrate with AI Service/00 OpenAI/10 Installation.md @@ -0,0 +1,25 @@ +This example imports packages through ESM CDN: + + import OpenAI from 'https://esm.sh/openai'; + +To install the [openai](https://www.npmjs.com/package/openai) package locally, use npm: + + npm install openai + +#### Additional Packages + +The code uses the following packages to convert markdown to HTML: + +- [`unified`](https://unifiedjs.com/) +- `remark-parse` +- `remark-rehype` +- `rehype-stringify` + +Use npm or CDN to install these packages. They can also be installed locally for customized builds. + + npm install unified, remark-parse remark-rehype rehype-stringify + + import { unified } from 'https://esm.sh/unified@11?bundle'; + import remarkParse from 'https://esm.sh/remark-parse@11?bundle'; + import remarkRehype from 'https://esm.sh/remark-rehype@11?bundle'; + import rehypeStringify from 'https://esm.sh/rehype-stringify@10?bundle'; \ No newline at end of file diff --git a/concepts/05 UI Components/Chat/15 Integrate with AI Service/00 OpenAI/15 Implementation.md b/concepts/05 UI Components/Chat/15 Integrate with AI Service/00 OpenAI/15 Implementation.md new file mode 100644 index 0000000000..f31cbf9859 --- /dev/null +++ b/concepts/05 UI Components/Chat/15 Integrate with AI Service/00 OpenAI/15 Implementation.md @@ -0,0 +1,48 @@ +#### Customizing Parameters + +The code includes the following customizable parameters: + + const deployment = 'gpt-4o-mini'; + const apiKey = 'OPENAI_API_KEY'; + +- `deployment`: Model name. +- `apiKey`: Secret key for accessing the OpenAI API. + +#### OpenAI Initialization + +The code creates an OpenAI instance: + + const chatService = new OpenAI({ + dangerouslyAllowBrowser: true, + apiKey, + }); + +Setting `dangerouslyAllowBrowser: true` lets the browser send requests directly to the OpenAI API. Use this with caution since the key will be visible in the client code. For production, route requests to OpenAI through your backend. + +#### Sending and Receiving Messages + +The code accesses the OpenAI Chat Completion API with the `getAIResponse(messages)` function: + + async function getAIResponse(messages) { + const params = { + messages: messages, + model: deployment, + }; + + const response = await chatService.chat.completions.create(params); + const data = { choices: response.choices }; + + return data.choices[0].message?.content; + }; + +The `messages` parameter refers to the dialog history, an array of objects formatted as `{role: 'user'|'assistant', content: '...'}`. +Key parameters: + +- `messages`: The chat history, with the user's latest message at the end. +- `model`: The model in use. + +The function returns the assistant’s new response text. + +`DevExpress.data.CustomStore` and `DevExpress.data.DataSource` manage and render message states. The functions `renderAssistantMessage` and `updateLastMessage` insert or update messages in the interface, and adjust button states (copy text, regenerate response). + +The "Regenerate" button allows you to request a new response from the assistant for the last message. This replaces the previous unsuccessful response. The `regenerate` function calls `getAIResponse` again. \ No newline at end of file diff --git a/concepts/05 UI Components/Chat/15 Integrate with AI Service/05 Azure OpenAI/00 Azure OpenAI.md b/concepts/05 UI Components/Chat/15 Integrate with AI Service/05 Azure OpenAI/00 Azure OpenAI.md new file mode 100644 index 0000000000..98140eabb6 --- /dev/null +++ b/concepts/05 UI Components/Chat/15 Integrate with AI Service/05 Azure OpenAI/00 Azure OpenAI.md @@ -0,0 +1,3 @@ +[Azure OpenAI](https://learn.microsoft.com/en-us/azure/ai-services/openai/) is a Microsoft service that allows access to [OpenAI](https://openai.com/)'s models. + +This help topic explains how to integrate Azure OpenAI with the DevExtreme Chat component. \ No newline at end of file diff --git a/concepts/05 UI Components/Chat/15 Integrate with AI Service/05 Azure OpenAI/05 Prerequisites.md b/concepts/05 UI Components/Chat/15 Integrate with AI Service/05 Azure OpenAI/05 Prerequisites.md new file mode 100644 index 0000000000..2def973a1d --- /dev/null +++ b/concepts/05 UI Components/Chat/15 Integrate with AI Service/05 Azure OpenAI/05 Prerequisites.md @@ -0,0 +1,14 @@ +1. [Create an Azure OpenAI Service resource](https://learn.microsoft.com/en-us/azure/ai-services/openai/how-to/create-resource). + +2. Deploy a selected language model: + - Open your Azure OpenAI resource in the portal. + - Select **Model deployments**. + - Create a new deployment for the desired model (for example, GPT-4). Note the deployment name for future use. + +3. Obtain API key and endpoint: + - Open your Azure OpenAI resource in the portal. + - Go to the **Keys and Endpoint** section. + - Copy the **API Key** and **Endpoint** for use in the code. + +##### See Also ##### +- [Azure OpenAI Service documentation](https://learn.microsoft.com/en-us/azure/ai-services/openai/) \ No newline at end of file diff --git a/concepts/05 UI Components/Chat/15 Integrate with AI Service/05 Azure OpenAI/10 Installation.md b/concepts/05 UI Components/Chat/15 Integrate with AI Service/05 Azure OpenAI/10 Installation.md new file mode 100644 index 0000000000..2e03a9fa8d --- /dev/null +++ b/concepts/05 UI Components/Chat/15 Integrate with AI Service/05 Azure OpenAI/10 Installation.md @@ -0,0 +1,25 @@ +This example imports packages through ESM CDN: + + import { AzureOpenAI } from 'https://esm.sh/openai'; + +To install the [openai](https://www.npmjs.com/package/openai) package locally, use npm: + + npm install openai + +#### Additional Packages + +The code uses the following packages to convert markdown to HTML: + +- [`unified`](https://unifiedjs.com/) +- `remark-parse` +- `remark-rehype` +- `rehype-stringify` + +Use npm or CDN to install these packages. They can also be installed locally for customized builds. + + npm install unified, remark-parse remark-rehype rehype-stringify + + import { unified } from 'https://esm.sh/unified@11?bundle'; + import remarkParse from 'https://esm.sh/remark-parse@11?bundle'; + import remarkRehype from 'https://esm.sh/remark-rehype@11?bundle'; + import rehypeStringify from 'https://esm.sh/rehype-stringify@10?bundle'; \ No newline at end of file diff --git a/concepts/05 UI Components/Chat/15 Integrate with AI Service/05 Azure OpenAI/15 Implementation.md b/concepts/05 UI Components/Chat/15 Integrate with AI Service/05 Azure OpenAI/15 Implementation.md new file mode 100644 index 0000000000..9cc8d5a437 --- /dev/null +++ b/concepts/05 UI Components/Chat/15 Integrate with AI Service/05 Azure OpenAI/15 Implementation.md @@ -0,0 +1,62 @@ +#### Customizing Parameters + +The code includes the following customizable parameters: + + const deployment = 'YOUR_MODEL_NAME'; + const apiVersion = 'YOUR_API_VERSION'; + const endpoint = 'YOUR_ENDPOINT'; + const apiKey = 'YOUR_API_KEY'; + +- `deployment`: Model name. +- `apiVersion`: API version. Ensure compatibility with your model. +- `endpoint`: Your unique URI from **Keys and Endpoint** in the Azure OpenAI resource, typically in the `https://.openai.azure.com/` format, or configured as a proxy if you set up your own access point. +- `apiKey`: Secret key for accessing the Azure OpenAI API. + +#### Azure OpenAI Initialization + +The code creates an Azure OpenAI instance: + + const chatService = new AzureOpenAI({ + dangerouslyAllowBrowser: true, + deployment, + endpoint, + apiVersion, + apiKey, + }); + +Setting `dangerouslyAllowBrowser: true` lets the browser send requests directly to the Azure OpenAI API. Use this with caution since the key will be visible in the client code. For production, route requests to Azure OpenAI through your backend. + +#### Sending and Receiving Messages + +The `processMessageSending` function is called when a user sends a new message. The main purpose of this function is the following: + +- To call Azure OpenAI Chat Completion API for a response based on the dialog history. +- To add the received reply to the message list and display it in the chat. + +The `getAIResponse` function handles the primary interaction with the API: + + async function getAIResponse(messages) { + const params = { + messages, + model: ‘your_model_from_deployment’, + max_tokens: 1000, + temperature: 0.7, + }; + + const response = await chatService.chat.completions.create(params); + const data = { choices: response.choices }; + + return data.choices[0].message?.content; + }; + +Key parameters: + +- `messages`: The chat history (`role: 'user'` for the user, `role: 'assistant'` for the assistant). +- `model`: The model in use. +- `max_tokens` and `temperature` control the length and the creativity of the answer. + +The function returns the assistant’s new response text. + +`DevExpress.data.CustomStore` and `DevExpress.data.DataSource` manage and render message states. The functions `renderAssistantMessage` and `updateLastMessage` insert or update messages in the interface, and adjust button states (copy text, regenerate response). + +The "Regenerate" button allows you to request a new response from the assistant for the last message. This replaces the previous unsuccessful response. The `regenerate` function calls `getAIResponse` again. \ No newline at end of file diff --git a/concepts/05 UI Components/Chat/15 Integrate with AI Service/10 Google Dialogflow/00 Google Dialogflow.md b/concepts/05 UI Components/Chat/15 Integrate with AI Service/10 Google Dialogflow/00 Google Dialogflow.md new file mode 100644 index 0000000000..87f0174637 --- /dev/null +++ b/concepts/05 UI Components/Chat/15 Integrate with AI Service/10 Google Dialogflow/00 Google Dialogflow.md @@ -0,0 +1,3 @@ +Google [Dialogflow](https://cloud.google.com/products/conversational-agents) is an AI service that allows you to build hybrid conversational agents with both deterministic and generative functionality. + +This help topic explains how to integrate Dialogflow with the DevExtreme Chat component. \ No newline at end of file diff --git a/concepts/05 UI Components/Chat/15 Integrate with AI Service/10 Google Dialogflow/05 Prerequisites.md b/concepts/05 UI Components/Chat/15 Integrate with AI Service/10 Google Dialogflow/05 Prerequisites.md new file mode 100644 index 0000000000..ff3c2d75ec --- /dev/null +++ b/concepts/05 UI Components/Chat/15 Integrate with AI Service/10 Google Dialogflow/05 Prerequisites.md @@ -0,0 +1,27 @@ +Before building your application, ensure you have: + +- Google Cloud Platform (GCP) account +Sign up at [Google Cloud Console](http://console.cloud.google.com/) if you do not have an account. + +- Dialogflow CX agent +You need to [create](https://cloud.google.com/dialogflow/cx/docs/concept/agent#create) and configure an agent for Dialogflow. + +Authenticate with Dialogflow through a Service Account and key from Google Cloud: + +Create a service account and key: +1. Go to Google Cloud Console -> IAM & Admin -> Service Accounts. +2. Create a service account or use an existing one. +3. Generate a key (JSON) for this account. +4. Save the JSON file. + +Place the key file `your_json_key.json` in the project root directory or the path specified in your code. Ensure the key path matches the parameter in `dialogflow.js`: + + const keyFilePath = path.join(__dirname, './your_json_key.json'); + +Specify project ID (found in the agent settings in GCP): + + const projectId = 'your-project-id'; + +#####See Also##### +- [Dialogflow documentation](https://cloud.google.com/dialogflow/docs) +- [Dialogflow ES documentation](https://cloud.google.com/dialogflow/es/docs) \ No newline at end of file diff --git a/concepts/05 UI Components/Chat/15 Integrate with AI Service/10 Google Dialogflow/10 Installation.md b/concepts/05 UI Components/Chat/15 Integrate with AI Service/10 Google Dialogflow/10 Installation.md new file mode 100644 index 0000000000..44d5b678e8 --- /dev/null +++ b/concepts/05 UI Components/Chat/15 Integrate with AI Service/10 Google Dialogflow/10 Installation.md @@ -0,0 +1,27 @@ +To integrate your application with Dialogflow, use the [`@google-cloud/dialogflow`](https://www.npmjs.com/package/@google-cloud/dialogflow) package. + +Install `@google-cloud/dialogflow`: + + npm install @google-cloud/dialogflow + +For server dependencies, install [`express`](https://www.npmjs.com/package/express) and [`body-parser`](https://www.npmjs.com/package/body-parser): + + npm install express body-parser + +#### Additional Packages + +The code uses the following packages to convert markdown to HTML: + +- [`unified`](https://unifiedjs.com/) +- `remark-parse` +- `remark-rehype` +- `rehype-stringify` + +Use npm or CDN to install these packages. They can also be installed locally for customized builds. + + npm install unified, remark-parse remark-rehype rehype-stringify + + import { unified } from 'https://esm.sh/unified@11?bundle'; + import remarkParse from 'https://esm.sh/remark-parse@11?bundle'; + import remarkRehype from 'https://esm.sh/remark-rehype@11?bundle'; + import rehypeStringify from 'https://esm.sh/rehype-stringify@10?bundle'; \ No newline at end of file diff --git a/concepts/05 UI Components/Chat/15 Integrate with AI Service/10 Google Dialogflow/15 Implementation.md b/concepts/05 UI Components/Chat/15 Integrate with AI Service/10 Google Dialogflow/15 Implementation.md new file mode 100644 index 0000000000..2f39b4f05a --- /dev/null +++ b/concepts/05 UI Components/Chat/15 Integrate with AI Service/10 Google Dialogflow/15 Implementation.md @@ -0,0 +1,95 @@ +#### Customizing Parameters + +The `dialogflow.js` includes the following customizable parameters: + + const keyFilePath = path.join(__dirname, '../your_json_key.json'); + const projectId = 'your_project_id'; + +- `keyFilePath`: Path to your service account key. +- `projectId`: Identifier of your GCP project in which the Dialogflow agent is created. + +#### Configure Requests to Dialogflow + +The `dialogflow.js` file contains the `detectIntent(message, sessionId)` function: + + async function detectIntent(message, sessionId) { + const sessionPath = sessionClient.projectAgentSessionPath(projectId, sessionId); + + const params = { + session: sessionPath, + queryInput: { + text: { + text: message, + languageCode: 'en', + }, + }, + }; + + const responses = await sessionClient.detectIntent(params); + const result = responses[0]?.queryResult; + + if (!result || !result.fulfillmentText) { + throw new Error('No response from Dialogflow'); + } + + return result.fulfillmentText; + } + +The function performs the following steps: + +- Forms a request to Dialogflow API with the received message and `sessionId`. +- Sets the request language to `'en'`. +- Generates `fulfillmentText` as the agent response. + +`sessionId` saves the dialog context. In this example (in `index.html`), a random string serves this purpose. + +#### Using Server + +With Dialogflow, use a server layer to ensure application security and reliability. You cannot use this library only on the client due to Google's CORS policy blocking all requests. + +To send a POST request to `/webhook`, the server calls `detectIntent` and returns the agent's response: + + + app.post('/webhook', async (request, response) => { + const { message, sessionId } = request.body; + if (!message || !sessionId) { + return response.status(400).json({ error: 'Message and sessionId are required' }); + } + + try { + const result = await detectIntent(message, sessionId); + response.json({ response: result }); + } catch (error) { + console.error(error); + response.status(500).send('Error processing request'); + } + }); + +#### Sending and Receiving Messages + +The `processMessageSending` function is called when a user sends a new message. The main purpose of this function is the following: + +- To send a request for the server to get a response from Dialogflow. +- To add the received reply to the message list and display it in the chat. + +The `getAIResponse` function handles the primary interaction with the API: + + async function getAIResponse(messages) { + const response = await fetch(`${YOUR_SERVER_URI}/webhook`, { + method: 'POST', + headers: { + 'Content-Type': 'application/json', + }, + body: JSON.stringify({ message: text, sessionId }), + }); + + const data = await response.json(); + + return data.response; + }; + +The user message and `sessionId` are passed in the body of the request. The function returns the assistant’s new response text. + +`DevExpress.data.CustomStore` and `DevExpress.data.DataSource` manage and render message states. The functions `renderAssistantMessage` and `updateLastMessage` insert or update messages in the interface, and adjust button states (copy text, regenerate response). + +The "Regenerate" button allows you to request a new response from the assistant for the last message. This replaces the previous unsuccessful response. The `regenerate` function calls `getAIResponse` again. \ No newline at end of file