You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Copy file name to clipboardExpand all lines: docs/docs/prompts/index.md
+15-10Lines changed: 15 additions & 10 deletions
Display the source diff
Display the rich diff
Original file line number
Diff line number
Diff line change
@@ -14,7 +14,7 @@ data class with the following properties:
14
14
-`params`: Optional [LLM configuration parameters](prompt-creation/index.md#prompt-parameters) (such as temperature, tool choice, and others).
15
15
16
16
Although you can instantiate the `Prompt` class directly,
17
-
the recommended way to create prompts is by using the Kotlin DSL,
17
+
the recommended way to create prompts is by using the [Kotlin DSL](prompt-creation/index.md),
18
18
which provides a structured way to define the conversation.
19
19
20
20
<!--- INCLUDE
@@ -95,8 +95,8 @@ Koog allows you to optimize performance and handle failures when running prompts
95
95
## Prompts in AI agents
96
96
97
97
In Koog, AI agents maintain and manage prompts during their lifecycle.
98
-
While LLM clients or executors are used for direct prompt execution, agents handle the flow of prompt updates to ensure
99
-
the conversation history is relevant and consistent.
98
+
While LLM clients or executors are used to run prompts, agents handle the flow of prompt updates, ensuring the
99
+
conversation history remains relevant and consistent.
100
100
101
101
The prompt lifecycle in an agent usually includes several stages:
102
102
@@ -107,9 +107,10 @@ The prompt lifecycle in an agent usually includes several stages:
107
107
108
108
### Initial prompt setup
109
109
110
-
When you [initialize an agent](../getting-started/#create-and-run-an-agent), you define a [system message](prompt-creation/index.md#system-message) that sets the agent's behavior.
111
-
An initial [user message](prompt-creation/index.md#user-messages) is usually provided as input when you call the agent's `run()` method.
112
-
For example:
110
+
When you [initialize an agent](../getting-started/#create-and-run-an-agent), you define
111
+
a [system message](prompt-creation/index.md#system-message) that sets the agent's behavior.
112
+
Then, when you call the agent's `run()` method, you typically provide an initial [user message](prompt-creation/index.md#user-messages)
113
+
as input. Together, these messages form the agent's initial prompt. For example:
113
114
114
115
<!--- INCLUDE
115
116
import ai.koog.agents.core.agent.AIAgent
@@ -137,7 +138,7 @@ val result = agent.run("What is Koog?")
137
138
```
138
139
<!--- KNIT example-prompts-02.kt -->
139
140
140
-
The agent automatically converts the text prompt to the Prompt object and sends it to the prompt executor:
141
+
In the example, the agent automatically converts the text prompt to the Prompt object and sends it to the prompt executor:
141
142
142
143
```mermaid
143
144
flowchart TB
@@ -158,14 +159,18 @@ flowchart TB
158
159
B -->|"result to"| A
159
160
```
160
161
162
+
For more [advanced configurations](../complex-workflow-agents.md#4-configure-the-agent), you can also use
As the agent runs its strategy, [predefined nodes](../nodes-and-components.md) automatically update the prompt.
164
169
For example:
165
170
166
-
-[`nodeLLMRequest`](../nodes-and-components/#nodellmrequest): Appends the user message and captures the LLM response.
167
-
-[`nodeExecuteTool`](../nodes-and-components/#nodeexecutetool): Adds tool execution results to the conversation history.
168
-
-[`nodeAppendPrompt`](../nodes-and-components/#nodeappendprompt): Inserts specific messages or instructions into the prompt at any point in the workflow.
171
+
-[`nodeLLMRequest`](../nodes-and-components/#nodellmrequest): Appends a user message to the prompt and captures the LLM response.
172
+
-[`nodeLLMSendToolResult`](../nodes-and-components/#nodellmsendtoolresult): Appends tool execution results to the conversation.
173
+
-[`nodeAppendPrompt`](../nodes-and-components/#nodeappendprompt): Inserts specific messages into the prompt at any point in the workflow.
Copy file name to clipboardExpand all lines: docs/docs/prompts/prompt-creation/multimodal-content.md
+1-1Lines changed: 1 addition & 1 deletion
Display the source diff
Display the rich diff
Original file line number
Diff line number
Diff line change
@@ -102,7 +102,7 @@ All `ContentPart.Attachment` types accept the following parameters:
102
102
103
103
#### Attachment content
104
104
105
-
The AttachmentContent interface defines the type and source of content that is provided as input to the LLM:
105
+
Implementations of the AttachmentContent interface define the type and source of content that is provided as input to the LLM:
106
106
107
107
-[`AttachmentContent.URL`](https://api.koog.ai/prompt/prompt-model/ai.koog.prompt.message/-attachment-content/-u-r-l/index.html) defines the URL of the provided content:
0 commit comments