You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
@@ -137,6 +148,16 @@ make setup-procs-funcs CONN=<{conn-placeholder}>
137
148
{streamlit-env-var}=true make deploy-streamlit CONN=<{conn-placeholder}>
138
149
----
139
150
+
151
+
[TIP]
152
+
====
153
+
*Quick start without Slack:* Use the `only-dbt` target to run steps 1+2+3+5+6 and load sample test data, skipping the Slack connector setup:
154
+
155
+
[source,bash,subs="attributes+"]
156
+
----
157
+
make only-dbt CONN=<{conn-placeholder}>
158
+
----
159
+
====
160
+
+
140
161
[NOTE]
141
162
====
142
163
The Streamlit deployment step is optional and will only proceed if `{streamlit-env-var}` is set to `true`. If the environment variable is not set or set to any other value, the deployment will be skipped with a warning message.
@@ -192,15 +213,42 @@ Drop a test documents (PDFs, Word documents, etc.) in the bronze_zone.DOCUMENTS
192
213
+
193
214
Manually execute the Tasks available in the following order to build the baseline models and the Cortex AI services on top including the Cortex Agent:
Once Cortex Agent has been created follow instructions link:https://docs.snowflake.com/en/user-guide/snowflake-cortex/snowflake-intelligence#add-agents[here] to add the Cortex Agent to Snowflake Intelligence.
203
224
225
+
. Loading Sample Test Data (Optional)
226
+
+
227
+
If you are not using the Slack connector or want to quickly populate the gold zone tables with sample data for testing, use:
228
+
+
229
+
[source,bash,subs="attributes+"]
230
+
----
231
+
make load-test-data CONN=<{conn-placeholder}>
232
+
----
233
+
+
234
+
This uploads CSV seed files from `data/csv/` to the stage, creates gold zone tables, and loads the sample data. The tables created are: `incidents`, `active_incidents`, `closed_incidents`, `incident_comment_history`, `incident_attachments`, `document_full_extracts`, and `quaterly_review_metrics`.
235
+
+
236
+
[NOTE]
237
+
====
238
+
This target uses `CREATE OR REPLACE TABLE` so it will overwrite existing gold zone tables. It is intended for initial setup and testing only.
239
+
====
240
+
241
+
. Redeploying the dbt Project (Optional)
242
+
+
243
+
After pushing changes to the git repository (e.g., updated dbt models, macros, or agent specs), redeploy the dbt project in Snowflake:
244
+
+
245
+
[source,bash,subs="attributes+"]
246
+
----
247
+
make redeploy-dbt CONN=<{conn-placeholder}>
248
+
----
249
+
+
250
+
This fetches the latest code from the git repository and adds a new version to the dbt project.
251
+
204
252
. Streamlit Dashboard (Optional)
205
253
+
206
254
Optionally, you can deploy a Streamlit dashboard to visualize the incidents and their statuses as a more traditional method of reporting pre-calculated metrics and insights.
Copy file name to clipboardExpand all lines: ascii-docs/cortex_agent.adoc
+10-4Lines changed: 10 additions & 4 deletions
Display the source diff
Display the rich diff
Original file line number
Diff line number
Diff line change
@@ -14,13 +14,19 @@ This page describes the approach taken by this project to deploy the Cortex Agen
14
14
15
15
=== Cortex Agent Deployment
16
16
17
-
The Cortex Agent is deployed using the macro link:../src/incident_management/macros/create_cortex_agent.sql[`create_cortex_agent`] in Gold Zone. This macro is used to create the Cortex Agent with the given name, stage name, spec file, and agent profile.
17
+
The Cortex Agent is deployed using the macro link:../src/incident_management/macros/create_cortex_agent.sql[`create_cortex_agent`] in Gold Zone. This macro accepts the following parameters: `agent_name`, `database`, `schema`, `stage_name`, `agent_spec_file`, and `next_version`.
18
18
19
-
The actual definition of the Cortex Agent is stored in a YAML specification file that should conform to this link:https://docs.snowflake.com/en/sql-reference/sql/create-agent#required-parameters[Cortex Agent Specification], and is expected to be staged in gold_zone.agent_specs stage prior to the deployment of the Cortex Agent.
19
+
The macro is *idempotent*: it checks whether the agent already exists in the database. If the agent does not exist, it creates it with `CREATE OR REPLACE AGENT` and registers it with Snowflake Intelligence. If the agent already exists, it updates the live version specification with `ALTER AGENT ... MODIFY LIVE VERSION` instead.
20
20
21
-
A sample spec file is provided in the link:../src/cortex_agents/incm360_agent_1.yml[incm360_agent_1.yml] file, and is loaded during the Snowflake setup process.
21
+
Each deployment is tagged with a version identifier and timestamp in the agent's `COMMENT` metadata, making it easy to track which spec version is deployed and when.
22
22
23
-
This macro does not need to be run on a regular basis and can only be run when the composition changes in terms of tools, tool resources, or instructions, etc.
23
+
The actual definition of the Cortex Agent is stored in a YAML specification file that should conform to this link:https://docs.snowflake.com/en/sql-reference/sql/create-agent#required-parameters[Cortex Agent Specification], and is expected to be staged in `gold_zone.agent_specs` stage prior to deployment.
24
+
25
+
The current spec file is link:../src/cortex_agents/incm360_agent_v200.yml[incm360_agent_v200.yml], loaded during the Snowflake setup process by `03_procs_and_funcs.sql`.
26
+
27
+
The agent spec is read from the stage at deployment time using a Python UDF (`READ_STAGE_FILE`) created in the `dbt_project_deployments` schema. The YAML content is passed verbatim into the `FROM SPECIFICATION` clause, preserving original indentation.
28
+
29
+
This macro does not need to be run on a regular basis and can only be run when the composition changes in terms of tools, tool resources, or instructions, etc. It is invoked via the `incm_deploy_cortex_agent` task in the Cortex services deployment task graph.
24
30
25
31
Having Cortex Agent created will allow you to use it in Snowflake Intelligence to answer questions about the incident management process. Post installation you will be required to add this explicitly to Snowflake Intelligence as described link:https://docs.snowflake.com/en/user-guide/snowflake-cortex/snowflake-intelligence#add-agents[here].
0 commit comments