Skip to content

Commit a22e12f

Browse files
committed
Updated Metro SDK to 2026.0 release (#2056)
1 parent 0337ea8 commit a22e12f

File tree

13 files changed

+240
-72
lines changed

13 files changed

+240
-72
lines changed

metro-ai-suite/metro-sdk-manager/docs/_static/installer/config.js

Lines changed: 154 additions & 7 deletions
Original file line numberDiff line numberDiff line change
@@ -47,6 +47,10 @@ const CONFIG = {
4747
label: "latest",
4848
value: "latest"
4949
},
50+
{
51+
label: "2026.0",
52+
value: "2026.0"
53+
},
5054
{
5155
label: "2025.2",
5256
value: "2025.2"
@@ -106,6 +110,52 @@ const CONFIG = {
106110
"Edge AI Suites - Repo"
107111
]
108112
},
113+
{
114+
when: {
115+
SDK: "METRO_VISION",
116+
OP_SYSTEM: "UBUNTU",
117+
VERSION: "2026.0"
118+
},
119+
components: [
120+
"DLStreamer",
121+
"DLStreamer Pipeline Server",
122+
"OpenVINO",
123+
"OpenVINO Model Server",
124+
"Edge AI Libraries - Repo",
125+
"Edge AI Suites - Repo"
126+
]
127+
},
128+
{
129+
when: {
130+
SDK: "METRO_GENAI",
131+
OP_SYSTEM: "UBUNTU",
132+
VERSION: "2026.0"
133+
},
134+
components: [
135+
"Audio Analyzer Microservice",
136+
"Document Ingestion (pgvector)",
137+
"Multimodal Embedding Serving",
138+
"Visual Data Preparation For Retrieval",
139+
"VLM OpenVINO Serving",
140+
"Edge AI Libraries - Repo",
141+
"Edge AI Suites - Repo"
142+
]
143+
},
144+
{
145+
when: {
146+
SDK: "VISUAL_AI_DEMO",
147+
OP_SYSTEM: "UBUNTU",
148+
VERSION: "2026.0"
149+
},
150+
components: [
151+
"DLStreamer Pipeline Server",
152+
"Node Red",
153+
"Grafana",
154+
"MediaMTX",
155+
"MQTT Broker",
156+
"Edge AI Suites - Repo"
157+
]
158+
},
109159
{
110160
when: {
111161
SDK: "METRO_VISION",
@@ -185,6 +235,32 @@ const CONFIG = {
185235
},
186236
text: `curl -fsS https://raw.githubusercontent.com/open-edge-platform/edge-ai-suites/refs/heads/release-2025.2.0/metro-ai-suite/metro-sdk-manager/scripts/visual-ai-demo-kit.sh | bash`
187237
},
238+
{
239+
when: {
240+
SDK: "METRO_VISION",
241+
OP_SYSTEM: "UBUNTU",
242+
VERSION: "2026.0"
243+
},
244+
text: `curl -fsS https://raw.githubusercontent.com/open-edge-platform/edge-ai-suites/refs/heads/release-2026.0.0/metro-ai-suite/metro-sdk-manager/scripts/metro-vision-ai-sdk.sh | bash`
245+
},
246+
247+
{
248+
when: {
249+
SDK: "METRO_GENAI",
250+
OP_SYSTEM: "UBUNTU",
251+
VERSION: "2026.0"
252+
},
253+
text: `curl -fsS https://raw.githubusercontent.com/open-edge-platform/edge-ai-suites/refs/heads/release-2026.0.0/metro-ai-suite/metro-sdk-manager/scripts/metro-gen-ai-sdk.sh | bash`
254+
},
255+
256+
{
257+
when: {
258+
SDK: "VISUAL_AI_DEMO",
259+
OP_SYSTEM: "UBUNTU",
260+
VERSION: "2026.0"
261+
},
262+
text: `curl -fsS https://raw.githubusercontent.com/open-edge-platform/edge-ai-suites/refs/heads/release-2026.0.0/metro-ai-suite/metro-sdk-manager/scripts/visual-ai-demo-kit.sh | bash`
263+
},
188264
{
189265
when: {
190266
SDK: "METRO_VISION",
@@ -226,7 +302,7 @@ const CONFIG = {
226302
VERSION: "2025.2"
227303
},
228304
text: `Get Started`,
229-
link: `https://docs.openedgeplatform.intel.com/dev/edge-ai-suites/metro-sdk-manager/metro-vision-ai-sdk/get-started.html`
305+
link: `https://docs.openedgeplatform.intel.com/2025.2/edge-ai-suites/metro-sdk-manager/metro-vision-ai-sdk/get-started.html`
230306
},
231307
{
232308
when: {
@@ -235,7 +311,7 @@ const CONFIG = {
235311
VERSION: "2025.2"
236312
},
237313
text: `Get Started`,
238-
link: `https://docs.openedgeplatform.intel.com/dev/edge-ai-suites/metro-sdk-manager/metro-gen-ai-sdk/get-started.html`
314+
link: `https://docs.openedgeplatform.intel.com/2025.2/edge-ai-suites/metro-sdk-manager/metro-gen-ai-sdk/get-started.html`
239315
},
240316
{
241317
when: {
@@ -244,7 +320,34 @@ const CONFIG = {
244320
VERSION: "2025.2"
245321
},
246322
text: `Get Started`,
247-
link: `https://docs.openedgeplatform.intel.com/dev/edge-ai-suites/metro-sdk-manager/visual-ai-demo-kit/get-started.html`
323+
link: `https://docs.openedgeplatform.intel.com/2025.2/edge-ai-suites/metro-sdk-manager/visual-ai-demo-kit/get-started.html`
324+
},
325+
{
326+
when: {
327+
SDK: "METRO_VISION",
328+
OP_SYSTEM: "UBUNTU",
329+
VERSION: "2026.0"
330+
},
331+
text: `Get Started`,
332+
link: `https://docs.openedgeplatform.intel.com/2026.0/edge-ai-suites/metro-sdk-manager/metro-vision-ai-sdk/get-started.html`
333+
},
334+
{
335+
when: {
336+
SDK: "METRO_GENAI",
337+
OP_SYSTEM: "UBUNTU",
338+
VERSION: "2026.0"
339+
},
340+
text: `Get Started`,
341+
link: `https://docs.openedgeplatform.intel.com/2026.0/edge-ai-suites/metro-sdk-manager/metro-gen-ai-sdk/get-started.html`
342+
},
343+
{
344+
when: {
345+
SDK: "VISUAL_AI_DEMO",
346+
OP_SYSTEM: "UBUNTU",
347+
VERSION: "2026.0"
348+
},
349+
text: `Get Started`,
350+
link: `https://docs.openedgeplatform.intel.com/2026.0/edge-ai-suites/metro-sdk-manager/visual-ai-demo-kit/get-started.html`
248351
},
249352
{
250353
when: {
@@ -287,8 +390,8 @@ const CONFIG = {
287390
VERSION: "2025.2"
288391
},
289392
links: [
290-
{ text: "DLStreamer", url: "http://docs.openedgeplatform.intel.com/dev/edge-ai-libraries/dl-streamer/index.html" },
291-
{ text: "DLStreamer Pipeline Server", url: "https://docs.openedgeplatform.intel.com/dev/edge-ai-libraries/dlstreamer-pipeline-server/index.html" },
393+
{ text: "DLStreamer", url: "http://docs.openedgeplatform.intel.com/2025.2/edge-ai-libraries/dl-streamer/index.html" },
394+
{ text: "DLStreamer Pipeline Server", url: "https://docs.openedgeplatform.intel.com/2025.2/edge-ai-libraries/dlstreamer-pipeline-server/index.html" },
292395
{ text: "OpenVINO", url: "https://docs.openvino.ai/2025/get-started.html" },
293396
{ text: "OpenVINO Model Server", url: "https://docs.openvino.ai/2025/model-server/ovms_what_is_openvino_model_server.html" },
294397
{ text: "Edge AI Libraries", url: "https://docs.openedgeplatform.intel.com/dev/ai-libraries.html"},
@@ -303,7 +406,7 @@ const CONFIG = {
303406
},
304407
links: [
305408
{ text: "Audio Analyzer", url: "https://docs.openedgeplatform.intel.com/dev/edge-ai-libraries/audio-analyzer/index.html" },
306-
{ text: "Document Ingestion - pgvector", url: "https://github.com/open-edge-platform/edge-ai-libraries/blob/main/microservices/document-ingestion/pgvector/docs/get-started.md" },
409+
{ text: "Document Ingestion - pgvector", url: "https://github.com/open-edge-platform/edge-ai-libraries/blob/main/microservices/document-ingestion/pgvector/docs/user-guide/get-started.md" },
307410
{ text: "Multimodal Embedding Serving", url: "https://github.com/open-edge-platform/edge-ai-libraries/blob/main/microservices/multimodal-embedding-serving/docs/user-guide/Overview.md" },
308411
{ text: "Visual Data Preparation For Retrieval", url: "https://github.com/open-edge-platform/edge-ai-libraries/blob/main/microservices/visual-data-preparation-for-retrieval/vdms/docs/user-guide/Overview.md" },
309412
{ text: "VLM OpenVINO Serving", url: "https://github.com/open-edge-platform/edge-ai-libraries/blob/main/microservices/vlm-openvino-serving/docs/user-guide/Overview.md" },
@@ -324,6 +427,50 @@ const CONFIG = {
324427
{ text: "Edge AI Suites", url: "https://docs.openedgeplatform.intel.com/dev/ai-suite-metro.html"}
325428
]
326429
},
430+
{
431+
when: {
432+
SDK: "METRO_VISION",
433+
OP_SYSTEM: "UBUNTU",
434+
VERSION: "2026.0"
435+
},
436+
links: [
437+
{ text: "DLStreamer", url: "http://docs.openedgeplatform.intel.com/dev/edge-ai-libraries/dl-streamer/index.html" },
438+
{ text: "DLStreamer Pipeline Server", url: "https://docs.openedgeplatform.intel.com/dev/edge-ai-libraries/dlstreamer-pipeline-server/index.html" },
439+
{ text: "OpenVINO", url: "https://docs.openvino.ai/2025/get-started.html" },
440+
{ text: "OpenVINO Model Server", url: "https://docs.openvino.ai/2025/model-server/ovms_what_is_openvino_model_server.html" },
441+
{ text: "Edge AI Libraries", url: "https://docs.openedgeplatform.intel.com/dev/ai-libraries.html"},
442+
{ text: "Edge AI Suites", url: "https://docs.openedgeplatform.intel.com/dev/ai-suite-metro.html"}
443+
]
444+
},
445+
{
446+
when: {
447+
SDK: "METRO_GENAI",
448+
OP_SYSTEM: "UBUNTU",
449+
VERSION: "2026.0"
450+
},
451+
links: [
452+
{ text: "Audio Analyzer", url: "https://docs.openedgeplatform.intel.com/dev/edge-ai-libraries/audio-analyzer/index.html" },
453+
{ text: "Document Ingestion - pgvector", url: "https://github.com/open-edge-platform/edge-ai-libraries/blob/main/microservices/document-ingestion/pgvector/docs/user-guide/get-started.md" },
454+
{ text: "Multimodal Embedding Serving", url: "https://github.com/open-edge-platform/edge-ai-libraries/blob/main/microservices/multimodal-embedding-serving/docs/user-guide/Overview.md" },
455+
{ text: "Visual Data Preparation For Retrieval", url: "https://github.com/open-edge-platform/edge-ai-libraries/blob/main/microservices/visual-data-preparation-for-retrieval/vdms/docs/user-guide/Overview.md" },
456+
{ text: "VLM OpenVINO Serving", url: "https://github.com/open-edge-platform/edge-ai-libraries/blob/main/microservices/vlm-openvino-serving/docs/user-guide/Overview.md" },
457+
{ text: "Edge AI Libraries", url: "https://docs.openedgeplatform.intel.com/dev/ai-libraries.html"},
458+
{ text: "Edge AI Suites", url: "https://docs.openedgeplatform.intel.com/dev/ai-suite-metro.html"}
459+
]
460+
},
461+
{
462+
when: {
463+
SDK: "VISUAL_AI_DEMO",
464+
OP_SYSTEM: "UBUNTU",
465+
VERSION: "2026.0"
466+
},
467+
links: [
468+
{ text: "DLStreamer", url: "http://docs.openedgeplatform.intel.com/dev/edge-ai-libraries/dl-streamer/index.html" },
469+
{ text: "DLStreamer Pipeline Server", url: "https://docs.openedgeplatform.intel.com/dev/edge-ai-libraries/dlstreamer-pipeline-server/index.html" },
470+
{ text: "Edge AI Libraries", url: "https://docs.openedgeplatform.intel.com/dev/ai-libraries.html"},
471+
{ text: "Edge AI Suites", url: "https://docs.openedgeplatform.intel.com/dev/ai-suite-metro.html"}
472+
]
473+
},
327474
{
328475
when: {
329476
SDK: "METRO_VISION",
@@ -347,7 +494,7 @@ const CONFIG = {
347494
},
348495
links: [
349496
{ text: "Audio Analyzer", url: "https://docs.openedgeplatform.intel.com/dev/edge-ai-libraries/audio-analyzer/index.html" },
350-
{ text: "Document Ingestion - pgvector", url: "https://github.com/open-edge-platform/edge-ai-libraries/blob/main/microservices/document-ingestion/pgvector/docs/get-started.md" },
497+
{ text: "Document Ingestion - pgvector", url: "https://github.com/open-edge-platform/edge-ai-libraries/blob/main/microservices/document-ingestion/pgvector/docs/user-guide/get-started.md" },
351498
{ text: "Multimodal Embedding Serving", url: "https://github.com/open-edge-platform/edge-ai-libraries/blob/main/microservices/multimodal-embedding-serving/docs/user-guide/Overview.md" },
352499
{ text: "Visual Data Preparation For Retrieval", url: "https://github.com/open-edge-platform/edge-ai-libraries/blob/main/microservices/visual-data-preparation-for-retrieval/vdms/docs/user-guide/Overview.md" },
353500
{ text: "VLM OpenVINO Serving", url: "https://github.com/open-edge-platform/edge-ai-libraries/blob/main/microservices/vlm-openvino-serving/docs/user-guide/Overview.md" },

metro-ai-suite/metro-sdk-manager/docs/user-guide/metro-gen-ai-sdk/get-started.md

Lines changed: 26 additions & 7 deletions
Original file line numberDiff line numberDiff line change
@@ -43,40 +43,59 @@ Navigate to the pre-installed question-answering application directory:
4343
cd $HOME/metro/edge-ai-libraries/sample-applications/chat-question-and-answer
4444
```
4545

46-
### Step 2: Configure Environment and Dependencies
46+
### Step 2: Setup Model Download Service
47+
48+
Configure and start the Model Download service to manage LLM and embedding model downloads:
49+
50+
```bash
51+
cd $HOME/metro/edge-ai-libraries/microservices/model-download
52+
export REGISTRY="intel/"
53+
export TAG=latest
54+
export HUGGINGFACEHUB_API_TOKEN=<your-huggingface-token>
55+
source scripts/run_service.sh up --plugins openvino --model-path $HOME/metro/models/
56+
```
57+
58+
> **Note:** Keep this terminal open while the model download service is running. Open a new terminal to continue with the next steps.
59+
60+
Update the `<your-huggingface-token>` to your Access Token from Hugging Face. To learn more, follow this [guide](https://huggingface.co/docs/hub/en/security-tokens).
61+
62+
### Step 3: Configure Environment and Dependencies
4763

4864
Set up the Python virtual environment and install required dependencies:
4965

5066
```bash
67+
cd $HOME/metro/edge-ai-libraries/sample-applications/chat-question-and-answer
5168
# Configure application environment variables
5269
export HUGGINGFACEHUB_API_TOKEN=<your-huggingface-token>
5370
export LLM_MODEL=Qwen/Qwen2.5-7B-Instruct
5471
export EMBEDDING_MODEL_NAME=Alibaba-NLP/gte-large-en-v1.5
5572
export RERANKER_MODEL=BAAI/bge-reranker-base
5673
export DEVICE="CPU"
5774
export REGISTRY="intel/"
58-
export TAG=2.0.0
75+
export TAG=latest
76+
export MODEL_DOWNLOAD_HOST=localhost
77+
export MODEL_DOWNLOAD_PORT=8200
5978
source setup.sh llm=OVMS embed=OVMS
6079
```
61-
Update the <your-huggingface-token> to your Access Token from Hugging Face. To know more, follow this [guide](https://huggingface.co/docs/hub/en/security-tokens).
6280

63-
### Step 3: Deploy the Application
81+
### Step 4: Deploy the Application
6482

6583
Start the complete Gen AI application stack using Docker Compose:
6684

6785
```bash
86+
export ALLOWED_HOSTS="*.intel.com,en.wikipedia.org,*.wikipedia.org,*.github.com"
6887
docker compose up
6988
```
7089

71-
### Step 4: Verify Deployment Status
90+
### Step 5: Verify Deployment Status
7291

73-
Check that all application components are running correctly:
92+
Run below command in another terminal to check that all application components are running correctly:
7493

7594
```bash
7695
docker ps
7796
```
7897

79-
### Step 5: Access the Application Interface
98+
### Step 6: Access the Application Interface
8099

81100
Open a web browser and navigate to the application dashboard:
82101

metro-ai-suite/metro-sdk-manager/docs/user-guide/metro-vision-ai-sdk/get-started.md

Lines changed: 2 additions & 2 deletions
Original file line numberDiff line numberDiff line change
@@ -160,9 +160,9 @@ Profiling and monitoring performance of Metro Vision AI workloads using command-
160160
\- Comprehensive documentation for Intel's GStreamer-based video analytics framework
161161
- [DL Streamer Pipeline Server](https://docs.openedgeplatform.intel.com/2026.0/edge-ai-libraries/dlstreamer-pipeline-server/index.html)
162162
\- RESTful microservice architecture documentation for scalable video analytics deployment
163-
- [OpenVINO](https://docs.openvino.ai/2025/get-started.html)
163+
- [OpenVINO](https://docs.openvino.ai/2026/get-started.html)
164164
\- Complete reference for Intel's cross-platform inference optimization toolkit
165-
- [OpenVINO Model Server](https://docs.openvino.ai/2025/model-server/ovms_what_is_openvino_model_server.html)
165+
- [OpenVINO Model Server](https://docs.openvino.ai/2026/model-server/ovms_what_is_openvino_model_server.html)
166166
\- Model serving infrastructure documentation for production deployments
167167
- [Edge AI Libraries](https://docs.openedgeplatform.intel.com/2026.0/ai-libraries.html)
168168
\- Comprehensive development toolkit documentation and API references

0 commit comments

Comments
 (0)