Skip to content

[Samples]: Add VLM alerts sample in python 1/2#620

Merged
oonyshch merged 35 commits intomainfrom
oonyshch/vlm_alerts
Feb 27, 2026
Merged

[Samples]: Add VLM alerts sample in python 1/2#620
oonyshch merged 35 commits intomainfrom
oonyshch/vlm_alerts

Conversation

@oonyshch
Copy link
Copy Markdown
Contributor

@oonyshch oonyshch commented Feb 17, 2026

Description

This PR is intended to:

  • Add vlm_alerts.py sample application
  • Download video if URL provided
  • Export HuggingFace VLM to OpenVINO IR using optimum-cli
  • Build and run GStreamer pipeline with gvagenai
  • Write JSONL metadata output
  • Add requirements.txt for exporter stack

Tested by running locally for all mentioned components on ARL-H.

Checklist:

  • I agree to use the MIT license for my code changes.
  • I have not introduced any 3rd party components incompatible with MIT.
  • I have not included any company confidential information, trade secret, password or security token.
  • I have performed a self-review of my code.

Next steps:

  • Modify the genai.cpp file to support displaying VLM inference result on the obtained video.
  • Add support for multiple widths and heights of the processed video.
  • Add list of suggested VLMs.

- Add vlm_alerts.py sample application
- Download video if URL provided
- Export HuggingFace VLM to OpenVINO IR using optimum-cli
- Build and run GStreamer pipeline with gvagenai
- Write JSONL metadata output
- Add requirements.txt for exporter stack
@oonyshch oonyshch changed the title samples: add Python VLM alerts sample using HF Optimum + gvagenai Add VLM alerts sample in python Feb 17, 2026
yunowo and others added 7 commits February 26, 2026 15:15
)

in that commit d6b4de5
the default diplay configuration and the text background by default
should be drawn. but if the user didn't specify _displ_cfg
the text background is drawn, this commit to fix that bug.

Signed-off-by: Walid <walid.aly@intel.com>
)

* Add disable-proxy property to gvametapublish for MQTT connections

* Update gvametaconvert documentation to clarify timestamp properties with timecodestamper links

* Remove disable-proxy property  from gvametapublish, add warning in the documentation

* Remove a header

* Revert code changes

---------

Co-authored-by: Tomasz Bujewski <tomasz.bujewski@intel.com>
* Update Openvino for fedora and ubuntu.

* Update openvino version

* fix windows path

* fix openvino version

* Update OpenVINO references to version 2026.0 across various files

* Update OpenVINO dependency to version 2026.0.0 in control files

* Update OpenVINO dependency to version 2026.0.0 in intel-dlstreamer.spec

* Update OpenVINO repository source to version 2026 in Dockerfile

* Update OpenVINO references and dependencies to version 2026.0.0 in scripts and Dockerfile

* Update copyright years to 2026 in various files

* fix openvino repo paths

* fix repo path in entrypoint script

* Add missing headers

* remove deprecated openvino.runtime namespace and use the openvino namespace directly

* Update OpenVINO version for Windows.

---------

Co-authored-by: nszczygl9 <118973656+nszczygl9@users.noreply.github.com>
@oonyshch oonyshch force-pushed the oonyshch/vlm_alerts branch from a8acd38 to b13008e Compare February 26, 2026 14:20
@oonyshch oonyshch changed the title Add VLM alerts sample in python Add VLM alerts sample in python 1/2 Feb 26, 2026
@oonyshch oonyshch changed the title Add VLM alerts sample in python 1/2 [Samples]: Add VLM alerts sample in python 1/2 Feb 26, 2026
@oonyshch oonyshch marked this pull request as ready for review February 26, 2026 20:30
-Dorc=disabled
-Dgpl=disabled
-Dpython=enabled
-Dpython=enabled
Copy link
Copy Markdown
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I know this is out of scope, just noticed misalignment in formatting.

@oonyshch oonyshch requested a review from tjanczak February 27, 2026 09:48
Copy link
Copy Markdown
Contributor

@walidbarakat walidbarakat left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

looks good, if this script will be integrated in any automated pipeline or we need to showcase our way of error resiliency we can add a robust error and exception handling functionality. otherwise no need.

@oonyshch oonyshch merged commit 088ebb1 into main Feb 27, 2026
29 of 35 checks passed
@oonyshch oonyshch deleted the oonyshch/vlm_alerts branch February 27, 2026 12:02
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

10 participants