💕 Power Up Patient Monitoring with the Health and Life Sciences AI Suite on a Single Intel Edge Platform #2055
stevenhoenisch
started this conversation in
General
Replies: 0 comments
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
Uh oh!
There was an error while loading. Please reload this page.
Uh oh!
There was an error while loading. Please reload this page.
-
The preview release of the Health and Life Sciences AI Suite and its Multi-Modal Patient Monitoring app demonstrates how to run multiple AI workloads concurrently on a single Intel‑powered edge device without a discrete GPU so that you, as a medical AI developer, can see Intel® Core™ Ultra processors in action.
The Multi-Modal Patient Monitoring application showcases how a single Intel-powered edge system can simultaneously run several AI workloads within one integrated dashboard:
The patient monitoring application proves that heterogeneous workloads -- from 3D human pose estimation with joint tracking to heart and respiratory rate monitoring, AI-based ECG analysis with 12-lead classification, and medical device simulation -— can coexist efficiently on one platform without compromising performance or stability. By accelerating multi-modal AI pipelines with the OpenVINO™ toolkit, the Health and Life Sciences AI Suite empowers you to drive high-performance, AI-powered health care on an Intel® Core™ Ultra platform using CPUs, a built-in GPU, and a Neural Processing Unit, or NPU.
Multi-Modal Patient Monitoring Application
The Multi-Modal Patient Monitoring application shows off how the suite organizes workflows tailored for healthcare and life sciences and eases integration with medical devices. As a collection of healthcare-focused AI applications, libraries, and benchmarking tools, the suite helps medical AI developers, original equipment manufacturer (OEMs), and original design manufacturers (ODMs) build patient monitoring solutions powered by artificial intelligence faster.
With the Multi-Modal Patient Monitoring application, you can view four key patient monitoring workloads side‑by‑side through a graphical dashboard, with each workload displaying the following information:
Outputs from these workloads are consolidated into a two-by-two layout, showing each stream in its own quadrant while sharing a single Intel Core Ultra CPU, iGPU, and NPU platform. Crucially, this combination helps reduce the bill of materials and simplify the deployment by consolidating multi‑modal AI on one cost-effective edge system.
Requirements and Setup for Demonstrating the Multi-Modal AI Capabilities of Intel Core Ultra Platforms
In patient monitoring application demonstrates the multi‑modal AI capabilities of Intel Core Ultra platforms. End-to-end setup and launch (with a single command) is estimated at about 30 minutes on Ubuntu 24.04 with containerized workloads; see the system requirements. Although secure provisioning (for example, with Polaris Peak integration) isn't part of the initial implementation, the architecture is intended to be extensible for security integrations in the future.
At a high level, the system is composed of several microservices that work together to ingest patient signals and video, run AI models on Intel hardware, aggregate results, and expose them to a user interface for clinicians. For details, see How It Works. Here's a glimpse of its architecture:
After you set up the Multi-Modal Patient Monitoring app, you can experiment with different
RPPG_DEVICEvalues to compare CPU, GPU, and NPU behavior. You can also replace the sample video or models with your own assets by updating themodelsandvideosvolumes and configuration; see, for instance, the Configure Hardware Target section of the Get Started guide.Conclusion
In summary, the preview release of the Health and Life Sciences AI Suite and its Multi-Modal Patient Monitoring app demonstrates the following:
💕 Check out the Health and Life Sciences AI Suite in its GitHub repository and power up Multi-Modal Patient Monitoring with the Get Started guide.
Beta Was this translation helpful? Give feedback.
All reactions