Skip to content

Conversation

alltheseas
Copy link
Contributor

@alltheseas alltheseas commented Oct 17, 2025

Summary

  • Adds the new notedeck_livestreams crate with HLS inline playback, stream filtering, and participant/status views,
    plus the icon (sparkle.svg).
  • Wires the Livestreams UI into Notedeck: updates cargo manifests, sidebar navigation, and the app enum so users can
    launch/return from the livestream viewer.
  • Documents the Linux VAAPI pitfall: misconfigured VAAPI can produce blank video while audio keeps playing; README now
    details environment overrides to force software decode.
  • No Android-specific changes were required—the Android build uses the platform’s MediaCodec pipeline, so the VAAPI
    workaround is Linux-only.

VAAPI Lessons Learned

VAAPI Lessons Learned

  - What broke: On some Linux setups, GStreamer auto-selects VAAPI (hardware) decoders such as vaapih264dec/
    vaapipostproc. When the driver stack is unhappy, those elements decode the desktop (or return all-zero frames) while
    audio keeps playing. That’s why both Notedeck and gst-launch-1.0 showed a black video even though the audio stream
    was fine.
  - How we proved it: Forcing a pure software pipeline (videoconvert ! ximagesink) immediately produced correct video,
    and the inline player started working once we kept those overrides in place. ffplay also worked, confirming the
    stream itself was healthy.
  - Where it happens: Only on Linux desktops using VAAPI. Android builds rely on MediaCodec and are unaffected.

  ———

  ### Quick way to boot without VAAPI

  In the shell where you launch Notedeck (or gst-launch), disable VAAPI before starting the app:

  export GST_VAAPI_DISABLE=1
  export LIBVA_DRIVER_NAME=dummy
  export GST_PLUGIN_FEATURE_RANK=vaapidecodebin:0,vaapih264dec:0,vaapipostproc:0,vaapisink:0

  RUST_LOG=notedeck_livestreams=debug \
  cargo run -p notedeck_chrome --release --features inline-playback -- \
    --debug --datapath ./target/

  - GST_VAAPI_DISABLE=1 stops VAAPI from loading.
  - LIBVA_DRIVER_NAME=dummy keeps libva from finding hardware drivers.
  - GST_PLUGIN_FEATURE_RANK=…:0 demotes the VAAPI plugins so GStreamer falls back to software decode.

  You can test the same environment with gst-launch-1.0 playbin … video-sink='videoconvert ! ximagesink' to confirm video
  renders.

  ———

  ### Booting consistently without VAAPI

  1. Shell profile/script: Add the exports to a helper script or your shell rc file. For example, create start-
     notedeck.sh:

     #!/bin/sh
     export GST_VAAPI_DISABLE=1
     export LIBVA_DRIVER_NAME=dummy
     export GST_PLUGIN_FEATURE_RANK=vaapidecodebin:0,vaapih264dec:0,vaapipostproc:0,vaapisink:0

     RUST_LOG=notedeck_livestreams=debug \
     cargo run -p notedeck_chrome --release --features inline-playback -- \
       --debug --datapath ./target/

     Make it executable (chmod +x start-notedeck.sh) and run ./start-notedeck.sh whenever you want to launch.
  2. System-wide option: If you package Notedeck or run it via a desktop entry, add those environment variables to the
     launch script/service so every invocation uses software decode by default.
  3. Verifying: Run gst-launch-1.0 playbin uri=… video-sink='videoconvert ! ximagesink' from the same environment. If you
     see live video, the overrides are active.

  Keep these variables in place until you confirm updated drivers/GStreamer versions make VAAPI reliable again.

@alltheseas
Copy link
Contributor Author

has not been tested for

  • android
  • MacOS
  • Windows

@alltheseas
Copy link
Contributor Author

@jb55 mentions that GStreamer is not the correct approach. @jb55 can you advise what is your preferred approach?

@alltheseas
Copy link
Contributor Author

Got it — if GStreamer isn’t a fit, you’ve still got solid ways to add **HLS (.m3u8)** to Notedeck across Android, Linux, macOS, and Windows. Below are pragmatic paths devs commonly take, plus what I’d pick for Notedeck.

---

# 1) “Use the OS player” (fastest, best HW accel, smallest surface)

**Idea:** call the platform’s native media stack for HLS; wrap each in a thin Rust layer; share one Rust API.

* **Android → ExoPlayer (Media3)**
  ExoPlayer has first-class HLS (including fMP4/TS, multi-rendition, subtitles). You embed a `PlayerView`/`SurfaceView` via JNI and pipe lifecycle + surface events from your Rust UI container. ([[Android Developers](https://developer.android.com/media/media3/exoplayer/supported-formats?utm_source=chatgpt.com)][1])

* **macOS → AVFoundation/AVPlayer**
  Apple’s AVPlayer plays HLS natively (Apple invented HLS). Use `AVPlayerLayer` in an NSView and bridge via `objc`/`objc2`. HW decode via VideoToolbox “just works.” ([[Apple Developer](https://developer.apple.com/documentation/avfoundation/media-playback?utm_source=chatgpt.com)][2])

* **Windows → Media Foundation (IMFMediaEngine)**
  Media Foundation’s `IMFMediaEngine` is the modern way to host a HTML5-style media engine (play, pause, seek) in native apps; it supports HLS via the platform’s MF pipeline. Use the `windows` crate for COM. ([[Microsoft Learn](https://learn.microsoft.com/en-us/windows/win32/directshow/audio-capabilities?utm_source=chatgpt.com)][3])

* **Linux → (pick one)**

  * **libmpv**: embeddable player with HLS via FFmpeg, strong HW accel backends (VAAPI/VDPAU), simple texture callbacks. Good balance of power and integration. ([[mpv.io](https://mpv.io/manual/stable/?utm_source=chatgpt.com)][4])
  * **FFmpeg directly**: roll your own demux/decode (HLS demuxer in libavformat) and render frames with wgpu/GL. Maximum control, maximum work. ([[FFmpeg](https://ffmpeg.org/ffmpeg-formats.html?utm_source=chatgpt.com)][5])

**Why this path:** best UX and battery on each OS, minimal third-party dependencies (especially on Android/macOS), and straightforward DRM path later (Widevine/FairPlay) if you ever need it.

---

# 2) “Embed a cross-platform engine” (one codepath for desktop)

* **libmpv** (recommended if you want a unified desktop stack)
  Mature, battle-tested, HLS via FFmpeg, HW accel everywhere, small API surface; Rust bindings exist (`libmpv`, `mpv`, `mpv-client`). You can get decoded frames or let it draw into a window/texture. ([[Docs.rs](https://docs.rs/libmpv?utm_source=chatgpt.com)][6])

* **libVLC**
  Also cross-platform with HLS support; Rust bindings exist (`vlc-rs`). API is higher-level than FFmpeg; can be simpler to drop in. (Note: libVLC is **LGPL 2.1+**, which is often friendlier to link than GPL; check modules you enable.) ([[GitLab](https://code.videolan.org/videolan/vlc-rs?utm_source=chatgpt.com)][7])

**Why this path:** one implementation for Linux/macOS/Windows; good if Notedeck’s UI can host a native child view for the player.

---

# 3) “WebView + hls.js” (lowest native code, great for quick wins)

* Embed a WebView (Edge/WebView2 on Windows, WKWebView on macOS, Android WebView) and run **hls.js** — it transmuxes HLS to MSE/HTML5 `<video>`. Works well for many public HLS streams (CORS + fMP4 best). ([[GitHub](https://github.com/video-dev/hls.js?utm_source=chatgpt.com)][8])

**Trade-offs:** added web runtime, trickier fullscreen/input/latency; DRM not trivial; but it’s fast to ship and very portable.

---

# 4) “Full custom with FFmpeg” (maximum control)

* Parse playlists yourself (e.g., `hls_m3u8` crate) or rely entirely on libavformat’s HLS demuxer, decode with libavcodec, and present frames via wgpu/GL. HW acceleration paths: VAAPI/VDPAU (Linux), D3D11VA (Windows), VideoToolbox (macOS), MediaCodec (Android). You own ABR, buffering, subtitle tracks, A/V sync, etc. ([[FFmpeg](https://ffmpeg.org/ffmpeg-formats.html?utm_source=chatgpt.com)][5])

**Trade-offs:** biggest engineering effort (seeking, ABR, audio device handling, color management, subtitle renderers…).

---

## What I’d ship for Notedeck

**Hybrid, minimal-risk plan:**

1. **Android:** Media3/ExoPlayer via JNI (official, HLS-native, best perf). ([[Android Developers](https://developer.android.com/media/media3/exoplayer/supported-formats?utm_source=chatgpt.com)][1])
2. **macOS:** AVPlayer via AVFoundation (native HLS, essentially zero glue). ([[Apple Developer](https://developer.apple.com/documentation/avfoundation/media-playback?utm_source=chatgpt.com)][2])
3. **Windows:** Media Foundation’s **IMFMediaEngine** host control. ([[Microsoft Learn](https://learn.microsoft.com/en-us/windows/win32/directshow/audio-capabilities?utm_source=chatgpt.com)][3])
4. **Linux:** **libmpv** embed (least glue for robust HLS across distros). ([[mpv.io](https://mpv.io/manual/stable/?utm_source=chatgpt.com)][4])

All 4 hide behind a single Rust trait, selected with `cfg(target_os)` features. If you’d rather avoid per-OS code on desktop, pick **libmpv** for macOS+Windows+Linux and keep **ExoPlayer** for Android.

---

## Sketch of the Rust API surface

```rust
pub trait VideoPlayer {
    fn new() -> Self where Self: Sized;
    fn set_source(&mut self, url: &str); // m3u8
    fn play(&mut self);
    fn pause(&mut self);
    fn seek(&mut self, seconds: f64);
    fn set_volume(&mut self, vol: f32);
    fn set_surface(&mut self, native_handle: RawWindowHandle); // or texture callback
}

#[cfg(target_os = "android")]
pub type PlatformPlayer = exoplayer::Player;

#[cfg(target_os = "macos")]
pub type PlatformPlayer = avfoundation::Player;

#[cfg(target_os = "windows")]
pub type PlatformPlayer = media_foundation::Player;

#[cfg(target_os = "linux")]
pub type PlatformPlayer = mpv_embed::Player;
  • Android glue: JNI (jni crate) to create ExoPlayer, bind to a SurfaceView/TextureView, forward callbacks. ([Android Developers]1)
  • macOS glue: objc/objc2 to wrap AVPlayer/AVPlayerLayer; host inside your NSView. ([Apple Developer]2)
  • Windows glue: windows crate to activate IMFMediaEngineClassFactory, render to HWND/DX surface. ([Microsoft Learn]9)
  • Linux glue: libmpv crate (docs.rs/libmpv) to create a player, set vo=gpu, and request either direct window embedding or GPU texture. ([Docs.rs]6)

Notes & gotchas

  • Licensing:

    • libVLC is LGPL 2.1+ (dynamic link is usually OK for non-GPL apps; verify modules). ([vlc-user-documentation.readthedocs.io]10)
    • libmpv’s licensing has GPL heritage; ensure your distribution model is compatible (or keep it desktop-only where that’s acceptable). ([GitHub]11)
  • DRM (Widevine/FairPlay): only feasible with native stacks (ExoPlayer, AVFoundation). If you need DRM later, this argues against hls.js/libmpv/libVLC. ([Android Developers]1)

  • ABR & latency: native stacks handle adaptive bitrate well; webview+hls.js is decent but may add latency. ([GitHub]8)

  • Headers/CORS: libVLC/mpv/FFmpeg sometimes need special header handling for HLS segment requests; be mindful if relays require referrers/tokens. ([GitLab]12)


TL;DR pick list

  • Want “it just works” on each OS with best perf?
    ExoPlayer + AVFoundation + Media Foundation + libmpv(Linux).

  • Want one engine across desktop?
    libmpv (desktop) + ExoPlayer (Android).

  • Want minimal native code & fastest prototype?
    WebView + hls.js.

If you want, I can sketch the JNI/AVF/MF/mpv binding crates, Cargo features, and a tiny demo column in Notedeck that renders an HLS player tile for each platform.

@alltheseas
Copy link
Contributor Author

@v0l appreciate your eyes ser 💪

@alltheseas alltheseas changed the title Notedeck livestream app Proof of concept GStreamer Notedeck livestream app Oct 19, 2025
@alltheseas
Copy link
Contributor Author

I will try porting the livestream app to the embedded media player that enables native hardware acceleration #1172 (i.e. remove GStreamer)

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Projects

None yet

Development

Successfully merging this pull request may close these issues.

1 participant