Skip to content

Conversation

ericholguin
Copy link
Collaborator

This PR adds a Live Stream timeline.

Users will be able to see all current live streams (kind 30311) Users can also see the associated chats with a live stream (kind 1311)

Changelog-Added: Live Stream Timeline View
Changelog-Added: Live Stream View
Changelog-Added: Live Chat View

Summary

[Please provide a summary of the changes in this PR.]

Checklist

  • I have read (or I am familiar with) the Contribution Guidelines
  • I have tested the changes in this PR
  • I have opened or referred to an existing github issue related to this change.
  • My PR is either small, or I have split it into smaller logical commits that are easier to review
  • I have added the signoff line to all my commits. See Signing off your work
  • I have added appropriate changelog entries for the changes in this PR. See Adding changelog entries
    • I do not need to add a changelog entry. Reason: [Please provide a reason]
  • I have added appropriate Closes: or Fixes: tags in the commit messages wherever applicable, or made sure those are not needed. See Submitting patches

Test report

Please provide a test report for the changes in this PR. You can use the template below, but feel free to modify it as needed.

Device: [Please specify the device you used for testing]

iOS: [Please specify the iOS version you used for testing]

Damus: [Please specify the Damus version or commit hash you used for testing]

Setup: [Please provide a brief description of the setup you used for testing, if applicable]

Steps: [Please provide a list of steps you took to test the changes in this PR]

Results:

  • PASS
  • Partial PASS
    • Details: [Please provide details of the partial pass]

Other notes

[Please provide any other information that you think is relevant to this PR.]

This PR adds a Live Stream timeline.

Users will be able to see all current live streams (kind 30311)
Users can also see the associated chats with a live stream (kind 1311)

Changelog-Added: Live Stream Timeline View
Changelog-Added: Live Stream View
Changelog-Added: Live Chat View

Signed-off-by: ericholguin <[email protected]>
@danieldaquino
Copy link
Collaborator

danieldaquino commented Sep 27, 2025

@ericholguin, I did some troubleshooting around the video coordination, and made a draft patch to improve coordination. I identified two improvements that I could make around video coordination:

  1. Creating a new view layer context called "focused_layer". Previously we mostly played videos either on a timeline or on a full screen carousel. The live stream view is a bit different as it is technically not full screen, nor a scrolling timeline. I created a new view layer context called "focused" for cases like this, updating the visibility detection and main stage priority mechanisms.
  2. Creating an optional ID argument for the video player view, and tied it to the live stream model for a more stable id. I noticed an issue with video coordination (It kept playing after leaving the live stream) was caused by the VideoPlayerView being reinitialized and creating a new ID for itself on each re-render, causing the video coordinator to believe there were multiple videos being visible to the user and ultimately keep playing.
  3. Improving the "live" video player style — which was previously mostly unused — to look and feel better in this context.

These changes should improve video coordination for the live stream events:

diff --git a/damus/Features/Live/LiveStream/Models/LiveEventModel.swift b/damus/Features/Live/LiveStream/Models/LiveEventModel.swift
index d93b0777..9a88d0b1 100644
--- a/damus/Features/Live/LiveStream/Models/LiveEventModel.swift
+++ b/damus/Features/Live/LiveStream/Models/LiveEventModel.swift
@@ -15,6 +15,7 @@ class LiveEventModel: ObservableObject {
     let damus_state: DamusState
     let live_event_subid = UUID().description
     var seen_dtag: Set<String> = Set()
+    var id: UUID = UUID()
     
     init(damus_state: DamusState) {
         self.damus_state = damus_state
diff --git a/damus/Features/Live/LiveStream/Views/LiveStreamView.swift b/damus/Features/Live/LiveStream/Views/LiveStreamView.swift
index e823e9d3..165e77da 100644
--- a/damus/Features/Live/LiveStream/Views/LiveStreamView.swift
+++ b/damus/Features/Live/LiveStream/Views/LiveStreamView.swift
@@ -111,7 +111,8 @@ struct LiveStreamView: View {
                     DamusVideoPlayerView(
                         model: videoModel,
                         coordinator: state.video,
-                        style: .preview(on_tap: {})
+                        style: .live,
+                        mainStageRequestorId: self.model.id
                     )
                 } else {
                     LiveStreamBanner(state: state, options: EventViewOptions(), image: event.image, preview: false)
diff --git a/damus/Shared/Extensions/DamusFullScreenCover.swift b/damus/Shared/Extensions/DamusFullScreenCover.swift
index cf916e4b..f3dbbf49 100644
--- a/damus/Shared/Extensions/DamusFullScreenCover.swift
+++ b/damus/Shared/Extensions/DamusFullScreenCover.swift
@@ -84,6 +84,8 @@ extension EnvironmentValues {
 enum ViewLayerContext {
     /// This is used for items placed in a scroll view, such as on a timeline or a thread view.
     case normal_layer
+    /// This is used for items that are not full screen, but are more static (i.e. not on a timeline). For example: A live stream event page, where the live stream video is the main focus of the view.
+    case focused_layer
     /// This is used for video players being displayed full screen
     case full_screen_layer
 }
diff --git a/damus/Shared/Media/Video/DamusVideoCoordinator.swift b/damus/Shared/Media/Video/DamusVideoCoordinator.swift
index 2512cdd4..c1c0247e 100644
--- a/damus/Shared/Media/Video/DamusVideoCoordinator.swift
+++ b/damus/Shared/Media/Video/DamusVideoCoordinator.swift
@@ -77,12 +77,12 @@ final class DamusVideoCoordinator: ObservableObject {
     func request_main_stage(_ request: MainStageRequest) {
         Log.info("VIDEO_COORDINATOR: %s requested main stage", for: .video_coordination, request.requestor_id.uuidString)
         switch request.layer_context {
-            case .normal_layer:
-                if normal_layer_main_stage_requests.first(where: { $0.requestor_id == request.requestor_id }) != nil { return } // Entry exists already
-                normal_layer_main_stage_requests.append(request)
-            case .full_screen_layer:
-                if full_screen_layer_stage_requests.first(where: { $0.requestor_id == request.requestor_id }) != nil { return } // Entry exists already
-                full_screen_layer_stage_requests.append(request)
+        case .normal_layer, .focused_layer:
+            if normal_layer_main_stage_requests.first(where: { $0.requestor_id == request.requestor_id }) != nil { return } // Entry exists already
+            normal_layer_main_stage_requests.append(request)
+        case .full_screen_layer:
+            if full_screen_layer_stage_requests.first(where: { $0.requestor_id == request.requestor_id }) != nil { return } // Entry exists already
+            full_screen_layer_stage_requests.append(request)
         }
         self.select_focused_video()
     }
diff --git a/damus/Shared/Media/Video/DamusVideoPlayerView.swift b/damus/Shared/Media/Video/DamusVideoPlayerView.swift
index b2438999..3bc183af 100644
--- a/damus/Shared/Media/Video/DamusVideoPlayerView.swift
+++ b/damus/Shared/Media/Video/DamusVideoPlayerView.swift
@@ -21,7 +21,7 @@ struct DamusVideoPlayerView: View {
     let url: URL
     @ObservedObject var model: DamusVideoPlayer
     let style: Style
-    let main_state_requestor_id: UUID = UUID()
+    let main_stage_requestor_id: UUID
     
     @State var is_visible: Bool = false {
         didSet {
@@ -29,7 +29,7 @@ struct DamusVideoPlayerView: View {
                 // We are visible, request main stage
                 video_coordinator.request_main_stage(
                     DamusVideoCoordinator.MainStageRequest(
-                        requestor_id: self.main_state_requestor_id,
+                        requestor_id: self.main_stage_requestor_id,
                         layer_context: self.view_layer,
                         player: self.model,
                         main_stage_granted: self.main_stage_granted
@@ -38,7 +38,7 @@ struct DamusVideoPlayerView: View {
             }
             else {
                 // We are no longer visible, give up the main stage
-                video_coordinator.give_up_main_stage(request_id: self.main_state_requestor_id)
+                video_coordinator.give_up_main_stage(request_id: self.main_stage_requestor_id)
             }
         }
     }
@@ -52,18 +52,20 @@ struct DamusVideoPlayerView: View {
         return view_layer_context ?? .normal_layer
     }
     
-    init(url: URL, coordinator: DamusVideoCoordinator, style: Style) {
+    init(url: URL, coordinator: DamusVideoCoordinator, style: Style, mainStageRequestorId: UUID? = nil) {
         self.url = url
         self.model = coordinator.get_player(for: url, title: "Untitled", link: url.absoluteString, artist: "NA", artwork: "NA")
         self.video_coordinator = coordinator
         self.style = style
+        self.main_stage_requestor_id = mainStageRequestorId ?? UUID()
     }
     
-    init(model: DamusVideoPlayer, coordinator: DamusVideoCoordinator, style: Style) {
+    init(model: DamusVideoPlayer, coordinator: DamusVideoCoordinator, style: Style, mainStageRequestorId: UUID? = nil) {
         self.url = model.url
         self.model = model
         self.video_coordinator = coordinator
         self.style = style
+        self.main_stage_requestor_id = mainStageRequestorId ?? UUID()
     }
     
     var body: some View {
@@ -94,9 +96,9 @@ struct DamusVideoPlayerView: View {
                 if model.has_audio {
                     mute_button
                 }
-            }
-            if model.is_live {
-                live_indicator
+                if model.is_live {
+                    live_indicator
+                }
             }
         }
         .on_visibility_change(perform: { new_is_visible in
@@ -106,19 +108,19 @@ struct DamusVideoPlayerView: View {
     
     private var visibility_tracking_method: VisibilityTracker.Method {
         switch self.view_layer {
-            case .normal_layer:
-                return .standard
-            case .full_screen_layer:
-                return .no_y_scroll_detection
+        case .normal_layer:
+            return .standard
+        case .full_screen_layer, .focused_layer:
+            return .no_y_scroll_detection
         }
     }
     
     func main_stage_granted() {
         switch self.style {
-        case .full, .no_controls:
-                self.model.is_muted = false
-            case .preview, .live:
-                self.model.is_muted = true
+        case .full, .no_controls, .live:
+            self.model.is_muted = false
+        case .preview:
+            self.model.is_muted = true
         }
     }
     
diff --git a/damus/Shared/Utilities/Router.swift b/damus/Shared/Utilities/Router.swift
index 2afc9cd1..f128afef 100644
--- a/damus/Shared/Utilities/Router.swift
+++ b/damus/Shared/Utilities/Router.swift
@@ -143,6 +143,7 @@ enum Route: Hashable {
             LiveStreamHomeView(damus_state: damusState, model: model)
         case .LiveEvent(let liveEvent, let liveEventModel):
             LiveStreamView(state: damusState, ev: liveEvent, model: liveEventModel)
+                .environment(\.view_layer_context, .focused_layer)
         }
     }
 

@danieldaquino
Copy link
Collaborator

@ericholguin, beyond my previous comment, I wonder if just using a normal apple-provided video player (i.e. not DamusVideoPlayerView) would be a good solution as well.

The reason why we need such a complex video player and coordination is because of how we normally do videos on the home timeline (We need to play and stop videos based on scroll position, we need to put the video inside a full screen carousel instead of using the normal apple video full screen, we need to pause timeline videos when watching a video in this "full-screen" carousel, etc).

However, in this context, we don't really have those complex requirements. There is only one possible video in the view (The live stream), so maybe it's easier and better to use a standard apple-provided video player view.

The only caveat is that we need to ensure there are no other videos that can play from the video coordinator at the same time. I think it's unlikely in the way you setup the views though, so maybe it is fine.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

2 participants