building libsrt failed out of the box on macOS #1838
Replies: 5 comments 4 replies
-
|
It looks like you have this build script available, but since I’m not using it to build, it’s difficult for me to give any advice. I’ve published a build script for HaishinKit in the following repository: |
Beta Was this translation helpful? Give feedback.
-
|
Thanks. This may not be necessary.
I’ve found out how to fix my app so I can stream to castr. Until I get my youtube feed restarted, I can’t test youtube, but it’s likely its the same issue.
I don’t understand the workaround below---in the slightest. I am hoping you can explain it to me? See below for details, it is not very complicated.
Short version:
Your app works because you are sending (audio) data to the mixer well before you make the publish() call. This appears to be necessary. If you start calling append() immediately after publish(), things don’t work.
Longer version:
Here is my connection code (simplified, I don’t do raw try! in real code):
let connection = SRTConnection()
let stream = SRTStream(connection: connection)
try! await stream.setAudioSettings(audioSettings)
try! await stream.setVideoSettings(videoSettings)
try! await connection.connect(ingestURL)
let connected = await connection.connected
guard connected else { return }
await stream.publish()
self.srtStream = stream
I then do:
await self.srtStream?.append(sampleBuffer) // video
and
await srtStream?.append(result.buffer, when: result.when)
to send the data.
Key point: because srtStream is nil until after publish() is called, I am not really calling append() until after the call to publish.
And this doesn’t work with CASTR.
So what does work?
If I do this instead:
self.srtStream = stream
Task {
try? await Task.sleep(seconds: 2)
await stream.publish()
}
then things start working.
In other words, it is necessary to call append() some number of times with a bunch of audio samples *before* calling publish — or castr never receives a stream. Indeed, the connection shuts off after a few seconds.
Can you explain why it is necessary to be calling append() on the audio streams *after* calling connection.connect() but BEFORE calling stream.publish()?
That is counter-intuitive and makes absolutely no sense at all.
If I regain access to youtube for SRT I will try this workaround and inform you if that was the issue.
…
It looks like you have this build script available, but since I’m not using it to build, it’s difficult for me to give any advice.
https://github.com/Haivision/srt/blob/master/scripts/build-ios/mksrt-xcf.sh
I’ve published a build script for HaishinKit in the following repository:
It does essentially the same thing as the official script, but supports more platforms.
https://github.com/HaishinKit/libsrt-xcframework
—
Reply to this email directly, view it on GitHub <#1838 (comment)>, or unsubscribe <https://github.com/notifications/unsubscribe-auth/ADAWJJK5Y6SFRZ5DZ45ML4L36AP6FAVCNFSM6AAAAACM2KSFTKVHI2DSMVQWIX3LMV43URDJONRXK43TNFXW4Q3PNVWWK3TUHMYTKMBUGUYDOMI>.
You are receiving this because you authored the thread.
|
Beta Was this translation helpful? Give feedback.
-
|
Perhaps a new method or extra parameters to publish specifying whether one plans to do audio+video, audio only, etc?
I considered that as I saw that expectedMedia types is locked down at publish time but on further inspection it seems like if you didn’t specify an expected type, it took everything.
I didn’t consider that maybe it locked down whether it expected audio or video.
Anyway, it sounds like I would only need to append one video and audio sample to get this primed, and could then immediately publish. Right now I’m just letting the stream run for 2-3 seconds before doing publish but I’d prefer not to have this be timing dependent.
Thank you so much for your help.
… On Nov 23, 2025, at 8:21 AM, shogo4405 ***@***.***> wrote:
When constructing the internal MPEG-TS track data, HaishinKit need to know at the time of publish whether it will contain audio only, video only, etc.
Since the determination was being made through the append method, it may have been interpreted as having no audio or video, causing the track data to be generated incorrectly and resulting in unexpected behavior.
—
Reply to this email directly, view it on GitHub <#1838 (reply in thread)>, or unsubscribe <https://github.com/notifications/unsubscribe-auth/ADAWJJPK4DOO7TW7CJPLPJL36HNIPAVCNFSM6AAAAACM2KSFTKVHI2DSMVQWIX3LMV43URDJONRXK43TNFXW4Q3PNVWWK3TUHMYTKMBVGM3DCMQ>.
You are receiving this because you authored the thread.
|
Beta Was this translation helpful? Give feedback.
-
|
Thanks. I observed before that if expectedMedia was empty, canWriteFor returned true…
…. but writeProgram() would exit early, and I am quite sure that was in fact the problem. I’ll try it out in a few hours and report back.
FYI, very minor bug, but it confused me for a bit: in iOS/PublishViewModel.swift,
this code:
func startPublishing(_ preference: PreferenceViewModel) {
Task {
guard let session else {
return
}
do {
try await session.connect {
Task { @mainactor in
self.isShowError = true
I believe you meant to write:
self.isShowError = false
With “true” the app pops up an error dialog whenever you stop streaming.
… On Nov 23, 2025, at 9:17 AM, shogo4405 ***@***.***> wrote:
Yes. I thought it was necessary, so I went ahead and implemented it.
https://github.com/HaishinKit/HaishinKit.swift/blob/8044c9cc4b336be11db481df6704bf5119425b79/SRTHaishinKit/Sources/SRT/SRTStream.swift#L131-L137
Please try.
await stream.setExpectedMedias([.audio, .video])
—
Reply to this email directly, view it on GitHub <#1838 (reply in thread)>, or unsubscribe <https://github.com/notifications/unsubscribe-auth/ADAWJJMTYVJJ5HZLGV4DY5336HTZBAVCNFSM6AAAAACM2KSFTKVHI2DSMVQWIX3LMV43URDJONRXK43TNFXW4Q3PNVWWK3TUHMYTKMBVGM4TIMA>.
You are receiving this because you authored the thread.
|
Beta Was this translation helpful? Give feedback.
-
|
Oops. i didn’t read carefully enough to see that assignment is in a completion callback.yes, my fix is wrong. it’s more subtle. thanks.Sent from my iPadOn Nov 23, 2025, at 9:34 AM, shogo4405 ***@***.***> wrote:
I wrote this code with the intention of showing that the stream has stopped when the network is disconnected during a broadcast.
I’ll update it later so that it behaves as intended. Thank you for the feedback.
—Reply to this email directly, view it on GitHub, or unsubscribe.You are receiving this because you authored the thread.Message ID: ***@***.***>
|
Beta Was this translation helpful? Give feedback.
Uh oh!
There was an error while loading. Please reload this page.
Uh oh!
There was an error while loading. Please reload this page.
-
Question
I want to build libsrt by itself to debug something. I followed the instructions, exactly, but the actual build of libsrt failed with this error message:
So it seems like it something that should have been built as x86_64 was built as ARM instead. I'm using an Apple Silicon macbook, of course, not the latest OS, but just one shy.
Any solutions? Is this an error in the build script, or is there command I need to issue to my global xcodebuild setup to make it do the right thing ?
Background / Tried Steps
No response
Environment
No response
Beta Was this translation helpful? Give feedback.
All reactions