Skip to content

max_bit_rate in RTP parameters should actually be min_bit_rate? #959

@adriancable

Description

@adriancable

Right now when starting a video stream, HAP-NodeJS passes something like this to handleStreamRequest on my delegate:

{
  sessionID: 'd06ea903-4ac4-4029-97e9-2e57db58b7d6',
  type: 'start',
  video: {
    codec: 0,
    profile: 2,
    level: 2,
    packetizationMode: 0,
    cvoId: undefined,
    width: 1280,
    height: 720,
    fps: 30,
    pt: 99,
    ssrc: 1947509574,
    max_bit_rate: 299,
    rtcp_interval: 0.5,
    mtu: 1378
  },
  audio: { ... }
}

Note max_bit_rate and indeed, at least in the very old HomeKit docs from Apple I have, this is what that says, too. But I don't think it's right. I think it actually should be min_bit_rate. Looking at the console output from homed on iOS, I see this:

[Garden Camera/74/D06EA903-4AC4-4029-97E9-2E57DB58B7D6/kDefaultCameraApplicationIdentifier] Writing start stream configuration: 
    {
          sessionControl = 
        {
              tlvDatablob = (null) 
              controlCommand = HMDSessionControlCommandStart 
              sessionID = D06EA903-4AC4-4029-97E9-2E57DB58B7D6 
        } 
          videoParameters = 
        {
              tlvDatablob = (null) 
              videocodec = HMDVideoCodecTypeH264 
              codecParameters = 
            {
                  tlvDatablob = (null) 
                  h264Profile = 
                  [ 
                {
                     h264Profile = HMDH264ProfileTypeHigh
                }
                  ] 
                  levels = 
                  [ 
                {
                     h264Level = HMDH264LevelType4
                }
                  ]
                  packetizationModes = 
                  [ 
                {
                     packetizationMode = HMDPacketizationModeTypeSingleNonInterleaved
                }
                  ]
            } 
              attributes = 
            {
                  tlvDatablob = (null) 
                  imageWidth = 1280 
                  imageHeight = 720 
                  resolution = 
                {
                     resolutionType = HMDVideoResolutionType1280x720
                } 
                  framerate = 30 
            } 
              rtpParameters = 
            {
                  tlvDatablob = (null) 
                  syncSource = 1947509574 
                  payloadType = 99 
                  minimumBitrate = 299000 
                  maximumBitrate = 1078000 
                  rtcpInterval = 0.5 
                  maxMTU = 1378 
                  comfortNoisePayloadType = (null) 
            } 
        } 
          audioParameters = { ... }
    }

Note that what HAP-NodeJS returns as max_bit_rate matches minimumBitrate here (299000), not maximumBitrate (1078000).

Also, I did an experiment: if I do send RTP data faster than what's called maximumBitrate in the homed logs for a few seconds, homed will kill the stream. So it does really appear that this is the enforced maximumBitrate, i.e. the iOS logs from homed are correct and HAP-NodeJS is wrong.

What I haven't looked at: how we get at the maximumBitrate from the RTP stream configuration TLV data sent by homed. This would probably be very useful for plug-ins to know. Right now since plug-ins are interpreting what's actually the min bite rate as the max bit rate, camera plug-ins are usually universally sending much lower quality video to HomeKit than they should be, which is quite sad.

Metadata

Metadata

Assignees

No one assigned

    Labels

    Type

    No type

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions