Measuring live encoding latency

Is there a way to programmatically measure the encoding latency in a live encoding workflow when using the Bitmovin live encoder?

So, in other words, let’s imagine that I configure then start a live encoder, wait for it to be ready, then start pushing a stream into it (RTMP, SRT, whatever…).
I’m searching for a way to be able to determine the time it takes for input frame at wall clock X to be encoded, muxed into a segment, that segment written into the output bucket, and then added to the manifest (HLS or DASH).

I know I could add a timecode in the input and then watch the output and compare, but that’s a lot of manual work I’d rather not have to perform for each single stream…

Hi Farbe,
we implemented something on top of our live encoding workflow that is measuring the end-to-end latency. With the emsg ISOBMFF box there is even a smart way to bring it to the client using HLS/DASH, which is also supported by almost all players to expose the content of the emsg box to the applications.
You can read some more details about it in a blog post: https://medium.com/zattoo_tech/how-to-go-low-latency-without-special-tricks-37c69db027bb

Hi @stefan.kaiser ,

I remember reading that series of articles with interest at the time. Thanks for reminding me about it.

Are you able to provide more insight as to how you inserted the emsg in the first frame of each fragment? Sounds very useful! However, it is not an option I’ve found with run-of-the-mill encoders…

Hi @fabre.lambeau1 , glad to hear from you. Looking at your requirement, you seem to be interested in finding the latency introduced by the transcoder+packager instead of E2E latency.

To find this, you will need

  • The timestamp when a input frame is ingested into encoder/transcoder.
  • The timestamp when the segment corresponding to that input frame is added into the playlist and the playlist uploaded to output with this segment as latest segment.

One possible way to achieve this could be

  • In StartLiveEncoding API

    • Add EXT-X-PROGRAM-DATETIME tag by setting hlsManifests.insertProgramDateTime=true
    • Configure hlsManifests.programDateTimeSettings. programDateTimeSource=EMBEDDED to use input time of the first frame as time written out to EXT-X-PROGRAM-DATETIME
  • SRT live ingest input with PICTURE_TIMING SEI information is required for this to work. An SRT stream without picture timing information will fallback to use system time as reference.

  • Once above is set, the DATERANGE tag in HLS playlist will contain the input picture timing information.

  • Then you can track updates of the HLS playlist in output bucket to calculate latency.

    • The output buckets normally provide mechanisms to track updates. Trigger a lambda/cloud function on playlist update.
    • Based on the DATERANGE and segment durations in the playlist, calculate the start time of the latest segment which will represent the input timing of the first frame in the segment.
  • You can then compare time calculated in above step with timing of the update of playlist and find the latency.

I hope that above information helpful.

Hi @fabre.lambeau1 ,
sorry for the late reply. I can imagine that it’s more tricky with off-the-shelf encoders.
We implemented our own packager and thus fully control the part to insert the ‘emsg’ boxes with the proper timestamps in it.