12/12/2023 0 Comments Wirecast streaming bad qualityThese buffers, before and during the playback sessions, would add to the end-to-end latency. Poor network conditions or default algorithms that favor quality and stability of playback could cause players to decide to buffer more content upfront to prevent interruptions during playback. Depending on the player’s implementation, buffering decisions impact the latency a viewer observes. When using the LowLatencyV2 stream option, be sure to find a player that supports Low Latency HLS (LL-HLS). Media Services supports different streaming protocol outputs – DASH, HLS with TS output and HLS with CMAF fragments. When choosing and configuring a video player, make sure you use settings that are optimized for lower latency. You won't get the full benefits of LL-HLS in native Apple players. This allows us to pack only one fragment into one HLS segment. If you must choose TS output, use an HLS packing ratio of 1. It increases your cache hit ratio when CDN is used. This allows you to share the same fragments for both formats. We recommend that you choose CMAF output for both HLS and DASH playback. For the Standard encoding (up to 720p) and Premium encoding (up to 1080p) stream options, unless you need a DVR window longer than 6 hours or smooth streaming output, use the Low latency stream latency setting. Use the low latency stream options for live events. Here are some configurations that will help you reduce the latency in our pipeline: For Low Latency HLS, the fixed frame rate is recommended, and the max frame duration should not exceed 0.5 seconds for the best experience.Ĭonfiguration of the Azure Media Services live event While we support 60 fps input for live events, our encoding live event output is still not above 30 fps. Keep your framerate at 30fps or lower unless using pass-through live events. For example, if you're using 720p standard encoding live events, send a stream that is already at 720p. Send content that is no higher in resolution than what you plan to stream. For example, with OBS Studio, if you use the Nvidia H.264 encoder, you may see the “zero latency” preset. Use an encoding profile that is optimized for low-latency. This would allow you to offload CPU work to the GPU. Use the GPU encoder if your encoding software allows you to do that. The default on some encoders, such as OBS, is 8 seconds. This will ensure that you have a great network connection to the Media Services account. Pick the physical region closest to your contribution encoder for your Media Services account. Here are some recommendations for the settings that would give you the lowest possible latency: You are in control of the settings of the source encoder settings before the RTMP stream reaches Media Services. Settings on this software affect the end-to-end latency of a live stream.ĭelays in the live streaming pipeline within Azure Media Servicesīuffering algorithms of the video player and network conditions on the client side When customers use an encoding software such as OBS Studio, Wirecast, or others to send an RTMP live stream to Media Services. Here are some that you should consider:ĭelays on the contribution encoder side. There are many factors that determine the end-to-end latency of a stream besides how the media is encoded. Then, come back to this guide to understand what else may affect streaming latency. Before you continue reading this article, read the Low Latency HLS (LL-HLS) article to understand low latency with live event encoding.
0 Comments
Leave a Reply. |
AuthorWrite something about yourself. No need to be fancy, just an overview. ArchivesCategories |