This FAQ is about broadcasting live streams to Clevercast. See the FAQ overview if you have questions about other topics.
FAQ - Live Broadcasting
Are there broadcast guidelines or requirements?
See our broadcast guidelines. If you’re not able to follow them, make sure to test in advance.
- For multilingual live streams we strongly recommend to broadcast with a framerate of 25 fps and keyframe interval of 2 seconds (if possible). Clevercast will apply this regardless of your broadcast specifications.
- When using Translate@Home, make sure not to broadcast a still image or animated intro with variable bitrate (VBR) at the time you set your event to preview or started. This may cause the translated audio to be slightly ahead of the original audio. We’re trying to fix this known issue. For more info, see our broadcast guidelines.
Which broadcast protocol should I use (RTMP or SRT)?
Our ingest servers are located in Europe. If you are broadcasting from a (virtual) location in Europe, RTMP should be fine.
If you want to broadcast a 1080p stream from outside of Europe, you should use SRT or ask for a local ingest hub.
Which bitrate should I use?
See also our broadcast guidelines. The bitrate of your broadcast also depends on the type of content (eg. dynamic or static). Since Clevercast does server-side transcoding for adaptive streaming, it doesn’t make sense to broadcast huge bitrates.
Why should I stop my broadcast after the event has ended?
When there is an incoming broadcast, Clevercast does server-side transcoding for adaptive streaming. This also happens if there is no live stream (event status is waiting or ended). So you will use up unnecessary live processing hours (part of your plan) if you don’t turn off your encoder after an event.
Why should I observe a grace period of 2 minutes before and after the actual event?
The live stream is delivered with some latency to your viewers. Typically, the latency for HTTP Live Streaming (HLS) is approximately 16 to 30 seconds (depending on device, connection, player configuration in Clevercast). However, iOS devices allow latency to grow up to a maximum of 2 minutes.
By broadcasting in advance and starting the event in Clevercast (at least) 2 minutes before the actual event starts, you make sure each viewer gets to see the actual start. Clevercast Player can also start buffering in advance (for smooth streaming).
By stopping the event in Clevercast (at least) 2 minutes after the event ends, you make sure each viewer gets to see the actual end. Since the player shows a (custom) image or message when the event is set to ended, this may happen while some viewers are still watching the stream if you don’t observe a 2 minute grace period.
Is live streaming pre-recorded videos possible without sending a broadcast?
Yes, this is also possible for multilingual live streams with closed captions and audio translations. See our pseudo-live streaming overview for more info.
Encoder support for multilingual live streams
Broadcast for Translate@Home?
Ingest from third party platforms?
Currently, Microsoft Teams doesn’t support this. It does support NDI output to your Teams client, which you could use to broadcast to Clevercast via an encoder within your LAN (eg Teams to OBS to Clevercast). This requires technical knowledge.
Broadcast with 2 languages?
For a live stream with 2 languages, you can send a stereo broadcast over RTMP or SRT with both languages panned left and right. This can be done with any software or hardware encoder.
Wirecast supports SRT, but currently doesn’t support sending multiple audio tracks or channels. So it can’t be used to send more than 2 embedded languages to Clevercast.
vMix only supports a single SRT audio track, but does allow you to add up to 8 audio channels to this single audio track. Clevercast allows you to use this feature to broadcast a multilingual live stream of up to 8 languages using vMix.
SRT broadcast with multiple audio tracks?
The number of languages that can be embedded in an SRT broadcast depends on how many audio tracks your encoder supports. For example, OBS Studio supports 12 languages, Haivision’s Makito X and Intinor Direct Link support 16 languages, and Makito X4 supports 32 languages. See this guide for more info.
RTMP broadcast with multiple audio channels ?
This is currently only supported by OBS Studio through their surround sound feature, for up to 7 languages.
Sending a separate RTMP broadcast per language?
If you only need a couple of languages and have sufficient outgoing bandwidth, it is possible to send a separate RTMP broadcast (video+audio) for each language. This requires multiple simultaneous live streams to be included in your plan.