While all StreamSpot live streams are indeed "live," every streaming provider (and traditional broadcaster) has a delay before the broadcast reaches your viewers. This delay is also referred to as latency.
To understand latency, you need to understand how a live stream works:
- Your audio and video are combined using your encoder and sent to StreamSpot's origin servers located all over the world using the Real-Time Messaging Protocol (RTMP).
- Once received by a StreamSpot origin server, the stream is recorded and transcoded from RTMP (or small video "chunks" ranging from 2-10 seconds in duration.) This type of encoding was developed by Apple and is referred to as HTTP Live Streaming (HLS) see https://en.wikipedia.org/wiki/HTTP_Live_Streaming for more information.
- The chunks are then called and cached by StreamSpot's global content delivery network (CDN) for distribution to your viewers all over the world.
- Depending on where you are in your live stream depends on which "chunk" the viewer receives for playback. When each "chunk" is played, the device loads the following "chunk" in the playlist automatically and the stream continues until the end of your broadcast.
The amount of "chunks" that need to be loaded before a device starts playback varies from device-to-device, but the industry standard is 3 chunks. This means for a standard device, a minimum of 30-seconds of the video must be loaded from your live stream before playback begins, leading to an average of 30-seconds of latency for your live stream (assuming 10-second chunks).
If your bitrate isn't sufficient enough to upload the chunks of video in succession, then your viewers will be stuck on a loading screen until the chunks are uploaded and sent to your viewers. This is called buffering. You can read more about buffering here.
This is not unique to StreamSpot, almost every live streaming provider utilizes HLS for content delivery and faces the same latency.