Streaming to the Apple iPhone, Part 1
Nov 17, 2009 12:00 PM, By Jan Ozer
How does it work?
There's an old joke about how many [insert target group here]s does it take to change a light bulb? The answer, of course, is two: one to call the electrician, another to mix the daiquiris.
Yeah, I know, old and not funny. The point is that you don't have to pull the pieces together yourself to make it work; you can call your friendly content-delivery network (CDN) or online video platform (OVP) provider, write a check, and get started right awaywhich is easier, cheaper, and faster than rolling your own system. Still, it always helps to know a little bit about the plumbing, so here's a buzzword-level description. There won't be a pop quiz, but feel free to pay attention anyway.
First, HTTP Live Streaming works with both live and on-demand feeds. As mentioned, the iPhone can receive the stream via Wi-Fi or wireless, and can switch back and forth while playing back a single video. Though adaptive bit rate is used by many producers who distribute to the iPhone, you can distribute single-data-rate video files.
While HTTP Live Streaming is technically codec-agnostic, the current implementation requires H.264 video encoded using the Baseline profile Level 3 and HE-AAC or AAC-LC audio up to 48kHz stereo, muxed as an MPEG-2 transport stream that has a .ts extension. You can also stream an audio-only .mp3 file. Though any encoder capable of producing files to the specification should work, Apple has tested compatibility with only two hardware encoders: Inlet Technologies' Spinnaker 7000 and Envivio's 4Caster C4.
The .M3U8 index file
After encoding, a free stream segmenter (see Figure 2) available from Apple divides the encoded file into separate files of equal duration, usually 10 seconds, all with .ts extensions. The segmenter also creates an index file that contains a list of the media file segments and their location (see Figure 4). Index files have a .M3U8 extension, and they serve as the glue between the player and the segmented files.
During playback, the player downloads the index file first, then downloads and plays the media files in sequence. After playing segment1.ts, for example, the player can check the index file, see that segment2.ts is next, and determine its location.
In a live scenario, the index file is continually updated and downloaded by the player, so the player always knows which segments to retrieve and from where. If you leave references to all segments in the index file, latecomers can use the file to view the stream and catch up to current playback. If desired, this index file can also be used for immediate on-demand playback after the end of the live event.
Continue the discussion on “Crosstalk” the Millimeter Forum.