The media streaming is based on UDP protocols. There are several protocols based on UDP such as RTSP, RTMP etc to stream media through internet. The flash player in web sites uses these real time protocol to achieve live streaming. Now its the age of HTML5 media players, but most of the browsers cannot support live streaming protocols.
Apple introduce a live streaming technology ( HTTP Live Streaming ) for their IOS devices (iPhone, iPad, or iPod touch) through HTTP . Safari in Mac OS also support this streaming.
How to setup HTTP Live Streaming ?
The HTTP Streaming Architecture is described here. The main parts of the streamer are a media encoder and a segmenter.
1. Encoding for HTTP Live Streaming.
The media encoder encode the raw media to a device supported format. For IOS devices, it is H.264 video and HE-AAC audio. MP3 audio is also used for audio encoding. We can use ffmpeg as the media encoder. The bitrate, aspect ratio, frame rate, etc.. can be specifird to the encoder while encoding the media.
The ffmpeg command for encoding video at bitrate 600k with an aspect ratio 16:9 as below:
ffmpeg -er 4 -i inputViodeo.mp4 -f mpegts -acodec libmp3lame -ar 32000 -ab 48k -s 640x360 -vcodec libx264 -b 600k -flags +loop+mv4 -cmp 256 -partitions +parti4x4+partp8x8+partb8x8 -subq 7 -trellis 1 -refs 5 -coder 0 -me_range 16 -keyint_min 25 -sc_threshold 40 -i_qfactor 0.71 -bt 600k -maxrate 600k -bufsize 600k -rc_eq 'blurCplx^(1-qComp)' -qcomp 0.6 -qmin 10 -qmax 51 -qdiff 4 -level 30 -aspect 16:9 -r 30 -g 90 -async 2 outputVideo.ts
2. Segmenting media for distribution.
The segmenter divides the encoded media to segments with specified time duration. The default time duration is 10 seconds. It also creates a index file (prog_index.m3u8), which references the media segments. The command for segmentation as below.
cat outputVideo.ts | mediastreamsegmenter -b <segment-url> -f <segments-location>
The media segmets and the index file are create at the 'segments-location'. we can access the media trough 'segment-url/prog_index.m3u8'. For eg:
cat outputVideo.ts | mediastreamsegmenter -b http://mystream/stream -f /myserver/stream/
The streaming media is available through http://mystream/stream/prog_index.m3u8.
we can also use a single command for encoding and streaming :
ffmpeg -er 4 -i inputViodeo.mp4 -f mpegts -acodec libmp3lame -ar 32000 -ab 48k -s 640x360 -vcodec libx264 -b 600k -flags +loop+mv4 -cmp 256 -partitions +parti4x4+partp8x8+partb8x8 -subq 7 -trellis 1 -refs 5 -coder 0 -me_range 16 -keyint_min 25 -sc_threshold 40 -i_qfactor 0.71 -bt 600k -maxrate 600k -bufsize 600k -rc_eq 'blurCplx^(1-qComp)' -qcomp 0.6 -qmin 10 -qmax 51 -qdiff 4 -level 30 -aspect 16:9 -r 30 -g 90 -async 2 | mediastreamsegmenter -b http://mystream/stream -f /myserver/stream/
References
1.http://developer.apple.com/library/ios/#documentation/networkinginternet/conceptual/streamingmediaguide/HTTPStreamingArchitecture/HTTPStreamingArchitecture.html#//apple_ref/doc/uid/TP40008332-CH101-SW2
2.http://developer.apple.com/library/ios/#technotes/tn2010/tn2224.html#//apple_ref/doc/uid/DTS40009745
3.http://www.ffmpeg.org/
Note: This post is for my reference, donate me by clicking the adds.