live streaming in android
TRANSCRIPT
*
Ahmet Oğuz Mermerkaya
● Writer of an Android Application Programming book, Merhaba Android, in Turkish
● Member of GDG Ankara● Software Developer in Aselsan, Defense Industry
Company in Turkiye
*
Outline
● How live streaming works ● Building FFmpeg(with x264 and fdk-aac)● Using Vitamio MediaPlayer● Implementing a RTSP Server● Sending Previews and Audio From RTSP Server● Receiving and Playing Audio On Client
*
How live streaming works
● RTSP (Real Time Streaming Protocol) ● RTP (Real Time Protocol)
*
How live streaming works
● RTSP (Real Time Streaming Protocol) ● RTP (Real Time Protocol)
This is the part FFmpeg takes place
*
What is FFmpeg?
● Open-source, cross-platform multimedia framework● Supports almost all codecs and formats(H264, H263,
AAC, AMR, MP4, MP3, AVI)● Streams Audio and Video
ffmpeg.org
*
Building FFmpeg
● Download o FFmpeg (http://fmpeg.org)o fdk_aac for using AAC encoder (
http://sourceforge.net/projects/opencore-amr/files/fdk-aac/)o libx264 source code for H264 encoder (
http://www.videolan.org/developers/x264.html)o Android NDK for cross compiling(http://developer.android.com)
● Configure & Make
*
Building FFmpeg - Configure & Makeexport PATH=ANDROID_NDK/toolchains/arm-linux-androideabi-4.4.3/prebuilt/linux-x86/bin:$PATH#libx264./configure --cross-prefix=arm-linux-androideabi- enable-pic --disable-cli -enable-static --disable-shared --host=arm-linux --extra-cflags="-march=armv7-a -mfloat-abi=softfp" --sysroot=/path/to/android-ndk/platforms/android-87arch-arm/make#fdk-aac./configure --host=arm-android-eabi-linux --enable-static --prefix=/path/to/fdk-aac/ --disable-sharedmake#ffmpeg
./configure --disable-everything --enable-gpl --enable-libx264 --enable-libfdk-aac --enable-nonfree --enable-pthreads --cross-prefix=arm-linux-androideabi- --arch=arm --sysroot=/path/to/android-ndk/platforms/android-8/arch/arm/ --extra-cflags="-march=armv7-a -mfloat-abi=softfp" --extra-cflags="-I./x264" --extra-cflags="-I./fdk-aac/include/" --extra-ldflags="-L./x264" --extra-ldflags="-L./fdk-aac/lib" --datadir="/data/ffmpeg/bin/ffpresets" --enable-version3 --enable-decoder=rawvideo --enable-demuxer=rawvideo --enable-decoder=aarc --enable-demuxer=aac --enable-muxer=h264 --enable-encoder=pcm_s16le --enable-protocol=rtp --enable-protocol=udp --enable-muxer?rtp --enable-demuxer=image2pipe --enable-muzer=adts --enable-muxer=pcm_s16le --enable-demuxer=pcm_s16le --enable-demuxer=h264 --enable-filter=scale --enable-encoder=libx264 --enable-encoder=libfdk_aac -enable-protocol=pipe --enable-decoder=pcm_s16le --enable-filter=aresample
make
*
Using Vitamio Player
● Get it from vitamio.org and extract
Line 1 VideoView videoView = (VideoView) findViewById(R.id.videoView);
Line 2 videoView.setVideoPath("rtsp://IP_OF_ANDROID_STREAM_SERVER:PORT/live.ts");
● To start vitamio play with partial buffer, follow the instructions on http://vitamio.org/topics/104?locale=en
Then we need a RTSP server
*
Implementing a RTSP Server
Client (MediaPlayer - Vitamio) RTSP Server
*
Sending Previews From RTSP ServerLine 1 public void startVideo(String address, int port) {
Line 2 String videoCommand = "path/to/ffmpeg -analyzeduration 0 -pix_fmt nv21
Line 3 -s 480x360 -vcodec rawvideo -f image2pipe -i - -s 320x240 -crf 18
Line 4 -preset ultrafast -vcodec libx264 -f rtp rtp://"+address+":"+port;
Line 5 Process ffmpegVideoProcess = Runtime.getRuntime().exec(videoCommand);
Line 6 OutputStream ostream = ffmpegVideoProcess.getOutputStream();
Line 7
Line 8 getCamera().setPreviewCallback(new PreviewCallback(){
Line 9 public void onPreviewFrame(byte[] buffer, Camera cam) {
Line 10 ostream.write(buffer);
Line 11 ostream.flush();
Line 12 }
Line 13 });
Line 14 }
*
Sending Audio From RTSP ServerLine 1 public void startAudio(String address, int port) {
Line 2 String audioCommand = "path/to/ffmpeg -analyzeduration 0 -f s16le -ac
Line 3 44100 -ac 1 -i - -ac 1 -acodec libfdk_aac -f adts -vbr 3
Line 4 udp://"+address+ ":" + port +"/";
Line 5 Process ffmpegAudioProcess = Runtime.getRuntime().exec(audioCommand);
Line 6 OutputStream ostream = ffmpegAudioProcess.getOutputStream();
Line 7 prepareAudioRecord();
Line 8 new Thread(){ public void run(){
Line 9 while(true) {
Line 10 int len = audioRecord.read(audioBuffer, 0, audioBuffer.length);
Line 11 ostream.write(audioBuffer, 0, len);
Line 12 ostream.flush()
Line 13 }
Line 14 }}.start();
Line 15 }
Line n
*
Receiving Audio On ClientLine 1 public void receiveAudio() {
Line 2 String audioCommand = "path/to/ffmpeg -analyzeduration 0 -f aac -strict -2 -acodec
Line 3 aac -b:a 120k -ac 1 -i - -ac 1 -acodec pcm_s16le -ar 44100 -f s16le -";
Line 5 Process ffmpegAudioProcess = Runtime.getRuntime().exec(audioCommand);
Line 6 OutputStream ostream = ffmpegAudioProcess.getOutputStream();
Line 7 DatagramSocket udpsocket = new DatagramSocket(PORT);
Line 8 DatagramPacket packet = new DatagramPacket(new byte[2048], 2048);
Line 9 new Thread(){ public void run(){
while(true) {
Line 10 udpsocket.receive(packet);
Line 11 ostream.write(datagramPacket.getData(), 0, datagram.getLength());
Line 12 ostream.flush();
Line 13 }}}.start();
Line 14 playAudio(ffmpegAudioProcess.getInputStream());
Line 15 }
Line n
*
Playing Audio On ClientLine 1 public void playAudio(final InputStream istream) {
Line 2 byte[] buffer = new byte[2048];
Line 3 int bufferSize = AudioTrack.getMinBufferSize(44100,...);
Line 5 audioTrack = new AudioTrack(AudioManager.STREAM_MUSIC,44100,
Line 6 AudioFormat.CHANNEL_OUT_MONO, AudioFormat.ENCODING_PCM_16BIT,
Line 7 bufferSize, AudioTrack.MODE_STREAM);
Line 8 audioTrack.play();
Line 9 while (true) {
Line 10 int len = istream.read(buffer, 0, buffer.length);
Line 11 audioTrack.write(buffer, 0, len);
Line 12 audioTrack.flush()
Line 13 }
Line 14 }
Line n
Demo
*
Android Developer Days
● Expected 1500~ participants ● Partner of Droidcon.com● 15 co-organizers from 7 countries● Free of charge● This year more inspiration, more
networking and more fun● ADD 2012 web site ->
www.androiddeveloperdays.com/2012
Date: June 14, 15 2013Venue: METU, Ankara, Turkiyewww.androiddeveloperdays.com