A solution for streaming H.264, H.263, AMR, AAC using RTP on Android

Overview

Introduction

What it does

libstreaming is an API that allows you, with only a few lines of code, to stream the camera and/or microphone of an android powered device using RTP over UDP.

  • Android 4.0 or more recent is required.
  • Supported encoders include H.264, H.263, AAC and AMR.

The first step you will need to achieve to start a streaming session to some peer is called 'signaling'. During this step you will contact the receiver and send a description of the incomming streams. You have three ways to do that with libstreaming.

  • With the RTSP client: if you want to stream to a Wowza Media Server, it's the way to go. The example 3 illustrates that use case.
  • With the RTSP server: in that case the phone will act as a RTSP server and wait for a RTSP client to request a stream. This use case is illustated in the example 1.
  • Or you use libstreaming without using the RTSP protocol at all, and signal the session using SDP over a protocol you like. The example 2 illustrates that use case.

The full javadoc documentation of the API is available here: http://guigui.us/libstreaming/doc

How does it work? You should really read this, it's important!

There are three ways on Android to get encoded data from the peripherals:

  • With the MediaRecorder API and a simple hack.
  • With the MediaCodec API and the buffer-to-buffer method which requires Android 4.1.
  • With the MediaCodec API and the surface-to-buffer method which requires Android 4.3.

Encoding with the MediaRecorder API

The MediaRecorder API was not intended for streaming applications but can be used to retrieve encoded data from the peripherals of the phone. The trick is to configure a MediaRecorder instance to write to a LocalSocket instead of a regular file (see MediaStream.java).

Edit: as of Android Lollipop using a LocalSocket is not possible anymore for security reasons. But using a ParcelFileDescriptor does the trick. More details in the file MediaStream.java! (Thanks to those guys for the insight)

This hack has some limitations:

  • Lip sync can be approximative.
  • The MediaRecorder internal buffers can lead to some important jitter. libstreaming tries to compensate that jitter.

It's hard to tell how well this hack is going to work on a phone. It does work well on many devices though.

Encoding with the MediaCodec API

The MediaCodec API do not present the limitations I just mentionned, but has its own issues. There are actually two ways to use the MediaCodec API: with buffers or with a surface.

The buffer-to-buffer method uses calls to dequeueInputBuffer and [queueInputBuffer](http://developer.android.com/reference/android/media/MediaCodec.html#queueInputBuffer(int, int, int, long, int)) to feed the encoder with raw data. That seems easy right ? Well it's not, because video encoders that you get access to with this API are using different color formats and you need to support all of them. A list of those color formats is available here. Moreover, many encoders claim support for color formats they don't actually support properly or can present little glitches.

All the hw package is dedicated to solving those issues. See in particular EncoderDebugger class.

If streaming with that API fails, libstreaming fallbacks on streaming with the MediaRecorder API.

The surface-to-buffer method uses the createInputSurface() method. This method is probably the best way to encode raw video from the camera but it requires android 4.3 and up.

The gl package is dedicated to using the MediaCodec API with a surface.

It is not yet enabled by default in libstreaming but you can force it with the setStreamingMethod(byte) method.

Packetization process

Once raw data from the peripherals has been encoded, it is encapsulated in a proper RTP stream. The packetization algorithm that must be used depends on the format of the data (H.264, H.263, AMR and AAC) and are all specified in their respective RFC:

  • RFC 3984 for H.264: H264Packetizer.java
  • RFC 4629 for H.263: H263Packetizer.java
  • RFC 3267 for AMR: AMRNBPacketizer.java
  • RFC 3640 for AAC: AACADTSPacketizer.java or AACLATMPacketizer.java

If you are looking for a basic implementation of one of the RFC mentionned above, check the sources of corresponding class.

RTCP packets are also sent to the receiver since version 2.0 of libstreaming. Only Sender Reports are implemented. They are actually needed for lip sync.

The rtp package handles packetization of encoded data in RTP packets.

Using libstreaming in your app

Required permissions

<uses-permission android:name="android.permission.INTERNET" />
<uses-permission android:name="android.permission.WRITE_EXTERNAL_STORAGE" />
<uses-permission android:name="android.permission.RECORD_AUDIO" />
<uses-permission android:name="android.permission.CAMERA" />

How to stream H.264 and AAC

This example is extracted from this simple android app. This could be a part of an Activity, a Fragment or a Service.

    protected void onCreate(Bundle savedInstanceState) {

        ...

		mSession = SessionBuilder.getInstance()
		.setCallback(this)
		.setSurfaceView(mSurfaceView)
		.setPreviewOrientation(90)
		.setContext(getApplicationContext())
		.setAudioEncoder(SessionBuilder.AUDIO_NONE)
		.setAudioQuality(new AudioQuality(16000, 32000))
		.setVideoEncoder(SessionBuilder.VIDEO_H264)
		.setVideoQuality(new VideoQuality(320,240,20,500000))
		.build();

		mSurfaceView.getHolder().addCallback(this);

        ...

    }

	public void onPreviewStarted() {
		Log.d(TAG,"Preview started.");
	}

	@Override
	public void onSessionConfigured() {
		Log.d(TAG,"Preview configured.");
		// Once the stream is configured, you can get a SDP formated session description
		// that you can send to the receiver of the stream.
		// For example, to receive the stream in VLC, store the session description in a .sdp file
		// and open it with VLC while streming.
		Log.d(TAG, mSession.getSessionDescription());
		mSession.start();
	}

	@Override
	public void onSessionStarted() {
		Log.d(TAG,"Streaming session started.");
        ...
	}

	@Override
	public void onSessionStopped() {
		Log.d(TAG,"Streaming session stopped.");
        ...
	}	

	@Override
	public void onBitrateUpdate(long bitrate) {
        // Informs you of the bandwidth consumption of the streams
		Log.d(TAG,"Bitrate: "+bitrate);
	}

	@Override
	public void onSessionError(int message, int streamType, Exception e) {
        // Might happen if the streaming at the requested resolution is not supported
        // or if the preview surface is not ready...
        // Check the Session class for a list of the possible errors.
		Log.e(TAG, "An error occured", e);
	}

	@Override
	public void surfaceChanged(SurfaceHolder holder, int format, int width,
			int height) {
		
	}

	@Override
	public void surfaceCreated(SurfaceHolder holder) {
        // Starts the preview of the Camera
		mSession.startPreview();
	}

	@Override
	public void surfaceDestroyed(SurfaceHolder holder) {
        // Stops the streaming session
		mSession.stop();
	}

The SessionBuilder simply facilitates the creation of Session objects. The call to setSurfaceView is needed for video streaming, that should not come up as a surprise since Android requires a valid surface for recording video (it's an annoying limitation of the MediaRecorder API). On Android 4.3, streaming with no SurfaceView is possible but not yet implemented. The call to setContext(Context) is necessary, it allows H264Stream objects and AACStream objects to store and recover data using SharedPreferences.

A Session object represents a streaming session to some peer. It contains one or more Stream objects that are started (resp. stopped) when the start() (resp. stop()) method is invoked.

The method getSessionDescription() will return a SDP of the session in the form of a String. Before calling it, you must make sure that the Session has been configured. After calling configure() or startPreview() on you Session instance, the callback onSessionConfigured() will be called.

In the example presented above, the Session instance is used in an asynchronous manner and calls to its methods do not block. You know when stuff is done when callbacks are called.

You can also use a Session object in a synchronous manner like that:

    // Blocks until the all streams are configured
    try {
         mSession.syncConfigure();
    } catch (Exception e) {
         ...
    }
    Strinf sdp = mSession.getSessionDescription();
    ...
    // Blocks until streaming actually starts.
    try {
         mSession.syncStart();
    } catch (Exception e) {
         ...
    }
    ...
    mSession.syncStop();

How to use the RTSP client

Check out this page of the wiki and the example 3.

How to use the RTSP server

Add this to your manifest:

<service android:name="net.majorkernelpanic.streaming.rtsp.RtspServer"/>

If you decide to override RtspServer change the line above accordingly.

You can change the port used by the RtspServer:

Editor editor = PreferenceManager.getDefaultSharedPreferences(this).edit();
editor.putString(RtspServer.KEY_PORT, String.valueOf(1234));
editor.commit();

The port is indeed stored as a String in the preferences, there is a good reason to that. The EditTextPreference object saves its input as a String and cannot easily (one would need to override it) be configured to store it as an Integer.

Configure its behavior with the SessionBuilder:

SessionBuilder.getInstance()    
			.setSurfaceHolder(mSurfaceView.getHolder())
			.setContext(getApplicationContext())
			.setAudioEncoder(SessionBuilder.AUDIO_AAC)
			.setVideoEncoder(SessionBuilder.VIDEO_H264);

Start and stop the server like this:

// Starts the RTSP server
context.startService(new Intent(this,RtspServer.class));
// Stops the RTSP server
context.stopService(new Intent(this,RtspServer.class));

Spydroid-ipcamera

Visit this github page to see how this streaming stack can be used and how it performs.

Comments
  • Save mediaStream as mp4(h.264)

    Save mediaStream as mp4(h.264)

    I want to streamming data through 2 android devices with node.js server. On the remote device, I want to save data as mp4. I did it in this example : https://github.com/pchab/AndroidRTC but I don't know how to save media stream as mp4 Your lib can do that?

    opened by hoang89yenthe 14
  • Decoder buffer not big enough, decoder did not decode anything

    Decoder buffer not big enough, decoder did not decode anything

    Hi, I followed these instructions in Section 3. Creating Android Project to create a new project using libstreaming as a library:

    http://www.androidhive.info/2014/06/android-streaming-live-camera-video-to-web-page/

    Then I followed example2 to try and stream H.264 without RTSP. I deploy this app to my Galaxy Nexus and it fails with "The decoder did not decode anything." This is using VideoQuality.DEFAULT_VIDEO_QUALITY. If I use a larger resolution (640x480, supported according to a log entry from VideoQuality), the error changes to "The decoder input buffer is not big enough (nal=91280, capacity=65536)." Every combination of resolution, fps, and bitrate I've tried results in one of these two errors.

    I've been struggling for days and cannot get libstreaming to work for me. Where do these errors come from and what do I need to look at to get them resolved?

    Question also asked here, with code and log outputs: http://stackoverflow.com/questions/24128279/libstreaming-errors-decoder-buffer-not-big-enough-decoder-did-not-decode-anyth

    (Somewhat unrelated, but I had to deviate slightly from example2: I'm building the session in a button click, not in onCreate(), because findViewById() was always returning null for my SurfaceView if I called it from onCreate().)

    Thanks!

    opened by BinaryDigit09 11
  • prefix must not be bound to one of the reserved namespace names

    prefix must not be bound to one of the reserved namespace names

    Hello, I am new to this libstreaming. I am facing problem when importing example3 into Android Studio. When compile and run, I get the error "Error:(2) Error parsing XML: prefix must not be bound to one of the reserved namespace names". Can anyone give advice on solving this? Sincerely appreciate and thank you so much!

    opened by winseyli 8
  • Preview doesn't work when the view is recreated

    Preview doesn't work when the view is recreated

    Both preview and streaming is working fine for me with libstreaming until the view that I'm using for it is recreated, either by starting the activity a second time, or tilting the phone so that it changes between landscape and portrait.

    The output is showing the message "sending message to a handler on a dead thread", and it's caused on the Session.Startpreview() method.

    Both example2 and example3 downloaded here have the same problem when I modify them to allow screen rotation.

    Anyone else who has/had this problem?

    opened by larsericson 7
  • connect failed: ETIMEDOUT

    connect failed: ETIMEDOUT

    Hi, I am using example-3 and getting the error: connect failed: ETIMEDOUT, please see log below:

    D/RtspClient: Connecting to RTSP server... I/RtspClient: TEARDOWN rtsp://192.168.0.42:1935/live/myStream RTSP/1.0 W/System.err: java.net.ConnectException: failed to connect to /192.168.0.42 (port 1935): connect failed: ETIMEDOUT (Connection timed out)

    and also installed the Wowza Streaming Engine locally and created a "Source Authentication".

    anything missing?

    opened by shankar667 5
  • Stream video is rotated

    Stream video is rotated

    Video stream is always rotated by 90 degrees even with using of MediaCodec API (I know that it is impossible to set correct rotation with MediaRecorder API). Is there any way to rotate it?

    opened by silin 4
  • Connection lost by timeout!

    Connection lost by timeout!

    Hi! I'm using your library to stream to wowza server.

    I have problems connecting to wowza: 2013/11/19 21:13:57 background: Connecting to wowza server 2013/11/19 21:13:57 background: Connecting to 10.0.1.8:1935/live/test.stream 2013/11/19 21:13:57 background: quality set... 2013/11/19 21:13:57 background: credentials set. username: admin; password: admin 2013/11/19 21:13:57 background: server address set 2013/11/19 21:13:57 background: path set. Starting stream... 2013/11/19 21:15:28 background: starting stream failed: Connection lost 2013/11/19 21:15:28 background: starting stream failed: java.net.SocketException: Connection lost

    Problem occurs on line: mClient.startStream(1);

    Wowza server is reacheable (I can ping it) and ip_address:1935 is visible from phone.

    Do you have any workarounds?

    opened by Nikita2k 4
  • Streaming and playing RTSP

    Streaming and playing RTSP

    I am developing a simple app that let you record video/audio using this library and stream it over network. App has basically two SurfaceView (one for recording/preview and other for playback) . What I am doing is as follows

    1. Start RtspServer service
    2. Create SessionBuilder instance and configure it.
    3. Create a session using SessionBuilder and start it (for preview and recording)
    4. Create a MediaPlayer passing url "rtsp://127.0.0.1:8086" . Set the second Surface view for display and start
    5. Connection is made to RtspServer but when "SETUP" request is sent, RtspServer start() the session created by the same SessionBuilder (as of step 2) in DESCRIBE request. It throws error in play().

    Kindly let me know simple way of achieving the task of recording and playback on same device using RTSP. I checked spydroid camera but its code is right now complex for me to understand as I m short of time.

    opened by rameezusmani 4
  • Not working in latest android 9 version in High resolution devices

    Not working in latest android 9 version in High resolution devices

    2019-05-31 10:18:23.501 15887-15887/com.example.facestream W/mple.facestream: type=1400 audit(0.0:8800): avc: denied { read } for name="u:object_r:vendor_camera_prop:s0" dev="tmpfs" ino=12863 scontext=u:r:untrusted_app:s0:c171,c256,c512,c768 tcontext=u:object_r:vendor_camera_prop:s0 tclass=file permissive=0 2019-05-31 10:18:23.516 15887-15887/com.example.facestream E/libc: Access denied finding property "vendor.camera.aux.packagelist" 2019-05-31 10:18:23.517 15887-15887/com.example.facestream E/libc: Access denied finding property "vendor.camera.hal1.packagelist" 2019-05-31 10:18:23.501 15887-15887/com.example.facestream W/mple.facestream: type=1400 audit(0.0:8801): avc: denied { read } for name="u:object_r:vendor_default_prop:s0" dev="tmpfs" ino=12871 scontext=u:r:untrusted_app:s0:c171,c256,c512,c768 tcontext=u:object_r:vendor_default_prop:s0 tclass=file permissive=0

    opened by vininandu 3
  • libstreaming not work in MODE_MEDIARECORDER_API mode with android 6.0

    libstreaming not work in MODE_MEDIARECORDER_API mode with android 6.0

    In Android 6.0,I set the libstreaming with MODE_MEDIARECORDER_API ,but it does not work. It start failed. Could anyone give me some suggestions.I must use the MODE_MEDIARECORDER_API mode

    opened by duohua 3
  • RTSP server using 3G AND vlc

    RTSP server using 3G AND vlc

    Hi!

    I am using this library for doing streaming for my android phone to VLC.

    In a local network, I get to watch the streaming in VLC OK, but when i disable wifi and i Use 3g/lte, I don't get to watch the streaming in VLC.

    I am using example 1 of libstreaming-examples

    Is it possible? And how? Thanks!

    opened by ssppgit 3
  • Is there any receiver solution/example which decodes AAC format of audio?

    Is there any receiver solution/example which decodes AAC format of audio?

    Hi, I successfully applied this library to send H264 Video and use example of H264 receiver code. But I didn't find out any example to receive AAC or other format of audio. Does anyone could have example audio decoding code?

    opened by MiTang0828 5
  • Does it have RTCP support?

    Does it have RTCP support?

    does this library has RTCP support? After extensive search i cannot find any python and java based RTCP library. Does any such library happen to exist?

    opened by gtrani 1
  • Does it have RTCP support?

    Does it have RTCP support?

    does this library has RTCP support? After extensive search i cannot find any python and java based RTCP library. Does any such library happen to exist?

    opened by gtrani 0
  • MTU default 1300

    MTU default 1300

    Dear libstreaming friends,

    In RtpSocket.java, it sets MTU = 1300 ([ref])https://github.com/fyhertz/libstreaming/blob/3a78d2219a4cf5bc972cdc59a8f84349fda277c6/src/net/majorkernelpanic/streaming/rtp/RtpSocket.java#L49

    We checked the git blame; found that MTU was 1500 few years ago. So we were wondering why libstreaming doesn't set MTU to its maximum.

    Hope to hear your expertise.

    Best, -- Luke

    opened by lukelu0520 0
Owner
Simon
Simon
MJPEG video streaming on Android

ipcam-view Android MJPEG video streaming made simple! A wrapper library around the well known SimpleMjpegView and android-camera-axis projects. If you

null 359 Jan 6, 2023
[] Easily integrate Camera features into your Android app

Deprecated CameraView is deprecated. No more development will be taking place. Use Jetpack CameraX instead. CameraView This is not an official Google

Google 4.8k Dec 29, 2022
[] FFmpeg build for android random architectures with example jni

AndroidFFmpegLibrary This project aims to create working library providing playing video files in android via ffmpeg libraries. With some effort and N

AppUnite Sp. z o.o. Spk. 1k Mar 1, 2021
a system for building custom ffmpeg binaries for Android

This is a new android-ffmpeg project since it seems there were so many different ways of doing it, it was confusing. So here is my clean, easily chan

Guardian Project 967 Nov 12, 2022
Android Java wrapper around ffmpeg command line binary

FFMPEG Library for Android This project is a Java wrapper around an ffmpeg command line binary for use in Android applications. It depends on the andr

Guardian Project 555 Dec 5, 2022
script(s) to build ffmpeg for android, including support for RTMP (and OpenSSL)

android-ffmpeg-with-rtmp This repository contains script(s) to build ffmpeg for android with RTMP (and OpenSSL) support. Instructions Install the Andr

cine.io 234 Dec 28, 2022
Script and Instructions for building FFmpeg for Android

FFmpeg-Android Herein lies scripts and instructions for compiling FFmpeg for Android with RTMP support. Much thanks to Chris Ballinger and Liu Feipeng

David Brodsky 80 Dec 13, 2022
A mod of the Twitch Android Mobile App adding BetterTTV and FrankerFaceZ emotes

A mod of the Twitch Android Mobile App adding BetterTTV and FrankerFaceZ emotes

null 377 Jan 2, 2023
Android/iOS video player based on FFmpeg n3.4, with MediaCodec, VideoToolbox support.

ijkplayer Platform Build Status Android iOS Video player based on ffplay Download Android: Gradle # required allprojects { repositories {

bilibili 28.9k May 26, 2021
android video player base on ijkplayer

GiraffePlayer NOTE:this project is no longer update please using improved GiraffePlayer2 ,for flutter please visit GPlayer out of the box android vide

tom 683 Nov 14, 2022
Android MoveNet single human pose estimation by ncnn

Android MoveNet single human pose estimation by ncnn

FeiGeChuanShu 93 Dec 31, 2022
MovieStreaming - Movie Streaming is a streaming app and developed with Kotlin and Koin Dependency injection

MovieStreaming Movie Streaming is a streaming app and developed with Kotlin and

Amir Ali 3 Nov 19, 2022
Movie Search App - using Flow, Suspend Function, AAC ViewModel, Dagger Hilt and so on.

Movie Search App - using Flow, Suspend Function, AAC ViewModel, Dagger Hilt and so on.

Taiki Suzuki 10 Mar 19, 2022
Convert audio files inside your Android app easily. Supported formats: AAC, MP3, M4A, WMA, WAV and FLAC.

AndroidAudioConverter Convert audio files inside your Android app easily. This is a wrapper of FFmpeg-Android-Java lib. Supported formats: AAC MP3 M4A

Adriel Café 1.3k Jan 5, 2023
AAC MVVM + Clean Architecture + Coroutine Flow

GithubBrowser MVVM Clean Architecture Sample AAC MVVM + Clean Architecture + Coroutine Flow 본 샘플 프로젝트는 https://github.com/omjoonkim/GitHubBrowserApp 을

Jeonguk-JayDev 18 May 25, 2022
Burak Akgün 84 Oct 30, 2022
This project uses AAC and 3rd parties to present NFT collectibles

My NFT Android app This project uses AAC and 3rd parties to present NFT collectibles AAC(Android Archicture Components) Lifecycle-aware components Vie

LouisWu 2 Aug 21, 2022
Free p2p cdn android github sdk to reduce video streaming costs of live and on demand video using webrtc by upto 90% and improve scalability by 6x - 🚀 Vadootv 🚀

Android p2p cdn sdk to distribute load and reduce costs(https://peervadoo.com) Vadootv is a p2p sdk integration to reduce your video streaming costs b

Vadootv 40 Oct 5, 2022
MJPEG video streaming on Android

ipcam-view Android MJPEG video streaming made simple! A wrapper library around the well known SimpleMjpegView and android-camera-axis projects. If you

null 359 Jan 6, 2023
A libre lightweight streaming front-end for Android.

NewPipe A libre lightweight streaming frontend for Android. Screenshots • Description • Features • Installation and updates • Contribution • Donate •

Team NewPipe 22.4k Jan 2, 2023