Design of RTSP|RTMP player on Android platform

background

When developing RTSP or RTMP player on Android platform, we need to pay attention to many points. The following is Daniel live SDK( official )As an example, the following interface designs are briefly introduced:

interface design

1. Open interface

The purpose of the Open interface is to create an instance and return the player instance handle normally. If there are multiple playback requests, you can create multiple instances.

	/**
	 * Initialize Player(Start playback instance)
	 *
	 * @param ctx: get by this.getApplicationContext()
	 *
	 * <pre>This function must be called firstly.</pre>
	 *
	 * @return player handle if successful, if return 0, which means init failed. 
	 */

	public native long SmartPlayerOpen(Object ctx);

2. Close() interface

The Close interface, corresponding to the Open() interface, is responsible for releasing the resources of the corresponding instance. After calling the Close() interface, remember to set the instance handle to 0.

Note: for example, an instance can be played, recorded at the same time, or streamed (forwarded). In this case, when calling the Close() interface, you need to ensure that the recording and streaming stop normally before calling.

	/**
	 * When the playback instance is closed, the close interface must be called to release resources
	 *
	 * @param handle: return value from SmartPlayerOpen()
	 *
	 * <pre> NOTE: it could not use player handle after call this function. </pre> 
	 *
	 * @return {0} if successful
	 */
	public native int SmartPlayerClose(long handle);

3. Network status callback

For a good player, good status callback is essential, such as network connectivity status, snapshot, video status, current download speed and other real-time feedback, which can enable the upper developers to better control the playback end status and give users a better playback experience.

	/**
	 * Set callback event(Set event callback)
	 *
	 * @param handle: return value from SmartPlayerOpen()
	 *
	 * @param callbackv2: callback function
	 *
	 * @return {0} if successful
	 */
	public native int SetSmartPlayerEventCallbackV2(long handle, NTSmartEventCallbackV2 callbackv2);

demo implementation example:

    class EventHandeV2 implements NTSmartEventCallbackV2 {
        @Override
        public void onNTSmartEventCallbackV2(long handle, int id, long param1,
                                             long param2, String param3, String param4, Object param5) {

            //Log.i(TAG, "EventHandeV2: handle=" + handle + " id:" + id);

            String player_event = "";

            switch (id) {
                case NTSmartEventID.EVENT_DANIULIVE_ERC_PLAYER_STARTED:
                    player_event = "start..";
                    break;
                case NTSmartEventID.EVENT_DANIULIVE_ERC_PLAYER_CONNECTING:
                    player_event = "Connecting..";
                    break;
                case NTSmartEventID.EVENT_DANIULIVE_ERC_PLAYER_CONNECTION_FAILED:
                    player_event = "connection failed..";
                    break;
                case NTSmartEventID.EVENT_DANIULIVE_ERC_PLAYER_CONNECTED:
                    player_event = "Connection succeeded..";
                    break;
                case NTSmartEventID.EVENT_DANIULIVE_ERC_PLAYER_DISCONNECTED:
                    player_event = "Disconnected..";
                    break;
                case NTSmartEventID.EVENT_DANIULIVE_ERC_PLAYER_STOP:
                    player_event = "stop playing..";
                    break;
                case NTSmartEventID.EVENT_DANIULIVE_ERC_PLAYER_RESOLUTION_INFO:
                    player_event = "Resolution information: width: " + param1 + ", height: " + param2;
                    break;
                case NTSmartEventID.EVENT_DANIULIVE_ERC_PLAYER_NO_MEDIADATA_RECEIVED:
                    player_event = "Media data not received, possibly url error..";
                    break;
                case NTSmartEventID.EVENT_DANIULIVE_ERC_PLAYER_SWITCH_URL:
                    player_event = "Switch playback URL..";
                    break;
                case NTSmartEventID.EVENT_DANIULIVE_ERC_PLAYER_CAPTURE_IMAGE:
                    player_event = "snapshot: " + param1 + " route:" + param3;

                    if (param1 == 0) {
                        player_event = player_event + ", Snapshot intercepted successfully";
                    } else {
                        player_event = player_event + ", Failed to intercept snapshot";
                    }
                    break;

                case NTSmartEventID.EVENT_DANIULIVE_ERC_PLAYER_RECORDER_START_NEW_FILE:
                    player_event = "[record]Start a new video file : " + param3;
                    break;
                case NTSmartEventID.EVENT_DANIULIVE_ERC_PLAYER_ONE_RECORDER_FILE_FINISHED:
                    player_event = "[record]A video file has been generated : " + param3;
                    break;

                case NTSmartEventID.EVENT_DANIULIVE_ERC_PLAYER_START_BUFFERING:
                    Log.i(TAG, "Start Buffering");
                    break;

                case NTSmartEventID.EVENT_DANIULIVE_ERC_PLAYER_BUFFERING:
                    Log.i(TAG, "Buffering:" + param1 + "%");
                    break;

                case NTSmartEventID.EVENT_DANIULIVE_ERC_PLAYER_STOP_BUFFERING:
                    Log.i(TAG, "Stop Buffering");
                    break;

                case NTSmartEventID.EVENT_DANIULIVE_ERC_PLAYER_DOWNLOAD_SPEED:
                    player_event = "download_speed:" + param1 + "Byte/s" + ", "
                            + (param1 * 8 / 1000) + "kbps" + ", " + (param1 / 1024)
                            + "KB/s";
                    break;

                case NTSmartEventID.EVENT_DANIULIVE_ERC_PLAYER_RTSP_STATUS_CODE:
                    Log.e(TAG, "RTSP error code received, please make sure username/password is correct, error code:" + param1);
                    player_event = "RTSP error code:" + param1;
                    break;

                case NTSmartEventID.EVENT_DANIULIVE_ERC_PLAYER_NEED_KEY:
                    Log.e(TAG, "RTMP Encrypted stream, please set the required for playback Key..");
                    player_event = "RTMP Encrypted stream, please set the required for playback Key..";
                    break;

                case NTSmartEventID.EVENT_DANIULIVE_ERC_PLAYER_KEY_ERROR:
                    Log.e(TAG, "RTMP Encrypted stream, Key Error, please reset..");
                    player_event = "RTMP Encrypted stream, Key Error, please reset..";
                    break;
            }

            if (player_event.length() > 0) {
                Log.i(TAG, player_event);
                Message message = new Message();
                message.what = PLAYER_EVENT_MSG;
                message.obj = player_event;
                handler.sendMessage(message);
            }
        }
    }

4. Soft decoding or hard decoding?

With the development of Android, the support of hard decoding by chips of various manufacturers is becoming more and more friendly. Generally, if it is suitable for general products, soft decoding is preferred under the condition of equipment performance guarantee. If a specific model of equipment, hard decoding can be considered as appropriate, which is divided into 264 hard decoding and HEVC hard decoding, and the hard decoding of surface mode can be set directly, The interfaces are as follows:

	/**
	 * Set Video H.264 HW decoder(Set H.264 (hard decoding)
	 *
	 * @param handle: return value from SmartPlayerOpen()
	 *
	 * @param isHWDecoder: 0: software decoder; 1: hardware decoder.
	 *
	 * @return {0} if successful
	 */
	public native int SetSmartPlayerVideoHWDecoder(long handle, int isHWDecoder);

	/**
	 * Set Video H.265(hevc) HW decoder(Set H.265 (hard decoding)
	 *
	 * @param handle: return value from SmartPlayerOpen()
	 *
	 * @param isHevcHWDecoder: 0: software decoder; 1: hardware decoder.
	 *
	 * @return {0} if successful
	 */
	public native int SetSmartPlayerVideoHevcHWDecoder(long handle, int isHevcHWDecoder);

Set the hard solution of surface mode:

	/**
	 * Set Mediacodec self drawing mode under video hard decoding (in this mode, hard decoding has better compatibility and efficiency, and the callback YUV/RGB and snapshot functions will not be available)
	 *
	 * @param handle: return value from SmartPlayerOpen()
	 *
	 * @param isHWRenderMode: 0: not enable; 1: Use the surface set by SmartPlayerSetSurface to paint by yourself
	 *
	 * @return {0} if successful
	 */
	public native int SmartPlayerSetHWRenderMode(long handle, int isHWRenderMode);

	/**
	 * Update hard decoded surface
	 *
	 * @param handle: return value from SmartPlayerOpen()
	 *
	 * @return {0} if successful
	 */
	public native int SmartPlayerUpdateHWRenderSurface(long handle);

5. Audio output type

For audio output, android platform supports audiotrack mode and opensl es mode. Generally speaking, considering the universality of the device, audiotrack mode is recommended.

	/**
	 * Set AudioOutput Type(Set audio output type)
	 *
	 * @param handle: return value from SmartPlayerOpen()
	 *
	 * @param use_audiotrack:
	 *
	 * <pre> NOTE: if use_audiotrack with 0: it will use auto-select output devices; if with 1: will use audio-track mode. </pre>
	 *
	 * @return {0} if successful
	 */
	public native int SmartPlayerSetAudioOutputType(long handle, int use_audiotrack);

6. Buffer time setting

Buffer time, as the name suggests, how much data is cached before playing. For example, set a buffer time of 2000ms. In the live broadcast mode, it will play normally after receiving 2 seconds of data.

Increasing the buffer time will increase the playback delay. The advantage is that it has better fluency when the network jitters.

	/**
	 * Set buffer(Set buffer time in milliseconds)
	 *
	 * @param handle: return value from SmartPlayerOpen()
	 *
	 * @param buffer:
	 *
	 * <pre> NOTE: Unit is millisecond, range is 0-5000 ms </pre>
	 *
	 * @return {0} if successful
	 */
	public native int SmartPlayerSetBuffer(long handle, int buffer);

7. Real time mute and real-time volume adjustment

Real time mute and real-time volume adjustment as the name suggests, the player can adjust the playback volume in real time or mute it directly, especially in multi-channel playback scenes.

In addition, we also provide the function of volume amplification, which supports amplification to twice the original volume.

	/**
	 * Set mute or not(Set live mute)
	 *
	 * @param handle: return value from SmartPlayerOpen()
	 *
	 * @param is_mute: if with 1:mute, if with 0: does not mute
	 *
	 * @return {0} if successful
	 */
	public native int SmartPlayerSetMute(long handle, int is_mute);

	/**
	 * Set playback volume
	 *
	 * @param handle: return value from SmartPlayerOpen()
	 *
	 * @param volume: The range is [0, 100], 0 is mute, 100 is the maximum volume, and the default is 100
	 *
	 * @return {0} if successful
	 */
	public native int SmartPlayerSetAudioVolume(long handle, int volume);

8. RTSP TCP-UDP mode setting, timeout setting or mode switching

Some RTSP servers or cameras only support RTSP TCP mode or UDP mode. At this time, it is very important to set TCP and UDP mode by default. In addition, we also design to support TCP or UDP mode. If we can't receive data, we can automatically switch to UDP or TCP after timeout.

	/**
	 * Set RTSP TCP/UDP mode (default UDP mode)
	 *
	 * @param handle: return value from SmartPlayerOpen()
	 *
	 * @param is_using_tcp: if with 1, it will via TCP mode, while 0 with UDP mode
	 *
	 * @return {0} if successful
	 */
	public native int SmartPlayerSetRTSPTcpMode(long handle, int is_using_tcp);

	/**
	 * Set RTSP timeout. Timeout is in seconds and must be greater than 0
	 *
	 * @param handle: return value from SmartPlayerOpen()
	 *
	 * @param timeout: RTSP timeout setting
	 *
	 * @return {0} if successful
	 */
	public native int SmartPlayerSetRTSPTimeout(long handle, int timeout);

	/**
	 * Set RTSP TCP/UDP auto switch
	 *
	 * @param handle: return value from SmartPlayerOpen()
	 *
	 * NOTE: For RTSP, some may support rtp over udp and some may support rtp over tcp
	 * For ease of use, you can turn on the automatic attempt switch in some scenarios. If udp cannot be played, sdk will automatically try tcp. If tcp cannot be played, sdk will automatically try udp
	 *
	 * @param is_auto_switch_tcp_udp If 1 is set, sdk will try to switch playback between tcp and udp. If 0 is set, it will not try to switch
	 *
	 * @return {0} if successful
	 */
	public native int SmartPlayerSetRTSPAutoSwitchTcpUdp(long handle, int is_auto_switch_tcp_udp);

9. Quick start

Quick start is mainly used to quickly brush the latest data in the scenario of server caching GOP to ensure the continuity of the picture.

	/**
	 * Set fast startup(Set the quick start mode, which is effective for the scenario where the server caches GOP)
	 *
	 * @param handle: return value from SmartPlayerOpen()
	 *
	 * @param is_fast_startup: if with 1, it will second play back, if with 0: does not it
	 *
	 * @return {0} if successful
	 */
	public native int SmartPlayerSetFastStartup(long handle, int is_fast_startup);

10. Low delay mode

In the low delay mode, set buffer time to 0 to lower the delay, which is suitable for ultra-low delay scenarios that require manipulation and control.

	/**
	 * Set low latency mode(Set low delay mode)
	 *
	 * @param handle: return value from SmartPlayerOpen()
	 *
	 * @param mode: if with 1, low latency mode, if with 0: normal mode
	 *
	 * @return {0} if successful
	 */
	public native int SmartPlayerSetLowLatencyMode(long handle, int mode);

11. Video view rotation, horizontal | vertical flip

The interface is mainly used to complete the normal angle playback of images through the playback end when the device end cannot adjust in scenes such as original video inversion.

	/**
	 * Set video vertical inversion
	 *
	 * @param handle: return value from SmartPlayerOpen()
	 *
	 * @param is_flip:  0: No inversion, 1: inversion
	 *
	 * @return {0} if successful
	 */
	public native int SmartPlayerSetFlipVertical(long handle, int is_flip);

	/**
	 * Set video horizontal inversion
	 *
	 * @param handle: return value from SmartPlayerOpen()
	 *
	 * @param is_flip:  0: No inversion, 1: inversion
	 *
	 * @return {0} if successful
	 */
	public native int SmartPlayerSetFlipHorizontal(long handle, int is_flip);

	/**
	 * Set clockwise rotation. Note that all angles except 0 degrees will consume additional performance
	 *
	 * @param handle: return value from SmartPlayerOpen()
	 *
	 * @param degress:  Currently, 0 degree, 90 degree, 180 degree and 270 degree rotation are supported
	 *
	 * @return {0} if successful
	 */
	public native int SmartPlayerSetRotation(long handle, int degress);

12. Set the download speed of real-time callback

Call the real-time download speed interface to realize more friendly interaction between the APP layer and the underlying SDK by setting the download speed interval and whether to report the current download speed.

	/**
	 * Set report download speed(Set real-time callback download speed)
	 *
	 * @param handle: return value from SmartPlayerOpen()
	 *
	 * @param is_report: if with 1, it will report download speed, it with 0: does not it.
	 *
	 * @param report_interval: report interval, unit is second, it must be greater than 0.
	 *
	 * @return {0} if successful
	 */
	public native int SmartPlayerSetReportDownloadSpeed(long handle, int is_report, int report_interval );

13. Real time snapshot

In short, whether to access the current playback screen during playback.

	/**
	 * Set if needs to save image during playback stream(Whether to start snapshot function)
	 *
	 * @param handle: return value from SmartPlayerOpen()
	 *
	 * @param is_save_image: if with 1, it will save current image via the interface of SmartPlayerSaveCurImage(), if with 0: does not it
	 *
	 * @return {0} if successful
	 */
	public native int SmartPlayerSaveImageFlag(long handle, int is_save_image);

	/**
	 * Save current image during playback stream(Real time snapshot)
	 *
	 * @param handle: return value from SmartPlayerOpen()
	 *
	 * @param imageName: image name, which including fully path, "/sdcard/daniuliveimage/daniu.png", etc.
	 *
	 * @return {0} if successful
	 */
	public native int SmartPlayerSaveCurImage(long handle, String imageName);

14. Extended video recording operation

For video recording at the playback end, we have done very detailed work. For example, we can record only audio or video, set the video storage path and set the size of a single file. If the data is not AAC, we can transfer it to AAC and then record.

	/**
	 * Create file directory(Create video catalog)
	 *
	 * @param path,  E.g: /sdcard/daniulive/rec
	 *
	 * <pre> The interface is only used for recording the stream data to local side. </pre>
	 *
	 * @return {0} if successful
	 */
	public native int SmartPlayerCreateFileDirectory(String path);

	/**
	 * Set recorder directory(Set recording directory)
	 *
	 * @param handle: return value from SmartPlayerOpen()
	 *
	 * @param path: the directory of recorder file
	 *
	 * <pre> NOTE: make sure the path should be existed, or else the setting failed. </pre>
	 *
	 * @return {0} if successful
	 */
	public native int SmartPlayerSetRecorderDirectory(long handle, String path);

	/**
	 * Set the size of every recorded file(Set the size of a single video file. If it exceeds the set size, it will automatically switch to the next file recording)
	 *
	 * @param handle: return value from SmartPlayerOpen()
	 *
	 * @param size: (MB), (5M~500M), if not in this range, set default size with 200MB.
	 *
	 * @return {0} if successful
	 */
	public native int SmartPlayerSetRecorderFileMaxSize(long handle, int size);

	/*
	 * Set the switch of audio to AAC coding during recording
	 *
	 * @param handle: return value from SmartPlayerOpen()
	 *
	 * aac More general, sdk adds the function of converting other audio codes (such as speex, pcmu, pcma, etc.) to aac
	 *
	 * @param is_transcode: If it is set to 1, if the audio coding is not aac, it will be converted to aac. If it is aac, it will not be converted. If it is set to 0, it will not be converted. The default is 0
	 *
	 * Note: transcoding will increase performance consumption
	 *
	 * @return {0} if successful
	 */
	public native int SmartPlayerSetRecorderAudioTranscodeAAC(long handle, int is_transcode);
	
	
	/*
	*Set whether to record video. By default, if the video source has video, it will be recorded. If not, it will not be recorded. However, in some scenarios, you may not want to record video, but only want to record audio, so a switch is added
	*
	*@param is_record_video: 1 It means recording video, 0 means not recording video, and the default is 1
	*
	* @return {0} if successful
	*/
	public native int SmartPlayerSetRecorderVideo(long handle, int is_record_video);
	
	
	/*
	*Set whether to record audio. By default, if the video source has audio, it will be recorded. If not, it will not be recorded. However, in some scenarios, you may not want to record audio but only want to record video, so a switch is added
	*
	*@param is_record_audio: 1 It means recording audio, 0 means not recording audio, and the default is 1
	*
	* @return {0} if successful
	*/
	public native int SmartPlayerSetRecorderAudio(long handle, int is_record_audio);

	/**
	 * Start recorder stream((start recording)
	 *
	 * @param handle: return value from SmartPlayerOpen()
	 *
	 * @return {0} if successful
	 */
	public native int SmartPlayerStartRecorder(long handle);

	/**
	 * Stop recorder stream(Stop recording)
	 *
	 * @param handle: return value from SmartPlayerOpen()
	 *
	 * @return {0} if successful
	 */
	public native int SmartPlayerStopRecorder(long handle);

15. Pull stream callback encoded data (used with forwarding module)

Pull stream callback encoded data is mainly used in conjunction with the forwarding module, such as pulling rtsp stream data, directly transferring it to RTMP and pushing it to RTMP service.

	/*
	 * Set the switch of audio to AAC coding during streaming
	 *
	 * @param handle: return value from SmartPlayerOpen()
	 *
	 * aac More general, sdk adds the function of converting other audio codes (such as speex, pcmu, pcma, etc.) to aac
	 *
	 * @param is_transcode: If it is set to 1, if the audio coding is not aac, it will be converted to aac. If it is aac, it will not be converted. If it is set to 0, it will not be converted. The default is 0

	 * Note: transcoding will increase performance consumption
	 *
	 * @return {0} if successful
	 */
	public native int SmartPlayerSetPullStreamAudioTranscodeAAC(long handle, int is_transcode);

	/**
	 * Start pull stream(Start streaming, used for data forwarding, only streaming but not playing)
	 *
	 * @param handle: return value from SmartPlayerOpen()
	 *
	 * @return {0} if successful
	 */
	public native int SmartPlayerStartPullStream(long handle);

	/**
	 * Stop pull stream(Stop pulling flow)
	 *
	 * @param handle: return value from SmartPlayerOpen()
	 *
	 * @return {0} if successful
	 */
	public native int SmartPlayerStopPullStream(long handle);

	/**
	 * Set Audio Data Callback(Set callback (audio data after encoding)
	 *
	 * @param handle: return value from SmartPlayerOpen()
	 *
	 * @param audio_data_callback: Audio Data Callback.
	 *
	 * @return {0} if successful
	 */
	public native int SmartPlayerSetAudioDataCallback(long handle, Object audio_data_callback);

	/**
	 * Set Video Data Callback(Set callback (video data after encoding)
	 *
	 * @param handle: return value from SmartPlayerOpen()
	 *
	 * @param video_data_callback: Video Data Callback.
	 *
	 * @return {0} if successful
	 */
	public native int SmartPlayerSetVideoDataCallback(long handle, Object video_data_callback);

16. H264 user data callback or SEI data callback

If the sender adds user-defined data during 264 encoding, the data callback can be realized through the following interface. If you need to directly callback SEI data, just call the following SEI callback interface.

	/**
	 * Set user data Callback(Set callback (SEI extended user data)
	 *
	 * @param handle: return value from SmartPlayerOpen()
	 *
	 * @param user_data_callback: user data callback.
	 *
	 * @return {0} if successful
	 */
	public native int SmartPlayerSetUserDataCallback(long handle, Object user_data_callback);

	/**
	 * Set SEI data Callback(Set callback (SEI data)
	 *
	 * @param handle: return value from SmartPlayerOpen()
	 *
	 * @param sei_data_callback: sei data callback.
	 *
	 * @return {0} if successful
	 */
	public native int SmartPlayerSetSEIDataCallback(long handle, Object sei_data_callback);

17. Set the YUV and RGB data after callback decoding

If you need to perform secondary processing on the decoded yuv or rgb data, such as face recognition, you can call back the yuv rgb interface to realize data secondary processing.

Take YUV as an example:

    class I420ExternalRender implements NTExternalRender {
        // public static final int NT_FRAME_FORMAT_RGBA = 1;
        // public static final int NT_FRAME_FORMAT_ABGR = 2;
        // public static final int NT_FRAME_FORMAT_I420 = 3;

        private int width_ = 0;
        private int height_ = 0;

        private int y_row_bytes_ = 0;
        private int u_row_bytes_ = 0;
        private int v_row_bytes_ = 0;

        private ByteBuffer y_buffer_ = null;
        private ByteBuffer u_buffer_ = null;
        private ByteBuffer v_buffer_ = null;

        @Override
        public int getNTFrameFormat() {
            Log.i(TAG, "I420ExternalRender::getNTFrameFormat return "
                    + NT_FRAME_FORMAT_I420);
            return NT_FRAME_FORMAT_I420;
        }

        @Override
        public void onNTFrameSizeChanged(int width, int height) {
            width_ = width;
            height_ = height;

            y_row_bytes_ = (width_ + 15) & (~15);
            u_row_bytes_ = ((width_ + 1) / 2 + 15) & (~15);
            v_row_bytes_ = ((width_ + 1) / 2 + 15) & (~15);

            y_buffer_ = ByteBuffer.allocateDirect(y_row_bytes_ * height_);
            u_buffer_ = ByteBuffer.allocateDirect(u_row_bytes_
                    * ((height_ + 1) / 2));
            v_buffer_ = ByteBuffer.allocateDirect(v_row_bytes_
                    * ((height_ + 1) / 2));

            Log.i(TAG, "I420ExternalRender::onNTFrameSizeChanged width_="
                    + width_ + " height_=" + height_ + " y_row_bytes_="
                    + y_row_bytes_ + " u_row_bytes_=" + u_row_bytes_
                    + " v_row_bytes_=" + v_row_bytes_);
        }

        @Override
        public ByteBuffer getNTPlaneByteBuffer(int index) {
            if (index == 0) {
                return y_buffer_;
            } else if (index == 1) {
                return u_buffer_;
            } else if (index == 2) {
                return v_buffer_;
            } else {
                Log.e(TAG, "I420ExternalRender::getNTPlaneByteBuffer index error:" + index);
                return null;
            }
        }

        @Override
        public int getNTPlanePerRowBytes(int index) {
            if (index == 0) {
                return y_row_bytes_;
            } else if (index == 1) {
                return u_row_bytes_;
            } else if (index == 2) {
                return v_row_bytes_;
            } else {
                Log.e(TAG, "I420ExternalRender::getNTPlanePerRowBytes index error:" + index);
                return 0;
            }
        }

        public void onNTRenderFrame(int width, int height, long timestamp) {
            if (y_buffer_ == null)
                return;

            if (u_buffer_ == null)
                return;

            if (v_buffer_ == null)
                return;


            y_buffer_.rewind();

            u_buffer_.rewind();

            v_buffer_.rewind();

    		/*
    		if ( !is_saved_image )
    		{
    			is_saved_image = true;

    			int y_len = y_row_bytes_*height_;

    			int u_len = u_row_bytes_*((height_+1)/2);
    			int v_len = v_row_bytes_*((height_+1)/2);

    			int data_len = y_len + (y_row_bytes_*((height_+1)/2));

    			byte[] nv21_data = new byte[data_len];

    			byte[] u_data = new byte[u_len];
    			byte[] v_data = new byte[v_len];

    			y_buffer_.get(nv21_data, 0, y_len);
    			u_buffer_.get(u_data, 0, u_len);
    			v_buffer_.get(v_data, 0, v_len);

    			int[] strides = new int[2];
    			strides[0] = y_row_bytes_;
    			strides[1] = y_row_bytes_;


    			int loop_row_c = ((height_+1)/2);
    			int loop_c = ((width_+1)/2);

    			int dst_row = y_len;
    			int src_v_row = 0;
    			int src_u_row = 0;

    			for ( int i = 0; i < loop_row_c; ++i)
    			{
    				int dst_pos = dst_row;

    				for ( int j = 0; j <loop_c; ++j )
    				{
    					nv21_data[dst_pos++] = v_data[src_v_row + j];
    					nv21_data[dst_pos++] = u_data[src_u_row + j];
    				}

    				dst_row   += y_row_bytes_;
    				src_v_row += v_row_bytes_;
    				src_u_row += u_row_bytes_;
    			}

    			String imagePath = "/sdcard" + "/" + "testonv21" + ".jpeg";

    			Log.e(TAG, "I420ExternalRender::begin test save iamge++ image_path:" + imagePath);

    			try
    			{
    				File file = new File(imagePath);

        			FileOutputStream image_os = new FileOutputStream(file);

        			YuvImage image = new YuvImage(nv21_data, ImageFormat.NV21, width_, height_, strides);

        			image.compressToJpeg(new android.graphics.Rect(0, 0, width_, height_), 50, image_os);

        			image_os.flush();
        			image_os.close();
    			}
    			catch(IOException e)
    			{
    				e.printStackTrace();
    			}

    			Log.e(TAG, "I420ExternalRender::begin test save iamge--");
    		}

    		*/


            Log.i(TAG, "I420ExternalRender::onNTRenderFrame w=" + width + " h=" + height + " timestamp=" + timestamp);

            // copy buffer

            // test
            // byte[] test_buffer = new byte[16];
            // y_buffer_.get(test_buffer);

            // Log.i(TAG, "I420ExternalRender::onNTRenderFrame y data:" + bytesToHexString(test_buffer));

            // u_buffer_.get(test_buffer);
            // Log.i(TAG, "I420ExternalRender::onNTRenderFrame u data:" + bytesToHexString(test_buffer));

            // v_buffer_.get(test_buffer);
            // Log.i(TAG, "I420ExternalRender::onNTRenderFrame v data:" + bytesToHexString(test_buffer));
        }
    }

18. Set the video picture filling mode

Set the filling mode of the video picture, such as filling the whole view and filling the view in equal proportion. If it is not set, the whole view will be filled by default.

Relevant interface design is as follows:

	/**
	 * Set the filling mode of the video picture, such as filling the whole view and filling the view in equal proportion. If not set, the whole view will be filled by default
	 * @param handle: return value from SmartPlayerOpen()
	 * @param render_scale_mode 0: Fill in the entire view; 1: Fill the view in equal proportion. The default value is 0
	 * @return {0} if successful
	 */
	public native int SmartPlayerSetRenderScaleMode(long handle, int render_scale_mode);

19. Set render type in SurfaceView mode

format: 0: RGB565 format. If it is not set, this mode is the default; 1: ARGB8888 format

	/**
	 * Set the render type in SurfaceView mode (when the second parameter of ntrender.createrenderer is passed false)
	 *
	 * @param handle: return value from SmartPlayerOpen()
	 *
	 * @param format: 0: RGB565 Format. If it is not set, this mode will be used by default; 1: ARGB8888 format
	 *
	 * @return {0} if successful
	 */
	public native int SmartPlayerSetSurfaceRenderFormat(long handle, int format);

20. Set anti aliasing in surfaceview mode

It should be noted that when anti aliasing mode is enabled, performance consumption will increase.

	/**
	 * Set the antialiasing effect in the SurfaceView mode (when the second parameter of NTRenderer.CreateRenderer is false). Note: the image performance may be affected when the antialiasing mode is turned on. Please use it with caution
	 *
	 * @param handle: return value from SmartPlayerOpen()
	 *
	 * @param isEnableAntiAlias: 0: If not set, the anti aliasing mode is not enabled by default; 1: Enable antialiasing mode
	 *
	 * @return {0} if successful
	 */
	public native int SmartPlayerSetSurfaceAntiAlias(long handle, int isEnableAntiAlias);

summary

The above are the reference points for the interface design of Android platform RTSP and RTMP player. For most developers, it is not necessary to implement all the above parts. As long as 40% of them are implemented according to the product requirements, it is enough to meet the use of specific scenarios.

A good player, especially to meet the low delay and stable playback (millisecond delay), needs to pay attention to more than that. Interested developers can refer to other blog articles.

Added by Jebs on Thu, 18 Nov 2021 12:10:59 +0200