OpenGL ES and GLSurfaceView render video frames

01 Preface

Hello everyone, this article is iOS/Android audio and video topics The sixth article, the AVPlayer project code will be hosted in Github, and you can get back the data in the background of WeChat official account (GeekDev) to get the project address.

In the last article OpenGL ES for Android world In this article, we will use OpenGL ES to play the decoded video.

Contents of this issue:

  • Rendering mechanism of View and Surface
  • Surfaceview / relationship between glsurfaceview and Surface
  • GLSurfaceView and Renderer
  • SurfaceTexture and Surface
  • MediaCodec decodes and renders the video
  • Conclusion

02 rendering mechanism of view and Surface

Now you know something about OpenGLES, but before rendering the video to the video screen, we need to know something about GLSurfaceView and Surface.

If Surface is mentioned, most developers may have little contact with it. In Android drawing system, Surface is a very important concept. It provides Canvas to Applicaiton and image cache for display to SufaceFlinger. The image buffer object is maintained inside the Surface, which will eventually be synthesized and displayed by SufaceFlinger.

In the Android Window, each Window object will create a Surface. These windows include Activity,Dialog, status bar, etc. the ordinary View we use shares the Surface instance with the Window. The ordinary View will not create a Surface object itself, but draw the content into the Window. The reason why it is emphasized to be an ordinary View is that surfaceview / glsurfaceview will not share the Surface of the Window, but will maintain a Surface internally.

When a window is created, Window Manger creates a Surface for each window. When the window needs to be redrawn, window calls the lockCanvas method to lock and return the Canvas. Window passes the Canvas to View by traversing the View level and calling the OnDraw (Canvas) method of View, and View draws any content through the Canvas. After this series of operations, the Surface will be unlock ed and synthesized to the screen by SurfaceFlinger.

03 relationship between surfaceview / glsurfaceview and Surface

SurfaceView is a subclass of View. The difference from ordinary View is that it has its own special Surface and does not share the Surface with the host Window. Due to the separation of SurfaceView and host Window, the rendering operation of SurfaceView can be put into a separate thread. This design is because the rendering of some games and video applications is extremely complex. In order not to affect the response to the events of the main thread, these rendering tasks need to be independent of the main thread.

The work of SurfaceView is relatively simple. The most important task is to create the Surface and make a hole in the host window to display the content of the Surface. The following is part of the source code:

public class SurfaceView extends View implements ViewRootImpl.WindowStoppedCallback 

{



private static final String TAG = "SurfaceView";

private static final boolean DEBUG = false;



final ArrayList<SurfaceHolder.Callback> mCallbacks

= new ArrayList<SurfaceHolder.Callback>();



------ Here's the point -----

final Surface mSurface = new Surface(); // Current surface in use



}

04 GLSurfaceView and Renderer

We talked about SurfaceView earlier, and GLSurfaceView is our focus today. In the first article OpenGL ES for Android world In this article, we have given a preliminary introduction to GLSurfaceView. You may remember that we used GLSurfaceView to draw a triangle on the screen.

We can roughly guess from the prefix GLSurfaceView that it must be related to OpenGL. As you guessed, GLSurfaceView does encapsulate the relevant contents of GL. Strictly speaking, it uses EGL to build the GL environment. Let's directly Render the content we want to display through the Render interface.

GLSurfaceView is an extension of SurfaceView. It not only adds EGL management, but also creates a Renderer thread for us. The design of SurfaceView allows us to perform rendering operations outside the main thread. GLSurfaceView inherits from SurfaceView and creates a GLThread internally. All your painting tasks will be executed in the GLThread thread.

GLSurfaceView has a setRenderer(Renderer renderer) method, which allows us to implement our own rendering logic. The definition of the Renderer interface is as follows:

public interface Renderer {

       

       /** Surface The creation is successful, and the GL environment is ready */

        void onSurfaceCreated(GL10 gl, EGLConfig config);



       /** Surface The width and height of changes, generally when the width and height mode of horizontal and vertical screen or GLSurfaceView changes */

        void onSurfaceChanged(GL10 gl, int width, int height);



        /** Our GL rendering logic is implemented in this method */

        void onDrawFrame(GL10 gl);

    }

When we finish decoding and get the Texture of each frame, we will draw the Texture on the screen in the Renderer's OnDrawFrame method.

05 SurfaceTexture and Surface

SurfaceView is a combination of Surface + View, and SurfaceTexture is a combination of Surface + GL Texture. SurfaceTexture can update the latest image data in the Surface to GL Texture. Through GL Texture, we can get video frames and render them directly to GLSurfaceView.

Use setOnFrameAvailableListener(listener) to register listening events with SurfaceTexture. When a new image is available on the Surface, call the updateTexImage() method of SurfaceTexture to update the image content into GL Texture, and then do the drawing operation.

The following is the basic creation process of surface texture:

// step:1 create a bound texture

int textures[] = new int[1];

GLES20.glGenTextures(1, textures, 0);

 int texId = textures[0];

GLES20.glBindTexture(GLES11Ext.GL_TEXTURE_EXTERNAL_OES, texId);



// step: 2 bind with SurfaceTexture through textid

SurfaceTexture surfaceTexture = new SurfaceTexture(texId);



// step:3 register listening events

surfaceTexture.setOnFrameAvailableListener(new surfaceTexture.OnFrameAvailableListener() {

            @Override

            public void onFrameAvailable(SurfaceTexture surfaceTexture) {

                // request render

            }

 });

06 MediaCodec decodes and renders video

Well, what we said above is the basic work of decoding video frames. Now, we can do some business.

The standard process of decoding and rendering a video is as follows:

  • Initialize GLSurfaceView settings and set the Renderer
  • Initialize the SurfaceTexture and register the onframeavalablelistener to listen
  • Initialize the splitter and select the video track
  • Initialize the decoder and configure the Surface
  • Implement the Renderer interface to render video texture

Step 1: initialize GLSurfaceView settings and set the Renderer

private void step1() {

     mSurfaceView = findViewById(surfaceView);

      // openGL ES 2.0

      mSurfaceView.setEGLContextClientVersion(2);

     mSurfaceView.setRenderer(mRenderer)

      // Set the rendering mode glsurfaceview RENDERMODE_ WHEN_ Dirty triggers rendering only when requestRender() is used

     mSurfaceView.setRenderMode(GLSurfaceView.RENDERMODE_WHEN_DIRTY);



  }

Step 2: initialize the SurfaceTexture and register onframeavalablelistener to listen

private void step2() {



    mSurfaceTexture = new AVSurfaceTexture();

    mSurfaceTexture.getSurfaceTexture().setOnFrameAvailableListener(new SurfaceTexture.OnFrameAvailableListener() {

     // When the releaseOutputBuffer(idx,true) of MediaCodec is called

     // OnFrame triggered

        @Override

        public void onFrameAvailable(SurfaceTexture surfaceTexture) {

            // Notify Renderer

            mSurfaceView.requestRender();

         }

    });

}

step3: initialize the splitter and select the video track

  private void step3(){



 // step 1: create a media splitter

   mMediaExtractor = new MediaExtractor();

  // step 2: load media file path for media separator

 // specify the path to a file

    Uri videoPathUri = Uri.parse("android.resource://" + getPackageName() + "/" + R.raw.demo_video);

    try {

            mMediaExtractor.setDataSource(this, videoPathUri, null);

        } catch (IOException e) {

            e.printStackTrace();

        }



        // step 3: get and select the track of the specified type

        // Number of tracks in media files (generally video, audio, subtitles, etc.)

        int trackCount = mMediaExtractor.getTrackCount();

        // mime type indicates the type of track that needs to be separated

        String extractMimeType = "video/";

        MediaFormat trackFormat = null;

        // Record the track index id. separate track indexes need to be specified before MediaExtractor reads data

        int trackID = -1;

        for (int i = 0; i < trackCount; i++) {

            trackFormat = mMediaExtractor.getTrackFormat(i);

            if (trackFormat.getString(MediaFormat.KEY_MIME).startsWith(extractMimeType)) {

                trackID = i;

                break;

            }

        }

        // There are video tracks in the media file

        // step 4: select the track of the specified type

        if (trackID != -1)

            mMediaExtractor.selectTrack(trackID);



    }

step4: initialize the decoder and configure the Surface

 private void step4() {



        MediaFormat trackFormat = mMediaExtractor.getTrackFormat(mMediaExtractor.getSampleTrackIndex());



        try {

            mMediaCodec = MediaCodec.createDecoderByType(trackFormat.getString(MediaFormat.KEY_MIME));



            /** configure Specify Surface in */

            mMediaCodec.configure(trackFormat,mSurfaceTexture.getSurface(),null,0);

            mMediaCodec.start();

        } catch (IOException e) {

            e.printStackTrace();

        }



    }

step5: implement the Renderer interface to render video textures

private GLSurfaceView.Renderer mRenderer = new GLSurfaceView.Renderer() {

     

         @Override

        public void onSurfaceCreated(GL10 gl, EGLConfig config) {

            mProgram = new GPUTextureProgram(GPUTextureProgram.ProgramType.TEXTURE_EXT);

        }



        @Override

        public void onSurfaceChanged(GL10 gl, int width, int height) {

            GLES20.glViewport(0,0 ,width,height);

        }



        @Override

        public void onDrawFrame(GL10 gl) {

            // Update video texture

            mSurfaceTexture.updateTexImage();



            // Painting textures to the screen

            mProgram.draw(mSurfaceTexture.getTextureID());

        }

};

Finally, our results are as follows:

http://mpvideo.qpic.cn/tjg_3247244669_50000_82366eb02ec74473a40f5c6bd570ca49.f10002.mp4?dis_k=f7624bff408d6364ba5dfed7ffc3e1cb&dis_t=1642686634&vid=wxv_854040044913508352&format_id=10002&support_redirect=0&mmversion=false

You can find the complete code example in DemoMediaPlayer of AVPlayer.

07 concluding remarks

Now, we have preliminarily realized a prototype player, which has no pause, fast and slow speed, no playing sound, and no audio and video synchronization.

Added by techi3 on Fri, 21 Jan 2022 08:44:29 +0200