Introduction to MediaPlayer audio and video playback

Author: Xiangyang Zhumeng

MediaPlayer in Android multimedia, we can use this API to play audio and video. This class is an important component in the Androd multimedia framework. Through this class, we can obtain, decode and play audio and video in the smallest steps.

It supports three different media sources:

  • local resources
  • Internal URI, for example, you can get it through ContentResolver
  • External URL (stream) list of media formats supported by Android

1. Detailed explanation of related methods

1) Obtain MediaPlayer instance:

You can create it directly with new or call the create method:

MediaPlayer mp = new MediaPlayer();
MediaPlayer mp = MediaPlayer.create(this, R.raw.test);  //无需再调用setDataSource` </pre>

In addition, create also has this form:create(Context context, Uri uri, SurfaceHolder holder) Specify SurfaceHolder through Uri [Abstract class ] Create a multimedia player.

2) Set playback files:

//①raw下的资源:
MediaPlayer.create(this, R.raw.test);

//②本地文件路径:
mp.setDataSource("/sdcard/test.mp3");

//③网络URL文件:
mp.setDataSource("http://www.xxx.com/music/test.mp3");

In addition, there are multiple setDataSource() methods, which have such a type of parameter: FileDescriptor. When using this API, you need to put the file into the assets folder at the same level as the res folder, and then use the following code to set the DataSource:

AssetFileDescriptor fileDescriptor = getAssets().openFd("rain.mp3");
m_mediaPlayer.setDataSource(fileDescriptor.getFileDescriptor(),fileDescriptor.getStartOffset(), fileDescriptor.getLength());

3) Other methods

  • getCurrentPosition( ): Get the current playback position
  • getDuration(): The time to get the file
  • getVideoHeight() :Get video height
  • getVideoWidth(): Get the video width
  • isLooping():Right or wrong circular broadcasting
  • isPlaying(): Whether it is playing
  • pause(): Pause
  • prepare():preparation(same step)
  • prepareAsync(): Prepare (asynchronous)
  • release() : Release Media Player image
  • reset(): Reset the MediaPlayer object
  • seekTo(int msec) : Specifies the position to play (time in milliseconds)
  • setAudioStreamType(int streamtype): Specify the type of streaming media
  • setDisplay(SurfaceHolder sh): Set up SurfaceHolder to display multimedia
  • setLooping(boolean looping): Set whether to loop playback
  • setOnBufferingUpdateListener(MediaPlayer.OnBufferingUpdateListener listener): Buffer listening for network streaming media
  • setOnCompletionListener(MediaPlayer.OnCompletionListener listener): Listening for the end of network streaming media playback
  • setOnErrorListener(MediaPlayer.OnErrorListener listener): Set error message listening
  • setOnVideoSizeChangedListener(MediaPlayer.OnVideoSizeChangedListener listener): Video size monitoring
  • setScreenOnWhilePlaying(boolean screenOn): Set whether to use SurfaceHolder to display
  • setVolume(float leftVolume, float rightVolume) :设置音量
  • start(): Start broadcast
  • stop(): Stop dissemination

2. Use code examples

Example 1: Use MediaPlayer to play audio:

Operation renderings:

关键代码

public class MainActivity extends AppCompatActivity implements View.OnClickListener{

    private Button btn_play;
    private Button btn_pause;
    private Button btn_stop;
    private MediaPlayer mPlayer = null;
    private boolean isRelease = true;   //判断是否MediaPlayer是否释放的标志

    @Override
    protected void onCreate(Bundle savedInstanceState) {
        super.onCreate(savedInstanceState);
        setContentView(R.layout.activity_main);
        bindViews();
    }

    private void bindViews() {
        btn_play = (Button) findViewById(R.id.btn_play);
        btn_pause = (Button) findViewById(R.id.btn_pause);
        btn_stop = (Button) findViewById(R.id.btn_stop);

        btn_play.setOnClickListener(this);
        btn_pause.setOnClickListener(this);
        btn_stop.setOnClickListener(this);
    }

    @Override
    public void onClick(View v) {
        switch (v.getId()){
            case R.id.btn_play:
                if(isRelease){
                    mPlayer = MediaPlayer.create(this,R.raw.fly);
                    isRelease = false;
                }
                mPlayer.start();   //开始播放
                btn_play.setEnabled(false);
                btn_pause.setEnabled(true);
                btn_stop.setEnabled(true);
                break;
            case R.id.btn_pause:
                mPlayer.pause();     //停止播放
                btn_play.setEnabled(true);
                btn_pause.setEnabled(false);
                btn_stop.setEnabled(false);
                break;
            case R.id.btn_stop:
                mPlayer.reset();     //重置MediaPlayer
                mPlayer.release();   //释放MediaPlayer
                isRelease = true;
                btn_play.setEnabled(true);
                btn_pause.setEnabled(false);
                btn_stop.setEnabled(false);
                break;
        }
    }
}

Precautions:

The audio files in the res/raw directory are played. The create method is called when creating MediaPlayer. There is no need to call prepare() before starting playback for the first time. If it is constructed using the constructor method, the prepare() method needs to be called once. ! In addition, here are the sample codes for playing audio from the other two ways in the official documentation:

Honji Uri

Uri myUri = ....; // initialize Uri here
MediaPlayer mediaPlayer = new MediaPlayer();
mediaPlayer.setAudioStreamType(AudioManager.STREAM_MUSIC);
mediaPlayer.setDataSource(getApplicationContext(), myUri);
mediaPlayer.prepare();
mediaPlayer.start();

External URL:

String url = "http://........"; // your URL here
MediaPlayer mediaPlayer = new MediaPlayer();
mediaPlayer.setAudioStreamType(AudioManager.STREAM_MUSIC);
mediaPlayer.setDataSource(url);
mediaPlayer.prepare(); // might take long! (for buffering, etc)
mediaPlayer.start();

Note: If you stream an online audio file through a URL, the file must be progressively downloadable

Example 2: Use MediaPlayer to play videos

MediaPlayer is mainly used to play audio and does not provide an image output interface, so we need to use other components to display the image output played by MediaPlayer. We can use SurfaceView to display, let’s use SurfaceView to write an example of video playback:

Operation renderings:

actually present

Future text:activity_main.xml

<LinearLayout xmlns:android="http://schemas.android.com/apk/res/android"
    android:layout_width="match_parent"
    android:layout_height="match_parent"
    android:orientation="vertical"
    android:padding="5dp">

    <SurfaceView
        android:id="@+id/sfv_show"
        android:layout_width="match_parent"
        android:layout_height="300dp" />

    <Button
        android:id="@+id/btn_start"
        android:layout_width="wrap_content"
        android:layout_height="wrap_content"
        android:text="开始" />

    <Button
        android:id="@+id/btn_pause"
        android:layout_width="wrap_content"
        android:layout_height="wrap_content"
        android:text="暂停 " />

    <Button
        android:id="@+id/btn_stop"
        android:layout_width="wrap_content"
        android:layout_height="wrap_content"
        android:text="终止" />
    
</LinearLayout>

MainActivity.java

public class MainActivity extends AppCompatActivity implements View.OnClickListener, SurfaceHolder.Callback {

    private MediaPlayer mPlayer = null;
    private SurfaceView sfv_show;
    private SurfaceHolder surfaceHolder;
    private Button btn_start;
    private Button btn_pause;
    private Button btn_stop;

    @Override
    protected void onCreate(Bundle savedInstanceState) {
        super.onCreate(savedInstanceState);
        setContentView(R.layout.activity_main);
        bindViews();
    }

    private void bindViews() {
        sfv_show = (SurfaceView) findViewById(R.id.sfv_show);
        btn_start = (Button) findViewById(R.id.btn_start);
        btn_pause = (Button) findViewById(R.id.btn_pause);
        btn_stop = (Button) findViewById(R.id.btn_stop);

        btn_start.setOnClickListener(this);
        btn_pause.setOnClickListener(this);
        btn_stop.setOnClickListener(this);

        //初始化SurfaceHolder类,SurfaceView的控制器
        surfaceHolder = sfv_show.getHolder();
        surfaceHolder.addCallback(this);
        surfaceHolder.setFixedSize(320, 220);   //显示的分辨率,不设置为视频默认

    }

    @Override
    public void onClick(View v) {
        switch (v.getId()) {
            case R.id.btn_start:
                mPlayer.start();
                break;
            case R.id.btn_pause:
                mPlayer.pause();
                break;
            case R.id.btn_stop:
                mPlayer.stop();
                break;
        }
    }

    @Override
    public void surfaceCreated(SurfaceHolder holder) {
        mPlayer = MediaPlayer.create(MainActivity.this, R.raw.lesson);
        mPlayer.setAudioStreamType(AudioManager.STREAM_MUSIC);
        mPlayer.setDisplay(surfaceHolder);    //设置显示视频显示在SurfaceView上
    }

    @Override
    public void surfaceChanged(SurfaceHolder holder, int format, int width, int height) {}

    @Override
    public void surfaceDestroyed(SurfaceHolder holder) {}

    @Override
    protected void onDestroy() {
        super.onDestroy();
        if (mPlayer.isPlaying()) {
            mPlayer.stop();
        }
        mPlayer.release();
    }
}

The code is very simple. The layout has a SurfaceView, and then calls getHolder to obtain a SurfaceHolder object. Here, the settings related to the SurfaceView are completed, the display resolution and a Callback interface are set, and the SurfaceView is rewritten when it is created, changed, and destroyed. Three methods! Then the buttons control play and pause.

Example 3: Use VideoView to play videos

In addition to using MediaPlayer + SurfaceView to play videos, we can also use VideoView to play videos directly. We can achieve video playback by slightly changing things! The running effect is consistent with the above, so I won’t post it anymore, just post the code directly!

public class MainActivity extends AppCompatActivity implements View.OnClickListener {

    private VideoView videoView;
    private Button btn_start;
    private Button btn_pause;
    private Button btn_stop;

    @Override
    protected void onCreate(Bundle savedInstanceState) {
        super.onCreate(savedInstanceState);
        setContentView(R.layout.activity_main);
        bindViews();
    }
    
    private void bindViews() {
        videoView = (VideoView) findViewById(R.id.videoView);
        btn_start = (Button) findViewById(R.id.btn_start);
        btn_pause = (Button) findViewById(R.id.btn_pause);
        btn_stop = (Button) findViewById(R.id.btn_stop);

        btn_start.setOnClickListener(this);
        btn_pause.setOnClickListener(this);
        btn_stop.setOnClickListener(this);
        
        //根据文件路径播放
        if (Environment.getExternalStorageState().equals(Environment.MEDIA_MOUNTED)) {
            videoView.setVideoPath(Environment.getExternalStorageDirectory() + "/lesson.mp4");
        }

        //读取放在raw目录下的文件
        //videoView.setVideoURI(Uri.parse("android.resource://com.jay.videoviewdemo/" + R.raw.lesson));
        videoView.setMediaController(new MediaController(this));
    }

    @Override
    public void onClick(View v) {
        switch (v.getId()) {
            case R.id.btn_start:
                videoView.start();
                break;
            case R.id.btn_pause:
                videoView.pause();
                break;
            case R.id.btn_stop:
                videoView.stopPlayback();
                break;
        }
    }
}

Here I recommend to you a set of related "Notes on Core Knowledge Points of Audio and Video Development". I believe it can provide you with some help. Friends in need can also download it and check for any gaps at any time:https://qr18.cn/Ei3VPD

Basic introduction to audio and video

The basic introduction to audio and video mainly involves contacting APIs related to Android multimedia display. By individually enumerating and using these APIs, you will have a basic outline of Android audio and video processing. Although the knowledge points are relatively scattered, the points form a line. Lines become surfaces, and the basic foundation is mastered. By learning the core API of Android audio and video, the audio and video processes are connected in series. In this way, the understanding and control of audio and video are not limited to the outermost API, but can be achieved through A relatively low-level approach to deepen your understanding of Android audio and video development.

  • Android audio and video development (1): draw pictures in three ways
  • Android audio and video development (2): Use AudioRecord to collect audio PCM and save it to a file
  • Android audio and video development (3): Use AudioTrack to play PCM audio
  • Android Audio and Video Development (4): Use Camera API to collect video data
  • Android audio and video development (5): Use MediaExtractor and MediaMuxer API to parse and encapsulate mp4 files
  • Android audio and video development (6): Detailed explanation of MediaCodec API
  • Android audio and video development (7): Summary of audio and video recording process...

Intermediate Advanced Audio and Video: OpenSL ES Learning:https://qr18.cn/Ei3VPD

Learn the Android platform OpenSL ES API, understand the basic process of OpenSL development, use OpenSL to play PCM data, and understand the simple use of related APIs

  • Android OpenSL ES development: Android OpenSL introduction and development process description
  • Android OpenSL ES development: Playing PCM data using OpenSL
  • Android OpenSL ES development: Android OpenSL recording PCM audio data
  • Android OpenSL ES development: OpenSL ES uses SoundTouch to realize PCM audio speed and pitch changes

This part is mainly about hands-on practice and accumulation of practical experience. You can try to set the following goals for yourself:

  • Display an image using OpenGL

  • GLSurfaceviw draws the Camera preview screen and takes pictures using OpenGL ES

  • Complete video recording and achieve video watermark effect

  • Android OpenGL ES development (1): Introduction to OpenGL ES

  • Android OpenGL ES development (2): OpenGL ES environment construction

  • Android OpenGL ES development (3): OpenGL ES defines shapes

  • Android OpenGL ES development (4): OpenGL ES drawing shapes

  • Android OpenGL ES development (5): OpenGL ES uses projection and camera view

  • Android OpenGL ES development (6): OpenGL ES adds motion effects

  • Android OpenGL ES development (7): OpenGL ES responds to touch events

  • Android OpenGL ES development (8): OpenGL ES shader language GLSL

  • Android OpenGL ES development (9): OpenGL ES texture map

  • Android OpenGL ES development (10): Interacting with shaders through GLES20

  • ……

Advanced exploration of audio and video

  • In-depth study of audio and video coding, such as H.264, AAC, and research on the use of open source codec libraries, such as x.264, JM, etc.
  • In-depth study of audio and video related network protocols, such as rtmp, hls, and packet formats, such as: flv, mp4
  • In-depth study of some open source projects in the audio and video field, such as webrtc, ffmpeg, ijkplayer, librtmp, etc.
  • Port the ffmpeg library to the Android platform, and combine the experience accumulated above to write a simple audio and video player
  • Port the x264 library to the Android platform, and combine the experience accumulated above to complete the H264 soft editing function of video data.
  • Port the librtmp library to the Android platform, and combine the experience accumulated above to complete the Android RTMP streaming function

Audio and video codec technology:https://qr18.cn/Ei3VPD

  • Audio and video codec technology (1): MPEG-4/H.264 AVC codec standard audio
  • Video coding and decoding technology (2): AAC audio coding technology
  • ……

streaming protocol

  • Streaming media protocol (1): HLS protocol
  • Streaming media protocol (2): RTMP protocol
  • ……

multimedia file format

  • Multimedia file format (1): MP4 format
  • Multimedia file format (2): FLV format
  • Multimedia file format (3): M3U8 format
  • Multimedia file format (4): TS format
  • Multimedia file format (5): PCM/WAV format…

FFmpeg learning:https://qr18.cn/Ei3VPD

  • FFmpeg command line tool learning (1): View media file header information tool ffprobe
  • FFmpeg command line tool learning (2): ffplay, a tool for playing media files
  • Learning the FFmpeg command line tool (3): Media file conversion tool ffmpeg
  • FFmpeg command line tool learning (4): FFmpeg capture device
  • FFmpeg command line tool learning (5): FFmpeg adjusts audio and video playback speed
  • ……

  • FFmpeg learning (1): Introduction to FFmpeg
  • FFmpeg learning (2): Install FFmpeg on Mac
  • FFmpeg learning (3): Porting FFmpeg to the Android platform
  • FFmpeg learning (4): FFmpeg API introduction and general API analysis
  • FFmpeg learning (5): FFmpeg encoding and decoding API analysis
  • FFmpeg learning (6): Analysis of FFmpeg core modules libavformat and libavcodec
  • ……

  • FFmpeg structure learning (1): AVFormatContext analysis
  • FFmpeg structure learning (2): AVStream analysis
  • FFmpeg structure learning (3): AVPacket analysis
  • FFmpeg structure learning (4): AVFrame analysis
  • FFmpeg structure learning (5): AVCodec analysis
  • FFmpeg structure learning (6): AVCodecContext analysis
  • FFmpeg structure learning (7): AVIOContext analysis
  • FFmpeg structure learning (8): The relationship between important structures in FFMPEG
  • ……

  • Summary of the AVFilter usage process in FFmpeg development
  • FFmpeg obsolete API summary
  • ……

Guess you like

Origin blog.csdn.net/maniuT/article/details/132476712