Android realizes voice sending & playback function and sample code

Link to this article: https://blog.csdn.net/qq_40785165/article/details/109658968

Hello everyone, I am Xiao Hei, a programmer who is not yet bald~~~

This is my first time writing an article, and I also hope to share my future learning experience with you, I hope you like it!

If you repeat the simple things, you are the expert; if you do the repeated things seriously, you are the winner.

After I realized sending pictures in a chat room project, I thought about implementing a function of sending voice, including recording, timing, playback, headset and speaker switching, first look at the renderings

It can be seen that the function of sending voice is realized by the dialog box that pops up after clicking the voice function module. Click the start button to start recording, click Finish to release the resource and upload the recording to the server, and finally refresh the recording list, and the list line click event will proceed. Record playback and monitor the connection broadcast of the headset.

*Note: The colors and sizes in the following codes are all defined in the project and can be replaced with the values ​​you want.

The permissions that need to be added for this feature development are as follows:

<uses-permission android:name="android.permission.BLUETOOTH" />
<uses-permission android:name="android.permission.INTERNET" />
<uses-permission android:name="android.permission.WRITE_EXTERNAL_STORAGE" />
<uses-permission android:name="android.permission.READ_EXTERNAL_STORAGE" />
<uses-permission android:name="android.permission.RECORD_AUDIO" />

(1) First define a dialog style dialog_microphone, the code is as follows:

<?xml version="1.0" encoding="utf-8"?>
<LinearLayout xmlns:android="http://schemas.android.com/apk/res/android"
    xmlns:app="http://schemas.android.com/apk/res-auto"
    android:layout_width="wrap_content"
    android:layout_height="wrap_content"
    android:layout_gravity="center"
    android:background="@drawable/frame_grey_white_edge"
    android:gravity="center"
    android:orientation="vertical"
    android:paddingLeft="@dimen/b20"
    android:paddingTop="@dimen/b50"
    android:paddingRight="@dimen/b20"
    android:paddingBottom="@dimen/b50">
​
    <ImageView
        android:layout_width="@dimen/b120"
        android:layout_height="@dimen/b120"
        android:src="@mipmap/icon_microphone" />
​
    <TextView
        android:layout_width="wrap_content"
        android:layout_height="wrap_content"
        android:layout_marginTop="@dimen/b50"
        android:text="点击外部区域,取消发送" />
​
    <LinearLayout
        android:layout_width="match_parent"
        android:layout_height="wrap_content"
        android:layout_marginTop="@dimen/b20">
​
        <com.qmuiteam.qmui.widget.roundwidget.QMUIRoundButton
            android:id="@+id/btn_start"
            android:layout_width="0dp"
            android:layout_height="wrap_content"
            android:layout_weight="1"
            android:padding="@dimen/b10"
            android:text="开始"
            android:textColor="@color/color_white"
            android:textSize="@dimen/b28"
            app:qmui_backgroundColor="@color/color_orange_main"
            app:qmui_borderColor="@color/color_white"
            app:qmui_radius="@dimen/b10" />
​
        <com.qmuiteam.qmui.widget.roundwidget.QMUIRoundButton
            android:id="@+id/btn_ok"
            android:layout_width="0dp"
            android:layout_height="wrap_content"
            android:layout_weight="1"
            android:padding="@dimen/b10"
            android:text="完成"
            android:textColor="@color/color_white"
            android:textSize="@dimen/b28"
            app:qmui_backgroundColor="@color/color_orange_main"
            app:qmui_borderColor="@color/color_white"
            app:qmui_radius="@dimen/b10" />
​
    </LinearLayout>
</LinearLayout>

The QMUIRoundButton in the code is the button control of the QMUI framework. Interested friends can find out on Baidu by themselves, or replace it with a normal button control. The frame_grey_white_edge code is as follows:

<?xml version="1.0" encoding="utf-8"?>
<shape xmlns:android="http://schemas.android.com/apk/res/android">
    <solid android:color="@color/color_gray_e8" />
    <corners android:radius="@dimen/b20" />
    <stroke android:color="@color/color_white" />
</shape>

(2) We can use MediaRecorder's Api for recording. I will organize all the Api used in the class-MediaHelper, the code is as follows:

public class MediaHelper {
    private MediaRecorder mMediaRecorder;
    private String mPath;//文件夹
    private String mFilePath;//文件
​
    private static MediaHelper mInstance;
​
    private MediaHelper(String path) {
        mPath = path;
    }
​
    /**
     * 准备播放后的回调
     * 这个时候文件夹里已经有文件生成了
     * 如果不去释放资源将会一直进行录音
     */
    public interface MediaStateListener {
        void preparedDone();
    }
​
    public MediaStateListener mMediaStateListener;
​
    public void setMediaStateListener(MediaStateListener mediaStateListener) {
        mMediaStateListener = mediaStateListener;
    }
​
​
    /**
     * 单例模式获取 MediaHelper
     * 双检锁/双重校验锁
     *
     * @param path
     * @return
     */
    public static MediaHelper getInstance(String path) {
        if (mInstance == null) {
            synchronized (MediaHelper.class) {
                if (mInstance == null) {
                    mInstance = new MediaHelper(path);
                }
            }
        }
​
        return mInstance;
    }
​
    /**
     * 准备录音
     */
    public void prepare() {
​
        try {
            File fileDir = new File(mPath);
            boolean b = !fileDir.exists();
            if (b) {
                fileDir.mkdirs();
            }
​
            String fileName = System.currentTimeMillis() + ".amr"; // 文件名字
            File file = new File(fileDir, fileName);  // 文件路径
​
            mMediaRecorder = new MediaRecorder();
            mFilePath = file.getAbsolutePath();
            //设置保存文件的路径
            if (Build.VERSION.SDK_INT < 26) {
                //若api低于26,调用setOutputFile(String path)
                mMediaRecorder.setOutputFile(file.getAbsolutePath());
            } else {
                //若API高于26 使用setOutputFile(File path)
                mMediaRecorder.setOutputFile(new File(file.getAbsolutePath()));
            }
            mMediaRecorder.setAudioSource(MediaRecorder.AudioSource.MIC);    // 设置MediaRecorder的音频源为麦克风
            mMediaRecorder.setOutputFormat(MediaRecorder.OutputFormat.AMR_NB);    // 设置音频的格式
            mMediaRecorder.setAudioEncoder(MediaRecorder.AudioEncoder.AMR_NB);    // 设置音频的编码为AMR_NB
​
            mMediaRecorder.prepare();
            mMediaRecorder.start();
​
            if (mMediaStateListener != null) {
                mMediaStateListener.preparedDone();
            }
        } catch (IllegalStateException e) {
            e.printStackTrace();
        } catch (IOException e) {
            e.printStackTrace();
        }
​
    }
​
    /**
     * 释放资源
     */
    public void release() {
        mMediaRecorder.stop();
        mMediaRecorder.release();
        mMediaRecorder = null;
    }
​
    /**
     * 取消
     */
    public void cancel() {
        release();
        //删除相应的录音
        if (mFilePath != null) {
            File file = new File(mFilePath);
            if (file.exists()) {
                file.delete();
            }
            mFilePath = null;
        }
    }
    //获取生成的文件路径
    public String getFilePath() {
        return mFilePath;
        }
}

  (3) In the dialog box, click the corresponding button to call the corresponding method in the above class. I have a dialog box class called MicrophoneDialog. The code is as follows:

public class MicrophoneDialog extends BaseDialog implements MediaHelper.MediaStateListener {
    public static final int EXTRA_START = 1;//开始录制
    public static final int EXTRA_UPDATE_TIME = 2;//更新时长
    private AudioListener mAudioListener;
    private long mTime;//时长
    private String mPath = Constants.APK_PATH;
    private boolean authDismiss;//是否是自动关闭的,自动关闭的不触发关闭监听
​
    private MediaHelper mMediaHelper;
    private boolean isRecording;
    private String TAG = "MicrophoneDialog";
​
    public void setAudioListener(AudioListener audioListener) {
        mAudioListener = audioListener;
    }
​
    @Override
    public void preparedDone() {
        mHandler.sendEmptyMessage(EXTRA_START);
    }
​
    public interface AudioListener {
        void finish(long time, String filePath);
​
        void cancel();
    }
​
    public void dismiss(boolean authDismiss) {
        this.authDismiss = authDismiss;
        dismiss();
    }
​
    public MicrophoneDialog(Context context) {
        super(context);
    }
​
    @Override
    public int getViewId() {
        return R.layout.dialog_microphone;
    }
​
    @Override
    public void initBasic(Bundle savedInstanceState) {
        mMediaHelper = MediaHelper.getInstance(mPath);
​
        setOnDismissListener(new OnDismissListener() {
            @Override
            public void onDismiss(DialogInterface dialog) {
                if (!authDismiss) {
                    mMediaHelper.release();
                    isRecording = false;
                    mTime = 0;
                    if (mAudioListener != null) {
                        mAudioListener.cancel();
                    }
                }
            }
        });
        mMediaHelper.setMediaStateListener(this);
    }
​
    @OnClick({R.id.btn_ok, R.id.btn_start})
    public void onViewClicked(View view) {
        switch (view.getId()) {
            case R.id.btn_start:
                QToast.showToast("开始录音");
                mMediaHelper.prepare();
                break;
            case R.id.btn_ok:
                mMediaHelper.release();//要上传音频前释放资源,否则没办法上传音频
                QToast.showToast("结束录音");
                isRecording = false;
                Log.e(TAG, "onViewClicked: " + mTime + "," + mMediaHelper.getFilePath());
                if (mAudioListener != null) {
                    mAudioListener.finish(mTime / 1000, mMediaHelper.getFilePath());
                    mTime = 0;
                }
                break;
        }
    }
​
    /**
     * 这里使用Handle是为了在子线程中更新ui
     * 尽管我现在子线程只用来计时,没有更新ui
     * 但是万一以后会更新ui呢
     */
    @SuppressLint("HandlerLeak")
    private Handler mHandler = new Handler() {
​
        public void handleMessage(android.os.Message msg) {
            switch (msg.what) {
                case EXTRA_START:
                    isRecording = true;
                    //开始计时
                    postDelayed(mRunnable, 1000);
                    break;
                case EXTRA_UPDATE_TIME:
                    postDelayed(mRunnable, 1000);
                    break;
​
            }
        }
    };
    /**
     * 开启个子线程计算时长
     */
    private Runnable mRunnable = new Runnable() {
        @Override
        public void run() {
            if (isRecording) {
                mTime += 1000;
                mHandler.sendEmptyMessage(EXTRA_UPDATE_TIME);//TODO 通知修改时长显示
            }
        }
    };

In the above code, I practiced the use of Handle. It was originally to update the ui in the child thread, but I was still lazy. It is also possible to use Thread.sleep for delayed timing. The Butterknife framework is used for control declaration and click event declaration, BaseDialog It is a encapsulated base class. Friends can move the code in the initBasic() method to onCreate(). It should be noted that the resources must be released before uploading the file, otherwise it will affect the call of the upload file interface. After the dialog is finished, just display the dialog in the activity or fragment. After the recording is completed, call the corresponding background interface to upload the file, and then return to the recording address, use the recyclerview to develop the recording list, and click to play the recording. The effects achieved by playback are:

1.点击相同子项播放与重置播放,点击不同子项需要释放上一个资源重置下一个资源
2.监听有线耳机与蓝牙耳机的拔插,修改播放参数

(4) The code to be played is as follows:

//变量声明以及定义
private MediaPlayer mMediaPlayer;
private HeadSetReceiver mHeadSetReceiver;//广播监听
private AudioManager mAudioManager = null;
private int index;//当前点击的是不是本身
......
registerHeadsetReceiver();
mAudioManager = (AudioManager) mActivity.getSystemService(Context.AUDIO_SERVICE);//切换耳机等播放模式
mAudioManager.setMode(AudioManager.MODE_NORMAL);//普通模式
mMediaPlayer = new MediaPlayer();//播放
......
//点击事件,先判断是否是音频类型的消息
 if (item.getMessageType() == 3) {
   if (!mMediaPlayer.isPlaying()) {//没在播放就播放
        play(item);
    } else {//同样的音频在播放就释放并重置,如果是新的音频就接着播放新的
            mMediaPlayer.reset();
            mMediaPlayer.stop();
            mMediaPlayer.release();
            mMediaPlayer = new MediaPlayer();
            if (position != index) {
                play(item);//播放
            }
      }
   }
   index = position;
   ......
   //播放录音的方法
     private void play(MessageBean item) {
        try {
            mMediaPlayer = new MediaPlayer();
            mMediaPlayer.setDataSource(HttpHelper.picDomain + item.getMessage_content());//域名+文件路径
            mMediaPlayer.prepare();
            mMediaPlayer.start();
        } catch (IOException e) {
            e.printStackTrace();
        }
    }

(5) The monitoring code for unplugging and plugging earphones (wired earphones & bluetooth earphones) and the registration monitoring codes are as follows:

 class HeadSetReceiver extends BroadcastReceiver {
​
        @Override
        public void onReceive(Context context, Intent intent) {
            String action = intent.getAction();
            if (BluetoothHeadset.ACTION_CONNECTION_STATE_CHANGED.equals(action)) {
                BluetoothAdapter defaultAdapter = BluetoothAdapter.getDefaultAdapter();
                //记得加上蓝牙权限
                if (BluetoothHeadset.STATE_AUDIO_DISCONNECTED == defaultAdapter.getProfileConnectionState(BluetoothHeadset.HEADSET)) {
                    QToast.showToast("耳机未连接");
                    mAudioManager.setSpeakerphoneOn(true);
                } else {
                    QToast.showToast("耳机已连接");
                    mAudioManager.setSpeakerphoneOn(false);
                }
            } else if (intent.hasExtra("state")) {
                if (intent.getIntExtra("state", 0) == 0) {
                    QToast.showToast("耳机未连接");
                    mAudioManager.setSpeakerphoneOn(true);
                } else {
                    QToast.showToast("耳机已连接");
                    mAudioManager.setSpeakerphoneOn(false);
​
                }
            }
        }
    }
    private void registerHeadsetReceiver() {
        mHeadSetReceiver = new HeadSetReceiver();
        IntentFilter intentFilter = new IntentFilter();
        intentFilter.addAction("android.intent.action.HEADSET_PLUG");
        registerReceiver(mHeadSetReceiver, intentFilter);
        IntentFilter bluetoothFilter = new IntentFilter(BluetoothHeadset.ACTION_CONNECTION_STATE_CHANGED);
        registerReceiver(mHeadSetReceiver, bluetoothFilter);
    }

(6) Release resources when the page is destroyed (onDestroy)

if (mMediaPlayer != null) {
            mMediaPlayer.stop();
            mMediaPlayer.reset();
            mMediaPlayer.release();
            mMediaPlayer = null;
        }
        unregisterReceiver(mHeadSetReceiver);

So far, the voice sending and click-to-play functions are realized. The effect is the two static images at the beginning. Because there is no fancy interface interaction effect, so the gif effect is not put on, and interested friends can do it themselves Give it a try. The front and back of this project are done by myself. If you have any doubts, you can scan the QR code below to add my WeChat. Welcome everyone to communicate with me about the Android front and back technology, and everyone can make progress together! You are also welcome to subscribe to my WeChat public account (also just started), I will continue to share some interesting learning experiences, and finally, I wish you all the best and good health, thank you for your support and reading!

Guess you like

Origin blog.csdn.net/qq_40785165/article/details/109658968