WeChat applet recording + playback component package (source code)

exhibit

Long press to record. Release to end recording. Click to play. Click again to pause. Click again to continue playing. Show the effect:

recording function 

Recording function (press your finger to start recording, release your finger to end recording):

Use the wx native recording function to create a new wx.getRecorderManager() outside the component

RecorderManager | WeChat Open Documentation

const vedioControl = wx.getRecorderManager() //录音控制器

 Don’t look for the wrong one in this location

First we think of wx press and release events as bindtouchstart/bindtouchend respectively

bindtouchstart="touchstart" bindtouchend="touchend"

//我们将 touchstart设置
    touchstart: function(e:any) {   
      this.setData({
        touchStartTime:e.timeStamp
        //获取按下的时间戳
      })
      this.setData({
        setTimeFn:setTimeout(()=>{
          vedioControl.start({})
          // 100ms后开始录音
          this.setData({startVedio:true})
        },100)
      })
    },  
   touchend: function(e:any) {    
        //如果touchstart 和 startend 按下时间大与100ms 就进行记录
      if (e.timeStamp - this.data.touchStartTime >100){      
          vedioControl.stop()
          // touch结束停止录音
          // 监听事件 如果停止录音出发事件
          vedioControl.onStop((res) => {
            const ss = Math.floor(res.duration / 1000 % 60)
            const mm = Math.floor(res.duration / 1000 / 60) 
            this.setData({
              vedioTime: `${mm}:${ss>10?ss:'0'+ss}`
            })
            clearTimeout(this.data.setTimeFn)
            this.setData({startVedio:false})
            const vedioList = this.data.vedioList
            // 我们将获取到的暂时的地址存储到vediolist中 duration为录音的时间 
            vedioList.push( {
              fileName: '音频',
              url: res.tempFilePath,
              type:'mp4',
              duration:res.duration
            })
            // 将新的录音添加到vedioList中
            this.setData({
              setTimeFn:null,
              vedioList:vedioList
            })
          })
        }else{
        //如果touchstart 和 startend 按下时间小与100ms 清除掉 settimeout 不尽行任何处理
          clearTimeout(this.data.setTimeFn)
          this.setData({
            setTimeFn:null,
            startVedio:false
          })
        } 
    },

 In this way, we get the address and audio of the recording. Next, we need to play the audio.

Play audio function

Using wx audio player wx.createInnerAudioContext() we also create a new one outside the component

InnerAudioContext | WeChat Open Documentation

const innerAudioContext = wx.createInnerAudioContext({
  useWebAudioImplement: false,
})//音频播放器
innerAudioContext.onPause(()=>{
  console.log('暂停了')
})
innerAudioContext.onPlay(()=>{
  console.log('开始播放了')
})

 When we click the play button, the playVedio function is executed

    playVedio (){
      innerAudioContext.src =  this.data.vedioList[this.data.vedioList.length-1].url
// 这里的url为上面的tempath
      const vedio = this.data.vedioList[this.data.vedioList.length - 1].duration
      const duration = this.data.vedioList[this.data.vedioList.length - 1].duration /20
//这里我设置为 每5%前进一个小格(小格变成深蓝色)
//设置 isPlayVedio变量如果 当前在播放则为true 否则是false
      if(!this.data.isPlayVedio){
        innerAudioContext.play() 
        this.setData({
          isPlayVedio:true
        })
//这个circle 的作用是每过 5%秒 将 用于记录的vedioProcess+5
//timeVedio作用为记录 已经播放到第几秒了
        const circle = setInterval(()=>{
          this.setData({
            vedioProcess:this.data.vedioProcess+5,
            timeVedio:this.data.timeVedio + duration
          })
          if(vedio<this.data.timeVedio||vedio===this.data.timeVedio){
            //如果 播放的长度已经大于或者等于 音频长度 我们则应该清除 这个播放让她回归于0     并且将播放状态设置为 false
            clearInterval(circle)
            this.setData({
              vedioProcess:0,
              timeVedio:0,
              isPlayVedio:false
            })
          }
// 如果在执行 setInterval的时候 用户点击了暂停 那么isPlayVedio的状态为false 我们就需要让 这个播放器的 process不再前进
          if(!this.data.isPlayVedio){
            clearInterval(circle)
          }
        },duration)
      }else{
//如果当前播放器正在播放 状态 那么用户点击的时候就是希望暂停
        innerAudioContext.pause() 
        this.setData({
          isPlayVedio:false
        })
      }
    },

The url here is the tempath above 

Here I set it to advance one small grid every 5% (the small grid turns dark blue)

isPlayVedio: used to determine whether it is playing

vedioProcess: The playback progress is all 100

timeVedio: the current playing time

circle: render a small grid every 5% of the time

About displaying small grids:

  

        <view class="vedio-file" bindtap="playVedio">
          <image src="{
   
   {isPlayVedio?'../images/vedio-start.svg':'../images/vedio-end.svg'}}"></image>
          <view class="item-list">
            <view wx:for="{
   
   {20}}" wx:for-item="item" class="item {
   
   {vedioProcess>item*5?'active':''}}" wx:key="index"></view>
          </view>
          <view class="time">{
   
   {vedioTime}}</view>
        </view>

Draw a small vertical grid and cycle it 20 times. What we set here is 100/20 and change the color every 5%.

vedioProces is the current playback progress, all of which are 100

If the small grid's index*5>vedioProces, then the small grid should turn dark blue (change the class name)

Next we need to set up small grids of different lengths

Use the n-th (n) feature of css to group 6 small->large large->small

      .item{
        height: 12px;
        width: 2px;
        background-color: #3370FF33;
        margin-left: 5px;
      }
      .active{
        background-color: #3370FF;
      }
      .item:nth-child(6n-5),.item:nth-child(6n-1){
        height: 4px;
      }
      .item:nth-child(6n-4), .item:nth-child(6n-2){
        height: 8px;
      }
      .item:nth-child(6n-3){
        height: 14px;
      }

pay attention: 

There is no sound when the player plays in ios mute format. We need to set obeyMuteSwitch:false globally in app.ts.

  onLaunch() {
    wx.setInnerAudioOption({
      obeyMuteSwitch:false
    })
  },

wx.setInnerAudioOption(Object object) | WeChat Open Document

Pure uncommented version component source code:

wxml:

<van-overlay show="{
   
   { vedioVisible }}" z-index="99">
  <view class="vedioDialog">
    <view class="close-icon" bindtap="closeVedio">
      <image src="../images/close-icon.svg"></image>
    </view>
    <view class="bigCircle" bindtouchstart="touchstart" bindtouchend="touchend">
      <view class="smallCircle {
   
   {startVedio?'onVedio':''}}">
        <text>{
   
   {startVedio?'正在录音':'长按录音'}}</text>
      </view>
    </view>
    <view>
      <view class="vedio-player" style="{
   
   {vedioTime==='0:00'?'display:none':null}}">
        <view class="vedio-file" bindtap="playVedio">
          <image src="{
   
   {isPlayVedio?'../images/vedio-start.svg':'../images/vedio-end.svg'}}"></image>
          <view class="item-list">
            <view wx:for="{
   
   {20}}" wx:for-item="item" class="item {
   
   {vedioProcess>item*5?'active':''}}" wx:key="index"></view>
          </view>
          <view class="time">{
   
   {vedioTime}}</view>
        </view>
        <view class="del-vedio" bindtap="delVedio">
          <image src="../images/bg-close-icon.svg"></image>
        </view>
      </view>
    </view>
  </view>
</van-overlay>

ts

// components/multifunctionInput/multifunctionInput.ts
const vedioControl = wx.getRecorderManager() //录音
const innerAudioContext = wx.createInnerAudioContext({
  useWebAudioImplement: false,
})//音频播放器
innerAudioContext.onPause(()=>{
  console.log('暂停了')
})
innerAudioContext.onPlay(()=>{
  console.log('开始播放了')
})
Component({
  /**
   * 组件的属性列表
   */
  properties: {

  },

  /**
   * 组件的初始数据
   */
  data: {
    vedioVisible:false,
    vedioProcess:0,
    vedioTime:'0:00',
    touchStartTime: 0,
    startVedio: false,
    setTimeFn:null as any,
    vedioList:[{
      url:'https://img.yzcdn.cn/vant/leaf.jpg',
      fileName:'音频.mp4',
      type: 'mp4',
      duration:4000
    }] ,
    isPlayVedio:false,
    timeVedio:0
  },

  /**
   * 组件的方法列表
   */
  methods: {
    openVedio(){
      this.setData({vedioVisible:true})
    },
    closeVedio(){
      this.setData({vedioVisible:false})
    },
    touchstart: function(e:any) {   
      this.setData({
        touchStartTime:e.timeStamp
      })
      this.setData({
        setTimeFn:setTimeout(()=>{
          vedioControl.start({})
          this.setData({startVedio:true})
        },100)
      })
    },  
   //touch end 
   touchend: function(e:any) {    
      if (e.timeStamp - this.data.touchStartTime >100){      
          vedioControl.stop()
          vedioControl.onStop((res) => {
            const ss = Math.floor(res.duration / 1000 % 60)
            const mm = Math.floor(res.duration / 1000 / 60) 
            this.setData({
              vedioTime: `${mm}:${ss>10?ss:'0'+ss}`
            })
            clearTimeout(this.data.setTimeFn)
            this.setData({startVedio:false})
            const vedioList = this.data.vedioList
            vedioList.push( {
              fileName: '音频',
              url: res.tempFilePath,
              type:'mp4',
              duration:res.duration
            })
            this.setData({
              setTimeFn:null,
              vedioList:vedioList
            })
          })
        }else{
          clearTimeout(this.data.setTimeFn)
          this.setData({
            setTimeFn:null,
            startVedio:false
          })
        } 
    },
    playVedio (){
      innerAudioContext.src =  this.data.vedioList[this.data.vedioList.length-1].url
      const vedio = this.data.vedioList[this.data.vedioList.length - 1].duration
      const duration = this.data.vedioList[this.data.vedioList.length - 1].duration /20
      if(!this.data.isPlayVedio){
        innerAudioContext.play() 
        this.setData({
          isPlayVedio:true
        })
        const circle = setInterval(()=>{
          this.setData({
            vedioProcess:this.data.vedioProcess+5,
            timeVedio:this.data.timeVedio + duration
          })
          if(vedio<this.data.timeVedio){
            clearInterval(circle)
            this.setData({
              vedioProcess:0,
              timeVedio:0,
              isPlayVedio:false
            })
          }
          if(!this.data.isPlayVedio){
            clearInterval(circle)
          }
        },duration)
      }else{
        innerAudioContext.pause() 
        this.setData({
          isPlayVedio:false
        })
      }
      // innerAudioContext.pause() // 暂停
      // innerAudioContext.stop() // 停止
    },
    delVedio (){
      this.setData({
        vedioTime:'0:00',
        isPlayVedio:false,
        vedioProcess:0,
        timeVedio:0,
      })
    },
    removeVedio (e:any) {
      const index = e.target.dataset.index
      const {vedioList = []} = this.data
      vedioList.splice(index,1)
      this.setData({
        vedioList:vedioList
      })
    },

  }
})

less 

.vedioDialog{
  height: 350px;
  width: 90%;
  background-color: white;
  box-shadow: 0px 0px 4px 6px rgba(91, 202, 151, 0.03);
  border-radius: 7px;
  position: absolute;
  top: calc(50% - 175px);
  right: 5%;
  display: flex;
  justify-content: center;
  align-items: center;
  flex-direction: column;
  .close-icon{
    position: absolute;
    top: 20px;
    right: 20px;
    image{
      height: 30rpx;
      width: 30rpx;
    }
  }
  .bigCircle{
    height: 166px;
    width: 166px;
    border-radius: 100%;
    background-color: #3370FF2E;
    display: flex;
    align-items: center;
    justify-content: center;
    color: white;
    .smallCircle{
      width: 100px;
      height: 100px;
      background-color: #3370FF;
      border-radius: 100%;
      display: flex;
      align-items: center;
      justify-content: center;
    }
  }
  .onVedio{
    background-color: #04D7B9 !important; 
  }
  .vedio-player{
    margin-top: 20px;
    display: flex;
    justify-content: center;
    align-items: center;
    .vedio-file{
      width: 250px;
      height: 44px;
      background: #F5F5F5;
      border-radius: 34.5px;
      display: flex;
      align-items: center;
      padding:0 10px;
      box-sizing: border-box;
      justify-content:space-evenly;
      image{
        height: 33px;
        width: 33px;
      }
      .item-list{
        width: 150px;
        height: 15px;
        display: flex;
        align-items: center;
      }
      .time{
        color: #0D296E;
        font-size: 12px;
      }
      .item{
        height: 12px;
        width: 2px;
        background-color: #3370FF33;
        margin-left: 5px;
      }
      .active{
        background-color: #3370FF;
      }
      .item:nth-child(6n-5),.item:nth-child(6n-1){
        height: 4px;
      }
      .item:nth-child(6n-4), .item:nth-child(6n-2){
        height: 8px;
      }
      .item:nth-child(6n-3){
        height: 14px;
      }
  
    }
    .del-vedio{
      margin-left: 15px;
      image{
        height: 25px;
        width: 25px;
      }
    }
  }

}

json 

{
  "component": true,
  "usingComponents": {
    "van-overlay": "@vant/weapp/overlay/index",
  }
}

Guess you like

Origin blog.csdn.net/weixin_44383533/article/details/130710846