Uniapp zero-based development study notes (9) - Practical use of media components, audio, video, camera, etc.

Uniapp zero-based development study notes (9) - Practical use of media components, audio, video, camera, etc.

In the basic component part, only the media component, map, canvas, and browser component web-view remain.
This time, let’s take a look at the media components first, focusing on the first few.
The link is as follows:
https://uniapp.dcloud.net.cn/component/audio.html#
Insert image description here

1. Usage of media components

1. audio audio component
Insert image description here
Insert image description here
This is the effect of the demonstration case.
Insert image description here
Sample source code, in which the src playback address, poster cover image address, name audio name, author, and playback action action do not seem to be among the listed attributes. If loop='true' is defined, it will play in a loop.

<!-- 音频播放 -->
<view class="uni-padding-wrap uni-common-mt">
	<view class="page-section page-section-gap" style="text-align: center;">
		<audio style="text-align: left" :src="current.src" :poster="current.poster" :name="current.name" 				
		:author="current.author" :action="audioAction" controls></audio>
	</view>		
</view>
<!-- 音频播放结束 -->

The JS code is as follows, with variable definitions added.

export default {
    
    
	data() {
    
    
		return {
    
    
			current: {
    
    
				poster: 'https://bjetxgzv.cdn.bspapp.com/VKCEYUGU-uni-app-doc/7fbf26a0-4f4a-11eb-b680-7980c8a877b8.png',
				name: '致爱丽丝',
				author: '贝多芬',
				src: 'https://bjetxgzv.cdn.bspapp.com/VKCEYUGU-hello-uniapp/2cc220e0-c27a-11ea-9dfb-6da8e309e0d8.mp3',
			},
			audioAction: {
    
    
				method: 'pause'
			}
		}
	}
}

WeChat applets and many other applets no longer support audio components. Try using API to write it, seeuni.createInnerAudioContext
Insert image description here
Join A button plays music, and defines click to trigger the music playback function playAudio

<button type="primary"  @click="playAudio">播放音乐</button>

Function definition:

playAudio() {
    
    
	const innerAudioContext = uni.createInnerAudioContext();
	innerAudioContext.autoplay = true;
	innerAudioContext.src = this.current.src;
	innerAudioContext.obeyMuteSwitch=true;//根据系统音量开关切换
	innerAudioContext.volume=0.8;//音量
	innerAudioContext.onPlay(() => {
    
    
		console.info('开始播放');//开始播放
			});
	innerAudioContext.onError((res) => {
    
    
	  	console.log(res.errMsg);
	  	console.log(res.errCode);
	});
	innerAudioContext.onEnded(() => {
    
    
		console.info('结束播放');//结束播放
			});
	}	

2. camera camera component
Insert image description here
Neither APP nor H5 support this component, but WeChat applet supports it. Because the WeChat applet development is not downloaded, I looked at the sample code but it cannot run.
Several important attributes:
The valid value of mode is normal, scanCode means taking pictures and scanning codes
resolution, the type is string, the default is medium, I don’t know what other values ​​​​are there
device-position front and rear cameras front/back
flash flash auto, on, off < a i=7> Sample code:
Insert image description here

<view>
     <camera device-position="back" flash="off" @error="error" style="width: 100%; height: 300px;"></camera>
     <button type="primary" @click="takePhoto">拍照</button>
     <view>预览</view>
     <image mode="widthFix" :src="src"></image>
 </view>

JS:

export default {
    
    
    data() {
    
    
        return {
    
    
            src:"" //临时存放照片的地址
        }
    },
    methods: {
    
    
         takePhoto() {
    
    
            const ctx = uni.createCameraContext();
            ctx.takePhoto({
    
    
                quality: 'high',
                success: (res) => {
    
    
                    this.src = res.tempImagePath
                }
            });
        },
        error(e) {
    
    
            console.log(e.detail);
        }
    }
}

Implemented using API: uni.chooseImage(OBJECT) selects pictures from the local album or uses the camera to take pictures
Insert image description here

data() {
    
    
			return {
    
    }
		},
GetImg(){
    
    
		uni.chooseImage({
    
    
		count: 6, //默认9
		sizeType: ['original', 'compressed'], //可以指定是原图还是压缩图,默认二者都有
		sourceType: ['album'],//从相册选择album,相机camera,默认二者都有
		success: function (res) {
    
    
		console.log(JSON.stringify(res.tempFilePaths));
				}
		});
	}

3. image picture component
image pictures are used very frequently.
Insert image description here
Important attributes are:
src: image address, both static resources and network addresses are acceptable
mode: for containers that hold images Size, including cropping, scaling and other modes
lazy-load: Lazy loading of images. It is only valid for images under page and scroll-view. In order to improve the page effect, first display the page frame and occupy the image position before displaying it.
Is it draggable?
In terms of events, @load is triggered after the loading is completed, and @error is triggered after an error occurs.
Insert image description here
Example Code effect:
Insert image description here
Page code:
The main definitions in the image attribute include width/height/background color and fill mode mode. The effects of all modes are shown below.

<view class="page">
	<view class="image-list">
		<view class="image-item" v-for="(item,index) in array" :key="index">
			<view class="image-content">
				<image style="width: 200px; height: 200px; background-color: #eeeeee;" :mode="item.mode" :src="src"
					@error="imageError"></image>
			</view>
			<view class="image-title">{
   
   {item.text}}</view>
		</view>
	</view>
</view>

JS:

src: 'https://bjetxgzv.cdn.bspapp.com/VKCEYUGU-uni-app-doc/6acec660-4f31-11eb-a16f-5b3e54966275.jpg',
array: [{
    
    
            mode: 'scaleToFill',
              text: 'scaleToFill:不保持纵横比缩放图片,使图片完全适应'
          }, {
    
    
              mode: 'aspectFit',
              text: 'aspectFit:保持纵横比缩放图片,使图片的长边能完全显示出来'
          }, {
    
    
              mode: 'aspectFill',
              text: 'aspectFill:保持纵横比缩放图片,只保证图片的短边能完全显示出来'
          }, {
    
    
              mode: 'top',
              text: 'top:不缩放图片,只显示图片的顶部区域'
          }, {
    
    
              mode: 'bottom',
              text: 'bottom:不缩放图片,只显示图片的底部区域'
          }, {
    
    
              mode: 'center',
              text: 'center:不缩放图片,只显示图片的中间区域'
          }, {
    
    
              mode: 'left',
              text: 'left:不缩放图片,只显示图片的左边区域'
          }, {
    
    
              mode: 'right',
              text: 'right:不缩放图片,只显示图片的右边边区域'
          }, {
    
    
              mode: 'top left',
              text: 'top left:不缩放图片,只显示图片的左上边区域'
          }, {
    
    
              mode: 'top right',
              text: 'top right:不缩放图片,只显示图片的右上边区域'
          }, {
    
    
              mode: 'bottom left',
              text: 'bottom left:不缩放图片,只显示图片的左下边区域'
          }, {
    
    
              mode: 'bottom right',
              text: 'bottom right:不缩放图片,只显示图片的右下边区域'
          }]		

4. video video component
I have used it once, but the playback failed. Let’s see what the problem is this time.
This time it finally succeeded. The reason is that it must be run in chrome, not the built-in browser.
Insert image description here
Sample code:
input v-model="danmuValue"bidirectionally binds the value of the barrage to be sent
Danmaku can also define colors. A random color function getRandomColor(); is defined here.

<!-- 视频播放 -->
<view class="uni-padding-wrap uni-common-mt">
	<view>
		<video id="myVideo" src="https://img.cdn.aliyun.dcloud.net.cn/guide/uniapp/%E7%AC%AC1%E8%AE%B2%EF%BC%88uni-	app%E4%BA%A7%E5%93%81%E4%BB%8B%E7%BB%8D%EF%BC%89-%20DCloud%E5%AE%98%E6%96%B9%E8%A7%86%E9%A2%91%E6%95%99%E7%A8%[email protected]"
			@error="videoErrorCallback" :danmu-list="danmuList"
			show-loading='true'
			enable-danmu danmu-btn controls ></video>
	</view>
	<!-- 定义条件编译,除了支付宝小程序别的都可以 -->
	<!-- #ifndef MP-ALIPAY -->
	<view class="uni-list uni-common-mt">
		<view class="uni-list-cell">
			<view>
				<view class="uni-label">弹幕内容</view>
			</view>
			<view class="uni-list-cell-db">
				<input v-model="danmuValue" class="uni-input" type="text" placeholder="在此处输入弹幕内容" />
			</view>
		</view>
	</view>
	<view class="uni-btn-v">
		<button @click="sendDanmu" class="page-body-button">发送弹幕</button>
	</view>
	<!-- #endif -->
</view>
<!-- 视频频播放结束 -->

JS:

export default {
    
    
    data() {
    
    
        return {
    
    
            src: '',
            danmuList: [{
    
    
                    text: '第 1s 出现的弹幕',
                    color: '#ff0000',
                    time: 1
                },
                {
    
    
                    text: '第 3s 出现的弹幕',
                    color: '#ff00ff',
                    time: 3
                }
            ],
            danmuValue: ''
        }
    },
    onReady: function(res) {
    
    
        // #ifndef MP-ALIPAY
        this.videoContext = uni.createVideoContext('myVideo')
        // #endif
    },
    methods: {
    
    
        sendDanmu: function() {
    
    
            this.videoContext.sendDanmu({
    
    
                text: this.danmuValue,
                color: this.getRandomColor()
            });
            this.danmuValue = '';
        },
        videoErrorCallback: function(e) {
    
    
            uni.showModal({
    
    
                content: e.target.errMsg,
                showCancel: false
            })
        },
        getRandomColor: function() {
    
    
            const rgb = []
            for (let i = 0; i < 3; ++i) {
    
    
                let color = Math.floor(Math.random() * 256).toString(16)
                color = color.length == 1 ? '0' + color : color
                rgb.push(color)
            }
            return '#' + rgb.join('')
        }
    }
}

5. live-player video component
Real-time audio and video playback, also called live streaming. This is not very useful. Just look at the sample code.
App’s real-time audio and video playback does not use live-player, but directly uses the video component.
Key attributes:
mode: live (live broadcast), RTC (real-time call, this mode has lower latency)
sound -mode: sound output mode; optional values ​​speaker speaker, ear receiver
object-fit: contain-the long side of the image fills the screen, and the short side area will be filled; fillCrop-the image fills the screen , the part beyond the display area will be cut off

<!-- 直播拉流 -->
<live-player
  src="https://domain/pull_stream"
  autoplay
  @statechange="statechange"
  @error="error"
  style="width: 300px; height: 225px;"
/>
<!-- 直播拉流结束 -->

JS

export default {
    
    
    methods:{
    
    
        statechange(e){
    
    
            console.log('live-player code:', e.detail.code)
        },
        error(e){
    
    
            console.error('live-player error:', e.detail.errMsg)
        }
    }
}

can't be tested, so just check it out.
6. live-pusher real-time audio and video recording, also called live push streaming
Main attributes:
Insert image description here
Sample code< /span>

 <view>
      <live-pusher id='livePusher' ref="livePusher" class="livePusher" url=""
       mode="SD" :muted="true" :enable-camera="true" :auto-focus="true" :beauty="1" whiteness="2"
       aspect="9:16" @statechange="statechange" @netstatus="netstatus" @error = "error"
       ></live-pusher>
       <button class="btn" @click="start">开始推流</button>
       <button class="btn" @click="pause">暂停推流</button>
       <button class="btn" @click="resume">resume</button>
       <button class="btn" @click="stop">停止推流</button>
       <button class="btn" @click="snapshot">快照</button>
       <button class="btn" @click="startPreview">开启摄像头预览</button>
       <button class="btn" @click="stopPreview">关闭摄像头预览</button>
       <button class="btn" @click="switchCamera">切换摄像头</button>
    </view>

JS

<script>
    export default {
    
    
        data() {
    
    
			return {
    
    }
        },
        onReady() {
    
    
            // 注意:需要在onReady中 或 onLoad 延时
            this.context = uni.createLivePusherContext("livePusher", this);
        },
        methods: {
    
    
            statechange(e) {
    
    
                console.log("statechange:" + JSON.stringify(e));
            },
            netstatus(e) {
    
    
                console.log("netstatus:" + JSON.stringify(e));
            },
            error(e) {
    
    
                console.log("error:" + JSON.stringify(e));
            },
            start: function() {
    
    
                this.context.start({
    
    
                    success: (a) => {
    
    
                        console.log("livePusher.start:" + JSON.stringify(a));
                    }
                });
            },
            close: function() {
    
    
                this.context.close({
    
    
                    success: (a) => {
    
    
                        console.log("livePusher.close:" + JSON.stringify(a));
                    }
                });
            },
            snapshot: function() {
    
    
                this.context.snapshot({
    
    
                    success: (e) => {
    
    
                        console.log(JSON.stringify(e));
                    }
                });
            },
            resume: function() {
    
    
                this.context.resume({
    
    
                    success: (a) => {
    
    
                        console.log("livePusher.resume:" + JSON.stringify(a));
                    }
                });
            },
            pause: function() {
    
    
                this.context.pause({
    
    
                    success: (a) => {
    
    
                        console.log("livePusher.pause:" + JSON.stringify(a));
                    }
                });
            },
            stop: function() {
    
    
                this.context.stop({
    
    
                    success: (a) => {
    
    
                        console.log(JSON.stringify(a));
                    }
                });
            },
            switchCamera: function() {
    
    
                this.context.switchCamera({
    
    
                    success: (a) => {
    
    
                        console.log("livePusher.switchCamera:" + JSON.stringify(a));
                    }
                });
            },
            startPreview: function() {
    
    
                this.context.startPreview({
    
    
                    success: (a) => {
    
    
                        console.log("livePusher.startPreview:" + JSON.stringify(a));
                    }
                });
            },
            stopPreview: function() {
    
    
                this.context.stopPreview({
    
    
                    success: (a) => {
    
    
                        console.log("livePusher.stopPreview:" + JSON.stringify(a));
                    }
                });
            }
        }
    }
</script>

Preview is not supported on H5, so this is just for understanding.

Guess you like

Origin blog.csdn.net/qq_43662503/article/details/127490126