uniapp iOS-native plug-in development problem record

component file

The following is a native plug-in written by scanning the code as an example. The steps of creating a plug-in project will not be described in detail. Below is a screenshot of the simple configuration steps.

iOS code

Plug-in project configuration screenshot

insert image description here
insert image description here

Main project file configuration screenshot

insert image description here

native development

Create a new view in the plugin project. The name is TestScanComponent, and the first letter must be capitalized.
DCUniComponent is introduced into the h file

#import <UIKit/UIKit.h>
#import "DCUniComponent.h"

@interface TestScanComponent : DCUniComponent

@end

.m file

#import "TestScanComponent.h"
#import "DCUniConvert.h"
#import <AVFoundation/AVFoundation.h>
#import "QRView.h"  // 扫码边框样式 第三方  非必写。


@interface TestScanComponent ()<AVCaptureMetadataOutputObjectsDelegate,QRViewDelegate>
{
    
    
    NSString * strFlash;
}
@property (nonatomic, strong) AVCaptureSession *captureSession;
@property (nonatomic, strong) AVCaptureVideoPreviewLayer *videoPreviewLayer;
@property (nonatomic, strong) NSArray *supportedBarcodeTypes; // 扫码类型


@end

@implementation TestScanComponent

- (void)viewWillLoad{
    
    
    [self startScanning];
}
- (void)viewDidLoad {
    
    

    self.supportedBarcodeTypes = @[AVMetadataObjectTypeQRCode,
                                   AVMetadataObjectTypeUPCECode,
                                   AVMetadataObjectTypeCode39Code,
                                   AVMetadataObjectTypeCode39Mod43Code,
                                   AVMetadataObjectTypeCode93Code,
                                   AVMetadataObjectTypeCode128Code,
                                   AVMetadataObjectTypeEAN8Code,
                                   AVMetadataObjectTypeEAN13Code,
                                   AVMetadataObjectTypeAztecCode,
                                   AVMetadataObjectTypePDF417Code,
                                   AVMetadataObjectTypeInterleaved2of5Code,
                                   AVMetadataObjectTypeITF14Code,
                                   AVMetadataObjectTypeDataMatrixCode];

    AVCaptureDevice *captureDevice = [AVCaptureDevice defaultDeviceWithMediaType:AVMediaTypeVideo];
    NSError *error;
    AVCaptureDeviceInput *input = [AVCaptureDeviceInput deviceInputWithDevice:captureDevice error:&error];

    if (!input) {
    
    
        NSLog(@"%@", [error localizedDescription]);
        return;
    }

    self.captureSession = [[AVCaptureSession alloc] init];
    [self.captureSession addInput:input];

    AVCaptureMetadataOutput *captureMetadataOutput = [[AVCaptureMetadataOutput alloc] init];
    [self.captureSession addOutput:captureMetadataOutput];

    dispatch_queue_t dispatchQueue;
    dispatchQueue = dispatch_queue_create("myQueue", NULL);
    [captureMetadataOutput setMetadataObjectsDelegate:self queue:dispatchQueue];
    [captureMetadataOutput setMetadataObjectTypes:self.supportedBarcodeTypes];

    self.videoPreviewLayer = [[AVCaptureVideoPreviewLayer alloc] initWithSession:self.captureSession];
    [self.videoPreviewLayer setVideoGravity:AVLayerVideoGravityResizeAspectFill];
    [self.videoPreviewLayer setFrame:self.view.layer.bounds];
    [self.view.layer addSublayer:self.videoPreviewLayer];

    [self.captureSession startRunning];

    ///*
    // 动画
    CGRect screenRect = self.view.bounds;
    QRView *qrRectView = [[QRView alloc] initWithFrame:screenRect];
    qrRectView.transparentArea = CGSizeMake(200, 200);
    qrRectView.backgroundColor = [UIColor clearColor];
    qrRectView.center = CGPointMake(self.view.frame.size.width / 2, self.view.frame.size.height / 2);
    qrRectView.delegate = self;
    [self.view addSubview:qrRectView];
    
    
    //修正扫描区域
    CGFloat screenHeight = self.view.frame.size.height;
    CGFloat screenWidth = self.view.frame.size.width;
    CGRect cropRect = CGRectMake((screenWidth - qrRectView.transparentArea.width) / 2,
                                 (screenHeight - qrRectView.transparentArea.height) / 2,
                                 qrRectView.transparentArea.width,
                                 qrRectView.transparentArea.height);
    
    [captureMetadataOutput setRectOfInterest:CGRectMake(cropRect.origin.y / screenHeight,
                                          cropRect.origin.x / screenWidth,
                                          cropRect.size.height / screenHeight,
                                          cropRect.size.width / screenWidth)];
    //*/
}
- (void)viewDidUnload{
    
    
    [self stopScanning];
}

#pragma mark - 开关手电筒
UNI_EXPORT_METHOD(@selector(setFlash:))
-(void)setFlash:(NSDictionary *)option{
    
    
    strFlash = [NSString stringWithFormat:@"%@",[option objectForKey:@"value"]];
    if([strFlash isEqualToString:@"1"]){
    
    
        [self turnOnTorch];
    }else{
    
    
        [self turnOffTorch];
    }
}
// 前端调用方法 打开手电筒
UNI_EXPORT_METHOD(@selector(turnOnTorch))
- (void)turnOnTorch {
    
    
    AVCaptureDevice *device = [AVCaptureDevice defaultDeviceWithMediaType:AVMediaTypeVideo];
    
    if (device && [device hasTorch] && [device isTorchAvailable] && [device isTorchModeSupported:AVCaptureTorchModeOn]) {
    
    
        NSError *error;
        [device lockForConfiguration:&error];
        
        if (!error) {
    
    
            [device setTorchMode:AVCaptureTorchModeOn];
            [device unlockForConfiguration];
        } else {
    
    
            NSLog(@"Failed to turn on the torch: %@", error.localizedDescription);
        }
    }
}
// 前端调用 关闭手电筒
UNI_EXPORT_METHOD(@selector(turnOffTorch))
- (void)turnOffTorch {
    
    
    AVCaptureDevice *device = [AVCaptureDevice defaultDeviceWithMediaType:AVMediaTypeVideo];
    
    if (device && [device hasTorch] && [device isTorchAvailable] && [device isTorchModeSupported:AVCaptureTorchModeOff]) {
    
    
        NSError *error;
        [device lockForConfiguration:&error];
        
        if (!error) {
    
    
            [device setTorchMode:AVCaptureTorchModeOff];
            [device unlockForConfiguration];
        } else {
    
    
            NSLog(@"Failed to turn off the torch: %@", error.localizedDescription);
        }
    }
}
#pragma mark - 开始 停止扫码
// 前端调用 开始扫码
UNI_EXPORT_METHOD(@selector(startScanning))
- (void)startScanning {
    
    
    if (![self.captureSession isRunning]) {
    
    
        [self.captureSession startRunning];
    }
}
// 前端调用 停止扫码
UNI_EXPORT_METHOD(@selector(stopScanning))
- (void)stopScanning {
    
    
    if ([self.captureSession isRunning]) {
    
    
        [self.captureSession stopRunning];
    }
}
#pragma mark - scan delegate
- (void)captureOutput:(AVCaptureOutput *)captureOutput didOutputMetadataObjects:(NSArray *)metadataObjects fromConnection:(AVCaptureConnection *)connection {
    
    
    
    NSString *stringValue;
    
    if (metadataObjects.count >0) {
    
    
        AVMetadataMachineReadableCodeObject *metadataObj = [metadataObjects objectAtIndex:0];
        stringValue = metadataObj.stringValue;
        [self stopScanning]; // 停止扫码

        dispatch_async(dispatch_get_main_queue(), ^{
    
    
            // 向前端发送事件,params 为传给前端的数据 注:数据最外层为 NSDictionary 格式,需要以 "detail" 作为 key 值
            [self fireEvent:@"scanLoaded" params:@{
    
    @"detail":@{
    
    @"data":stringValue}} domChanges:nil];
        });
        
        dispatch_after(dispatch_time(DISPATCH_TIME_NOW, (int64_t)(3.0 * NSEC_PER_SEC)), dispatch_get_main_queue(), ^{
    
    
            // 在这里写需要延迟执行的任务代码
            [self startScanning];

        });
    }
    
    
}

@end

The iOS side exposes the asynchronous method to the js side through the macro UNI_EXPORT_METHOD

// Front-end call
// Turn on and off the flash
UNI_EXPORT_METHOD(@selector(setFlash:))
-(void)setFlash:(NSDictionary *)option
// Start scanning
UNI_EXPORT_METHOD(@selector(startScanning))
-(void)startScanning
// Stop Scan code
UNI_EXPORT_METHOD(@selector(stopScanning))
-(void)stopScanning

Screenshot of the plist configuration, note that the class and class file names are consistent, and the name must be unique.
insert image description here
So far, the code on the iOS side of the native plug-in has been completed. The uni code is needed below.

uniapp code uses plugins

Note that component type functions can only be used in nvue files in uniapp.

<template>
	<view class="page">
		<view>
		<!-- 直接使用插件 -->
			<dc-testmapscan ref='mycomponent' style="width:100%;height:400px" @scanLoaded="onMapLoaded"></dc-testmapscan>
		</view>

		<view class="font-box">
			<text :class="flash ? 'shandian icon ziconfont':'jingyongshandian icon ziconfont' " 
			@click="toFlash">{
   
   {flash ? "&#xe61b;":"&#xe8a7;"}}</text>
		</view>
	</view>
	
</template>

<script>
	import fontFace from '@/common/utils/iconfont.js' // nvue文件icon的引入
	import permision from "@/common/utils/permission.js" // 权限引入
	
	export default {
      
      
		components: {
      
      },
		data() {
      
      
			return {
      
      
				// 闪光灯
				flash: false,
				topHeight: 60,
			}
		},
		 onShow() {
      
      
			// #ifdef APP-PLUS
				this.scanplus();
			// #endif
		},
		onHide() {
      
      
			this.$refs.mycomponent.stopScanning();
		},
		created() {
      
      
			const domModule = weex.requireModule("dom");
			domModule.addRule('fontFace', fontFace)
			// https://ask.dcloud.net.cn/question/113657
			const uniDomModule = uni.requireNativePlugin('dom')
			uniDomModule.addRule('fontFace', fontFace);
		},
		mounted() {
      
      
			this.countTopBar()
		  	sunmiScan.bindService()
		},
		
		destroyed() {
      
      
		  sunmiScan.clean()
		},
		methods: {
      
      
				// 获取到条码内容
		        onMapLoaded:function(e) {
      
      
		            // 原生端传递的数据保存在 e.detail 中
		            console.log('map loaded:',JSON.stringify(e.detail))
					var vcode = JSON.stringify(e.detail);
					// 在这里处理获取到的条码内容
					
					
		        },
				
				/**
				 * 计算顶部DOM元素节点
				 */
				async countTopBar() {
      
      
				  	var that = this;
					const topBar = await this.$getComponentsDom('#navBar');
					this.topHeight = topBar.height
					// #ifdef APP-PLUS
					  //这里判断一下设备及版本。
					  if(uni.getSystemInfoSync().platform=='android' &&  sunmiScan.getScannerModel().code > 100){
      
      
						
					  }else{
      
      
						this.scanplus();
					  }
					// #endif
				},
				// #ifdef APP-PLUS
				async checkPermission(code) {
      
      
					let status = permision.isIOS ? await permision.requestIOS('camera') :
						await permision.requestAndroid('android.permission.CAMERA');
					if (status === null || status === 1) {
      
      
						status = 1;
					} else {
      
      
						uni.showModal({
      
      
							content: "需要相机权限",
							confirmText: "设置",
							success: function(res) {
      
      
								if (res.confirm) {
      
      
									permision.gotoAppSetting();
								}
							}
						})
					}
					return status;
				},
				// #endif
				async scanplus() {
      
      
					var that = this
					let status = await this.checkPermission();
					if (status !== 1) {
      
      
						return;
					}
					this.$refs.mycomponent.startScanning();
				},

				/**
				 * 开启/关闭闪光灯
				 * */
				toFlash: function() {
      
      
					this.flash = !this.flash
					this.$refs.mycomponent.setFlash({
      
      'value':this.flash});
				},
				/**
				 * 返回上一级页面
				 */
				toBack() {
      
      
					uni.navigateBack({
      
      
						delta: 1
					})
				},
	
		    }
	}
</script>

<style>
	@font-face {
      
      
		font-family: ziconfont;
		src: url('https://at.alicdn.com/t/font_865816_17gjspmmrkti.ttf') format('truetype');
	}
	
.page {
      
      
		position: relative;
	}
	.font-box{
      
      
		/* position: absolute; */
		/* left: 0;
		right: 0; */
		display: flex;
		align-items: center;
		justify-content: center;
		/* z-index: 100; */
		flex-direction: column;
		/* width: 100%; */
		margin-top: 30rpx;
		
	}
	.shandian {
      
      
		background-color: #FD5022;
	}
	.jingyongshandian {
      
      
		background-color: #ddd;
	}
	.icon {
      
      
		text-align: center;
		line-height: 60rpx;
		color: #fff;
		font-size: 32rpx;
		width: 160rpx;
		height: 60rpx;
		margin-bottom: 20rpx;
		border-radius: 80rpx;
		font-family: ziconfont;
	}
	
	.font {
      
      
		color: #999;
		margin-bottom: 20rpx;
		font-size: 24rpx
	}
	
	.font-red {
      
      
		color: #FD5022;
		font-size: 24rpx;
		text-decoration: underline;
	}
	
	.ziconfont {
      
      
		font-family: ziconfont;
	}
	

</style>

The js method used by the front end

<dc-testmapscan ref=‘mycomponent’ style=“width:100%;height:400px”
@scanLoaded=“onMapLoaded”> </ dc-testmapscan>

this.$refs.mycomponent.startScanning();

this.$refs.mycomponent.setFlash({‘value’:this.flash});

this.$refs.mycomponent.stopScanning();

onMapLoaded:function(e) { // Process the obtained barcode content }

So far the code of uniapp has been completed. Generate uniapp as a local packaging resource.
insert image description here

iOS dock joint debugging

Take out the generated resource package and put it into the main project file for iOS native plug-in development.
insert image description here
insert image description here
Connect the mobile phone for real machine debugging, if there is no problem, you can proceed to the next step. The joint debugging here is to check whether the function can run normally.

iOS Dock Packaging

After there is no problem with the real machine joint debugging, the ipa package can be tested.

Note: The plug-in project must be exported to generate a .framework or .a library file.
Otherwise, the page of the plug-in used in the generated ipa package will flash back.

Select the plug-in project, select the compile project as the plug-in project scanDemo
insert image description here
, select Edit Scheme, switch Run->Info->Build Configuration to Release, and then click Close to close the pop-up window
insert image description here
Build Active Architecture Only->Release is No
Valid Architectures contain at least arm64 (Generally keep the default configuration of the project)
This part is basically the default configuration and does not need to be changed.
insert image description here

Select the minimum supported iOS version in iOS Deployment Target, it is recommended to select iOS11.0
insert image description here
and click the Run button or Command + B to compile and run the project

After the compilation is complete, click Product > Show Build Folder in Finder on the top menu to open the compiled path. The library in Products>Release-iphoneos>scanDemo.framework is the compiled plug-in library file; copy and paste the library file to the project
SDK -Libs inside
insert image description here

After that, iOS packs the ipa and calls the plug-in, which is normal.

I didn't export the scanDemo.framework package when I first started. The code scanning function page in the ipa package packaged by the base for the test keeps blanking and flashing back, and the camera permissions are not displayed in the mobile phone settings.
Then export the plug-in to the scanDemo.framework package, put it in the framework and package it again, it will be normal.

Guess you like

Origin blog.csdn.net/c1o2c3o4/article/details/132150618