HKUST IFLYTEK weather forecast (under test)

IFLYTEK weather forecast

1. Cloud configuration


#Create application AIUI is a full-link human-computer interaction voice solution launched by iFLYTEK in 2015 with natural language understanding as the core. It can quickly make your applications and devices listen and speak, understand and think.
Enter the AIUI open platform, log in to your account, application access-enter the application-create an application , or in the beginner's guide below -create an AIUI application , fill in the relevant information of your application and the application platform, and then create the application.
Insert picture description here
After creating the application, you can download the corresponding SDK package on the SDK download page of the application (the name of the downloaded SDK package is Windows_aiui5.5.1059.0001_60208bd0). The SDK package contains MSC and AIUI libraries.
This chapter only introduces the introductory use of the AIUI library. The MSC library contains wake-up and synthesis capabilities. If you need to understand the MSC library, please visit the MSC Development Guide .
Insert picture description here

Android platform integration steps


#Import SDK Open Android Studio, create a new project, copy libaiui.so and AIUI.jar in the libs directory of the downloaded Android SDK compressed package to the libs directory of the Android project, and add cfg in the assets directory of the SDK package Copy the folder and the vad folder in the res directory to the project.

The project structure is shown in the figure below:
Insert picture description here
Add AIUI.jar to the project dependencies

Insert picture description here
Specify the default jniLibs directory as libs in the gradle configuration file (build.gradle) under the app module.

android {
    
    
...
	sourceSets {
    
    
		main {
    
    
			jniLibs.srcDirs = ['libs']
		}
	}
...
}

As shown below:
Insert picture description here


#Add user permissions Add the following permissions to the project AndroidManifest.xml file. If it is integrated and used in Android 6.0 and above mobile phones, please dynamically apply for the required permissions.

<uses-permission android:name="android.permission.RECORD_AUDIO" />
<uses-permission android:name="android.permission.INTERNET" />
<uses-permission android:name="android.permission.ACCESS_NETWORK_STATE" />
<uses-permission android:name="android.permission.ACCESS_WIFI_STATE" />
<uses-permission android:name="android.permission.CHANGE_NETWORK_STATE" />
<uses-permission android:name="android.permission.READ_PHONE_STATE" />
<uses-permission android:name="android.permission.ACCESS_FINE_LOCATION" />
<uses-permission android:name="android.permission.WRITE_EXTERNAL_STORAGE" />
<uses-permission android:name="android.permission.WRITE_SETTINGS" />
<uses-permission android:name="android.permission.READ_EXTERNAL_STORAGE" />

Insert picture description here

Note: If you need to obfuscate when packaging or generating APK, please add the following code in proguard.cfg:

-keep class com.iflytek.**{
    
    *;}
-keepattributes Signature

Insert picture description here

#Create layout file You
can refer to the project source code to add layout.

#Create AIUIAgent The AIUIAgent
provided in the SDK is a bridge to interact with AIUI. Create AIUIAgent, an example is as follows:

//Create AIUIAgent

AIUIAgent mAIUIAgent = AIUIAgent.createAgent(context,getAIUIParams(),mAIUIListener);

The createAgent method contains three parameters:

The first parameter type is Context; the
second parameter type is String, and the specific value is a string obtained by reading the cfg/aiui_phone.cfg file in the assets directory; the
third parameter type is AIUIListener, which is an AIUI event callback Listener.
A specific example of getAIUIParams() is as follows:

private String getAIUIParams() {
    
    
	String params = "";
	AssetManager assetManager = getResources().getAssets();
	try {
    
    
		InputStream ins = assetManager.open( "cfg/aiui_phone.cfg" );
		byte[] buffer = new byte[ins.available()];
		
		ins.read(buffer);
		ins.close();
		
		params = new String(buffer);
	} catch (IOException e) {
    
    
		e.printStackTrace();
	}
	return params;
}

A specific example of mAIUIListener is as follows:

AIUIListener mAIUIListener = new AIUIListener() {
    
    

	@Override
	public void onEvent(AIUIEvent event) {
    
    
		switch (event.eventType) {
    
    
			//唤醒事件
			case AIUIConstant.EVENT_WAKEUP:
			{
    
    
				break;
			}
			//结果事件(包含听写,语义,离线语法结果)
			case AIUIConstant.EVENT_RESULT:
			{
    
    
				break;
			}
			//休眠事件
			case AIUIConstant.EVENT_SLEEP:
			{
    
    
				break;
			}
			// 状态事件
			case AIUIConstant.EVENT_STATE: {
    
    
				mAIUIState = event.arg1;
				if (AIUIConstant.STATE_IDLE == mAIUIState) {
    
    
					// 闲置状态,AIUI未开启
				} else if (AIUIConstant.STATE_READY == mAIUIState) {
    
    
					// AIUI已就绪,等待唤醒
				} else if (AIUIConstant.STATE_WORKING == mAIUIState) {
    
    
					// AIUI工作中,可进行交互
				}
			} break;
			//错误事件
			case AIUIConstant.EVENT_ERROR:
			{
    
    
				break;
			}
		}
	}
}


#Speech semantic understanding example Send CMD_WAKEUP message to AIUI, make AIUI in awake state, then send start recording message, make microphone record audio, and get semantic result through AIUIListener callback. The code example is as follows:

// 先发送唤醒消息,改变AIUI内部状态,只有唤醒状态才能接收语音输入
if( AIUIConstant.STATE_WORKING != mAIUIState ){
    
    
	AIUIMessage wakeupMsg = new AIUIMessage(AIUIConstant.CMD_WAKEUP, 0, 0, "", null);
	mAIUIAgent.sendMessage(wakeupMsg);
}
		
// 打开AIUI内部录音机,开始录音
String params = "sample_rate=16000,data_type=audio";
AIUIMessage writeMsg = new AIUIMessage( AIUIConstant.CMD_START_RECORD, 0, 0, params, null );
mAIUIAgent.sendMessage(writeMsg);

If the 20006 error occurs, please pay attention to whether the application has the recording permission. For the semantic result returned, refer to the semantic result description document


#Result analysis In the AIUIEventListener callback, you can receive a variety of messages from AIUI, specific examples are as follows:


private AIUIListener mAIUIListener = new AIUIListener() {
    
    

	@Override
	public void onEvent(AIUIEvent event) {
    
    
		switch (event.eventType) {
    
    
			case AIUIConstant.EVENT_WAKEUP: 
				//唤醒事件
				Log.i( TAG,  "on event: "+ event.eventType );
			break;

			case AIUIConstant.EVENT_RESULT: {
    
    
				//结果解析事件
				try {
    
    
					JSONObject bizParamJson = new JSONObject(event.info);
					JSONObject data = bizParamJson.getJSONArray("data").getJSONObject(0);
					JSONObject params = data.getJSONObject("params");
					JSONObject content = data.getJSONArray("content").getJSONObject(0);
					
					if (content.has("cnt_id")) {
    
    
						String cnt_id = content.getString("cnt_id");
						JSONObject cntJson = new JSONObject(new String(event.data.getByteArray(cnt_id), "utf-8"));
						String sub = params.optString("sub");
						if ("nlp".equals(sub)) {
    
    
							// 解析得到语义结果
							String resultStr = cntJson.optString("intent");
							Log.i( TAG, resultStr );
						}
					}
				} catch (Throwable e) {
    
    
					e.printStackTrace();
				}
			} break;

			case AIUIConstant.EVENT_ERROR: {
    
    
				//错误事件
				Log.i( TAG,  "on event: "+ event.eventType );
				Log.e(TAG, "错误: "+event.arg1+"\n"+event.info );
			} break;

			case AIUIConstant.EVENT_VAD: {
    
    
				if (AIUIConstant.VAD_BOS == event.arg1) {
    
    
					//语音前端点
				} else if (AIUIConstant.VAD_EOS == event.arg1) {
    
    
					//语音后端点
				} 
			} break;
			
			case AIUIConstant.EVENT_START_RECORD: {
    
    
				Log.i( TAG,  "on event: "+ event.eventType );
				//开始录音
			} break;
			
			case AIUIConstant.EVENT_STOP_RECORD: {
    
    
				Log.i( TAG,  "on event: "+ event.eventType );
				// 停止录音
			} break;

			case AIUIConstant.EVENT_STATE: {
    
    	
				// 状态事件
				mAIUIState = event.arg1;
				if (AIUIConstant.STATE_IDLE == mAIUIState) {
    
    
					// 闲置状态,AIUI未开启
				} else if (AIUIConstant.STATE_READY == mAIUIState) {
    
    
					// AIUI已就绪,等待唤醒
				} else if (AIUIConstant.STATE_WORKING == mAIUIState) {
    
    
					// AIUI工作中,可进行交互
				}
			} break;

			default:
				break;
		}
	}
};

Guess you like

Origin blog.csdn.net/weixin_47542175/article/details/113757887