EasyAR for unity learning (1)

Try AR technology today, the 3D engine is unity.

I have seen others use Unity to do some AR effects before, and it feels very cool and deep.

It's actually very simple to look at the doorway, using EasyAR to easily achieve AR effects.

————————————————————————————

First, get your tools ready.

Until now, the latest version of Unity is 5.6.2, released on July 21, 2017.

EasyAR_SDK_2.0.0, released on May 30, 2017

Go to EasyAR website http://www.easyar.cn/, go to download, download EasyAR_SDK_2.0.0_Basic_Samples_Unity Note that the download here is for unity.

At the time of download, a user can be registered on the website. After registration, click "Develop" on the website to enter the personal application list and create an application. The name is arbitrary.


 After that, there is "view key" in the operation column, and this key is useful next. (must use this key)

————————————————————————————

It should be downloaded now, unzip it by yourself, open unity, and import the first sample.

It is recommended to take a look at the Getting Started Guide and click on "Support" on the website.

(http://www.easyar.cn/doc_sdk/cn/Getting-Started/Compile-and-Run-EasyAR-Unity-Samples.html)

Enter the scene and select the object EasyAR_Startup


 Fill in your key here, otherwise it won't work.


 Well, now it can be run, the program opens the computer's camera.

Now, even though we have no idea how this example works, we found a few things in the scene earlier:

It's natural to think "is there a same picture that enters the camera, the program will recognize it and draw the model on top?"

Try it out, put the picture in the project into the mobile phone (everyone who can use unity knows how to find the location of the picture), then open the picture with the mobile phone and aim at the camera............... .........and


 The effect came out. Is it amazing? Of course, I am not surprised at all.

————————————————————————————

The sample has been implemented, and now try to study it yourself.

About how to identify pictures?

The first picture that can be found, in Textures, try to use the replacement method. I replaced the picture used in the above picture with my own campus card picture. When it runs, it cannot be recognized, but I continue to use the picture in the above picture, just The same effect can be recognized and displayed.

The pictures used for identification are not placed in the Textures folder. So where will it be?

————————

In the StreamingAssets folder

I know it from someone else, if I didn't know it before, I would search for pictures in the project folder and find that there are pictures in this folder too!

————————

Now try to replace the image file inside, and replace the original image with an image of the same name.

Run, hold the campus card directly at the camera.

OK, now our image is recognized!


How to add recognizable pictures?

————————————————————————————

In addition to putting the pictures that need to be recognized in the folder just mentioned, some configurations are also required.

The targets.json in the folder is used. For details, please refer to "EasyAR Target Configuration" in "User Manual" in "Support". ——In addition, it is also possible to not configure it, but it is inconvenient to manage.

After configuration, how to specify which model corresponds to when recognizing images?

Let's take a look at the 3 instances in the unity scene. They all have the same script, but the parameters are different.


 It can be seen that the English is not bad, Path path, Name name. You can see that in the screenshot above, Path points to the json file mentioned above, and Name is the name of the image set in json.

in addition:


 You can see that the path in this script points directly to the picture, not to the configuration file, so it is possible to not configure it.

————————————————————————————

When the pictures we need to add are placed and the configuration is modified.

We observed the objects of the sample and found that they had scripts in the parent object, the child object is the model that needs to be displayed, and the parent object plane corresponds to the picture plane after the picture is recognized. In fact, the parent object plane can not be rendered, as long as its material is set to empty, then only the model will be rendered, and there will be no such plane.

Now we create a new object (empty objects are also fine, when the script is attached, it will be automatically given to the upper surface (the parent object plane)), add the same script as the sample, set the script parameters, and then create a new model to add to the child object.

try it.

The effect does not appear? !

Well, don't know why, but keep trying.

We copy a sample object, the same, modify the parameters, and change the model. In addition, the name of the parent object can be changed at will. And I set the parent's material to null.

Let's try it later.....................

 but it works.

Compare the newly created object with the copied object:

I created it myself:

 copied and modified: the

 difference is just that one is preset and the other is not...?

Pay attention to the Storage item. I didn't read it carefully at first, and I wasted a lot of time. Set it as assets and try. success.

In addition, if you directly drag the preset into the scene to instantiate it (how to find its preset object through an instance? Click Select next to the yellow word Prefab in the picture above), its Loader item is empty, remember to select it .
 ——————————————————————————————

Alright, now we've figured out how to add the graphics we want to recognize and add the model to display.

Also it can't recognize both images at the same time (maybe it can but only shows one model)


After that, you can modify it yourself to make better effects, such as changing to a better-looking model and having model animations.

In the picture above, my angle cannot be directly identified. First identify it at a suitable position, and then slowly adjust it to my angle.
 ——————————————————————————————

Today, I mainly studied the official first sample HelloAR, and made some small modifications.

There is no in-depth understanding of the specific implementation and application.

Guess you like

Origin http://43.154.161.224:23101/article/api/json?id=326260701&siteId=291194637