Augmented reality technology solution based on mobile terminal (realized)

My main research topic during my postgraduate study is: Augmented Reality Technology Based on Mobile Terminals. It is mainly completed under the Android system, using OpenGL, OpenCV, Android NDK programming technology, and the programming languages ​​are mainly Java and C++. Here is a brief introduction to the process of implementing this system, for reference only:

The main framework of the system is as follows:


The main modules of the system are as follows:



Considering that the image processing algorithm, that is, the registration algorithm module, is complex, time-consuming and inefficient. Therefore, the implementation of this module is placed in the Native layer and completed using C++. The model rendering module is implemented in the Java layer, and currently only obj format models can be parsed. Load the model to be drawn from SDCard using multithreading.

The model of the Vuforia SDK is mainly rendered in the native layer, which feels inconvenient to control the model, and the format of the model it uses is a .h file. Students who have used Vuforia should be able to find that the data volume of .h files is generally relatively large. As far as I know, this kind of model in the Native layer is placed in the program JNI folder, which can only be added manually, and cannot be handled as easily as it is placed in SDCard. It can realize online real-time loading of the model.

Finally, the interaction method is introduced. I also wrote an article on the Vuforia Sdk before, introducing the implementation of the interactive method. That one needs to be programmed using the NDK. Since the rendering module in this system is at the java layer, NDK programming is not required. By using the trigger event of the general Android touch screen, the relationship between the gesture sliding change and the position, size, and angle of the model drawn by OpenGL can be calculated through an appropriate algorithm.

The final effect is also good. That is, the algorithm is not optimized, the system performance and efficiency cannot keep up, and the frame rate is relatively low.

The purpose of writing this article is mainly to give ideas to those who will do this research in the future. Of course, it turns out that I am not the best.

Recently I saw a book, "AndroidApplication Programming with Opencv" , which introduces the method of using OpenCV under Android. When I was doing research before, I groped for a long time and took a lot of detours. Now I recommend this book to everyone, I hope everyone can learn more easily and take less detours. PS: This book is open source! The fifth chapter introduces the case of augmented reality, which is not programmed using the NDK. When I have time to read his example, I will introduce it to you when there is content!


ARVR technology exchange group: 129340649

Welcome!

^_^ This team specializes in mobile augmented reality application development and solutions. Please contact us by private message if you have any cooperation! ^_^


Guess you like

Origin http://43.154.161.224:23101/article/api/json?id=325402176&siteId=291194637