[AR.js] Preliminary understanding and use of official examples

say up front

  • Test browser: Microsoft Edge (PC version 97.0.1072.55)/Firefox (Android)
  • github repository: AR.js
  • AR series articles: here
  • go version: go version go1.17.3 windows/amd64
  • Miscellaneous: This article focuses on official use cases.

About AR.js

  • here
    insert image description here
  • They have three implementations: based on markers, based on feature points, and based on geographic location; the first two are more or less the same.
  • Advantages: It can be used as long as there is a supported webrtcbrowser webgl, whether it is a PC or a mobile device, it is very convenient.

Image-based (feature points)

mark-based

golang server

  • Looking at the source code of the example, we can see that some static files such as models and textures are used in the example, so if you want to try it yourself, you need to build a static file server; just choose the tool chain you are familiar with.
  • golangimplemented here using
    package main
    
    import (
    	"github.com/gin-gonic/gin"
    )
    
    func main() {
          
          
    	router := gin.Default()
    	router.Static("/", "./public")
    
    	// Listen and serve on 0.0.0.0:8080
    	router.Run(":8080")
    }
    
  • html
    <script src='js/aframe-master.min.js'></script>
    
    <style>
      .arjs-loader {
            
            
        height: 100%;
        width: 100%;
        position: absolute;
        top: 0;
        left: 0;
        background-color: rgba(0, 0, 0, 0.8);
        z-index: 9999;
        display: flex;
        justify-content: center;
        align-items: center;
      }
    
      .arjs-loader div {
            
            
        text-align: center;
        font-size: 1.25em;
        color: white;
      }
    </style>
    
    <!-- rawgithack development URL -->
    <script src='js/aframe-ar-nft.js'></script>
    
    <body style='margin : 0px; overflow: hidden;'>
       <!-- minimal loader shown until image descriptors are loaded -->
      <div class="arjs-loader">
        <div>Loading, please wait...</div>
      </div>
        <a-scene
            vr-mode-ui='enabled: false;'
            renderer="logarithmicDepthBuffer: true;"
            embedded arjs='trackingMethod: best; sourceType: webcam; debugUIEnabled: false;'>
    
            <!-- use rawgithack to retrieve the correct url for nft marker (see 'pinball' below) -->
            <a-nft
                type='nft' url='./trex/trex-image/trex'
                smooth='true' smoothCount='10' smoothTolerance='0.01' smoothThreshold='5'>
                <a-entity
                    gltf-model='./trex/scene.gltf'
                    scale="5 5 5"
                    position="150 300 -100"
                    >
                </a-entity>
            </a-nft>
    		<a-entity camera></a-entity>
        </a-scene>
    </body>
    
  • directory structure
    trexfolder content here
    ─public
        ├─index.html
        ├─js
        └─trex
            ├─textures
            └─trex-image
    

Precautions

  • After setting up the static file service, it can be accessed under the LAN
  • The test browser can be used directly, but there are restrictions pc edgeon mobile browsers , so it may not be possible to use mobile browsers.webrtchttps
  • But the real test can Android Firefoxvisit and use the official online examples.

Guess you like

Origin blog.csdn.net/qq_33446100/article/details/122379022