Real-time monitoring and video docking hypergraph GIS application of (two, real-time video presentation)

 First, the principle of real-time video're running ?

The rtsp video stream using websocket pushed to the front end, Video element to play the video. Video distribution installation position of the observer (the position of the monitoring device) and related parameters, forming a fixed direction and range of the view frustum, the view of the vertebral body intersects the three-dimensional model in the 3D scene cache portion, i.e. serve the video or image Area. (As shown below)

 

WebGL corresponding interfaces: ProjectionImage , the relevant parameters are as follows:

l direction: Number, the azimuth angle of the projector Gets or sets the video delivery, i.e., clockwise angle between the north direction, in the range: 0 ° to 36 °.

l distance: Number, get or set the distance to the observation point of the projector.

l hintLineColor: Color, Gets or sets the color of the video reference line served.

l hintLineVisible: Color, suggesting visibility lines Gets or sets the video delivery.

l horizontalFov: Number, the horizontal viewing angle range of the projector Gets or sets the video delivery, unit: degree.

l pitch: Number, Gets or sets the video projector serving pitch angle, the angle refers to the surface direction and the direction of the camera angle, the orientation is positive, unit: degree.

l verticalFov: Number, the vertical viewing angle of the projector Gets or sets the video delivery, unit: degree.

l viewPosition: Array, get or set the position of the observer of video delivery. Array position indicated by latitude, longitude and elevation of a composition represented

 

Second, the parameters of how to obtain real-time video delivery?

1 , to prepare a video for offline delivery zone (currently iDesktop only supports avi format)

2 , the idesktop open region for the three-dimensional scene, where to CBD sample data as an example:

 

3 , choose a three-dimensional analysis of video delivery inside

 

4 , adjust the scene to the area you want to run video, add the observation point

 

5 , can be seen to automatically generate a view of the vertebral body, as follows

 

6 , the right side of the toolbar, set your video files to be projected, three-dimensional scene is automatically updated

 

7 , to dynamically adjust the video effect parameters delivered by the right side of the toolbar and delivery information

  

8 , the effect is ok , the current can be saved information deriving relevant parameters for the three-dimensional point data set stored

 

 

 9 , the three-dimensional point data stored parameter set corresponding to webgl related parameter value in serving. The data can be set where the data source, saving work space, to publish data services. Webgl front-end to access to relevant information by the data service query

 

 

 

三、实时视频投放如何与三维场景中的真实场景校准

1、接上面的处理流程,发现调整好的视频中深灰色的区域和实际场景中的道路不平行(测试的视频和场景不对应,理解处理流程即可)

 

2、勾选右侧工具栏中的同名点配准,点击按钮后弹出一个操作页面,左侧是你的投放视频,右侧是你的三维场景。

 

3、点击左上的按钮,在视频上刺几个特征点,如下:

 

 

 

4、同理,在右侧的三维场景中,刺几个对应的配准点

 

 

5、点击确定,三维场景中的视频就已经配准成果,具体见下图(这里刺点不是很准确,理解原理即可)

 

 

6、此时将视频投放的结果输出。保存为三维点数据集,供webgl端加载显示

Guess you like

Origin www.cnblogs.com/yaohuimo/p/12154574.html