FPGA VR camera - capture and stitch stereoscopic 360-degree video

1695bcc51c5dcb23600ff8750c8721c5.png

This article introduces the second version of the FPGA VR camera. The first version is as follows:

b9b6c63d07430bec07aa86aa3f242916.png

First version address:

https://hackaday.io/project/26974-vr-camera-fpga-stereoscopic-3d-360-camera

This article mainly introduces the second version. The second version of the VR camera can shoot 4k (3840 x 1920) stereoscopic 360-degree video at 30fps, while splicing and encoding in real time on the camera. All image processing functions will be performed on the FPGA, except for the final H.264 encoding, which will be performed on the Nvidia Jetson TX2.

Hardware composition

  • Youjing DE10-Nano

  • 8×Aptina AR0330 camera module with 12mm lens mount

  • 2×quad camera interface PCB

  • 1×NVIDIA Jetson TX2

Shown below is the main data flow and hardware connection diagram of the camera.

0f4d10ac109827f6fc7ee6cd5f320c91.png

Here are the features that have been successfully implemented so far:

  • Camera I2C control

  • Demosaic module

  • Camera image deformation correction module

  • Camera interface PCB

  • Grayscale image to unit pixels

  • 3D image stitching/block matching/optical flow

Divide the grayscale image into 3 vertical parts: left, center, and right. This third is a 45 degree slice of the full 360 degree horizontal image. Since we have 8 evenly spaced cameras, this means our cameras are 360/8 = 45 degrees away from each other. From this, we know that if we arrange cameras A, B, and C as follows:

23a4521772ddbdad0151cb34f27b559f.png

Then the right third of camera A, the center of camera B, and the left third of camera C must all point in the same direction. If they were all looking at something at infinity, all three slices should be the exact same image. However, we want to show depth in the output image. Therefore, we will use the left third of camera C and the right third of camera A to provide images for the right and left eyes respectively when the viewer is facing that direction. When a viewer turns their head to the left, their right eye will move from looking at the left third of camera C to the left third of camera B.

Open source address

https://github.com/colinpate

https://github.com/colinpate/fpga-vr-remap

CAD shape:

https://cad.onshape.com/documents/e230395963de661bfa5c14c7/w/05d75a95b60ee2972e714205/e/a708f4a0d0e872dd361e0a75

9988fb468ad4c31cb466ccd031d66ccb.png

Guess you like

Origin blog.csdn.net/Pieces_thinking/article/details/132551078