some conceptual understanding

1 Sub-pixel understanding

During the imaging process of the camera, the obtained image data is processed by discretizing the image. Due to the limitation of the capability of the photosensitive element itself, each pixel on the imaging surface only represents the nearby color. For example, there is a distance of 4.5um between the pixels on two sensory elements. Macroscopically, they are connected together. Microscopically, there are countless tiny things between them. These pixels exist between two actual physical pixels. , is called "sub-pixel". Subpixels should actually be there, just lacking a smaller sensor to detect them, so they can only be approximated in software.
   Sub-pixels can be represented as shown in the figure below, the rectangular area surrounded by every four red dots is the pixel on the actual original, and the black dots are sub-pixels:
insert image description here

According to the difference of the interpolation between two adjacent pixels, the precision of the sub-pixel can be adjusted, for example, one-quarter means that each pixel is regarded as four pixels horizontally and vertically. That is, there are three black dots between the red dots in the picture above. In this way, the method of sub-pixel interpolation can realize the mapping from the small rectangle to the large rectangle, thereby improving the resolution.
   Reposted from: https://blog.csdn.net/moonlightpeng/article/details/97691006
  
2. Theory of Bilinear Interpolation
Bilinear interpolation is the linear interpolation extension of the interpolation function with two variables. Carry out a linear interpolation in each direction (that is, perform linear interpolation twice in the X direction to obtain R1 and R2, and then perform a linear interpolation in the Y direction to obtain P).
As shown in the figure below, first do linear interpolation between Q11 and Q21, find the pixel value R1 at the midpoint, and find R2 in the same way, and then interpolate between R1 and R2, you can get the pixel point P that needs to be added .
insert image description here

Source: https://zhuanlan.zhihu.com/p/562277081

  1. CT imaging principle
    reference: https://zhuanlan.zhihu.com/p/102262776

  2. The difference between TOF depth camera and lidar
    https://zhuanlan.zhihu.com/p/494591696

  3. Gray code decoding
    global/local gray threshold method, multiple image threshold method
    https://blog.csdn.net/qq_29462849/article/details/118160850?ops_request_misc=%257B%2522request%255Fid%2522%253A%25221679279502168002 13092280%2522 %252C%2522scm%2522%253A%252220140713.130102334.pc%255Fall.%2522%257D&request_id=167927950216800213092280&biz_id=0&utm_medium=distribute.pc_search_ result.none-task-blog-2 all first_rank_ecpm_v1~rank_v31_ecpm-2-118160850-null - null.142 v74 insert_down1,201 v4 add_ask,239 v2 insert_chatgpt&utm_term=%E6%A0%BC%E9%9B%B7%E7%A0%81%E7%BC%96%E7%A0%81%E5%AE%9E%E7%8E %B0%E5%8F%8A%E8%A7%84%E5%BE%8B&spm=1018.2226.3001.4187
    https://zhuanlan.zhihu.com/p/113664502

  4. Addition, subtraction, multiplication and division of the four arithmetic operations of images
    7ERate -11-124620904-blog-7290273.pc_relevant_3mothn_strategy_and_data_recovery&depth_1-utm_source=distribute.pc_relevant.none-task-blog-2%7Edefault%7ECTRLIST%7ERate-11-124620904-blog-729 0273.pc_relevant_3mothn_strategy_and_data_recovery&utm_relevant_index=12
    https://blog.csdn.net/weixin_42704093/article/details/123594920?spm=1001.2101.3001.6661.1&utm_medium=distribute.pc_relevant_t0.none-task-blog-2%7Edefault%7ECTRLIST%7ERate-1-123594920-blog-124620904.pc_relevant_3mothn_strategy_and_data_recovery&depth_1-utm_source=distribute.pc_relevant_t0.none-task-blog-2%7Edefault%7ECTRLIST%7ERate-1-123594920-blog-124620904.pc_relevant_3mothn_strategy_and_data_recovery&utm_relevant_index=1

Guess you like

Origin blog.csdn.net/weixin_44934373/article/details/129649852