Communication Principles in Image Processing - Gonzalez's Reading Notes (1)

The concept of information is clear in information theory: it is used to measure the uncertainty of events. The form or carrier of information is signals, such as electrical signals, optical signals, and sound signals. Then the form of information is the message, which can be language, text, or image. So image processing is still part of the field of information and communication. It can be regarded as the encoded information, that is, the encoding of the source. This article will focus on documenting some of the communication expertise in image processing.

First of all, as far as imaging is concerned, the most important image energy source is the electromagnetic energy spectrum (other main image energy sources include sound waves, ultrasound, and electrons (in the form of electron beams used in electron microscopes)). Electromagnetic waves are familiar to us and can be defined as sine waves propagating at various wavelengths, or as a massless stream of particles, each propagating as a wave and moving at the speed of light. Each massless particle contains a certain amount of energy, and each beam of energy is called a photon. If the spectral bands are grouped according to photon energy, a spectrum ranging from gamma rays (the highest energy, which is also more harmful to living tissue) to radio waves (the lowest energy) can be obtained, with wavelengths ranging from short to long, with no clear boundaries between the various bands , but a smooth transition. Gamma-ray imaging is mainly used for nuclear medicine and astronomical observations; X-rays are used for medical diagnostics (angiography, CT, chest X-rays), and higher-energy X-rays are used to detect manufacturing defects in circuit boards. Ultraviolet light is used in fluorescence microscopy (UV light shines on minerals to fluoresce). The infrared band is commonly used in visible light combined imaging, remote sensing, and satellite multispectral imaging. The microwave penetration ability is strong, and the typical application is radar. Radio waves, used in medicine and astronomy. In medical magnetic resonance imaging, MRI, the patient is placed in a strong magnetic field, and short pulses of radio waves are passed through the body, and the patient's tissue will produce radio response pulses.

In the electromagnetic wave spectrum, there are several types of electromagnetic waves arranged according to the wavelength from the longest to the shortest.
X-rays and gamma rays. From this sequence, it can be seen that radio waves are a type of electromagnetic waves, while microwaves are not. Radio waves are distributed in the frequency range of 3Hz to 3000GHz. By
It is generated by the alternating current of the oscillating circuit and can be emitted and absorbed by the antenna, so it is called radio waves. Microwaves are actually a type of radio waves, which are radio waves with relatively short wavelengths.
Radio waves can be divided into long waves, medium waves, and short waves according to wavelength. If the wavelength is shorter than the short wave, it becomes a microwave. Microwave generally uses line-of-sight transmission (straight line), which can be used to transmit TV signals.
No. The microwave transmission distance to which the TV signal belongs is relatively short, generally in the range of tens to one or two hundred kilometers.
The lower the frequency, the smaller the propagation loss, the longer the coverage distance, and the stronger the diffraction ability. However, the frequency resources of the low frequency band are tight and the system capacity is limited, so the radio waves in the low frequency band are mainly used in
Radio, TV, paging and other systems. The high-band frequency resources are abundant and the system capacity is large. However, the higher the frequency, the greater the propagation loss, the closer the coverage distance, and the weaker the diffraction ability. Other
In addition, the higher the frequency, the greater the technical difficulty and the corresponding increase in the cost of the system. The coverage effect and capacity should be comprehensively considered when selecting the frequency band used by the mobile communication system. UHF band and
Compared with other frequency bands, the compromise between coverage effect and capacity is better, so it is widely used in the field of mobile communication of mobile phones and other terminals. Of course, as people become more
The demand for more and more, the required capacity is increasing, the mobile communication system is bound to develop towards the high frequency band.

Communication is divided into analog signals and digital signals. Things in nature, including the image obtained by the human eye, are of course continuous, and although we can also obtain continuous voltage waveforms through sensors, for the convenience of computer processing (image processing is a calculation process to a certain extent) we must sample and Quantification, which is the same as communication. For a two-dimensional image, the x, y coordinates and amplitude are continuous, and digitizing the coordinate value is called sampling, and digitizing the amplitude value is called quantization. In fact, the essential difference between discrete digital signals and continuous signals is whether the amplitude is continuous. The digitization of coordinates and magnitudes directly affects the spatial and grayscale resolution of the image. Talking about the influence but not the decision is because there is a sentence in the book "Digital Image Processing": the spatial resolution is specified for the unit space. If the spatial dimension contained in the image is not specified, then we say the resolution of an image. 1024x1024 pixels makes no sense. What I understand is that it is a bit absolute to only know the resolution here. The spatial resolution, especially when printing and measuring the level of the screen, has a reference significance. The unit is dpi/ppi, but qualitatively speaking, the pixel resolution Larger rates can print larger sizes without distortion. The first step in feature point detection is to build an image space pyramid. The pyramid is composed of images of different resolutions, and different resolutions are generated by downsampling or box filtering. Enlarge, shrink, and geometrically correct the image. In order to obtain a new image (still a digital image), it is necessary to fill in pixels at some points. This is interpolation. The known data (original image) is used to estimate the unknown position. Numerical processing. Correspondingly, the reconstruction of the signal in the communication is also interpolated.

When it comes to filtering, it is an operation in the frequency domain in communication, which is divided into low-pass, band-pass, and high-pass. The same is true in image processing, and the frequency components are more intuitive. The low-frequency components in the image are gray-smooth, slow-transformed parts, representing image segmentation and regional characteristics. Middle and high frequency components include contours, edges, and parts of noise. You can refer to the pictures in https://blog.csdn.net/u010757264/article/details/49869145:


The understanding of filtering is better understood in the frequency domain after the Fourier transform. For the understanding of the Fourier transform, you can see the Fourier transform analysis and strangulation tutorial of the great God Heinrich . Regarding the image, the image is discrete and two-dimensional. The Fourier transform of the image can be regarded as two one-dimensional Fourier transforms, and the one-dimensional Fourier transform is Fourier on the row scan line and the column scan line. Superposition of leaf transforms.


It can be seen from the formula that F(u,v) and f(x,y) are not in a one-to-one correspondence, and F(u,v) does not correspond to a certain f(x,y) but to all The sum of the products of f(x, y) and e^(-j2TT(ux/M+vy/N)).

Fourier transform is to decompose the time domain signal into the sum of sine functions (or cosine functions) of different frequencies, and the amplitude represents the number of sine functions of the frequency.

The origin of the Fourier-transformed spectrogram, the component whose frequency is 0, is the DC component, which represents the average gray value of the original image. The center of the spectrogram is bright, indicating that the gray mean value is high, and the image is brighter intuitively.


The following is an illustration of an image Fourier transform from Bole Online, Butterworth's low-pass and high-pass filters :


In PyCharm, I found that the cv2 library installation was unsuccessful. It seems that the versions do not match. Without experiments, there is no right to speak. I will talk about filtering in the next article. In Gonzalez's book, chapters three and four talk about the spatial filtering kernel frequency. Domain filtering, including convolution, is also very knowledgeable, and I will record it after some research.


Guess you like

Origin http://43.154.161.224:23101/article/api/json?id=325529378&siteId=291194637