Self-made synthetic aperture radar (2) Comparison of SDR implementation

I checked the information today:

https://www.sohu.com/a/286298900_819029

After consulting some literature, I wanted to explore the possibility of reducing the number of expensive analog front-end components required for radar systems. The design is inspired by Gregory L. Charvat's coffee can radar. This is an amazing radar that can do many interesting things, such as FMCW and SAR imaging. Their only problem is the analog components produced by Mini-Circuits required for the front end. It's pretty good in itself, but it's expensive and often difficult to buy, especially in regions outside Europe or the United States. It would be great if we can use software-defined methods to reduce the number of analog devices required, thereby reducing costs and simplifying assembly and testing.

 It turns out that the author of the Limesdr I used to implement Doppler radar also made reference to this MIT radar, and changed it to SDR to implement radar, but the radar signal source implemented by SDR is different, not chirp signal, and only Doppler. , Did not do ranging and SAR.

Flow chart (left and right are too long, divided into two screenshots):

 

This flow diagram is actually very simple. The two AGC modules are automatic gain modules, which can be ignored as two wires. In this way, it is equivalent to a cosine wave signal source directly transmitted from the limesdr into the air, and this cosine wave signal source is also transmitted to the Multiply Conjugate module together with the signal received from the limesdr from the air. This Multiply Conjugate module is a combination of two The meaning of multiplying a signal is the same as a mixer. The result is low-pass filtered (it can be ignored as a line), and then sent to the waterfall chart for display. The waterfall chart actually has FFT implementation inside. The time domain signal is displayed on the image after FFT transformation. 

Let's compare this flow graph with the block diagram of the radar implemented by hardware.

Not exactly the same but a bit like:

1. The hardware radar uses sawtooth wave signal source (Modulator1) and voltage controlled oscillator (OSC1) to realize chirp signal, and gnuradio flow graph does not use chirp signal, but uses a fixed frequency cosine wave to add to the LO of limesdr Above, the attenuator ATT1 and power amplifier PA1 of the hardware radar are implemented in the transmit channel inside the limesdr.

2. The splitter SPLTR1 and the mixer MXR1 of the hardware radar are implemented in software in the flow chart. The splitter is in the software. Just pull two wires directly from the same signal output port. The mixer is the Multiply mentioned above. Conjugate module. But one thing that is not exactly the same is that the two parts of the hardware radar are behind the VCO, which means that the two parts of work are done on the high-frequency signal, and the corresponding part of the gnuradio flow graph is implemented on the baseband.

3. The LNA1 of the hardware radar is implemented in the receiving channel inside the limesdr. Video Amp1 is divided into two parts, the gain part is skipped, and the filtering part corresponds to the Low Pass Filter in the flow graph.

Next, let's analyze why this flow graph can achieve Doppler velocity measurement: 

If you connect an oscilloscope to the two inputs of the Multiply Conjugate in the flow graph, you will see the following results.

The red line is the signal from the cosine wave signal source, which is perfect. The blue line is the signal reflected by the object received by the LimeSDR receiver. It looks more like the original cosine wave signal, but it can be seen that there is some interference.

The cosine wave passes through the LimeSDR TX module, which is the LO on the LimeSDR hardware transmission channel, and is transmitted to the measured object. If the object is assumed to be stationary, then the radio frequency signal received after reflection from the object should be exactly the same as the transmitted radio frequency signal (ideally, the intensity is attenuated and the phase may also change. After all, there is a wave path difference, but the frequency is Invariant), they are all Cos wave frequencies are the original cosine wave signal source frequency + LimeSDR local oscillator frequency (2.45GHz in this example). Then the radio frequency signal received from the air is down-converted by the local oscillator of the LimeSDR receiver. The signal output from the LimeSDR RX module should be a cosine wave with the frequency of the original cosine wave signal source (because it increased 2.45GHz and then decreased 2.45GHz is cancelled out), this signal is the blue signal curve in the oscilloscope.

At this time, if the object moves relative to our detection equipment, the frequency of the RF signal reflected by the object will change (Doppler shift), and the transmitter and receiver LO frequencies will cancel each other out. This frequency change will eventually be reflected in the received On the baseband signal. In other words, we only need to compare the frequency difference between the red line and the blue line to know how much Doppler frequency shift the object produces, and then we know the speed of the object.

A relatively simple and rude method is to directly calculate the FFT of the red and blue lines to get their frequency values, and the subtraction is the size of the Doppler frequency shift.

But we can also use another method to multiply the red and blue lines for mixing. It can also be seen as down-converting the blue signal based on the red signal. The new signal obtained in this way is still a cosine wave. Its frequency The value is (the frequency value of the original blue line-the frequency value of the original red line, which is the size of the Doppler frequency shift), so we only need to find the frequency of the signal output by the mixing to know how big the Doppler frequency shift is ( There is no need to obtain FFT for both signals as before), we can only obtain FFT of the mixing result, or directly output the time domain of the mixed signal to the waterfall chart, which will do the FFT internally, and then display the result .

You can see that there is a relatively bright line on the waterfall chart. If the object moves fast, the line will be on the outside, indicating that the value obtained by the FFT result is large, that is, the Doppler frequency shift is relatively large and the object velocity is relatively fast.

Video of the experiment I did before:

https://v.youku.com/v_show/id_XNDM1MzA2NzM5Ng==.html

Guess you like

Origin blog.csdn.net/shukebeta008/article/details/108377653