STM32 collected AD input impedance problems

When doing a consumer electronics products, we need to collect the battery voltage (3.3V-4.2V), hoping to minimize standby current in sleep time. Battery voltage acquisition circuit using two 300K 1% resistor dividing standby circuit caused by the circuit is 4.2 / (300 + 300) mA = 7uA. In this case a more reasonable (machine standby current requirement within 30uA).

  The initial design of the circuit as follows:

  

  Found in the data collection program test voltage and the actual voltage deviation, the actual value of the measured value is better than a little smaller. Do compensation in the software, the value fixed.

  But for a test when the board found that the test voltage and unreliable that it knows will not work this time through software to compensate for this method. We can only find a cause from the hardware.

  Find datasheet found that the maximum input impedance AD ​​only 50KΩ.

  

    FIG RAIN: External input impedance, the maximum value of the chip STM32 50K;

      RADC: sampling switch resistance, a maximum value of 1 K [Omega;

      CADC: internal sample and hold capacitor, a maximum of 8pF.

  When ADC data acquisition requires a current flows, then the RAIN will produce a voltage drop. The RC network and RADC CADC, the charging of the capacitor is controlled by the RADC. With the increase of source resistance (RADC), the charging time of the storage capacitor is also increased.

  Charging the CADC RAIN + RADC controlled by the charging time constant tc = (RADC + RAIN) × CADC. If the time is too short, ADC converted value will be less than the actual value.

  Through the above data to know, about the acquisition accuracy with acquisition time and input impedance. However, by calculation that, if the input impedance is 300KΩ, then the charging time is about 2.4uS. In the software to maximum sampling period (ADC_SampleTime_239_5Cycles, frequency 12M, time 19.9uS), or there is an error. Description at this time with the cycle is not the main reason.

  The problem lies in the input impedance of the IC impedance is greater than the maximum allowed in the ADC. When the charging current is divided into two paths, all the way to ground via R1 to R2, R1 also flows all the way through the MCU interface to AD. (Maybe it is IO port there will be some leakage current to ground, IL) at this time corresponds to the next and a resistor R2 to a voltage, the detection point than the standard 1 / 2Vbat.

  Then in order to more accurately detect the battery voltage, the resistance had to piecemeal. If you choose two 50K resistor, then brought here after the current is 42uA so on made a circuit adjustment:

  

  The original change to a local ground IO port outputs a low level at the time to be detected, outputs a high level when not needed. Then dividing resistors using two 30K problems are solved, the voltage detection error is less than 0.02V, the standby current of several microamperes is smaller than the original.

 

 

Citation from: https: //www.cnblogs.com/zjh-x/p/6617712.html

Guess you like

Origin www.cnblogs.com/duwenqidu/p/10943085.html