Using the ADC to calibrate the HV

A project log for ESP8266 Geiger counter

Simple Geiger counter using ESP8266 PWM for HV generation and network connectivity

biemsterbiemster 08/27/2016 at 20:178 Comments

It occurred to me to use the ESP's ADC to tune the PWM duty such that there is the correct 400V on the tube.

The ADC of the ESP has a tolerance between 0 and 1 Volt. To measure the HV without overloading the ADC port I have to create a voltage divider that has let's say 0.400 Volt on the ADC corresponding to 400V over the tube.

I know from previous measurements of the HV part of the circuit that I need a load of around 100 megohm or larger for the output to reach the desired 400V. So I could take a 100 megohm and 100kOhm in the voltage divider, to divide the HV by about a factor 1000.

Now that's where the problem begins. A second resistance of 100k in the voltage divider will probably not be more than an order of magnitude smaller than the internal resistance of the ADC, so this will alter the behavior of the voltage divider. Now since this second part of the voltage divider depends on the internal resistance of the ADC, I could use just this internal resistance as second stage, lose the 100k altogether.

But the important question still remains:

What is the internal resistance of the ADC?

Info on the ADC of the ESP8266 is scarce to say the least. I've seen reports on forums mentioning 1M, but it apparently also depends on how often the ADC port is polled.

Seriously, if anybody can shed some light on the internal resistance of the ADC, or some other way to measure 400V with it but at the same time burning extremely low current, please comment below. Any hints or thoughts are welcome.


Ted Yapo wrote 08/30/2016 at 20:52 point

The other thing to consider is that common resistors may not be rated for high voltage - the 1/4W ones I typically use for through-hole are only rated at 350V; but you can easily find others rated much higher.

  Are you sure? yes | no

biemster wrote 08/30/2016 at 20:58 point

I've also considered this indeed. I was under the impression that the extremely low current through the resistor would loosen this rating a bit. I'll find some higher rated ones when I get on this again, after I get some positive results with the design as it is now.

  Are you sure? yes | no

Ted Yapo wrote 08/30/2016 at 21:11 point

I think it has to do with the sprial-groove construction of many film resistors.  The spacing between the grooves can be pretty small, and I think the voltage rating is based on the arcing breakdown across the grooves as opposed to dissipation concerns.  A 10M resistor would need almost 1600V across it to dissipate 1/4W, but you still find them rated at only 350V.

  Are you sure? yes | no

K.C. Lee wrote 08/30/2016 at 21:07 point

It is not the current that is the problem.  The high voltage affects the linearity of the resistor.  There is usually a working voltage rating on the datasheet along with the usual I^2*R wattage. 

You can easily fix that by using a few resistors in series.

  Are you sure? yes | no

biemster wrote 08/31/2016 at 08:08 point

Thanks K.C. and Ted, I'm learning new stuff every day over here!

  Are you sure? yes | no

Ted Yapo wrote 08/30/2016 at 18:55 point

I don't know much about the ADC on the ESP8266, but if the draw when not polling the ADC is low enough, you might be able to add a capacitor across the lower leg of your divider to store enough charge to drive the ADC periodically - this will reduce the impedance of the divider for brief samples.  Sure, you might not get the 1000x factor exactly, but you probably have to calibrate the voltage control loop anyway.

  Are you sure? yes | no

biemster wrote 08/30/2016 at 19:05 point

Interesting idea, thanks! I agree, the voltage control loop probably has to be calibrated as well unfortunately. Maybe it is not really worth all the trouble, since calibrating the counter on the Geiger plateau is much more fun and informative. But I'll certainly give this a try anyway!

  Are you sure? yes | no

K.C. Lee wrote 08/30/2016 at 19:42 point

Pretty much most SAR ADC use sampling caps for CMOS process.

The ADC charged up the internal sampling cap in a microsecond or so in the sampling period.  So for high impedance source, you should put a cap at the input.  The external cap supplies charge to the ADC during sampling (more accurate results) and also lower the AC impedance (better noise immunity).

I have found that the placement and the parasitics on how you are connecting the cap makes enough of a difference for me. i.e. close to chip input pin and short fat trace ideally with SMT caps.

  Are you sure? yes | no