I noticed when writing my last post Digital To Analog Converter Control Using SPI that the voltage reference that I planned on using was not going to be possible. I planned on using the internal 4.096V reference of the LTC2630, but I mistakenly only powered my DAC from a 3.3V voltage regulator. Therefor in the short term I set my reference to be the supply voltage of 3.3V. In this post I am going to explain why I want to use the 4.096V reference.

The reason the voltage reference for the DAC is so important, is because the voltage reference can be a large contributor to the accuracy of the DAC's output. DAC's simply take your reference voltage, and split it up into a number of steps, depending on the number of bits. The LTC2630 is a 12 Bit DAC, therefore it splits the reference into 4095 steps.

The voltage reference accuracy is important because the SW assumes the reference is perfect, but in reality the reference voltage could be slightly different due to a number of circumstances. This is probably best explained with some examples, therefore I will show the worst case error when using the external 3.3V reference and the internal 4.096V reference. Lets say we are trying to target a DAC output voltage of 1V in both examples.

__External 3.3V Reference Worst Case Circuit Analysis__

Below is an equation that will be done in SW in order to determine the correct number of Counts to send to the DAC in order to generate 1V:

The software has to assume that the reference is perfectly 3.3V, but in reality it can be different from 3.3V. This difference is determined by the worst case accuracy of the external 3.3V reference. The 3.3V reference in our case is a simple Linear Regulator from ON Semiconductor (NCP1117DT33T5G):

The output voltage range is defined in the datasheet in this section:

With this information we can calculate the worst case output voltage when the reference is at its maximum or minimum output voltage:

__Internal 4.096V Reference Worst Case Circuit Analysis__

Below is an equation that will be done in SW in order to determine the correct number of Counts to send to the DAC in order to generate 1V:

The software has to assume that the reference is perfectly 4.096V, but in reality it can be different from 4.096V. This difference is determined by the worst case accuracy of the internal 4.096V reference. The 4.096V reference in our case is inside the LTC2630 DAC:

"The reference is INSIDE the DAC!" (Zoolander __reference,__ pun intended)

This is where things get hard to compare apples to apples, but the datasheet leads me to believe that the error introduced by the internal reference is specified here in the datasheet for the DAC:

We are designing for a maximum ambient temperature of 70C, therefore the worst case reference error should be around 0.07%. This gives a minimum voltage of 4.093V and a maximum of 4.098V.

With this information we can calculate the worst case output voltage when the reference is at its maximum or minimum output voltage:

As you can see the error is significantly less when using the internal reference of the DAC. Minimizing this error is very important because this DAC Output voltage accuracy will directly translate to load current accuracy. My goal is for this electronic load to be very accurate, therefore I want to enable every option that will increase the accuracy of the load current.

Please note that there are definitely other contributors to DAC Output voltage error, but the premise of this log was to compare the references and their contribution to the error. If you wish to see a more lengthy explanation of an overall worst case analysis of output error, please let me know.

I hope you enjoyed my post. Thank you for reading. Please feel free to leave a comment below.

Thanks again.

## Discussions

## Become a Hackaday.io Member

Create an account to leave a comment. Already have an account? Log In.