Last night I finished the voltage calibration code for both setpoint and actual readings. It works by connecting a multimeter to get a known reading, and then setting the raw DAC output to find a known value + reading the ADC. You do this twice (for 0 and 8000 mV), and a bit of linear math gets you the slope and offset required. You can then save this to EEPROM.
From initial testing, this looks to be very promising. Across the entire range I was not getting an error rate of no more than about +/-10mV (i.e. +/- 1 on the least significant digit on my 2 decimal place multimeter). This may not be running into the 'precision' measurement range, but it is well within my desires for this supply. Also, this is done using a linear algorithm and two measurement points. By adjusting the measurement points (if you know you need 5v to be calibrated, use that as a point) or by using a non-linear calibration algorithm with multiple calibration points I could probably improve things (although I would really need a more accurate multimeter before I could do much more than what I have already).
Next up is the current calibration (both for current measurement as well as current limiting setpoints). I am planning on the same approach; most of the code is ready although it needs to be tested and fine tuned.
Finally, I thought I would share some pictures of my test setup.
Here you can see the power supply (one channel, comprising two linear regulators + heat sinks with accompanying op amps, plus control board / encoders / display), with my constant current dummy load attached.
Here is a closeup of the two regulators and op amps comprising the single channel.
Finally, here is a closeup of the control board and UI.