Close

Solution to 14V Rail Problem

A project log for D-DAQ

automotive parameter & performance monitor & logger

michael-obrienMichael O'Brien 06/12/2014 at 21:360 Comments

Not a moment has gone by that I've been considering various solutions to the 14V rail problem. The previous plan was going to involved a simple charge pump to provide the input voltage to the 14V LDO. The current LDO being used has a max input voltage of 18V and a 3.3V clock will provide an easy output of +20V. After a week's deliberation, it comes down to 2 solutions: a SEPIC regulator or a charge pump and different LDO. There are pros and cons to both for what I'd be using the 14V rail for. First a little more in depth for why I'm going to so much trouble to have a 14V rail.

The display driver, the SEPS525, requires a 3.3V input for display logic and 14V input for display operation. The fun is the latter bit. The SEPS525 data sheets state a 8-18V input range with a absolute max of 19.5V. All 4 versions of the datasheet state this. However, every display datasheet using this driver state a 13.5-14.5V input range and an absolute max of 16V. Furthermore, the datasheets say that the the voltage range can be adjusted upon request. I'm presented with a chicken or the egg problem because I don't know which sheet I should go from. If I go with a lowest common denominator and pick a 12V input and it doesn't work, I have to come back to where I'm at now and start over.

Now, I am applying some LEDs for aesthetic treatment and pragmatism on the display boards. In order to pay homage to the community I'm developing this from, I'm using some LEDs with a forward voltage of 4.9V; they happen to be indigo color, which is the same as the cluster lights on the MK IV VW Jetta. I *could* drive them from the 5V rail, but I'm so close to the forward voltage, I can have other concerns so I need something higher and as such, I have a 14V rail already in the specs. This also eases things up a bit as I don't have to worry about sag when using high(er) current PWM driving of the LEDs and having my 5V or 3.3V supply lines ripple like crazy.

So, as such, I need to keep the 14V rail and it comes back to SEPIC or charge pump + LDO. Here is the short list of pros/cons for both.

SEPIC Pros:

SEPIC Cons:

CP + LDO Pros:

CP + LDO Cons:

Now, current output is a big concern for me. I will be having up to a 300 mA draw at 10% DC due to the PWM-driven LEDs. Even though it's at 10% DC, it's still a hunk of current and It's difficult to know if an LDO will survive the strain. They usually have internal current limiters from what I've read so even though I have a low average current draw, the instantaneous current draw might be a problem. I don't know though unless I build it. I don't have 3 additional weeks though to test a theory and then make a new version of the board if they theory is wrong. Even though on paper the SEPIC regulator seems a perfect fit, the size of the inductors and finding the right one to fit is difficult. Just the 7.3 uH inductor for the SMPS was a tough one to find and it's the smallest I could go with for it's series resistance and power rating.

I'm opting for the CP + LDO combo. I'll be using 4x2.2 uF MLCC caps arranged so 2 are in series for the clock's load and 2 are in parallel for the output load, a TL1963A LDO from TI, and a zener+ to ensure that I don't get more than 20 V to the LDO. Though the LDO is expensive, about $3 ea. vs ~$0.56 for the previous one, it's thermal resistance is realllllllly low for a SOT-223 package. The footprint is near identical to what I currently have and even beats out most Boost/Buck/SEPIC controller IC thermal resistances. Coincidently, the tab is connected to ground and on the backside of that board, I have a massive contiguous section of ground plane.

Due to the thermal performance, despite it's inefficiency, I can pull nearly 300 mA continuous out of this LDO at 50˚C ambient temperatures. It's a bit much for now, but given the option of larger displays which mean greater power draw in the future, this feels like a much more robust choice.

The charge pump to be driven by a SO-8 clock IC at 32.768 kHz with a 3.3V signal. With the equivalent of 1.1uF for the charge pump caps the voltage will vary greatly.However, at 10V input  the charge pump bumps the voltage so its above the LDO's drop out for a 14.5V output. The lower capacitance reduces it's efficiency but in turn brings down the voltage to below the absolute max. Ironically, with the associated ripple from the reduced capacitance, the efficiency of the LDO is increased from ~67% to ~74% with 10uF caps under light load, and low 90s under 180 mA load. I was hopping for an input of 9V for the entire device, but right now that won't happen.

Side note: Thermal dissipation is a big deal for me. Though at regular temperatures, 20-30˚C, most of the components i have will easily function within the desired parameters, it will be common for this device to see 50˚C, or about 122˚F or higher in a car. Right now, I'm making sure the power subsystem(s) can handle 50˚C with no airflow. It may be able to operate beyond that, but I'm spec'ing this to only 50˚C operating temp for now. I've portioned a generous section of PCB copper to the LDOs and the SMPS so I should have a significantly lower thermal resistance than the max in the datasheets, and though I'm not going to run the math for irregular polygons just to find out how much better, some dirty math shows that I've double the amount of copper recommended in some cases.

Discussions