The goal is simple: replace my car battery with a coin cell (plus clever circuitry), and get the car to start at least once using only the energy from the cell.

Details

OK, it's not all worked out, yet, but here's the back of the envelope calculations. I found that Mehdi on electroboom has already started a car with super capacitors. He connected 6x 400F capacitors in series to get 66.6 F. This was enough (charged to 14V) to start his car. How much energy is this?

Sounds like a lot. How much energy is in a CR2477 coin cell? They're rated at 1000mAh at 3V:

So, if I can create a boost converter to charge the capacitors to 14V from the cell that's at least 6527/10800 = 60% efficient, I should be able to start a car from a coin cell.

The first rough plan is to make a simple inductor-based boost converter to charge a similar (possibly identical) capacitor bank over perhaps 24 hours, then start the car with the capacitors in place of the usual battery.

Of course, the devil is in the details, but at first look it seems like it could work.

I'm going to admit defeat. My last hope to get the car started was charging a NiMH battery of 4 AA cells (previously shorted) with a LiSOCl2 cell of 1.5Ah capacity.

The NiMH cells read 1.33V after charging, which doesn't mean much. Figuring that not much charge had found its way into the cells, I decided to start with the capacitor at 11V. If I could boost this to around 14, start the engine, then show that the capacitor voltage was still above 11V, I'd consider it successful.

Here's the log of the capacitor voltage:

The initial almost-linear part was charging with my lab power supply at 11V, current-limited to 1A. The voltage climbed to 11V and stayed there (it's not quite linear probably due to resistance in the wires). After about an hour, I disconnected the lab supply, and put the caps on the boost converter driven by the NiMH cells (having been charged from the "coin cell"). I saw that it was working OK, and left to do some other things.

When I returned, I found that the capacitor voltage had peaked at 12.4V, then started dropping again. The NiMH cells were depleted. This means that 1097J had been deposited in the capacitor by the NiMH cells. Previous measurements had shown that 1500J are required to crank the engine.

So, it's not going to happen from one cell.

I have some more NiCd cells that were charged from LiSOCl2 cells and CR2477s, so I'll see if I can get the cap charged using the energy from multiple cells...

Good news, everyone! I assembled the 67F/16.2V capacitor, complete with cables and charge balance board. Let me show you the various lengths of wire I used:

TI has an interesting application note about charging supercapacitors for energy buffering from LiSOCl2 cells. In their design, they use a microcontroller to slowly ramp up the output voltage of a DC-DC converter charging the capacitor to minimize losses. This is similar to charging with a current source instead of a voltage source. The difference is huge. Here's a simulation:

C1 and C2 are two identical 67F capacitors, charged through two identical 10-ohm resistors. On the right, the capacitor is charged from a fixed 14V source (for example, a voltage-feedback DC-DC converter). On the left, C2 is charged from a current source (more on how to do this later). As you'd expect, the voltage on C2 exponentially approaches 14V, while C2 shows a linear ramp (I've chosen the values so the two capacitors reach 14V at about the same time):

What you may not expect is that there's a huge difference in efficiency between the two. Charging C1 from the voltage source wastes half the energy in R1: in the simulation, 6.5 kJ end up in C1, and 6.56 kJ get dissipated in R1 (theoretically, they should be equal). This is 50% efficiency. It turns out that no matter what value you choose for R1, 50% of the energy is always wasted in the resistor.

On the other hand, C2 has 6.54kJ at the end of the simulation, while R2 has dissipated only 2.43kJ, for an efficiency of 73%. If we choose a 1-ohm resistor for R2 instead, only 243J end up wasted in R2: the efficiency is now 96%. You can decrease R2 as much as you like to further increase the efficiency.

So, we should charge capacitors from current sources. In the TI app note, they emulate a current source by slowly ramping the voltage output of a DC-DC converter. If we look at the voltage at the output of the current source in the above simulation, we see a similar thing:

Here, the current source and R2 always keep the "charging voltage" a fixed amount above the capacitor voltage. Decreasing R2 decreases this voltage, and hence the power that gets wasted in the resistor.

Modifying common DC-DC converters for current output

I happened to have a few MT3608 boost converter modules around. These converters accept inputs as low as 2V, can handle switching currents of up to 4A, and can output up to 28V. The modules have a 25-turn pot for adjusting the output voltage. A quick look at the MT3608 datasheet shows that this potentiometer controls the feedback from the output. The feedback is compared to 0.6V to regulate the output voltage.

I modified one of the boards by removing the output adjustment potentiometer and including a current sense resistor in the negative output line. The feedback voltage is taken from the top of the sense resistor.

The before and after block diagrams are shown here. The rest of the circuit on the PCB isn't particularly relevant:

With the modified converter, the MT3608 tries to keep the voltage across R1 at 0.6V, no matter what the output voltage is. Now, the output is a current source, perfect for charging capacitors. By choosing R1, you control the output current according to:

It turns out that this is also a good way to control/limit the input current to the converter. I think I'm going to try charging NiCd's from another coin cell with one of these converters. It's likely to be more efficient than the homebrew converter I came up with.

So, my first experiment with charging NiCd cells is almost done. They have been charging from a CR2477 through the converter for more than a week now. The initial cell current of 2.5 mA has now dropped to 1.65, while the voltage is at 2.0, which would typically be considered dead.

The NiCd battery voltage is 2.685V at this point. This doesn't mean all that much, since you can't accurately infer the state-of-charge of the NiCd's from their voltage. I roughly guess that I was able to extract 70% of the energy from the coin cell (earlier modeling predicts 83%). Assuming a 60% converter efficiency (testing showed 65%), and 100% NiCd charging efficiency, I might have dropped 4000J into the NiCd's. It's taken a week to charge, so self-discharge has taken a toll, and maybe I managed to cram 3800J in there.

If I go with my top-off-the-capacitor plan, I'll need to transfer 1742 J to the capacitor to bring it from 12 to 14V. This allows me an efficiency of 45% for the second converter. It seems like it might be possible.

So, today, I have to finish building my capacitor bank and figure out some kind of boost converter to top off the capacitor with the NiCds. I have to hurry a little because I'm going to start losing energy to self-discharge in the NiCd's. For the moment, I suspect even the feeble current from the not-quite-dead-yet CR2477 is holding self-discharge at bay, but that won't last long.

Besides this one shot, I still have a week to try charging other NiCd's, so I have to get them going today, too.

I made some shorting bars for AA NiCd cells from 1.5 x 20mm aluminum stock. The slight curvature of the stock created by my shears makes a nice contact for the negative battery terminal. They've been shorted like this since yesterday. This ensures the cells are at a "zero" state of charge. To prepare them for shorting, I first discharged them fully with 3.6 ohm and subsequently 1.8 ohm power resistors:

The lousy clip leads probably double the resistance, but the cells did eventually discharge.

How dead are they?

I intend to use the NiCd cells as an intermediate storage for the energy drained from a coin cell, since they self-discharge much more slowly than supercapacitors (1% per day vs 20% in the first day). The energy will then be transferred (quickly) from the NiCd's to the capacitors, since the capacitors can discharge with enough power to start the car.

Even though the NiCd's have been shorted for a long time, it's natural to wonder if there is any charge left in them. I can hear people crying foul already. But, it's a legitimate question: what's left in a NiCd after it has been stored shorted?

I took two cells out of their shorting bars and tested them. I connected the first to a high-impedance DMM to test the open-circuit voltage. Over the course of a few hours, the voltage slowly rose from 0 to 1.047V. This probably won't convince anyone that the cell is empty.

I connected the second cell directly to a DMM on the 2mA setting, then measured the voltage across the cell. The current crept up to 626uA, while the voltage rose to 65mV. This represents a power of 41 uW, which is ridiculously tiny.

To put this number in perspective, it would take 5 years to charge my 67F capacitor to 14V using a charging power of 41uW, assuming 100% efficiency and zero capacitor self-discharge. In contrast, I expect to charge the capacitor from the NiCd's in an hour or two.

My suspicion is that any minuscule residual charge that these measurements represent would eventually drain away during extended shorting. The open-circuit voltage recovery after shorting reminds me of dielectric absorption in capacitors: short a capacitor for a while, and the voltage will recover somewhat afterwards.

In any case, I'm not worried that there is any significant energy left in these cells. As an experimental control, however, I'll leave a set of identically shorted cells un-shorted while charging the others. Once the active cells have charged, I'll check how much the control set has recovered. I'd bet there won't be anything significant.

I tested a CR2032 cell at continuous 10mA discharge. The results aren't very good.

If we take 1.5V as the cutoff voltage, around 275J can be drained from the cell in a little over 3 hours. Draining down to the mV level gets you to 325J in just under 5 hours. This is roughly 12 and 14.5% of the cell's capacity, respectively.

I tried the same experiment with a CR2477 cell:

This cell manages about 850J to the 1.5V level in 10 hours. I stopped the test about 15 hours in, when the energy reached 1000J; by this time the cell voltage is below 500mV. This isn't very good either, although it is roughly 3x the energy of the CR2032. The datasheets would have you believe the CR2477 can supply 4x or so at very low discharge rates (1000 vs 225 mAh), so I guess the relative performance of the cells I am seeing roughly makes sense.

I also tried a longer test with a CR2477 at 5mA continuous drain.

In this case, the cell takes 43 hours to reach the 1.5V level. At this point, it has yielded around 1900J. If you can make used of the cell down to mV levels, you can get around 2100J at this current, but it takes 55 hours. Again, not very good at all.

Looking at the four data points I have for CR2477 cells, we can see what effect the drain current has on the yield (I also add the single data point for CR2032):

Yield is a very strong function of drain. I wondered how it would look on a log-log scale, so I plotted it, and it is roughly linear, hinting at an exponential relationship between capacity and current drain, similar to Peukert's Law for lead-acid batteries. Here are the data points along with a best fit line in the log-log space:

The exponential model is of the form:

where E is the energy yield of the cell in Joules, I is the discharge current in mA, and k and m constants. For the CR2477, a least-squares fit gives k = 35594, and m = -1.704.

Using this model, I would estimate that at 80mA, you should get 35594*80^-1.704 = 20J, while at 2.5 mA, you should get 35594*2.5^-1.704 = 7469 J.

Just like Peukert's Law, this model predicts you can get infinite energy by draining at lower and lower currents, so at some point it is no longer applicable. Just for fun, I re-arranged the equation to solve for the current you'd need to use to get 9000J (approximately full capacity) from a CR2477. The answer: 2.24 mA. This sounds like an interesting test, and if it wouldn't tie up my meters for five days, I'd start it now. Maybe after the contest :-)

I must have figured this out while sleeping, because when I woke this morning, I knew I could use the cell discharge tester to measure the DC-DC converter efficiency. The discharge tester is an adjustable current load. By connecting this load to the output of the DC-DC converter and manually sweeping the current, I could record the converter efficiency as a function of output voltage.

The plot shows efficiency and output current vs output voltage for the homebrew converter tuned to draw 10.7mA from a 3V source. A 1.7Ah LiSOCl2 cell can supply this current without any decrease in capacity.

The efficiency peaks at 65% between about 2.5 and 3.5V output. This is a good match to charging a series battery of two AA NiCd cells (nominal 2.4V). To ensure good NiCd efficiency, I'll want to limit the cells to 70% charged, so I will probably use two sets of NiCd's in parallel, making a 2S2P pack of 2.4V / 2Ah capacity.

I ordered some 1000mAh NiCd cells, since I haven't had any around here in over a decade.

The efficiency of the converter is not very good compared to commercial offerings, but the input current is easily adjusted to a fixed value, an odd requirement for a power supply. I am not sure how easy this is to do with commercial switcher offerings. Does anybody know about this?

I will have to re-test this converter with a higher input current, because if the efficiency curve is the same, this converter is unsuitable for charging capacitors to 14V. At only 9V output, the efficiency has dropped to below 35%. The efficiency peak around 2.5V might make it OK for charging 2.7V capacitors in parallel (they would then be connected in series once charged). Even then, the steep drop-off of efficiency at low voltages isn't very good.

I still have to investigate commercial DC-DC converters for charging the supercapacitors.

The problem with the 10.7mA current is that it would take a week to get all the energy out of an LiSOCl2 cell at this rate. The cell is rated for 10mA continuous/50mA pulse, but I'm not sure how far I'm willing to push it. Going to 30mA would drain the cell in 57 hours, which seems more reasonable.

So @EricH gave me an interesting idea. He asked if using an intermediate step with another set of capacitors could help with the energy transfer problem. If you could find large-valued capacitors with low self-discharge, you could take a long time to charge those with a coin cell, then charge the supercapacitor quickly from the intermediate caps. It sounds like it could work, but I don't think the right capacitor exists for this intermediate step.

What about using a rechargeable battery as the intermediate energy storage? This gets interesting. Everyone seems willing to allow an electrochemical capacitor as intermediate storage, so why not a rechargeable battery? (I'll refer to the coin cell as a "cell" and the rechargeable battery as a "battery" in the discussion below).

Let's forget about any technical problems for a minute and consider the contest judges and spectators. You have to convince them somehow that you're not running anything from energy pre-stored in the battery. Since state-of-charge is very difficult to measure accurately, I'm not even sure I wouldn't be cheating with most battery chemistries. The exception is NiCd, which can and should be stored with the terminals shorted and at a zero state of charge. It's how NASA stores their NiCd cells, as detailed in this technical report on NiCds. So, if I take a couple of AA NiCd's that have had their terminals shorted for a few days, then verify there is 0V across them, I think I can make a convincing argument that there's no energy hidden up my sleeve.

OK, so there's a way to verify that all the energy is coming from the coin cell. What are the properties of a NiCd battery?

Nominal voltage 1.2V with a flat discharge curve.

Can sustain very high rates of discharge (think of a cheap cordless drill).

Self discharge rates quoted as 10% per month (wikipedia) or 1% per day (NASA TR).

Tolerant of varied charging methods (C-rate and end-of-charge detection)

Overall, they sound like a good intermediate reservoir for energy storage. They have a much lower self-discharge rate than supercapacitors, so can be charged slowly from a coin cell without terrible losses (a DC-DC converter is still required). Then, once charged, they can be drained very quickly to charge the supercapacitor before supercap self-discharge becomes an issue.

What are the drawbacks? First, the energy will be going through two DC-DC converters, so losses get compounded there. Also, there's the charging efficiency of the NiCd's. Wikipedia mentions that at a C/10 charge rate, you have to apply around 1.5C of charge to fully charge a NiCd (equivalent to a 33% loss of energy). The NASA TR, however, shows that this ratio is a strong function of temperature (P. 13). With battery temperature near 0C, the ratio approaches 1, so much less energy is lost in charging.

So, can I take a 1.7Ah LiSOCl2 cell, charge some 1000mAh AA NiCd's, then use the NiCd's to charge a 67F capacitor to 14V? Here's how everything stacks up:

From the TL-5935/P datasheet, it looks like I can get the full 1.7Ah from the cell at 10mA, which is the maximum recommended continuous drain. I also estimate that the cell voltage will remain stable at around 3V for the entire discharge. Discharging at this rate will take 7.08 days. Assuming the 1% per day self-discharge rate for NiCd's, I might lose 600J during this week. Assuming a 33% loss due to NiCd charging inefficiency (which might be improved by cooling), and a 70% DC-DC converter efficiency, I end up with 8961 J in the NiCds, which almost fits in 2 AA's. I'll call it 8600...

The numbers are in on the self-discharge test. I charged a 400F capacitor to 2.33V with my bench supply, and let it soak at that voltage for about 2 hours. Then, I recorded the capacitor voltage over the next two days while the capacitor self-discharged.

To estimate the self discharge current, I fit a series of lines to the local voltage-vs-time curve using least-squares regression with a window of 2001 points wide (about 45 minutes of elapsed time). Some of the noise in the curve is due to the quantization of the voltage steps.

The datasheet specifies a 1mA maximum self-discharge after 72 hours. The capacitor meets the specification, with the leakage current dropping below 1mA after about 12 hours. After about 16 hours, the self-discharge current levels off at around 0.5 mA. The bad news is that the self-discharge current starts at around 5 mA, so 10% of the capacitor energy is lost in the first 5 hours, extending to 20% lost in 24 hours.

I don't know how this curve would look if the capacitor had been "soaked" for a longer time at 2.33V. It may be that if held at a specific voltage for an extended period of time, the initial self-discharge would decrease.

So far, I'm also not sure exactly how to apply this data to the charging problem. For instance, what does the leakage current look like during charging? Does it increase or decrease as the capacitor charges? In any case, the initial self-discharge current doesn't look good.

...and it's likely to spark some debate. Let me introduce the idea by way of an analogy. Let's say you're tired of the city and want to visit the country. Your friend agrees to lend you her car for the weekend so you can get away and clear your head. When you pick up the car from her, it has 3/4 of a tank of gas. The first thing you do is stop at the gas station to fill the tank and buy some snacks for the trip. The trip is great, and you return on Sunday evening completely refreshed. When you return the car, it has a little more than 3/4 of a tank of gas - say 7/16. Your friend notices you haven't used any of her gasoline, and says you can borrow the car again whenever you want.

After a few trips like this you decide it makes sense for you to buy your own car. Unfortunately, you are taken in by the slick salesman and end up with a car which has trouble starting. It turns out that the real issue is not the battery but the alternator - for some unknown reason, the alternator will keep the car running once it has started, but never re-charges the battery, not even a little. Luckily, your friend has a 67F capacitor she can lend you to start your car. When you pick up the capacitor it is charged to 12V, so contains 4824J of energy. Just like you did with the gasoline, you first "fill up" the capacitor to 14V (you happen to have a little device which does this) - now the capacitor contains 6566J. You connect the capacitor in place of your car battery, and start the car. Starting the car takes 1500J, so afterwards, the capacitor contains 5066J of energy, and is charged to 12.297V. Just as you did with her car, you return the capacitor to her with a little more energy than when you borrowed it. Again, she notices that you haven't used any of the energy in her capacitor, and offers to lend it to you whenever you need.

Unfortunately, the little device you have won't charge the capacitor from 0 to 14V, only from 12 to 14. So, if your friend lends you a fully discharged capacitor, you can't start your car. If, on the other hand, she lends you the same capacitor you began with, you always leave it with a little more energy than the last time you used it, and you can continue starting your car indefinitely using just the energy from your little device.

The Proposal

I don't know if I can extract 6566J from a coin cell fast enough to charge a supercapacitor, but I may be able to extract 1500J. In this case, I propose to begin with a capacitor charged to 12V, charge the capacitor to 14V, then start the car with it. By monitoring the voltage and current during starting with an oscilloscope, I can verify that the starting took less energy than was deposited in the capacitor by the coin cell. The numbers used here are just examples: the actual values will probably differ somewhat.

Ideally, I'd modify the car so that the alternator would not charge the capacitor once the engine is started (imagine a beefy ideal diode made with MOSFETs). The problem is that cars depend on the battery to filter the alternator voltage - with older cars, you could disconnect the battery once the car was running, but with modern computerized engines, this is asking for a whole lot of trouble. I am willing to believe that a 67F capacitor can stand in for the battery as a power supply filter, but don't think it is wise to have the capacitor isolated from the alternator once the engine is running. So, the next best thing is to analyze the voltage and current waveforms during starting to verify that none of the original charge of the capacitor was used in starting the engine.

Will the judges accept this argument? Will they disqualify my entry? What do you think?

Then again, maybe I can still find a way to get 6566J from a coin cell :-)

Point of interest with respect to design of staged-capacitor (SC?) charging to minimize ISR and increase energy efficiency for low-V dc-dc converter/regulator; see: http://helixsemiconductors.com/pages/products/muxcapacitor-dcdcpoe. I'm hacking this architecture in simulation for coin-cell booster problem.

I took a look at the part you linked there. The datasheet is a bit sparse, but nearest I can tell, that's a charge-pump (flying capacitor) supply than can cut the input voltage by factors of two (1/2, 1/4, etc). They're also using capacitors in the tens-of-uF range, which don't have anywhere near the leakage problem of supercapacitors.

Hey! Just wanted to let you know that your project was very inspiring - excellent work! Even though it has proved to be a tough challenge, your logs are awesome and super informative :-) Also, congratulations on winning the $100!! :-D

I think if you can prove that the charged used from the nicd batteries is less than what came out of the coin cell, then its ok. The baseline charge that already might be in the cell isn't being used, its just like a bias or counterweight. If you can demonstrate a system where you can top off a nicd pack with 1 coin cell and start a car, and continue to do that indefinately without the nicd eventually going dead, one coin cell per start, that's quite an interesting project.

I agree, although it's impossible to measure rechargeable battery state-of-charge with any accuracy, except maybe for NiCd's, where you can short them and call it zero. They really are empty at that point.

The problem is that NiCd's large enough to turn over a car would be too large to charge fully from a coin cell. So, you could use some with a pre-charge, top them off with the coin cell, and you'd be stuck trying to accurately measure the state of charge afterwards. I guess you could discharge the NiCds afterwards to determine how much energy was left in them, but it sounds like a messy measurement (mostly because, as we know, how fast you drain them determines how much you get out ;-)

But, state-of-charge measurement on capacitors is easy: just measure the voltage. You know from E=0.5CV^2 how much energy is inside. So, topping off a capacitor with energy from a coin cell, starting the car, then measuring afterwards is easier. Even if you can't fully charge the cap (from zero) with just the cell, you can show that you didn't use more than the cell deposited.

The problem with capacitors is self-discharge rate (20% in the first day vs 1% per day for NiCd's). So, I could take a week to charge some zeroed-out NiCds from a coin cell, then use the NiCds to top off a supercapacitor. As long as I can show that the energy used in starting the car was less than I dumped in from the cell via the NiCds, I think it's OK.

I've had some NiCds charging since Tuesday. I just got some 600mAh ones in the mail today - they're supposed to be better at high currents than the high-capacity 1000mAh ones I started with.

It looks that's faisable, theorically ! The energy given by the battery isn't 10800J because when the manufacturer measures the overall current during discharge (1000ma), it says the voltage drop to 2V at the end. But anyway, there's far more than the 1700J your simulator gave you. Good luck.

As an alternative to charging the capacitors in series, which requires a DC/DC converter that can't be 100% efficient, may I suggest charging the capacitors one at a time, since the per capacitor voltage is near your coin cell voltage. The capacitors are still wired in series, with suitably robust wire since the series path must handle the engine cranking current. The coin cell is connected to each capacitor in turn, using relays (or a more complex circuit could be done with MOSFETs (they don't have to big, since the charging current is in milli amps)).

If the self discharge rate of the capacitors is low, then just one pass of charging each capacitor for 1/6 of the total charge time would work. If self discharge is high such that by the time you have charged capacitor 6, capacitor 1 has discharged too much, an alternative would be charge each capacitor for less time, and go around multiple times. For example 1/10 charge to cap 1, then 2 .... cap 6, then back to cap 1. Repeat cycle 10 times for total equivalent charging time. With only relay (or saturated MOSFETs) , the losses in the charging process might be far less than the DC/DC converter.

Charging in parallel/discharging in series is a reasonable idea, and switching the charging circuit seems easier and cheaper than switching the discharge circuit (200A battery switches are $7 each). I considered a charge-in-parallel scheme at first, but finally settled on the charge-in-series.

The sneaky problem with charge-in-parallel (all at once or individually) is limiting the charge current to something the cell can handle efficiently. Just hooking the cell across a 400F cap (which acts like a dead short) will charge it somewhat, but will introduce huge losses because of the internal resistance of the cell. Also, the cell voltage will droop significantly under this load, so the final capacitor voltage will be low.

You can think of this as an impedance matching problem. To get maximum power transfer, you'd want to match the impedance of the 400F capacitor (near zero) to the impedance of the coin cell (tens of ohms). But you actually don't even want maximum power transfer, you want maximum energy transfer (minimizing the losses in the cell's internal resistance), so you want to present an even higher impedance to the cell, and drain it more slowly.

So, in order to charge even a single cap efficiently, you still need a DC-DC converter. You might think you could get away with a buck converter, since the capacitors are 2.7V rated, and the cell is 3.0V, but because of the voltage droop, you actually need a boost converter for most of the energy. @jaromir.sukuba had to build such a converter to charge a single cap for his spot welder. So, you have to build a DC-DC supply anyway, and it seems simpler to avoid switching the capacitors. One snag may be maintaining the converter efficiency once the output voltage gets large; it's easier to design a converter that's efficient over a narrow range of output voltages. No data on how bad this will be yet.

The multiplexed "parallel" charging is an interesting idea. I keep coming back to a "total discharge time" argument thinking that you don't gain anything (self-discharge-wise) by multiplexing the charge, except that the capacitors end up more balanced at the end. And, of course, you don't have to switch the discharge path. I'll have to think about it some more.

Thanks for the switched parallel charging idea; I'll keep it in the back of my mind. Who knows what might be needed in the eleventh hour?

Somehow, intuitively, to me anyhow, this'd also result in more stored charge. I understand that e.g. two series caps will charge with half the overall capacitance, but... if charged in muxed-parallel like this, it seems like the overall stored charge would be higher... I'll have to contemplate this further.

@esot.eric there are a bunch of thoughts/facts/theories about capacitors that always make for an interesting few hours with pencil and paper (or SPICE simulators or computer algebra software). Like charging a capacitor through a resistor always wastes half the power. Even if the "resistor" is just the series resistance (ESR) of the capacitor itself. Then, there's the capacitor paradox - two ideal capacitors: one charged with charge Q, one empty. Connect them together, and each now has charge Q/2. Sounds good until you realize that each one now has one quarter of the energy of the original one charged to Q. So half the energy is gone, now, too.

I think capacitors should be the new over-unity free-energy fake video subjects; everyone is tired of magnets and red enameled wire.

@Ted Yapo, interesting math/paradoxes I don't recall from my studies over a decade ago.

Simulation of ideal components shows exactly the same discharge curve, when series-discharged, for both series and parallel-charged. ... as would be expected from training.

Here's a thought-experiment: say two capacitors are in series, the two capacitor plates which are connected together, when in series, (along with the interconnecting wire) have been somehow previously stripped of all their "free" (valence?) electrons... what, then, is attracted by the most-positive plate, when the series-capacitors are connected to a battery? And the most-negative? Am thinkink SPICE doesn't handle that... otoh, maybe such a situation would destroy the capacitors. Otooh, surely there's an inter,mediate state which wouldn't, and would affect the devices' capacity...?

@esot.eric That's a good one. It strips through several layers of abstraction. "Capacitors" and "inductors" are idealized abstractions (the so-called lumped elements) of Maxwell's equations. Maxwell's equations are idealized abstractions of something else, I guess quantum electrodynamics. Who knows what that's a simplification of? Each layer of simplification probably introduces some apparent paradoxes (paradoxen?).

No, the coin cell can't do that (alone). If you charge a bank of supercapacitors from the cell over a longer period of time, you can use all the energy from the cell (excluding losses during charging) much faster than using the cell directly.

The problem is one of power vs energy. Calculations show that there's enough energy in the cell to turn the engine over, but you can't extract it quickly enough (i.e. the cell can't provide enough power, energy/time). So, the idea is to extract the energy from the cell slowly then release it all quickly to start the engine.

I can imagine doing this with a mechanical device - maybe the cell runs a small DC motor that's geared way down to wind a very strong spring over a turn or two. You let the cell run the motor for a day, transferring most of the energy into the large spring. Then, you connect the wound spring to the crank-starter of your Ford Model T, and let the spring go to start the engine. The spring releases the energy much more quickly than the cell can, so produces much more power for a short time, which is what you need to start the engine.

But, alas, I lack the tools - and quite frankly, the skill - to build such a mechanical device (and more importantly, I am fresh out of Ford Model T's). I'm OK with a soldering iron, though, so I'll substitute supercapacitors for the huge spring and do the whole thing electronically.

[whispers] cheaty idea [whispers] it says a SINGLE coin but only at a time - using a magazine loading mechanism - you drop - drain - drop - drain - drop - drain - drop - drain - drop - drain - drop - dra--well you shd get the point. One cell technically at a time - draining it in a cap series - so all you need is a Mentos like stack and a Pez dispenser. [unwhispers] BUT THAT WOULD BE CHEATING [whispers] don't listen to me - I think that could work [unwhispers]

Assuming that getting enough electrical energy from a 3V cell to directly operate the electric starter in a car engine is just not going to happen, I'm wondering if it would be considered cheating to use a gravity assist to generate power and the power anticipated from a running engine to reset the gravity assist?

What if you had a weighted flywheel attached to a generating device where the 3V cell's job is to nudge that weight from a balanced at-the-top position to the point where the flywheel is being spun by the force of gravity acting on the weight?

If you use energy later generated by the engine to move the flywheel back into position then you have a device (admittedly finicky and not very portable) that can repeatedly start an engine using a supply of 3V coin cells.

Watts aren't energy, joules are. The calculations in the project's details show how much energy is in a CR2477, which should be enough energy to start a car assuming the linked video isn't fake.

The video might be fake; but the idea is plausible. I found ranges of 130-225A to crank 4 to 8-cylinder engines on automotive sites. At 14V, that's 1820-3100W. Assuming you need that power for 3 seconds (a conservative guess), that's 5460-9300J. I also found links that indicate the true cranking time is substantially less than 3s.

I have a 650A oscilloscope current probe on order so I can collect my own data.

You're playing with words but know what I mean. I've got such a starter box for cold mornings, it contains supercapacitance (several Farads) and a loading circuit to charge these, it is only able to give 4 to 5 "starts". It takes more than an hour to fully load the box ! A normal starter needs 40A/12V during of few seconds. That's obvious 3W cannot fit else if why manufacturers still use batteries, they're permanently reducing their costs.

"Using words correctly" is "playing with words" now? 🙃

Honestly, I didn't really know what you meant because of the confusion between power and energy. I still don't feel confident that I know what your point is. It's true that 3 W can't run a car's starter motor, but I don't think anybody here said it could.

Yes I wrote power instead of energy, and english isn't my mother's language, which isn't an excuse but doesn't indeed help. Anyway, the goal was clearly defined "The goal is simple: replace my car battery with a coin cell (plus clever circuitry), and get the car to start at least once using only the energy from the cell.". I only gave my opinion writing this is actually impossible with only one coin cell and a standard starter. Who knows what will be the future, maybe coin cells will deliver hundreds of amps, maybe Ted will discover something totally new and unpredictable as Dupont's chemists did when they found Nylon ?.

@rafununu I don't think there's anything new here - I intend to use the coin cell to deliver tens of milliamps over many hours to charge a supercapacitor, which can deliver hundreds of amps for a few seconds. The supercapacitor and charger are the "clever circuitry" in the original statement.

You're right, there's no way a coin cell can deliver hundreds of amps. But, if drained slowly enough, you can theoretically extract enough energy to start a car. The trick is storing all that energy where it can be released quickly, and supercapacitors are perfect for that.

Will it work? Maybe. You have to charge the capacitor very efficiently to end up with enough energy to start a car. This may not be possible.

3W is power, what he is intending to do is store the energy he has in that coin cell in capacitors which may very will take hours to charge. and discharge them all in a few fractions of a second, producing a huge amount of power momentarily.

Point of interest with respect to design of staged-capacitor (SC?) charging to minimize ISR and increase energy efficiency for low-V dc-dc converter/regulator; see: http://helixsemiconductors.com/pages/products/muxcapacitor-dcdcpoe. I'm hacking this architecture in simulation for coin-cell booster problem.

raildoc