Besides the desired mains frequency, electronic devices create lots of high frequency noise on the power line. This device will clean it up.
I'm reading through the excellent free online book "Marks Pages", which deals with all sorts of powerline things, and I came across this:
"Taking this a little further; How many times are filters installed to rid the network of certain harmonics? It is, personally speaking, a totally ludicrous idea as all one is doing is asking a passive device to 'work against' a portion of the current curve of an imperfect load. This has two serious consequences. The first is the amount of heat generated through this (which increases should the frequency be slightly off), but the worst is the current associated with the harmonic will at least double through the network ahead of the filter (for every action there is an equal but opposite reaction!)."
And this is true, there is extra harmonic energy generated by the filter. So basically, the filter needs to be installed as close to the offending device as possible, and then the extra current is limited to a smaller bit of powerline.
Do the benefits ("cleaner" powerline) outweigh the cons? (locally "dirtier" powerline, possibly higher power usage)?
"Capacitors will, however, naturally shunt any harmonic content present on the electrical system. In doing so, they will intensify the high-frequency current on the electrical distribution system between the capacitor and non-linear load. This means installing power factor correction may lead to the need to increase the ratings of electrical switchboards to cope with the increased high-frequency current!"
I'm going to think about this.
This part of the log is to calculate theoretical filter dissipation for the measurements that I have done.
I've done measurements with a transformer that had a voltage ratio of roughly 1:30.
Therefore, all voltages measured in the previous log have to be multiplied by 30 in order to relate them to the true powerline frequencies.
However, we don't really need to consider the voltage ratio of the transformer, because we've measured the transfer function of the transformer, and established that it's essentially flat from 0 to 70 KHz. So we can just take the measurements of the transformer, and use the relative attenuation for the powerline voltages.
For instance, the 3rd harmonic was at -25 dB. That is equal to:
x = -25 dB, corresponding to 0.003162
The powerline frequency (50 Hz) has an amplitude of 230 VAC (for Europe), so the third harmonic would have a voltage of 230 * 0.003162 = 0.727324 VAC.
This is of course the RMS (Root Mean Squared) voltage, which is determines how much power would be dissipated in a load for a sine frequency.
That's a small amount.
Now, this device is supposed to dissipate (short out and turn into heat) all frequencies above 50 Hz.
There is a bit of a problem with calculating how much power would be dissipated for these frequencies.
Let's assume that the power station generates only a power frequency of 50 Hz, and that all frequencies above that are generated by devices that are connected to the power line.
Let's say that the powerline can generate up to 13 A (which is the normal type of fuse in the UK).
Then the power generated at each socket at the power line frequency is 230 V * 13 A = 3 kW (approximately). We certainly don't want to dissipate that amount with this device.
Dissipating this 3 kW takes a resistive value of W=I^2/R = 3000/(13A^2) = 17.75 ohms.
The problem is about source impedance: (or output impedance)
Each source has an impedance in series, and attaching a load to that source causes the voltage to be divided between the source impedance and the load impedance, using a resistive divider calculation.
The source resistance of the powerline is (should be) really low, otherwise if you attach a few high wattage devices, the powerline voltage will drop (which is called a "weak net"). It turns out that for the UK this is about 0.5 ohms at 50 Hz 
But how much source impedance is there at the harmonic frequencies? The generators for these frequencies are inside your devices.
Also, what if the frequencies generated by the net are not 50 Hz, but also these high frequencies are generated with a low output resistance? Unlikely though.
Theory about power line issues:
And this also seems relevant:
I suppose we just went to do some simulations to look at what ifs and filter topologies.
Using the transformer of the previous log entry, I've taken an FFT using the Rigol DS1052E Scope, using a Blackman filter, and saved that as BMP to my disk. Here are the results:
On this first measurement, the time div is 250 Hz/div, so the maximum frequency here is 2.5 KHz. At 50 Hz, obviously the voltage is highest, since that is the powerline frequency. Then up to 500Hz there are some harmonics: multiples of 50 Hz, at 150, 250, 350, 450, and 550 Hz. These harmonics have increasingly smaller amplitude: -25 dB, -32.5 dB, -35 dB, -45 dB. -45 dB.
Those pulses correspond to the 3rd, 5th, 7th, 9th, and 11th harmonic (integer multiple) of 50 Hz.
Frm approx 550 Hz to 2250 Hz the amplitude is below -50 dB, and then there are two increasing peaks: 2360 Hz and 2450 Hz, at -30 dB and -20 dB.
With a larger x division, the total spectrum has moved up a bit, by approx 5 dB. Generally speaking, up to 50 KHz, the spectrum drops from -40 to -45 dB, and up to 250 KHz, the spectrum is reasonably flat at approximately -45 dB.
We have to remember though that the transformer resonates at 200 KHz, so measurements above that frequency cannot really be trusted.
That was the baseline reading.
A bit of analysis:
Where do the harmonics come from? Well, these voltages are caused by switching behaviour. You can have a read here for theory about that.
In this case, because I'm measuring a single phase powerline, these harmonics fit a full bridge rectifier best, which has four diodes, and hence a pulse number of 4. In that case the following harmonics are generated :
So, for single phase, and a full bridge, the pulse number is 4, so we expect to see harmonics on:
1x4-1 = 3
1x4+1 = 5
2x4-1 = 7
2x4+1 = 9
And that is indeed what we saw.
The amplitudes are caused by currents, and the amplitude of each of these harmonics should be the fundamental voltage divided by the harmonic number.
3: 1/3 = 1/3rd of the fundamental
5: 1/5 = 1/5th of the fundamental.
What we measured was:
-25 dB for the 3rd,
-32.5 dB for the 5th,
-35 dB for the 7th,
-45 dB for the 9th
but according to calculations these should be:
10*10log(1/3) = -4.77 dB
10*10log(1/5) = -6.99 dB
10*10log(1/7) = -8.45 dB
10*10log(1/9) = -9.54 dB
So what we see is that the theory doesn't line up with practice. Not only are all frequencies far more damped in reality than in theory, also, the increasing drop off is less. However, we do have to remember that the theory applies to unfiltered bridges, and the function of filtering is exactly to prevent unwanted harmonics.
In fact, there are rules and regulations about how strong the harmonics may be on the powerline, which I won't go into, but you can read about in .
Next step: calculating what the amplitude in volts would be for these measurements.
If we want to know how much we have cleaned up our power supply, first we need to know how dirty it was in the first place. I'd like to take an FFT with my oscilloscope, a Rigol DS1052E (of course). I don't want to hook my probe up directly to mains, because I rather like living.
To step down the voltage, I've taken a regular wall-wart with transformer. This particular one was an ac to dc transformer, so I sawed it open and removed the rectifier and filtering caps. Then I closed it back up again and measured the output voltage. 230VAC to 8.4 VAC, so it has a turns ratio of 230:8.4 = 27.38, when terminated with a 10k resistor.
Then I opened it up again, and soldered wires to the primary of the transformer. Of course without it being connected to mains.
A transformer is a complicated device, with series resistance, inductance, and capacitances. But I'm expecting it to behave as a low pass filter up to a certain frequency, then resonate, and then act like a high pass filter. Hopefully the frequency transfer function up to at least 150 KHz is flat (no dampening or amplification of the voltage), because I'm interested in measuring the power line frequencies up to 150 KHz.
To test the flatness of the frequency transfer function, I've connected a signal generator to the primary (using the maximum output voltage and low impedance drive output), and for each frequency measured the Vrms (root mean square voltage) on both the primary and secondary.
Then I divided the primary by the secondary voltage, to get the ratio.
I then normalised the ratio so that the voltage at 50 Hz would be '1'.
Then I took the 10*log10(ratio) to get the amplification expressed in power. That leaves us with a traditional transfer function, although expressed in 10*log10(ratio) (power) instead of 20*log10(ratio) voltage amplification. I'm doing that because that's how the oscilloscope also spits out FFT spectrum.
Here are the results:
Between up to approximately 70 KHz, the voltage ratio is within 20% of the voltage at 50 Hz (the line frequency). Then it starts dropping off up to 200 KHz, and then starts increasing again up to 500 KHz.
The power transfer function image was roughly identical, so I zoomed in on the part where it dips:
It looks like this transformer is resonating at 200 KHz. Resonance is the point where the inductive behaviour (resulting in the low pass behaviour) changes for capacitive behaviour (resulting in high pass behaviour). So at resonance these two components are equal.
What have we learnt?
This transformer is not perfect but suitable for measuring voltages up to 200 KHz. We know with what ratios to multiply the FFT in order to get a correct power spectrum when measuring with this transformer.