Voltage Sensing
The Voltage of the mains supply is measured by placing two resistors that form a voltage divider between the live and neutral wires in order to scale down the voltage from uk grid voltage rms 240V or 340V peak to a more manageable 0.17V peak. The voltage needs to be scaled down due to the need by the isolator chips for an input voltage within the range of -0.256 to 0.256 Volts.
The voltage divider in my circuit consists of the two resistors circled in red. The upmost one is resistor RB with the top end connected to the neutral terminal, the bottom end is connected to resistor RA and the Vout wire. RA is in turn connected to the live terminal.
Three values determine what value resistors we will need for our voltage divider. The first is mains supply peak voltage, the second is the isolator chip max input voltage range and the third is how much current we want to have flowing through those resistors.
In Britain the mains supply rms voltage is 240V, the peak voltage is 240 multiplied by the square root of two or 340V. The isolator chips require an input voltage range of -0.256 to +0.256 Volts.
We dont however want to scale our peak 340V to be 0.256 Volts, it is best to allow some leway. The Cornel PowerBox guys went for 0.170V and so I will stick with that. We use the voltage divider equation that can be derived from ohms law to calculate what resistors we will need:
But first we need to decide how much current we want flowing through our resistors, the resistors have a max power rating we need to make sure we stay well clear of a current and voltage that would exede this value and since we are measuring voltage we dont want any significant current flow and hence we dont need high power resistors. The current flowing through both resistors will be most at peak voltage 340V. We can find the current by Ohms Law:
and therefore power by:
There are resistors that go up to the gigaohm range but most of the standard large resistance resistors that I have come across are in the megaohm range. Lets take a 1.0MOhm resistor as an example. If we say that RA + RB = 1MOhm (10^6 Ohms) The power dissipated would be :
So it would be fine to use 0.25W resistors for this application since the power rating is higher than the power that will be dissipated in them, that is as long as the total of RA+RB is in the 1MOhm range. So if RA + RB = 1MOhm then to scale 340V to 0.17V the voltage divider equation above gives us an RB value of 500Ohms. So we could make our voltage divider from with RB = 500Ohms and RA= 999500Ohms. RA of this side doesnt exist so we go for a 1MOhm resistor this changes our Vout which will still be pretty much 0.17V.
I have actually gone for an RA 2.2MOhm carbon film resistor with a max power rating of 0.25W and a tolerance of 5% and a RB 1kOhm carbon film resistor with a max power rating of 0.25W and a tolerance of 5% which reduced the voltage from 340 to 0.15V and dissapates only 0.052W and so is a bit safer.