Post by Max DemianPost by Jethro_ukSmart meters are shit - who'd have thunk it ?
https://www.bbc.co.uk/news/articles/cq52382zd1no
The way smart energy meters work in northern England and Scotland is
causing issues for customers, BBC Panorama has been told.
The body that represents energy companies, Energy UK, has confirmed for
the first time there is a regional divide - because of the way meters
send usage data back to suppliers.
1. They don't measure the consumption of individual appliances;
2. They don't tell you the average consumption of an individual appliance (such as a fridge);
3. They don't enable you to measure the total energy consumption for a particular process e.g. to compare cooking a jacket potato in an oven or a microwave, or cooking a stew entirely on the hob as opposed to starting on the hob and finishing in the oven;
4. They enable suppliers to switch your power off remotely, "by mistake";
5. They enable the suppliers to charge different rates at different time, not just in an easy to understand way like Economy 7 but also to bump up the rate just when you want to cook dinner.
To measure individual appliances, you use one of these.
This is a "miniature smart meter" which is just as accurate (1%)
as your household smart meter is. It uses two 500KHz sigma-delta
converters, to convert the voltage and current waveforms to digital
form, so the meter box can do the math equations and work out
W (watts), VA (VoltAmperes), PF (Power factor). On an electric
fire, the power factor is 1.0 (in phase, non-reactive load).
https://www.amazon.co.uk/dp/B0B2NXPGYW?th=1
Those are best suited, to working out the daily consumption
of your refrigerator.
For an electric fire, you can plug in any electric fire
up to the current flow limit. Measure the values for ten
seconds or so. Then turn off the electric fire and disconnect
it. This prevents thermal damage to the shunt inside the meter.
While the meter has a 13 amp rating, it's best not to run it
at 12.999 amps for a year or so, because that can cause the
solder to melt underneath the shunt :-)
If you put a more-capable shunt in it (more cool-running),
the sensitivity of the ampere conversion circuit must be
cranked up, and it is harder to do the design for that.
while they know how to fix this, they're not changing the design.
If they want 100mV full scale across the shunt, nobody is
changing the circuit to 10mV full scale, and delivering
*inaccurate* readings to the user. There could be some
degradation of the conversion process, if making the
full scale value on the shunt, too low of a value.
So the design of the miniature meter, is ideal for refrigerator
and IBM PC. The numbers should be good to 1% . For an electric
fire, a ten second reading should verify what you already know
about electric fires :-) Their consumption is proportional to
voltage, if your line voltage is too high, your electric fire
draws too much current. This is not a fault of the electric fire.
Electric fires would be too expensive if they were fitted with
"regulation". It is the job of your power supplier to implement
"regulation" and deliver the exact right voltage. (I'm saying
that for the benefit of my power company, who do not know this!)
Paul