Another way to approach determining the "padding factor" hearkens right back to the accurate observation that Pacowy made regarding the limitation of the "rule of 2.5X" as a precise tool to determine a homes heat loss due to its only computing to the mean coal consumption on the coldest day of the year and not the "peak" coal consumption during the coldest temperature experienced.
I actually have developed two separate rules involving 2.5's. One which says that for every degree of extra heat (either desired or required) to sustain a target temp 2.5% more energy is required. The other which says that you can calculate (estimate) your maximum coal consumed on the coldest day of the season by taking your daily average consumption for the season and multiplying this by 2.5.
For my specific case last year, the coldest day of the year saw a high of +4 degrees, and a low of -17 degrees, for a mean of -6.5 degrees. And as I understand the Pacowy argument, If I burned 110 lbs. to keep my home warm and at a steady (T-Stat set point) temp for a mean of -6.5 (this coming to 4.583 lbs. per hour) , what hourly burn rate would I need to sustain it at the desired T-Stat temp setpoint during the time when it was actually -17 degrees outside.
The difference between -6.5 degrees (the mean temp) and -17 degrees (the actual coldest temp experienced) is 10.5 degrees. If I need to consume 2.5% more energy per degree then 2.5% x 10.5 degrees of difference = 26% more energy was being burned when it was actually -17 degrees outside. Therefore for my case the burn rate for -17 degrees becomes 1.26 x the mean of 4.583 lbs. per hour, or 5.77 lbs. per hour of feed rate required, and my padding factor for this case is 1.26 (which becomes 1.3 when rounded). Therefore 1.3 it is for the ideal "padding factor".
Putting it all together this brings yet a third rule of thumb into play. The actual rule of how to compute your homes heat loss on the cheap becomes 2.5 x the "padding factor" of 1.3 = 3.25X. The "Rule of 3.25X" is born.
The three separate (though linked) rules of thumb to be extracted from all of this are thus:
1) On the coldest day of the year you will burn approximately 2.5 times the average daily fuel (coal, etc...) you burn for the entire heating season. Thus the "Rule of 2.5X"
2) For each degree of additional warmth desired (or required to sustain a T-Stat setting as the outside temperature varies downward) you will consume approximately 2.5% of additional energy. Thus the "Rule of 2.5%"
3) A homes "heat loss" can be approximated by multiplying the average daily fuel (coal, etc...) used for the entire season by 3.25X and then multiplying this amount of fuel consumed by its BTU content, then multiply this by the efficiency of your heating appliance, and finally divide by 24 hours. Thus the "Rule of 3.25X"
Putting it all together for my case:
Since I burned 11,086 lbs of coal from October 1st through May 31st (my T-Stats "on" season), for me this becomes:
Total days of my (T-stats are on) heating season = 243
Average daily usage for the entire season = 11,086 / 243 = 45.6 lbs.
"Rule of 2.5X" ballpark estimate of coal to be burned on the coldest day of the year: 2.5 x 45.6 = 114 lbs (vs. my actual of 110 lbs)
Rule of 3.25X estimate of "home heat loss":
Step 1) 3.25 x 45.6 avg lbs. burned = 148.2 lbs.
Step 2) 148.2 lbs. x 12,150 BTU's/Lb. x 0.80 (my appliances efficiency rating) / 24 (hours) = 60,021 BTUH
And finally, to match home heat loss to the required heating appliance, divide the homes computed "heat loss" by the efficiency of the appliance.
60,000 BTUH (heat loss) / 0.80 (for 80% efficiency) = 75,000 BTUH input required for my homes ideal heating appliance if such were rated at 80% efficiency
Last edited by lsayre
on Sat Jul 12, 2014 8:50 am, edited 4 times in total.