When I look at the solar generation inside Sense, I temporarily feel pretty good: at times, and for good amounts of time, I’m producing more than i use, so a net surplus (in that very moment, perhaps). But when I scroll around and look in the app, I see things like “35% powered by solar.” At first this didn’t make sense. But then I parsed out “oh, so the 100% is instantaneous, and the 35% is over the course of the day.” Then I look at the size of the orange and green vertical bars, or the solar collected through the day in the line graphs, and it still doesn’t add up. And to further complicate things - these screen shots were when we were on vacation - meaning minimal use from the house. How on Earth with sunny days and a new solar array was I not able to cover more than my fair share of electricity use when I wasn’t even there ?? Am I just having a hard time reading this? Or is it something else?
Powered by solar means literally that. What was powered directly by the solar you produced? If you use literally 0kWh of power during the day and produced 100 kWh hours of solar, your powered by solar would be 0%.
And the net metering “banked” power you send back to the grid doesn’t count toward the “Powered by Solar” later in the day when the sun is shining less. The “Powered by Solar” gauge is best for people who are on a TOU tariff where they buy electricity at retail based on “from grid” and sell back to the utility based on “to grid”.
“Power by solar” is the percent of your home’s electricity supplied by your solar for that interval (day, week, month, or year) directly, without using the electrical grid as a net metering battery.
OK, that (I think) helps. We are sending back to Grid what we don’t use (metering). I guess it’d be good to see “this is what you used, this is how much was offset by the panels on your roof, and this is what you pulled from National Grid.” That’s what I thought this page/these graphs are supposed to say, and maybe they are, but even some of the $ amounts just don’t add up. Or, it’s just not intuitive to me, anyway.
Thanks. Conceptually, I get it. I guess I just don’t believe it. I put my thermostats (3 zones) into away mode. We had a few LED lights on and off to simulate being home. And the fridges were running, but with no one opening/closing them, I presume they were running low. So to see that was still using 11.3kWh (in the screen shot above) is weird - I guess I thought my ‘standing’ use (non-peak, baseline, etc.) would be lower, and that my array would’ve pulled in more.
Not weird at all. Sense measures how much energy the devices in your home are using. It doesn’t care where that energy came from. If it showed that all your devices were using 0 kWh that wouldn’t be accurate because they are still using power.
Your screenshot shows that you produced 15.9 kWh and you used 11.3 kWh of energy. If you have 1:1 net metering, just subtract the 2 numbers and you’ll know that you are at a surplus of 4.6 kWh. Below that, to grid, says 12.0 kWh. This is the amount you sent back out to the grid because you were generating more than your house was using. From grid, 7.4 kWh, shows how much you pulled from the grid because the demand was higher than your panels were supplying. The difference is still 4.6 kWh. It’s 2 different ways of showing the same data.
Not everyone has 1:1 net metering, so using the available solar power is beneficial compared to sending it to the grid. Maybe they are charged X for each kWh they pull from the grid but only get a credit for X/2 for what they send back to the grid. They might have a goal of increasing the powered by solar %.
BTW, the “powered by solar” calculation is simple:
- powered by solar = (usage - from grid) / usage
The “usage - from grid” term is the amount of used energy that was provided by sources other than the grid, i.e the amount of used energy that was sourced by your solar. So that term, divided by the total usage is “powered by solar” percentage. A better term would have been “powered by not grid”.
The grid…all the transmission lines, substations, local lines, etc…requires a lot of investment to keep working. Most of your electricity charges are to support the maintenance of the grid, not the actual cost of electricity generation. The cost of electricity generation is something like $0.03-$0.05 per kWh…the rest of your per-kWh charges are really to pay for the grid (or for peaker generation in the evenings). Us folks with solar and with net metering get to use the grid as a free battery. The electric companies are not going to let us have a free battery at their expense forever (though some of us do have 20+ year grandfathered net metering plans), rather they will be changing their rate plans over time to lessen our benefit of using their grid as a battery. So we should be looking into ways to change our consumption behavior to rely less on the grid battery…we should try to increase our “powered by solar” metric…e.g. shift when we run major appliances to when the sun is shining rather than in the evening.
This might be an interesting read: https://syonyk.blogspot.com/2019/11/what-replaces-net-metering.html