Problem: It’s hard to know how device learning is progressing, and it is the most important up-front process for each home owner. Device learning is the gateway to delivering any initial value from the Sense product, but there is no way in any of the apps to be able to observe and assess whether and how well device learning is progressing.
Device learning is a long term and non-linear process where new devices are learned and devices that have been learned previously can disappear and reappear over time. Home owners need to be able to see the progress and regression of device learning over time to judge the effectiveness and value of the Sense product.
For example, trying to understand how Learned, Always on, and Unknown percentages for recorded daily energy use have changed over time. Using the data manually transcribed from the Sense app, it’s possible to create charts that help home owners understand the device learning progress for their home. Here is an example showing daily actuals for all metrics from my home:
And here is an example of the same metrics as 7-day rolling averages to reduce the noise and smooth the volatility:
A device learning trend would be useful to understand the metrics better and set expectations realistically of how quickly device learning may proceed for each home:
I like the way you’ve presented your data and the things you were looking at. I think the Sense team needs quite a bit of data like this to make general improvements. Each piece of information is helpful, but what they really need are trends like this so they can sort out the anomalies (perhaps like my energy misreadings) from the general errors or hurdles (like the voltage repair).
It may be worth recruiting some new owners to help with a project like this, although it may be easier for the Sense team to look at it all retrospectively.
And to prove the value of doing such reports, I just discovered that the metrics I’d previously captured manually have been revised downwards. Here are the differences between the two data capture dates:
And the revised overall chart based on the updated data:
Ideally, it would be great to be able to download the data as well as have a canned report so that the community can understand how different device learning metrics can be. I’m being asked to share my Excel file, but I think a solution from Sense is a more scalable outcome.
Could the correction be due to the error identified around improper voltage readings? I remember seeing things were off by some corrective factor, was this data collected before or after the patch (and could it have retroactively effected data)?
No, I believe the voltage reading problem was a separate issue and I observed increases in the number of kWh per day as a result:
I hypothesize that the changes in the learned devices (some disappearing, some conflating, etc.) account for the metric changes shown above. The charts that I’m showing are the percentage of overall wattage that was unknown, learned, and always on. Changes in the total power usage raises all boats equally.
It seems there has been another revision of my learning metrics and not in the happy way. In mid-January 2017 the learned device load dropped to less than 10%:
Since then learned device load has rebounded into the low 20s. Here are the 7 day moving averages for each of the three “unknown” reporting dates:
From the latest reporting date, unknown device load is down to around 60%.
For what it’s worth, one can now chart Other over the course of a year, to hopefully see a reduction thanks to improved recognition. In my case, there is a seasonal component to Other. I think Sense misses the off signature of my AC during long runs in the summer, which causes Other to zero out (AC is actually off, but Sense sees it as on, reducing Other).