True. I was going to be explicit and post something like what you did above. Thanks for simplifying (compressing!) my thoughts.
I use Apple’s pedometer “Health Data” as my model for the way Sense likely confronts the data stream and the storage/UI costs. The pedometer is streaming “high frequency” variable bit rate data to the iPhone which is then aggregated and uploaded to the cloud which is then re-aggregated periodically. The process is inherently lossy but matched to the likely need for historical access.
What’s most intriguing to me is the possibility of using local (Device) pattern matching in the DSP to create a lossy compression of the higher resolution data.
We get hints of this when we see artificially flat plateaus in the rendering of historical data. The question is: “How confident is the plateau and how consistent is the underlying high-res signal matching?”
Put simply, is the ultimate compression something like:
“Device X was on between time A and time B and used W watts with 99.99% confidence” or is it
“Device X was on between time A and time B and matched device pattern P with 95% confidence”
#1 is currently what a smart plug can deliver and #2 is what Sense native detection strives to achieve. Both methods have advantages and disadvantages in terms of the backend data aggregation and “compression”; the technical and financial costs.