Allow Export at Resolution Smaller than One Hour

I’m the one who originally posted this request and still want it badly. At the same time, I have come to the conclusion that it is structurally hard for Sense to provide the collected data to us in intervals that don’t fit their internal time-steps and aggregation intervals. Right now, those intervals are:

  • 1/2 second - update interval for the Power Meter
  • 1 hour - the fundamental aggregation interval for all the other charts beyond the Power Meter
  • Days, Months, Billing Periods - used in the Trends.

Thanks for bumping and adding your like to the original request… We still need more (hint to everyone reading this - like the initial request).

1 Like

I was originally going to post this in response to another topic but my thoughts went orthogonal so here goes …

Not so easy.

A few thoughts:

  • The historical Power meter rendering using the meta-data would be made more difficult and necessarily slower. You need to ponder the lowest-common-denominator.
  • There would be (significant?) data storage overhead = cost (with no revenue motivation).
  • The tagging may help some humans some of the time but it doesn’t necessarily help the mothership.
  • In order to be exploited by the mothership, the Device model or otherwise notable “tag model” would need some data upload to the mother (AWS) at a higher resolution than what we see. This may well be possible, and even happening selectively, but at scale “for all tags” it seems like a taxing process.

Here’s something to ponder in terms of data resolution vs cost:
1s vs 1hr aggregations would translate to pretty much 3,600 times the data. “Sense” (below) has 20 smart plugs, 3 Hues and 38 found devices and is at the sensible limit of its processing capacity due to smart plug load. “Sense2” is not linked to any smart plugs and has not detected any devices because it’s currently dedicated to my hot water tank circuit.

So the question is, doing the math, could Sense deal, technically and financially, with 29MBps … or even 9MBps … if there is no revenue increase?

1 Like

Thanks for jogging my memory on something @ixu. I need to reframe of the datapoints, since we’re talking about long-term history. Our current data is available to us in 1/2 second Power Meter resolution, but if you look at the Power Meter back in time, it jumps to viewing in 1 min resolution. So Sense has some kind of storage hierarchy for what’s kept around for customer viewing, probably to help manage data storage and communications costs with AWS. I would guess that they don’t throw the “original” data away since they do backfills when new models are introduced. But we can’t assume that all the high res power data is visible all of the time for customers due to economics and technical constraints.

Historic data visible at one minute resolution

True. I was going to be explicit and post something like what you did above. Thanks for simplifying (compressing!) my thoughts.

I use Apple’s pedometer “Health Data” as my model for the way Sense likely confronts the data stream and the storage/UI costs. The pedometer is streaming “high frequency” variable bit rate data to the iPhone which is then aggregated and uploaded to the cloud which is then re-aggregated periodically. The process is inherently lossy but matched to the likely need for historical access.

What’s most intriguing to me is the possibility of using local (Device) pattern matching in the DSP to create a lossy compression of the higher resolution data.
We get hints of this when we see artificially flat plateaus in the rendering of historical data. The question is: “How confident is the plateau and how consistent is the underlying high-res signal matching?”

Put simply, is the ultimate compression something like:

  1. “Device X was on between time A and time B and used W watts with 99.99% confidence” or is it

  2. “Device X was on between time A and time B and matched device pattern P with 95% confidence”

#1 is currently what a smart plug can deliver and #2 is what Sense native detection strives to achieve. Both methods have advantages and disadvantages in terms of the backend data aggregation and “compression”; the technical and financial costs.

1 Like

I’m using the Sense Home Assistant Integration as a solution today. It samples Sense data once per minute.