I looked a bit in the forum and could not find the answer to such question. In essence, the Sense is an Analog-to-Digital Converter that collects data on either 2 or 4 channels (Main CTs + 2 Dedicated/Solar) and then uploads such data to the cloud via WIFI.
So, I did some simplistic calculations, with the following inputs/assumptions:
- Sampling Rate - 4 million samples/second (4 MHz)
- Data Resolution (# Bits of Analog-Digital Converter) 14 Bits
- Number of Channels - 2 vs. 4 (main CTs only, or with DCMs/Solar)
The results are in the table below. Does this mean that in the best case scenario (Using only the 2 main CTs) the Sense consumes 112 of my 200 MBPS upload bandwidth? That sounds extreme.
However, I might be reading the specs wrong. When Sense advertizes they collect 4 million samples per second, do they mean 4M samples or 4M BITS per second? If that’s the case, then the used bandwidth is fixed at 4 MBPS (Million Bits Per Second), and the data from the (2 or 4) CTs is simply multiplexed. This also means that (possibly):
- The data for the 2 main CTs is more precise (sampled higher) than when one uses 2 additional CTs for DCMs … or …
- All 4 CT channels are always sampled at the same/fixed rate (1 MHz), and, when no extra CTs are used, 50% of the data is either discarded before being uploaded, or still uploaded and the Cloud takes care of ignoring it (waste of bandwidth).
I got some basic information from the web site http://whatnicklife.blogspot.com/2017/12/sense-energy-monitor-teardown-sampling.html