Integration for Home Automation

Yeah, ideally, like others I want to integrate this into PVOutput.

We’re not opposed at all to an open API. We really want to provide that, but it’s actually a significant amount of development work to continually support one and we just don’t have those resources at this time. We feel those resources are better dedicated to improving the device detection experience (while primarily the realm of the data science team, detection still requires a lot of development work) and the Sense app itself. This includes data export, which should be coming down the pipe in the near future. I’m sorry that your contractor passed Sense off as having that functionality. We’ve never advertised it as such. Hopefully between @kevin1’s help and the upcoming data export functionality, that will be enough to do what you need. If it still won’t and you’d like to return Sense, just reach out directly to me.

There’s also some other API-specific threads in the Product Wishlist subforum that you can add your voice to.


Looks like @danderson will soon be pushing a version of code/API that works with PVOutput directly.

Will the data export be able to get retrospective?

So, if a capability for downloading data is coming, that’s fine. I would like it to be able to do retrospective, but if it can’t, it can’t.

Having a transparent mechanism for simply allowing PVOutput to poll and obtain consumption data like it does for inverters would also be great, but not sufficient: Want the download.

As far as a couple of the other comments go:

First, I don’t expect my solar installer to know the ins-and-outs of the energy monitor they offer. I had to nudge them, originally, to get API keys for the first SolarEdge inverter we had. They do electrical installations. They don’t do networking. It’s not their fault that the specs don’t state the limitations. They are great, but this part of the business isn’t their main thing.

Second, regarding detection and data science, I imagine it does. It’s hardly an easy thing to do, but, then, that’s the product you are offering. I’m not just being some kind of wise-ass. Indeed, I am a data scientist myself (Bayesian statistician, actually, specializing these days in series, topic modeling, and Bayesian additive regression and classification trees). I don’t do energy, though, and I have no interest at all in competing with your products or your team. I do Internet data for my main gig and work on environmental and biological series on the side (pro bono). I work principally with R, and occasionally with Python 3.

I am going to look at the consumption and generation series myself for these reasons.

Finally, I’m not versed in IFTTT but can’t someone just write one of those for Sense that dumps consumption into a Google sheet?

I don’t think that IFTTT would add any value here. If you want to sample your data and every 5 seconds and write it to a Google sheet, all you should need is the code that @kevin1 posted above and the Google Sheets API.

I’m not an expert on Python myself, but definitely post on here if you run into any trouble that you can’t Google your way out of.


Yeah, but I’m not paying for something where I need to trouble my own desktop with the problem. I need my desktop for other things, and these include, occasionally, interrupting all processes running on it. I can’t dedicate it to just doing this service.

Again, there are other devices and services out there who will do this.

I’m going to wait for’s (hopefully retrospectively capable) download mechanism. Shy of that, it’s pretty clear that this community does not know how to readily expose data as a JSON feed for public consumption, even if such needs authentification by an API exchange. Heck, should even be able to do public key without much trouble. After all, why couldn’t publish data like this to a central Github for ALL their customers, giving an encrypted handle assignment to each, so sources were anonymized. That way, everyone could have access to everyone’s data, but no one would know whose data was what except for their own, where they knew what their encrypted handle was.

Appreciate what you are trying to do. Fortunately my utility company and solar supplier already offer full interval data downloads, so I was able to do avoided cost calculations for my solar setup without any need for Sense data. Built a multiple rate plan comparison system in R with features similar to Energy Toolbase. Think it is important that people be able to model their aggregate energy costs with and without solar, vs. many possible rate plans.

Agree that Sense should focus on download data access before an API. I suspect that they will be able to offer retrospective data because it’s available via the web app (and web scraping).

ps: why does every company do their time-interval CSV just a little different - different interval, different flavors of time stamps, different units (KWh vs Wh), etc.

I could see a simple download interface with a few default values and then a custom option

This week
Last week
This month
Last month
This quarter
Last quarter
This year
Last year
Start date. End date

Pretty Basic stuff.

X-posting from Download or API to feed PVOutput

Hey guys. It’s come to the engineering team’s attention that the unofficial APIs posted are slamming our internal API with authentication requests pretty much constantly. This is causing some performance degradation on our end. We have no issue with these APIs existing and being utilized by our userbase, but they should not be coded in such a way as to negatively affect performance for our teams and the rest of our users. We’ve contacted the owners of both Github repositories so they can make the necessary changes to their code.


Request throttling FTW!

Hi @kevin1
Can you share the latest version of your web app for getting historical data? I have had my Sense for 6 months and want to use the historical data to help decide whether to add more solar PV.

@RyanAtSense, Twitter User Auth is a 15min window. And they are monstrous in size.
Can Sense engineers recommend how long they prefer to Auth?


If you are looking for my R/RSelenium code that scrapes the web app, here it is. A few notes:

  • As a scraper, is still sensitive to underlying web app response times and occasionally breaks - the most typical mode of failure is when I access the changing carousel of bar graphs and it sees an additional day of bars (4 days/96 hours instead of 3 days/72 hours). I haven’t put in a mechanism to trap that and repair, but instead have tried to extend the waits in the program to avoid.
  • I have had success running it for 180 days of scraping. I haven’t fixed the DST issue, so I manually fixed the missing hour on March 11th by hand.

To use:

  • Add your email and password where indicated
  • Set the NumDays to the number of days you want to go back in time. The code will stop short of scraping the last couple days of data to avoid going off the end of the carousel…

# Semi automatic scraping
# Run via Selenium Server
rD <- rsDriver (port = 4445L)
remDr <- rD$client

# Login to the website
remDr$navigate ("")
webElem <- remDr$findElement(using = "xpath", '//*[@id="application__main"]/div/div[1]/form/div[1]/input')
webElem <- remDr$findElement(using = "xpath", '//*[@id="application__main"]/div/div[1]/form/div[2]/input')
# Go to usage page for current month.  
webElem <- remDr$findElement(using = 'xpath', value = '//*[@id="navbar"]/div/a[2]/div') 
# Navigate to Days and click
webElem <- remDr$findElement(using = 'xpath', value = '//*[@id="trends-viewer"]/div[1]/div/span[1]/label') 

# Backtrack to start of reporting period plus 1
NumDays <- 180
FirstDay <- Sys.Date()-(NumDays-1)
# Navigate to left arrow and click
for (i in 1:(NumDays+1)) {
  webElem <- remDr$findElement(using = 'xpath', value = '//*[@id="trends-viewer"]/div[2]/div[1]/div/i[1]')

# Intitialize data storage lists as string lists
date.list <- list()
energy.list <- list()
senergy.list <- list()

# Move forward sampling data
for (i in 1:((NumDays-2)%/%3)) {
  # Click forward 3 days
  webElem <- remDr$findElement(using = 'xpath', value = '//*[@id="trends-viewer"]/div[2]/div[1]/div/i[2]')
  webElem <- remDr$findElement(using = 'xpath', value = '//*[@id="trends-viewer"]/div[2]/div[1]/div/i[2]')
  webElem <- remDr$findElement(using = 'xpath', value = '//*[@id="trends-viewer"]/div[2]/div[1]/div/i[2]')
  # Pull out the center date
  webElem <- remDr$findElement(using = 'class', value = 'nav-arrows__title')
  centerdate <-  (gsub('
', '', webElem$getElementAttribute("outerHTML"))) centerdate <- gsub('
', '', centerdate) print (centerdate) date.list <- c(date.list, rep (centerdate, 72)) FirstDay <- as.Date(centerdate, format="%a %b %d, %Y")-1 # Navigate back to Usage and click webElem <- remDr$findElement(using = 'xpath', value = '//*[@id="navbar"]/div/a[2]/div') webElem$clickElement() Sys.sleep(10.0) # Extract the kWh data Usage page # Data will include current month (to date), plus previous month webElem <- remDr$findElements(using = "class", "bar-graph__bar") energy.list <- c(energy.list, sapply(webElem, function(x){x$getElementAttribute("outerHTML")})) # Navigate to Solar and click webElem <- remDr$findElement(using = 'xpath', value = '//*[@id="application__main"]/div/div[1]/div[2]') webElem$clickElement() Sys.sleep(1) # Extract the solar kWh data on that page... # Data will include current month (to date), plus previous month webElem <- remDr$findElements(using = "class", 'bar-graph__bar') senergy.list <- c(senergy.list, sapply(webElem, function(x){x$getElementAttribute("outerHTML")})) } # Get rid of all the extra HTML gleet (smarter grep minds could probably do in a single line) energy.list <- gsub ('', "", energy.list) energy.list <- gsub ('kWh<.*', "", energy.list) # Get rid of all the extra HTML gleet (smarter grep minds could probably do in a single line) senergy.list <- gsub ('', "", senergy.list) senergy.list <- gsub ('kWh<.*', "", senergy.list) # Convert list to dataframe energy.df <- data.frame(matrix(unlist(date.list), ncol=1)) energy.df$Energy <- as.numeric(energy.list) colnames(energy.df) <- c("CenterDate", "Energy") # Add solar to dataframe energy.df$Solar <- as.numeric(senergy.list) # Calculate hourly net power energy.df$Net <- energy.df$Energy - energy.df$Solar # Synthesize datetime for hourly data energy.df$DateTime <- seq(as.POSIXct(c(paste(FirstDay,"00:00:00"))), by="hour", length.out=length(energy.df$Solar)) energy.df <- energy.df[,c(5,2,3,4,1)] write.csv(energy.df, paste("./", as.character(FirstDay), "Energy.csv", sep=""))

15min to 30min should be fine. It was happening constantly which is not great!

(I know there is already a request to get Nest integrated into sense).

This way sense would get notified when the device are turned on and off. I am sure it would improve the detection of these devices (It seems to work well with TVs).


Hey Jon

If would be helpful to specify what specific devices you’d be looking for integration with, and devote a specific Topic to each. That way, other Community members can “like” the Topic if they agree, giving us valuable insight into what ours users would like to see.

Check out the sticky at the top of the Product Wishlist category: How to Submit a Feature Request

I meant generically.
It seems to work for TVs. Can’t they do the same for other devices?

Ah, I understand. Mostly — yes, but it’s a bit more complicated than that. The Philips Hue integration is working in a similar way and really all future integrations will be, but they’re quite difficult to implement alongside a machine learning framework. Lots of backend work has to happen to get these two models working alongside each other. The good news is that we’re pretty deep into work on the next one and have a bunch more lined up after that release.

1 Like

I’m not having a whole lot of success with the Philips Hue integration. Sense and the Hue Hub lose their connection all the time.