Discussion: Why can't I train my Sense?

Totally agree with Mike. As a retired electronics technician who still owns plenty of test equipment which can see all the different power signatures, there has to be a better way of identifying devices.

My sense has been installed over a week now. What has it detected? One lousy device, my fridge. Cut me a break!

I can clearly see the signature of every device turning on and off in the sense app. I can see when my condensing boiler turns on, when the fan ramps up before ignition, when the igniter fires and turns off, and when the fan ramps back down. Sense is too to stupid to identify it. Condensing boilers are common today and the current waveforms are unmistakable, unlike any other device in your home.

It has not identified any led bulbs, not even a toaster, television, nothing but the fridge. I have plenty of devices in my home (always on 413 watts) and sense has not identified any of it and most likely never will.

I believe you would make more customers happy (the technical ones at least) giving them the ability to identify and label their own devices.

Another gripe I have is my net zero happens the end of this month. I see no way to zero out my consumption and solar production. This was the main reason I purchased sense, to keep track of how much solar generation vs. usage, so I do not go over my net credit.

I truly hope that you consider these gripes, many forum users have the same ones.

1 Like

@Louie, @mike_gessner,
Since you are electronics types, Iā€™m going to challenge you to show me that you can tag a specific deviceā€™s waveform from your home, for training. But first, Iā€™ll impose an additional constraint that the Sense guys have to deal with. Because they wanted the bubble coming on immediately when a device fires up or disappear right when a device turns off, they have to do identification of signatures within less than a second of identified devices coming on or going off. More info here:

Please show me a 1 second long snapshot (at most - 1/2 second is even better) of your home device that you would use to train Sense on that device. You canā€™t for two reasons:

  1. Time resolution - The Power Meter only shows things from a 2Hz (1/2 second resolution) perspective. You never see the actual AC waveforms. Bottom line - the world you are doing your visual ā€œidentificationsā€ in is very different than the world Sense is actually seeing (1 microsecond resolution view on actual AC waveforms with less than a second identification window)

  2. Even if you could view into that 1 microsecond, 1 second long world, I think you would be stymied by the range and huge volume of spikes and ridges. You might be able to look at the the macro waveforms you are showing to me first, then go back to the micro world and ā€œboxā€ the 1 second of an edge that matters, but that would be highly tediousā€¦

BTW - Iā€™m not trying to pooh-pooh your desire to help. I really would like to be able to assist with identification as well, but understand these limits and my limits. Plus Sense has expanded recognition / identification to include post-waveform identification (where the identification is done 30 seconds to several minutes into the waveform), so there might be a place for human training in that as that capability expands. Until then, using smartplugs on 120V devices that donā€™t seem to be discoverable, is the best way for you to help training.

1 Like


This is probably not what your asking to see, but will give you a specific example.

My home has a 1,000 amp hour commercial battery bank that powers all my Ham Radio equipment and many 12 Volt items around the home. One of the items it powers is a 440 MHz Ham Radio Repeater. This repeater has to transmit every ten minutes to identify itself per FCC regulations.

The 1,000 amp hour battery bank is kept charged with a Statpower battery charger (switching type power supply).

The current plot shown by the sense app is the charger ramping up to keep up with the current used from the batteries by the Ham Radio Repeater transmitting itā€™s ID. After the repeater stops transmitting you can see the current taper off as the voltage of the batteries comes back up to the float voltage.

Since I doubt no one in the world has this exact battery bank, charger, and Ham Radio repeater, my Sense will be the only one sending this power signature. As such Sense will never be able to ā€œCrowd Sourceā€ and identify for my particular situation because no one else has it.

1 Like

Thanks for sharingā€¦

So given the current less than one second primary paradigm, you would have to train Sense based on identifying these two little boxed features.

But you captured this snapshot with everything else pretty quiet in the house. You would also have to show instances of the same exact region when there is a lot of additional activity going on in the house.

We have zero reason to lock out this feature. If it were both practical to use and feasible to implement, it would be in the app. This has been discussed ad nauseam in this forum. I recommend searching for those previous threads to see all of the prior discussion. While weā€™ll continue to assess the feasibility of such a training/learning mode, ultimately, this is our response for the time being: Why can't you train Sense?.

Since we have no issue with users discussing this, Iā€™ll leave this thread open as the primary discussion thread. I have moved it to Technical Questions and renamed it accordingly.

2 Likes

That is awesome to see how it tracks your solar as well. I would love to have solar on my house but unfortunately i live in Virginia and the HOA in my neighborhood forbids solar on houses if it can be seen from any of my neighbors houses or the street.

1 Like

Just an asideā€¦ Would your HOA allow solar roof tiles as a roofing replacement ? Expensive, but designed for these aesthetics situationsā€¦

Ryan,

Thanks for leaving this and i just posted this question again just to get people talking about the idea on here. For those that just want to say that this isnā€™t possible or complain about me asking this question then my only response to them is that it isnā€™t possible yet and i would ask them to simply go look up the word forum in the dictionary. I simply just want to discuss this with others and maybe that will trigger the ah-ha moment for someone and make this available. Btw I love my new sense monitor and i think its a great little piece of test equipment/monitor and is worth every penny that i spent on it so thank you for this device.

1 Like

iā€™m pretty sure they would allow those but there way out of my price range. I would also prefer the traditional panels that sit off the roof because my house isnā€™t shaded and these would create that shade on my roof and in return keep my attic cooler in the summers. it is what it is and hopefully one day iā€™ll have a house with solar.

1 Like

Absolutely. I totally support that. Just know that weā€™ve thought very deeply about this (and even had a training mode in some prior app versions). Itā€™s been researched and tested extensively our engineering and data science teams. If we felt it was doable, weā€™d do it. That, of course, could change in the future, especially as we get more data. Energy disaggregation, in the broad way that weā€™re doing it, is pretty new still. Facial recognition, as @kevin1 noted above, had a bit of a head start and a lot more resources thrown at it. The good news is, weā€™re making steady progress and Sense will only get better.

1 Like

@kevin1

I accept your challenge. First off I am saying that ā€œifā€ they implemented a training mode within the sense app that you could turn on and off to tell sense to start looking for devices to turn on and off then once turned on and when an event/device turned on sense will know to ask you what just turned on and give you the option to label it (just like you label it when sense finds a new device) and also give you the option to label it as a guess. Once youā€™ve done this once and labeled it you can now test it by turning the device on and off several times to make sure you labeled it correctly and when you are done with training mode youā€™re sense app can now send this information back to its crowd sourcing to add to their database that they are using.

Sense is already taking snapshots of my power usage and has a record of my devices turning on and off. All i am asking is that we the end users are given the opportunity to help speed up the device discovery process by adding a learning or teaching feature.

My best guess as to why we the end users arenā€™t given the option to train sense is that the data that is being collected from sense is being sent to a data collection computer and is being analyzed there. Once the data is analyzed they send you a notification that a new device has been detected and thatā€™s when you tell it if thatā€™s what it discovered OR if its a different device that is selected from its list of devices. We the end users are being shown a very basic representation (slower sample rate) of the power usage in real time on our computer and apps on our phones and tablets.

Also like to say that after three days my sense has detected 5 devices and 2 of which i have identified. the first one that i identified was my daughters aquarium heater and i just figured out the second one which is the iron that my wife uses in her craft room which i originally thought was her tea kettle. i have heat 3, dryer and stove on my list and i have no idea what these are yet. i have turned on the stove again to see if it will trigger my sense to tell me its on but it doesnā€™t so it must be something else. my wife has been in her craft room sewing and watching TV over the past few hours and the dishwasher has been running for a while as well and gets used daily as well but hasnā€™t been detected yet.

@mike_gessner,

Youā€™re right with your description. A highly compressed and specially processed version of the microsecond data is sent back to the Sense mothership for analysis and training. Think about it - they get voltage and current feeds for both legs of electricity in your house, plus additional processed info like the phase angle between the voltage and current on each leg, including the full splendor of 60Hz AC. We only get to see the finished product, a half-second updated view of just the total RMS power, without all the complexity of the raw data.

You should really push into the type of analysis Sense is doing on their end, using machine learning. If you pushed into it a bit, you would learn that when Sense is doing recognition on about a half second window into the data, the training (you indicating ā€œthe garbage disposal just turned onā€) has to happen within about the same amount of time to be helpful. Similarly for ā€œthe garbage disposal just turned offā€). Plus the signatures associated with flipping a breaker are different than the in-use signatures for many devices (think refrigerators or HVAC thermostats), so you canā€™t just use the breaker as part of training for those. The training that does work for these short recognition windows comes from consistent machine-to-machine level feedback from smartplugs and possibly things like thermostats in the future.

And please remember that there are two phases to this process:

  1. Identifying the patterns as some kind of device
  2. Classifying the device as an XYZā€¦
    The toughest one, by far, to train is the first.

I would also like to challenge everyone who thinks they can identify their own devices, mainly because Iā€™d like to be proved wrong.

Weā€™re merely saying that to put what the human brain is seeing into code is not plain English as much as my personality thinks it should be. Not sure how to explain the complexity of what is being done as much as it frustrates me at times.

Hereā€™s whole picture vs up close, same deviceā€¦

Pump%20Start

1 Like

Iā€™d say the human brain is fully capable of doing what sense does, just not at the same speed. After all, sense has only ā€œlearnedā€ what was programmed. @Grandpa2390 explained it very well in a post.

I understand the desire to have the on/off bubbles show up within 1/2 a second of the device switching, but I think it would also be nice to see if Sense can use the ā€œmacroā€ view of the waveform in their AI implementation to help w/ the energy de-segregation retroactively.

I guess, I see two uses for Sense; the first being the immediate picture of what is currently on/off in the home; and the second being a historical understanding of the power usage for the devices in the home. I think the second problem might be easier for ML to solve. And I donā€™t believe you need immediate view to make decisions about how you should be saving energy.

Iā€™d like to hear some thoughts on this topic from users as well as @RyanAtSense. It may be that Sense just doesnā€™t have resources to develop a parallel ML to enable this kind of view.

Somewhere on this forum, there are indications from Sense that they are branching out from the immediate bubble 1/2 second recognitions, mostly for EVs initially, since EVs have a an extended power ramp for a reason. I think they refer to it as post-identification updates as opposed to immediate updates.

It depends on the device, and on your house. With my house, I can identify my own devices. thereā€™s never more than 2 or 3 running at a time. And they all follow the same unique patterns that the human brain is powerful enough to readily recognize.

Itā€™s not fault of Sense that it canā€™t do what the human brain can do. our brains have been in a working progress for millions of years (or designed by God, whatever you believe). and the technology has been training 24 hours a day for decades in our cases.
the technology that powers Sense, on the other hand, has been in a working progress for a few decades. and has been training 24 hours a day for 1 or 2 years. Even if it were as capable as human, it is still only a 2 year old baby (and Iā€™m just being figurative here).

resolution makes a difference as well. just because something look unintelligible up close or far away, doesnā€™t mean you have to look at it from those distances. I donā€™t know if thatā€™s the point you are trying to make. to make sense of an elephant, we stand back 15 feet. to make sense of a virus, we zoom in very very close with electron microscopes. :slight_smile:
But the main point is that, I know I can identify the appliances in my own house. my house has a very small vocabulary, and Iā€™ve been looking at it for a year now. @kevin1ā€™s house on the other handā€¦ Iā€™m donā€™t think I could do the same. but who knows.

@samwooly1 I agree with you, but I think it is Sense that canā€™t keep up with the brain. unless I am not following the conversation correctly and yaā€™ll are talking about something different.

@serovner86 That would be interesting. Eventually Iā€™m going to have to figure out how to do export all the data in such a way that I can posthumously do what you described. because even when sense detects devices, it doesnā€™t always capture all of the power the device used. it detects my fridge turned on a second or 2 after it actually does. and thatā€™s ok. What disturbs me is when it not only detects a device has turned on a second or 2 after it does, but when it also starts recording the power used a second or 2 (or longer in the case of some of my devices) after they start using power.

I donā€™t know. Iā€™d be afraid theyā€™d all break during hurricane season. And solar panels are bad for the environment when it comes to disposing of them. Theyā€™re also, as you said, very expensive. The only reason they are even halfway affordable is government subsidies. I canā€™t afford them.

But these do solve one issue I have with solar panels, even if they were affordable. and that is that they look ugly. my roof is oriented in such a way that they would have to go on the front of my house. So it is a no go for me. Estimates say they would cost me more money than they would provide energy down here in South Louisiana. and it is mostly because of the shape and orientation of my roof. If my house had been built on the other side of the street, lol.

Otherwise, I wish I could get solar panels. I would being selling power back to the electric company all the time. theyā€™d be cutting me a check every year.
I agree with Bill Gates. We need to have a ā€œmanhattan projectā€ for a renewable, affordable, energy source. a manhattan project for solar panels that produce more energy for less money, for example.

1 Like

Your 2 second problem may not be a real problem. As I suggest, Sense take a half second to a second to detect the on signature (or off), plus there could be another second or so interaction with the Sense mothership before the bubble pops up. Doesnā€™t mean Sense delayed the start of recording device usage. The real measure would be to look at the Device Power Meter to see what Sense has captured.