Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Work in progress: Restructuring of decoder module and filtering of heatpump values. #379

Closed
wants to merge 16 commits into from

Conversation

Samureimer
Copy link
Contributor

Fixed a potential memory overflow in the ReadSerial function: https://github.com/Egyras/HeishaMon/blob/master/HeishaMon/HeishaMon.ino#L309

Changed the decoder module to hold structs that combines the decoding logic instead of multiple arrays.

The description type is now a struct that also holds the value for what kind of filter the described value should be treated with.
The filter is only applied to floating point values.
A averaging filter is updated for each value every time data is decoded from the heatpump.
Once the data has been published via MQTT the filters are reset.
The values shown on the website and retrieved by the rules are not subject to this filter and is just the latest decoded value.

Removed the retention flag as a default flag for all values. Maybe I just don't understand why one would want this for values updated so frequently?

Added a crash helper for saving and printing stack traces. This is currently published via MQTT but under a temporary topic.

Renamed variables to be easier for newcomers to understand. At least in my opinion.

Added VSCode files for easier and more streamlined development.

Removed dead code.

Configured some formatter that was available to me and configured it to somewhat match the projects current style.
This point definitely needs more work and discussion. I have no strong feelings about the style or choice of formatter but I would like a shared setup.

A lot of work is still needed but I would like some feedback at this stage.
I haven't tested that the rules after my changes as I don't use them in my setup.
Maybe someone would try it out for me?

@IgorYbema
Copy link
Contributor

Hi,
Thanks for the multiple ideas and changes.

The reason for the retain flag isn't clear to me anymore but we had a specific reason for it when we implemented it. Let me try to remember.

Filtering the data in heishamon is outside our main goal which is that heishamon should be a gateway only and not do any value changing (like averging the floats). It is up to the upstream home automation tools to do that.

Can you explain the potention buffer overflow in the serial read? I don't understand your change why this would help.

@IgorYbema
Copy link
Contributor

Owh I think I see the potential buffer overlfow already. Thanks for that

@Samureimer
Copy link
Contributor Author

Hi @IgorYbema

I agree that the filtering should not be something that is on by default at least
But I would like to add it as a setting and maybe give the data its own topic like "main_averaged".
This would make it possible for the "raw" and filtered to co-exist.

There are two main reasons for this:

The first being that my connection is a bit shaky from time to time. I don't want to loose data but rather have the average from whenever values was transmitted successfully last.

The second and most important to me is that I want to calculate different metrics based on the values stored in InfluxDb for my heat-pump. To get the most accurate metrics I would have to transmit and store the values every time there's an update. This would amount to a lot of data that I'm not interested in dealing with. I would rather collect the average values and transmit those every x minutes.

@IgorYbema
Copy link
Contributor

Sending each value every time there's an update is exactly what heishamon does now and what people like to have. It is up to influxdb to store each value and the magic in influx to average it out if needed. It is a design decision on which heishamon is build on to not change the data anyway and we need to stick to that or else there are a lot of other changes which come to mind.

But you are ofcourse free to make your own changes in the code for your own needs. And I will certainly look at the other changes you did.

@Samureimer
Copy link
Contributor Author

Sending each value every time there's an update is exactly what heishamon does now

Omg, you are correct.
I had missed that the updateAllTime setting was when to re-transmit all values. Not just when to transmit....
My bad.

My PR currently only transmits values when the updateAllTime timer runs out.
I'll revert this change so that it transmits every time there is a change and when the timer runs out.

This still leaves me with a problem during connection losses.
If I just transmit the latest value when the connection is recovered my integrals in Influx will be incorrect.

How about a setting (off by default) that enables the averaging of values only when connection is lost.
This way HeishaMon is still only a gateway but with the option to "recover" lost data.
I don't see any other way of solving my problem but if you have another idea I would be appreciate the input.

@IgorYbema
Copy link
Contributor

I'm not a hero in mathematical problems but doesn't an average also destroy the integral? It at least is missing the time point at which a value change occured so the average should then at least also take the diff in time into calculation of the average.

Let me think about this. Haven't got a good solution right now. Except of just fixing your wifi :)

@Samureimer
Copy link
Contributor Author

As long as the average filter is updated at fixed intervals (each time we decode a value) the integral is intact.
The only problem I see is if we have lost the MQTT connection and begin to drop serial packages as well :/
Then I need a average filter that takes the sampling interval/drift into account.

@IgorYbema
Copy link
Contributor

And then I need to ask, is that so important? Nice to have history graphs which looks fine but for the heatpump only the realtime values are important right?

@Samureimer
Copy link
Contributor Author

Good question.
It's important for me as I plan to use it to validate the performance of my automation.
I want to calculate how much energy/heat is produced each heat-pump run cycle.
This is the value I want to maximize as I have a buffer tank with a capacity of X kWh.
Whatever the difference is between the actual production and X must be the heat that was overflown into my house.

I want to optimize my automation to dump as much energy into my house when the heat-pump is running as sensibly possible.
This should virtually extend the capacity of my buffer tank.

So I want data with as much resolution as possible when calculating these values :)

@Egyras
Copy link
Owner

Egyras commented Oct 16, 2023

I'am closing this pull, please raise a PR for discussion ;)

@Egyras Egyras closed this Oct 16, 2023
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

3 participants