Question

I'm pretty sure this question has been asked several times, but either I did not find the correct answer or I didn't understand the solution. To my current problem: I have a sensor which measures the time a motor is running. The sensor is reset after reading. I'm not interested in the time the motor was running the last five minutes. I'm more interested in how long the motor was running from the very beginning (or from the last reset). When storing the values in an rrd, depending on the aggregate function, several values are recorded. When working with GAUGE, the value read is 3000 (10th seconds) every five minutes. When working with ABSOLUTE, the value is 10 every five minutes.

But what I would like to get is something like:

3000 after the first 5 minutes

6000 after the next 5 minutes (last value + 3000)

9000 after another 5 minutes (last value + 3000)

The accuracy of the older values (and slopes) is not so important, but the last value should reflect the time in seconds since the beginning as accurate as possible.

Is there a way to accomplish this?

Was it helpful?

Solution 2

I now created a small SQLite database with one table and one column in that tabe. The table has one row. I update that row every time my cron job runs and add the current value to the current value. So the current value of the one row and column is the cumualted value of my sensor. This is then fed into the rrd. Any other (better) ideas?

OTHER TIPS

I dont know if it is useful for ur need or not but maybe using TREND/TRENDNAN CDEF function is what u want, look at here: TREND CDEF function

The way that I'd tackle this (in Linux) is to write the value to a plain-text file and then use the value from that file for the RRDTool graph. I think that maybe using SQLite (or any other SQL server) just to keep track of this would be unnecessarily hard on a system just to keep track of something like this.

Licensed under: CC-BY-SA with attribution
Not affiliated with StackOverflow
scroll top