Pergunta

I am using this logic:

MYSQL Trendline calculation

to calculate the slope of a set of time-series data. But my data is such (measuring current output over time) that the 'slope' makes no sense as it is dependent on the relative scale of the axes. For example current shown in milliamps would produce a different slope than current represented in Amps over the same time period. What I need is to calculate whether the trend over the specified time represents a certain percentage increase.

To summarize, I am using this sql query:

    SELECT COUNT(*) AS N,SUM(UNIX_TIMESTAMP(timestamp)) AS Sum_X,
        SUM(UNIX_TIMESTAMP(timestamp) * UNIX_TIMESTAMP(timestamp)) AS Sum_X2,
        SUM(max_current) AS Sum_Y,
        SUM(max_current*max_current) AS Sum_Y2,
        SUM(UNIX_TIMESTAMP(timestamp) * max_current) AS Sum_XY
        FROM circuit_history
        WHERE circuit_filename = '".$cfn."'
        AND timestamp > date_sub(now(), interval 60 day)";

And my slope is calculated thusly:

$slope = ($row['N'] * $row['Sum_XY'] - $row['Sum_X'] * $row['Sum_Y'])/($row['N'] * $row['Sum_X2'] - $row['Sum_X'] * $row['Sum_X']);

But I am getting numbers like 5.9808374081288E-10 because of the scale of the unix timestamps.

What is the best way to get from this to a percentage increase or decrease? Or more specifically, how do I get the Y values from the endpoints of the trendline?

Foi útil?

Solução

  1. Want to change your scale from Amps/Time to MilliAmps/Time? Multiply by 1000
  2. Trouble with Unix timestamp? Because your timescale reaches back to Jan 1, 1970. Pick a timestamp as your 'zero' time and subtract it from all of the others.
Licenciado em: CC-BY-SA com atribuição
Não afiliado a StackOverflow
scroll top