Question

Im appealing to your brighter minds on this one!

This is my first foray into pattern recognition/trend analysis, basically i want to be able to differentiate between waveforms (essentialy line graphs).

For instance the original graph would represent "normal", the second graph which is almost the same but has a higher plot during the last 30% of the graph, this second graph would be termed "Bad Graph".
What i would like to achieve is for the program to know what "Normal" and "Bad Graph" look like and then for any other waveform after this to be seen and differentiated between the two, in this example if a third waveform was exactly like the original but towards the final 30% was incremented but not as much as the "Bad Graph" it would recognise this as trending towards the "Bad Graph".

I'm thinking maybe slicing the graph and then differentiating or generating rules etc. but have no idea of how to implement this.

What techniques are there available to point me in the right direction and if possible code examples etc. would be a bonus.

Was it helpful?

Solution

I'm not an expert in this area, but could you consider using Fourier Transformation to make a qualitative comparison of the waveforms?

There is an SO article on Fast Fourier Transformation using C# and various libraries here:-

Licensed under: CC-BY-SA with attribution
Not affiliated with StackOverflow
scroll top