Question

I've got a dsPIC33F collecting from two ADC channels, simultaneously, at 10bit. I'm using a timer to sample at 64Hz and have the ADC set to auto sampling, but manual conversion. Every time the timer interrupt is polled I'm clearing the sample bit and the DMA buffer is filled with my ADC data. Plotting this data shows it's giving the right values, but I've noticed it's very noisy!

enter image description here

Ignore the green line. The red line is correctly plotting my ADC results (the peaks are intentional), but as you can see it's got an awful lot of noise throughout.

Any ideas on what can be done to reduce this? When plotting simultaneously with a DAQ (but using the same power source and linking the grounds) it's much much smoother, so I know this noise isn't always present. Decoupling capacitors on the PIC maybe? I'm using a breadboard and through-hole components, the analogue sensor is placed as close to the PIC pin as possible. I'm under the impression this is a hardware issue, but let me know if something can be done on the software side of things.

Was it helpful?

Solution

This could be due to the source impedance that is driving the ADC, ie your analogue sensor. It might need a buffer amplifier to drive the ADC better. An opamp in unity gain configuration should help. Another way to achieve some improvement is a small capacitor from the ADC input to ground, but you would need to choose its value carefully to avoid filtering those peaks too much.

Licensed under: CC-BY-SA with attribution
Not affiliated with StackOverflow
scroll top