Perhaps the best solution would be to use a DEFLATE library and run it on large blocks of data and with high compression settings.
If you want to roll your own stream compression algorithm, you can apply the same algorithm that works for sound files: Send the first measurement directly, then encode the difference between each sample and the previous one (delta-encoding).
Now, the best encoding differs based on how fast the data is changing:
If the data is changing fast enough, use an adaptive Huffmann tree. If the differences are uncorrelated (data + noise), this will get you at most one bit per sample from the entropy.
If several consecutive data samples are likely to be equal to each other (the data is not changing very fast and there's no HF noise), then encode each non-zero difference using one Huffmann tree, and the number of zeroes using a second Huffmann tree. This will get you at most two bits per run.
You can even encode the differences as one bit only (up or down), but then you need to be able to encode zero-length runs.
My suggestion: delta-encode once or twice to get uncorrelated entries, then DEFLATE using a library.