Question

I have two sets of large arrays (here I have shortened them):

x1 = [0, 2, 4, 6, 8, 10]

y1 = [0.2, 0.19, 0.22, 0.18, 0.22, 0.21]

x2 = [0, 0.2, 0.5, 1, 1.5, 2, 2.7, 3.5, 6, 10]

y2 = [5, 4.9, 4.9, 4.9, 5.2, 4, 4.6, 4, 4.3, 3.9] 

Thus three things can be noticed about the above arrays:

  1. x1 and y1 have 6 values but x2 and y2 have 10 values (the exact numbers do not matter, the only thing that matters is that x2 and y2 have many more values than x1 and y1).

  2. x1 is spaced evenly while x2 is not.

  3. The x arrays begin and end at the same values.

How do I create a new array with the extrapolated values of y1 at all the values of x2?

It is also important to notice that y1 has potential values that will not be included in the new array, for instance in the above example, y1[4] is not needed because it does not fall at a value of x2.

Also notice that y2 has nothing to do with this particular problem, I just provided it for overall understanding.

I have tried using the approach that Jblasco developed for a similar problem here: interpolate python array to minimize maximum difference between elements however my problem is somewhat different than the one posed in that link.

Was it helpful?

Solution

This seems to just be a straightforward interpolation/extrapolation problem.

import numpy as np

#here y2 is the new extrapolated array
y2 = np.interp(x2,x1,y1)
Licensed under: CC-BY-SA with attribution
Not affiliated with StackOverflow
scroll top