سؤال

I have a program in C# that asks the user for the 4 values. They are:

MinIndex, MaxIndex, MinValue, MaxValue

I want to be able to determine the value for any given index within the [MinIndex MaxIndex] range. The range of Indexs will not always be the same so i need to first find it out and use that value somehow.

As an example say the

MinIndex=250, MaxIndex=750 the range is 500;
MinValue=0.025, MaxValue=0.254 range is 0.229.

If i do the valueRange / indexRage i get 0.000458.

This number enables me to take any index say "267" and multiply it by 0.000458 and i will get the value for that index.

However this is working from 0-500. How can i use my original indexs say [250-750] and have a single value that i can multiply to get that value for that index ie [298 * ?]

The calculation is linear, and because i know the value of the max index and the value of the min index i know there is a way to work out the rest.

Sorry if this is a stupid question but maths is not one of my string points

thankyou in advance

هل كانت مفيدة؟

المحلول

The formula is:

(Value - MinValue) / (MaxValue-MinValue) = (Index - MinIndex)/(MaxIndex - MinIndex)

solving for value:

value = (MaxValue-MinValue) * ((Index-MinIndex) / (MaxIndex-MinIndex)) + MinValue

نصائح أخرى

Subtract the MinIndex from the Index and divide by the index range to give a value between 0 and 1. This represents how far you are from MinIndex on the way to MaxIndex.

Multiply value range by this ratio and add MinValue to get the result.

This is known as linear interpolation

مرخصة بموجب: CC-BY-SA مع الإسناد
لا تنتمي إلى StackOverflow
scroll top