I am writing the program in Cocoa but I think that the solution must be quite universal.

I have a set of points represented by 3D vectors. Each point has a weight assigned to it. The weight is in the range from 0 to 1. The sum of all weights isn't equal to 1.

How should the weighted mean point be calculated from such set?

Either programmatic or pure mathematical solution will be helpful. Of course if Cocoa has some specific tools for solving this task, I would very appreciate this information.

有帮助吗?

解决方案

Simply sum all vectors scaled by their weight. Finally, divide by the sum of all weights. This has the same effect as first normalizing all weights to sum to 1.

Pseudo-code:

sum = [0, 0, 0]
totalWeights = 0
for each point p with associated weight w:
    sum += p * w
    totalWeights += w
mean = sum / totalWeights
许可以下: CC-BY-SA归因
不隶属于 StackOverflow
scroll top