Question

I've designed a hash algorithm based on mathematical exponentiation, exponentiating each number by its following number. To be able to exponentiate indefinitely, numbers are first normalized to a 0-1 range. Then, to preserve the ordering characteristic even when normalized, the mean of the list is taken before hashing and prepended to the list.

Also:

  • Zeroes are substituted by the mean to prevent stabilization around zero values and zero results for leading-zero lists
  • Inputs are normalized to absolutes to avoid producing complex numbers, since mean is changed
  • All-zero lists are checked and return zero
  • The resulting value is a double in the 0-1 range.

It's been tested on large and small magnitude lists of floats, as well as showing zero collisions in a 446,000 word test.

Does this mean it has zero collisions?

Was it helpful?

Solution

No. By definition, reducing N doubles to a single double hash will produce collisions. You just don’t have enough bits to represent all of the combinations.

Licensed under: CC-BY-SA with attribution
scroll top