This is not a solution; just a more mathematical description of what you are trying to achieve (without judging if this is the right thing to do):
Since you are rounding all the numbers to x decimals, we can treat these numbers as integers (just multiply them by 10^x).
Now, you are trying to solve the following problem:
Given the matrix
A11+Adj11 A12+Adj12 ... A1n+Adj1n
A21+Adj21 A22+Adj22 ... A2n+Adj2n
A31+Adj31 A32+Adj32 ... A3n+Adj3n
... ... ... ...
Am1+Adjm1 Am2+Adjm2 ... Amn+Adjmn
Where A11..Amn are constant integers,
Find integers Adj11...Adjmn
Minimizing sum(abs(Adjxy))
(or maybe you prefer: Minimizing sum((Adjxy)^2)
Subject to:
- for each row m: Adjm1+Adjm2+...+Adjmn = - (Am1+Am2+...+Amn)
- for each col n: Adj1n+Adj2n+...+Adjmn = - (A1n+A2n+...+Amn)
This is an integer programming problem, with m*n variables and m+n constrains. The function that you are trying to minimize is not linear.
I'm afraid that this problem is far from trivial. I believe that you should better post it on https://math.stackexchange.com/