Question

Let $f(x)=a_0+a_1x+a_2x^2+\dots+a_nx^n$, where $a_i\ge0$ and $a_i$ is integer.

Given $f(1)=p$ and $f(f(1))=q$, we have to find $a_0$, $a_1$, $a_2$, $a_3$, $\dots$, $a_n$, where such $f(x)$ exists. Or we have to confirm if such $f(x)$ exists or if the polynomial is ambiguous e.g. for $p=1$ and $q=2$, no such $f(x)$ exists but for $p=1$ and $q=1$, $f(x)=1$, $f(x)=x^2$ both can be solution.

I have to write a program to do that. What should be my procedure?

Was it helpful?

Solution

The conditions $f(1) = p$ and $f(p) = q$ imply the following two equations: $$ \begin{align} &\sum_{i=0}^n a_i = p, \\ &\sum_{i=0}^n a_i p^i = q. \end{align} $$ When $p < 0$ or $p$ is not an integer, there are no solutions. When $p = 0$, the first equation implies that the polynomial must be $0$ and so $q = 0$. When $p = 1$, the equations imply that $p = q$, and the first equation implies that the solutions are the polynomials $1,x,\ldots,x^n$. From now on, we assume that $p \geq 2$.

Intuitively, the polynomial minimizing $\sum_i a_i$ under the second constraint above is obtained by writing $q$ in base $p$ notation (first $n$ digits) and putting the rest as a coefficient of $a_n$. Denote this solution by $b_0,\ldots,b_n$. If $\sum_i b_i > p$ then there is no solution. If $\sum_i b_i = p$ then we have found the unique solution. From now on, suppose that $\sum_i b_i < p$; this implies that the base $p$ expansion of $q$ has length at most $n+1$.

The given equations imply more constraints. First of all, $q \geq p$. Second, since $p \equiv 1 \pmod{p-1}$, we must have $q \equiv p \pmod{p-1}$. Starting with the polynomial $g(x) = \sum_{i=0}^n b_i x^i$, we can increase $g(1)$ by applying the update $(b'_{i+1},b'_i) \gets (b_{i+1} - \alpha,b_i + p\alpha)$ (when $b_{i+1} \geq \alpha$); this increases $g(1)$ by $\alpha(p-1)$. We can apply these updates systematically, moving the mass all the way to the polynomial $q$. Since $\sum_i b_i < p \leq q$ and $q \equiv p \pmod{p-1}$, somewhere along the way we will find a (unique) polynomial satisfying both requirements above.

How do we perform this process algorithmically? We go over the indices from $b_n$ to $b_1$ in order. At each point, we subtract from $b_i$ the maximal amount $\alpha$ which keeps the invariant $g(1) \leq p$. After processing $b_1$, we should reach the desired polynomial. (We are basically looking at the base $p-1$ expansion of $p - g(1)$.)

OTHER TIPS

If $p=1$, then the constraints are $f(1)=1=q$ which is easily solved depending on the value of $q$. Let's now assume $p \ge 2$. Suppose that $f(x) = \sum_k a_k x^k$ is a solution. Then $q = a_0 + a_1 p + a_2 p^2 + \ldots + a_n p^n$ where $n$ is the degree of $f$ ($a_n \ne 0$).

For any $k \le n$, $a_k \le \dfrac{q}{p^k}$. This gives a bound on each coefficient.

Since $p \gt 1$, this inequation also gives a bound for the degree of $f$: $1 \le a_n \le \dfrac{q}{p^n}$ so $n \le \dfrac{\log(q)}{\log(p)}$.

This shows that there is only a finite number of candidate polynomials. You can then enumerate all possible solutions. First compute the maximum degree $n$ of $f$. Then loop over the possible degrees ($n$ to $0$). For each degree value $m$, loop over the possible values of $a_m$; for each value of $a_m$, loop over the possible values of $a_{m-1}$, and so on. Maintain the partial sums $S_k = a_m p^m + \ldots + a_k p^k$ and $R_k = a_m + \ldots + a_k$; if either of these sums exceeds $q$ or $p$ respectively, abort this branch of the tree. When you get to $k=0$, the constraints on $f$ are $S_1 + a_0 = q$ and $R_1 + a_0 = p$, so check whether $S_1 - R_1 = q - p$.

I have no idea whether this approach is practical — there may be a significantly faster algorithm exploiting divisibility more.

You need a polynomial such that $f(1) = p$ and $f(p) = q$. This can be done as long as $p \neq 1$, $p, q > 0$ and $\frac {q-p} {p-1} > 0$ (or $p = q = 1$). In all cases a polynomial $ax^n+b$ is enough. Here $p^{n+1} > q$. Note that no $n$ is too large.

Licensed under: CC-BY-SA with attribution
Not affiliated with cs.stackexchange
scroll top