Inverse of analytic functions #
We construct the left and right inverse of a formal multilinear series with invertible linear term, we prove that they coincide and study their properties (notably convergence).
Main statements #
p.leftInv i
: the formal left inverse of the formal multilinear seriesp
, fori : E βL[π] F
which coincides withpβ
.p.rightInv i
: the formal right inverse of the formal multilinear seriesp
, fori : E βL[π] F
which coincides withpβ
.p.leftInv_comp
says thatp.leftInv i
is indeed a left inverse top
whenpβ = i
.p.rightInv_comp
says thatp.rightInv i
is indeed a right inverse top
whenpβ = i
.p.leftInv_eq_rightInv
: the two inverses coincide.p.radius_rightInv_pos_of_radius_pos
: if a power series has a positive radius of convergence, then so does its inverse.
The left inverse of a formal multilinear series #
The left inverse of a formal multilinear series, where the n
-th term is defined inductively
in terms of the previous ones to make sure that (leftInv p i) β p = id
. For this, the linear term
pβ
in p
should be invertible. In the definition, i
is a linear isomorphism that should
coincide with pβ
, so that one can use its inverse in the construction. The definition does not
use that i = pβ
, but proofs that the definition is well-behaved do.
The n
-th term in q β p
is β qβ (p_{jβ}, ..., p_{jβ})
over jβ + ... + jβ = n
. In this
expression, qβ
appears only once, in qβ (pβ, ..., pβ)
. We adjust the definition so that this
term compensates the rest of the sum, using iβ»ΒΉ
as an inverse to pβ
.
These formulas only make sense when the constant term pβ
vanishes. The definition we give is
general, but it ignores the value of pβ
.
Equations
- One or more equations did not get rendered due to their size.
- FormalMultilinearSeries.leftInv p i 0 = 0
- FormalMultilinearSeries.leftInv p i 1 = (LinearIsometryEquiv.symm (continuousMultilinearCurryFin1 π F E)) β(ContinuousLinearEquiv.symm i)
Instances For
The left inverse does not depend on the zeroth coefficient of a formal multilinear series.
The left inverse to a formal multilinear series is indeed a left inverse, provided its linear term is invertible.
The right inverse of a formal multilinear series #
The right inverse of a formal multilinear series, where the n
-th term is defined inductively
in terms of the previous ones to make sure that p β (rightInv p i) = id
. For this, the linear
term pβ
in p
should be invertible. In the definition, i
is a linear isomorphism that should
coincide with pβ
, so that one can use its inverse in the construction. The definition does not
use that i = pβ
, but proofs that the definition is well-behaved do.
The n
-th term in p β q
is β pβ (q_{jβ}, ..., q_{jβ})
over jβ + ... + jβ = n
. In this
expression, qβ
appears only once, in pβ (qβ)
. We adjust the definition of qβ
so that this
term compensates the rest of the sum, using iβ»ΒΉ
as an inverse to pβ
.
These formulas only make sense when the constant term pβ
vanishes. The definition we give is
general, but it ignores the value of pβ
.
Equations
- One or more equations did not get rendered due to their size.
- FormalMultilinearSeries.rightInv p i 0 = 0
- FormalMultilinearSeries.rightInv p i 1 = (LinearIsometryEquiv.symm (continuousMultilinearCurryFin1 π F E)) β(ContinuousLinearEquiv.symm i)
Instances For
The right inverse does not depend on the zeroth coefficient of a formal multilinear series.
The right inverse to a formal multilinear series is indeed a right inverse, provided its linear term is invertible and its constant term vanishes.
Coincidence of the left and the right inverse #
The left inverse and the right inverse of a formal multilinear series coincide. This is not at all obvious from their definition, but it follows from uniqueness of inverses (which comes from the fact that composition is associative on formal multilinear series).
Convergence of the inverse of a power series #
Assume that p
is a convergent multilinear series, and let q
be its (left or right) inverse.
Using the left-inverse formula gives
$$
q_n = - (p_1)^{-n} \sum_{k=0}^{n-1} \sum_{i_1 + \dotsc + i_k = n} q_k (p_{i_1}, \dotsc, p_{i_k}).
$$
Assume for simplicity that we are in dimension 1
and pβ = 1
. In the formula for qβ
, the term
q_{n-1}
appears with a multiplicity of n-1
(choosing the index i_j
for which i_j = 2
while
all the other indices are equal to 1
), which indicates that qβ
might grow like n!
. This is
bad for summability properties.
It turns out that the right-inverse formula is better behaved, and should instead be used for this
kind of estimate. It reads
$$
q_n = - (p_1)^{-1} \sum_{k=2}^n \sum_{i_1 + \dotsc + i_k = n} p_k (q_{i_1}, \dotsc, q_{i_k}).
$$
Here, q_{n-1}
can only appear in the term with k = 2
, and it only appears twice, so there is
hope this formula can lead to an at most geometric behavior.
Let Qβ = βqββ
. Bounding βpββ
with C r^k
gives an inequality
$$
Q_n β€ C' \sum_{k=2}^n r^k \sum_{i_1 + \dotsc + i_k = n} Q_{i_1} \dotsm Q_{i_k}.
$$
This formula is not enough to prove by naive induction on n
a bound of the form Qβ β€ D R^n
.
However, assuming that the inequality above were an equality, one could get a formula for the
generating series of the Qβ
:
$$ \begin{align} Q(z) & := \sum Q_n z^n = Q_1 z + C' \sum_{2 \leq k \leq n} \sum_{i_1 + \dotsc + i_k = n} (r z^{i_1} Q_{i_1}) \dotsm (r z^{i_k} Q_{i_k}) \ & = Q_1 z + C' \sum_{k = 2}^\infty (\sum_{i_1 \geq 1} r z^{i_1} Q_{i_1}) \dotsm (\sum_{i_k \geq 1} r z^{i_k} Q_{i_k}) \ & = Q_1 z + C' \sum_{k = 2}^\infty (r Q(z))^k = Q_1 z + C' (r Q(z))^2 / (1 - r Q(z)). \end{align} $$
One can solve this formula explicitly. The solution is analytic in a neighborhood of 0
in β
,
hence its coefficients grow at most geometrically (by a contour integral argument), and therefore
the original Qβ
, which are bounded by these ones, are also at most geometric.
This classical argument is not really satisfactory, as it requires an a priori bound on a complex analytic function. Another option would be to compute explicitly its terms (with binomial coefficients) to obtain an explicit geometric bound, but this would be very painful.
Instead, we will use the above intuition, but in a slightly different form, with finite sums and an
induction. I learnt this trick in [pΓΆschel2017siegelsternberg]. Let
$S_n = \sum_{k=1}^n Q_k a^k$ (where a
is a positive real parameter to be chosen suitably small).
The above computation but with finite sums shows that
$$ S_n \leq Q_1 a + C' \sum_{k=2}^n (r S_{n-1})^k. $$
In particular, $S_n \leq Q_1 a + C' (r S_{n-1})^2 / (1- r S_{n-1})$.
Assume that $S_{n-1} \leq K a$, where K > Qβ
is fixed and a
is small enough so that
r K a β€ 1/2
(to control the denominator). Then this equation gives a bound
$S_n \leq Q_1 a + 2 C' r^2 K^2 a^2$.
If a
is small enough, this is bounded by K a
as the second term is quadratic in a
, and
therefore negligible.
By induction, we deduce Sβ β€ K a
for all n
, which gives in particular the fact that aβΏ Qβ
remains bounded.
First technical lemma to control the growth of coefficients of the inverse. Bound the explicit
expression for β_{k<n+1} aα΅ Qβ
in terms of a sum of powers of the same sum one step before,
in a general abstract setup.
Second technical lemma to control the growth of coefficients of the inverse. Bound the explicit
expression for β_{k<n+1} aα΅ Qβ
in terms of a sum of powers of the same sum one step before,
in the specific setup we are interesting in, by reducing to the general bound in
radius_rightInv_pos_of_radius_pos_aux1
.
If a a formal multilinear series has a positive radius of convergence, then its right inverse also has a positive radius of convergence.