Proof.
Indeed, let
g_{1} and
g_{2} be solutions of
Equation (
71).
Then the product
f (
g_{1} 
g_{2}) is a multiple of
x^{}.
Since
f (0) = 1 then
g_{1}(0) 
g_{2}(0) must be 0.
Hence there is a constant
c R and polynomials
h_{1},
h_{2}
with degree less than
 1 such that
g_{1}(x) = h_{1}(x) x + c and g_{2}(x) = h_{2}(x) x + c 
(72) 
It follows that
f (
h_{1} 
h_{2}) is a multiple of
x^{1}.
By repeating the same argument we show that
h_{1}(0) =
h_{2}(0).
Then by induction on
we obtain
g_{1} =
g_{2}.
Theorem 9
Let
R be a commutative ring with identity element.
Let
f be a polynomial in
R[
x] such that
f (0) = 1.
Let
g_{0},
g_{1},
g_{2},... be the sequence of polynomials defined
for all
i 0 by

(75) 
Then for
i 0 we have
f g_{i} 1 modx^{2i} 
(76) 
Proof.
Theorem
9
tell us that Algorithm
6
computes the inverse of
f modulo
x^{2r}.
Since
x^{} divides
x^{2r}, the result is also valid modulo
x^{}.
We do not give here a proof of the complexity result
and we refer to Theorem 9.4 in [
vzGG99].
In fact we prefer to give a very brutal explanation about it.
To understand the complexity result one needs to point out
the following relation for
i = 1^{ ... }r.
g_{i} g_{i1} modx^{2i1} 
(79) 
Indeed, by virtue of
Theorem
9
we have
g_{i} 

2g_{i1}  f g_{i1}^{2} 
modx^{2i} 


2g_{i1}  f g_{i1}^{2} 
modx^{2i1} 


g_{i1}(2  f g_{i1}) 
modx^{2i1} 


g_{i1}(2  1) 
modx^{2i1} 


g_{i1} 
modx^{2i1} 

(80) 
Therefore when computing
g_{i} we only care about power of
x
in the range
x^{2i1 ... }x^{2i}.
This says that
 half of the computation of g_{r} is made
during the last iteration of the for loop,
 a quater is made when computing g_{r1} etc.
Now recall that
So
roughly the cost of the algorithm is in the order
of magnitude of the cost of the last iteration.
which consists of
 two multiplications of polynomials with degree less than 2^{r},
 a multiplication of a polynomial (with degree less than 2^{r})
by a constant,
 truncations modulo x^{2r}
 a subtraction of polynomials with degree less than 2^{r}.
leading to
(
(2
^{r})) operations in
R.