Documentation

Mathlib.Analysis.Calculus.UniformLimitsDeriv

Swapping limits and derivatives via uniform convergence #

The purpose of this file is to prove that the derivative of the pointwise limit of a sequence of functions is the pointwise limit of the functions' derivatives when the derivatives converge uniformly. The formal statement appears as hasFDerivAt_of_tendstoLocallyUniformlyOn.

Main statements #

Each of the above statements also has variations that support deriv instead of fderiv.

Implementation notes #

Our technique for proving the main result is the famous "ε / 3 proof." In words, you can find it explained, for instance, at this StackExchange post. The subtlety is that we want to prove that the difference quotients of the g converge to the g'. That is, we want to prove something like:

∀ ε > 0, ∃ δ > 0, ∀ y ∈ B_δ(x), |y - x|⁻¹ * |(g y - g x) - g' x (y - x)| < ε.

To do so, we will need to introduce a pair of quantifiers

∀ ε > 0, ∃ N, ∀ n ≥ N, ∃ δ > 0, ∀ y ∈ B_δ(x), |y - x|⁻¹ * |(g y - g x) - g' x (y - x)| < ε.

So how do we write this in terms of filters? Well, the initial definition of the derivative is

tendsto (|y - x|⁻¹ * |(g y - g x) - g' x (y - x)|) (𝓝 x) (𝓝 0)

There are two ways we might introduce n. We could do:

∀ᶠ (n : ℕ) in atTop, Tendsto (|y - x|⁻¹ * |(g y - g x) - g' x (y - x)|) (𝓝 x) (𝓝 0)

but this is equivalent to the quantifier order ∃ N, ∀ n ≥ N, ∀ ε > 0, ∃ δ > 0, ∀ y ∈ B_δ(x), which implies our desired ∀ ∃ ∀ ∃ ∀ but is not equivalent to it. On the other hand, we might try

Tendsto (|y - x|⁻¹ * |(g y - g x) - g' x (y - x)|) (atTop ×ˢ 𝓝 x) (𝓝 0)

but this is equivalent to the quantifier order ∀ ε > 0, ∃ N, ∃ δ > 0, ∀ n ≥ N, ∀ y ∈ B_δ(x), which again implies our desired ∀ ∃ ∀ ∃ ∀ but is not equivalent to it.

So to get the quantifier order we want, we need to introduce a new filter construction, which we call a "curried filter"

Tendsto (|y - x|⁻¹ * |(g y - g x) - g' x (y - x)|) (atTop.curry (𝓝 x)) (𝓝 0)

Then the above implications are Filter.Tendsto.curry and Filter.Tendsto.mono_left Filter.curry_le_prod. We will use both of these deductions as part of our proof.

We note that if you loosen the assumptions of the main theorem then the proof becomes quite a bit easier. In particular, if you assume there is a common neighborhood s where all of the three assumptions of hasFDerivAt_of_tendstoUniformlyOnFilter hold and that the f' are continuous, then you can avoid the mean value theorem and much of the work around curried filters.

Tags #

uniform convergence, limits of derivatives

theorem uniformCauchySeqOnFilter_of_fderiv {ι : Type u_1} {l : Filter ι} {E : Type u_2} [NormedAddCommGroup E] {𝕜 : Type u_3} [IsROrC 𝕜] [NormedSpace 𝕜 E] {G : Type u_4} [NormedAddCommGroup G] [NormedSpace 𝕜 G] {f : ιEG} {f' : ιEE →L[𝕜] G} {x : E} (hf' : UniformCauchySeqOnFilter f' l (nhds x)) (hf : ∀ᶠ (n : ι × E) in l ×ˢ nhds x, HasFDerivAt (f n.1) (f' n.1 n.2) n.2) (hfg : Cauchy (Filter.map (fun (n : ι) => f n x) l)) :

If a sequence of functions real or complex functions are eventually differentiable on a neighborhood of x, they are Cauchy at x, and their derivatives are a uniform Cauchy sequence in a neighborhood of x, then the functions form a uniform Cauchy sequence in a neighborhood of x.

theorem uniformCauchySeqOn_ball_of_fderiv {ι : Type u_1} {l : Filter ι} {E : Type u_2} [NormedAddCommGroup E] {𝕜 : Type u_3} [IsROrC 𝕜] [NormedSpace 𝕜 E] {G : Type u_4} [NormedAddCommGroup G] [NormedSpace 𝕜 G] {f : ιEG} {f' : ιEE →L[𝕜] G} {x : E} {r : } (hf' : UniformCauchySeqOn f' l (Metric.ball x r)) (hf : ∀ (n : ι), yMetric.ball x r, HasFDerivAt (f n) (f' n y) y) (hfg : Cauchy (Filter.map (fun (n : ι) => f n x) l)) :

A variant of the second fundamental theorem of calculus (FTC-2): If a sequence of functions between real or complex normed spaces are differentiable on a ball centered at x, they form a Cauchy sequence at x, and their derivatives are Cauchy uniformly on the ball, then the functions form a uniform Cauchy sequence on the ball.

NOTE: The fact that we work on a ball is typically all that is necessary to work with power series and Dirichlet series (our primary use case). However, this can be generalized by replacing the ball with any connected, bounded, open set and replacing uniform convergence with local uniform convergence. See cauchy_map_of_uniformCauchySeqOn_fderiv.

theorem cauchy_map_of_uniformCauchySeqOn_fderiv {ι : Type u_1} {l : Filter ι} {E : Type u_2} [NormedAddCommGroup E] {𝕜 : Type u_3} [IsROrC 𝕜] [NormedSpace 𝕜 E] {G : Type u_4} [NormedAddCommGroup G] [NormedSpace 𝕜 G] {f : ιEG} {f' : ιEE →L[𝕜] G} {s : Set E} (hs : IsOpen s) (h's : IsPreconnected s) (hf' : UniformCauchySeqOn f' l s) (hf : ∀ (n : ι), ys, HasFDerivAt (f n) (f' n y) y) {x₀ : E} {x : E} (hx₀ : x₀ s) (hx : x s) (hfg : Cauchy (Filter.map (fun (n : ι) => f n x₀) l)) :
Cauchy (Filter.map (fun (n : ι) => f n x) l)

If a sequence of functions between real or complex normed spaces are differentiable on a preconnected open set, they form a Cauchy sequence at x, and their derivatives are Cauchy uniformly on the set, then the functions form a Cauchy sequence at any point in the set.

theorem difference_quotients_converge_uniformly {ι : Type u_1} {l : Filter ι} {E : Type u_2} [NormedAddCommGroup E] {𝕜 : Type u_3} [IsROrC 𝕜] [NormedSpace 𝕜 E] {G : Type u_4} [NormedAddCommGroup G] [NormedSpace 𝕜 G] {f : ιEG} {g : EG} {f' : ιEE →L[𝕜] G} {g' : EE →L[𝕜] G} {x : E} (hf' : TendstoUniformlyOnFilter f' g' l (nhds x)) (hf : ∀ᶠ (n : ι × E) in l ×ˢ nhds x, HasFDerivAt (f n.1) (f' n.1 n.2) n.2) (hfg : ∀ᶠ (y : E) in nhds x, Filter.Tendsto (fun (n : ι) => f n y) l (nhds (g y))) :
TendstoUniformlyOnFilter (fun (n : ι) (y : E) => (y - x)⁻¹ (f n y - f n x)) (fun (y : E) => (y - x)⁻¹ (g y - g x)) l (nhds x)

If f_n → g pointwise and the derivatives (f_n)' → h uniformly converge, then in fact for a fixed y, the difference quotients ‖z - y‖⁻¹ • (f_n z - f_n y) converge uniformly to ‖z - y‖⁻¹ • (g z - g y)

theorem hasFDerivAt_of_tendstoUniformlyOnFilter {ι : Type u_1} {l : Filter ι} {E : Type u_2} [NormedAddCommGroup E] {𝕜 : Type u_3} [IsROrC 𝕜] [NormedSpace 𝕜 E] {G : Type u_4} [NormedAddCommGroup G] [NormedSpace 𝕜 G] {f : ιEG} {g : EG} {f' : ιEE →L[𝕜] G} {g' : EE →L[𝕜] G} {x : E} [Filter.NeBot l] (hf' : TendstoUniformlyOnFilter f' g' l (nhds x)) (hf : ∀ᶠ (n : ι × E) in l ×ˢ nhds x, HasFDerivAt (f n.1) (f' n.1 n.2) n.2) (hfg : ∀ᶠ (y : E) in nhds x, Filter.Tendsto (fun (n : ι) => f n y) l (nhds (g y))) :
HasFDerivAt g (g' x) x

(d/dx) lim_{n → ∞} f n x = lim_{n → ∞} f' n x when the f' n converge uniformly to their limit at x.

In words the assumptions mean the following:

  • hf': The f' converge "uniformly at" x to g'. This does not mean that the f' n even converge away from x!
  • hf: For all (y, n) with y sufficiently close to x and n sufficiently large, f' n is the derivative of f n
  • hfg: The f n converge pointwise to g on a neighborhood of x
theorem hasFDerivAt_of_tendstoLocallyUniformlyOn {ι : Type u_1} {l : Filter ι} {E : Type u_2} [NormedAddCommGroup E] {𝕜 : Type u_3} [IsROrC 𝕜] [NormedSpace 𝕜 E] {G : Type u_4} [NormedAddCommGroup G] [NormedSpace 𝕜 G] {f : ιEG} {g : EG} {f' : ιEE →L[𝕜] G} {g' : EE →L[𝕜] G} {x : E} [Filter.NeBot l] {s : Set E} (hs : IsOpen s) (hf' : TendstoLocallyUniformlyOn f' g' l s) (hf : ∀ (n : ι), xs, HasFDerivAt (f n) (f' n x) x) (hfg : xs, Filter.Tendsto (fun (n : ι) => f n x) l (nhds (g x))) (hx : x s) :
HasFDerivAt g (g' x) x
theorem hasFDerivAt_of_tendsto_locally_uniformly_on' {ι : Type u_1} {l : Filter ι} {E : Type u_2} [NormedAddCommGroup E] {𝕜 : Type u_3} [IsROrC 𝕜] [NormedSpace 𝕜 E] {G : Type u_4} [NormedAddCommGroup G] [NormedSpace 𝕜 G] {f : ιEG} {g : EG} {g' : EE →L[𝕜] G} {x : E} [Filter.NeBot l] {s : Set E} (hs : IsOpen s) (hf' : TendstoLocallyUniformlyOn (fderiv 𝕜 f) g' l s) (hf : ∀ (n : ι), DifferentiableOn 𝕜 (f n) s) (hfg : xs, Filter.Tendsto (fun (n : ι) => f n x) l (nhds (g x))) (hx : x s) :
HasFDerivAt g (g' x) x

A slight variant of hasFDerivAt_of_tendstoLocallyUniformlyOn with the assumption stated in terms of DifferentiableOn rather than HasFDerivAt. This makes a few proofs nicer in complex analysis where holomorphicity is assumed but the derivative is not known a priori.

theorem hasFDerivAt_of_tendstoUniformlyOn {ι : Type u_1} {l : Filter ι} {E : Type u_2} [NormedAddCommGroup E] {𝕜 : Type u_3} [IsROrC 𝕜] [NormedSpace 𝕜 E] {G : Type u_4} [NormedAddCommGroup G] [NormedSpace 𝕜 G] {f : ιEG} {g : EG} {f' : ιEE →L[𝕜] G} {g' : EE →L[𝕜] G} [Filter.NeBot l] {s : Set E} (hs : IsOpen s) (hf' : TendstoUniformlyOn f' g' l s) (hf : ∀ (n : ι), xs, HasFDerivAt (f n) (f' n x) x) (hfg : xs, Filter.Tendsto (fun (n : ι) => f n x) l (nhds (g x))) (x : E) :
x sHasFDerivAt g (g' x) x

(d/dx) lim_{n → ∞} f n x = lim_{n → ∞} f' n x when the f' n converge uniformly to their limit on an open set containing x.

theorem hasFDerivAt_of_tendstoUniformly {ι : Type u_1} {l : Filter ι} {E : Type u_2} [NormedAddCommGroup E] {𝕜 : Type u_3} [IsROrC 𝕜] [NormedSpace 𝕜 E] {G : Type u_4} [NormedAddCommGroup G] [NormedSpace 𝕜 G] {f : ιEG} {g : EG} {f' : ιEE →L[𝕜] G} {g' : EE →L[𝕜] G} [Filter.NeBot l] (hf' : TendstoUniformly f' g' l) (hf : ∀ (n : ι) (x : E), HasFDerivAt (f n) (f' n x) x) (hfg : ∀ (x : E), Filter.Tendsto (fun (n : ι) => f n x) l (nhds (g x))) (x : E) :
HasFDerivAt g (g' x) x

(d/dx) lim_{n → ∞} f n x = lim_{n → ∞} f' n x when the f' n converge uniformly to their limit.

deriv versions of above theorems #

In this section, we provide deriv equivalents of the fderiv lemmas in the previous section.

theorem UniformCauchySeqOnFilter.one_smulRight {ι : Type u_1} {l : Filter ι} {𝕜 : Type u_2} [IsROrC 𝕜] {G : Type u_3} [NormedAddCommGroup G] [NormedSpace 𝕜 G] {f' : ι𝕜G} {l' : Filter 𝕜} (hf' : UniformCauchySeqOnFilter f' l l') :
UniformCauchySeqOnFilter (fun (n : ι) (z : 𝕜) => ContinuousLinearMap.smulRight 1 (f' n z)) l l'

If our derivatives converge uniformly, then the Fréchet derivatives converge uniformly

theorem uniformCauchySeqOnFilter_of_deriv {ι : Type u_1} {l : Filter ι} {𝕜 : Type u_2} [IsROrC 𝕜] {G : Type u_3} [NormedAddCommGroup G] [NormedSpace 𝕜 G] {f : ι𝕜G} {f' : ι𝕜G} {x : 𝕜} (hf' : UniformCauchySeqOnFilter f' l (nhds x)) (hf : ∀ᶠ (n : ι × 𝕜) in l ×ˢ nhds x, HasDerivAt (f n.1) (f' n.1 n.2) n.2) (hfg : Cauchy (Filter.map (fun (n : ι) => f n x) l)) :
theorem uniformCauchySeqOn_ball_of_deriv {ι : Type u_1} {l : Filter ι} {𝕜 : Type u_2} [IsROrC 𝕜] {G : Type u_3} [NormedAddCommGroup G] [NormedSpace 𝕜 G] {f : ι𝕜G} {f' : ι𝕜G} {x : 𝕜} {r : } (hf' : UniformCauchySeqOn f' l (Metric.ball x r)) (hf : ∀ (n : ι), yMetric.ball x r, HasDerivAt (f n) (f' n y) y) (hfg : Cauchy (Filter.map (fun (n : ι) => f n x) l)) :
theorem hasDerivAt_of_tendstoUniformlyOnFilter {ι : Type u_1} {l : Filter ι} {𝕜 : Type u_2} [IsROrC 𝕜] {G : Type u_3} [NormedAddCommGroup G] [NormedSpace 𝕜 G] {f : ι𝕜G} {g : 𝕜G} {f' : ι𝕜G} {g' : 𝕜G} {x : 𝕜} [Filter.NeBot l] (hf' : TendstoUniformlyOnFilter f' g' l (nhds x)) (hf : ∀ᶠ (n : ι × 𝕜) in l ×ˢ nhds x, HasDerivAt (f n.1) (f' n.1 n.2) n.2) (hfg : ∀ᶠ (y : 𝕜) in nhds x, Filter.Tendsto (fun (n : ι) => f n y) l (nhds (g y))) :
HasDerivAt g (g' x) x
theorem hasDerivAt_of_tendstoLocallyUniformlyOn {ι : Type u_1} {l : Filter ι} {𝕜 : Type u_2} [IsROrC 𝕜] {G : Type u_3} [NormedAddCommGroup G] [NormedSpace 𝕜 G] {f : ι𝕜G} {g : 𝕜G} {f' : ι𝕜G} {g' : 𝕜G} {x : 𝕜} [Filter.NeBot l] {s : Set 𝕜} (hs : IsOpen s) (hf' : TendstoLocallyUniformlyOn f' g' l s) (hf : ∀ᶠ (n : ι) in l, xs, HasDerivAt (f n) (f' n x) x) (hfg : xs, Filter.Tendsto (fun (n : ι) => f n x) l (nhds (g x))) (hx : x s) :
HasDerivAt g (g' x) x
theorem hasDerivAt_of_tendsto_locally_uniformly_on' {ι : Type u_1} {l : Filter ι} {𝕜 : Type u_2} [IsROrC 𝕜] {G : Type u_3} [NormedAddCommGroup G] [NormedSpace 𝕜 G] {f : ι𝕜G} {g : 𝕜G} {g' : 𝕜G} {x : 𝕜} [Filter.NeBot l] {s : Set 𝕜} (hs : IsOpen s) (hf' : TendstoLocallyUniformlyOn (deriv f) g' l s) (hf : ∀ᶠ (n : ι) in l, DifferentiableOn 𝕜 (f n) s) (hfg : xs, Filter.Tendsto (fun (n : ι) => f n x) l (nhds (g x))) (hx : x s) :
HasDerivAt g (g' x) x

A slight variant of hasDerivAt_of_tendstoLocallyUniformlyOn with the assumption stated in terms of DifferentiableOn rather than HasDerivAt. This makes a few proofs nicer in complex analysis where holomorphicity is assumed but the derivative is not known a priori.

theorem hasDerivAt_of_tendstoUniformlyOn {ι : Type u_1} {l : Filter ι} {𝕜 : Type u_2} [IsROrC 𝕜] {G : Type u_3} [NormedAddCommGroup G] [NormedSpace 𝕜 G] {f : ι𝕜G} {g : 𝕜G} {f' : ι𝕜G} {g' : 𝕜G} [Filter.NeBot l] {s : Set 𝕜} (hs : IsOpen s) (hf' : TendstoUniformlyOn f' g' l s) (hf : ∀ᶠ (n : ι) in l, xs, HasDerivAt (f n) (f' n x) x) (hfg : xs, Filter.Tendsto (fun (n : ι) => f n x) l (nhds (g x))) (x : 𝕜) :
x sHasDerivAt g (g' x) x
theorem hasDerivAt_of_tendstoUniformly {ι : Type u_1} {l : Filter ι} {𝕜 : Type u_2} [IsROrC 𝕜] {G : Type u_3} [NormedAddCommGroup G] [NormedSpace 𝕜 G] {f : ι𝕜G} {g : 𝕜G} {f' : ι𝕜G} {g' : 𝕜G} [Filter.NeBot l] (hf' : TendstoUniformly f' g' l) (hf : ∀ᶠ (n : ι) in l, ∀ (x : 𝕜), HasDerivAt (f n) (f' n x) x) (hfg : ∀ (x : 𝕜), Filter.Tendsto (fun (n : ι) => f n x) l (nhds (g x))) (x : 𝕜) :
HasDerivAt g (g' x) x