The Fréchet derivative #
Let E
and F
be normed spaces, f : E → F
, and f' : E →L[𝕜] F
a
continuous 𝕜-linear map, where 𝕜
is a non-discrete normed field. Then
HasFDerivWithinAt f f' s x
says that f
has derivative f'
at x
, where the domain of interest
is restricted to s
. We also have
HasFDerivAt f f' x := HasFDerivWithinAt f f' x univ
Finally,
HasStrictFDerivAt f f' x
means that f : E → F
has derivative f' : E →L[𝕜] F
in the sense of strict differentiability,
i.e., f y - f z - f'(y - z) = o(y - z)
as y, z → x
. This notion is used in the inverse
function theorem, and is defined here only to avoid proving theorems like
IsBoundedBilinearMap.hasFDerivAt
twice: first for HasFDerivAt
, then for
HasStrictFDerivAt
.
Main results #
In addition to the definition and basic properties of the derivative,
the folder Analysis/Calculus/FDeriv/
contains the usual formulas
(and existence assertions) for the derivative of
- constants
- the identity
- bounded linear maps (
Linear.lean
) - bounded bilinear maps (
Bilinear.lean
) - sum of two functions (
Add.lean
) - sum of finitely many functions (
Add.lean
) - multiplication of a function by a scalar constant (
Add.lean
) - negative of a function (
Add.lean
) - subtraction of two functions (
Add.lean
) - multiplication of a function by a scalar function (
Mul.lean
) - multiplication of two scalar functions (
Mul.lean
) - composition of functions (the chain rule) (
Comp.lean
) - inverse function (
Mul.lean
) (assuming that it exists; the inverse function theorem is in../Inverse.lean
)
For most binary operations we also define const_op
and op_const
theorems for the cases when
the first or second argument is a constant. This makes writing chains of HasDerivAt
's easier,
and they more frequently lead to the desired result.
One can also interpret the derivative of a function f : 𝕜 → E
as an element of E
(by identifying
a linear function from 𝕜
to E
with its value at 1
). Results on the Fréchet derivative are
translated to this more elementary point of view on the derivative in the file Deriv.lean
. The
derivative of polynomials is handled there, as it is naturally one-dimensional.
The simplifier is set up to prove automatically that some functions are differentiable, or
differentiable at a point (but not differentiable on a set or within a set at a point, as checking
automatically that the good domains are mapped one to the other when using composition is not
something the simplifier can easily do). This means that one can write
example (x : ℝ) : Differentiable ℝ (fun x ↦ sin (exp (3 + x^2)) - 5 * cos x) := by simp
.
If there are divisions, one needs to supply to the simplifier proofs that the denominators do
not vanish, as in
example (x : ℝ) (h : 1 + sin x ≠ 0) : DifferentiableAt ℝ (fun x ↦ exp x / (1 + sin x)) x :=
by simp [h]
Of course, these examples only work once exp
, cos
and sin
have been shown to be
differentiable, in Analysis.SpecialFunctions.Trigonometric
.
The simplifier is not set up to compute the Fréchet derivative of maps (as these are in general
complicated multidimensional linear maps), but it will compute one-dimensional derivatives,
see Deriv.lean
.
Implementation details #
The derivative is defined in terms of the isLittleO
relation, but also
characterized in terms of the Tendsto
relation.
We also introduce predicates DifferentiableWithinAt 𝕜 f s x
(where 𝕜
is the base field,
f
the function to be differentiated, x
the point at which the derivative is asserted to exist,
and s
the set along which the derivative is defined), as well as DifferentiableAt 𝕜 f x
,
DifferentiableOn 𝕜 f s
and Differentiable 𝕜 f
to express the existence of a derivative.
To be able to compute with derivatives, we write fderivWithin 𝕜 f s x
and fderiv 𝕜 f x
for some choice of a derivative if it exists, and the zero function otherwise. This choice only
behaves well along sets for which the derivative is unique, i.e., those for which the tangent
directions span a dense subset of the whole space. The predicates UniqueDiffWithinAt s x
and
UniqueDiffOn s
, defined in TangentCone.lean
express this property. We prove that indeed
they imply the uniqueness of the derivative. This is satisfied for open subsets, and in particular
for univ
. This uniqueness only holds when the field is non-discrete, which we request at the very
beginning: otherwise, a derivative can be defined, but it has no interesting properties whatsoever.
To make sure that the simplifier can prove automatically that functions are differentiable, we tag
many lemmas with the simp
attribute, for instance those saying that the sum of differentiable
functions is differentiable, as well as their product, their cartesian product, and so on. A notable
exception is the chain rule: we do not mark as a simp lemma the fact that, if f
and g
are
differentiable, then their composition also is: simp
would always be able to match this lemma,
by taking f
or g
to be the identity. Instead, for every reasonable function (say, exp
),
we add a lemma that if f
is differentiable then so is (fun x ↦ exp (f x))
. This means adding
some boilerplate lemmas, but these can also be useful in their own right.
Tests for this ability of the simplifier (with more examples) are provided in
Tests/Differentiable.lean
.
Tags #
derivative, differentiable, Fréchet, calculus
A function f
has the continuous linear map f'
as derivative along the filter L
if
f x' = f x + f' (x' - x) + o (x' - x)
when x'
converges along the filter L
. This definition
is designed to be specialized for L = 𝓝 x
(in HasFDerivAt
), giving rise to the usual notion
of Fréchet derivative, and for L = 𝓝[s] x
(in HasFDerivWithinAt
), giving rise to
the notion of Fréchet derivative along the set s
.
- of_isLittleO :: (
- )
Instances For
A function f
has the continuous linear map f'
as derivative at x
within a set s
if
f x' = f x + f' (x' - x) + o (x' - x)
when x'
tends to x
inside s
.
Equations
- HasFDerivWithinAt f f' s x = HasFDerivAtFilter f f' x (nhdsWithin x s)
Instances For
A function f
has the continuous linear map f'
as derivative at x
if
f x' = f x + f' (x' - x) + o (x' - x)
when x'
tends to x
.
Equations
- HasFDerivAt f f' x = HasFDerivAtFilter f f' x (nhds x)
Instances For
A function f
has derivative f'
at a
in the sense of strict differentiability
if f x - f y - f' (x - y) = o(x - y)
as x, y → a
. This form of differentiability is required,
e.g., by the inverse function theorem. Any C^1
function on a vector space over ℝ
is strictly
differentiable but this definition works, e.g., for vector spaces over p
-adic numbers.
Equations
Instances For
A function f
is differentiable at a point x
within a set s
if it admits a derivative
there (possibly non-unique).
Equations
- DifferentiableWithinAt 𝕜 f s x = ∃ (f' : E →L[𝕜] F), HasFDerivWithinAt f f' s x
Instances For
A function f
is differentiable at a point x
if it admits a derivative there (possibly
non-unique).
Equations
- DifferentiableAt 𝕜 f x = ∃ (f' : E →L[𝕜] F), HasFDerivAt f f' x
Instances For
If f
has a derivative at x
within s
, then fderivWithin 𝕜 f s x
is such a derivative.
Otherwise, it is set to 0
. If x
is isolated in s
, we take the derivative within s
to
be zero for convenience.
Equations
Instances For
If f
has a derivative at x
, then fderiv 𝕜 f x
is such a derivative. Otherwise, it is
set to 0
.
Instances For
DifferentiableOn 𝕜 f s
means that f
is differentiable within s
at any point of s
.
Equations
- DifferentiableOn 𝕜 f s = ∀ x ∈ s, DifferentiableWithinAt 𝕜 f s x
Instances For
Differentiable 𝕜 f
means that f
is differentiable at any point.
Equations
- Differentiable 𝕜 f = ∀ (x : E), DifferentiableAt 𝕜 f x
Instances For
If a function f has a derivative f' at x, a rescaled version of f around x converges to f',
i.e., n (f (x + (1/n) v) - f x)
converges to f' v
. More generally, if c n
tends to infinity
and c n * d n
tends to v
, then c n * (f (x + d n) - f x)
tends to f' v
. This lemma expresses
this fact, for functions having a derivative within a set. Its specific formulation is useful for
tangent cone related discussions.
If f'
and f₁'
are two derivatives of f
within s
at x
, then they are equal on the
tangent cone to s
at x
UniqueDiffWithinAt
achieves its goal: it implies the uniqueness of the derivative.
Basic properties of the derivative #
Converse to the mean value inequality: if f
is differentiable at x₀
and C
-lipschitz
on a neighborhood of x₀
then its derivative at x₀
has norm bounded by C
. This version
only assumes that ‖f x - f x₀‖ ≤ C * ‖x - x₀‖
in a neighborhood of x
.
Converse to the mean value inequality: if f
is differentiable at x₀
and C
-lipschitz
on a neighborhood of x₀
then its derivative at x₀
has norm bounded by C
.
Converse to the mean value inequality: if f
is differentiable at x₀
and C
-lipschitz
then its derivative at x₀
has norm bounded by C
.
Alias of the forward direction of hasFDerivWithinAt_univ
.
Alias of the forward direction of hasFDerivWithinAt_insert
.
Alias of the reverse direction of hasFDerivWithinAt_insert
.
If f
is strictly differentiable at x
with derivative f'
and K > ‖f'‖₊
, then f
is
K
-Lipschitz in a neighborhood of x
.
If f
is strictly differentiable at x
with derivative f'
, then f
is Lipschitz in a
neighborhood of x
. See also HasStrictFDerivAt.exists_lipschitzOnWith_of_nnnorm_lt
for a
more precise statement.
Directional derivative agrees with HasFDeriv
.
If x
is isolated in s
, then f
has any derivative at x
within s
,
as this statement is empty.
If x
is not in the closure of s
, then f
has any derivative at x
within s
,
as this statement is empty.
Converse to the mean value inequality: if f
is C
-lipschitz
on a neighborhood of x₀
then its derivative at x₀
has norm bounded by C
. This version
only assumes that ‖f x - f x₀‖ ≤ C * ‖x - x₀‖
in a neighborhood of x
.
Converse to the mean value inequality: if f
is C
-lipschitz
on a neighborhood of x₀
then its derivative at x₀
has norm bounded by C
.
Version using fderiv
.
Converse to the mean value inequality: if f
is C
-lipschitz then
its derivative at x₀
has norm bounded by C
.
Version using fderiv
.