Documentation

LeanCert.Core.DerivativeIntervals

Derivative Interval Library #

This file provides a systematic collection of derivative interval facts for basic functions. These lemmas establish that derivatives of common functions lie within specified intervals, which feeds into:

Main theorems #

Exp (derivative = exp, always positive) #

Log (derivative = 1/x on (0, ∞)) #

Arctan (derivative = 1/(1+x²) ∈ (0, 1]) #

Arsinh (derivative = 1/√(1+x²) ∈ (0, 1]) #

Sin/Cos (derivatives bounded by 1) #

Sinh/Cosh (derivative relationships) #

Design notes #

These lemmas are independent of any specific algorithm; they're just facts about functions. They're designed to be composable with the chain rule for AD correctness proofs.

Exp derivative intervals #

The derivative of exp is positive everywhere

exp is strictly increasing (follows from positive derivative)

For x ∈ [a, b], we have exp' x = exp x ∈ [exp a, exp b]

Log derivative intervals #

The derivative of log at x > 0 is x⁻¹

For x ∈ [a, b] with 0 < a ≤ b, we have log' x = 1/x ∈ [1/b, 1/a]

Arctan derivative intervals #

The derivative of arctan at x is 1/(1+x²)

1/(1+x²) is always positive

arctan' x ∈ [0, 1] for all x (closed interval version for interval arithmetic)

arctan is strictly increasing (follows from positive derivative)

Arsinh derivative intervals #

The derivative of arsinh at x is (√(1+x²))⁻¹

√(1+x²) ≥ 1 for all x

√(1+x²) > 0 for all x

arsinh' x ∈ [0, 1] for all x (closed interval version)

Sin derivative intervals #

The derivative of sin is cos

sin' x ∈ [-1, 1] for all x

Cos derivative intervals #

The derivative of cos is -sin

cos' x ∈ [-1, 1] for all x

Sinh derivative intervals #

The derivative of sinh is cosh

sinh' x = cosh x ≥ 1 for all x

Cosh derivative intervals #

The derivative of cosh is sinh

For x > 0, cosh' x = sinh x > 0

For x < 0, cosh' x = sinh x < 0

One-Sided Taylor Bounds #

For convex functions, the Taylor polynomial provides a one-sided bound. For exp on [0, ∞), the n-th Taylor polynomial is a lower bound. For concave functions, similar upper bounds hold.

These bounds are tighter than symmetric remainder bounds at endpoints, which is important for verified numerics.

Exp lower bounds (convexity gives Taylor poly ≤ function) #

1 ≤ exp x for x ≥ 0 (Taylor degree 0)

1 + x ≤ exp x for all x (Taylor degree 1)

x + 1 ≤ exp x for all x (alternate form)

Monotonicity from derivative bounds #

These wrap the Mathlib lemmas for convenience with our naming.

theorem LeanCert.Core.OneSidedTaylor.monotoneOn_of_deriv_nonneg {f : } {a b : } (hf : ContinuousOn f (Set.Icc a b)) (hf' : DifferentiableOn f (Set.Ioo a b)) (hderiv : xSet.Ioo a b, 0 deriv f x) :

If f' ≥ 0 on [a, b], then f is monotone increasing on [a, b]

theorem LeanCert.Core.OneSidedTaylor.strictMonoOn_of_deriv_pos {f : } {a b : } (_hab : a < b) (hf : ContinuousOn f (Set.Icc a b)) (_hf' : DifferentiableOn f (Set.Ioo a b)) (hderiv : xSet.Ioo a b, 0 < deriv f x) :

If f' > 0 on (a, b), then f is strictly monotone on [a, b]

theorem LeanCert.Core.OneSidedTaylor.antitoneOn_of_deriv_nonpos {f : } {a b : } (hf : ContinuousOn f (Set.Icc a b)) (hf' : DifferentiableOn f (Set.Ioo a b)) (hderiv : xSet.Ioo a b, deriv f x 0) :

If f' ≤ 0 on [a, b], then f is antitone (monotone decreasing) on [a, b]

theorem LeanCert.Core.OneSidedTaylor.strictAntiOn_of_deriv_neg {f : } {a b : } (_hab : a < b) (hf : ContinuousOn f (Set.Icc a b)) (_hf' : DifferentiableOn f (Set.Ioo a b)) (hderiv : xSet.Ioo a b, deriv f x < 0) :

If f' < 0 on (a, b), then f is strictly antitone on [a, b]