An auto-differentiation library.
Currently supported features:
-
Forward auto-differentiation
-
Reverse auto-differentiation
To compute a derivative with respect to a variable using this library:
-
create a variable of type
F
, which implements theFloat
trait from thenum-traits
crate. -
compute your function using this variable as the input.
-
request the derivative from this variable using the
deriv
method.
This library is a work in progress and is not ready for production use.
The following example differentiates a 1D function defined by a closure.
// Define a function `f(x) = e^{-0.5*x^2}`.
let f = |x: FT<f64>| (-x * x / F1::cst(2.0)).exp();
// Differentiate `f` at zero.
println!("{}", diff(f, 0.0)); // prints `0`
To compute the gradient of a function, use the function grad
as follows:
// Define a function `f(x,y) = x*y^2`.
let f = |x: &[FT<f64>]| x[0] * x[1] * x[1];
// Differentiate `f` at `(1,2)`.
let g = grad(f, &vec![1.0, 2.0]);
println!("({}, {})", g[0], g[1]); // prints `(4, 4)`
Compute a specific derivative of a multi-variable function:
// Define a function `f(x,y) = x*y^2`.
let f = |v: &[FT<f64>]| v[0] * v[1] * v[1];
// Differentiate `f` at `(1,2)` with respect to `x` (the first unknown) only.
let v = vec![
F1::var(1.0), // Create a variable.
F1::cst(2.0), // Create a constant.
];
println!("{}", f(&v).deriv()); // prints `4`
Support for approx
, cgmath
and nalgebra
via the approx
, cgmath
and na
feature flags respectively.
This repository is licensed under either of
- Apache License, Version 2.0, (LICENSE-APACHE or https://www.apache.org/licenses/LICENSE-2.0)
- MIT License (LICENSE-MIT or https://opensource.org/licenses/MIT)
at your option.
This library started as a fork of rust-ad.