Lean-auto is an interface between Lean and automated theorem provers. Up to now, lean-auto is maintained and developed primarily by Yicheng Qian (GitHub: PratherConid). It is currently in active development, and pull requests/issues are welcome. For more information, feel free to reach out to Yicheng Qian on Lean Zulip.
Lean-auto is based on a monomorphization procedure from dependent type theory to higher-order logic and a deep embedding of higher-order logic into dependent type theory. It is capable of handling dependently-typed and/or universe-polymorphic input terms. Currently, proof reconstruction can be handled by Duper, a higher-order superposition prover written in Lean.
Although Lean-auto is still under development, it's already able to solve nontrivial problems. For example the first part of the "snake lemma" in category theory can be solved by a direct invocation to auto
(and the second part can also be partly automated):
Type "auto 👍" to see whether auto is set up.
auto [<term>,*] u[<ident>,*] d[<ident>,*]
u[<ident>,*]
: Unfold identifiersd[<ident>,*]
: Add definitional equality related to identifiers
- Currently, auto supports
- SMT solver invocation:
set_option auto.smt true
, but without proof reconstruction. Make sure that SMT solvers are installed, and thatauto.smt.solver.name
is correctly set. - TPTP Solver invocation:
set_option auto.tptp true
, but without proof reconstruction. Make sure that TPTP solvers (currently only supports zipperposition) are installed, and thatauto.tptp.solver.name
andauto.tptp.zeport.path
are correctly set. - Proof search by native prover. To enable proof search by native prover, use
set_option auto.native true
, and useattribute [rebind Auto.Native.solverFunc] <solve_interface>
to bindlean-auto
to the interface of the solver, which should be a Lean constant of typeArray Lemma → MetaM Expr
.
- SMT solver invocation:
z3
version >= 4.12.2. Lower versions may not be able to deal with smt-lib 2.6 string escape sequence.cvc5
zipperposition
portfolio mode
- Command
#getExprAndApply [ <term> | <ident> ]
: Defined inExprExtra.lean
. This command first elaborates the<term>
into a leanExpr
, then applies function<ident>
toExpr
. The constantident
must be already declared and be of typeExpr → TermElabM Unit
- Command
#genMonadState <term>, #genMonadContext <term>
: Defined inMonadUtils.lean
. Refer to the comment at the beginning ofMonadUtils.lean
. - Command
#fromMetaTactic [<ident>]
: CallsTactic.liftMetaTactic
on<ident>
. The constant<ident>
must be already declared and be of typeMVarId → MetaM (List MVarId)
- Lexical Analyzer Generator:
Parser/LeanLex.lean
. The frontend is not yet implemented. The backend can be found inNDFA.lean
.
- Let
$H : \alpha$ be an assumption. We require that-
$(1)$ If the type$\beta$ of any subterm$t$ of$\alpha$ depends on a bound variable$x$ inside$\alpha$ , and$\beta$ is not of type$Prop$ , then$x$ must be instantiated. Examples: Monomorphization, sectionInstExamples
-
$(2)$ If any binder$x$ of$\alpha$ has binderinfoinstImplicit
, then the binder$x$ must be instantiated via typeclass inference.
-
- TODO
- Collecting assumptions from local context and user-provided facts
- We reduce
let
binders and unfold projections when we collect assumptions. So, in the following discussion, we'll assume that the expression contains nolet
binders and noproj
s. - We also
$\beta$ reduce user provided facts so that there are nothing like$(\lambda x. t_1) \ t_2$
- We reduce
-
$CIC \to COC$ : Collecting constructors and recursors for inductive types (effectively, not directly)- e.g collecting equational theorem for match constructs
- e.g collect constructors for inductively defined propositions
-
$COC \to COC(\lambda^{c.u.})$ : Monomorphization- Monomorphize all (dependently typed/universe polymorphic) facts to higher-order universe-monomorphic facts
-
$c.u.$ stands for "constant universe level" - Note that at this stage, all the facts we've obtained are still valid
$CIC$ expressions and has convenient CIC proofs from the assumptions.
-
$COC(\lambda^{c.u.}) \to COC(\lambda)$ - We want all types
$α$ occuring in the signature of constants and variables to be of sortType (u + 1)
, i.e.,$α : Type \ (u + 1)$ . This is necessary because we want to write a checker (instead of directly reconstructing proof in DTT) and the valuation function from less expressive logic to dependent type theory requires [the elements in the range of the valuation function] to be [of the same sort]. - To do this, we use
GLift
. For example,Nat.add
is transformed intoNat.addLift
structure GLift.{u, v} (α : Sort u) : Sort (max u (v + 1)) where /-- Lift a value into `GLift α` -/ up :: /-- Extract a value from `GLift α` -/ down : α def Nat.addLift.{u} (x y : GLift.{1, u} Nat) := GLift.up (Nat.add (GLift.down x) (GLift.down y))
- We only transfer these "lifted" terms to the less expressive
$\lambda_2$ , and$\lambda_2$ is unaware of the universe levels wrapped insideGLift.up
. - Lifted constantes should be introduced into the local context. Theorems corresponding to the original one but using only lifted constants and with uniform universe levels, should also be introduced into the local context. Later translations should only use theorems and constants with uniform universe levels.
- We want all types
-
$\lambda \to \lambda(\text{TPTP TF0})$ : Instantiating function arguments-
$\lambda$ is the reified$COC(\lambda)$
-
- There should also be a process similar to ULifting that "lifts" Bool into Prop, Nat to Int
-
$COC(\lambda) \to \lambda(\text{TPTP\ TH0})$ Auto/Translation/LamReif.lean
-
$\lambda(\text{TPTP TF0}) \to \text{TPTP TF0}$ Auto/Translation/LamFOL2SMT.lean
- The checker is based on a deep embedding of simply-typed lambda calculus into dependent type theory.
- The checker is slow on large input. For example, it takes
6s
to typecheck the final example inBinderComplexity.lean
. However, this is probably acceptable for mathlib usages, because e.gMathlib/Analysis/BoxIntegral/DivergenceTheorem.lean
has two theorems that take4s
to compile (but a large portion of the4s
are spent on typeclass inference)
defeq <num> <name>
: The<num>
-th definitional equality associated with definition<name>
hw <name>
: Lemmas hard-wired into Lean-autolctxInh
: Inhabitation fact from local contextlctxLem
: Lemma from local contextqueryNative::<func_name>
: Proved by native proverrec <indName>.<ctorName>
rw [0, 1]
: Rewrite0
using1
(1
must be an equality)tyCanInh
: Inhabitation instance synthesized for canonicalized type❰<term>❱
: User-provided lemma<term>