Once-per-file cache for tactics #
This file defines cache data structures for tactics that are initialized the first time they are accessed. Since Lean 4 starts one process per file, these caches are once-per-file and can for example be used to cache information about the imported modules.
The Cache α
data structure is the most generic version we define.
It is created using Cache.mk f
where f : MetaM α
performs the initialization of the cache:
initialize numberOfImports : Cache Nat ← Cache.mk do
(← getEnv).imports.size
-- (does not work in the same module where the cache is defined)
#eval show MetaM Nat from numberOfImports.get
The DeclCache α
data structure computes a fold over the environment's constants:
DeclCache.mk empty f
constructs such a cache
where empty : α
and f : Name → ConstantInfo → α → MetaM α
.
The result of the constants in the imports is cached between tactic invocations,
while for constants defined in the same file f
is evaluated again every time.
This kind of cache can be used e.g. to populate discrimination trees.
Once-per-file cache.
Equations
- Std.Tactic.Cache α = IO.Ref (Lean.MetaM α ⊕ Task (Except Lean.Exception α))
Instances For
Equations
- (_ : Nonempty (Std.Tactic.Cache α)) = (_ : Nonempty (IO.Ref (Lean.MetaM α ⊕ Task (Except Lean.Exception α))))
Creates a cache with an initialization function.
Equations
- Std.Tactic.Cache.mk init = liftM (IO.mkRef (Sum.inl init))
Instances For
Access the cache. Calling this function for the first time will initialize the cache with the function provided in the constructor.
Equations
- One or more equations did not get rendered due to their size.
Instances For
Cached fold over the environment's declarations,
where a given function is applied to α
for every constant.
- mk' :: (
- cache : Std.Tactic.Cache α
The cached data.
- addDecl : Lean.Name → Lean.ConstantInfo → α → Lean.MetaM α
Function for adding a declaration from the current file to the cache.
- addLibraryDecl : Lean.Name → Lean.ConstantInfo → α → Lean.MetaM α
Function for adding a declaration from the library to the cache. Defaults to the same behaviour as adding a declaration from the current file.
- )
Instances For
Equations
- (_ : Nonempty (Std.Tactic.DeclCache α)) = (_ : Nonempty (Std.Tactic.DeclCache α))
Creates a DeclCache
.
First, if pre
is nonempty, run that for a value,
and if successful populate the cache with that value.
If pre
is empty, or it fails,
the cached structure α
is initialized with empty
,
and then addLibraryDecl
is called for every imported constant.
After all imported constants have been added, we run post
.
Finally, the result is cached.
When get
is called, addDecl
is also called for every constant in the current file.
Equations
- One or more equations did not get rendered due to their size.
Instances For
Access the cache. Calling this function for the first time will initialize the cache with the function provided in the constructor.
Equations
- One or more equations did not get rendered due to their size.
Instances For
A type synonym for a DeclCache
containing a pair of discrimination trees.
The first will store declarations in the current file,
the second will store declarations from imports (and will hopefully be "read-only" after creation).
Equations
Instances For
Discrimation tree settings for the DiscrTreeCache
.
Equations
- Std.Tactic.DiscrTreeCache.config = { iota := true, beta := true, proj := Lean.Meta.ProjReductionKind.yesWithDelta, zeta := true }
Instances For
Build a DiscrTreeCache
,
from a function that returns a collection of keys and values for each declaration.
Equations
- One or more equations did not get rendered due to their size.
Instances For
Get matches from both the discrimination tree for declarations in the current file, and for the imports.
Note that if you are calling this multiple times with the same environment,
it will rebuild the discrimination tree for the current file multiple times,
and it would be more efficient to call c.get
once,
and then call DiscrTree.getMatch
multiple times.
Equations
- One or more equations did not get rendered due to their size.