| Age | Commit message (Collapse) | Author |
|
|
|
independent of the multi-usage internal "letin_tac"
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
As far as I know, this plugin is untested and barely maintained. I don't
think it has real use cases any more, so let's move it out from the repo
and see if somebody wants to take over and maintain it.
We also remove the documentation, which was telling our users to look at
ring to see an example of reification done using quote, when in fact it
wasn't using it anymore.
|
|
|
|
|
|
|
|
Removing in passing two Local which are no-ops in practice.
|
|
This module contains:
- the former ExtraEnv in pretyping
- a few functions to traverse binders in pretyping.ml and cases.ml
- the part of pretyping dealing with genarg interpretation
The dependency of pretyping in an interpretation of names as names of
variables of identifier is now hidden in GlobEnv (no more explicit
"lvar" management in pretyping.ml). Similarly for the interpretation
of names as terms and for the interpretation of tactics-in-terms.
We keep empty_lvar in Glob_ops for compatibility, even though it is a
bit isolated there.
|
|
|
|
|
|
[Dune](https://github.com/ocaml/dune) is a compositional declarative
build system for OCaml. It provides automatic generation of
`version.ml`, `.merlin`, `META`, `opam`, API documentation; install
management; easy integration with external libraries, test runners,
and modular builds.
In particular, Dune uniformly handles components regardless whether
they live in, or out-of-tree. This greatly simplifies cases where a
plugin [or CoqIde] is checked out in the current working copy but then
distributed separately [and vice-versa]. Dune can thus be used as a
more flexible `coq_makefile` replacement.
For now we provide experimental support for a Dune build. In order to
build Coq + the standard library with Dune type:
```
$ make -f Makefile.dune world
```
This PR includes a preliminary, developer-only preview of Dune for
Coq. There is still ongoing work, see
https://github.com/coq/coq/issues/8052 for tracking status towards
full support.
## Technical description.
Dune works out of the box with Coq, once we have fixed some modularity
issues. The main remaining challenge was to support `.vo` files.
As Dune doesn't support custom build rules yet, to properly build
`.vo` files we provide a small helper script `tools/coq_dune.ml`. The
script will scan the Coq library directories and generate the
corresponding rules for `.v -> .vo` and `.ml4 -> .ml` builds. The
script uses `coqdep` as to correctly output the dependencies of
`.v` files. `coq_dune` is akin to `coq_makefile` and should be able to
be used to build Coq projects in the future.
Due to this pitfall, the build process has to proceed in three stages:
1) build `coqdep` and `coq_dune`; 2) generate `dune` files for
`theories` and `plugins`; 3) perform a regular build with all
targets are in scope.
## FAQ
### Why Dune?
Coq has a moderately complex build system and it is not a secret that
many developer-hours have been spent fighting with `make`.
In particular, the current `make`-based system does offer poor support
to verify that the current build rules and variables are coherent, and
requires significant manual, error-prone. Many variables must be
passed by hand, duplicated, etc... Additionally, our make system
offers poor integration with now standard OCaml ecosystem tools such
as `opam`, `ocamlfind` or `odoc`. Another critical point is build
compositionality. Coq is rich in 3rd party contributions, and a big
shortcoming of the current make system is that it cannot be used to
build these projects; requiring us to maintain a custom tool,
`coq_makefile`, with the corresponding cost.
In the past, there has been some efforts to migrate Coq to more
specialized build systems, however these stalled due to a variety of
reasons. Dune, is a declarative, OCaml-specific build tool that is on
the path to become the standard build system for the OCaml ecosystem.
Dune seems to be a good fit for Coq well: it is well-supported, fast,
compositional, and designed for large projects.
### Does Dune replace the make-based build system?
The current, make-based build system is unmodified by this PR and kept
as the default option. However, Dune has the potential
### Is this PR complete? What does it provide?
This PR is ready for developer preview and feedback. The build system
is functional, however, more work is necessary in order to make Dune
the default for Coq.
The main TODOs are tracked at https://github.com/coq/coq/issues/8052
This PR allows developers to use most of the features of Dune today:
- Modular organization of the codebase; each component is built only
against declared dependencies so components are checked for
containment more strictly.
- Hygienic builds; Dune places all artifacts under `_build`.
- Automatic generation of `.install` files, simplified OPAM workflow.
- `utop` support, `-opaque` in developer mode, etc...
- `ml4` files are handled using `coqp5`, a native-code customized
camlp5 executable which brings much faster `ml4 -> ml` processing.
### What dependencies does Dune require?
Dune doesn't depend on any 3rd party package other than the OCaml compiler.
### Some Benchs:
```
$ /usr/bin/time make DUNEOPT="-j 1000" -f Makefile.dune states
59.50user 18.81system 0:29.83elapsed 262%CPU (0avgtext+0avgdata 302996maxresident)k
0inputs+646632outputs (0major+4893811minor)pagefaults 0swaps
$ /usr/bin/time sh -c "./configure -local -native-compiler no && make -j states"
88.21user 23.65system 0:32.96elapsed 339%CPU (0avgtext+0avgdata 304992maxresident)k
0inputs+1051680outputs (0major+5300680minor)pagefaults 0swaps
```
|
|
Dune will complain about these leftovers / dead files in the tree.
|
|
|
|
Looks like this bug was introduced when unification started raising the
UnableToUnify exception in 8ac929ea128f1f7353b3f4d532b642e769542e55 .
I now turn this exception into a PretypeError that is correctly catched
and printed.
|
|
|
|
|
|
Thanks to Emilio and Pierre-Marie Pédrot for pointers.
|
|
Aliases of global references can now be used in numeral notations
|
|
Now we support using inductive constructors and section-local variables
as numeral notation printing and parsing functions.
I'm not sure that I got the econstr conversion right.
|
|
|
|
As per https://github.com/coq/coq/pull/8064#pullrequestreview-145971522
|
|
As per https://github.com/coq/coq/pull/8064#discussion_r209875616
I decided to make it a warning because it seems more flexible that way;
users to are flipping back and forth between option types and not option
types while designing won't have to update their `abstract after`
directives to do so, and users who don't want to allow this can make it an
actual error message.
|
|
Also make `Check S` no longer anomaly
Add a couple more test cases for numeral notations
Also add another possibly-confusing error message to the doc.
Respond to Hugo's doc request with Zimmi48's suggestion
From https://github.com/coq/coq/pull/8064/files#r204191608
|
|
|
|
Some of this code is cargo-culted or kludged to work.
As I understand it, the situation is as follows:
There are two sorts of use-cases that need to be supported:
1. A plugin registers an OCaml function as a numeral interpreter. In
this case, the function registration must be synchronized with the
document state, but the functions should not be marshelled / stored
in the .vo.
2. A vernacular registers a Gallina function as a numeral interpreter.
In this case, the registration must be synchronized, and the function
should be marshelled / stored in the .vo.
In case (1), we can compare functions by pointer equality, and we should
be able to rely on globally unique keys, even across backtracking.
In case (2), we cannot compare functions by pointer equality (because
they must be regenerated on unmarshelling when `Require`ing a .vo file),
and we also cannot rely on any sort of unique key being both unique and
persistent across files.
The solution we use here is that we ask clients to provide "unique"
keys, and that clients tell us whether or not to overwrite existing
registered functions, i.e., to tell us whether or not we should expect
interpreter functions to be globally unique under pointer equality. For
plugins, a simple string suffices, as long as the string does not clash
between different plugins. In the case of vernacular-registered
functions, use marshell a description of all of the data used to
generate the function, and use that string as a unique key which is
expected to persist across files. Because we cannot rely on
function-pointer uniqueness here, we tell the
interpretation-registration to allow overwriting.
----
Some of this code is response to comments on the PR
----
Some code is to fix an issue that bignums revealed:
Both Int31 and bignums registered numeral notations in int31_scope. We
now prepend a globally unique identifier when registering numeral
notations from OCaml plugins. This is permissible because we don't
store the uid information for such notations in .vo files (assuming I'm
understanding the code correctly).
|
|
|
|
|
|
|
|
|
|
|
|
```
git grep --name-only 'should goes' | xargs sed s'/should goes/should go/g' -i
```
|
|
|
|
This way, we could fully bypass bigint.ml.
The previous mechanism of parsing/printing Z is kept for now.
Currently, the conversion functions accepted by Numeral Notation foo
may have the following types.
for parsing:
int -> foo
int -> option foo
uint -> foo
uint -> option foo
Z -> foo
Z -> option foo
for printing:
foo -> int
foo -> option int
foo -> uint
foo -> option uint
foo -> Z
foo -> option Z
Notes:
- The Declare ML Module is directly done in Prelude
- When doing a Numeral Notation, having the Z datatype around
isn't mandatory anymore (but the error messages suggest that
it can still be used).
- An option (abstract after ...) allows to keep large numbers in
an abstract form such as (Nat.of_uint 123456) instead of reducing
to (S (S (S ...))) and ending immediately with Stack Overflow.
- After checking with Matthieu, there is now a explicit check
and an error message in case of polymorphic inductive types
|
|
|
|
|
|
This is a portion of roglo's PR#156 introducing a Numeral Notation
command : we deal here with inductive types via conversion fonctions
from/to Z written in Coq.
For an example, see plugins/syntax/NatSyntaxViaZ.v
This commit does not include the part about printing via some ltac.
Using ltac was meant for dealing with real numbers, let's see first what
become PR#415 about a compact representation for real literals.
|
|
The first part (e.g. register_bignumeral_interpretation) deals only with
the interp/uninterp closures. It should typically be done as a side
effect during a syntax plugin loading. No prim notation are active yet
after this phase.
The second part (enable_prim_token_interpretation) activates the prim
notation. It is now correctly talking to Summary and to the LibStack.
To avoid "phantom" objects in libstack after a mere Require, this
second part should be done inside a Mltop.declare_cache_obj
The link between the two parts is a prim_token_uid (a string), which
should be unique for each primitive notation. When this primitive
notation is specific to a scope, the scope_name could be used as uid.
Btw, the list of "patterns" for detecting when an uninterpreter should
be considered is now restricted to a list of global_reference
(inductive constructors, or injection functions such as IZR).
The earlier API was accepting a glob_constr list, but was actually
only working well for global_reference.
A minimal compatibility is provided (declare_numeral_interpreter),
but is discouraged, since it is known to store uncessary objects
in the libstack.
|
|
Without this, the library segment of all .vo except Notations.vo starts
with two TOKEN objects (declaration of tokens "->" and "<-").
This is due to side effects creating these objects during the dynlink
of ltac_plugin.cmxs, more precisely the two Metasyntax.add_token_obj in
Extraargs. It's quite cleaner to register these two side effects via
Mltop.declare_cache_obj, so that the two objects only live in
Notations.vo, and are loaded from there.
|
|
|
|
|