| Age | Commit message (Collapse) | Author |
|
|
|
|
|
|
|
|
|
Removing a few Global.env in the way.
|
|
After some analysis this should be taken care of by
`Safe_typing.add_constant`
It was added in
https://github.com/coq/coq/commit/f7338257584ba69e7e815c7ef9ac0d24f0dec36c
, so maybe @gares can provide more context as to how is this stuff
supposed to work.
|
|
In general, `Nametab` is not a module you want to open globally as it
exposes very generic identifiers such as `push` or `global`.
Thus, we remove all global opens and qualify `Nametab` access. The
patch is small and confirms the hypothesis that `Nametab` access
happens in few places thus it doesn't need a global open.
It is also very convenient to be able to use `grep` to see accesses to
the namespace table.
|
|
|
|
|
|
|
|
|
|
|
|
We remove sections paths from kernel names. This is a cleanup as most of the times this information was unused. This implies a change in the Kernel API and small user visible changes with regards to tactic qualification. In particular, the removal of "global discharge" implies a large cleanup of code.
Additionally, the change implies that some machinery in `library` and `safe_typing` must now take an `~in_section` parameter, as to provide the information whether a section is open or not.
|
|
Declaring the universe to the kernel/section mechanism is centralized
to [Declare.declare_universe_context].
Then the universe name object really is only about the user visible
names.
|
|
Fixes #6764: Printing Notation regressed compared to 8.7
|
|
Instead of looking into the name-oriented structure we look into the
actual section structures.
Note: together with #8475 this lets us remove UnivNames.add_global_universe.
|
|
|
|
In particular we check if really used for internal debugging purpose
or to display a message to the user. In the latter case, we replace it
(when possible) by a higher-level printer (e.g. printing foo instead
of Top.foo). In the former case, we clarify that the use is a
debugging use.
Still not perfect (see a few FIXME).
|
|
This fixes #8401
Supersedes / closes #8407
Vernacular-command-registered numeral notations now live in the summary,
and the interpretation function for them is hard-coded.
Plugin-registered numeral notations are still unsynchronized, and only
the UIDs of these functions gets synchronized. I am not 100% sure why
this is fine, but the test-suite file working suggests that it is fine.
I think it is because worker delegation correctly handles
non-synchronized state which is declared at `Declare ML Module`-time.
This final commit changes the synchronization of numeral notations (and
deletes no-longer-used declarations in notation.mli that were introduced
temporarily in the last commit). Since the interpretation can now be
done in notation.ml, we no longer need to register unique ids for
numeral notation (un)interp functions, and can instead synchronize the
underlying constants with the document state. This is the change that
actually fixes #8401.
|
|
Move most of the rest of the stuff from numeral.ml to notation.ml. Now
that we use exceptions to print error messages, all of the
interpretation code for numeral notations can be moved to notation.ml.
This is commit 1/4 in the fix for #8401.
This is a pure cut/paste commit, modulo fixing name qualifications due
to moved things, and exposing some stuff in notation.mli (to be removed
in the next commit, where we finish the numeral notation reworking).
|
|
Switch to using exceptions rather than user errors. This will be
required because the machinery for printing constrs is not available in
notation.ml, so we move the error message printing to himsg.ml instead.
This is commit 2/4 in the fix for #8401.
|
|
Move various things from from numeral.ml to notation.ml and
notation.mli; this is required to allow the vernac command to continue
living in numeral.ml while preparing to move all of the numeral notation
interpretation logic to notation.ml
This is commit 1/4 in the fix for #8401.
This is a pure cut/paste commit, modulo adding section-heading comments.
|
|
The Dischargedhypsmap table collected the segment of section variables
that constants defined in a section were originally depending on. It
was useful to reconstruct the structure of sections as originally
given in a source file. In particular this was used in Sacerdoti
Coen's plugin for exportation of Coq files to xml. There is no
information that this plugin, moved out of Coq in September 2014, was
finally maintained, even as an external plugin. So it seems that the
Dischargedhypsmap table is virtually not used anymore in the wild.
Please contact the developers if ever the need for such a table
happens to be necessary for your work.
|
|
|
|
|
|
[Dune](https://github.com/ocaml/dune) is a compositional declarative
build system for OCaml. It provides automatic generation of
`version.ml`, `.merlin`, `META`, `opam`, API documentation; install
management; easy integration with external libraries, test runners,
and modular builds.
In particular, Dune uniformly handles components regardless whether
they live in, or out-of-tree. This greatly simplifies cases where a
plugin [or CoqIde] is checked out in the current working copy but then
distributed separately [and vice-versa]. Dune can thus be used as a
more flexible `coq_makefile` replacement.
For now we provide experimental support for a Dune build. In order to
build Coq + the standard library with Dune type:
```
$ make -f Makefile.dune world
```
This PR includes a preliminary, developer-only preview of Dune for
Coq. There is still ongoing work, see
https://github.com/coq/coq/issues/8052 for tracking status towards
full support.
## Technical description.
Dune works out of the box with Coq, once we have fixed some modularity
issues. The main remaining challenge was to support `.vo` files.
As Dune doesn't support custom build rules yet, to properly build
`.vo` files we provide a small helper script `tools/coq_dune.ml`. The
script will scan the Coq library directories and generate the
corresponding rules for `.v -> .vo` and `.ml4 -> .ml` builds. The
script uses `coqdep` as to correctly output the dependencies of
`.v` files. `coq_dune` is akin to `coq_makefile` and should be able to
be used to build Coq projects in the future.
Due to this pitfall, the build process has to proceed in three stages:
1) build `coqdep` and `coq_dune`; 2) generate `dune` files for
`theories` and `plugins`; 3) perform a regular build with all
targets are in scope.
## FAQ
### Why Dune?
Coq has a moderately complex build system and it is not a secret that
many developer-hours have been spent fighting with `make`.
In particular, the current `make`-based system does offer poor support
to verify that the current build rules and variables are coherent, and
requires significant manual, error-prone. Many variables must be
passed by hand, duplicated, etc... Additionally, our make system
offers poor integration with now standard OCaml ecosystem tools such
as `opam`, `ocamlfind` or `odoc`. Another critical point is build
compositionality. Coq is rich in 3rd party contributions, and a big
shortcoming of the current make system is that it cannot be used to
build these projects; requiring us to maintain a custom tool,
`coq_makefile`, with the corresponding cost.
In the past, there has been some efforts to migrate Coq to more
specialized build systems, however these stalled due to a variety of
reasons. Dune, is a declarative, OCaml-specific build tool that is on
the path to become the standard build system for the OCaml ecosystem.
Dune seems to be a good fit for Coq well: it is well-supported, fast,
compositional, and designed for large projects.
### Does Dune replace the make-based build system?
The current, make-based build system is unmodified by this PR and kept
as the default option. However, Dune has the potential
### Is this PR complete? What does it provide?
This PR is ready for developer preview and feedback. The build system
is functional, however, more work is necessary in order to make Dune
the default for Coq.
The main TODOs are tracked at https://github.com/coq/coq/issues/8052
This PR allows developers to use most of the features of Dune today:
- Modular organization of the codebase; each component is built only
against declared dependencies so components are checked for
containment more strictly.
- Hygienic builds; Dune places all artifacts under `_build`.
- Automatic generation of `.install` files, simplified OPAM workflow.
- `utop` support, `-opaque` in developer mode, etc...
- `ml4` files are handled using `coqp5`, a native-code customized
camlp5 executable which brings much faster `ml4 -> ml` processing.
### What dependencies does Dune require?
Dune doesn't depend on any 3rd party package other than the OCaml compiler.
### Some Benchs:
```
$ /usr/bin/time make DUNEOPT="-j 1000" -f Makefile.dune states
59.50user 18.81system 0:29.83elapsed 262%CPU (0avgtext+0avgdata 302996maxresident)k
0inputs+646632outputs (0major+4893811minor)pagefaults 0swaps
$ /usr/bin/time sh -c "./configure -local -native-compiler no && make -j states"
88.21user 23.65system 0:32.96elapsed 339%CPU (0avgtext+0avgdata 304992maxresident)k
0inputs+1051680outputs (0major+5300680minor)pagefaults 0swaps
```
|
|
|
|
|
|
Numeral Notations are not well-supported inside functors. We now give a
proper error message rather than an anomaly when this occurs.
|
|
Thanks to Emilio and Pierre-Marie Pédrot for pointers.
|
|
Because that's the sane thing to do.
This will inevitably cause issues for projects which do not `Import
Coq.Strings.Ascii` before trying to use ascii notations.
We also move the syntax plugin for `int31` notations from `Cyclic31` to
`Int31`, so that users (like CompCert) who merely `Require Import
Coq.Numbers.Cyclic.Int31.Int31` get the `int31` numeral syntax. Since
`Cyclic31` `Export`s `Int31`, this should not cause any additional
incompatibilities.
|
|
Some of this code is cargo-culted or kludged to work.
As I understand it, the situation is as follows:
There are two sorts of use-cases that need to be supported:
1. A plugin registers an OCaml function as a numeral interpreter. In
this case, the function registration must be synchronized with the
document state, but the functions should not be marshelled / stored
in the .vo.
2. A vernacular registers a Gallina function as a numeral interpreter.
In this case, the registration must be synchronized, and the function
should be marshelled / stored in the .vo.
In case (1), we can compare functions by pointer equality, and we should
be able to rely on globally unique keys, even across backtracking.
In case (2), we cannot compare functions by pointer equality (because
they must be regenerated on unmarshelling when `Require`ing a .vo file),
and we also cannot rely on any sort of unique key being both unique and
persistent across files.
The solution we use here is that we ask clients to provide "unique"
keys, and that clients tell us whether or not to overwrite existing
registered functions, i.e., to tell us whether or not we should expect
interpreter functions to be globally unique under pointer equality. For
plugins, a simple string suffices, as long as the string does not clash
between different plugins. In the case of vernacular-registered
functions, use marshell a description of all of the data used to
generate the function, and use that string as a unique key which is
expected to persist across files. Because we cannot rely on
function-pointer uniqueness here, we tell the
interpretation-registration to allow overwriting.
----
Some of this code is response to comments on the PR
----
Some code is to fix an issue that bignums revealed:
Both Int31 and bignums registered numeral notations in int31_scope. We
now prepend a globally unique identifier when registering numeral
notations from OCaml plugins. This is permissible because we don't
store the uid information for such notations in .vo files (assuming I'm
understanding the code correctly).
|
|
The first part (e.g. register_bignumeral_interpretation) deals only with
the interp/uninterp closures. It should typically be done as a side
effect during a syntax plugin loading. No prim notation are active yet
after this phase.
The second part (enable_prim_token_interpretation) activates the prim
notation. It is now correctly talking to Summary and to the LibStack.
To avoid "phantom" objects in libstack after a mere Require, this
second part should be done inside a Mltop.declare_cache_obj
The link between the two parts is a prim_token_uid (a string), which
should be unique for each primitive notation. When this primitive
notation is specific to a scope, the scope_name could be used as uid.
Btw, the list of "patterns" for detecting when an uninterpreter should
be considered is now restricted to a list of global_reference
(inductive constructors, or injection functions such as IZR).
The earlier API was accepting a glob_constr list, but was actually
only working well for global_reference.
A minimal compatibility is provided (declare_numeral_interpreter),
but is discouraged, since it is known to store uncessary objects
in the libstack.
|
|
This is prim token notations for inductive *types*, not values.
So we're speaking of a scope where 0 is the type nat, 1 is the type bool, etc.
To my knowledge, this feature hasn't ever been used, and is very unlikely
to be used ever, so let's clean the code a bit by removing it.
|
|
|
|
This cleanup prepares for forthcoming synchronization of prim_token_interpreter
wrt to Summary. These chains of prim_token_interpreter were anyway never used,
only one interpreter was declared per numeral scope.
After sync wrt Summary, we'll anyway be able to backtrack to a previous
interpreter.
|
|
|
|
- New command "Declare Custom Entry bar".
- Entries can have levels.
- Printing is done using a notion of coercion between grammar
entries. This typically corresponds to rules of the form
'Notation "[ x ]" := x (x custom myconstr).' but also
'Notation "{ x }" := x (in custom myconstr, x constr).'.
- Rules declaring idents such as 'Notation "x" := x (in custom myconstr, x ident).'
are natively recognized.
- Rules declaring globals such as 'Notation "x" := x (in custom myconstr, x global).'
are natively recognized.
Incidentally merging ETConstr and ETConstrAsBinder.
Noticed in passing that parsing binder as custom was not done as in
constr.
Probably some fine-tuning still to do (priority of notations,
interactions between scopes and entries, ...). To be tested live
further.
|
|
|
|
|
|
|
|
|
|
The upper layers still need a mapping constant -> projection, which is
provided by Recordops.
|
|
No reason not to collapse inner applications with explicit arguments.
This is compatible with the ad hoc encoding of @f as GApp(f,[])/NApp(f,[]).
|
|
case of missing record field
|
|
|
|
While we were adding a new field into `QuestionMark`, we
decided to go ahead and refactor the constructor to hold
an actual record. This record now holds the name, obligations, and
whether the evar represents a missing record field.
This is used to provide better error messages on missing record
fields.
|
|
Fixes #8067. This is becoming the default in many developments, so it
makes sense to require it too, both for Coq and for Plugins.
|
|
constr in Constr
|
|
|