diff options
Diffstat (limited to 'doc')
76 files changed, 4412 insertions, 4232 deletions
diff --git a/doc/changelog/01-kernel/09867-floats.rst b/doc/changelog/01-kernel/09867-floats.rst new file mode 100644 index 0000000000..56b5fc747a --- /dev/null +++ b/doc/changelog/01-kernel/09867-floats.rst @@ -0,0 +1,13 @@ +- A built-in support of floating-point arithmetic was added, allowing + one to devise efficient reflection tactics involving numerical + computation. Primitive floats are added in the language of terms, + following the binary64 format of the IEEE 754 standard, and the + related operations are implemented for the different reduction + engines of Coq by using the corresponding processor operators in + rounding-to-nearest-even. The properties of these operators are + axiomatized in the theory :g:`Coq.Floats.FloatAxioms` which is part + of the library :g:`Coq.Floats.Floats`. + See Section :ref:`primitive-floats` + (`#9867 <https://github.com/coq/coq/pull/9867>`_, + closes `#8276 <https://github.com/coq/coq/issues/8276>`_, + by Guillaume Bertholon, Erik Martin-Dorel, Pierre Roux). diff --git a/doc/changelog/01-kernel/10664-sections-stack-in-kernel.rst b/doc/changelog/01-kernel/10664-sections-stack-in-kernel.rst new file mode 100644 index 0000000000..bac08d12ea --- /dev/null +++ b/doc/changelog/01-kernel/10664-sections-stack-in-kernel.rst @@ -0,0 +1,6 @@ +- Section data is now part of the kernel. Solves a soundness issue + in interactive mode where global monomorphic universe constraints would be + dropped when forcing a delayed opaque proof inside a polymorphic section. Also + relaxes the nesting criterion for sections, as polymorphic sections can now + appear inside a monomorphic one + (#10664, <https://github.com/coq/coq/pull/10664> by Pierre-Marie Pédrot). diff --git a/doc/changelog/01-kernel/10811-sprop-default-on.rst b/doc/changelog/01-kernel/10811-sprop-default-on.rst new file mode 100644 index 0000000000..349c44c205 --- /dev/null +++ b/doc/changelog/01-kernel/10811-sprop-default-on.rst @@ -0,0 +1,3 @@ +- Using ``SProp`` is now allowed by default, without needing to pass + ``-allow-sprop`` or use :flag:`Allow StrictProp` (`#10811 + <https://github.com/coq/coq/pull/10811>`_, by Gaëtan Gilbert). diff --git a/doc/changelog/02-specification-language/10758-fix-10757.rst b/doc/changelog/02-specification-language/10758-fix-10757.rst new file mode 100644 index 0000000000..4cce26aedc --- /dev/null +++ b/doc/changelog/02-specification-language/10758-fix-10757.rst @@ -0,0 +1,5 @@ +- ``Program Fixpoint`` now uses ``ex`` and ``sig`` to make telescopes + involving ``Prop`` types (`#10758 + <https://github.com/coq/coq/pull/10758>`_, by Gaëtan Gilbert, fixing + `#10757 <https://github.com/coq/coq/issues/10757>`_ reported by + Xavier Leroy). diff --git a/doc/changelog/02-specification-language/10985-about-arguments.rst b/doc/changelog/02-specification-language/10985-about-arguments.rst new file mode 100644 index 0000000000..1e05b0b0fe --- /dev/null +++ b/doc/changelog/02-specification-language/10985-about-arguments.rst @@ -0,0 +1,5 @@ +- The output of the :cmd:`Print` and :cmd:`About` commands has + changed. Arguments meta-data is now displayed as the corresponding + :cmd:`Arguments <Arguments (implicits)>` command instead of the + human-targeted prose used in previous Coq versions. (`#10985 + <https://github.com/coq/coq/pull/10985>`_, by Gaëtan Gilbert). diff --git a/doc/changelog/02-specification-language/10996-refine-instance-returns.rst b/doc/changelog/02-specification-language/10996-refine-instance-returns.rst new file mode 100644 index 0000000000..cd1a692f54 --- /dev/null +++ b/doc/changelog/02-specification-language/10996-refine-instance-returns.rst @@ -0,0 +1,4 @@ +- Added ``#[refine]`` attribute for :cmd:`Instance`, a more + predictable version of the old ``Refine Instance Mode`` which + unconditionally opens a proof (`#10996 + <https://github.com/coq/coq/pull/10996>`_, by Gaëtan Gilbert). diff --git a/doc/changelog/02-specification-language/10997-unsupport-atts-warn.rst b/doc/changelog/02-specification-language/10997-unsupport-atts-warn.rst new file mode 100644 index 0000000000..43a748b365 --- /dev/null +++ b/doc/changelog/02-specification-language/10997-unsupport-atts-warn.rst @@ -0,0 +1,3 @@ +- The unsupported attribute error is now an error-by-default warning, + meaning it can be disabled (`#10997 + <https://github.com/coq/coq/pull/10997>`_, by Gaëtan Gilbert). diff --git a/doc/changelog/03-notations/09883-numeral-notations-sorts.rst b/doc/changelog/03-notations/09883-numeral-notations-sorts.rst new file mode 100644 index 0000000000..abc5a516ae --- /dev/null +++ b/doc/changelog/03-notations/09883-numeral-notations-sorts.rst @@ -0,0 +1,4 @@ +- Numeral Notations now support sorts in the input to printing + functions (e.g., numeral notations can be defined for terms + containing things like `@cons Set nat nil`). (`#9883 + <https://github.com/coq/coq/pull/9883>`_, by Jason Gross). diff --git a/doc/changelog/03-notations/10963-simplify-parser.rst b/doc/changelog/03-notations/10963-simplify-parser.rst new file mode 100644 index 0000000000..327a39bdb6 --- /dev/null +++ b/doc/changelog/03-notations/10963-simplify-parser.rst @@ -0,0 +1,6 @@ +- A simplification of parsing rules could cause a slight change of + parsing precedences for the very rare users who defined notations + with `constr` at level strictly between 100 and 200 and used these + notations on the right-hand side of a cast operator (`:`, `:>`, + `:>>`) (`#10963 <https://github.com/coq/coq/pull/10963>`_, by Théo + Zimmermann, simplification initially noticed by Jim Fehrle). diff --git a/doc/changelog/04-tactics/09856-zify.rst b/doc/changelog/04-tactics/09856-zify.rst new file mode 100644 index 0000000000..6b9143c77b --- /dev/null +++ b/doc/changelog/04-tactics/09856-zify.rst @@ -0,0 +1,7 @@ +- Reimplementation of the :tacn:`zify` tactic. The tactic is more efficient and copes with dependent hypotheses. + It can also be extended by redefining the tactic ``zify_post_hook``. + (`#9856 <https://github.com/coq/coq/pull/9856>`_ fixes + `#8898 <https://github.com/coq/coq/issues/8898>`_, + `#7886 <https://github.com/coq/coq/issues/7886>`_, + `#9848 <https://github.com/coq/coq/issues/9848>`_ and + `#5155 <https://github.com/coq/coq/issues/5155>`_, by Frédéric Besson). diff --git a/doc/changelog/04-tactics/10765-micromega-caches.rst b/doc/changelog/04-tactics/10765-micromega-caches.rst new file mode 100644 index 0000000000..12d8f68e63 --- /dev/null +++ b/doc/changelog/04-tactics/10765-micromega-caches.rst @@ -0,0 +1,3 @@ +- Introduction of flags :flag:`Lia Cache`, :flag:`Nia Cache` and :flag:`Nra Cache`. + (see `#10772 <https://github.com/coq/coq/issues/10772>`_ for use case) + (`#10765 <https://github.com/coq/coq/pull/10765>`_ fixes `#10772 <https://github.com/coq/coq/issues/10772>`_ , by Frédéric Besson). diff --git a/doc/changelog/04-tactics/10774-zify-Z_to_N.rst b/doc/changelog/04-tactics/10774-zify-Z_to_N.rst new file mode 100644 index 0000000000..ed46cb101e --- /dev/null +++ b/doc/changelog/04-tactics/10774-zify-Z_to_N.rst @@ -0,0 +1,3 @@ +- The :tacn:`zify` tactic is now aware of `Z.to_N`. + (`#10774 <https://github.com/coq/coq/pull/10774>`_ fixes + `#9162 <https://github.com/coq/coq/issues/9162>`_, by Kazuhiko Sakaguchi). diff --git a/doc/changelog/04-tactics/10966-assert-succeeds-once.rst b/doc/changelog/04-tactics/10966-assert-succeeds-once.rst new file mode 100644 index 0000000000..09bef82c80 --- /dev/null +++ b/doc/changelog/04-tactics/10966-assert-succeeds-once.rst @@ -0,0 +1,11 @@ +- The :tacn:`assert_succeeds` and :tacn:`assert_fails` tactics now + only run their tactic argument once, even if it has multiple + successes. This prevents blow-up and looping from using + multisuccess tactics with :tacn:`assert_succeeds`. (`#10966 + <https://github.com/coq/coq/pull/10966>`_ fixes `#10965 + <https://github.com/coq/coq/issues/10965>`_, by Jason Gross). + +- The :tacn:`assert_succeeds` and :tacn:`assert_fails` tactics now + behave correctly when their tactic fully solves the goal. (`#10966 + <https://github.com/coq/coq/pull/10966>`_ fixes `#9114 + <https://github.com/coq/coq/issues/9114>`_, by Jason Gross). diff --git a/doc/changelog/04-tactics/10998-zify-complements.rst b/doc/changelog/04-tactics/10998-zify-complements.rst new file mode 100644 index 0000000000..3ec526f0a9 --- /dev/null +++ b/doc/changelog/04-tactics/10998-zify-complements.rst @@ -0,0 +1,7 @@ +- The :tacn:`zify` tactic is now aware of `Pos.pred_double`, `Pos.pred_N`, + `Pos.of_nat`, `Pos.add_carry`, `Pos.pow`, `Pos.square`, `Z.pow`, `Z.double`, + `Z.pred_double`, `Z.succ_double`, `Z.square`, `Z.div2`, and `Z.quot2`. + Injections for internal definitions in module `ZifyBool` (`isZero` and `isLeZero`) + are also added to help users to declare new :tacn:`zify` class instances using + Micromega tactics. + (`#10998 <https://github.com/coq/coq/pull/10998>`_, by Kazuhiko Sakaguchi). diff --git a/doc/changelog/05-tactic-language/10324-ltac2-ssr-ampersand.rst b/doc/changelog/05-tactic-language/10324-ltac2-ssr-ampersand.rst new file mode 100644 index 0000000000..fba09f5e87 --- /dev/null +++ b/doc/changelog/05-tactic-language/10324-ltac2-ssr-ampersand.rst @@ -0,0 +1,5 @@ +- White spaces are forbidden in the “&ident” syntax for ltac2 references + that are described in :ref:`ltac2_built-in-quotations` + (`#10324 <https://github.com/coq/coq/pull/10324>`_, + fixes `#10088 <https://github.com/coq/coq/issues/10088>`_, + authored by Pierre-Marie Pédrot). diff --git a/doc/changelog/06-ssreflect/10022-ssr-under-setoid.rst b/doc/changelog/06-ssreflect/10022-ssr-under-setoid.rst new file mode 100644 index 0000000000..5e005742fd --- /dev/null +++ b/doc/changelog/06-ssreflect/10022-ssr-under-setoid.rst @@ -0,0 +1,28 @@ +- Generalize tactics :tacn:`under` and :tacn:`over` for any registered + relation. More precisely, assume the given context lemma has type + `forall f1 f2, .. -> (forall i, R1 (f1 i) (f2 i)) -> R2 f1 f2`. The + first step performed by :tacn:`under` (since Coq 8.10) amounts to + calling the tactic :tacn:`rewrite <rewrite (ssreflect)>`, which + itself relies on :tacn:`setoid_rewrite` if need be. So this step was + already compatible with a double implication or setoid equality for + the conclusion head symbol `R2`. But a further step consists in + tagging the generated subgoal `R1 (f1 i) (?f2 i)` to protect it from + unwanted evar instantiation, and get `Under_rel _ R1 (f1 i) (?f2 i)` + that is displayed as ``'Under[ f1 i ]``. In Coq 8.10, this second + (convenience) step was only performed when `R1` was Leibniz' `eq` or + `iff`. Now, it is also performed for any relation `R1` which has a + ``RewriteRelation`` instance (a `RelationClasses.Reflexive` instance + being also needed so :tacn:`over` can discharge the ``'Under[ _ ]`` + goal by instantiating the hidden evar.) Also, it is now possible to + manipulate `Under_rel _ R1 (f1 i) (?f2 i)` subgoals directly if `R1` + is a `PreOrder` relation or so, thanks to extra instances proving + that `Under_rel` preserves the properties of the `R1` relation. + These two features generalizing support for setoid-like relations is + enabled as soon as we do both ``Require Import ssreflect.`` and + ``Require Setoid.`` Finally, a rewrite rule ``UnderE`` has been + added if one wants to "unprotect" the evar, and instantiate it + manually with another rule than reflexivity (i.e., without using the + :tacn:`over` tactic nor the ``over`` rewrite rule). See also Section + :ref:`under_ssr` (`#10022 <https://github.com/coq/coq/pull/10022>`_, + by Erik Martin-Dorel, with suggestions and review by Enrico Tassi + and Cyril Cohen). diff --git a/doc/changelog/06-ssreflect/10932-void-type-ssr.rst b/doc/changelog/06-ssreflect/10932-void-type-ssr.rst new file mode 100644 index 0000000000..7366ef1190 --- /dev/null +++ b/doc/changelog/06-ssreflect/10932-void-type-ssr.rst @@ -0,0 +1,3 @@ +- Add a :g:`void` notation for the standard library empty type (:g:`Empty_set`) + (`#10932 <https://github.com/coq/coq/pull/10932>`_, by Arthur Azevedo de + Amorim). diff --git a/doc/changelog/07-commands-and-options/10291-typing-flags.rst b/doc/changelog/07-commands-and-options/10291-typing-flags.rst new file mode 100644 index 0000000000..ef7adde801 --- /dev/null +++ b/doc/changelog/07-commands-and-options/10291-typing-flags.rst @@ -0,0 +1,4 @@ +- Adding unsafe commands to enable/disable guard checking, positivity checking + and universes checking (providing a local `-type-in-type`). + See :ref:`controlling-typing-flags`. + (`#10291 <https://github.com/coq/coq/pull/10291>`_ by Simon Boulier). diff --git a/doc/changelog/07-commands-and-options/10336-ambiguous-paths.rst b/doc/changelog/07-commands-and-options/10336-ambiguous-paths.rst deleted file mode 100644 index 151c400b2c..0000000000 --- a/doc/changelog/07-commands-and-options/10336-ambiguous-paths.rst +++ /dev/null @@ -1,5 +0,0 @@ -- Improve the ambiguous paths warning to indicate which path is ambiguous with - new one - (`#10336 <https://github.com/coq/coq/pull/10336>`_, - closes `#3219 <https://github.com/coq/coq/issues/3219>`_, - by Kazuhiko Sakaguchi). diff --git a/doc/changelog/07-commands-and-options/10476-fix-export.rst b/doc/changelog/07-commands-and-options/10476-fix-export.rst new file mode 100644 index 0000000000..ba71e1c337 --- /dev/null +++ b/doc/changelog/07-commands-and-options/10476-fix-export.rst @@ -0,0 +1,5 @@ +- Fix two bugs in `Export`. This can have an impact on the behavior of the + `Import` command on libraries. `Import A` when `A` imports `B` which exports + `C` was importing `C`, whereas `Import` is not transitive. Also, after + `Import A B`, the import of `B` was sometimes incomplete. + (`#10476 <https://github.com/coq/coq/pull/10476>`_, by Maxime Dénès). diff --git a/doc/changelog/07-commands-and-options/10489-print_dependent_evars.rst b/doc/changelog/07-commands-and-options/10489-print_dependent_evars.rst new file mode 100644 index 0000000000..580e808baa --- /dev/null +++ b/doc/changelog/07-commands-and-options/10489-print_dependent_evars.rst @@ -0,0 +1,7 @@ +- Update output generated by :flag:`Printing Dependent Evars Line` flag + used by the Prooftree tool in Proof General. + (`#10489 <https://github.com/coq/coq/pull/10489>`_, + closes `#4504 <https://github.com/coq/coq/issues/4504>`_, + `#10399 <https://github.com/coq/coq/issues/10399>`_ and + `#10400 <https://github.com/coq/coq/issues/10400>`_, + by Jim Fehrle). diff --git a/doc/changelog/07-commands-and-options/10494-diffs-in-show-proof.rst b/doc/changelog/07-commands-and-options/10494-diffs-in-show-proof.rst new file mode 100644 index 0000000000..c1df728c5c --- /dev/null +++ b/doc/changelog/07-commands-and-options/10494-diffs-in-show-proof.rst @@ -0,0 +1,6 @@ +- Optionally highlight the differences between successive proof steps in the + :cmd:`Show Proof` command. Experimental; only available in coqtop + and Proof General for now, may be supported in other IDEs + in the future. + (`#10494 <https://github.com/coq/coq/pull/10494>`_, + by Jim Fehrle). diff --git a/doc/changelog/08-tools/08642-vos-files.rst b/doc/changelog/08-tools/08642-vos-files.rst new file mode 100644 index 0000000000..f612096880 --- /dev/null +++ b/doc/changelog/08-tools/08642-vos-files.rst @@ -0,0 +1,7 @@ +- `coqc` now provides the ability to generate compiled interfaces. + Use `coqc -vos foo.v` to skip all opaque proofs during the + compilation of `foo.v`, and output a file called `foo.vos`. + This feature is experimental. It enables working on a Coq file without the need to + first compile the proofs contained in its dependencies + (`#8642 <https://github.com/coq/coq/pull/8642>`_ by Arthur Charguéraud, review by + Maxime Dénès and Emilio Gallego). diff --git a/doc/changelog/08-tools/10947-coq-makefile-dep.rst b/doc/changelog/08-tools/10947-coq-makefile-dep.rst new file mode 100644 index 0000000000..f620b32cb8 --- /dev/null +++ b/doc/changelog/08-tools/10947-coq-makefile-dep.rst @@ -0,0 +1,5 @@ +- Renamed `VDFILE` from `.coqdeps.d` to `.<CoqMakefile>.d` in the `coq_makefile` + utility, where `<CoqMakefile>` is the name of the output file given by the + `-o` option. In this way two generated makefiles can coexist in the same + directory. + (`#10947 <https://github.com/coq/coq/pull/10947>`_, by Kazuhiko Sakaguchi). diff --git a/doc/changelog/08-tools/11068-coqbin-noslash.rst b/doc/changelog/08-tools/11068-coqbin-noslash.rst new file mode 100644 index 0000000000..c2c8f4df31 --- /dev/null +++ b/doc/changelog/08-tools/11068-coqbin-noslash.rst @@ -0,0 +1,3 @@ +- ``coq_makefile`` now supports environment variable ``COQBIN`` with + no ending ``/`` character (`#11068 + <https://github.com/coq/coq/pull/11068>`_, by Gaëtan Gilbert). diff --git a/doc/changelog/10-standard-library/09772-ordered_type-hint-db.rst b/doc/changelog/10-standard-library/09772-ordered_type-hint-db.rst new file mode 100644 index 0000000000..7babcdb6f1 --- /dev/null +++ b/doc/changelog/10-standard-library/09772-ordered_type-hint-db.rst @@ -0,0 +1,4 @@ +- Moved the `auto` hints of the `OrderedType` module into a new `ordered_type` + database + (`#9772 <https://github.com/coq/coq/pull/9772>`_, + by Vincent Laporte). diff --git a/doc/changelog/10-standard-library/09811-remove-zlogarithm.rst b/doc/changelog/10-standard-library/09811-remove-zlogarithm.rst new file mode 100644 index 0000000000..ab625b9e03 --- /dev/null +++ b/doc/changelog/10-standard-library/09811-remove-zlogarithm.rst @@ -0,0 +1,4 @@ +- Removes deprecated modules `Coq.ZArith.Zlogarithm` + and `Coq.ZArith.Zsqrt_compat` + (#9881 <https://github.com/coq/coq/pull/9811> + by Vincent Laporte). diff --git a/doc/changelog/10-standard-library/10445-constructive-reals.rst b/doc/changelog/10-standard-library/10445-constructive-reals.rst new file mode 100644 index 0000000000..d69056fc2f --- /dev/null +++ b/doc/changelog/10-standard-library/10445-constructive-reals.rst @@ -0,0 +1,12 @@ +- New module `Reals.ConstructiveCauchyReals` defines constructive real numbers + by Cauchy sequences of rational numbers. Classical real numbers are now defined + as a quotient of these constructive real numbers, which significantly reduces + the number of axioms needed (see `Reals.Rdefinitions` and `Reals.Raxioms`), + while preserving backward compatibility. + + Futhermore, the new axioms for classical real numbers include the limited + principle of omniscience (`sig_forall_dec`), which is a logical principle + instead of an ad hoc property of the real numbers. + + See `#10445 <https://github.com/coq/coq/pull/10445>`_, by Vincent Semeria, + with the help and review of Guillaume Melquiond and Bas Spitters. diff --git a/doc/changelog/10-standard-library/10651-new-lemmas-for-lists.rst b/doc/changelog/10-standard-library/10651-new-lemmas-for-lists.rst new file mode 100644 index 0000000000..864c4e6a7e --- /dev/null +++ b/doc/changelog/10-standard-library/10651-new-lemmas-for-lists.rst @@ -0,0 +1,6 @@ +- New lemmas on :g:`combine`, :g:`filter`, :g:`nodup`, :g:`nth`, and + :g:`nth_error` functions on lists. The lemma :g:`filter_app` was moved to the + :g:`List` module. + + See `#10651 <https://github.com/coq/coq/pull/10651>`_, and + `#10731 <https://github.com/coq/coq/pull/10731>`_, by Oliver Nash. diff --git a/doc/changelog/10-standard-library/10827-dedekind-reals.rst b/doc/changelog/10-standard-library/10827-dedekind-reals.rst new file mode 100644 index 0000000000..5d8467025b --- /dev/null +++ b/doc/changelog/10-standard-library/10827-dedekind-reals.rst @@ -0,0 +1,11 @@ +- New module `Reals.ClassicalDedekindReals` defines Dedekind real numbers + as boolean-values functions along with 3 logical axioms: limited principle + of omniscience, excluded middle of negations and functional extensionality. + The exposed type :g:`R` in module :g:`Reals.Rdefinitions` is those + Dedekind reals, hidden behind an opaque module. + Classical Dedekind reals are a quotient of constructive reals, which allows + to transport many constructive proofs to the classical case. + + See `#10827 <https://github.com/coq/coq/pull/10827>`_, by Vincent Semeria, + based on discussions with Guillaume Melquiond, Bas Spitters and Hugo Herbelin, + code review by Hugo Herbelin. diff --git a/doc/changelog/10-standard-library/10895-master+weak-excluded-middle-de-morgan.rst b/doc/changelog/10-standard-library/10895-master+weak-excluded-middle-de-morgan.rst new file mode 100644 index 0000000000..6e87ff93c7 --- /dev/null +++ b/doc/changelog/10-standard-library/10895-master+weak-excluded-middle-de-morgan.rst @@ -0,0 +1 @@ +- ClassicalFacts: Adding the standard equivalence between weak excluded-middle and the classical instance of De Morgan's law (`#10895 <https://github.com/coq/coq/pull/10895>`_, by Hugo Herbelin). diff --git a/doc/changelog/README.md b/doc/changelog/README.md index 2891eb207e..3e0970a656 100644 --- a/doc/changelog/README.md +++ b/doc/changelog/README.md @@ -7,25 +7,28 @@ otherwise important infrastructure changes, and important bug fixes should get a changelog entry. Compatibility-breaking changes should always get a changelog entry, -which should explain what compatibility-breakage is to expect. +which should explain what compatibility breakage is to expect. Pull requests changing the ML API in significant ways should add an entry in [`dev/doc/changes.md`](../../dev/doc/changes.md). ## How to add an entry? ## -You should create a file in one of the sub-directories. The name of -the file should be `NNNNN-identifier.rst` where `NNNNN` is the number -of the pull request on five digits and `identifier` is whatever you -want. - -This file should use the same format as the reference manual (as it -will be copied in there). You may reference the documentation you just -added with `:ref:`, `:tacn:`, `:cmd:`, `:opt:`, `:token:`, etc. See +Run `./dev/tools/make-changelog.sh`: it will ask you for your PR +number, and to choose among the predefined categories. Afterward, +fill in the automatically generated entry with a short description of +your change (which should describe any compatibility issues in +particular). You may also add a reference to the relevant fixed +issue, and credit reviewers, co-authors, and anyone who helped advance +the PR. + +The format for changelog entries is the same as in the reference +manual. In particular, you may reference the documentation you just +added with `:ref:`, `:tacn:`, `:cmd:`, `:opt:`, `:token:`, etc. See the [documentation of the Sphinx format](../sphinx/README.rst) of the manual for details. -The entry should be written using the following structure: +Here is a summary of the structure of a changelog entry: ``` rst - Description of the changes, with possible link to @@ -35,7 +38,3 @@ The entry should be written using the following structure: [ and `#ISSUE2 <https://github.com/coq/coq/issues/ISSUE2>`_],] by Full Name[, with help / review of Full Name]). ``` - -The description should be kept rather short and the only additional -required meta-information are the link to the pull request and the -full name of the author. diff --git a/doc/plugin_tutorial/tuto0/src/dune b/doc/plugin_tutorial/tuto0/src/dune index 79d561061d..ab9b4dd531 100644 --- a/doc/plugin_tutorial/tuto0/src/dune +++ b/doc/plugin_tutorial/tuto0/src/dune @@ -3,7 +3,4 @@ (public_name coq.plugins.tutorial.p0) (libraries coq.plugins.ltac)) -(rule - (targets g_tuto0.ml) - (deps (:pp-file g_tuto0.mlg) ) - (action (run coqpp %{pp-file}))) +(coq.pp (modules g_tuto0)) diff --git a/doc/plugin_tutorial/tuto1/src/dune b/doc/plugin_tutorial/tuto1/src/dune index cf9c674b14..054d5ecd26 100644 --- a/doc/plugin_tutorial/tuto1/src/dune +++ b/doc/plugin_tutorial/tuto1/src/dune @@ -3,7 +3,4 @@ (public_name coq.plugins.tutorial.p1) (libraries coq.plugins.ltac)) -(rule - (targets g_tuto1.ml) - (deps (:pp-file g_tuto1.mlg) ) - (action (run coqpp %{pp-file}))) +(coq.pp (modules g_tuto1)) diff --git a/doc/plugin_tutorial/tuto1/src/simple_declare.ml b/doc/plugin_tutorial/tuto1/src/simple_declare.ml index 9dd4700db5..307214089f 100644 --- a/doc/plugin_tutorial/tuto1/src/simple_declare.ml +++ b/doc/plugin_tutorial/tuto1/src/simple_declare.ml @@ -9,4 +9,4 @@ let edeclare ?hook ~name ~poly ~scope ~kind ~opaque sigma udecl body tyopt imps let declare_definition ~poly name sigma body = let udecl = UState.default_univ_decl in edeclare ~name ~poly ~scope:(DeclareDef.Global Declare.ImportDefaultBehavior) - ~kind:Decls.Definition ~opaque:false sigma udecl body None [] + ~kind:Decls.(IsDefinition Definition) ~opaque:false sigma udecl body None [] diff --git a/doc/plugin_tutorial/tuto2/src/dune b/doc/plugin_tutorial/tuto2/src/dune index 68ddd13947..8c4b04b1ae 100644 --- a/doc/plugin_tutorial/tuto2/src/dune +++ b/doc/plugin_tutorial/tuto2/src/dune @@ -3,7 +3,4 @@ (public_name coq.plugins.tutorial.p2) (libraries coq.plugins.ltac)) -(rule - (targets g_tuto2.ml) - (deps (:pp-file g_tuto2.mlg) ) - (action (run coqpp %{pp-file}))) +(coq.pp (modules g_tuto2)) diff --git a/doc/plugin_tutorial/tuto3/src/dune b/doc/plugin_tutorial/tuto3/src/dune index ba6d8b288f..678dd71328 100644 --- a/doc/plugin_tutorial/tuto3/src/dune +++ b/doc/plugin_tutorial/tuto3/src/dune @@ -4,7 +4,4 @@ (flags :standard -warn-error -3) (libraries coq.plugins.ltac)) -(rule - (targets g_tuto3.ml) - (deps (:pp-file g_tuto3.mlg)) - (action (run coqpp %{pp-file}))) +(coq.pp (modules g_tuto3)) diff --git a/doc/sphinx/addendum/extended-pattern-matching.rst b/doc/sphinx/addendum/extended-pattern-matching.rst index b568160356..45b3f6f161 100644 --- a/doc/sphinx/addendum/extended-pattern-matching.rst +++ b/doc/sphinx/addendum/extended-pattern-matching.rst @@ -192,7 +192,7 @@ Disjunctive patterns -------------------- Multiple patterns that share the same right-hand-side can be -factorized using the notation :n:`{+| @mult_pattern}`. For +factorized using the notation :n:`{+| @patterns_comma}`. For instance, :g:`max` can be rewritten as follows: .. coqtop:: in reset diff --git a/doc/sphinx/addendum/extraction.rst b/doc/sphinx/addendum/extraction.rst index 3dc8707a34..7136cc28d1 100644 --- a/doc/sphinx/addendum/extraction.rst +++ b/doc/sphinx/addendum/extraction.rst @@ -127,20 +127,21 @@ Concerning Haskell, type-preserving optimizations are less useful because of laziness. We still make some optimizations, for example in order to produce more readable code. -The type-preserving optimizations are controlled by the following |Coq| options: +The type-preserving optimizations are controlled by the following |Coq| flags +and commands: .. flag:: Extraction Optimize Default is on. This controls all type-preserving optimizations made on the ML terms (mostly reduction of dummy beta/iota redexes, but also - simplifications on Cases, etc). Turn this option off if you want a + simplifications on Cases, etc). Turn this flag off if you want a ML term as close as possible to the Coq term. .. flag:: Extraction Conservative Types Default is off. This controls the non type-preserving optimizations made on ML terms (which try to avoid function abstraction of dummy - types). Turn this option on to make sure that ``e:t`` + types). Turn this flag on to make sure that ``e:t`` implies that ``e':t'`` where ``e'`` and ``t'`` are the extracted code of ``e`` and ``t`` respectively. @@ -150,7 +151,7 @@ The type-preserving optimizations are controlled by the following |Coq| options: produces a singleton type (i.e. a type with only one constructor, and only one argument to this constructor), the inductive structure is removed and this type is seen as an alias to the inner type. - The typical example is ``sig``. This option allows disabling this + The typical example is ``sig``. This flag allows disabling this optimization when one wishes to preserve the inductive structure of types. .. flag:: Extraction AutoInline @@ -159,7 +160,7 @@ The type-preserving optimizations are controlled by the following |Coq| options: some defined constants, according to some heuristics like size of bodies, uselessness of some arguments, etc. Those heuristics are not always perfect; if you want to disable - this feature, turn this option off. + this feature, turn this flag off. .. cmd:: Extraction Inline {+ @qualid } @@ -223,11 +224,11 @@ principles of extraction (logical parts and types). When an actual extraction takes place, an error is normally raised if the :cmd:`Extraction Implicit` declarations cannot be honored, that is if any of the implicit arguments still occurs in the final code. -This behavior can be relaxed via the following option: +This behavior can be relaxed via the following flag: .. flag:: Extraction SafeImplicits - Default is on. When this option is off, a warning is emitted + Default is on. When this flag is off, a warning is emitted instead of an error if some implicit arguments still occur in the final code of an extraction. This way, the extracted code may be obtained nonetheless and reviewed manually to locate the source of the issue @@ -282,7 +283,7 @@ Notice that in the case of type scheme axiom (i.e. whose type is an arity, that is a sequence of product finished by a sort), then some type variables have to be given (as quoted strings). The syntax is then: -.. cmdv:: Extract Constant @qualid @string ... @string => @string +.. cmdv:: Extract Constant @qualid {+ @string } => @string :undocumented: The number of type variables is checked by the system. For example: @@ -529,7 +530,7 @@ A detailed example: Euclidean division ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~ The file ``Euclid`` contains the proof of Euclidean division. -The natural numbers used here are unary, represented by the type``nat``, +The natural numbers used here are unary, represented by the type ``nat``, which is defined by two constructors ``O`` and ``S``. This module contains a theorem ``eucl_dev``, whose type is:: diff --git a/doc/sphinx/addendum/generalized-rewriting.rst b/doc/sphinx/addendum/generalized-rewriting.rst index 2ea0861e47..ca5b5e54a7 100644 --- a/doc/sphinx/addendum/generalized-rewriting.rst +++ b/doc/sphinx/addendum/generalized-rewriting.rst @@ -117,7 +117,7 @@ parameters is any term :math:`f \, t_1 \ldots t_n`. .. example:: Morphisms Continuing the previous example, let ``union: forall (A : Type), list A -> list A -> list A`` - perform the union of two sets by appending one list to the other. ``union` is a binary + perform the union of two sets by appending one list to the other. ``union`` is a binary morphism parametric over ``A`` that respects the relation instance ``(set_eq A)``. The latter condition is proved by showing: @@ -714,8 +714,10 @@ Definitions The generalized rewriting tactic is based on a set of strategies that can be combined to obtain custom rewriting procedures. Its set of strategies is based -on Elan’s rewriting strategies :cite:`Luttik97specificationof`. Rewriting -strategies are applied using the tactic :n:`rewrite_strat @strategy` where :token:`strategy` is a +on the programmable rewriting strategies with generic traversals by Visser et al. +:cite:`Luttik97specificationof` :cite:`Visser98`, which formed the core of +the Stratego transformation language :cite:`Visser01`. Rewriting strategies +are applied using the tactic :n:`rewrite_strat @strategy` where :token:`strategy` is a strategy expression. Strategies are defined inductively as described by the following grammar: @@ -739,7 +741,7 @@ following grammar: : topdown `strategy` (top-down) : hints `ident` (apply hints from hint database) : terms `term` ... `term` (any of the terms) - : eval `redexpr` (apply reduction) + : eval `red_expr` (apply reduction) : fold `term` (unify) : ( `strategy` ) diff --git a/doc/sphinx/addendum/implicit-coercions.rst b/doc/sphinx/addendum/implicit-coercions.rst index 7fee62179b..c3b197288f 100644 --- a/doc/sphinx/addendum/implicit-coercions.rst +++ b/doc/sphinx/addendum/implicit-coercions.rst @@ -274,7 +274,7 @@ Activating the Printing of Coercions .. flag:: Printing Coercions - When on, this option forces all the coercions to be printed. + When on, this flag forces all the coercions to be printed. By default, coercions are not printed. .. table:: Printing Coercion @qualid diff --git a/doc/sphinx/addendum/micromega.rst b/doc/sphinx/addendum/micromega.rst index e56b36caad..cc19c8b6a9 100644 --- a/doc/sphinx/addendum/micromega.rst +++ b/doc/sphinx/addendum/micromega.rst @@ -9,9 +9,11 @@ Short description of the tactics -------------------------------- The Psatz module (``Require Import Psatz.``) gives access to several -tactics for solving arithmetic goals over :math:`\mathbb{Z}`, :math:`\mathbb{Q}`, and :math:`\mathbb{R}` [#]_. -It also possible to get the tactics for integers by a ``Require Import Lia``, -rationals ``Require Import Lqa`` and reals ``Require Import Lra``. +tactics for solving arithmetic goals over :math:`\mathbb{Q}`, +:math:`\mathbb{R}`, and :math:`\mathbb{Z}` but also :g:`nat` and +:g:`N`. It also possible to get the tactics for integers by a +``Require Import Lia``, rationals ``Require Import Lqa`` and reals +``Require Import Lra``. + :tacn:`lia` is a decision procedure for linear integer arithmetic; + :tacn:`nia` is an incomplete proof procedure for integer non-linear @@ -23,16 +25,28 @@ rationals ``Require Import Lqa`` and reals ``Require Import Lra``. ``n`` is an optional integer limiting the proof search depth, is an incomplete proof procedure for non-linear arithmetic. It is based on John Harrison’s HOL Light - driver to the external prover `csdp` [#]_. Note that the `csdp` driver is + driver to the external prover `csdp` [#csdp]_. Note that the `csdp` driver is generating a *proof cache* which makes it possible to rerun scripts even without `csdp`. .. flag:: Simplex - This option (set by default) instructs the decision procedures to + This flag (set by default) instructs the decision procedures to use the Simplex method for solving linear goals. If it is not set, the decision procedures are using Fourier elimination. +.. flag:: Lia Cache + + This flag (set by default) instructs :tacn:`lia` to cache its results in the file `.lia.cache` + +.. flag:: Nia Cache + + This flag (set by default) instructs :tacn:`nia` to cache its results in the file `.nia.cache` + +.. flag:: Nra Cache + + This flag (set by default) instructs :tacn:`nra` to cache its results in the file `.nra.cache` + The tactics solve propositional formulas parameterized by atomic arithmetic expressions interpreted over a domain :math:`D \in \{\mathbb{Z},\mathbb{Q},\mathbb{R}\}`. @@ -78,7 +92,7 @@ closed under the following rules: \end{array}` The following theorem provides a proof principle for checking that a -set of polynomial inequalities does not have solutions [#]_. +set of polynomial inequalities does not have solutions [#fnpsatz]_. .. _psatz_thm: @@ -111,32 +125,21 @@ and checked to be :math:`-1`. The deductive power of :tacn:`lra` overlaps with the one of :tacn:`field` tactic *e.g.*, :math:`x = 10 * x / 10` is solved by :tacn:`lra`. - `lia`: a tactic for linear integer arithmetic --------------------------------------------- .. tacn:: lia :name: lia - This tactic offers an alternative to the :tacn:`omega` tactic. Roughly - speaking, the deductive power of lia is the combined deductive power of - :tacn:`ring_simplify` and :tacn:`omega`. However, it solves linear goals - that :tacn:`omega` does not solve, such as the following so-called *omega - nightmare* :cite:`TheOmegaPaper`. + This tactic solves linear goals over :g:`Z` by searching for *linear* refutations and cutting planes. + :tacn:`lia` provides support for :g:`Z`, :g:`nat`, :g:`positive` and :g:`N` by pre-processing via the :tacn:`zify` tactic. -.. coqdoc:: - - Goal forall x y, - 27 <= 11 * x + 13 * y <= 45 -> - -10 <= 7 * x - 9 * y <= 4 -> False. - -The estimation of the relative efficiency of :tacn:`lia` *vs* :tacn:`omega` is under evaluation. High level view of `lia` ~~~~~~~~~~~~~~~~~~~~~~~~ Over :math:`\mathbb{R}`, *positivstellensatz* refutations are a complete proof -principle [#]_. However, this is not the case over :math:`\mathbb{Z}`. Actually, +principle [#mayfail]_. However, this is not the case over :math:`\mathbb{Z}`. Actually, *positivstellensatz* refutations are not even sufficient to decide linear *integer* arithmetic. The canonical example is :math:`2 * x = 1 -> \mathtt{False}` which is a theorem of :math:`\mathbb{Z}` but not a theorem of :math:`{\mathbb{R}}`. To remedy this @@ -249,21 +252,55 @@ cone expression :math:`2 \times (x-1) + (\mathbf{x-1}) \times (\mathbf{x−1}) + belongs to :math:`\mathit{Cone}({−x^2,x -1})`. Moreover, by running :tacn:`ring` we obtain :math:`-1`. By Theorem :ref:`Psatz <psatz_thm>`, the goal is valid. -.. [#] Support for :g:`nat` and :g:`N` is obtained by pre-processing the goal with - the ``zify`` tactic. -.. [#] Support for :g:`Z.div` and :g:`Z.modulo` may be obtained by - pre-processing the goal with the ``Z.div_mod_to_equations`` tactic (you may - need to manually run ``zify`` first). -.. [#] Support for :g:`Z.quot` and :g:`Z.rem` may be obtained by pre-processing - the goal with the ``Z.quot_rem_to_equations`` tactic (you may need to manually - run ``zify`` first). -.. [#] Note that support for :g:`Z.div`, :g:`Z.modulo`, :g:`Z.quot`, and - :g:`Z.rem` may be simultaneously obtained by pre-processing the goal with the - ``Z.to_euclidean_division_equations`` tactic (you may need to manually run - ``zify`` first). -.. [#] Sources and binaries can be found at https://projects.coin-or.org/Csdp -.. [#] Variants deal with equalities and strict inequalities. -.. [#] In practice, the oracle might fail to produce such a refutation. +`zify`: pre-processing of arithmetic goals +------------------------------------------ + +.. tacn:: zify + :name: zify + + This tactic is internally called by :tacn:`lia` to support additional types e.g., :g:`nat`, :g:`positive` and :g:`N`. + By requiring the module ``ZifyBool``, the boolean type :g:`bool` and some comparison operators are also supported. + :tacn:`zify` can also be extended by rebinding the tactic `Zify.zify_post_hook` that is run immediately after :tacn:`zify`. + + + To support :g:`Z.div` and :g:`Z.modulo`: ``Ltac Zify.zify_post_hook ::= Z.div_mod_to_equations``. + + To support :g:`Z.quot` and :g:`Z.rem`: ``Ltac Zify.zify_post_hook ::= Z.quot_rem_to_equations``. + + To support :g:`Z.div`, :g:`Z.modulo`, :g:`Z.quot`, and :g:`Z.rem`: ``Ltac Zify.zify_post_hook ::= Z.to_euclidean_division_equations``. + + +.. cmd:: Show Zify InjTyp + :name: Show Zify InjTyp + + This command shows the list of types that can be injected into :g:`Z`. + +.. cmd:: Show Zify BinOp + :name: Show Zify BinOp + + This command shows the list of binary operators processed by :tacn:`zify`. + +.. cmd:: Show Zify BinRel + :name: Show Zify BinRel + + This command shows the list of binary relations processed by :tacn:`zify`. + + +.. cmd:: Show Zify UnOp + :name: Show Zify UnOp + + This command shows the list of unary operators processed by :tacn:`zify`. + +.. cmd:: Show Zify CstOp + :name: Show Zify CstOp + + This command shows the list of constants processed by :tacn:`zify`. + +.. cmd:: Show Zify Spec + :name: Show Zify Spec + + This command shows the list of operators over :g:`Z` that are compiled using their specification e.g., :g:`Z.min`. + +.. [#csdp] Sources and binaries can be found at https://projects.coin-or.org/Csdp +.. [#fnpsatz] Variants deal with equalities and strict inequalities. +.. [#mayfail] In practice, the oracle might fail to produce such a refutation. .. comment in original TeX: .. %% \paragraph{The {\tt sos} tactic} -- where {\tt sos} stands for \emph{sum of squares} -- tries to prove that a diff --git a/doc/sphinx/addendum/omega.rst b/doc/sphinx/addendum/omega.rst index b008508bbc..650a444a16 100644 --- a/doc/sphinx/addendum/omega.rst +++ b/doc/sphinx/addendum/omega.rst @@ -119,21 +119,21 @@ Options .. deprecated:: 8.5 - This deprecated option (on by default) is for compatibility with Coq pre 8.5. It + This deprecated flag (on by default) is for compatibility with Coq pre 8.5. It resets internal name counters to make executions of :tacn:`omega` independent. .. flag:: Omega UseLocalDefs - This option (on by default) allows :tacn:`omega` to use the bodies of local + This flag (on by default) allows :tacn:`omega` to use the bodies of local variables. .. flag:: Omega System - This option (off by default) activate the printing of debug information + This flag (off by default) activate the printing of debug information .. flag:: Omega Action - This option (off by default) activate the printing of debug information + This flag (off by default) activate the printing of debug information Technical data -------------- diff --git a/doc/sphinx/addendum/parallel-proof-processing.rst b/doc/sphinx/addendum/parallel-proof-processing.rst index 903ee115c9..35729d852d 100644 --- a/doc/sphinx/addendum/parallel-proof-processing.rst +++ b/doc/sphinx/addendum/parallel-proof-processing.rst @@ -58,7 +58,7 @@ variables used. Automatic suggestion of proof annotations ````````````````````````````````````````` -The flag :flag:`Suggest Proof Using` makes |Coq| suggest, when a ``Qed`` +The :flag:`Suggest Proof Using` flag makes |Coq| suggest, when a ``Qed`` command is processed, a correct proof annotation. It is up to the user to modify the proof script accordingly. @@ -162,7 +162,7 @@ need to process all the proofs of the ``.v`` file. The asynchronous processing of proofs can decouple the generation of a compiled file (like the ``.vo`` one) that can be loaded by ``Require`` from the generation and checking of the proof objects. The ``-quick`` flag can be -passed to ``coqc`` or ``coqtop`` to produce, quickly, ``.vio`` files. +passed to ``coqc`` to produce, quickly, ``.vio`` files. Alternatively, when using a Makefile produced by ``coq_makefile``, the ``quick`` target can be used to compile all files using the ``-quick`` flag. @@ -182,7 +182,7 @@ running ``coqc`` as usual. Alternatively one can turn each ``.vio`` into the corresponding ``.vo``. All .vio files can be processed in parallel, hence this alternative might -be faster. The command ``coqtop -schedule-vio2vo 2 a b c`` can be used to +be faster. The command ``coqc -schedule-vio2vo 2 a b c`` can be used to obtain a good scheduling for two workers to produce ``a.vo``, ``b.vo``, and ``c.vo``. When using a Makefile produced by ``coq_makefile``, the ``vio2vo`` target can be used for that purpose. Variable ``J`` should be set to the number @@ -197,7 +197,7 @@ There is an extra, possibly even faster, alternative: just check the proof tasks stored in ``.vio`` files without producing the ``.vo`` files. This is possibly faster because all the proof tasks are independent, hence one can further partition the job to be done between workers. The -``coqtop -schedule-vio-checking 6 a b c`` command can be used to obtain a +``coqc -schedule-vio-checking 6 a b c`` command can be used to obtain a good scheduling for 6 workers to check all the proof tasks of ``a.vio``, ``b.vio``, and ``c.vio``. Auxiliary files are used to predict how long a proof task will take, assuming it will take the same amount of time it took diff --git a/doc/sphinx/addendum/program.rst b/doc/sphinx/addendum/program.rst index 45c74ab02a..a17dca1693 100644 --- a/doc/sphinx/addendum/program.rst +++ b/doc/sphinx/addendum/program.rst @@ -78,7 +78,7 @@ operation (see :ref:`extendedpatternmatching`). also works with the previous mechanism. -There are options to control the generation of equalities and +There are flags to control the generation of equalities and coercions. .. flag:: Program Cases @@ -86,13 +86,13 @@ coercions. This controls the special treatment of pattern matching generating equalities and disequalities when using |Program| (it is on by default). All pattern-matches and let-patterns are handled using the standard algorithm - of |Coq| (see :ref:`extendedpatternmatching`) when this option is + of |Coq| (see :ref:`extendedpatternmatching`) when this flag is deactivated. .. flag:: Program Generalized Coercion This controls the coercion of general inductive types when using |Program| - (the option is on by default). Coercion of subset types and pairs is still + (the flag is on by default). Coercion of subset types and pairs is still active in this case. .. flag:: Program Mode @@ -341,9 +341,9 @@ optional tactic is replaced by the default one if not specified. .. flag:: Shrink Obligations - *Deprecated since 8.7* + .. deprecated:: 8.7 - This option (on by default) controls whether obligations should have + This flag (on by default) controls whether obligations should have their context minimized to the set of variables used in the proof of the obligation, to avoid unnecessary dependencies. diff --git a/doc/sphinx/addendum/sprop.rst b/doc/sphinx/addendum/sprop.rst index 8935ba27e3..9acdd18b89 100644 --- a/doc/sphinx/addendum/sprop.rst +++ b/doc/sphinx/addendum/sprop.rst @@ -9,15 +9,18 @@ SProp (proof irrelevant propositions) This section describes the extension of |Coq| with definitionally proof irrelevant propositions (types in the sort :math:`\SProp`, also -known as strict propositions). To use :math:`\SProp` you must pass -``-allow-sprop`` to the |Coq| program or use :flag:`Allow StrictProp`. +known as strict propositions) as described in +:cite:`Gilbert:POPL2019`. + +Using :math:`\SProp` may be prevented by passing ``-disallow-sprop`` +to the |Coq| program or using :flag:`Allow StrictProp`. .. flag:: Allow StrictProp :name: Allow StrictProp Allows using :math:`\SProp` when set and forbids it when unset. The initial value depends on whether you used the command line - ``-allow-sprop``. + ``-disallow-sprop`` and ``-allow-sprop``. .. exn:: SProp not allowed, you need to Set Allow StrictProp or to use the -allow-sprop command-line-flag. :undocumented: diff --git a/doc/sphinx/addendum/type-classes.rst b/doc/sphinx/addendum/type-classes.rst index db3e20a9c6..661aa88082 100644 --- a/doc/sphinx/addendum/type-classes.rst +++ b/doc/sphinx/addendum/type-classes.rst @@ -47,9 +47,22 @@ Leibniz equality on some type. An example implementation is: | tt, tt => eq_refl tt end }. -Using :cmd:`Program Instance`, if one does not give all the members in -the Instance declaration, Coq generates obligations for the remaining -fields, e.g.: +Using the attribute ``refine``, if the term is not sufficient to +finish the definition (e.g. due to a missing field or non-inferable +hole) it must be finished in proof mode. If it is sufficient a trivial +proof mode with no open goals is started. + +.. coqtop:: in + + #[refine] Instance unit_EqDec' : EqDec unit := { eqb x y := true }. + Proof. intros [] [];reflexivity. Defined. + +Note that if you finish the proof with :cmd:`Qed` the entire instance +will be opaque, including the fields given in the initial term. + +Alternatively, in :flag:`Program Mode` if one does not give all the +members in the Instance declaration, Coq generates obligations for the +remaining fields, e.g.: .. coqtop:: in @@ -560,8 +573,16 @@ Settings Determines how much information is shown for typeclass resolution steps during search. 1 is the default level. 2 shows additional information such as tried tactics and shelving - of goals. Setting this option to 1 or 2 turns on :flag:`Typeclasses Debug`; setting this - option to 0 turns that option off. + of goals. Setting this option to 1 or 2 turns on the :flag:`Typeclasses Debug` flag; setting this + option to 0 turns that flag off. + +.. flag:: Typeclasses Axioms Are Instances + + .. deprecated:: 8.10 + + This flag (off by default since 8.8) automatically declares axioms + whose type is a typeclass at declaration time as instances of that + class. Typeclasses eauto `:=` ~~~~~~~~~~~~~~~~~~~~~~ diff --git a/doc/sphinx/addendum/universe-polymorphism.rst b/doc/sphinx/addendum/universe-polymorphism.rst index 7e698bfb66..7adb25cbd6 100644 --- a/doc/sphinx/addendum/universe-polymorphism.rst +++ b/doc/sphinx/addendum/universe-polymorphism.rst @@ -129,12 +129,12 @@ Polymorphic, Monomorphic .. flag:: Universe Polymorphism - Once enabled, this option will implicitly prepend ``Polymorphic`` to any + Once enabled, this flag will implicitly prepend ``Polymorphic`` to any definition of the user. .. cmd:: Monomorphic @definition - When the :flag:`Universe Polymorphism` option is set, to make a definition + When the :flag:`Universe Polymorphism` flag is set, to make a definition producing global universe constraints, one can use the ``Monomorphic`` prefix. Many other commands support the ``Polymorphic`` flag, including: @@ -147,14 +147,7 @@ Many other commands support the ``Polymorphic`` flag, including: - :cmd:`Section` will locally set the polymorphism flag inside the section. - ``Variables``, ``Context``, ``Universe`` and ``Constraint`` in a section support - polymorphism. This means that the universe variables (and associated - constraints) are discharged polymorphically over definitions that use - them. In other words, two definitions in the section sharing a common - variable will both get parameterized by the universes produced by the - variable declaration. This is in contrast to a “mononorphic” variable - which introduces global universes and constraints, making the two - definitions depend on the *same* global universes associated to the - variable. + polymorphism. See :ref:`universe-polymorphism-in-sections` for more details. - :cmd:`Hint Resolve` and :cmd:`Hint Rewrite` will use the auto/rewrite hint polymorphically, not at a single instance. @@ -169,8 +162,8 @@ declared cumulative using the :g:`Cumulative` prefix. Declares the inductive as cumulative -Alternatively, there is a flag :flag:`Polymorphic Inductive -Cumulativity` which when set, makes all subsequent *polymorphic* +Alternatively, there is a :flag:`Polymorphic Inductive +Cumulativity` flag which when set, makes all subsequent *polymorphic* inductive definitions cumulative. When set, inductive types and the like can be enforced to be non-cumulative using the :g:`NonCumulative` prefix. @@ -181,7 +174,7 @@ prefix. .. flag:: Polymorphic Inductive Cumulativity - When this option is on, it sets all following polymorphic inductive + When this flag is on, it sets all following polymorphic inductive types as cumulative (it is off by default). Consider the examples below. @@ -229,8 +222,8 @@ Cumulative inductive types, coinductive types, variants and records only make sense when they are universe polymorphic. Therefore, an error is issued whenever the user uses the :g:`Cumulative` or :g:`NonCumulative` prefix in a monomorphic context. -Notice that this is not the case for the option :flag:`Polymorphic Inductive Cumulativity`. -That is, this option, when set, makes all subsequent *polymorphic* +Notice that this is not the case for the :flag:`Polymorphic Inductive Cumulativity` flag. +That is, this flag, when set, makes all subsequent *polymorphic* inductive declarations cumulative (unless, of course the :g:`NonCumulative` prefix is used) but has no effect on *monomorphic* inductive declarations. @@ -375,9 +368,7 @@ to universes and explicitly instantiate polymorphic definitions. as well. Global universe names live in a separate namespace. The command supports the ``Polymorphic`` flag only in sections, meaning the universe quantification will be discharged on each section definition - independently. One cannot mix polymorphic and monomorphic - declarations in the same section. - + independently. .. cmd:: Constraint @universe_constraint Polymorphic Constraint @universe_constraint @@ -448,7 +439,7 @@ underscore or by omitting the annotation to a polymorphic definition. .. flag:: Strict Universe Declaration - Turning this option off allows one to freely use + Turning this flag off allows one to freely use identifiers for universes without declaring them first, with the semantics that the first use declares it. In this mode, the universe names are not associated with the definition or proof once it has been @@ -456,7 +447,7 @@ underscore or by omitting the annotation to a polymorphic definition. .. flag:: Private Polymorphic Universes - This option, on by default, removes universes which appear only in + This flag, on by default, removes universes which appear only in the body of an opaque polymorphic definition from the definition's universe arguments. As such, no value needs to be provided for these universes when instantiating the definition. Universe @@ -489,7 +480,7 @@ underscore or by omitting the annotation to a polymorphic definition. About foo. To recover the same behaviour with regard to universes as - :g:`Defined`, the option :flag:`Private Polymorphic Universes` may + :g:`Defined`, the :flag:`Private Polymorphic Universes` flag may be unset: .. coqtop:: all @@ -510,3 +501,51 @@ underscore or by omitting the annotation to a polymorphic definition. Lemma baz : Type@{outer}. Proof. exact Type@{inner}. Qed. About baz. + +.. _universe-polymorphism-in-sections: + +Universe polymorphism and sections +---------------------------------- + +:cmd:`Variables`, :cmd:`Context`, :cmd:`Universe` and +:cmd:`Constraint` in a section support polymorphism. This means that +the universe variables and their associated constraints are discharged +polymorphically over definitions that use them. In other words, two +definitions in the section sharing a common variable will both get +parameterized by the universes produced by the variable declaration. +This is in contrast to a “mononorphic” variable which introduces +global universes and constraints, making the two definitions depend on +the *same* global universes associated to the variable. + +It is possible to mix universe polymorphism and monomorphism in +sections, except in the following ways: + +- no monomorphic constraint may refer to a polymorphic universe: + + .. coqtop:: all reset + + Section Foo. + + Polymorphic Universe i. + Fail Constraint i = i. + + This includes constraints implictly declared by commands such as + :cmd:`Variable`, which may as a such need to be used with universe + polymorphism activated (locally by attribute or globally by option): + + .. coqtop:: all + + Fail Variable A : (Type@{i} : Type). + Polymorphic Variable A : (Type@{i} : Type). + + (in the above example the anonymous :g:`Type` constrains polymorphic + universe :g:`i` to be strictly smaller.) + +- no monomorphic constant or inductive may be declared if polymorphic + universes or universe constraints are present. + +These restrictions are required in order to produce a sensible result +when closing the section (the requirement on constants and inductives +is stricter than the one on constraints, because constants and +inductives are abstracted by *all* the section's polymorphic universes +and constraints). diff --git a/doc/sphinx/biblio.bib b/doc/sphinx/biblio.bib index 85b02013d8..3d73f9bd6e 100644 --- a/doc/sphinx/biblio.bib +++ b/doc/sphinx/biblio.bib @@ -192,7 +192,7 @@ s}, @InProceedings{Del00, author = {Delahaye, D.}, - title = {A {T}actic {L}anguage for the {S}ystem {{\sf Coq}}}, + title = {A {T}actic {L}anguage for the {S}ystem {Coq}}, booktitle = {Proceedings of Logic for Programming and Automated Reasoning (LPAR), Reunion Island}, publisher = SV, @@ -222,6 +222,25 @@ s}, year = {1890} } +@article{Gilbert:POPL2019, + author = {Gilbert, Ga\"{e}tan and Cockx, Jesper and Sozeau, Matthieu and Tabareau, Nicolas}, + title = {{Definitional Proof Irrelevance Without K}}, + journal = {Proc. ACM Program. Lang.}, + issue_date = {January 2019}, + volume = {3}, + number = {POPL}, + year = {2019}, + issn = {2475-1421}, + pages = {3:1--3:28}, + articleno = {3}, + numpages = {28}, + url = {http://doi.acm.org/10.1145/3290316}, + acmid = {3290316}, + publisher = {ACM}, + address = {New York, NY, USA}, + keywords = {proof assistants, proof irrelevance, type theory}, +} + @InProceedings{Gim94, author = {E. Gim\'enez}, booktitle = {Types'94 : Types for Proofs and Programs}, @@ -340,6 +359,27 @@ s}, year = {1997} } +@inproceedings{Visser98, + author = {Eelco Visser and + Zine{-}El{-}Abidine Benaissa and + Andrew P. Tolmach}, + title = {Building Program Optimizers with Rewriting Strategies}, + booktitle = {ICFP}, + pages = {13--26}, + year = {1998}, +} + +@inproceedings{Visser01, + author = {Eelco Visser}, + title = {Stratego: {A} Language for Program Transformation Based on Rewriting + Strategies}, + booktitle = {RTA}, + pages = {357--362}, + year = {2001}, + series = {LNCS}, + volume = {2051}, +} + @InProceedings{DBLP:conf/types/McBride00, author = {Conor McBride}, title = {Elimination with a Motive}, diff --git a/doc/sphinx/changes.rst b/doc/sphinx/changes.rst index 6ac55e7bf4..80a24b997c 100644 --- a/doc/sphinx/changes.rst +++ b/doc/sphinx/changes.rst @@ -198,21 +198,21 @@ Melquiond, Matthieu Sozeau, Enrico Tassi (who migrated it to opam 2) with contributions from many users. A list of packages is available at https://coq.inria.fr/opam/www/. -The 61 contributors to this version are David A. Dalrymple, Tanaka -Akira, Benjamin Barenblat, Yves Bertot, Frédéric Besson, Lasse -Blaauwbroek, Martin Bodin, Joachim Breitner, Tej Chajed, Frédéric -Chapoton, Arthur Charguéraud, Cyril Cohen, Lukasz Czajka, Christian -Doczkal, Maxime Dénès, Andres Erbsen, Jim Fehrle, Gaëtan Gilbert, Matěj -Grabovský, Simon Gregersen, Jason Gross, Samuel Gruetter, Hugo Herbelin, -Jasper Hugunin, Mirai Ikebuchi, Emilio Jesus Gallego Arias, Chantal -Keller, Matej Košík, Vincent Laporte, Olivier Laurent, Larry Darryl Lee -Jr, Pierre Letouzey, Nick Lewycky, Yao Li, Yishuai Li, Xia Li-yao, Assia -Mahboubi, Simon Marechal, Erik Martin-Dorel, Thierry Martinez, Guillaume -Melquiond, Kayla Ngan, Sam Pablo Kuper, Karl Palmskog, Clément -Pit-Claudel, Pierre-Marie Pédrot, Pierre Roux, Kazuhiko Sakaguchi, Ryan -Scott, Vincent Semeria, Gan Shen, Michael Soegtrop, Matthieu Sozeau, -Enrico Tassi, Laurent Théry, Kamil Trzciński, whitequark, Théo -Winterhalter, Beta Ziliani and Théo Zimmermann. +The 61 contributors to this version are Tanaka Akira, Benjamin +Barenblat, Yves Bertot, Frédéric Besson, Lasse Blaauwbroek, Martin +Bodin, Joachim Breitner, Tej Chajed, Frédéric Chapoton, Arthur +Charguéraud, Cyril Cohen, Lukasz Czajka, David A. Dalrymple, Christian +Doczkal, Maxime Dénès, Andres Erbsen, Jim Fehrle, Emilio Jesus Gallego +Arias, Gaëtan Gilbert, Matěj Grabovský, Simon Gregersen, Jason Gross, +Samuel Gruetter, Hugo Herbelin, Jasper Hugunin, Mirai Ikebuchi, +Chantal Keller, Matej Košík, Sam Pablo Kuper, Vincent Laporte, Olivier +Laurent, Larry Darryl Lee Jr, Nick Lewycky, Yao Li, Yishuai Li, Assia +Mahboubi, Simon Marechal, Erik Martin-Dorel, Thierry Martinez, +Guillaume Melquiond, Kayla Ngan, Karl Palmskog, Pierre-Marie Pédrot, +Clément Pit-Claudel, Pierre Roux, Kazuhiko Sakaguchi, Ryan Scott, +Vincent Semeria, Gan Shen, Michael Soegtrop, Matthieu Sozeau, Enrico +Tassi, Laurent Théry, Kamil Trzciński, whitequark, Théo Winterhalter, +Xia Li-yao, Beta Ziliani and Théo Zimmermann. Many power users helped to improve the design of the new features via the issue and pull request system, the |Coq| development mailing list, @@ -649,6 +649,121 @@ Many bug fixes and documentation improvements, in particular: (in Proof General) `#421 <https://github.com/ProofGeneral/PG/pull/421>`_, by Jim Fehrle). +Changes in 8.10+beta3 +~~~~~~~~~~~~~~~~~~~~~ + +**Kernel** + +- Fix soundness issue with template polymorphism (`#9294 + <https://github.com/coq/coq/issues/9294>`_). + + Declarations of template-polymorphic inductive types ignored the + provenance of the universes they were abstracting on and did not + detect if they should be greater or equal to :math:`\Set` in + general. Previous universes and universes introduced by the inductive + definition could have constraints that prevented their instantiation + with e.g. :math:`\Prop`, resulting in unsound instantiations later. The + implemented fix only allows abstraction over universes introduced by + the inductive declaration, and properly records all their constraints + by making them by default only :math:`>= \Prop`. It is also checked + that a template polymorphic inductive actually is polymorphic on at + least one universe. + + This prevents inductive declarations in sections to be universe + polymorphic over section parameters. For a backward compatible fix, + simply hoist the inductive definition out of the section. + An alternative is to declare the inductive as universe-polymorphic and + cumulative in a universe-polymorphic section: all universes and + constraints will be properly gathered in this case. + See :ref:`Template-polymorphism` for a detailed exposition of the + rules governing template-polymorphic types. + + To help users incrementally fix this issue, a command line option + `-no-template-check` and a global flag :flag:`Template Check` are + available to selectively disable the new check. Use at your own risk. + + (`#9918 <https://github.com/coq/coq/pull/9918>`_, by Matthieu Sozeau + and Maxime Dénès). + +**User messages** + +- Improve the ambiguous paths warning to indicate which path is ambiguous with + new one + (`#10336 <https://github.com/coq/coq/pull/10336>`_, + closes `#3219 <https://github.com/coq/coq/issues/3219>`_, + by Kazuhiko Sakaguchi). + +**Extraction** + +- Fix extraction to OCaml of primitive machine integers; + see :ref:`primitive-integers` + (`#10430 <https://github.com/coq/coq/pull/10430>`_, + fixes `#10361 <https://github.com/coq/coq/issues/10361>`_, + by Vincent Laporte). +- Fix a printing bug of OCaml extraction on dependent record projections, which + produced improper `assert false`. This change makes the OCaml extractor + internally inline record projections by default; thus the monolithic OCaml + extraction (:cmd:`Extraction` and :cmd:`Recursive Extraction`) does not + produce record projection constants anymore except for record projections + explicitly instructed to extract, and records declared in opaque modules + (`#10577 <https://github.com/coq/coq/pull/10577>`_, + fixes `#7348 <https://github.com/coq/coq/issues/7348>`_, + by Kazuhiko Sakaguchi). + +**Standard library** + +- Added ``splitat`` function and lemmas about ``splitat`` and ``uncons`` + (`#9379 <https://github.com/coq/coq/pull/9379>`_, + by Yishuai Li, with help of Konstantinos Kallas, + follow-up of `#8365 <https://github.com/coq/coq/pull/8365>`_, + which added ``uncons`` in 8.10+beta1). + +Changes in 8.10.0 +~~~~~~~~~~~~~~~~~ + +- Micromega tactics (:tacn:`lia`, :tacn:`nia`, etc) are no longer confused by + primitive projections (`#10806 <https://github.com/coq/coq/pull/10806>`_, + fixes `#9512 <https://github.com/coq/coq/issues/9512>`_ + by Vincent Laporte). + +Changes in 8.10.1 +~~~~~~~~~~~~~~~~~ + +A few bug fixes and documentation improvements, in particular: + +**Kernel** + +- Fix proof of False when using |SProp| (incorrect De Bruijn handling + when inferring the relevance mark of a function) (`#10904 + <https://github.com/coq/coq/pull/10904>`_, by Pierre-Marie Pédrot). + +**Tactics** + +- Fix an anomaly when unsolved evar in :cmd:`Add Ring` + (`#10891 <https://github.com/coq/coq/pull/10891>`_, + fixes `#9851 <https://github.com/coq/coq/issues/9851>`_, + by Gaëtan Gilbert). + +**Tactic language** + +- Fix Ltac regression in binding free names in uconstr + (`#10899 <https://github.com/coq/coq/pull/10899>`_, + fixes `#10894 <https://github.com/coq/coq/issues/10894>`_, + by Hugo Herbelin). + +**CoqIDE** + +- Fix handling of unicode input before space + (`#10852 <https://github.com/coq/coq/pull/10852>`_, + fixes `#10842 <https://github.com/coq/coq/issues/10842>`_, + by Arthur Charguéraud). + +**Extraction** + +- Fix custom extraction of inductives to JSON + (`#10897 <https://github.com/coq/coq/pull/10897>`_, + fixes `#4741 <https://github.com/coq/coq/issues/4741>`_, + by Helge Bahmann). Version 8.9 ----------- @@ -894,8 +1009,8 @@ Standard Library and other packages. They are still delimited by `%int` and `%uint`. - Syntax notations for `string`, `ascii`, `Z`, `positive`, `N`, `R`, - and `int31` are no longer available merely by `Require`ing the files - that define the inductives. You must `Import` `Coq.Strings.String.StringSyntax` + and `int31` are no longer available merely by :cmd:`Require`\ing the files + that define the inductives. You must :cmd:`Import` `Coq.Strings.String.StringSyntax` (after `Require` `Coq.Strings.String`), `Coq.Strings.Ascii.AsciiSyntax` (after `Require` `Coq.Strings.Ascii`), `Coq.ZArith.BinIntDef`, `Coq.PArith.BinPosDef`, `Coq.NArith.BinNatDef`, `Coq.Reals.Rdefinitions`, and diff --git a/doc/sphinx/conf.py b/doc/sphinx/conf.py index 867a19efe5..f1dd7479c5 100755 --- a/doc/sphinx/conf.py +++ b/doc/sphinx/conf.py @@ -183,18 +183,17 @@ todo_include_todos = False nitpicky = True nitpick_ignore = [ ('token', token) for token in [ - 'tactic', - # 142 occurrences currently sort of defined in the ltac chapter, - # but is it the right place? - 'module', - 'redexpr', - 'modpath', - 'dirpath', 'collection', + 'command', + 'dirpath', + 'modpath', + 'module', + 'red_expr', + 'symbol', + 'tactic', 'term_pattern', 'term_pattern_string', - 'command', - 'symbol' ]] +]] # -- Options for HTML output ---------------------------------------------- diff --git a/doc/sphinx/language/cic.rst b/doc/sphinx/language/cic.rst index ef183174d7..4beaff70f5 100644 --- a/doc/sphinx/language/cic.rst +++ b/doc/sphinx/language/cic.rst @@ -49,7 +49,8 @@ The sort :math:`\SProp` is like :math:`\Prop` but the propositions in equal). Objects of type :math:`\SProp` are called strict propositions. :math:`\SProp` is rejected except when using the compiler option ``-allow-sprop``. See :ref:`sprop` for information about using -:math:`\SProp`. +:math:`\SProp`, and :cite:`Gilbert:POPL2019` for meta theoretical +considerations. The sort :math:`\Set` intends to be the type of small sets. This includes data types such as booleans and naturals, but also products, subsets, and @@ -70,7 +71,7 @@ and function types over these sorts. Formally, we call :math:`\Sort` the set of sorts which is defined by: .. math:: - + \Sort \equiv \{\SProp,\Prop,\Set,\Type(i)\;|\; i~∈ ℕ\} Their properties, such as: :math:`\Prop:\Type(1)`, :math:`\Set:\Type(1)`, and @@ -436,7 +437,7 @@ instance the identity function over a given type :math:`T` can be written this a *reduction* (or a *conversion*) rule we call :math:`β`: .. math:: - + E[Γ] ⊢ ((λx:T.~t)~u)~\triangleright_β~\subst{t}{x}{u} We say that :math:`\subst{t}{x}{u}` is the *β-contraction* of @@ -474,14 +475,14 @@ with its value, that is to expand (or unfold) it into its value. This reduction is called δ-reduction and shows as follows. .. inference:: Delta-Local - + \WFE{\Gamma} (x:=t:T) ∈ Γ -------------- E[Γ] ⊢ x~\triangleright_Δ~t .. inference:: Delta-Global - + \WFE{\Gamma} (c:=t:T) ∈ E -------------- @@ -499,7 +500,7 @@ destroyed, this reduction differs from δ-reduction. It is called ζ-reduction and shows as follows. .. inference:: Zeta - + \WFE{\Gamma} \WTEG{u}{U} \WTE{\Gamma::(x:=u:U)}{t}{T} @@ -533,17 +534,17 @@ for :math:`x` an arbitrary variable name fresh in :math:`t`. .. math:: f ~:~ ∀ x:\Type(2),~\Type(1) - + then .. math:: λ x:\Type(1).~(f~x) ~:~ ∀ x:\Type(1),~\Type(1) - + We could not allow .. math:: λ x:\Type(1).~(f~x) ~\triangleright_η~ f - + because the type of the reduced term :math:`∀ x:\Type(2),~\Type(1)` would not be convertible to the type of the original term :math:`∀ x:\Type(1),~\Type(1)`. @@ -665,7 +666,7 @@ a *subtyping* relation inductively defined by: .. math:: [c_1 : ∀Γ_P' ,∀ T_{1,1}' … T_{1,n_1}' ,~t'~v_{1,1}' … v_{1,m}' ;~…;~ c_k : ∀Γ_P' ,∀ T_{k,1}' … T_{k,n_k}' ,~t'~v_{k,1}' … v_{k,m}' ] - + respectively then .. math:: @@ -695,7 +696,7 @@ a *subtyping* relation inductively defined by: The conversion rule up to subtyping is now exactly: .. inference:: Conv - + E[Γ] ⊢ U : s E[Γ] ⊢ t : T E[Γ] ⊢ T ≤_{βδιζη} U @@ -716,13 +717,13 @@ that :math:`t_0` is :math:`λ x:T.~u_0` then one step of β-head reduction of :m .. math:: λ x_1 :T_1 .~… λ x_k :T_k .~(λ x:T.~u_0~t_1 … t_n ) ~\triangleright~ λ (x_1 :T_1 )…(x_k :T_k ).~(\subst{u_0}{x}{t_1}~t_2 … t_n ) - + Iterating the process of head reduction until the head of the reduced term is no more an abstraction leads to the *β-head normal form* of :math:`t`: .. math:: t \triangleright … \triangleright λ x_1 :T_1 .~…λ x_k :T_k .~(v~u_1 … u_m ) - + where :math:`v` is not an abstraction (nor an application). Note that the head normal form must not be confused with the normal form since some :math:`u_i` can be reducible. Similar notions of head-normal forms involving δ, ι @@ -828,7 +829,7 @@ We have to give the type of constants in a global environment :math:`E` which contains an inductive definition. .. inference:: Ind - + \WFE{Γ} \ind{p}{Γ_I}{Γ_C} ∈ E (a:A)∈Γ_I @@ -836,7 +837,7 @@ contains an inductive definition. E[Γ] ⊢ a : A .. inference:: Constr - + \WFE{Γ} \ind{p}{Γ_I}{Γ_C} ∈ E (c:C)∈Γ_C @@ -917,7 +918,7 @@ condition* for a constant :math:`X` in the following cases: + :math:`T=(X~t_1 … t_n )` and :math:`X` does not occur free in any :math:`t_i` + :math:`T=∀ x:U,~V` and :math:`X` occurs only strictly positively in :math:`U` and the type :math:`V` satisfies the positivity condition for :math:`X`. - + Strict positivity +++++++++++++++++ @@ -931,10 +932,10 @@ cases: strictly positively in type :math:`V` + :math:`T` converts to :math:`(I~a_1 … a_m~t_1 … t_p )` where :math:`I` is the name of an inductive definition of the form - + .. math:: \ind{m}{I:A}{c_1 :∀ p_1 :P_1 ,… ∀p_m :P_m ,~C_1 ;~…;~c_n :∀ p_1 :P_1 ,… ∀p_m :P_m ,~C_n} - + (in particular, it is not mutually defined and it has :math:`m` parameters) and :math:`X` does not occur in any of the :math:`t_i`, and the (instantiated) types of constructor @@ -998,7 +999,7 @@ such that :math:`Γ_I` is :math:`[I_1 :∀ Γ_P ,A_1 ;~…;~I_k :∀ Γ_P ,A_k]` (E[Γ_I ;Γ_P ] ⊢ C_i : s_{q_i} )_{i=1… n} ------------------------------------------ \WF{E;~\ind{p}{Γ_I}{Γ_C}}{} - + provided that the following side conditions hold: @@ -1046,36 +1047,77 @@ between universes for inductive types in the Type hierarchy. exT_intro : forall X:Type, P X -> exType P. +.. example:: Negative occurrence (first example) -.. _Template-polymorphism: + The following inductive definition is rejected because it does not + satisfy the positivity condition: -Template polymorphism -+++++++++++++++++++++ + .. coqtop:: all -Inductive types can be made polymorphic over their arguments -in :math:`\Type`. + Fail Inductive I : Prop := not_I_I (not_I : I -> False) : I. -.. flag:: Auto Template Polymorphism + If we were to accept such definition, we could derive a + contradiction from it (we can test this by disabling the + :flag:`Positivity Checking` flag): - This option, enabled by default, makes every inductive type declared - at level :math:`\Type` (without annotations or hiding it behind a - definition) template polymorphic. + .. coqtop:: none - This can be prevented using the ``notemplate`` attribute. + Unset Positivity Checking. + Inductive I : Prop := not_I_I (not_I : I -> False) : I. + Set Positivity Checking. - An inductive type can be forced to be template polymorphic using the - ``template`` attribute. + .. coqtop:: all - Template polymorphism and universe polymorphism (see Chapter - :ref:`polymorphicuniverses`) are incompatible, so if the later is - enabled it will prevail over automatic template polymorphism and - cause an error when using the ``template`` attribute. + Definition I_not_I : I -> ~ I := fun i => + match i with not_I_I not_I => not_I end. -.. warn:: Automatically declaring @ident as template polymorphic. + .. coqtop:: in - Warning ``auto-template`` can be used to find which types are - implicitly declared template polymorphic by :flag:`Auto Template - Polymorphism`. + Lemma contradiction : False. + Proof. + enough (I /\ ~ I) as [] by contradiction. + split. + - apply not_I_I. + intro. + now apply I_not_I. + - intro. + now apply I_not_I. + Qed. + +.. example:: Negative occurrence (second example) + + Here is another example of an inductive definition which is + rejected because it does not satify the positivity condition: + + .. coqtop:: all + + Fail Inductive Lam := lam (_ : Lam -> Lam). + + Again, if we were to accept it, we could derive a contradiction + (this time through a non-terminating recursive function): + + .. coqtop:: none + + Unset Positivity Checking. + Inductive Lam := lam (_ : Lam -> Lam). + Set Positivity Checking. + + .. coqtop:: all + + Fixpoint infinite_loop l : False := + match l with lam x => infinite_loop (x l) end. + + Check infinite_loop (lam (@id Lam)) : False. + +.. _Template-polymorphism: + +Template polymorphism ++++++++++++++++++++++ + +Inductive types can be made polymorphic over the universes introduced by +their parameters in :math:`\Type`, if the minimal inferred sort of the +inductive declarations either mention some of those parameter universes +or is computed to be :math:`\Prop` or :math:`\Set`. If :math:`A` is an arity of some sort and :math:`s` is a sort, we write :math:`A_{/s}` for the arity obtained from :math:`A` by replacing its sort with :math:`s`. @@ -1117,10 +1159,11 @@ provided that the following side conditions hold: + there are sorts :math:`s_i`, for :math:`1 ≤ i ≤ k` such that, for :math:`Γ_{I'} = [I_1 :∀ Γ_{P'} ,(A_1)_{/s_1} ;~…;~I_k :∀ Γ_{P'} ,(A_k)_{/s_k}]` we have :math:`(E[Γ_{I′} ;Γ_{P′}] ⊢ C_i : s_{q_i})_{i=1… n}` ; - + the sorts :math:`s_i` are such that all eliminations, to - :math:`\Prop`, :math:`\Set` and :math:`\Type(j)`, are allowed - (see Section :ref:`Destructors`). - + + the sorts :math:`s_i` are all introduced by the inductive + declaration and have no universe constraints beside being greater + than or equal to :math:`\Prop`, and such that all + eliminations, to :math:`\Prop`, :math:`\Set` and :math:`\Type(j)`, + are allowed (see Section :ref:`Destructors`). Notice that if :math:`I_j~q_1 … q_r` is typable using the rules **Ind-Const** and @@ -1141,6 +1184,62 @@ Conversion is preserved as any (partial) instance :math:`I_j~q_1 … q_r` or :math:`C_i~q_1 … q_r` is mapped to the names chosen in the specific instance of :math:`\ind{p}{Γ_I}{Γ_C}`. +.. warning:: + + The restriction that sorts are introduced by the inductive + declaration prevents inductive types declared in sections to be + template-polymorphic on universes introduced previously in the + section: they cannot parameterize over the universes introduced with + section variables that become parameters at section closing time, as + these may be shared with other definitions from the same section + which can impose constraints on them. + +.. flag:: Auto Template Polymorphism + + This flag, enabled by default, makes every inductive type declared + at level :math:`\Type` (without annotations or hiding it behind a + definition) template polymorphic if possible. + + This can be prevented using the ``universes(notemplate)`` + attribute. + +.. warn:: Automatically declaring @ident as template polymorphic. + + Warning ``auto-template`` can be used to find which types are + implicitly declared template polymorphic by :flag:`Auto Template + Polymorphism`. + + An inductive type can be forced to be template polymorphic using + the ``universes(template)`` attribute: it should then fulfill the + criterion to be template polymorphic or an error is raised. + +.. exn:: Inductive @ident cannot be made template polymorphic. + + This error is raised when the `#[universes(template)]` attribute is + on but the inductive cannot be made polymorphic on any universe or be + inferred to live in :math:`\Prop` or :math:`\Set`. + + Template polymorphism and universe polymorphism (see Chapter + :ref:`polymorphicuniverses`) are incompatible, so if the later is + enabled it will prevail over automatic template polymorphism and + cause an error when using the ``universes(template)`` attribute. + +.. flag:: Template Check + + This flag is on by default. Turning it off disables the check of + locality of the sorts when abstracting the inductive over its + parameters. This is a deprecated and *unsafe* flag that can introduce + inconsistencies, it is only meant to help users incrementally update + code from Coq versions < 8.10 which did not implement this check. + The `Coq89.v` compatibility file sets this flag globally. A global + ``-no-template-check`` command line option is also available. Use at + your own risk. Use of this flag is recorded in the typing flags + associated to a definition but is *not* supported by the |Coq| + checker (`coqchk`). It will appear in :g:`Print Assumptions` and + :g:`About @ident` output involving inductive declarations that were + (potentially unsoundly) assumed to be template polymorphic. + + In practice, the rule **Ind-Family** is used by |Coq| only when all the inductive types of the inductive definition are declared with an arity whose sort is in the Type hierarchy. Then, the polymorphism is over @@ -1154,10 +1253,10 @@ inductive type is set in :math:`\Set` (even in case :math:`\Set` is impredicativ Section The-Calculus-of-Inductive-Construction-with-impredicative-Set_), and otherwise in the Type hierarchy. -Note that the side-condition about allowed elimination sorts in the -rule **Ind-Family** is just to avoid to recompute the allowed elimination -sorts at each instance of a pattern matching (see Section :ref:`Destructors`). As -an example, let us consider the following definition: +Note that the side-condition about allowed elimination sorts in the rule +**Ind-Family** avoids to recompute the allowed elimination sorts at each +instance of a pattern matching (see Section :ref:`Destructors`). As an +example, let us consider the following definition: .. example:: @@ -1320,7 +1419,7 @@ using the syntax: \Match~m~\as~x~\In~I~\_~a~\return~P~\with~ (c_1~x_{11} ... x_{1p_1} ) ⇒ f_1 | … | (c_n~x_{n1} ... x_{np_n} ) ⇒ f_n~\kwend - + The :math:`\as` part can be omitted if either the result type does not depend on :math:`m` (non-dependent elimination) or :math:`m` is a variable (in this case, :math:`m` can occur in :math:`P` where it is considered a bound variable). The :math:`\In` part @@ -1360,7 +1459,7 @@ There is no restriction on the sort of the predicate to be eliminated. ----------------------- [I:∀ x:A,~A′|∀ x:A,~B′] - + .. inference:: Set & Type s_1 ∈ \{\Set,\Type(j)\} @@ -1376,7 +1475,7 @@ is also of sort :math:`\Prop` or is of the morally smaller sort :math:`\SProp`. .. inference:: Prop - + s ∈ \{\SProp,\Prop\} -------------------- [I:\Prop|I→s] @@ -1404,7 +1503,7 @@ the proof of :g:`or A B` is not accepted: Fail Definition choice (A B: Prop) (x:or A B) := match x with or_introl _ _ a => true | or_intror _ _ b => false end. - + From the computational point of view, the structure of the proof of :g:`(or A B)` in this term is needed for computing the boolean value. @@ -1441,7 +1540,7 @@ this type. :math:`\Prop` for which more eliminations are allowed. .. inference:: Prop-extended - + I~\kw{is an empty or singleton definition} s ∈ \Sort ------------------------------------- @@ -1589,7 +1688,7 @@ An ι-redex is a term of the following form: .. math:: \case((c_{p_i}~q_1 … q_r~a_1 … a_m ),P,f_1 |… |f_l ) - + with :math:`c_{p_i}` the :math:`i`-th constructor of the inductive type :math:`I` with :math:`r` parameters. @@ -1636,7 +1735,7 @@ Typing rule The typing rule is the expected one for a fixpoint. .. inference:: Fix - + (E[Γ] ⊢ A_i : s_i )_{i=1… n} (E[Γ;~f_1 :A_1 ;~…;~f_n :A_n ] ⊢ t_i : A_i )_{i=1… n} ------------------------------------------------------- @@ -1749,7 +1848,7 @@ The reduction for fixpoints is: .. math:: (\Fix~f_i \{F\}~a_1 …a_{k_i}) ~\triangleright_ι~ \subst{t_i}{f_k}{\Fix~f_k \{F\}}_{k=1… n} ~a_1 … a_{k_i} - + when :math:`a_{k_i}` starts with a constructor. This last restriction is needed in order to keep strong normalization and corresponds to the reduction for primitive recursive operators. The following reductions are now @@ -1808,11 +1907,11 @@ and :math:`\subst{E}{|Γ|}{|Γ|c}` to mean the parallel substitution {\WF{E;~c:U;~E′;~c′:=λ x:U.~\subst{t}{c}{x}:∀x:U,~\subst{T}{c}{x};~\subst{E″}{c′}{(c′~c)}} {\subst{Γ}{c′}{(c′~c)}}} - + .. math:: \frac{\WF{E;~c:U;~E′;~c′:T;~E″}{Γ}} {\WF{E;~c:U;~E′;~c′:∀ x:U,~\subst{T}{c}{x};~\subst{E″}{c′}{(c′~c)}}{\subst{Γ}{c′}{(c′~c)}}} - + .. math:: \frac{\WF{E;~c:U;~E′;~\ind{p}{Γ_I}{Γ_C};~E″}{Γ}} {\WFTWOLINES{E;~c:U;~E′;~\ind{p+1}{∀ x:U,~\subst{Γ_I}{c}{x}}{∀ x:U,~\subst{Γ_C}{c}{x}};~ @@ -1853,7 +1952,7 @@ One can consequently derive the following property. .. _First-pruning-property: .. inference:: First pruning property: - + \WF{E;~c:U;~E′}{Γ} c~\kw{does not occur in}~E′~\kw{and}~Γ -------------------------------------- @@ -1933,5 +2032,3 @@ impredicative system for sort :math:`\Set` become: s ∈ \{\Type(i)\} ---------------- [I:\Set|I→ s] - - diff --git a/doc/sphinx/language/coq-library.rst b/doc/sphinx/language/coq-library.rst index d1b95e6203..cad5e4e67e 100644 --- a/doc/sphinx/language/coq-library.rst +++ b/doc/sphinx/language/coq-library.rst @@ -7,22 +7,20 @@ The |Coq| library single: Theories -The |Coq| library is structured into two parts: +The |Coq| library has two parts: - * **The initial library**: it contains elementary logical notions and - data-types. It constitutes the basic state of the system directly - available when running |Coq|; + * **The basic library**: definitions and theorems for + the most commonly used elementary logical notions and + data types. |Coq| normally loads these files automatically when it starts. - * **The standard library**: general-purpose libraries containing various - developments of |Coq| axiomatizations about sets, lists, sorting, - arithmetic, etc. This library comes with the system and its modules - are directly accessible through the ``Require`` command (see - Section :ref:`compiled-files`); + * **The standard library**: general-purpose libraries with + definitions and theorems for sets, lists, sorting, + arithmetic, etc. To use these files, users must load them explicitly + with the ``Require`` command (see :ref:`compiled-files`) -In addition, user-provided libraries or developments are provided by -|Coq| users' community. These libraries and developments are available -for download at http://coq.inria.fr (see -Section :ref:`userscontributions`). +There are also many libraries provided by |Coq| users' community. +These libraries and developments are available +for download at http://coq.inria.fr (see :ref:`userscontributions`). This chapter briefly reviews the |Coq| libraries whose contents can also be browsed at http://coq.inria.fr/stdlib. @@ -514,8 +512,8 @@ realizability interpretation. forall (A B:Prop) (P:Type), (A -> B -> P) -> A /\ B -> P. -Basic Arithmetics -~~~~~~~~~~~~~~~~~ +Basic Arithmetic +~~~~~~~~~~~~~~~~ The basic library includes a few elementary properties of natural numbers, together with the definitions of predecessor, addition and @@ -758,6 +756,7 @@ subdirectories: * **Sets** : Sets (classical, constructive, finite, infinite, power set, etc.) * **FSets** : Specification and implementations of finite sets and finite maps (by lists and by AVL trees) * **Reals** : Axiomatization of real numbers (classical, basic functions, integer part, fractional part, limit, derivative, Cauchy series, power series and results,...) + * **Floats** : Machine implementation of floating-point arithmetic (for the binary64 format) * **Relations** : Relations (definitions and basic results) * **Sorting** : Sorted list (basic definitions and heapsort correctness) * **Strings** : 8-bits characters and strings @@ -770,7 +769,7 @@ are directly accessible with the command ``Require`` (see Section :ref:`compiled-files`). The different modules of the |Coq| standard library are documented -online at http://coq.inria.fr/stdlib. +online at https://coq.inria.fr/stdlib. Peano’s arithmetic (nat) ~~~~~~~~~~~~~~~~~~~~~~~~ @@ -804,8 +803,8 @@ Notation Interpretation =============== =================== -Notations for integer arithmetics -~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~ +Notations for integer arithmetic +~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~ .. index:: single: Arithmetical notations @@ -822,7 +821,7 @@ Notations for integer arithmetics The following table describes the syntax of expressions -for integer arithmetics. It is provided by requiring and opening the module ``ZArith`` and opening scope ``Z_scope``. +for integer arithmetic. It is provided by requiring and opening the module ``ZArith`` and opening scope ``Z_scope``. It specifies how notations are interpreted and, when not already reserved, the precedence and associativity. @@ -866,7 +865,7 @@ Notations for real numbers This is provided by requiring and opening the module ``Reals`` and opening scope ``R_scope``. This set of notations is very similar to -the notation for integer arithmetics. The inverse function was added. +the notation for integer arithmetic. The inverse function was added. =============== =================== Notation Interpretation @@ -990,6 +989,106 @@ Notation Interpretation Precedence Associativity ``_ :: _`` ``cons`` 60 right ========== ============== ========== ============= +.. _floats_library: + +Floats library +~~~~~~~~~~~~~~ + +The library of primitive floating-point arithmetic can be loaded by +requiring module ``Floats``: + +.. coqtop:: in + + Require Import Floats. + +It exports the module ``PrimFloat`` that provides a primitive type +named ``float``, defined in the kernel (see section :ref:`primitive-floats`), +as well as two variant types ``float_comparison`` and ``float_class``: + + +.. coqtop:: all + + Print float. + Print float_comparison. + Print float_class. + +It then defines the primitive operators below, using the processor +floating-point operators for binary64 in rounding-to-nearest even: + +* ``abs`` +* ``opp`` +* ``sub`` +* ``add`` +* ``mul`` +* ``div`` +* ``sqrt`` +* ``compare`` : compare two floats and return a ``float_comparison`` +* ``classify`` : analyze a float and return a ``float_class`` +* ``of_int63`` : round a primitive integer and convert it into a float +* ``normfr_mantissa`` : take a float in ``[0.5; 1.0)`` and return its mantissa +* ``frshiftexp`` : convert a float to fractional part in ``[0.5; 1.0)`` and integer part +* ``ldshiftexp`` : multiply a float by an integral power of ``2`` +* ``next_up`` : return the next float towards positive infinity +* ``next_down`` : return the next float towards negative infinity + +For special floating-point values, the following constants are also +defined: + +* ``zero`` +* ``neg_zero`` +* ``one`` +* ``two`` +* ``infinity`` +* ``neg_infinity`` +* ``nan`` : Not a Number (assumed to be unique: the "payload" of NaNs is ignored) + +The following table shows the notations available when opening scope +``float_scope``. + +=========== ============== +Notation Interpretation +=========== ============== +``- _`` ``opp`` +``_ - _`` ``sub`` +``_ + _`` ``add`` +``_ * _`` ``mul`` +``_ / _`` ``div`` +``_ == _`` ``eqb`` +``_ < _`` ``ltb`` +``_ <= _`` ``leb`` +``_ ?= _`` ``compare`` +=========== ============== + +Floating-point constants are parsed and pretty-printed as (17-digit) +decimal constants. This ensures that the composition +:math:`\text{parse} \circ \text{print}` amounts to the identity. + +.. example:: + + .. coqtop:: all + + Open Scope float_scope. + Eval compute in 1 + 0.5. + Eval compute in 1 / 0. + Eval compute in 1 / -0. + Eval compute in 0 / 0. + Eval compute in 0 ?= -0. + Eval compute in nan ?= nan. + Eval compute in next_down (-1). + +The primitive operators are specified with respect to their Gallina +counterpart, using the variant type ``spec_float``, and the injection +``Prim2SF``: + +.. coqtop:: all + + Print spec_float. + Check Prim2SF. + Check mul_spec. + +For more details on the available definitions and lemmas, see the +online documentation of the ``Floats`` library. + .. _userscontributions: Users’ contributions diff --git a/doc/sphinx/language/gallina-extensions.rst b/doc/sphinx/language/gallina-extensions.rst index c93984661e..ae0afc7819 100644 --- a/doc/sphinx/language/gallina-extensions.rst +++ b/doc/sphinx/language/gallina-extensions.rst @@ -79,14 +79,6 @@ To build an object of type :token:`ident`, one should provide the constructor Definition half := mkRat true 1 2 (O_S 1) one_two_irred. Check half. -.. FIXME: move this to the main grammar in the spec chapter - -.. _record-named-fields-grammar: - - .. productionlist:: - record_term : {| [`field_def` ; … ; `field_def`] |} - field_def : `ident` [`binders`] := `term` - Alternatively, the following syntax allows creating objects by using named fields, as shown in this grammar. The fields do not have to be in any particular order, nor do they have to be all present if the missing ones can be inferred or prompted for @@ -163,10 +155,11 @@ available: .. _record_projections_grammar: - .. productionlist:: terms - projection : `term` `.` ( `qualid` ) - : `term` `.` ( `qualid` `arg` … `arg` ) - : `term` `.` ( @`qualid` `term` … `term` ) + .. insertgram term_projection term_projection + + .. productionlist:: coq + term_projection : `term0` .( `qualid` `args_opt` ) + : `term0` .( @ `qualid` `term1_list_opt` ) Syntax of Record projections @@ -182,13 +175,13 @@ other arguments are the parameters of the inductive type. recursive (references to the record's name in the type of its field raises an error). To define recursive records, one can use the ``Inductive`` and ``CoInductive`` keywords, resulting in an inductive or co-inductive record. - Definition of mutal inductive or co-inductive records are also allowed, as long + Definition of mutually inductive or co-inductive records are also allowed, as long as all of the types in the block are records. .. note:: Induction schemes are automatically generated for inductive records. Automatic generation of induction schemes for non-recursive records defined with the ``Record`` keyword can be activated with the - ``Nonrecursive Elimination Schemes`` option (see :ref:`proofschemes-induction-principles`). + :flag:`Nonrecursive Elimination Schemes` flag (see :ref:`proofschemes-induction-principles`). .. note:: ``Structure`` is a synonym of the keyword ``Record``. @@ -243,14 +236,14 @@ Primitive Projections .. flag:: Printing Primitive Projection Parameters - This compatibility option reconstructs internally omitted parameters at + This compatibility flag reconstructs internally omitted parameters at printing time (even though they are absent in the actual AST manipulated by the kernel). Primitive Record Types ++++++++++++++++++++++ -When the :flag:`Primitive Projections` option is on, definitions of +When the :flag:`Primitive Projections` flag is on, definitions of record types change meaning. When a type is declared with primitive projections, its :g:`match` construct is disabled (see :ref:`primitive_projections` though). To eliminate the (co-)inductive type, one must use its defined primitive projections. @@ -260,10 +253,7 @@ To eliminate the (co-)inductive type, one must use its defined primitive project For compatibility, the parameters still appear to the user when printing terms even though they are absent in the actual AST manipulated by the kernel. This can be changed by unsetting the -:flag:`Printing Primitive Projection Parameters` flag. Further compatibility -printing can be deactivated thanks to the ``Printing Primitive Projection -Compatibility`` option which governs the printing of pattern matching -over primitive records. +:flag:`Printing Primitive Projection Parameters` flag. There are currently two ways to introduce primitive records types: @@ -305,7 +295,7 @@ an object of the record type as arguments, and whose body is an application of the unfolded primitive projection of the same name. These constants are used when elaborating partial applications of the projection. One can distinguish them from applications of the primitive -projection if the :flag:`Printing Primitive Projection Parameters` option +projection if the :flag:`Printing Primitive Projection Parameters` flag is off: For a primitive projection application, parameters are printed as underscores while for the compatibility projections they are printed as usual. @@ -484,7 +474,7 @@ Printing nested patterns pattern matching into a single pattern matching over a nested pattern. - When this option is on (default), |Coq|’s printer tries to do such + When this flag is on (default), |Coq|’s printer tries to do such limited re-factorization. Turning it off tells |Coq| to print only simple pattern matching problems in the same way as the |Coq| kernel handles them. @@ -497,7 +487,7 @@ Factorization of clauses with same right-hand side When several patterns share the same right-hand side, it is additionally possible to share the clauses using disjunctive patterns. Assuming that the - printing matching mode is on, this option (on by default) tells |Coq|'s + printing matching mode is on, this flag (on by default) tells |Coq|'s printer to try to do this kind of factorization. Use of a default clause @@ -508,7 +498,7 @@ Use of a default clause When several patterns share the same right-hand side which do not depend on the arguments of the patterns, yet an extra factorization is possible: the disjunction of patterns can be replaced with a `_` default clause. Assuming that - the printing matching mode and the factorization mode are on, this option (on by + the printing matching mode and the factorization mode are on, this flag (on by default) tells |Coq|'s printer to use a default clause when relevant. Printing of wildcard patterns @@ -517,7 +507,7 @@ Printing of wildcard patterns .. flag:: Printing Wildcard Some variables in a pattern may not occur in the right-hand side of - the pattern matching clause. When this option is on (default), the + the pattern matching clause. When this flag is on (default), the variables having no occurrences in the right-hand side of the pattern matching clause are just printed using the wildcard symbol “_”. @@ -530,7 +520,7 @@ Printing of the elimination predicate In most of the cases, the type of the result of a matched term is mechanically synthesizable. Especially, if the result type does not - depend of the matched term. When this option is on (default), + depend of the matched term. When this flag is on (default), the result type is not printed when |Coq| knows that it can re- synthesize it. @@ -565,7 +555,7 @@ which types are written this way: ``if`` … ``then`` … ``else`` …. Use the :cmd:`Add @table` and :cmd:`Remove @table` commands to update this set. -This example emphasizes what the printing options offer. +This example emphasizes what the printing settings offer. .. example:: @@ -594,7 +584,7 @@ Advanced recursive functions The following experimental command is available when the ``FunInd`` library has been loaded via ``Require Import FunInd``: -.. cmd:: Function @ident {* @binder} { @decrease_annot } : @type := @term +.. cmd:: Function @ident {* @binder} { @fixannot } : @type := @term This command can be seen as a generalization of ``Fixpoint``. It is actually a wrapper for several ways of defining a function *and other useful related @@ -603,16 +593,11 @@ The following experimental command is available when the ``FunInd`` library has The meaning of this declaration is to define a function ident, similarly to ``Fixpoint``. Like in ``Fixpoint``, the decreasing argument must be given (unless the function is not recursive), but it might not - necessarily be *structurally* decreasing. The point of the :n:`{ @decrease_annot }` annotation + necessarily be *structurally* decreasing. The point of the :n:`{ @fixannot }` annotation is to name the decreasing argument *and* to describe which kind of decreasing criteria must be used to ensure termination of recursive calls. - .. productionlist:: - decrease_annot : struct `ident` - : measure `term` `ident` - : wf `term` `ident` - The ``Function`` construction also enjoys the ``with`` extension to define mutually recursive definitions. However, this feature does not work for non structurally recursive functions. @@ -641,7 +626,11 @@ the induction principle to easily reason about the function. than like this: - .. coqtop:: reset all + .. coqtop:: reset none + + Require Import FunInd. + + .. coqtop:: all Function plus (n m : nat) {struct n} : nat := match n with @@ -652,17 +641,22 @@ the induction principle to easily reason about the function. *Limitations* -|term_0| must be built as a *pure pattern matching tree* (:g:`match … with`) +:token:`term` must be built as a *pure pattern matching tree* (:g:`match … with`) with applications only *at the end* of each branch. Function does not support partial application of the function being defined. Thus, the following example cannot be accepted due to the presence of partial application of :g:`wrong` in the body of :g:`wrong`: -.. coqtop:: all +.. coqtop:: none + + Require List. + Import List.ListNotations. - Fail Function wrong (C:nat) : nat := - List.hd 0 (List.map wrong (C::nil)). +.. coqtop:: all fail + + Function wrong (C:nat) : nat := + List.hd 0 (List.map wrong (C::nil)). For now, dependent cases are not treated for non structurally terminating functions. @@ -868,7 +862,7 @@ Sections create local contexts which can be shared across multiple definitions. :name: Let Fixpoint :undocumented: - .. cmdv:: Let CoFixpoint @ident @cofix_body {* with @cofix_body} + .. cmdv:: Let CoFixpoint @ident @fix_body {* with @fix_body} :name: Let CoFixpoint :undocumented: @@ -1305,7 +1299,7 @@ component is equal ``nat`` and hence ``M1.T`` as specified. .. flag:: Short Module Printing - This option (off by default) disables the printing of the types of fields, + This flag (off by default) disables the printing of the types of fields, leaving only their names, for the commands :cmd:`Print Module` and :cmd:`Print Module Type`. @@ -1409,7 +1403,7 @@ is needed. In this translation, names in the file system are called *physical* paths while |Coq| names are contrastingly called *logical* names. -A logical prefix Lib can be associated to a physical pathpath using +A logical prefix Lib can be associated with a physical path using the command line option ``-Q`` `path` ``Lib``. All subfolders of path are recursively associated to the logical path ``Lib`` extended with the corresponding suffix coming from the physical path. For instance, the @@ -1578,7 +1572,7 @@ says that the implicit argument is maximally inserted. Each implicit argument can be declared to have to be inserted maximally or non maximally. This can be governed argument per argument by the command -:cmd:`Arguments (implicits)` or globally by the :flag:`Maximal Implicit Insertion` option. +:cmd:`Arguments (implicits)` or globally by the :flag:`Maximal Implicit Insertion` flag. .. seealso:: :ref:`displaying-implicit-args`. @@ -1751,7 +1745,7 @@ Automatic declaration of implicit arguments This command tells |Coq| to automatically detect what are the implicit arguments of a defined object. - The auto-detection is governed by options telling if strict, + The auto-detection is governed by flags telling if strict, contextual, or reversible-pattern implicit arguments must be considered or not (see :ref:`controlling-strict-implicit-args`, :ref:`controlling-strict-implicit-args`, :ref:`controlling-rev-pattern-implicit-args`, and also :ref:`controlling-insertion-implicit-args`). @@ -1821,9 +1815,9 @@ Mode for automatic declaration of implicit arguments .. flag:: Implicit Arguments - This option (off by default) allows to systematically declare implicit + This flag (off by default) allows to systematically declare implicit the arguments detectable as such. Auto-detection of implicit arguments is - governed by options controlling whether strict and contextual implicit + governed by flags controlling whether strict and contextual implicit arguments have to be considered or not. .. _controlling-strict-implicit-args: @@ -1838,11 +1832,11 @@ Controlling strict implicit arguments arguments plus, for historical reasons, a small subset of the non-strict implicit arguments. To relax this constraint and to set implicit all non strict implicit arguments by default, you can turn this - option off. + flag off. .. flag:: Strongly Strict Implicit - Use this option (off by default) to capture exactly the strict implicit + Use this flag (off by default) to capture exactly the strict implicit arguments and no more than the strict implicit arguments. .. _controlling-contextual-implicit-args: @@ -1853,7 +1847,7 @@ Controlling contextual implicit arguments .. flag:: Contextual Implicit By default, |Coq| does not automatically set implicit the contextual - implicit arguments. You can turn this option on to tell |Coq| to also + implicit arguments. You can turn this flag on to tell |Coq| to also infer contextual implicit argument. .. _controlling-rev-pattern-implicit-args: @@ -1864,7 +1858,7 @@ Controlling reversible-pattern implicit arguments .. flag:: Reversible Pattern Implicit By default, |Coq| does not automatically set implicit the reversible-pattern - implicit arguments. You can turn this option on to tell |Coq| to also infer + implicit arguments. You can turn this flag on to tell |Coq| to also infer reversible-pattern implicit argument. .. _controlling-insertion-implicit-args: @@ -1874,7 +1868,7 @@ Controlling the insertion of implicit arguments not followed by explicit argumen .. flag:: Maximal Implicit Insertion - Assuming the implicit argument mode is on, this option (off by default) + Assuming the implicit argument mode is on, this flag (off by default) declares implicit arguments to be automatically inserted when a function is partially applied and the next argument of the function is an implicit one. @@ -1921,9 +1915,11 @@ Renaming implicit arguments This command is used to redefine the names of implicit arguments. -With the assert flag, ``Arguments`` can be used to assert that a given -object has the expected number of arguments and that these arguments -are named as expected. +.. cmd:: Arguments @qualid {* @name} : assert + :name: Arguments (assert) + + This command is used to assert that a given object has the expected + number of arguments and that these arguments are named as expected. .. example:: (continued) @@ -1954,7 +1950,7 @@ Explicit displaying of implicit arguments for pretty-printing .. flag:: Printing Implicit By default, the basic pretty-printing rules hide the inferable implicit - arguments of an application. Turn this option on to force printing all + arguments of an application. Turn this flag on to force printing all implicit arguments. .. flag:: Printing Implicit Defensive @@ -1962,7 +1958,7 @@ Explicit displaying of implicit arguments for pretty-printing By default, the basic pretty-printing rules display the implicit arguments that are not detected as strict implicit arguments. This “defensive” mode can quickly make the display cumbersome so this can - be deactivated by turning this option off. + be deactivated by turning this flag off. .. seealso:: :flag:`Printing All`. @@ -1991,7 +1987,7 @@ Deactivation of implicit arguments for parsing .. flag:: Parsing Explicit - Turning this option on (it is off by default) deactivates the use of implicit arguments. + Turning this flag on (it is off by default) deactivates the use of implicit arguments. In this case, all arguments of constants, inductive types, constructors, etc, including the arguments declared as implicit, have @@ -2263,11 +2259,11 @@ Printing constructions in full Coercions, implicit arguments, the type of pattern matching, but also notations (see :ref:`syntaxextensionsandinterpretationscopes`) can obfuscate the behavior of some tactics (typically the tactics applying to occurrences of subterms are - sensitive to the implicit arguments). Turning this option on + sensitive to the implicit arguments). Turning this flag on deactivates all high-level printing features such as coercions, implicit arguments, returned type of pattern matching, notations and various syntactic sugar for pattern matching or record projections. - Otherwise said, :flag:`Printing All` includes the effects of the options + Otherwise said, :flag:`Printing All` includes the effects of the flags :flag:`Printing Implicit`, :flag:`Printing Coercions`, :flag:`Printing Synth`, :flag:`Printing Projections`, and :flag:`Printing Notations`. To reactivate the high-level printing features, use the command ``Unset Printing All``. @@ -2279,8 +2275,8 @@ Printing universes .. flag:: Printing Universes - Turn this option on to activate the display of the actual level of each - occurrence of :g:`Type`. See :ref:`Sorts` for details. This wizard option, in + Turn this flag on to activate the display of the actual level of each + occurrence of :g:`Type`. See :ref:`Sorts` for details. This wizard flag, in combination with :flag:`Printing All` can help to diagnose failures to unify terms apparently identical but internally different in the Calculus of Inductive Constructions. @@ -2291,7 +2287,7 @@ Printing universes This command can be used to print the constraints on the internal level of the occurrences of :math:`\Type` (see :ref:`Sorts`). - If the optional ``Sorted`` option is given, each universe will be made + If the ``Sorted`` keyword is present, each universe will be made equivalent to a numbered label reflecting its level (with a linear ordering) in the universe hierarchy. @@ -2315,6 +2311,18 @@ Printing universes Existential variables --------------------- +.. insertgram term_evar evar_binding + +.. productionlist:: coq + term_evar : ?[ `ident` ] + : ?[ ?`ident` ] + : ?`ident` `evar_bindings_opt` + evar_bindings_opt : @{ `evar_bindings_semi` } + : `empty` + evar_bindings_semi : `evar_bindings_semi` ; `evar_binding` + : `evar_binding` + evar_binding : `ident` := `term` + |Coq| terms can include existential variables which represents unknown subterms to eventually be replaced by actual subterms. @@ -2349,7 +2357,7 @@ outside of its context of definition, its instance, written under the form :n:`{ {*; @ident := @term} }` is appending to its name, indicating how the variables of its defining context are instantiated. The variables of the context of the existential variables which are -instantiated by themselves are not written, unless the flag :flag:`Printing Existential Instances` +instantiated by themselves are not written, unless the :flag:`Printing Existential Instances` flag is on (see Section :ref:`explicit-display-existentials`), and this is why an existential variable used in the same context as its context of definition is written with no instance. @@ -2373,7 +2381,7 @@ Explicit displaying of existential instances for pretty-printing .. flag:: Printing Existential Instances - This option (off by default) activates the full display of how the + This flag (off by default) activates the full display of how the context of an existential variable is instantiated at each of the occurrences of the existential variable. @@ -2403,7 +2411,7 @@ by means of the interactive proof engine. .. _primitive-integers: Primitive Integers --------------------------------- +------------------ The language of terms features 63-bit machine integers as values. The type of such a value is *axiomatized*; it is declared through the following sentence @@ -2443,12 +2451,68 @@ The reduction machines (:tacn:`vm_compute`, :tacn:`native_compute`) implement dedicated, efficient, rules to reduce the applications of these primitive operations. -These primitives, when extracted to OCaml (see :ref:`extraction`), are mapped to -types and functions of a :g:`Uint63` module. Said module is not produced by +The extraction of these primitives can be customized similarly to the extraction +of regular axioms (see :ref:`extraction`). Nonetheless, the :g:`ExtrOCamlInt63` +module can be used when extracting to OCaml: it maps the Coq primitives to types +and functions of a :g:`Uint63` module. Said OCaml module is not produced by +extraction. Instead, it has to be provided by the user (if they want to compile +or execute the extracted code). For instance, an implementation of this module +can be taken from the kernel of Coq. + +Literal values (at type :g:`Int63.int`) are extracted to literal OCaml values +wrapped into the :g:`Uint63.of_int` (resp. :g:`Uint63.of_int64`) constructor on +64-bit (resp. 32-bit) platforms. Currently, this cannot be customized (see the +function :g:`Uint63.compile` from the kernel). + +.. _primitive-floats: + +Primitive Floats +---------------- + +The language of terms features Binary64 floating-point numbers as values. +The type of such a value is *axiomatized*; it is declared through the +following sentence (excerpt from the :g:`PrimFloat` module): + +.. coqdoc:: + + Primitive float := #float64_type. + +This type is equipped with a few operators, that must be similarly declared. +For instance, the product of two primitive floats can be computed using the +:g:`PrimFloat.mul` function, declared and specified as follows: + +.. coqdoc:: + + Primitive mul := #float64_mul. + Notation "x * y" := (mul x y) : float_scope. + + Axiom mul_spec : forall x y, Prim2SF (x * y)%float = SF64mul (Prim2SF x) (Prim2SF y). + +where :g:`Prim2SF` is defined in the :g:`FloatOps` module. + +The set of such operators is described in section :ref:`floats_library`. + +These primitive declarations are regular axioms. As such, they must be trusted, and are listed by the +:g:`Print Assumptions` command. + +The reduction machines (:tacn:`vm_compute`, :tacn:`native_compute`) implement +dedicated, efficient rules to reduce the applications of these primitive +operations, using the floating-point processor operators that are assumed +to comply with the IEEE 754 standard for floating-point arithmetic. + +The extraction of these primitives can be customized similarly to the extraction +of regular axioms (see :ref:`extraction`). Nonetheless, the :g:`ExtrOCamlFloats` +module can be used when extracting to OCaml: it maps the Coq primitives to types +and functions of a :g:`Float64` module. Said OCaml module is not produced by extraction. Instead, it has to be provided by the user (if they want to compile or execute the extracted code). For instance, an implementation of this module can be taken from the kernel of Coq. +Literal values (of type :g:`Float64.t`) are extracted to literal OCaml +values (of type :g:`float`) written in hexadecimal notation and +wrapped into the :g:`Float64.of_float` constructor, e.g.: +:g:`Float64.of_float (0x1p+0)`. + Bidirectionality hints ---------------------- diff --git a/doc/sphinx/language/gallina-specification-language.rst b/doc/sphinx/language/gallina-specification-language.rst index 38f6714f46..3cc101d06b 100644 --- a/doc/sphinx/language/gallina-specification-language.rst +++ b/doc/sphinx/language/gallina-specification-language.rst @@ -44,78 +44,86 @@ Lexical conventions =================== Blanks - Space, newline and horizontal tabulation are considered as blanks. + Space, newline and horizontal tab are considered blanks. Blanks are ignored but they separate tokens. Comments - Comments in Coq are enclosed between ``(*`` and ``*)``, and can be nested. - They can contain any character. However, :token:`string` literals must be + Comments are enclosed between ``(*`` and ``*)``. They can be nested. + They can contain any character. However, embedded :token:`string` literals must be correctly closed. Comments are treated as blanks. -Identifiers and access identifiers +Identifiers Identifiers, written :token:`ident`, are sequences of letters, digits, ``_`` and - ``'``, that do not start with a digit or ``'``. That is, they are - recognized by the following lexical class: + ``'``, that do not start with a digit or ``'``. That is, they are + recognized by the following grammar (except that the string ``_`` is reserved; + it is not a valid identifier): .. productionlist:: coq - first_letter : a..z ∣ A..Z ∣ _ ∣ unicode-letter - subsequent_letter : a..z ∣ A..Z ∣ 0..9 ∣ _ ∣ ' ∣ unicode-letter ∣ unicode-id-part ident : `first_letter`[`subsequent_letter`…`subsequent_letter`] - access_ident : .`ident` + first_letter : a..z ∣ A..Z ∣ _ ∣ `unicode_letter` + subsequent_letter : `first_letter` ∣ 0..9 ∣ ' ∣ `unicode_id_part` All characters are meaningful. In particular, identifiers are case-sensitive. - The entry ``unicode-letter`` non-exhaustively includes Latin, + :production:`unicode_letter` non-exhaustively includes Latin, Greek, Gothic, Cyrillic, Arabic, Hebrew, Georgian, Hangul, Hiragana and Katakana characters, CJK ideographs, mathematical letter-like - symbols, hyphens, non-breaking space, … The entry ``unicode-id-part`` + symbols and non-breaking space. :production:`unicode_id_part` non-exhaustively includes symbols for prime letters and subscripts. - Access identifiers, written :token:`access_ident`, are identifiers prefixed by - `.` (dot) without blank. They are used in the syntax of qualified - identifiers. - Numerals - Numerals are sequences of digits with a potential fractional part - and exponent. Integers are numerals without fractional nor exponent - part and optionally preceded by a minus sign. Underscores ``_`` can - be used as comments in numerals. + Numerals are sequences of digits with an optional fractional part + and exponent, optionally preceded by a minus sign. :token:`int` is an integer; + a numeral without fractional or exponent parts. :token:`num` is a non-negative + integer. Underscores embedded in the digits are ignored, for example + ``1_000_000`` is the same as ``1000000``. .. productionlist:: coq - digit : 0..9 + numeral : `num`[. `num`][`exp`[`sign`]`num`] + int : [-]`num` num : `digit`…`digit` - integer : [-]`num` - dot : . + digit : 0..9 exp : e | E sign : + | - - numeral : `num`[`dot` `num`][`exp`[`sign`]`num`] Strings - Strings are delimited by ``"`` (double quote), and enclose a sequence of - any characters different from ``"`` or the sequence ``""`` to denote the - double quote character. In grammars, the entry for quoted strings is - :production:`string`. + Strings begin and end with ``"`` (double quote). Use ``""`` to represent + a double quote character within a string. In the grammar, strings are + identified with :production:`string`. Keywords - The following identifiers are reserved keywords, and cannot be - employed otherwise:: - - _ as at cofix else end exists exists2 fix for - forall fun if IF in let match mod return - SProp Prop Set Type then using where with - -Special tokens - The following sequences of characters are special tokens:: - - ! % & && ( () ) * + ++ , - -> . .( .. - / /\ : :: :< := :> ; < <- <-> <: <= <> = - => =_D > >-> >= ? ?= @ [ \/ ] ^ { | |- - || } ~ #[ - - Lexical ambiguities are resolved according to the “longest match” - rule: when a sequence of non alphanumerical characters can be - decomposed into several different ways, then the first token is the - longest possible one (among all tokens defined at this moment), and so - on. + The following character sequences are reserved keywords that cannot be + used as identifiers:: + + _ Axiom CoFixpoint Definition Fixpoint Hypothesis IF Parameter Prop + SProp Set Theorem Type Variable as at by cofix discriminated else + end exists exists2 fix for forall fun if in lazymatch let match + multimatch return then using where with + + Note that plugins may define additional keywords when they are loaded. + +Other tokens + The set of + tokens defined at any given time can vary because the :cmd:`Notation` + command can define new tokens. A :cmd:`Require` command may load more notation definitions, + while the end of a :cmd:`Section` may remove notations. Some notations + are defined in the basic library (see :ref:`thecoqlibrary`) and are normally + loaded automatically at startup time. + + Here are the character sequences that Coq directly defines as tokens + without using :cmd:`Notation` (omitting 25 specialized tokens that begin with + ``#int63_``):: + + ! #[ % & ' ( () (bfs) (dfs) ) * ** + , - -> + . .( .. ... / : ::= := :> :>> ; < <+ <- <: + <<: <= = => > >-> >= ? @ @{ [ [= ] _ _eqn + `( `{ { {| | |- || } + + When multiple tokens match the beginning of a sequence of characters, + the longest matching token is used. + Occasionally you may need to insert spaces to separate tokens. For example, + if ``~`` and ``~~`` are both defined as tokens, the inputs ``~ ~`` and + ``~~`` generate different tokens, whereas if `~~` is not defined, then the + two inputs are equivalent. .. _term: @@ -131,62 +139,50 @@ presentation of Cic is given in Chapter :ref:`calculusofinductiveconstructions`. are given in Chapter :ref:`extensionsofgallina`. How to customize the syntax is described in Chapter :ref:`syntaxextensionsandinterpretationscopes`. -.. productionlist:: coq - term : forall `binders` , `term` - : fun `binders` => `term` - : fix `fix_bodies` - : cofix `cofix_bodies` - : let `ident` [`binders`] [: `term`] := `term` in `term` - : let fix `fix_body` in `term` - : let cofix `cofix_body` in `term` - : let ( [`name` , … , `name`] ) [`dep_ret_type`] := `term` in `term` - : let ' `pattern` [in `term`] := `term` [`return_type`] in `term` - : if `term` [`dep_ret_type`] then `term` else `term` - : `term` : `term` - : `term` <: `term` - : `term` :> - : `term` -> `term` - : `term` `arg` … `arg` - : @ `qualid` [`term` … `term`] - : `term` % `ident` - : match `match_item` , … , `match_item` [`return_type`] with - : [[|] `equation` | … | `equation`] end - : `qualid` - : `sort` - : `num` - : _ - : ( `term` ) - arg : `term` - : ( `ident` := `term` ) - binders : `binder` … `binder` - binder : `name` - : ( `name` … `name` : `term` ) - : ( `name` [: `term`] := `term` ) - : ' `pattern` - name : `ident` | _ - qualid : `ident` | `qualid` `access_ident` - sort : SProp | Prop | Set | Type - fix_bodies : `fix_body` - : `fix_body` with `fix_body` with … with `fix_body` for `ident` - cofix_bodies : `cofix_body` - : `cofix_body` with `cofix_body` with … with `cofix_body` for `ident` - fix_body : `ident` `binders` [`annotation`] [: `term`] := `term` - cofix_body : `ident` [`binders`] [: `term`] := `term` - annotation : { struct `ident` } - match_item : `term` [as `name`] [in `qualid` [`pattern` … `pattern`]] - dep_ret_type : [as `name`] `return_type` - return_type : return `term` - equation : `mult_pattern` | … | `mult_pattern` => `term` - mult_pattern : `pattern` , … , `pattern` - pattern : `qualid` `pattern` … `pattern` - : @ `qualid` `pattern` … `pattern` - : `pattern` as `ident` - : `pattern` % `ident` - : `qualid` - : _ - : `num` - : ( `pattern` | … | `pattern` ) +.. insertgram term binders_opt +.. productionlist:: coq + term : forall `open_binders` , `term` + : fun `open_binders` => `term` + : `term_let` + : if `term` `as_return_type_opt` then `term` else `term` + : `term_fix` + : `term100` + term100 : `term_cast` + : `term10` + term10 : `term1` `args` + : @ `qualid` `universe_annot_opt` `term1_list_opt` + : `term1` + args : `args` `arg` + : `arg` + arg : ( `ident` := `term` ) + : `term1` + term1_list_opt : `term1_list_opt` `term1` + : `empty` + empty : + term1 : `term_projection` + : `term0` % `ident` + : `term0` + args_opt : `args` + : `empty` + term0 : `qualid` `universe_annot_opt` + : `sort` + : `numeral` + : `string` + : _ + : `term_evar` + : `term_match` + : ( `term` ) + : {| `fields_def` |} + : `{ `term` } + : `( `term` ) + : ltac : ( `ltac_expr` ) + fields_def : `field_def` ; `fields_def` + : `field_def` + : `empty` + field_def : `qualid` `binders_opt` := `term` + binders_opt : `binders` + : `empty` Types ----- @@ -200,6 +196,13 @@ of types inside the syntactic class :token:`term`. Qualified identifiers and simple identifiers -------------------------------------------- +.. insertgram qualid field + +.. productionlist:: coq + qualid : `qualid` `field` + : `ident` + field : .`ident` + *Qualified identifiers* (:token:`qualid`) denote *global constants* (definitions, lemmas, theorems, remarks or facts), *global variables* (parameters or axioms), *inductive types* or *constructors of inductive @@ -207,11 +210,15 @@ types*. *Simple identifiers* (or shortly :token:`ident`) are a syntactic subset of qualified identifiers. Identifiers may also denote *local variables*, while qualified identifiers do not. -Numerals --------- +Field identifiers, written :token:`field`, are identifiers prefixed by +`.` (dot) with no blank between the dot and the identifier. + + +Numerals and strings +-------------------- -Numerals have no definite semantics in the calculus. They are mere -notations that can be bound to objects through the notation mechanism +Numerals and strings have no predefined semantics in the calculus. They are +merely notations that can be bound to objects through the notation mechanism (see Chapter :ref:`syntaxextensionsandinterpretationscopes` for details). Initially, numerals are bound to Peano’s representation of natural numbers (see :ref:`datatypes`). @@ -230,6 +237,35 @@ numbers (see :ref:`datatypes`). Sorts ----- +.. insertgram sort universe_level + +.. productionlist:: coq + sort : Set + : Prop + : SProp + : Type + : Type @{ _ } + : Type @{ `universe` } + universe : max ( `universe_exprs_comma` ) + : `universe_expr` + universe_exprs_comma : `universe_exprs_comma` , `universe_expr` + : `universe_expr` + universe_expr : `universe_name` `universe_increment_opt` + universe_name : `qualid` + : Set + : Prop + universe_increment_opt : + `num` + : `empty` + universe_annot_opt : @{ `universe_levels_opt` } + : `empty` + universe_levels_opt : `universe_levels_opt` `universe_level` + : `empty` + universe_level : Set + : Prop + : Type + : _ + : `qualid` + There are four sorts :g:`SProp`, :g:`Prop`, :g:`Set` and :g:`Type`. - :g:`SProp` is the universe of *definitionally irrelevant @@ -253,6 +289,35 @@ More on sorts can be found in Section :ref:`sorts`. Binders ------- +.. insertgram open_binders exclam_opt + +.. productionlist:: coq + open_binders : `names` : `term` + : `binders` + names : `names` `name` + : `name` + name : _ + : `ident` + binders : `binders` `binder` + : `binder` + binder : `name` + : ( `names` : `term` ) + : ( `name` `colon_term_opt` := `term` ) + : { `name` } + : { `names` `colon_term_opt` } + : `( `typeclass_constraints_comma` ) + : `{ `typeclass_constraints_comma` } + : ' `pattern0` + : ( `name` : `term` | `term` ) + typeclass_constraints_comma : `typeclass_constraints_comma` , `typeclass_constraint` + : `typeclass_constraint` + typeclass_constraint : `exclam_opt` `term` + : { `name` } : `exclam_opt` `term` + : `name` : `exclam_opt` `term` + exclam_opt : ! + : `empty` + + Various constructions such as :g:`fun`, :g:`forall`, :g:`fix` and :g:`cofix` *bind* variables. A binding is represented by an identifier. If the binding variable is not used in the expression, the identifier can be replaced by the @@ -278,8 +343,8 @@ the case of a single sequence of bindings sharing the same type (e.g.: .. index:: fun ... => ... -Abstractions ------------- +Abstractions: fun +----------------- The expression :n:`fun @ident : @type => @term` defines the *abstraction* of the variable :token:`ident`, of type :token:`type`, over the term @@ -300,8 +365,8 @@ Section :ref:`let-in`). .. index:: forall -Products --------- +Products: forall +---------------- The expression :n:`forall @ident : @type, @term` denotes the *product* of the variable :token:`ident` of type :token:`type`, over the term :token:`term`. @@ -346,6 +411,14 @@ Section :ref:`explicit-applications`). Type cast --------- +.. insertgram term_cast term_cast + +.. productionlist:: coq + term_cast : `term10` <: `term` + : `term10` <<: `term` + : `term10` : `term` + : `term10` :> + The expression :n:`@term : @type` is a type cast expression. It enforces the type of :token:`term` to be :token:`type`. @@ -371,6 +444,22 @@ guess the missing piece of information. Let-in definitions ------------------ +.. insertgram term_let names_comma + +.. productionlist:: coq + term_let : let `name` `colon_term_opt` := `term` in `term` + : let `name` `binders` `colon_term_opt` := `term` in `term` + : let `single_fix` in `term` + : let `names_tuple` `as_return_type_opt` := `term` in `term` + : let ' `pattern` := `term` `return_type_opt` in `term` + : let ' `pattern` in `pattern` := `term` `return_type` in `term` + colon_term_opt : : `term` + : `empty` + names_tuple : ( `names_comma` ) + : () + names_comma : `names_comma` , `name` + : `name` + :n:`let @ident := @term in @term’` denotes the local binding of :token:`term` to the variable :token:`ident` in :token:`term`’. There is a syntactic sugar for let-in @@ -379,10 +468,62 @@ stands for :n:`let @ident := fun {+ @binder} => @term in @term’`. .. index:: match ... with ... -Definition by case analysis ---------------------------- +Definition by cases: match +-------------------------- + +.. insertgram term_match record_pattern -Objects of inductive types can be destructurated by a case-analysis +.. productionlist:: coq + term_match : match `case_items_comma` `return_type_opt` with `or_opt` `eqns_or_opt` end + case_items_comma : `case_items_comma` , `case_item` + : `case_item` + return_type_opt : `return_type` + : `empty` + as_return_type_opt : `as_name_opt` `return_type` + : `empty` + return_type : return `term100` + case_item : `term100` `as_name_opt` `in_opt` + as_name_opt : as `name` + : `empty` + in_opt : in `pattern` + : `empty` + or_opt : | + : `empty` + eqns_or_opt : `eqns_or` + : `empty` + eqns_or : `eqns_or` | `eqn` + : `eqn` + eqn : `patterns_comma_list_or` => `term` + patterns_comma_list_or : `patterns_comma_list_or` | `patterns_comma` + : `patterns_comma` + patterns_comma : `patterns_comma` , `pattern` + : `pattern` + pattern : `pattern10` : `term` + : `pattern10` + pattern10 : `pattern1` as `name` + : `pattern1_list` + : @ `qualid` `pattern1_list_opt` + : `pattern1` + pattern1_list : `pattern1_list` `pattern1` + : `pattern1` + pattern1_list_opt : `pattern1_list` + : `empty` + pattern1 : `pattern0` % `ident` + : `pattern0` + pattern0 : `qualid` + : {| `record_patterns_opt` |} + : _ + : ( `patterns_or` ) + : `numeral` + : `string` + patterns_or : `patterns_or` | `pattern` + : `pattern` + record_patterns_opt : `record_patterns_opt` ; `record_pattern` + : `record_pattern` + : `empty` + record_pattern : `qualid` := `pattern` + +Objects of inductive types can be destructured by a case-analysis construction called *pattern matching* expression. A pattern matching expression is used to analyze the structure of an inductive object and to apply specific treatments accordingly. @@ -390,11 +531,11 @@ to apply specific treatments accordingly. This paragraph describes the basic form of pattern matching. See Section :ref:`Mult-match` and Chapter :ref:`extendedpatternmatching` for the description of the general form. The basic form of pattern matching is characterized -by a single :token:`match_item` expression, a :token:`mult_pattern` restricted to a +by a single :token:`case_item` expression, a :token:`patterns_comma` restricted to a single :token:`pattern` and :token:`pattern` restricted to the form :n:`@qualid {* @ident}`. -The expression match ":token:`term`:math:`_0` :token:`return_type` with +The expression match ":token:`term`:math:`_0` :token:`return_type_opt` with :token:`pattern`:math:`_1` => :token:`term`:math:`_1` :math:`|` … :math:`|` :token:`pattern`:math:`_n` => :token:`term`:math:`_n` end" denotes a *pattern matching* over the term :token:`term`:math:`_0` (expected to be @@ -404,10 +545,10 @@ expression. Each of :token:`pattern`:math:`_i` has a form :token:`qualid` :token:`ident` where :token:`qualid` must denote a constructor. There should be exactly one branch for every constructor of :math:`I`. -The :token:`return_type` expresses the type returned by the whole match +The :token:`return_type_opt` expresses the type returned by the whole match expression. There are several cases. In the *non dependent* case, all -branches have the same type, and the :token:`return_type` is the common type of -branches. In this case, :token:`return_type` can usually be omitted as it can be +branches have the same type, and the :token:`return_type_opt` is the common type of +branches. In this case, :token:`return_type_opt` can usually be omitted as it can be inferred from the type of the branches [2]_. In the *dependent* case, there are three subcases. In the first subcase, @@ -507,8 +648,30 @@ Sections :ref:`if-then-else` and :ref:`irrefutable-patterns`). single: fix single: cofix -Recursive functions -------------------- +Recursive and co-recursive functions: fix and cofix +--------------------------------------------------- + +.. insertgram term_fix term1_extended_opt + +.. productionlist:: coq + term_fix : `single_fix` + : `single_fix` with `fix_bodies` for `ident` + single_fix : fix `fix_body` + : cofix `fix_body` + fix_bodies : `fix_bodies` with `fix_body` + : `fix_body` + fix_body : `ident` `binders_opt` `fixannot_opt` `colon_term_opt` := `term` + fixannot_opt : `fixannot` + : `empty` + fixannot : { struct `ident` } + : { wf `term1_extended` `ident` } + : { measure `term1_extended` `ident_opt` `term1_extended_opt` } + term1_extended : `term1` + : @ `qualid` `universe_annot_opt` + ident_opt : `ident` + : `empty` + term1_extended_opt : `term1_extended` + : `empty` The expression “``fix`` :token:`ident`:math:`_1` :token:`binder`:math:`_1` ``:`` :token:`type`:math:`_1` ``:=`` :token:`term`:math:`_1` ``with … with`` @@ -534,6 +697,8 @@ syntax: :n:`let fix @ident @binders := @term in` stands for The Vernacular ============== +.. insertgramXX vernac ident_opt2 + .. productionlist:: coq decorated-sentence : [ `decoration` … `decoration` ] `sentence` sentence : `assumption` @@ -555,11 +720,11 @@ The Vernacular ind_body : `ident` [`binders`] : `term` := : [[|] `ident` [`binders`] [:`term`] | … | `ident` [`binders`] [:`term`]] fixpoint : Fixpoint `fix_body` with … with `fix_body` . - : CoFixpoint `cofix_body` with … with `cofix_body` . + : CoFixpoint `fix_body` with … with `fix_body` . assertion : `assertion_keyword` `ident` [`binders`] : `term` . assertion_keyword : Theorem | Lemma : Remark | Fact - : Corollary | Proposition + : Corollary | Property | Proposition : Definition | Example proof : Proof . … Qed . : Proof . … Defined . @@ -765,7 +930,8 @@ Simple inductive types The types of the constructors have to satisfy a *positivity condition* (see Section :ref:`positivity`). This condition ensures the soundness of - the inductive definition. + the inductive definition. The positivity checking can be disabled using + the :flag:`Positivity Checking` flag (see :ref:`controlling-typing-flags`). .. exn:: The conclusion of @type is not valid; it must be built from @ident. @@ -942,7 +1108,7 @@ Parameterized inductive types .. flag:: Uniform Inductive Parameters - When this option is set (it is off by default), + When this flag is set (it is off by default), inductive definitions are abstracted over their parameters before type checking constructors, allowing to write: @@ -977,7 +1143,7 @@ Variants The :cmd:`Variant` command is identical to the :cmd:`Inductive` command, except that it disallows recursive definition of types (for instance, lists cannot be defined using :cmd:`Variant`). No induction scheme is generated for - this variant, unless option :flag:`Nonrecursive Elimination Schemes` is on. + this variant, unless the :flag:`Nonrecursive Elimination Schemes` flag is on. .. exn:: The @num th argument of @ident must be @ident in @type. :undocumented: @@ -1379,11 +1545,11 @@ Chapter :ref:`Tactics`. The basic assertion command is: The name you provided is already defined. You have then to choose another name. - .. exn:: Nested proofs are not allowed unless you turn option Nested Proofs Allowed on. + .. exn:: Nested proofs are not allowed unless you turn the :flag:`Nested Proofs Allowed` flag on. You are asserting a new statement while already being in proof editing mode. This feature, called nested proofs, is disabled by default. - To activate it, turn option :flag:`Nested Proofs Allowed` on. + To activate it, turn the :flag:`Nested Proofs Allowed` flag on. .. cmdv:: Lemma @ident {? @binders } : @type Remark @ident {? @binders } : @type @@ -1456,8 +1622,8 @@ using the keyword :cmd:`Qed`. .. note:: - #. Several statements can be simultaneously asserted provided option - :flag:`Nested Proofs Allowed` was turned on. + #. Several statements can be simultaneously asserted provided the + :flag:`Nested Proofs Allowed` flag was turned on. #. Not only other assertions but any vernacular command can be given while in the process of proving a given assertion. In this case, the @@ -1489,19 +1655,20 @@ Each attribute has a name (an identifier) and may have a value. A value is either a :token:`string` (in which case it is specified with an equal ``=`` sign), or a list of attributes, enclosed within brackets. -Currently, -the following attributes names are recognized: +Some attributes are specific to a command, and so are described with +that command. Currently, the following attributes are recognized by a +variety of commands: -``monomorphic``, ``polymorphic`` - Take no value, analogous to the ``Monomorphic`` and ``Polymorphic`` flags - (see :ref:`polymorphicuniverses`). +``universes(monomorphic)``, ``universes(polymorphic)`` + Equivalent to the ``Monomorphic`` and + ``Polymorphic`` flags (see :ref:`polymorphicuniverses`). ``program`` - Takes no value, analogous to the ``Program`` flag + Takes no value, equivalent to the ``Program`` flag (see :ref:`programs`). ``global``, ``local`` - Take no value, analogous to the ``Global`` and ``Local`` flags + Take no value, equivalent to the ``Global`` and ``Local`` flags (see :ref:`controlling-locality-of-commands`). ``deprecated`` @@ -1542,6 +1709,11 @@ the following attributes names are recognized: now foo. Abort. +.. warn:: Unsupported attribute + + This warning is an error by default. It is caused by using a + command with some attribute it does not understand. + .. [1] This is similar to the expression “*entry* :math:`\{` sep *entry* :math:`\}`” in standard BNF, or “*entry* :math:`(` sep *entry* diff --git a/doc/sphinx/practical-tools/coq-commands.rst b/doc/sphinx/practical-tools/coq-commands.rst index 48d5f4075e..514f5acc8e 100644 --- a/doc/sphinx/practical-tools/coq-commands.rst +++ b/doc/sphinx/practical-tools/coq-commands.rst @@ -36,7 +36,7 @@ toplevel with the command ``Coqloop.loop();;``. .. flag:: Coqtop Exit On Error - This option, off by default, causes coqtop to exit with status code + This flag, off by default, causes coqtop to exit with status code ``1`` if a command produces an error instead of recovering from it. Batch compilation (coqc) @@ -184,6 +184,13 @@ and ``coqtop``, unless stated otherwise: :-verbose: Output the content of the input file as it is compiled. This option is available for ``coqc`` only; it is the counterpart of -compile-verbose. +:-vos: Indicate |Coq| to skip the processing of opaque proofs + (i.e., proofs ending with ``Qed`` or ``Admitted``), output a ``.vos`` files + instead of a ``.vo`` file, and to load ``.vos`` files instead of ``.vo`` files + when interpreting ``Require`` commands. +:-vok: Indicate |Coq| to check a file completely, to load ``.vos`` files instead + of ``.vo`` files when interpreting ``Require`` commands, and to output an empty + ``.vok`` files upon success instead of writing a ``.vo`` file. :-w (all|none|w₁,…,wₙ): Configure the display of warnings. This option expects all, none or a comma-separated list of warning names or categories (see Section :ref:`controlling-display`). @@ -212,7 +219,7 @@ and ``coqtop``, unless stated otherwise: .. warning:: This makes the logic inconsistent. :-mangle-names *ident*: *Experimental.* Do not depend on this option. Replace Coq's auto-generated name scheme with names of the form *ident0*, *ident1*, - etc. Within Coq, the flag :flag:`Mangle Names` turns this behavior on, + etc. Within Coq, the :flag:`Mangle Names` flag turns this behavior on, and the :opt:`Mangle Names Prefix` option sets the prefix to use. This feature is intended to be used as a linter for developments that want to be robust to changes in the auto-generated name scheme. The options are provided to @@ -245,6 +252,119 @@ and ``coqtop``, unless stated otherwise: currently associated color and exit. :-h, --help: Print a short usage and exit. + + +Compiled interfaces (produced using ``-vos``) +---------------------------------------------- + +Compiled interfaces help saving time while developing Coq formalizations, +by compiling the formal statements exported by a library independently of +the proofs that it contains. + + .. warning:: + + Compiled interfaces should only be used for development purposes. + At the end of the day, one still needs to proof check all files + by producing standard ``.vo`` files. (Technically, when using ``-vos``, + fewer universe constraints are collected.) + Moreover, this feature is still experimental, it may be subject to + change without prior notice. + +**Principle.** + +The compilation using ``coqc -vos foo.v`` produces a file called ``foo.vos``, +which is similar to ``foo.vo`` except that all opaque proofs are skipped in +the compilation process. + +The compilation using ``coqc -vok foo.v`` checks that the file ``foo.v`` +correctly compiles, including all its opaque proofs. If the compilation +succeeds, then the output is a file called ``foo.vok``, with empty contents. +This file is only a placeholder indicating that ``foo.v`` has been successfully +compiled. (This placeholder is useful for build systems such as ``make``.) + +When compiling a file ``bar.v`` that depends on ``foo.v`` (for example via +a ``Require Foo.`` command), if the compilation command is ``coqc -vos bar.v`` +or ``coqc -vok bar.v``, then the file ``foo.vos`` gets loaded (instead of +``foo.vo``). A special case is if file ``foo.vos`` exists and has empty +contents, and ``foo.vo`` exists, then ``foo.vo`` is loaded. + +Appart from the aforementioned case where ``foo.vo`` can be loaded in place +of ``foo.vos``, in general the ``.vos`` and ``.vok`` files live totally +independently from the ``.vo`` files. + +**Dependencies generated by ``coq_makefile``.** + +The files ``foo.vos`` and ``foo.vok`` both depend on ``foo.v``. + +Furthermore, if a file ``foo.v`` requires ``bar.v``, then ``foo.vos`` +and ``foo.vok`` also depend on ``bar.vos``. + +Note, however, that ``foo.vok`` does not depend on ``bar.vok``. +Hence, as detailed further, parallel compilation of proofs is possible. + +In addition, ``coq_makefile`` generates for a file ``foo.v`` a target +``foo.required_vos`` which depends on the list of ``.vos`` files that +``foo.vos`` depends upon (excluding ``foo.vos`` itself). As explained +next, the purpose of this target is to be able to request the minimal +working state for editing interactively the file ``foo.v``. + +**Typical compilation of a set of file using a build system.** + +Assume a file ``foo.v`` that depends on two files ``f1.v`` and ``f2.v``. The +command ``make foo.required_vos`` will compile ``f1.v`` and ``f2.v`` using +the option ``-vos`` to skip the proofs, producing ``f1.vos`` and ``f2.vos``. +At this point, one is ready to work interactively on the file ``foo.v``, even +though it was never needed to compile the proofs involved in the files ``f1.v`` +and ``f2.v``. + +Assume a set of files ``f1.v ... fn.v`` with linear dependencies. The command +``make vos`` enables compiling the statements (i.e. excluding the proofs) in all +the files. Next, ``make -j vok`` enables compiling all the proofs in parallel. +Thus, calling ``make -j vok`` directly enables taking advantage of a maximal +amount of parallelism during the compilation of the set of files. + +Note that this comes at the cost of parsing and typechecking all definitions +twice, once for the ``.vos`` file and once for the ``.vok`` file. However, if +files contain nontrivial proofs, or if the files have many linear chains of +dependencies, or if one has many cores available, compilation should be faster +overall. + +**Need for ``Proof using``** + +When a theorem is part of a section, typechecking the statement of this theorem +might be insufficient for deducing the type of this statement as of at the end +of the section. Indeed, the proof of the theorem could make use of section +variables or section hypotheses that are not mentioned in the statement of the +theorem. + +For this reason, proofs inside section should begin with :cmd:`Proof using` +instead of :cmd:`Proof`, where after the ``using`` clause one should provide +the list of the names of the section variables that are required for the proof +but are not involved in the typechecking of the statement. Note that it is safe +to write ``Proof using.`` instead of ``Proof.`` also for proofs that are not +within a section. + +.. warn:: You should use the “Proof using [...].” syntax instead of “Proof.” to enable skipping this proof which is located inside a section. Give as argument to “Proof using” the list of section variables that are not needed to typecheck the statement but that are required by the proof. + + If |Coq| is invoked using the ``-vos`` option, whenever it finds the + command ``Proof.`` inside a section, it will compile the proof, that is, + refuse to skip it, and it will raise a warning. To disable the warning, one + may pass the flag ``-w -proof-without-using-in-section``. + +**Interaction with standard compilation** + +When compiling a file ``foo.v`` using ``coqc`` in the standard way (i.e., without +``-vos`` nor ``-vok``), an empty file ``foo.vos`` and an empty file ``foo.vok`` +are created in addition to the regular output file ``foo.vo``. +If ``coqc`` is subsequently invoked on some other file ``bar.v`` using option +``-vos`` or ``-vok``, and that ``bar.v`` requires ``foo.v``, if |Coq| finds an +empty file ``foo.vos``, then it will load ``foo.vo`` instead of ``foo.vos``. + +The purpose of this feature is to allow users to benefit from the ``-vos`` +option even if they depend on libraries that were compiled in the traditional +manner (i.e., never compiled using the ``-vos`` option). + + Compiled libraries checker (coqchk) ---------------------------------------- diff --git a/doc/sphinx/practical-tools/coqide.rst b/doc/sphinx/practical-tools/coqide.rst index efb5df720a..b1f392c337 100644 --- a/doc/sphinx/practical-tools/coqide.rst +++ b/doc/sphinx/practical-tools/coqide.rst @@ -88,8 +88,6 @@ There are other buttons on the |CoqIDE| toolbar: a button to save the running buffer; a button to close the current buffer (an "X"); buttons to switch among buffers (left and right arrows); an "information" button; and a "gears" button. -The "information" button is described in Section :ref:`try-tactics-automatically`. - The "gears" button submits proof terms to the |Coq| kernel for type checking. When |Coq| uses asynchronous processing (see Chapter :ref:`asynchronousandparallelproofprocessing`), proofs may have been completed without kernel-checking of generated proof terms. @@ -100,27 +98,6 @@ processed color, though their preceding proofs have the processed color. Notice that for all these buttons, except for the "gears" button, their operations are also available in the menu, where their keyboard shortcuts are given. -.. _try-tactics-automatically: - -Trying tactics automatically ------------------------------- - -The menu Try Tactics provides some features for automatically trying -to solve the current goal using simple tactics. If such a tactic -succeeds in solving the goal, then its text is automatically inserted -into the script. There is finally a combination of these tactics, -called the *proof wizard* which will try each of them in turn. This -wizard is also available as a tool button (the "information" button). The set of -tactics tried by the wizard is customizable in the preferences. - -These tactics are general ones, in particular they do not refer to -particular hypotheses. You may also try specific tactics related to -the goal or one of the hypotheses, by clicking with the right mouse -button on the goal or the considered hypothesis. This is the -“contextual menu on goals” feature, that may be disabled in the -preferences if undesirable. - - Proof folding ------------------ @@ -202,17 +179,13 @@ compilation, printing, web browsing. In the browser command, you may use `%s` to denote the URL to open, for example: `firefox -remote "OpenURL(%s)"`. -The `Tactics Wizard` section allows defining the set of tactics that -should be tried, in sequence, to solve the current goal. - -The last section is for miscellaneous boolean settings, such as the -“contextual menu on goals” feature presented in the section -:ref:`Try tactics automatically <try-tactics-automatically>`. - -Notice that these settings are saved in the file `.coqiderc` of your -home directory. +Notice that these settings are saved in the file ``coqiderc`` in the +``coq`` subdirectory of the user configuration directory which +is the value of ``$XDG_CONFIG_HOME`` if this environment variable is +set and which otherwise is ``$HOME/.config/``. -A Gtk2 accelerator keymap is saved under the name `.coqide.keys`. It +A GTK+ accelerator keymap is saved under the name ``coqide.keys`` in +the same ``coq`` subdirectory of the user configuration directory. It is not recommended to edit this file manually: to modify a given menu shortcut, go to the corresponding menu item without releasing the mouse button, press the key you want for the new shortcut, and release @@ -289,8 +262,9 @@ Adding custom bindings ~~~~~~~~~~~~~~~~~~~~~~ To extend the default set of bindings, create a file named ``coqide.bindings`` -and place it in the same folder as ``coqide.keys``. On Linux, this would be -the folder ``~/.config/coq``. The file `coqide.bindings` should contain one +and place it in the same folder as ``coqide.keys``. This would be +the folder ``$XDG_CONFIG_HOME/coq``, defaulting to ``~/.config/coq`` +if ``XDG_CONFIG_HOME`` is unset. The file `coqide.bindings` should contain one binding per line, in the form ``\key value``, followed by an optional priority integer. (The key and value should not contain any space character.) diff --git a/doc/sphinx/practical-tools/utilities.rst b/doc/sphinx/practical-tools/utilities.rst index 554f6bf230..e5edd08995 100644 --- a/doc/sphinx/practical-tools/utilities.rst +++ b/doc/sphinx/practical-tools/utilities.rst @@ -62,7 +62,7 @@ A simple example of a ``_CoqProject`` file follows: theories/foo.v theories/bar.v -I src/ - src/baz.ml4 + src/baz.mlg src/bazaux.ml src/qux_plugin.mlpack @@ -111,7 +111,7 @@ decide how to build them. In particular: + |Coq| files must use the ``.v`` extension + |OCaml| files must use the ``.ml`` or ``.mli`` extension + |OCaml| files that require pre processing for syntax - extensions (like ``VERNAC EXTEND``) must use the ``.ml4`` extension + extensions (like ``VERNAC EXTEND``) must use the ``.mlg`` extension + In order to generate a plugin one has to list all |OCaml| modules (i.e. ``Baz`` for ``baz.ml``) in a ``.mlpack`` file (or ``.mllib`` file). @@ -359,7 +359,7 @@ line timing data: pass ``TIMING=before`` or ``TIMING=after`` rather than ``TIMING=1``. .. note:: - The sorting used here is the same as in the ``print-pretty-timed -diff`` target. + The sorting used here is the same as in the ``print-pretty-timed-diff`` target. .. note:: This target requires python to build the table. @@ -522,10 +522,7 @@ of your project. (flags :standard -warn-error -3-9-27-32-33-50) (libraries coq.plugins.cc coq.plugins.extraction)) - (rule - (targets g_equations.ml) - (deps (:pp-file g_equations.mlg)) - (action (run coqpp %{pp-file}))) + (coq.pp (modules g_equations)) And a Coq-specific part that depends on it via the ``libraries`` field: diff --git a/doc/sphinx/proof-engine/ltac.rst b/doc/sphinx/proof-engine/ltac.rst index 46f9826e41..b2b426ada5 100644 --- a/doc/sphinx/proof-engine/ltac.rst +++ b/doc/sphinx/proof-engine/ltac.rst @@ -31,10 +31,10 @@ Syntax The syntax of the tactic language is given below. See Chapter :ref:`gallinaspecificationlanguage` for a description of the BNF metasyntax used in these grammar rules. Various already defined entries will be used in this -chapter: entries :token:`natural`, :token:`integer`, :token:`ident`, +chapter: entries :token:`num`, :token:`int`, :token:`ident` :token:`qualid`, :token:`term`, :token:`cpattern` and :token:`tactic` -represent respectively the natural and integer numbers, the authorized -identificators and qualified names, Coq terms and patterns and all the atomic +represent respectively natural and integer numbers, +identifiers, qualified names, Coq terms, patterns and the atomic tactics described in Chapter :ref:`tactics`. The syntax of :production:`cpattern` is @@ -127,7 +127,7 @@ mode but it can also be used in toplevel definitions as shown below. : gfail [`natural`] [`message_token` ... `message_token`] : fresh [ `component` … `component` ] : context `ident` [`term`] - : eval `redexpr` in `term` + : eval `red_expr` in `term` : type of `term` : constr : `term` : uconstr : `term` @@ -141,10 +141,10 @@ mode but it can also be used in toplevel definitions as shown below. : `atom` atom : `qualid` : () - : `integer` + : `int` : ( `ltac_expr` ) component : `string` | `qualid` - message_token : `string` | `ident` | `integer` + message_token : `string` | `ident` | `int` tacarg : `qualid` : () : ltac : `atom` @@ -159,11 +159,11 @@ mode but it can also be used in toplevel definitions as shown below. match_rule : `cpattern` => `ltac_expr` : context [`ident`] [ `cpattern` ] => `ltac_expr` : _ => `ltac_expr` - test : `integer` = `integer` - : `integer` (< | <= | > | >=) `integer` + test : `int` = `int` + : `int` (< | <= | > | >=) `int` selector : [`ident`] - : `integer` - : (`integer` | `integer` - `integer`), ..., (`integer` | `integer` - `integer`) + : `int` + : (`int` | `int` - `int`), ..., (`int` | `int` - `int`) toplevel_selector : `selector` : all : par @@ -368,7 +368,7 @@ We can check if a tactic made progress with: :name: progress :n:`@ltac_expr` is evaluated to v which must be a tactic value. The tactic value ``v`` - is applied to each focued subgoal independently. If the application of ``v`` + is applied to each focused subgoal independently. If the application of ``v`` to one of the focused subgoal produced subgoals equal to the initial goals (up to syntactical equality), then an error of level 0 is raised. @@ -516,7 +516,9 @@ Coq provides a derived tactic to check that a tactic *fails*: .. tacn:: assert_fails @ltac_expr :name: assert_fails - This behaves like :n:`tryif @ltac_expr then fail 0 tac "succeeds" else idtac`. + This behaves like :tacn:`idtac` if :n:`@ltac_expr` fails, and + behaves like :n:`fail 0 @ltac_expr "succeeds"` if :n:`@ltac_expr` + has at least one success. Checking the success ~~~~~~~~~~~~~~~~~~~~ @@ -528,7 +530,7 @@ success: :name: assert_succeeds This behaves like - :n:`tryif (assert_fails tac) then fail 0 tac "fails" else idtac`. + :n:`tryif (assert_fails @ltac_expr) then fail 0 @ltac_expr "fails" else idtac`. Solving ~~~~~~~ @@ -858,8 +860,8 @@ We can carry out pattern matching on terms with: If the evaluation of the right-hand-side of a valid match fails, the next matching subterm is tried. If no further subterm matches, the next clause is tried. Matching subterms are considered top-bottom and from left to - right (with respect to the raw printing obtained by setting option - :flag:`Printing All`). + right (with respect to the raw printing obtained by setting the + :flag:`Printing All` flag). .. example:: @@ -984,9 +986,9 @@ Computing in a constr Evaluation of a term can be performed with: -.. tacn:: eval @redexpr in @term +.. tacn:: eval @red_expr in @term - where :n:`@redexpr` is a reduction tactic among :tacn:`red`, :tacn:`hnf`, + where :n:`@red_expr` is a reduction tactic among :tacn:`red`, :tacn:`hnf`, :tacn:`compute`, :tacn:`simpl`, :tacn:`cbv`, :tacn:`lazy`, :tacn:`unfold`, :tacn:`fold`, :tacn:`pattern`. @@ -1640,7 +1642,7 @@ Interactive debugger .. flag:: Ltac Debug - This option governs the step-by-step debugger that comes with the |Ltac| interpreter. + This flag governs the step-by-step debugger that comes with the |Ltac| interpreter. When the debugger is activated, it stops at every step of the evaluation of the current |Ltac| expression and prints information on what it is doing. @@ -1664,13 +1666,13 @@ following: .. exn:: Debug mode not available in the IDE :undocumented: -A non-interactive mode for the debugger is available via the option: +A non-interactive mode for the debugger is available via the flag: .. flag:: Ltac Batch Debug - This option has the effect of presenting a newline at every prompt, when + This flag has the effect of presenting a newline at every prompt, when the debugger is on. The debug log thus created, which does not require - user input to generate when this option is set, can then be run through + user input to generate when this flag is set, can then be run through external tools such as diff. Profiling |Ltac| tactics @@ -1689,7 +1691,7 @@ performance issue. .. flag:: Ltac Profiling - This option enables and disables the profiler. + This flag enables and disables the profiler. .. cmd:: Show Ltac Profile @@ -1773,7 +1775,7 @@ performance issue. benchmarking purposes. You can also pass the ``-profile-ltac`` command line option to ``coqc``, which -turns the :flag:`Ltac Profiling` option on at the beginning of each document, +turns the :flag:`Ltac Profiling` flag on at the beginning of each document, and performs a :cmd:`Show Ltac Profile` at the end. .. warning:: diff --git a/doc/sphinx/proof-engine/ltac2.rst b/doc/sphinx/proof-engine/ltac2.rst index 3036648b08..cfdc70d50e 100644 --- a/doc/sphinx/proof-engine/ltac2.rst +++ b/doc/sphinx/proof-engine/ltac2.rst @@ -17,16 +17,16 @@ Coq, yet it is at the same time its Achilles' heel. Indeed, Ltac: - is error-prone and fragile - has an intricate implementation -Following the need of users that start developing huge projects relying +Following the need of users who are developing huge projects relying critically on Ltac, we believe that we should offer a proper modern language that features at least the following: - at least informal, predictable semantics -- a typing system -- standard programming facilities (i.e. datatypes) +- a type system +- standard programming facilities (e.g., datatypes) This new language, called Ltac2, is described in this chapter. It is still -experimental but we encourage nonetheless users to start testing it, +experimental but we nonetheless encourage users to start testing it, especially wherever an advanced tactic language is needed. The previous implementation of Ltac, described in the previous chapter, will be referred to as Ltac1. @@ -36,9 +36,9 @@ as Ltac1. General design -------------- -There are various alternatives to Ltac1, such that Mtac or Rtac for instance. -While those alternatives can be quite distinct from Ltac1, we designed -Ltac2 to be closest as reasonably possible to Ltac1, while fixing the +There are various alternatives to Ltac1, such as Mtac or Rtac for instance. +While those alternatives can be quite different from Ltac1, we designed +Ltac2 to be as close as reasonably possible to Ltac1, while fixing the aforementioned defects. In particular, Ltac2 is: @@ -47,11 +47,11 @@ In particular, Ltac2 is: * a call-by-value functional language * with effects - * together with Hindley-Milner type system + * together with the Hindley-Milner type system - a language featuring meta-programming facilities for the manipulation of Coq-side terms -- a language featuring notation facilities to help writing palatable scripts +- a language featuring notation facilities to help write palatable scripts We describe more in details each point in the remainder of this document. @@ -77,7 +77,7 @@ Sticking to a standard ML type system can be considered somewhat weak for a meta-language designed to manipulate Coq terms. In particular, there is no way to statically guarantee that a Coq term resulting from an Ltac2 computation will be well-typed. This is actually a design choice, motivated -by retro-compatibility with Ltac1. Instead, well-typedness is deferred to +by backward compatibility with Ltac1. Instead, well-typedness is deferred to dynamic checks, allowing many primitive functions to fail whenever they are provided with an ill-typed term. @@ -92,7 +92,7 @@ Type Syntax ~~~~~~~~~~~ At the level of terms, we simply elaborate on Ltac1 syntax, which is quite -close to e.g. the one of OCaml. Types follow the simply-typed syntax of OCaml. +close to OCaml. Types follow the simply-typed syntax of OCaml. The non-terminal :production:`lident` designates identifiers starting with a lowercase. @@ -122,7 +122,7 @@ Built-in types include: Type declarations ~~~~~~~~~~~~~~~~~ -One can define new types by the following commands. +One can define new types with the following commands. .. cmd:: Ltac2 Type {? @ltac2_typeparams } @lident :name: Ltac2 Type @@ -149,7 +149,7 @@ One can define new types by the following commands. Variants are sum types defined by constructors and eliminated by pattern-matching. They can be recursive, but the `rec` flag must be - explicitly set. Pattern-maching must be exhaustive. + explicitly set. Pattern matching must be exhaustive. Records are product types with named fields and eliminated by projection. Likewise they can be recursive if the `rec` flag is set. @@ -158,15 +158,15 @@ One can define new types by the following commands. Open variants are a special kind of variant types whose constructors are not statically defined, but can instead be extended dynamically. A typical example - is the standard `exn` type. Pattern-matching must always include a catch-all - clause. They can be extended by this command. + is the standard `exn` type. Pattern matching on open variants must always include a catch-all + clause. They can be extended with this command. Term Syntax ~~~~~~~~~~~ The syntax of the functional fragment is very close to the one of Ltac1, except that it adds a true pattern-matching feature, as well as a few standard -constructions from ML. +constructs from ML. .. productionlist:: coq ltac2_var : `lident` @@ -179,7 +179,7 @@ constructions from ML. : let `ltac2_var` := `ltac2_term` in `ltac2_term` : let rec `ltac2_var` := `ltac2_term` in `ltac2_term` : match `ltac2_term` with `ltac2_branch` ... `ltac2_branch` end - : `integer` + : `int` : `string` : `ltac2_term` ; `ltac2_term` : [| `ltac2_term` ; ... ; `ltac2_term` |] @@ -202,7 +202,7 @@ constructions from ML. In practice, there is some additional syntactic sugar that allows e.g. to bind a variable and match on it at the same time, in the usual ML style. -There is a dedicated syntax for list and array literals. +There is dedicated syntax for list and array literals. .. note:: @@ -217,7 +217,7 @@ Ltac Definitions This command defines a new global Ltac2 value. For semantic reasons, the body of the Ltac2 definition must be a syntactical - value, i.e. a function, a constant or a pure constructor recursively applied to + value, that is, a function, a constant or a pure constructor recursively applied to values. If ``rec`` is set, the tactic is expanded into a recursive binding. @@ -247,7 +247,7 @@ if ever we implement native compilation. The expected equations are as follows:: (t any term, V values, C constructor) Note that call-by-value reduction is already a departure from Ltac1 which uses -heuristics to decide when evaluating an expression. For instance, the following +heuristics to decide when to evaluate an expression. For instance, the following expressions do not evaluate the same way in Ltac1. :n:`foo (idtac; let x := 0 in bar)` @@ -255,7 +255,7 @@ expressions do not evaluate the same way in Ltac1. :n:`foo (let x := 0 in bar)` Instead of relying on the :n:`idtac` idiom, we would now require an explicit thunk -not to compute the argument, and :n:`foo` would have e.g. type +to not compute the argument, and :n:`foo` would have e.g. type :n:`(unit -> unit) -> unit`. :n:`foo (fun () => let x := 0 in bar)` @@ -263,19 +263,19 @@ not to compute the argument, and :n:`foo` would have e.g. type Typing ~~~~~~ -Typing is strict and follows Hindley-Milner system. Unlike Ltac1, there +Typing is strict and follows the Hindley-Milner system. Unlike Ltac1, there are no type casts at runtime, and one has to resort to conversion functions. See notations though to make things more palatable. -In this setting, all usual argument-free tactics have type :n:`unit -> unit`, but -one can return as well a value of type :n:`t` thanks to terms of type :n:`unit -> t`, +In this setting, all the usual argument-free tactics have type :n:`unit -> unit`, but +one can return a value of type :n:`t` thanks to terms of type :n:`unit -> t`, or take additional arguments. Effects ~~~~~~~ Effects in Ltac2 are straightforward, except that instead of using the -standard IO monad as the ambient effectful world, Ltac2 is going to use the +standard IO monad as the ambient effectful world, Ltac2 is has a tactic monad. Note that the order of evaluation of application is *not* specified and is @@ -288,15 +288,15 @@ Intuitively a thunk of type :n:`unit -> 'a` can do the following: - It can perform non-backtracking IO like printing and setting mutable variables - It can fail in a non-recoverable way -- It can use first-class backtrack. The proper way to figure that is that we - morally have the following isomorphism: +- It can use first-class backtracking. One way to think about this is that + thunks are isomorphic to this type: :n:`(unit -> 'a) ~ (unit -> exn + ('a * (exn -> 'a)))` i.e. thunks can produce a lazy list of results where each tail is waiting for a continuation exception. -- It can access a backtracking proof state, made out amongst other things of +- It can access a backtracking proof state, consisting among other things of the current evar assignation and the list of goals under focus. -We describe more thoroughly the various effects existing in Ltac2 hereafter. +We now describe more thoroughly the various effects in Ltac2. Standard IO +++++++++++ @@ -315,28 +315,28 @@ Fatal errors ++++++++++++ The Ltac2 language provides non-backtracking exceptions, also known as *panics*, -through the following primitive in module `Control`.:: +through the following primitive in module `Control`:: val throw : exn -> 'a Unlike backtracking exceptions from the next section, this kind of error is never caught by backtracking primitives, that is, throwing an exception -destroys the stack. This is materialized by the following equation, where `E` -is an evaluation context.:: +destroys the stack. This is codified by the following equation, where `E` +is an evaluation context:: E[throw e] ≡ throw e (e value) -There is currently no way to catch such an exception and it is a design choice. -There might be at some future point a way to catch it in a brutal way, -destroying all backtrack and return values. +There is currently no way to catch such an exception, which is a deliberate design choice. +Eventually there might be a way to catch it and +destroy all backtrack and return values. -Backtrack -+++++++++ +Backtracking +++++++++++++ In Ltac2, we have the following backtracking primitives, defined in the -`Control` module.:: +`Control` module:: Ltac2 Type 'a result := [ Val ('a) | Err (exn) ]. @@ -344,7 +344,7 @@ In Ltac2, we have the following backtracking primitives, defined in the val plus : (unit -> 'a) -> (exn -> 'a) -> 'a val case : (unit -> 'a) -> ('a * (exn -> 'a)) result -If one sees thunks as lazy lists, then `zero` is the empty list and `plus` is +If one views thunks as lazy lists, then `zero` is the empty list and `plus` is list concatenation, while `case` is pattern-matching. The backtracking is first-class, i.e. one can write @@ -376,8 +376,8 @@ represent several goals, including none. Thus, there is no such thing as *the current goal*. Goals are naturally ordered, though. It is natural to do the same in Ltac2, but we must provide a way to get access -to a given goal. This is the role of the `enter` primitive, that applies a -tactic to each currently focused goal in turn.:: +to a given goal. This is the role of the `enter` primitive, which applies a +tactic to each currently focused goal in turn:: val enter : (unit -> unit) -> unit @@ -427,6 +427,8 @@ In general, quotations can be introduced in terms using the following syntax, wh .. prodn:: ltac2_term += @ident : ( @quotentry ) +.. _ltac2_built-in-quotations: + Built-in quotations +++++++++++++++++++ @@ -439,10 +441,11 @@ The current implementation recognizes the following built-in quotations: holes at runtime (type ``Init.constr`` as well). - ``pattern``, which parses Coq patterns and produces a pattern used for term matching (type ``Init.pattern``). -- ``reference``, which parses either a :n:`@qualid` or :n:`& @ident`. Qualified names +- ``reference``, which parses either a :n:`@qualid` or :n:`&@ident`. Qualified names are globalized at internalization into the corresponding global reference, while ``&id`` is turned into ``Std.VarRef id``. This produces at runtime a - ``Std.reference``. + ``Std.reference``. There shall be no white space between the ampersand + symbol (``&``) and the identifier (:n:`@ident`). The following syntactic sugar is provided for two common cases. @@ -452,9 +455,9 @@ The following syntactic sugar is provided for two common cases. Strict vs. non-strict mode ++++++++++++++++++++++++++ -Depending on the context, quotations producing terms (i.e. ``constr`` or +Depending on the context, quotation-producing terms (i.e. ``constr`` or ``open_constr``) are not internalized in the same way. There are two possible -modes, respectively called the *strict* and the *non-strict* mode. +modes, the *strict* and the *non-strict* mode. - In strict mode, all simple identifiers appearing in a term quotation are required to be resolvable statically. That is, they must be the short name of @@ -467,7 +470,7 @@ modes, respectively called the *strict* and the *non-strict* mode. of the term at runtime will fail if there is no such variable in the dynamic context. -Strict mode is enforced by default, e.g. for all Ltac2 definitions. Non-strict +Strict mode is enforced by default, such as for all Ltac2 definitions. Non-strict mode is only set when evaluating Ltac2 snippets in interactive proof mode. The rationale is that it is cumbersome to explicitly add ``&`` interactively, while it is expected that global tactics enforce more invariants on their code. @@ -490,12 +493,12 @@ for their side-effects. Semantics +++++++++ -Interpretation of a quoted Coq term is done in two phases, internalization and +A quoted Coq term is interpreted in two phases, internalization and evaluation. -- Internalization is part of the static semantics, i.e. it is done at Ltac2 +- Internalization is part of the static semantics, that is, it is done at Ltac2 typing time. -- Evaluation is part of the dynamic semantics, i.e. it is done when +- Evaluation is part of the dynamic semantics, that is, it is done when a term gets effectively computed by Ltac2. Note that typing of Coq terms is a *dynamic* process occurring at Ltac2 @@ -560,6 +563,20 @@ for it. - `&x` as a Coq constr expression expands to `ltac2:(Control.refine (fun () => hyp @x))`. +In the special case where Ltac2 antiquotations appear inside a Coq term +notation, the notation variables are systematically bound in the body +of the tactic expression with type `Ltac2.Init.preterm`. Such a type represents +untyped syntactic Coq expressions, which can by typed in the +current context using the `Ltac2.Constr.pretype` function. + +.. example:: + + The following notation is essentially the identity. + + .. coqtop:: in + + Notation "[ x ]" := ltac2:(let x := Ltac2.Constr.pretype x in exact $x) (only parsing). + Dynamic semantics ***************** @@ -670,9 +687,9 @@ A scope is a name given to a grammar entry used to produce some Ltac2 expression at parsing time. Scopes are described using a form of S-expression. .. prodn:: - ltac2_scope ::= {| @string | @integer | @lident ({+, @ltac2_scope}) } + ltac2_scope ::= {| @string | @int | @lident ({+, @ltac2_scope}) } -A few scopes contain antiquotation features. For sake of uniformity, all +A few scopes contain antiquotation features. For the sake of uniformity, all antiquotations are introduced by the syntax :n:`$@lident`. The following scopes are built-in. @@ -713,15 +730,15 @@ The following scopes are built-in. - :n:`self`: - + parses a Ltac2 expression at the current level and return it as is. + + parses a Ltac2 expression at the current level and returns it as is. - :n:`next`: - + parses a Ltac2 expression at the next level and return it as is. + + parses a Ltac2 expression at the next level and returns it as is. -- :n:`tactic(n = @integer)`: +- :n:`tactic(n = @int)`: - + parses a Ltac2 expression at the provided level :n:`n` and return it as is. + + parses a Ltac2 expression at the provided level :n:`n` and returns it as is. - :n:`thunk(@ltac2_scope)`: @@ -747,7 +764,7 @@ The following scopes are built-in. out of the parsed values in the same order. As an optimization, all subscopes of the form :n:`STRING` are left out of the returned tuple, instead of returning a useless unit value. It is forbidden for the various - subscopes to refer to the global entry using self or next. + subscopes to refer to the global entry using :n:`self` or :n:`next`. A few other specific scopes exist to handle Ltac1-like syntax, but their use is discouraged and they are thus not documented. @@ -758,9 +775,9 @@ planned. Notations ~~~~~~~~~ -The Ltac2 parser can be extended by syntactic notations. +The Ltac2 parser can be extended with syntactic notations. -.. cmd:: Ltac2 Notation {+ {| @lident (@ltac2_scope) | @string } } {? : @integer} := @ltac2_term +.. cmd:: Ltac2 Notation {+ {| @lident (@ltac2_scope) | @string } } {? : @int} := @ltac2_term :name: Ltac2 Notation A Ltac2 notation adds a parsing rule to the Ltac2 grammar, which is expanded @@ -793,10 +810,10 @@ Abbreviations .. cmdv:: Ltac2 Notation @lident := @ltac2_term - This command introduces a special kind of notations, called abbreviations, + This command introduces a special kind of notation, called an abbreviation, that is designed so that it does not add any parsing rules. It is similar in spirit to Coq abbreviations, insofar as its main purpose is to give an - absolute name to a piece of pure syntax, which can be transparently referred + absolute name to a piece of pure syntax, which can be transparently referred to by this name as if it were a proper definition. The abbreviation can then be manipulated just as a normal Ltac2 definition, @@ -850,8 +867,11 @@ a Ltac1 expression, and semantics of this quotation is the evaluation of the corresponding code for its side effects. In particular, it cannot return values, and the quotation has type :n:`unit`. +.. productionlist:: coq + ltac2_term : ltac1 : ( `ltac_expr` ) + Ltac1 **cannot** implicitly access variables from the Ltac2 scope, but this can -be done via an explicit annotation to the :n:`ltac1` quotation. +be done with an explicit annotation on the :n:`ltac1` quotation. .. productionlist:: coq ltac2_term : ltac1 : ( `ident` ... `ident` |- `ltac_expr` ) @@ -887,10 +907,19 @@ Ltac2 from Ltac1 Same as above by switching Ltac1 by Ltac2 and using the `ltac2` quotation instead. -Note that the tactic expression is evaluated eagerly, if one wants to use it as -an argument to a Ltac1 function, she has to resort to the good old -:n:`idtac; ltac2:(foo)` trick. For instance, the code below will fail immediately -and won't print anything. +.. productionlist:: coq + ltac_expr : ltac2 : ( `ltac2_term` ) + : ltac2 : ( `ident` ... `ident` |- `ltac2_term` ) + +The typing rules are dual, that is, the optional identifiers are bound +with type `Ltac2.Ltac1.t` in the Ltac2 expression, which is expected to have +type unit. The value returned by this quotation is an Ltac1 function with the +same arity as the number of bound variables. + +Note that when no variables are bound, the inner tactic expression is evaluated +eagerly, if one wants to use it as an argument to a Ltac1 function, one has to +resort to the good old :n:`idtac; ltac2:(foo)` trick. For instance, the code +below will fail immediately and won't print anything. .. coqtop:: in @@ -899,11 +928,17 @@ and won't print anything. .. coqtop:: all - Ltac mytac tac := idtac "wow"; tac. + Ltac mytac tac := idtac "I am being evaluated"; tac. Goal True. Proof. + (* Doesn't print anything *) Fail mytac ltac2:(fail). + (* Prints and fails *) + Fail mytac ltac:(idtac; ltac2:(fail)). + +In any case, the value returned by the fully applied quotation is an +unspecified dummy Ltac1 closure and should not be further used. Transition from Ltac1 --------------------- @@ -923,8 +958,8 @@ Due to conflicts, a few syntactic rules have changed. - The dispatch tactical :n:`tac; [foo|bar]` is now written :n:`tac > [foo|bar]`. - Levels of a few operators have been revised. Some tacticals now parse as if - they were a normal function, i.e. one has to put parentheses around the - argument when it is complex, e.g an abstraction. List of affected tacticals: + they were normal functions. Parentheses are now required around complex + arguments, such as abstractions. The tacticals affected are: :n:`try`, :n:`repeat`, :n:`do`, :n:`once`, :n:`progress`, :n:`time`, :n:`abstract`. - :n:`idtac` is no more. Either use :n:`()` if you expect nothing to happen, :n:`(fun () => ())` if you want a thunk (see next section), or use printing @@ -1010,4 +1045,4 @@ Exception catching Ltac2 features a proper exception-catching mechanism. For this reason, the Ltac1 mechanism relying on `fail` taking integers, and tacticals decreasing it, has been removed. Now exceptions are preserved by all tacticals, and it is -your duty to catch them and reraise them depending on your use. +your duty to catch them and re-raise them as needed. diff --git a/doc/sphinx/proof-engine/proof-handling.rst b/doc/sphinx/proof-engine/proof-handling.rst index 03b30d5d97..6884b6e998 100644 --- a/doc/sphinx/proof-engine/proof-handling.rst +++ b/doc/sphinx/proof-engine/proof-handling.rst @@ -535,16 +535,20 @@ Requesting information eexists ?[n]. Show n. - .. cmdv:: Show Proof + .. cmdv:: Show Proof {? Diffs {? removed } } :name: Show Proof - It displays the proof term generated by the tactics - that have been applied. If the proof is not completed, this term - contain holes, which correspond to the sub-terms which are still to be - constructed. These holes appear as a question mark indexed by an - integer, and applied to the list of variables in the context, since it - may depend on them. The types obtained by abstracting away the context - from the type of each placeholder are also printed. + Displays the proof term generated by the tactics + that have been applied so far. If the proof is incomplete, the term + will contain holes, which correspond to subterms which are still to be + constructed. Each hole is an existential variable, which appears as a + question mark followed by an identifier. + + Experimental: Specifying “Diffs” highlights the difference between the + current and previous proof step. By default, the command shows the + output once with additions highlighted. Including “removed” shows + the output twice: once showing removals and once showing additions. + It does not examine the :opt:`Diffs` option. See :ref:`showing_diffs`. .. cmdv:: Show Conjectures :name: Show Conjectures @@ -574,9 +578,8 @@ Requesting information .. cmdv:: Show Existentials :name: Show Existentials - It displays the set of all uninstantiated - existential variables in the current proof tree, along with the type - and the context of each variable. + Displays all open goals / existential variables in the current proof + along with the type and the context of each variable. .. cmdv:: Show Match @ident @@ -627,8 +630,11 @@ Showing differences between proof steps --------------------------------------- -Coq can automatically highlight the differences between successive proof steps and between -values in some error messages. +Coq can automatically highlight the differences between successive proof steps +and between values in some error messages. Also, as an experimental feature, +Coq can also highlight differences between proof steps shown in the :cmd:`Show Proof` +command, but only, for now, when using coqtop and Proof General. + For example, the following screenshots of CoqIDE and coqtop show the application of the same :tacn:`intros` tactic. The tactic creates two new hypotheses, highlighted in green. The conclusion is entirely in pale green because although it’s changed, no tokens were added @@ -798,7 +804,7 @@ Controlling the effect of proof editing commands .. flag:: Nested Proofs Allowed - When turned on (it is off by default), this option enables support for nested + When turned on (it is off by default), this flag enables support for nested proofs: a new assertion command can be inserted before the current proof is finished, in which case Coq will temporarily switch to the proof of this *nested lemma*. When the proof of the nested lemma is finished (with :cmd:`Qed` diff --git a/doc/sphinx/proof-engine/ssreflect-proof-language.rst b/doc/sphinx/proof-engine/ssreflect-proof-language.rst index ed980bd4de..853ddfd6dc 100644 --- a/doc/sphinx/proof-engine/ssreflect-proof-language.rst +++ b/doc/sphinx/proof-engine/ssreflect-proof-language.rst @@ -514,7 +514,7 @@ is a valid tactic expression. The pose tactic is also improved for the local definition of higher order terms. Local definitions of functions can use the same syntax as global ones. -For example, the tactic :tacn:`pose <pose (ssreflect)>` supoprts parameters: +For example, the tactic :tacn:`pose <pose (ssreflect)>` supports parameters: .. example:: @@ -684,7 +684,7 @@ conditions: + If this head is a projection of a canonical structure, then canonical structure equations are used for the matching. + If the head of term is *not* a constant, the subterm should have the - same structure (λ abstraction,let…in structure …). + same structure (λ abstraction, let…in structure …). + If the head of :token:`term` is a hole, the subterm should have at least as many arguments as :token:`term`. @@ -1151,7 +1151,7 @@ is basically equivalent to move: a H1 H2; tactic => a H1 H2. -with two differences: the in tactical will preserve the body of a ifa +with two differences: the in tactical will preserve the body of an if a is a defined constant, and if the ``*`` is omitted it will use a temporary abbreviation to hide the statement of the goal from ``tactic``. @@ -1513,7 +1513,7 @@ In a goal like the following:: ============= m < 5 + n -The tactic ``abstract: abs n`` first generalizes the goal with respect ton +The tactic :g:`abstract: abs n` first generalizes the goal with respect to :g:`n` (that is not visible to the abstract constant abs) and then assigns abs. The resulting goal is:: @@ -1706,7 +1706,7 @@ Intro patterns execution of tactic should thus generate exactly m subgoals, unless the ``[…]`` :token:`i_pattern` comes after an initial ``//`` or ``//=`` :token:`s_item` that closes some of the goals produced by ``tactic``, in - which case exactly m subgoals should remain after thes- item, or we have + which case exactly m subgoals should remain after the :token:`s_item`, or we have the trivial branching :token:`i_pattern` [], which always does nothing, regardless of the number of remaining subgoals. ``[`` :token:`i_item` * ``| … |`` :token:`i_item` * ``]`` @@ -2240,8 +2240,8 @@ then the tactic tactic ; last k [ tactic1 |…| tacticm ] || tacticn. -where natural denotes the integer k as above, applies tactic1 to the n -−k + 1-th goal, … tacticm to the n −k + 2 − m-th goal and tactic n +where natural denotes the integer :math:`k` as above, applies tactic1 to the +:math:`n−k+1`\-th goal, … tacticm to the :math:`n−k+2`\-th goal and tacticn to the others. .. example:: @@ -2631,7 +2631,7 @@ The :token:`i_item` and :token:`s_item` can be used to interpret the asserted hypothesis with views (see section :ref:`views_and_reflection_ssr`) or simplify the resulting goals. -The ``have`` tactic also supports a ``suff`` modifier which allows for +The :tacn:`have` tactic also supports a ``suff`` modifier which allows for asserting that a given statement implies the current goal without copying the goal itself. @@ -2651,7 +2651,7 @@ compatible with the presence of a list of binders. Generating let in context entries with have ``````````````````````````````````````````` -Since |SSR| 1.5 the ``have`` tactic supports a “transparent” modifier +Since |SSR| 1.5 the :tacn:`have` tactic supports a “transparent” modifier to generate let in context entries: the ``@`` symbol in front of the context entry name. @@ -2670,7 +2670,7 @@ context entry name. Lemma test n m (H : m + 1 < n) : True. have @i : 'I_n by apply: (Sub m); omega. -Note that the sub-term produced by ``omega`` is in general huge and +Note that the subterm produced by :tacn:`omega` is in general huge and uninteresting, and hence one may want to hide it. For this purpose the ``[: name ]`` intro pattern and the tactic ``abstract`` (see :ref:`abstract_ssr`) are provided. @@ -2764,7 +2764,7 @@ typeclass inference. .. flag:: SsrHave NoTCResolution - This option restores the behavior of |SSR| 1.4 and below (never resolve typeclasses). + This flag restores the behavior of |SSR| 1.4 and below (never resolve typeclasses). Variants: the suff and wlog tactics ``````````````````````````````````` @@ -2782,7 +2782,7 @@ The ``have`` and ``suff`` tactics are equivalent and have the same syntax but: -+ the order of the generated subgoals is inversed ++ the order of the generated subgoals is inverted + the optional clear item is still performed in the *second* branch. This means that the tactic: @@ -3756,8 +3756,11 @@ involves the following steps: the corresponding intro pattern :n:`@i_pattern__i` in each goal. 4. Then :tacn:`under` checks that the first n subgoals - are (quantified) equalities or double implications between a - term and an evar (e.g. ``m - m = ?F2 m`` in the running example). + are (quantified) Leibniz equalities, double implications or + registered relations (w.r.t. Class ``RewriteRelation``) between a + term and an evar, e.g. ``m - m = ?F2 m`` in the running example. + (This support for setoid-like relations is enabled as soon as we do + both ``Require Import ssreflect.`` and ``Require Setoid.``) 5. If so :tacn:`under` protects these n goals against an accidental instantiation of the evar. @@ -3769,7 +3772,10 @@ involves the following steps: by using a regular :tacn:`rewrite` tactic. 7. Interactive editing of the first n goals has to be signalled by - using the :tacn:`over` tactic or rewrite rule (see below). + using the :tacn:`over` tactic or rewrite rule (see below), which + requires that the underlying relation is reflexive. (The running + example deals with Leibniz equality, but ``PreOrder`` relations are + also supported, for example.) 8. Finally, a post-processing step is performed in the main goal to keep the name(s) for the bound variables chosen by the user in @@ -3795,6 +3801,10 @@ displayed as ``'Under[ … ]``): This is a variant of :tacn:`over` in order to close ``'Under[ … ]`` goals, relying on the ``over`` rewrite rule. +Note that a rewrite rule ``UnderE`` is available as well, if one wants +to "unprotect" the evar, without closing the goal automatically (e.g., +to instantiate it manually with another rule than reflexivity). + .. _under_one_liner: One-liner mode @@ -4061,6 +4071,7 @@ which the function is supplied: :name: congr This tactic: + + checks that the goal is a Leibniz equality; + matches both sides of this equality with “term applied to some arguments”, inferring the right number of arguments from the goal and the type of term. This may expand some definitions or fixpoints; + generates the subgoals corresponding to pairwise equalities of the arguments present in the goal. @@ -4208,7 +4219,7 @@ in the second column. ``ident`` in all the occurrences of ``term2`` * - ``term1 as ident in term2`` - ``term 1`` - - in all the subterms identified by ``ident` + - in all the subterms identified by ``ident`` in all the occurrences of ``term2[term 1 /ident]`` The rewrite tactic supports two more patterns obtained prefixing the @@ -4583,7 +4594,7 @@ The ``elim/`` tactic distinguishes two cases: passed to the eliminator as the last argument (``x`` in ``foo_ind``) and ``en−1 … e1`` are used as patterns to select in the goal the occurrences that will be bound by the predicate ``P``, thus it must be possible to unify - the sub-term of the goal matched by ``en−1`` with ``pm`` , the one matched + the subterm of the goal matched by ``en−1`` with ``pm`` , the one matched by ``en−2`` with ``pm−1`` and so on. :regular eliminator: in all the other cases. Here it must be possible to unify the term matched by ``en`` with ``pm`` , the one matched by @@ -5013,7 +5024,7 @@ mechanism: Coercion is_true (b : bool) := b = true. This allows any boolean formula ``b`` to be used in a context where |Coq| -would expect a proposition, e.g., after ``Lemma … : ``. It is then +would expect a proposition, e.g., after ``Lemma … :``. It is then interpreted as ``(is_true b)``, i.e., the proposition ``b = true``. Coercions are elided by the pretty-printer, so they are essentially transparent to the user. @@ -5451,7 +5462,7 @@ equivalences are indeed taken into account, otherwise only single name of an open module. This command returns the list of lemmas: + whose *conclusion* contains a subterm matching the optional first - pattern. A - reverses the test, producing the list of lemmas whose + pattern. A ``-`` reverses the test, producing the list of lemmas whose conclusion does not contain any subterm matching the pattern; + whose name contains the given string. A ``-`` prefix reverses the test, producing the list of lemmas whose name does not contain the string. A diff --git a/doc/sphinx/proof-engine/tactics.rst b/doc/sphinx/proof-engine/tactics.rst index fa6d62ffa2..62d4aa704f 100644 --- a/doc/sphinx/proof-engine/tactics.rst +++ b/doc/sphinx/proof-engine/tactics.rst @@ -157,10 +157,10 @@ The :n:`eqn:` construct in various tactics uses :n:`@naming_intropattern`. Use these elementary patterns to specify a name: -* :n:`@ident` - use the specified name -* :n:`?` - let Coq choose a name -* :n:`?@ident` - generate a name that begins with :n:`@ident` -* :n:`_` - discard the matched part (unless it is required for another +* :n:`@ident` — use the specified name +* :n:`?` — let Coq choose a name +* :n:`?@ident` — generate a name that begins with :n:`@ident` +* :n:`_` — discard the matched part (unless it is required for another hypothesis) * if a disjunction pattern omits a name, such as :g:`[|H2]`, Coq will choose a name @@ -186,7 +186,7 @@ use the :tacn:`split` tactic to replace the current goal with subgoals :g:`A` an For a goal :g:`A \/ B`, use :tacn:`left` to replace the current goal with :g:`A`, or :tacn:`right` to replace the current goal with :g:`B`. -* :n:`( {+, @simple_intropattern}` ) - matches +* :n:`( {+, @simple_intropattern}` ) — matches a product over an inductive type with a :ref:`single constructor <intropattern_cons_note>`. If the number of patterns @@ -196,7 +196,7 @@ For a goal :g:`A \/ B`, use :tacn:`left` to replace the current goal with :g:`A` If the number of patterns equals the number of constructor arguments plus the number of :n:`let-ins`, the patterns are applied to the arguments and :n:`let-in` variables. -* :n:`( {+& @simple_intropattern} )` - matches a right-hand nested term that consists +* :n:`( {+& @simple_intropattern} )` — matches a right-hand nested term that consists of one or more nested binary inductive types such as :g:`a1 OP1 a2 OP2 ...` (where the :g:`OPn` are right-associative). (If the :g:`OPn` are left-associative, additional parentheses will be needed to make the @@ -207,15 +207,15 @@ For a goal :g:`A \/ B`, use :tacn:`left` to replace the current goal with :g:`A` :ref:`single constructor with two parameters <intropattern_cons_note>`. :ref:`Example <intropattern_ampersand_ex>` -* :n:`[ {+| @intropattern_list} ]` - splits an inductive type that has +* :n:`[ {+| @intropattern_list} ]` — splits an inductive type that has :ref:`multiple constructors <intropattern_cons_note>` such as :n:`A \/ B` into multiple subgoals. The number of :token:`intropattern_list` must be the same as the number of constructors for the matched part. -* :n:`[ {+ @intropattern} ]` - splits an inductive type that has a +* :n:`[ {+ @intropattern} ]` — splits an inductive type that has a :ref:`single constructor with multiple parameters <intropattern_cons_note>` such as :n:`A /\ B` into multiple hypotheses. Use :n:`[H1 [H2 H3]]` to match :g:`A /\ B /\ C`. -* :n:`[]` - splits an inductive type: If the inductive +* :n:`[]` — splits an inductive type: If the inductive type has multiple constructors, such as :n:`A \/ B`, create one subgoal for each constructor. If the inductive type has a single constructor with multiple parameters, such as :n:`A /\ B`, split it into multiple hypotheses. @@ -224,14 +224,14 @@ For a goal :g:`A \/ B`, use :tacn:`left` to replace the current goal with :g:`A` These patterns can be used when the hypothesis is an equality: -* :n:`->` - replaces the right-hand side of the hypothesis with the left-hand +* :n:`->` — replaces the right-hand side of the hypothesis with the left-hand side of the hypothesis in the conclusion of the goal; the hypothesis is cleared; if the left-hand side of the hypothesis is a variable, it is substituted everywhere in the context and the variable is removed. :ref:`Example <intropattern_rarrow_ex>` -* :n:`<-` - similar to :n:`->`, but replaces the left-hand side of the hypothesis +* :n:`<-` — similar to :n:`->`, but replaces the left-hand side of the hypothesis with the right-hand side of the hypothesis. -* :n:`[= {*, @intropattern} ]` - If the product is over an equality type, +* :n:`[= {*, @intropattern} ]` — If the product is over an equality type, applies either :tacn:`injection` or :tacn:`discriminate`. If :tacn:`injection` is applicable, the intropattern is used on the hypotheses generated by :tacn:`injection`. If the @@ -241,16 +241,16 @@ These patterns can be used when the hypothesis is an equality: **Other patterns** -* :n:`*` - introduces one or more quantified variables from the result +* :n:`*` — introduces one or more quantified variables from the result until there are no more quantified variables. :ref:`Example <intropattern_star_ex>` -* :n:`**` - introduces one or more quantified variables or hypotheses from the result until there are +* :n:`**` — introduces one or more quantified variables or hypotheses from the result until there are no more quantified variables or implications (:g:`->`). :g:`intros **` is equivalent to :g:`intros`. :ref:`Example <intropattern_2stars_ex>` -* :n:`@simple_intropattern_closed {* % @term}` - first applies each of the terms +* :n:`@simple_intropattern_closed {* % @term}` — first applies each of the terms with the :tacn:`apply ... in` tactic on the hypothesis to be introduced, then it uses :n:`@simple_intropattern_closed`. :ref:`Example <intropattern_injection_ex>` @@ -261,7 +261,7 @@ These patterns can be used when the hypothesis is an equality: conjunctive pattern that doesn't give enough simple patterns to match all the arguments in the constructor. If set (the default), |Coq| generates additional names to match the number of arguments. - Unsetting the option will put the additional hypotheses in the goal instead, behavior that is more + Unsetting the flag will put the additional hypotheses in the goal instead, behavior that is more similar to |SSR|'s intro patterns. .. deprecated:: 8.10 @@ -477,7 +477,7 @@ that occurrences have to be selected in the hypotheses named :token:`ident`. If no numbers are given for hypothesis :token:`ident`, then all the occurrences of :token:`term` in the hypothesis are selected. If numbers are given, they refer to occurrences of :token:`term` when the term is printed -using option :flag:`Printing All`, counting from left to right. In particular, +using the :flag:`Printing All` flag, counting from left to right. In particular, occurrences of :token:`term` in implicit arguments (see :ref:`ImplicitArguments`) or coercions (see :ref:`Coercions`) are counted. @@ -804,11 +804,11 @@ Applying theorems component of the tuple matches the goal, it excludes components whose statement would result in applying an universal lemma of the form ``forall A, ... -> A``. Excluding this kind of lemma can be avoided by - setting the following option: + setting the following flag: .. flag:: Universal Lemma Under Conjunction - This option, which preserves compatibility with versions of Coq prior to + This flag, which preserves compatibility with versions of Coq prior to 8.4 is also available for :n:`apply @term in @ident` (see :tacn:`apply ... in`). .. tacn:: apply @term in @ident @@ -1409,7 +1409,7 @@ Controlling the proof flow While the different variants of :tacn:`assert` expect that no existential variables are generated by the tactic, :tacn:`eassert` removes this constraint. - This allows not to specify the asserted statement completeley before starting + This lets you avoid specifying the asserted statement completely before starting to prove it. .. tacv:: pose proof @term {? as @simple_intropattern} @@ -1527,7 +1527,7 @@ name of the variable (here :g:`n`) is chosen based on :g:`T`. This is equivalent to :n:`generalize @term` but it generalizes only over the specified occurrences of :n:`@term` (counting from left to right on the - expression printed using option :flag:`Printing All`). + expression printed using the :flag:`Printing All` flag). .. tacv:: generalize @term as @ident @@ -1555,8 +1555,8 @@ name of the variable (here :g:`n`) is chosen based on :g:`T`. :name: instantiate The instantiate tactic refines (see :tacn:`refine`) an existential variable - :n:`@ident` with the term :n:`@term`. It is equivalent to only [ident]: - :n:`refine @term` (preferred alternative). + :n:`@ident` with the term :n:`@term`. It is equivalent to + :n:`only [ident]: refine @term` (preferred alternative). .. note:: To be able to refer to an existential variable by name, the user must have given the name explicitly (see :ref:`Existential-Variables`). @@ -2008,7 +2008,7 @@ analysis on inductive or co-inductive objects (see :ref:`inductive-definitions`) .. coqtop:: reset all - Lemma le_minus : forall n:nat, n < 1 -> n = 0. + Lemma lt_1_r : forall n:nat, n < 1 -> n = 0. intros n H ; induction H. Here we did not get any information on the indexes to help fulfill @@ -2020,7 +2020,7 @@ analysis on inductive or co-inductive objects (see :ref:`inductive-definitions`) .. coqtop:: reset all Require Import Coq.Program.Equality. - Lemma le_minus : forall n:nat, n < 1 -> n = 0. + Lemma lt_1_r : forall n:nat, n < 1 -> n = 0. intros n H ; dependent induction H. The subgoal is cleaned up as the tactic tries to automatically @@ -2300,16 +2300,16 @@ and an explanation of the underlying technique. .. flag:: Structural Injection - This option ensure that :n:`injection @term` erases the original hypothesis + This flag ensures that :n:`injection @term` erases the original hypothesis and leaves the generated equalities in the context rather than putting them as antecedents of the current goal, as if giving :n:`injection @term as` - (with an empty list of names). This option is off by default. + (with an empty list of names). This flag is off by default. .. flag:: Keep Proof Equalities By default, :tacn:`injection` only creates new equalities between :n:`@term`\s whose type is in sort :g:`Type` or :g:`Set`, thus implementing a special - behavior for objects that are proofs of a statement in :g:`Prop`. This option + behavior for objects that are proofs of a statement in :g:`Prop`. This flag controls this behavior. .. tacn:: inversion @ident @@ -2691,7 +2691,7 @@ simply :g:`t=u` dropping the implicit type of :g:`t` and :g:`u`. This tactic applies to any goal. The type of :token:`term` must have the form - ``forall (x``:sub:`1` ``:A``:sub:`1` ``) ... (x``:sub:`n` ``:A``:sub:`n` ``). eq term``:sub:`1` ``term``:sub:`2` ``.`` + ``forall (x``:sub:`1` ``:A``:sub:`1` ``) ... (x``:sub:`n` ``:A``:sub:`n` ``), eq term``:sub:`1` ``term``:sub:`2` ``.`` where :g:`eq` is the Leibniz equality or a registered setoid equality. @@ -2862,26 +2862,26 @@ simply :g:`t=u` dropping the implicit type of :g:`t` and :g:`u`. .. flag:: Regular Subst Tactic - This option controls the behavior of :tacn:`subst`. When it is + This flag controls the behavior of :tacn:`subst`. When it is activated (it is by default), :tacn:`subst` also deals with the following corner cases: + A context with ordered hypotheses :n:`@ident`:sub:`1` :n:`= @ident`:sub:`2` and :n:`@ident`:sub:`1` :n:`= t`, or :n:`t′ = @ident`:sub:`1`` with `t′` not a variable, and no other hypotheses of the form :n:`@ident`:sub:`2` :n:`= u` - or :n:`u = @ident`:sub:`2`; without the option, a second call to + or :n:`u = @ident`:sub:`2`; without the flag, a second call to subst would be necessary to replace :n:`@ident`:sub:`2` by `t` or `t′` respectively. - + The presence of a recursive equation which without the option would + + The presence of a recursive equation which without the flag would be a cause of failure of :tacn:`subst`. + A context with cyclic dependencies as with hypotheses :n:`@ident`:sub:`1` :n:`= f @ident`:sub:`2` and :n:`@ident`:sub:`2` :n:`= g @ident`:sub:`1` which without the - option would be a cause of failure of :tacn:`subst`. + flag would be a cause of failure of :tacn:`subst`. Additionally, it prevents a local definition such as :n:`@ident := t` to be unfolded which otherwise it would exceptionally unfold in configurations containing hypotheses of the form :n:`@ident = u`, or :n:`u′ = @ident` with `u′` not a variable. Finally, it preserves the initial order of - hypotheses, which without the option it may break. + hypotheses, which without the flag it may break. default. @@ -3005,7 +3005,7 @@ the conversion in hypotheses :n:`{+ @ident}`. flags are either ``beta``, ``delta``, ``match``, ``fix``, ``cofix``, ``iota`` or ``zeta``. The ``iota`` flag is a shorthand for ``match``, ``fix`` and ``cofix``. The ``delta`` flag itself can be refined into - :n:`delta {+ @qualid}` or :n:`delta -{+ @qualid}`, restricting in the first + :n:`delta [ {+ @qualid} ]` or :n:`delta - [ {+ @qualid} ]`, restricting in the first case the constants to unfold to the constants listed, and restricting in the second case the constant to unfold to all but the ones explicitly mentioned. Notice that the ``delta`` flag does not apply to variables bound by a let-in @@ -3049,18 +3049,18 @@ the conversion in hypotheses :n:`{+ @ident}`. This is a synonym for ``lazy beta delta iota zeta``. -.. tacv:: compute {+ @qualid} - cbv {+ @qualid} +.. tacv:: compute [ {+ @qualid} ] + cbv [ {+ @qualid} ] These are synonyms of :n:`cbv beta delta {+ @qualid} iota zeta`. -.. tacv:: compute -{+ @qualid} - cbv -{+ @qualid} +.. tacv:: compute - [ {+ @qualid} ] + cbv - [ {+ @qualid} ] These are synonyms of :n:`cbv beta delta -{+ @qualid} iota zeta`. -.. tacv:: lazy {+ @qualid} - lazy -{+ @qualid} +.. tacv:: lazy [ {+ @qualid} ] + lazy - [ {+ @qualid} ] These are respectively synonyms of :n:`lazy beta delta {+ @qualid} iota zeta` and :n:`lazy beta delta -{+ @qualid} iota zeta`. @@ -3071,7 +3071,7 @@ the conversion in hypotheses :n:`{+ @ident}`. This tactic evaluates the goal using the optimized call-by-value evaluation bytecode-based virtual machine described in :cite:`CompiledStrongReduction`. This algorithm is dramatically more efficient than the algorithm used for the - ``cbv`` tactic, but it cannot be fine-tuned. It is specially interesting for + :tacn:`cbv` tactic, but it cannot be fine-tuned. It is especially interesting for full evaluation of algebraic objects. This includes the case of reflection-based tactics. @@ -3080,14 +3080,14 @@ the conversion in hypotheses :n:`{+ @ident}`. This tactic evaluates the goal by compilation to OCaml as described in :cite:`FullReduction`. If Coq is running in native code, it can be - typically two to five times faster than ``vm_compute``. Note however that the + typically two to five times faster than :tacn:`vm_compute`. Note however that the compilation cost is higher, so it is worth using only for intensive computations. .. flag:: NativeCompute Profiling - On Linux, if you have the ``perf`` profiler installed, this option makes - it possible to profile ``native_compute`` evaluations. + On Linux, if you have the ``perf`` profiler installed, this flag makes + it possible to profile :tacn:`native_compute` evaluations. .. opt:: NativeCompute Profile Filename @string :name: NativeCompute Profile Filename @@ -3097,13 +3097,13 @@ the conversion in hypotheses :n:`{+ @ident}`. will contain extra characters to avoid overwriting an existing file; that filename is reported to the user. That means you can individually profile multiple uses of - ``native_compute`` in a script. From the Linux command line, run ``perf report`` + :tacn:`native_compute` in a script. From the Linux command line, run ``perf report`` on the profile file to see the results. Consult the ``perf`` documentation for more details. .. flag:: Debug Cbv - This option makes :tacn:`cbv` (and its derivative :tacn:`compute`) print + This flag makes :tacn:`cbv` (and its derivative :tacn:`compute`) print information about the constants it encounters and the unfolding decisions it makes. @@ -3153,14 +3153,15 @@ the conversion in hypotheses :n:`{+ @ident}`. use the name of the constant the (co)fixpoint comes from instead of the (co)fixpoint definition in recursive calls. - The ``cbn`` tactic is claimed to be a more principled, faster and more - predictable replacement for ``simpl``. + The :tacn:`cbn` tactic is claimed to be a more principled, faster and more + predictable replacement for :tacn:`simpl`. - The ``cbn`` tactic accepts the same flags as ``cbv`` and ``lazy``. The - behavior of both ``simpl`` and ``cbn`` can be tuned using the - Arguments vernacular command as follows: + The :tacn:`cbn` tactic accepts the same flags as :tacn:`cbv` and + :tacn:`lazy`. The behavior of both :tacn:`simpl` and :tacn:`cbn` + can be tuned using the Arguments vernacular command as follows: - + A constant can be marked to be never unfolded by ``cbn`` or ``simpl``: + + A constant can be marked to be never unfolded by :tacn:`cbn` or + :tacn:`simpl`: .. example:: @@ -3169,7 +3170,7 @@ the conversion in hypotheses :n:`{+ @ident}`. Arguments minus n m : simpl never. After that command an expression like :g:`(minus (S x) y)` is left - untouched by the tactics ``cbn`` and ``simpl``. + untouched by the tactics :tacn:`cbn` and :tacn:`simpl`. + A constant can be marked to be unfolded only if applied to enough arguments. The number of arguments required can be specified using the @@ -3184,7 +3185,7 @@ the conversion in hypotheses :n:`{+ @ident}`. Notation "f \o g" := (fcomp f g) (at level 50). After that command the expression :g:`(f \o g)` is left untouched by - ``simpl`` while :g:`((f \o g) t)` is reduced to :g:`(f (g t))`. + :tacn:`simpl` while :g:`((f \o g) t)` is reduced to :g:`(f (g t))`. The same mechanism can be used to make a constant volatile, i.e. always unfolded. @@ -3206,7 +3207,7 @@ the conversion in hypotheses :n:`{+ @ident}`. Arguments minus !n !m. After that command, the expression :g:`(minus (S x) y)` is left untouched - by ``simpl``, while :g:`(minus (S x) (S y))` is reduced to :g:`(minus x y)`. + by :tacn:`simpl`, while :g:`(minus (S x) (S y))` is reduced to :g:`(minus x y)`. + A special heuristic to determine if a constant has to be unfolded can be activated with the following command: @@ -3222,25 +3223,25 @@ the conversion in hypotheses :n:`{+ @ident}`. :g:`(minus (S (S x)) (S y))` is simplified to :g:`(minus (S x) y)` even if an extra simplification is possible. - In detail, the tactic ``simpl`` first applies :math:`\beta`:math:`\iota`-reduction. Then, it - expands transparent constants and tries to reduce further using :math:`\beta`:math:`\iota`- - reduction. But, when no :math:`\iota` rule is applied after unfolding then - :math:`\delta`-reductions are not applied. For instance trying to use ``simpl`` on + In detail, the tactic :tacn:`simpl` first applies :math:`\beta`:math:`\iota`-reduction. Then, it + expands transparent constants and tries to reduce further using :math:`\beta`:math:`\iota`-reduction. + But, when no :math:`\iota` rule is applied after unfolding then + :math:`\delta`-reductions are not applied. For instance trying to use :tacn:`simpl` on :g:`(plus n O) = n` changes nothing. Notice that only transparent constants whose name can be reused in the - recursive calls are possibly unfolded by ``simpl``. For instance a + recursive calls are possibly unfolded by :tacn:`simpl`. For instance a constant defined by :g:`plus' := plus` is possibly unfolded and reused in the recursive calls, but a constant such as :g:`succ := plus (S O)` is - never unfolded. This is the main difference between ``simpl`` and ``cbn``. - The tactic ``cbn`` reduces whenever it will be able to reuse it or not: + never unfolded. This is the main difference between :tacn:`simpl` and :tacn:`cbn`. + The tactic :tacn:`cbn` reduces whenever it will be able to reuse it or not: :g:`succ t` is reduced to :g:`S t`. -.. tacv:: cbn {+ @qualid} - cbn -{+ @qualid} +.. tacv:: cbn [ {+ @qualid} ] + cbn - [ {+ @qualid} ] - These are respectively synonyms of :n:`cbn beta delta {+ @qualid} iota zeta` - and :n:`cbn beta delta -{+ @qualid} iota zeta` (see :tacn:`cbn`). + These are respectively synonyms of :n:`cbn beta delta [ {+ @qualid} ] iota zeta` + and :n:`cbn beta delta - [ {+ @qualid} ] iota zeta` (see :tacn:`cbn`). .. tacv:: simpl @pattern @@ -3249,7 +3250,7 @@ the conversion in hypotheses :n:`{+ @ident}`. .. tacv:: simpl @pattern at {+ @num} - This applies ``simpl`` only to the :n:`{+ @num}` occurrences of the subterms + This applies :tacn:`simpl` only to the :n:`{+ @num}` occurrences of the subterms matching :n:`@pattern` in the current goal. .. exn:: Too few occurrences. @@ -3265,12 +3266,12 @@ the conversion in hypotheses :n:`{+ @ident}`. .. tacv:: simpl @qualid at {+ @num} simpl @string at {+ @num} - This applies ``simpl`` only to the :n:`{+ @num}` applicative subterms whose + This applies :tacn:`simpl` only to the :n:`{+ @num}` applicative subterms whose head occurrence is :n:`@qualid` (or :n:`@string`). .. flag:: Debug RAKAM - This option makes :tacn:`cbn` print various debugging information. + This flag makes :tacn:`cbn` print various debugging information. ``RAKAM`` is the Refolding Algebraic Krivine Abstract Machine. .. tacn:: unfold @qualid @@ -3547,7 +3548,7 @@ Automation Info Trivial Debug Trivial - These options enable printing of informative or debug information for + These flags enable printing of informative or debug information for the :tacn:`auto` and :tacn:`trivial` tactics. .. tacn:: eauto @@ -3575,7 +3576,7 @@ Automation The various options for :tacn:`eauto` are the same as for :tacn:`auto`. - :tacn:`eauto` also obeys the following options: + :tacn:`eauto` also obeys the following flags: .. flag:: Info Eauto Debug Eauto @@ -3719,7 +3720,7 @@ automatically created. .. cmdv:: Local Hint @hint_definition : {+ @ident} This is used to declare hints that must not be exported to the other modules - that require and import the current module. Inside a section, the option + that require and import the current module. Inside a section, the flag Local is useless since hints do not survive anyway to the closure of sections. @@ -3960,6 +3961,9 @@ At Coq startup, only the core database is nonempty and can be used. :fset: internal database for the implementation of the ``FSets`` library. +:ordered_type: lemmas about ordered types (as defined in the legacy ``OrderedType`` module), + mainly used in the ``FSets`` and ``FMaps`` libraries. + You are advised not to put your own hints in the core database, but use one or several databases specific to your development. @@ -4001,8 +4005,8 @@ use one or several databases specific to your development. This vernacular command adds the terms :n:`{+ @term}` (their types must be equalities) in the rewriting bases :n:`{+ @ident}` with the default orientation - (left to right). Notice that the rewriting bases are distinct from the ``auto`` - hint bases and thatauto does not take them into account. + (left to right). Notice that the rewriting bases are distinct from the :tacn:`auto` + hint bases and that :tacn:`auto` does not take them into account. This command is synchronous with the section mechanism (see :ref:`section-mechanism`): when closing a section, all aliases created by ``Hint Rewrite`` in that @@ -4192,7 +4196,7 @@ some incompatibilities. .. flag:: Intuition Negation Unfolding Controls whether :tacn:`intuition` unfolds inner negations which do not need - to be unfolded. This option is on by default. + to be unfolded. This flag is on by default. .. tacn:: rtauto :name: rtauto @@ -4233,7 +4237,13 @@ some incompatibilities. .. tacv:: firstorder using {+ @qualid} - Adds lemmas :n:`{+ @qualid}` to the proof-search environment. If :n:`@qualid` + .. deprecated:: 8.3 + + Use the syntax below instead (with commas). + +.. tacv:: firstorder using {+, @qualid} + + Adds lemmas :n:`{+, @qualid}` to the proof-search environment. If :n:`@qualid` refers to an inductive type, it is the collection of its constructors which are added to the proof-search environment. @@ -4242,7 +4252,7 @@ some incompatibilities. Adds lemmas from :tacn:`auto` hint bases :n:`{+ @ident}` to the proof-search environment. -.. tacv:: firstorder @tactic using {+ @qualid} with {+ @ident} +.. tacv:: firstorder @tactic using {+, @qualid} with {+ @ident} This combines the effects of the different variants of :tacn:`firstorder`. @@ -4312,7 +4322,7 @@ some incompatibilities. .. flag:: Congruence Verbose - This option makes :tacn:`congruence` print debug information. + This flag makes :tacn:`congruence` print debug information. Checking properties of terms @@ -4549,7 +4559,7 @@ Inversion .. tacv:: functional inversion @num - This does the same thing as :n:`intros until @num` folowed by + This does the same thing as :n:`intros until @num` followed by :n:`functional inversion @ident` where :token:`ident` is the identifier for the last introduced hypothesis. @@ -4565,8 +4575,8 @@ Inversion Classical tactics ----------------- -In order to ease the proving process, when the Classical module is -loaded. A few more tactics are available. Make sure to load the module +In order to ease the proving process, when the ``Classical`` module is +loaded, a few more tactics are available. Make sure to load the module using the ``Require Import`` command. .. tacn:: classical_left @@ -4623,7 +4633,7 @@ Automating The tactic :tacn:`omega`, due to Pierre Crégut, is an automatic decision procedure for Presburger arithmetic. It solves quantifier-free - formulas built with `~`, `\/`, `/\`, `->` on top of equalities, + formulas built with `~`, `\\/`, `/\\`, `->` on top of equalities, inequalities and disequalities on both the type :g:`nat` of natural numbers and :g:`Z` of binary integers. This tactic must be loaded by the command ``Require Import Omega``. See the additional documentation about omega diff --git a/doc/sphinx/proof-engine/vernacular-commands.rst b/doc/sphinx/proof-engine/vernacular-commands.rst index 5f3e82938d..89b24ea8a3 100644 --- a/doc/sphinx/proof-engine/vernacular-commands.rst +++ b/doc/sphinx/proof-engine/vernacular-commands.rst @@ -195,7 +195,7 @@ Requests to the environment (see Section :ref:`invocation-of-tactics`). -.. cmd:: Eval @redexpr in @term +.. cmd:: Eval @red_expr in @term This command performs the specified reduction on :n:`@term`, and displays the resulting term with its type. The term to be reduced may depend on @@ -627,6 +627,7 @@ file is a particular case of module called *library file*. as ``Export``. .. cmdv:: From @dirpath Require @qualid + :name: From ... Require ... This command acts as :cmd:`Require`, but picks any library whose absolute name is of the form :n:`@dirpath.@dirpath’.@qualid` @@ -870,26 +871,6 @@ interactively, they cannot be part of a vernacular file loaded via have to undo some extra commands and end on a state :n:`@num′ ≤ @num` if necessary. - .. cmdv:: Backtrack @num @num @num - :name: Backtrack - - .. deprecated:: 8.4 - - :cmd:`Backtrack` is a *deprecated* form of - :cmd:`BackTo` which allows explicitly manipulating the proof environment. The - three numbers represent the following: - - + *first number* : State label to reach, as for :cmd:`BackTo`. - + *second number* : *Proof state number* to unbury once aborts have been done. - |Coq| will compute the number of :cmd:`Undo` to perform (see Chapter :ref:`proofhandling`). - + *third number* : Number of :cmd:`Abort` to perform, i.e. the number of currently - opened nested proofs that must be canceled (see Chapter :ref:`proofhandling`). - - .. exn:: Invalid backtrack. - - The destination state label is unknown. - - .. _quitting-and-debugging: Quitting and debugging @@ -981,7 +962,7 @@ Controlling display .. flag:: Silent - This option controls the normal displaying. + This flag controls the normal displaying. .. opt:: Warnings "{+, {? {| - | + } } @ident }" :name: Warnings @@ -996,7 +977,7 @@ Controlling display .. flag:: Search Output Name Only - This option restricts the output of search commands to identifier names; + This flag restricts the output of search commands to identifier names; turning it on causes invocations of :cmd:`Search`, :cmd:`SearchHead`, :cmd:`SearchPattern`, :cmd:`SearchRewrite` etc. to omit types from their output, printing only identifiers. @@ -1017,7 +998,7 @@ Controlling display .. flag:: Printing Compact Contexts - This option controls the compact display mode for goals contexts. When on, + This flag controls the compact display mode for goals contexts. When on, the printer tries to reduce the vertical size of goals contexts by putting several variables (even if of different types) on the same line provided it does not exceed the printing width (see :opt:`Printing Width`). At the time @@ -1025,14 +1006,15 @@ Controlling display .. flag:: Printing Unfocused - This option controls whether unfocused goals are displayed. Such goals are + This flag controls whether unfocused goals are displayed. Such goals are created by focusing other goals with bullets (see :ref:`bullets` or :ref:`curly braces <curly-braces>`). It is off by default. .. flag:: Printing Dependent Evars Line - This option controls the printing of the “(dependent evars: …)” line when - ``-emacs`` is passed. + This flag controls the printing of the “(dependent evars: …)” information + after each tactic. The information is used by the Prooftree tool in Proof + General. (https://askra.de/software/prooftree) .. _vernac-controlling-the-reduction-strategies: @@ -1164,7 +1146,7 @@ described first. Print all the currently non-transparent strategies. -.. cmd:: Declare Reduction @ident := @redexpr +.. cmd:: Declare Reduction @ident := @red_expr This command allows giving a short name to a reduction expression, for instance ``lazy beta delta [foo bar]``. This short name can then be used @@ -1176,7 +1158,7 @@ described first. functor applications will be rejected if these declarations are not local. The name :n:`@ident` cannot be used directly as an Ltac tactic, but nothing prevents the user from also performing a - :n:`Ltac @ident := @redexpr`. + :n:`Ltac @ident := @red_expr`. .. seealso:: :ref:`performingcomputations` @@ -1224,6 +1206,79 @@ Controlling the locality of commands occurs in a section. The :cmd:`Set` and :cmd:`Unset` commands belong to this category. +.. _controlling-typing-flags: + +Controlling Typing Flags +---------------------------- + +.. flag:: Guard Checking + + This flag can be used to enable/disable the guard checking of + fixpoints. Warning: this can break the consistency of the system, use at your + own risk. Decreasing argument can still be specified: the decrease is not checked + anymore but it still affects the reduction of the term. Unchecked fixpoints are + printed by :cmd:`Print Assumptions`. + +.. flag:: Positivity Checking + + This flag can be used to enable/disable the positivity checking of inductive + types and the productivity checking of coinductive types. Warning: this can + break the consistency of the system, use at your own risk. Unchecked + (co)inductive types are printed by :cmd:`Print Assumptions`. + +.. flag:: Universe Checking + + This flag can be used to enable/disable the checking of universes, providing a + form of "type in type". Warning: this breaks the consistency of the system, use + at your own risk. Constants relying on "type in type" are printed by + :cmd:`Print Assumptions`. It has the same effect as `-type-in-type` command line + argument (see :ref:`command-line-options`). + +.. cmd:: Print Typing Flags + + Print the status of the three typing flags: guard checking, positivity checking + and universe checking. + +.. example:: + + .. coqtop:: all reset + + Unset Guard Checking. + + Print Typing Flags. + + Fixpoint f (n : nat) : False + := f n. + + Fixpoint ackermann (m n : nat) {struct m} : nat := + match m with + | 0 => S n + | S m => + match n with + | 0 => ackermann m 1 + | S n => ackermann m (ackermann (S m) n) + end + end. + + Print Assumptions ackermann. + + Note that the proper way to define the Ackermann function is to use + an inner fixpoint: + + .. coqtop:: all reset + + Fixpoint ack m := + fix ackm n := + match m with + | 0 => S n + | S m' => + match n with + | 0 => ack m' 1 + | S n' => ack m' (ackm n') + end + end. + + .. _internal-registration-commands: Internal registration commands diff --git a/doc/sphinx/refman-preamble.rst b/doc/sphinx/refman-preamble.rst index c662028773..de95eda989 100644 --- a/doc/sphinx/refman-preamble.rst +++ b/doc/sphinx/refman-preamble.rst @@ -70,7 +70,11 @@ .. |p_i| replace:: `p`\ :math:`_{i}` .. |p_n| replace:: `p`\ :math:`_{n}` .. |Program| replace:: :strong:`Program` +.. |Prop| replace:: :math:`\Prop` +.. |SProp| replace:: :math:`\SProp` +.. |Set| replace:: :math:`\Set` .. |SSR| replace:: :smallcaps:`SSReflect` +.. |Type| replace:: :math:`\Type` .. |t_1| replace:: `t`\ :math:`_{1}` .. |t_i| replace:: `t`\ :math:`_{i}` .. |t_m| replace:: `t`\ :math:`_{m}` diff --git a/doc/sphinx/user-extensions/proof-schemes.rst b/doc/sphinx/user-extensions/proof-schemes.rst index 3a12ee288a..5b0b3c51b0 100644 --- a/doc/sphinx/user-extensions/proof-schemes.rst +++ b/doc/sphinx/user-extensions/proof-schemes.rst @@ -128,7 +128,7 @@ Automatic declaration of schemes .. warning:: - You have to be careful with this option since Coq may now reject well-defined + You have to be careful with these flags since Coq may now reject well-defined inductive types because it cannot compute a Boolean equality for them. .. flag:: Rewriting Schemes diff --git a/doc/sphinx/user-extensions/syntax-extensions.rst b/doc/sphinx/user-extensions/syntax-extensions.rst index fd315c097d..dbe714c388 100644 --- a/doc/sphinx/user-extensions/syntax-extensions.rst +++ b/doc/sphinx/user-extensions/syntax-extensions.rst @@ -267,31 +267,30 @@ The second, more powerful control on printing is by using the format A *format* is an extension of the string denoting the notation with the possible following elements delimited by single quotes: -- extra spaces are translated into simple spaces +- tokens of the form ``'/ '`` are translated into breaking points. If + there is a line break, indents the number of spaces appearing after the + “``/``” (no indentation in the example) -- tokens of the form ``'/ '`` are translated into breaking point, in - case a line break occurs, an indentation of the number of spaces after - the “ ``/``” is applied (2 spaces in the given example) - -- token of the form ``'//'`` force writing on a new line +- tokens of the form ``'//'`` force writing on a new line - well-bracketed pairs of tokens of the form ``'[ '`` and ``']'`` are - translated into printing boxes; in case a line break occurs, an extra - indentation of the number of spaces given after the “ ``[``” is applied - (4 spaces in the example) + translated into printing boxes; if there is a line break, an extra + indentation of the number of spaces after the “``[``” is applied - well-bracketed pairs of tokens of the form ``'[hv '`` and ``']'`` are translated into horizontal-or-else-vertical printing boxes; if the content of the box does not fit on a single line, then every breaking - point forces a newline and an extra indentation of the number of - spaces given after the “ ``[``” is applied at the beginning of each - newline (3 spaces in the example) + point forces a new line and an extra indentation of the number of + spaces after the “``[hv``” is applied at the beginning of each new line - well-bracketed pairs of tokens of the form ``'[v '`` and ``']'`` are translated into vertical printing boxes; every breaking point forces a - newline, even if the line is large enough to display the whole content - of the box, and an extra indentation of the number of spaces given - after the “``[``” is applied at the beginning of each newline + new line, even if the line is large enough to display the whole content + of the box, and an extra indentation of the number of spaces + after the “``[v``” is applied at the beginning of each new line (3 spaces + in the example) + +- extra spaces in other tokens are preserved in the output Notations disappear when a section is closed. No typing of the denoted expression is performed at definition time. Type checking is done only @@ -592,7 +591,7 @@ placeholder being the nesting point. In the innermost occurrence of the nested iterating pattern, the second placeholder is finally filled with the terminating expression. -In the example above, the iterator :math:`φ([~]_E , [~]_I)` is :math:`cons [~]_E [~]_I` +In the example above, the iterator :math:`φ([~]_E , [~]_I)` is :math:`cons [~]_E\, [~]_I` and the terminating expression is ``nil``. Here are other examples: .. coqtop:: in @@ -751,12 +750,12 @@ level is otherwise given explicitly by using the syntax Levels are cumulative: a notation at level ``n`` of which the left end is a term shall use rules at level less than ``n`` to parse this -sub-term. More precisely, it shall use rules at level strictly less +subterm. More precisely, it shall use rules at level strictly less than ``n`` if the rule is declared with ``right associativity`` and rules at level less or equal than ``n`` if the rule is declared with ``left associativity``. Similarly, a notation at level ``n`` of which the right end is a term shall use by default rules at level strictly -less than ``n`` to parse this sub-term if the rule is declared left +less than ``n`` to parse this subterm if the rule is declared left associative and rules at level less or equal than ``n`` if the rule is declared right associative. This is what happens for instance in the rule @@ -872,7 +871,7 @@ notations are given below. The optional :production:`scope` is described in : Inductive `ind_body` [`decl_notation`] with … with `ind_body` [`decl_notation`]. : CoInductive `ind_body` [`decl_notation`] with … with `ind_body` [`decl_notation`]. : Fixpoint `fix_body` [`decl_notation`] with … with `fix_body` [`decl_notation`]. - : CoFixpoint `cofix_body` [`decl_notation`] with … with `cofix_body` [`decl_notation`]. + : CoFixpoint `fix_body` [`decl_notation`] with … with `fix_body` [`decl_notation`]. : [Local] Declare Custom Entry `ident`. decl_notation : [where `string` := `term` [: `scope`] and … and `string` := `term` [: `scope`]]. modifiers : `modifier`, … , `modifier` @@ -1443,8 +1442,8 @@ Numeral notations of the resulting term will be refreshed. Note that only fully-reduced ground terms (terms containing only - function application, constructors, inductive type families, and - primitive integers) will be considered for printing. + function application, constructors, inductive type families, + sorts, and primitive integers) will be considered for printing. .. cmdv:: Numeral Notation @ident__1 @ident__2 @ident__3 : @scope (warning after @num). @@ -1593,8 +1592,8 @@ String notations of the resulting term will be refreshed. Note that only fully-reduced ground terms (terms containing only - function application, constructors, inductive type families, and - primitive integers) will be considered for printing. + function application, constructors, inductive type families, + sorts, and primitive integers) will be considered for printing. .. exn:: Cannot interpret this string as a value of type @type diff --git a/doc/stdlib/hidden-files b/doc/stdlib/hidden-files index b25104ddb9..a2bc90ffc0 100644 --- a/doc/stdlib/hidden-files +++ b/doc/stdlib/hidden-files @@ -12,6 +12,8 @@ plugins/extraction/ExtrHaskellZInteger.v plugins/extraction/ExtrHaskellZNum.v plugins/extraction/ExtrOcamlBasic.v plugins/extraction/ExtrOcamlBigIntConv.v +plugins/extraction/ExtrOCamlInt63.v +plugins/extraction/ExtrOCamlFloats.v plugins/extraction/ExtrOcamlIntConv.v plugins/extraction/ExtrOcamlNatBigInt.v plugins/extraction/ExtrOcamlNatInt.v @@ -41,6 +43,11 @@ plugins/micromega/Tauto.v plugins/micromega/VarMap.v plugins/micromega/ZCoeff.v plugins/micromega/ZMicromega.v +plugins/micromega/ZifyInst.v +plugins/micromega/ZifyBool.v +plugins/micromega/ZifyComparison.v +plugins/micromega/ZifyClasses.v +plugins/micromega/Zify.v plugins/nsatz/Nsatz.v plugins/omega/Omega.v plugins/omega/OmegaLemmas.v @@ -76,3 +83,5 @@ plugins/setoid_ring/Rings_Q.v plugins/setoid_ring/Rings_R.v plugins/setoid_ring/Rings_Z.v plugins/setoid_ring/ZArithRing.v +plugins/ssr/ssrunder.v +plugins/ssr/ssrsetoid.v diff --git a/doc/stdlib/index-list.html.template b/doc/stdlib/index-list.html.template index a561de1d0c..21b5678a85 100644 --- a/doc/stdlib/index-list.html.template +++ b/doc/stdlib/index-list.html.template @@ -68,6 +68,7 @@ through the <tt>Require Import</tt> command.</p> theories/Logic/WKL.v theories/Logic/FinFun.v theories/Logic/PropFacts.v + theories/Logic/HLevels.v </dd> <dt> <b>Structures</b>: @@ -181,14 +182,12 @@ through the <tt>Require Import</tt> command.</p> theories/ZArith/Zhints.v (theories/ZArith/ZArith_base.v) theories/ZArith/Zcomplements.v - theories/ZArith/Zsqrt_compat.v theories/ZArith/Zpow_def.v theories/ZArith/Zpow_alt.v theories/ZArith/Zpower.v theories/ZArith/Zdiv.v theories/ZArith/Zquot.v theories/ZArith/Zeuclid.v - theories/ZArith/Zlogarithm.v (theories/ZArith/ZArith.v) theories/ZArith/Zgcd_alt.v theories/ZArith/Zwf.v @@ -329,6 +328,19 @@ through the <tt>Require Import</tt> command.</p> theories/Numbers/Integer/Binary/ZBinary.v theories/Numbers/Integer/NatPairs/ZNatPairs.v </dd> + + <dt> <b> Floats</b>: + Floating-point arithmetic + </dt> + <dd> + theories/Floats/FloatClass.v + theories/Floats/PrimFloat.v + theories/Floats/SpecFloat.v + theories/Floats/FloatOps.v + theories/Floats/FloatAxioms.v + theories/Floats/FloatLemmas.v + (theories/Floats/Floats.v) + </dd> </dl> </dd> @@ -516,7 +528,13 @@ through the <tt>Require Import</tt> command.</p> </dt> <dd> theories/Reals/Rdefinitions.v + theories/Reals/ConstructiveReals.v + theories/Reals/ConstructiveRealsMorphisms.v + theories/Reals/ConstructiveCauchyReals.v + theories/Reals/ConstructiveCauchyRealsMult.v + theories/Reals/ClassicalDedekindReals.v theories/Reals/Raxioms.v + theories/Reals/ConstructiveRealsLUB.v theories/Reals/RIneq.v theories/Reals/DiscrR.v theories/Reals/ROrderedType.v @@ -561,6 +579,7 @@ through the <tt>Require Import</tt> command.</p> theories/Reals/Ranalysis5.v theories/Reals/Ranalysis_reg.v theories/Reals/Rcomplete.v + theories/Reals/ConstructiveRcomplete.v theories/Reals/RiemannInt.v theories/Reals/RiemannInt_SF.v theories/Reals/Rpow_def.v @@ -601,6 +620,7 @@ through the <tt>Require Import</tt> command.</p> </dt> <dd> plugins/ssrmatching/ssrmatching.v + plugins/ssr/ssrclasses.v plugins/ssr/ssreflect.v plugins/ssr/ssrbool.v plugins/ssr/ssrfun.v @@ -619,8 +639,8 @@ through the <tt>Require Import</tt> command.</p> </dt> <dd> theories/Compat/AdmitAxiom.v - theories/Compat/Coq88.v theories/Compat/Coq89.v theories/Compat/Coq810.v + theories/Compat/Coq811.v </dd> </dl> diff --git a/doc/tools/docgram/README.md b/doc/tools/docgram/README.md index 98fdc38ca7..a0a1809133 100644 --- a/doc/tools/docgram/README.md +++ b/doc/tools/docgram/README.md @@ -186,6 +186,9 @@ that appear in the specified production: | WITH <newprod> ``` +* `PRINT` <nonterminal> - prints the nonterminal definition at that point in + applying the edits. Most useful when the edits get a bit complicated to follow. + * (any other nonterminal name) - adds a new production (and possibly a new nonterminal) to the grammar. diff --git a/doc/tools/docgram/common.edit_mlg b/doc/tools/docgram/common.edit_mlg index ea94e21ff3..06b49a0a18 100644 --- a/doc/tools/docgram/common.edit_mlg +++ b/doc/tools/docgram/common.edit_mlg @@ -12,41 +12,10 @@ DOC_GRAMMAR -(* additional nts to be spliced *) - -LEFTQMARK: [ -| "?" -] - -SPLICE: [ -| LEFTQMARK -] - -hyp: [ -| var -] - -tactic_then_gen: [ -| EDIT ADD_OPT tactic_expr5 "|" tactic_then_gen -| EDIT ADD_OPT tactic_expr5 ".." tactic_then_last -] - -SPLICE: [ -| hyp -| identref -| pattern_ident (* depends on previous LEFTQMARK splice todo: improve *) -| constr_eval (* splices as multiple prods *) -| tactic_then_last (* todo: dependency on c.edit_mlg edit?? really useful? *) -| Prim.name -| ltac_selector -| Constr.ident -| tactic_then_locality (* todo: cleanup *) -| attribute_list -] - +(* renames to eliminate qualified names + put other renames at the end *) RENAME: [ (* map missing names for rhs *) -| _binders binders | Constr.constr term | Constr.constr_pattern constr_pattern | Constr.global global @@ -54,58 +23,52 @@ RENAME: [ | Constr.lconstr_pattern lconstr_pattern | G_vernac.query_command query_command | G_vernac.section_subset_expr section_subset_expr -| nonsimple_intropattern intropattern | Pltac.tactic tactic -| Pltac.tactic_expr ltac_expr +| Pltac.tactic_expr tactic_expr5 | Prim.ident ident | Prim.reference reference | Pvernac.Vernac_.main_entry vernac_control | Tactic.tactic tactic -| tactic3 ltac_expr3 (* todo: can't figure out how this gets mapped by coqpp *) -| tactic1 ltac_expr1 (* todo: can't figure out how this gets mapped by coqpp *) -| tactic0 ltac_expr0 (* todo: can't figure out how this gets mapped by coqpp *) -| tactic_expr5 ltac_expr -| tactic_expr4 ltac_expr4 -| tactic_expr3 ltac_expr3 -| tactic_expr2 ltac_expr2 -| tactic_expr1 ltac_expr1 -| tactic_expr0 ltac_expr0 - - (* elementary renaming/OCaml-defined productions *) -| clause clause_dft_concl -| in_clause' in_clause -| l_constr lconstr (* todo: should delete the production *) (* SSR *) +(* | G_vernac.def_body def_body | Pcoq.Constr.constr term | Prim.by_notation by_notation | Prim.identref ident | Prim.natural natural +*) | Vernac.rec_definition rec_definition - (* rename on lhs *) -| intropatterns intropattern_list_opt | Constr.closed_binder closed_binder +] - (* historical name *) -| constr term +(* written in OCaml *) +impl_ident_head: [ +| "{" ident ] +lpar_id_coloneq: [ +| "(" ident; ":=" +] + +(* lookahead symbols *) DELETE: [ | check_for_coloneq -| impl_ident_head | local_test_lpar_id_colon | lookup_at_as_comma | only_starredidentrefs | test_bracket_ident -| test_lpar_id_coloneq +| test_lpar_id_colon +| test_lpar_id_coloneq (* todo: grammar seems incorrect, repeats the "(" IDENT ":=" *) | test_lpar_id_rpar | test_lpar_idnum_coloneq +| test_nospace_pipe_closedcurly | test_show_goal (* SSR *) (* | ssr_null_entry *) +(* | ssrtermkind (* todo: rename as "test..." *) | term_annotation (* todo: rename as "test..." *) | test_idcomma @@ -122,48 +85,410 @@ DELETE: [ | test_ident_no_do | ssrdoarg (* todo: this and the next one should be removed from the grammar? *) | ssrseqdir +*) + +(* unused *) +| constr_comma_sequence' +| auto_using' +| constr_may_eval ] -ident: [ -| DELETE IDENT ssr_null_entry +(* ssrintrosarg: [ | DELETENT ] *) + +(* additional nts to be spliced *) + +hyp: [ +| var ] -natural: [ -| DELETE _natural +empty: [ +| ] +or_opt: [ +| "|" +| empty +] - (* added productions *) +ltac_expr_opt: [ +| tactic_expr5 +| empty +] -empty: [ (* todo: (bug) this is getting converted to empty -> empty *) -| +ltac_expr_opt_list_or: [ +| ltac_expr_opt_list_or "|" ltac_expr_opt +| ltac_expr_opt ] -lpar_id_coloneq: [ -| "(" IDENT; ":=" +tactic_then_gen: [ +| EDIT ADD_OPT tactic_expr5 "|" tactic_then_gen +| EDIT ADD_OPT tactic_expr5 ".." tactic_then_last +| REPLACE OPT tactic_expr5 ".." tactic_then_last +| WITH ltac_expr_opt ".." or_opt ltac_expr_opt_list_or ] -name_colon: [ -| IDENT; ":" -| "_" ":" (* todo: should "_" be a keyword or an identifier? *) +ltac_expr_opt_list_or: [ +| ltac_expr_opt_list_or "|" OPT tactic_expr5 +| OPT tactic_expr5 ] -int: [ (* todo: probably should be NUMERAL *) -| integer +reference: [ | DELETENT ] + +reference: [ +| qualid ] -command_entry: [ -| noedit_mode +fullyqualid: [ | DELETENT ] + +fullyqualid: [ +| qualid +] + + +field: [ | DELETENT ] + +field: [ +| "." ident +] + +basequalid: [ +| REPLACE ident fields +| WITH qualid field +] + +fields: [ | DELETENT ] + +dirpath: [ +| REPLACE ident LIST0 field +| WITH ident +| dirpath field ] binders: [ | DELETE Pcoq.Constr.binders (* todo: not sure why there are 2 "binders:" *) ] -(* edits to simplify *) +lconstr: [ +| DELETE l_constr +] + +let_type_cstr: [ +| DELETE OPT [ ":" lconstr ] +| rec_type_cstr +] + +as_name_opt: [ +| "as" name +| empty +] + +(* rename here because we want to use "return_type" for something else *) +RENAME: [ +| return_type as_return_type_opt +] + +as_return_type_opt: [ +| REPLACE OPT [ OPT [ "as" name ] case_type ] +| WITH as_name_opt case_type +| empty +] + +case_item: [ +| REPLACE operconstr100 OPT [ "as" name ] OPT [ "in" pattern200 ] +| WITH operconstr100 as_name_opt OPT [ "in" pattern200 ] +] + +as_dirpath: [ +| DELETE OPT [ "as" dirpath ] +| "as" dirpath +| empty +] + +binder_constr: [ +| MOVETO term_let "let" name binders let_type_cstr ":=" operconstr200 "in" operconstr200 +| MOVETO term_let "let" single_fix "in" operconstr200 +| MOVETO term_let "let" [ "(" LIST0 name SEP "," ")" | "()" ] as_return_type_opt ":=" operconstr200 "in" operconstr200 +| MOVETO term_let "let" "'" pattern200 ":=" operconstr200 "in" operconstr200 +| MOVETO term_let "let" "'" pattern200 ":=" operconstr200 case_type "in" operconstr200 +| MOVETO term_let "let" "'" pattern200 "in" pattern200 ":=" operconstr200 case_type "in" operconstr200 +] + +term_let: [ +| REPLACE "let" name binders let_type_cstr ":=" operconstr200 "in" operconstr200 +| WITH "let" name let_type_cstr ":=" operconstr200 "in" operconstr200 +| "let" name LIST1 binder let_type_cstr ":=" operconstr200 "in" operconstr200 +(* Don't need to document that "( )" is equivalent to "()" *) +| REPLACE "let" [ "(" LIST0 name SEP "," ")" | "()" ] as_return_type_opt ":=" operconstr200 "in" operconstr200 +| WITH "let" [ "(" LIST1 name SEP "," ")" | "()" ] as_return_type_opt ":=" operconstr200 "in" operconstr200 +| REPLACE "let" "'" pattern200 ":=" operconstr200 "in" operconstr200 +| WITH "let" "'" pattern200 ":=" operconstr200 OPT case_type "in" operconstr200 +| DELETE "let" "'" pattern200 ":=" operconstr200 case_type "in" operconstr200 +] + +atomic_constr: [ +(* @Zimmi48: "string" used only for notations, but keep to be consistent with patterns *) +(* | DELETE string *) +| REPLACE "?" "[" ident "]" +| WITH "?[" ident "]" +| MOVETO term_evar "?[" ident "]" +| REPLACE "?" "[" pattern_ident "]" +| WITH "?[" pattern_ident "]" +| MOVETO term_evar "?[" pattern_ident "]" +| MOVETO term_evar pattern_ident evar_instance +] + +tactic_expr0: [ +| REPLACE "[" ">" tactic_then_gen "]" +| WITH "[>" tactic_then_gen "]" +] + +operconstr100: [ +| MOVETO term_cast operconstr99 "<:" operconstr200 +| MOVETO term_cast operconstr99 "<<:" operconstr200 +| MOVETO term_cast operconstr99 ":" operconstr200 +| MOVETO term_cast operconstr99 ":>" +] + +operconstr10: [ +(* fixme: add in as a prodn somewhere *) +| MOVETO dangling_pattern_extension_rule "@" pattern_identref LIST1 identref +| DELETE dangling_pattern_extension_rule +] + +operconstr9: [ +(* @Zimmi48: Special token .. is for use in the Notation command. (see bug_3304.v) *) +| DELETE ".." operconstr0 ".." +] + +arg_list: [ +| arg_list appl_arg +| appl_arg +] + +arg_list_opt: [ +| arg_list +| empty +] + +operconstr1: [ +| REPLACE operconstr0 ".(" global LIST0 appl_arg ")" +| WITH operconstr0 ".(" global arg_list_opt ")" +| MOVETO term_projection operconstr0 ".(" global arg_list_opt ")" +| MOVETO term_projection operconstr0 ".(" "@" global LIST0 ( operconstr9 ) ")" +] + +operconstr0: [ +(* @Zimmi48: This rule is a hack, according to Hugo, and should not be shown in the manual. *) +| DELETE "{" binder_constr "}" +] + +single_fix: [ +| DELETE fix_kw fix_decl +| "fix" fix_decl +| "cofix" fix_decl +] + +fix_kw: [ | DELETENT ] + +binders_fixannot: [ +(* +| REPLACE impl_name_head impl_ident_tail binders_fixannot +| WITH impl_name_head impl_ident_tail "}" binders_fixannot +*) +(* Omit this complex detail. See https://github.com/coq/coq/pull/10614#discussion_r344118146 *) +| DELETE impl_name_head impl_ident_tail binders_fixannot + +| DELETE fixannot +| DELETE binder binders_fixannot +| DELETE (* empty *) + +| LIST0 binder OPT fixannot +] -ltac_expr1: [ +impl_ident_tail: [ +| DELETENT +(* +| REPLACE "}" +| WITH empty +| REPLACE LIST1 name ":" lconstr "}" +| WITH LIST1 name ":" lconstr +| REPLACE LIST1 name "}" +| WITH LIST1 name +| REPLACE ":" lconstr "}" +| WITH ":" lconstr +*) +] + +of_type_with_opt_coercion: [ +| DELETE ":>" ">" +| DELETE ":" ">" ">" +| DELETE ":" ">" +] + +binder: [ +| DELETE name +] + +open_binders: [ +| REPLACE name LIST0 name ":" lconstr +| WITH LIST1 name ":" lconstr +(* @Zimmi48: Special token .. is for use in the Notation command. (see bug_3304.v) *) +| DELETE name ".." name +| REPLACE name LIST0 name binders +| WITH LIST1 binder +| DELETE closed_binder binders +] + +closed_binder: [ +| name + +| REPLACE "(" name LIST1 name ":" lconstr ")" +| WITH "(" LIST1 name ":" lconstr ")" +| DELETE "(" name ":" lconstr ")" + +| DELETE "(" name ":=" lconstr ")" +| REPLACE "(" name ":" lconstr ":=" lconstr ")" +| WITH "(" name rec_type_cstr ":=" lconstr ")" + +| DELETE "{" name LIST1 name "}" + +| REPLACE "{" name LIST1 name ":" lconstr "}" +| WITH "{" LIST1 name rec_type_cstr "}" +| DELETE "{" name ":" lconstr "}" +] + +typeclass_constraint: [ +| EDIT ADD_OPT "!" operconstr200 +] + +(* ?? From the grammar, Prim.name seems to be only "_" but ident is also accepted "*) +Prim.name: [ +| REPLACE "_" +| WITH name +] + +oriented_rewriter: [ +| REPLACE orient_rw rewriter +| WITH orient rewriter +] + +DELETE: [ +| orient_rw +] + +pattern1_list: [ +| pattern1_list pattern1 +| pattern1 +] + +pattern1_list_opt: [ +| pattern1_list +| empty +] + +pattern10: [ +| REPLACE pattern1 LIST1 pattern1 +| WITH LIST1 pattern1 +| REPLACE "@" reference LIST0 pattern1 +| WITH "@" reference pattern1_list_opt +] + +pattern0: [ +| REPLACE "(" pattern200 ")" +| WITH "(" LIST1 pattern200 SEP "|" ")" +| DELETE "(" pattern200 "|" LIST1 pattern200 SEP "|" ")" +] + +patterns_comma: [ +| patterns_comma "," pattern100 +| pattern100 +] + +patterns_comma_list_or: [ +| patterns_comma_list_or "|" patterns_comma +| patterns_comma +] + +eqn: [ +| REPLACE LIST1 mult_pattern SEP "|" "=>" lconstr +| WITH patterns_comma_list_or "=>" lconstr +] + +record_patterns: [ +| REPLACE record_pattern ";" record_patterns +| WITH record_patterns ";" record_pattern +] + +(* todo: binders should be binders_opt *) + + +(* lexer stuff *) +bigint: [ +| DELETE NUMERAL +| num +] + +ident: [ +| DELETENT +] + +IDENT: [ +| ident +] + +integer: [ | DELETENT ] +RENAME: [ +| integer int (* todo: review uses in .mlg files, some should be "natural" *) +] + +LEFTQMARK: [ +| "?" +] + +natural: [ | DELETENT ] +natural: [ +| num (* todo: or should it be "nat"? *) +] + +NUMERAL: [ +| numeral +] + +(* todo: QUOTATION only used in a test suite .mlg files, is it documented/useful? *) + +string: [ | DELETENT ] +STRING: [ +| string +] + + +(* todo: is "bigint" useful?? *) +(* todo: "check_int" in g_prim.mlg should be "check_num" *) + + (* added productions *) + +name_colon: [ +| name ":" +] + +command_entry: [ +| noedit_mode +] + +tactic_expr1: [ | EDIT match_key ADD_OPT "reverse" "goal" "with" match_context_list "end" +| MOVETO ltac_match_goal match_key OPT "reverse" "goal" "with" match_context_list "end" +| MOVETO ltac_match_term match_key tactic_expr5 "with" match_list "end" +] + +DELETE: [ +| tactic_then_locality +] + +tactic_expr4: [ +| REPLACE tactic_expr3 ";" tactic_then_gen "]" +| WITH tactic_expr3 ";" "[" tactic_then_gen "]" +| tactic_expr3 ";" "[" ">" tactic_then_gen "]" ] match_context_list: [ @@ -180,35 +505,37 @@ match_list: [ | EDIT ADD_OPT "|" LIST1 match_rule SEP "|" ] +match_rule: [ +| REPLACE match_pattern "=>" tactic_expr5 +| WITH [ match_pattern | "_" ] "=>" tactic_expr5 +| DELETE "_" "=>" tactic_expr5 +] + selector_body: [ | REPLACE range_selector_or_nth (* depends on whether range_selector_or_nth is deleted first *) | WITH LIST1 range_selector SEP "," ] -range_selector_or_nth: [ -| DELETENT -] +range_selector_or_nth: [ | DELETENT ] simple_tactic: [ | DELETE "intros" | REPLACE "intros" ne_intropatterns -| WITH "intros" intropattern_list_opt +| WITH "intros" intropatterns | DELETE "eintros" | REPLACE "eintros" ne_intropatterns -| WITH "eintros" intropattern_list_opt +| WITH "eintros" intropatterns ] -intropattern_list_opt: [ +intropatterns: [ | DELETE LIST0 intropattern -| intropattern_list_opt intropattern +| intropatterns intropattern | empty ] - -ne_intropatterns: [ -| DELETENT (* todo: don't use DELETENT for this *) -] +(* todo: don't use DELETENT for this *) +ne_intropatterns: [ | DELETENT ] or_and_intropattern: [ @@ -216,5 +543,181 @@ or_and_intropattern: [ | DELETE "(" simple_intropattern ")" | REPLACE "(" simple_intropattern "," LIST1 simple_intropattern SEP "," ")" | WITH "(" LIST0 simple_intropattern SEP "," ")" -| EDIT "[" USE_NT intropattern_or LIST1 intropattern_list_opt SEP "|" "]" +| EDIT "[" USE_NT intropattern_or LIST1 intropatterns SEP "|" "]" ] + +bar_cbrace: [ +| REPLACE "|" "}" +| WITH "|}" +] + +(* todo: is this really correct? Search for "Pvernac.register_proof_mode" *) +(* consider tactic_command vs tac2mode *) +vernac_aux: [ +| tactic_mode "." +] + +SPLICE: [ +| noedit_mode +| command_entry +| bigint +| match_list +| match_context_list +| IDENT +| LEFTQMARK +| natural +| NUMERAL +| STRING +| hyp +| var +| identref +| pattern_ident +| constr_eval (* splices as multiple prods *) +| tactic_then_last (* todo: dependency on c.edit_mlg edit?? really useful? *) +| Prim.name +| ltac_selector +| Constr.ident +| attribute_list +| operconstr99 +| operconstr90 +| operconstr9 +| operconstr8 +| pattern200 +| pattern99 +| pattern90 +| ne_lstring +| ne_string +| lstring +| basequalid +| fullyqualid +| global +| reference +| bar_cbrace +| lconstr +| impl_name_head + +(* +| ast_closure_term +| ast_closure_lterm +| ident_no_do +| ssrterm +| ssrtacarg +| ssrtac3arg +| ssrtclarg +| ssrhyp +| ssrhoi_hyp +| ssrhoi_id +| ssrindex +| ssrhpats +| ssrhpats_nobs +| ssrfwdid +| ssrmovearg +| ssrcasearg +| ssrrwargs +| ssrviewposspc +| ssrpatternarg +| ssr_elsepat +| ssr_mpat +| ssrunlockargs +| ssrcofixfwd +| ssrfixfwd +| ssrhavefwdwbinders +| ssripats_ne +| ssrparentacarg +| ssrposefwd +*) + +| preident +| lpar_id_coloneq +| binders +| casted_constr +| check_module_types +| constr_pattern +| decl_sep +| function_rec_definition_loc (* loses funind annotation *) +| glob +| glob_constr_with_bindings +| id_or_meta +| lconstr_pattern +| lglob +| ltac_tacdef_body +| mode +| mult_pattern +| open_constr +| option_table +| record_declaration +| register_type_token +| tactic +| uconstr +| impl_ident_head +| argument_spec +| at_level +| branches +| check_module_type +| decorated_vernac +| ext_module_expr +| ext_module_type +| pattern_identref +| test +| binder_constr +| atomic_constr +| let_type_cstr +| name_colon +| closed_binder +| binders_fixannot +] + +RENAME: [ +| clause clause_dft_concl +| in_clause' in_clause + +| tactic3 ltac_expr3 (* todo: can't figure out how this gets mapped by coqpp *) +| tactic1 ltac_expr1 (* todo: can't figure out how this gets mapped by coqpp *) +| tactic0 ltac_expr0 (* todo: can't figure out how this gets mapped by coqpp *) +| tactic_expr5 ltac_expr +| tactic_expr4 ltac_expr4 +| tactic_expr3 ltac_expr3 +| tactic_expr2 ltac_expr2 +| tactic_expr1 ltac_expr1 +| tactic_expr0 ltac_expr0 + +(* | nonsimple_intropattern intropattern (* ltac2 *) *) +| intropatterns intropattern_list_opt + +| operconstr200 term (* historical name *) +| operconstr100 term100 +| operconstr10 term10 +| operconstr1 term1 +| operconstr0 term0 +| pattern100 pattern +| match_constr term_match +(*| impl_ident_tail impl_ident*) +| ssexpr35 ssexpr (* strange in mlg, ssexpr50 is after this *) + +| tactic_then_gen multi_goal_tactics +| selector only_selector +| selector_body selector +| input_fun fun_var +| match_hyps match_hyp + +| BULLET bullet +| nat_or_var num_or_var +| fix_decl fix_body +| instance universe_annot_opt +| rec_type_cstr colon_term_opt +| fix_constr term_fix +| constr term1_extended +| case_type return_type +| appl_arg arg +| record_patterns record_patterns_opt +| universe_increment universe_increment_opt +| rec_definition fix_definition +| corec_definition cofix_definition +| record_field_instance field_def +| record_fields_instance fields_def +| evar_instance evar_bindings_opt +| inst evar_binding +] + + +(* todo: ssrreflect*.rst ref to fix_body is incorrect *) diff --git a/doc/tools/docgram/doc_grammar.ml b/doc/tools/docgram/doc_grammar.ml index 9f0a1942f9..70976e705e 100644 --- a/doc/tools/docgram/doc_grammar.ml +++ b/doc/tools/docgram/doc_grammar.ml @@ -48,6 +48,9 @@ let default_args = { verify = false; } +let start_symbols = ["vernac_toplevel"] +let tokens = [ "bullet"; "ident"; "int"; "num"; "numeral"; "string" ] + (* translated symbols *) type doc_symbol = @@ -128,8 +131,8 @@ module DocGram = struct g_update_prods g nt' (oprods @ nprods) (* add a new nonterminal after "ins_after" None means insert at the beginning *) - let g_add_after g ins_after nt prods = - if NTMap.mem nt !g.map then raise Duplicate; (* don't update the nt if it's already present *) + let g_add_after g ?(update=true) ins_after nt prods = + if (not update) && NTMap.mem nt !g.map then raise Duplicate; (* don't update the nt if it's already present *) let rec insert_nt order res = match ins_after, order with | None, _ -> nt :: order @@ -143,6 +146,11 @@ module DocGram = struct g := { order = insert_nt !g.order []; map = NTMap.add nt prods !g.map } + let g_add_prod_after g ins_after nt prod = + let prods = try NTMap.find nt !g.map with Not_found -> [] in + (* todo: add check for duplicates *) + g_add_after g ~update:true ins_after nt (prods @ [prod]) + (* replace the map and order *) let g_reorder g map order = let order_nts = StringSet.of_list order in @@ -188,13 +196,13 @@ let rec output_prod plist need_semi = function | Slist0sep (sym, sep) -> sprintf "LIST0 %s SEP %s" (prod_to_str ~plist [sym]) (prod_to_str ~plist [sep]) | Sopt sym -> sprintf "OPT %s" (prod_to_str ~plist [sym]) | Sparen sym_list -> sprintf "( %s )" (prod_to_str sym_list) - | Sprod sym_list -> + | Sprod sym_list_list -> sprintf "[ %s ]" (String.concat " " (List.mapi (fun i r -> let prod = (prod_to_str r) in let sep = if i = 0 then "" else if prod <> "" then "| " else "|" in sprintf "%s%s" sep prod) - sym_list)) + sym_list_list)) | Sedit s -> sprintf "%s" s (* todo: make PLUGIN info output conditional on the set of prods? *) | Sedit2 ("PLUGIN", plugin) -> @@ -213,6 +221,8 @@ let rec output_prod plist need_semi = function and prod_to_str_r plist prod = match prod with + | Sterm s :: Snterm "ident" :: tl when List.mem s ["?"; "."] && plist -> + (sprintf "%s`ident`" s) :: (prod_to_str_r plist tl) | p :: tl -> let need_semi = match prod with @@ -258,6 +268,15 @@ and output_sep sep = and prod_to_prodn prod = String.concat " " (List.map output_prodn prod) +let pr_prods nt prods = (* duplicative *) + Printf.printf "%s: [\n" nt; + List.iter (fun prod -> + let str = prod_to_str ~plist:false prod in + let pfx = if str = "" then "|" else "| " in + Printf.printf "%s%s\n" pfx str) + prods; + Printf.printf "]\n\n" + type fmt = [`MLG | `PRODLIST | `PRODN ] (* print a subset of the grammar with nts in the specified order *) @@ -313,6 +332,8 @@ let cvt_ext prod = in List.map from_ext prod +let keywords = ref StringSet.empty + let rec cvt_gram_sym = function | GSymbString s -> Sterm s | GSymbQualid (s, level) -> @@ -352,6 +373,10 @@ and cvt_gram_sym_list l = (Sedit2 ("NOTE", s2)) :: cvt_gram_sym_list tl | GSymbQualid ("USE_NT", _) :: GSymbQualid (s2, l) :: tl -> (Sedit2 ("USE_NT", s2)) :: cvt_gram_sym_list tl + | GSymbString s :: tl -> + (* todo: not seeing "(bfs)" here for some reason *) + keywords := StringSet.add s !keywords; + cvt_gram_sym (GSymbString s) :: cvt_gram_sym_list tl | hd :: tl -> cvt_gram_sym hd :: cvt_gram_sym_list tl | [] -> [] @@ -453,6 +478,7 @@ let plugin_regex = Str.regexp "^plugins/\\([a-zA-Z0-9_]+\\)/" let read_mlg is_edit ast file level_renames symdef_map = let res = ref [] in + let locals = ref StringSet.empty in let add_prods nt prods = if not is_edit then add_symdef nt file symdef_map; @@ -478,6 +504,8 @@ let read_mlg is_edit ast file level_renames symdef_map = let len = List.length ent.gentry_rules in List.iteri (fun i rule -> let nt = ent.gentry_name in + if not (List.mem nt grammar_ext.gramext_globals) then + locals := StringSet.add nt !locals; let level = (get_label rule.grule_label) in let level = if level <> "" then level else match ent.gentry_pos with @@ -528,7 +556,7 @@ let read_mlg is_edit ast file level_renames symdef_map = in List.iter prod_loop ast; - List.rev !res + List.rev !res, !locals let dir s = "doc/tools/docgram/" ^ s @@ -536,7 +564,8 @@ let read_mlg_edit file = let fdir = dir file in let level_renames = ref StringMap.empty in (* ignored *) let symdef_map = ref StringMap.empty in (* ignored *) - read_mlg true (parse_file fdir) fdir level_renames symdef_map + let prods, _ = read_mlg true (parse_file fdir) fdir level_renames symdef_map in + prods let add_rule g nt prods file = let ent = try NTMap.find nt !g.map with Not_found -> [] in @@ -555,9 +584,12 @@ let read_mlg_files g args symdef_map = let last_autoloaded = List.hd (List.rev autoloaded_mlgs) in List.iter (fun file -> (* does nt renaming, deletion and splicing *) - let rules = read_mlg false (parse_file file) file level_renames symdef_map in + let rules, locals = read_mlg false (parse_file file) file level_renames symdef_map in let numprods = List.fold_left (fun num rule -> let nt, prods = rule in + if NTMap.mem nt !g.map && (StringSet.mem nt locals) && + StringSet.cardinal (StringSet.of_list (StringMap.find nt !symdef_map)) > 1 then + warn "%s: local nonterminal '%s' already defined\n" file nt; add_rule g nt prods file; num + List.length prods) 0 rules @@ -572,18 +604,74 @@ let read_mlg_files g args symdef_map = !level_renames + (* get the nt's in the production, preserving order, don't worry about dups *) + let nts_in_prod prod = + let rec traverse = function + | Sterm s -> [] + | Snterm s -> if List.mem s tokens then [] else [s] + | Slist1 sym + | Slist0 sym + | Sopt sym + -> traverse sym + | Slist1sep (sym, sep) + | Slist0sep (sym, sep) + -> traverse sym @ (traverse sep) + | Sparen sym_list -> List.concat (List.map traverse sym_list) + | Sprod sym_list_list -> List.concat (List.map (fun l -> List.concat (List.map traverse l)) sym_list_list) + | Sedit _ + | Sedit2 _ -> [] + in + List.rev (List.concat (List.map traverse prod)) + +let get_refdef_nts g = + let rec get_nts_r refd defd bindings = + match bindings with + | [] -> refd, defd + | (nt, prods) :: tl -> + get_nts_r (List.fold_left (fun res prod -> + StringSet.union res (StringSet.of_list (nts_in_prod prod))) + refd prods) + (StringSet.add nt defd) tl + in + let toks = StringSet.of_list tokens in + get_nts_r toks toks (NTMap.bindings !g.map) + + (*** global editing ops ***) -let create_edit_map edits = +let create_edit_map g op edits = let rec aux edits map = match edits with | [] -> map | edit :: tl -> let (key, binding) = edit in + let all_nts_ref, all_nts_def = get_refdef_nts g in + (match op with + (* todo: messages should tell you which edit file causes the error *) + | "SPLICE" -> + if not (StringSet.mem key all_nts_def) then + error "Undefined nt `%s` in SPLICE\n" key + | "DELETE" -> + if not (StringSet.mem key all_nts_ref || (StringSet.mem key all_nts_def)) then + error "Unused/undefined nt `%s` in DELETE\n" key; + | "RENAME" -> + if not (StringSet.mem key all_nts_ref || (StringSet.mem key all_nts_def)) then + error "Unused/undefined nt `%s` in RENAME\n" key; +(* todo: could not get the following codeto type check + (match binding with + | _ :: Snterm new_nt :: _ -> + if not (StringSet.mem new_nt all_nts_ref) then + error "nt `%s` already exists in %s\n" new_nt op + | _ -> ()) +*) + | _ -> ()); aux tl (StringMap.add key binding map) in aux edits StringMap.empty +let remove_Sedit2 p = + List.filter (fun sym -> match sym with | Sedit2 _ -> false | _ -> true) p + (* edit a production: rename nonterminals, drop nonterminals, substitute nonterminals *) let rec edit_prod g top edit_map prod = let edit_nt edit_map sym0 nt = @@ -596,8 +684,8 @@ let rec edit_prod g top edit_map prod = try let splice_prods = NTMap.find nt !g.map in match splice_prods with | [] -> assert false - | [p] -> List.rev p - | _ -> [Sprod splice_prods] + | [p] -> List.rev (remove_Sedit2 p) + | _ -> [Sprod (List.map remove_Sedit2 splice_prods)] with Not_found -> error "Missing nt '%s' for splice\n" nt; [Snterm nt] end | _ -> [Snterm binding] @@ -654,16 +742,22 @@ and edit_rule g edit_map nt rule = (*** splice: replace a reference to a nonterminal with its definition ***) -(* todo: create a better splice routine, handle recursive case *) +(* todo: create a better splice routine *) let apply_splice g splice_map = - StringMap.iter (fun nt b -> - if not (NTMap.mem nt !g.map) then - error "Unknown nt '%s' for apply_splice\n" nt) - splice_map; List.iter (fun b -> - let (nt, prods) = b in - let (nt', prods) = edit_rule g splice_map nt prods in - g_update_prods g nt' prods) + let (nt0, prods0) = b in + let rec splice_loop nt prods cnt = + let max_cnt = 10 in + let (nt', prods') = edit_rule g splice_map nt prods in + if cnt > max_cnt then + error "Splice for '%s' not done after %d iterations\n" nt0 max_cnt; + if nt' = nt && prods' = prods then + (nt', prods') + else + splice_loop nt' prods' (cnt+1) + in + let (nt', prods') = splice_loop nt0 prods0 0 in + g_update_prods g nt' prods') (NTMap.bindings !g.map); List.iter (fun b -> let (nt, op) = b in @@ -678,7 +772,7 @@ let find_first edit prods nt = let rec find_first_r edit prods nt i = match prods with | [] -> - error "Can't find '%s' in REPLACE for '%s'\n" (prod_to_str edit) nt; + error "Can't find '%s' in edit for '%s'\n" (prod_to_str edit) nt; raise Not_found | prod :: tl -> if ematch prod edit then i @@ -906,7 +1000,7 @@ let edit_all_prods g op eprods = op (prod_to_str eprod) num; aux tl res in - let map = create_edit_map (aux eprods []) in + let map = create_edit_map g op (aux eprods []) in if op = "SPLICE" then apply_splice g map else (* RENAME/DELETE *) @@ -960,6 +1054,13 @@ let edit_single_prod g edit0 prods nt = in edit_single_prod_r edit0 prods nt [] +let report_undef_nts g prod rec_nt = + let nts = nts_in_prod prod in + List.iter (fun nt -> + if not (NTMap.mem nt !g.map) && not (List.mem nt tokens) && nt <> rec_nt then + error "Undefined nonterminal `%s` in edit: %s\n" nt (prod_to_str prod)) + nts + let apply_edit_file g edits = List.iter (fun b -> let (nt, eprod) = b in @@ -970,11 +1071,26 @@ let apply_edit_file g edits = | (Snterm "DELETE" :: oprod) :: tl -> aux tl (remove_prod oprod prods nt) add_nt | (Snterm "DELETENT" :: _) :: tl -> (* note this doesn't remove references *) + if not (NTMap.mem nt !g.map) then + error "DELETENT for undefined nonterminal `%s`\n" nt; g_remove g nt; aux tl prods false + | (Snterm "MOVETO" :: Snterm nt2 :: oprod) :: tl -> + g_add_prod_after g (Some nt) nt2 oprod; + let prods' = (try + let posn = find_first oprod prods nt in + let prods = insert_after posn [[Snterm nt2]] prods in (* insert new prod *) + remove_prod oprod prods nt (* remove orig prod *) + with Not_found -> prods) + in + aux tl prods' add_nt + | (Snterm "PRINT" :: _) :: tl -> + pr_prods nt (NTMap.find nt !g.map); + aux tl prods add_nt | (Snterm "EDIT" :: oprod) :: tl -> aux tl (edit_single_prod g oprod prods nt) add_nt | (Snterm "REPLACE" :: oprod) :: (Snterm "WITH" :: rprod) :: tl -> + report_undef_nts g rprod ""; let prods' = (try let posn = find_first oprod prods nt in let prods = insert_after posn [rprod] prods in (* insert new prod *) @@ -985,10 +1101,12 @@ let apply_edit_file g edits = | (Snterm "REPLACE" :: _ as eprod) :: tl -> error "Missing WITH after '%s' in '%s'\n" (prod_to_str eprod) nt; aux tl prods add_nt + (* todo: check for unmatched editing keywords here *) | prod :: tl -> (* add a production *) if has_match prod prods then error "Duplicate production '%s' for %s\n" (prod_to_str prod) nt; + report_undef_nts g prod nt; aux tl (prods @ [prod]) add_nt in let prods, add_nt = @@ -1001,24 +1119,36 @@ let apply_edit_file g edits = (*** main routines ***) - (* get the nt's in the production, preserving order, don't worry about dups *) - let nts_in_prod prod = - let rec traverse = function - | Sterm s -> [] - | Snterm s -> [s] + (* get the special tokens in the grammar *) +let print_special_tokens g = + let rec traverse set = function + | Sterm s -> + let c = s.[0] in + if (c >= 'a' && c <= 'z') || (c >= 'A' && c <= 'Z') then set + else StringSet.add s set + | Snterm s -> set | Slist1 sym | Slist0 sym | Sopt sym - -> traverse sym + -> traverse set sym | Slist1sep (sym, sep) | Slist0sep (sym, sep) - -> traverse sym @ (traverse sep) - | Sparen sym_list -> List.concat (List.map traverse sym_list) - | Sprod sym_list_list -> List.concat (List.map (fun l -> List.concat (List.map traverse l)) sym_list_list) + -> traverse (traverse set sym) sep + | Sparen sym_list -> traverse_prod set sym_list + | Sprod sym_list_list -> traverse_prods set sym_list_list | Sedit _ - | Sedit2 _ -> [] + | Sedit2 _ -> set + and traverse_prod set prod = List.fold_left traverse set prod + and traverse_prods set prods = List.fold_left traverse_prod set prods in - List.rev (List.concat (List.map traverse prod)) + let spec_toks = List.fold_left (fun set b -> + let nt, prods = b in + traverse_prods set prods) + StringSet.empty (NTMap.bindings !g.map) + in + Printf.printf "Special tokens:"; + StringSet.iter (fun t -> Printf.printf " %s" t) spec_toks; + Printf.printf "\n\n" (* get the transitive closure of a non-terminal excluding "stops" symbols. Preserve ordering to the extent possible *) @@ -1098,23 +1228,156 @@ let print_chunks g out fmt () = (*seen := StringSet.diff !seen (StringSet.of_list ssr_tops);*) (*print_chunk out g seen fmt "vernac_toplevel" ["vernac_toplevel"] [];*) +let index_of str list = + let rec index_of_r str list index = + match list with + | [] -> None + | hd :: list -> + if hd = str then Some index + else index_of_r str list (index+1) + in + index_of_r str list 0 -let start_symbols = ["vernac_toplevel"; "tactic_mode"] -let tokens = [ "BULLET"; "FIELD"; "IDENT"; "NUMERAL"; "STRING" ] (* don't report as undefined *) +exception IsNone -let report_bad_nts g file = - let rec get_nts refd defd bindings = - match bindings with - | [] -> refd, defd - | (nt, prods) :: tl -> - get_nts (List.fold_left (fun res prod -> - StringSet.union res (StringSet.of_list (nts_in_prod prod))) - refd prods) - (StringSet.add nt defd) tl +(* todo: raise exception for bad n? *) +let rec nthcdr n list = if n <= 0 then list else nthcdr (n-1) (List.tl list) + +let pfx n list = + let rec pfx_r n res = function + | item :: tl -> if n < 0 then res else pfx_r (n-1) (item :: res) tl + | [] -> res in - let all_nts_ref, all_nts_def = - get_nts (StringSet.of_list tokens) (StringSet.of_list tokens) (NTMap.bindings !g.map) in + List.rev (pfx_r n [] list) + +(* todo: adjust Makefile to include Option.ml/mli *) +let get_opt = function + | Some y -> y + | _ -> raise IsNone + +let get_range g start end_ = + let starti, endi = get_opt (index_of start !g.order), get_opt (index_of end_ !g.order) in + pfx (endi - starti) (nthcdr starti !g.order) + +let get_rangeset g start end_ = StringSet.of_list (get_range g start end_) + +let print_dominated g = + let info nt rangeset exclude = + let reachable = StringSet.of_list (nt_closure g nt exclude) in + let unreachable = StringSet.of_list (nt_closure g (List.hd start_symbols) (nt::exclude)) in + let dominated = StringSet.diff reachable unreachable in + Printf.printf "For %s, 'attribute' is: reachable = %b, unreachable = %b, dominated = %b\n" nt + (StringSet.mem "attribute" reachable) + (StringSet.mem "attribute" unreachable) + (StringSet.mem "attribute" dominated); + Printf.printf " rangeset = %b excluded = %b\n" + (StringSet.mem "attribute" rangeset) + (List.mem "attribute" exclude); + reachable, dominated + in + let pr3 nt rangeset reachable dominated = + let missing = StringSet.diff dominated rangeset in + if not (StringSet.is_empty missing) then begin + Printf.printf "\nMissing in range for '%s':\n" nt; + StringSet.iter (fun nt -> Printf.printf " %s\n" nt) missing + end; + + let unneeded = StringSet.diff rangeset reachable in + if not (StringSet.is_empty unneeded) then begin + Printf.printf "\nUnneeded in range for '%s':\n" nt; + StringSet.iter (fun nt -> Printf.printf " %s\n" nt) unneeded + end; + in + let pr2 nt rangeset exclude = + let reachable, dominated = info nt rangeset exclude in + pr3 nt rangeset reachable dominated + in + let pr nt end_ = pr2 nt (get_rangeset g nt end_) [] in + + let ssr_ltac = ["ssr_first_else"; "ssrmmod"; "ssrdotac"; "ssrortacarg"; + "ssrparentacarg"] in + let ssr_tac = ["ssrintrosarg"; "ssrhintarg"; "ssrtclarg"; "ssrseqarg"; "ssrmovearg"; + "ssrrpat"; "ssrclauses"; "ssrcasearg"; "ssrarg"; "ssrapplyarg"; "ssrexactarg"; + "ssrcongrarg"; "ssrterm"; "ssrrwargs"; "ssrunlockargs"; "ssrfixfwd"; "ssrcofixfwd"; + "ssrfwdid"; "ssrposefwd"; "ssrsetfwd"; "ssrdgens"; "ssrhavefwdwbinders"; "ssrhpats_nobs"; + "ssrhavefwd"; "ssrsufffwd"; "ssrwlogfwd"; "ssrhint"; "ssrclear"; "ssr_idcomma"; + "ssrrwarg"; "ssrintros_ne"; "ssrhint3arg" ] @ ssr_ltac in + let ssr_cmd = ["ssr_modlocs"; "ssr_search_arg"; "ssrhintref"; "ssrhintref_list"; + "ssrviewpos"; "ssrviewposspc"] in + let ltac = ["ltac_expr"; "ltac_expr0"; "ltac_expr1"; "ltac_expr2"; "ltac_expr3"] in + let term = ["term"; "term0"; "term1"; "term10"; "term100"; "term9"; + "pattern"; "pattern0"; "pattern1"; "pattern10"] in + + pr "term" "constr"; + + let ltac_rangeset = List.fold_left StringSet.union StringSet.empty + [(get_rangeset g "ltac_expr" "tactic_atom"); + (get_rangeset g "toplevel_selector" "range_selector"); + (get_rangeset g "ltac_match_term" "match_pattern"); + (get_rangeset g "ltac_match_goal" "match_pattern_opt")] in + pr2 "ltac_expr" ltac_rangeset ("simple_tactic" :: ssr_tac); + + let dec_vern_rangeset = get_rangeset g "decorated_vernac" "opt_coercion" in + let dev_vern_excl = + ["gallina_ext"; "command"; "tactic_mode"; "syntax"; "command_entry"] @ term @ ltac @ ssr_tac in + pr2 "decorated_vernac" dec_vern_rangeset dev_vern_excl; + + let simp_tac_range = get_rangeset g "simple_tactic" "hypident_occ_list_comma" in + let simp_tac_excl = ltac @ ssr_tac in + pr2 "simple_tactic" simp_tac_range simp_tac_excl; + + let cmd_range = get_rangeset g "command" "int_or_id_list_opt" in + let cmd_excl = ssr_tac @ ssr_cmd in + pr2 "command" cmd_range cmd_excl; + + let syn_range = get_rangeset g "syntax" "constr_as_binder_kind" in + let syn_excl = ssr_tac @ ssr_cmd in + pr2 "syntax" syn_range syn_excl; + + let gext_range = get_rangeset g "gallina_ext" "Structure_opt" in + let gext_excl = ssr_tac @ ssr_cmd in + pr2 "gallina_ext" gext_range gext_excl; + + let qry_range = get_rangeset g "query_command" "searchabout_query_list" in + let qry_excl = ssr_tac @ ssr_cmd in + pr2 "query_command" qry_range qry_excl + + (* todo: tactic_mode *) + +let check_range_consistency g start end_ = + let defined_list = get_range g start end_ in + let defined = StringSet.of_list defined_list in + let referenced = List.fold_left (fun set nt -> + let prods = NTMap.find nt !g.map in + let refs = List.concat (List.map nts_in_prod prods) in + StringSet.union set (StringSet.of_list refs)) + StringSet.empty defined_list + in + let undef = StringSet.diff referenced defined in + let unused = StringSet.diff defined referenced in + if StringSet.cardinal unused > 0 || (StringSet.cardinal undef > 0) then begin + Printf.printf "\nFor range '%s' to '%s':\n External reference:" start end_; + StringSet.iter (fun nt -> Printf.printf " %s" nt) undef; + Printf.printf "\n"; + if StringSet.cardinal unused > 0 then begin + Printf.printf " Unreferenced:"; + StringSet.iter (fun nt -> Printf.printf " %s" nt) unused; + Printf.printf "\n" + end + end +(* print info on symbols with a single production of a single nonterminal *) +let check_singletons g = + NTMap.iter (fun nt prods -> + if List.length prods = 1 then + if List.length (remove_Sedit2 (List.hd prods)) = 1 then + warn "Singleton non-terminal, maybe SPLICE?: %s\n" nt + else + (*warn "Single production, maybe SPLICE?: %s\n" nt*) ()) + !g.map + +let report_bad_nts g file = + let all_nts_ref, all_nts_def = get_refdef_nts g in let undef = StringSet.diff all_nts_ref all_nts_def in List.iter (fun nt -> warn "%s: Undefined symbol '%s'\n" file nt) (StringSet.elements undef); @@ -1255,12 +1518,13 @@ let finish_with_file old_file verify = in let temp_file = (old_file ^ "_temp") in - if verify then - if (files_eq old_file temp_file || !exit_code <> 0) then - Sys.remove temp_file - else - error "%s is not current\n" old_file - else + if !exit_code <> 0 then + Sys.remove temp_file + else if verify then begin + if not (files_eq old_file temp_file) then + error "%s is not current\n" old_file; + Sys.remove temp_file + end else Sys.rename temp_file old_file let open_temp_bin file = @@ -1310,21 +1574,13 @@ let process_rst g file args seen tac_prods cmd_prods = let ig_args_regex = Str.regexp "^[ \t]*\\([a-zA-Z0-9_\\.]*\\)[ \t]*\\([a-zA-Z0-9_\\.]*\\)" in let blank_regex = Str.regexp "^[ \t]*$" in let end_prodlist_regex = Str.regexp "^[ \t]*$" in - let rec index_of_r str list index = - match list with - | [] -> None - | hd :: list -> - if hd = str then Some index - else index_of_r str list (index+1) - in - let index_of str list = index_of_r str list 0 in let getline () = let line = input_line old_rst in incr linenum; line in + (* todo: maybe pass end_index? *) let output_insertgram start_index end_ indent is_coq_group = - let rec nthcdr n list = if n = 0 then list else nthcdr (n-1) (List.tl list) in let rec copy_prods list = match list with | [] -> () @@ -1358,10 +1614,12 @@ let process_rst g file args seen tac_prods cmd_prods = error "%s line %d: '%s' is undefined\n" file !linenum start; if end_index = None then error "%s line %d: '%s' is undefined\n" file !linenum end_; + if start_index <> None && end_index <> None then + check_range_consistency g start end_; match start_index, end_index with | Some start_index, Some end_index -> if start_index > end_index then - error "%s line %d: '%s' must appear before '%s' in .../orderedGrammar\n" file !linenum start end_ + error "%s line %d: '%s' must appear before '%s' in orderedGrammar\n" file !linenum start end_ else begin try let line2 = getline() in @@ -1438,21 +1696,28 @@ let report_omitted_prods entries seen label split = end in - let first, last, n = List.fold_left (fun missing nt -> - let first, last, n = missing in + let first, last, n, total = List.fold_left (fun missing nt -> + let first, last, n, total = missing in if NTMap.mem nt seen then begin maybe_warn first last n; - "", "", 0 + "", "", 0, total end else - (if first = "" then nt else first), nt, n + 1) - ("", "", 0) entries in - maybe_warn first last n + (if first = "" then nt else first), nt, n + 1, total + 1) + ("", "", 0, 0) entries in + maybe_warn first last n; + if total <> 0 then + Printf.eprintf "TOTAL %ss not included = %d\n" label total let process_grammar args = let symdef_map = ref StringMap.empty in let g = ref { map = NTMap.empty; order = [] } in let level_renames = read_mlg_files g args symdef_map in + if args.verbose then begin + Printf.printf "Keywords:\n"; + StringSet.iter (fun kw -> Printf.printf "%s " kw) !keywords; + Printf.printf "\n\n"; + end; (* rename nts with levels *) List.iter (fun b -> let (nt, prod) = b in @@ -1468,6 +1733,8 @@ let process_grammar args = print_in_order out g `MLG !g.order StringSet.empty; close_out out; finish_with_file (dir "fullGrammar") args.verify; + if args.verbose then + print_special_tokens g; if not args.fullGrammar then begin (* do shared edits *) @@ -1512,6 +1779,8 @@ let process_grammar args = print_in_order out g `MLG !g.order StringSet.empty; close_out out; finish_with_file (dir "orderedGrammar") args.verify; + check_singletons g +(* print_dominated g*) end; if !exit_code = 0 then begin @@ -1524,6 +1793,8 @@ let process_grammar args = let seen = ref { nts=NTMap.empty; tacs=NTMap.empty; cmds=NTMap.empty } in List.iter (fun file -> process_rst g file args seen tac_prods cmd_prods) args.rst_files; report_omitted_prods !g.order !seen.nts "Nonterminal" ""; + let out = open_out (dir "updated_rsts") in + close_out out; if args.check_tacs then report_omitted_prods tac_list !seen.tacs "Tactic" "\n "; if args.check_cmds then diff --git a/doc/tools/docgram/fullGrammar b/doc/tools/docgram/fullGrammar index a83638dd73..ebaeb392a5 100644 --- a/doc/tools/docgram/fullGrammar +++ b/doc/tools/docgram/fullGrammar @@ -73,12 +73,9 @@ operconstr200: [ ] operconstr100: [ -| operconstr99 "<:" binder_constr -| operconstr99 "<:" operconstr100 -| operconstr99 "<<:" binder_constr -| operconstr99 "<<:" operconstr100 -| operconstr99 ":" binder_constr -| operconstr99 ":" operconstr100 +| operconstr99 "<:" operconstr200 +| operconstr99 "<<:" operconstr200 +| operconstr99 ":" operconstr200 | operconstr99 ":>" | operconstr99 ] @@ -126,26 +123,23 @@ operconstr0: [ ] record_declaration: [ -| record_fields +| record_fields_instance ] -record_fields: [ -| record_field_declaration ";" record_fields -| record_field_declaration +record_fields_instance: [ +| record_field_instance ";" record_fields_instance +| record_field_instance | -| record_field ";" record_fields -| record_field ";" -| record_field ] -record_field_declaration: [ +record_field_instance: [ | global binders ":=" lconstr ] binder_constr: [ | "forall" open_binders "," operconstr200 | "fun" open_binders "=>" operconstr200 -| "let" name binders type_cstr ":=" operconstr200 "in" operconstr200 +| "let" name binders let_type_cstr ":=" operconstr200 "in" operconstr200 | "let" single_fix "in" operconstr200 | "let" [ "(" LIST0 name SEP "," ")" | "()" ] return_type ":=" operconstr200 "in" operconstr200 | "let" "'" pattern200 ":=" operconstr200 "in" operconstr200 @@ -153,11 +147,6 @@ binder_constr: [ | "let" "'" pattern200 "in" pattern200 ":=" operconstr200 case_type "in" operconstr200 | "if" operconstr200 return_type "then" operconstr200 "else" operconstr200 | fix_constr -| "if" operconstr200 "is" ssr_dthen ssr_else (* ssr plugin *) -| "if" operconstr200 "isn't" ssr_dthen ssr_else (* ssr plugin *) -| "let" ":" ssr_mpat ":=" lconstr "in" lconstr (* ssr plugin *) -| "let" ":" ssr_mpat ":=" lconstr ssr_rtype "in" lconstr (* ssr plugin *) -| "let" ":" ssr_mpat "in" pattern200 ":=" lconstr ssr_rtype "in" lconstr (* ssr plugin *) ] appl_arg: [ @@ -213,7 +202,7 @@ fix_kw: [ ] fix_decl: [ -| identref binders_fixannot type_cstr ":=" operconstr200 +| identref binders_fixannot let_type_cstr ":=" operconstr200 ] match_constr: [ @@ -250,7 +239,6 @@ record_pattern: [ record_patterns: [ | record_pattern ";" record_patterns -| record_pattern ";" | record_pattern | ] @@ -260,8 +248,7 @@ pattern200: [ ] pattern100: [ -| pattern99 ":" binder_constr -| pattern99 ":" operconstr100 +| pattern99 ":" operconstr200 | pattern99 ] @@ -306,8 +293,6 @@ fixannot: [ | "{" "struct" identref "}" | "{" "wf" constr identref "}" | "{" "measure" constr OPT identref OPT constr "}" -| "{" "struct" name "}" -| ] impl_name_head: [ @@ -350,7 +335,6 @@ closed_binder: [ | "`(" LIST1 typeclass_constraint SEP "," ")" | "`{" LIST1 typeclass_constraint SEP "," "}" | "'" pattern0 -| [ "of" | "&" ] operconstr99 (* ssr plugin *) ] typeclass_constraint: [ @@ -360,10 +344,8 @@ typeclass_constraint: [ | operconstr200 ] -type_cstr: [ +let_type_cstr: [ | OPT [ ":" lconstr ] -| ":" lconstr -| ] preident: [ @@ -467,14 +449,15 @@ bigint: [ ] bar_cbrace: [ -| "|" "}" +| test_nospace_pipe_closedcurly "|" "}" ] vernac_toplevel: [ | "Drop" "." | "Quit" "." -| "Backtrack" natural natural natural "." +| "BackTo" natural "." | test_show_goal "Show" "Goal" natural "at" natural "." +| "Show" "Proof" "Diffs" OPT "removed" "." | Pvernac.Vernac_.main_entry ] @@ -560,7 +543,6 @@ command: [ | "Reset" identref | "Back" | "Back" natural -| "BackTo" natural | "Debug" "On" | "Debug" "Off" | "Declare" "Reduction" IDENT; ":=" red_expr @@ -669,14 +651,27 @@ command: [ | "Show" "Ltac" "Profile" | "Show" "Ltac" "Profile" "CutOff" int | "Show" "Ltac" "Profile" string +| "Add" "InjTyp" constr (* micromega plugin *) +| "Add" "BinOp" constr (* micromega plugin *) +| "Add" "UnOp" constr (* micromega plugin *) +| "Add" "CstOp" constr (* micromega plugin *) +| "Add" "BinRel" constr (* micromega plugin *) +| "Add" "PropOp" constr (* micromega plugin *) +| "Add" "PropUOp" constr (* micromega plugin *) +| "Add" "Spec" constr (* micromega plugin *) +| "Add" "BinOpSpec" constr (* micromega plugin *) +| "Add" "UnOpSpec" constr (* micromega plugin *) +| "Add" "Saturate" constr (* micromega plugin *) +| "Show" "Zify" "InjTyp" (* micromega plugin *) +| "Show" "Zify" "BinOp" (* micromega plugin *) +| "Show" "Zify" "UnOp" (* micromega plugin *) +| "Show" "Zify" "CstOp" (* micromega plugin *) +| "Show" "Zify" "BinRel" (* micromega plugin *) +| "Show" "Zify" "Spec" (* micromega plugin *) | "Add" "Ring" ident ":" constr OPT ring_mods (* setoid_ring plugin *) | "Print" "Rings" (* setoid_ring plugin *) | "Add" "Field" ident ":" constr OPT field_mods (* setoid_ring plugin *) | "Print" "Fields" (* setoid_ring plugin *) -| "Prenex" "Implicits" LIST1 global (* ssr plugin *) -| "Search" ssr_search_arg ssr_modlocs (* ssr plugin *) -| "Print" "Hint" "View" ssrviewpos (* ssr plugin *) -| "Hint" "View" ssrviewposspc LIST1 ssrhintref (* ssr plugin *) | "Numeral" "Notation" reference reference reference ":" ident numnotoption | "String" "Notation" reference reference reference ":" ident ] @@ -803,6 +798,7 @@ register_token: [ register_type_token: [ | "#int63_type" +| "#float64_type" ] register_prim_token: [ @@ -830,6 +826,24 @@ register_prim_token: [ | "#int63_lt" | "#int63_le" | "#int63_compare" +| "#float64_opp" +| "#float64_abs" +| "#float64_eq" +| "#float64_lt" +| "#float64_le" +| "#float64_compare" +| "#float64_classify" +| "#float64_add" +| "#float64_sub" +| "#float64_mul" +| "#float64_div" +| "#float64_sqrt" +| "#float64_of_int63" +| "#float64_normfr_mantissa" +| "#float64_frshiftexp" +| "#float64_ldshiftexp" +| "#float64_next_up" +| "#float64_next_down" ] thm_token: [ @@ -949,11 +963,16 @@ opt_coercion: [ ] rec_definition: [ -| ident_decl binders_fixannot type_cstr OPT [ ":=" lconstr ] decl_notation +| ident_decl binders_fixannot rec_type_cstr OPT [ ":=" lconstr ] decl_notation ] corec_definition: [ -| ident_decl binders type_cstr OPT [ ":=" lconstr ] decl_notation +| ident_decl binders rec_type_cstr OPT [ ":=" lconstr ] decl_notation +] + +rec_type_cstr: [ +| ":" lconstr +| ] scheme: [ @@ -973,6 +992,13 @@ record_field: [ | LIST0 quoted_attributes record_binder OPT [ "|" natural ] decl_notation ] +record_fields: [ +| record_field ";" record_fields +| record_field ";" +| record_field +| +] + record_binder_body: [ | binders of_type_with_opt_coercion lconstr | binders of_type_with_opt_coercion lconstr ":=" lconstr @@ -1048,7 +1074,6 @@ gallina_ext: [ | "Generalizable" [ "All" "Variables" | "No" "Variables" | [ "Variable" | "Variables" ] LIST1 identref ] | "Export" "Set" option_table option_setting | "Export" "Unset" option_table -| "Import" "Prenex" "Implicits" (* ssr plugin *) ] export_token: [ @@ -1175,21 +1200,21 @@ arguments_modifier: [ | "clear" "implicits" "and" "scopes" ] -scope: [ +scope_delimiter: [ | "%" IDENT ] argument_spec: [ -| OPT "!" name OPT scope +| OPT "!" name OPT scope_delimiter ] argument_spec_block: [ | argument_spec | "/" | "&" -| "(" LIST1 argument_spec ")" OPT scope -| "[" LIST1 argument_spec "]" OPT scope -| "{" LIST1 argument_spec "}" OPT scope +| "(" LIST1 argument_spec ")" OPT scope_delimiter +| "[" LIST1 argument_spec "]" OPT scope_delimiter +| "{" LIST1 argument_spec "}" OPT scope_delimiter ] more_implicits_block: [ @@ -1260,6 +1285,7 @@ printable: [ | "Coercions" | "Coercion" "Paths" class_rawexpr class_rawexpr | "Canonical" "Projections" +| "Typing" "Flags" | "Tables" | "Options" | "Hint" @@ -1339,7 +1365,7 @@ positive_search_mark: [ ] searchabout_query: [ -| positive_search_mark ne_string OPT scope +| positive_search_mark ne_string OPT scope_delimiter | positive_search_mark constr_pattern ] @@ -1725,6 +1751,9 @@ simple_tactic: [ | "psatz_R" tactic (* micromega plugin *) | "psatz_Q" int_or_var tactic (* micromega plugin *) | "psatz_Q" tactic (* micromega plugin *) +| "iter_specs" tactic (* micromega plugin *) +| "zify_op" (* micromega plugin *) +| "saturate" (* micromega plugin *) | "nsatz_compute" constr (* nsatz plugin *) | "omega" (* omega plugin *) | "omega" "with" LIST1 ident (* omega plugin *) @@ -1734,54 +1763,6 @@ simple_tactic: [ | "protect_fv" string (* setoid_ring plugin *) | "ring_lookup" tactic0 "[" LIST0 constr "]" LIST1 constr (* setoid_ring plugin *) | "field_lookup" tactic "[" LIST0 constr "]" LIST1 constr (* setoid_ring plugin *) -| "YouShouldNotTypeThis" ssrintrosarg (* ssr plugin *) -| "by" ssrhintarg (* ssr plugin *) -| "YouShouldNotTypeThis" "do" ssrdoarg (* ssr plugin *) -| "YouShouldNotTypeThis" ssrtclarg ssrseqdir ssrseqarg (* ssr plugin *) -| "clear" natural (* ssr plugin *) -| "move" ssrmovearg ssrrpat (* ssr plugin *) -| "move" ssrmovearg ssrclauses (* ssr plugin *) -| "move" ssrrpat (* ssr plugin *) -| "move" (* ssr plugin *) -| "case" ssrcasearg ssrclauses (* ssr plugin *) -| "case" (* ssr plugin *) -| "elim" ssrarg ssrclauses (* ssr plugin *) -| "elim" (* ssr plugin *) -| "apply" ssrapplyarg (* ssr plugin *) -| "apply" (* ssr plugin *) -| "exact" ssrexactarg (* ssr plugin *) -| "exact" (* ssr plugin *) -| "exact" "<:" lconstr (* ssr plugin *) -| "congr" ssrcongrarg (* ssr plugin *) -| "ssrinstancesofruleL2R" ssrterm (* ssr plugin *) -| "ssrinstancesofruleR2L" ssrterm (* ssr plugin *) -| "rewrite" ssrrwargs ssrclauses (* ssr plugin *) -| "unlock" ssrunlockargs ssrclauses (* ssr plugin *) -| "pose" ssrfixfwd (* ssr plugin *) -| "pose" ssrcofixfwd (* ssr plugin *) -| "pose" ssrfwdid ssrposefwd (* ssr plugin *) -| "set" ssrfwdid ssrsetfwd ssrclauses (* ssr plugin *) -| "abstract" ssrdgens (* ssr plugin *) -| "have" ssrhavefwdwbinders (* ssr plugin *) -| "have" "suff" ssrhpats_nobs ssrhavefwd (* ssr plugin *) -| "have" "suffices" ssrhpats_nobs ssrhavefwd (* ssr plugin *) -| "suff" "have" ssrhpats_nobs ssrhavefwd (* ssr plugin *) -| "suffices" "have" ssrhpats_nobs ssrhavefwd (* ssr plugin *) -| "suff" ssrsufffwd (* ssr plugin *) -| "suffices" ssrsufffwd (* ssr plugin *) -| "wlog" ssrhpats_nobs ssrwlogfwd ssrhint (* ssr plugin *) -| "wlog" "suff" ssrhpats_nobs ssrwlogfwd ssrhint (* ssr plugin *) -| "wlog" "suffices" ssrhpats_nobs ssrwlogfwd ssrhint (* ssr plugin *) -| "without" "loss" ssrhpats_nobs ssrwlogfwd ssrhint (* ssr plugin *) -| "without" "loss" "suff" ssrhpats_nobs ssrwlogfwd ssrhint (* ssr plugin *) -| "without" "loss" "suffices" ssrhpats_nobs ssrwlogfwd ssrhint (* ssr plugin *) -| "gen" "have" ssrclear ssr_idcomma ssrhpats_nobs ssrwlogfwd ssrhint (* ssr plugin *) -| "generally" "have" ssrclear ssr_idcomma ssrhpats_nobs ssrwlogfwd ssrhint (* ssr plugin *) -| "under" ssrrwarg (* ssr plugin *) -| "under" ssrrwarg ssrintros_ne (* ssr plugin *) -| "under" ssrrwarg ssrintros_ne "do" ssrhint3arg (* ssr plugin *) -| "under" ssrrwarg "do" ssrhint3arg (* ssr plugin *) -| "ssrinstancesoftpat" cpattern (* ssrmatching plugin *) ] mlname: [ @@ -1891,10 +1872,6 @@ test_lpar_id_colon: [ | local_test_lpar_id_colon ] -orient_string: [ -| orient preident -] - comparison: [ | "=" | "<" @@ -1977,9 +1954,6 @@ tactic_expr4: [ | tactic_expr3 ";" tactic_expr3 | tactic_expr3 ";" tactic_then_locality tactic_then_gen "]" | tactic_expr3 -| tactic_expr5 ";" "first" ssr_first_else (* ssr plugin *) -| tactic_expr5 ";" "first" ssrseqarg (* ssr plugin *) -| tactic_expr5 ";" "last" ssrseqarg (* ssr plugin *) ] tactic_expr3: [ @@ -1996,10 +1970,6 @@ tactic_expr3: [ | "abstract" tactic_expr2 "using" ident | selector tactic_expr3 | tactic_expr2 -| "do" ssrmmod ssrdotac ssrclauses (* ssr plugin *) -| "do" ssrortacarg ssrclauses (* ssr plugin *) -| "do" int_or_var ssrmmod ssrdotac ssrclauses (* ssr plugin *) -| "abstract" ssrdgens (* ssr plugin *) ] tactic_expr2: [ @@ -2023,14 +1993,12 @@ tactic_expr1: [ | tactic_arg | reference LIST0 tactic_arg_compat | tactic_expr0 -| tactic_expr5 ssrintros_ne (* ssr plugin *) ] tactic_expr0: [ | "(" tactic_expr5 ")" | "[" ">" tactic_then_gen "]" | tactic_atom -| ssrparentacarg (* ssr plugin *) ] failkw: [ @@ -2422,8 +2390,6 @@ hypident: [ | id_or_meta | "(" "type" "of" id_or_meta ")" | "(" "value" "of" id_or_meta ")" -| "(" "type" "of" Prim.identref ")" (* ssr plugin *) -| "(" "value" "of" Prim.identref ")" (* ssr plugin *) ] hypident_occ: [ @@ -2462,13 +2428,24 @@ in_hyp_as: [ | ] +orient_rw: [ +| "->" +| "<-" +| +] + simple_binder: [ | name | "(" LIST1 name ":" lconstr ")" ] fixdecl: [ -| "(" ident LIST0 simple_binder fixannot ":" lconstr ")" +| "(" ident LIST0 simple_binder struct_annot ":" lconstr ")" +] + +struct_annot: [ +| "{" "struct" name "}" +| ] cofixdecl: [ @@ -2525,7 +2502,7 @@ rewriter: [ ] oriented_rewriter: [ -| orient rewriter +| orient_rw rewriter ] induction_clause: [ @@ -2564,608 +2541,6 @@ field_mods: [ | "(" LIST1 field_mod SEP "," ")" (* setoid_ring plugin *) ] -ssrtacarg: [ -| tactic_expr5 (* ssr plugin *) -] - -ssrtac3arg: [ -| tactic_expr3 (* ssr plugin *) -] - -ssrtclarg: [ -| ssrtacarg (* ssr plugin *) -] - -ssrhyp: [ -| ident (* ssr plugin *) -] - -ssrhoi_hyp: [ -| ident (* ssr plugin *) -] - -ssrhoi_id: [ -| ident (* ssr plugin *) -] - -ssrsimpl_ne: [ -| "//=" (* ssr plugin *) -| "/=" (* ssr plugin *) -| test_ssrslashnum11 "/" natural "/" natural "=" (* ssr plugin *) -| test_ssrslashnum10 "/" natural "/" (* ssr plugin *) -| test_ssrslashnum10 "/" natural "=" (* ssr plugin *) -| test_ssrslashnum10 "/" natural "/=" (* ssr plugin *) -| test_ssrslashnum10 "/" natural "/" "=" (* ssr plugin *) -| test_ssrslashnum01 "//" natural "=" (* ssr plugin *) -| test_ssrslashnum00 "//" (* ssr plugin *) -] - -ssrclear_ne: [ -| "{" LIST1 ssrhyp "}" (* ssr plugin *) -] - -ssrclear: [ -| ssrclear_ne (* ssr plugin *) -| (* ssr plugin *) -] - -ssrindex: [ -| int_or_var (* ssr plugin *) -] - -ssrocc: [ -| natural LIST0 natural (* ssr plugin *) -| "-" LIST0 natural (* ssr plugin *) -| "+" LIST0 natural (* ssr plugin *) -] - -ssrmmod: [ -| "!" (* ssr plugin *) -| LEFTQMARK (* ssr plugin *) -| "?" (* ssr plugin *) -] - -ssrmult_ne: [ -| natural ssrmmod (* ssr plugin *) -| ssrmmod (* ssr plugin *) -] - -ssrmult: [ -| ssrmult_ne (* ssr plugin *) -| (* ssr plugin *) -] - -ssrdocc: [ -| "{" ssrocc "}" (* ssr plugin *) -| "{" LIST0 ssrhyp "}" (* ssr plugin *) -] - -ssrterm: [ -| "YouShouldNotTypeThis" constr (* ssr plugin *) -| ssrtermkind Pcoq.Constr.constr (* ssr plugin *) -] - -ast_closure_term: [ -| term_annotation constr (* ssr plugin *) -] - -ast_closure_lterm: [ -| term_annotation lconstr (* ssr plugin *) -] - -ssrbwdview: [ -| "YouShouldNotTypeThis" (* ssr plugin *) -| test_not_ssrslashnum "/" Pcoq.Constr.constr (* ssr plugin *) -| test_not_ssrslashnum "/" Pcoq.Constr.constr ssrbwdview (* ssr plugin *) -] - -ssrfwdview: [ -| "YouShouldNotTypeThis" (* ssr plugin *) -| test_not_ssrslashnum "/" ast_closure_term (* ssr plugin *) -| test_not_ssrslashnum "/" ast_closure_term ssrfwdview (* ssr plugin *) -] - -ident_no_do: [ -| "YouShouldNotTypeThis" ident (* ssr plugin *) -| test_ident_no_do IDENT (* ssr plugin *) -] - -ssripat: [ -| "_" (* ssr plugin *) -| "*" (* ssr plugin *) -| ">" (* ssr plugin *) -| ident_no_do (* ssr plugin *) -| "?" (* ssr plugin *) -| "+" (* ssr plugin *) -| "++" (* ssr plugin *) -| ssrsimpl_ne (* ssr plugin *) -| ssrdocc "->" (* ssr plugin *) -| ssrdocc "<-" (* ssr plugin *) -| ssrdocc (* ssr plugin *) -| "->" (* ssr plugin *) -| "<-" (* ssr plugin *) -| "-" (* ssr plugin *) -| "-/" "=" (* ssr plugin *) -| "-/=" (* ssr plugin *) -| "-/" "/" (* ssr plugin *) -| "-//" (* ssr plugin *) -| "-/" integer "/" (* ssr plugin *) -| "-/" "/=" (* ssr plugin *) -| "-//" "=" (* ssr plugin *) -| "-//=" (* ssr plugin *) -| "-/" integer "/=" (* ssr plugin *) -| "-/" integer "/" integer "=" (* ssr plugin *) -| ssrfwdview (* ssr plugin *) -| "[" ":" LIST0 ident "]" (* ssr plugin *) -| "[:" LIST0 ident "]" (* ssr plugin *) -| ssrcpat (* ssr plugin *) -] - -ssripats: [ -| ssripat ssripats (* ssr plugin *) -| (* ssr plugin *) -] - -ssriorpat: [ -| ssripats "|" ssriorpat (* ssr plugin *) -| ssripats "|-" ">" ssriorpat (* ssr plugin *) -| ssripats "|-" ssriorpat (* ssr plugin *) -| ssripats "|->" ssriorpat (* ssr plugin *) -| ssripats "||" ssriorpat (* ssr plugin *) -| ssripats "|||" ssriorpat (* ssr plugin *) -| ssripats "||||" ssriorpat (* ssr plugin *) -| ssripats (* ssr plugin *) -] - -ssrcpat: [ -| "YouShouldNotTypeThis" ssriorpat (* ssr plugin *) -| test_nohidden "[" hat "]" (* ssr plugin *) -| test_nohidden "[" ssriorpat "]" (* ssr plugin *) -| test_nohidden "[=" ssriorpat "]" (* ssr plugin *) -] - -hat: [ -| "^" ident (* ssr plugin *) -| "^" "~" ident (* ssr plugin *) -| "^" "~" natural (* ssr plugin *) -| "^~" ident (* ssr plugin *) -| "^~" natural (* ssr plugin *) -] - -ssripats_ne: [ -| ssripat ssripats (* ssr plugin *) -] - -ssrhpats: [ -| ssripats (* ssr plugin *) -] - -ssrhpats_wtransp: [ -| ssripats (* ssr plugin *) -| ssripats "@" ssripats (* ssr plugin *) -] - -ssrhpats_nobs: [ -| ssripats (* ssr plugin *) -] - -ssrrpat: [ -| "->" (* ssr plugin *) -| "<-" (* ssr plugin *) -] - -ssrintros_ne: [ -| "=>" ssripats_ne (* ssr plugin *) -] - -ssrintros: [ -| ssrintros_ne (* ssr plugin *) -| (* ssr plugin *) -] - -ssrintrosarg: [ -| "YouShouldNotTypeThis" ssrtacarg ssrintros_ne (* ssr plugin *) -] - -ssrfwdid: [ -| test_ssrfwdid Prim.ident (* ssr plugin *) -] - -ssrortacs: [ -| ssrtacarg "|" ssrortacs (* ssr plugin *) -| ssrtacarg "|" (* ssr plugin *) -| ssrtacarg (* ssr plugin *) -| "|" ssrortacs (* ssr plugin *) -| "|" (* ssr plugin *) -] - -ssrhintarg: [ -| "[" "]" (* ssr plugin *) -| "[" ssrortacs "]" (* ssr plugin *) -| ssrtacarg (* ssr plugin *) -] - -ssrhint3arg: [ -| "[" "]" (* ssr plugin *) -| "[" ssrortacs "]" (* ssr plugin *) -| ssrtac3arg (* ssr plugin *) -] - -ssrortacarg: [ -| "[" ssrortacs "]" (* ssr plugin *) -] - -ssrhint: [ -| (* ssr plugin *) -| "by" ssrhintarg (* ssr plugin *) -] - -ssrwgen: [ -| ssrclear_ne (* ssr plugin *) -| ssrhoi_hyp (* ssr plugin *) -| "@" ssrhoi_hyp (* ssr plugin *) -| "(" ssrhoi_id ":=" lcpattern ")" (* ssr plugin *) -| "(" ssrhoi_id ")" (* ssr plugin *) -| "(@" ssrhoi_id ":=" lcpattern ")" (* ssr plugin *) -| "(" "@" ssrhoi_id ":=" lcpattern ")" (* ssr plugin *) -] - -ssrclausehyps: [ -| ssrwgen "," ssrclausehyps (* ssr plugin *) -| ssrwgen ssrclausehyps (* ssr plugin *) -| ssrwgen (* ssr plugin *) -] - -ssrclauses: [ -| "in" ssrclausehyps "|-" "*" (* ssr plugin *) -| "in" ssrclausehyps "|-" (* ssr plugin *) -| "in" ssrclausehyps "*" (* ssr plugin *) -| "in" ssrclausehyps (* ssr plugin *) -| "in" "|-" "*" (* ssr plugin *) -| "in" "*" (* ssr plugin *) -| "in" "*" "|-" (* ssr plugin *) -| (* ssr plugin *) -] - -ssrfwd: [ -| ":=" ast_closure_lterm (* ssr plugin *) -| ":" ast_closure_lterm ":=" ast_closure_lterm (* ssr plugin *) -] - -ssrbvar: [ -| ident (* ssr plugin *) -| "_" (* ssr plugin *) -] - -ssrbinder: [ -| ssrbvar (* ssr plugin *) -| "(" ssrbvar ")" (* ssr plugin *) -| "(" ssrbvar ":" lconstr ")" (* ssr plugin *) -| "(" ssrbvar LIST1 ssrbvar ":" lconstr ")" (* ssr plugin *) -| "(" ssrbvar ":" lconstr ":=" lconstr ")" (* ssr plugin *) -| "(" ssrbvar ":=" lconstr ")" (* ssr plugin *) -| [ "of" | "&" ] operconstr99 (* ssr plugin *) -] - -ssrstruct: [ -| "{" "struct" ident "}" (* ssr plugin *) -| (* ssr plugin *) -] - -ssrposefwd: [ -| LIST0 ssrbinder ssrfwd (* ssr plugin *) -] - -ssrfixfwd: [ -| "fix" ssrbvar LIST0 ssrbinder ssrstruct ssrfwd (* ssr plugin *) -] - -ssrcofixfwd: [ -| "cofix" ssrbvar LIST0 ssrbinder ssrfwd (* ssr plugin *) -] - -ssrsetfwd: [ -| ":" ast_closure_lterm ":=" "{" ssrocc "}" cpattern (* ssr plugin *) -| ":" ast_closure_lterm ":=" lcpattern (* ssr plugin *) -| ":=" "{" ssrocc "}" cpattern (* ssr plugin *) -| ":=" lcpattern (* ssr plugin *) -] - -ssrhavefwd: [ -| ":" ast_closure_lterm ssrhint (* ssr plugin *) -| ":" ast_closure_lterm ":=" ast_closure_lterm (* ssr plugin *) -| ":" ast_closure_lterm ":=" (* ssr plugin *) -| ":=" ast_closure_lterm (* ssr plugin *) -] - -ssrhavefwdwbinders: [ -| ssrhpats_wtransp LIST0 ssrbinder ssrhavefwd (* ssr plugin *) -] - -ssrdoarg: [ -] - -ssrseqarg: [ -| ssrswap (* ssr plugin *) -| ssrseqidx ssrortacarg OPT ssrorelse (* ssr plugin *) -| ssrseqidx ssrswap (* ssr plugin *) -| tactic_expr3 (* ssr plugin *) -] - -ssrseqidx: [ -| test_ssrseqvar Prim.ident (* ssr plugin *) -| Prim.natural (* ssr plugin *) -] - -ssrswap: [ -| "first" (* ssr plugin *) -| "last" (* ssr plugin *) -] - -ssrorelse: [ -| "||" tactic_expr2 (* ssr plugin *) -] - -Prim.ident: [ -| IDENT ssr_null_entry (* ssr plugin *) -] - -ssrparentacarg: [ -| "(" tactic_expr5 ")" (* ssr plugin *) -] - -ssrdotac: [ -| tactic_expr3 (* ssr plugin *) -| ssrortacarg (* ssr plugin *) -] - -ssrseqdir: [ -] - -ssr_first: [ -| ssr_first ssrintros_ne (* ssr plugin *) -| "[" LIST0 tactic_expr5 SEP "|" "]" (* ssr plugin *) -] - -ssr_first_else: [ -| ssr_first ssrorelse (* ssr plugin *) -| ssr_first (* ssr plugin *) -] - -ssrgen: [ -| ssrdocc cpattern (* ssr plugin *) -| cpattern (* ssr plugin *) -] - -ssrdgens_tl: [ -| "{" LIST1 ssrhyp "}" cpattern ssrdgens_tl (* ssr plugin *) -| "{" LIST1 ssrhyp "}" (* ssr plugin *) -| "{" ssrocc "}" cpattern ssrdgens_tl (* ssr plugin *) -| "/" ssrdgens_tl (* ssr plugin *) -| cpattern ssrdgens_tl (* ssr plugin *) -| (* ssr plugin *) -] - -ssrdgens: [ -| ":" ssrgen ssrdgens_tl (* ssr plugin *) -] - -ssreqid: [ -| test_ssreqid ssreqpat (* ssr plugin *) -| test_ssreqid (* ssr plugin *) -] - -ssreqpat: [ -| Prim.ident (* ssr plugin *) -| "_" (* ssr plugin *) -| "?" (* ssr plugin *) -| "+" (* ssr plugin *) -| ssrdocc "->" (* ssr plugin *) -| ssrdocc "<-" (* ssr plugin *) -| "->" (* ssr plugin *) -| "<-" (* ssr plugin *) -] - -ssrarg: [ -| ssrfwdview ssreqid ssrdgens ssrintros (* ssr plugin *) -| ssrfwdview ssrclear ssrintros (* ssr plugin *) -| ssreqid ssrdgens ssrintros (* ssr plugin *) -| ssrclear_ne ssrintros (* ssr plugin *) -| ssrintros_ne (* ssr plugin *) -] - -ssrmovearg: [ -| ssrarg (* ssr plugin *) -] - -ssrcasearg: [ -| ssrarg (* ssr plugin *) -] - -ssragen: [ -| "{" LIST1 ssrhyp "}" ssrterm (* ssr plugin *) -| ssrterm (* ssr plugin *) -] - -ssragens: [ -| "{" LIST1 ssrhyp "}" ssrterm ssragens (* ssr plugin *) -| "{" LIST1 ssrhyp "}" (* ssr plugin *) -| ssrterm ssragens (* ssr plugin *) -| (* ssr plugin *) -] - -ssrapplyarg: [ -| ":" ssragen ssragens ssrintros (* ssr plugin *) -| ssrclear_ne ssrintros (* ssr plugin *) -| ssrintros_ne (* ssr plugin *) -| ssrbwdview ":" ssragen ssragens ssrintros (* ssr plugin *) -| ssrbwdview ssrclear ssrintros (* ssr plugin *) -] - -ssrexactarg: [ -| ":" ssragen ssragens (* ssr plugin *) -| ssrbwdview ssrclear (* ssr plugin *) -| ssrclear_ne (* ssr plugin *) -] - -ssrcongrarg: [ -| natural constr ssrdgens (* ssr plugin *) -| natural constr (* ssr plugin *) -| constr ssrdgens (* ssr plugin *) -| constr (* ssr plugin *) -] - -ssrrwocc: [ -| "{" LIST0 ssrhyp "}" (* ssr plugin *) -| "{" ssrocc "}" (* ssr plugin *) -| (* ssr plugin *) -] - -ssrrule_ne: [ -| test_not_ssrslashnum [ "/" ssrterm | ssrterm | ssrsimpl_ne ] (* ssr plugin *) -| ssrsimpl_ne (* ssr plugin *) -] - -ssrrule: [ -| ssrrule_ne (* ssr plugin *) -| (* ssr plugin *) -] - -ssrpattern_squarep: [ -| "[" rpattern "]" (* ssr plugin *) -| (* ssr plugin *) -] - -ssrpattern_ne_squarep: [ -| "[" rpattern "]" (* ssr plugin *) -] - -ssrrwarg: [ -| "-" ssrmult ssrrwocc ssrpattern_squarep ssrrule_ne (* ssr plugin *) -| "-/" ssrterm (* ssr plugin *) -| ssrmult_ne ssrrwocc ssrpattern_squarep ssrrule_ne (* ssr plugin *) -| "{" LIST1 ssrhyp "}" ssrpattern_ne_squarep ssrrule_ne (* ssr plugin *) -| "{" LIST1 ssrhyp "}" ssrrule (* ssr plugin *) -| "{" ssrocc "}" ssrpattern_squarep ssrrule_ne (* ssr plugin *) -| "{" "}" ssrpattern_squarep ssrrule_ne (* ssr plugin *) -| ssrpattern_ne_squarep ssrrule_ne (* ssr plugin *) -| ssrrule_ne (* ssr plugin *) -] - -ssrrwargs: [ -| test_ssr_rw_syntax LIST1 ssrrwarg (* ssr plugin *) -] - -ssrunlockarg: [ -| "{" ssrocc "}" ssrterm (* ssr plugin *) -| ssrterm (* ssr plugin *) -] - -ssrunlockargs: [ -| LIST0 ssrunlockarg (* ssr plugin *) -] - -ssrsufffwd: [ -| ssrhpats LIST0 ssrbinder ":" ast_closure_lterm ssrhint (* ssr plugin *) -] - -ssrwlogfwd: [ -| ":" LIST0 ssrwgen "/" ast_closure_lterm (* ssr plugin *) -] - -ssr_idcomma: [ -| (* ssr plugin *) -| test_idcomma [ IDENT | "_" ] "," (* ssr plugin *) -] - -ssr_rtype: [ -| "return" operconstr100 (* ssr plugin *) -] - -ssr_mpat: [ -| pattern200 (* ssr plugin *) -] - -ssr_dpat: [ -| ssr_mpat "in" pattern200 ssr_rtype (* ssr plugin *) -| ssr_mpat ssr_rtype (* ssr plugin *) -| ssr_mpat (* ssr plugin *) -] - -ssr_dthen: [ -| ssr_dpat "then" lconstr (* ssr plugin *) -] - -ssr_elsepat: [ -| "else" (* ssr plugin *) -] - -ssr_else: [ -| ssr_elsepat lconstr (* ssr plugin *) -] - -ssr_search_item: [ -| string (* ssr plugin *) -| string "%" preident (* ssr plugin *) -| constr_pattern (* ssr plugin *) -] - -ssr_search_arg: [ -| "-" ssr_search_item ssr_search_arg (* ssr plugin *) -| ssr_search_item ssr_search_arg (* ssr plugin *) -| (* ssr plugin *) -] - -ssr_modlocs: [ -| (* ssr plugin *) -| "in" LIST1 modloc (* ssr plugin *) -] - -modloc: [ -| "-" global (* ssr plugin *) -| global (* ssr plugin *) -] - -ssrhintref: [ -| constr (* ssr plugin *) -| constr "|" natural (* ssr plugin *) -] - -ssrviewpos: [ -| "for" "move" "/" (* ssr plugin *) -| "for" "apply" "/" (* ssr plugin *) -| "for" "apply" "/" "/" (* ssr plugin *) -| "for" "apply" "//" (* ssr plugin *) -| (* ssr plugin *) -] - -ssrviewposspc: [ -| ssrviewpos (* ssr plugin *) -] - -rpattern: [ -| lconstr (* ssrmatching plugin *) -| "in" lconstr (* ssrmatching plugin *) -| lconstr "in" lconstr (* ssrmatching plugin *) -| "in" lconstr "in" lconstr (* ssrmatching plugin *) -| lconstr "in" lconstr "in" lconstr (* ssrmatching plugin *) -| lconstr "as" lconstr "in" lconstr (* ssrmatching plugin *) -] - -cpattern: [ -| "Qed" constr (* ssrmatching plugin *) -| ssrtermkind constr (* ssrmatching plugin *) -] - -lcpattern: [ -| "Qed" lconstr (* ssrmatching plugin *) -| ssrtermkind lconstr (* ssrmatching plugin *) -] - -ssrpatternarg: [ -| rpattern (* ssrmatching plugin *) -] - numnotoption: [ | | "(" "warning" "after" bigint ")" diff --git a/doc/tools/docgram/orderedGrammar b/doc/tools/docgram/orderedGrammar index cd6e11505c..545ccde03a 100644 --- a/doc/tools/docgram/orderedGrammar +++ b/doc/tools/docgram/orderedGrammar @@ -3,603 +3,541 @@ doc_grammar will modify this file to add/remove nonterminals and productions to match editedGrammar, which will remove comments. Not compiled into Coq *) DOC_GRAMMAR -global: [ -| reference -] - -constr_pattern: [ -| term -] - -sort: [ -| "Set" -| "Prop" -| "SProp" -| "Type" -| "Type" "@{" "_" "}" -| "Type" "@{" universe "}" -] - -sort_family: [ -| "Set" -| "Prop" -| "SProp" -| "Type" +vernac_toplevel: [ +| "Drop" "." +| "Quit" "." +| "BackTo" num "." +| "Show" "Goal" num "at" num "." +| "Show" "Proof" "Diffs" removed_opt "." +| vernac_control ] -universe_increment: [ -| "+" natural +removed_opt: [ +| "removed" | empty ] -universe_name: [ -| global -| "Set" -| "Prop" +tactic_mode: [ +| toplevel_selector_opt query_command +| toplevel_selector_opt "{" +| toplevel_selector_opt ltac_info_opt ltac_expr ltac_use_default +| "par" ":" ltac_info_opt ltac_expr ltac_use_default ] -universe_expr: [ -| universe_name universe_increment +toplevel_selector_opt: [ +| toplevel_selector +| empty ] -universe: [ -| "max" "(" universe_expr_list_comma ")" -| universe_expr +ltac_info_opt: [ +| "Info" num +| empty ] -universe_expr_list_comma: [ -| universe_expr_list_comma "," universe_expr -| universe_expr +ltac_use_default: [ +| "." +| "..." ] -lconstr: [ -| operconstr200 -| lconstr +vernac_control: [ +| "Time" vernac_control +| "Redirect" string vernac_control +| "Timeout" num vernac_control +| "Fail" vernac_control +| quoted_attributes_list_opt vernac ] term: [ -| operconstr8 -| "@" global instance +| "forall" open_binders "," term +| "fun" open_binders "=>" term +| term_let +| "if" term as_return_type_opt "then" term "else" term +| term_fix +| term100 ] -operconstr200: [ -| binder_constr -| operconstr100 +term100: [ +| term_cast +| term10 ] -operconstr100: [ -| operconstr99 "<:" binder_constr -| operconstr99 "<:" operconstr100 -| operconstr99 "<<:" binder_constr -| operconstr99 "<<:" operconstr100 -| operconstr99 ":" binder_constr -| operconstr99 ":" operconstr100 -| operconstr99 ":>" -| operconstr99 +term10: [ +| term1 args +| "@" qualid universe_annot_opt term1_list_opt +| term1 ] -operconstr99: [ -| operconstr90 +args: [ +| args arg +| arg ] -operconstr90: [ -| operconstr10 +arg: [ +| "(" ident ":=" term ")" +| term1 ] -operconstr10: [ -| operconstr9 appl_arg_list -| "@" global instance operconstr9_list_opt -| "@" pattern_identref ident_list -| operconstr9 -] - -appl_arg_list: [ -| appl_arg_list appl_arg -| appl_arg -] - -operconstr9: [ -| ".." operconstr0 ".." -| operconstr8 -] - -operconstr8: [ -| operconstr1 +term1_list_opt: [ +| term1_list_opt term1 +| empty ] -operconstr1: [ -| operconstr0 ".(" global appl_arg_list_opt ")" -| operconstr0 ".(" "@" global operconstr9_list_opt ")" -| operconstr0 "%" IDENT -| operconstr0 +empty: [ +| ] -appl_arg_list_opt: [ -| appl_arg_list_opt appl_arg -| empty +term1: [ +| term_projection +| term0 "%" ident +| term0 ] -operconstr9_list_opt: [ -| operconstr9_list_opt operconstr9 +args_opt: [ +| args | empty ] -operconstr0: [ -| atomic_constr -| match_constr -| "(" operconstr200 ")" -| "{|" record_declaration bar_cbrace -| "{" binder_constr "}" -| "`{" operconstr200 "}" -| "`(" operconstr200 ")" +term0: [ +| qualid universe_annot_opt +| sort +| numeral +| string +| "_" +| term_evar +| term_match +| "(" term ")" +| "{|" fields_def "|}" +| "`{" term "}" +| "`(" term ")" | "ltac" ":" "(" ltac_expr ")" ] -record_declaration: [ -| record_fields -] - -record_fields: [ -| record_field_declaration ";" record_fields -| record_field_declaration +fields_def: [ +| field_def ";" fields_def +| field_def | empty -| record_field ";" record_fields -| record_field ";" -| record_field -] - -record_field_declaration: [ -| global binders ":=" lconstr ] -binder_constr: [ -| "forall" open_binders "," operconstr200 -| "fun" open_binders "=>" operconstr200 -| "let" name binders type_cstr ":=" operconstr200 "in" operconstr200 -| "let" single_fix "in" operconstr200 -| "let" name_alt return_type ":=" operconstr200 "in" operconstr200 -| "let" "'" pattern200 ":=" operconstr200 "in" operconstr200 -| "let" "'" pattern200 ":=" operconstr200 case_type "in" operconstr200 -| "let" "'" pattern200 "in" pattern200 ":=" operconstr200 case_type "in" operconstr200 -| "if" operconstr200 return_type "then" operconstr200 "else" operconstr200 -| fix_constr -| "if" operconstr200 "is" ssr_dthen ssr_else (* ssr plugin *) -| "if" operconstr200 "isn't" ssr_dthen ssr_else (* ssr plugin *) -| "let" ":" ssr_mpat ":=" lconstr "in" lconstr (* ssr plugin *) -| "let" ":" ssr_mpat ":=" lconstr ssr_rtype "in" lconstr (* ssr plugin *) -| "let" ":" ssr_mpat "in" pattern200 ":=" lconstr ssr_rtype "in" lconstr (* ssr plugin *) +field_def: [ +| qualid binders_opt ":=" term ] -name_alt: [ -| "(" name_list_comma_opt ")" -| "()" +binders_opt: [ +| binders +| empty ] -name_list_comma_opt: [ -| name_list_comma -| empty +term_projection: [ +| term0 ".(" qualid args_opt ")" +| term0 ".(" "@" qualid term1_list_opt ")" ] -name_list_comma: [ -| name_list_comma "," name -| name +term_evar: [ +| "?[" ident "]" +| "?[" "?" ident "]" +| "?" ident evar_bindings_opt ] -name_list_opt: [ -| name_list_opt name +evar_bindings_opt: [ +| "@{" evar_bindings_semi "}" | empty ] -name_list: [ -| name_list name -| name +evar_bindings_semi: [ +| evar_bindings_semi ";" evar_binding +| evar_binding ] -appl_arg: [ -| lpar_id_coloneq lconstr ")" -| operconstr9 +evar_binding: [ +| ident ":=" term ] -atomic_constr: [ -| global instance -| sort -| NUMERAL -| string -| "_" -| "?" "[" ident "]" -| "?" "[" "?" ident "]" -| "?" ident evar_instance +dangling_pattern_extension_rule: [ +| "@" "?" ident ident_list ] -inst: [ -| ident ":=" lconstr +ident_list: [ +| ident_list ident +| ident ] -evar_instance: [ -| "@{" inst_list_semi "}" +record_fields: [ +| record_field ";" record_fields +| record_field ";" +| record_field | empty ] -inst_list_semi: [ -| inst_list_semi ";" inst -| inst -] - -instance: [ -| "@{" universe_level_list_opt "}" -| empty +record_field: [ +| quoted_attributes_list_opt record_binder num_opt2 decl_notation ] -universe_level_list_opt: [ -| universe_level_list_opt universe_level +decl_notation: [ +| "where" one_decl_notation_list | empty ] -universe_level: [ -| "Set" -| "Prop" -| "Type" -| "_" -| global -] - -fix_constr: [ -| single_fix -| single_fix "with" fix_decl_list "for" ident -] - -fix_decl_list: [ -| fix_decl_list "with" fix_decl -| fix_decl +one_decl_notation_list: [ +| one_decl_notation_list "and" one_decl_notation +| one_decl_notation ] -single_fix: [ -| fix_kw fix_decl +one_decl_notation: [ +| string ":=" term1_extended ident_opt3 ] -fix_kw: [ -| "fix" -| "cofix" +ident_opt3: [ +| ":" ident +| empty ] -fix_decl: [ -| ident binders_fixannot type_cstr ":=" operconstr200 +record_binder: [ +| name +| name record_binder_body ] -match_constr: [ -| "match" case_item_list_comma case_type_opt "with" branches "end" +record_binder_body: [ +| binders_opt of_type_with_opt_coercion term +| binders_opt of_type_with_opt_coercion term ":=" term +| binders_opt ":=" term ] -case_item_list_comma: [ -| case_item_list_comma "," case_item -| case_item +of_type_with_opt_coercion: [ +| ":>>" +| ":>" +| ":" ] -case_type_opt: [ -| case_type +num_opt2: [ +| "|" num | empty ] -case_item: [ -| operconstr100 as_opt in_opt -] - -as_opt2: [ -| as_opt case_type +quoted_attributes_list_opt: [ +| quoted_attributes_list_opt "#[" attribute_list_comma_opt "]" | empty ] -in_opt: [ -| "in" pattern200 +attribute_list_comma_opt: [ +| attribute_list_comma | empty ] -case_type: [ -| "return" operconstr100 +attribute_list_comma: [ +| attribute_list_comma "," attribute +| attribute ] -return_type: [ -| as_opt2 +attribute: [ +| ident attribute_value ] -as_opt3: [ -| "as" dirpath +attribute_value: [ +| "=" string +| "(" attribute_list_comma_opt ")" | empty ] -branches: [ -| or_opt eqn_list_or_opt -] - -mult_pattern: [ -| pattern200_list_comma -] - -pattern200_list_comma: [ -| pattern200_list_comma "," pattern200 -| pattern200 +qualid: [ +| qualid field +| ident ] -eqn: [ -| mult_pattern_list_or "=>" lconstr +field: [ +| "." ident ] -mult_pattern_list_or: [ -| mult_pattern_list_or "|" mult_pattern -| mult_pattern +sort: [ +| "Set" +| "Prop" +| "SProp" +| "Type" +| "Type" "@{" "_" "}" +| "Type" "@{" universe "}" ] -record_pattern: [ -| global ":=" pattern200 +universe: [ +| "max" "(" universe_exprs_comma ")" +| universe_expr ] -record_patterns: [ -| record_pattern ";" record_patterns -| record_pattern ";" -| record_pattern -| empty +universe_exprs_comma: [ +| universe_exprs_comma "," universe_expr +| universe_expr ] -pattern200: [ -| pattern100 +universe_expr: [ +| universe_name universe_increment_opt ] -pattern100: [ -| pattern99 ":" binder_constr -| pattern99 ":" operconstr100 -| pattern99 +universe_name: [ +| qualid +| "Set" +| "Prop" ] -pattern99: [ -| pattern90 +universe_increment_opt: [ +| "+" num +| empty ] -pattern90: [ -| pattern10 +universe_annot_opt: [ +| "@{" universe_levels_opt "}" +| empty ] -pattern10: [ -| pattern1 "as" name -| pattern1 pattern1_list -| "@" reference pattern1_list_opt -| pattern1 +universe_levels_opt: [ +| universe_levels_opt universe_level +| empty ] -pattern1_list: [ -| pattern1_list pattern1 -| pattern1 +universe_level: [ +| "Set" +| "Prop" +| "Type" +| "_" +| qualid ] -pattern1_list_opt: [ -| pattern1_list_opt pattern1 -| empty +term_fix: [ +| single_fix +| single_fix "with" fix_bodies "for" ident ] -pattern1: [ -| pattern0 "%" IDENT -| pattern0 +single_fix: [ +| "fix" fix_body +| "cofix" fix_body ] -pattern0: [ -| reference -| "{|" record_patterns bar_cbrace -| "_" -| "(" pattern200 ")" -| "(" pattern200 "|" pattern200_list_or ")" -| NUMERAL -| string +fix_bodies: [ +| fix_bodies "with" fix_body +| fix_body ] -pattern200_list_or: [ -| pattern200_list_or "|" pattern200 -| pattern200 +fix_body: [ +| ident binders_opt fixannot_opt colon_term_opt ":=" term ] -impl_ident_tail: [ -| "}" -| name_list ":" lconstr "}" -| name_list "}" -| ":" lconstr "}" +fixannot_opt: [ +| fixannot +| empty ] fixannot: [ | "{" "struct" ident "}" -| "{" "wf" term ident "}" -| "{" "measure" term ident_opt term_opt "}" -| "{" "struct" name "}" -| empty +| "{" "wf" term1_extended ident "}" +| "{" "measure" term1_extended ident_opt term1_extended_opt "}" ] -term_opt: [ -| term -| empty +term1_extended: [ +| term1 +| "@" qualid universe_annot_opt ] -impl_name_head: [ +ident_opt: [ +| ident | empty ] -binders_fixannot: [ -| impl_name_head impl_ident_tail binders_fixannot -| fixannot -| binder binders_fixannot +term1_extended_opt: [ +| term1_extended | empty ] -open_binders: [ -| name name_list_opt ":" lconstr -| name name_list_opt binders -| name ".." name -| closed_binder binders -] - -binders: [ -| binder_list_opt +term_let: [ +| "let" name colon_term_opt ":=" term "in" term +| "let" name binders colon_term_opt ":=" term "in" term +| "let" single_fix "in" term +| "let" names_tuple as_return_type_opt ":=" term "in" term +| "let" "'" pattern ":=" term return_type_opt "in" term +| "let" "'" pattern "in" pattern ":=" term return_type "in" term ] -binder_list_opt: [ -| binder_list_opt binder +colon_term_opt: [ +| ":" term | empty ] -binder: [ -| name -| closed_binder +names_tuple: [ +| "(" names_comma ")" +| "()" ] -typeclass_constraint: [ -| "!" operconstr200 -| "{" name "}" ":" exclam_opt operconstr200 -| name_colon exclam_opt operconstr200 -| operconstr200 +names_comma: [ +| names_comma "," name +| name ] -type_cstr: [ -| lconstr_opt -| ":" lconstr -| empty +open_binders: [ +| names ":" term +| binders ] -preident: [ -| IDENT +names: [ +| names name +| name ] -pattern_identref: [ -| "?" ident +name: [ +| "_" +| ident ] -var: [ -| ident +binders: [ +| binders binder +| binder ] -field: [ -| FIELD +binder: [ +| name +| "(" names ":" term ")" +| "(" name colon_term_opt ":=" term ")" +| "{" name "}" +| "{" names colon_term_opt "}" +| "`(" typeclass_constraints_comma ")" +| "`{" typeclass_constraints_comma "}" +| "'" pattern0 +| "(" name ":" term "|" term ")" ] -fields: [ -| field fields -| field +typeclass_constraints_comma: [ +| typeclass_constraints_comma "," typeclass_constraint +| typeclass_constraint ] -fullyqualid: [ -| ident fields -| ident +typeclass_constraint: [ +| exclam_opt term +| "{" name "}" ":" exclam_opt term +| name ":" exclam_opt term ] -basequalid: [ -| ident fields -| ident +exclam_opt: [ +| "!" +| empty ] -name: [ -| "_" -| ident +term_cast: [ +| term10 "<:" term +| term10 "<<:" term +| term10 ":" term +| term10 ":>" ] -reference: [ -| ident fields -| ident +term_match: [ +| "match" case_items_comma return_type_opt "with" or_opt eqns_or_opt "end" ] -by_notation: [ -| ne_string IDENT_opt +case_items_comma: [ +| case_items_comma "," case_item +| case_item ] -IDENT_opt: [ -| "%" IDENT +return_type_opt: [ +| return_type | empty ] -smart_global: [ -| reference -| by_notation +as_return_type_opt: [ +| as_name_opt return_type +| empty ] -qualid: [ -| basequalid +return_type: [ +| "return" term100 ] -ne_string: [ -| STRING +case_item: [ +| term100 as_name_opt in_opt ] -ne_lstring: [ -| ne_string +as_name_opt: [ +| "as" name +| empty ] -dirpath: [ -| ident field_list_opt +in_opt: [ +| "in" pattern +| empty ] -field_list_opt: [ -| field_list_opt field +or_opt: [ +| "|" | empty ] -string: [ -| STRING +eqns_or_opt: [ +| eqns_or +| empty ] -lstring: [ -| string +eqns_or: [ +| eqns_or "|" eqn +| eqn ] -integer: [ -| NUMERAL -| "-" NUMERAL +eqn: [ +| patterns_comma_list_or "=>" term ] -natural: [ -| NUMERAL +patterns_comma_list_or: [ +| patterns_comma_list_or "|" patterns_comma +| patterns_comma ] -bigint: [ -| NUMERAL +patterns_comma: [ +| patterns_comma "," pattern +| pattern ] -bar_cbrace: [ -| "|" "}" +pattern: [ +| pattern10 ":" term +| pattern10 ] -vernac_control: [ -| "Time" vernac_control -| "Redirect" ne_string vernac_control -| "Timeout" natural vernac_control -| "Fail" vernac_control -| decorated_vernac +pattern10: [ +| pattern1 "as" name +| pattern1_list +| "@" qualid pattern1_list_opt +| pattern1 ] -decorated_vernac: [ -| quoted_attributes_list_opt vernac +pattern1_list: [ +| pattern1_list pattern1 +| pattern1 ] -quoted_attributes_list_opt: [ -| quoted_attributes_list_opt quoted_attributes +pattern1_list_opt: [ +| pattern1_list | empty ] -quoted_attributes: [ -| "#[" attribute_list_comma_opt "]" +pattern1: [ +| pattern0 "%" ident +| pattern0 ] -attribute_list_comma_opt: [ -| attribute_list_comma -| empty +pattern0: [ +| qualid +| "{|" record_patterns_opt "|}" +| "_" +| "(" patterns_or ")" +| numeral +| string ] -attribute_list_comma: [ -| attribute_list_comma "," attribute -| attribute +patterns_or: [ +| patterns_or "|" pattern +| pattern ] -attribute: [ -| ident attribute_value +record_patterns_opt: [ +| record_patterns_opt ";" record_pattern +| record_pattern +| empty ] -attribute_value: [ -| "=" string -| "(" attribute_list_comma_opt ")" -| empty +record_pattern: [ +| qualid ":=" pattern ] vernac: [ @@ -620,44 +558,51 @@ vernac_aux: [ | gallina "." | gallina_ext "." | command "." +| tactic_mode "." | syntax "." | subprf -| command_entry -] - -noedit_mode: [ | query_command ] subprf: [ -| BULLET +| bullet | "{" | "}" ] gallina: [ -| thm_token ident_decl binders ":" lconstr with_list_opt +| thm_token ident_decl binders_opt ":" term with_list_opt | assumption_token inline assum_list | assumptions_token inline assum_list | def_token ident_decl def_body | "Let" ident def_body | cumulativity_token_opt private_token finite_token inductive_definition_list -| "Fixpoint" rec_definition_list -| "Let" "Fixpoint" rec_definition_list -| "CoFixpoint" corec_definition_list -| "Let" "CoFixpoint" corec_definition_list +| "Fixpoint" fix_definition_list +| "Let" "Fixpoint" fix_definition_list +| "CoFixpoint" cofix_definition_list +| "Let" "CoFixpoint" cofix_definition_list | "Scheme" scheme_list | "Combined" "Scheme" ident "from" ident_list_comma -| "Register" global "as" qualid -| "Register" "Inline" global -| "Primitive" ident lconstr_opt ":=" register_token +| "Register" qualid "as" qualid +| "Register" "Inline" qualid +| "Primitive" ident term_opt ":=" register_token | "Universe" ident_list | "Universes" ident_list | "Constraint" univ_constraint_list_comma ] +term_opt: [ +| ":" term +| empty +] + +univ_constraint_list_comma: [ +| univ_constraint_list_comma "," univ_constraint +| univ_constraint +] + with_list_opt: [ -| with_list_opt "with" ident_decl binders ":" lconstr +| with_list_opt "with" ident_decl binders_opt ":" term | empty ] @@ -671,14 +616,23 @@ inductive_definition_list: [ | inductive_definition ] -rec_definition_list: [ -| rec_definition_list "with" rec_definition -| rec_definition +fix_definition_list: [ +| fix_definition_list "with" fix_definition +| fix_definition ] -corec_definition_list: [ -| corec_definition_list "with" corec_definition -| corec_definition +fix_definition: [ +| ident_decl binders_opt fixannot_opt colon_term_opt term_opt2 decl_notation +] + +term_opt2: [ +| ":=" term +| empty +] + +cofix_definition_list: [ +| cofix_definition_list "with" cofix_definition +| cofix_definition ] scheme_list: [ @@ -691,23 +645,10 @@ ident_list_comma: [ | ident ] -univ_constraint_list_comma: [ -| univ_constraint_list_comma "," univ_constraint -| univ_constraint -] - -lconstr_opt2: [ -| ":=" lconstr -| empty -] - register_token: [ | register_prim_token -| register_type_token -] - -register_type_token: [ | "#int63_type" +| "#float64_type" ] register_prim_token: [ @@ -735,6 +676,24 @@ register_prim_token: [ | "#int63_lt" | "#int63_le" | "#int63_compare" +| "#float64_opp" +| "#float64_abs" +| "#float64_eq" +| "#float64_lt" +| "#float64_le" +| "#float64_compare" +| "#float64_classify" +| "#float64_add" +| "#float64_sub" +| "#float64_mul" +| "#float64_div" +| "#float64_sqrt" +| "#float64_of_int63" +| "#float64_normfr_mantissa" +| "#float64_frshiftexp" +| "#float64_ldshiftexp" +| "#float64_next_up" +| "#float64_next_down" ] thm_token: [ @@ -770,7 +729,7 @@ assumptions_token: [ ] inline: [ -| "Inline" "(" natural ")" +| "Inline" "(" num ")" | "Inline" | empty ] @@ -785,30 +744,6 @@ lt_alt: [ | "<=" ] -univ_decl: [ -| "@{" ident_list_opt plus_opt univ_constraint_alt -] - -plus_opt: [ -| "+" -| empty -] - -univ_constraint_alt: [ -| "|" univ_constraint_list_comma_opt plus_opt "}" -| rbrace_alt -] - -univ_constraint_list_comma_opt: [ -| univ_constraint_list_comma -| empty -] - -rbrace_alt: [ -| "}" -| bar_cbrace -] - ident_decl: [ | ident univ_decl_opt ] @@ -833,9 +768,9 @@ private_token: [ ] def_body: [ -| binders ":=" reduce lconstr -| binders ":" lconstr ":=" reduce lconstr -| binders ":" lconstr +| binders_opt ":=" reduce term +| binders_opt ":" term ":=" reduce term +| binders_opt ":" term ] reduce: [ @@ -843,27 +778,70 @@ reduce: [ | empty ] -one_decl_notation: [ -| ne_lstring ":=" term IDENT_opt2 +red_expr: [ +| "red" +| "hnf" +| "simpl" delta_flag ref_or_pattern_occ_opt +| "cbv" strategy_flag +| "cbn" strategy_flag +| "lazy" strategy_flag +| "compute" delta_flag +| "vm_compute" ref_or_pattern_occ_opt +| "native_compute" ref_or_pattern_occ_opt +| "unfold" unfold_occ_list_comma +| "fold" term1_extended_list +| "pattern" pattern_occ_list_comma +| ident ] -IDENT_opt2: [ -| ":" IDENT -| empty +strategy_flag: [ +| red_flags_list +| delta_flag +] + +red_flags_list: [ +| red_flags_list red_flags +| red_flags ] -decl_sep: [ -| "and" +red_flags: [ +| "beta" +| "iota" +| "match" +| "fix" +| "cofix" +| "zeta" +| "delta" delta_flag ] -decl_notation: [ -| "where" one_decl_notation_list +delta_flag: [ +| "-" "[" smart_global_list "]" +| "[" smart_global_list "]" | empty ] -one_decl_notation_list: [ -| one_decl_notation_list decl_sep one_decl_notation -| one_decl_notation +ref_or_pattern_occ_opt: [ +| ref_or_pattern_occ +| empty +] + +ref_or_pattern_occ: [ +| smart_global occs +| term1_extended occs +] + +unfold_occ_list_comma: [ +| unfold_occ_list_comma "," unfold_occ +| unfold_occ +] + +unfold_occ: [ +| smart_global occs +] + +pattern_occ_list_comma: [ +| pattern_occ_list_comma "," pattern_occ +| pattern_occ ] opt_constructors_or_fields: [ @@ -872,7 +850,12 @@ opt_constructors_or_fields: [ ] inductive_definition: [ -| opt_coercion ident_decl binders lconstr_opt opt_constructors_or_fields decl_notation +| opt_coercion ident_decl binders_opt term_opt opt_constructors_or_fields decl_notation +] + +opt_coercion: [ +| ">" +| empty ] constructor_list_or_record_decl: [ @@ -894,52 +877,6 @@ constructor_list_or_opt: [ | empty ] -opt_coercion: [ -| ">" -| empty -] - -rec_definition: [ -| ident_decl binders_fixannot type_cstr lconstr_opt2 decl_notation -] - -corec_definition: [ -| ident_decl binders type_cstr lconstr_opt2 decl_notation -] - -lconstr_opt: [ -| ":" lconstr -| empty -] - -scheme: [ -| scheme_kind -| ident ":=" scheme_kind -] - -scheme_kind: [ -| "Induction" "for" smart_global "Sort" sort_family -| "Minimality" "for" smart_global "Sort" sort_family -| "Elimination" "for" smart_global "Sort" sort_family -| "Case" "for" smart_global "Sort" sort_family -| "Equality" "for" smart_global -] - -record_field: [ -| quoted_attributes_list_opt record_binder natural_opt2 decl_notation -] - -record_binder_body: [ -| binders of_type_with_opt_coercion lconstr -| binders of_type_with_opt_coercion lconstr ":=" lconstr -| binders ":=" lconstr -] - -record_binder: [ -| name -| name record_binder_body -] - assum_list: [ | assum_coe_list | simple_assum_coe @@ -955,7 +892,7 @@ assum_coe: [ ] simple_assum_coe: [ -| ident_decl_list of_type_with_opt_coercion lconstr +| ident_decl_list of_type_with_opt_coercion term ] ident_decl_list: [ @@ -964,11 +901,11 @@ ident_decl_list: [ ] constructor_type: [ -| binders of_type_with_opt_coercion_opt +| binders_opt of_type_with_opt_coercion_opt ] of_type_with_opt_coercion_opt: [ -| of_type_with_opt_coercion lconstr +| of_type_with_opt_coercion term | empty ] @@ -976,95 +913,135 @@ constructor: [ | ident constructor_type ] -of_type_with_opt_coercion: [ -| ":>>" -| ":>" ">" -| ":>" -| ":" ">" ">" -| ":" ">" -| ":" +cofix_definition: [ +| ident_decl binders_opt colon_term_opt term_opt2 decl_notation +] + +scheme: [ +| scheme_kind +| ident ":=" scheme_kind +] + +scheme_kind: [ +| "Induction" "for" smart_global "Sort" sort_family +| "Minimality" "for" smart_global "Sort" sort_family +| "Elimination" "for" smart_global "Sort" sort_family +| "Case" "for" smart_global "Sort" sort_family +| "Equality" "for" smart_global +] + +sort_family: [ +| "Set" +| "Prop" +| "SProp" +| "Type" +] + +smart_global: [ +| qualid +| by_notation +] + +by_notation: [ +| string ident_opt2 +] + +ident_opt2: [ +| "%" ident +| empty ] gallina_ext: [ | "Module" export_token ident module_binder_list_opt of_module_type is_module_expr -| "Module" "Type" ident module_binder_list_opt check_module_types is_module_type +| "Module" "Type" ident module_binder_list_opt module_type_inl_list_opt is_module_type | "Declare" "Module" export_token ident module_binder_list_opt ":" module_type_inl | "Section" ident | "Chapter" ident | "End" ident | "Collection" ident ":=" section_subset_expr -| "Require" export_token global_list -| "From" global "Require" export_token global_list -| "Import" global_list -| "Export" global_list -| "Include" module_type_inl ext_module_expr_list_opt -| "Include" "Type" module_type_inl ext_module_type_list_opt +| "Require" export_token qualid_list +| "From" qualid "Require" export_token qualid_list +| "Import" qualid_list +| "Export" qualid_list +| "Include" module_type_inl module_expr_inl_list_opt +| "Include" "Type" module_type_inl module_type_inl_list_opt | "Transparent" smart_global_list | "Opaque" smart_global_list | "Strategy" strategy_level_list -| "Canonical" Structure_opt global univ_decl_opt2 +| "Canonical" Structure_opt qualid univ_decl_opt2 | "Canonical" Structure_opt by_notation -| "Coercion" global univ_decl_opt def_body +| "Coercion" qualid univ_decl_opt def_body | "Identity" "Coercion" ident ":" class_rawexpr ">->" class_rawexpr -| "Coercion" global ":" class_rawexpr ">->" class_rawexpr +| "Coercion" qualid ":" class_rawexpr ">->" class_rawexpr | "Coercion" by_notation ":" class_rawexpr ">->" class_rawexpr -| "Context" binder_list -| "Instance" instance_name ":" operconstr200 hint_info record_declaration_opt -| "Existing" "Instance" global hint_info -| "Existing" "Instances" global_list natural_opt2 -| "Existing" "Class" global +| "Context" binders +| "Instance" instance_name ":" term hint_info fields_def_opt +| "Existing" "Instance" qualid hint_info +| "Existing" "Instances" qualid_list num_opt2 +| "Existing" "Class" qualid | "Arguments" smart_global argument_spec_block_list_opt more_implicits_block_opt arguments_modifier_opt | "Implicit" "Type" reserv_list | "Implicit" "Types" reserv_list | "Generalizable" All_alt -| "Export" "Set" option_table option_setting -| "Export" "Unset" option_table -| "Import" "Prenex" "Implicits" (* ssr plugin *) +| "Export" "Set" ident_list option_setting +| "Export" "Unset" ident_list ] -module_binder_list_opt: [ -| module_binder_list_opt module_binder -| empty +smart_global_list: [ +| smart_global_list smart_global +| smart_global ] -ext_module_expr_list_opt: [ -| ext_module_expr_list_opt ext_module_expr +num_opt: [ +| num | empty ] -ext_module_type_list_opt: [ -| ext_module_type_list_opt ext_module_type +qualid_list: [ +| qualid_list qualid +| qualid +] + +option_setting: [ | empty +| int +| string ] -strategy_level_list: [ -| strategy_level_list strategy_level "[" smart_global_list "]" -| strategy_level "[" smart_global_list "]" +class_rawexpr: [ +| "Funclass" +| "Sortclass" +| smart_global ] -Structure_opt: [ -| "Structure" +hint_info: [ +| "|" num_opt term1_extended_opt | empty ] -univ_decl_opt: [ -| univ_decl +module_binder_list_opt: [ +| module_binder_list_opt "(" export_token ident_list ":" module_type_inl ")" | empty ] -binder_list: [ -| binder_list binder -| binder +module_type_inl_list_opt: [ +| module_type_inl_list_opt module_type_inl +| empty ] -record_declaration_opt: [ -| ":=" "{" record_declaration "}" -| ":=" lconstr +module_expr_inl_list_opt: [ +| module_expr_inl_list_opt module_expr_inl | empty ] -natural_opt: [ -| natural +strategy_level_list: [ +| strategy_level_list strategy_level "[" smart_global_list "]" +| strategy_level "[" smart_global_list "]" +] + +fields_def_opt: [ +| ":=" "{" fields_def "}" +| ":=" term | empty ] @@ -1114,50 +1091,54 @@ univ_decl_opt2: [ | empty ] -export_token: [ -| "Import" -| "Export" +univ_decl_opt: [ +| "@{" ident_list_opt plus_opt univ_constraint_alt | empty ] -ext_module_type: [ -| "<+" module_type_inl +plus_opt: [ +| "+" +| empty ] -ext_module_expr: [ -| "<+" module_expr_inl +univ_constraint_alt: [ +| "|" univ_constraint_list_comma_opt plus_opt "}" +| rbrace_alt ] -check_module_type: [ -| "<:" module_type_inl +univ_constraint_list_comma_opt: [ +| univ_constraint_list_comma +| empty ] -check_module_types: [ -| check_module_type_list_opt +rbrace_alt: [ +| "}" +| "|}" ] -check_module_type_list_opt: [ -| check_module_type_list_opt check_module_type +export_token: [ +| "Import" +| "Export" | empty ] of_module_type: [ | ":" module_type_inl -| check_module_types +| module_type_inl_list_opt ] is_module_type: [ -| ":=" module_type_inl ext_module_type_list_opt +| ":=" module_type_inl module_type_inl_list_opt | empty ] is_module_expr: [ -| ":=" module_expr_inl ext_module_expr_list_opt +| ":=" module_expr_inl module_expr_inl_list_opt | empty ] functor_app_annot: [ -| "[" "inline" "at" "level" natural "]" +| "[" "inline" "at" "level" num "]" | "[" "no" "inline" "]" | empty ] @@ -1172,10 +1153,6 @@ module_type_inl: [ | module_type functor_app_annot ] -module_binder: [ -| "(" export_token ident_list ":" module_type_inl ")" -] - module_expr: [ | module_expr_atom | module_expr module_expr_atom @@ -1186,11 +1163,6 @@ module_expr_atom: [ | "(" module_expr ")" ] -with_declaration: [ -| "Definition" fullyqualid univ_decl_opt ":=" lconstr -| "Module" fullyqualid ":=" qualid -] - module_type: [ | qualid | "(" module_type ")" @@ -1198,108 +1170,45 @@ module_type: [ | module_type "with" with_declaration ] -section_subset_expr: [ -| starredidentref_list_opt -| ssexpr35 -] - -starredidentref_list_opt: [ -| starredidentref_list_opt starredidentref -| empty -] - -starredidentref: [ -| ident -| ident "*" -| "Type" -| "Type" "*" -] - -ssexpr35: [ -| "-" ssexpr50 -| ssexpr50 -] - -ssexpr50: [ -| ssexpr0 "-" ssexpr0 -| ssexpr0 "+" ssexpr0 -| ssexpr0 -] - -ssexpr0: [ -| starredidentref -| "(" starredidentref_list_opt ")" -| "(" starredidentref_list_opt ")" "*" -| "(" ssexpr35 ")" -| "(" ssexpr35 ")" "*" -] - -arguments_modifier: [ -| "simpl" "nomatch" -| "simpl" "never" -| "default" "implicits" -| "clear" "implicits" -| "clear" "scopes" -| "clear" "bidirectionality" "hint" -| "rename" -| "assert" -| "extra" "scopes" -| "clear" "scopes" "and" "implicits" -| "clear" "implicits" "and" "scopes" -] - -scope: [ -| "%" IDENT -] - -argument_spec: [ -| exclam_opt name scope_opt -] - -exclam_opt: [ -| "!" -| empty -] - -scope_opt: [ -| scope -| empty +with_declaration: [ +| "Definition" qualid univ_decl_opt ":=" term +| "Module" qualid ":=" qualid ] argument_spec_block: [ -| argument_spec +| exclam_opt name scope_delimiter_opt | "/" | "&" -| "(" argument_spec_list ")" scope_opt -| "[" argument_spec_list "]" scope_opt -| "{" argument_spec_list "}" scope_opt +| "(" scope_delimiter_list ")" scope_delimiter_opt +| "[" scope_delimiter_list "]" scope_delimiter_opt +| "{" scope_delimiter_list "}" scope_delimiter_opt +] + +scope_delimiter_opt: [ +| "%" ident +| empty ] -argument_spec_list: [ -| argument_spec_list argument_spec -| argument_spec +scope_delimiter_list: [ +| scope_delimiter_list scope_delimiter_opt +| scope_delimiter_opt ] more_implicits_block: [ | name -| "[" name_list "]" -| "{" name_list "}" +| "[" names "]" +| "{" names "}" ] strategy_level: [ | "expand" | "opaque" -| integer +| int | "transparent" ] instance_name: [ -| ident_decl binders -| empty -] - -hint_info: [ -| "|" natural_opt constr_pattern_opt +| ident_decl binders_opt | empty ] @@ -1318,64 +1227,83 @@ reserv_tuple: [ ] simple_reserv: [ -| ident_list ":" lconstr +| ident_list ":" term +] + +arguments_modifier: [ +| "simpl" "nomatch" +| "simpl" "never" +| "default" "implicits" +| "clear" "implicits" +| "clear" "scopes" +| "clear" "bidirectionality" "hint" +| "rename" +| "assert" +| "extra" "scopes" +| "clear" "scopes" "and" "implicits" +| "clear" "implicits" "and" "scopes" +] + +Structure_opt: [ +| "Structure" +| empty ] command: [ +| "Goal" term | "Comments" comment_list_opt -| "Declare" "Instance" ident_decl binders ":" operconstr200 hint_info -| "Declare" "Scope" IDENT +| "Declare" "Instance" ident_decl binders_opt ":" term hint_info +| "Declare" "Scope" ident | "Pwd" | "Cd" -| "Cd" ne_string -| "Load" Verbose_opt ne_string_alt -| "Declare" "ML" "Module" ne_string_list +| "Cd" string +| "Load" Verbose_opt string_alt +| "Declare" "ML" "Module" string_list | "Locate" locatable -| "Add" "LoadPath" ne_string as_dirpath -| "Add" "Rec" "LoadPath" ne_string as_dirpath -| "Remove" "LoadPath" ne_string -| "AddPath" ne_string "as" as_dirpath -| "AddRecPath" ne_string "as" as_dirpath -| "DelPath" ne_string -| "Type" lconstr +| "Add" "LoadPath" string as_dirpath +| "Add" "Rec" "LoadPath" string as_dirpath +| "Remove" "LoadPath" string +| "AddPath" string "as" as_dirpath +| "AddRecPath" string "as" as_dirpath +| "DelPath" string +| "Type" term | "Print" printable | "Print" smart_global univ_name_list_opt -| "Print" "Module" "Type" global -| "Print" "Module" global +| "Print" "Module" "Type" qualid +| "Print" "Module" qualid | "Print" "Namespace" dirpath -| "Inspect" natural -| "Add" "ML" "Path" ne_string -| "Add" "Rec" "ML" "Path" ne_string -| "Set" option_table option_setting -| "Unset" option_table -| "Print" "Table" option_table -| "Add" IDENT IDENT option_ref_value_list -| "Add" IDENT option_ref_value_list -| "Test" option_table "for" option_ref_value_list -| "Test" option_table -| "Remove" IDENT IDENT option_ref_value_list -| "Remove" IDENT option_ref_value_list -| "Write" "State" IDENT -| "Write" "State" ne_string -| "Restore" "State" IDENT -| "Restore" "State" ne_string +| "Inspect" num +| "Add" "ML" "Path" string +| "Add" "Rec" "ML" "Path" string +| "Set" ident_list option_setting +| "Unset" ident_list +| "Print" "Table" ident_list +| "Add" ident ident option_ref_value_list +| "Add" ident option_ref_value_list +| "Test" ident_list "for" option_ref_value_list +| "Test" ident_list +| "Remove" ident ident option_ref_value_list +| "Remove" ident option_ref_value_list +| "Write" "State" ident +| "Write" "State" string +| "Restore" "State" ident +| "Restore" "State" string | "Reset" "Initial" | "Reset" ident | "Back" -| "Back" natural -| "BackTo" natural +| "Back" num | "Debug" "On" | "Debug" "Off" -| "Declare" "Reduction" IDENT; ":=" red_expr -| "Declare" "Custom" "Entry" IDENT -| "Goal" lconstr +| "Declare" "Reduction" ident ":=" red_expr +| "Declare" "Custom" "Entry" ident +| "Derive" ident "SuchThat" term1_extended "As" ident (* derive plugin *) | "Proof" | "Proof" "Mode" string -| "Proof" lconstr +| "Proof" term | "Abort" | "Abort" "All" | "Abort" ident -| "Existential" natural constr_body +| "Existential" num constr_body | "Admitted" | "Qed" | "Save" ident @@ -1383,14 +1311,14 @@ command: [ | "Defined" ident | "Restart" | "Undo" -| "Undo" natural -| "Undo" "To" natural +| "Undo" num +| "Undo" "To" num | "Focus" -| "Focus" natural +| "Focus" num | "Unfocus" | "Unfocused" | "Show" -| "Show" natural +| "Show" num | "Show" ident | "Show" "Existentials" | "Show" "Universes" @@ -1398,47 +1326,57 @@ command: [ | "Show" "Proof" | "Show" "Intro" | "Show" "Intros" -| "Show" "Match" reference +| "Show" "Match" qualid | "Guarded" -| "Create" "HintDb" IDENT discriminated_opt -| "Remove" "Hints" global_list opt_hintbases +| "Create" "HintDb" ident discriminated_opt +| "Remove" "Hints" qualid_list opt_hintbases | "Hint" hint opt_hintbases -| "Obligation" integer "of" ident ":" lglob withtac -| "Obligation" integer "of" ident withtac -| "Obligation" integer ":" lglob withtac -| "Obligation" integer withtac +| "Obligation" int "of" ident ":" term withtac +| "Obligation" int "of" ident withtac +| "Obligation" int ":" term withtac +| "Obligation" int withtac | "Next" "Obligation" "of" ident withtac | "Next" "Obligation" withtac -| "Solve" "Obligation" integer "of" ident "with" tactic -| "Solve" "Obligation" integer "with" tactic -| "Solve" "Obligations" "of" ident "with" tactic -| "Solve" "Obligations" "with" tactic +| "Solve" "Obligation" int "of" ident "with" ltac_expr +| "Solve" "Obligation" int "with" ltac_expr +| "Solve" "Obligations" "of" ident "with" ltac_expr +| "Solve" "Obligations" "with" ltac_expr | "Solve" "Obligations" -| "Solve" "All" "Obligations" "with" tactic +| "Solve" "All" "Obligations" "with" ltac_expr | "Solve" "All" "Obligations" | "Admit" "Obligations" "of" ident | "Admit" "Obligations" -| "Obligation" "Tactic" ":=" tactic +| "Obligation" "Tactic" ":=" ltac_expr | "Show" "Obligation" "Tactic" | "Obligations" "of" ident | "Obligations" | "Preterm" "of" ident | "Preterm" -| "Hint" "Rewrite" orient term_list ":" preident_list_opt -| "Hint" "Rewrite" orient term_list "using" tactic ":" preident_list_opt -| "Hint" "Rewrite" orient term_list -| "Hint" "Rewrite" orient term_list "using" tactic -| "Derive" "Inversion_clear" ident "with" term "Sort" sort_family -| "Derive" "Inversion_clear" ident "with" term -| "Derive" "Inversion" ident "with" term "Sort" sort_family -| "Derive" "Inversion" ident "with" term -| "Derive" "Dependent" "Inversion" ident "with" term "Sort" sort_family -| "Derive" "Dependent" "Inversion_clear" ident "with" term "Sort" sort_family -| "Declare" "Left" "Step" term -| "Declare" "Right" "Step" term +| "Add" "Relation" term1_extended term1_extended "reflexivity" "proved" "by" term1_extended "symmetry" "proved" "by" term1_extended "as" ident +| "Add" "Relation" term1_extended term1_extended "reflexivity" "proved" "by" term1_extended "as" ident +| "Add" "Relation" term1_extended term1_extended "as" ident +| "Add" "Relation" term1_extended term1_extended "symmetry" "proved" "by" term1_extended "as" ident +| "Add" "Relation" term1_extended term1_extended "symmetry" "proved" "by" term1_extended "transitivity" "proved" "by" term1_extended "as" ident +| "Add" "Relation" term1_extended term1_extended "reflexivity" "proved" "by" term1_extended "transitivity" "proved" "by" term1_extended "as" ident +| "Add" "Relation" term1_extended term1_extended "reflexivity" "proved" "by" term1_extended "symmetry" "proved" "by" term1_extended "transitivity" "proved" "by" term1_extended "as" ident +| "Add" "Relation" term1_extended term1_extended "transitivity" "proved" "by" term1_extended "as" ident +| "Add" "Parametric" "Relation" binders_opt ":" term1_extended term1_extended "reflexivity" "proved" "by" term1_extended "symmetry" "proved" "by" term1_extended "as" ident +| "Add" "Parametric" "Relation" binders_opt ":" term1_extended term1_extended "reflexivity" "proved" "by" term1_extended "as" ident +| "Add" "Parametric" "Relation" binders_opt ":" term1_extended term1_extended "as" ident +| "Add" "Parametric" "Relation" binders_opt ":" term1_extended term1_extended "symmetry" "proved" "by" term1_extended "as" ident +| "Add" "Parametric" "Relation" binders_opt ":" term1_extended term1_extended "symmetry" "proved" "by" term1_extended "transitivity" "proved" "by" term1_extended "as" ident +| "Add" "Parametric" "Relation" binders_opt ":" term1_extended term1_extended "reflexivity" "proved" "by" term1_extended "transitivity" "proved" "by" term1_extended "as" ident +| "Add" "Parametric" "Relation" binders_opt ":" term1_extended term1_extended "reflexivity" "proved" "by" term1_extended "symmetry" "proved" "by" term1_extended "transitivity" "proved" "by" term1_extended "as" ident +| "Add" "Parametric" "Relation" binders_opt ":" term1_extended term1_extended "transitivity" "proved" "by" term1_extended "as" ident +| "Add" "Setoid" term1_extended term1_extended term1_extended "as" ident +| "Add" "Parametric" "Setoid" binders_opt ":" term1_extended term1_extended term1_extended "as" ident +| "Add" "Morphism" term1_extended ":" ident +| "Declare" "Morphism" term1_extended ":" ident +| "Add" "Morphism" term1_extended "with" "signature" term "as" ident +| "Add" "Parametric" "Morphism" binders_opt ":" term1_extended "with" "signature" term "as" ident | "Grab" "Existential" "Variables" | "Unshelve" -| "Declare" "Equivalent" "Keys" term term +| "Declare" "Equivalent" "Keys" term1_extended term1_extended | "Print" "Equivalent" "Keys" | "Optimize" "Proof" | "Optimize" "Heap" @@ -1446,129 +1384,143 @@ command: [ | "Show" "Ltac" "Profile" | "Show" "Ltac" "Profile" "CutOff" int | "Show" "Ltac" "Profile" string +| "Add" "InjTyp" term1_extended (* micromega plugin *) +| "Add" "BinOp" term1_extended (* micromega plugin *) +| "Add" "UnOp" term1_extended (* micromega plugin *) +| "Add" "CstOp" term1_extended (* micromega plugin *) +| "Add" "BinRel" term1_extended (* micromega plugin *) +| "Add" "PropOp" term1_extended (* micromega plugin *) +| "Add" "PropUOp" term1_extended (* micromega plugin *) +| "Add" "Spec" term1_extended (* micromega plugin *) +| "Add" "BinOpSpec" term1_extended (* micromega plugin *) +| "Add" "UnOpSpec" term1_extended (* micromega plugin *) +| "Add" "Saturate" term1_extended (* micromega plugin *) +| "Show" "Zify" "InjTyp" (* micromega plugin *) +| "Show" "Zify" "BinOp" (* micromega plugin *) +| "Show" "Zify" "UnOp" (* micromega plugin *) +| "Show" "Zify" "CstOp" (* micromega plugin *) +| "Show" "Zify" "BinRel" (* micromega plugin *) +| "Show" "Zify" "Spec" (* micromega plugin *) +| "Add" "Ring" ident ":" term1_extended ring_mods_opt (* setoid_ring plugin *) | "Hint" "Cut" "[" hints_path "]" opthints -| "Typeclasses" "Transparent" reference_list_opt -| "Typeclasses" "Opaque" reference_list_opt +| "Typeclasses" "Transparent" qualid_list_opt +| "Typeclasses" "Opaque" qualid_list_opt | "Typeclasses" "eauto" ":=" debug eauto_search_strategy int_opt -| "Add" "Relation" term term "reflexivity" "proved" "by" term "symmetry" "proved" "by" term "as" ident -| "Add" "Relation" term term "reflexivity" "proved" "by" term "as" ident -| "Add" "Relation" term term "as" ident -| "Add" "Relation" term term "symmetry" "proved" "by" term "as" ident -| "Add" "Relation" term term "symmetry" "proved" "by" term "transitivity" "proved" "by" term "as" ident -| "Add" "Relation" term term "reflexivity" "proved" "by" term "transitivity" "proved" "by" term "as" ident -| "Add" "Relation" term term "reflexivity" "proved" "by" term "symmetry" "proved" "by" term "transitivity" "proved" "by" term "as" ident -| "Add" "Relation" term term "transitivity" "proved" "by" term "as" ident -| "Add" "Parametric" "Relation" binders ":" term term "reflexivity" "proved" "by" term "symmetry" "proved" "by" term "as" ident -| "Add" "Parametric" "Relation" binders ":" term term "reflexivity" "proved" "by" term "as" ident -| "Add" "Parametric" "Relation" binders ":" term term "as" ident -| "Add" "Parametric" "Relation" binders ":" term term "symmetry" "proved" "by" term "as" ident -| "Add" "Parametric" "Relation" binders ":" term term "symmetry" "proved" "by" term "transitivity" "proved" "by" term "as" ident -| "Add" "Parametric" "Relation" binders ":" term term "reflexivity" "proved" "by" term "transitivity" "proved" "by" term "as" ident -| "Add" "Parametric" "Relation" binders ":" term term "reflexivity" "proved" "by" term "symmetry" "proved" "by" term "transitivity" "proved" "by" term "as" ident -| "Add" "Parametric" "Relation" binders ":" term term "transitivity" "proved" "by" term "as" ident -| "Add" "Setoid" term term term "as" ident -| "Add" "Parametric" "Setoid" binders ":" term term term "as" ident -| "Add" "Morphism" term ":" ident -| "Declare" "Morphism" term ":" ident -| "Add" "Morphism" term "with" "signature" lconstr "as" ident -| "Add" "Parametric" "Morphism" binders ":" term "with" "signature" lconstr "as" ident -| "Print" "Rewrite" "HintDb" preident -| "Proof" "with" tactic using_opt +| "Print" "Rewrite" "HintDb" ident +| "Proof" "with" ltac_expr using_opt | "Proof" "using" section_subset_expr with_opt -| "Tactic" "Notation" ltac_tactic_level_opt ltac_production_item_list ":=" tactic -| "Print" "Ltac" reference -| "Locate" "Ltac" reference -| "Ltac" ltac_tacdef_body_list +| "Tactic" "Notation" ltac_tactic_level_opt ltac_production_item_list ":=" ltac_expr +| "Print" "Ltac" qualid +| "Locate" "Ltac" qualid +| "Ltac" tacdef_body_list | "Print" "Ltac" "Signatures" -| "String" "Notation" reference reference reference ":" ident -| "Set" "Firstorder" "Solver" tactic +| "Set" "Firstorder" "Solver" ltac_expr | "Print" "Firstorder" "Solver" -| "Numeral" "Notation" reference reference reference ":" ident numnotoption -| "Derive" ident "SuchThat" term "As" ident (* derive plugin *) -| "Extraction" global (* extraction plugin *) -| "Recursive" "Extraction" global_list (* extraction plugin *) -| "Extraction" string global_list (* extraction plugin *) -| "Extraction" "TestCompile" global_list (* extraction plugin *) -| "Separate" "Extraction" global_list (* extraction plugin *) +| "Extraction" qualid (* extraction plugin *) +| "Recursive" "Extraction" qualid_list (* extraction plugin *) +| "Extraction" string qualid_list (* extraction plugin *) +| "Extraction" "TestCompile" qualid_list (* extraction plugin *) +| "Separate" "Extraction" qualid_list (* extraction plugin *) | "Extraction" "Library" ident (* extraction plugin *) | "Recursive" "Extraction" "Library" ident (* extraction plugin *) | "Extraction" "Language" language (* extraction plugin *) -| "Extraction" "Inline" global_list (* extraction plugin *) -| "Extraction" "NoInline" global_list (* extraction plugin *) +| "Extraction" "Inline" qualid_list (* extraction plugin *) +| "Extraction" "NoInline" qualid_list (* extraction plugin *) | "Print" "Extraction" "Inline" (* extraction plugin *) | "Reset" "Extraction" "Inline" (* extraction plugin *) -| "Extraction" "Implicit" global "[" int_or_id_list_opt "]" (* extraction plugin *) +| "Extraction" "Implicit" qualid "[" int_or_id_list_opt "]" (* extraction plugin *) | "Extraction" "Blacklist" ident_list (* extraction plugin *) | "Print" "Extraction" "Blacklist" (* extraction plugin *) | "Reset" "Extraction" "Blacklist" (* extraction plugin *) -| "Extract" "Constant" global string_list_opt "=>" mlname (* extraction plugin *) -| "Extract" "Inlined" "Constant" global "=>" mlname (* extraction plugin *) -| "Extract" "Inductive" global "=>" mlname "[" mlname_list_opt "]" string_opt (* extraction plugin *) +| "Extract" "Constant" qualid string_list_opt "=>" mlname (* extraction plugin *) +| "Extract" "Inlined" "Constant" qualid "=>" mlname (* extraction plugin *) +| "Extract" "Inductive" qualid "=>" mlname "[" mlname_list_opt "]" string_opt (* extraction plugin *) | "Show" "Extraction" (* extraction plugin *) -| "Function" function_rec_definition_loc_list (* funind plugin *) +| "Function" fix_definition_list (* funind plugin *) | "Functional" "Scheme" fun_scheme_arg_list (* funind plugin *) | "Functional" "Case" fun_scheme_arg (* funind plugin *) -| "Generate" "graph" "for" reference (* funind plugin *) -| "Add" "Ring" ident ":" term ring_mods_opt (* setoid_ring plugin *) +| "Generate" "graph" "for" qualid (* funind plugin *) +| "Hint" "Rewrite" orient term1_extended_list ":" ident_list_opt +| "Hint" "Rewrite" orient term1_extended_list "using" ltac_expr ":" ident_list_opt +| "Hint" "Rewrite" orient term1_extended_list +| "Hint" "Rewrite" orient term1_extended_list "using" ltac_expr +| "Derive" "Inversion_clear" ident "with" term1_extended "Sort" sort_family +| "Derive" "Inversion_clear" ident "with" term1_extended +| "Derive" "Inversion" ident "with" term1_extended "Sort" sort_family +| "Derive" "Inversion" ident "with" term1_extended +| "Derive" "Dependent" "Inversion" ident "with" term1_extended "Sort" sort_family +| "Derive" "Dependent" "Inversion_clear" ident "with" term1_extended "Sort" sort_family +| "Declare" "Left" "Step" term1_extended +| "Declare" "Right" "Step" term1_extended | "Print" "Rings" (* setoid_ring plugin *) -| "Add" "Field" ident ":" term field_mods_opt (* setoid_ring plugin *) +| "Add" "Field" ident ":" term1_extended field_mods_opt (* setoid_ring plugin *) | "Print" "Fields" (* setoid_ring plugin *) -| "Prenex" "Implicits" global_list (* ssr plugin *) -| "Search" ssr_search_arg ssr_modlocs (* ssr plugin *) -| "Print" "Hint" "View" ssrviewpos (* ssr plugin *) -| "Hint" "View" ssrviewposspc ssrhintref_list (* ssr plugin *) +| "Numeral" "Notation" qualid qualid qualid ":" ident numnotoption +| "String" "Notation" qualid qualid qualid ":" ident ] -comment_list_opt: [ -| comment_list_opt comment +orient: [ +| "->" +| "<-" | empty ] -Verbose_opt: [ -| "Verbose" +string_opt: [ +| string | empty ] -ne_string_alt: [ -| ne_string -| IDENT +qualid_list_opt: [ +| qualid_list_opt qualid +| empty ] -ne_string_list: [ -| ne_string_list ne_string -| ne_string +univ_name_list_opt: [ +| "@{" name_list_opt "}" +| empty ] -univ_name_list_opt: [ -| univ_name_list +name_list_opt: [ +| name_list_opt name | empty ] -option_ref_value_list: [ -| option_ref_value_list option_ref_value -| option_ref_value +section_subset_expr: [ +| starredidentref_list_opt +| ssexpr ] -discriminated_opt: [ -| "discriminated" -| empty +ssexpr: [ +| "-" ssexpr50 +| ssexpr50 ] -global_list: [ -| global_list global -| global +ssexpr50: [ +| ssexpr0 "-" ssexpr0 +| ssexpr0 "+" ssexpr0 +| ssexpr0 ] -preident_list_opt: [ -| preident_list_opt preident -| empty +ssexpr0: [ +| starredidentref +| "(" starredidentref_list_opt ")" +| "(" starredidentref_list_opt ")" "*" +| "(" ssexpr ")" +| "(" ssexpr ")" "*" ] -reference_list_opt: [ -| reference_list_opt reference +starredidentref_list_opt: [ +| starredidentref_list_opt starredidentref | empty ] +starredidentref: [ +| ident +| ident "*" +| "Type" +| "Type" "*" +] + int_opt: [ | int | empty @@ -1580,12 +1532,12 @@ using_opt: [ ] with_opt: [ -| "with" tactic +| "with" ltac_expr | empty ] ltac_tactic_level_opt: [ -| ltac_tactic_level +| "(" "at" "level" num ")" | empty ] @@ -1594,85 +1546,17 @@ ltac_production_item_list: [ | ltac_production_item ] -ltac_tacdef_body_list: [ -| ltac_tacdef_body_list "with" ltac_tacdef_body -| ltac_tacdef_body -] - -int_or_id_list_opt: [ -| int_or_id_list_opt int_or_id -| empty -] - -ident_list: [ -| ident_list ident -| ident -] - -string_list_opt: [ -| string_list_opt string -| empty -] - -mlname_list_opt: [ -| mlname_list_opt mlname -| empty -] - -string_opt: [ -| string -| empty -] - -function_rec_definition_loc_list: [ -| function_rec_definition_loc_list "with" function_rec_definition_loc -| function_rec_definition_loc -] - -fun_scheme_arg_list: [ -| fun_scheme_arg_list "with" fun_scheme_arg -| fun_scheme_arg -] - -ring_mods_opt: [ -| ring_mods -| empty -] - -field_mods_opt: [ -| field_mods -| empty -] - -ssrhintref_list: [ -| ssrhintref_list ssrhintref -| ssrhintref -] - -query_command: [ -| "Eval" red_expr "in" lconstr "." -| "Compute" lconstr "." -| "Check" lconstr "." -| "About" smart_global univ_name_list_opt "." -| "SearchHead" constr_pattern in_or_out_modules "." -| "SearchPattern" constr_pattern in_or_out_modules "." -| "SearchRewrite" constr_pattern in_or_out_modules "." -| "Search" searchabout_query searchabout_queries "." -| "SearchAbout" searchabout_query searchabout_queries "." -| "SearchAbout" "[" searchabout_query_list "]" in_or_out_modules "." -] - -searchabout_query_list: [ -| searchabout_query_list searchabout_query -| searchabout_query +tacdef_body_list: [ +| tacdef_body_list "with" tacdef_body +| tacdef_body ] printable: [ | "Term" smart_global univ_name_list_opt | "All" -| "Section" global -| "Grammar" IDENT -| "Custom" "Grammar" IDENT +| "Section" qualid +| "Grammar" ident +| "Custom" "Grammar" ident | "LoadPath" dirpath_opt | "Modules" | "Libraries" @@ -1686,17 +1570,18 @@ printable: [ | "Coercions" | "Coercion" "Paths" class_rawexpr class_rawexpr | "Canonical" "Projections" +| "Typing" "Flags" | "Tables" | "Options" | "Hint" | "Hint" smart_global | "Hint" "*" -| "HintDb" IDENT +| "HintDb" ident | "Scopes" -| "Scope" IDENT -| "Visibility" IDENT_opt3 +| "Scope" ident +| "Visibility" ident_opt | "Implicit" smart_global -| Sorted_opt "Universes" printunivs_subgraph_opt ne_string_opt +| Sorted_opt "Universes" printunivs_subgraph_opt string_opt | "Assumptions" smart_global | "Opaque" "Dependencies" smart_global | "Transparent" "Dependencies" smart_global @@ -1711,9 +1596,9 @@ dirpath_opt: [ | empty ] -IDENT_opt3: [ -| IDENT -| empty +dirpath: [ +| ident +| dirpath field ] Sorted_opt: [ @@ -1722,384 +1607,408 @@ Sorted_opt: [ ] printunivs_subgraph_opt: [ -| printunivs_subgraph +| "Subgraph" "(" qualid_list_opt ")" | empty ] -ne_string_opt: [ -| ne_string +comment_list_opt: [ +| comment_list_opt comment | empty ] -printunivs_subgraph: [ -| "Subgraph" "(" reference_list_opt ")" +Verbose_opt: [ +| "Verbose" +| empty ] -class_rawexpr: [ -| "Funclass" -| "Sortclass" -| smart_global +string_alt: [ +| string +| ident ] -locatable: [ -| smart_global -| "Term" smart_global -| "File" ne_string -| "Library" global -| "Module" global +string_list: [ +| string_list string +| string ] -option_setting: [ -| empty -| integer -| STRING +option_ref_value_list: [ +| option_ref_value_list option_ref_value +| option_ref_value ] -option_ref_value: [ -| global -| STRING +discriminated_opt: [ +| "discriminated" +| empty ] -option_table: [ -| IDENT_list +string_list_opt: [ +| string_list_opt string +| empty ] -as_dirpath: [ -| as_opt3 +mlname_list_opt: [ +| mlname_list_opt mlname +| empty ] -as_opt: [ -| "as" name -| empty +fun_scheme_arg_list: [ +| fun_scheme_arg_list "with" fun_scheme_arg +| fun_scheme_arg ] -ne_in_or_out_modules: [ -| "inside" global_list -| "outside" global_list +term1_extended_list: [ +| term1_extended_list term1_extended +| term1_extended ] -in_or_out_modules: [ -| ne_in_or_out_modules +ring_mods_opt: [ +| "(" ring_mod_list_comma ")" (* setoid_ring plugin *) | empty ] -comment: [ -| term -| STRING -| natural +field_mods_opt: [ +| "(" field_mod_list_comma ")" (* setoid_ring plugin *) +| empty ] -positive_search_mark: [ -| "-" -| empty +locatable: [ +| smart_global +| "Term" smart_global +| "File" string +| "Library" qualid +| "Module" qualid ] -searchabout_query: [ -| positive_search_mark ne_string scope_opt -| positive_search_mark constr_pattern +option_ref_value: [ +| qualid +| string ] -searchabout_queries: [ -| ne_in_or_out_modules -| searchabout_query searchabout_queries +as_dirpath: [ +| "as" dirpath | empty ] -univ_name_list: [ -| "@{" name_list_opt "}" +comment: [ +| term1_extended +| string +| num ] -syntax: [ -| "Open" "Scope" IDENT -| "Close" "Scope" IDENT -| "Delimit" "Scope" IDENT; "with" IDENT -| "Undelimit" "Scope" IDENT -| "Bind" "Scope" IDENT; "with" class_rawexpr_list -| "Infix" ne_lstring ":=" term syntax_modifier_opt IDENT_opt2 -| "Notation" ident ident_list_opt ":=" term only_parsing -| "Notation" lstring ":=" term syntax_modifier_opt IDENT_opt2 -| "Format" "Notation" STRING STRING STRING -| "Reserved" "Infix" ne_lstring syntax_modifier_opt -| "Reserved" "Notation" ne_lstring syntax_modifier_opt +reference_or_constr: [ +| qualid +| term1_extended ] -class_rawexpr_list: [ -| class_rawexpr_list class_rawexpr -| class_rawexpr +hint: [ +| "Resolve" reference_or_constr_list hint_info +| "Resolve" "->" qualid_list num_opt +| "Resolve" "<-" qualid_list num_opt +| "Immediate" reference_or_constr_list +| "Variables" "Transparent" +| "Variables" "Opaque" +| "Constants" "Transparent" +| "Constants" "Opaque" +| "Transparent" qualid_list +| "Opaque" qualid_list +| "Mode" qualid plus_list +| "Unfold" qualid_list +| "Constructors" qualid_list +| "Extern" num term1_extended_opt "=>" ltac_expr ] -syntax_modifier_opt: [ -| "(" syntax_modifier_list_comma ")" -| empty +reference_or_constr_list: [ +| reference_or_constr_list reference_or_constr +| reference_or_constr ] -syntax_modifier_list_comma: [ -| syntax_modifier_list_comma "," syntax_modifier -| syntax_modifier +constr_body: [ +| ":=" term +| ":" term ":=" term ] -only_parsing: [ -| "(" "only" "parsing" ")" -| "(" "compat" STRING ")" -| empty +plus_list: [ +| plus_list plus_alt +| plus_alt ] -level: [ -| "level" natural -| "next" "level" +plus_alt: [ +| "+" +| "!" +| "-" ] -syntax_modifier: [ -| "at" "level" natural -| "in" "custom" IDENT -| "in" "custom" IDENT; "at" "level" natural -| "left" "associativity" -| "right" "associativity" -| "no" "associativity" -| "only" "printing" -| "only" "parsing" -| "compat" STRING -| "format" STRING STRING_opt -| IDENT; "," IDENT_list_comma "at" level -| IDENT; "at" level -| IDENT; "at" level constr_as_binder_kind -| IDENT constr_as_binder_kind -| IDENT syntax_extension_type +withtac: [ +| "with" ltac_expr +| empty ] -STRING_opt: [ -| STRING -| empty +ltac_def_kind: [ +| ":=" +| "::=" ] -IDENT_list_comma: [ -| IDENT_list_comma "," IDENT -| IDENT +tacdef_body: [ +| qualid fun_var_list ltac_def_kind ltac_expr +| qualid ltac_def_kind ltac_expr ] -syntax_extension_type: [ -| "ident" -| "global" -| "bigint" -| "binder" -| "constr" -| "constr" at_level_opt constr_as_binder_kind_opt -| "pattern" -| "pattern" "at" "level" natural -| "strict" "pattern" -| "strict" "pattern" "at" "level" natural -| "closed" "binder" -| "custom" IDENT at_level_opt constr_as_binder_kind_opt +ltac_production_item: [ +| string +| ident "(" ident ltac_production_sep_opt ")" +| ident ] -at_level_opt: [ -| at_level +ltac_production_sep_opt: [ +| "," string | empty ] -constr_as_binder_kind_opt: [ -| constr_as_binder_kind +numnotoption: [ | empty +| "(" "warning" "after" num ")" +| "(" "abstract" "after" num ")" ] -at_level: [ -| "at" level +mlname: [ +| ident (* extraction plugin *) +| string (* extraction plugin *) ] -constr_as_binder_kind: [ -| "as" "ident" -| "as" "pattern" -| "as" "strict" "pattern" +int_or_id: [ +| ident (* extraction plugin *) +| int (* extraction plugin *) ] -opt_hintbases: [ -| empty -| ":" IDENT_list +language: [ +| "Ocaml" (* extraction plugin *) +| "OCaml" (* extraction plugin *) +| "Haskell" (* extraction plugin *) +| "Scheme" (* extraction plugin *) +| "JSON" (* extraction plugin *) ] -IDENT_list: [ -| IDENT_list IDENT -| IDENT +fun_scheme_arg: [ +| ident ":=" "Induction" "for" qualid "Sort" sort_family (* funind plugin *) ] -reference_or_constr: [ -| global -| term +ring_mod: [ +| "decidable" term1_extended (* setoid_ring plugin *) +| "abstract" (* setoid_ring plugin *) +| "morphism" term1_extended (* setoid_ring plugin *) +| "constants" "[" ltac_expr "]" (* setoid_ring plugin *) +| "closed" "[" qualid_list "]" (* setoid_ring plugin *) +| "preprocess" "[" ltac_expr "]" (* setoid_ring plugin *) +| "postprocess" "[" ltac_expr "]" (* setoid_ring plugin *) +| "setoid" term1_extended term1_extended (* setoid_ring plugin *) +| "sign" term1_extended (* setoid_ring plugin *) +| "power" term1_extended "[" qualid_list "]" (* setoid_ring plugin *) +| "power_tac" term1_extended "[" ltac_expr "]" (* setoid_ring plugin *) +| "div" term1_extended (* setoid_ring plugin *) ] -hint: [ -| "Resolve" reference_or_constr_list hint_info -| "Resolve" "->" global_list natural_opt -| "Resolve" "<-" global_list natural_opt -| "Immediate" reference_or_constr_list -| "Variables" "Transparent" -| "Variables" "Opaque" -| "Constants" "Transparent" -| "Constants" "Opaque" -| "Transparent" global_list -| "Opaque" global_list -| "Mode" global mode -| "Unfold" global_list -| "Constructors" global_list -| "Extern" natural constr_pattern_opt "=>" tactic +ring_mod_list_comma: [ +| ring_mod_list_comma "," ring_mod +| ring_mod ] -reference_or_constr_list: [ -| reference_or_constr_list reference_or_constr -| reference_or_constr +field_mod: [ +| ring_mod (* setoid_ring plugin *) +| "completeness" term1_extended (* setoid_ring plugin *) ] -natural_opt2: [ -| "|" natural -| empty +field_mod_list_comma: [ +| field_mod_list_comma "," field_mod +| field_mod ] -constr_pattern_opt: [ -| constr_pattern +debug: [ +| "debug" | empty ] -constr_body: [ -| ":=" lconstr -| ":" lconstr ":=" lconstr +eauto_search_strategy: [ +| "(bfs)" +| "(dfs)" +| empty ] -mode: [ -| plus_list +hints_path_atom: [ +| qualid_list +| "_" ] -plus_list: [ -| plus_list plus_alt -| plus_alt +hints_path: [ +| "(" hints_path ")" +| hints_path "*" +| "emp" +| "eps" +| hints_path "|" hints_path +| hints_path_atom +| hints_path hints_path ] -plus_alt: [ -| "+" -| "!" -| "-" +opthints: [ +| ":" ident_list +| empty ] -vernac_toplevel: [ -| "Drop" "." -| "Quit" "." -| "Backtrack" natural natural natural "." -| "Show" "Goal" natural "at" natural "." -| vernac_control +opt_hintbases: [ +| empty +| ":" ident_list ] -orient: [ -| "->" -| "<-" +int_or_id_list_opt: [ +| int_or_id_list_opt int_or_id | empty ] -occurrences: [ -| integer_list -| var +query_command: [ +| "Eval" red_expr "in" term "." +| "Compute" term "." +| "Check" term "." +| "About" smart_global univ_name_list_opt "." +| "SearchHead" term1_extended in_or_out_modules "." +| "SearchPattern" term1_extended in_or_out_modules "." +| "SearchRewrite" term1_extended in_or_out_modules "." +| "Search" searchabout_query searchabout_queries "." +| "SearchAbout" searchabout_query searchabout_queries "." +| "SearchAbout" "[" searchabout_query_list "]" in_or_out_modules "." ] -integer_list: [ -| integer_list integer -| integer +ne_in_or_out_modules: [ +| "inside" qualid_list +| "outside" qualid_list ] -glob: [ -| term +in_or_out_modules: [ +| ne_in_or_out_modules +| empty ] -lglob: [ -| lconstr +positive_search_mark: [ +| "-" +| empty ] -casted_constr: [ -| term +searchabout_query: [ +| positive_search_mark string scope_delimiter_opt +| positive_search_mark term1_extended ] -hloc: [ +searchabout_queries: [ +| ne_in_or_out_modules +| searchabout_query searchabout_queries | empty -| "in" "|-" "*" -| "in" ident -| "in" "(" "Type" "of" ident ")" -| "in" "(" "Value" "of" ident ")" -| "in" "(" "type" "of" ident ")" -| "in" "(" "value" "of" ident ")" ] -rename: [ -| ident "into" ident +searchabout_query_list: [ +| searchabout_query_list searchabout_query +| searchabout_query ] -by_arg_tac: [ -| "by" ltac_expr3 -| empty +syntax: [ +| "Open" "Scope" ident +| "Close" "Scope" ident +| "Delimit" "Scope" ident "with" ident +| "Undelimit" "Scope" ident +| "Bind" "Scope" ident "with" class_rawexpr_list +| "Infix" string ":=" term1_extended syntax_modifier_opt ident_opt3 +| "Notation" ident ident_list_opt ":=" term1_extended only_parsing +| "Notation" string ":=" term1_extended syntax_modifier_opt ident_opt3 +| "Format" "Notation" string string string +| "Reserved" "Infix" string syntax_modifier_opt +| "Reserved" "Notation" string syntax_modifier_opt ] -in_clause: [ -| in_clause -| "*" occs -| "*" "|-" concl_occ -| hypident_occ_list_comma_opt "|-" concl_occ -| hypident_occ_list_comma_opt +class_rawexpr_list: [ +| class_rawexpr_list class_rawexpr +| class_rawexpr ] -hypident_occ_list_comma_opt: [ -| hypident_occ_list_comma +syntax_modifier_opt: [ +| "(" syntax_modifier_list_comma ")" | empty ] -hypident_occ_list_comma: [ -| hypident_occ_list_comma "," hypident_occ -| hypident_occ +syntax_modifier_list_comma: [ +| syntax_modifier_list_comma "," syntax_modifier +| syntax_modifier ] -test_lpar_id_colon: [ +only_parsing: [ +| "(" "only" "parsing" ")" +| "(" "compat" string ")" | empty ] -withtac: [ -| "with" tactic -| empty +level: [ +| "level" num +| "next" "level" ] -closed_binder: [ -| "(" name name_list ":" lconstr ")" -| "(" name ":" lconstr ")" -| "(" name ":=" lconstr ")" -| "(" name ":" lconstr ":=" lconstr ")" -| "{" name "}" -| "{" name name_list ":" lconstr "}" -| "{" name ":" lconstr "}" -| "{" name name_list "}" -| "`(" typeclass_constraint_list_comma ")" -| "`{" typeclass_constraint_list_comma "}" -| "'" pattern0 -| of_alt operconstr99 (* ssr plugin *) -| "(" "_" ":" lconstr "|" lconstr ")" +syntax_modifier: [ +| "at" "level" num +| "in" "custom" ident +| "in" "custom" ident "at" "level" num +| "left" "associativity" +| "right" "associativity" +| "no" "associativity" +| "only" "printing" +| "only" "parsing" +| "compat" string +| "format" string string_opt +| ident "," ident_list_comma "at" level +| ident "at" level +| ident "at" level constr_as_binder_kind +| ident constr_as_binder_kind +| ident syntax_extension_type ] -typeclass_constraint_list_comma: [ -| typeclass_constraint_list_comma "," typeclass_constraint -| typeclass_constraint +syntax_extension_type: [ +| "ident" +| "global" +| "bigint" +| "binder" +| "constr" +| "constr" level_opt constr_as_binder_kind_opt +| "pattern" +| "pattern" "at" "level" num +| "strict" "pattern" +| "strict" "pattern" "at" "level" num +| "closed" "binder" +| "custom" ident level_opt constr_as_binder_kind_opt ] -of_alt: [ -| "of" -| "&" +level_opt: [ +| level +| empty +] + +constr_as_binder_kind_opt: [ +| constr_as_binder_kind +| empty +] + +constr_as_binder_kind: [ +| "as" "ident" +| "as" "pattern" +| "as" "strict" "pattern" ] simple_tactic: [ | "reflexivity" -| "exact" casted_constr +| "exact" term1_extended | "assumption" | "etransitivity" -| "cut" term -| "exact_no_check" term -| "vm_cast_no_check" term -| "native_cast_no_check" term -| "casetype" term -| "elimtype" term -| "lapply" term -| "transitivity" term +| "cut" term1_extended +| "exact_no_check" term1_extended +| "vm_cast_no_check" term1_extended +| "native_cast_no_check" term1_extended +| "casetype" term1_extended +| "elimtype" term1_extended +| "lapply" term1_extended +| "transitivity" term1_extended | "left" | "eleft" | "left" "with" bindings @@ -2131,32 +2040,32 @@ simple_tactic: [ | "intro" ident | "intro" ident "at" "top" | "intro" ident "at" "bottom" -| "intro" ident "after" var -| "intro" ident "before" var +| "intro" ident "after" ident +| "intro" ident "before" ident | "intro" "at" "top" | "intro" "at" "bottom" -| "intro" "after" var -| "intro" "before" var -| "move" var "at" "top" -| "move" var "at" "bottom" -| "move" var "after" var -| "move" var "before" var +| "intro" "after" ident +| "intro" "before" ident +| "move" ident "at" "top" +| "move" ident "at" "bottom" +| "move" ident "after" ident +| "move" ident "before" ident | "rename" rename_list_comma -| "revert" var_list +| "revert" ident_list | "simple" "induction" quantified_hypothesis | "simple" "destruct" quantified_hypothesis | "double" "induction" quantified_hypothesis quantified_hypothesis | "admit" -| "fix" ident natural +| "fix" ident num | "cofix" ident -| "clear" var_list_opt -| "clear" "-" var_list -| "clearbody" var_list -| "generalize" "dependent" term -| "replace" uconstr "with" term clause_dft_concl by_arg_tac -| "replace" "->" uconstr clause_dft_concl -| "replace" "<-" uconstr clause_dft_concl -| "replace" uconstr clause_dft_concl +| "clear" ident_list_opt +| "clear" "-" ident_list +| "clearbody" ident_list +| "generalize" "dependent" term1_extended +| "replace" term1_extended "with" term1_extended clause_dft_concl by_arg_tac +| "replace" "->" term1_extended clause_dft_concl +| "replace" "<-" term1_extended clause_dft_concl +| "replace" term1_extended clause_dft_concl | "simplify_eq" | "simplify_eq" destruction_arg | "esimplify_eq" @@ -2175,64 +2084,64 @@ simple_tactic: [ | "einjection" destruction_arg "as" simple_intropattern_list_opt | "simple" "injection" | "simple" "injection" destruction_arg -| "dependent" "rewrite" orient term -| "dependent" "rewrite" orient term "in" var -| "cutrewrite" orient term -| "cutrewrite" orient term "in" var -| "decompose" "sum" term -| "decompose" "record" term -| "absurd" term +| "dependent" "rewrite" orient term1_extended +| "dependent" "rewrite" orient term1_extended "in" ident +| "cutrewrite" orient term1_extended +| "cutrewrite" orient term1_extended "in" ident +| "decompose" "sum" term1_extended +| "decompose" "record" term1_extended +| "absurd" term1_extended | "contradiction" constr_with_bindings_opt -| "autorewrite" "with" preident_list clause_dft_concl -| "autorewrite" "with" preident_list clause_dft_concl "using" tactic -| "autorewrite" "*" "with" preident_list clause_dft_concl -| "autorewrite" "*" "with" preident_list clause_dft_concl "using" tactic -| "rewrite" "*" orient uconstr "in" var "at" occurrences by_arg_tac -| "rewrite" "*" orient uconstr "at" occurrences "in" var by_arg_tac -| "rewrite" "*" orient uconstr "in" var by_arg_tac -| "rewrite" "*" orient uconstr "at" occurrences by_arg_tac -| "rewrite" "*" orient uconstr by_arg_tac -| "refine" uconstr -| "simple" "refine" uconstr -| "notypeclasses" "refine" uconstr -| "simple" "notypeclasses" "refine" uconstr +| "autorewrite" "with" ident_list clause_dft_concl +| "autorewrite" "with" ident_list clause_dft_concl "using" ltac_expr +| "autorewrite" "*" "with" ident_list clause_dft_concl +| "autorewrite" "*" "with" ident_list clause_dft_concl "using" ltac_expr +| "rewrite" "*" orient term1_extended "in" ident "at" occurrences by_arg_tac +| "rewrite" "*" orient term1_extended "at" occurrences "in" ident by_arg_tac +| "rewrite" "*" orient term1_extended "in" ident by_arg_tac +| "rewrite" "*" orient term1_extended "at" occurrences by_arg_tac +| "rewrite" "*" orient term1_extended by_arg_tac +| "refine" term1_extended +| "simple" "refine" term1_extended +| "notypeclasses" "refine" term1_extended +| "simple" "notypeclasses" "refine" term1_extended | "solve_constraints" -| "subst" var_list +| "subst" ident_list | "subst" | "simple" "subst" -| "evar" test_lpar_id_colon "(" ident ":" lconstr ")" -| "evar" term -| "instantiate" "(" ident ":=" lglob ")" -| "instantiate" "(" integer ":=" lglob ")" hloc +| "evar" "(" ident ":" term ")" +| "evar" term1_extended +| "instantiate" "(" ident ":=" term ")" +| "instantiate" "(" int ":=" term ")" hloc | "instantiate" -| "stepl" term "by" tactic -| "stepl" term -| "stepr" term "by" tactic -| "stepr" term -| "generalize_eqs" var -| "dependent" "generalize_eqs" var -| "generalize_eqs_vars" var -| "dependent" "generalize_eqs_vars" var -| "specialize_eqs" var -| "hresolve_core" "(" ident ":=" term ")" "at" int_or_var "in" term -| "hresolve_core" "(" ident ":=" term ")" "in" term +| "stepl" term1_extended "by" ltac_expr +| "stepl" term1_extended +| "stepr" term1_extended "by" ltac_expr +| "stepr" term1_extended +| "generalize_eqs" ident +| "dependent" "generalize_eqs" ident +| "generalize_eqs_vars" ident +| "dependent" "generalize_eqs_vars" ident +| "specialize_eqs" ident +| "hresolve_core" "(" ident ":=" term1_extended ")" "at" int_or_var "in" term1_extended +| "hresolve_core" "(" ident ":=" term1_extended ")" "in" term1_extended | "hget_evar" int_or_var | "destauto" -| "destauto" "in" var +| "destauto" "in" ident | "transparent_abstract" ltac_expr3 | "transparent_abstract" ltac_expr3 "using" ident -| "constr_eq" term term -| "constr_eq_strict" term term -| "constr_eq_nounivs" term term -| "is_evar" term -| "has_evar" term -| "is_var" term -| "is_fix" term -| "is_cofix" term -| "is_ind" term -| "is_constructor" term -| "is_proj" term -| "is_const" term +| "constr_eq" term1_extended term1_extended +| "constr_eq_strict" term1_extended term1_extended +| "constr_eq_nounivs" term1_extended term1_extended +| "is_evar" term1_extended +| "has_evar" term1_extended +| "is_var" term1_extended +| "is_fix" term1_extended +| "is_cofix" term1_extended +| "is_ind" term1_extended +| "is_constructor" term1_extended +| "is_proj" term1_extended +| "is_const" term1_extended | "shelve" | "shelve_unifiable" | "unshelve" ltac_expr1 @@ -2240,8 +2149,8 @@ simple_tactic: [ | "cycle" int_or_var | "swap" int_or_var int_or_var | "revgoals" -| "guard" test -| "decompose" "[" term_list "]" term +| "guard" int_or_var comparison int_or_var +| "decompose" "[" term1_extended_list "]" term1_extended | "optimize_heap" | "start" "ltac" "profiling" | "stop" "ltac" "profiling" @@ -2253,51 +2162,51 @@ simple_tactic: [ | "finish_timing" string_opt | "finish_timing" "(" string ")" string_opt | "eassumption" -| "eexact" term +| "eexact" term1_extended | "trivial" auto_using hintbases | "info_trivial" auto_using hintbases | "debug" "trivial" auto_using hintbases | "auto" int_or_var_opt auto_using hintbases | "info_auto" int_or_var_opt auto_using hintbases | "debug" "auto" int_or_var_opt auto_using hintbases -| "prolog" "[" uconstr_list_opt "]" int_or_var +| "prolog" "[" term1_extended_list_opt "]" int_or_var | "eauto" int_or_var_opt int_or_var_opt auto_using hintbases | "new" "auto" int_or_var_opt auto_using hintbases | "debug" "eauto" int_or_var_opt int_or_var_opt auto_using hintbases | "info_eauto" int_or_var_opt int_or_var_opt auto_using hintbases | "dfs" "eauto" int_or_var_opt auto_using hintbases | "autounfold" hintbases clause_dft_concl -| "autounfold_one" hintbases "in" var +| "autounfold_one" hintbases "in" ident | "autounfold_one" hintbases -| "unify" term term -| "unify" term term "with" preident -| "convert_concl_no_check" term -| "typeclasses" "eauto" "bfs" int_or_var_opt "with" preident_list -| "typeclasses" "eauto" int_or_var_opt "with" preident_list +| "unify" term1_extended term1_extended +| "unify" term1_extended term1_extended "with" ident +| "convert_concl_no_check" term1_extended +| "typeclasses" "eauto" "bfs" int_or_var_opt "with" ident_list +| "typeclasses" "eauto" int_or_var_opt "with" ident_list | "typeclasses" "eauto" int_or_var_opt -| "head_of_constr" ident term -| "not_evar" term -| "is_ground" term -| "autoapply" term "using" preident -| "autoapply" term "with" preident -| "progress_evars" tactic -| "rewrite_strat" rewstrategy "in" var +| "head_of_constr" ident term1_extended +| "not_evar" term1_extended +| "is_ground" term1_extended +| "autoapply" term1_extended "using" ident +| "autoapply" term1_extended "with" ident +| "progress_evars" ltac_expr | "rewrite_strat" rewstrategy -| "rewrite_db" preident "in" var -| "rewrite_db" preident -| "substitute" orient glob_constr_with_bindings -| "setoid_rewrite" orient glob_constr_with_bindings -| "setoid_rewrite" orient glob_constr_with_bindings "in" var -| "setoid_rewrite" orient glob_constr_with_bindings "at" occurrences -| "setoid_rewrite" orient glob_constr_with_bindings "at" occurrences "in" var -| "setoid_rewrite" orient glob_constr_with_bindings "in" var "at" occurrences +| "rewrite_db" ident "in" ident +| "rewrite_db" ident +| "substitute" orient constr_with_bindings +| "setoid_rewrite" orient constr_with_bindings +| "setoid_rewrite" orient constr_with_bindings "in" ident +| "setoid_rewrite" orient constr_with_bindings "at" occurrences +| "setoid_rewrite" orient constr_with_bindings "at" occurrences "in" ident +| "setoid_rewrite" orient constr_with_bindings "in" ident "at" occurrences | "setoid_symmetry" -| "setoid_symmetry" "in" var +| "setoid_symmetry" "in" ident | "setoid_reflexivity" -| "setoid_transitivity" term +| "setoid_transitivity" term1_extended | "setoid_etransitivity" | "decide" "equality" -| "compare" term term +| "compare" term1_extended term1_extended +| "rewrite_strat" rewstrategy "in" ident | "intros" intropattern_list_opt | "eintros" intropattern_list_opt | "apply" constr_with_bindings_arg_list_comma in_hyp_as @@ -2308,33 +2217,33 @@ simple_tactic: [ | "eelim" constr_with_bindings_arg eliminator_opt | "case" induction_clause_list | "ecase" induction_clause_list -| "fix" ident natural "with" fixdecl_list +| "fix" ident num "with" fixdecl_list | "cofix" ident "with" cofixdecl_list | "pose" bindings_with_parameters -| "pose" term as_name +| "pose" term1_extended as_name | "epose" bindings_with_parameters -| "epose" term as_name +| "epose" term1_extended as_name | "set" bindings_with_parameters clause_dft_concl -| "set" term as_name clause_dft_concl +| "set" term1_extended as_name clause_dft_concl | "eset" bindings_with_parameters clause_dft_concl -| "eset" term as_name clause_dft_concl -| "remember" term as_name eqn_ipat clause_dft_all -| "eremember" term as_name eqn_ipat clause_dft_all -| "assert" "(" ident ":=" lconstr ")" -| "eassert" "(" ident ":=" lconstr ")" -| "assert" test_lpar_id_colon "(" ident ":" lconstr ")" by_tactic -| "eassert" test_lpar_id_colon "(" ident ":" lconstr ")" by_tactic -| "enough" test_lpar_id_colon "(" ident ":" lconstr ")" by_tactic -| "eenough" test_lpar_id_colon "(" ident ":" lconstr ")" by_tactic -| "assert" term as_ipat by_tactic -| "eassert" term as_ipat by_tactic -| "pose" "proof" lconstr as_ipat -| "epose" "proof" lconstr as_ipat -| "enough" term as_ipat by_tactic -| "eenough" term as_ipat by_tactic -| "generalize" term -| "generalize" term term_list -| "generalize" term occs as_name pattern_occ_list_opt +| "eset" term1_extended as_name clause_dft_concl +| "remember" term1_extended as_name eqn_ipat clause_dft_all +| "eremember" term1_extended as_name eqn_ipat clause_dft_all +| "assert" "(" ident ":=" term ")" +| "eassert" "(" ident ":=" term ")" +| "assert" "(" ident ":" term ")" by_tactic +| "eassert" "(" ident ":" term ")" by_tactic +| "enough" "(" ident ":" term ")" by_tactic +| "eenough" "(" ident ":" term ")" by_tactic +| "assert" term1_extended as_ipat by_tactic +| "eassert" term1_extended as_ipat by_tactic +| "pose" "proof" term as_ipat +| "epose" "proof" term as_ipat +| "enough" term1_extended as_ipat by_tactic +| "eenough" term1_extended as_ipat by_tactic +| "generalize" term1_extended +| "generalize" term1_extended term1_extended_list +| "generalize" term1_extended occs as_name pattern_occ_list_opt | "induction" induction_clause_list | "einduction" induction_clause_list | "destruct" induction_clause_list @@ -2345,7 +2254,7 @@ simple_tactic: [ | "simple" "inversion" quantified_hypothesis as_or_and_ipat in_hyp_list | "inversion" quantified_hypothesis as_or_and_ipat in_hyp_list | "inversion_clear" quantified_hypothesis as_or_and_ipat in_hyp_list -| "inversion" quantified_hypothesis "using" term in_hyp_list +| "inversion" quantified_hypothesis "using" term1_extended in_hyp_list | "red" clause_dft_concl | "hnf" clause_dft_concl | "simpl" delta_flag ref_or_pattern_occ_opt clause_dft_concl @@ -2356,357 +2265,176 @@ simple_tactic: [ | "vm_compute" ref_or_pattern_occ_opt clause_dft_concl | "native_compute" ref_or_pattern_occ_opt clause_dft_concl | "unfold" unfold_occ_list_comma clause_dft_concl -| "fold" term_list clause_dft_concl +| "fold" term1_extended_list clause_dft_concl | "pattern" pattern_occ_list_comma clause_dft_concl | "change" conversion clause_dft_concl | "change_no_check" conversion clause_dft_concl | "btauto" | "rtauto" | "congruence" -| "congruence" integer -| "congruence" "with" term_list -| "congruence" integer "with" term_list +| "congruence" int +| "congruence" "with" term1_extended_list +| "congruence" int "with" term1_extended_list | "f_equal" -| "firstorder" tactic_opt firstorder_using -| "firstorder" tactic_opt "with" preident_list -| "firstorder" tactic_opt firstorder_using "with" preident_list -| "gintuition" tactic_opt -| "functional" "inversion" quantified_hypothesis reference_opt (* funind plugin *) -| "functional" "induction" term_list fun_ind_using with_names (* funind plugin *) -| "soft" "functional" "induction" term_list fun_ind_using with_names (* funind plugin *) +| "firstorder" ltac_expr_opt firstorder_using +| "firstorder" ltac_expr_opt "with" ident_list +| "firstorder" ltac_expr_opt firstorder_using "with" ident_list +| "gintuition" ltac_expr_opt +| "functional" "inversion" quantified_hypothesis qualid_opt (* funind plugin *) +| "functional" "induction" term1_extended_list fun_ind_using with_names (* funind plugin *) +| "soft" "functional" "induction" term1_extended_list fun_ind_using with_names (* funind plugin *) | "myred" (* micromega plugin *) -| "psatz_Z" int_or_var tactic (* micromega plugin *) -| "psatz_Z" tactic (* micromega plugin *) -| "xlia" tactic (* micromega plugin *) -| "xnlia" tactic (* micromega plugin *) -| "xnra" tactic (* micromega plugin *) -| "xnqa" tactic (* micromega plugin *) -| "sos_Z" tactic (* micromega plugin *) -| "sos_Q" tactic (* micromega plugin *) -| "sos_R" tactic (* micromega plugin *) -| "lra_Q" tactic (* micromega plugin *) -| "lra_R" tactic (* micromega plugin *) -| "psatz_R" int_or_var tactic (* micromega plugin *) -| "psatz_R" tactic (* micromega plugin *) -| "psatz_Q" int_or_var tactic (* micromega plugin *) -| "psatz_Q" tactic (* micromega plugin *) -| "nsatz_compute" term (* nsatz plugin *) +| "psatz_Z" int_or_var ltac_expr (* micromega plugin *) +| "psatz_Z" ltac_expr (* micromega plugin *) +| "xlia" ltac_expr (* micromega plugin *) +| "xnlia" ltac_expr (* micromega plugin *) +| "xnra" ltac_expr (* micromega plugin *) +| "xnqa" ltac_expr (* micromega plugin *) +| "sos_Z" ltac_expr (* micromega plugin *) +| "sos_Q" ltac_expr (* micromega plugin *) +| "sos_R" ltac_expr (* micromega plugin *) +| "lra_Q" ltac_expr (* micromega plugin *) +| "lra_R" ltac_expr (* micromega plugin *) +| "psatz_R" int_or_var ltac_expr (* micromega plugin *) +| "psatz_R" ltac_expr (* micromega plugin *) +| "psatz_Q" int_or_var ltac_expr (* micromega plugin *) +| "psatz_Q" ltac_expr (* micromega plugin *) +| "iter_specs" ltac_expr (* micromega plugin *) +| "zify_op" (* micromega plugin *) +| "saturate" (* micromega plugin *) +| "nsatz_compute" term1_extended (* nsatz plugin *) | "omega" (* omega plugin *) | "omega" "with" ident_list (* omega plugin *) | "omega" "with" "*" (* omega plugin *) | "protect_fv" string "in" ident (* setoid_ring plugin *) | "protect_fv" string (* setoid_ring plugin *) -| "ring_lookup" ltac_expr0 "[" term_list_opt "]" term_list (* setoid_ring plugin *) -| "field_lookup" tactic "[" term_list_opt "]" term_list (* setoid_ring plugin *) -| "YouShouldNotTypeThis" ssrintrosarg (* ssr plugin *) -| "by" ssrhintarg (* ssr plugin *) -| "YouShouldNotTypeThis" "do" (* ssr plugin *) -| "YouShouldNotTypeThis" ssrtclarg ssrseqarg (* ssr plugin *) -| "clear" natural (* ssr plugin *) -| "move" ssrmovearg ssrrpat (* ssr plugin *) -| "move" ssrmovearg ssrclauses (* ssr plugin *) -| "move" ssrrpat (* ssr plugin *) -| "move" (* ssr plugin *) -| "case" ssrcasearg ssrclauses (* ssr plugin *) -| "case" (* ssr plugin *) -| "elim" ssrarg ssrclauses (* ssr plugin *) -| "elim" (* ssr plugin *) -| "apply" ssrapplyarg (* ssr plugin *) -| "apply" (* ssr plugin *) -| "exact" ssrexactarg (* ssr plugin *) -| "exact" (* ssr plugin *) -| "exact" "<:" lconstr (* ssr plugin *) -| "congr" ssrcongrarg (* ssr plugin *) -| "ssrinstancesofruleL2R" ssrterm (* ssr plugin *) -| "ssrinstancesofruleR2L" ssrterm (* ssr plugin *) -| "rewrite" ssrrwargs ssrclauses (* ssr plugin *) -| "unlock" ssrunlockargs ssrclauses (* ssr plugin *) -| "pose" ssrfixfwd (* ssr plugin *) -| "pose" ssrcofixfwd (* ssr plugin *) -| "pose" ssrfwdid ssrposefwd (* ssr plugin *) -| "set" ssrfwdid ssrsetfwd ssrclauses (* ssr plugin *) -| "abstract" ssrdgens (* ssr plugin *) -| "have" ssrhavefwdwbinders (* ssr plugin *) -| "have" "suff" ssrhpats_nobs ssrhavefwd (* ssr plugin *) -| "have" "suffices" ssrhpats_nobs ssrhavefwd (* ssr plugin *) -| "suff" "have" ssrhpats_nobs ssrhavefwd (* ssr plugin *) -| "suffices" "have" ssrhpats_nobs ssrhavefwd (* ssr plugin *) -| "suff" ssrsufffwd (* ssr plugin *) -| "suffices" ssrsufffwd (* ssr plugin *) -| "wlog" ssrhpats_nobs ssrwlogfwd ssrhint (* ssr plugin *) -| "wlog" "suff" ssrhpats_nobs ssrwlogfwd ssrhint (* ssr plugin *) -| "wlog" "suffices" ssrhpats_nobs ssrwlogfwd ssrhint (* ssr plugin *) -| "without" "loss" ssrhpats_nobs ssrwlogfwd ssrhint (* ssr plugin *) -| "without" "loss" "suff" ssrhpats_nobs ssrwlogfwd ssrhint (* ssr plugin *) -| "without" "loss" "suffices" ssrhpats_nobs ssrwlogfwd ssrhint (* ssr plugin *) -| "gen" "have" ssrclear ssr_idcomma ssrhpats_nobs ssrwlogfwd ssrhint (* ssr plugin *) -| "generally" "have" ssrclear ssr_idcomma ssrhpats_nobs ssrwlogfwd ssrhint (* ssr plugin *) -| "under" ssrrwarg (* ssr plugin *) -| "under" ssrrwarg ssrintros_ne (* ssr plugin *) -| "under" ssrrwarg ssrintros_ne "do" ssrhint3arg (* ssr plugin *) -| "under" ssrrwarg "do" ssrhint3arg (* ssr plugin *) -| "ssrinstancesoftpat" cpattern (* ssrmatching plugin *) -] - -var_list: [ -| var_list var -| var -] - -var_list_opt: [ -| var_list_opt var -| empty +| "ring_lookup" ltac_expr0 "[" term1_extended_list_opt "]" term1_extended_list (* setoid_ring plugin *) +| "field_lookup" ltac_expr "[" term1_extended_list_opt "]" term1_extended_list (* setoid_ring plugin *) ] -constr_with_bindings_opt: [ -| constr_with_bindings -| empty -] - -int_or_var_opt: [ -| int_or_var -| empty +int_or_var: [ +| int +| ident ] -uconstr_list_opt: [ -| uconstr_list_opt uconstr +constr_with_bindings_opt: [ +| constr_with_bindings | empty ] -constr_with_bindings_arg_list_comma: [ -| constr_with_bindings_arg_list_comma "," constr_with_bindings_arg -| constr_with_bindings_arg -] - -fixdecl_list: [ -| fixdecl_list fixdecl -| fixdecl -] - -cofixdecl_list: [ -| cofixdecl_list cofixdecl -| cofixdecl -] - -pattern_occ_list_opt: [ -| pattern_occ_list_opt "," pattern_occ as_name +hloc: [ | empty +| "in" "|-" "*" +| "in" ident +| "in" "(" "Type" "of" ident ")" +| "in" "(" "Value" "of" ident ")" +| "in" "(" "type" "of" ident ")" +| "in" "(" "value" "of" ident ")" ] -oriented_rewriter_list_comma: [ -| oriented_rewriter_list_comma "," oriented_rewriter -| oriented_rewriter -] - -simple_alt: [ -| "simple" "inversion" -| "inversion" -| "inversion_clear" +rename: [ +| ident "into" ident ] -with_opt2: [ -| "with" term +by_arg_tac: [ +| "by" ltac_expr3 | empty ] -tactic_opt: [ -| tactic -| empty +in_clause: [ +| in_clause +| "*" occs +| "*" "|-" concl_occ +| hypident_occ_list_comma_opt "|-" concl_occ +| hypident_occ_list_comma_opt ] -reference_opt: [ -| reference +occs: [ +| "at" occs_nums | empty ] -bindings_list_comma: [ -| bindings_list_comma "," bindings -| bindings -] - -rename_list_comma: [ -| rename_list_comma "," rename -| rename -] - -orient_string: [ -| orient preident -] - -comparison: [ -| "=" -| "<" -| "<=" -| ">" -| ">=" -] - -test: [ -| int_or_var comparison int_or_var -] - -hintbases: [ -| "with" "*" -| "with" preident_list +hypident_occ_list_comma_opt: [ +| hypident_occ_list_comma | empty ] -preident_list: [ -| preident_list preident -| preident -] - -auto_using: [ -| "using" uconstr_list_comma +as_ipat: [ +| "as" simple_intropattern | empty ] -uconstr_list_comma: [ -| uconstr_list_comma "," uconstr -| uconstr -] - -hints_path_atom: [ -| global_list -| "_" -] - -hints_path: [ -| "(" hints_path ")" -| hints_path "*" -| "emp" -| "eps" -| hints_path "|" hints_path -| hints_path_atom -| hints_path hints_path +or_and_intropattern_loc: [ +| or_and_intropattern +| ident ] -opthints: [ -| ":" preident_list +as_or_and_ipat: [ +| "as" or_and_intropattern_loc | empty ] -debug: [ -| "debug" +eqn_ipat: [ +| "eqn" ":" naming_intropattern +| "_eqn" ":" naming_intropattern +| "_eqn" | empty ] -eauto_search_strategy: [ -| "(bfs)" -| "(dfs)" +as_name: [ +| "as" ident | empty ] -glob_constr_with_bindings: [ -| constr_with_bindings -] - -rewstrategy: [ -| glob -| "<-" term -| "subterms" rewstrategy -| "subterm" rewstrategy -| "innermost" rewstrategy -| "outermost" rewstrategy -| "bottomup" rewstrategy -| "topdown" rewstrategy -| "id" -| "fail" -| "refl" -| "progress" rewstrategy -| "try" rewstrategy -| "any" rewstrategy -| "repeat" rewstrategy -| rewstrategy ";" rewstrategy -| "(" rewstrategy ")" -| "choice" rewstrategy rewstrategy -| "old_hints" preident -| "hints" preident -| "terms" term_list_opt -| "eval" red_expr -| "fold" term -] - -term_list_opt: [ -| term_list_opt term +by_tactic: [ +| "by" ltac_expr3 | empty ] -int_or_var: [ -| integer -| ident -] - -nat_or_var: [ -| natural -| ident -] - -id_or_meta: [ -| ident -] - -open_constr: [ -| term -] - -uconstr: [ -| term -] - -destruction_arg: [ -| natural -| constr_with_bindings +rewriter: [ +| "!" constr_with_bindings_arg +| qmark_alt constr_with_bindings_arg +| num "!" constr_with_bindings_arg +| num qmark_alt constr_with_bindings_arg +| num constr_with_bindings_arg | constr_with_bindings_arg ] -constr_with_bindings_arg: [ -| ">" constr_with_bindings -| constr_with_bindings +qmark_alt: [ +| "?" +| "?" ] -quantified_hypothesis: [ -| ident -| natural +oriented_rewriter: [ +| orient rewriter ] -conversion: [ -| term -| term "with" term -| term "at" occs_nums "with" term +induction_clause: [ +| destruction_arg as_or_and_ipat eqn_ipat opt_clause ] -occs_nums: [ -| nat_or_var_list -| "-" nat_or_var int_or_var_list_opt +induction_clause_list: [ +| induction_clause_list_comma eliminator_opt opt_clause ] -nat_or_var_list: [ -| nat_or_var_list nat_or_var -| nat_or_var +induction_clause_list_comma: [ +| induction_clause_list_comma "," induction_clause +| induction_clause ] -int_or_var_list_opt: [ -| int_or_var_list_opt int_or_var +eliminator_opt: [ +| "using" constr_with_bindings | empty ] -occs: [ -| "at" occs_nums +auto_using: [ +| "using" term1_extended_list_comma | empty ] -pattern_occ: [ -| term occs -] - -ref_or_pattern_occ: [ -| smart_global occs -| term occs -] - -unfold_occ: [ -| smart_global occs +term1_extended_list_comma: [ +| term1_extended_list_comma "," term1_extended +| term1_extended ] intropattern_list_opt: [ @@ -2764,11 +2492,11 @@ intropattern: [ ] simple_intropattern: [ -| simple_intropattern_closed operconstr0_list_opt +| simple_intropattern_closed term0_list_opt ] -operconstr0_list_opt: [ -| operconstr0_list_opt "%" operconstr0 +term0_list_opt: [ +| term0_list_opt "%" term0 | empty ] @@ -2780,13 +2508,13 @@ simple_intropattern_closed: [ ] simple_binding: [ -| "(" ident ":=" lconstr ")" -| "(" natural ":=" lconstr ")" +| "(" ident ":=" term ")" +| "(" num ":=" term ")" ] bindings: [ | simple_binding_list -| term_list +| term1_extended_list ] simple_binding_list: [ @@ -2794,88 +2522,88 @@ simple_binding_list: [ | simple_binding ] -term_list: [ -| term_list term -| term +constr_with_bindings_arg_list_comma: [ +| constr_with_bindings_arg_list_comma "," constr_with_bindings_arg +| constr_with_bindings_arg ] -constr_with_bindings: [ -| term with_bindings +fixdecl_list: [ +| fixdecl_list fixdecl +| fixdecl ] -with_bindings: [ -| "with" bindings +cofixdecl_list: [ +| cofixdecl_list cofixdecl +| cofixdecl +] + +pattern_occ_list_opt: [ +| pattern_occ_list_opt "," pattern_occ as_name | empty ] -red_flags: [ -| "beta" -| "iota" -| "match" -| "fix" -| "cofix" -| "zeta" -| "delta" delta_flag +pattern_occ: [ +| term1_extended occs ] -delta_flag: [ -| "-" "[" smart_global_list "]" -| "[" smart_global_list "]" +oriented_rewriter_list_comma: [ +| oriented_rewriter_list_comma "," oriented_rewriter +| oriented_rewriter +] + +simple_alt: [ +| "simple" "inversion" +| "inversion" +| "inversion_clear" +] + +with_opt2: [ +| "with" term1_extended | empty ] -smart_global_list: [ -| smart_global_list smart_global -| smart_global +bindings_list_comma: [ +| bindings_list_comma "," bindings +| bindings ] -strategy_flag: [ -| red_flags_list -| delta_flag +rename_list_comma: [ +| rename_list_comma "," rename +| rename ] -red_flags_list: [ -| red_flags_list red_flags -| red_flags +comparison: [ +| "=" +| "<" +| "<=" +| ">" +| ">=" ] -red_expr: [ -| "red" -| "hnf" -| "simpl" delta_flag ref_or_pattern_occ_opt -| "cbv" strategy_flag -| "cbn" strategy_flag -| "lazy" strategy_flag -| "compute" delta_flag -| "vm_compute" ref_or_pattern_occ_opt -| "native_compute" ref_or_pattern_occ_opt -| "unfold" unfold_occ_list_comma -| "fold" term_list -| "pattern" pattern_occ_list_comma -| IDENT +hintbases: [ +| "with" "*" +| "with" ident_list +| empty ] -ref_or_pattern_occ_opt: [ -| ref_or_pattern_occ +qualid_opt: [ +| qualid | empty ] -unfold_occ_list_comma: [ -| unfold_occ_list_comma "," unfold_occ -| unfold_occ +bindings_with_parameters: [ +| "(" ident simple_binder_list_opt ":=" term ")" ] -pattern_occ_list_comma: [ -| pattern_occ_list_comma "," pattern_occ -| pattern_occ +simple_binder_list_opt: [ +| simple_binder_list_opt simple_binder +| empty ] hypident: [ -| id_or_meta -| "(" "type" "of" id_or_meta ")" -| "(" "value" "of" id_or_meta ")" -| "(" "type" "of" ident ")" (* ssr plugin *) -| "(" "value" "of" ident ")" (* ssr plugin *) +| ident +| "(" "type" "of" ident ")" +| "(" "value" "of" ident ")" ] hypident_occ: [ @@ -2899,118 +2627,151 @@ opt_clause: [ | empty ] +occs_nums: [ +| num_or_var_list +| "-" num_or_var int_or_var_list_opt +] + +num_or_var: [ +| num +| ident +] + +int_or_var_list_opt: [ +| int_or_var_list_opt int_or_var +| empty +] + +num_or_var_list: [ +| num_or_var_list num_or_var +| num_or_var +] + concl_occ: [ | "*" occs | empty ] in_hyp_list: [ -| "in" id_or_meta_list +| "in" ident_list | empty ] -id_or_meta_list: [ -| id_or_meta_list id_or_meta -| id_or_meta -] - in_hyp_as: [ -| "in" id_or_meta as_ipat +| "in" ident as_ipat | empty ] simple_binder: [ | name -| "(" name_list ":" lconstr ")" +| "(" names ":" term ")" ] fixdecl: [ -| "(" ident simple_binder_list_opt fixannot ":" lconstr ")" +| "(" ident simple_binder_list_opt struct_annot ":" term ")" ] -cofixdecl: [ -| "(" ident simple_binder_list_opt ":" lconstr ")" -] - -bindings_with_parameters: [ -| "(" ident simple_binder_list_opt ":=" lconstr ")" +struct_annot: [ +| "{" "struct" name "}" +| empty ] -simple_binder_list_opt: [ -| simple_binder_list_opt simple_binder -| empty +cofixdecl: [ +| "(" ident simple_binder_list_opt ":" term ")" ] -eliminator: [ -| "using" constr_with_bindings +constr_with_bindings: [ +| term1_extended with_bindings ] -as_ipat: [ -| "as" simple_intropattern +with_bindings: [ +| "with" bindings | empty ] -or_and_intropattern_loc: [ -| or_and_intropattern -| ident +destruction_arg: [ +| num +| constr_with_bindings +| constr_with_bindings_arg ] -as_or_and_ipat: [ -| "as" or_and_intropattern_loc -| empty +constr_with_bindings_arg: [ +| ">" constr_with_bindings +| constr_with_bindings ] -eqn_ipat: [ -| "eqn" ":" naming_intropattern -| "_eqn" ":" naming_intropattern -| "_eqn" -| empty +quantified_hypothesis: [ +| ident +| num ] -as_name: [ -| "as" ident -| empty +conversion: [ +| term1_extended +| term1_extended "with" term1_extended +| term1_extended "at" occs_nums "with" term1_extended ] -by_tactic: [ -| "by" ltac_expr3 +firstorder_using: [ +| "using" qualid +| "using" qualid "," qualid_list_comma +| "using" qualid qualid qualid_list_opt | empty ] -rewriter: [ -| "!" constr_with_bindings_arg -| qmark_alt constr_with_bindings_arg -| natural "!" constr_with_bindings_arg -| natural qmark_alt constr_with_bindings_arg -| natural constr_with_bindings_arg -| constr_with_bindings_arg +qualid_list_comma: [ +| qualid_list_comma "," qualid +| qualid ] -qmark_alt: [ -| "?" -| "?" +fun_ind_using: [ +| "using" constr_with_bindings (* funind plugin *) +| empty (* funind plugin *) ] -oriented_rewriter: [ -| orient rewriter +with_names: [ +| "as" simple_intropattern (* funind plugin *) +| empty (* funind plugin *) ] -induction_clause: [ -| destruction_arg as_or_and_ipat eqn_ipat opt_clause +occurrences: [ +| int_list +| ident ] -induction_clause_list: [ -| induction_clause_list_comma eliminator_opt opt_clause +int_list: [ +| int_list int +| int ] -induction_clause_list_comma: [ -| induction_clause_list_comma "," induction_clause -| induction_clause +rewstrategy: [ +| term1_extended +| "<-" term1_extended +| "subterms" rewstrategy +| "subterm" rewstrategy +| "innermost" rewstrategy +| "outermost" rewstrategy +| "bottomup" rewstrategy +| "topdown" rewstrategy +| "id" +| "fail" +| "refl" +| "progress" rewstrategy +| "try" rewstrategy +| "any" rewstrategy +| "repeat" rewstrategy +| rewstrategy ";" rewstrategy +| "(" rewstrategy ")" +| "choice" rewstrategy rewstrategy +| "old_hints" ident +| "hints" ident +| "terms" term1_extended_list_opt +| "eval" red_expr +| "fold" term1_extended ] -eliminator_opt: [ -| eliminator -| empty +hypident_occ_list_comma: [ +| hypident_occ_list_comma "," hypident_occ +| hypident_occ ] ltac_expr: [ @@ -3019,19 +2780,19 @@ ltac_expr: [ ] binder_tactic: [ -| "fun" input_fun_list "=>" ltac_expr +| "fun" fun_var_list "=>" ltac_expr | "let" rec_opt let_clause_list "in" ltac_expr | "info" ltac_expr ] -input_fun_list: [ -| input_fun_list input_fun -| input_fun +fun_var_list: [ +| fun_var_list fun_var +| fun_var ] -input_fun: [ -| "_" +fun_var: [ | ident +| "_" ] rec_opt: [ @@ -3047,27 +2808,20 @@ let_clause_list: [ let_clause: [ | ident ":=" ltac_expr | "_" ":=" ltac_expr -| ident input_fun_list ":=" ltac_expr +| ident fun_var_list ":=" ltac_expr ] ltac_expr4: [ | ltac_expr3 ";" binder_tactic | ltac_expr3 ";" ltac_expr3 -| ltac_expr3 ";" "[" gt_opt tactic_then_gen "]" +| ltac_expr3 ";" "[" multi_goal_tactics "]" +| ltac_expr3 ";" "[" ">" multi_goal_tactics "]" | ltac_expr3 -| ltac_expr ";" "first" ssr_first_else (* ssr plugin *) -| ltac_expr ";" "first" ssrseqarg (* ssr plugin *) -| ltac_expr ";" "last" ssrseqarg (* ssr plugin *) -] - -gt_opt: [ -| ">" -| empty ] -tactic_then_gen: [ -| ltac_expr_opt "|" tactic_then_gen -| ltac_expr_opt ".." or_opt ltac_expr_list2 +multi_goal_tactics: [ +| ltac_expr_opt "|" multi_goal_tactics +| ltac_expr_opt ".." or_opt ltac_expr_opt_list_or | ltac_expr | empty ] @@ -3077,13 +2831,8 @@ ltac_expr_opt: [ | empty ] -ltac_expr_list_or2_opt: [ -| ltac_expr_list_or2 -| empty -] - -ltac_expr_list_or2: [ -| ltac_expr_list_or2 "|" ltac_expr_opt +ltac_expr_opt_list_or: [ +| ltac_expr_opt_list_or "|" ltac_expr_opt | ltac_expr_opt ] @@ -3099,51 +2848,10 @@ ltac_expr3: [ | "infoH" ltac_expr3 | "abstract" ltac_expr2 | "abstract" ltac_expr2 "using" ident -| selector ltac_expr3 -| "do" ssrmmod ssrdotac ssrclauses (* ssr plugin *) -| "do" ssrortacarg ssrclauses (* ssr plugin *) -| "do" int_or_var ssrmmod ssrdotac ssrclauses (* ssr plugin *) -| "abstract" ssrdgens (* ssr plugin *) +| only_selector ltac_expr3 | ltac_expr2 ] -tactic_mode: [ -| toplevel_selector_opt query_command -| toplevel_selector_opt "{" -| toplevel_selector_opt ltac_info_opt tactic ltac_use_default -| "par" ":" ltac_info_opt tactic ltac_use_default -] - -toplevel_selector_opt: [ -| toplevel_selector -| empty -] - -toplevel_selector: [ -| selector_body ":" -| "!" ":" -| "all" ":" -] - -selector: [ -| "only" selector_body ":" -] - -selector_body: [ -| range_selector_list_comma -| "[" ident "]" -] - -range_selector_list_comma: [ -| range_selector_list_comma "," range_selector -| range_selector -] - -range_selector: [ -| natural "-" natural -| natural -] - ltac_expr2: [ | ltac_expr1 "+" binder_tactic | ltac_expr1 "+" ltac_expr2 @@ -3154,30 +2862,18 @@ ltac_expr2: [ ] ltac_expr1: [ -| match_key reverse_opt "goal" "with" match_context_list "end" -| match_key ltac_expr "with" match_list "end" +| ltac_match_term +| ltac_match_goal | "first" "[" ltac_expr_list_or_opt "]" | "solve" "[" ltac_expr_list_or_opt "]" | "idtac" message_token_list_opt | failkw int_or_var_opt message_token_list_opt | simple_tactic | tactic_arg -| reference tactic_arg_compat_list_opt -| ltac_expr ssrintros_ne (* ssr plugin *) +| qualid tactic_arg_compat_list_opt | ltac_expr0 ] -match_key: [ -| "match" -| "lazymatch" -| "multimatch" -] - -reverse_opt: [ -| "reverse" -| empty -] - ltac_expr_list_or_opt: [ | ltac_expr_list_or | empty @@ -3188,95 +2884,27 @@ ltac_expr_list_or: [ | ltac_expr ] -match_context_list: [ -| or_opt match_context_rule_list_or -] - -match_context_rule_list_or: [ -| match_context_rule_list_or "|" match_context_rule -| match_context_rule -] - -or_opt: [ -| "|" -| empty -] - -eqn_list_or_opt: [ -| eqn_list_or -| empty -] - -eqn_list_or: [ -| eqn_list_or "|" eqn -| eqn -] - -match_context_rule: [ -| match_hyps_list_comma_opt "|-" match_pattern "=>" ltac_expr -| "[" match_hyps_list_comma_opt "|-" match_pattern "]" "=>" ltac_expr -| "_" "=>" ltac_expr -] - -match_hyps_list_comma_opt: [ -| match_hyps_list_comma +message_token_list_opt: [ +| message_token_list_opt message_token | empty ] -match_hyps_list_comma: [ -| match_hyps_list_comma "," match_hyps -| match_hyps -] - -match_hyps: [ -| name ":" match_pattern -| name ":=" match_pattern_opt match_pattern -] - -match_pattern: [ -| "context" ident_opt "[" lconstr_pattern "]" -| lconstr_pattern -] - -ident_opt: [ +message_token: [ | ident -| empty -] - -lconstr_pattern: [ -| lconstr +| string +| int ] -match_pattern_opt: [ -| "[" match_pattern "]" ":" +int_or_var_opt: [ +| int_or_var | empty ] -match_list: [ -| or_opt match_rule_list_or -] - -match_rule_list_or: [ -| match_rule_list_or "|" match_rule -| match_rule -] - -match_rule: [ -| match_pattern "=>" ltac_expr -| "_" "=>" ltac_expr -] - -message_token_list_opt: [ -| message_token_list_opt message_token +term1_extended_list_opt: [ +| term1_extended_list_opt term1_extended | empty ] -message_token: [ -| ident -| STRING -| integer -] - failkw: [ | "fail" | "gfail" @@ -3284,10 +2912,10 @@ failkw: [ tactic_arg: [ | "eval" red_expr "in" term -| "context" ident "[" lconstr "]" +| "context" ident "[" term "]" | "type" "of" term | "fresh" fresh_id_list_opt -| "type_term" uconstr +| "type_term" term1_extended | "numgoals" ] @@ -3297,7 +2925,7 @@ fresh_id_list_opt: [ ] fresh_id: [ -| STRING +| string | qualid ] @@ -3314,857 +2942,112 @@ tactic_arg_compat: [ ltac_expr0: [ | "(" ltac_expr ")" -| "[" ">" tactic_then_gen "]" +| "[>" multi_goal_tactics "]" | tactic_atom -| ssrparentacarg (* ssr plugin *) ] tactic_atom: [ -| integer -| reference +| int +| qualid | "()" ] -constr_may_eval: [ -| "eval" red_expr "in" term -| "context" ident "[" lconstr "]" -| "type" "of" term -| term -] - -ltac_def_kind: [ -| ":=" -| "::=" -] - -tacdef_body: [ -| global input_fun_list ltac_def_kind ltac_expr -| global ltac_def_kind ltac_expr +toplevel_selector: [ +| selector ":" +| "all" ":" +| "!" ":" ] -tactic: [ -| ltac_expr +only_selector: [ +| "only" selector ":" ] -ltac_info_opt: [ -| ltac_info -| empty +selector: [ +| range_selector_list_comma +| "[" ident "]" ] -ltac_info: [ -| "Info" natural +range_selector_list_comma: [ +| range_selector_list_comma "," range_selector +| range_selector ] -ltac_use_default: [ -| "." -| "..." +range_selector: [ +| num "-" num +| num ] -ltac_tactic_level: [ -| "(" "at" "level" natural ")" +ltac_match_term: [ +| match_key ltac_expr "with" or_opt match_rule_list_or "end" ] -ltac_production_sep: [ -| "," string +match_key: [ +| "match" +| "multimatch" +| "lazymatch" ] -ltac_production_item: [ -| string -| ident "(" ident ltac_production_sep_opt ")" -| ident +match_rule_list_or: [ +| match_rule_list_or "|" match_rule +| match_rule ] -ltac_production_sep_opt: [ -| ltac_production_sep -| empty +match_rule: [ +| match_pattern_alt "=>" ltac_expr ] -ltac_tacdef_body: [ -| tacdef_body +match_pattern_alt: [ +| match_pattern +| "_" ] -firstorder_using: [ -| "using" reference -| "using" reference "," reference_list_comma -| "using" reference reference reference_list_opt -| empty +match_pattern: [ +| "context" ident_opt "[" term "]" +| term ] -reference_list_comma: [ -| reference_list_comma "," reference -| reference +ltac_match_goal: [ +| match_key reverse_opt "goal" "with" or_opt match_context_rule_list_or "end" ] -numnotoption: [ +reverse_opt: [ +| "reverse" | empty -| "(" "warning" "after" bigint ")" -| "(" "abstract" "after" bigint ")" -] - -mlname: [ -| preident (* extraction plugin *) -| string (* extraction plugin *) -] - -int_or_id: [ -| preident (* extraction plugin *) -| integer (* extraction plugin *) -] - -language: [ -| "Ocaml" (* extraction plugin *) -| "OCaml" (* extraction plugin *) -| "Haskell" (* extraction plugin *) -| "Scheme" (* extraction plugin *) -| "JSON" (* extraction plugin *) -] - -fun_ind_using: [ -| "using" constr_with_bindings (* funind plugin *) -| empty (* funind plugin *) -] - -with_names: [ -| "as" simple_intropattern (* funind plugin *) -| empty (* funind plugin *) -] - -constr_comma_sequence': [ -| term "," constr_comma_sequence' (* funind plugin *) -| term (* funind plugin *) -] - -auto_using': [ -| "using" constr_comma_sequence' (* funind plugin *) -| empty (* funind plugin *) -] - -function_rec_definition_loc: [ -| rec_definition (* funind plugin *) -] - -fun_scheme_arg: [ -| ident ":=" "Induction" "for" reference "Sort" sort_family (* funind plugin *) -] - -ring_mod: [ -| "decidable" term (* setoid_ring plugin *) -| "abstract" (* setoid_ring plugin *) -| "morphism" term (* setoid_ring plugin *) -| "constants" "[" tactic "]" (* setoid_ring plugin *) -| "closed" "[" global_list "]" (* setoid_ring plugin *) -| "preprocess" "[" tactic "]" (* setoid_ring plugin *) -| "postprocess" "[" tactic "]" (* setoid_ring plugin *) -| "setoid" term term (* setoid_ring plugin *) -| "sign" term (* setoid_ring plugin *) -| "power" term "[" global_list "]" (* setoid_ring plugin *) -| "power_tac" term "[" tactic "]" (* setoid_ring plugin *) -| "div" term (* setoid_ring plugin *) -] - -ring_mods: [ -| "(" ring_mod_list_comma ")" (* setoid_ring plugin *) -] - -ring_mod_list_comma: [ -| ring_mod_list_comma "," ring_mod -| ring_mod -] - -field_mod: [ -| ring_mod (* setoid_ring plugin *) -| "completeness" term (* setoid_ring plugin *) -] - -field_mods: [ -| "(" field_mod_list_comma ")" (* setoid_ring plugin *) -] - -field_mod_list_comma: [ -| field_mod_list_comma "," field_mod -| field_mod -] - -ssrtacarg: [ -| ltac_expr (* ssr plugin *) -] - -ssrtac3arg: [ -| ltac_expr3 (* ssr plugin *) -] - -ssrtclarg: [ -| ssrtacarg (* ssr plugin *) -] - -ssrhyp: [ -| ident (* ssr plugin *) -] - -ssrhoi_hyp: [ -| ident (* ssr plugin *) -] - -ssrhoi_id: [ -| ident (* ssr plugin *) ] -ssrsimpl_ne: [ -| "//=" (* ssr plugin *) -| "/=" (* ssr plugin *) -| "/" natural "/" natural "=" (* ssr plugin *) -| "/" natural "/" (* ssr plugin *) -| "/" natural "=" (* ssr plugin *) -| "/" natural "/=" (* ssr plugin *) -| "/" natural "/" "=" (* ssr plugin *) -| "//" natural "=" (* ssr plugin *) -| "//" (* ssr plugin *) -] - -ssrclear_ne: [ -| "{" ssrhyp_list "}" (* ssr plugin *) -] - -ssrclear: [ -| ssrclear_ne (* ssr plugin *) -| empty (* ssr plugin *) -] - -ssrindex: [ -| int_or_var (* ssr plugin *) +match_context_rule_list_or: [ +| match_context_rule_list_or "|" match_context_rule +| match_context_rule ] -ssrocc: [ -| natural natural_list_opt (* ssr plugin *) -| "-" natural_list_opt (* ssr plugin *) -| "+" natural_list_opt (* ssr plugin *) +match_context_rule: [ +| match_hyp_list_comma_opt "|-" match_pattern "=>" ltac_expr +| "[" match_hyp_list_comma_opt "|-" match_pattern "]" "=>" ltac_expr +| "_" "=>" ltac_expr ] -natural_list_opt: [ -| natural_list_opt natural +match_hyp_list_comma_opt: [ +| match_hyp_list_comma | empty ] -ssrmmod: [ -| "!" (* ssr plugin *) -| "?" (* ssr plugin *) -| "?" (* ssr plugin *) -] - -ssrmult_ne: [ -| natural ssrmmod (* ssr plugin *) -| ssrmmod (* ssr plugin *) -] - -ssrmult: [ -| ssrmult_ne (* ssr plugin *) -| empty (* ssr plugin *) +match_hyp_list_comma: [ +| match_hyp_list_comma "," match_hyp +| match_hyp ] -ssrdocc: [ -| "{" ssrocc "}" (* ssr plugin *) -| "{" ssrhyp_list_opt "}" (* ssr plugin *) +match_hyp: [ +| name ":" match_pattern +| name ":=" match_pattern_opt match_pattern ] -ssrhyp_list_opt: [ -| ssrhyp_list_opt ssrhyp +match_pattern_opt: [ +| "[" match_pattern "]" ":" | empty ] -ssrterm: [ -| "YouShouldNotTypeThis" term (* ssr plugin *) -| term (* ssr plugin *) -] - -ast_closure_term: [ -| term (* ssr plugin *) -] - -ast_closure_lterm: [ -| lconstr (* ssr plugin *) -] - -ssrbwdview: [ -| "YouShouldNotTypeThis" (* ssr plugin *) -| "/" term (* ssr plugin *) -| "/" term ssrbwdview (* ssr plugin *) -] - -ssrfwdview: [ -| "YouShouldNotTypeThis" (* ssr plugin *) -| "/" ast_closure_term (* ssr plugin *) -| "/" ast_closure_term ssrfwdview (* ssr plugin *) -] - -ident_no_do: [ -| "YouShouldNotTypeThis" ident (* ssr plugin *) -| IDENT (* ssr plugin *) -] - -ssripat: [ -| "_" (* ssr plugin *) -| "*" (* ssr plugin *) -| ">" (* ssr plugin *) -| ident_no_do (* ssr plugin *) -| "?" (* ssr plugin *) -| "+" (* ssr plugin *) -| "++" (* ssr plugin *) -| ssrsimpl_ne (* ssr plugin *) -| ssrdocc "->" (* ssr plugin *) -| ssrdocc "<-" (* ssr plugin *) -| ssrdocc (* ssr plugin *) -| "->" (* ssr plugin *) -| "<-" (* ssr plugin *) -| "-" (* ssr plugin *) -| "-/" "=" (* ssr plugin *) -| "-/=" (* ssr plugin *) -| "-/" "/" (* ssr plugin *) -| "-//" (* ssr plugin *) -| "-/" integer "/" (* ssr plugin *) -| "-/" "/=" (* ssr plugin *) -| "-//" "=" (* ssr plugin *) -| "-//=" (* ssr plugin *) -| "-/" integer "/=" (* ssr plugin *) -| "-/" integer "/" integer "=" (* ssr plugin *) -| ssrfwdview (* ssr plugin *) -| "[" ":" ident_list_opt "]" (* ssr plugin *) -| "[:" ident_list_opt "]" (* ssr plugin *) -| ssrcpat (* ssr plugin *) -] - ident_list_opt: [ | ident_list_opt ident | empty ] -ssripats: [ -| ssripat ssripats (* ssr plugin *) -| empty (* ssr plugin *) -] - -ssriorpat: [ -| ssripats "|" ssriorpat (* ssr plugin *) -| ssripats "|-" ">" ssriorpat (* ssr plugin *) -| ssripats "|-" ssriorpat (* ssr plugin *) -| ssripats "|->" ssriorpat (* ssr plugin *) -| ssripats "||" ssriorpat (* ssr plugin *) -| ssripats "|||" ssriorpat (* ssr plugin *) -| ssripats "||||" ssriorpat (* ssr plugin *) -| ssripats (* ssr plugin *) -] - -ssrcpat: [ -| "YouShouldNotTypeThis" ssriorpat (* ssr plugin *) -| "[" hat "]" (* ssr plugin *) -| "[" ssriorpat "]" (* ssr plugin *) -| "[=" ssriorpat "]" (* ssr plugin *) -] - -hat: [ -| "^" ident (* ssr plugin *) -| "^" "~" ident (* ssr plugin *) -| "^" "~" natural (* ssr plugin *) -| "^~" ident (* ssr plugin *) -| "^~" natural (* ssr plugin *) -] - -ssripats_ne: [ -| ssripat ssripats (* ssr plugin *) -] - -ssrhpats: [ -| ssripats (* ssr plugin *) -] - -ssrhpats_wtransp: [ -| ssripats (* ssr plugin *) -| ssripats "@" ssripats (* ssr plugin *) -] - -ssrhpats_nobs: [ -| ssripats (* ssr plugin *) -] - -ssrrpat: [ -| "->" (* ssr plugin *) -| "<-" (* ssr plugin *) -] - -ssrintros_ne: [ -| "=>" ssripats_ne (* ssr plugin *) -] - -ssrintros: [ -| ssrintros_ne (* ssr plugin *) -| empty (* ssr plugin *) -] - -ssrintrosarg: [ -| "YouShouldNotTypeThis" ssrtacarg ssrintros_ne (* ssr plugin *) -] - -ssrfwdid: [ -| ident (* ssr plugin *) -] - -ssrortacs: [ -| ssrtacarg "|" ssrortacs (* ssr plugin *) -| ssrtacarg "|" (* ssr plugin *) -| ssrtacarg (* ssr plugin *) -| "|" ssrortacs (* ssr plugin *) -| "|" (* ssr plugin *) -] - -ssrhintarg: [ -| "[" "]" (* ssr plugin *) -| "[" ssrortacs "]" (* ssr plugin *) -| ssrtacarg (* ssr plugin *) -] - -ssrhint3arg: [ -| "[" "]" (* ssr plugin *) -| "[" ssrortacs "]" (* ssr plugin *) -| ssrtac3arg (* ssr plugin *) -] - -ssrortacarg: [ -| "[" ssrortacs "]" (* ssr plugin *) -] - -ssrhint: [ -| empty (* ssr plugin *) -| "by" ssrhintarg (* ssr plugin *) -] - -ssrwgen: [ -| ssrclear_ne (* ssr plugin *) -| ssrhoi_hyp (* ssr plugin *) -| "@" ssrhoi_hyp (* ssr plugin *) -| "(" ssrhoi_id ":=" lcpattern ")" (* ssr plugin *) -| "(" ssrhoi_id ")" (* ssr plugin *) -| "(@" ssrhoi_id ":=" lcpattern ")" (* ssr plugin *) -| "(" "@" ssrhoi_id ":=" lcpattern ")" (* ssr plugin *) -] - -ssrclausehyps: [ -| ssrwgen "," ssrclausehyps (* ssr plugin *) -| ssrwgen ssrclausehyps (* ssr plugin *) -| ssrwgen (* ssr plugin *) -] - -ssrclauses: [ -| "in" ssrclausehyps "|-" "*" (* ssr plugin *) -| "in" ssrclausehyps "|-" (* ssr plugin *) -| "in" ssrclausehyps "*" (* ssr plugin *) -| "in" ssrclausehyps (* ssr plugin *) -| "in" "|-" "*" (* ssr plugin *) -| "in" "*" (* ssr plugin *) -| "in" "*" "|-" (* ssr plugin *) -| empty (* ssr plugin *) -] - -ssrfwd: [ -| ":=" ast_closure_lterm (* ssr plugin *) -| ":" ast_closure_lterm ":=" ast_closure_lterm (* ssr plugin *) -] - -ssrbvar: [ -| ident (* ssr plugin *) -| "_" (* ssr plugin *) -] - -ssrbinder: [ -| ssrbvar (* ssr plugin *) -| "(" ssrbvar ")" (* ssr plugin *) -| "(" ssrbvar ":" lconstr ")" (* ssr plugin *) -| "(" ssrbvar ssrbvar_list ":" lconstr ")" (* ssr plugin *) -| "(" ssrbvar ":" lconstr ":=" lconstr ")" (* ssr plugin *) -| "(" ssrbvar ":=" lconstr ")" (* ssr plugin *) -| of_alt operconstr99 (* ssr plugin *) -] - -ssrbvar_list: [ -| ssrbvar_list ssrbvar -| ssrbvar -] - -ssrstruct: [ -| "{" "struct" ident "}" (* ssr plugin *) -| empty (* ssr plugin *) -] - -ssrposefwd: [ -| ssrbinder_list_opt ssrfwd (* ssr plugin *) -] - -ssrfixfwd: [ -| "fix" ssrbvar ssrbinder_list_opt ssrstruct ssrfwd (* ssr plugin *) -] - -ssrcofixfwd: [ -| "cofix" ssrbvar ssrbinder_list_opt ssrfwd (* ssr plugin *) -] - -ssrbinder_list_opt: [ -| ssrbinder_list_opt ssrbinder -| empty -] - -ssrsetfwd: [ -| ":" ast_closure_lterm ":=" "{" ssrocc "}" cpattern (* ssr plugin *) -| ":" ast_closure_lterm ":=" lcpattern (* ssr plugin *) -| ":=" "{" ssrocc "}" cpattern (* ssr plugin *) -| ":=" lcpattern (* ssr plugin *) -] - -ssrhavefwd: [ -| ":" ast_closure_lterm ssrhint (* ssr plugin *) -| ":" ast_closure_lterm ":=" ast_closure_lterm (* ssr plugin *) -| ":" ast_closure_lterm ":=" (* ssr plugin *) -| ":=" ast_closure_lterm (* ssr plugin *) -] - -ssrhavefwdwbinders: [ -| ssrhpats_wtransp ssrbinder_list_opt ssrhavefwd (* ssr plugin *) -] - -ssrseqarg: [ -| ssrswap (* ssr plugin *) -| ssrseqidx ssrortacarg ssrorelse_opt (* ssr plugin *) -| ssrseqidx ssrswap (* ssr plugin *) -| ltac_expr3 (* ssr plugin *) -] - -ssrorelse_opt: [ -| ssrorelse -| empty -] - -ssrseqidx: [ -| ident (* ssr plugin *) -| natural (* ssr plugin *) -] - -ssrswap: [ -| "first" (* ssr plugin *) -| "last" (* ssr plugin *) -] - -ssrorelse: [ -| "||" ltac_expr2 (* ssr plugin *) -] - -ident: [ -| IDENT -] - -ssrparentacarg: [ -| "(" ltac_expr ")" (* ssr plugin *) -] - -ssrdotac: [ -| ltac_expr3 (* ssr plugin *) -| ssrortacarg (* ssr plugin *) -] - -ssr_first: [ -| ssr_first ssrintros_ne (* ssr plugin *) -| "[" ltac_expr_list_or_opt "]" (* ssr plugin *) -] - -ssr_first_else: [ -| ssr_first ssrorelse (* ssr plugin *) -| ssr_first (* ssr plugin *) -] - -ssrgen: [ -| ssrdocc cpattern (* ssr plugin *) -| cpattern (* ssr plugin *) -] - -ssrdgens_tl: [ -| "{" ssrhyp_list "}" cpattern ssrdgens_tl (* ssr plugin *) -| "{" ssrhyp_list "}" (* ssr plugin *) -| "{" ssrocc "}" cpattern ssrdgens_tl (* ssr plugin *) -| "/" ssrdgens_tl (* ssr plugin *) -| cpattern ssrdgens_tl (* ssr plugin *) -| empty (* ssr plugin *) -] - -ssrdgens: [ -| ":" ssrgen ssrdgens_tl (* ssr plugin *) -] - -ssreqid: [ -| ssreqpat (* ssr plugin *) -| empty (* ssr plugin *) -] - -ssreqpat: [ -| ident (* ssr plugin *) -| "_" (* ssr plugin *) -| "?" (* ssr plugin *) -| "+" (* ssr plugin *) -| ssrdocc "->" (* ssr plugin *) -| ssrdocc "<-" (* ssr plugin *) -| "->" (* ssr plugin *) -| "<-" (* ssr plugin *) -] - -ssrarg: [ -| ssrfwdview ssreqid ssrdgens ssrintros (* ssr plugin *) -| ssrfwdview ssrclear ssrintros (* ssr plugin *) -| ssreqid ssrdgens ssrintros (* ssr plugin *) -| ssrclear_ne ssrintros (* ssr plugin *) -| ssrintros_ne (* ssr plugin *) -] - -ssrmovearg: [ -| ssrarg (* ssr plugin *) -] - -ssrcasearg: [ -| ssrarg (* ssr plugin *) -] - -ssragen: [ -| "{" ssrhyp_list "}" ssrterm (* ssr plugin *) -| ssrterm (* ssr plugin *) -] - -ssrhyp_list: [ -| ssrhyp_list ssrhyp -| ssrhyp -] - -ssragens: [ -| "{" ssrhyp_list "}" ssrterm ssragens (* ssr plugin *) -| "{" ssrhyp_list "}" (* ssr plugin *) -| ssrterm ssragens (* ssr plugin *) -| empty (* ssr plugin *) -] - -ssrapplyarg: [ -| ":" ssragen ssragens ssrintros (* ssr plugin *) -| ssrclear_ne ssrintros (* ssr plugin *) -| ssrintros_ne (* ssr plugin *) -| ssrbwdview ":" ssragen ssragens ssrintros (* ssr plugin *) -| ssrbwdview ssrclear ssrintros (* ssr plugin *) -] - -ssrexactarg: [ -| ":" ssragen ssragens (* ssr plugin *) -| ssrbwdview ssrclear (* ssr plugin *) -| ssrclear_ne (* ssr plugin *) -] - -ssrcongrarg: [ -| natural term ssrdgens (* ssr plugin *) -| natural term (* ssr plugin *) -| term ssrdgens (* ssr plugin *) -| term (* ssr plugin *) -] - -ssrrwocc: [ -| "{" ssrhyp_list_opt "}" (* ssr plugin *) -| "{" ssrocc "}" (* ssr plugin *) -| empty (* ssr plugin *) -] - -ssrrule_ne: [ -| ssrterm_alt (* ssr plugin *) -| ssrsimpl_ne (* ssr plugin *) -] - -ssrterm_alt: [ -| "/" ssrterm -| ssrterm -| ssrsimpl_ne -] - -ssrrule: [ -| ssrrule_ne (* ssr plugin *) -| empty (* ssr plugin *) -] - -ssrpattern_squarep: [ -| "[" rpattern "]" (* ssr plugin *) -| empty (* ssr plugin *) -] - -ssrpattern_ne_squarep: [ -| "[" rpattern "]" (* ssr plugin *) -] - -ssrrwarg: [ -| "-" ssrmult ssrrwocc ssrpattern_squarep ssrrule_ne (* ssr plugin *) -| "-/" ssrterm (* ssr plugin *) -| ssrmult_ne ssrrwocc ssrpattern_squarep ssrrule_ne (* ssr plugin *) -| "{" ssrhyp_list "}" ssrpattern_ne_squarep ssrrule_ne (* ssr plugin *) -| "{" ssrhyp_list "}" ssrrule (* ssr plugin *) -| "{" ssrocc "}" ssrpattern_squarep ssrrule_ne (* ssr plugin *) -| "{" "}" ssrpattern_squarep ssrrule_ne (* ssr plugin *) -| ssrpattern_ne_squarep ssrrule_ne (* ssr plugin *) -| ssrrule_ne (* ssr plugin *) -] - -ssrrwargs: [ -| ssrrwarg_list (* ssr plugin *) -] - -ssrrwarg_list: [ -| ssrrwarg_list ssrrwarg -| ssrrwarg -] - -ssrunlockarg: [ -| "{" ssrocc "}" ssrterm (* ssr plugin *) -| ssrterm (* ssr plugin *) -] - -ssrunlockargs: [ -| ssrunlockarg_list_opt (* ssr plugin *) -] - -ssrunlockarg_list_opt: [ -| ssrunlockarg_list_opt ssrunlockarg -| empty -] - -ssrsufffwd: [ -| ssrhpats ssrbinder_list_opt ":" ast_closure_lterm ssrhint (* ssr plugin *) -] - -ssrwlogfwd: [ -| ":" ssrwgen_list_opt "/" ast_closure_lterm (* ssr plugin *) -] - -ssrwgen_list_opt: [ -| ssrwgen_list_opt ssrwgen -| empty -] - -ssr_idcomma: [ -| empty (* ssr plugin *) -| IDENT_alt "," (* ssr plugin *) -] - -IDENT_alt: [ -| IDENT -| "_" -] - -ssr_rtype: [ -| "return" operconstr100 (* ssr plugin *) -] - -ssr_mpat: [ -| pattern200 (* ssr plugin *) -] - -ssr_dpat: [ -| ssr_mpat "in" pattern200 ssr_rtype (* ssr plugin *) -| ssr_mpat ssr_rtype (* ssr plugin *) -| ssr_mpat (* ssr plugin *) -] - -ssr_dthen: [ -| ssr_dpat "then" lconstr (* ssr plugin *) -] - -ssr_elsepat: [ -| "else" (* ssr plugin *) -] - -ssr_else: [ -| ssr_elsepat lconstr (* ssr plugin *) -] - -ssr_search_item: [ -| string (* ssr plugin *) -| string "%" preident (* ssr plugin *) -| constr_pattern (* ssr plugin *) -] - -ssr_search_arg: [ -| "-" ssr_search_item ssr_search_arg (* ssr plugin *) -| ssr_search_item ssr_search_arg (* ssr plugin *) -| empty (* ssr plugin *) -] - -ssr_modlocs: [ -| empty (* ssr plugin *) -| "in" modloc_list (* ssr plugin *) -] - -modloc_list: [ -| modloc_list modloc -| modloc -] - -modloc: [ -| "-" global (* ssr plugin *) -| global (* ssr plugin *) -] - -ssrhintref: [ -| term (* ssr plugin *) -| term "|" natural (* ssr plugin *) -] - -ssrviewpos: [ -| "for" "move" "/" (* ssr plugin *) -| "for" "apply" "/" (* ssr plugin *) -| "for" "apply" "/" "/" (* ssr plugin *) -| "for" "apply" "//" (* ssr plugin *) -| empty (* ssr plugin *) -] - -ssrviewposspc: [ -| ssrviewpos (* ssr plugin *) -] - -rpattern: [ -| lconstr (* ssrmatching plugin *) -| "in" lconstr (* ssrmatching plugin *) -| lconstr "in" lconstr (* ssrmatching plugin *) -| "in" lconstr "in" lconstr (* ssrmatching plugin *) -| lconstr "in" lconstr "in" lconstr (* ssrmatching plugin *) -| lconstr "as" lconstr "in" lconstr (* ssrmatching plugin *) -] - -cpattern: [ -| "Qed" term (* ssrmatching plugin *) -| term (* ssrmatching plugin *) -] - -lcpattern: [ -| "Qed" lconstr (* ssrmatching plugin *) -| lconstr (* ssrmatching plugin *) -] - -ssrpatternarg: [ -| rpattern (* ssrmatching plugin *) -] - -empty: [ -| -] - -lpar_id_coloneq: [ -| "(" IDENT; ":=" -] - -name_colon: [ -| IDENT; ":" -| "_" ":" -] - -int: [ -| integer -] - -command_entry: [ -| noedit_mode -] - diff --git a/doc/tools/docgram/prodn.edit_mlg b/doc/tools/docgram/prodn.edit_mlg index a28d07636a..37197a1fec 100644 --- a/doc/tools/docgram/prodn.edit_mlg +++ b/doc/tools/docgram/prodn.edit_mlg @@ -12,3 +12,13 @@ (* Contents used to generate prodn in doc *) DOC_GRAMMAR + +(* todo: doesn't work, gives +ltac_match: @match_key @ltac_expr with {? %| } {+| @ltac_expr } end +instead of +ltac_match: @match_key @ltac_expr with {? %| } {+| {| @match_pattern | _ } => @ltac_expr } end + +SPLICE: [ +| match_rule +] +*) diff --git a/doc/tools/docgram/productionlist.edit_mlg b/doc/tools/docgram/productionlist.edit_mlg index 84acd07075..42d94e76bb 100644 --- a/doc/tools/docgram/productionlist.edit_mlg +++ b/doc/tools/docgram/productionlist.edit_mlg @@ -15,11 +15,42 @@ DOC_GRAMMAR EXPAND: [ | ] -(* ugh todo: try to handle before expansion *) -tactic_then_gen : [ -| REPLACE ltac_expr_opt ".." ltac_expr_opt2 -| WITH ltac_expr_opt ".." or_opt ltac_expr_list2 +RENAME: [ +| name_alt names_tuple +| binder_list binders +| binder_list_opt binders_opt +| typeclass_constraint_list_comma typeclass_constraints_comma +| universe_expr_list_comma universe_exprs_comma +| universe_level_list_opt universe_levels_opt +| name_list names +| name_list_comma names_comma +| case_item_list_comma case_items_comma +| eqn_list_or_opt eqns_or_opt +| eqn_list_or eqns_or +| pattern_list_or patterns_or +| fix_body_list fix_bodies +| arg_list args +| arg_list_opt args_opt +| evar_binding_list_semi evar_bindings_semi ] -ltac_expr_opt2 : [ | DELETENT ] -ltac_expr_list2_opt : [ | DELETENT ] +binders_opt: [ +| REPLACE binders_opt binder +| WITH binders +] + +(* this is here because they're inside _opt generated by EXPAND *) +SPLICE: [ +| ltac_info +| eliminator +| field_mods +| ltac_production_sep +| ltac_tactic_level +| module_binder +| printunivs_subgraph +| quoted_attributes +| ring_mods +| scope_delimiter +| univ_decl +| univ_name_list +] |
