Skip to content

Commit 7027843

Browse files
nikhilkhatriThomas HoffmannDimitri KartsaklisCharles London
committed
Release version 0.3.1
Co-authored-by: Thomas Hoffmann <thomas.hoffmann@quantinuum.com> Co-authored-by: Dimitri Kartsaklis <dimitri.kartsaklis@quantinuum.com> Co-authored-by: Charles London <charles.london@quantinuum.com>
1 parent 54b5bfa commit 7027843

24 files changed

+533
-316
lines changed

.github/workflows/build_test.yml

+18-1
Original file line numberDiff line numberDiff line change
@@ -11,6 +11,7 @@ on:
1111
env:
1212
SRC_DIR: lambeq
1313
TEST_DIR: tests
14+
DOCS_DIR: docs
1415

1516
jobs:
1617
lint:
@@ -76,7 +77,7 @@ jobs:
7677
--doctest-modules
7778
--durations=50
7879
--ignore=${{ env.TEST_DIR }}/text2diagram/test_depccg_parser.py
79-
--ignore=docs/extract_code_cells.py
80+
--ignore=${{ env.DOCS_DIR }}/extract_code_cells.py
8081
- name: Determine if depccg tests should be run
8182
# only test depccg if it is explicitly changed, since it is very slow
8283
# tests are also disabled on Python 3.11
@@ -108,6 +109,22 @@ jobs:
108109
if: steps.depccg-enabled.outcome == 'success'
109110
continue-on-error: true
110111
run: coverage run --append --source=${{ env.SRC_DIR }} -m pytest -k test_depccg_parser.py
112+
- name: Preparation for notebook testing
113+
run: pip install nbmake
114+
- name: Test example notebooks
115+
env:
116+
TEST_NOTEBOOKS: 1
117+
run: >
118+
pytest --nbmake ${{ env.DOCS_DIR }}/examples/
119+
--nbmake-timeout=60
120+
- name: Test tutorial notebooks
121+
env:
122+
TEST_NOTEBOOKS: 1
123+
run: >
124+
pytest --nbmake ${{ env.DOCS_DIR }}/tutorials/
125+
--nbmake-timeout=60
126+
--ignore ${{ env.DOCS_DIR }}/tutorials/trainer_hybrid.ipynb
127+
--ignore ${{ env.DOCS_DIR }}/tutorials/code
111128
- name: Coverage report
112129
run: coverage report -m
113130
type_check:
17.8 KB
Loading

docs/clean_notebooks.py

+35
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,35 @@
1+
from pathlib import Path
2+
from itertools import chain
3+
import nbformat as nbf
4+
5+
6+
print("Cleaning notebooks...")
7+
8+
nbs_path = Path("examples")
9+
tut_path = Path("tutorials")
10+
useful_metadata = ["nbsphinx", "raw_mimetype"]
11+
12+
for file in chain(nbs_path.iterdir(), tut_path.iterdir()):
13+
if not (file.is_file() and file.suffix == ".ipynb"):
14+
continue
15+
16+
ntbk = nbf.read(file, nbf.NO_CONVERT)
17+
18+
for cell in ntbk.cells:
19+
# Delete cell ID if it's there
20+
cell.pop("id", None)
21+
22+
# Keep only useful metadata
23+
new_metadata = {x: cell.metadata[x]
24+
for x in useful_metadata
25+
if x in cell.metadata}
26+
cell.metadata = new_metadata
27+
28+
ntbk.metadata = {"language_info": {"name": "python"}}
29+
30+
# We need the version of nbformat to be x.4, otherwise cells IDs
31+
# are regenerated automatically
32+
ntbk.nbformat = 4
33+
ntbk.nbformat_minor = 4
34+
35+
nbf.write(ntbk, file, version=nbf.NO_CONVERT)

docs/conf.py

+1-1
Original file line numberDiff line numberDiff line change
@@ -46,7 +46,7 @@
4646
]
4747

4848
intersphinx_mapping = {
49-
'discopy': ("https://discopy.readthedocs.io/en/0.5/", None),
49+
'discopy': ("https://docs.discopy.org/en/0.5.1.1/", None),
5050
'pennylane': ("https://pennylane.readthedocs.io/en/stable/", None),
5151
}
5252

docs/discopy.rst

+3-6
Original file line numberDiff line numberDiff line change
@@ -3,14 +3,11 @@
33
DisCoPy
44
=======
55

6-
While the :ref:`parser <sec-parsing>` provides ``lambeq``'s input, *DisCoPy* [#f1]_ [FTC2020]_ is ``lambeq``'s underlying engine, the component where all the low-level processing takes place. At its core, DisCoPy is a Python library that allows computation with :term:`monoidal categories <monoidal category>`. The main data structure is that of a *monoidal diagram*, or :ref:`string diagram <sec-string-diagrams>`, which is the format that ``lambeq`` uses internally to encode a sentence (:py:class:`discopy.rigid.Diagram`). DisCoPy makes this easy, by offering many language-related features, such as support for :term:`pregroup grammars <pregroup grammar>` and :term:`functors <functor>` for implementing :term:`compositional models <compositional model>` such as :term:`DisCoCat`. Furthermore, from a quantum computing perspective, DisCoPy provides abstractions for creating all standard :term:`quantum gates <quantum gate>` and building :term:`quantum circuits <quantum circuit>`, which are used by ``lambeq`` in the final stages of the :ref:`pipeline <sec-pipeline>`.
6+
While the :ref:`parser <sec-parsing>` provides ``lambeq``'s input, `DisCoPy <https://discopy.org>`_ [FTC2020]_ is ``lambeq``'s underlying engine, the component where all the low-level processing takes place. At its core, DisCoPy is a Python library that allows computation with :term:`monoidal categories <monoidal category>`. The main data structure is that of a *monoidal diagram*, or :ref:`string diagram <sec-string-diagrams>`, which is the format that ``lambeq`` uses internally to encode a sentence (:py:class:`discopy.rigid.Diagram`). DisCoPy makes this easy, by offering many language-related features, such as support for :term:`pregroup grammars <pregroup grammar>` and :term:`functors <functor>` for implementing :term:`compositional models <compositional model>` such as :term:`DisCoCat`. Furthermore, from a quantum computing perspective, DisCoPy provides abstractions for creating all standard :term:`quantum gates <quantum gate>` and building :term:`quantum circuits <quantum circuit>`, which are used by ``lambeq`` in the final stages of the :ref:`pipeline <sec-pipeline>`.
77

88
Thus, it is not a surprise that the advanced use of ``lambeq``, involving extending the toolkit with new :term:`compositional models <compositional model>` and :term:`ansätze <ansatz (plural: ansätze)>`, requires some familiarity of DisCoPy. For this, you can use the following resources:
99

1010
- For a gentle introduction to basic DisCoPy concepts, start with ``lambeq``'s tutorial :ref:`sec-advanced`.
11-
- The `basic example notebooks <https://discopy.readthedocs.io/en/main/notebooks.basics.html>`_ in DisCoPy documentation provide another good starting point.
12-
- The `advanced tutorials <https://discopy.readthedocs.io/en/main/notebooks.advanced.html>`_ in DisCoPy documentation can help you to delve further into DisCoPy.
11+
- The `basic example notebooks <https://docs.discopy.org/en/0.5.1.1/notebooks.basics.html>`_ in DisCoPy documentation provide another good starting point.
12+
- The `advanced tutorials <https://docs.discopy.org/en/0.5.1.1/notebooks.advanced.html>`_ in DisCoPy documentation can help you to delve further into DisCoPy.
1313

14-
.. rubric:: Footnotes
15-
16-
.. [#f1] https://github.com/oxford-quantum-group/discopy

docs/examples/classical_pipeline.ipynb

+18-1
Original file line numberDiff line numberDiff line change
@@ -59,6 +59,23 @@
5959
"test_labels, test_data = read_data('datasets/mc_test_data.txt')"
6060
]
6161
},
62+
{
63+
"cell_type": "code",
64+
"execution_count": null,
65+
"metadata": {
66+
"nbsphinx": "hidden"
67+
},
68+
"outputs": [],
69+
"source": [
70+
"TESTING = int(os.environ.get('TEST_NOTEBOOKS', '0'))\n",
71+
"\n",
72+
"if TESTING:\n",
73+
" train_labels, train_data = train_labels[:2], train_data[:2]\n",
74+
" dev_labels, dev_data = dev_labels[:2], dev_data[:2]\n",
75+
" test_labels, test_data = test_labels[:2], test_data[:2]\n",
76+
" EPOCHS = 1"
77+
]
78+
},
6279
{
6380
"cell_type": "markdown",
6481
"metadata": {},
@@ -306,5 +323,5 @@
306323
}
307324
},
308325
"nbformat": 4,
309-
"nbformat_minor": 5
326+
"nbformat_minor": 4
310327
}

docs/examples/parser.ipynb

+1-1
Original file line numberDiff line numberDiff line change
@@ -64,5 +64,5 @@
6464
}
6565
},
6666
"nbformat": 4,
67-
"nbformat_minor": 5
67+
"nbformat_minor": 4
6868
}

docs/examples/pennylane.ipynb

+118-53
Large diffs are not rendered by default.

docs/examples/quantum_pipeline.ipynb

+18-1
Original file line numberDiff line numberDiff line change
@@ -61,6 +61,23 @@
6161
"test_labels, test_data = read_data('datasets/mc_test_data.txt')"
6262
]
6363
},
64+
{
65+
"cell_type": "code",
66+
"execution_count": null,
67+
"metadata": {
68+
"nbsphinx": "hidden"
69+
},
70+
"outputs": [],
71+
"source": [
72+
"TESTING = int(os.environ.get('TEST_NOTEBOOKS', '0'))\n",
73+
"\n",
74+
"if TESTING:\n",
75+
" train_labels, train_data = train_labels[:2], train_data[:2]\n",
76+
" dev_labels, dev_data = dev_labels[:2], dev_data[:2]\n",
77+
" test_labels, test_data = test_labels[:2], test_data[:2]\n",
78+
" EPOCHS = 1"
79+
]
80+
},
6481
{
6582
"cell_type": "markdown",
6683
"metadata": {},
@@ -356,5 +373,5 @@
356373
}
357374
},
358375
"nbformat": 4,
359-
"nbformat_minor": 2
376+
"nbformat_minor": 4
360377
}

docs/examples/quantum_pipeline_jax.ipynb

+18-1
Original file line numberDiff line numberDiff line change
@@ -64,6 +64,23 @@
6464
"test_labels, test_data = read_data('datasets/mc_test_data.txt')"
6565
]
6666
},
67+
{
68+
"cell_type": "code",
69+
"execution_count": null,
70+
"metadata": {
71+
"nbsphinx": "hidden"
72+
},
73+
"outputs": [],
74+
"source": [
75+
"TESTING = int(os.environ.get('TEST_NOTEBOOKS', '0'))\n",
76+
"\n",
77+
"if TESTING:\n",
78+
" train_labels, train_data = train_labels[:2], train_data[:2]\n",
79+
" dev_labels, dev_data = dev_labels[:2], dev_data[:2]\n",
80+
" test_labels, test_data = test_labels[:2], test_data[:2]\n",
81+
" EPOCHS = 1"
82+
]
83+
},
6784
{
6885
"cell_type": "markdown",
6986
"metadata": {},
@@ -360,5 +377,5 @@
360377
}
361378
},
362379
"nbformat": 4,
363-
"nbformat_minor": 2
380+
"nbformat_minor": 4
364381
}

docs/examples/reader.ipynb

+1-1
Original file line numberDiff line numberDiff line change
@@ -137,5 +137,5 @@
137137
}
138138
},
139139
"nbformat": 4,
140-
"nbformat_minor": 5
140+
"nbformat_minor": 4
141141
}

docs/examples/rewrite.ipynb

+1-1
Original file line numberDiff line numberDiff line change
@@ -816,5 +816,5 @@
816816
}
817817
},
818818
"nbformat": 4,
819-
"nbformat_minor": 5
819+
"nbformat_minor": 4
820820
}

docs/examples/tensor.ipynb

+1-1
Original file line numberDiff line numberDiff line change
@@ -154,5 +154,5 @@
154154
}
155155
},
156156
"nbformat": 4,
157-
"nbformat_minor": 5
157+
"nbformat_minor": 4
158158
}

docs/examples/tree_reader.ipynb

+1-1
Original file line numberDiff line numberDiff line change
@@ -101,5 +101,5 @@
101101
}
102102
},
103103
"nbformat": 4,
104-
"nbformat_minor": 2
104+
"nbformat_minor": 4
105105
}

docs/index.rst

+3-3
Original file line numberDiff line numberDiff line change
@@ -1,11 +1,11 @@
11
lambeq
22
======
33

4-
.. image:: _static/images/CQ-logo.png
5-
:width: 120px
4+
.. image:: _static/images/Quantinuum_logo.png
5+
:width: 240px
66
:align: right
77

8-
``lambeq`` is an open-source, modular, extensible high-level Python library for experimental :term:`Quantum Natural Language Processing <quantum NLP (QNLP)>` (QNLP), created by `Cambridge Quantum <https://cambridgequantum.com>`_'s QNLP team. At a high level, the library allows the conversion of any sentence to a :term:`quantum circuit`, based on a given :term:`compositional model` and certain parameterisation and choices of :term:`ansätze <ansatz (plural: ansätze)>`, and facilitates :ref:`training <sec-training>` for both quantum and classical NLP experiments. The notes for the latest release can be found :ref:`here <sec-release_notes>`.
8+
``lambeq`` is an open-source, modular, extensible high-level Python library for experimental :term:`Quantum Natural Language Processing <quantum NLP (QNLP)>` (QNLP), created by `Quantinuum <https://www.quantinuum.com>`_'s QNLP team. At a high level, the library allows the conversion of any sentence to a :term:`quantum circuit`, based on a given :term:`compositional model` and certain parameterisation and choices of :term:`ansätze <ansatz (plural: ansätze)>`, and facilitates :ref:`training <sec-training>` for both quantum and classical NLP experiments. The notes for the latest release can be found :ref:`here <sec-release_notes>`.
99

1010
``lambeq`` is available for Python 3.8 and higher, on Linux, macOS and Windows. To install, type:
1111

docs/release_notes.rst

+16
Original file line numberDiff line numberDiff line change
@@ -4,6 +4,22 @@ Release notes
44
=============
55

66

7+
.. _rel-0.3.1:
8+
9+
`0.3.1 <https://github.com/CQCL/lambeq/releases/tag/0.3.1>`_
10+
------------------------------------------------------------
11+
12+
Changed:
13+
14+
- Added example and tutorial notebooks to tests.
15+
- Dependencies: pinned the maximum version of Jax and Jaxlib to 0.4.6 to avoid a JIT-compilation error when using the :py:class:`~lambeq.NumpyModel`.
16+
17+
Fixed:
18+
19+
- Documentation: fixed broken DisCoPy links.
20+
- Fixed PyTorch datatype errors in example and tutorial notebooks.
21+
- Updated custom :term:`ansätze <ansatz (plural: ansätze)>` in tutorial notebook to match new structure of :py:class:`~lambeq.CircuitAnsatz` and :py:class:`~lambeq.TensorAnsatz`.
22+
723
.. _rel-0.3.0:
824

925
`0.3.0 <https://github.com/CQCL/lambeq/releases/tag/0.3.0>`_

0 commit comments

Comments
 (0)