Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Lightsout updates #227

Open
wants to merge 92 commits into
base: main
Choose a base branch
from
Open
Show file tree
Hide file tree
Changes from 67 commits
Commits
Show all changes
92 commits
Select commit Hold shift + click to select a range
3afb4a1
stim integration
lucasberent Apr 29, 2024
3c97be9
readme update
lucasberent May 14, 2024
9df96d7
merge
lucasberent Jun 3, 2024
93d6af4
🎨 pre-commit fixes
pre-commit-ci[bot] Jun 3, 2024
1bc31bf
update ignore to new folder structure, fixes
lucasberent Jun 3, 2024
d421961
merge
lucasberent Jun 3, 2024
1fa089d
🎨 pre-commit fixes
pre-commit-ci[bot] Jun 3, 2024
7d645ff
more fixes
lucasberent Jun 3, 2024
dd3297d
🎨 pre-commit fixes
pre-commit-ci[bot] Jun 3, 2024
807e495
Merge branch 'main' into lightsout-updates
lucasberent Oct 15, 2024
cbd9e74
🎨 pre-commit fixes
pre-commit-ci[bot] Oct 15, 2024
e44cd86
try fix some linter warnings
lucasberent Oct 15, 2024
86881a1
🎨 pre-commit fixes
pre-commit-ci[bot] Oct 15, 2024
45239a8
try fix some mypy warnings
lucasberent Oct 15, 2024
31c791b
Merge remote-tracking branch 'origin/lightsout-updates' into lightsou…
lucasberent Oct 15, 2024
13831f9
🎨 pre-commit fixes
pre-commit-ci[bot] Oct 15, 2024
58d3678
try fix more hints
lucasberent Oct 15, 2024
3afaf6c
🎨 pre-commit fixes
pre-commit-ci[bot] Oct 15, 2024
8f7203a
more fixes
lucasberent Oct 15, 2024
c37e89a
🎨 pre-commit fixes
pre-commit-ci[bot] Oct 15, 2024
de44781
even more fixes
lucasberent Oct 15, 2024
c316411
🎨 pre-commit fixes
pre-commit-ci[bot] Oct 15, 2024
e1b2d04
hopefully make precommit happy now
lucasberent Oct 15, 2024
4ead234
fix missing param
lucasberent Oct 16, 2024
2ac8366
🎨 pre-commit fixes
pre-commit-ci[bot] Oct 16, 2024
f05fed4
fix warnings
lucasberent Oct 16, 2024
d3dd250
mypy fixes
lucasberent Oct 16, 2024
4711427
🎨 pre-commit fixes
pre-commit-ci[bot] Oct 16, 2024
0d6925c
fix replace issues
lucasberent Oct 16, 2024
c610f14
🎨 pre-commit fixes
pre-commit-ci[bot] Oct 16, 2024
e9d384e
fix replace issues
lucasberent Oct 16, 2024
a4b42d6
mypy fixes
lucasberent Oct 16, 2024
61df30f
🎨 pre-commit fixes
pre-commit-ci[bot] Oct 16, 2024
1c030d5
mypy fixes
lucasberent Oct 16, 2024
397f367
Merge remote-tracking branch 'origin/lightsout-updates' into lightsou…
lucasberent Oct 16, 2024
6d9f566
🎨 pre-commit fixes
pre-commit-ci[bot] Oct 16, 2024
27979c9
fix np array typing
lucasberent Oct 16, 2024
982333b
mypy fixes
lucasberent Oct 16, 2024
609ae17
🎨 pre-commit fixes
pre-commit-ci[bot] Oct 16, 2024
a580b4c
mypy fixes
lucasberent Oct 16, 2024
23806ef
🎨 pre-commit fixes
pre-commit-ci[bot] Oct 16, 2024
23fb1a3
try fixing import issues
lucasberent Oct 16, 2024
18f1051
Merge remote-tracking branch 'origin/lightsout-updates' into lightsou…
lucasberent Oct 16, 2024
b75df90
🎨 pre-commit fixes
pre-commit-ci[bot] Oct 16, 2024
91f3457
try fix mypy excludes
lucasberent Oct 16, 2024
fb6be1b
Merge remote-tracking branch 'origin/lightsout-updates' into lightsou…
lucasberent Oct 16, 2024
bf48e84
try fix mypy excludes
lucasberent Oct 16, 2024
07be228
🎨 pre-commit fixes
pre-commit-ci[bot] Oct 16, 2024
157f437
try fix mypy excludes
lucasberent Oct 17, 2024
3d4c44a
try fix mypy excludes ctd
lucasberent Oct 17, 2024
31859d6
add sinter to mypy modules
lucasberent Oct 17, 2024
5129ec8
mypy ignores for subclassing from stim
lucasberent Oct 17, 2024
2c5e167
🎨 pre-commit fixes
pre-commit-ci[bot] Oct 17, 2024
77de7f1
adapt codecov ignores
lucasberent Oct 17, 2024
5d143a3
mypy byline ignores
lucasberent Oct 23, 2024
b029dae
🎨 pre-commit fixes
pre-commit-ci[bot] Oct 23, 2024
05a96b1
add types for ignore types :P
lucasberent Oct 23, 2024
5aa9811
Merge remote-tracking branch 'origin/lightsout-updates' into lightsou…
lucasberent Oct 23, 2024
d2c36a7
🎨 pre-commit fixes
pre-commit-ci[bot] Oct 23, 2024
50e2cfb
Merge branch 'main' into lightsout-updates
lucasberent Oct 27, 2024
aae9904
Merge branch 'main' into lightsout-updates
burgholzer Nov 5, 2024
32bd204
♻️ streamline decoder changes
burgholzer Nov 8, 2024
e8fd04f
🚚 move evaluation and plotting outside of main package
burgholzer Nov 8, 2024
cd5777f
🚨 proper includes
burgholzer Nov 8, 2024
b460e8b
🔧 adjust codecov config
burgholzer Nov 8, 2024
fb2c93f
🚨 typing
burgholzer Nov 8, 2024
d01b0d5
🔒 update lockfile
burgholzer Nov 8, 2024
3f75906
remove stimbposd
lucasberent Nov 18, 2024
14bc263
fix numpy issue
lucasberent Nov 18, 2024
d4a07ba
remove superfluous arg
lucasberent Nov 18, 2024
d854adf
tests for color code stim ckt gen
lucasberent Nov 18, 2024
1439947
tests for color code stim ckt gen
lucasberent Nov 18, 2024
fa725d4
stim interface tests
lucasberent Nov 18, 2024
98a209a
Merge branch 'main' into lightsout-updates
lucasberent Nov 19, 2024
3761c14
🎨 pre-commit fixes
pre-commit-ci[bot] Nov 19, 2024
bffd7dd
ignore scripts files
lucasberent Nov 19, 2024
9f06935
add some docstrings
lucasberent Nov 19, 2024
eca979d
🎨 pre-commit fixes
pre-commit-ci[bot] Nov 19, 2024
cf87d42
fix tuple include
lucasberent Nov 20, 2024
610d2e9
🎨 pre-commit fixes
pre-commit-ci[bot] Nov 20, 2024
fb7642e
🔒 update lockfile
burgholzer Nov 25, 2024
f004891
🎨 remove uncommented code
burgholzer Nov 25, 2024
330830c
🚨 remove lint ignores and fix warnings
burgholzer Nov 25, 2024
8f87743
sinter tests
lucasberent Dec 5, 2024
b75d012
🎨 pre-commit fixes
pre-commit-ci[bot] Dec 5, 2024
8e43e08
Merge branch 'main' into lightsout-updates
lucasberent Dec 11, 2024
7e15ff3
add docstrings
lucasberent Dec 11, 2024
b7f7c4b
🎨 pre-commit fixes
pre-commit-ci[bot] Dec 11, 2024
c830bd2
rename testfile
lucasberent Dec 11, 2024
8f9017d
Merge remote-tracking branch 'origin/lightsout-updates' into lightsou…
lucasberent Dec 11, 2024
e85c2f9
try fix sinter dependency
lucasberent Dec 11, 2024
c1ec034
try fix linter warning
lucasberent Dec 11, 2024
File filter

Filter by extension

Filter by extension


Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
1 change: 0 additions & 1 deletion .github/codecov.yml
Original file line number Diff line number Diff line change
@@ -1,7 +1,6 @@
ignore:
- "**/python"
- "test/**/*"
- "src/mqt/qecc/cc_decoder/plots.py"
- "src/mqt/qecc/analog_information_decoding/utils/data_utils.py"
- "src/mqt/qecc/analog_information_decoding/code_construction/*"

Expand Down
12 changes: 9 additions & 3 deletions pyproject.toml
Original file line number Diff line number Diff line change
Expand Up @@ -56,6 +56,7 @@ dependencies = [
"numba>=0.59; python_version > '3.11'",
"numba>=0.57; python_version <= '3.11'",
"pymatching>=2.2.1",
"stimbposd"
lucasberent marked this conversation as resolved.
Show resolved Hide resolved
]
dynamic = ["version"]

Expand Down Expand Up @@ -180,13 +181,17 @@ disallow_untyped_defs = false
explicit_package_bases = true
warn_unreachable = true
exclude = [
"^plot\\_convergence\\_rate\\.ipynb$",
"^plot\\_pseudothresholds.\\ipnyb$",
"^plots\\.py$",
lucasberent marked this conversation as resolved.
Show resolved Hide resolved
"code_construction*",
"^data_utils\\.py$"
"^data_utils\\.py$",
"^run\\_color\\_code\\_phenomenological\\_noise\\.py$",
lucasberent marked this conversation as resolved.
Show resolved Hide resolved
]

[[tool.mypy.overrides]]
module = ["qiskit.*", "qecsim.*", "qiskit_aer.*", "matplotlib.*", "scipy.*", "ldpc.*", "pytest_console_scripts.*",
"z3.*", "bposd.*", "numba.*", "pymatching.*", "stim.*", "multiprocess.*"]
"z3.*", "bposd.*", "numba.*", "pymatching.*", "stim.*", "multiprocess.*", "stimbposd.*", "sinter.*"]
lucasberent marked this conversation as resolved.
Show resolved Hide resolved
ignore_missing_imports = true


Expand Down Expand Up @@ -271,7 +276,8 @@ isort.required-imports = ["from __future__ import annotations"]
"E402", # Allow imports to appear anywhere in Jupyter notebooks
"I002", # Allow missing `from __future__ import annotations` import
]
"*/cc_decoder/plots.py" = ["T201"]
"*/cc_decoder/plotting/**" = ["T20"]
"*/cc_decoder/plotting/*.ipynb" = ["T", "PTH", "FURB", "PERF", "D"]
lucasberent marked this conversation as resolved.
Show resolved Hide resolved
"scripts/*" = ["T201"]

[tool.ruff.lint.pydocstyle]
Expand Down
74 changes: 74 additions & 0 deletions scripts/cc_decoder/plotting/plot_convergence_rate.ipynb
Original file line number Diff line number Diff line change
@@ -0,0 +1,74 @@
{
"cells": [
{
"cell_type": "code",
"execution_count": null,
"metadata": {},
"outputs": [],
"source": [
"import locale\n",
"import os\n",
"from collections import defaultdict\n",
"\n",
"import matplotlib.pyplot as plt\n",
"from matplotlib import ticker\n",
"\n",
"with open(f\"{os.getcwd()}/convergence_rate.txt\", encoding=locale.getpreferredencoding(False)) as file:\n",
" content = file.read()\n",
"\n",
"convergence_rate_dict = defaultdict(lambda: defaultdict(int))\n",
"for line in content.split(\"\\n\"):\n",
" d, per, n_converged, n_not_converged = line.split(\" \")[:]\n",
" convergence_rate_dict[d][round(float(per), 5)] = (int(n_converged), int(n_not_converged))"
]
},
{
"cell_type": "code",
"execution_count": null,
"metadata": {},
"outputs": [],
"source": [
"for d in convergence_rate_dict:\n",
" pers = []\n",
" convergence_rates = []\n",
" for per in convergence_rate_dict[d]:\n",
" pers.append(per)\n",
" pers.sort()\n",
"\n",
" for per in pers:\n",
" n_converged, n_not_converged = convergence_rate_dict[d][per]\n",
" convergence_rates.append(n_converged / (n_converged + n_not_converged))\n",
" plt.plot(pers, convergence_rates, label=f\"d={d}\")\n",
"\n",
"ax = plt.gca()\n",
"ax.xaxis.set_major_locator(ticker.MaxNLocator(nbins=10))\n",
"ax.grid()\n",
"\n",
"plt.legend()\n",
"plt.ylabel(\"Convergence rate\")\n",
"plt.xlabel(\"Physical error rate\")\n",
"plt.savefig(\"convergence.svg\")"
]
}
],
"metadata": {
"kernelspec": {
"display_name": "Python 3",
"language": "python",
"name": "python3"
},
"language_info": {
"codemirror_mode": {
"name": "ipython",
"version": 3
},
"file_extension": ".py",
"mimetype": "text/x-python",
"name": "python",
"nbconvert_exporter": "python",
"pygments_lexer": "ipython3"
}
},
"nbformat": 4,
"nbformat_minor": 2
}
143 changes: 143 additions & 0 deletions scripts/cc_decoder/plotting/plot_pseudothresholds.ipynb
Original file line number Diff line number Diff line change
@@ -0,0 +1,143 @@
{
"cells": [
{
"cell_type": "code",
"execution_count": null,
"metadata": {},
"outputs": [],
"source": [
"import os\n",
"\n",
"import matplotlib.pyplot as plt\n",
"import numpy as np\n",
"import sinter\n",
"\n",
"samples = sinter.read_stats_from_csv_files(f\"{os.getcwd()}/pseudothreshold_plot.csv\")"
]
},
{
"cell_type": "code",
"execution_count": null,
"metadata": {},
"outputs": [],
"source": [
"# Render a matplotlib plot of the data.\n",
"fig, ax = plt.subplots(1, 1)\n",
"sinter.plot_error_rate(\n",
" ax=ax,\n",
" stats=samples,\n",
" group_func=lambda stat: f\"d={stat.json_metadata['d']}, {stat.decoder}\",\n",
" x_func=lambda stat: stat.json_metadata[\"p\"],\n",
" failure_units_per_shot_func=lambda stats: stats.json_metadata[\"rounds\"],\n",
" filter_func=lambda stat: stat.json_metadata[\"d\"] < 5,\n",
")\n",
"x_s = np.linspace(0.001, 0.029, 1000000)\n",
"y_s = np.linspace(0.001, 0.029, 1000000)\n",
"ax.set_yscale(\"log\")\n",
"ax.plot(x_s, y_s, \"k-\", alpha=0.75, zorder=0, label=\"x=y\")\n",
"ax.grid()\n",
"ax.set_title(\"Phenomenological Noise\")\n",
"ax.set_ylabel(\"Logical Error Probability (per shot)\")\n",
"ax.set_xlabel(\"Physical Error Rate\")\n",
"ax.legend(loc=\"lower right\")\n",
"fig.savefig(\"pseudoth.svg\")"
]
},
{
"cell_type": "code",
"execution_count": null,
"metadata": {},
"outputs": [],
"source": [
"# Render a matplotlib plot of the data.\n",
"fig, ax = plt.subplots(1, 1)\n",
"sinter.plot_error_rate(\n",
" ax=ax,\n",
" stats=samples,\n",
" group_func=lambda stat: f\"d={stat.json_metadata['d']}, {stat.decoder}\",\n",
" x_func=lambda stat: stat.json_metadata[\"p\"],\n",
" failure_units_per_shot_func=lambda stats: stats.json_metadata[\"rounds\"],\n",
" filter_func=lambda stat: stat.json_metadata[\"d\"] > 4,\n",
")\n",
"x_s = np.linspace(0.001, 0.029, 1000000)\n",
"y_s = np.linspace(0.001, 0.029, 1000000)\n",
"ax.set_yscale(\"log\")\n",
"ax.plot(x_s, y_s, \"k-\", alpha=0.75, zorder=0, label=\"x=y\")\n",
"ax.grid()\n",
"ax.set_title(\"Phenomenological Noise\")\n",
"ax.set_ylabel(\"Logical Error Probability (per shot)\")\n",
"ax.set_xlabel(\"Physical Error Rate\")\n",
"ax.set_ylim(0.00001, 1)\n",
"ax.legend(loc=\"lower right\")\n",
"fig.savefig(\"no-pseudoth.svg\")"
]
},
{
"cell_type": "code",
"execution_count": null,
"metadata": {},
"outputs": [],
"source": [
"# Render a matplotlib plot of the data.\n",
"fig, ax = plt.subplots(1, 2, sharey=True, figsize=(12, 5))\n",
"sinter.plot_error_rate(\n",
" ax=ax[0],\n",
" stats=samples,\n",
" group_func=lambda stat: f\"d={stat.json_metadata['d']}, {stat.decoder}\",\n",
" x_func=lambda stat: stat.json_metadata[\"p\"],\n",
" failure_units_per_shot_func=lambda stats: stats.json_metadata[\"rounds\"],\n",
" filter_func=lambda stat: stat.json_metadata[\"d\"] < 5,\n",
")\n",
"x_s = np.linspace(0.001, 0.029, 1000000)\n",
"y_s = np.linspace(0.001, 0.029, 1000000)\n",
"ax[0].set_yscale(\"log\")\n",
"ax[0].plot(x_s, y_s, \"k-\", alpha=0.75, zorder=0, label=\"x=y\")\n",
"ax[0].grid()\n",
"ax[0].set_ylabel(\"Logical Error Probability (per shot)\")\n",
"ax[0].set_xlabel(\"Physical Error Rate\")\n",
"ax[0].legend(loc=\"lower right\")\n",
"\n",
"# Render a matplotlib plot of the data.\n",
"sinter.plot_error_rate(\n",
" ax=ax[1],\n",
" stats=samples,\n",
" group_func=lambda stat: f\"d={stat.json_metadata['d']}, {stat.decoder}\",\n",
" x_func=lambda stat: stat.json_metadata[\"p\"],\n",
" failure_units_per_shot_func=lambda stats: stats.json_metadata[\"rounds\"],\n",
" filter_func=lambda stat: stat.json_metadata[\"d\"] > 4,\n",
")\n",
"x_s = np.linspace(0.001, 0.029, 1000000)\n",
"y_s = np.linspace(0.001, 0.029, 1000000)\n",
"ax[1].set_yscale(\"log\")\n",
"ax[1].plot(x_s, y_s, \"k-\", alpha=0.75, zorder=0, label=\"x=y\")\n",
"ax[1].grid()\n",
"# ax[1].set_ylabel('Logical Error Probability (per shot)')\n",
"ax[1].set_xlabel(\"Physical Error Rate\")\n",
"ax[1].set_ylim(0.00001, 1)\n",
"ax[1].legend(loc=\"lower right\")\n",
"\n",
"fig.savefig(\"pseudoth.svg\")"
]
}
],
"metadata": {
"kernelspec": {
"display_name": "Python 3",
"language": "python",
"name": "python3"
},
"language_info": {
"codemirror_mode": {
"name": "ipython",
"version": 3
},
"file_extension": ".py",
"mimetype": "text/x-python",
"name": "python",
"nbconvert_exporter": "python",
"pygments_lexer": "ipython3"
}
},
"nbformat": 4,
"nbformat_minor": 2
}
Original file line number Diff line number Diff line change
Expand Up @@ -82,7 +82,6 @@ def calculate_threshold(
popt, _ = curve_fit(threshold_fit, (per_data, distance_data), ler_data, maxfev=10000)
if ax is not None:
ax.axvline(x=popt[-1], color="black", linestyle="dashed")
print("threshold: ", popt[-1])

distance_array = [int(distance) for distance in code_dict]
distance_array.sort()
Expand Down Expand Up @@ -198,23 +197,23 @@ def generate_plots(results_dir: Path, results_file: Path) -> None:


def generate_plots_tn(results_dir: Path, results_file: Path) -> None:
"""Generate the plots for the tensor network decoder."""
"""Generate plots for TN decoder."""
# read in all generated data
data = []
for file in results_dir.glob("*.json"):
with file.open() as f:
data.append(json.loads(f.read()))

# prepare code to per,ler map and print
code_to_xys: dict[float, Any] = {}
code_to_xys = {} # type: ignore[var-annotated]
for run in data:
xys = code_to_xys.setdefault(run["n_k_d"][-1], [])
xys.append((run["physical_error_rate"], run["logical_failure_rate"]))

for xys in code_to_xys.values():
xys.sort(key=operator.itemgetter(0))

_, ax = plt.subplots(2, 2, figsize=(12, 10))
_fig, ax = plt.subplots(2, 2, figsize=(12, 10))
# add data
for code, xys in sorted(code_to_xys.items()):
ax[0][0].plot(*zip(*xys), "x-", label=f"d={code}")
Expand All @@ -235,31 +234,55 @@ def generate_plots_tn(results_dir: Path, results_file: Path) -> None:
for code, xys in sorted(code_to_xys.items()):
ax[1][0].plot(*zip(*xys), "x-", label=f"d={code}")
ax[1][0].set_xlabel("Physical error rate")
ax[1][0].set_ylabel("Average time per run (microseconds)")
ax[1][0].set_ylabel("Average time per run (µs)") # noqa: RUF001
ax[1][0].legend()
ax[1][0].set_ylim(0, 300000)

ds = []
p_data: dict[float, Any] = {}
pers = [0.001, 0.021, 0.051, 0.081, 0.111]
for d, cdata in sorted(code_to_xys.items()):
p_data = {} # type: ignore[var-annotated]
pers = [0.051, 0.081, 0.111] # 0.001, 0.021,
for d, data in sorted(code_to_xys.items()):
ds.append(d)
for p, t in cdata:
for p, t in data:
if p in pers:
if p not in p_data:
p_data[p] = {"d": [], "t": []}
p_data[p]["d"].append(d)
p_data[p]["t"].append(t)
for p, pdata in sorted(p_data.items()):
ax[1][1].plot(ds, pdata["t"], label="p=" + str(p))

for p, data in sorted(p_data.items()):
ax[1][1].plot(ds, data["t"], label="p=" + str(p)) # type: ignore[call-overload]

ax[1][1].set_xlabel("Distance")
ax[1][1].set_ylabel("Average time per run (microseconds)")
ax[1][1].set_ylabel("Average time per run (µs)") # noqa: RUF001
ax[1][1].legend()
# ax[1][1].set_yscale("log")
ax[1][1].set_yscale("log")
ax[1][1].set_xticks(ds)
ax[1][1].set_ylim(0, 300000)

data = []
for file in results_dir.glob("*.json"):
with file.open() as f:
data.append(json.loads(f.read()))
metrics = {} # type: ignore[var-annotated]
per_metrics = {} # type: ignore[var-annotated]

# save plot as vector graphic
for result in data:
d = result["n_k_d"][2]
p = result["physical_error_rate"]

if d not in metrics:
metrics[d] = {
"p": [],
"logical_error_rate": [],
}
if p not in per_metrics:
per_metrics[p] = {}

metrics[d]["p"].append(p)
metrics[d]["logical_error_rate"].append(result["logical_failure_rate"])
calculate_threshold(code_dict=metrics, ax=ax[0][1], title="Threshold")
plt.savefig(results_file, bbox_inches="tight")


Expand Down
File renamed without changes.
Loading
Loading