Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

ansible_mitogen: Fix usage of connection_loader__get. #1215

Open
wants to merge 6 commits into
base: master
Choose a base branch
from

Conversation

Nihlus
Copy link
Contributor

@Nihlus Nihlus commented Jan 11, 2025

This PR revives #976 and updates the proposed fix in both the SSH and kubectl connection methods.

As a short recap, get_with_context returns a named tuple as of Ansible 2.10, which necessitates a small change in how it's accessed in Mitogen.

Furthermore, certain connections use both a local and a remote connection. This would cause issues in the same code path due to how Mitogen transmutes action types at runtime - this has also been fixed.

This fixes #766, #662, #342, and #1094.

@Nihlus Nihlus force-pushed the fix-connection-loader branch from 9dfacc3 to 476754b Compare January 11, 2025 20:22
@moreati moreati changed the title Fix usage of connection_loader__get. ansible_mitogen: Fix usage of connection_loader__get. Jan 12, 2025
@moreati
Copy link
Member

moreati commented Jan 12, 2025

Thank you for looking at this. The PR needs work to fix one or more failing tests and to include a changelog entry.

PLAY [integration/stub_connections/kubectl.yml] ********************************

TASK [include_tasks _raw_params=../_mitogen_only.yml] **************************
Saturday 11 January 2025  20:28:10 +0000 (0:00:03.327)       0:03:11.667 ****** 
included: /home/runner/work/mitogen/mitogen/tests/ansible/integration/_mitogen_only.yml for target-debian11-1, target-ubuntu2004-2

TASK [meta _raw_params=end_play] ***********************************************
Saturday 11 January 2025  20:28:10 +0000 (0:00:00.037)       0:03:11.704 ****** 
skipping: [target-debian11-1]

TASK [meta _raw_params=end_play] ***********************************************
Saturday 11 January 2025  20:28:10 +0000 (0:00:00.008)       0:03:11.712 ****** 
skipping: [target-debian11-1]

TASK [custom_python_detect_environment ] ***************************************
Saturday 11 January 2025  20:28:10 +0000 (0:00:00.007)       0:03:11.720 ****** 
An exception occurred during task execution. To see the full traceback, use -vvv. The error was: NameError: name 'vanilla_class' is not defined. Did you mean: 'self.vanilla_class'?
fatal: [target-debian11-1]: FAILED! => 
  msg: 'Unexpected failure during module execution: name ''vanilla_class'' is not defined'
  stdout: ''

@moreati
Copy link
Member

moreati commented Jan 12, 2025

I have an alternative approach that may eliminate the need for ansible_mitogen.loaders.connection_loader__get. I haven't had a chance to try it out/test it

@@ -33,6 +33,7 @@ import os.path
 import sys
 
 from ansible.plugins.connection.ssh import (
+    Connection as _ansible_ssh_Connection,
     DOCUMENTATION as _ansible_ssh_DOCUMENTATION,
 )
 
@@ -55,18 +56,13 @@ except ImportError:
     del base_dir
 
 import ansible_mitogen.connection
-import ansible_mitogen.loaders
 
 
 class Connection(ansible_mitogen.connection.Connection):
     transport = 'ssh'
-    vanilla_class = ansible_mitogen.loaders.connection_loader__get(
-        'ssh',
-        class_only=True,
-    )
 
     @staticmethod
     def _create_control_path(*args, **kwargs):
         """Forward _create_control_path() to the implementation in ssh.py."""
         # https://github.com/dw/mitogen/issues/342
-        return Connection.vanilla_class._create_control_path(*args, **kwargs)
+        return _ansible_ssh_Connection._create_control_path(*args, **kwargs)

Does it work for you?

@Nihlus
Copy link
Contributor Author

Nihlus commented Jan 12, 2025

That works, though I don't think it'd work as well for the kubectl connection plugin. That one checks for a valid return from connection_loader__get as a way to detect kubectl support, and I'm not sure if pushing the error up to the import level would be the right choice.

@Nihlus Nihlus force-pushed the fix-connection-loader branch from 476754b to 9a01aef Compare January 12, 2025 19:25
@Nihlus
Copy link
Contributor Author

Nihlus commented Jan 12, 2025

I noticed another issue with how get_with_context is handled - Ansible's get function on the loader is actually a wrapper over get_with_context these days, so if something were to access an action via get_with_context directly, Mitogen's mixin would not be applied. I added a fix to that as well.

@Nihlus Nihlus force-pushed the fix-connection-loader branch from b03d18b to 0f8575d Compare January 12, 2025 21:16
@Nihlus
Copy link
Contributor Author

Nihlus commented Jan 12, 2025

I'm currently working on tracking down the "Could not recover task_vars" issue - from what I've gathered thus far, it happens when the network_cli connection (which is not Mitogen) drops to the local connection (which is Mitogen) to run some raw commands. At that point, the action on the stack (gather_facts) has determined that it is not running under Mitogen and transmuted itself back to the non-mixin variant. However, we've not discovered the task vars yet, and we don't have a fallback to handle this like we do with _execute_meta.

I've tried forcing it through in various ways without any luck, and it all either boils down to the action on the stack not being a Mitogen action with a mixin while we're operating on the local Mitogen connection, or the python interpreter not having been discovered. Any input on that would be appreciated :)

@moreati
Copy link
Member

moreati commented Jan 13, 2025

Notes to self

  • Only ansible_mitogen.plugins.connections.mitogen_ssh has the from ... import DOCUMENTATION as ... kludge. Others probably need it, or something better.
  • `ansible_mitogen.plugins.connections.mitogen_kubectl.Connection.get_extra_args() does task var shenanigans. Probably related to above.
  • Mitogen's test coverage of connection plugins is very sparse outside ssh and local.
  • There's Ansible < 2.10 code in ansible_mitogen.plugins.connection.mitogen_kubectl.Connection.get_extra_args(). It should probably be deleted.
  • mitogen_kubectl and mitogen_ssh are the only connection plugins that import ansible_mitogen.loaders
  • Importing ansible_mitogen.loaders runs assert_supported_release() as a side effect. This is arguably also a kludge. ansible_mitogen.loaders was chosen for this purpose because it is very early in the order of imports when Ansible is used with Mitogen. Removing ansible_mitogen.loaders from connection plugins is probably fine, because it is also imported by the strategy plugins. Otherwise the check could be moved elsewhere, or eliminated (RFC RFC: Cost/benefit of Ansible version constraint ("Your Ansible version (%s) is too recent ...") #1218).
$ ag kubectl tests/ansible                  
tests/ansible/integration/transport/all.yml
2:- include_playbook: kubectl.yml
4:    - kubectl

tests/ansible/integration/transport/kubectl.yml
45:        ansible_connection: "kubectl"
50:- name: "Test kubectl connection (default strategy)"
73:            ansible_kubectl_container: python3
94:- name: "Test kubectl connection (mitogen strategy)"
117:            ansible_kubectl_container: python3

tests/ansible/integration/stub_connections/all.yml
1:- import_playbook: kubectl.yml

tests/ansible/integration/stub_connections/kubectl.yml
2:- name: integration/stub_connections/kubectl.yml
14:      ansible_connection: kubectl
15:      mitogen_kubectl_path: stub-kubectl.py
20:      - out.env.THIS_IS_STUB_KUBECTL == '1'
24:    - kubectl

tests/ansible/integration/stub_connections/README.md
5:tools (kubectl etc.) to verify arguments passed by Ansible to Mitogen and

@moreati
Copy link
Member

moreati commented Jan 13, 2025

That works, though I don't think it'd work as well for the kubectl connection plugin

Agreed. I like from ansible.plugins.connection... import Connection as ... for SSH, but kubectl will need something else.

if something were to access an action via get_with_context directly, Mitogen's mixin would not be applied. I added a fix to that as well

Good spot.

Any input on that would be appreciated

Looking, but no promises. We're in deep, at the intersection of many tight/accidental couplings. I've only recently got my head around this bit of the code.

Comment on lines +102 to +103
action_loader__get_with_context = action_loader.get_with_context
connection_loader__get_with_context = connection_loader.get_with_context
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

+1 changing the names. ...__get = ...get_with_context has long bugged me.

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Yeah, I saw that and thought like hell I'm leaving it like that hah

@@ -48,6 +48,8 @@
import ansible.template
import ansible.utils.sentinel

from ansible.plugins.loader import get_with_context_result
Copy link
Member

@moreati moreati Jan 13, 2025

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Personal preference, use absolute imports when possible. It makes it easier to distinguish what belongs to ansible.* and what belongs to ansible_mitogen.*, or mitogen.*

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Gotcha - I had some issues getting an absolute import working in this file due to other imports masking things. Tried a few more ways just now but didn't get anywhere.

Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I tried some variations in moreati@25da0f7. We could eliminate this import entirely by using

result = ansible_mitogen.loaders.action_loader__get_with_context(...)
...
return result._replace(object=adorned_klass)

It loses a little clarity. I'm in favour of the tradeoff, but ready to be convinced otherwise.

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I'm not very keen on this alternative for a few reasons - clarity, as you mentioned, but also the fact that we're now relying on _replace which is a protected attribute. Adding to that, we've now also leaked the name of the tuple object into our code; meaning that if Ansible ever goes and changes it, our code is broken and needs an update.

Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I'm not very keen on this alternative

On reflection I agree. I'd still like to use an absolute import. I'll investigate.

Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Absolute import seems to have worked in 8273eac

@moreati
Copy link
Member

moreati commented Jan 15, 2025

I'm currently working on tracking down the "Could not recover task_vars" issue

Which issue is this? Do you have a playbook or other way to reproduce it?

@moreati
Copy link
Member

moreati commented Jan 15, 2025

I'm currently working on tracking down the "Could not recover task_vars" issue

Which issue is this? Do you have a playbook or other way to reproduce it?

Probably a reference to one or more of the issues (#766, #662, #342) that this PR (derived from #976 by @markafarrell) may solve. There are repro instructions in https://github.com/markafarrell/mitogen-issue-766-repro, mentioned in #976 (comment).

@Nihlus
Copy link
Contributor Author

Nihlus commented Jan 15, 2025

That's correct. Mitogen is currently unable to handle network_cli connections properly and fails during task var recovery when executing low-level local commands.

@moreati
Copy link
Member

moreati commented Jan 16, 2025

@Nihlus I've updated Mark's reproduction of #766 and added your branch https://github.com/moreati/mitogen-issue-766-repro.

  • linear strategy succeeds
  • mitogen_linear strategy using Mitogen 0.3.20 fails with AttributeError: 'get_with_context_result' object has no attribute '_create_control_path'
  • mitogen_linear strategy using your branch rev 0f8575d succeeds
Full tox run
alex@uv2404:~/src/mitogen-issue-766-repro$ tox
prep: commands[0]> ansible-galaxy collection install --force --collections-path collections -r requirements.yml
Starting galaxy collection install process
Process install dependency map
Starting collection install process
Downloading https://galaxy.ansible.com/api/v3/plugin/ansible/content/published/collections/artifacts/ansible-netcommon-7.1.0.tar.gz to /home/alex/.ansible/tmp/ansible-local-20614fhi9yvpp/tmpnrmskcyv/ansible-netcommon-7.1.0-q2fs3m7e
Installing 'ansible.netcommon:7.1.0' to '/home/alex/src/mitogen-issue-766-repro/collections/ansible_collections/ansible/netcommon'
ansible.netcommon:7.1.0 was installed successfully
'ansible.utils:5.1.2' is already installed, skipping.
prep: OK ✔ in 2.73 seconds
py3.13-ansible8-linear: commands[0]> ansible-playbook playbook.yml

PLAY [Get running configuration and state data] **********************************************************

TASK [Start container] ***********************************************************************************
changed: [netconf -> localhost]

TASK [Wait for container] ********************************************************************************
ok: [netconf -> localhost]

TASK [Get running configuration and state data] **********************************************************
ok: [netconf]

TASK [Cleanup container] *********************************************************************************
changed: [netconf -> localhost]

PLAY RECAP ***********************************************************************************************
netconf                    : ok=4    changed=2    unreachable=0    failed=0    skipped=0    rescued=0    ignored=0   

py3.13-ansible8-linear: OK ✔ in 6.84 seconds
py3.13-ansible9-linear: commands[0]> ansible-playbook playbook.yml

PLAY [Get running configuration and state data] **********************************************************

TASK [Start container] ***********************************************************************************
changed: [netconf -> localhost]

TASK [Wait for container] ********************************************************************************
ok: [netconf -> localhost]

TASK [Get running configuration and state data] **********************************************************
ok: [netconf]

TASK [Cleanup container] *********************************************************************************
changed: [netconf -> localhost]

PLAY RECAP ***********************************************************************************************
netconf                    : ok=4    changed=2    unreachable=0    failed=0    skipped=0    rescued=0    ignored=0   

py3.13-ansible9-linear: OK ✔ in 6.84 seconds
py3.13-ansible10-linear: commands[0]> ansible-playbook playbook.yml

PLAY [Get running configuration and state data] **********************************************************

TASK [Start container] ***********************************************************************************
changed: [netconf -> localhost]

TASK [Wait for container] ********************************************************************************
ok: [netconf -> localhost]

TASK [Get running configuration and state data] **********************************************************
ok: [netconf]

TASK [Cleanup container] *********************************************************************************
changed: [netconf -> localhost]

PLAY RECAP ***********************************************************************************************
netconf                    : ok=4    changed=2    unreachable=0    failed=0    skipped=0    rescued=0    ignored=0   

py3.13-ansible10-linear: OK ✔ in 6.86 seconds
py3.13-ansible11-linear: commands[0]> ansible-playbook playbook.yml

PLAY [Get running configuration and state data] **********************************************************

TASK [Start container] ***********************************************************************************
changed: [netconf -> localhost]

TASK [Wait for container] ********************************************************************************
ok: [netconf -> localhost]

TASK [Get running configuration and state data] **********************************************************
ok: [netconf]

TASK [Cleanup container] *********************************************************************************
changed: [netconf -> localhost]

PLAY RECAP ***********************************************************************************************
netconf                    : ok=4    changed=2    unreachable=0    failed=0    skipped=0    rescued=0    ignored=0   

py3.13-ansible11-linear: OK ✔ in 6.81 seconds
py3.13-ansible8-mitogen-mitogen_linear: commands[0]> ansible-playbook playbook.yml

PLAY [Get running configuration and state data] **********************************************************

TASK [Start container] ***********************************************************************************
changed: [netconf -> localhost]

TASK [Wait for container] ********************************************************************************
ok: [netconf -> localhost]

TASK [Get running configuration and state data] **********************************************************
An exception occurred during task execution. To see the full traceback, use -vvv. The error was: AttributeError: 'get_with_context_result' object has no attribute '_create_control_path'
fatal: [netconf]: FAILED! => {"msg": "Unexpected failure during module execution: 'get_with_context_result' object has no attribute '_create_control_path'", "stdout": ""}

TASK [Cleanup container] *********************************************************************************
changed: [netconf -> localhost]

PLAY RECAP ***********************************************************************************************
netconf                    : ok=3    changed=2    unreachable=0    failed=1    skipped=0    rescued=0    ignored=0   

py3.13-ansible8-mitogen-mitogen_linear: exit 2 (7.55 seconds) /home/alex/src/mitogen-issue-766-repro> ansible-playbook playbook.yml pid=22122
py3.13-ansible8-mitogen-mitogen_linear: FAIL ✖ in 7.56 seconds
py3.13-ansible8-mitogen_pr1215-mitogen_linear: install_deps> python -I -m pip install 'ansible~=8.0' ncclient git+https://github.com/Nihlus/mitogen.git@fix-connection-loader
py3.13-ansible8-mitogen_pr1215-mitogen_linear: commands[0]> ansible-playbook playbook.yml

PLAY [Get running configuration and state data] **********************************************************

TASK [Start container] ***********************************************************************************
changed: [netconf -> localhost]

TASK [Wait for container] ********************************************************************************
ok: [netconf -> localhost]

TASK [Get running configuration and state data] **********************************************************
ok: [netconf]

TASK [Cleanup container] *********************************************************************************
changed: [netconf -> localhost]

PLAY RECAP ***********************************************************************************************
netconf                    : ok=4    changed=2    unreachable=0    failed=0    skipped=0    rescued=0    ignored=0   

py3.13-ansible8-mitogen_pr1215-mitogen_linear: OK ✔ in 9.57 seconds
py3.13-ansible9-mitogen-mitogen_linear: install_deps> python -I -m pip install 'ansible~=9.0' mitogen ncclient
py3.13-ansible9-mitogen-mitogen_linear: commands[0]> ansible-playbook playbook.yml

PLAY [Get running configuration and state data] **********************************************************

TASK [Start container] ***********************************************************************************
changed: [netconf -> localhost]

TASK [Wait for container] ********************************************************************************
ok: [netconf -> localhost]

TASK [Get running configuration and state data] **********************************************************
An exception occurred during task execution. To see the full traceback, use -vvv. The error was: AttributeError: 'get_with_context_result' object has no attribute '_create_control_path'
fatal: [netconf]: FAILED! => {"msg": "Unexpected failure during module execution: 'get_with_context_result' object has no attribute '_create_control_path'", "stdout": ""}

TASK [Cleanup container] *********************************************************************************
changed: [netconf -> localhost]

PLAY RECAP ***********************************************************************************************
netconf                    : ok=3    changed=2    unreachable=0    failed=1    skipped=0    rescued=0    ignored=0   

py3.13-ansible9-mitogen-mitogen_linear: exit 2 (7.45 seconds) /home/alex/src/mitogen-issue-766-repro> ansible-playbook playbook.yml pid=22864
py3.13-ansible9-mitogen-mitogen_linear: FAIL ✖ in 18.84 seconds
py3.13-ansible9-mitogen_pr1215-mitogen_linear: install_deps> python -I -m pip install 'ansible~=9.0' ncclient git+https://github.com/Nihlus/mitogen.git@fix-connection-loader
py3.13-ansible9-mitogen_pr1215-mitogen_linear: commands[0]> ansible-playbook playbook.yml

PLAY [Get running configuration and state data] **********************************************************

TASK [Start container] ***********************************************************************************
changed: [netconf -> localhost]

TASK [Wait for container] ********************************************************************************
ok: [netconf -> localhost]

TASK [Get running configuration and state data] **********************************************************
ok: [netconf]

TASK [Cleanup container] *********************************************************************************
changed: [netconf -> localhost]

PLAY RECAP ***********************************************************************************************
netconf                    : ok=4    changed=2    unreachable=0    failed=0    skipped=0    rescued=0    ignored=0   

py3.13-ansible9-mitogen_pr1215-mitogen_linear: OK ✔ in 21.32 seconds
py3.13-ansible10-mitogen-mitogen_linear: install_deps> python -I -m pip install 'ansible~=10.0' mitogen ncclient
py3.13-ansible10-mitogen-mitogen_linear: commands[0]> ansible-playbook playbook.yml

PLAY [Get running configuration and state data] **********************************************************

TASK [Start container] ***********************************************************************************
changed: [netconf -> localhost]

TASK [Wait for container] ********************************************************************************
ok: [netconf -> localhost]

TASK [Get running configuration and state data] **********************************************************
An exception occurred during task execution. To see the full traceback, use -vvv. The error was: AttributeError: 'get_with_context_result' object has no attribute '_create_control_path'
fatal: [netconf]: FAILED! => {"msg": "Unexpected failure during module execution: 'get_with_context_result' object has no attribute '_create_control_path'", "stdout": ""}

TASK [Cleanup container] *********************************************************************************
changed: [netconf -> localhost]

PLAY RECAP ***********************************************************************************************
netconf                    : ok=3    changed=2    unreachable=0    failed=1    skipped=0    rescued=0    ignored=0   

py3.13-ansible10-mitogen-mitogen_linear: exit 2 (7.43 seconds) /home/alex/src/mitogen-issue-766-repro> ansible-playbook playbook.yml pid=23593
py3.13-ansible10-mitogen-mitogen_linear: FAIL ✖ in 19.21 seconds
py3.13-ansible10-mitogen_pr1215-mitogen_linear: install_deps> python -I -m pip install 'ansible~=10.0' ncclient git+https://github.com/Nihlus/mitogen.git@fix-connection-loader
py3.13-ansible10-mitogen_pr1215-mitogen_linear: commands[0]> ansible-playbook playbook.yml

PLAY [Get running configuration and state data] **********************************************************

TASK [Start container] ***********************************************************************************
changed: [netconf -> localhost]

TASK [Wait for container] ********************************************************************************
ok: [netconf -> localhost]

TASK [Get running configuration and state data] **********************************************************
ok: [netconf]

TASK [Cleanup container] *********************************************************************************
changed: [netconf -> localhost]

PLAY RECAP ***********************************************************************************************
netconf                    : ok=4    changed=2    unreachable=0    failed=0    skipped=0    rescued=0    ignored=0   

py3.13-ansible10-mitogen_pr1215-mitogen_linear: OK ✔ in 22.6 seconds
py3.13-ansible11-mitogen-mitogen_linear: install_deps> python -I -m pip install 'ansible~=11.0' mitogen ncclient
Collecting ansible~=11.0
  Using cached ansible-11.1.0-py3-none-any.whl.metadata (8.0 kB)
Collecting mitogen
  Using cached mitogen-0.3.20-py2.py3-none-any.whl.metadata (2.0 kB)
Collecting ncclient
  Using cached ncclient-0.6.16-py2.py3-none-any.whl
Collecting ansible-core~=2.18.1 (from ansible~=11.0)
  Using cached ansible_core-2.18.1-py3-none-any.whl.metadata (7.7 kB)
Collecting setuptools>0.6 (from ncclient)
  Using cached setuptools-75.8.0-py3-none-any.whl.metadata (6.7 kB)
Collecting paramiko>=1.15.0 (from ncclient)
  Using cached paramiko-3.5.0-py3-none-any.whl.metadata (4.4 kB)
Collecting lxml>=3.3.0 (from ncclient)
  Using cached lxml-5.3.0-cp313-cp313-manylinux_2_28_x86_64.whl.metadata (3.8 kB)
Collecting six (from ncclient)
  Using cached six-1.17.0-py2.py3-none-any.whl.metadata (1.7 kB)
Collecting jinja2>=3.0.0 (from ansible-core~=2.18.1->ansible~=11.0)
  Using cached jinja2-3.1.5-py3-none-any.whl.metadata (2.6 kB)
Collecting PyYAML>=5.1 (from ansible-core~=2.18.1->ansible~=11.0)
  Using cached PyYAML-6.0.2-cp313-cp313-manylinux_2_17_x86_64.manylinux2014_x86_64.whl.metadata (2.1 kB)
Collecting cryptography (from ansible-core~=2.18.1->ansible~=11.0)
  Using cached cryptography-44.0.0-cp39-abi3-manylinux_2_28_x86_64.whl.metadata (5.7 kB)
Collecting packaging (from ansible-core~=2.18.1->ansible~=11.0)
  Using cached packaging-24.2-py3-none-any.whl.metadata (3.2 kB)
Collecting resolvelib<1.1.0,>=0.5.3 (from ansible-core~=2.18.1->ansible~=11.0)
  Using cached resolvelib-1.0.1-py2.py3-none-any.whl.metadata (4.0 kB)
Collecting bcrypt>=3.2 (from paramiko>=1.15.0->ncclient)
  Using cached bcrypt-4.2.1-cp39-abi3-manylinux_2_28_x86_64.whl.metadata (9.8 kB)
Collecting pynacl>=1.5 (from paramiko>=1.15.0->ncclient)
  Using cached PyNaCl-1.5.0-cp36-abi3-manylinux_2_17_x86_64.manylinux2014_x86_64.manylinux_2_24_x86_64.whl.metadata (8.6 kB)
Collecting cffi>=1.12 (from cryptography->ansible-core~=2.18.1->ansible~=11.0)
  Using cached cffi-1.17.1-cp313-cp313-manylinux_2_17_x86_64.manylinux2014_x86_64.whl.metadata (1.5 kB)
Collecting MarkupSafe>=2.0 (from jinja2>=3.0.0->ansible-core~=2.18.1->ansible~=11.0)
  Using cached MarkupSafe-3.0.2-cp313-cp313-manylinux_2_17_x86_64.manylinux2014_x86_64.whl.metadata (4.0 kB)
Collecting pycparser (from cffi>=1.12->cryptography->ansible-core~=2.18.1->ansible~=11.0)
  Using cached pycparser-2.22-py3-none-any.whl.metadata (943 bytes)
Using cached ansible-11.1.0-py3-none-any.whl (51.4 MB)
Using cached mitogen-0.3.20-py2.py3-none-any.whl (285 kB)
Using cached ansible_core-2.18.1-py3-none-any.whl (2.2 MB)
Using cached lxml-5.3.0-cp313-cp313-manylinux_2_28_x86_64.whl (4.9 MB)
Using cached paramiko-3.5.0-py3-none-any.whl (227 kB)
Using cached setuptools-75.8.0-py3-none-any.whl (1.2 MB)
Using cached six-1.17.0-py2.py3-none-any.whl (11 kB)
Using cached bcrypt-4.2.1-cp39-abi3-manylinux_2_28_x86_64.whl (278 kB)
Using cached cryptography-44.0.0-cp39-abi3-manylinux_2_28_x86_64.whl (4.2 MB)
Using cached jinja2-3.1.5-py3-none-any.whl (134 kB)
Using cached PyNaCl-1.5.0-cp36-abi3-manylinux_2_17_x86_64.manylinux2014_x86_64.manylinux_2_24_x86_64.whl (856 kB)
Using cached PyYAML-6.0.2-cp313-cp313-manylinux_2_17_x86_64.manylinux2014_x86_64.whl (759 kB)
Using cached resolvelib-1.0.1-py2.py3-none-any.whl (17 kB)
Using cached packaging-24.2-py3-none-any.whl (65 kB)
Using cached cffi-1.17.1-cp313-cp313-manylinux_2_17_x86_64.manylinux2014_x86_64.whl (479 kB)
Using cached MarkupSafe-3.0.2-cp313-cp313-manylinux_2_17_x86_64.manylinux2014_x86_64.whl (23 kB)
Using cached pycparser-2.22-py3-none-any.whl (117 kB)
Installing collected packages: resolvelib, six, setuptools, PyYAML, pycparser, packaging, mitogen, MarkupSafe, lxml, bcrypt, jinja2, cffi, pynacl, cryptography, paramiko, ansible-core, ncclient, ansible
ERROR: Could not install packages due to an OSError: [Errno 28] No space left on device


py3.13-ansible11-mitogen-mitogen_linear: exit 1 (2.99 seconds) /home/alex/src/mitogen-issue-766-repro> python -I -m pip install 'ansible~=11.0' mitogen ncclient pid=24302
py3.13-ansible11-mitogen-mitogen_linear: FAIL ✖ in 3 seconds
py3.13-ansible11-mitogen_pr1215-mitogen_linear: install_deps> python -I -m pip install 'ansible~=11.0' ncclient git+https://github.com/Nihlus/mitogen.git@fix-connection-loader
Collecting git+https://github.com/Nihlus/mitogen.git@fix-connection-loader
  Cloning https://github.com/Nihlus/mitogen.git (to revision fix-connection-loader) to /tmp/pip-req-build-i7djncds
  Resolved https://github.com/Nihlus/mitogen.git to commit 0f8575d4233614ae94e7e96cb30145aef48bf937
  Installing build dependencies ... done
  Getting requirements to build wheel ... done
  Preparing metadata (pyproject.toml) ... done
Collecting ansible~=11.0
  Using cached ansible-11.1.0-py3-none-any.whl.metadata (8.0 kB)
Collecting ncclient
  Using cached ncclient-0.6.16-py2.py3-none-any.whl
Collecting ansible-core~=2.18.1 (from ansible~=11.0)
  Using cached ansible_core-2.18.1-py3-none-any.whl.metadata (7.7 kB)
Collecting setuptools>0.6 (from ncclient)
  Using cached setuptools-75.8.0-py3-none-any.whl.metadata (6.7 kB)
Collecting paramiko>=1.15.0 (from ncclient)
  Using cached paramiko-3.5.0-py3-none-any.whl.metadata (4.4 kB)
Collecting lxml>=3.3.0 (from ncclient)
  Using cached lxml-5.3.0-cp313-cp313-manylinux_2_28_x86_64.whl.metadata (3.8 kB)
Collecting six (from ncclient)
  Using cached six-1.17.0-py2.py3-none-any.whl.metadata (1.7 kB)
Collecting jinja2>=3.0.0 (from ansible-core~=2.18.1->ansible~=11.0)
  Using cached jinja2-3.1.5-py3-none-any.whl.metadata (2.6 kB)
Collecting PyYAML>=5.1 (from ansible-core~=2.18.1->ansible~=11.0)
  Using cached PyYAML-6.0.2-cp313-cp313-manylinux_2_17_x86_64.manylinux2014_x86_64.whl.metadata (2.1 kB)
Collecting cryptography (from ansible-core~=2.18.1->ansible~=11.0)
  Using cached cryptography-44.0.0-cp39-abi3-manylinux_2_28_x86_64.whl.metadata (5.7 kB)
Collecting packaging (from ansible-core~=2.18.1->ansible~=11.0)
  Using cached packaging-24.2-py3-none-any.whl.metadata (3.2 kB)
Collecting resolvelib<1.1.0,>=0.5.3 (from ansible-core~=2.18.1->ansible~=11.0)
  Using cached resolvelib-1.0.1-py2.py3-none-any.whl.metadata (4.0 kB)
Collecting bcrypt>=3.2 (from paramiko>=1.15.0->ncclient)
  Using cached bcrypt-4.2.1-cp39-abi3-manylinux_2_28_x86_64.whl.metadata (9.8 kB)
Collecting pynacl>=1.5 (from paramiko>=1.15.0->ncclient)
  Using cached PyNaCl-1.5.0-cp36-abi3-manylinux_2_17_x86_64.manylinux2014_x86_64.manylinux_2_24_x86_64.whl.metadata (8.6 kB)
Collecting cffi>=1.12 (from cryptography->ansible-core~=2.18.1->ansible~=11.0)
  Using cached cffi-1.17.1-cp313-cp313-manylinux_2_17_x86_64.manylinux2014_x86_64.whl.metadata (1.5 kB)
Collecting MarkupSafe>=2.0 (from jinja2>=3.0.0->ansible-core~=2.18.1->ansible~=11.0)
  Using cached MarkupSafe-3.0.2-cp313-cp313-manylinux_2_17_x86_64.manylinux2014_x86_64.whl.metadata (4.0 kB)
Collecting pycparser (from cffi>=1.12->cryptography->ansible-core~=2.18.1->ansible~=11.0)
  Using cached pycparser-2.22-py3-none-any.whl.metadata (943 bytes)
Using cached ansible-11.1.0-py3-none-any.whl (51.4 MB)
  Running command git clone --filter=blob:none --quiet https://github.com/Nihlus/mitogen.git /tmp/pip-req-build-i7djncds
  Running command git checkout -b fix-connection-loader --track origin/fix-connection-loader
  Switched to a new branch 'fix-connection-loader'
  branch 'fix-connection-loader' set up to track 'origin/fix-connection-loader'.
ERROR: Could not install packages due to an OSError: [Errno 28] No space left on device


py3.13-ansible11-mitogen_pr1215-mitogen_linear: exit 1 (2.75 seconds) /home/alex/src/mitogen-issue-766-repro> python -I -m pip install 'ansible~=11.0' ncclient git+https://github.com/Nihlus/mitogen.git@fix-connection-loader pid=24321
  prep: OK (2.73=setup[0.02]+cmd[2.71] seconds)
  py3.13-ansible8-linear: OK (6.84=setup[0.01]+cmd[6.83] seconds)
  py3.13-ansible9-linear: OK (6.84=setup[0.00]+cmd[6.83] seconds)
  py3.13-ansible10-linear: OK (6.86=setup[0.00]+cmd[6.86] seconds)
  py3.13-ansible11-linear: OK (6.81=setup[0.00]+cmd[6.81] seconds)
  py3.13-ansible8-mitogen-mitogen_linear: FAIL code 2 (7.56=setup[0.00]+cmd[7.55] seconds)
  py3.13-ansible8-mitogen_pr1215-mitogen_linear: OK (9.57=setup[2.10]+cmd[7.47] seconds)
  py3.13-ansible9-mitogen-mitogen_linear: FAIL code 2 (18.84=setup[11.39]+cmd[7.45] seconds)
  py3.13-ansible9-mitogen_pr1215-mitogen_linear: OK (21.32=setup[13.84]+cmd[7.48] seconds)
  py3.13-ansible10-mitogen-mitogen_linear: FAIL code 2 (19.21=setup[11.78]+cmd[7.43] seconds)
  py3.13-ansible10-mitogen_pr1215-mitogen_linear: OK (22.60=setup[15.16]+cmd[7.44] seconds)
  py3.13-ansible11-mitogen-mitogen_linear: FAIL code 1 (3.00 seconds)
  py3.13-ansible11-mitogen_pr1215-mitogen_linear: FAIL code 1 (2.76 seconds)
  evaluation failed :( (134.99 seconds)
alex@uv2404:~/src/mitogen-issue-766-repro$ ncdu ~
Command 'ncdu' not found, but can be installed with:
sudo apt install ncdu
alex@uv2404:~/src/mitogen-issue-766-repro$ rm -rf .tox/py3.13-ansible11-
py3.13-ansible11-linear/                        py3.13-ansible11-mitogen_pr1215-mitogen_linear/
py3.13-ansible11-mitogen-mitogen_linear/        
alex@uv2404:~/src/mitogen-issue-766-repro$ rm -rf .tox/py3.13-ansible11-mitogen*
alex@uv2404:~/src/mitogen-issue-766-repro$ sudo apt install ncdu
[sudo] password for alex: 
Reading package lists... Done
Building dependency tree... Done
Reading state information... Done
The following NEW packages will be installed:
  ncdu
0 upgraded, 1 newly installed, 0 to remove and 76 not upgraded.
Need to get 50.7 kB of archives.
After this operation, 117 kB of additional disk space will be used.
Get:1 http://gb.archive.ubuntu.com/ubuntu noble/universe amd64 ncdu amd64 1.19-0.1 [50.7 kB]
Fetched 50.7 kB in 0s (2,100 kB/s)
Selecting previously unselected package ncdu.
(Reading database ... 135634 files and directories currently installed.)
Preparing to unpack .../ncdu_1.19-0.1_amd64.deb ...
Unpacking ncdu (1.19-0.1) ...
Setting up ncdu (1.19-0.1) ...
Processing triggers for man-db (2.12.0-4build2) ...
Disabling Ubuntu mode, explicit restart mode configuredScanning processes... [                            Scanning processes...                                                                                     

No services need to be restarted.

No containers need to be restarted.

No user sessions are running outdated binaries.

No VM guests are running outdated hypervisor (qemu) binaries on this host.
alex@uv2404:~/src/mitogen-issue-766-repro$ ncdu ~/
alex@uv2404:~/src/mitogen-issue-766-repro$ tox
prep: commands[0]> ansible-galaxy collection install --force --collections-path collections -r requirements.yml
Starting galaxy collection install process
Process install dependency map
Starting collection install process
Downloading https://galaxy.ansible.com/api/v3/plugin/ansible/content/published/collections/artifacts/ansible-netcommon-7.1.0.tar.gz to /home/alex/.ansible/tmp/ansible-local-24617oj9mygf7/tmpmq_xv6xz/ansible-netcommon-7.1.0-ypryup5n
Installing 'ansible.netcommon:7.1.0' to '/home/alex/src/mitogen-issue-766-repro/collections/ansible_collections/ansible/netcommon'
ansible.netcommon:7.1.0 was installed successfully
'ansible.utils:5.1.2' is already installed, skipping.
prep: OK ✔ in 2.04 seconds
py3.13-ansible8-linear: commands[0]> ansible-playbook playbook.yml

PLAY [Get running configuration and state data] **********************************************************

TASK [Start container] ***********************************************************************************
changed: [netconf -> localhost]

TASK [Wait for container] ********************************************************************************
ok: [netconf -> localhost]

TASK [Get running configuration and state data] **********************************************************
ok: [netconf]

TASK [Cleanup container] *********************************************************************************
changed: [netconf -> localhost]

PLAY RECAP ***********************************************************************************************
netconf                    : ok=4    changed=2    unreachable=0    failed=0    skipped=0    rescued=0    ignored=0   

py3.13-ansible8-linear: OK ✔ in 6.98 seconds
py3.13-ansible9-linear: commands[0]> ansible-playbook playbook.yml

PLAY [Get running configuration and state data] **********************************************************

TASK [Start container] ***********************************************************************************
changed: [netconf -> localhost]

TASK [Wait for container] ********************************************************************************
ok: [netconf -> localhost]

TASK [Get running configuration and state data] **********************************************************
ok: [netconf]

TASK [Cleanup container] *********************************************************************************
changed: [netconf -> localhost]

PLAY RECAP ***********************************************************************************************
netconf                    : ok=4    changed=2    unreachable=0    failed=0    skipped=0    rescued=0    ignored=0   

py3.13-ansible9-linear: OK ✔ in 6.86 seconds
py3.13-ansible10-linear: commands[0]> ansible-playbook playbook.yml

PLAY [Get running configuration and state data] **********************************************************

TASK [Start container] ***********************************************************************************
changed: [netconf -> localhost]

TASK [Wait for container] ********************************************************************************
ok: [netconf -> localhost]

TASK [Get running configuration and state data] **********************************************************
ok: [netconf]

TASK [Cleanup container] *********************************************************************************
changed: [netconf -> localhost]

PLAY RECAP ***********************************************************************************************
netconf                    : ok=4    changed=2    unreachable=0    failed=0    skipped=0    rescued=0    ignored=0   

py3.13-ansible10-linear: OK ✔ in 6.82 seconds
py3.13-ansible11-linear: commands[0]> ansible-playbook playbook.yml

PLAY [Get running configuration and state data] **********************************************************

TASK [Start container] ***********************************************************************************
changed: [netconf -> localhost]

TASK [Wait for container] ********************************************************************************
ok: [netconf -> localhost]

TASK [Get running configuration and state data] **********************************************************
ok: [netconf]

TASK [Cleanup container] *********************************************************************************
changed: [netconf -> localhost]

PLAY RECAP ***********************************************************************************************
netconf                    : ok=4    changed=2    unreachable=0    failed=0    skipped=0    rescued=0    ignored=0   

py3.13-ansible11-linear: OK ✔ in 6.86 seconds
py3.13-ansible8-mitogen-mitogen_linear: commands[0]> ansible-playbook playbook.yml

PLAY [Get running configuration and state data] **********************************************************

TASK [Start container] ***********************************************************************************
changed: [netconf -> localhost]

TASK [Wait for container] ********************************************************************************
ok: [netconf -> localhost]

TASK [Get running configuration and state data] **********************************************************
An exception occurred during task execution. To see the full traceback, use -vvv. The error was: AttributeError: 'get_with_context_result' object has no attribute '_create_control_path'
fatal: [netconf]: FAILED! => {"msg": "Unexpected failure during module execution: 'get_with_context_result' object has no attribute '_create_control_path'", "stdout": ""}

TASK [Cleanup container] *********************************************************************************
changed: [netconf -> localhost]

PLAY RECAP ***********************************************************************************************
netconf                    : ok=3    changed=2    unreachable=0    failed=1    skipped=0    rescued=0    ignored=0   

py3.13-ansible8-mitogen-mitogen_linear: exit 2 (7.49 seconds) /home/alex/src/mitogen-issue-766-repro> ansible-playbook playbook.yml pid=26148
py3.13-ansible8-mitogen-mitogen_linear: FAIL ✖ in 7.49 seconds
py3.13-ansible8-mitogen_pr1215-mitogen_linear: commands[0]> ansible-playbook playbook.yml

PLAY [Get running configuration and state data] **********************************************************

TASK [Start container] ***********************************************************************************
changed: [netconf -> localhost]

TASK [Wait for container] ********************************************************************************
ok: [netconf -> localhost]

TASK [Get running configuration and state data] **********************************************************
ok: [netconf]

TASK [Cleanup container] *********************************************************************************
changed: [netconf -> localhost]

PLAY RECAP ***********************************************************************************************
netconf                    : ok=4    changed=2    unreachable=0    failed=0    skipped=0    rescued=0    ignored=0   

py3.13-ansible8-mitogen_pr1215-mitogen_linear: OK ✔ in 7.57 seconds
py3.13-ansible9-mitogen-mitogen_linear: commands[0]> ansible-playbook playbook.yml

PLAY [Get running configuration and state data] **********************************************************

TASK [Start container] ***********************************************************************************
changed: [netconf -> localhost]

TASK [Wait for container] ********************************************************************************
ok: [netconf -> localhost]

TASK [Get running configuration and state data] **********************************************************
An exception occurred during task execution. To see the full traceback, use -vvv. The error was: AttributeError: 'get_with_context_result' object has no attribute '_create_control_path'
fatal: [netconf]: FAILED! => {"msg": "Unexpected failure during module execution: 'get_with_context_result' object has no attribute '_create_control_path'", "stdout": ""}

TASK [Cleanup container] *********************************************************************************
changed: [netconf -> localhost]

PLAY RECAP ***********************************************************************************************
netconf                    : ok=3    changed=2    unreachable=0    failed=1    skipped=0    rescued=0    ignored=0   

py3.13-ansible9-mitogen-mitogen_linear: exit 2 (7.49 seconds) /home/alex/src/mitogen-issue-766-repro> ansible-playbook playbook.yml pid=26799
py3.13-ansible9-mitogen-mitogen_linear: FAIL ✖ in 7.5 seconds
py3.13-ansible9-mitogen_pr1215-mitogen_linear: commands[0]> ansible-playbook playbook.yml

PLAY [Get running configuration and state data] **********************************************************

TASK [Start container] ***********************************************************************************
changed: [netconf -> localhost]

TASK [Wait for container] ********************************************************************************
ok: [netconf -> localhost]

TASK [Get running configuration and state data] **********************************************************
ok: [netconf]

TASK [Cleanup container] *********************************************************************************
changed: [netconf -> localhost]

PLAY RECAP ***********************************************************************************************
netconf                    : ok=4    changed=2    unreachable=0    failed=0    skipped=0    rescued=0    ignored=0   

py3.13-ansible9-mitogen_pr1215-mitogen_linear: OK ✔ in 7.53 seconds
py3.13-ansible10-mitogen-mitogen_linear: commands[0]> ansible-playbook playbook.yml

PLAY [Get running configuration and state data] **********************************************************

TASK [Start container] ***********************************************************************************
changed: [netconf -> localhost]

TASK [Wait for container] ********************************************************************************
ok: [netconf -> localhost]

TASK [Get running configuration and state data] **********************************************************
An exception occurred during task execution. To see the full traceback, use -vvv. The error was: AttributeError: 'get_with_context_result' object has no attribute '_create_control_path'
fatal: [netconf]: FAILED! => {"msg": "Unexpected failure during module execution: 'get_with_context_result' object has no attribute '_create_control_path'", "stdout": ""}

TASK [Cleanup container] *********************************************************************************
changed: [netconf -> localhost]

PLAY RECAP ***********************************************************************************************
netconf                    : ok=3    changed=2    unreachable=0    failed=1    skipped=0    rescued=0    ignored=0   

py3.13-ansible10-mitogen-mitogen_linear: exit 2 (7.49 seconds) /home/alex/src/mitogen-issue-766-repro> ansible-playbook playbook.yml pid=27425
py3.13-ansible10-mitogen-mitogen_linear: FAIL ✖ in 7.49 seconds
py3.13-ansible10-mitogen_pr1215-mitogen_linear: commands[0]> ansible-playbook playbook.yml

PLAY [Get running configuration and state data] **********************************************************

TASK [Start container] ***********************************************************************************
changed: [netconf -> localhost]

TASK [Wait for container] ********************************************************************************
ok: [netconf -> localhost]

TASK [Get running configuration and state data] **********************************************************
ok: [netconf]

TASK [Cleanup container] *********************************************************************************
changed: [netconf -> localhost]

PLAY RECAP ***********************************************************************************************
netconf                    : ok=4    changed=2    unreachable=0    failed=0    skipped=0    rescued=0    ignored=0   

py3.13-ansible10-mitogen_pr1215-mitogen_linear: OK ✔ in 7.6 seconds
py3.13-ansible11-mitogen-mitogen_linear: pip-24.3.1-py3-none-any.whl already present in /home/alex/.local/share/virtualenv/wheel/3.13/embed/3/pip.json
py3.13-ansible11-mitogen-mitogen_linear: install_deps> python -I -m pip install 'ansible~=11.0' mitogen ncclient
py3.13-ansible11-mitogen-mitogen_linear: commands[0]> ansible-playbook playbook.yml

PLAY [Get running configuration and state data] **********************************************************

TASK [Start container] ***********************************************************************************
changed: [netconf -> localhost]

TASK [Wait for container] ********************************************************************************
ok: [netconf -> localhost]

TASK [Get running configuration and state data] **********************************************************
An exception occurred during task execution. To see the full traceback, use -vvv. The error was: AttributeError: 'get_with_context_result' object has no attribute '_create_control_path'
fatal: [netconf]: FAILED! => {"msg": "Unexpected failure during module execution: 'get_with_context_result' object has no attribute '_create_control_path'", "stdout": ""}

TASK [Cleanup container] *********************************************************************************
changed: [netconf -> localhost]

PLAY RECAP ***********************************************************************************************
netconf                    : ok=3    changed=2    unreachable=0    failed=1    skipped=0    rescued=0    ignored=0   

py3.13-ansible11-mitogen-mitogen_linear: exit 2 (7.33 seconds) /home/alex/src/mitogen-issue-766-repro> ansible-playbook playbook.yml pid=28105
py3.13-ansible11-mitogen-mitogen_linear: FAIL ✖ in 20.57 seconds
py3.13-ansible11-mitogen_pr1215-mitogen_linear: pip-24.3.1-py3-none-any.whl already present in /home/alex/.local/share/virtualenv/wheel/3.13/embed/3/pip.json
py3.13-ansible11-mitogen_pr1215-mitogen_linear: install_deps> python -I -m pip install 'ansible~=11.0' ncclient git+https://github.com/Nihlus/mitogen.git@fix-connection-loader
py3.13-ansible11-mitogen_pr1215-mitogen_linear: commands[0]> ansible-playbook playbook.yml

PLAY [Get running configuration and state data] **********************************************************

TASK [Start container] ***********************************************************************************
changed: [netconf -> localhost]

TASK [Wait for container] ********************************************************************************
ok: [netconf -> localhost]

TASK [Get running configuration and state data] **********************************************************
ok: [netconf]

TASK [Cleanup container] *********************************************************************************
changed: [netconf -> localhost]

PLAY RECAP ***********************************************************************************************
netconf                    : ok=4    changed=2    unreachable=0    failed=0    skipped=0    rescued=0    ignored=0   

  prep: OK (2.04=setup[0.03]+cmd[2.01] seconds)
  py3.13-ansible8-linear: OK (6.98=setup[0.00]+cmd[6.97] seconds)
  py3.13-ansible9-linear: OK (6.86=setup[0.00]+cmd[6.86] seconds)
  py3.13-ansible10-linear: OK (6.82=setup[0.00]+cmd[6.82] seconds)
  py3.13-ansible11-linear: OK (6.86=setup[0.00]+cmd[6.85] seconds)
  py3.13-ansible8-mitogen-mitogen_linear: FAIL code 2 (7.49=setup[0.00]+cmd[7.49] seconds)
  py3.13-ansible8-mitogen_pr1215-mitogen_linear: OK (7.57=setup[0.01]+cmd[7.56] seconds)
  py3.13-ansible9-mitogen-mitogen_linear: FAIL code 2 (7.50=setup[0.01]+cmd[7.49] seconds)
  py3.13-ansible9-mitogen_pr1215-mitogen_linear: OK (7.53=setup[0.01]+cmd[7.53] seconds)
  py3.13-ansible10-mitogen-mitogen_linear: FAIL code 2 (7.49=setup[0.01]+cmd[7.49] seconds)
  py3.13-ansible10-mitogen_pr1215-mitogen_linear: OK (7.60=setup[0.00]+cmd[7.60] seconds)
  py3.13-ansible11-mitogen-mitogen_linear: FAIL code 2 (20.57=setup[13.23]+cmd[7.33] seconds)
  py3.13-ansible11-mitogen_pr1215-mitogen_linear: OK (22.42=setup[15.02]+cmd[7.40] seconds)
  evaluation failed :( (117.81 seconds)

@moreati
Copy link
Member

moreati commented Jan 16, 2025

That's correct. Mitogen is currently unable to handle network_cli connections properly and fails during task var recovery when executing low-level local commands.

Can you post a playbook or other reproduction of the task_vars error? I'm not seeing it.

@Nihlus
Copy link
Contributor Author

Nihlus commented Jan 16, 2025

I pulled that reproduction repo - notably, it uses a netconf connection and not a network_cli connection as is the case in #766, #662, and #342. network_cli is a more targeted connection than netconf and is specifically intended for appliances with a terminal-like user interface accessible over SSH (managed switches, mostly).

I noticed that you can reproduce the error with the linked repo if you enable fact gathering - then, it also fails with the "could not recover task_vars" message.

@Nihlus
Copy link
Contributor Author

Nihlus commented Jan 16, 2025

I tested some more with my real playbook and found that I can actually execute network_cli modules just fine with mitogen using my fixes - it's just the fact gathering that fails for "pure" network playbooks. I also have a failure case with the same error output in a zabbix module - will dig further.

@Nihlus
Copy link
Contributor Author

Nihlus commented Jan 16, 2025

Found it - it's a module running on a httpapi connection, not network_cli. Same presentation of the error and the same failure mode - the (non-mitogen) connection is trying to run _low_level_execute_command via _make_tmp_path on the local (mitogen) connection, but the action on the stack has been transmuted back to a non-mitogen action and thus it fails.

@Nihlus
Copy link
Contributor Author

Nihlus commented Jan 16, 2025

Further digging has me thinking this is all due to the messed-up way interpreter discovery is handled... going to see if I can figure out a better way of doing it.

…onnection in the play is not a Mitogen connection.
@Nihlus
Copy link
Contributor Author

Nihlus commented Jan 16, 2025

Got it. The core issue was that while network_cli was not a Mitogen connection, the local connection it created (and then used) was. We can't allow a mixed stack of connections due to the way type transmutation works with actions, since actions instantiated against the top-level connection would then be unusable with the Mitogen-enabled connection lower in the stack.

The latest change adds a check to not redirect connections if the play context's connection is not also a Mitogen connection. I'm now able to run both network_cli and httpapi connections successfully.

moreati added a commit to moreati/mitogen-issue-766-repro that referenced this pull request Jan 17, 2025
@moreati
Copy link
Member

moreati commented Jan 17, 2025

I pulled that reproduction repo - notably, it uses a netconf connection and not a network_cli connection as is the case in #766

My repo (https://github.com/moreati/mitogen-issue-766-repro) follows the lead set by the author of #766 in https://github.com/markafarrell/mitogen-issue-766-repro.

I noticed that you can reproduce the error with the linked repo if you enable fact gathering - then, it also fails with the "could not recover task_vars" message.

Confirmed using 0f8575d and moreati/mitogen-issue-766-repro@d4b1531. Confirmed fixed by c1ac3ab

  prep: OK (15.06=setup[12.90]+cmd[2.16] seconds)
  py3.13-ansible8-linear: OK (22.91=setup[15.16]+cmd[7.75] seconds)
  py3.13-ansible9-linear: OK (21.19=setup[13.59]+cmd[7.60] seconds)
  py3.13-ansible10-linear: OK (21.10=setup[13.53]+cmd[7.57] seconds)
  py3.13-ansible11-linear: OK (21.37=setup[13.75]+cmd[7.61] seconds)
  py3.13-ansible8-mitogen-mitogen_linear: FAIL code 2 (16.58=setup[16.22]+cmd[0.36] seconds)
  py3.13-ansible8-mitogen_pr1215-mitogen_linear: OK (26.46=setup[18.11]+cmd[8.35] seconds)
  py3.13-ansible9-mitogen-mitogen_linear: FAIL code 2 (14.23=setup[13.91]+cmd[0.32] seconds)
  py3.13-ansible9-mitogen_pr1215-mitogen_linear: OK (24.64=setup[16.29]+cmd[8.35] seconds)
  py3.13-ansible10-mitogen-mitogen_linear: FAIL code 2 (14.56=setup[14.22]+cmd[0.34] seconds)
  py3.13-ansible10-mitogen_pr1215-mitogen_linear: OK (24.19=setup[15.82]+cmd[8.37] seconds)
  py3.13-ansible11-mitogen-mitogen_linear: FAIL code 2 (13.93=setup[13.64]+cmd[0.29] seconds)
  py3.13-ansible11-mitogen_pr1215-mitogen_linear: OK (24.07=setup[15.92]+cmd[8.14] seconds)
  evaluation failed :( (260.34 seconds)

Found it - it's a module running on a httpapi connection, not network_cli

May have bearing on #1086 or #1094.

Further digging has me thinking this is all due to the messed-up way interpreter discovery is handled... going to see if I can figure out a better way of doing it.

That is a Pandora's box I'm still working up to.

@moreati
Copy link
Member

moreati commented Jan 17, 2025

I've rerun a CI job that failed due to #1185. All existing tests are now passing. I would like to add new tests to guard against regressions. https://github.com/moreati/mitogen-issue-766-repro may serve as a template.

@moreati
Copy link
Member

moreati commented Jan 17, 2025

Added sysrepo/sysrepo-netopeer2 to Github Container Registry

mitogen git:(pr1215) ✗ skopeo  --insecure-policy copy docker://docker.io/sysrepo/sysrepo-netopeer2:latest docker://ghcr.io/mitogen-hq/sysrepo-netopeer2:latest
Getting image source signatures
Copying blob 69c7818ed7f4 done   | 
Copying blob fee5db0ff82f done   | 
Copying blob 4639459be202 done   | 
Copying blob fc878cd0a91c done   | 
Copying blob d51af753c3d3 done   | 
Copying blob 6154df8ff988 done   | 
Copying blob 95d253b40401 done   | 
Copying blob 8eba48fcdc7f done   | 
Copying blob 3d97b7f99a05 done   | 
Copying blob 9be2789ffa20 done   | 
Copying blob 9b0152ba3415 done   | 
Copying blob 456e970a28f8 done   | 
Copying blob 344aedc9d9b9 done   | 
Copying blob c2b2cc7242a0 done   | 
Copying blob a9fcf187b886 done   | 
Copying blob 43bd915ab2e3 done   | 
Copying blob 4fa0252bc3d9 done   | 
Copying blob 80587ec04044 done   | 
Copying blob 198a6f97edd0 done   | 
Copying blob 399fb1102f3f done   | 
Copying config 976d1dcf6c done   | 
Writing manifest to image destination

@Nihlus
Copy link
Contributor Author

Nihlus commented Jan 17, 2025

Yeah, #1086 would also be closed by this fix, making #1094 obsolete.

@Nihlus Nihlus force-pushed the fix-connection-loader branch from 721be87 to 9982198 Compare January 17, 2025 19:42
@Nihlus
Copy link
Contributor Author

Nihlus commented Jan 17, 2025

I'm not super familiar with how the tests are set up and what the requirements are, so I added a sketch for a regression test that you can review. I'm going to need a little bit of handholding here, I think 😅

@moreati
Copy link
Member

moreati commented Jan 18, 2025

TASK [Start container detach=True, name=sysprep, recreate=True, published_ports=[u'{{ ansible_port }}:830'], auto_remove=True, image=ghcr.io/mitogen-hq/sysrepo-netopeer2:latest] ***
Friday 17 January 2025  19:45:57 +0000 (0:00:00.427)       0:00:51.828 ******** 
An exception occurred during task execution. To see the full traceback, use -vvv. The error was: SystemError: Parent module 'ansible_collections.containers.podman.plugins' not loaded, cannot perform relative import
fatal: [localhost -> localhost]: FAILED! => changed=false 
  module_stderr: |-
    Traceback (most recent call last):
      File "master:/home/runner/work/mitogen/mitogen/ansible_mitogen/runner.py", line 1039, in _run
        self._run_code(code, mod)
      File "master:/home/runner/work/mitogen/mitogen/ansible_mitogen/runner.py", line 1005, in _run_code
        exec('exec code in vars(mod)')
      File "<string>", line 1, in <module>
      File "master:/home/runner/work/mitogen/mitogen/.tox/py27-mode_ansible-ansible2.10/lib/python2.7/site-packages/ansible_collections/containers/podman/plugins/modules/podman_container.py", line 892, in <module>
    SystemError: Parent module 'ansible_collections.containers.podman.plugins' not loaded, cannot perform relative import
  module_stdout: ''
  msg: |-
    MODULE FAILURE
    See stdout/stderr for the exact error
  rc: 1

-- https://github.com/mitogen-hq/mitogen/actions/runs/12835765981/job/35795929770?pr=1215

Looks similar to #826. I'll try some variations (e.g. https://github.com/mitogen-hq/mitogen/blob/master/tests/ansible/regression/issue_655__wait_for_connection_error.yml) or with the Docker module(s) instead. The main reason I used podman in https://github.com/moreati/mitogen-issue-766-repro was an annoying behaviour of the Docker Snap on Ubuntu https://stackoverflow.com/questions/71477749/error-response-from-daemon-cannot-kill-container-permission-denied-how-to-kil.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

network_cli connections failing with "could not recover task_vars"
2 participants