-
-
Notifications
You must be signed in to change notification settings - Fork 53
Labels
C: AnsibleThis issue pertains to the use of Ansible in Qubes OS.This issue pertains to the use of Ansible in Qubes OS.P: defaultPriority: default. Default priority for new issues, to be replaced given sufficient information.Priority: default. Default priority for new issues, to be replaced given sufficient information.affects-4.3This issue affects Qubes OS 4.3.This issue affects Qubes OS 4.3.diagnosedTechnical diagnosis of this issue has been performed.Technical diagnosis of this issue has been performed.pr submittedA pull request has been submitted for this issue.A pull request has been submitted for this issue.
Description
Qubes OS release
4.3
Brief summary
Current qubes_proxy strategy assumes localhost
as the host to apply to dom0. But one can also specify dom0
and without qubes_proxy strategy that works.
Steps to reproduce
- hosts: dom0
gather_facts: no
tasks:
- name: Create testvm
qubesos:
guest: testvm
label: red
state: present
template: fedora-42-xfce
properties:
netvm: sys-firewall
Expected behavior
Create testvm.
Or maybe clearly say I should use a different name (see additional context section).
Actual behavior
Complain about dom0 not having "management_dispvm" property:
PLAY [dom0] **********************************************************************************************************************************************************************************
[ERROR]: Invalid property 'management_dispvm' of dom0
multiprocessing.pool.RemoteTraceback:
"""
Traceback (most recent call last):
File "/usr/lib64/python3.13/multiprocessing/pool.py", line 125, in worker
result = (True, func(*args, **kwds))
~~~~^^^^^^^^^^^^^^^
File "/usr/share/ansible/plugins/strategy/qubes_proxy.py", line 57, in run_play_executor
return QubesPlayExecutor(iterator, play_context).run()
~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~^^
File "/usr/share/ansible/plugins/strategy/qubes_proxy.py", line 411, in run
dispvm = self._start_mgmt_disp_vm()
File "/usr/share/ansible/plugins/strategy/qubes_proxy.py", line 390, in _start_mgmt_disp_vm
template=self.vm.management_dispvm,
^^^^^^^^^^^^^^^^^^^^^^^^^
File "/usr/lib/python3.13/site-packages/qubesadmin/base.py", line 232, in __getattr__
property_str = self.qubesd_call(
self._method_dest,
self._method_prefix + 'Get',
item,
None)
File "/usr/lib/python3.13/site-packages/qubesadmin/base.py", line 76, in qubesd_call
return self.app.qubesd_call(dest, method, arg, payload,
~~~~~~~~~~~~~~~~~~~~^^^^^^^^^^^^^^^^^^^^^^^^^^^^
payload_stream)
^^^^^^^^^^^^^^^
File "/usr/lib/python3.13/site-packages/qubesadmin/app.py", line 874, in qubesd_call
return self._parse_qubesd_response(return_data)
~~~~~~~~~~~~~~~~~~~~~~~~~~~^^^^^^^^^^^^^
File "/usr/lib/python3.13/site-packages/qubesadmin/base.py", line 111, in _parse_qubesd_response
raise exc_class(format_string, *args)
qubesadmin.exc.QubesNoSuchPropertyError: Invalid property 'management_dispvm' of dom0
"""
The above exception was the direct cause of the following exception:
qubesadmin.exc.QubesNoSuchPropertyError: Invalid property 'management_dispvm' of dom0
Additional information
TBH, I'm not sure if the expected behavior is correct. Maybe it should be only localhost? While intuitively in dom0 both names should behave the same, it isn't that obvious anymore if calling ansible from a management vm (if some future version of qubes_proxy strategy would support that case too). And in that case, I'm not sure which one is correct...
Metadata
Metadata
Assignees
Labels
C: AnsibleThis issue pertains to the use of Ansible in Qubes OS.This issue pertains to the use of Ansible in Qubes OS.P: defaultPriority: default. Default priority for new issues, to be replaced given sufficient information.Priority: default. Default priority for new issues, to be replaced given sufficient information.affects-4.3This issue affects Qubes OS 4.3.This issue affects Qubes OS 4.3.diagnosedTechnical diagnosis of this issue has been performed.Technical diagnosis of this issue has been performed.pr submittedA pull request has been submitted for this issue.A pull request has been submitted for this issue.