mirror of
https://github.com/ansible-collections/community.general.git
synced 2026-05-01 10:53:20 +00:00
Compare commits
46 Commits
| Author | SHA1 | Date | |
|---|---|---|---|
|
|
c512b789cb | ||
|
|
ec23171586 | ||
|
|
1704f947e3 | ||
|
|
80c7fc2d12 | ||
|
|
5b1bb61b9e | ||
|
|
56e00efcba | ||
|
|
ef42314714 | ||
|
|
9a6eb4e028 | ||
|
|
cf4b814c2d | ||
|
|
f47fced4ca | ||
|
|
370fa9814a | ||
|
|
7f60b1f2dd | ||
|
|
bbbc98a751 | ||
|
|
233743f2fe | ||
|
|
8539c534e3 | ||
|
|
eadf1320df | ||
|
|
1242dff77f | ||
|
|
2ce82ce1fa | ||
|
|
f3aab7a5b8 | ||
|
|
b3d79d728e | ||
|
|
ecd6bca049 | ||
|
|
84c883e854 | ||
|
|
91c8d6badc | ||
|
|
3cf6a67f74 | ||
|
|
3593d9c17c | ||
|
|
b0910d6a47 | ||
|
|
5c6053bf79 | ||
|
|
d789351195 | ||
|
|
a4739d8a36 | ||
|
|
986c0ab03a | ||
|
|
67279e7ca1 | ||
|
|
f34cd9ddb9 | ||
|
|
cbab5e887d | ||
|
|
c0cd4827da | ||
|
|
833c21a2bc | ||
|
|
cd333e6575 | ||
|
|
a70de88577 | ||
|
|
3ce83dcf6a | ||
|
|
6448372c04 | ||
|
|
be09373815 | ||
|
|
9a6f7c5c3f | ||
|
|
1f94bd4a17 | ||
|
|
b15a2c52e3 | ||
|
|
f764685c53 | ||
|
|
1d6552e005 | ||
|
|
770ae38aff |
@@ -173,8 +173,8 @@ stages:
|
||||
targets:
|
||||
- name: Alpine 3.18
|
||||
test: alpine/3.18
|
||||
# - name: Fedora 38
|
||||
# test: fedora/38
|
||||
# - name: Fedora 39
|
||||
# test: fedora/39
|
||||
- name: Ubuntu 22.04
|
||||
test: ubuntu/22.04
|
||||
groups:
|
||||
@@ -189,8 +189,8 @@ stages:
|
||||
targets:
|
||||
- name: macOS 13.2
|
||||
test: macos/13.2
|
||||
- name: RHEL 9.2
|
||||
test: rhel/9.2
|
||||
- name: RHEL 9.3
|
||||
test: rhel/9.3
|
||||
- name: FreeBSD 13.2
|
||||
test: freebsd/13.2
|
||||
groups:
|
||||
@@ -207,6 +207,8 @@ stages:
|
||||
targets:
|
||||
#- name: macOS 13.2
|
||||
# test: macos/13.2
|
||||
- name: RHEL 9.2
|
||||
test: rhel/9.2
|
||||
- name: RHEL 8.8
|
||||
test: rhel/8.8
|
||||
#- name: FreeBSD 13.2
|
||||
@@ -229,10 +231,10 @@ stages:
|
||||
test: rhel/8.7
|
||||
- name: RHEL 7.9
|
||||
test: rhel/7.9
|
||||
- name: FreeBSD 13.1
|
||||
test: freebsd/13.1
|
||||
- name: FreeBSD 12.4
|
||||
test: freebsd/12.4
|
||||
# - name: FreeBSD 13.1
|
||||
# test: freebsd/13.1
|
||||
# - name: FreeBSD 12.4
|
||||
# test: freebsd/12.4
|
||||
groups:
|
||||
- 1
|
||||
- 2
|
||||
@@ -265,8 +267,8 @@ stages:
|
||||
parameters:
|
||||
testFormat: devel/linux/{0}
|
||||
targets:
|
||||
- name: Fedora 38
|
||||
test: fedora38
|
||||
- name: Fedora 39
|
||||
test: fedora39
|
||||
- name: Ubuntu 20.04
|
||||
test: ubuntu2004
|
||||
- name: Ubuntu 22.04
|
||||
@@ -285,6 +287,8 @@ stages:
|
||||
parameters:
|
||||
testFormat: 2.16/linux/{0}
|
||||
targets:
|
||||
- name: Fedora 38
|
||||
test: fedora38
|
||||
- name: openSUSE 15
|
||||
test: opensuse15
|
||||
groups:
|
||||
@@ -315,8 +319,8 @@ stages:
|
||||
parameters:
|
||||
testFormat: 2.14/linux/{0}
|
||||
targets:
|
||||
- name: Fedora 36
|
||||
test: fedora36
|
||||
- name: Alpine 3
|
||||
test: alpine3
|
||||
groups:
|
||||
- 1
|
||||
- 2
|
||||
|
||||
3
.github/BOTMETA.yml
vendored
3
.github/BOTMETA.yml
vendored
@@ -642,6 +642,7 @@ files:
|
||||
maintainers: bregman-arie
|
||||
$modules/ipa_:
|
||||
maintainers: $team_ipa
|
||||
ignore: fxfitz
|
||||
$modules/ipbase_info.py:
|
||||
maintainers: dominikkukacka
|
||||
$modules/ipa_pwpolicy.py:
|
||||
@@ -1433,7 +1434,7 @@ macros:
|
||||
team_gitlab: Lunik Shaps marwatk waheedi zanssa scodeman metanovii sh0shin nejch lgatellier suukit
|
||||
team_hpux: bcoca davx8342
|
||||
team_huawei: QijunPan TommyLike edisonxiang freesky-edward hwDCN niuzhenguo xuxiaowei0512 yanzhangi zengchen1024 zhongjun2
|
||||
team_ipa: Akasurde Nosmoht fxfitz justchris1
|
||||
team_ipa: Akasurde Nosmoht justchris1
|
||||
team_jboss: Wolfant jairojunior wbrefvem
|
||||
team_keycloak: eikef ndclt mattock
|
||||
team_linode: InTheCloudDan decentral1se displague rmcintosh Charliekenney23 LBGarber
|
||||
|
||||
4
.github/workflows/codeql-analysis.yml
vendored
4
.github/workflows/codeql-analysis.yml
vendored
@@ -28,9 +28,9 @@ jobs:
|
||||
|
||||
# Initializes the CodeQL tools for scanning.
|
||||
- name: Initialize CodeQL
|
||||
uses: github/codeql-action/init@v2
|
||||
uses: github/codeql-action/init@v3
|
||||
with:
|
||||
languages: python
|
||||
|
||||
- name: Perform CodeQL Analysis
|
||||
uses: github/codeql-action/analyze@v2
|
||||
uses: github/codeql-action/analyze@v3
|
||||
|
||||
@@ -6,6 +6,54 @@ Community General Release Notes
|
||||
|
||||
This changelog describes changes after version 6.0.0.
|
||||
|
||||
v7.5.3
|
||||
======
|
||||
|
||||
Release Summary
|
||||
---------------
|
||||
|
||||
Regular bugfix release.
|
||||
|
||||
Bugfixes
|
||||
--------
|
||||
|
||||
- keycloak_identity_provider - ``mappers`` processing was not idempotent if the mappers configuration list had not been sorted by name (in ascending order). Fix resolves the issue by sorting mappers in the desired state using the same key which is used for obtaining existing state (https://github.com/ansible-collections/community.general/pull/7418).
|
||||
- keycloak_identity_provider - it was not possible to reconfigure (add, remove) ``mappers`` once they were created initially. Removal was ignored, adding new ones resulted in dropping the pre-existing unmodified mappers. Fix resolves the issue by supplying correct input to the internal update call (https://github.com/ansible-collections/community.general/pull/7418).
|
||||
- keycloak_user - when ``force`` is set, but user does not exist, do not try to delete it (https://github.com/ansible-collections/community.general/pull/7696).
|
||||
- statusio_maintenance - fix error caused by incorrectly formed API data payload. Was raising "Failed to create maintenance HTTP Error 400 Bad Request" caused by bad data type for date/time and deprecated dict keys (https://github.com/ansible-collections/community.general/pull/7754).
|
||||
|
||||
v7.5.2
|
||||
======
|
||||
|
||||
Release Summary
|
||||
---------------
|
||||
|
||||
Regular bugfix release.
|
||||
|
||||
Minor Changes
|
||||
-------------
|
||||
|
||||
- elastic callback plugin - close elastic client to not leak resources (https://github.com/ansible-collections/community.general/pull/7517).
|
||||
|
||||
Bugfixes
|
||||
--------
|
||||
|
||||
- cloudflare_dns - fix Cloudflare lookup of SHFP records (https://github.com/ansible-collections/community.general/issues/7652).
|
||||
- interface_files - also consider ``address_family`` when changing ``option=method`` (https://github.com/ansible-collections/community.general/issues/7610, https://github.com/ansible-collections/community.general/pull/7612).
|
||||
- irc - replace ``ssl.wrap_socket`` that was removed from Python 3.12 with code for creating a proper SSL context (https://github.com/ansible-collections/community.general/pull/7542).
|
||||
- keycloak_* - fix Keycloak API client to quote ``/`` properly (https://github.com/ansible-collections/community.general/pull/7641).
|
||||
- keycloak_authz_permission - resource payload variable for scope-based permission was constructed as a string, when it needs to be a list, even for a single item (https://github.com/ansible-collections/community.general/issues/7151).
|
||||
- log_entries callback plugin - replace ``ssl.wrap_socket`` that was removed from Python 3.12 with code for creating a proper SSL context (https://github.com/ansible-collections/community.general/pull/7542).
|
||||
- lvol - test for output messages in both ``stdout`` and ``stderr`` (https://github.com/ansible-collections/community.general/pull/7601, https://github.com/ansible-collections/community.general/issues/7182).
|
||||
- ocapi_utils, oci_utils, redfish_utils module utils - replace ``type()`` calls with ``isinstance()`` calls (https://github.com/ansible-collections/community.general/pull/7501).
|
||||
- onepassword lookup plugin - field and section titles are now case insensitive when using op CLI version two or later. This matches the behavior of version one (https://github.com/ansible-collections/community.general/pull/7564).
|
||||
- pipx module utils - change the CLI argument formatter for the ``pip_args`` parameter (https://github.com/ansible-collections/community.general/issues/7497, https://github.com/ansible-collections/community.general/pull/7506).
|
||||
- redhat_subscription - use the D-Bus registration on RHEL 7 only on 7.4 and
|
||||
greater; older versions of RHEL 7 do not have it
|
||||
(https://github.com/ansible-collections/community.general/issues/7622,
|
||||
https://github.com/ansible-collections/community.general/pull/7624).
|
||||
- terraform - fix multiline string handling in complex variables (https://github.com/ansible-collections/community.general/pull/7535).
|
||||
|
||||
v7.5.1
|
||||
======
|
||||
|
||||
|
||||
@@ -1610,3 +1610,78 @@ releases:
|
||||
- 7465-redfish-firmware-update-message-id-hardening.yml
|
||||
- 7467-fix-gitlab-constants-calls.yml
|
||||
release_date: '2023-11-06'
|
||||
7.5.2:
|
||||
changes:
|
||||
bugfixes:
|
||||
- cloudflare_dns - fix Cloudflare lookup of SHFP records (https://github.com/ansible-collections/community.general/issues/7652).
|
||||
- interface_files - also consider ``address_family`` when changing ``option=method``
|
||||
(https://github.com/ansible-collections/community.general/issues/7610, https://github.com/ansible-collections/community.general/pull/7612).
|
||||
- irc - replace ``ssl.wrap_socket`` that was removed from Python 3.12 with code
|
||||
for creating a proper SSL context (https://github.com/ansible-collections/community.general/pull/7542).
|
||||
- keycloak_* - fix Keycloak API client to quote ``/`` properly (https://github.com/ansible-collections/community.general/pull/7641).
|
||||
- keycloak_authz_permission - resource payload variable for scope-based permission
|
||||
was constructed as a string, when it needs to be a list, even for a single
|
||||
item (https://github.com/ansible-collections/community.general/issues/7151).
|
||||
- log_entries callback plugin - replace ``ssl.wrap_socket`` that was removed
|
||||
from Python 3.12 with code for creating a proper SSL context (https://github.com/ansible-collections/community.general/pull/7542).
|
||||
- lvol - test for output messages in both ``stdout`` and ``stderr`` (https://github.com/ansible-collections/community.general/pull/7601,
|
||||
https://github.com/ansible-collections/community.general/issues/7182).
|
||||
- ocapi_utils, oci_utils, redfish_utils module utils - replace ``type()`` calls
|
||||
with ``isinstance()`` calls (https://github.com/ansible-collections/community.general/pull/7501).
|
||||
- onepassword lookup plugin - field and section titles are now case insensitive
|
||||
when using op CLI version two or later. This matches the behavior of version
|
||||
one (https://github.com/ansible-collections/community.general/pull/7564).
|
||||
- pipx module utils - change the CLI argument formatter for the ``pip_args``
|
||||
parameter (https://github.com/ansible-collections/community.general/issues/7497,
|
||||
https://github.com/ansible-collections/community.general/pull/7506).
|
||||
- 'redhat_subscription - use the D-Bus registration on RHEL 7 only on 7.4 and
|
||||
|
||||
greater; older versions of RHEL 7 do not have it
|
||||
|
||||
(https://github.com/ansible-collections/community.general/issues/7622,
|
||||
|
||||
https://github.com/ansible-collections/community.general/pull/7624).
|
||||
|
||||
'
|
||||
- terraform - fix multiline string handling in complex variables (https://github.com/ansible-collections/community.general/pull/7535).
|
||||
minor_changes:
|
||||
- elastic callback plugin - close elastic client to not leak resources (https://github.com/ansible-collections/community.general/pull/7517).
|
||||
release_summary: Regular bugfix release.
|
||||
fragments:
|
||||
- 000-redhat_subscription-dbus-on-7.4-plus.yaml
|
||||
- 7.5.2.yml
|
||||
- 7151-fix-keycloak_authz_permission-incorrect-resource-payload.yml
|
||||
- 7501-type.yml
|
||||
- 7506-pipx-pipargs.yml
|
||||
- 7517-elastic-close-client.yaml
|
||||
- 7535-terraform-fix-multiline-string-handling-in-complex-variables.yml
|
||||
- 7542-irc-logentries-ssl.yml
|
||||
- 7564-onepassword-lookup-case-insensitive.yaml
|
||||
- 7601-lvol-fix.yml
|
||||
- 7612-interface_file-method.yml
|
||||
- 7641-fix-keycloak-api-client-to-quote-properly.yml
|
||||
- 7653-fix-cloudflare-lookup.yml
|
||||
release_date: '2023-12-04'
|
||||
7.5.3:
|
||||
changes:
|
||||
bugfixes:
|
||||
- keycloak_identity_provider - ``mappers`` processing was not idempotent if
|
||||
the mappers configuration list had not been sorted by name (in ascending order).
|
||||
Fix resolves the issue by sorting mappers in the desired state using the same
|
||||
key which is used for obtaining existing state (https://github.com/ansible-collections/community.general/pull/7418).
|
||||
- keycloak_identity_provider - it was not possible to reconfigure (add, remove)
|
||||
``mappers`` once they were created initially. Removal was ignored, adding
|
||||
new ones resulted in dropping the pre-existing unmodified mappers. Fix resolves
|
||||
the issue by supplying correct input to the internal update call (https://github.com/ansible-collections/community.general/pull/7418).
|
||||
- keycloak_user - when ``force`` is set, but user does not exist, do not try
|
||||
to delete it (https://github.com/ansible-collections/community.general/pull/7696).
|
||||
- statusio_maintenance - fix error caused by incorrectly formed API data payload.
|
||||
Was raising "Failed to create maintenance HTTP Error 400 Bad Request" caused
|
||||
by bad data type for date/time and deprecated dict keys (https://github.com/ansible-collections/community.general/pull/7754).
|
||||
release_summary: Regular bugfix release.
|
||||
fragments:
|
||||
- 7.5.3.yml
|
||||
- 7418-kc_identity_provider-mapper-reconfiguration-fixes.yml
|
||||
- 7696-avoid-attempt-to-delete-non-existing-user.yml
|
||||
- 7754-fixed-payload-format.yml
|
||||
release_date: '2024-01-01'
|
||||
|
||||
@@ -5,7 +5,7 @@
|
||||
|
||||
namespace: community
|
||||
name: general
|
||||
version: 7.5.1
|
||||
version: 7.5.3
|
||||
readme: README.md
|
||||
authors:
|
||||
- Ansible (https://github.com/ansible)
|
||||
|
||||
@@ -84,6 +84,7 @@ import time
|
||||
import uuid
|
||||
|
||||
from collections import OrderedDict
|
||||
from contextlib import closing
|
||||
from os.path import basename
|
||||
|
||||
from ansible.errors import AnsibleError, AnsibleRuntimeError
|
||||
@@ -201,24 +202,25 @@ class ElasticSource(object):
|
||||
|
||||
apm_cli = self.init_apm_client(apm_server_url, apm_service_name, apm_verify_server_cert, apm_secret_token, apm_api_key)
|
||||
if apm_cli:
|
||||
instrument() # Only call this once, as early as possible.
|
||||
if traceparent:
|
||||
parent = trace_parent_from_string(traceparent)
|
||||
apm_cli.begin_transaction("Session", trace_parent=parent, start=parent_start_time)
|
||||
else:
|
||||
apm_cli.begin_transaction("Session", start=parent_start_time)
|
||||
# Populate trace metadata attributes
|
||||
if self.ansible_version is not None:
|
||||
label(ansible_version=self.ansible_version)
|
||||
label(ansible_session=self.session, ansible_host_name=self.host, ansible_host_user=self.user)
|
||||
if self.ip_address is not None:
|
||||
label(ansible_host_ip=self.ip_address)
|
||||
with closing(apm_cli):
|
||||
instrument() # Only call this once, as early as possible.
|
||||
if traceparent:
|
||||
parent = trace_parent_from_string(traceparent)
|
||||
apm_cli.begin_transaction("Session", trace_parent=parent, start=parent_start_time)
|
||||
else:
|
||||
apm_cli.begin_transaction("Session", start=parent_start_time)
|
||||
# Populate trace metadata attributes
|
||||
if self.ansible_version is not None:
|
||||
label(ansible_version=self.ansible_version)
|
||||
label(ansible_session=self.session, ansible_host_name=self.host, ansible_host_user=self.user)
|
||||
if self.ip_address is not None:
|
||||
label(ansible_host_ip=self.ip_address)
|
||||
|
||||
for task_data in tasks:
|
||||
for host_uuid, host_data in task_data.host_data.items():
|
||||
self.create_span_data(apm_cli, task_data, host_data)
|
||||
for task_data in tasks:
|
||||
for host_uuid, host_data in task_data.host_data.items():
|
||||
self.create_span_data(apm_cli, task_data, host_data)
|
||||
|
||||
apm_cli.end_transaction(name=__name__, result=status, duration=end_time - parent_start_time)
|
||||
apm_cli.end_transaction(name=__name__, result=status, duration=end_time - parent_start_time)
|
||||
|
||||
def create_span_data(self, apm_cli, task_data, host_data):
|
||||
""" create the span with the given TaskData and HostData """
|
||||
|
||||
@@ -18,7 +18,7 @@ DOCUMENTATION = '''
|
||||
requirements:
|
||||
- whitelisting in configuration
|
||||
- certifi (Python library)
|
||||
- flatdict (Python library), if you want to use the 'flatten' option
|
||||
- flatdict (Python library), if you want to use the O(flatten) option
|
||||
options:
|
||||
api:
|
||||
description: URI to the Logentries API.
|
||||
@@ -90,9 +90,9 @@ examples: >
|
||||
api = data.logentries.com
|
||||
port = 10000
|
||||
tls_port = 20000
|
||||
use_tls = no
|
||||
use_tls = true
|
||||
token = dd21fc88-f00a-43ff-b977-e3a4233c53af
|
||||
flatten = False
|
||||
flatten = false
|
||||
'''
|
||||
|
||||
import os
|
||||
@@ -196,15 +196,11 @@ else:
|
||||
class TLSSocketAppender(PlainTextSocketAppender):
|
||||
def open_connection(self):
|
||||
sock = socket.socket(socket.AF_INET, socket.SOCK_STREAM)
|
||||
sock = ssl.wrap_socket(
|
||||
context = ssl.create_default_context(
|
||||
purpose=ssl.Purpose.SERVER_AUTH,
|
||||
cafile=certifi.where(), )
|
||||
sock = context.wrap_socket(
|
||||
sock=sock,
|
||||
keyfile=None,
|
||||
certfile=None,
|
||||
server_side=False,
|
||||
cert_reqs=ssl.CERT_REQUIRED,
|
||||
ssl_version=getattr(
|
||||
ssl, 'PROTOCOL_TLSv1_2', ssl.PROTOCOL_TLSv1),
|
||||
ca_certs=certifi.where(),
|
||||
do_handshake_on_connect=True,
|
||||
suppress_ragged_eofs=True, )
|
||||
sock.connect((self.LE_API, self.LE_TLS_PORT))
|
||||
|
||||
@@ -25,7 +25,10 @@ DOCUMENTATION = """
|
||||
type: list
|
||||
elements: str
|
||||
search:
|
||||
description: Field to retrieve, for example V(name) or V(id).
|
||||
description:
|
||||
- Field to retrieve, for example V(name) or V(id).
|
||||
- If set to V(id), only zero or one element can be returned.
|
||||
Use the Jinja C(first) filter to get the only list element.
|
||||
type: str
|
||||
default: name
|
||||
version_added: 5.7.0
|
||||
@@ -39,27 +42,27 @@ DOCUMENTATION = """
|
||||
"""
|
||||
|
||||
EXAMPLES = """
|
||||
- name: "Get 'password' from Bitwarden record named 'a_test'"
|
||||
- name: "Get 'password' from all Bitwarden records named 'a_test'"
|
||||
ansible.builtin.debug:
|
||||
msg: >-
|
||||
{{ lookup('community.general.bitwarden', 'a_test', field='password') }}
|
||||
|
||||
- name: "Get 'password' from Bitwarden record with id 'bafba515-af11-47e6-abe3-af1200cd18b2'"
|
||||
- name: "Get 'password' from Bitwarden record with ID 'bafba515-af11-47e6-abe3-af1200cd18b2'"
|
||||
ansible.builtin.debug:
|
||||
msg: >-
|
||||
{{ lookup('community.general.bitwarden', 'bafba515-af11-47e6-abe3-af1200cd18b2', search='id', field='password') }}
|
||||
{{ lookup('community.general.bitwarden', 'bafba515-af11-47e6-abe3-af1200cd18b2', search='id', field='password') | first }}
|
||||
|
||||
- name: "Get 'password' from Bitwarden record named 'a_test' from collection"
|
||||
- name: "Get 'password' from all Bitwarden records named 'a_test' from collection"
|
||||
ansible.builtin.debug:
|
||||
msg: >-
|
||||
{{ lookup('community.general.bitwarden', 'a_test', field='password', collection_id='bafba515-af11-47e6-abe3-af1200cd18b2') }}
|
||||
|
||||
- name: "Get full Bitwarden record named 'a_test'"
|
||||
- name: "Get list of all full Bitwarden records named 'a_test'"
|
||||
ansible.builtin.debug:
|
||||
msg: >-
|
||||
{{ lookup('community.general.bitwarden', 'a_test') }}
|
||||
|
||||
- name: "Get custom field 'api_key' from Bitwarden record named 'a_test'"
|
||||
- name: "Get custom field 'api_key' from all Bitwarden records named 'a_test'"
|
||||
ansible.builtin.debug:
|
||||
msg: >-
|
||||
{{ lookup('community.general.bitwarden', 'a_test', field='api_key') }}
|
||||
@@ -67,9 +70,12 @@ EXAMPLES = """
|
||||
|
||||
RETURN = """
|
||||
_raw:
|
||||
description: List of requested field or JSON object of list of matches.
|
||||
description:
|
||||
- A one-element list that contains a list of requested fields or JSON objects of matches.
|
||||
- If you use C(query), you get a list of lists. If you use C(lookup) without C(wantlist=true),
|
||||
this always gets reduced to a list of field values or JSON objects.
|
||||
type: list
|
||||
elements: raw
|
||||
elements: list
|
||||
"""
|
||||
|
||||
from subprocess import Popen, PIPE
|
||||
|
||||
@@ -127,6 +127,14 @@ from ansible.module_utils.six import with_metaclass
|
||||
from ansible_collections.community.general.plugins.module_utils.onepassword import OnePasswordConfig
|
||||
|
||||
|
||||
def _lower_if_possible(value):
|
||||
"""Return the lower case version value, otherwise return the value"""
|
||||
try:
|
||||
return value.lower()
|
||||
except AttributeError:
|
||||
return value
|
||||
|
||||
|
||||
class OnePassCLIBase(with_metaclass(abc.ABCMeta, object)):
|
||||
bin = "op"
|
||||
|
||||
@@ -480,6 +488,7 @@ class OnePassCLIv2(OnePassCLIBase):
|
||||
}
|
||||
"""
|
||||
data = json.loads(data_json)
|
||||
field_name = _lower_if_possible(field_name)
|
||||
for field in data.get("fields", []):
|
||||
if section_title is None:
|
||||
# If the field name exists in the section, return that value
|
||||
@@ -488,17 +497,19 @@ class OnePassCLIv2(OnePassCLIBase):
|
||||
|
||||
# If the field name doesn't exist in the section, match on the value of "label"
|
||||
# then "id" and return "value"
|
||||
if field.get("label") == field_name:
|
||||
if field.get("label", "").lower() == field_name:
|
||||
return field.get("value", "")
|
||||
|
||||
if field.get("id") == field_name:
|
||||
if field.get("id", "").lower() == field_name:
|
||||
return field.get("value", "")
|
||||
|
||||
# Look at the section data and get an identifier. The value of 'id' is either a unique ID
|
||||
# or a human-readable string. If a 'label' field exists, prefer that since
|
||||
# it is the value visible in the 1Password UI when both 'id' and 'label' exist.
|
||||
section = field.get("section", {})
|
||||
current_section_title = section.get("label", section.get("id"))
|
||||
section_title = _lower_if_possible(section_title)
|
||||
|
||||
current_section_title = section.get("label", section.get("id", "")).lower()
|
||||
if section_title == current_section_title:
|
||||
# In the correct section. Check "label" then "id" for the desired field_name
|
||||
if field.get("label") == field_name:
|
||||
|
||||
@@ -1679,7 +1679,7 @@ class KeycloakAPI(object):
|
||||
:param name: Name of the role to fetch.
|
||||
:param realm: Realm in which the role resides; default 'master'.
|
||||
"""
|
||||
role_url = URL_REALM_ROLE.format(url=self.baseurl, realm=realm, name=quote(name))
|
||||
role_url = URL_REALM_ROLE.format(url=self.baseurl, realm=realm, name=quote(name, safe=''))
|
||||
try:
|
||||
return json.loads(to_native(open_url(role_url, method="GET", http_agent=self.http_agent, headers=self.restheaders, timeout=self.connection_timeout,
|
||||
validate_certs=self.validate_certs).read()))
|
||||
@@ -1716,7 +1716,7 @@ class KeycloakAPI(object):
|
||||
:param rolerep: A RoleRepresentation of the updated role.
|
||||
:return HTTPResponse object on success
|
||||
"""
|
||||
role_url = URL_REALM_ROLE.format(url=self.baseurl, realm=realm, name=quote(rolerep['name']))
|
||||
role_url = URL_REALM_ROLE.format(url=self.baseurl, realm=realm, name=quote(rolerep['name']), safe='')
|
||||
try:
|
||||
composites = None
|
||||
if "composites" in rolerep:
|
||||
@@ -1737,9 +1737,9 @@ class KeycloakAPI(object):
|
||||
if clientid is not None:
|
||||
client = self.get_client_by_clientid(client_id=clientid, realm=realm)
|
||||
cid = client['id']
|
||||
composite_url = URL_CLIENT_ROLE_COMPOSITES.format(url=self.baseurl, realm=realm, id=cid, name=quote(rolerep["name"]))
|
||||
composite_url = URL_CLIENT_ROLE_COMPOSITES.format(url=self.baseurl, realm=realm, id=cid, name=quote(rolerep["name"], safe=''))
|
||||
else:
|
||||
composite_url = URL_REALM_ROLE_COMPOSITES.format(url=self.baseurl, realm=realm, name=quote(rolerep["name"]))
|
||||
composite_url = URL_REALM_ROLE_COMPOSITES.format(url=self.baseurl, realm=realm, name=quote(rolerep["name"], safe=''))
|
||||
# Get existing composites
|
||||
return json.loads(to_native(open_url(
|
||||
composite_url,
|
||||
@@ -1758,9 +1758,9 @@ class KeycloakAPI(object):
|
||||
if clientid is not None:
|
||||
client = self.get_client_by_clientid(client_id=clientid, realm=realm)
|
||||
cid = client['id']
|
||||
composite_url = URL_CLIENT_ROLE_COMPOSITES.format(url=self.baseurl, realm=realm, id=cid, name=quote(rolerep["name"]))
|
||||
composite_url = URL_CLIENT_ROLE_COMPOSITES.format(url=self.baseurl, realm=realm, id=cid, name=quote(rolerep["name"], safe=''))
|
||||
else:
|
||||
composite_url = URL_REALM_ROLE_COMPOSITES.format(url=self.baseurl, realm=realm, name=quote(rolerep["name"]))
|
||||
composite_url = URL_REALM_ROLE_COMPOSITES.format(url=self.baseurl, realm=realm, name=quote(rolerep["name"], safe=''))
|
||||
# Get existing composites
|
||||
# create new composites
|
||||
return open_url(composite_url, method='POST', http_agent=self.http_agent, headers=self.restheaders, timeout=self.connection_timeout,
|
||||
@@ -1775,9 +1775,9 @@ class KeycloakAPI(object):
|
||||
if clientid is not None:
|
||||
client = self.get_client_by_clientid(client_id=clientid, realm=realm)
|
||||
cid = client['id']
|
||||
composite_url = URL_CLIENT_ROLE_COMPOSITES.format(url=self.baseurl, realm=realm, id=cid, name=quote(rolerep["name"]))
|
||||
composite_url = URL_CLIENT_ROLE_COMPOSITES.format(url=self.baseurl, realm=realm, id=cid, name=quote(rolerep["name"], safe=''))
|
||||
else:
|
||||
composite_url = URL_REALM_ROLE_COMPOSITES.format(url=self.baseurl, realm=realm, name=quote(rolerep["name"]))
|
||||
composite_url = URL_REALM_ROLE_COMPOSITES.format(url=self.baseurl, realm=realm, name=quote(rolerep["name"], safe=''))
|
||||
# Get existing composites
|
||||
# create new composites
|
||||
return open_url(composite_url, method='DELETE', http_agent=self.http_agent, headers=self.restheaders, timeout=self.connection_timeout,
|
||||
@@ -1842,7 +1842,7 @@ class KeycloakAPI(object):
|
||||
:param name: The name of the role.
|
||||
:param realm: The realm in which this role resides, default "master".
|
||||
"""
|
||||
role_url = URL_REALM_ROLE.format(url=self.baseurl, realm=realm, name=quote(name))
|
||||
role_url = URL_REALM_ROLE.format(url=self.baseurl, realm=realm, name=quote(name, safe=''))
|
||||
try:
|
||||
return open_url(role_url, method='DELETE', http_agent=self.http_agent, headers=self.restheaders, timeout=self.connection_timeout,
|
||||
validate_certs=self.validate_certs)
|
||||
@@ -1886,7 +1886,7 @@ class KeycloakAPI(object):
|
||||
if cid is None:
|
||||
self.module.fail_json(msg='Could not find client %s in realm %s'
|
||||
% (clientid, realm))
|
||||
role_url = URL_CLIENT_ROLE.format(url=self.baseurl, realm=realm, id=cid, name=quote(name))
|
||||
role_url = URL_CLIENT_ROLE.format(url=self.baseurl, realm=realm, id=cid, name=quote(name, safe=''))
|
||||
try:
|
||||
return json.loads(to_native(open_url(role_url, method="GET", http_agent=self.http_agent, headers=self.restheaders, timeout=self.connection_timeout,
|
||||
validate_certs=self.validate_certs).read()))
|
||||
@@ -1950,7 +1950,7 @@ class KeycloakAPI(object):
|
||||
if cid is None:
|
||||
self.module.fail_json(msg='Could not find client %s in realm %s'
|
||||
% (clientid, realm))
|
||||
role_url = URL_CLIENT_ROLE.format(url=self.baseurl, realm=realm, id=cid, name=quote(rolerep['name']))
|
||||
role_url = URL_CLIENT_ROLE.format(url=self.baseurl, realm=realm, id=cid, name=quote(rolerep['name'], safe=''))
|
||||
try:
|
||||
composites = None
|
||||
if "composites" in rolerep:
|
||||
@@ -1976,7 +1976,7 @@ class KeycloakAPI(object):
|
||||
if cid is None:
|
||||
self.module.fail_json(msg='Could not find client %s in realm %s'
|
||||
% (clientid, realm))
|
||||
role_url = URL_CLIENT_ROLE.format(url=self.baseurl, realm=realm, id=cid, name=quote(name))
|
||||
role_url = URL_CLIENT_ROLE.format(url=self.baseurl, realm=realm, id=cid, name=quote(name, safe=''))
|
||||
try:
|
||||
return open_url(role_url, method='DELETE', http_agent=self.http_agent, headers=self.restheaders, timeout=self.connection_timeout,
|
||||
validate_certs=self.validate_certs)
|
||||
@@ -2036,7 +2036,7 @@ class KeycloakAPI(object):
|
||||
URL_AUTHENTICATION_FLOW_COPY.format(
|
||||
url=self.baseurl,
|
||||
realm=realm,
|
||||
copyfrom=quote(config["copyFrom"])),
|
||||
copyfrom=quote(config["copyFrom"], safe='')),
|
||||
method='POST',
|
||||
http_agent=self.http_agent, headers=self.restheaders,
|
||||
data=json.dumps(new_name),
|
||||
@@ -2110,7 +2110,7 @@ class KeycloakAPI(object):
|
||||
URL_AUTHENTICATION_FLOW_EXECUTIONS.format(
|
||||
url=self.baseurl,
|
||||
realm=realm,
|
||||
flowalias=quote(flowAlias)),
|
||||
flowalias=quote(flowAlias, safe='')),
|
||||
method='PUT',
|
||||
http_agent=self.http_agent, headers=self.restheaders,
|
||||
data=json.dumps(updatedExec),
|
||||
@@ -2159,7 +2159,7 @@ class KeycloakAPI(object):
|
||||
URL_AUTHENTICATION_FLOW_EXECUTIONS_FLOW.format(
|
||||
url=self.baseurl,
|
||||
realm=realm,
|
||||
flowalias=quote(flowAlias)),
|
||||
flowalias=quote(flowAlias, safe='')),
|
||||
method='POST',
|
||||
http_agent=self.http_agent, headers=self.restheaders,
|
||||
data=json.dumps(newSubFlow),
|
||||
@@ -2183,7 +2183,7 @@ class KeycloakAPI(object):
|
||||
URL_AUTHENTICATION_FLOW_EXECUTIONS_EXECUTION.format(
|
||||
url=self.baseurl,
|
||||
realm=realm,
|
||||
flowalias=quote(flowAlias)),
|
||||
flowalias=quote(flowAlias, safe='')),
|
||||
method='POST',
|
||||
http_agent=self.http_agent, headers=self.restheaders,
|
||||
data=json.dumps(newExec),
|
||||
@@ -2243,7 +2243,7 @@ class KeycloakAPI(object):
|
||||
URL_AUTHENTICATION_FLOW_EXECUTIONS.format(
|
||||
url=self.baseurl,
|
||||
realm=realm,
|
||||
flowalias=quote(config["alias"])),
|
||||
flowalias=quote(config["alias"], safe='')),
|
||||
method='GET',
|
||||
http_agent=self.http_agent, headers=self.restheaders,
|
||||
timeout=self.connection_timeout,
|
||||
@@ -2336,7 +2336,7 @@ class KeycloakAPI(object):
|
||||
return open_url(
|
||||
URL_AUTHENTICATION_REQUIRED_ACTIONS_ALIAS.format(
|
||||
url=self.baseurl,
|
||||
alias=quote(alias),
|
||||
alias=quote(alias, safe=''),
|
||||
realm=realm
|
||||
),
|
||||
method='PUT',
|
||||
@@ -2363,7 +2363,7 @@ class KeycloakAPI(object):
|
||||
return open_url(
|
||||
URL_AUTHENTICATION_REQUIRED_ACTIONS_ALIAS.format(
|
||||
url=self.baseurl,
|
||||
alias=quote(alias),
|
||||
alias=quote(alias, safe=''),
|
||||
realm=realm
|
||||
),
|
||||
method='DELETE',
|
||||
@@ -2630,7 +2630,7 @@ class KeycloakAPI(object):
|
||||
|
||||
def get_authz_authorization_scope_by_name(self, name, client_id, realm):
|
||||
url = URL_AUTHZ_AUTHORIZATION_SCOPES.format(url=self.baseurl, client_id=client_id, realm=realm)
|
||||
search_url = "%s/search?name=%s" % (url, quote(name))
|
||||
search_url = "%s/search?name=%s" % (url, quote(name, safe=''))
|
||||
|
||||
try:
|
||||
return json.loads(to_native(open_url(search_url, method='GET', http_agent=self.http_agent, headers=self.restheaders,
|
||||
|
||||
@@ -432,7 +432,7 @@ class OcapiUtils(object):
|
||||
else:
|
||||
return response
|
||||
details = response["data"]["Status"].get("Details")
|
||||
if type(details) is str:
|
||||
if isinstance(details, str):
|
||||
details = [details]
|
||||
health_list = response["data"]["Status"]["Health"]
|
||||
return_value = {
|
||||
|
||||
@@ -1529,7 +1529,7 @@ def delete_and_wait(
|
||||
result[resource_type] = resource
|
||||
return result
|
||||
# oci.wait_until() returns an instance of oci.util.Sentinel in case the resource is not found.
|
||||
if type(wait_response) is not Sentinel:
|
||||
if not isinstance(wait_response, Sentinel):
|
||||
resource = to_dict(wait_response.data)
|
||||
else:
|
||||
resource["lifecycle_state"] = "DELETED"
|
||||
|
||||
@@ -42,7 +42,7 @@ def pipx_runner(module, command, **kwargs):
|
||||
system_site_packages=fmt.as_bool("--system-site-packages"),
|
||||
_list=fmt.as_fixed(['list', '--include-injected', '--json']),
|
||||
editable=fmt.as_bool("--editable"),
|
||||
pip_args=fmt.as_opt_val('--pip-args'),
|
||||
pip_args=fmt.as_opt_eq_val('--pip-args'),
|
||||
),
|
||||
environ_update={'USE_EMOJI': '0'},
|
||||
check_rc=True,
|
||||
|
||||
@@ -3708,7 +3708,7 @@ class RedfishUtils(object):
|
||||
# WORKAROUND
|
||||
# HPE systems with iLO 4 will have BIOS Attribute Registries location URI as a dictionary with key 'extref'
|
||||
# Hence adding condition to fetch the Uri
|
||||
if type(loc['Uri']) is dict and "extref" in loc['Uri'].keys():
|
||||
if isinstance(loc['Uri'], dict) and "extref" in loc['Uri'].keys():
|
||||
rsp_uri = loc['Uri']['extref']
|
||||
if not rsp_uri:
|
||||
msg = "Language 'en' not found in BIOS Attribute Registries location, URI: %s, response: %s"
|
||||
|
||||
@@ -138,6 +138,7 @@ options:
|
||||
description:
|
||||
- The type of DNS record to create. Required if O(state=present).
|
||||
- O(type=DS), O(type=SSHFP), and O(type=TLSA) were added in Ansible 2.7.
|
||||
- Note that V(SPF) is no longer supported by CloudFlare. Support for it will be removed from community.general 9.0.0.
|
||||
type: str
|
||||
choices: [ A, AAAA, CNAME, DS, MX, NS, SPF, SRV, SSHFP, TLSA, TXT ]
|
||||
value:
|
||||
@@ -613,7 +614,7 @@ class CloudflareAPI(object):
|
||||
content = str(params['key_tag']) + '\t' + str(params['algorithm']) + '\t' + str(params['hash_type']) + '\t' + params['value']
|
||||
elif params['type'] == 'SSHFP':
|
||||
if not (params['value'] is None or params['value'] == ''):
|
||||
content = str(params['algorithm']) + '\t' + str(params['hash_type']) + '\t' + params['value']
|
||||
content = str(params['algorithm']) + ' ' + str(params['hash_type']) + ' ' + params['value'].upper()
|
||||
elif params['type'] == 'TLSA':
|
||||
if not (params['value'] is None or params['value'] == ''):
|
||||
content = str(params['cert_usage']) + '\t' + str(params['selector']) + '\t' + str(params['hash_type']) + '\t' + params['value']
|
||||
@@ -726,7 +727,7 @@ class CloudflareAPI(object):
|
||||
if (attr is None) or (attr == ''):
|
||||
self.module.fail_json(msg="You must provide algorithm, hash_type and a value to create this record type")
|
||||
sshfp_data = {
|
||||
"fingerprint": params['value'],
|
||||
"fingerprint": params['value'].upper(),
|
||||
"type": params['hash_type'],
|
||||
"algorithm": params['algorithm'],
|
||||
}
|
||||
@@ -736,7 +737,7 @@ class CloudflareAPI(object):
|
||||
'data': sshfp_data,
|
||||
"ttl": params['ttl'],
|
||||
}
|
||||
search_value = str(params['algorithm']) + '\t' + str(params['hash_type']) + '\t' + params['value']
|
||||
search_value = str(params['algorithm']) + ' ' + str(params['hash_type']) + ' ' + params['value']
|
||||
|
||||
if params['type'] == 'TLSA':
|
||||
for attr in [params['port'], params['proto'], params['cert_usage'], params['selector'], params['hash_type'], params['value']]:
|
||||
|
||||
@@ -43,8 +43,8 @@ options:
|
||||
description:
|
||||
- Section name in INI file. This is added if O(state=present) automatically when
|
||||
a single value is being set.
|
||||
- If left empty, being omitted, or being set to V(null), the O(option) will be placed before the first O(section).
|
||||
- Using V(null) is also required if the config format does not support sections.
|
||||
- If being omitted, the O(option) will be placed before the first O(section).
|
||||
- Omitting O(section) is also required if the config format does not support sections.
|
||||
type: str
|
||||
option:
|
||||
description:
|
||||
@@ -171,6 +171,13 @@ EXAMPLES = r'''
|
||||
- pepsi
|
||||
mode: '0600'
|
||||
state: present
|
||||
|
||||
- name: Add "beverage=lemon juice" outside a section in specified file
|
||||
community.general.ini_file:
|
||||
path: /etc/conf
|
||||
option: beverage
|
||||
value: lemon juice
|
||||
state: present
|
||||
'''
|
||||
|
||||
import io
|
||||
|
||||
@@ -12,14 +12,14 @@ __metaclass__ = type
|
||||
DOCUMENTATION = '''
|
||||
---
|
||||
module: interfaces_file
|
||||
short_description: Tweak settings in /etc/network/interfaces files
|
||||
short_description: Tweak settings in C(/etc/network/interfaces) files
|
||||
extends_documentation_fragment:
|
||||
- ansible.builtin.files
|
||||
- community.general.attributes
|
||||
description:
|
||||
- Manage (add, remove, change) individual interface options in an interfaces-style file without having
|
||||
to manage the file as a whole with, say, M(ansible.builtin.template) or M(ansible.builtin.assemble). Interface has to be presented in a file.
|
||||
- Read information about interfaces from interfaces-styled files
|
||||
- Read information about interfaces from interfaces-styled files.
|
||||
attributes:
|
||||
check_mode:
|
||||
support: full
|
||||
@@ -29,27 +29,27 @@ options:
|
||||
dest:
|
||||
type: path
|
||||
description:
|
||||
- Path to the interfaces file
|
||||
- Path to the interfaces file.
|
||||
default: /etc/network/interfaces
|
||||
iface:
|
||||
type: str
|
||||
description:
|
||||
- Name of the interface, required for value changes or option remove
|
||||
- Name of the interface, required for value changes or option remove.
|
||||
address_family:
|
||||
type: str
|
||||
description:
|
||||
- Address family of the interface, useful if same interface name is used for both inet and inet6
|
||||
- Address family of the interface, useful if same interface name is used for both V(inet) and V(inet6).
|
||||
option:
|
||||
type: str
|
||||
description:
|
||||
- Name of the option, required for value changes or option remove
|
||||
- Name of the option, required for value changes or option remove.
|
||||
value:
|
||||
type: str
|
||||
description:
|
||||
- If O(option) is not presented for the O(iface) and O(state) is V(present) option will be added.
|
||||
If O(option) already exists and is not V(pre-up), V(up), V(post-up) or V(down), it's value will be updated.
|
||||
V(pre-up), V(up), V(post-up) and V(down) options cannot be updated, only adding new options, removing existing
|
||||
ones or cleaning the whole option set are supported
|
||||
ones or cleaning the whole option set are supported.
|
||||
backup:
|
||||
description:
|
||||
- Create a backup file including the timestamp information so you can get
|
||||
@@ -64,72 +64,76 @@ options:
|
||||
choices: [ "present", "absent" ]
|
||||
|
||||
notes:
|
||||
- If option is defined multiple times last one will be updated but all will be deleted in case of an absent state
|
||||
- If option is defined multiple times last one will be updated but all will be deleted in case of an absent state.
|
||||
requirements: []
|
||||
author: "Roman Belyakovsky (@hryamzik)"
|
||||
'''
|
||||
|
||||
RETURN = '''
|
||||
dest:
|
||||
description: destination file/path
|
||||
description: Destination file/path.
|
||||
returned: success
|
||||
type: str
|
||||
sample: "/etc/network/interfaces"
|
||||
ifaces:
|
||||
description: interfaces dictionary
|
||||
description: Interfaces dictionary.
|
||||
returned: success
|
||||
type: complex
|
||||
type: dict
|
||||
contains:
|
||||
ifaces:
|
||||
description: interface dictionary
|
||||
description: Interface dictionary.
|
||||
returned: success
|
||||
type: dict
|
||||
contains:
|
||||
eth0:
|
||||
description: Name of the interface
|
||||
description: Name of the interface.
|
||||
returned: success
|
||||
type: dict
|
||||
contains:
|
||||
address_family:
|
||||
description: interface address family
|
||||
description: Interface address family.
|
||||
returned: success
|
||||
type: str
|
||||
sample: "inet"
|
||||
method:
|
||||
description: interface method
|
||||
description: Interface method.
|
||||
returned: success
|
||||
type: str
|
||||
sample: "manual"
|
||||
mtu:
|
||||
description: other options, all values returned as strings
|
||||
description: Other options, all values returned as strings.
|
||||
returned: success
|
||||
type: str
|
||||
sample: "1500"
|
||||
pre-up:
|
||||
description: list of C(pre-up) scripts
|
||||
description: List of C(pre-up) scripts.
|
||||
returned: success
|
||||
type: list
|
||||
elements: str
|
||||
sample:
|
||||
- "route add -net 10.10.10.0/24 gw 10.10.10.1 dev eth1"
|
||||
- "route add -net 10.10.11.0/24 gw 10.10.11.1 dev eth2"
|
||||
up:
|
||||
description: list of C(up) scripts
|
||||
description: List of C(up) scripts.
|
||||
returned: success
|
||||
type: list
|
||||
elements: str
|
||||
sample:
|
||||
- "route add -net 10.10.10.0/24 gw 10.10.10.1 dev eth1"
|
||||
- "route add -net 10.10.11.0/24 gw 10.10.11.1 dev eth2"
|
||||
post-up:
|
||||
description: list of C(post-up) scripts
|
||||
description: List of C(post-up) scripts.
|
||||
returned: success
|
||||
type: list
|
||||
elements: str
|
||||
sample:
|
||||
- "route add -net 10.10.10.0/24 gw 10.10.10.1 dev eth1"
|
||||
- "route add -net 10.10.11.0/24 gw 10.10.11.1 dev eth2"
|
||||
down:
|
||||
description: list of C(down) scripts
|
||||
description: List of C(down) scripts.
|
||||
returned: success
|
||||
type: list
|
||||
elements: str
|
||||
sample:
|
||||
- "route del -net 10.10.10.0/24 gw 10.10.10.1 dev eth1"
|
||||
- "route del -net 10.10.11.0/24 gw 10.10.11.1 dev eth2"
|
||||
@@ -336,6 +340,8 @@ def addOptionAfterLine(option, value, iface, lines, last_line_dict, iface_option
|
||||
changed = False
|
||||
for ln in lines:
|
||||
if ln.get('line_type', '') == 'iface' and ln.get('iface', '') == iface and value != ln.get('params', {}).get('method', ''):
|
||||
if address_family is not None and ln.get('address_family') != address_family:
|
||||
continue
|
||||
changed = True
|
||||
ln['line'] = re.sub(ln.get('params', {}).get('method', '') + '$', value, ln.get('line'))
|
||||
ln['params']['method'] = value
|
||||
|
||||
@@ -195,7 +195,14 @@ def send_msg(msg, server='localhost', port='6667', channel=None, nick_to=None, k
|
||||
|
||||
irc = socket.socket(socket.AF_INET, socket.SOCK_STREAM)
|
||||
if use_ssl:
|
||||
irc = ssl.wrap_socket(irc)
|
||||
if getattr(ssl, 'PROTOCOL_TLS', None) is not None:
|
||||
# Supported since Python 2.7.13
|
||||
context = ssl.SSLContext(ssl.PROTOCOL_TLS)
|
||||
else:
|
||||
context = ssl.SSLContext()
|
||||
context.verify_mode = ssl.CERT_NONE
|
||||
# TODO: create a secure context with `context = ssl.create_default_context()` instead!
|
||||
irc = context.wrap_socket(irc)
|
||||
irc.connect((server, int(port)))
|
||||
|
||||
if passwd:
|
||||
|
||||
@@ -330,7 +330,7 @@ def main():
|
||||
if not r:
|
||||
module.fail_json(msg='Unable to find authorization resource with name %s for client %s in realm %s' % (resources[0], cid, realm))
|
||||
else:
|
||||
payload['resources'] = r['_id']
|
||||
payload['resources'].append(r['_id'])
|
||||
|
||||
for rs in r['scopes']:
|
||||
resource_scopes.append(rs['id'])
|
||||
|
||||
@@ -542,10 +542,14 @@ def main():
|
||||
old_mapper = dict()
|
||||
new_mapper = old_mapper.copy()
|
||||
new_mapper.update(change)
|
||||
if new_mapper != old_mapper:
|
||||
if changeset.get('mappers') is None:
|
||||
changeset['mappers'] = list()
|
||||
changeset['mappers'].append(new_mapper)
|
||||
|
||||
if changeset.get('mappers') is None:
|
||||
changeset['mappers'] = list()
|
||||
# eventually this holds all desired mappers, unchanged, modified and newly added
|
||||
changeset['mappers'].append(new_mapper)
|
||||
|
||||
# ensure idempotency in case module.params.mappers is not sorted by name
|
||||
changeset['mappers'] = sorted(changeset['mappers'], key=lambda x: x.get('id') if x.get('name') is None else x['name'])
|
||||
|
||||
# Prepare the desired values using the existing values (non-existence results in a dict that is save to use as a basis)
|
||||
desired_idp = before_idp.copy()
|
||||
@@ -612,10 +616,17 @@ def main():
|
||||
# do the update
|
||||
desired_idp = desired_idp.copy()
|
||||
updated_mappers = desired_idp.pop('mappers', [])
|
||||
original_mappers = list(before_idp.get('mappers', []))
|
||||
|
||||
kc.update_identity_provider(desired_idp, realm)
|
||||
for mapper in updated_mappers:
|
||||
if mapper.get('id') is not None:
|
||||
kc.update_identity_provider_mapper(mapper, alias, realm)
|
||||
# only update existing if there is a change
|
||||
for i, orig in enumerate(original_mappers):
|
||||
if mapper['id'] == orig['id']:
|
||||
del original_mappers[i]
|
||||
if mapper != orig:
|
||||
kc.update_identity_provider_mapper(mapper, alias, realm)
|
||||
else:
|
||||
if mapper.get('identityProviderAlias') is None:
|
||||
mapper['identityProviderAlias'] = alias
|
||||
|
||||
@@ -481,7 +481,7 @@ def main():
|
||||
|
||||
else:
|
||||
after_user = {}
|
||||
if force: # If the force option is set to true
|
||||
if force and before_user: # If the force option is set to true
|
||||
# Delete the existing user
|
||||
kc.delete_user(user_id=before_user["id"], realm=realm)
|
||||
|
||||
|
||||
@@ -552,9 +552,9 @@ def main():
|
||||
elif rc == 0:
|
||||
changed = True
|
||||
msg = "Volume %s resized to %s%s" % (this_lv['name'], size_requested, unit)
|
||||
elif "matches existing size" in err:
|
||||
elif "matches existing size" in err or "matches existing size" in out:
|
||||
module.exit_json(changed=False, vg=vg, lv=this_lv['name'], size=this_lv['size'])
|
||||
elif "not larger than existing size" in err:
|
||||
elif "not larger than existing size" in err or "not larger than existing size" in out:
|
||||
module.exit_json(changed=False, vg=vg, lv=this_lv['name'], size=this_lv['size'], msg="Original size is larger than requested size", err=err)
|
||||
else:
|
||||
module.fail_json(msg="Unable to resize %s to %s%s" % (lv, size, size_unit), rc=rc, err=err)
|
||||
@@ -585,9 +585,9 @@ def main():
|
||||
module.fail_json(msg="Unable to resize %s to %s%s" % (lv, size, size_unit), rc=rc, err=err, out=out)
|
||||
elif rc == 0:
|
||||
changed = True
|
||||
elif "matches existing size" in err:
|
||||
elif "matches existing size" in err or "matches existing size" in out:
|
||||
module.exit_json(changed=False, vg=vg, lv=this_lv['name'], size=this_lv['size'])
|
||||
elif "not larger than existing size" in err:
|
||||
elif "not larger than existing size" in err or "not larger than existing size" in out:
|
||||
module.exit_json(changed=False, vg=vg, lv=this_lv['name'], size=this_lv['size'], msg="Original size is larger than requested size", err=err)
|
||||
else:
|
||||
module.fail_json(msg="Unable to resize %s to %s%s" % (lv, size, size_unit), rc=rc, err=err)
|
||||
|
||||
@@ -205,10 +205,11 @@ EXAMPLES = r'''
|
||||
body: System {{ ansible_hostname }} has been successfully provisioned.
|
||||
secure: starttls
|
||||
|
||||
- name: Sending an e-mail using StartTLS, remote server, custom EHLO
|
||||
- name: Sending an e-mail using StartTLS, remote server, custom EHLO, and timeout of 10 seconds
|
||||
community.general.mail:
|
||||
host: some.smtp.host.tld
|
||||
port: 25
|
||||
timeout: 10
|
||||
ehlohost: my-resolvable-hostname.tld
|
||||
to: John Smith <john.smith@example.com>
|
||||
subject: Ansible-report
|
||||
|
||||
@@ -48,6 +48,7 @@ options:
|
||||
[,replicate=<1|0>] [,ro=<1|0>] [,shared=<1|0>] [,size=<DiskSize>])."
|
||||
- See U(https://pve.proxmox.com/wiki/Linux_Container) for a full description.
|
||||
- This option has no default unless O(proxmox_default_behavior) is set to V(compatibility); then the default is V(3).
|
||||
- Should not be used in conjunction with O(storage).
|
||||
type: str
|
||||
cores:
|
||||
description:
|
||||
@@ -96,6 +97,7 @@ options:
|
||||
storage:
|
||||
description:
|
||||
- target storage
|
||||
- Should not be used in conjunction with O(disk).
|
||||
type: str
|
||||
default: 'local'
|
||||
cpuunits:
|
||||
@@ -233,6 +235,18 @@ EXAMPLES = r'''
|
||||
hostname: example.org
|
||||
ostemplate: 'local:vztmpl/ubuntu-14.04-x86_64.tar.gz'
|
||||
|
||||
- name: Create new container with minimal options specifying disk storage location and size
|
||||
community.general.proxmox:
|
||||
vmid: 100
|
||||
node: uk-mc02
|
||||
api_user: root@pam
|
||||
api_password: 1q2w3e
|
||||
api_host: node1
|
||||
password: 123456
|
||||
hostname: example.org
|
||||
ostemplate: 'local:vztmpl/ubuntu-14.04-x86_64.tar.gz'
|
||||
disk: 'local-lvm:20'
|
||||
|
||||
- name: Create new container with hookscript and description
|
||||
community.general.proxmox:
|
||||
vmid: 100
|
||||
|
||||
@@ -37,7 +37,7 @@ extends_documentation_fragment:
|
||||
|
||||
EXAMPLES = '''
|
||||
- name: List tasks on node01
|
||||
community.general.proxmox_task_info:
|
||||
community.general.proxmox_tasks_info:
|
||||
api_host: proxmoxhost
|
||||
api_user: root@pam
|
||||
api_password: '{{ password | default(omit) }}'
|
||||
@@ -47,7 +47,7 @@ EXAMPLES = '''
|
||||
register: result
|
||||
|
||||
- name: Retrieve information about specific tasks on node01
|
||||
community.general.proxmox_task_info:
|
||||
community.general.proxmox_tasks_info:
|
||||
api_host: proxmoxhost
|
||||
api_user: root@pam
|
||||
api_password: '{{ password | default(omit) }}'
|
||||
|
||||
@@ -28,7 +28,7 @@ notes:
|
||||
process listing on the system. Due to limitations of the D-Bus interface of C(rhsm),
|
||||
the module will I(not) use D-Bus for registration when trying either to register
|
||||
using O(token), or when specifying O(environment), or when the system is old
|
||||
(typically RHEL 6 and older).
|
||||
(typically RHEL 7 older than 7.4, RHEL 6, and older).
|
||||
- In order to register a system, subscription-manager requires either a username and password, or an activationkey and an Organization ID.
|
||||
- Since 2.5 values for O(server_hostname), O(server_insecure), O(rhsm_baseurl),
|
||||
O(server_proxy_hostname), O(server_proxy_port), O(server_proxy_user) and
|
||||
@@ -415,6 +415,30 @@ class Rhsm(object):
|
||||
else:
|
||||
return False
|
||||
|
||||
def _has_dbus_interface(self):
|
||||
"""
|
||||
Checks whether subscription-manager has a D-Bus interface.
|
||||
|
||||
:returns: bool -- whether subscription-manager has a D-Bus interface.
|
||||
"""
|
||||
|
||||
def str2int(s, default=0):
|
||||
try:
|
||||
return int(s)
|
||||
except ValueError:
|
||||
return default
|
||||
|
||||
distro_id = distro.id()
|
||||
distro_version = tuple(str2int(p) for p in distro.version_parts())
|
||||
|
||||
# subscription-manager in any supported Fedora version has the interface.
|
||||
if distro_id == 'fedora':
|
||||
return True
|
||||
# Any other distro: assume it is EL;
|
||||
# the D-Bus interface was added to subscription-manager in RHEL 7.4.
|
||||
return (distro_version[0] == 7 and distro_version[1] >= 4) or \
|
||||
distro_version[0] >= 8
|
||||
|
||||
def _can_connect_to_dbus(self):
|
||||
"""
|
||||
Checks whether it is possible to connect to the system D-Bus bus.
|
||||
@@ -458,7 +482,8 @@ class Rhsm(object):
|
||||
# of rhsm, so always use the CLI in that case;
|
||||
# also, since the specified environments are names, and the D-Bus APIs
|
||||
# require IDs for the environments, use the CLI also in that case
|
||||
if not token and not environment and self._can_connect_to_dbus():
|
||||
if (not token and not environment and self._has_dbus_interface() and
|
||||
self._can_connect_to_dbus()):
|
||||
self._register_using_dbus(was_registered, username, password, auto_attach,
|
||||
activationkey, org_id, consumer_type,
|
||||
consumer_name, consumer_id,
|
||||
|
||||
@@ -286,25 +286,24 @@ def create_maintenance(auth_headers, url, statuspage, host_ids,
|
||||
returned_date, maintenance_notify_now,
|
||||
maintenance_notify_72_hr, maintenance_notify_24_hr,
|
||||
maintenance_notify_1_hr):
|
||||
returned_dates = [[x] for x in returned_date]
|
||||
component_id = []
|
||||
container_id = []
|
||||
for val in host_ids:
|
||||
component_id.append(val['component_id'])
|
||||
container_id.append(val['container_id'])
|
||||
infrastructure_id = [i + '-' + j for i, j in zip(component_id, container_id)]
|
||||
try:
|
||||
values = json.dumps({
|
||||
"statuspage_id": statuspage,
|
||||
"components": component_id,
|
||||
"containers": container_id,
|
||||
"all_infrastructure_affected": str(int(all_infrastructure_affected)),
|
||||
"infrastructure_affected": infrastructure_id,
|
||||
"automation": str(int(automation)),
|
||||
"maintenance_name": title,
|
||||
"maintenance_details": desc,
|
||||
"date_planned_start": returned_dates[0],
|
||||
"time_planned_start": returned_dates[1],
|
||||
"date_planned_end": returned_dates[2],
|
||||
"time_planned_end": returned_dates[3],
|
||||
"date_planned_start": returned_date[0],
|
||||
"time_planned_start": returned_date[1],
|
||||
"date_planned_end": returned_date[2],
|
||||
"time_planned_end": returned_date[3],
|
||||
"maintenance_notify_now": str(int(maintenance_notify_now)),
|
||||
"maintenance_notify_72_hr": str(int(maintenance_notify_72_hr)),
|
||||
"maintenance_notify_24_hr": str(int(maintenance_notify_24_hr)),
|
||||
|
||||
@@ -514,7 +514,7 @@ def main():
|
||||
|
||||
def format_args(vars):
|
||||
if isinstance(vars, str):
|
||||
return '"{string}"'.format(string=vars.replace('\\', '\\\\').replace('"', '\\"'))
|
||||
return '"{string}"'.format(string=vars.replace('\\', '\\\\').replace('"', '\\"')).replace('\n', '\\n')
|
||||
elif isinstance(vars, bool):
|
||||
if vars:
|
||||
return 'true'
|
||||
|
||||
@@ -22,7 +22,7 @@
|
||||
assert:
|
||||
that:
|
||||
- 'alternative is changed'
|
||||
- 'cmd.stdout == "dummy{{ item }}"'
|
||||
- 'cmd.stdout == "dummy" ~ item'
|
||||
|
||||
- name: check that alternative has been updated
|
||||
command: "grep -Pzq '/bin/dummy{{ item }}\\n{{ 60 + item|int }}' '{{ alternatives_dir }}/dummy'"
|
||||
|
||||
@@ -29,7 +29,7 @@
|
||||
that:
|
||||
- archive_no_options is changed
|
||||
- "archive_no_options.dest_state == 'archive'"
|
||||
- "{{ archive_no_options.archived | length }} == 3"
|
||||
- "archive_no_options.archived | length == 3"
|
||||
|
||||
- name: Remove the archive - no options ({{ format }})
|
||||
file:
|
||||
@@ -54,7 +54,7 @@
|
||||
that:
|
||||
- archive_file_options_stat is not changed
|
||||
- "archive_file_options.mode == '0600'"
|
||||
- "{{ archive_file_options.archived | length }} == 3"
|
||||
- "archive_file_options.archived | length == 3"
|
||||
|
||||
- name: Remove the archive - file options ({{ format }})
|
||||
file:
|
||||
@@ -146,7 +146,7 @@
|
||||
assert:
|
||||
that:
|
||||
- archive_path_list is changed
|
||||
- "{{ archive_path_list.archived | length }} == 3"
|
||||
- "archive_path_list.archived | length == 3"
|
||||
|
||||
- name: Remove archive - path list ({{ format }})
|
||||
file:
|
||||
@@ -168,8 +168,8 @@
|
||||
that:
|
||||
- archive_missing_paths is changed
|
||||
- "archive_missing_paths.dest_state == 'incomplete'"
|
||||
- "'{{ remote_tmp_dir }}/dne.txt' in archive_missing_paths.missing"
|
||||
- "'{{ remote_tmp_dir }}/foo.txt' not in archive_missing_paths.missing"
|
||||
- "(remote_tmp_dir ~ '/dne.txt') in archive_missing_paths.missing"
|
||||
- "(remote_tmp_dir ~ '/foo.txt') not in archive_missing_paths.missing"
|
||||
|
||||
- name: Remove archive - missing paths ({{ format }})
|
||||
file:
|
||||
|
||||
@@ -20,21 +20,28 @@
|
||||
assert:
|
||||
that:
|
||||
- archive_remove_source_files is changed
|
||||
- "{{ archive_remove_source_files.archived | length }} == 3"
|
||||
- "archive_remove_source_files.archived | length == 3"
|
||||
|
||||
- name: Remove Archive - remove source files ({{ format }})
|
||||
file:
|
||||
path: "{{ remote_tmp_dir }}/archive_remove_source_files.{{ format }}"
|
||||
state: absent
|
||||
|
||||
- name: Assert that source files were removed - remove source files ({{ format }})
|
||||
assert:
|
||||
that:
|
||||
- "'{{ remote_tmp_dir }}/{{ item }}' is not exists"
|
||||
- name: Remove source files in check mode ({{ format }})
|
||||
file:
|
||||
path: "{{ remote_tmp_dir }}/{{ item }}"
|
||||
state: absent
|
||||
check_mode: true
|
||||
with_items:
|
||||
- foo.txt
|
||||
- bar.txt
|
||||
- empty.txt
|
||||
register: remove_files
|
||||
|
||||
- name: Assert that source files were removed - remove source files ({{ format }})
|
||||
assert:
|
||||
that:
|
||||
- remove_files is not changed
|
||||
|
||||
- name: Copy source files - remove source directory ({{ format }})
|
||||
copy:
|
||||
@@ -76,17 +83,24 @@
|
||||
assert:
|
||||
that:
|
||||
- archive_remove_source_directory is changed
|
||||
- "{{ archive_remove_source_directory.archived | length }} == 3"
|
||||
- "archive_remove_source_directory.archived | length == 3"
|
||||
|
||||
- name: Remove archive - remove source directory ({{ format }})
|
||||
file:
|
||||
path: "{{ remote_tmp_dir }}/archive_remove_source_directory.{{ format }}"
|
||||
state: absent
|
||||
|
||||
- name: Remove source source directory in check mode ({{ format }})
|
||||
file:
|
||||
path: "{{ remote_tmp_dir }}/tmpdir"
|
||||
state: absent
|
||||
check_mode: true
|
||||
register: remove_dir
|
||||
|
||||
- name: Verify source directory was removed - remove source directory ({{ format }})
|
||||
assert:
|
||||
that:
|
||||
- "'{{ remote_tmp_dir }}/tmpdir' is not exists"
|
||||
- remove_dir is not changed
|
||||
|
||||
- name: Create temporary directory - remove source excluding path ({{ format }})
|
||||
file:
|
||||
@@ -120,7 +134,7 @@
|
||||
assert:
|
||||
that:
|
||||
- archive_remove_source_excluding_path is changed
|
||||
- "{{ archive_remove_source_excluding_path.archived | length }} == 2"
|
||||
- "archive_remove_source_excluding_path.archived | length == 2"
|
||||
|
||||
- name: Remove archive - remove source excluding path ({{ format }})
|
||||
file:
|
||||
|
||||
@@ -0,0 +1,56 @@
|
||||
# Copyright 2012, Dag Wieers <dag@wieers.com>
|
||||
# GNU General Public License v3.0+ (see LICENSES/GPL-3.0-or-later.txt or https://www.gnu.org/licenses/gpl-3.0.txt)
|
||||
# SPDX-License-Identifier: GPL-3.0-or-later
|
||||
|
||||
from __future__ import (absolute_import, division, print_function)
|
||||
__metaclass__ = type
|
||||
|
||||
from ansible.errors import AnsibleError
|
||||
from ansible.playbook.conditional import Conditional
|
||||
from ansible.plugins.action import ActionBase
|
||||
|
||||
|
||||
class ActionModule(ActionBase):
|
||||
''' Fail with custom message '''
|
||||
|
||||
_requires_connection = False
|
||||
|
||||
_VALID_ARGS = frozenset(('msg', 'that'))
|
||||
|
||||
def _make_safe(self, text):
|
||||
# A simple str(text) won't do it since AnsibleUnsafeText is clever :-)
|
||||
return ''.join(chr(ord(x)) for x in text)
|
||||
|
||||
def run(self, tmp=None, task_vars=None):
|
||||
if task_vars is None:
|
||||
task_vars = dict()
|
||||
|
||||
result = super(ActionModule, self).run(tmp, task_vars)
|
||||
del tmp # tmp no longer has any effect
|
||||
|
||||
if 'that' not in self._task.args:
|
||||
raise AnsibleError('conditional required in "that" string')
|
||||
|
||||
fail_msg = 'Assertion failed'
|
||||
success_msg = 'All assertions passed'
|
||||
|
||||
thats = self._task.args['that']
|
||||
|
||||
cond = Conditional(loader=self._loader)
|
||||
result['_ansible_verbose_always'] = True
|
||||
|
||||
for that in thats:
|
||||
cond.when = [str(self._make_safe(that))]
|
||||
test_result = cond.evaluate_conditional(templar=self._templar, all_vars=task_vars)
|
||||
if not test_result:
|
||||
result['failed'] = True
|
||||
result['evaluated_to'] = test_result
|
||||
result['assertion'] = that
|
||||
|
||||
result['msg'] = fail_msg
|
||||
|
||||
return result
|
||||
|
||||
result['changed'] = False
|
||||
result['msg'] = success_msg
|
||||
return result
|
||||
@@ -24,5 +24,5 @@
|
||||
ignore_errors: "{{ item.expect_error | default(omit) }}"
|
||||
|
||||
- name: check results ({{ item.name }})
|
||||
assert:
|
||||
_unsafe_assert:
|
||||
that: "{{ item.assertions }}"
|
||||
|
||||
@@ -17,25 +17,25 @@
|
||||
assert:
|
||||
that:
|
||||
- "'project_path' in deploy_helper"
|
||||
- "deploy_helper.current_path == '{{ deploy_helper.project_path }}/current'"
|
||||
- "deploy_helper.releases_path == '{{ deploy_helper.project_path }}/releases'"
|
||||
- "deploy_helper.shared_path == '{{ deploy_helper.project_path }}/shared'"
|
||||
- "deploy_helper.current_path == deploy_helper.project_path ~ '/current'"
|
||||
- "deploy_helper.releases_path == deploy_helper.project_path ~ '/releases'"
|
||||
- "deploy_helper.shared_path == deploy_helper.project_path ~ '/shared'"
|
||||
- "deploy_helper.unfinished_filename == 'DEPLOY_UNFINISHED'"
|
||||
- "'previous_release' in deploy_helper"
|
||||
- "'previous_release_path' in deploy_helper"
|
||||
- "'new_release' in deploy_helper"
|
||||
- "'new_release_path' in deploy_helper"
|
||||
- "deploy_helper.new_release_path == '{{ deploy_helper.releases_path }}/{{ deploy_helper.new_release }}'"
|
||||
- "deploy_helper.new_release_path == deploy_helper.releases_path ~ '/' ~ deploy_helper.new_release"
|
||||
|
||||
- name: State=query with relative overridden paths
|
||||
deploy_helper: path={{ deploy_helper_test_root }} current_path=CURRENT_PATH releases_path=RELEASES_PATH shared_path=SHARED_PATH state=query
|
||||
- name: Assert State=query with relative overridden paths
|
||||
assert:
|
||||
that:
|
||||
- "deploy_helper.current_path == '{{ deploy_helper.project_path }}/CURRENT_PATH'"
|
||||
- "deploy_helper.releases_path == '{{ deploy_helper.project_path }}/RELEASES_PATH'"
|
||||
- "deploy_helper.shared_path == '{{ deploy_helper.project_path }}/SHARED_PATH'"
|
||||
- "deploy_helper.new_release_path == '{{ deploy_helper.releases_path }}/{{ deploy_helper.new_release}}'"
|
||||
- "deploy_helper.current_path == deploy_helper.project_path ~ '/CURRENT_PATH'"
|
||||
- "deploy_helper.releases_path == deploy_helper.project_path ~ '/RELEASES_PATH'"
|
||||
- "deploy_helper.shared_path == deploy_helper.project_path ~ '/SHARED_PATH'"
|
||||
- "deploy_helper.new_release_path == deploy_helper.releases_path ~ '/' ~ deploy_helper.new_release"
|
||||
|
||||
- name: State=query with absolute overridden paths
|
||||
deploy_helper: path={{ deploy_helper_test_root }} current_path=/CURRENT_PATH releases_path=/RELEASES_PATH shared_path=/SHARED_PATH state=query
|
||||
@@ -45,7 +45,7 @@
|
||||
- "deploy_helper.current_path == '/CURRENT_PATH'"
|
||||
- "deploy_helper.releases_path == '/RELEASES_PATH'"
|
||||
- "deploy_helper.shared_path == '/SHARED_PATH'"
|
||||
- "deploy_helper.new_release_path == '{{ deploy_helper.releases_path }}/{{ deploy_helper.new_release}}'"
|
||||
- "deploy_helper.new_release_path == deploy_helper.releases_path ~ '/' ~ deploy_helper.new_release"
|
||||
|
||||
- name: State=query with overridden unfinished_filename
|
||||
deploy_helper: path={{ deploy_helper_test_root }} unfinished_filename=UNFINISHED_DEPLOY state=query
|
||||
|
||||
@@ -17,3 +17,4 @@ skip/rhel8.8
|
||||
skip/rhel9.0
|
||||
skip/rhel9.1
|
||||
skip/rhel9.2
|
||||
skip/rhel9.3
|
||||
|
||||
@@ -33,6 +33,22 @@
|
||||
state: present
|
||||
notify: Remove ejabberd
|
||||
|
||||
- name: Make runnable on Arch
|
||||
community.general.ini_file:
|
||||
path: /usr/lib/systemd/system/ejabberd.service
|
||||
section: Service
|
||||
option: "{{ item }}"
|
||||
state: absent
|
||||
loop:
|
||||
- PrivateDevices
|
||||
- AmbientCapabilities
|
||||
when: ansible_distribution == 'Archlinux'
|
||||
|
||||
- name: Make installable on Arch
|
||||
systemd:
|
||||
daemon_reload: true
|
||||
when: ansible_distribution == 'Archlinux'
|
||||
|
||||
- ansible.builtin.service:
|
||||
name: ejabberd
|
||||
state: started
|
||||
|
||||
@@ -10,3 +10,4 @@ skip/macos
|
||||
skip/rhel9.0 # See https://www.reddit.com/r/Fedora/comments/si7nzk/homectl/
|
||||
skip/rhel9.1 # See https://www.reddit.com/r/Fedora/comments/si7nzk/homectl/
|
||||
skip/rhel9.2 # See https://www.reddit.com/r/Fedora/comments/si7nzk/homectl/
|
||||
skip/rhel9.3 # See https://www.reddit.com/r/Fedora/comments/si7nzk/homectl/
|
||||
|
||||
@@ -39,4 +39,4 @@
|
||||
that:
|
||||
- result_basic_2 is not changed
|
||||
- result_basic_2 is failed
|
||||
- result_basic_2.msg == "Destination {{ non_existing_file }} does not exist!"
|
||||
- result_basic_2.msg == "Destination " ~ non_existing_file ~ " does not exist!"
|
||||
|
||||
@@ -7,6 +7,7 @@
|
||||
set_fact:
|
||||
interfaces_testfile: '{{ remote_tmp_dir }}/interfaces'
|
||||
interfaces_testfile_3841: '{{ remote_tmp_dir }}/interfaces_3841'
|
||||
interfaces_testfile_7610: '{{ remote_tmp_dir }}/interfaces_7610'
|
||||
|
||||
- name: Copy interfaces file
|
||||
copy:
|
||||
@@ -65,3 +66,60 @@
|
||||
that:
|
||||
- ifile_3841_a is changed
|
||||
- ifile_3841_b is not changed
|
||||
|
||||
- name: 7610 - create file
|
||||
copy:
|
||||
dest: '{{ interfaces_testfile_7610 }}'
|
||||
content: |
|
||||
iface ens3 inet dhcp
|
||||
iface ens3 inet6 auto
|
||||
|
||||
- name: 7610 - modify file
|
||||
interfaces_file:
|
||||
dest: '{{ interfaces_testfile_7610 }}'
|
||||
iface: ens3
|
||||
address_family: "inet6"
|
||||
option: "{{ item.option }}"
|
||||
value: "{{ item.value }}"
|
||||
loop:
|
||||
- option: "method"
|
||||
value: "static"
|
||||
- option: "address"
|
||||
value: "1:2::3/48"
|
||||
|
||||
- name: 7610 - read file
|
||||
slurp:
|
||||
src: '{{ interfaces_testfile_7610 }}'
|
||||
register: content_7610
|
||||
|
||||
- name: 7610 - check assertions
|
||||
assert:
|
||||
that:
|
||||
- content_7610.content | b64decode == expected_content
|
||||
vars:
|
||||
expected_content: |
|
||||
iface ens3 inet dhcp
|
||||
iface ens3 inet6 static
|
||||
address 1:2::3/48
|
||||
|
||||
- name: 7610 - modify file again
|
||||
interfaces_file:
|
||||
dest: '{{ interfaces_testfile_7610 }}'
|
||||
iface: ens3
|
||||
option: method
|
||||
value: foobar
|
||||
|
||||
- name: 7610 - read file
|
||||
slurp:
|
||||
src: '{{ interfaces_testfile_7610 }}'
|
||||
register: content_7610
|
||||
|
||||
- name: 7610 - check assertions
|
||||
assert:
|
||||
that:
|
||||
- content_7610.content | b64decode == expected_content
|
||||
vars:
|
||||
expected_content: |
|
||||
iface ens3 inet foobar
|
||||
iface ens3 inet6 foobar
|
||||
address 1:2::3/48
|
||||
|
||||
@@ -10,5 +10,6 @@ skip/osx # FIXME
|
||||
skip/rhel9.0 # FIXME
|
||||
skip/rhel9.1 # FIXME
|
||||
skip/rhel9.2 # FIXME
|
||||
skip/rhel9.3 # FIXME
|
||||
skip/freebsd12.4 # FIXME
|
||||
skip/freebsd13.2 # FIXME
|
||||
|
||||
@@ -18,7 +18,14 @@ except ModuleNotFoundError:
|
||||
from http.server import HTTPServer, SimpleHTTPRequestHandler
|
||||
|
||||
httpd = HTTPServer(('localhost', port), SimpleHTTPRequestHandler)
|
||||
httpd.socket = ssl.wrap_socket(httpd.socket, server_side=True,
|
||||
certfile=os.path.join(root_dir, 'cert.pem'),
|
||||
keyfile=os.path.join(root_dir, 'key.pem'))
|
||||
try:
|
||||
httpd.socket = ssl.wrap_socket(httpd.socket, server_side=True,
|
||||
certfile=os.path.join(root_dir, 'cert.pem'),
|
||||
keyfile=os.path.join(root_dir, 'key.pem'))
|
||||
except AttributeError:
|
||||
# Python 3.12 or newer:
|
||||
context = ssl.create_default_context(purpose=ssl.Purpose.CLIENT_AUTH)
|
||||
context.load_cert_chain(certfile=os.path.join(root_dir, 'cert.pem'),
|
||||
keyfile=os.path.join(root_dir, 'key.pem'))
|
||||
httpd.socket = context.wrap_socket(httpd.socket)
|
||||
httpd.handle_request()
|
||||
|
||||
@@ -35,14 +35,14 @@
|
||||
syncMode: FORCE
|
||||
mappers:
|
||||
- name: "first_name"
|
||||
identityProviderAlias: "oidc-idp"
|
||||
identityProviderAlias: "{{ idp }}"
|
||||
identityProviderMapper: "oidc-user-attribute-idp-mapper"
|
||||
config:
|
||||
claim: "first_name"
|
||||
user.attribute: "first_name"
|
||||
syncMode: "INHERIT"
|
||||
- name: "last_name"
|
||||
identityProviderAlias: "oidc-idp"
|
||||
identityProviderAlias: "{{ idp }}"
|
||||
identityProviderMapper: "oidc-user-attribute-idp-mapper"
|
||||
config:
|
||||
claim: "last_name"
|
||||
@@ -84,14 +84,14 @@
|
||||
syncMode: FORCE
|
||||
mappers:
|
||||
- name: "first_name"
|
||||
identityProviderAlias: "oidc-idp"
|
||||
identityProviderAlias: "{{ idp }}"
|
||||
identityProviderMapper: "oidc-user-attribute-idp-mapper"
|
||||
config:
|
||||
claim: "first_name"
|
||||
user.attribute: "first_name"
|
||||
syncMode: "INHERIT"
|
||||
- name: "last_name"
|
||||
identityProviderAlias: "oidc-idp"
|
||||
identityProviderAlias: "{{ idp }}"
|
||||
identityProviderMapper: "oidc-user-attribute-idp-mapper"
|
||||
config:
|
||||
claim: "last_name"
|
||||
@@ -109,7 +109,7 @@
|
||||
that:
|
||||
- result is not changed
|
||||
|
||||
- name: Update existing identity provider (with change)
|
||||
- name: Update existing identity provider (with change, no mapper change)
|
||||
community.general.keycloak_identity_provider:
|
||||
auth_keycloak_url: "{{ url }}"
|
||||
auth_realm: "{{ admin_realm }}"
|
||||
@@ -132,6 +132,109 @@
|
||||
- result.existing.enabled == true
|
||||
- result.end_state.enabled == false
|
||||
|
||||
- name: Update existing identity provider (delete mapper)
|
||||
community.general.keycloak_identity_provider:
|
||||
auth_keycloak_url: "{{ url }}"
|
||||
auth_realm: "{{ admin_realm }}"
|
||||
auth_username: "{{ admin_user }}"
|
||||
auth_password: "{{ admin_password }}"
|
||||
realm: "{{ realm }}"
|
||||
alias: "{{ idp }}"
|
||||
state: present
|
||||
mappers:
|
||||
- name: "first_name"
|
||||
identityProviderAlias: "{{ idp }}"
|
||||
identityProviderMapper: "oidc-user-attribute-idp-mapper"
|
||||
config:
|
||||
claim: "first_name"
|
||||
user.attribute: "first_name"
|
||||
syncMode: "INHERIT"
|
||||
register: result
|
||||
|
||||
- name: Debug
|
||||
debug:
|
||||
var: result
|
||||
|
||||
- name: Assert identity provider updated
|
||||
assert:
|
||||
that:
|
||||
- result is changed
|
||||
- result.existing.mappers | length == 2
|
||||
- result.end_state.mappers | length == 1
|
||||
- result.end_state.mappers[0].name == "first_name"
|
||||
|
||||
- name: Update existing identity provider (add mapper)
|
||||
community.general.keycloak_identity_provider:
|
||||
auth_keycloak_url: "{{ url }}"
|
||||
auth_realm: "{{ admin_realm }}"
|
||||
auth_username: "{{ admin_user }}"
|
||||
auth_password: "{{ admin_password }}"
|
||||
realm: "{{ realm }}"
|
||||
alias: "{{ idp }}"
|
||||
state: present
|
||||
mappers:
|
||||
- name: "last_name"
|
||||
identityProviderAlias: "{{ idp }}"
|
||||
identityProviderMapper: "oidc-user-attribute-idp-mapper"
|
||||
config:
|
||||
claim: "last_name"
|
||||
user.attribute: "last_name"
|
||||
syncMode: "INHERIT"
|
||||
- name: "first_name"
|
||||
identityProviderAlias: "{{ idp }}"
|
||||
identityProviderMapper: "oidc-user-attribute-idp-mapper"
|
||||
config:
|
||||
claim: "first_name"
|
||||
user.attribute: "first_name"
|
||||
syncMode: "INHERIT"
|
||||
register: result
|
||||
|
||||
- name: Debug
|
||||
debug:
|
||||
var: result
|
||||
|
||||
- name: Assert identity provider updated
|
||||
assert:
|
||||
that:
|
||||
- result is changed
|
||||
- result.existing.mappers | length == 1
|
||||
- result.end_state.mappers | length == 2
|
||||
|
||||
- name: Update existing identity provider (no change, test mapper idempotency)
|
||||
community.general.keycloak_identity_provider:
|
||||
auth_keycloak_url: "{{ url }}"
|
||||
auth_realm: "{{ admin_realm }}"
|
||||
auth_username: "{{ admin_user }}"
|
||||
auth_password: "{{ admin_password }}"
|
||||
realm: "{{ realm }}"
|
||||
alias: "{{ idp }}"
|
||||
state: present
|
||||
mappers:
|
||||
- name: "last_name"
|
||||
identityProviderAlias: "{{ idp }}"
|
||||
identityProviderMapper: "oidc-user-attribute-idp-mapper"
|
||||
config:
|
||||
claim: "last_name"
|
||||
user.attribute: "last_name"
|
||||
syncMode: "INHERIT"
|
||||
- name: "first_name"
|
||||
identityProviderAlias: "{{ idp }}"
|
||||
identityProviderMapper: "oidc-user-attribute-idp-mapper"
|
||||
config:
|
||||
claim: "first_name"
|
||||
user.attribute: "first_name"
|
||||
syncMode: "INHERIT"
|
||||
register: result
|
||||
|
||||
- name: Debug
|
||||
debug:
|
||||
var: result
|
||||
|
||||
- name: Assert identity provider updated
|
||||
assert:
|
||||
that:
|
||||
- result is not changed
|
||||
|
||||
- name: Delete existing identity provider
|
||||
community.general.keycloak_identity_provider:
|
||||
auth_keycloak_url: "{{ url }}"
|
||||
|
||||
@@ -35,7 +35,7 @@
|
||||
loop: "{{ initial_lv_status_result.stdout_lines }}"
|
||||
assert:
|
||||
that:
|
||||
- "'active' == '{{ item }}'"
|
||||
- "'active' == item"
|
||||
|
||||
- name: Deactivate volume group
|
||||
lvg:
|
||||
@@ -100,7 +100,7 @@
|
||||
loop: "{{ activate_lv_status_result.stdout_lines }}"
|
||||
assert:
|
||||
that:
|
||||
- "'active' == '{{ item }}'"
|
||||
- "'active' == item"
|
||||
|
||||
- name: Activate volume group again to verify idempontency
|
||||
lvg:
|
||||
@@ -138,7 +138,7 @@
|
||||
loop: "{{ activate_partial_lv_status_result.stdout_lines }}"
|
||||
assert:
|
||||
that:
|
||||
- "'active' == '{{ item }}'"
|
||||
- "'active' == item"
|
||||
|
||||
- name: Deactivate volume group in check mode
|
||||
lvg:
|
||||
@@ -160,4 +160,4 @@
|
||||
loop: "{{ check_mode_deactivate_lv_status_result.stdout_lines }}"
|
||||
assert:
|
||||
that:
|
||||
- "'active' == '{{ item }}'"
|
||||
- "'active' == item"
|
||||
|
||||
@@ -10,96 +10,101 @@
|
||||
|
||||
# TODO: Our current implementation does not handle SMTP authentication
|
||||
|
||||
# NOTE: If the system does not support smtpd-tls (python 2.6 and older) we do basic tests
|
||||
- name: Attempt to install smtpd-tls
|
||||
pip:
|
||||
name: smtpd-tls
|
||||
state: present
|
||||
ignore_errors: true
|
||||
register: smtpd_tls
|
||||
- when:
|
||||
# TODO: https://github.com/ansible-collections/community.general/issues/4656
|
||||
- ansible_python.version.major != 3 or ansible_python.version.minor < 12
|
||||
block:
|
||||
|
||||
- name: Install test smtpserver
|
||||
copy:
|
||||
src: '{{ item }}'
|
||||
dest: '{{ remote_tmp_dir }}/{{ item }}'
|
||||
loop:
|
||||
- smtpserver.py
|
||||
- smtpserver.crt
|
||||
- smtpserver.key
|
||||
# NOTE: If the system does not support smtpd-tls (python 2.6 and older) we do basic tests
|
||||
- name: Attempt to install smtpd-tls
|
||||
pip:
|
||||
name: smtpd-tls
|
||||
state: present
|
||||
ignore_errors: true
|
||||
register: smtpd_tls
|
||||
|
||||
# FIXME: Verify the mail after it was send would be nice
|
||||
# This would require either dumping the content, or registering async task output
|
||||
- name: Start test smtpserver
|
||||
shell: '{{ ansible_python.executable }} {{ remote_tmp_dir }}/smtpserver.py 10025:10465'
|
||||
async: 45
|
||||
poll: 0
|
||||
register: smtpserver
|
||||
- name: Install test smtpserver
|
||||
copy:
|
||||
src: '{{ item }}'
|
||||
dest: '{{ remote_tmp_dir }}/{{ item }}'
|
||||
loop:
|
||||
- smtpserver.py
|
||||
- smtpserver.crt
|
||||
- smtpserver.key
|
||||
|
||||
- name: Send a basic test-mail
|
||||
mail:
|
||||
port: 10025
|
||||
subject: Test mail 1 (smtp)
|
||||
secure: never
|
||||
# FIXME: Verify the mail after it was send would be nice
|
||||
# This would require either dumping the content, or registering async task output
|
||||
- name: Start test smtpserver
|
||||
shell: '{{ ansible_python.executable }} {{ remote_tmp_dir }}/smtpserver.py 10025:10465'
|
||||
async: 45
|
||||
poll: 0
|
||||
register: smtpserver
|
||||
|
||||
- name: Send a test-mail with body and specific recipient
|
||||
mail:
|
||||
port: 10025
|
||||
from: ansible@localhost
|
||||
to: root@localhost
|
||||
subject: Test mail 2 (smtp + body)
|
||||
body: Test body 2
|
||||
secure: never
|
||||
- name: Send a basic test-mail
|
||||
mail:
|
||||
port: 10025
|
||||
subject: Test mail 1 (smtp)
|
||||
secure: never
|
||||
|
||||
- name: Send a test-mail with attachment
|
||||
mail:
|
||||
port: 10025
|
||||
from: ansible@localhost
|
||||
to: root@localhost
|
||||
subject: Test mail 3 (smtp + body + attachment)
|
||||
body: Test body 3
|
||||
attach: /etc/group
|
||||
secure: never
|
||||
- name: Send a test-mail with body and specific recipient
|
||||
mail:
|
||||
port: 10025
|
||||
from: ansible@localhost
|
||||
to: root@localhost
|
||||
subject: Test mail 2 (smtp + body)
|
||||
body: Test body 2
|
||||
secure: never
|
||||
|
||||
# NOTE: This might fail if smtpd-tls is missing or python 2.7.8 or older is used
|
||||
- name: Send a test-mail using starttls
|
||||
mail:
|
||||
port: 10025
|
||||
from: ansible@localhost
|
||||
to: root@localhost
|
||||
subject: Test mail 4 (smtp + starttls + body + attachment)
|
||||
body: Test body 4
|
||||
attach: /etc/group
|
||||
secure: starttls
|
||||
ignore_errors: true
|
||||
register: starttls_support
|
||||
- name: Send a test-mail with attachment
|
||||
mail:
|
||||
port: 10025
|
||||
from: ansible@localhost
|
||||
to: root@localhost
|
||||
subject: Test mail 3 (smtp + body + attachment)
|
||||
body: Test body 3
|
||||
attach: /etc/group
|
||||
secure: never
|
||||
|
||||
# NOTE: This might fail if smtpd-tls is missing or python 2.7.8 or older is used
|
||||
- name: Send a test-mail using TLS
|
||||
mail:
|
||||
port: 10465
|
||||
from: ansible@localhost
|
||||
to: root@localhost
|
||||
subject: Test mail 5 (smtp + tls + body + attachment)
|
||||
body: Test body 5
|
||||
attach: /etc/group
|
||||
secure: always
|
||||
ignore_errors: true
|
||||
register: tls_support
|
||||
# NOTE: This might fail if smtpd-tls is missing or python 2.7.8 or older is used
|
||||
- name: Send a test-mail using starttls
|
||||
mail:
|
||||
port: 10025
|
||||
from: ansible@localhost
|
||||
to: root@localhost
|
||||
subject: Test mail 4 (smtp + starttls + body + attachment)
|
||||
body: Test body 4
|
||||
attach: /etc/group
|
||||
secure: starttls
|
||||
ignore_errors: true
|
||||
register: starttls_support
|
||||
|
||||
- fail:
|
||||
msg: Sending mail using starttls failed.
|
||||
when: smtpd_tls is succeeded and starttls_support is failed and tls_support is succeeded
|
||||
# NOTE: This might fail if smtpd-tls is missing or python 2.7.8 or older is used
|
||||
- name: Send a test-mail using TLS
|
||||
mail:
|
||||
port: 10465
|
||||
from: ansible@localhost
|
||||
to: root@localhost
|
||||
subject: Test mail 5 (smtp + tls + body + attachment)
|
||||
body: Test body 5
|
||||
attach: /etc/group
|
||||
secure: always
|
||||
ignore_errors: true
|
||||
register: tls_support
|
||||
|
||||
- fail:
|
||||
msg: Send mail using TLS failed.
|
||||
when: smtpd_tls is succeeded and tls_support is failed and starttls_support is succeeded
|
||||
- fail:
|
||||
msg: Sending mail using starttls failed.
|
||||
when: smtpd_tls is succeeded and starttls_support is failed and tls_support is succeeded
|
||||
|
||||
- name: Send a test-mail with body, specific recipient and specific ehlohost
|
||||
mail:
|
||||
port: 10025
|
||||
ehlohost: some.domain.tld
|
||||
from: ansible@localhost
|
||||
to: root@localhost
|
||||
subject: Test mail 6 (smtp + body + ehlohost)
|
||||
body: Test body 6
|
||||
secure: never
|
||||
- fail:
|
||||
msg: Send mail using TLS failed.
|
||||
when: smtpd_tls is succeeded and tls_support is failed and starttls_support is succeeded
|
||||
|
||||
- name: Send a test-mail with body, specific recipient and specific ehlohost
|
||||
mail:
|
||||
port: 10025
|
||||
ehlohost: some.domain.tld
|
||||
from: ansible@localhost
|
||||
to: root@localhost
|
||||
subject: Test mail 6 (smtp + body + ehlohost)
|
||||
body: Test body 6
|
||||
secure: never
|
||||
|
||||
@@ -10,5 +10,6 @@ skip/rhel8.0
|
||||
skip/rhel9.0
|
||||
skip/rhel9.1
|
||||
skip/rhel9.2
|
||||
skip/rhel9.3
|
||||
skip/freebsd
|
||||
skip/python2.6
|
||||
|
||||
@@ -21,7 +21,7 @@
|
||||
- name: Test if state and value are required together
|
||||
assert:
|
||||
that:
|
||||
- "'following are missing: value' in '{{ missing_value['msg'] }}'"
|
||||
- "'following are missing: value' in missing_value['msg']"
|
||||
|
||||
- name: Change value of AppleMeasurementUnits to centimeter in check_mode
|
||||
osx_defaults:
|
||||
@@ -194,7 +194,7 @@
|
||||
register: test_data_types
|
||||
|
||||
- assert:
|
||||
that: "{{ item.changed }}"
|
||||
that: "item is changed"
|
||||
with_items: "{{ test_data_types.results }}"
|
||||
|
||||
- name: Use different data types and delete them
|
||||
@@ -208,7 +208,7 @@
|
||||
register: test_data_types
|
||||
|
||||
- assert:
|
||||
that: "{{ item.changed }}"
|
||||
that: "item is changed"
|
||||
with_items: "{{ test_data_types.results }}"
|
||||
|
||||
|
||||
|
||||
@@ -314,3 +314,28 @@
|
||||
that:
|
||||
- install_tox_sitewide is changed
|
||||
- usrlocaltox.stat.exists
|
||||
|
||||
##############################################################################
|
||||
# Test for issue 7497
|
||||
- name: ensure application pyinstaller is uninstalled
|
||||
community.general.pipx:
|
||||
name: pyinstaller
|
||||
state: absent
|
||||
|
||||
- name: Install Python Package pyinstaller
|
||||
community.general.pipx:
|
||||
name: pyinstaller
|
||||
state: present
|
||||
system_site_packages: true
|
||||
pip_args: "--no-cache-dir"
|
||||
register: install_pyinstaller
|
||||
|
||||
- name: cleanup pyinstaller
|
||||
community.general.pipx:
|
||||
name: pyinstaller
|
||||
state: absent
|
||||
|
||||
- name: check assertions
|
||||
assert:
|
||||
that:
|
||||
- install_pyinstaller is changed
|
||||
|
||||
@@ -16,11 +16,19 @@
|
||||
}}
|
||||
|
||||
- name: Include OS-specific variables
|
||||
include_vars: '{{ ansible_os_family }}.yml'
|
||||
include_vars: '{{ lookup("first_found", params) }}'
|
||||
vars:
|
||||
params:
|
||||
files:
|
||||
- '{{ ansible_distribution }}-{{ ansible_distribution_version }}.yml'
|
||||
- '{{ ansible_distribution }}-{{ ansible_distribution_major_version }}.yml'
|
||||
- '{{ ansible_os_family }}.yml'
|
||||
paths:
|
||||
- '{{ role_path }}/vars'
|
||||
when: has_java_keytool
|
||||
|
||||
- name: Install keytool
|
||||
package:
|
||||
name: '{{ keytool_package_name }}'
|
||||
name: '{{ keytool_package_names }}'
|
||||
become: true
|
||||
when: has_java_keytool
|
||||
|
||||
@@ -3,4 +3,5 @@
|
||||
# GNU General Public License v3.0+ (see LICENSES/GPL-3.0-or-later.txt or https://www.gnu.org/licenses/gpl-3.0.txt)
|
||||
# SPDX-License-Identifier: GPL-3.0-or-later
|
||||
|
||||
keytool_package_name: openjdk11-jre-headless
|
||||
keytool_package_names:
|
||||
- openjdk11-jre-headless
|
||||
|
||||
@@ -3,4 +3,5 @@
|
||||
# GNU General Public License v3.0+ (see LICENSES/GPL-3.0-or-later.txt or https://www.gnu.org/licenses/gpl-3.0.txt)
|
||||
# SPDX-License-Identifier: GPL-3.0-or-later
|
||||
|
||||
keytool_package_name: jre11-openjdk-headless
|
||||
keytool_package_names:
|
||||
- jre11-openjdk-headless
|
||||
|
||||
@@ -0,0 +1,8 @@
|
||||
---
|
||||
# Copyright (c) Ansible Project
|
||||
# GNU General Public License v3.0+ (see LICENSES/GPL-3.0-or-later.txt or https://www.gnu.org/licenses/gpl-3.0.txt)
|
||||
# SPDX-License-Identifier: GPL-3.0-or-later
|
||||
|
||||
keytool_package_names:
|
||||
- ca-certificates-java
|
||||
- openjdk-17-jre-headless
|
||||
@@ -3,4 +3,5 @@
|
||||
# GNU General Public License v3.0+ (see LICENSES/GPL-3.0-or-later.txt or https://www.gnu.org/licenses/gpl-3.0.txt)
|
||||
# SPDX-License-Identifier: GPL-3.0-or-later
|
||||
|
||||
keytool_package_name: ca-certificates-java
|
||||
keytool_package_names:
|
||||
- ca-certificates-java
|
||||
|
||||
@@ -3,4 +3,5 @@
|
||||
# GNU General Public License v3.0+ (see LICENSES/GPL-3.0-or-later.txt or https://www.gnu.org/licenses/gpl-3.0.txt)
|
||||
# SPDX-License-Identifier: GPL-3.0-or-later
|
||||
|
||||
keytool_package_name: java-11-openjdk-headless
|
||||
keytool_package_names:
|
||||
- java-11-openjdk-headless
|
||||
|
||||
@@ -3,4 +3,5 @@
|
||||
# GNU General Public License v3.0+ (see LICENSES/GPL-3.0-or-later.txt or https://www.gnu.org/licenses/gpl-3.0.txt)
|
||||
# SPDX-License-Identifier: GPL-3.0-or-later
|
||||
|
||||
keytool_package_name: java-11-openjdk-headless
|
||||
keytool_package_names:
|
||||
- java-11-openjdk-headless
|
||||
|
||||
@@ -127,6 +127,32 @@
|
||||
seconds: 5
|
||||
when: ansible_os_family == 'Suse'
|
||||
|
||||
- name: Make installable on Arch
|
||||
community.general.ini_file:
|
||||
path: /usr/lib/systemd/system/postgresql.service
|
||||
section: Service
|
||||
option: "{{ item }}"
|
||||
state: absent
|
||||
loop:
|
||||
- PrivateTmp
|
||||
- ProtectHome
|
||||
- ProtectSystem
|
||||
- NoNewPrivileges
|
||||
- ProtectControlGroups
|
||||
- ProtectKernelModules
|
||||
- ProtectKernelTunables
|
||||
- PrivateDevices
|
||||
- RestrictAddressFamilies
|
||||
- RestrictNamespaces
|
||||
- RestrictRealtime
|
||||
- SystemCallArchitectures
|
||||
when: ansible_distribution == 'Archlinux'
|
||||
|
||||
- name: Make installable on Arch
|
||||
systemd:
|
||||
daemon_reload: true
|
||||
when: ansible_distribution == 'Archlinux'
|
||||
|
||||
- name: Initialize postgres (Suse)
|
||||
service: name=postgresql state=started
|
||||
when: ansible_os_family == 'Suse'
|
||||
|
||||
@@ -0,0 +1,6 @@
|
||||
---
|
||||
# Copyright (c) Ansible Project
|
||||
# GNU General Public License v3.0+ (see LICENSES/GPL-3.0-or-later.txt or https://www.gnu.org/licenses/gpl-3.0.txt)
|
||||
# SPDX-License-Identifier: GPL-3.0-or-later
|
||||
|
||||
# Do nothing
|
||||
@@ -140,11 +140,11 @@
|
||||
- name: Test within jail
|
||||
#
|
||||
# NOTE: currently fails with FreeBSD 12 with minor version less than 4
|
||||
# NOTE: currently fails with FreeBSD 13 with minor version less than 1
|
||||
# NOTE: currently fails with FreeBSD 13 with minor version less than 2
|
||||
#
|
||||
when: >-
|
||||
ansible_distribution_version is version('12.4', '>=') and ansible_distribution_version is version('13', '<')
|
||||
or ansible_distribution_version is version('13.1', '>=')
|
||||
or ansible_distribution_version is version('13.2', '>=')
|
||||
block:
|
||||
- name: Setup testjail
|
||||
include_tasks: setup-testjail.yml
|
||||
|
||||
@@ -12,6 +12,7 @@ skip/rhel8.0 # FIXME
|
||||
skip/rhel9.0 # FIXME
|
||||
skip/rhel9.1 # FIXME
|
||||
skip/rhel9.2 # FIXME
|
||||
skip/rhel9.3 # FIXME
|
||||
skip/docker
|
||||
needs/root
|
||||
needs/target/setup_epel
|
||||
|
||||
@@ -107,7 +107,7 @@ MOCK_ENTRIES = {
|
||||
"queries": ["Omitted values"],
|
||||
"kwargs": {
|
||||
"field": "section-label-without-value",
|
||||
"section": "section-without-values"
|
||||
"section": "Section-Without-Values"
|
||||
},
|
||||
"expected": [""],
|
||||
"output": load_file("v2_out_04.json")
|
||||
|
||||
@@ -13,10 +13,10 @@
|
||||
"additional_information": "Jan 18, 2015, 08:13:38",
|
||||
"fields": [
|
||||
{
|
||||
"id": "password",
|
||||
"id": "Password",
|
||||
"type": "CONCEALED",
|
||||
"purpose": "PASSWORD",
|
||||
"label": "password",
|
||||
"label": "Password",
|
||||
"value": "OctoberPoppyNuttyDraperySabbath",
|
||||
"reference": "op://Test Vault/Authy Backup/password",
|
||||
"password_details": {
|
||||
|
||||
@@ -29,6 +29,8 @@ def patch_redhat_subscription(mocker):
|
||||
return_value='/testbin/subscription-manager')
|
||||
mocker.patch('ansible_collections.community.general.plugins.modules.redhat_subscription.Rhsm._can_connect_to_dbus',
|
||||
return_value=False)
|
||||
mocker.patch('ansible_collections.community.general.plugins.modules.redhat_subscription.Rhsm._has_dbus_interface',
|
||||
return_value=False)
|
||||
mocker.patch('ansible_collections.community.general.plugins.modules.redhat_subscription.getuid',
|
||||
return_value=0)
|
||||
|
||||
|
||||
@@ -6,7 +6,8 @@ unittest2 ; python_version < '2.7'
|
||||
importlib ; python_version < '2.7'
|
||||
|
||||
# requirement for the memcached cache plugin
|
||||
python-memcached
|
||||
python-memcached < 1.60 ; python_version < '3.6'
|
||||
python-memcached ; python_version >= '3.6'
|
||||
|
||||
# requirement for the redis cache plugin
|
||||
redis
|
||||
|
||||
Reference in New Issue
Block a user