Compare commits

...

29 Commits
3.3.2 ... 3.4.0

Author SHA1 Message Date
Felix Fontein
5a71909770 Release 3.4.0. 2021-07-19 23:59:45 +02:00
patchback[bot]
9d0af30702 Keycloak: add clientscope management (#2905) (#3037)
* Add new keycloak_clienscope module

* Add description and protocol parameter + Indentation Fix

* Add protocolMappers parameter

* Add documentation and Fix updatating of protocolMappers

* Update plugins/modules/identity/keycloak/keycloak_clientscope.py

Co-authored-by: Felix Fontein <felix@fontein.de>

* Update plugins/modules/identity/keycloak/keycloak_clientscope.py

Co-authored-by: Felix Fontein <felix@fontein.de>

* Update plugins/modules/identity/keycloak/keycloak_clientscope.py

Co-authored-by: Felix Fontein <felix@fontein.de>

* Update plugins/modules/identity/keycloak/keycloak_clientscope.py

Co-authored-by: Felix Fontein <felix@fontein.de>

* Update plugins/modules/identity/keycloak/keycloak_clientscope.py

Co-authored-by: Felix Fontein <felix@fontein.de>

* Update plugins/modules/identity/keycloak/keycloak_clientscope.py

Co-authored-by: Felix Fontein <felix@fontein.de>

* Add sanitize_cr(clientscoperep) function to sanitize the clientscope representation

* Add unit tests for clientscope Keycloak module

* Apply suggestions from code review

Co-authored-by: Felix Fontein <felix@fontein.de>
(cherry picked from commit 4a392372a8)

Co-authored-by: Gaetan2907 <48204380+Gaetan2907@users.noreply.github.com>
2021-07-19 23:58:26 +02:00
patchback[bot]
9dc21447cc Add Keycloak roles module (#2930) (#3035)
* implement simple realm and client role

* fix documentation

* code cleanup

* separate realm and client roles functions

* remove blank lines

* add tests

* fix linefeeds

* fix indentation

* fix error message

* fix documentation

* fix documentation

* keycloak_role integration tests

* keycloak_role integration tests

* remove extra blank line

* add version_added tag

Co-authored-by: Felix Fontein <felix@fontein.de>

Co-authored-by: Felix Fontein <felix@fontein.de>
(cherry picked from commit d7c6ba89f8)

Co-authored-by: Laurent Paumier <30328363+laurpaum@users.noreply.github.com>
2021-07-19 23:38:51 +02:00
patchback[bot]
940130c959 Feature/gitlab project configuration (#3002) (#3032)
* added
-         only_allow_merge_if_all_discussions_are_resolved
- only_allow_merge_if_all_discussions_are_resolved
- only_allow_merge_if_pipeline_succeeds
- only_allow_merge_if_pipeline_succeeds
- packages_enabled
- remove_source_branch_after_merge
- squash_option

* minor fix

* added changelog

* Fixedlinter findings

* changed version_added to 3.4 -> check requires to do so

* Update changelogs/fragments/3001-enhance_gitlab_module.yml

Co-authored-by: Felix Fontein <felix@fontein.de>

* Update plugins/modules/source_control/gitlab/gitlab_project.py

Co-authored-by: Felix Fontein <felix@fontein.de>

* Update plugins/modules/source_control/gitlab/gitlab_project.py

Co-authored-by: Felix Fontein <felix@fontein.de>

* Update plugins/modules/source_control/gitlab/gitlab_project.py

Co-authored-by: Felix Fontein <felix@fontein.de>

* Update plugins/modules/source_control/gitlab/gitlab_project.py

Co-authored-by: Felix Fontein <felix@fontein.de>

* Update plugins/modules/source_control/gitlab/gitlab_project.py

Co-authored-by: Felix Fontein <felix@fontein.de>

* Update plugins/modules/source_control/gitlab/gitlab_project.py

Co-authored-by: Felix Fontein <felix@fontein.de>

* rework due to review of felixfontein:
- changed option description to full sentences
- change default behaviour of new properties

* Requested changes

Co-authored-by: Max Bidlingmaier <Max-Florian.Bidlingmaier@sap.com>
Co-authored-by: Felix Fontein <felix@fontein.de>
(cherry picked from commit a3607a745e)

Co-authored-by: suukit <maks@konsolan.de>
2021-07-19 12:04:06 +02:00
patchback[bot]
0b239199e7 archive - staging idempotency fix (#2987) (#3030)
* Initial Commit

* Fixing PY26 filter

* Adding changelog fragment

* Removing checksum related code

* Removing list comparisons due to Jinja errors

* Applying review suggestions

* Applying review suggestions - typos

(cherry picked from commit 9fd2ba60df)

Co-authored-by: Ajpantuso <ajpantuso@gmail.com>
2021-07-19 09:40:26 +02:00
patchback[bot]
f0d6fcb3fa Fix snap's channel option. (#3028) (#3029)
(cherry picked from commit 7b9687f758)

Co-authored-by: Felix Fontein <felix@fontein.de>
2021-07-19 09:40:11 +02:00
Felix Fontein
e1aad0db30 Prepare 3.4.0 release. 2021-07-17 15:33:34 +02:00
patchback[bot]
7701ea0293 Added module for creating protected branches (#2781) (#3024)
* Added module for creating protected branches

* Applied some changes due to comments and added a test that currently fails

* Changing no_access to nobody due to comment on PR

* Changing the description to clarify it a bit more

* Added working tests for module 'gitlab_protected_branch'

* Fixing lint issues

* Added doc that minimum of v2.3.0 is needed to work correctly

* Fixed the requirements notation

* Check the version of the module

* Hopefully fixed the tests by skipping it when lower version of 2.3.0 is installed

* Fix lint issues

* Applying changes due to comments in PR

* Remove commented code

* Removing the trailing dot ...

Co-authored-by: jenkins-x-bot <jenkins-x@googlegroups.com>
Co-authored-by: Werner Dijkerman <iam@werner-dijkerman.nl>
(cherry picked from commit 7734430f23)

Co-authored-by: Werner Dijkerman <werner@dj-wasabi.nl>
2021-07-17 10:46:42 +02:00
patchback[bot]
9afb84c8f3 Check targets (#3019) (#3022)
* Add extra sanity test to check aliases files.

* Remove invalid target name.

(cherry picked from commit 27ba98a68e)

Co-authored-by: Felix Fontein <felix@fontein.de>
2021-07-16 20:13:19 +02:00
patchback[bot]
1746d11749 Enable tests (#3015) (#3018)
* Enable tests.

* Fix error message check.

* Fix boolean tests.

* Adjust to Python version.

(cherry picked from commit 9b1c6f0743)

Co-authored-by: Felix Fontein <felix@fontein.de>
2021-07-16 20:06:35 +02:00
patchback[bot]
ea3b8eeee7 Redfish Bootoverride Disable behaves incorrectly (#3006) (#3017)
* https://github.com/ansible-collections/community.general/issues/3005

Bypass the boot device argument check when the command is: DisableBootOverride
as it isn't needed to perform this operation.

* Add changelog fragment

(cherry picked from commit ea822c7bdd)

Co-authored-by: Scott Seekamp <sylgeist@users.noreply.github.com>
2021-07-16 19:18:46 +02:00
patchback[bot]
8c9add3d15 rax_mon_notification_plan - fixed validation check (#2955) (#2977)
* fixed validation-modules for plugins/modules/cloud/rackspace/rax_mon_notification_plan.py

* fixed sanity check

* added changelog fragment

(cherry picked from commit 0e90ff48b5)

Co-authored-by: Alexei Znamensky <103110+russoz@users.noreply.github.com>
2021-07-16 14:17:06 +02:00
patchback[bot]
9244d0ae47 pamd - fixed single line issue (#2989) (#3013)
* fixed pamd single line issue

* added changelog fragment

* supported case for 0 lines, improved test

(cherry picked from commit a3a40f6de3)

Co-authored-by: Alexei Znamensky <103110+russoz@users.noreply.github.com>
2021-07-14 13:21:57 +02:00
patchback[bot]
22591fb6e1 Update README.md (#3003) (#3010)
(cherry picked from commit 28193b699b)

Co-authored-by: Andrew Klychkov <aklychko@redhat.com>
2021-07-14 08:46:23 +02:00
patchback[bot]
166fa1a7fa [nmcli] add runner and runner-hwaddr-policy for network teaming (#2901) (#3008)
* [nmcli] add runner and runner-hwaddr-policy for network teaming

* [nmcli] delete extra space

* Update plugins/modules/net_tools/nmcli.py

* Update plugins/modules/net_tools/nmcli.py

* [nmcli] add changelog fragment

* Update plugins/modules/net_tools/nmcli.py

Co-authored-by: Amin Vakil <info@aminvakil.com>

Co-authored-by: Oriol MULA VALLS <oriol.mula@lxp.lu>
Co-authored-by: Alexei Znamensky <103110+russoz@users.noreply.github.com>
Co-authored-by: Amin Vakil <info@aminvakil.com>
(cherry picked from commit 9ffc1ef393)

Co-authored-by: omula <joriol.mula@gmail.com>
2021-07-14 08:46:13 +02:00
patchback[bot]
9e541a6f11 Keycloak: Improve diff mode on keycloak_authentication module (#2963) (#3000)
* Fix diff mode when updating authentication flow with keycloak_authentication module

* Add changelog fragment

* Fix unit test

* Update plugins/modules/identity/keycloak/keycloak_authentication.py

Co-authored-by: Ajpantuso <ajpantuso@gmail.com>

* Update changelogs/fragments/2963-improve-diff-mode-on-keycloak_authentication.yml

Co-authored-by: Ajpantuso <ajpantuso@gmail.com>

* Update documentation of create_or_update_executions function (return tuple instead of dict)

* Update plugins/modules/identity/keycloak/keycloak_authentication.py

Co-authored-by: Ajpantuso <ajpantuso@gmail.com>

* Update plugins/modules/identity/keycloak/keycloak_authentication.py

Co-authored-by: Ajpantuso <ajpantuso@gmail.com>

Co-authored-by: Ajpantuso <ajpantuso@gmail.com>
(cherry picked from commit 3fc97bf80a)

Co-authored-by: Gaetan2907 <48204380+Gaetan2907@users.noreply.github.com>
2021-07-13 12:13:25 +02:00
patchback[bot]
dbb37194d4 added missing copyright notes to MH integration tests (#2990) (#2993)
(cherry picked from commit d56d34bce6)

Co-authored-by: Alexei Znamensky <103110+russoz@users.noreply.github.com>
2021-07-11 16:52:55 +02:00
patchback[bot]
3cd7b0ec25 module_helper cmd - added feature flag to control whether CmdMixin adds rc, out and err t… (#2922) (#2988)
* added feature flag to control whether CmdMixin adds rc, out and err to the result of the module

* added changelog fragment

* changed from a global flag to parameters in run_command

* updated changelog

* fixed brainless copy-paste of yours truly

(cherry picked from commit c5cbe2943b)

Co-authored-by: Alexei Znamensky <103110+russoz@users.noreply.github.com>
2021-07-11 16:25:32 +02:00
patchback[bot]
1c84389f50 feat: support datadog_monitor composite type (#2958) (#2986)
* feat: support datadog_monitor composite type

* docs: note support for composite types

* lint

* lint: line lengths

* doc: changelog frag

(cherry picked from commit 7a41833e59)

Co-authored-by: Tyler Schwend <tyler.schwend@placeexchange.com>
2021-07-10 20:40:43 +02:00
patchback[bot]
61de9ce51c filesystem: extend support for FreeBSD (#2902) (#2983)
* extend support for FreeBSD

* Check if FS exists with `fstyp` if `blkid` fails to find FS signature
  (fix a potential data loss)
* Add support for FreeBSD special devices (character devices).
* Add support for FreeBSD native fstype (UFS).
* Update DOCUMENTATION accordingly.

* add/update integration tests

* Add tests for `fstype=ufs` on FreeBSD.
* Run `remove_fs` tests (`state=absent`) on FreeBSD.
* Run `overwrite_another_fs` tests on FreeBSD.

* add a changelog fragment

* fix indentation

* restrict new tests to regular files

* fix typo

* fix searching of providersize (block count)

* add '-y' option to growfs command

* remove references to versions older than the collection itself

* bump version adding new feats to 3.4.0

* reformat *collection* and *version added* for better DOCUMENTATION parsing

* skip tests for FreeBSD < 12.2

* run tests for FreeBSD >= 12.2

* re-enable tests for FreeBSD < 12.2 and give it a try with group1

* util-linux not available on FreeBSD < 12.2

(cherry picked from commit 9023d4dba1)

Co-authored-by: quidame <quidame@poivron.org>
2021-07-10 16:56:09 +02:00
patchback[bot]
7ccd5c9116 proxmox inventory - fix parsing for offline nodes (#2967) (#2985)
* Initial commit

* Adding changelog fragment

* Applying initial review suggestions

(cherry picked from commit 111c5de550)

Co-authored-by: Ajpantuso <ajpantuso@gmail.com>
2021-07-10 16:55:57 +02:00
patchback[bot]
e3cea35f2c Temporarily disable passwordstore lookup tests on macOS and OSX. (#2979) (#2982)
(cherry picked from commit 4ae392e5de)

Co-authored-by: Felix Fontein <felix@fontein.de>
2021-07-10 13:49:12 +02:00
patchback[bot]
94f58d1920 launchd - fixed validation check (#2960) (#2976)
* replaced use of expanduser() with value from HOME var

* fixed sanity check

* added changelog fragment

(cherry picked from commit 1990f79d8a)

Co-authored-by: Alexei Znamensky <103110+russoz@users.noreply.github.com>
2021-07-10 13:22:41 +02:00
patchback[bot]
0f884bbadc added comments to the ignore files (#2972) (#2974)
(cherry picked from commit ad8c4e4de6)

Co-authored-by: Alexei Znamensky <103110+russoz@users.noreply.github.com>
2021-07-10 13:21:03 +02:00
patchback[bot]
6ca3e78d11 archive - adding dest_state return value and enhancing integration tests. (#2913) (#2973)
* Initial commit

* Adding changelog fragment

* fixing changelog fragment

* Updating documentation

* Applying review suggestions

(cherry picked from commit 288fe1cfc6)

Co-authored-by: Ajpantuso <ajpantuso@gmail.com>
2021-07-10 13:19:53 +02:00
John R Barker
a09d70daa0 Update commit-rights.md (#2964)
aminvakil is no longer involved with the Ansible Community due to United
States export controls and economic sanctions laws apply to U.S.
persons, entities, and controlled software and technology that is of
U.S. origin or that enters the U.S., including open source software.

(cherry picked from commit 518ace2562)
2021-07-09 13:00:11 +01:00
patchback[bot]
c2a3cf35c7 jenkins_job_info: Remove necessities of password or token. (#2948) (#2961)
* Remove necessities on password or token.

* Upper case letter -> Lower case letter

Co-authored-by: Amin Vakil <info@aminvakil.com>

* Documentation update.

* C -> I

Co-authored-by: Amin Vakil <info@aminvakil.com>
(cherry picked from commit d97a9b5961)

Co-authored-by: Tong He <68936428+unnecessary-username@users.noreply.github.com>
2021-07-09 08:52:10 +02:00
patchback[bot]
ee5ff3b31b Add option to the keycloak_client module (#2949) (#2962)
* Add authentication_flow_binding_overrides option to the keycloak_client module

* Add changelog fragment

* Update changelogs/fragments/2949-add_authentication-flow-binding_keycloak-client.yml

Co-authored-by: Amin Vakil <info@aminvakil.com>

* Update plugins/modules/identity/keycloak/keycloak_client.py

Co-authored-by: Amin Vakil <info@aminvakil.com>

* Update plugins/modules/identity/keycloak/keycloak_client.py

Co-authored-by: Amin Vakil <info@aminvakil.com>

* Add unit test authentication_flow_binding_overrides feature on keycloak_client module

Co-authored-by: Amin Vakil <info@aminvakil.com>
(cherry picked from commit 1b80a9c587)

Co-authored-by: Gaetan2907 <48204380+Gaetan2907@users.noreply.github.com>
2021-07-09 08:51:55 +02:00
Felix Fontein
18b7333f93 Next expected release is 3.4.0. 2021-07-08 11:02:27 +02:00
71 changed files with 4359 additions and 773 deletions

View File

@@ -6,6 +6,58 @@ Community General Release Notes
This changelog describes changes after version 2.0.0.
v3.4.0
======
Release Summary
---------------
Regular bugfix and feature release.
Minor Changes
-------------
- archive - added ``dest_state`` return value to describe final state of ``dest`` after successful task execution (https://github.com/ansible-collections/community.general/pull/2913).
- archive - refactoring prior to fix for idempotency checks. The fix will be a breaking change and only appear in community.general 4.0.0 (https://github.com/ansible-collections/community.general/pull/2987).
- datadog_monitor - allow creation of composite datadog monitors (https://github.com/ansible-collections/community.general/issues/2956).
- filesystem - extend support for FreeBSD. Avoid potential data loss by checking existence of a filesystem with ``fstyp`` (native command) if ``blkid`` (foreign command) doesn't find one. Add support for character devices and ``ufs`` filesystem type (https://github.com/ansible-collections/community.general/pull/2902).
- gitlab_project - add new options ``allow_merge_on_skipped_pipeline``, ``only_allow_merge_if_all_discussions_are_resolved``, ``only_allow_merge_if_pipeline_succeeds``, ``packages_enabled``, ``remove_source_branch_after_merge``, ``squash_option`` (https://github.com/ansible-collections/community.general/pull/3002).
- jenkins_job_info - the ``password`` and ``token`` parameters can also be omitted to retrieve only public information (https://github.com/ansible-collections/community.general/pull/2948).
- keycloak_authentication - enhanced diff mode to also return before and after state when the authentication flow is updated (https://github.com/ansible-collections/community.general/pull/2963).
- keycloak_client - add ``authentication_flow_binding_overrides`` option (https://github.com/ansible-collections/community.general/pull/2949).
- module_helper module utils - added feature flag parameters to ``CmdMixin`` to control whether ``rc``, ``out`` and ``err`` are automatically added to the module output (https://github.com/ansible-collections/community.general/pull/2922).
- nmcli - add ``runner`` and ``runner_hwaddr_policy`` options (https://github.com/ansible-collections/community.general/issues/2901).
- rax_mon_notification_plan - fixed validation checks by specifying type ``str`` as the ``elements`` of parameters ``ok_state``, ``warning_state`` and ``critical_state`` (https://github.com/ansible-collections/community.general/pull/2955).
Bugfixes
--------
- launchd - fixed sanity check in the module's code (https://github.com/ansible-collections/community.general/pull/2960).
- pamd - fixed problem with files containing only one or two lines (https://github.com/ansible-collections/community.general/issues/2925).
- proxmox inventory plugin - fixed parsing failures when some cluster nodes are offline (https://github.com/ansible-collections/community.general/issues/2931).
- redfish_command - fix extraneous error caused by missing ``bootdevice`` argument when using the ``DisableBootOverride`` sub-command (https://github.com/ansible-collections/community.general/issues/3005).
- snap - fix formatting of ``--channel`` argument when the ``channel`` option is used (https://github.com/ansible-collections/community.general/pull/3028).
New Modules
-----------
Identity
~~~~~~~~
keycloak
^^^^^^^^
- keycloak_clientscope - Allows administration of Keycloak client_scopes via Keycloak API
- keycloak_role - Allows administration of Keycloak roles via Keycloak API
Source Control
~~~~~~~~~~~~~~
gitlab
^^^^^^
- gitlab_protected_branch - (un)Marking existing branches for protection
v3.3.2
======

View File

@@ -58,7 +58,9 @@ See [Ansible Using collections](https://docs.ansible.com/ansible/latest/user_gui
## Contributing to this collection
The content of this collection is made by good people like you, a community of individuals collaborating on making the world better through developing automation software.
The content of this collection is made by good people just like you, a community of individuals collaborating on making the world better through developing automation software.
We are actively accepting new contributors.
All types of contributions are very welcome.

View File

@@ -1431,3 +1431,69 @@ releases:
- 2951-mh-vars-deepcopy.yml
- 3.3.2.yml
release_date: '2021-07-08'
3.4.0:
changes:
bugfixes:
- launchd - fixed sanity check in the module's code (https://github.com/ansible-collections/community.general/pull/2960).
- pamd - fixed problem with files containing only one or two lines (https://github.com/ansible-collections/community.general/issues/2925).
- proxmox inventory plugin - fixed parsing failures when some cluster nodes
are offline (https://github.com/ansible-collections/community.general/issues/2931).
- redfish_command - fix extraneous error caused by missing ``bootdevice`` argument
when using the ``DisableBootOverride`` sub-command (https://github.com/ansible-collections/community.general/issues/3005).
- snap - fix formatting of ``--channel`` argument when the ``channel`` option
is used (https://github.com/ansible-collections/community.general/pull/3028).
minor_changes:
- archive - added ``dest_state`` return value to describe final state of ``dest``
after successful task execution (https://github.com/ansible-collections/community.general/pull/2913).
- archive - refactoring prior to fix for idempotency checks. The fix will be
a breaking change and only appear in community.general 4.0.0 (https://github.com/ansible-collections/community.general/pull/2987).
- datadog_monitor - allow creation of composite datadog monitors (https://github.com/ansible-collections/community.general/issues/2956).
- filesystem - extend support for FreeBSD. Avoid potential data loss by checking
existence of a filesystem with ``fstyp`` (native command) if ``blkid`` (foreign
command) doesn't find one. Add support for character devices and ``ufs`` filesystem
type (https://github.com/ansible-collections/community.general/pull/2902).
- gitlab_project - add new options ``allow_merge_on_skipped_pipeline``, ``only_allow_merge_if_all_discussions_are_resolved``,
``only_allow_merge_if_pipeline_succeeds``, ``packages_enabled``, ``remove_source_branch_after_merge``,
``squash_option`` (https://github.com/ansible-collections/community.general/pull/3002).
- jenkins_job_info - the ``password`` and ``token`` parameters can also be omitted
to retrieve only public information (https://github.com/ansible-collections/community.general/pull/2948).
- keycloak_authentication - enhanced diff mode to also return before and after
state when the authentication flow is updated (https://github.com/ansible-collections/community.general/pull/2963).
- keycloak_client - add ``authentication_flow_binding_overrides`` option (https://github.com/ansible-collections/community.general/pull/2949).
- module_helper module utils - added feature flag parameters to ``CmdMixin``
to control whether ``rc``, ``out`` and ``err`` are automatically added to
the module output (https://github.com/ansible-collections/community.general/pull/2922).
- nmcli - add ``runner`` and ``runner_hwaddr_policy`` options (https://github.com/ansible-collections/community.general/issues/2901).
- rax_mon_notification_plan - fixed validation checks by specifying type ``str``
as the ``elements`` of parameters ``ok_state``, ``warning_state`` and ``critical_state``
(https://github.com/ansible-collections/community.general/pull/2955).
release_summary: Regular bugfix and feature release.
fragments:
- 2901-nmcli_teaming.yml
- 2902-filesystem_extend_freebsd_support.yml
- 2913-archive-dest_state.yml
- 2922-mh-cmd-output-feature-flag.yml
- 2948-jenkins_job_info-remove_necessities_on_password_or_token.yml
- 2949-add_authentication-flow-binding_keycloak-client.yml
- 2955-rax_mon_notification_plan-added-elements-to-list-params.yaml
- 2958-datadog_monitor_support_composites.yml
- 2960-launchd-validation-check.yaml
- 2963-improve-diff-mode-on-keycloak_authentication.yml
- 2967-proxmox_inventory-offline-node-fix.yml
- 2987-archive-stage-idempotency-fix.yml
- 2989-pamd-single-line.yaml
- 3.4.0.yml
- 3001-enhance_gitlab_module.yml
- 3006-redfish_command-bootoverride-argument-check.yaml
- 3028-snap-channel.yml
modules:
- description: (un)Marking existing branches for protection
name: gitlab_protected_branch
namespace: source_control.gitlab
- description: Allows administration of Keycloak client_scopes via Keycloak API
name: keycloak_clientscope
namespace: identity.keycloak
- description: Allows administration of Keycloak roles via Keycloak API
name: keycloak_role
namespace: identity.keycloak
release_date: '2021-07-20'

View File

@@ -68,7 +68,6 @@ Individuals who have been asked to become a part of this group have generally be
| Name | GitHub ID | IRC Nick | Other |
| ------------------- | -------------------- | ------------------ | -------------------- |
| Alexei Znamensky | russoz | russoz | |
| Amin Vakil | aminvakil | aminvakil | |
| Andrew Klychkov | andersson007 | andersson007_ | |
| Felix Fontein | felixfontein | felixfontein | |
| John R Barker | gundalow | gundalow | |

View File

@@ -1,6 +1,6 @@
namespace: community
name: general
version: 3.3.2
version: 3.4.0
readme: README.md
authors:
- Ansible (https://github.com/ansible)

View File

@@ -369,6 +369,9 @@ class InventoryModule(BaseInventoryPlugin, Constructable, Cacheable):
if node['type'] == 'node':
self.inventory.add_child(nodes_group, node['node'])
if node['status'] == 'offline':
continue
# get node IP address
if self.get_option("want_proxmox_nodes_ansible_host"):
ip = self._get_node_ip(node['node'])

View File

@@ -43,14 +43,25 @@ URL_REALM = "{url}/admin/realms/{realm}"
URL_TOKEN = "{url}/realms/{realm}/protocol/openid-connect/token"
URL_CLIENT = "{url}/admin/realms/{realm}/clients/{id}"
URL_CLIENTS = "{url}/admin/realms/{realm}/clients"
URL_CLIENT_ROLES = "{url}/admin/realms/{realm}/clients/{id}/roles"
URL_CLIENT_ROLE = "{url}/admin/realms/{realm}/clients/{id}/roles/{name}"
URL_CLIENT_ROLE_COMPOSITES = "{url}/admin/realms/{realm}/clients/{id}/roles/{name}/composites"
URL_REALM_ROLES = "{url}/admin/realms/{realm}/roles"
URL_REALM_ROLE = "{url}/admin/realms/{realm}/roles/{name}"
URL_REALM_ROLE_COMPOSITES = "{url}/admin/realms/{realm}/roles/{name}/composites"
URL_CLIENTTEMPLATE = "{url}/admin/realms/{realm}/client-templates/{id}"
URL_CLIENTTEMPLATES = "{url}/admin/realms/{realm}/client-templates"
URL_GROUPS = "{url}/admin/realms/{realm}/groups"
URL_GROUP = "{url}/admin/realms/{realm}/groups/{groupid}"
URL_CLIENTSCOPES = "{url}/admin/realms/{realm}/client-scopes"
URL_CLIENTSCOPE = "{url}/admin/realms/{realm}/client-scopes/{id}"
URL_CLIENTSCOPE_PROTOCOLMAPPERS = "{url}/admin/realms/{realm}/client-scopes/{id}/protocol-mappers/models"
URL_CLIENTSCOPE_PROTOCOLMAPPER = "{url}/admin/realms/{realm}/client-scopes/{id}/protocol-mappers/models/{mapper_id}"
URL_AUTHENTICATION_FLOWS = "{url}/admin/realms/{realm}/authentication/flows"
URL_AUTHENTICATION_FLOW = "{url}/admin/realms/{realm}/authentication/flows/{id}"
URL_AUTHENTICATION_FLOW_COPY = "{url}/admin/realms/{realm}/authentication/flows/{copyfrom}/copy"
@@ -505,6 +516,239 @@ class KeycloakAPI(object):
self.module.fail_json(msg='Could not delete client template %s in realm %s: %s'
% (id, realm, str(e)))
def get_clientscopes(self, realm="master"):
""" Fetch the name and ID of all clientscopes on the Keycloak server.
To fetch the full data of the group, make a subsequent call to
get_clientscope_by_clientscopeid, passing in the ID of the group you wish to return.
:param realm: Realm in which the clientscope resides; default 'master'.
:return The clientscopes of this realm (default "master")
"""
clientscopes_url = URL_CLIENTSCOPES.format(url=self.baseurl, realm=realm)
try:
return json.loads(to_native(open_url(clientscopes_url, method="GET", headers=self.restheaders,
validate_certs=self.validate_certs).read()))
except Exception as e:
self.module.fail_json(msg="Could not fetch list of clientscopes in realm %s: %s"
% (realm, str(e)))
def get_clientscope_by_clientscopeid(self, cid, realm="master"):
""" Fetch a keycloak clientscope from the provided realm using the clientscope's unique ID.
If the clientscope does not exist, None is returned.
gid is a UUID provided by the Keycloak API
:param cid: UUID of the clientscope to be returned
:param realm: Realm in which the clientscope resides; default 'master'.
"""
clientscope_url = URL_CLIENTSCOPE.format(url=self.baseurl, realm=realm, id=cid)
try:
return json.loads(to_native(open_url(clientscope_url, method="GET", headers=self.restheaders,
validate_certs=self.validate_certs).read()))
except HTTPError as e:
if e.code == 404:
return None
else:
self.module.fail_json(msg="Could not fetch clientscope %s in realm %s: %s"
% (cid, realm, str(e)))
except Exception as e:
self.module.fail_json(msg="Could not clientscope group %s in realm %s: %s"
% (cid, realm, str(e)))
def get_clientscope_by_name(self, name, realm="master"):
""" Fetch a keycloak clientscope within a realm based on its name.
The Keycloak API does not allow filtering of the clientscopes resource by name.
As a result, this method first retrieves the entire list of clientscopes - name and ID -
then performs a second query to fetch the group.
If the clientscope does not exist, None is returned.
:param name: Name of the clientscope to fetch.
:param realm: Realm in which the clientscope resides; default 'master'
"""
try:
all_clientscopes = self.get_clientscopes(realm=realm)
for clientscope in all_clientscopes:
if clientscope['name'] == name:
return self.get_clientscope_by_clientscopeid(clientscope['id'], realm=realm)
return None
except Exception as e:
self.module.fail_json(msg="Could not fetch clientscope %s in realm %s: %s"
% (name, realm, str(e)))
def create_clientscope(self, clientscoperep, realm="master"):
""" Create a Keycloak clientscope.
:param clientscoperep: a ClientScopeRepresentation of the clientscope to be created. Must contain at minimum the field name.
:return: HTTPResponse object on success
"""
clientscopes_url = URL_CLIENTSCOPES.format(url=self.baseurl, realm=realm)
try:
return open_url(clientscopes_url, method='POST', headers=self.restheaders,
data=json.dumps(clientscoperep), validate_certs=self.validate_certs)
except Exception as e:
self.module.fail_json(msg="Could not create clientscope %s in realm %s: %s"
% (clientscoperep['name'], realm, str(e)))
def update_clientscope(self, clientscoperep, realm="master"):
""" Update an existing clientscope.
:param grouprep: A GroupRepresentation of the updated group.
:return HTTPResponse object on success
"""
clientscope_url = URL_CLIENTSCOPE.format(url=self.baseurl, realm=realm, id=clientscoperep['id'])
try:
return open_url(clientscope_url, method='PUT', headers=self.restheaders,
data=json.dumps(clientscoperep), validate_certs=self.validate_certs)
except Exception as e:
self.module.fail_json(msg='Could not update clientscope %s in realm %s: %s'
% (clientscoperep['name'], realm, str(e)))
def delete_clientscope(self, name=None, cid=None, realm="master"):
""" Delete a clientscope. One of name or cid must be provided.
Providing the clientscope ID is preferred as it avoids a second lookup to
convert a clientscope name to an ID.
:param name: The name of the clientscope. A lookup will be performed to retrieve the clientscope ID.
:param cid: The ID of the clientscope (preferred to name).
:param realm: The realm in which this group resides, default "master".
"""
if cid is None and name is None:
# prefer an exception since this is almost certainly a programming error in the module itself.
raise Exception("Unable to delete group - one of group ID or name must be provided.")
# only lookup the name if cid isn't provided.
# in the case that both are provided, prefer the ID, since it's one
# less lookup.
if cid is None and name is not None:
for clientscope in self.get_clientscopes(realm=realm):
if clientscope['name'] == name:
cid = clientscope['id']
break
# if the group doesn't exist - no problem, nothing to delete.
if cid is None:
return None
# should have a good cid by here.
clientscope_url = URL_CLIENTSCOPE.format(realm=realm, id=cid, url=self.baseurl)
try:
return open_url(clientscope_url, method='DELETE', headers=self.restheaders,
validate_certs=self.validate_certs)
except Exception as e:
self.module.fail_json(msg="Unable to delete clientscope %s: %s" % (cid, str(e)))
def get_clientscope_protocolmappers(self, cid, realm="master"):
""" Fetch the name and ID of all clientscopes on the Keycloak server.
To fetch the full data of the group, make a subsequent call to
get_clientscope_by_clientscopeid, passing in the ID of the group you wish to return.
:param cid: id of clientscope (not name).
:param realm: Realm in which the clientscope resides; default 'master'.
:return The protocolmappers of this realm (default "master")
"""
protocolmappers_url = URL_CLIENTSCOPE_PROTOCOLMAPPERS.format(id=cid, url=self.baseurl, realm=realm)
try:
return json.loads(to_native(open_url(protocolmappers_url, method="GET", headers=self.restheaders,
validate_certs=self.validate_certs).read()))
except Exception as e:
self.module.fail_json(msg="Could not fetch list of protocolmappers in realm %s: %s"
% (realm, str(e)))
def get_clientscope_protocolmapper_by_protocolmapperid(self, pid, cid, realm="master"):
""" Fetch a keycloak clientscope from the provided realm using the clientscope's unique ID.
If the clientscope does not exist, None is returned.
gid is a UUID provided by the Keycloak API
:param cid: UUID of the protocolmapper to be returned
:param cid: UUID of the clientscope to be returned
:param realm: Realm in which the clientscope resides; default 'master'.
"""
protocolmapper_url = URL_CLIENTSCOPE_PROTOCOLMAPPER.format(url=self.baseurl, realm=realm, id=cid, mapper_id=pid)
try:
return json.loads(to_native(open_url(protocolmapper_url, method="GET", headers=self.restheaders,
validate_certs=self.validate_certs).read()))
except HTTPError as e:
if e.code == 404:
return None
else:
self.module.fail_json(msg="Could not fetch protocolmapper %s in realm %s: %s"
% (pid, realm, str(e)))
except Exception as e:
self.module.fail_json(msg="Could not fetch protocolmapper %s in realm %s: %s"
% (cid, realm, str(e)))
def get_clientscope_protocolmapper_by_name(self, cid, name, realm="master"):
""" Fetch a keycloak clientscope within a realm based on its name.
The Keycloak API does not allow filtering of the clientscopes resource by name.
As a result, this method first retrieves the entire list of clientscopes - name and ID -
then performs a second query to fetch the group.
If the clientscope does not exist, None is returned.
:param cid: Id of the clientscope (not name).
:param name: Name of the protocolmapper to fetch.
:param realm: Realm in which the clientscope resides; default 'master'
"""
try:
all_protocolmappers = self.get_clientscope_protocolmappers(cid, realm=realm)
for protocolmapper in all_protocolmappers:
if protocolmapper['name'] == name:
return self.get_clientscope_protocolmapper_by_protocolmapperid(protocolmapper['id'], cid, realm=realm)
return None
except Exception as e:
self.module.fail_json(msg="Could not fetch protocolmapper %s in realm %s: %s"
% (name, realm, str(e)))
def create_clientscope_protocolmapper(self, cid, mapper_rep, realm="master"):
""" Create a Keycloak clientscope protocolmapper.
:param cid: Id of the clientscope.
:param mapper_rep: a ProtocolMapperRepresentation of the protocolmapper to be created. Must contain at minimum the field name.
:return: HTTPResponse object on success
"""
protocolmappers_url = URL_CLIENTSCOPE_PROTOCOLMAPPERS.format(url=self.baseurl, id=cid, realm=realm)
try:
return open_url(protocolmappers_url, method='POST', headers=self.restheaders,
data=json.dumps(mapper_rep), validate_certs=self.validate_certs)
except Exception as e:
self.module.fail_json(msg="Could not create protocolmapper %s in realm %s: %s"
% (mapper_rep['name'], realm, str(e)))
def update_clientscope_protocolmappers(self, cid, mapper_rep, realm="master"):
""" Update an existing clientscope.
:param cid: Id of the clientscope.
:param mapper_rep: A ProtocolMapperRepresentation of the updated protocolmapper.
:return HTTPResponse object on success
"""
protocolmapper_url = URL_CLIENTSCOPE_PROTOCOLMAPPER.format(url=self.baseurl, realm=realm, id=cid, mapper_id=mapper_rep['id'])
try:
return open_url(protocolmapper_url, method='PUT', headers=self.restheaders,
data=json.dumps(mapper_rep), validate_certs=self.validate_certs)
except Exception as e:
self.module.fail_json(msg='Could not update protocolmappers for clientscope %s in realm %s: %s'
% (mapper_rep, realm, str(e)))
def get_groups(self, realm="master"):
""" Fetch the name and ID of all groups on the Keycloak server.
@@ -632,10 +876,197 @@ class KeycloakAPI(object):
try:
return open_url(group_url, method='DELETE', headers=self.restheaders,
validate_certs=self.validate_certs)
except Exception as e:
self.module.fail_json(msg="Unable to delete group %s: %s" % (groupid, str(e)))
def get_realm_roles(self, realm='master'):
""" Obtains role representations for roles in a realm
:param realm: realm to be queried
:return: list of dicts of role representations
"""
rolelist_url = URL_REALM_ROLES.format(url=self.baseurl, realm=realm)
try:
return json.loads(to_native(open_url(rolelist_url, method='GET', headers=self.restheaders,
validate_certs=self.validate_certs).read()))
except ValueError as e:
self.module.fail_json(msg='API returned incorrect JSON when trying to obtain list of roles for realm %s: %s'
% (realm, str(e)))
except Exception as e:
self.module.fail_json(msg='Could not obtain list of roles for realm %s: %s'
% (realm, str(e)))
def get_realm_role(self, name, realm='master'):
""" Fetch a keycloak role from the provided realm using the role's name.
If the role does not exist, None is returned.
:param name: Name of the role to fetch.
:param realm: Realm in which the role resides; default 'master'.
"""
role_url = URL_REALM_ROLE.format(url=self.baseurl, realm=realm, name=name)
try:
return json.loads(to_native(open_url(role_url, method="GET", headers=self.restheaders,
validate_certs=self.validate_certs).read()))
except HTTPError as e:
if e.code == 404:
return None
else:
self.module.fail_json(msg='Could not fetch role %s in realm %s: %s'
% (name, realm, str(e)))
except Exception as e:
self.module.fail_json(msg='Could not fetch role %s in realm %s: %s'
% (name, realm, str(e)))
def create_realm_role(self, rolerep, realm='master'):
""" Create a Keycloak realm role.
:param rolerep: a RoleRepresentation of the role to be created. Must contain at minimum the field name.
:return: HTTPResponse object on success
"""
roles_url = URL_REALM_ROLES.format(url=self.baseurl, realm=realm)
try:
return open_url(roles_url, method='POST', headers=self.restheaders,
data=json.dumps(rolerep), validate_certs=self.validate_certs)
except Exception as e:
self.module.fail_json(msg='Could not create role %s in realm %s: %s'
% (rolerep['name'], realm, str(e)))
def update_realm_role(self, rolerep, realm='master'):
""" Update an existing realm role.
:param rolerep: A RoleRepresentation of the updated role.
:return HTTPResponse object on success
"""
role_url = URL_REALM_ROLE.format(url=self.baseurl, realm=realm, name=rolerep['name'])
try:
return open_url(role_url, method='PUT', headers=self.restheaders,
data=json.dumps(rolerep), validate_certs=self.validate_certs)
except Exception as e:
self.module.fail_json(msg='Could not update role %s in realm %s: %s'
% (rolerep['name'], realm, str(e)))
def delete_realm_role(self, name, realm='master'):
""" Delete a realm role.
:param name: The name of the role.
:param realm: The realm in which this role resides, default "master".
"""
role_url = URL_REALM_ROLE.format(url=self.baseurl, realm=realm, name=name)
try:
return open_url(role_url, method='DELETE', headers=self.restheaders,
validate_certs=self.validate_certs)
except Exception as e:
self.module.fail_json(msg='Unable to delete role %s in realm %s: %s'
% (name, realm, str(e)))
def get_client_roles(self, clientid, realm='master'):
""" Obtains role representations for client roles in a specific client
:param clientid: Client id to be queried
:param realm: Realm to be queried
:return: List of dicts of role representations
"""
cid = self.get_client_id(clientid, realm=realm)
if cid is None:
self.module.fail_json(msg='Could not find client %s in realm %s'
% (clientid, realm))
rolelist_url = URL_CLIENT_ROLES.format(url=self.baseurl, realm=realm, id=cid)
try:
return json.loads(to_native(open_url(rolelist_url, method='GET', headers=self.restheaders,
validate_certs=self.validate_certs).read()))
except ValueError as e:
self.module.fail_json(msg='API returned incorrect JSON when trying to obtain list of roles for client %s in realm %s: %s'
% (clientid, realm, str(e)))
except Exception as e:
self.module.fail_json(msg='Could not obtain list of roles for client %s in realm %s: %s'
% (clientid, realm, str(e)))
def get_client_role(self, name, clientid, realm='master'):
""" Fetch a keycloak client role from the provided realm using the role's name.
:param name: Name of the role to fetch.
:param clientid: Client id for the client role
:param realm: Realm in which the role resides
:return: Dict of role representation
If the role does not exist, None is returned.
"""
cid = self.get_client_id(clientid, realm=realm)
if cid is None:
self.module.fail_json(msg='Could not find client %s in realm %s'
% (clientid, realm))
role_url = URL_CLIENT_ROLE.format(url=self.baseurl, realm=realm, id=cid, name=name)
try:
return json.loads(to_native(open_url(role_url, method="GET", headers=self.restheaders,
validate_certs=self.validate_certs).read()))
except HTTPError as e:
if e.code == 404:
return None
else:
self.module.fail_json(msg='Could not fetch role %s in client %s of realm %s: %s'
% (name, clientid, realm, str(e)))
except Exception as e:
self.module.fail_json(msg='Could not fetch role %s for client %s in realm %s: %s'
% (name, clientid, realm, str(e)))
def create_client_role(self, rolerep, clientid, realm='master'):
""" Create a Keycloak client role.
:param rolerep: a RoleRepresentation of the role to be created. Must contain at minimum the field name.
:param clientid: Client id for the client role
:param realm: Realm in which the role resides
:return: HTTPResponse object on success
"""
cid = self.get_client_id(clientid, realm=realm)
if cid is None:
self.module.fail_json(msg='Could not find client %s in realm %s'
% (clientid, realm))
roles_url = URL_CLIENT_ROLES.format(url=self.baseurl, realm=realm, id=cid)
try:
return open_url(roles_url, method='POST', headers=self.restheaders,
data=json.dumps(rolerep), validate_certs=self.validate_certs)
except Exception as e:
self.module.fail_json(msg='Could not create role %s for client %s in realm %s: %s'
% (rolerep['name'], clientid, realm, str(e)))
def update_client_role(self, rolerep, clientid, realm="master"):
""" Update an existing client role.
:param rolerep: A RoleRepresentation of the updated role.
:param clientid: Client id for the client role
:param realm: Realm in which the role resides
:return HTTPResponse object on success
"""
cid = self.get_client_id(clientid, realm=realm)
if cid is None:
self.module.fail_json(msg='Could not find client %s in realm %s'
% (clientid, realm))
role_url = URL_CLIENT_ROLE.format(url=self.baseurl, realm=realm, id=cid, name=rolerep['name'])
try:
return open_url(role_url, method='PUT', headers=self.restheaders,
data=json.dumps(rolerep), validate_certs=self.validate_certs)
except Exception as e:
self.module.fail_json(msg='Could not update role %s for client %s in realm %s: %s'
% (rolerep['name'], clientid, realm, str(e)))
def delete_client_role(self, name, clientid, realm="master"):
""" Delete a role. One of name or roleid must be provided.
:param name: The name of the role.
:param clientid: Client id for the client role
:param realm: Realm in which the role resides
"""
cid = self.get_client_id(clientid, realm=realm)
if cid is None:
self.module.fail_json(msg='Could not find client %s in realm %s'
% (clientid, realm))
role_url = URL_CLIENT_ROLE.format(url=self.baseurl, realm=realm, id=cid, name=name)
try:
return open_url(role_url, method='DELETE', headers=self.restheaders,
validate_certs=self.validate_certs)
except Exception as e:
self.module.fail_json(msg='Unable to delete role %s for client %s in realm %s: %s'
% (name, clientid, realm, str(e)))
def get_authentication_flow_by_alias(self, alias, realm='master'):
"""
Get an authentication flow by it's alias

View File

@@ -152,7 +152,14 @@ class CmdMixin(object):
def process_command_output(self, rc, out, err):
return rc, out, err
def run_command(self, extra_params=None, params=None, process_output=None, *args, **kwargs):
def run_command(self,
extra_params=None,
params=None,
process_output=None,
publish_rc=True,
publish_out=True,
publish_err=True,
*args, **kwargs):
self.vars.cmd_args = self._calculate_args(extra_params, params)
options = dict(self.run_command_fixed_options)
options['check_rc'] = options.get('check_rc', self.check_rc)
@@ -166,7 +173,12 @@ class CmdMixin(object):
self.update_output(force_lang=self.force_lang)
options['environ_update'] = env_update
rc, out, err = self.module.run_command(self.vars.cmd_args, *args, **options)
self.update_output(rc=rc, stdout=out, stderr=err)
if publish_rc:
self.update_output(rc=rc)
if publish_out:
self.update_output(stdout=out)
if publish_err:
self.update_output(stderr=err)
if process_output is None:
_process = self.process_command_output
else:

View File

@@ -1582,13 +1582,14 @@ class RedfishUtils(object):
boot = data[key]
annotation = 'BootSourceOverrideTarget@Redfish.AllowableValues'
if annotation in boot:
allowable_values = boot[annotation]
if isinstance(allowable_values, list) and bootdevice not in allowable_values:
return {'ret': False,
'msg': "Boot device %s not in list of allowable values (%s)" %
(bootdevice, allowable_values)}
if override_enabled != 'Disabled':
annotation = 'BootSourceOverrideTarget@Redfish.AllowableValues'
if annotation in boot:
allowable_values = boot[annotation]
if isinstance(allowable_values, list) and bootdevice not in allowable_values:
return {'ret': False,
'msg': "Boot device %s not in list of allowable values (%s)" %
(bootdevice, allowable_values)}
# read existing values
cur_enabled = boot.get('BootSourceOverrideEnabled')

View File

@@ -32,16 +32,19 @@ options:
required: true
critical_state:
type: list
elements: str
description:
- Notification list to use when the alarm state is CRITICAL. Must be an
array of valid rax_mon_notification ids.
warning_state:
type: list
elements: str
description:
- Notification list to use when the alarm state is WARNING. Must be an array
of valid rax_mon_notification ids.
ok_state:
type: list
elements: str
description:
- Notification list to use when the alarm state is OK. Must be an array of
valid rax_mon_notification ids.
@@ -150,9 +153,9 @@ def main():
dict(
state=dict(default='present', choices=['present', 'absent']),
label=dict(required=True),
critical_state=dict(type='list'),
warning_state=dict(type='list'),
ok_state=dict(type='list')
critical_state=dict(type='list', elements='str'),
warning_state=dict(type='list', elements='str'),
ok_state=dict(type='list', elements='str'),
)
)

View File

@@ -137,6 +137,16 @@ state:
The state of the input C(path).
type: str
returned: always
dest_state:
description:
- The state of the I(dest) file.
- C(absent) when the file does not exist.
- C(archive) when the file is an archive.
- C(compress) when the file is compressed, but not an archive.
- C(incomplete) when the file is an archive, but some files under I(path) were not found.
type: str
returned: success
version_added: 3.4.0
missing:
description: Any files that were missing from the source.
type: list
@@ -288,6 +298,8 @@ class Archive(object):
msg='Error, must specify "dest" when archiving multiple files or trees'
)
self.original_size = self.destination_size()
def add(self, path, archive_name):
try:
self._add(_to_native_ascii(path), _to_native(archive_name))
@@ -305,7 +317,7 @@ class Archive(object):
self.destination_state = STATE_ARCHIVED
else:
try:
f_out = self._open_compressed_file(_to_native_ascii(self.destination))
f_out = self._open_compressed_file(_to_native_ascii(self.destination), 'wb')
with open(path, 'rb') as f_in:
shutil.copyfileobj(f_in, f_out)
f_out.close()
@@ -358,9 +370,15 @@ class Archive(object):
msg='Errors when writing archive at %s: %s' % (_to_native(self.destination), '; '.join(self.errors))
)
def compare_with_original(self):
self.changed |= self.original_size != self.destination_size()
def destination_exists(self):
return self.destination and os.path.exists(self.destination)
def destination_readable(self):
return self.destination and os.access(self.destination, os.R_OK)
def destination_size(self):
return os.path.getsize(self.destination) if self.destination_exists() else 0
@@ -397,6 +415,15 @@ class Archive(object):
def has_unfound_targets(self):
return bool(self.not_found)
def remove_single_target(self, path):
try:
os.remove(path)
except OSError as e:
self.module.fail_json(
path=_to_native(path),
msg='Unable to remove source file: %s' % _to_native(e), exception=format_exc()
)
def remove_targets(self):
for path in self.successes:
if os.path.exists(path):
@@ -435,6 +462,7 @@ class Archive(object):
return {
'archived': [_to_native(p) for p in self.successes],
'dest': _to_native(self.destination),
'dest_state': self.destination_state,
'changed': self.changed,
'arcroot': _to_native(self.root),
'missing': [_to_native(p) for p in self.not_found],
@@ -442,14 +470,14 @@ class Archive(object):
'expanded_exclude_paths': [_to_native(p) for p in self.expanded_exclude_paths],
}
def _open_compressed_file(self, path):
def _open_compressed_file(self, path, mode):
f = None
if self.format == 'gz':
f = gzip.open(path, 'wb')
f = gzip.open(path, mode)
elif self.format == 'bz2':
f = bz2.BZ2File(path, 'wb')
f = bz2.BZ2File(path, mode)
elif self.format == 'xz':
f = lzma.LZMAFile(path, 'wb')
f = lzma.LZMAFile(path, mode)
else:
self.module.fail_json(msg="%s is not a valid format" % self.format)
@@ -531,7 +559,7 @@ class TarArchive(Archive):
return None if matches_exclusion_patterns(tarinfo.name, self.exclusion_patterns) else tarinfo
def py26_filter(path):
return matches_exclusion_patterns(path, self.exclusion_patterns)
return legacy_filter(path, self.exclusion_patterns)
if PY27:
self.file.add(path, archive_name, recursive=False, filter=py27_filter)
@@ -569,7 +597,6 @@ def main():
check_mode = module.check_mode
archive = get_archive(module)
size = archive.destination_size()
archive.find_targets()
if not archive.has_targets():
@@ -581,10 +608,9 @@ def main():
else:
archive.add_targets()
archive.destination_state = STATE_INCOMPLETE if archive.has_unfound_targets() else STATE_ARCHIVED
archive.compare_with_original()
if archive.remove:
archive.remove_targets()
if archive.destination_size() != size:
archive.changed = True
else:
if check_mode:
if not archive.destination_exists():
@@ -592,16 +618,9 @@ def main():
else:
path = archive.paths[0]
archive.add_single_target(path)
if archive.destination_size() != size:
archive.changed = True
archive.compare_with_original()
if archive.remove:
try:
os.remove(path)
except OSError as e:
module.fail_json(
path=_to_native(path),
msg='Unable to remove source file: %s' % _to_native(e), exception=format_exc()
)
archive.remove_single_target(path)
if archive.destination_exists():
archive.update_permissions()

View File

@@ -0,0 +1 @@
source_control/gitlab/gitlab_protected_branch.py

View File

@@ -196,9 +196,15 @@ def create_or_update_executions(kc, config, realm='master'):
:param config: Representation of the authentication flow including it's executions.
:param realm: Realm
:return: True if executions have been modified. False otherwise.
:return: tuple (changed, dict(before, after)
WHERE
bool changed indicates if changes have been made
dict(str, str) shows state before and after creation/update
"""
try:
changed = False
after = ""
before = ""
if "authenticationExecutions" in config:
# Get existing executions on the Keycloak server for this alias
existing_executions = kc.get_executions_representation(config, realm=realm)
@@ -221,17 +227,21 @@ def create_or_update_executions(kc, config, realm='master'):
exclude_key.append(key)
# Compare the executions to see if it need changes
if not is_struct_included(new_exec, existing_executions[exec_index], exclude_key) or exec_index != new_exec_index:
changed = True
exec_found = True
before += str(existing_executions[exec_index]) + '\n'
id_to_update = existing_executions[exec_index]["id"]
# Remove exec from list in case 2 exec with same name
existing_executions[exec_index].clear()
elif new_exec["providerId"] is not None:
kc.create_execution(new_exec, flowAlias=flow_alias_parent, realm=realm)
changed = True
exec_found = True
after += str(new_exec) + '\n'
elif new_exec["displayName"] is not None:
kc.create_subflow(new_exec["displayName"], flow_alias_parent, realm=realm)
exec_found = True
after += str(new_exec) + '\n'
if exec_found:
changed = True
if changed:
if exec_index != -1:
# Update the existing execution
updated_exec = {
@@ -248,7 +258,8 @@ def create_or_update_executions(kc, config, realm='master'):
kc.update_authentication_executions(flow_alias_parent, updated_exec, realm=realm)
diff = exec_index - new_exec_index
kc.change_execution_priority(updated_exec["id"], diff, realm=realm)
return changed
after += str(kc.get_executions_representation(config, realm=realm)[new_exec_index]) + '\n'
return changed, dict(before=before, after=after)
except Exception as e:
kc.module.fail_json(msg='Could not create or update executions for authentication flow %s in realm %s: %s'
% (config["alias"], realm, str(e)))
@@ -358,8 +369,10 @@ def main():
# Configure the executions for the flow
if module.check_mode:
module.exit_json(**result)
if create_or_update_executions(kc=kc, config=new_auth_repr, realm=realm):
result['changed'] = True
changed, diff = create_or_update_executions(kc=kc, config=new_auth_repr, realm=realm)
result['changed'] |= changed
if module._diff:
result['diff'] = diff
# Get executions created
exec_repr = kc.get_executions_representation(config=new_auth_repr, realm=realm)
if exec_repr is not None:

View File

@@ -318,6 +318,14 @@ options:
aliases:
- authorizationSettings
authentication_flow_binding_overrides:
description:
- Override realm authentication flow bindings.
type: dict
aliases:
- authenticationFlowBindingOverrides
version_added: 3.4.0
protocol_mappers:
description:
- a list of dicts defining protocol mappers for this client.
@@ -593,6 +601,8 @@ EXAMPLES = '''
default_roles:
- test01
- test02
authentication_flow_binding_overrides:
browser: 4c90336b-bf1d-4b87-916d-3677ba4e5fbb
protocol_mappers:
- config:
access.token.claim: True
@@ -745,6 +755,7 @@ def main():
use_template_config=dict(type='bool', aliases=['useTemplateConfig']),
use_template_scope=dict(type='bool', aliases=['useTemplateScope']),
use_template_mappers=dict(type='bool', aliases=['useTemplateMappers']),
authentication_flow_binding_overrides=dict(type='dict', aliases=['authenticationFlowBindingOverrides']),
protocol_mappers=dict(type='list', elements='dict', options=protmapper_spec, aliases=['protocolMappers']),
authorization_settings=dict(type='dict', aliases=['authorizationSettings']),
)

View File

@@ -0,0 +1,492 @@
#!/usr/bin/python
# -*- coding: utf-8 -*-
# GNU General Public License v3.0+ (see COPYING or https://www.gnu.org/licenses/gpl-3.0.txt)
from __future__ import absolute_import, division, print_function
__metaclass__ = type
DOCUMENTATION = '''
---
module: keycloak_clientscope
short_description: Allows administration of Keycloak client_scopes via Keycloak API
version_added: 3.4.0
description:
- This module allows you to add, remove or modify Keycloak client_scopes via the Keycloak REST API.
It requires access to the REST API via OpenID Connect; the user connecting and the client being
used must have the requisite access rights. In a default Keycloak installation, admin-cli
and an admin user would work, as would a separate client definition with the scope tailored
to your needs and a user having the expected roles.
- The names of module options are snake_cased versions of the camelCase ones found in the
Keycloak API and its documentation at U(https://www.keycloak.org/docs-api/8.0/rest-api/index.html).
- Attributes are multi-valued in the Keycloak API. All attributes are lists of individual values and will
be returned that way by this module. You may pass single values for attributes when calling the module,
and this will be translated into a list suitable for the API.
- When updating a client_scope, where possible provide the client_scope ID to the module. This removes a lookup
to the API to translate the name into the client_scope ID.
options:
state:
description:
- State of the client_scope.
- On C(present), the client_scope will be created if it does not yet exist, or updated with the parameters you provide.
- On C(absent), the client_scope will be removed if it exists.
default: 'present'
type: str
choices:
- present
- absent
name:
type: str
description:
- Name of the client_scope.
- This parameter is required only when creating or updating the client_scope.
realm:
type: str
description:
- They Keycloak realm under which this client_scope resides.
default: 'master'
id:
type: str
description:
- The unique identifier for this client_scope.
- This parameter is not required for updating or deleting a client_scope but
providing it will reduce the number of API calls required.
description:
type: str
description:
- Description for this client_scope.
- This parameter is not required for updating or deleting a client_scope.
protocol:
description:
- Type of client.
choices: ['openid-connect', 'saml', 'wsfed']
type: str
protocol_mappers:
description:
- A list of dicts defining protocol mappers for this client.
- This is 'protocolMappers' in the Keycloak REST API.
aliases:
- protocolMappers
type: list
elements: dict
suboptions:
protocol:
description:
- This specifies for which protocol this protocol mapper
- is active.
choices: ['openid-connect', 'saml', 'wsfed']
type: str
protocolMapper:
description:
- "The Keycloak-internal name of the type of this protocol-mapper. While an exhaustive list is
impossible to provide since this may be extended through SPIs by the user of Keycloak,
by default Keycloak as of 3.4 ships with at least:"
- C(docker-v2-allow-all-mapper)
- C(oidc-address-mapper)
- C(oidc-full-name-mapper)
- C(oidc-group-membership-mapper)
- C(oidc-hardcoded-claim-mapper)
- C(oidc-hardcoded-role-mapper)
- C(oidc-role-name-mapper)
- C(oidc-script-based-protocol-mapper)
- C(oidc-sha256-pairwise-sub-mapper)
- C(oidc-usermodel-attribute-mapper)
- C(oidc-usermodel-client-role-mapper)
- C(oidc-usermodel-property-mapper)
- C(oidc-usermodel-realm-role-mapper)
- C(oidc-usersessionmodel-note-mapper)
- C(saml-group-membership-mapper)
- C(saml-hardcode-attribute-mapper)
- C(saml-hardcode-role-mapper)
- C(saml-role-list-mapper)
- C(saml-role-name-mapper)
- C(saml-user-attribute-mapper)
- C(saml-user-property-mapper)
- C(saml-user-session-note-mapper)
- An exhaustive list of available mappers on your installation can be obtained on
the admin console by going to Server Info -> Providers and looking under
'protocol-mapper'.
type: str
name:
description:
- The name of this protocol mapper.
type: str
id:
description:
- Usually a UUID specifying the internal ID of this protocol mapper instance.
type: str
config:
description:
- Dict specifying the configuration options for the protocol mapper; the
contents differ depending on the value of I(protocolMapper) and are not documented
other than by the source of the mappers and its parent class(es). An example is given
below. It is easiest to obtain valid config values by dumping an already-existing
protocol mapper configuration through check-mode in the C(existing) return value.
type: dict
attributes:
type: dict
description:
- A dict of key/value pairs to set as custom attributes for the client_scope.
- Values may be single values (for example a string) or a list of strings.
extends_documentation_fragment:
- community.general.keycloak
author:
- Gaëtan Daubresse (@Gaetan2907)
'''
EXAMPLES = '''
- name: Create a Keycloak client_scopes, authentication with credentials
community.general.keycloak_clientscope:
name: my-new-kc-clientscope
realm: MyCustomRealm
state: present
auth_client_id: admin-cli
auth_keycloak_url: https://auth.example.com/auth
auth_realm: master
auth_username: USERNAME
auth_password: PASSWORD
delegate_to: localhost
- name: Create a Keycloak client_scopes, authentication with token
community.general.keycloak_clientscope:
name: my-new-kc-clientscope
realm: MyCustomRealm
state: present
auth_client_id: admin-cli
auth_keycloak_url: https://auth.example.com/auth
token: TOKEN
delegate_to: localhost
- name: Delete a keycloak client_scopes
community.general.keycloak_clientscope:
id: '9d59aa76-2755-48c6-b1af-beb70a82c3cd'
state: absent
realm: MyCustomRealm
auth_client_id: admin-cli
auth_keycloak_url: https://auth.example.com/auth
auth_realm: master
auth_username: USERNAME
auth_password: PASSWORD
delegate_to: localhost
- name: Delete a Keycloak client_scope based on name
community.general.keycloak_clientscope:
name: my-clientscope-for-deletion
state: absent
auth_client_id: admin-cli
auth_keycloak_url: https://auth.example.com/auth
auth_realm: master
auth_username: USERNAME
auth_password: PASSWORD
delegate_to: localhost
- name: Update the name of a Keycloak client_scope
community.general.keycloak_clientscope:
id: '9d59aa76-2755-48c6-b1af-beb70a82c3cd'
name: an-updated-kc-clientscope-name
state: present
auth_client_id: admin-cli
auth_keycloak_url: https://auth.example.com/auth
auth_realm: master
auth_username: USERNAME
auth_password: PASSWORD
delegate_to: localhost
- name: Create a Keycloak client_scope with some custom attributes
community.general.keycloak_clientscope:
auth_client_id: admin-cli
auth_keycloak_url: https://auth.example.com/auth
auth_realm: master
auth_username: USERNAME
auth_password: PASSWORD
name: my-new_clientscope
description: description-of-clientscope
protocol: openid-connect
protocol_mappers:
- config:
access.token.claim: True
claim.name: "family_name"
id.token.claim: True
jsonType.label: String
user.attribute: lastName
userinfo.token.claim: True
name: family name
protocol: openid-connect
protocolMapper: oidc-usermodel-property-mapper
- config:
attribute.name: Role
attribute.nameformat: Basic
single: false
name: role list
protocol: saml
protocolMapper: saml-role-list-mapper
attributes:
attrib1: value1
attrib2: value2
attrib3:
- with
- numerous
- individual
- list
- items
delegate_to: localhost
'''
RETURN = '''
msg:
description: Message as to what action was taken
returned: always
type: str
sample: "Client_scope testclientscope has been updated"
proposed:
description: client_scope representation of proposed changes to client_scope
returned: always
type: dict
sample: {
clientId: "test"
}
existing:
description: client_scope representation of existing client_scope (sample is truncated)
returned: always
type: dict
sample: {
"adminUrl": "http://www.example.com/admin_url",
"attributes": {
"request.object.signature.alg": "RS256",
}
}
end_state:
description: client_scope representation of client_scope after module execution (sample is truncated)
returned: always
type: dict
sample: {
"adminUrl": "http://www.example.com/admin_url",
"attributes": {
"request.object.signature.alg": "RS256",
}
}
'''
from ansible_collections.community.general.plugins.module_utils.identity.keycloak.keycloak import KeycloakAPI, camel, \
keycloak_argument_spec, get_token, KeycloakError, is_struct_included
from ansible.module_utils.basic import AnsibleModule
def sanitize_cr(clientscoperep):
""" Removes probably sensitive details from a clientscoperep representation
:param clientscoperep: the clientscoperep dict to be sanitized
:return: sanitized clientrep dict
"""
result = clientscoperep.copy()
if 'secret' in result:
result['secret'] = 'no_log'
if 'attributes' in result:
if 'saml.signing.private.key' in result['attributes']:
result['attributes']['saml.signing.private.key'] = 'no_log'
return result
def main():
"""
Module execution
:return:
"""
argument_spec = keycloak_argument_spec()
protmapper_spec = dict(
id=dict(type='str'),
name=dict(type='str'),
protocol=dict(type='str', choices=['openid-connect', 'saml', 'wsfed']),
protocolMapper=dict(type='str'),
config=dict(type='dict'),
)
meta_args = dict(
state=dict(default='present', choices=['present', 'absent']),
realm=dict(default='master'),
id=dict(type='str'),
name=dict(type='str'),
description=dict(type='str'),
protocol=dict(type='str', choices=['openid-connect', 'saml', 'wsfed']),
attributes=dict(type='dict'),
protocol_mappers=dict(type='list', elements='dict', options=protmapper_spec, aliases=['protocolMappers']),
)
argument_spec.update(meta_args)
module = AnsibleModule(argument_spec=argument_spec,
supports_check_mode=True,
required_one_of=([['id', 'name'],
['token', 'auth_realm', 'auth_username', 'auth_password']]),
required_together=([['auth_realm', 'auth_username', 'auth_password']]))
result = dict(changed=False, msg='', diff={}, proposed={}, existing={}, end_state={})
# Obtain access token, initialize API
try:
connection_header = get_token(module.params)
except KeycloakError as e:
module.fail_json(msg=str(e))
kc = KeycloakAPI(module, connection_header)
realm = module.params.get('realm')
state = module.params.get('state')
cid = module.params.get('id')
name = module.params.get('name')
protocol_mappers = module.params.get('protocol_mappers')
before_clientscope = None # current state of the clientscope, for merging.
# does the clientscope already exist?
if cid is None:
before_clientscope = kc.get_clientscope_by_name(name, realm=realm)
else:
before_clientscope = kc.get_clientscope_by_clientscopeid(cid, realm=realm)
before_clientscope = {} if before_clientscope is None else before_clientscope
clientscope_params = [x for x in module.params
if x not in list(keycloak_argument_spec().keys()) + ['state', 'realm'] and
module.params.get(x) is not None]
# Build a proposed changeset from parameters given to this module
changeset = dict()
for clientscope_param in clientscope_params:
new_param_value = module.params.get(clientscope_param)
# some lists in the Keycloak API are sorted, some are not.
if isinstance(new_param_value, list):
if clientscope_param in ['attributes']:
try:
new_param_value = sorted(new_param_value)
except TypeError:
pass
# Unfortunately, the ansible argument spec checker introduces variables with null values when
# they are not specified
if clientscope_param == 'protocol_mappers':
new_param_value = [dict((k, v) for k, v in x.items() if x[k] is not None) for x in new_param_value]
changeset[camel(clientscope_param)] = new_param_value
# prepare the new clientscope
updated_clientscope = before_clientscope.copy()
updated_clientscope.update(changeset)
# if before_clientscope is none, the clientscope doesn't exist.
if before_clientscope == {}:
if state == 'absent':
# nothing to do.
if module._diff:
result['diff'] = dict(before='', after='')
result['msg'] = 'Clientscope does not exist; doing nothing.'
result['end_state'] = dict()
module.exit_json(**result)
# for 'present', create a new clientscope.
result['changed'] = True
if name is None:
module.fail_json(msg='name must be specified when creating a new clientscope')
if module._diff:
result['diff'] = dict(before='', after=sanitize_cr(updated_clientscope))
if module.check_mode:
module.exit_json(**result)
# do it for real!
kc.create_clientscope(updated_clientscope, realm=realm)
after_clientscope = kc.get_clientscope_by_name(name, realm)
result['end_state'] = sanitize_cr(after_clientscope)
result['msg'] = 'Clientscope {name} has been created with ID {id}'.format(name=after_clientscope['name'],
id=after_clientscope['id'])
else:
if state == 'present':
# no changes
if updated_clientscope == before_clientscope:
result['changed'] = False
result['end_state'] = sanitize_cr(updated_clientscope)
result['msg'] = "No changes required to clientscope {name}.".format(name=before_clientscope['name'])
module.exit_json(**result)
# update the existing clientscope
result['changed'] = True
if module._diff:
result['diff'] = dict(before=sanitize_cr(before_clientscope), after=sanitize_cr(updated_clientscope))
if module.check_mode:
module.exit_json(**result)
# do the clientscope update
kc.update_clientscope(updated_clientscope, realm=realm)
# do the protocolmappers update
if protocol_mappers is not None:
for protocol_mapper in protocol_mappers:
# update if protocolmapper exist
current_protocolmapper = kc.get_clientscope_protocolmapper_by_name(updated_clientscope['id'], protocol_mapper['name'], realm=realm)
if current_protocolmapper is not None:
protocol_mapper['id'] = current_protocolmapper['id']
kc.update_clientscope_protocolmappers(updated_clientscope['id'], protocol_mapper, realm=realm)
# create otherwise
else:
kc.create_clientscope_protocolmapper(updated_clientscope['id'], protocol_mapper, realm=realm)
after_clientscope = kc.get_clientscope_by_clientscopeid(updated_clientscope['id'], realm=realm)
result['end_state'] = after_clientscope
result['msg'] = "Clientscope {id} has been updated".format(id=after_clientscope['id'])
module.exit_json(**result)
elif state == 'absent':
result['end_state'] = dict()
if module._diff:
result['diff'] = dict(before=sanitize_cr(before_clientscope), after='')
if module.check_mode:
module.exit_json(**result)
# delete for real
cid = before_clientscope['id']
kc.delete_clientscope(cid=cid, realm=realm)
result['changed'] = True
result['msg'] = "Clientscope {name} has been deleted".format(name=before_clientscope['name'])
module.exit_json(**result)
module.exit_json(**result)
if __name__ == '__main__':
main()

View File

@@ -0,0 +1,363 @@
#!/usr/bin/python
# -*- coding: utf-8 -*-
# Copyright (c) 2019, Adam Goossens <adam.goossens@gmail.com>
# GNU General Public License v3.0+ (see COPYING or https://www.gnu.org/licenses/gpl-3.0.txt)
from __future__ import absolute_import, division, print_function
__metaclass__ = type
DOCUMENTATION = '''
---
module: keycloak_role
short_description: Allows administration of Keycloak roles via Keycloak API
version_added: 3.4.0
description:
- This module allows you to add, remove or modify Keycloak roles via the Keycloak REST API.
It requires access to the REST API via OpenID Connect; the user connecting and the client being
used must have the requisite access rights. In a default Keycloak installation, admin-cli
and an admin user would work, as would a separate client definition with the scope tailored
to your needs and a user having the expected roles.
- The names of module options are snake_cased versions of the camelCase ones found in the
Keycloak API and its documentation at U(https://www.keycloak.org/docs-api/8.0/rest-api/index.html).
- Attributes are multi-valued in the Keycloak API. All attributes are lists of individual values and will
be returned that way by this module. You may pass single values for attributes when calling the module,
and this will be translated into a list suitable for the API.
options:
state:
description:
- State of the role.
- On C(present), the role will be created if it does not yet exist, or updated with the parameters you provide.
- On C(absent), the role will be removed if it exists.
default: 'present'
type: str
choices:
- present
- absent
name:
type: str
required: true
description:
- Name of the role.
- This parameter is required.
description:
type: str
description:
- The role description.
realm:
type: str
description:
- The Keycloak realm under which this role resides.
default: 'master'
client_id:
type: str
description:
- If the role is a client role, the client id under which it resides.
- If this parameter is absent, the role is considered a realm role.
attributes:
type: dict
description:
- A dict of key/value pairs to set as custom attributes for the role.
- Values may be single values (e.g. a string) or a list of strings.
extends_documentation_fragment:
- community.general.keycloak
author:
- Laurent Paumier (@laurpaum)
'''
EXAMPLES = '''
- name: Create a Keycloak realm role, authentication with credentials
community.general.keycloak_role:
name: my-new-kc-role
realm: MyCustomRealm
state: present
auth_client_id: admin-cli
auth_keycloak_url: https://auth.example.com/auth
auth_realm: master
auth_username: USERNAME
auth_password: PASSWORD
delegate_to: localhost
- name: Create a Keycloak realm role, authentication with token
community.general.keycloak_role:
name: my-new-kc-role
realm: MyCustomRealm
state: present
auth_client_id: admin-cli
auth_keycloak_url: https://auth.example.com/auth
token: TOKEN
delegate_to: localhost
- name: Create a Keycloak client role
community.general.keycloak_role:
name: my-new-kc-role
realm: MyCustomRealm
client_id: MyClient
state: present
auth_client_id: admin-cli
auth_keycloak_url: https://auth.example.com/auth
auth_realm: master
auth_username: USERNAME
auth_password: PASSWORD
delegate_to: localhost
- name: Delete a Keycloak role
community.general.keycloak_role:
name: my-role-for-deletion
state: absent
auth_client_id: admin-cli
auth_keycloak_url: https://auth.example.com/auth
auth_realm: master
auth_username: USERNAME
auth_password: PASSWORD
delegate_to: localhost
- name: Create a keycloak role with some custom attributes
community.general.keycloak_role:
auth_client_id: admin-cli
auth_keycloak_url: https://auth.example.com/auth
auth_realm: master
auth_username: USERNAME
auth_password: PASSWORD
name: my-new-role
attributes:
attrib1: value1
attrib2: value2
attrib3:
- with
- numerous
- individual
- list
- items
delegate_to: localhost
'''
RETURN = '''
msg:
description: Message as to what action was taken
returned: always
type: str
sample: "Role myrole has been updated"
proposed:
description: Role representation of proposed changes to role
returned: always
type: dict
sample: {
"description": "My updated test description"
}
existing:
description: Role representation of existing role
returned: always
type: dict
sample: {
"attributes": {},
"clientRole": true,
"composite": false,
"containerId": "9f03eb61-a826-4771-a9fd-930e06d2d36a",
"description": "My client test role",
"id": "561703dd-0f38-45ff-9a5a-0c978f794547",
"name": "myrole"
}
end_state:
description: Role representation of role after module execution (sample is truncated)
returned: always
type: dict
sample: {
"attributes": {},
"clientRole": true,
"composite": false,
"containerId": "9f03eb61-a826-4771-a9fd-930e06d2d36a",
"description": "My updated client test role",
"id": "561703dd-0f38-45ff-9a5a-0c978f794547",
"name": "myrole"
}
'''
from ansible_collections.community.general.plugins.module_utils.identity.keycloak.keycloak import KeycloakAPI, camel, \
keycloak_argument_spec, get_token, KeycloakError
from ansible.module_utils.basic import AnsibleModule
def main():
"""
Module execution
:return:
"""
argument_spec = keycloak_argument_spec()
meta_args = dict(
state=dict(type='str', default='present', choices=['present', 'absent']),
name=dict(type='str', required=True),
description=dict(type='str'),
realm=dict(type='str', default='master'),
client_id=dict(type='str'),
attributes=dict(type='dict'),
)
argument_spec.update(meta_args)
module = AnsibleModule(argument_spec=argument_spec,
supports_check_mode=True,
required_one_of=([['token', 'auth_realm', 'auth_username', 'auth_password']]),
required_together=([['auth_realm', 'auth_username', 'auth_password']]))
result = dict(changed=False, msg='', diff={}, proposed={}, existing={}, end_state={})
# Obtain access token, initialize API
try:
connection_header = get_token(module.params)
except KeycloakError as e:
module.fail_json(msg=str(e))
kc = KeycloakAPI(module, connection_header)
realm = module.params.get('realm')
clientid = module.params.get('client_id')
name = module.params.get('name')
state = module.params.get('state')
# attributes in Keycloak have their values returned as lists
# via the API. attributes is a dict, so we'll transparently convert
# the values to lists.
if module.params.get('attributes') is not None:
for key, val in module.params['attributes'].items():
module.params['attributes'][key] = [val] if not isinstance(val, list) else val
# convert module parameters to client representation parameters (if they belong in there)
role_params = [x for x in module.params
if x not in list(keycloak_argument_spec().keys()) + ['state', 'realm', 'client_id', 'composites'] and
module.params.get(x) is not None]
# does the role already exist?
if clientid is None:
before_role = kc.get_realm_role(name, realm)
else:
before_role = kc.get_client_role(name, clientid, realm)
if before_role is None:
before_role = dict()
# build a changeset
changeset = dict()
for param in role_params:
new_param_value = module.params.get(param)
old_value = before_role[param] if param in before_role else None
if new_param_value != old_value:
changeset[camel(param)] = new_param_value
# prepare the new role
updated_role = before_role.copy()
updated_role.update(changeset)
result['proposed'] = changeset
result['existing'] = before_role
# if before_role is none, the role doesn't exist.
if before_role == dict():
if state == 'absent':
# nothing to do.
if module._diff:
result['diff'] = dict(before='', after='')
result['changed'] = False
result['end_state'] = dict()
result['msg'] = 'Role does not exist; doing nothing.'
module.exit_json(**result)
# for 'present', create a new role.
result['changed'] = True
if name is None:
module.fail_json(msg='name must be specified when creating a new role')
if module._diff:
result['diff'] = dict(before='', after=updated_role)
if module.check_mode:
module.exit_json(**result)
# do it for real!
if clientid is None:
kc.create_realm_role(updated_role, realm)
after_role = kc.get_realm_role(name, realm)
else:
kc.create_client_role(updated_role, clientid, realm)
after_role = kc.get_client_role(name, clientid, realm)
result['end_state'] = after_role
result['msg'] = 'Role {name} has been created'.format(name=name)
module.exit_json(**result)
else:
if state == 'present':
# no changes
if updated_role == before_role:
result['changed'] = False
result['end_state'] = updated_role
result['msg'] = "No changes required to role {name}.".format(name=name)
module.exit_json(**result)
# update the existing role
result['changed'] = True
if module._diff:
result['diff'] = dict(before=before_role, after=updated_role)
if module.check_mode:
module.exit_json(**result)
# do the update
if clientid is None:
kc.update_realm_role(updated_role, realm)
after_role = kc.get_realm_role(name, realm)
else:
kc.update_client_role(updated_role, clientid, realm)
after_role = kc.get_client_role(name, clientid, realm)
result['end_state'] = after_role
result['msg'] = "Role {name} has been updated".format(name=name)
module.exit_json(**result)
elif state == 'absent':
result['changed'] = True
if module._diff:
result['diff'] = dict(before=before_role, after='')
if module.check_mode:
module.exit_json(**result)
# delete for real
if clientid is None:
kc.delete_realm_role(name, realm)
else:
kc.delete_client_role(name, clientid, realm)
result['end_state'] = dict()
result['msg'] = "Role {name} has been deleted".format(name=name)
module.exit_json(**result)
module.exit_json(**result)
if __name__ == '__main__':
main()

View File

@@ -0,0 +1 @@
identity/keycloak/keycloak_clientscope.py

View File

@@ -0,0 +1 @@
./identity/keycloak/keycloak_role.py

View File

@@ -51,7 +51,17 @@ options:
description:
- The type of the monitor.
- The types C(query alert), C(trace-analytics alert) and C(rum alert) were added in community.general 2.1.0.
choices: ['metric alert', 'service check', 'event alert', 'process alert', 'log alert', 'query alert', 'trace-analytics alert', 'rum alert']
- The type C(composite) was added in community.general 3.4.0.
choices:
- metric alert
- service check
- event alert
- process alert
- log alert
- query alert
- trace-analytics alert
- rum alert
- composite
type: str
query:
description:
@@ -209,7 +219,8 @@ def main():
app_key=dict(required=True, no_log=True),
state=dict(required=True, choices=['present', 'absent', 'mute', 'unmute']),
type=dict(choices=['metric alert', 'service check', 'event alert', 'process alert',
'log alert', 'query alert', 'trace-analytics alert', 'rum alert']),
'log alert', 'query alert', 'trace-analytics alert',
'rum alert', 'composite']),
name=dict(required=True),
query=dict(),
notification_message=dict(no_log=True),

View File

@@ -57,7 +57,7 @@ options:
choices: [ bond, bond-slave, bridge, bridge-slave, ethernet, generic, infiniband, ipip, sit, team, team-slave, vlan, vxlan, wifi ]
mode:
description:
- This is the type of device or network connection that you wish to create for a bond, team or bridge.
- This is the type of device or network connection that you wish to create for a bond or bridge.
type: str
choices: [ 802.3ad, active-backup, balance-alb, balance-rr, balance-tlb, balance-xor, broadcast ]
default: balance-rr
@@ -265,6 +265,20 @@ options:
frame was received on.
type: bool
default: yes
runner:
description:
- This is the type of device or network connection that you wish to create for a team.
type: str
choices: [ broadcast, roundrobin, activebackup, loadbalance, lacp ]
default: roundrobin
version_added: 3.4.0
runner_hwaddr_policy:
description:
- This defines the policy of how hardware addresses of team device and port devices
should be set during the team lifetime.
type: str
choices: [ same_all, by_active, only_active ]
version_added: 3.4.0
vlanid:
description:
- This is only used with VLAN - VLAN ID in range <0-4095>.
@@ -719,6 +733,8 @@ class Nmcli(object):
self.hairpin = module.params['hairpin']
self.path_cost = module.params['path_cost']
self.mac = module.params['mac']
self.runner = module.params['runner']
self.runner_hwaddr_policy = module.params['runner_hwaddr_policy']
self.vlanid = module.params['vlanid']
self.vlandev = module.params['vlandev']
self.flags = module.params['flags']
@@ -826,6 +842,11 @@ class Nmcli(object):
'bridge.priority': self.priority,
'bridge.stp': self.stp,
})
elif self.type == 'team':
options.update({
'team.runner': self.runner,
'team.runner-hwaddr-policy': self.runner_hwaddr_policy,
})
elif self.type == 'bridge-slave':
options.update({
'connection.slave-type': 'bridge',
@@ -1214,6 +1235,11 @@ def main():
ageingtime=dict(type='int', default=300),
hairpin=dict(type='bool', default=True),
path_cost=dict(type='int', default=100),
# team specific vars
runner=dict(type='str', default='roundrobin',
choices=['broadcast', 'roundrobin', 'activebackup', 'loadbalance', 'lacp']),
# team active-backup runner specific options
runner_hwaddr_policy=dict(type='str', choices=['same_all', 'by_active', 'only_active']),
# vlan specific vars
vlanid=dict(type='int'),
vlandev=dict(type='str'),
@@ -1245,6 +1271,10 @@ def main():
# check for issues
if nmcli.conn_name is None:
nmcli.module.fail_json(msg="Please specify a name for the connection")
# team checks
if nmcli.type == "team":
if nmcli.runner_hwaddr_policy and not nmcli.runner == "activebackup":
nmcli.module.fail_json(msg="Runner-hwaddr-policy is only allowed for runner activebackup")
# team-slave checks
if nmcli.type == 'team-slave':
if nmcli.master is None:

View File

@@ -145,7 +145,7 @@ class Snap(CmdStateModuleHelper):
actionable_snaps=dict(fmt=lambda v: v),
state=dict(fmt=_state_map),
classic=dict(fmt="--classic", style=ArgFormat.BOOLEAN),
channel=dict(fmt=lambda v: [] if v == 'stable' else ['--channel', '{0}']),
channel=dict(fmt=lambda v: [] if v == 'stable' else ['--channel', '{0}'.format(v)]),
)
check_rc = False

View File

@@ -114,6 +114,38 @@ options:
- Used to create a personal project under a user's name.
type: str
version_added: "3.3.0"
allow_merge_on_skipped_pipeline:
description:
- Allow merge when skipped pipelines exist.
type: bool
version_added: "3.4.0"
only_allow_merge_if_all_discussions_are_resolved:
description:
- All discussions on a merge request (MR) have to be resolved.
type: bool
version_added: "3.4.0"
only_allow_merge_if_pipeline_succeeds:
description:
- Only allow merges if pipeline succeeded.
type: bool
version_added: "3.4.0"
packages_enabled:
description:
- Enable GitLab package repository.
type: bool
version_added: "3.4.0"
remove_source_branch_after_merge:
description:
- Remove the source branch after merge.
type: bool
version_added: "3.4.0"
squash_option:
description:
- Squash commits when merging.
type: str
choices: ["never", "always", "default_off", "default_on"]
version_added: "3.4.0"
'''
EXAMPLES = r'''
@@ -214,6 +246,12 @@ class GitLabProject(object):
'snippets_enabled': options['snippets_enabled'],
'visibility': options['visibility'],
'lfs_enabled': options['lfs_enabled'],
'allow_merge_on_skipped_pipeline': options['allow_merge_on_skipped_pipeline'],
'only_allow_merge_if_all_discussions_are_resolved': options['only_allow_merge_if_all_discussions_are_resolved'],
'only_allow_merge_if_pipeline_succeeds': options['only_allow_merge_if_pipeline_succeeds'],
'packages_enabled': options['packages_enabled'],
'remove_source_branch_after_merge': options['remove_source_branch_after_merge'],
'squash_option': options['squash_option'],
}
# Because we have already call userExists in main()
if self.projectObject is None:
@@ -221,6 +259,7 @@ class GitLabProject(object):
'path': options['path'],
'import_url': options['import_url'],
})
project_options = self.getOptionsWithValue(project_options)
project = self.createProject(namespace, project_options)
changed = True
else:
@@ -254,6 +293,17 @@ class GitLabProject(object):
return project
'''
@param arguments Attributes of the project
'''
def getOptionsWithValue(self, arguments):
ret_arguments = dict()
for arg_key, arg_value in arguments.items():
if arguments[arg_key] is not None:
ret_arguments[arg_key] = arg_value
return ret_arguments
'''
@param project Project Object
@param arguments Attributes of the project
@@ -308,6 +358,12 @@ def main():
state=dict(type='str', default="present", choices=["absent", "present"]),
lfs_enabled=dict(default=False, type='bool'),
username=dict(type='str'),
allow_merge_on_skipped_pipeline=dict(type='bool'),
only_allow_merge_if_all_discussions_are_resolved=dict(type='bool'),
only_allow_merge_if_pipeline_succeeds=dict(type='bool'),
packages_enabled=dict(type='bool'),
remove_source_branch_after_merge=dict(type='bool'),
squash_option=dict(type='str', choices=['never', 'always', 'default_off', 'default_on']),
))
module = AnsibleModule(
@@ -340,6 +396,12 @@ def main():
state = module.params['state']
lfs_enabled = module.params['lfs_enabled']
username = module.params['username']
allow_merge_on_skipped_pipeline = module.params['allow_merge_on_skipped_pipeline']
only_allow_merge_if_all_discussions_are_resolved = module.params['only_allow_merge_if_all_discussions_are_resolved']
only_allow_merge_if_pipeline_succeeds = module.params['only_allow_merge_if_pipeline_succeeds']
packages_enabled = module.params['packages_enabled']
remove_source_branch_after_merge = module.params['remove_source_branch_after_merge']
squash_option = module.params['squash_option']
if not HAS_GITLAB_PACKAGE:
module.fail_json(msg=missing_required_lib("python-gitlab"), exception=GITLAB_IMP_ERR)
@@ -386,6 +448,7 @@ def main():
module.exit_json(changed=False, msg="Project deleted or does not exists")
if state == 'present':
if gitlab_project.createOrUpdateProject(project_name, namespace, {
"path": project_path,
"description": project_description,
@@ -396,7 +459,14 @@ def main():
"snippets_enabled": snippets_enabled,
"visibility": visibility,
"import_url": import_url,
"lfs_enabled": lfs_enabled}):
"lfs_enabled": lfs_enabled,
"allow_merge_on_skipped_pipeline": allow_merge_on_skipped_pipeline,
"only_allow_merge_if_all_discussions_are_resolved": only_allow_merge_if_all_discussions_are_resolved,
"only_allow_merge_if_pipeline_succeeds": only_allow_merge_if_pipeline_succeeds,
"packages_enabled": packages_enabled,
"remove_source_branch_after_merge": remove_source_branch_after_merge,
"squash_option": squash_option,
}):
module.exit_json(changed=True, msg="Successfully created or updated the project %s" % project_name, project=gitlab_project.projectObject._attrs)
module.exit_json(changed=False, msg="No need to update the project %s" % project_name, project=gitlab_project.projectObject._attrs)

View File

@@ -0,0 +1,201 @@
#!/usr/bin/python
# -*- coding: utf-8 -*-
# Copyright: (c) 2021, Werner Dijkerman (ikben@werner-dijkerman.nl)
# GNU General Public License v3.0+ (see COPYING or https://www.gnu.org/licenses/gpl-3.0.txt)
from __future__ import absolute_import, division, print_function
__metaclass__ = type
DOCUMENTATION = '''
module: gitlab_protected_branch
short_description: (un)Marking existing branches for protection
version_added: 3.4.0
description:
- (un)Marking existing branches for protection.
author:
- "Werner Dijkerman (@dj-wasabi)"
requirements:
- python >= 2.7
- python-gitlab >= 2.3.0
extends_documentation_fragment:
- community.general.auth_basic
options:
state:
description:
- Create or delete proteced branch.
default: present
type: str
choices: ["present", "absent"]
api_token:
description:
- GitLab access token with API permissions.
required: true
type: str
project:
description:
- The path and name of the project.
required: true
type: str
name:
description:
- The name of the branch that needs to be protected.
- Can make use a wildcard charachter for like C(production/*) or just have C(main) or C(develop) as value.
required: true
type: str
merge_access_levels:
description:
- Access levels allowed to merge.
default: maintainer
type: str
choices: ["maintainer", "developer", "nobody"]
push_access_level:
description:
- Access levels allowed to push.
default: maintainer
type: str
choices: ["maintainer", "developer", "nobody"]
'''
EXAMPLES = '''
- name: Create protected branch on main
community.general.gitlab_protected_branch:
api_url: https://gitlab.com
api_token: secret_access_token
project: "dj-wasabi/collection.general"
name: main
merge_access_levels: maintainer
push_access_level: nobody
'''
RETURN = '''
'''
import traceback
from ansible.module_utils.basic import AnsibleModule, missing_required_lib
from ansible.module_utils.api import basic_auth_argument_spec
from distutils.version import LooseVersion
GITLAB_IMP_ERR = None
try:
import gitlab
HAS_GITLAB_PACKAGE = True
except Exception:
GITLAB_IMP_ERR = traceback.format_exc()
HAS_GITLAB_PACKAGE = False
from ansible_collections.community.general.plugins.module_utils.gitlab import gitlabAuthentication
class GitlabProtectedBranch(object):
def __init__(self, module, project, gitlab_instance):
self.repo = gitlab_instance
self._module = module
self.project = self.get_project(project)
self.ACCESS_LEVEL = {
'nobody': gitlab.NO_ACCESS,
'developer': gitlab.DEVELOPER_ACCESS,
'maintainer': gitlab.MAINTAINER_ACCESS
}
def get_project(self, project_name):
return self.repo.projects.get(project_name)
def protected_branch_exist(self, name):
try:
return self.project.protectedbranches.get(name)
except Exception as e:
return False
def create_protected_branch(self, name, merge_access_levels, push_access_level):
if self._module.check_mode:
return True
merge = self.ACCESS_LEVEL[merge_access_levels]
push = self.ACCESS_LEVEL[push_access_level]
self.project.protectedbranches.create({
'name': name,
'merge_access_level': merge,
'push_access_level': push
})
def compare_protected_branch(self, name, merge_access_levels, push_access_level):
configured_merge = self.ACCESS_LEVEL[merge_access_levels]
configured_push = self.ACCESS_LEVEL[push_access_level]
current = self.protected_branch_exist(name=name)
current_merge = current.merge_access_levels[0]['access_level']
current_push = current.push_access_levels[0]['access_level']
if current:
if current.name == name and current_merge == configured_merge and current_push == configured_push:
return True
return False
def delete_protected_branch(self, name):
if self._module.check_mode:
return True
return self.project.protectedbranches.delete(name)
def main():
argument_spec = basic_auth_argument_spec()
argument_spec.update(
api_token=dict(type='str', required=True, no_log=True),
project=dict(type='str', required=True),
name=dict(type='str', required=True),
merge_access_levels=dict(type='str', default="maintainer", choices=["maintainer", "developer", "nobody"]),
push_access_level=dict(type='str', default="maintainer", choices=["maintainer", "developer", "nobody"]),
state=dict(type='str', default="present", choices=["absent", "present"]),
)
module = AnsibleModule(
argument_spec=argument_spec,
mutually_exclusive=[
['api_username', 'api_token'],
['api_password', 'api_token'],
],
required_together=[
['api_username', 'api_password'],
],
required_one_of=[
['api_username', 'api_token']
],
supports_check_mode=True
)
project = module.params['project']
name = module.params['name']
merge_access_levels = module.params['merge_access_levels']
push_access_level = module.params['push_access_level']
state = module.params['state']
if not HAS_GITLAB_PACKAGE:
module.fail_json(msg=missing_required_lib("python-gitlab"), exception=GITLAB_IMP_ERR)
gitlab_version = gitlab.__version__
if LooseVersion(gitlab_version) < LooseVersion('2.3.0'):
module.fail_json(msg="community.general.gitlab_proteched_branch requires python-gitlab Python module >= 2.3.0 (installed version: [%s])."
" Please upgrade python-gitlab to version 2.3.0 or above." % gitlab_version)
gitlab_instance = gitlabAuthentication(module)
this_gitlab = GitlabProtectedBranch(module=module, project=project, gitlab_instance=gitlab_instance)
p_branch = this_gitlab.protected_branch_exist(name=name)
if not p_branch and state == "present":
this_gitlab.create_protected_branch(name=name, merge_access_levels=merge_access_levels, push_access_level=push_access_level)
module.exit_json(changed=True, msg="Created the proteched branch.")
elif p_branch and state == "present":
if not this_gitlab.compare_protected_branch(name, merge_access_levels, push_access_level):
this_gitlab.delete_protected_branch(name=name)
this_gitlab.create_protected_branch(name=name, merge_access_levels=merge_access_levels, push_access_level=push_access_level)
module.exit_json(changed=True, msg="Recreated the proteched branch.")
elif p_branch and state == "absent":
this_gitlab.delete_protected_branch(name=name)
module.exit_json(changed=True, msg="Deleted the proteched branch.")
module.exit_json(changed=False, msg="No changes are needed.")
if __name__ == '__main__':
main()

View File

@@ -1,6 +1,7 @@
#!/usr/bin/python
# -*- coding: utf-8 -*-
# Copyright: (c) 2021, quidame <quidame@poivron.org>
# Copyright: (c) 2013, Alexander Bulimov <lazywolf0@gmail.com>
# GNU General Public License v3.0+ (see COPYING or https://www.gnu.org/licenses/gpl-3.0.txt)
@@ -12,6 +13,7 @@ DOCUMENTATION = '''
---
author:
- Alexander Bulimov (@abulimov)
- quidame (@quidame)
module: filesystem
short_description: Makes a filesystem
description:
@@ -30,25 +32,22 @@ options:
default: present
version_added: 1.3.0
fstype:
choices: [ btrfs, ext2, ext3, ext4, ext4dev, f2fs, lvm, ocfs2, reiserfs, xfs, vfat, swap ]
choices: [ btrfs, ext2, ext3, ext4, ext4dev, f2fs, lvm, ocfs2, reiserfs, xfs, vfat, swap, ufs ]
description:
- Filesystem type to be created. This option is required with
C(state=present) (or if I(state) is omitted).
- reiserfs support was added in 2.2.
- lvm support was added in 2.5.
- since 2.5, I(dev) can be an image file.
- vfat support was added in 2.5
- ocfs2 support was added in 2.6
- f2fs support was added in 2.7
- swap support was added in 2.8
- ufs support has been added in community.general 3.4.0.
type: str
aliases: [type]
dev:
description:
- Target path to block device or regular file.
- On systems not using block devices but character devices instead (as
FreeBSD), this module only works when applying to regular files, aka
disk images.
- Target path to block device (Linux) or character device (FreeBSD) or
regular file (both).
- When setting Linux-specific filesystem types on FreeBSD, this module
only works when applying to regular files, aka disk images.
- Currently C(lvm) (Linux-only) and C(ufs) (FreeBSD-only) don't support
a regular file as their target I(dev).
- Support for character devices on FreeBSD has been added in community.general 3.4.0.
type: path
required: yes
aliases: [device]
@@ -60,7 +59,7 @@ options:
resizefs:
description:
- If C(yes), if the block device and filesystem size differ, grow the filesystem into the space.
- Supported for C(ext2), C(ext3), C(ext4), C(ext4dev), C(f2fs), C(lvm), C(xfs) and C(vfat) filesystems.
- Supported for C(ext2), C(ext3), C(ext4), C(ext4dev), C(f2fs), C(lvm), C(xfs), C(ufs) and C(vfat) filesystems.
Attempts to resize other filesystem types will fail.
- XFS Will only grow if mounted. Currently, the module is based on commands
from C(util-linux) package to perform operations, so resizing of XFS is
@@ -73,16 +72,24 @@ options:
- List of options to be passed to mkfs command.
type: str
requirements:
- Uses tools related to the I(fstype) (C(mkfs)) and the C(blkid) command.
- When I(resizefs) is enabled, C(blockdev) command is required too.
- Uses specific tools related to the I(fstype) for creating or resizing a
filesystem (from packages e2fsprogs, xfsprogs, dosfstools, and so on).
- Uses generic tools mostly related to the Operating System (Linux or
FreeBSD) or available on both, as C(blkid).
- On FreeBSD, either C(util-linux) or C(e2fsprogs) package is required.
notes:
- Potential filesystem on I(dev) are checked using C(blkid). In case C(blkid)
isn't able to detect an existing filesystem, this filesystem is overwritten
even if I(force) is C(no).
- On FreeBSD systems, either C(e2fsprogs) or C(util-linux) packages provide
a C(blkid) command that is compatible with this module, when applied to
regular files.
- Potential filesystems on I(dev) are checked using C(blkid). In case C(blkid)
is unable to detect a filesystem (and in case C(fstyp) on FreeBSD is also
unable to detect a filesystem), this filesystem is overwritten even if
I(force) is C(no).
- On FreeBSD systems, both C(e2fsprogs) and C(util-linux) packages provide
a C(blkid) command that is compatible with this module. However, these
packages conflict with each other, and only the C(util-linux) package
provides the command required to not fail when I(state=absent).
- This module supports I(check_mode).
seealso:
- module: community.general.filesize
- module: ansible.posix.mount
'''
EXAMPLES = '''
@@ -101,6 +108,11 @@ EXAMPLES = '''
community.general.filesystem:
dev: /dev/sdb1
state: absent
- name: Create a filesystem on top of a regular file
community.general.filesystem:
dev: /path/to/disk.img
fstype: vfat
'''
from distutils.version import LooseVersion
@@ -125,6 +137,10 @@ class Device(object):
blockdev_cmd = self.module.get_bin_path("blockdev", required=True)
dummy, out, dummy = self.module.run_command([blockdev_cmd, "--getsize64", self.path], check_rc=True)
devsize_in_bytes = int(out)
elif stat.S_ISCHR(statinfo.st_mode) and platform.system() == 'FreeBSD':
diskinfo_cmd = self.module.get_bin_path("diskinfo", required=True)
dummy, out, dummy = self.module.run_command([diskinfo_cmd, self.path], check_rc=True)
devsize_in_bytes = int(out.split()[2])
elif os.path.isfile(self.path):
devsize_in_bytes = os.path.getsize(self.path)
else:
@@ -423,6 +439,31 @@ class Swap(Filesystem):
MKFS_FORCE_FLAGS = ['-f']
class UFS(Filesystem):
MKFS = 'newfs'
INFO = 'dumpfs'
GROW = 'growfs'
GROW_MAX_SPACE_FLAGS = ['-y']
def get_fs_size(self, dev):
"""Get providersize and fragment size and return their product."""
cmd = self.module.get_bin_path(self.INFO, required=True)
dummy, out, dummy = self.module.run_command([cmd, str(dev)], check_rc=True, environ_update=self.LANG_ENV)
fragmentsize = providersize = None
for line in out.splitlines():
if line.startswith('fsize'):
fragmentsize = int(line.split()[1])
elif 'providersize' in line:
providersize = int(line.split()[-1])
if None not in (fragmentsize, providersize):
break
else:
raise ValueError(out)
return fragmentsize * providersize
FILESYSTEMS = {
'ext2': Ext2,
'ext3': Ext3,
@@ -436,6 +477,7 @@ FILESYSTEMS = {
'ocfs2': Ocfs2,
'LVM2_member': LVM,
'swap': Swap,
'ufs': UFS,
}
@@ -484,11 +526,16 @@ def main():
dev = Device(module, dev)
# In case blkid/fstyp isn't able to identify an existing filesystem, device
# is considered as empty, then this existing filesystem would be overwritten
# even if force isn't enabled.
cmd = module.get_bin_path('blkid', required=True)
rc, raw_fs, err = module.run_command([cmd, '-c', os.devnull, '-o', 'value', '-s', 'TYPE', str(dev)])
# In case blkid isn't able to identify an existing filesystem, device is considered as empty,
# then this existing filesystem would be overwritten even if force isn't enabled.
fs = raw_fs.strip()
if not fs and platform.system() == 'FreeBSD':
cmd = module.get_bin_path('fstyp', required=True)
rc, raw_fs, err = module.run_command([cmd, str(dev)])
fs = raw_fs.strip()
if state == "present":
if fstype in friendly_names:

View File

@@ -159,7 +159,7 @@ class Plist:
"""Finds the plist file associated with a service"""
launchd_paths = [
os.path.expanduser('~/Library/LaunchAgents'),
os.path.join(os.getenv('HOME'), 'Library/LaunchAgents'),
'/Library/LaunchAgents',
'/Library/LaunchDaemons',
'/System/Library/LaunchAgents',

View File

@@ -733,14 +733,19 @@ class PamdService(object):
lines = []
current_line = self._head
mark = "# Updated by Ansible - %s" % datetime.now().isoformat()
while current_line is not None:
lines.append(str(current_line))
current_line = current_line.next
if lines[1].startswith("# Updated by Ansible"):
lines.pop(1)
lines.insert(1, "# Updated by Ansible - " + datetime.now().isoformat())
if len(lines) <= 1:
lines.insert(0, "")
lines.insert(1, mark)
else:
if lines[1].startswith("# Updated by Ansible"):
lines[1] = mark
else:
lines.insert(1, mark)
return '\n'.join(lines) + '\n'

View File

@@ -33,12 +33,12 @@ options:
type: str
description:
- Password to authenticate with the Jenkins server.
- This is a required parameter, if C(token) is not provided.
- This is mutually exclusive with I(token).
token:
type: str
description:
- API token used to authenticate with the Jenkins server.
- This is a required parameter, if C(password) is not provided.
- This is mutually exclusive with I(password).
url:
type: str
description:
@@ -59,6 +59,11 @@ author:
'''
EXAMPLES = '''
# Get all Jenkins jobs anonymously
- community.general.jenkins_job_info:
user: admin
register: my_jenkins_job_info
# Get all Jenkins jobs using basic auth
- community.general.jenkins_job_info:
user: admin
@@ -232,9 +237,6 @@ def main():
['password', 'token'],
['name', 'glob'],
],
required_one_of=[
['password', 'token'],
],
supports_check_mode=True,
)

View File

@@ -1,22 +0,0 @@
---
- name: Create broken link
file:
src: /nowhere
dest: "{{ output_dir }}/nowhere.txt"
state: link
force: yes
- name: Archive broken link (tar.gz)
archive:
path: "{{ output_dir }}/*.txt"
dest: "{{ output_dir }}/archive_broken_link.tar.gz"
- name: Archive broken link (tar.bz2)
archive:
path: "{{ output_dir }}/*.txt"
dest: "{{ output_dir }}/archive_broken_link.tar.bz2"
- name: Archive broken link (zip)
archive:
path: "{{ output_dir }}/*.txt"
dest: "{{ output_dir }}/archive_broken_link.zip"

View File

@@ -22,6 +22,7 @@
# along with Ansible. If not, see <http://www.gnu.org/licenses/>.
# Make sure we start fresh
# Test setup
- name: Ensure zip is present to create test archive (yum)
yum: name=zip state=latest
when: ansible_facts.pkg_mgr == 'yum'
@@ -82,400 +83,52 @@
- sub
- sub/subfile.txt
- name: archive using gz
archive:
path: "{{ output_dir }}/*.txt"
dest: "{{ output_dir }}/archive_01.gz"
format: gz
register: archive_gz_result_01
- debug: msg="{{ archive_gz_result_01 }}"
- name: verify that the files archived
file: path={{output_dir}}/archive_01.gz state=file
- name: check if gz file exists and includes all text files
assert:
that:
- "{{ archive_gz_result_01.changed }}"
- "{{ 'archived' in archive_gz_result_01 }}"
- "{{ archive_gz_result_01['archived'] | length }} == 3"
- name: archive using zip
archive:
path: "{{ output_dir }}/*.txt"
dest: "{{ output_dir }}/archive_01.zip"
format: zip
register: archive_zip_result_01
- debug: msg="{{ archive_zip_result_01 }}"
- name: verify that the files archived
file: path={{output_dir}}/archive_01.zip state=file
- name: check if zip file exists
assert:
that:
- "{{ archive_zip_result_01.changed }}"
- "{{ 'archived' in archive_zip_result_01 }}"
- "{{ archive_zip_result_01['archived'] | length }} == 3"
- name: archive using bz2
archive:
path: "{{ output_dir }}/*.txt"
dest: "{{ output_dir }}/archive_01.bz2"
format: bz2
register: archive_bz2_result_01
- debug: msg="{{ archive_bz2_result_01 }}"
- name: verify that the files archived
file: path={{output_dir}}/archive_01.bz2 state=file
- name: check if bzip file exists
assert:
that:
- "{{ archive_bz2_result_01.changed }}"
- "{{ 'archived' in archive_bz2_result_01 }}"
- "{{ archive_bz2_result_01['archived'] | length }} == 3"
- name: archive using xz
archive:
path: "{{ output_dir }}/*.txt"
dest: "{{ output_dir }}/archive_01.xz"
format: xz
register: archive_xz_result_01
- debug: msg="{{ archive_xz_result_01 }}"
- name: verify that the files archived
file: path={{output_dir}}/archive_01.xz state=file
- name: check if xz file exists
assert:
that:
- "{{ archive_xz_result_01.changed }}"
- "{{ 'archived' in archive_xz_result_01 }}"
- "{{ archive_xz_result_01['archived'] | length }} == 3"
- name: archive and set mode to 0600
archive:
path: "{{ output_dir }}/*.txt"
dest: "{{ output_dir }}/archive_02.gz"
format: gz
mode: "u+rwX,g-rwx,o-rwx"
register: archive_bz2_result_02
- name: Test that the file modes were changed
stat:
path: "{{ output_dir }}/archive_02.gz"
register: archive_02_gz_stat
- debug: msg="{{ archive_02_gz_stat}}"
- name: Test that the file modes were changed
assert:
that:
- archive_02_gz_stat is not changed
- "archive_02_gz_stat.stat.mode == '0600'"
- "'archived' in archive_bz2_result_02"
- "{{ archive_bz2_result_02['archived']| length}} == 3"
- name: remove our gz
file: path="{{ output_dir }}/archive_02.gz" state=absent
- name: archive and set mode to 0600
archive:
path: "{{ output_dir }}/*.txt"
dest: "{{ output_dir }}/archive_02.zip"
format: zip
mode: "u+rwX,g-rwx,o-rwx"
register: archive_zip_result_02
- name: Test that the file modes were changed
stat:
path: "{{ output_dir }}/archive_02.zip"
register: archive_02_zip_stat
- name: Test that the file modes were changed
assert:
that:
- archive_02_zip_stat is not changed
- "archive_02_zip_stat.stat.mode == '0600'"
- "'archived' in archive_zip_result_02"
- "{{ archive_zip_result_02['archived']| length}} == 3"
- name: remove our zip
file: path="{{ output_dir }}/archive_02.zip" state=absent
- name: archive and set mode to 0600
archive:
path: "{{ output_dir }}/*.txt"
dest: "{{ output_dir }}/archive_02.bz2"
format: bz2
mode: "u+rwX,g-rwx,o-rwx"
register: archive_bz2_result_02
- name: Test that the file modes were changed
stat:
path: "{{ output_dir }}/archive_02.bz2"
register: archive_02_bz2_stat
- name: Test that the file modes were changed
assert:
that:
- archive_02_bz2_stat is not changed
- "archive_02_bz2_stat.stat.mode == '0600'"
- "'archived' in archive_bz2_result_02"
- "{{ archive_bz2_result_02['archived']| length}} == 3"
- name: remove our bz2
file: path="{{ output_dir }}/archive_02.bz2" state=absent
- name: archive and set mode to 0600
archive:
path: "{{ output_dir }}/*.txt"
dest: "{{ output_dir }}/archive_02.xz"
format: xz
mode: "u+rwX,g-rwx,o-rwx"
register: archive_xz_result_02
- name: Test that the file modes were changed
stat:
path: "{{ output_dir }}/archive_02.xz"
register: archive_02_xz_stat
- name: Test that the file modes were changed
assert:
that:
- archive_02_xz_stat is not changed
- "archive_02_xz_stat.stat.mode == '0600'"
- "'archived' in archive_xz_result_02"
- "{{ archive_xz_result_02['archived']| length}} == 3"
- name: remove our xz
file: path="{{ output_dir }}/archive_02.xz" state=absent
- name: archive multiple files as list
archive:
path:
- "{{ output_dir }}/empty.txt"
- "{{ output_dir }}/foo.txt"
- "{{ output_dir }}/bar.txt"
dest: "{{ output_dir }}/archive_list.gz"
format: gz
register: archive_gz_list_result
- name: verify that the files archived
file: path={{output_dir}}/archive_list.gz state=file
- name: check if gz file exists and includes all text files
assert:
that:
- "{{ archive_gz_list_result.changed }}"
- "{{ 'archived' in archive_gz_list_result }}"
- "{{ archive_gz_list_result['archived'] | length }} == 3"
- name: remove our gz
file: path="{{ output_dir }}/archive_list.gz" state=absent
- name: test that gz archive that contains non-ascii filenames
archive:
path: "{{ output_dir }}/*.txt"
dest: "{{ output_dir }}/test-archive-nonascii-くらとみ.tar.gz"
format: gz
register: nonascii_result_0
- name: Check that file is really there
stat:
path: "{{ output_dir }}/test-archive-nonascii-くらとみ.tar.gz"
register: nonascii_stat0
- name: Assert that nonascii tests succeeded
assert:
that:
- nonascii_result_0 is changed
- "nonascii_stat0.stat.exists == true"
- name: remove nonascii test
file: path="{{ output_dir }}/test-archive-nonascii-くらとみ.tar.gz" state=absent
- name: test that bz2 archive that contains non-ascii filenames
archive:
path: "{{ output_dir }}/*.txt"
dest: "{{ output_dir }}/test-archive-nonascii-くらとみ.bz2"
format: bz2
register: nonascii_result_1
- name: Check that file is really there
stat:
path: "{{ output_dir }}/test-archive-nonascii-くらとみ.bz2"
register: nonascii_stat_1
- name: Assert that nonascii tests succeeded
assert:
that:
- nonascii_result_1 is changed
- "nonascii_stat_1.stat.exists == true"
- name: remove nonascii test
file: path="{{ output_dir }}/test-archive-nonascii-くらとみ.bz2" state=absent
- name: test that xz archive that contains non-ascii filenames
archive:
path: "{{ output_dir }}/*.txt"
dest: "{{ output_dir }}/test-archive-nonascii-くらとみ.xz"
format: xz
register: nonascii_result_1
- name: Check that file is really there
stat:
path: "{{ output_dir }}/test-archive-nonascii-くらとみ.xz"
register: nonascii_stat_1
- name: Assert that nonascii tests succeeded
assert:
that:
- nonascii_result_1 is changed
- "nonascii_stat_1.stat.exists == true"
- name: remove nonascii test
file: path="{{ output_dir }}/test-archive-nonascii-くらとみ.xz" state=absent
- name: test that zip archive that contains non-ascii filenames
archive:
path: "{{ output_dir }}/*.txt"
dest: "{{ output_dir }}/test-archive-nonascii-くらとみ.zip"
format: zip
register: nonascii_result_2
- name: Check that file is really there
stat:
path: "{{ output_dir }}/test-archive-nonascii-くらとみ.zip"
register: nonascii_stat_2
- name: Assert that nonascii tests succeeded
assert:
that:
- nonascii_result_2 is changed
- "nonascii_stat_2.stat.exists == true"
- name: remove nonascii test
file: path="{{ output_dir }}/test-archive-nonascii-くらとみ.zip" state=absent
- name: Test exclusion_patterns option
archive:
path: "{{ output_dir }}/*.txt"
dest: "{{ output_dir }}/test-archive-exclusion-patterns.tgz"
exclusion_patterns: b?r.*
register: exclusion_patterns_result
- name: Assert that exclusion_patterns only archives included files
assert:
that:
- exclusion_patterns_result is changed
- "'bar.txt' not in exclusion_patterns_result.archived"
- name: Test that excluded paths do not influence archive root
archive:
path:
- "{{ output_dir }}/sub/subfile.txt"
- "{{ output_dir }}"
exclude_path:
- "{{ output_dir }}"
dest: "{{ output_dir }}/test-archive-root.tgz"
register: archive_root_result
- name: Assert that excluded paths do not influence archive root
assert:
that:
- archive_root_result.arcroot != output_dir
- name: Remove archive root test
file:
path: "{{ output_dir }}/test-archive-root.tgz"
state: absent
- name: Test Single Target with format={{ item }}
archive:
path: "{{ output_dir }}/foo.txt"
dest: "{{ output_dir }}/test-single-target.{{ item }}"
format: "{{ item }}"
register: "single_target_test"
loop:
- zip
- tar
- gz
- bz2
- xz
# Dummy tests until ``dest_state`` result value can be implemented
- name: Assert that single target tests are effective
assert:
that:
- single_target_test.results[0] is changed
- single_target_test.results[1] is changed
- single_target_test.results[2] is changed
- single_target_test.results[3] is changed
- single_target_test.results[4] is changed
- name: Retrieve contents of single target archives
ansible.builtin.unarchive:
src: "{{ output_dir }}/test-single-target.zip"
dest: .
list_files: true
check_mode: true
ignore_errors: true
register: single_target_test_contents
- name: Assert that file names in single-file zip archives are preserved
assert:
that:
- "'oo.txt' not in single_target_test_contents.files"
- "'foo.txt' in single_target_test_contents.files"
# ``unarchive`` fails for RHEL and FreeBSD on ansible 2.x
when: single_target_test_contents is success and single_target_test_contents is not skipped
- name: Remove single target test with format={{ item }}
file:
path: "{{ output_dir }}/test-single-target.{{ item }}"
state: absent
loop:
- zip
- tar
- gz
- bz2
- xz
- name: Test that missing files result in incomplete state
archive:
path:
- "{{ output_dir }}/*.txt"
- "{{ output_dir }}/dne.txt"
exclude_path: "{{ output_dir }}/foo.txt"
dest: "{{ output_dir }}/test-incomplete-archive.tgz"
register: incomplete_archive_result
- name: Assert that incomplete archive has incomplete state
assert:
that:
- incomplete_archive_result is changed
- "'{{ output_dir }}/dne.txt' in incomplete_archive_result.missing"
- "'{{ output_dir }}/foo.txt' not in incomplete_archive_result.missing"
- name: Remove incomplete archive
file:
path: "{{ output_dir }}/test-incomplete-archive.tgz"
state: absent
- name: Define formats to test
set_fact:
formats:
- tar
- zip
- gz
- bz2
- xz
# Run tests
- name: Run core tests
include_tasks:
file: ../tests/core.yml
loop: "{{ formats }}"
loop_control:
loop_var: format
- name: Run exclusions tests
include_tasks:
file: ../tests/exclusions.yml
loop: "{{ formats }}"
loop_control:
loop_var: format
- name: Run remove tests
include_tasks:
file: ../tests/remove.yml
loop: "{{ formats }}"
loop_control:
loop_var: format
- name: Run broken link tests
include_tasks:
file: ../tests/broken-link.yml
loop: "{{ formats }}"
loop_control:
loop_var: format
- name: Run Idempotency tests
include_tasks:
file: ../tests/idempotency.yml
loop: "{{ formats }}"
loop_control:
loop_var: format
# Test cleanup
- name: Remove backports.lzma if previously installed (pip)
pip: name=backports.lzma state=absent
when: backports_lzma_pip is changed
- name: import remove tests
import_tasks: remove.yml
- name: import broken-link tests
import_tasks: broken-link.yml

View File

@@ -1,186 +0,0 @@
---
- name: archive using gz and remove src files
archive:
path: "{{ output_dir }}/*.txt"
dest: "{{ output_dir }}/archive_remove_01.gz"
format: gz
remove: yes
register: archive_remove_result_01
- debug: msg="{{ archive_remove_result_01 }}"
- name: verify that the files archived
file: path={{ output_dir }}/archive_remove_01.gz state=file
- name: check if gz file exists and includes all text files and src files has been removed
assert:
that:
- "{{ archive_remove_result_01.changed }}"
- "{{ 'archived' in archive_remove_result_01 }}"
- "{{ archive_remove_result_01['archived'] | length }} == 3"
- name: remove our gz
file: path="{{ output_dir }}/archive_remove_01.gz" state=absent
- name: check if src files has been removed
assert:
that:
- "'{{ output_dir }}/{{ item }}' is not exists"
with_items:
- foo.txt
- bar.txt
- empty.txt
- name: prep our files again
copy: src={{ item }} dest={{ output_dir }}/{{ item }}
with_items:
- foo.txt
- bar.txt
- empty.txt
- name: create a temporary directory to be check if it will be removed
file:
path: "{{ output_dir }}/tmpdir"
state: directory
- name: prep our files in tmpdir
copy: src={{ item }} dest={{ output_dir }}/tmpdir/{{ item }}
with_items:
- foo.txt
- bar.txt
- empty.txt
- name: archive using gz and remove src directory
archive:
path: "{{ output_dir }}/tmpdir"
dest: "{{ output_dir }}/archive_remove_02.gz"
format: gz
remove: yes
register: archive_remove_result_02
- debug: msg="{{ archive_remove_result_02 }}"
- name: verify that the files archived
file: path={{ output_dir }}/archive_remove_02.gz state=file
- name: check if gz file exists and includes all text files
assert:
that:
- "{{ archive_remove_result_02.changed }}"
- "{{ 'archived' in archive_remove_result_02 }}"
- "{{ archive_remove_result_02['archived'] | length }} == 3"
- name: remove our gz
file: path="{{ output_dir }}/archive_remove_02.gz" state=absent
- name: check if src folder has been removed
assert:
that:
- "'{{ output_dir }}/tmpdir' is not exists"
- name: create temporary directory again
file:
path: "{{ output_dir }}/tmpdir"
state: directory
- name: prep our files in tmpdir again
copy: src={{ item }} dest={{ output_dir }}/tmpdir/{{ item }}
with_items:
- foo.txt
- bar.txt
- empty.txt
- name: archive using gz and remove src directory excluding one file
archive:
path: "{{ output_dir }}/tmpdir/*"
dest: "{{ output_dir }}/archive_remove_03.gz"
format: gz
remove: yes
exclude_path: "{{ output_dir }}/tmpdir/empty.txt"
register: archive_remove_result_03
- debug: msg="{{ archive_remove_result_03 }}"
- name: verify that the files archived
file: path={{ output_dir }}/archive_remove_03.gz state=file
- name: check if gz file exists and includes all text files
assert:
that:
- "{{ archive_remove_result_03.changed }}"
- "{{ 'archived' in archive_remove_result_03 }}"
- "{{ archive_remove_result_03['archived'] | length }} == 2"
- name: remove our gz
file: path="{{ output_dir }}/archive_remove_03.gz" state=absent
- name: verify that excluded file is still present
file: path={{ output_dir }}/tmpdir/empty.txt state=file
- name: prep our files in tmpdir again
copy: src={{ item }} dest={{ output_dir }}/tmpdir/{{ item }}
with_items:
- foo.txt
- bar.txt
- empty.txt
- sub
- sub/subfile.txt
- name: archive using gz and remove src directory
archive:
path:
- "{{ output_dir }}/tmpdir/*.txt"
- "{{ output_dir }}/tmpdir/sub/*"
dest: "{{ output_dir }}/archive_remove_04.gz"
format: gz
remove: yes
exclude_path: "{{ output_dir }}/tmpdir/sub/subfile.txt"
register: archive_remove_result_04
- debug: msg="{{ archive_remove_result_04 }}"
- name: verify that the files archived
file: path={{ output_dir }}/archive_remove_04.gz state=file
- name: remove our gz
file: path="{{ output_dir }}/archive_remove_04.gz" state=absent
- name: verify that excluded sub file is still present
file: path={{ output_dir }}/tmpdir/sub/subfile.txt state=file
- name: prep our files in tmpdir again
copy: src={{ item }} dest={{ output_dir }}/tmpdir/{{ item }}
with_items:
- foo.txt
- bar.txt
- empty.txt
- sub
- sub/subfile.txt
- name: archive using gz and remove src directory
archive:
path:
- "{{ output_dir }}/tmpdir/"
dest: "{{ output_dir }}/archive_remove_05.gz"
format: gz
remove: yes
exclude_path: "{{ output_dir }}/tmpdir/sub/subfile.txt"
register: archive_remove_result_05
- name: verify that the files archived
file: path={{ output_dir }}/archive_remove_05.gz state=file
- name: Verify source files were removed
file:
path: "{{ output_dir }}/tmpdir"
state: absent
register: archive_source_file_removal_05
- name: Verify that task status is success
assert:
that:
- archive_remove_result_05 is success
- archive_source_file_removal_05 is not changed
- name: remove our gz
file: path="{{ output_dir }}/archive_remove_05.gz" state=absent

View File

@@ -0,0 +1,31 @@
---
- block:
- name: Create link - broken link ({{ format }})
file:
src: /nowhere
dest: "{{ output_dir }}/nowhere.txt"
state: link
force: yes
- name: Archive - broken link ({{ format }})
archive:
path: "{{ output_dir }}/*.txt"
dest: "{{ output_dir }}/archive_broken_link.{{ format }}"
format: "{{ format }}"
- name: Verify archive exists - broken link ({{ format }})
file:
path: "{{ output_dir }}/archive_broken_link.{{ format }}"
state: file
- name: Remove archive - broken link ({{ format }})
file:
path: "{{ output_dir }}/archive_broken_link.{{ format }}"
state: absent
- name: Remove link - broken link ({{ format }})
file:
path: "{{ output_dir }}/nowhere.txt"
state: absent
# 'zip' does not support symlink's
when: format != 'zip'

View File

@@ -0,0 +1,188 @@
####################################################################
# WARNING: These are designed specifically for Ansible tests #
# and should not be used as examples of how to write Ansible roles #
####################################################################
# Test code for the archive module.
# (c) 2017, Abhijeet Kasurde <akasurde@redhat.com>
# This file is part of Ansible
#
# Ansible is free software: you can redistribute it and/or modify
# it under the terms of the GNU General Public License as published by
# the Free Software Foundation, either version 3 of the License, or
# (at your option) any later version.
#
# Ansible is distributed in the hope that it will be useful,
# but WITHOUT ANY WARRANTY; without even the implied warranty of
# MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the
# GNU General Public License for more details.
#
# You should have received a copy of the GNU General Public License
# along with Ansible. If not, see <http://www.gnu.org/licenses/>.
# Make sure we start fresh
# Core functionality tests
- name: Archive - no options ({{ format }})
archive:
path: "{{ output_dir }}/*.txt"
dest: "{{ output_dir }}/archive_no_opts.{{ format }}"
format: "{{ format }}"
register: archive_no_options
- name: Verify that archive exists - no options ({{ format }})
file:
path: "{{output_dir}}/archive_no_opts.{{ format }}"
state: file
- name: Verify that archive result is changed and includes all files - no options ({{ format }})
assert:
that:
- archive_no_options is changed
- "archive_no_options.dest_state == 'archive'"
- "{{ archive_no_options.archived | length }} == 3"
- name: Remove the archive - no options ({{ format }})
file:
path: "{{ output_dir }}/archive_no_options.{{ format }}"
state: absent
- name: Archive - file options ({{ format }})
archive:
path: "{{ output_dir }}/*.txt"
dest: "{{ output_dir }}/archive_file_options.{{ format }}"
format: "{{ format }}"
mode: "u+rwX,g-rwx,o-rwx"
register: archive_file_options
- name: Retrieve archive file information - file options ({{ format }})
stat:
path: "{{ output_dir }}/archive_file_options.{{ format }}"
register: archive_file_options_stat
- name: Test that the file modes were changed
assert:
that:
- archive_file_options_stat is not changed
- "archive_file_options.mode == '0600'"
- "{{ archive_file_options.archived | length }} == 3"
- name: Remove the archive - file options ({{ format }})
file:
path: "{{ output_dir }}/archive_file_options.{{ format }}"
state: absent
- name: Archive - non-ascii ({{ format }})
archive:
path: "{{ output_dir }}/*.txt"
dest: "{{ output_dir }}/archive_nonascii_くらとみ.{{ format }}"
format: "{{ format }}"
register: archive_nonascii
- name: Retrieve archive file information - non-ascii ({{ format }})
stat:
path: "{{ output_dir }}/archive_nonascii_くらとみ.{{ format }}"
register: archive_nonascii_stat
- name: Test that archive exists - non-ascii ({{ format }})
assert:
that:
- archive_nonascii is changed
- archive_nonascii_stat.stat.exists == true
- name: Remove the archive - non-ascii ({{ format }})
file:
path: "{{ output_dir }}/archive_nonascii_くらとみ.{{ format }}"
state: absent
- name: Archive - single target ({{ format }})
archive:
path: "{{ output_dir }}/foo.txt"
dest: "{{ output_dir }}/archive_single_target.{{ format }}"
format: "{{ format }}"
register: archive_single_target
- name: Assert archive has correct state - single target ({{ format }})
assert:
that:
- archive_single_target.dest_state == state_map[format]
vars:
state_map:
tar: archive
zip: archive
gz: compress
bz2: compress
xz: compress
- block:
- name: Retrieve contents of archive - single target ({{ format }})
ansible.builtin.unarchive:
src: "{{ output_dir }}/archive_single_target.{{ format }}"
dest: .
list_files: true
check_mode: true
ignore_errors: true
register: archive_single_target_contents
- name: Assert that file names are preserved - single target ({{ format }})
assert:
that:
- "'oo.txt' not in archive_single_target_contents.files"
- "'foo.txt' in archive_single_target_contents.files"
# ``unarchive`` fails for RHEL and FreeBSD on ansible 2.x
when: archive_single_target_contents is success and archive_single_target_contents is not skipped
when: "format == 'zip'"
- name: Remove archive - single target ({{ format }})
file:
path: "{{ output_dir }}/archive_single_target.{{ format }}"
state: absent
- name: Archive - path list ({{ format }})
archive:
path:
- "{{ output_dir }}/empty.txt"
- "{{ output_dir }}/foo.txt"
- "{{ output_dir }}/bar.txt"
dest: "{{ output_dir }}/archive_path_list.{{ format }}"
format: "{{ format }}"
register: archive_path_list
- name: Verify that archive exists - path list ({{ format }})
file:
path: "{{output_dir}}/archive_path_list.{{ format }}"
state: file
- name: Assert that archive contains all files - path list ({{ format }})
assert:
that:
- archive_path_list is changed
- "{{ archive_path_list.archived | length }} == 3"
- name: Remove archive - path list ({{ format }})
file:
path: "{{ output_dir }}/archive_path_list.{{ format }}"
state: absent
- name: Archive - missing paths ({{ format }})
archive:
path:
- "{{ output_dir }}/*.txt"
- "{{ output_dir }}/dne.txt"
exclude_path: "{{ output_dir }}/foo.txt"
dest: "{{ output_dir }}/archive_missing_paths.{{ format }}"
format: "{{ format }}"
register: archive_missing_paths
- name: Assert that incomplete archive has incomplete state - missing paths ({{ format }})
assert:
that:
- archive_missing_paths is changed
- "archive_missing_paths.dest_state == 'incomplete'"
- "'{{ output_dir }}/dne.txt' in archive_missing_paths.missing"
- "'{{ output_dir }}/foo.txt' not in archive_missing_paths.missing"
- name: Remove archive - missing paths ({{ format }})
file:
path: "{{ output_dir }}/archive_missing_paths.{{ format }}"
state: absent

View File

@@ -0,0 +1,40 @@
---
- name: Archive - exclusion patterns ({{ format }})
archive:
path: "{{ output_dir }}/*.txt"
dest: "{{ output_dir }}/archive_exclusion_patterns.{{ format }}"
format: "{{ format }}"
exclusion_patterns: b?r.*
register: archive_exclusion_patterns
- name: Assert that only included files are archived - exclusion patterns ({{ format }})
assert:
that:
- archive_exclusion_patterns is changed
- "'bar.txt' not in archive_exclusion_patterns.archived"
- name: Remove archive - exclusion patterns ({{ format }})
file:
path: "{{ output_dir }}/archive_exclusion_patterns.{{ format }}"
state: absent
- name: Archive - exclude path ({{ format }})
archive:
path:
- "{{ output_dir }}/sub/subfile.txt"
- "{{ output_dir }}"
exclude_path:
- "{{ output_dir }}"
dest: "{{ output_dir }}/archive_exclude_paths.{{ format }}"
format: "{{ format }}"
register: archive_excluded_paths
- name: Assert that excluded paths do not influence archive root - exclude path ({{ format }})
assert:
that:
- archive_excluded_paths.arcroot != output_dir
- name: Remove archive - exclude path ({{ format }})
file:
path: "{{ output_dir }}/archive_exclude_paths.{{ format }}"
state: absent

View File

@@ -0,0 +1,141 @@
---
- name: Archive - file content idempotency ({{ format }})
archive:
path: "{{ output_dir }}/*.txt"
dest: "{{ output_dir }}/archive_file_content_idempotency.{{ format }}"
format: "{{ format }}"
register: file_content_idempotency_before
- name: Modify file - file content idempotency ({{ format }})
lineinfile:
line: bar.txt
regexp: "^foo.txt$"
path: "{{ output_dir }}/foo.txt"
- name: Archive second time - file content idempotency ({{ format }})
archive:
path: "{{ output_dir }}/*.txt"
dest: "{{ output_dir }}/archive_file_content_idempotency.{{ format }}"
format: "{{ format }}"
register: file_content_idempotency_after
# After idempotency fix result will be reliably changed for all formats
- name: Assert task status is changed - file content idempotency ({{ format }})
assert:
that:
- file_content_idempotency_after is not changed
when: "format in ('tar', 'zip')"
- name: Remove archive - file content idempotency ({{ format }})
file:
path: "{{ output_dir }}/archive_file_content_idempotency.{{ format }}"
state: absent
- name: Modify file back - file content idempotency ({{ format }})
lineinfile:
line: foo.txt
regexp: "^bar.txt$"
path: "{{ output_dir }}/foo.txt"
- name: Archive - file name idempotency ({{ format }})
archive:
path: "{{ output_dir }}/*.txt"
dest: "{{ output_dir }}/archive_file_name_idempotency.{{ format }}"
format: "{{ format }}"
register: file_name_idempotency_before
- name: Rename file - file name idempotency ({{ format }})
command: "mv {{ output_dir}}/foo.txt {{ output_dir }}/fii.txt"
- name: Archive again - file name idempotency ({{ format }})
archive:
path: "{{ output_dir }}/*.txt"
dest: "{{ output_dir }}/archive_file_name_idempotency.{{ format }}"
format: "{{ format }}"
register: file_name_idempotency_after
# After idempotency fix result will be reliably changed for all formats
- name: Check task status - file name idempotency ({{ format }})
assert:
that:
- file_name_idempotency_after is not changed
when: "format in ('tar', 'zip')"
- name: Remove archive - file name idempotency ({{ format }})
file:
path: "{{ output_dir }}/archive_file_name_idempotency.{{ format }}"
state: absent
- name: Rename file back - file name idempotency ({{ format }})
command: "mv {{ output_dir }}/fii.txt {{ output_dir }}/foo.txt"
- name: Archive - single file content idempotency ({{ format }})
archive:
path: "{{ output_dir }}/foo.txt"
dest: "{{ output_dir }}/archive_single_file_content_idempotency.{{ format }}"
format: "{{ format }}"
register: single_file_content_idempotency_before
- name: Modify file - single file content idempotency ({{ format }})
lineinfile:
line: bar.txt
regexp: "^foo.txt$"
path: "{{ output_dir }}/foo.txt"
- name: Archive second time - single file content idempotency ({{ format }})
archive:
path: "{{ output_dir }}/foo.txt"
dest: "{{ output_dir }}/archive_single_file_content_idempotency.{{ format }}"
format: "{{ format }}"
register: single_file_content_idempotency_after
# After idempotency fix result will be reliably changed for all formats
- name: Assert task status is changed - single file content idempotency ({{ format }})
assert:
that:
- single_file_content_idempotency_after is not changed
when: "format in ('tar', 'zip')"
- name: Remove archive - single file content idempotency ({{ format }})
file:
path: "{{ output_dir }}/archive_single_file_content_idempotency.{{ format }}"
state: absent
- name: Modify file back - single file content idempotency ({{ format }})
lineinfile:
line: foo.txt
regexp: "^bar.txt$"
path: "{{ output_dir }}/foo.txt"
- name: Archive - single file name idempotency ({{ format }})
archive:
path: "{{ output_dir }}/foo.txt"
dest: "{{ output_dir }}/archive_single_file_name_idempotency.{{ format }}"
format: "{{ format }}"
register: single_file_name_idempotency_before
- name: Rename file - single file name idempotency ({{ format }})
command: "mv {{ output_dir}}/foo.txt {{ output_dir }}/fii.txt"
- name: Archive again - single file name idempotency ({{ format }})
archive:
path: "{{ output_dir }}/fii.txt"
dest: "{{ output_dir }}/archive_single_file_name_idempotency.{{ format }}"
format: "{{ format }}"
register: single_file_name_idempotency_after
# After idempotency fix result will be reliably changed for all formats
- name: Check task status - single file name idempotency ({{ format }})
assert:
that:
- single_file_name_idempotency_after is not changed
when: "format in ('tar', 'zip')"
- name: Remove archive - single file name idempotency ({{ format }})
file:
path: "{{ output_dir }}/archive_single_file_name_idempotency.{{ format }}"
state: absent
- name: Rename file back - single file name idempotency ({{ format }})
command: "mv {{ output_dir }}/fii.txt {{ output_dir }}/foo.txt"

View File

@@ -0,0 +1,207 @@
---
- name: Archive - remove source files ({{ format }})
archive:
path: "{{ output_dir }}/*.txt"
dest: "{{ output_dir }}/archive_remove_source_files.{{ format }}"
format: "{{ format }}"
remove: yes
register: archive_remove_source_files
- name: Verify archive exists - remove source files ({{ format }})
file:
path: "{{ output_dir }}/archive_remove_source_files.{{ format }}"
state: file
- name: Verify all files were archived - remove source files ({{ format }})
assert:
that:
- archive_remove_source_files is changed
- "{{ archive_remove_source_files.archived | length }} == 3"
- name: Remove Archive - remove source files ({{ format }})
file:
path: "{{ output_dir }}/archive_remove_source_files.{{ format }}"
state: absent
- name: Assert that source files were removed - remove source files ({{ format }})
assert:
that:
- "'{{ output_dir }}/{{ item }}' is not exists"
with_items:
- foo.txt
- bar.txt
- empty.txt
- name: Copy source files - remove source directory ({{ format }})
copy:
src: "{{ item }}"
dest: "{{ output_dir }}/{{ item }}"
with_items:
- foo.txt
- bar.txt
- empty.txt
- name: Create temporary directory - remove source directory ({{ format }})
file:
path: "{{ output_dir }}/tmpdir"
state: directory
- name: Copy source files to temporary directory - remove source directory ({{ format }})
copy:
src: "{{ item }}"
dest: "{{ output_dir }}/tmpdir/{{ item }}"
with_items:
- foo.txt
- bar.txt
- empty.txt
- name: Archive - remove source directory ({{ format }})
archive:
path: "{{ output_dir }}/tmpdir"
dest: "{{ output_dir }}/archive_remove_source_directory.{{ format }}"
format: "{{ format }}"
remove: yes
register: archive_remove_source_directory
- name: Verify archive exists - remove source directory ({{ format }})
file:
path: "{{ output_dir }}/archive_remove_source_directory.{{ format }}"
state: file
- name: Verify archive contains all files - remove source directory ({{ format }})
assert:
that:
- archive_remove_source_directory is changed
- "{{ archive_remove_source_directory.archived | length }} == 3"
- name: Remove archive - remove source directory ({{ format }})
file:
path: "{{ output_dir }}/archive_remove_source_directory.{{ format }}"
state: absent
- name: Verify source directory was removed - remove source directory ({{ format }})
assert:
that:
- "'{{ output_dir }}/tmpdir' is not exists"
- name: Create temporary directory - remove source excluding path ({{ format }})
file:
path: "{{ output_dir }}/tmpdir"
state: directory
- name: Copy source files to temporary directory - remove source excluding path ({{ format }})
copy:
src: "{{ item }}"
dest: "{{ output_dir }}/tmpdir/{{ item }}"
with_items:
- foo.txt
- bar.txt
- empty.txt
- name: Archive - remove source excluding path ({{ format }})
archive:
path: "{{ output_dir }}/tmpdir/*"
dest: "{{ output_dir }}/archive_remove_source_excluding_path.{{ format }}"
format: "{{ format }}"
remove: yes
exclude_path: "{{ output_dir }}/tmpdir/empty.txt"
register: archive_remove_source_excluding_path
- name: Verify archive exists - remove source excluding path ({{ format }})
file:
path: "{{ output_dir }}/archive_remove_source_excluding_path.{{ format }}"
state: file
- name: Verify all files except excluded are archived - remove source excluding path ({{ format }})
assert:
that:
- archive_remove_source_excluding_path is changed
- "{{ archive_remove_source_excluding_path.archived | length }} == 2"
- name: Remove archive - remove source excluding path ({{ format }})
file:
path: "{{ output_dir }}/archive_remove_source_excluding_path.{{ format }}"
state: absent
- name: Verify that excluded file still exists - remove source excluding path ({{ format }})
file:
path: "{{ output_dir }}/tmpdir/empty.txt"
state: file
- name: Copy source files to temporary directory - remove source excluding sub path ({{ format }})
copy:
src: "{{ item }}"
dest: "{{ output_dir }}/tmpdir/{{ item }}"
with_items:
- foo.txt
- bar.txt
- empty.txt
- sub
- sub/subfile.txt
- name: Archive - remove source excluding sub path ({{ format }})
archive:
path:
- "{{ output_dir }}/tmpdir/*.txt"
- "{{ output_dir }}/tmpdir/sub/*"
dest: "{{ output_dir }}/archive_remove_source_excluding_sub_path.{{ format }}"
format: "{{ format }}"
remove: yes
exclude_path: "{{ output_dir }}/tmpdir/sub/subfile.txt"
register: archive_remove_source_excluding_sub_path
- name: Verify archive exists - remove source excluding sub path ({{ format }})
file:
path: "{{ output_dir }}/archive_remove_source_excluding_sub_path.{{ format }}"
state: file
- name: Remove archive - remove source excluding sub path ({{ format }})
file:
path: "{{ output_dir }}/archive_remove_source_excluding_sub_path.{{ format }}"
state: absent
- name: Verify that sub path still exists - remove source excluding sub path ({{ format }})
file:
path: "{{ output_dir }}/tmpdir/sub/subfile.txt"
state: file
- name: Copy source files to temporary directory - remove source with nested paths ({{ format }})
copy:
src: "{{ item }}"
dest: "{{ output_dir }}/tmpdir/{{ item }}"
with_items:
- foo.txt
- bar.txt
- empty.txt
- sub
- sub/subfile.txt
- name: Archive - remove source with nested paths ({{ format }})
archive:
path: "{{ output_dir }}/tmpdir/"
dest: "{{ output_dir }}/archive_remove_source_nested_paths.{{ format }}"
format: "{{ format }}"
remove: yes
register: archive_remove_nested_paths
- name: Verify archive exists - remove source with nested paths ({{ format }})
file:
path: "{{ output_dir }}/archive_remove_source_nested_paths.{{ format }}"
state: file
- name: Verify source files were removed - remove source with nested paths ({{ format }})
file:
path: "{{ output_dir }}/tmpdir"
state: absent
register: archive_remove_nested_paths_status
- name: Assert tasks status - remove source with nested paths ({{ format }})
assert:
that:
- archive_remove_nested_paths is success
- archive_remove_nested_paths_status is not changed
- name: Remove archive - remove source with nested paths ({{ format }})
file:
path: "{{ output_dir }}/archive_remove_source_nested_paths.{{ format }}"
state: absent

View File

@@ -1,5 +1,5 @@
destructive
shippable/posix/group3
shippable/posix/group1
skip/aix
skip/osx
skip/macos

View File

@@ -23,3 +23,9 @@ tested_filesystems:
f2fs: {fssize: '{{ f2fs_fssize|default(60) }}', grow: 'f2fs_version is version("1.10.0", ">=")'}
lvm: {fssize: 20, grow: True}
swap: {fssize: 10, grow: False} # grow not implemented
ufs: {fssize: 10, grow: True}
get_uuid_any: "blkid -c /dev/null -o value -s UUID {{ dev }}"
get_uuid_ufs: "dumpfs {{ dev }} | awk -v sb=superblock -v id=id '$1 == sb && $4 == id {print $6$7}'"
get_uuid_cmd: "{{ get_uuid_ufs if fstype == 'ufs' else get_uuid_any }}"

View File

@@ -19,6 +19,17 @@
ansible.builtin.set_fact:
dev: "{{ loop_device_cmd.stdout }}"
- when: fstype == 'ufs'
block:
- name: 'Create a memory disk for UFS'
ansible.builtin.command:
cmd: 'mdconfig -a -f {{ dev }}'
register: memory_disk_cmd
- name: 'Switch to memory disk target for further tasks'
ansible.builtin.set_fact:
dev: "/dev/{{ memory_disk_cmd.stdout }}"
- include_tasks: '{{ action }}.yml'
always:
@@ -28,10 +39,16 @@
removes: '{{ dev }}'
when: fstype == 'lvm'
- name: 'Clean correct device for LVM'
- name: 'Detach memory disk used for UFS'
ansible.builtin.command:
cmd: 'mdconfig -d -u {{ dev }}'
removes: '{{ dev }}'
when: fstype == 'ufs'
- name: 'Clean correct device for LVM and UFS'
ansible.builtin.set_fact:
dev: '{{ image_file }}'
when: fstype == 'lvm'
when: fstype in ['lvm', 'ufs']
- name: 'Remove disk image file'
ansible.builtin.file:

View File

@@ -12,8 +12,8 @@
- 'fs_result is success'
- name: "Get UUID of created filesystem"
ansible.builtin.command:
cmd: 'blkid -c /dev/null -o value -s UUID {{ dev }}'
ansible.builtin.shell:
cmd: "{{ get_uuid_cmd }}"
changed_when: false
register: uuid
@@ -24,8 +24,8 @@
register: fs2_result
- name: "Get UUID of the filesystem"
ansible.builtin.command:
cmd: 'blkid -c /dev/null -o value -s UUID {{ dev }}'
ansible.builtin.shell:
cmd: "{{ get_uuid_cmd }}"
changed_when: false
register: uuid2
@@ -44,8 +44,8 @@
register: fs3_result
- name: "Get UUID of the new filesystem"
ansible.builtin.command:
cmd: 'blkid -c /dev/null -o value -s UUID {{ dev }}'
ansible.builtin.shell:
cmd: "{{ get_uuid_cmd }}"
changed_when: false
register: uuid3
@@ -71,6 +71,11 @@
cmd: 'losetup -c {{ dev }}'
when: fstype == 'lvm'
- name: "Resize memory disk for UFS"
ansible.builtin.command:
cmd: 'mdconfig -r -u {{ dev }} -s {{ fssize | int + 1 }}M'
when: fstype == 'ufs'
- name: "Expand filesystem"
community.general.filesystem:
dev: '{{ dev }}'
@@ -79,8 +84,8 @@
register: fs4_result
- name: "Get UUID of the filesystem"
ansible.builtin.command:
cmd: 'blkid -c /dev/null -o value -s UUID {{ dev }}'
ansible.builtin.shell:
cmd: "{{ get_uuid_cmd }}"
changed_when: false
register: uuid4

View File

@@ -0,0 +1,10 @@
---
- name: "Uninstall e2fsprogs"
ansible.builtin.package:
name: e2fsprogs
state: absent
- name: "Install util-linux"
ansible.builtin.package:
name: util-linux
state: present

View File

@@ -35,6 +35,10 @@
# Available on FreeBSD but not on testbed (util-linux conflicts with e2fsprogs): wipefs, mkfs.minix
- 'not (ansible_system == "FreeBSD" and item.1 in ["overwrite_another_fs", "remove_fs"])'
# Linux limited support
# Not available: ufs (this is FreeBSD's native fs)
- 'not (ansible_system == "Linux" and item.0.key == "ufs")'
# Other limitations and corner cases
# f2fs-tools and reiserfs-utils packages not available with RHEL/CentOS on CI
@@ -59,3 +63,24 @@
item.0.key == "xfs" and ansible_python.version.major == 2)'
loop: "{{ query('dict', tested_filesystems)|product(['create_fs', 'overwrite_another_fs', 'remove_fs'])|list }}"
# With FreeBSD extended support (util-linux is not available before 12.2)
- include_tasks: freebsd_setup.yml
when:
- 'ansible_system == "FreeBSD"'
- 'ansible_distribution_version is version("12.2", ">=")'
- include_tasks: create_device.yml
vars:
image_file: '{{ remote_tmp_dir }}/img'
fstype: '{{ item.0.key }}'
fssize: '{{ item.0.value.fssize }}'
grow: '{{ item.0.value.grow }}'
action: '{{ item.1 }}'
when:
- 'ansible_system == "FreeBSD"'
- 'ansible_distribution_version is version("12.2", ">=")'
- 'item.0.key in ["xfs", "vfat"]'
loop: "{{ query('dict', tested_filesystems)|product(['create_fs', 'overwrite_another_fs', 'remove_fs'])|list }}"

View File

@@ -10,8 +10,8 @@
cmd: 'mkfs.minix {{ dev }}'
- name: 'Get UUID of the new filesystem'
ansible.builtin.command:
cmd: 'blkid -c /dev/null -o value -s UUID {{ dev }}'
ansible.builtin.shell:
cmd: "{{ get_uuid_cmd }}"
changed_when: false
register: uuid
@@ -23,8 +23,8 @@
ignore_errors: True
- name: 'Get UUID of the filesystem'
ansible.builtin.command:
cmd: 'blkid -c /dev/null -o value -s UUID {{ dev }}'
ansible.builtin.shell:
cmd: "{{ get_uuid_cmd }}"
changed_when: false
register: uuid2
@@ -42,8 +42,8 @@
register: fs_result2
- name: 'Get UUID of the new filesystem'
ansible.builtin.command:
cmd: 'blkid -c /dev/null -o value -s UUID {{ dev }}'
ansible.builtin.shell:
cmd: "{{ get_uuid_cmd }}"
changed_when: false
register: uuid3

View File

@@ -7,8 +7,8 @@
fstype: '{{ fstype }}'
- name: "Get filesystem UUID with 'blkid'"
ansible.builtin.command:
cmd: 'blkid -c /dev/null -o value -s UUID {{ dev }}'
ansible.builtin.shell:
cmd: "{{ get_uuid_cmd }}"
changed_when: false
register: blkid_ref
@@ -27,8 +27,8 @@
check_mode: yes
- name: "Get filesystem UUID with 'blkid' (should remain the same)"
ansible.builtin.command:
cmd: 'blkid -c /dev/null -o value -s UUID {{ dev }}'
ansible.builtin.shell:
cmd: "{{ get_uuid_cmd }}"
changed_when: false
register: blkid
@@ -46,8 +46,8 @@
register: wipefs
- name: "Get filesystem UUID with 'blkid' (should be empty)"
ansible.builtin.command:
cmd: 'blkid -c /dev/null -o value -s UUID {{ dev }}'
ansible.builtin.shell:
cmd: "{{ get_uuid_cmd }}"
changed_when: false
failed_when: false
register: blkid

View File

@@ -1,2 +1,2 @@
shippable/posix/group4
shippable/posix/group3
skip/python2.6 # filters are controller only, and we no longer support Python 2.6 on the controller

View File

@@ -42,4 +42,4 @@
- assert:
that:
- result.msg == "Multiple sequence entries have attribute value 'a'"
- result.msg == "Multiple sequence entries have attribute value 'a'" or result.msg == "Multiple sequence entries have attribute value u'a'"

View File

@@ -0,0 +1 @@
unsupported

View File

@@ -0,0 +1,246 @@
---
- name: Create realm
community.general.keycloak_realm:
auth_keycloak_url: "{{ url }}"
auth_realm: "{{ admin_realm }}"
auth_username: "{{ admin_user }}"
auth_password: "{{ admin_password }}"
id: "{{ realm }}"
realm: "{{ realm }}"
state: present
- name: Create client
community.general.keycloak_client:
auth_keycloak_url: "{{ url }}"
auth_realm: "{{ admin_realm }}"
auth_username: "{{ admin_user }}"
auth_password: "{{ admin_password }}"
realm: "{{ realm }}"
client_id: "{{ client_id }}"
state: present
register: client
- name: Create new realm role
community.general.keycloak_role:
auth_keycloak_url: "{{ url }}"
auth_realm: "{{ admin_realm }}"
auth_username: "{{ admin_user }}"
auth_password: "{{ admin_password }}"
realm: "{{ realm }}"
name: "{{ role }}"
description: "{{ description_1 }}"
state: present
register: result
- name: Debug
debug:
var: result
- name: Assert realm role created
assert:
that:
- result is changed
- result.existing == {}
- result.end_state.name == "{{ role }}"
- result.end_state.containerId == "{{ realm }}"
- name: Create existing realm role
community.general.keycloak_role:
auth_keycloak_url: "{{ url }}"
auth_realm: "{{ admin_realm }}"
auth_username: "{{ admin_user }}"
auth_password: "{{ admin_password }}"
realm: "{{ realm }}"
name: "{{ role }}"
description: "{{ description_1 }}"
state: present
register: result
- name: Debug
debug:
var: result
- name: Assert realm role unchanged
assert:
that:
- result is not changed
- name: Update realm role
community.general.keycloak_role:
auth_keycloak_url: "{{ url }}"
auth_realm: "{{ admin_realm }}"
auth_username: "{{ admin_user }}"
auth_password: "{{ admin_password }}"
realm: "{{ realm }}"
name: "{{ role }}"
description: "{{ description_2 }}"
state: present
register: result
- name: Debug
debug:
var: result
- name: Assert realm role updated
assert:
that:
- result is changed
- result.existing.description == "{{ description_1 }}"
- result.end_state.description == "{{ description_2 }}"
- name: Delete existing realm role
community.general.keycloak_role:
auth_keycloak_url: "{{ url }}"
auth_realm: "{{ admin_realm }}"
auth_username: "{{ admin_user }}"
auth_password: "{{ admin_password }}"
realm: "{{ realm }}"
name: "{{ role }}"
state: absent
register: result
- name: Debug
debug:
var: result
- name: Assert realm role deleted
assert:
that:
- result is changed
- result.end_state == {}
- name: Delete absent realm role
community.general.keycloak_role:
auth_keycloak_url: "{{ url }}"
auth_realm: "{{ admin_realm }}"
auth_username: "{{ admin_user }}"
auth_password: "{{ admin_password }}"
realm: "{{ realm }}"
name: "{{ role }}"
state: absent
register: result
- name: Debug
debug:
var: result
- name: Assert realm role unchanged
assert:
that:
- result is not changed
- result.end_state == {}
- name: Create new client role
community.general.keycloak_role:
auth_keycloak_url: "{{ url }}"
auth_realm: "{{ admin_realm }}"
auth_username: "{{ admin_user }}"
auth_password: "{{ admin_password }}"
realm: "{{ realm }}"
client_id: "{{ client_id }}"
name: "{{ role }}"
description: "{{ description_1 }}"
state: present
register: result
- name: Debug
debug:
var: result
- name: Assert client role created
assert:
that:
- result is changed
- result.existing == {}
- result.end_state.name == "{{ role }}"
- result.end_state.containerId == "{{ client.end_state.id }}"
- name: Create existing client role
community.general.keycloak_role:
auth_keycloak_url: "{{ url }}"
auth_realm: "{{ admin_realm }}"
auth_username: "{{ admin_user }}"
auth_password: "{{ admin_password }}"
realm: "{{ realm }}"
client_id: "{{ client_id }}"
name: "{{ role }}"
description: "{{ description_1 }}"
state: present
register: result
- name: Debug
debug:
var: result
- name: Assert client role unchanged
assert:
that:
- result is not changed
- name: Update client role
community.general.keycloak_role:
auth_keycloak_url: "{{ url }}"
auth_realm: "{{ admin_realm }}"
auth_username: "{{ admin_user }}"
auth_password: "{{ admin_password }}"
realm: "{{ realm }}"
client_id: "{{ client_id }}"
name: "{{ role }}"
description: "{{ description_2 }}"
state: present
register: result
- name: Debug
debug:
var: result
- name: Assert client role updated
assert:
that:
- result is changed
- result.existing.description == "{{ description_1 }}"
- result.end_state.description == "{{ description_2 }}"
- name: Delete existing client role
community.general.keycloak_role:
auth_keycloak_url: "{{ url }}"
auth_realm: "{{ admin_realm }}"
auth_username: "{{ admin_user }}"
auth_password: "{{ admin_password }}"
realm: "{{ realm }}"
client_id: "{{ client_id }}"
name: "{{ role }}"
state: absent
register: result
- name: Debug
debug:
var: result
- name: Assert client role deleted
assert:
that:
- result is changed
- result.end_state == {}
- name: Delete absent client role
community.general.keycloak_role:
auth_keycloak_url: "{{ url }}"
auth_realm: "{{ admin_realm }}"
auth_username: "{{ admin_user }}"
auth_password: "{{ admin_password }}"
realm: "{{ realm }}"
client_id: "{{ client_id }}"
name: "{{ role }}"
state: absent
register: result
- name: Debug
debug:
var: result
- name: Assert client role unchanged
assert:
that:
- result is not changed
- result.end_state == {}

View File

@@ -0,0 +1,10 @@
---
url: http://localhost:8080/auth
admin_realm: master
admin_user: admin
admin_password: password
realm: myrealm
client_id: myclient
role: myrole
description_1: desc 1
description_2: desc 2

View File

@@ -3,3 +3,5 @@ destructive
skip/aix
skip/rhel
skip/python2.6 # lookups are controller only, and we no longer support Python 2.6 on the controller
skip/osx # FIXME https://github.com/ansible-collections/community.general/issues/2978
skip/macos # FIXME https://github.com/ansible-collections/community.general/issues/2978

View File

@@ -1 +1 @@
shippable/posix/group4
shippable/posix/group2

View File

@@ -1,3 +1,6 @@
# (c) 2021, Alexei Znamensky
# GNU General Public License v3.0+ (see COPYING or https://www.gnu.org/licenses/gpl-3.0.txt)
- include_tasks: msimple.yml
- include_tasks: mdepfail.yml
- include_tasks: mstate.yml

View File

@@ -1,3 +1,6 @@
# (c) 2021, Alexei Znamensky
# GNU General Public License v3.0+ (see COPYING or https://www.gnu.org/licenses/gpl-3.0.txt)
- name: test failing dependency
mdepfail:
a: 123
@@ -7,8 +10,8 @@
- name: assert failing dependency
assert:
that:
- result.failed is true
- result is failed
- '"Failed to import" in result.msg'
- '"nopackagewiththisname" in result.msg'
- '"ModuleNotFoundError:" in result.exception'
- '"ModuleNotFoundError:" in result.exception or "ImportError:" in result.exception'
- '"nopackagewiththisname" in result.exception'

View File

@@ -1,3 +1,6 @@
# (c) 2021, Alexei Znamensky
# GNU General Public License v3.0+ (see COPYING or https://www.gnu.org/licenses/gpl-3.0.txt)
- name: test msimple 1
msimple:
a: 80
@@ -8,7 +11,7 @@
that:
- simple1.a == 80
- simple1.abc == "abc"
- simple1.changed is false
- simple1 is not changed
- simple1.value is none
- name: test msimple 2
@@ -23,8 +26,8 @@
- simple2.a == 101
- 'simple2.msg == "Module failed with exception: a >= 100"'
- simple2.abc == "abc"
- simple2.failed is true
- simple2.changed is false
- simple2 is failed
- simple2 is not changed
- simple2.value is none
- name: test msimple 3
@@ -39,7 +42,7 @@
- simple3.a == 2
- simple3.b == "potatoespotatoes"
- simple3.c == "NoneNone"
- simple3.changed is false
- simple3 is not changed
- name: test msimple 4
msimple:
@@ -51,4 +54,4 @@
that:
- simple4.c == "abc change"
- simple4.abc == "changed abc"
- simple4.changed is true
- simple4 is changed

View File

@@ -1,3 +1,6 @@
# (c) 2021, Alexei Znamensky
# GNU General Public License v3.0+ (see COPYING or https://www.gnu.org/licenses/gpl-3.0.txt)
- name: test mstate 1
mstate:
a: 80
@@ -13,7 +16,7 @@
- state1.b == "banana"
- state1.c == "cashew"
- state1.result == "abc"
- state1.changed is false
- state1 is not changed
- name: test mstate 2
mstate:
@@ -29,7 +32,7 @@
- state2.b == "banana"
- state2.c == "cashew"
- state2.result == "80bananacashew"
- state2.changed is true
- state2 is changed
- name: test mstate 3
mstate:
@@ -44,7 +47,7 @@
- state3.a == 3
- state3.b == "banana"
- state3.result == "bananabananabanana"
- state3.changed is true
- state3 is changed
- name: test mstate 4
mstate:
@@ -59,7 +62,7 @@
- state4.a == 4
- state4.c == "cashew"
- state4.result == "cashewcashewcashewcashew"
- state4.changed is true
- state4 is changed
- name: test mstate 5
mstate:
@@ -76,4 +79,4 @@
- state5.b == "foo"
- state5.c == "bar"
- state5.result == "foobarfoobarfoobarfoobarfoobar"
- state5.changed is true
- state5 is changed

View File

@@ -1,4 +1,3 @@
notification/mqtt
shippable/posix/group1
skip/aix
skip/osx

View File

@@ -0,0 +1,5 @@
shippable/posix/group1
skip/aix
skip/freebsd
skip/osx
skip/macos

View File

@@ -0,0 +1,56 @@
# (c) 2021, Alexei Znamensky
# GNU General Public License v3.0+ (see COPYING or https://www.gnu.org/licenses/gpl-3.0.txt)
- name: Set value for temp limit configuration
set_fact:
test_pamd_file: "/tmp/pamd_file"
- name: Copy temporary pam.d file
copy:
content: "session required pam_lastlog.so silent showfailed"
dest: "{{ test_pamd_file }}"
- name: Test working on a single-line file works (2925)
community.general.pamd:
path: /tmp
name: pamd_file
type: session
control: required
module_path: pam_lastlog.so
module_arguments: silent
state: args_absent
register: pamd_file_output
- name: Check if changes made
assert:
that:
- pamd_file_output is changed
- name: Copy temporary pam.d file
copy:
content: ""
dest: "{{ test_pamd_file }}"
# This test merely demonstrates that, as-is, module will not perform any changes on an empty file
# All the existing values for "state" will first search for a rule matching type, control, module_path
# and will not perform any change whatsoever if no existing rules match.
- name: Test working on a empty file works (2925)
community.general.pamd:
path: /tmp
name: pamd_file
type: session
control: required
module_path: pam_lastlog.so
module_arguments: silent
register: pamd_file_output_empty
- name: Read back the file
slurp:
src: "{{ test_pamd_file }}"
register: pamd_file_slurp
- name: Check if changes made
assert:
that:
- pamd_file_output_empty is not changed
- pamd_file_slurp.content|b64decode == ''

View File

@@ -0,0 +1,11 @@
{
"include_symlinks": false,
"prefixes": [
".azure-pipelines/azure-pipelines.yml",
"tests/integration/targets/"
],
"output": "path-message",
"requirements": [
"PyYAML"
]
}

63
tests/sanity/extra/aliases.py Executable file
View File

@@ -0,0 +1,63 @@
#!/usr/bin/env python
# Copyright (c) Ansible Project
# GNU General Public License v3.0+ (see COPYING or https://www.gnu.org/licenses/gpl-3.0.txt)
"""Check extra collection docs with antsibull-lint."""
from __future__ import (absolute_import, division, print_function)
__metaclass__ = type
import os
import sys
import yaml
def main():
"""Main entry point."""
paths = sys.argv[1:] or sys.stdin.read().splitlines()
paths = [path for path in paths if path.endswith('/aliases')]
with open('.azure-pipelines/azure-pipelines.yml', 'rb') as f:
azp = yaml.safe_load(f)
allowed_targets = set(['shippable/cloud/group1'])
for stage in azp['stages']:
if stage['stage'].startswith(('Sanity', 'Unit', 'Cloud', 'Summary')):
continue
for job in stage['jobs']:
for group in job['parameters']['groups']:
allowed_targets.add('shippable/posix/group{0}'.format(group))
for path in paths:
targets = []
skip = False
with open(path, 'r') as f:
for line in f:
if '#' in line:
line = line[:line.find('#')]
line = line.strip()
if line.startswith('needs/'):
continue
if line.startswith('skip/'):
continue
if line.startswith('cloud/'):
continue
if line in ('unsupported', 'disabled', 'hidden'):
skip = True
if line in ('destructive', ):
continue
if '/' not in line:
continue
targets.append(line)
if skip:
continue
if not targets:
if 'targets/setup_' in path:
continue
print('%s: %s' % (path, 'found no targets'))
for target in targets:
if target not in allowed_targets:
print('%s: %s' % (path, 'found invalid target "{0}"'.format(target)))
if __name__ == '__main__':
main()

View File

@@ -7,7 +7,6 @@ plugins/modules/cloud/misc/rhevm.py validate-modules:parameter-state-invalid-cho
plugins/modules/cloud/rackspace/rax.py use-argspec-type-path # fix needed
plugins/modules/cloud/rackspace/rax_files.py validate-modules:parameter-state-invalid-choice
plugins/modules/cloud/rackspace/rax_files_objects.py use-argspec-type-path
plugins/modules/cloud/rackspace/rax_mon_notification_plan.py validate-modules:parameter-list-no-elements
plugins/modules/cloud/rackspace/rax_scaling_group.py use-argspec-type-path # fix needed, expanduser() applied to dict values
plugins/modules/cloud/scaleway/scaleway_organization_info.py validate-modules:return-syntax-error
plugins/modules/cloud/smartos/vmadm.py validate-modules:parameter-type-not-in-doc
@@ -20,17 +19,17 @@ plugins/modules/cloud/univention/udm_user.py validate-modules:parameter-list-no-
plugins/modules/clustering/consul/consul.py validate-modules:doc-missing-type
plugins/modules/clustering/consul/consul.py validate-modules:undocumented-parameter
plugins/modules/clustering/consul/consul_session.py validate-modules:parameter-state-invalid-choice
plugins/modules/notification/grove.py validate-modules:invalid-argument-name
plugins/modules/notification/grove.py validate-modules:invalid-argument-name # invalid alias - removed in 4.0.0
plugins/modules/packaging/language/composer.py validate-modules:parameter-invalid
plugins/modules/packaging/os/apt_rpm.py validate-modules:parameter-invalid
plugins/modules/packaging/os/homebrew.py validate-modules:parameter-invalid
plugins/modules/packaging/os/homebrew_cask.py validate-modules:parameter-invalid
plugins/modules/packaging/os/opkg.py validate-modules:parameter-invalid
plugins/modules/packaging/os/pacman.py validate-modules:parameter-invalid
plugins/modules/packaging/os/apt_rpm.py validate-modules:parameter-invalid # invalid alias - removed in 5.0.0
plugins/modules/packaging/os/homebrew.py validate-modules:parameter-invalid # invalid alias - removed in 5.0.0
plugins/modules/packaging/os/homebrew_cask.py validate-modules:parameter-invalid # invalid alias - removed in 5.0.0
plugins/modules/packaging/os/opkg.py validate-modules:parameter-invalid # invalid alias - removed in 5.0.0
plugins/modules/packaging/os/pacman.py validate-modules:parameter-invalid # invalid alias - removed in 5.0.0
plugins/modules/packaging/os/redhat_subscription.py validate-modules:return-syntax-error
plugins/modules/packaging/os/slackpkg.py validate-modules:parameter-invalid
plugins/modules/packaging/os/urpmi.py validate-modules:parameter-invalid
plugins/modules/packaging/os/xbps.py validate-modules:parameter-invalid
plugins/modules/packaging/os/slackpkg.py validate-modules:parameter-invalid # invalid alias - removed in 5.0.0
plugins/modules/packaging/os/urpmi.py validate-modules:parameter-invalid # invalid alias - removed in 5.0.0
plugins/modules/packaging/os/xbps.py validate-modules:parameter-invalid # invalid alias - removed in 5.0.0
plugins/modules/remote_management/hpilo/hpilo_boot.py validate-modules:parameter-type-not-in-doc
plugins/modules/remote_management/hpilo/hpilo_info.py validate-modules:parameter-type-not-in-doc
plugins/modules/remote_management/hpilo/hponcfg.py validate-modules:parameter-type-not-in-doc
@@ -43,13 +42,12 @@ plugins/modules/remote_management/manageiq/manageiq_tags.py validate-modules:par
plugins/modules/source_control/github/github_deploy_key.py validate-modules:parameter-invalid
plugins/modules/system/gconftool2.py validate-modules:parameter-state-invalid-choice
plugins/modules/system/iptables_state.py validate-modules:undocumented-parameter
plugins/modules/system/launchd.py use-argspec-type-path # False positive
plugins/modules/system/osx_defaults.py validate-modules:parameter-state-invalid-choice
plugins/modules/system/parted.py validate-modules:parameter-state-invalid-choice
plugins/modules/system/puppet.py use-argspec-type-path
plugins/modules/system/puppet.py validate-modules:doc-default-does-not-match-spec # show_diff is not documented
plugins/modules/system/puppet.py validate-modules:parameter-type-not-in-doc
plugins/modules/system/runit.py validate-modules:parameter-type-not-in-doc
plugins/modules/system/runit.py validate-modules:parameter-type-not-in-doc # param removed in 4.0.0
plugins/modules/system/ssh_config.py use-argspec-type-path # Required since module uses other methods to specify path
plugins/modules/system/xfconf.py validate-modules:parameter-state-invalid-choice
plugins/modules/system/xfconf.py validate-modules:return-syntax-error

View File

@@ -6,7 +6,6 @@ plugins/modules/cloud/misc/rhevm.py validate-modules:parameter-state-invalid-cho
plugins/modules/cloud/rackspace/rax.py use-argspec-type-path # fix needed
plugins/modules/cloud/rackspace/rax_files.py validate-modules:parameter-state-invalid-choice
plugins/modules/cloud/rackspace/rax_files_objects.py use-argspec-type-path
plugins/modules/cloud/rackspace/rax_mon_notification_plan.py validate-modules:parameter-list-no-elements
plugins/modules/cloud/rackspace/rax_scaling_group.py use-argspec-type-path # fix needed, expanduser() applied to dict values
plugins/modules/cloud/scaleway/scaleway_organization_info.py validate-modules:return-syntax-error
plugins/modules/cloud/smartos/vmadm.py validate-modules:parameter-type-not-in-doc
@@ -19,17 +18,17 @@ plugins/modules/cloud/univention/udm_user.py validate-modules:parameter-list-no-
plugins/modules/clustering/consul/consul.py validate-modules:doc-missing-type
plugins/modules/clustering/consul/consul.py validate-modules:undocumented-parameter
plugins/modules/clustering/consul/consul_session.py validate-modules:parameter-state-invalid-choice
plugins/modules/notification/grove.py validate-modules:invalid-argument-name
plugins/modules/notification/grove.py validate-modules:invalid-argument-name # invalid alias - removed in 4.0.0
plugins/modules/packaging/language/composer.py validate-modules:parameter-invalid
plugins/modules/packaging/os/apt_rpm.py validate-modules:parameter-invalid
plugins/modules/packaging/os/homebrew.py validate-modules:parameter-invalid
plugins/modules/packaging/os/homebrew_cask.py validate-modules:parameter-invalid
plugins/modules/packaging/os/opkg.py validate-modules:parameter-invalid
plugins/modules/packaging/os/pacman.py validate-modules:parameter-invalid
plugins/modules/packaging/os/apt_rpm.py validate-modules:parameter-invalid # invalid alias - removed in 5.0.0
plugins/modules/packaging/os/homebrew.py validate-modules:parameter-invalid # invalid alias - removed in 5.0.0
plugins/modules/packaging/os/homebrew_cask.py validate-modules:parameter-invalid # invalid alias - removed in 5.0.0
plugins/modules/packaging/os/opkg.py validate-modules:parameter-invalid # invalid alias - removed in 5.0.0
plugins/modules/packaging/os/pacman.py validate-modules:parameter-invalid # invalid alias - removed in 5.0.0
plugins/modules/packaging/os/redhat_subscription.py validate-modules:return-syntax-error
plugins/modules/packaging/os/slackpkg.py validate-modules:parameter-invalid
plugins/modules/packaging/os/urpmi.py validate-modules:parameter-invalid
plugins/modules/packaging/os/xbps.py validate-modules:parameter-invalid
plugins/modules/packaging/os/slackpkg.py validate-modules:parameter-invalid # invalid alias - removed in 5.0.0
plugins/modules/packaging/os/urpmi.py validate-modules:parameter-invalid # invalid alias - removed in 5.0.0
plugins/modules/packaging/os/xbps.py validate-modules:parameter-invalid # invalid alias - removed in 5.0.0
plugins/modules/remote_management/hpilo/hpilo_boot.py validate-modules:parameter-type-not-in-doc
plugins/modules/remote_management/hpilo/hpilo_info.py validate-modules:parameter-type-not-in-doc
plugins/modules/remote_management/hpilo/hponcfg.py validate-modules:parameter-type-not-in-doc
@@ -42,13 +41,12 @@ plugins/modules/remote_management/manageiq/manageiq_tags.py validate-modules:par
plugins/modules/source_control/github/github_deploy_key.py validate-modules:parameter-invalid
plugins/modules/system/gconftool2.py validate-modules:parameter-state-invalid-choice
plugins/modules/system/iptables_state.py validate-modules:undocumented-parameter
plugins/modules/system/launchd.py use-argspec-type-path # False positive
plugins/modules/system/osx_defaults.py validate-modules:parameter-state-invalid-choice
plugins/modules/system/parted.py validate-modules:parameter-state-invalid-choice
plugins/modules/system/puppet.py use-argspec-type-path
plugins/modules/system/puppet.py validate-modules:doc-default-does-not-match-spec # show_diff is not documented
plugins/modules/system/puppet.py validate-modules:parameter-type-not-in-doc
plugins/modules/system/runit.py validate-modules:parameter-type-not-in-doc
plugins/modules/system/runit.py validate-modules:parameter-type-not-in-doc # param removed in 4.0.0
plugins/modules/system/ssh_config.py use-argspec-type-path # Required since module uses other methods to specify path
plugins/modules/system/xfconf.py validate-modules:parameter-state-invalid-choice
plugins/modules/system/xfconf.py validate-modules:return-syntax-error

View File

@@ -6,7 +6,6 @@ plugins/modules/cloud/misc/rhevm.py validate-modules:parameter-state-invalid-cho
plugins/modules/cloud/rackspace/rax.py use-argspec-type-path # fix needed
plugins/modules/cloud/rackspace/rax_files.py validate-modules:parameter-state-invalid-choice
plugins/modules/cloud/rackspace/rax_files_objects.py use-argspec-type-path
plugins/modules/cloud/rackspace/rax_mon_notification_plan.py validate-modules:parameter-list-no-elements
plugins/modules/cloud/rackspace/rax_scaling_group.py use-argspec-type-path # fix needed, expanduser() applied to dict values
plugins/modules/cloud/scaleway/scaleway_organization_info.py validate-modules:return-syntax-error
plugins/modules/cloud/smartos/vmadm.py validate-modules:parameter-type-not-in-doc
@@ -19,17 +18,17 @@ plugins/modules/cloud/univention/udm_user.py validate-modules:parameter-list-no-
plugins/modules/clustering/consul/consul.py validate-modules:doc-missing-type
plugins/modules/clustering/consul/consul.py validate-modules:undocumented-parameter
plugins/modules/clustering/consul/consul_session.py validate-modules:parameter-state-invalid-choice
plugins/modules/notification/grove.py validate-modules:invalid-argument-name
plugins/modules/notification/grove.py validate-modules:invalid-argument-name # invalid alias - removed in 4.0.0
plugins/modules/packaging/language/composer.py validate-modules:parameter-invalid
plugins/modules/packaging/os/apt_rpm.py validate-modules:parameter-invalid
plugins/modules/packaging/os/homebrew.py validate-modules:parameter-invalid
plugins/modules/packaging/os/homebrew_cask.py validate-modules:parameter-invalid
plugins/modules/packaging/os/opkg.py validate-modules:parameter-invalid
plugins/modules/packaging/os/pacman.py validate-modules:parameter-invalid
plugins/modules/packaging/os/apt_rpm.py validate-modules:parameter-invalid # invalid alias - removed in 5.0.0
plugins/modules/packaging/os/homebrew.py validate-modules:parameter-invalid # invalid alias - removed in 5.0.0
plugins/modules/packaging/os/homebrew_cask.py validate-modules:parameter-invalid # invalid alias - removed in 5.0.0
plugins/modules/packaging/os/opkg.py validate-modules:parameter-invalid # invalid alias - removed in 5.0.0
plugins/modules/packaging/os/pacman.py validate-modules:parameter-invalid # invalid alias - removed in 5.0.0
plugins/modules/packaging/os/redhat_subscription.py validate-modules:return-syntax-error
plugins/modules/packaging/os/slackpkg.py validate-modules:parameter-invalid
plugins/modules/packaging/os/urpmi.py validate-modules:parameter-invalid
plugins/modules/packaging/os/xbps.py validate-modules:parameter-invalid
plugins/modules/packaging/os/slackpkg.py validate-modules:parameter-invalid # invalid alias - removed in 5.0.0
plugins/modules/packaging/os/urpmi.py validate-modules:parameter-invalid # invalid alias - removed in 5.0.0
plugins/modules/packaging/os/xbps.py validate-modules:parameter-invalid # invalid alias - removed in 5.0.0
plugins/modules/remote_management/hpilo/hpilo_boot.py validate-modules:parameter-type-not-in-doc
plugins/modules/remote_management/hpilo/hpilo_info.py validate-modules:parameter-type-not-in-doc
plugins/modules/remote_management/hpilo/hponcfg.py validate-modules:parameter-type-not-in-doc
@@ -42,13 +41,12 @@ plugins/modules/remote_management/manageiq/manageiq_tags.py validate-modules:par
plugins/modules/source_control/github/github_deploy_key.py validate-modules:parameter-invalid
plugins/modules/system/gconftool2.py validate-modules:parameter-state-invalid-choice
plugins/modules/system/iptables_state.py validate-modules:undocumented-parameter
plugins/modules/system/launchd.py use-argspec-type-path # False positive
plugins/modules/system/osx_defaults.py validate-modules:parameter-state-invalid-choice
plugins/modules/system/parted.py validate-modules:parameter-state-invalid-choice
plugins/modules/system/puppet.py use-argspec-type-path
plugins/modules/system/puppet.py validate-modules:doc-default-does-not-match-spec # show_diff is not documented
plugins/modules/system/puppet.py validate-modules:parameter-type-not-in-doc
plugins/modules/system/runit.py validate-modules:parameter-type-not-in-doc
plugins/modules/system/runit.py validate-modules:parameter-type-not-in-doc # param removed in 4.0.0
plugins/modules/system/ssh_config.py use-argspec-type-path # Required since module uses other methods to specify path
plugins/modules/system/xfconf.py validate-modules:parameter-state-invalid-choice
plugins/modules/system/xfconf.py validate-modules:return-syntax-error

View File

@@ -14,15 +14,15 @@ plugins/modules/cloud/spotinst/spotinst_aws_elastigroup.py validate-modules:undo
plugins/modules/clustering/consul/consul.py validate-modules:doc-missing-type
plugins/modules/clustering/consul/consul.py validate-modules:undocumented-parameter
plugins/modules/packaging/language/composer.py validate-modules:parameter-invalid
plugins/modules/packaging/os/apt_rpm.py validate-modules:parameter-invalid
plugins/modules/packaging/os/homebrew.py validate-modules:parameter-invalid
plugins/modules/packaging/os/homebrew_cask.py validate-modules:parameter-invalid
plugins/modules/packaging/os/opkg.py validate-modules:parameter-invalid
plugins/modules/packaging/os/pacman.py validate-modules:parameter-invalid
plugins/modules/packaging/os/apt_rpm.py validate-modules:parameter-invalid # invalid alias - removed in 5.0.0
plugins/modules/packaging/os/homebrew.py validate-modules:parameter-invalid # invalid alias - removed in 5.0.0
plugins/modules/packaging/os/homebrew_cask.py validate-modules:parameter-invalid # invalid alias - removed in 5.0.0
plugins/modules/packaging/os/opkg.py validate-modules:parameter-invalid # invalid alias - removed in 5.0.0
plugins/modules/packaging/os/pacman.py validate-modules:parameter-invalid # invalid alias - removed in 5.0.0
plugins/modules/packaging/os/redhat_subscription.py validate-modules:return-syntax-error
plugins/modules/packaging/os/slackpkg.py validate-modules:parameter-invalid
plugins/modules/packaging/os/urpmi.py validate-modules:parameter-invalid
plugins/modules/packaging/os/xbps.py validate-modules:parameter-invalid
plugins/modules/packaging/os/slackpkg.py validate-modules:parameter-invalid # invalid alias - removed in 5.0.0
plugins/modules/packaging/os/urpmi.py validate-modules:parameter-invalid # invalid alias - removed in 5.0.0
plugins/modules/packaging/os/xbps.py validate-modules:parameter-invalid # invalid alias - removed in 5.0.0
plugins/modules/remote_management/hpilo/hpilo_boot.py validate-modules:parameter-type-not-in-doc
plugins/modules/remote_management/hpilo/hpilo_info.py validate-modules:parameter-type-not-in-doc
plugins/modules/remote_management/hpilo/hponcfg.py validate-modules:parameter-type-not-in-doc
@@ -64,10 +64,9 @@ plugins/modules/net_tools/nios/nios_zone.py validate-modules:deprecation-mismatc
plugins/modules/net_tools/nios/nios_zone.py validate-modules:invalid-documentation
plugins/modules/source_control/github/github_deploy_key.py validate-modules:parameter-invalid
plugins/modules/system/iptables_state.py validate-modules:undocumented-parameter
plugins/modules/system/launchd.py use-argspec-type-path # False positive
plugins/modules/system/puppet.py use-argspec-type-path
plugins/modules/system/puppet.py validate-modules:parameter-type-not-in-doc
plugins/modules/system/runit.py validate-modules:parameter-type-not-in-doc
plugins/modules/system/runit.py validate-modules:parameter-type-not-in-doc # deprecated param - removed in 4.0.0
plugins/modules/system/ssh_config.py use-argspec-type-path # Required since module uses other methods to specify path
plugins/modules/system/xfconf.py validate-modules:return-syntax-error
plugins/modules/web_infrastructure/jenkins_plugin.py use-argspec-type-path

View File

@@ -9,7 +9,6 @@ __metaclass__ = type
import pytest
from ansible.errors import AnsibleError, AnsibleParserError
from ansible.inventory.data import InventoryData
from ansible_collections.community.general.plugins.inventory.proxmox import InventoryModule
@@ -52,7 +51,12 @@ def get_json(url):
"disk": 1000,
"maxmem": 1000,
"uptime": 10000,
"level": ""}]
"level": ""},
{"type": "node",
"node": "testnode2",
"id": "node/testnode2",
"status": "offline",
"ssl_fingerprint": "yy"}]
elif url == "https://localhost:8006/api2/json/pools":
# _get_pools
return [{"poolid": "test"}]
@@ -554,7 +558,6 @@ def test_populate(inventory, mocker):
host_qemu_multi_nic = inventory.inventory.get_host('test-qemu-multi-nic')
host_qemu_template = inventory.inventory.get_host('test-qemu-template')
host_lxc = inventory.inventory.get_host('test-lxc')
host_node = inventory.inventory.get_host('testnode')
# check if qemu-test is in the proxmox_pool_test group
assert 'proxmox_pool_test' in inventory.inventory.groups
@@ -584,3 +587,6 @@ def test_populate(inventory, mocker):
# check if qemu template is not present
assert host_qemu_template is None
# check that offline node is in inventory
assert inventory.inventory.get_host('testnode2')

View File

@@ -343,7 +343,7 @@ class TestKeycloakAuthentication(ModuleTestCase):
self.assertEqual(len(mock_get_authentication_flow_by_alias.mock_calls), 1)
self.assertEqual(len(mock_copy_auth_flow.mock_calls), 0)
self.assertEqual(len(mock_create_empty_auth_flow.mock_calls), 1)
self.assertEqual(len(mock_get_executions_representation.mock_calls), 2)
self.assertEqual(len(mock_get_executions_representation.mock_calls), 3)
self.assertEqual(len(mock_delete_authentication_flow_by_id.mock_calls), 0)
# Verify that the module's changed status matches what is expected
@@ -434,7 +434,7 @@ class TestKeycloakAuthentication(ModuleTestCase):
self.assertEqual(len(mock_get_authentication_flow_by_alias.mock_calls), 1)
self.assertEqual(len(mock_copy_auth_flow.mock_calls), 0)
self.assertEqual(len(mock_create_empty_auth_flow.mock_calls), 0)
self.assertEqual(len(mock_get_executions_representation.mock_calls), 2)
self.assertEqual(len(mock_get_executions_representation.mock_calls), 3)
self.assertEqual(len(mock_delete_authentication_flow_by_id.mock_calls), 0)
# Verify that the module's changed status matches what is expected
@@ -611,7 +611,7 @@ class TestKeycloakAuthentication(ModuleTestCase):
self.assertEqual(len(mock_get_authentication_flow_by_alias.mock_calls), 1)
self.assertEqual(len(mock_copy_auth_flow.mock_calls), 0)
self.assertEqual(len(mock_create_empty_auth_flow.mock_calls), 1)
self.assertEqual(len(mock_get_executions_representation.mock_calls), 2)
self.assertEqual(len(mock_get_executions_representation.mock_calls), 3)
self.assertEqual(len(mock_delete_authentication_flow_by_id.mock_calls), 1)
# Verify that the module's changed status matches what is expected

View File

@@ -0,0 +1,150 @@
# -*- coding: utf-8 -*-
# Copyright: (c) 2021, Ansible Project
# GNU General Public License v3.0+ (see COPYING or https://www.gnu.org/licenses/gpl-3.0.txt)
from __future__ import absolute_import, division, print_function
__metaclass__ = type
from contextlib import contextmanager
from ansible_collections.community.general.tests.unit.compat import unittest
from ansible_collections.community.general.tests.unit.compat.mock import call, patch
from ansible_collections.community.general.tests.unit.plugins.modules.utils import AnsibleExitJson, AnsibleFailJson, \
ModuleTestCase, set_module_args
from ansible_collections.community.general.plugins.modules.identity.keycloak import keycloak_client
from itertools import count
from ansible.module_utils.six import StringIO
@contextmanager
def patch_keycloak_api(get_client_by_clientid=None, get_client_by_id=None, update_client=None, create_client=None,
delete_client=None):
"""Mock context manager for patching the methods in PwPolicyIPAClient that contact the IPA server
Patches the `login` and `_post_json` methods
Keyword arguments are passed to the mock object that patches `_post_json`
No arguments are passed to the mock object that patches `login` because no tests require it
Example::
with patch_ipa(return_value={}) as (mock_login, mock_post):
...
"""
obj = keycloak_client.KeycloakAPI
with patch.object(obj, 'get_client_by_clientid', side_effect=get_client_by_clientid) as mock_get_client_by_clientid:
with patch.object(obj, 'get_client_by_id', side_effect=get_client_by_id) as mock_get_client_by_id:
with patch.object(obj, 'create_client', side_effect=create_client) as mock_create_client:
with patch.object(obj, 'update_client', side_effect=update_client) as mock_update_client:
with patch.object(obj, 'delete_client', side_effect=delete_client) as mock_delete_client:
yield mock_get_client_by_clientid, mock_get_client_by_id, mock_create_client, mock_update_client, mock_delete_client
def get_response(object_with_future_response, method, get_id_call_count):
if callable(object_with_future_response):
return object_with_future_response()
if isinstance(object_with_future_response, dict):
return get_response(
object_with_future_response[method], method, get_id_call_count)
if isinstance(object_with_future_response, list):
call_number = next(get_id_call_count)
return get_response(
object_with_future_response[call_number], method, get_id_call_count)
return object_with_future_response
def build_mocked_request(get_id_user_count, response_dict):
def _mocked_requests(*args, **kwargs):
url = args[0]
method = kwargs['method']
future_response = response_dict.get(url, None)
return get_response(future_response, method, get_id_user_count)
return _mocked_requests
def create_wrapper(text_as_string):
"""Allow to mock many times a call to one address.
Without this function, the StringIO is empty for the second call.
"""
def _create_wrapper():
return StringIO(text_as_string)
return _create_wrapper
def mock_good_connection():
token_response = {
'http://keycloak.url/auth/realms/master/protocol/openid-connect/token': create_wrapper(
'{"access_token": "alongtoken"}'), }
return patch(
'ansible_collections.community.general.plugins.module_utils.identity.keycloak.keycloak.open_url',
side_effect=build_mocked_request(count(), token_response),
autospec=True
)
class TestKeycloakRealm(ModuleTestCase):
def setUp(self):
super(TestKeycloakRealm, self).setUp()
self.module = keycloak_client
def test_authentication_flow_binding_overrides_feature(self):
"""Add a new realm"""
module_args = {
'auth_keycloak_url': 'https: // auth.example.com / auth',
'token': '{{ access_token }}',
'state': 'present',
'realm': 'master',
'client_id': 'test',
'authentication_flow_binding_overrides': {
'browser': '4c90336b-bf1d-4b87-916d-3677ba4e5fbb'
}
}
return_value_get_client_by_clientid = [
None,
{
"authenticationFlowBindingOverrides": {
"browser": "f9502b6d-d76a-4efe-8331-2ddd853c9f9c"
},
"clientId": "onboardingid",
"enabled": "true",
"protocol": "openid-connect",
"redirectUris": [
"*"
]
}
]
changed = True
set_module_args(module_args)
# Run the module
with mock_good_connection():
with patch_keycloak_api(get_client_by_clientid=return_value_get_client_by_clientid) \
as (mock_get_client_by_clientid, mock_get_client_by_id, mock_create_client, mock_update_client, mock_delete_client):
with self.assertRaises(AnsibleExitJson) as exec_info:
self.module.main()
self.assertEqual(mock_get_client_by_clientid.call_count, 2)
self.assertEqual(mock_get_client_by_id.call_count, 0)
self.assertEqual(mock_create_client.call_count, 1)
self.assertEqual(mock_update_client.call_count, 0)
self.assertEqual(mock_delete_client.call_count, 0)
# Verify that the module's changed status matches what is expected
self.assertIs(exec_info.exception.args[0]['changed'], changed)
if __name__ == '__main__':
unittest.main()

View File

@@ -0,0 +1,614 @@
# -*- coding: utf-8 -*-
# Copyright: (c) 2021, Ansible Project
# GNU General Public License v3.0+ (see COPYING or https://www.gnu.org/licenses/gpl-3.0.txt)
from __future__ import absolute_import, division, print_function
__metaclass__ = type
from contextlib import contextmanager
from ansible_collections.community.general.tests.unit.compat import unittest
from ansible_collections.community.general.tests.unit.compat.mock import call, patch
from ansible_collections.community.general.tests.unit.plugins.modules.utils import AnsibleExitJson, AnsibleFailJson, \
ModuleTestCase, set_module_args
from ansible_collections.community.general.plugins.modules.identity.keycloak import keycloak_clientscope
from itertools import count
from ansible.module_utils.six import StringIO
@contextmanager
def patch_keycloak_api(get_clientscope_by_name=None, get_clientscope_by_clientscopeid=None, create_clientscope=None,
update_clientscope=None, get_clientscope_protocolmapper_by_name=None,
update_clientscope_protocolmappers=None, create_clientscope_protocolmapper=None,
delete_clientscope=None):
"""Mock context manager for patching the methods in PwPolicyIPAClient that contact the IPA server
Patches the `login` and `_post_json` methods
Keyword arguments are passed to the mock object that patches `_post_json`
No arguments are passed to the mock object that patches `login` because no tests require it
Example::
with patch_ipa(return_value={}) as (mock_login, mock_post):
...
"""
"""
get_clientscope_by_clientscopeid
delete_clientscope
"""
obj = keycloak_clientscope.KeycloakAPI
with patch.object(obj, 'get_clientscope_by_name', side_effect=get_clientscope_by_name) \
as mock_get_clientscope_by_name:
with patch.object(obj, 'get_clientscope_by_clientscopeid', side_effect=get_clientscope_by_clientscopeid) \
as mock_get_clientscope_by_clientscopeid:
with patch.object(obj, 'create_clientscope', side_effect=create_clientscope) \
as mock_create_clientscope:
with patch.object(obj, 'update_clientscope', return_value=update_clientscope) \
as mock_update_clientscope:
with patch.object(obj, 'get_clientscope_protocolmapper_by_name',
side_effect=get_clientscope_protocolmapper_by_name) \
as mock_get_clientscope_protocolmapper_by_name:
with patch.object(obj, 'update_clientscope_protocolmappers',
side_effect=update_clientscope_protocolmappers) \
as mock_update_clientscope_protocolmappers:
with patch.object(obj, 'create_clientscope_protocolmapper',
side_effect=create_clientscope_protocolmapper) \
as mock_create_clientscope_protocolmapper:
with patch.object(obj, 'delete_clientscope', side_effect=delete_clientscope) \
as mock_delete_clientscope:
yield mock_get_clientscope_by_name, mock_get_clientscope_by_clientscopeid, mock_create_clientscope, \
mock_update_clientscope, mock_get_clientscope_protocolmapper_by_name, mock_update_clientscope_protocolmappers, \
mock_create_clientscope_protocolmapper, mock_delete_clientscope
def get_response(object_with_future_response, method, get_id_call_count):
if callable(object_with_future_response):
return object_with_future_response()
if isinstance(object_with_future_response, dict):
return get_response(
object_with_future_response[method], method, get_id_call_count)
if isinstance(object_with_future_response, list):
call_number = next(get_id_call_count)
return get_response(
object_with_future_response[call_number], method, get_id_call_count)
return object_with_future_response
def build_mocked_request(get_id_user_count, response_dict):
def _mocked_requests(*args, **kwargs):
url = args[0]
method = kwargs['method']
future_response = response_dict.get(url, None)
return get_response(future_response, method, get_id_user_count)
return _mocked_requests
def create_wrapper(text_as_string):
"""Allow to mock many times a call to one address.
Without this function, the StringIO is empty for the second call.
"""
def _create_wrapper():
return StringIO(text_as_string)
return _create_wrapper
def mock_good_connection():
token_response = {
'http://keycloak.url/auth/realms/master/protocol/openid-connect/token': create_wrapper(
'{"access_token": "alongtoken"}'), }
return patch(
'ansible_collections.community.general.plugins.module_utils.identity.keycloak.keycloak.open_url',
side_effect=build_mocked_request(count(), token_response),
autospec=True
)
class TestKeycloakAuthentication(ModuleTestCase):
def setUp(self):
super(TestKeycloakAuthentication, self).setUp()
self.module = keycloak_clientscope
def test_create_clientscope(self):
"""Add a new authentication flow from copy of an other flow"""
module_args = {
'auth_keycloak_url': 'http://keycloak.url/auth',
'auth_username': 'admin',
'auth_password': 'admin',
'auth_realm': 'master',
'realm': 'realm-name',
'state': 'present',
'name': 'my-new-kc-clientscope'
}
return_value_get_clientscope_by_name = [
None,
{
"attributes": {},
"id": "73fec1d2-f032-410c-8177-583104d01305",
"name": "my-new-kc-clientscope"
}]
changed = True
set_module_args(module_args)
# Run the module
with mock_good_connection():
with patch_keycloak_api(get_clientscope_by_name=return_value_get_clientscope_by_name) \
as (mock_get_clientscope_by_name, mock_get_clientscope_by_clientscopeid, mock_create_clientscope,
mock_update_clientscope, mock_get_clientscope_protocolmapper_by_name,
mock_update_clientscope_protocolmappers,
mock_create_clientscope_protocolmapper, mock_delete_clientscope):
with self.assertRaises(AnsibleExitJson) as exec_info:
self.module.main()
# Verify number of call on each mock
self.assertEqual(mock_get_clientscope_by_name.call_count, 2)
self.assertEqual(mock_create_clientscope.call_count, 1)
self.assertEqual(mock_get_clientscope_by_clientscopeid.call_count, 0)
self.assertEqual(mock_update_clientscope.call_count, 0)
self.assertEqual(mock_get_clientscope_protocolmapper_by_name.call_count, 0)
self.assertEqual(mock_update_clientscope_protocolmappers.call_count, 0)
self.assertEqual(mock_create_clientscope_protocolmapper.call_count, 0)
self.assertEqual(mock_delete_clientscope.call_count, 0)
# Verify that the module's changed status matches what is expected
self.assertIs(exec_info.exception.args[0]['changed'], changed)
def test_create_clientscope_idempotency(self):
"""Add a new authentication flow from copy of an other flow"""
module_args = {
'auth_keycloak_url': 'http://keycloak.url/auth',
'auth_username': 'admin',
'auth_password': 'admin',
'auth_realm': 'master',
'realm': 'realm-name',
'state': 'present',
'name': 'my-new-kc-clientscope'
}
return_value_get_clientscope_by_name = [{
"attributes": {},
"id": "73fec1d2-f032-410c-8177-583104d01305",
"name": "my-new-kc-clientscope"
}]
changed = False
set_module_args(module_args)
# Run the module
with mock_good_connection():
with patch_keycloak_api(get_clientscope_by_name=return_value_get_clientscope_by_name) \
as (mock_get_clientscope_by_name, mock_get_clientscope_by_clientscopeid, mock_create_clientscope,
mock_update_clientscope, mock_get_clientscope_protocolmapper_by_name,
mock_update_clientscope_protocolmappers,
mock_create_clientscope_protocolmapper, mock_delete_clientscope):
with self.assertRaises(AnsibleExitJson) as exec_info:
self.module.main()
# Verify number of call on each mock
self.assertEqual(mock_get_clientscope_by_name.call_count, 1)
self.assertEqual(mock_create_clientscope.call_count, 0)
self.assertEqual(mock_get_clientscope_by_clientscopeid.call_count, 0)
self.assertEqual(mock_update_clientscope.call_count, 0)
self.assertEqual(mock_get_clientscope_protocolmapper_by_name.call_count, 0)
self.assertEqual(mock_update_clientscope_protocolmappers.call_count, 0)
self.assertEqual(mock_create_clientscope_protocolmapper.call_count, 0)
self.assertEqual(mock_delete_clientscope.call_count, 0)
# Verify that the module's changed status matches what is expected
self.assertIs(exec_info.exception.args[0]['changed'], changed)
def test_delete_clientscope(self):
"""Add a new authentication flow from copy of an other flow"""
module_args = {
'auth_keycloak_url': 'http://keycloak.url/auth',
'auth_username': 'admin',
'auth_password': 'admin',
'auth_realm': 'master',
'realm': 'realm-name',
'state': 'absent',
'name': 'my-new-kc-clientscope'
}
return_value_get_clientscope_by_name = [{
"attributes": {},
"id": "73fec1d2-f032-410c-8177-583104d01305",
"name": "my-new-kc-clientscope"
}]
changed = True
set_module_args(module_args)
# Run the module
with mock_good_connection():
with patch_keycloak_api(get_clientscope_by_name=return_value_get_clientscope_by_name) \
as (mock_get_clientscope_by_name, mock_get_clientscope_by_clientscopeid, mock_create_clientscope,
mock_update_clientscope, mock_get_clientscope_protocolmapper_by_name,
mock_update_clientscope_protocolmappers,
mock_create_clientscope_protocolmapper, mock_delete_clientscope):
with self.assertRaises(AnsibleExitJson) as exec_info:
self.module.main()
# Verify number of call on each mock
self.assertEqual(mock_get_clientscope_by_name.call_count, 1)
self.assertEqual(mock_create_clientscope.call_count, 0)
self.assertEqual(mock_get_clientscope_by_clientscopeid.call_count, 0)
self.assertEqual(mock_update_clientscope.call_count, 0)
self.assertEqual(mock_get_clientscope_protocolmapper_by_name.call_count, 0)
self.assertEqual(mock_update_clientscope_protocolmappers.call_count, 0)
self.assertEqual(mock_create_clientscope_protocolmapper.call_count, 0)
self.assertEqual(mock_delete_clientscope.call_count, 1)
# Verify that the module's changed status matches what is expected
self.assertIs(exec_info.exception.args[0]['changed'], changed)
def test_delete_clientscope_idempotency(self):
"""Add a new authentication flow from copy of an other flow"""
module_args = {
'auth_keycloak_url': 'http://keycloak.url/auth',
'auth_username': 'admin',
'auth_password': 'admin',
'auth_realm': 'master',
'realm': 'realm-name',
'state': 'absent',
'name': 'my-new-kc-clientscope'
}
return_value_get_clientscope_by_name = [None]
changed = False
set_module_args(module_args)
# Run the module
with mock_good_connection():
with patch_keycloak_api(get_clientscope_by_name=return_value_get_clientscope_by_name) \
as (mock_get_clientscope_by_name, mock_get_clientscope_by_clientscopeid, mock_create_clientscope,
mock_update_clientscope, mock_get_clientscope_protocolmapper_by_name,
mock_update_clientscope_protocolmappers,
mock_create_clientscope_protocolmapper, mock_delete_clientscope):
with self.assertRaises(AnsibleExitJson) as exec_info:
self.module.main()
# Verify number of call on each mock
self.assertEqual(mock_get_clientscope_by_name.call_count, 1)
self.assertEqual(mock_create_clientscope.call_count, 0)
self.assertEqual(mock_get_clientscope_by_clientscopeid.call_count, 0)
self.assertEqual(mock_update_clientscope.call_count, 0)
self.assertEqual(mock_get_clientscope_protocolmapper_by_name.call_count, 0)
self.assertEqual(mock_update_clientscope_protocolmappers.call_count, 0)
self.assertEqual(mock_create_clientscope_protocolmapper.call_count, 0)
self.assertEqual(mock_delete_clientscope.call_count, 0)
# Verify that the module's changed status matches what is expected
self.assertIs(exec_info.exception.args[0]['changed'], changed)
def test_create_clientscope_with_protocolmappers(self):
"""Add a new authentication flow from copy of an other flow"""
module_args = {
'auth_keycloak_url': 'http://keycloak.url/auth',
'auth_username': 'admin',
'auth_password': 'admin',
'auth_realm': 'master',
'realm': 'realm-name',
'state': 'present',
'name': 'my-new-kc-clientscope',
'protocolMappers': [
{
'protocol': 'openid-connect',
'config': {
'full.path': 'true',
'id.token.claim': 'true',
'access.token.claim': 'true',
'userinfo.token.claim': 'true',
'claim.name': 'protocol1',
},
'name': 'protocol1',
'protocolMapper': 'oidc-group-membership-mapper',
},
{
'protocol': 'openid-connect',
'config': {
'full.path': 'false',
'id.token.claim': 'false',
'access.token.claim': 'false',
'userinfo.token.claim': 'false',
'claim.name': 'protocol2',
},
'name': 'protocol2',
'protocolMapper': 'oidc-group-membership-mapper',
},
{
'protocol': 'openid-connect',
'config': {
'full.path': 'true',
'id.token.claim': 'false',
'access.token.claim': 'true',
'userinfo.token.claim': 'false',
'claim.name': 'protocol3',
},
'name': 'protocol3',
'protocolMapper': 'oidc-group-membership-mapper',
},
]
}
return_value_get_clientscope_by_name = [
None,
{
"attributes": {},
"id": "890ec72e-fe1d-4308-9f27-485ef7eaa182",
"name": "my-new-kc-clientscope",
"protocolMappers": [
{
"config": {
"access.token.claim": "false",
"claim.name": "protocol2",
"full.path": "false",
"id.token.claim": "false",
"userinfo.token.claim": "false"
},
"consentRequired": "false",
"id": "a7f19adb-cc58-41b1-94ce-782dc255139b",
"name": "protocol2",
"protocol": "openid-connect",
"protocolMapper": "oidc-group-membership-mapper"
},
{
"config": {
"access.token.claim": "true",
"claim.name": "protocol3",
"full.path": "true",
"id.token.claim": "false",
"userinfo.token.claim": "false"
},
"consentRequired": "false",
"id": "2103a559-185a-40f4-84ae-9ab311d5b812",
"name": "protocol3",
"protocol": "openid-connect",
"protocolMapper": "oidc-group-membership-mapper"
},
{
"config": {
"access.token.claim": "true",
"claim.name": "protocol1",
"full.path": "true",
"id.token.claim": "true",
"userinfo.token.claim": "true"
},
"consentRequired": "false",
"id": "bbf6390f-e95f-4c20-882b-9dad328363b9",
"name": "protocol1",
"protocol": "openid-connect",
"protocolMapper": "oidc-group-membership-mapper"
}]
}]
changed = True
set_module_args(module_args)
# Run the module
with mock_good_connection():
with patch_keycloak_api(get_clientscope_by_name=return_value_get_clientscope_by_name) \
as (mock_get_clientscope_by_name, mock_get_clientscope_by_clientscopeid, mock_create_clientscope,
mock_update_clientscope, mock_get_clientscope_protocolmapper_by_name,
mock_update_clientscope_protocolmappers,
mock_create_clientscope_protocolmapper, mock_delete_clientscope):
with self.assertRaises(AnsibleExitJson) as exec_info:
self.module.main()
# Verify number of call on each mock
self.assertEqual(mock_get_clientscope_by_name.call_count, 2)
self.assertEqual(mock_create_clientscope.call_count, 1)
self.assertEqual(mock_get_clientscope_by_clientscopeid.call_count, 0)
self.assertEqual(mock_update_clientscope.call_count, 0)
self.assertEqual(mock_get_clientscope_protocolmapper_by_name.call_count, 0)
self.assertEqual(mock_update_clientscope_protocolmappers.call_count, 0)
self.assertEqual(mock_create_clientscope_protocolmapper.call_count, 0)
self.assertEqual(mock_delete_clientscope.call_count, 0)
# Verify that the module's changed status matches what is expected
self.assertIs(exec_info.exception.args[0]['changed'], changed)
def test_update_clientscope_with_protocolmappers(self):
"""Add a new authentication flow from copy of an other flow"""
module_args = {
'auth_keycloak_url': 'http://keycloak.url/auth',
'auth_username': 'admin',
'auth_password': 'admin',
'auth_realm': 'master',
'realm': 'realm-name',
'state': 'present',
'name': 'my-new-kc-clientscope',
'protocolMappers': [
{
'protocol': 'openid-connect',
'config': {
'full.path': 'false',
'id.token.claim': 'false',
'access.token.claim': 'false',
'userinfo.token.claim': 'false',
'claim.name': 'protocol1_updated',
},
'name': 'protocol1',
'protocolMapper': 'oidc-group-membership-mapper',
},
{
'protocol': 'openid-connect',
'config': {
'full.path': 'true',
'id.token.claim': 'false',
'access.token.claim': 'false',
'userinfo.token.claim': 'false',
'claim.name': 'protocol2_updated',
},
'name': 'protocol2',
'protocolMapper': 'oidc-group-membership-mapper',
},
{
'protocol': 'openid-connect',
'config': {
'full.path': 'true',
'id.token.claim': 'true',
'access.token.claim': 'true',
'userinfo.token.claim': 'true',
'claim.name': 'protocol3_updated',
},
'name': 'protocol3',
'protocolMapper': 'oidc-group-membership-mapper',
},
]
}
return_value_get_clientscope_by_name = [{
"attributes": {},
"id": "890ec72e-fe1d-4308-9f27-485ef7eaa182",
"name": "my-new-kc-clientscope",
"protocolMappers": [
{
"config": {
"access.token.claim": "true",
"claim.name": "groups",
"full.path": "true",
"id.token.claim": "true",
"userinfo.token.claim": "true"
},
"consentRequired": "false",
"id": "e077007a-367a-444f-91ef-70277a1d868d",
"name": "groups",
"protocol": "saml",
"protocolMapper": "oidc-group-membership-mapper"
},
{
"config": {
"access.token.claim": "true",
"claim.name": "groups",
"full.path": "true",
"id.token.claim": "true",
"userinfo.token.claim": "true"
},
"consentRequired": "false",
"id": "06c518aa-c627-43cc-9a82-d8467b508d34",
"name": "groups",
"protocol": "openid-connect",
"protocolMapper": "oidc-group-membership-mapper"
},
{
"config": {
"access.token.claim": "true",
"claim.name": "groups",
"full.path": "true",
"id.token.claim": "true",
"userinfo.token.claim": "true"
},
"consentRequired": "false",
"id": "1d03c557-d97e-40f4-ac35-6cecd74ea70d",
"name": "groups",
"protocol": "wsfed",
"protocolMapper": "oidc-group-membership-mapper"
}
]
}]
return_value_get_clientscope_by_clientscopeid = [{
"attributes": {},
"id": "2286032f-451e-44d5-8be6-e45aac7983a1",
"name": "my-new-kc-clientscope",
"protocolMappers": [
{
"config": {
"access.token.claim": "true",
"claim.name": "protocol1_updated",
"full.path": "true",
"id.token.claim": "false",
"userinfo.token.claim": "false"
},
"consentRequired": "false",
"id": "a7f19adb-cc58-41b1-94ce-782dc255139b",
"name": "protocol2",
"protocol": "openid-connect",
"protocolMapper": "oidc-group-membership-mapper"
},
{
"config": {
"access.token.claim": "true",
"claim.name": "protocol1_updated",
"full.path": "true",
"id.token.claim": "false",
"userinfo.token.claim": "false"
},
"consentRequired": "false",
"id": "2103a559-185a-40f4-84ae-9ab311d5b812",
"name": "protocol3",
"protocol": "openid-connect",
"protocolMapper": "oidc-group-membership-mapper"
},
{
"config": {
"access.token.claim": "false",
"claim.name": "protocol1_updated",
"full.path": "false",
"id.token.claim": "false",
"userinfo.token.claim": "false"
},
"consentRequired": "false",
"id": "bbf6390f-e95f-4c20-882b-9dad328363b9",
"name": "protocol1",
"protocol": "openid-connect",
"protocolMapper": "oidc-group-membership-mapper"
}
]
}]
changed = True
set_module_args(module_args)
# Run the module
with mock_good_connection():
with patch_keycloak_api(get_clientscope_by_name=return_value_get_clientscope_by_name,
get_clientscope_by_clientscopeid=return_value_get_clientscope_by_clientscopeid) \
as (mock_get_clientscope_by_name, mock_get_clientscope_by_clientscopeid, mock_create_clientscope,
mock_update_clientscope, mock_get_clientscope_protocolmapper_by_name,
mock_update_clientscope_protocolmappers,
mock_create_clientscope_protocolmapper, mock_delete_clientscope):
with self.assertRaises(AnsibleExitJson) as exec_info:
self.module.main()
# Verify number of call on each mock
self.assertEqual(mock_get_clientscope_by_name.call_count, 1)
self.assertEqual(mock_create_clientscope.call_count, 0)
self.assertEqual(mock_get_clientscope_by_clientscopeid.call_count, 1)
self.assertEqual(mock_update_clientscope.call_count, 1)
self.assertEqual(mock_get_clientscope_protocolmapper_by_name.call_count, 3)
self.assertEqual(mock_update_clientscope_protocolmappers.call_count, 3)
self.assertEqual(mock_create_clientscope_protocolmapper.call_count, 0)
self.assertEqual(mock_delete_clientscope.call_count, 0)
# Verify that the module's changed status matches what is expected
self.assertIs(exec_info.exception.args[0]['changed'], changed)
if __name__ == '__main__':
unittest.main()

View File

@@ -0,0 +1,326 @@
# -*- coding: utf-8 -*-
# Copyright: (c) 2021, Ansible Project
# GNU General Public License v3.0+ (see COPYING or https://www.gnu.org/licenses/gpl-3.0.txt)
from __future__ import absolute_import, division, print_function
__metaclass__ = type
from contextlib import contextmanager
from ansible_collections.community.general.tests.unit.compat import unittest
from ansible_collections.community.general.tests.unit.compat.mock import call, patch
from ansible_collections.community.general.tests.unit.plugins.modules.utils import AnsibleExitJson, AnsibleFailJson, ModuleTestCase, set_module_args
from ansible_collections.community.general.plugins.modules.identity.keycloak import keycloak_role
from itertools import count
from ansible.module_utils.six import StringIO
@contextmanager
def patch_keycloak_api(get_realm_role, create_realm_role=None, update_realm_role=None, delete_realm_role=None):
"""Mock context manager for patching the methods in PwPolicyIPAClient that contact the IPA server
Patches the `login` and `_post_json` methods
Keyword arguments are passed to the mock object that patches `_post_json`
No arguments are passed to the mock object that patches `login` because no tests require it
Example::
with patch_ipa(return_value={}) as (mock_login, mock_post):
...
"""
obj = keycloak_role.KeycloakAPI
with patch.object(obj, 'get_realm_role', side_effect=get_realm_role) as mock_get_realm_role:
with patch.object(obj, 'create_realm_role', side_effect=create_realm_role) as mock_create_realm_role:
with patch.object(obj, 'update_realm_role', side_effect=update_realm_role) as mock_update_realm_role:
with patch.object(obj, 'delete_realm_role', side_effect=delete_realm_role) as mock_delete_realm_role:
yield mock_get_realm_role, mock_create_realm_role, mock_update_realm_role, mock_delete_realm_role
def get_response(object_with_future_response, method, get_id_call_count):
if callable(object_with_future_response):
return object_with_future_response()
if isinstance(object_with_future_response, dict):
return get_response(
object_with_future_response[method], method, get_id_call_count)
if isinstance(object_with_future_response, list):
call_number = next(get_id_call_count)
return get_response(
object_with_future_response[call_number], method, get_id_call_count)
return object_with_future_response
def build_mocked_request(get_id_user_count, response_dict):
def _mocked_requests(*args, **kwargs):
url = args[0]
method = kwargs['method']
future_response = response_dict.get(url, None)
return get_response(future_response, method, get_id_user_count)
return _mocked_requests
def create_wrapper(text_as_string):
"""Allow to mock many times a call to one address.
Without this function, the StringIO is empty for the second call.
"""
def _create_wrapper():
return StringIO(text_as_string)
return _create_wrapper
def mock_good_connection():
token_response = {
'http://keycloak.url/auth/realms/master/protocol/openid-connect/token': create_wrapper('{"access_token": "alongtoken"}'), }
return patch(
'ansible_collections.community.general.plugins.module_utils.identity.keycloak.keycloak.open_url',
side_effect=build_mocked_request(count(), token_response),
autospec=True
)
class TestKeycloakRealmRole(ModuleTestCase):
def setUp(self):
super(TestKeycloakRealmRole, self).setUp()
self.module = keycloak_role
def test_create_when_absent(self):
"""Add a new realm role"""
module_args = {
'auth_keycloak_url': 'http://keycloak.url/auth',
'auth_password': 'admin',
'auth_realm': 'master',
'auth_username': 'admin',
'auth_client_id': 'admin-cli',
'validate_certs': True,
'realm': 'realm-name',
'name': 'role-name',
'description': 'role-description',
}
return_value_absent = [
None,
{
"attributes": {},
"clientRole": False,
"composite": False,
"containerId": "realm-name",
"description": "role-description",
"id": "90f1cdb6-be88-496e-89c6-da1fb6bc6966",
"name": "role-name",
}
]
return_value_created = [None]
changed = True
set_module_args(module_args)
# Run the module
with mock_good_connection():
with patch_keycloak_api(get_realm_role=return_value_absent, create_realm_role=return_value_created) \
as (mock_get_realm_role, mock_create_realm_role, mock_update_realm_role, mock_delete_realm_role):
with self.assertRaises(AnsibleExitJson) as exec_info:
self.module.main()
self.assertEqual(len(mock_get_realm_role.mock_calls), 2)
self.assertEqual(len(mock_create_realm_role.mock_calls), 1)
self.assertEqual(len(mock_update_realm_role.mock_calls), 0)
# Verify that the module's changed status matches what is expected
self.assertIs(exec_info.exception.args[0]['changed'], changed)
def test_create_when_present_with_change(self):
"""Update with change a realm role"""
module_args = {
'auth_keycloak_url': 'http://keycloak.url/auth',
'auth_password': 'admin',
'auth_realm': 'master',
'auth_username': 'admin',
'auth_client_id': 'admin-cli',
'validate_certs': True,
'realm': 'realm-name',
'name': 'role-name',
'description': 'new-role-description',
}
return_value_present = [
{
"attributes": {},
"clientRole": False,
"composite": False,
"containerId": "realm-name",
"description": "role-description",
"id": "90f1cdb6-be88-496e-89c6-da1fb6bc6966",
"name": "role-name",
},
{
"attributes": {},
"clientRole": False,
"composite": False,
"containerId": "realm-name",
"description": "new-role-description",
"id": "90f1cdb6-be88-496e-89c6-da1fb6bc6966",
"name": "role-name",
}
]
return_value_updated = [None]
changed = True
set_module_args(module_args)
# Run the module
with mock_good_connection():
with patch_keycloak_api(get_realm_role=return_value_present, update_realm_role=return_value_updated) \
as (mock_get_realm_role, mock_create_realm_role, mock_update_realm_role, mock_delete_realm_role):
with self.assertRaises(AnsibleExitJson) as exec_info:
self.module.main()
self.assertEqual(len(mock_get_realm_role.mock_calls), 2)
self.assertEqual(len(mock_create_realm_role.mock_calls), 0)
self.assertEqual(len(mock_update_realm_role.mock_calls), 1)
# Verify that the module's changed status matches what is expected
self.assertIs(exec_info.exception.args[0]['changed'], changed)
def test_create_when_present_no_change(self):
"""Update without change a realm role"""
module_args = {
'auth_keycloak_url': 'http://keycloak.url/auth',
'auth_password': 'admin',
'auth_realm': 'master',
'auth_username': 'admin',
'auth_client_id': 'admin-cli',
'validate_certs': True,
'realm': 'realm-name',
'name': 'role-name',
'description': 'role-description',
}
return_value_present = [
{
"attributes": {},
"clientRole": False,
"composite": False,
"containerId": "realm-name",
"description": "role-description",
"id": "90f1cdb6-be88-496e-89c6-da1fb6bc6966",
"name": "role-name",
},
{
"attributes": {},
"clientRole": False,
"composite": False,
"containerId": "realm-name",
"description": "role-description",
"id": "90f1cdb6-be88-496e-89c6-da1fb6bc6966",
"name": "role-name",
}
]
return_value_updated = [None]
changed = False
set_module_args(module_args)
# Run the module
with mock_good_connection():
with patch_keycloak_api(get_realm_role=return_value_present, update_realm_role=return_value_updated) \
as (mock_get_realm_role, mock_create_realm_role, mock_update_realm_role, mock_delete_realm_role):
with self.assertRaises(AnsibleExitJson) as exec_info:
self.module.main()
self.assertEqual(len(mock_get_realm_role.mock_calls), 1)
self.assertEqual(len(mock_create_realm_role.mock_calls), 0)
self.assertEqual(len(mock_update_realm_role.mock_calls), 0)
# Verify that the module's changed status matches what is expected
self.assertIs(exec_info.exception.args[0]['changed'], changed)
def test_delete_when_absent(self):
"""Remove an absent realm role"""
module_args = {
'auth_keycloak_url': 'http://keycloak.url/auth',
'auth_password': 'admin',
'auth_realm': 'master',
'auth_username': 'admin',
'auth_client_id': 'admin-cli',
'validate_certs': True,
'realm': 'realm-name',
'name': 'role-name',
'state': 'absent'
}
return_value_absent = [None]
return_value_deleted = [None]
changed = False
set_module_args(module_args)
# Run the module
with mock_good_connection():
with patch_keycloak_api(get_realm_role=return_value_absent, delete_realm_role=return_value_deleted) \
as (mock_get_realm_role, mock_create_realm_role, mock_update_realm_role, mock_delete_realm_role):
with self.assertRaises(AnsibleExitJson) as exec_info:
self.module.main()
self.assertEqual(len(mock_get_realm_role.mock_calls), 1)
self.assertEqual(len(mock_delete_realm_role.mock_calls), 0)
# Verify that the module's changed status matches what is expected
self.assertIs(exec_info.exception.args[0]['changed'], changed)
def test_delete_when_present(self):
"""Remove a present realm role"""
module_args = {
'auth_keycloak_url': 'http://keycloak.url/auth',
'auth_password': 'admin',
'auth_realm': 'master',
'auth_username': 'admin',
'auth_client_id': 'admin-cli',
'validate_certs': True,
'realm': 'realm-name',
'name': 'role-name',
'state': 'absent'
}
return_value_absent = [
{
"attributes": {},
"clientRole": False,
"composite": False,
"containerId": "realm-name",
"description": "role-description",
"id": "90f1cdb6-be88-496e-89c6-da1fb6bc6966",
"name": "role-name",
}
]
return_value_deleted = [None]
changed = True
set_module_args(module_args)
# Run the module
with mock_good_connection():
with patch_keycloak_api(get_realm_role=return_value_absent, delete_realm_role=return_value_deleted) \
as (mock_get_realm_role, mock_create_realm_role, mock_update_realm_role, mock_delete_realm_role):
with self.assertRaises(AnsibleExitJson) as exec_info:
self.module.main()
self.assertEqual(len(mock_get_realm_role.mock_calls), 1)
self.assertEqual(len(mock_delete_realm_role.mock_calls), 1)
# Verify that the module's changed status matches what is expected
self.assertIs(exec_info.exception.args[0]['changed'], changed)
if __name__ == '__main__':
unittest.main()

View File

@@ -279,8 +279,20 @@ ipv4.may-fail: yes
ipv6.method: auto
ipv6.ignore-auto-dns: no
ipv6.ignore-auto-routes: no
team.runner: roundrobin
"""
TESTCASE_TEAM_HWADDR_POLICY_FAILS = [
{
'type': 'team',
'conn_name': 'non_existent_nw_device',
'ifname': 'team0_non_existant',
'runner_hwaddr_policy': 'by_active',
'state': 'present',
'_ansible_check_mode': False,
}
]
TESTCASE_TEAM_SLAVE = [
{
'type': 'team-slave',
@@ -1053,6 +1065,20 @@ def test_team_connection_unchanged(mocked_team_connection_unchanged, capfd):
assert not results['changed']
@pytest.mark.parametrize('patch_ansible_module', TESTCASE_TEAM_HWADDR_POLICY_FAILS, indirect=['patch_ansible_module'])
def test_team_connection_create_hwaddr_policy_fails(mocked_generic_connection_create, capfd):
"""
Test : Team connection created
"""
with pytest.raises(SystemExit):
nmcli.main()
out, err = capfd.readouterr()
results = json.loads(out)
assert results.get('failed')
assert results['msg'] == "Runner-hwaddr-policy is only allowed for runner activebackup"
@pytest.mark.parametrize('patch_ansible_module', TESTCASE_TEAM_SLAVE, indirect=['patch_ansible_module'])
def test_create_team_slave(mocked_generic_connection_create, capfd):
"""

View File

@@ -13,7 +13,7 @@ from httmock import urlmatch # noqa
from ansible_collections.community.general.tests.unit.compat import unittest
from gitlab import Gitlab
import gitlab
class FakeAnsibleModule(object):
@@ -33,7 +33,7 @@ class GitlabModuleTestCase(unittest.TestCase):
self.mock_module = FakeAnsibleModule()
self.gitlab_instance = Gitlab("http://localhost", private_token="private_token", api_version=4)
self.gitlab_instance = gitlab.Gitlab("http://localhost", private_token="private_token", api_version=4)
# Python 2.7+ is needed for python-gitlab
@@ -45,6 +45,14 @@ def python_version_match_requirement():
return sys.version_info >= GITLAB_MINIMUM_PYTHON_VERSION
def python_gitlab_module_version():
return gitlab.__version__
def python_gitlab_version_match_requirement():
return "2.3.0"
# Skip unittest test case if python version don't match requirement
def unitest_python_version_check_requirement(unittest_testcase):
if not python_version_match_requirement():
@@ -467,6 +475,32 @@ def resp_delete_project(url, request):
return response(204, content, headers, None, 5, request)
@urlmatch(scheme="http", netloc="localhost", path="/api/v4/projects/1/protected_branches/master", method="get")
def resp_get_protected_branch(url, request):
headers = {'content-type': 'application/json'}
content = ('{"id": 1, "name": "master", "push_access_levels": [{"access_level": 40, "access_level_description": "Maintainers"}],'
'"merge_access_levels": [{"access_level": 40, "access_level_description": "Maintainers"}],'
'"allow_force_push":false, "code_owner_approval_required": false}')
content = content.encode("utf-8")
return response(200, content, headers, None, 5, request)
@urlmatch(scheme="http", netloc="localhost", path="/api/v4/projects/1/protected_branches/master", method="get")
def resp_get_protected_branch_not_exist(url, request):
headers = {'content-type': 'application/json'}
content = ('')
content = content.encode("utf-8")
return response(404, content, headers, None, 5, request)
@urlmatch(scheme="http", netloc="localhost", path="/api/v4/projects/1/protected_branches/master", method="delete")
def resp_delete_protected_branch(url, request):
headers = {'content-type': 'application/json'}
content = ('')
content = content.encode("utf-8")
return response(204, content, headers, None, 5, request)
'''
HOOK API
'''

View File

@@ -0,0 +1,81 @@
# -*- coding: utf-8 -*-
# Copyright: (c) 2019, Guillaume Martinez (lunik@tiwabbit.fr)
# GNU General Public License v3.0+ (see COPYING or https://www.gnu.org/licenses/gpl-3.0.txt)
from __future__ import (absolute_import, division, print_function)
__metaclass__ = type
import pytest
from distutils.version import LooseVersion
from ansible_collections.community.general.plugins.modules.source_control.gitlab.gitlab_protected_branch import GitlabProtectedBranch
def _dummy(x):
"""Dummy function. Only used as a placeholder for toplevel definitions when the test is going
to be skipped anyway"""
return x
pytestmark = []
try:
from .gitlab import (GitlabModuleTestCase,
python_version_match_requirement, python_gitlab_module_version,
python_gitlab_version_match_requirement,
resp_get_protected_branch, resp_get_project_by_name,
resp_get_protected_branch_not_exist,
resp_delete_protected_branch, resp_get_user)
# GitLab module requirements
if python_version_match_requirement():
from gitlab.v4.objects import Project
gitlab_req_version = python_gitlab_version_match_requirement()
gitlab_module_version = python_gitlab_module_version()
if LooseVersion(gitlab_module_version) < LooseVersion(gitlab_req_version):
pytestmark.append(pytest.mark.skip("Could not load gitlab module required for testing (Wrong version)"))
except ImportError:
pytestmark.append(pytest.mark.skip("Could not load gitlab module required for testing"))
# Unit tests requirements
try:
from httmock import with_httmock # noqa
except ImportError:
pytestmark.append(pytest.mark.skip("Could not load httmock module required for testing"))
with_httmock = _dummy
class TestGitlabProtectedBranch(GitlabModuleTestCase):
@with_httmock(resp_get_project_by_name)
@with_httmock(resp_get_user)
def setUp(self):
super(TestGitlabProtectedBranch, self).setUp()
self.gitlab_instance.user = self.gitlab_instance.users.get(1)
self.moduleUtil = GitlabProtectedBranch(module=self.mock_module, project="foo-bar/diaspora-client", gitlab_instance=self.gitlab_instance)
@with_httmock(resp_get_protected_branch)
def test_protected_branch_exist(self):
rvalue = self.moduleUtil.protected_branch_exist(name="master")
self.assertEqual(rvalue.name, "master")
@with_httmock(resp_get_protected_branch_not_exist)
def test_protected_branch_exist_not_exist(self):
rvalue = self.moduleUtil.protected_branch_exist(name="master")
self.assertEqual(rvalue, False)
@with_httmock(resp_get_protected_branch)
def test_compare_protected_branch(self):
rvalue = self.moduleUtil.compare_protected_branch(name="master", merge_access_levels="maintainer", push_access_level="maintainer")
self.assertEqual(rvalue, True)
@with_httmock(resp_get_protected_branch)
def test_compare_protected_branch_different_settings(self):
rvalue = self.moduleUtil.compare_protected_branch(name="master", merge_access_levels="developer", push_access_level="maintainer")
self.assertEqual(rvalue, False)
@with_httmock(resp_get_protected_branch)
@with_httmock(resp_delete_protected_branch)
def test_delete_protected_branch(self):
rvalue = self.moduleUtil.delete_protected_branch(name="master")
self.assertEqual(rvalue, None)