ansible 2.6.2: playbook works when executed with ansible-playbook command but not in AWX 1.0.7.2

797 Views Asked by At

I've seen many posts claiming that playbooks work properly when executed with ansible CLI but not in AWX. However, I didn't find any solution to my issue. To make it simple, I have the following role:

---
- name: Append Public key in authorized_keys file
  authorized_key:
    user: "{{ username }}"
    state: present
    key: "{{ lookup('file', '~/.ssh/id_rsa.pub') }}"

It is called as follows:

- name: copy root public key to nodes
  become: yes
  become_user: root
  hosts: jenkins-nodes
  roles:
    - role: copy-keys
      username: root

Running it with the CLI, as shown below:

ansible-playbook -i inventory.ini -u root <my-playbook> ---vvv

works as expected and displays the following:

TASK [copy-keys : Append Public key in authorized_keys file 
***************************************************************
task path: /opt/jenkins-cluster/roles/copy-keys/tasks/main.yml:2
...
ok: [jenkins-agent-1] => {
"changed": false,
"comment": null,
"exclusive": false,
"invocation": {
    "module_args": {
        "comment": null,
        "exclusive": false,
        "key": "ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABAQCuF9U2HvzUubuYYZxJaEu/1nls7RLAZO+qcJF37RIepTSLOgoPsluq7uVRhEnadqnB0yVWccZYHs6WEp5Fo2QIRDRho4+TuACB26EE4GTYGnozyMwOwVcTzRo0CiUXfo3IZKWwQ+v8WwBMae3EpYrbrEZy6lLS8K85uYseyjg1myRhEsltdSiNnHun7p09/v/HMq2KsZcmx6nTg66QvkbbnFvv9UpGQ1J6gvimp11r5r1hwXaB7ejTwrxMICvaE2Flq3WGeaB35I4dYFsrWNK1CalP7jPF+MRgqHUrjoOy5hxp3zSXunfGWeRJCaJY5hYDLp3hTGrt8BwcdD+8Gy7r root@inf-inone01-prd",
        "key_options": null,
        "keyfile": "/root/.ssh/authorized_keys",
        "manage_dir": true,
        "path": null,
        "state": "present",
        "unique": false,
        "user": "root",
        "validate_certs": true
    }
},
"key": "ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABAQCuF9U2HvzUubuYYZxJaEu/1nls7RLAZO+qcJF37RIepTSLOgoPsluq7uVRhEnadqnB0yVWccZYHs6WEp5Fo2QIRDRho4+TuACB26EE4GTYGnozyMwOwVcTzRo0CiUXfo3IZKWwQ+v8WwBMae3EpYrbrEZy6lLS8K85uYseyjg1myRhEsltdSiNnHun7p09/v/HMq2KsZcmx6nTg66QvkbbnFvv9UpGQ1J6gvimp11r5r1hwXaB7ejTwrxMICvaE2Flq3WGeaB35I4dYFsrWNK1CalP7jPF+MRgqHUrjoOy5hxp3zSXunfGWeRJCaJY5hYDLp3hTGrt8BwcdD+8Gy7r root@inf-inone01-prd",
"key_options": null,
"keyfile": "/root/.ssh/authorized_keys",
"manage_dir": true,
"path": null,
"state": "present",
"unique": false,
"user": "root",
"validate_certs": true
}
...
META: ran handlers
META: ran handlers

When I execute exactly the same thing in AWX I get:

TASK [copy-keys : Append Public key in authorized_keys file] 
*******************
task path: /var/lib/awx/projects/_39__jenkins_cluster/roles/copy-keys/tasks/main.yml:2
 [WARNING]: Unable to find '~/.ssh/id_rsa.pub' in expected paths (use -vvvvv to
see paths)
 [WARNING]: Unable to find '~/.ssh/id_rsa.pub' in expected paths (use -vvvvv to
see paths)
fatal: [jenkins-agent-1]: FAILED! => {
    "msg": "An unhandled exception occurred while running the lookup plugin 'file'. Error was a 
<class 'ansible.errors.AnsibleError'>, original message: could not locate file in lookup: ~/.ssh/id_rsa.pub"
}

The exception is saying that the file ~/.ssh/id_rsa.pub, here /root/.ssh/id_rsa.pub for th user root, cannot be located as it doesn't exist. My understanding is that the authorized_key module will add to the authorized_keys file on the target host the content of the /root/.ssh/id_rsa.pub file on the ansible controller. And this file exists:

PROD root@inf-inone01-prd jenkins-cluster $ cat /root/.ssh/id_rsa.pub
ssh-rsa 
 AAAAB3NzaC1yc2EAAAADAQABAAABAQCuF9U2HvzUubuYYZxJaEu/1nls7RLAZO
+qcJF37RIepTSLOgoPsluq7uVRhEnadqnB0yVWccZY
Hs6WEp5Fo2QIRDRho4+TuACB26EE4GTYGnozyMwOwVcTzRo0CiUXfo3IZKWwQ
+v8WwBMae3EpYrbrEZy6lLS8K85uYseyjg1myRhEsltd 
SiNnHun7p09/v/HMq2KsZcmx6nTg66QvkbbnFvv9UpGQ1J6gvimp11r5r1hwXaB7ejTwrxMIC
vaE2Flq3WGeaB35I4dYFsrWNK1CalP7jPF+MRgqHUrjoOy5hxp3zSXunfGWeRJCaJY5hYDLp3hTGrt8BwcdD+8Gy7r 
root@inf-inone01-prd
PROD root@inf-inone01-prd jenkins-cluster $

Obviously the authorized_keys module is not able to resolve ~/.ssh but how come it does it when ran with the CLI ?

Any suggestion would be highly appreciated as, after having spent time to test the whole stuff such that to cover all the cases using the CLI, I thought that putting everything in AWX would be a matter of minutes. Which is unfortunatelly not.

Kind regards,

Nicolas

1

There are 1 best solutions below

0
purplemouse On

I had the same need and the best solution I found involved using a custom credential type.

Examples how to set up custom credential types are explained very nicely here and here.

In my case, I created a custom credential type called "SSH Keypair Credential" as seen here.

The input configuration:

fields:
  - id: my_ssh_private_key
    type: string
    label: ssh_private_key
    secret: true
    multiline: true
  - id: my_ssh_public_key
    type: string
    label: ssh_public_key
    secret: true

The injected configuration:

extra_vars:
  ssh_private_key: '{{ tower.filename.my_ssh_private_key }}'
  ssh_public_key: '{{ tower.filename.my_ssh_public_key }}'
file:
  template.my_ssh_private_key: '{{ my_ssh_private_key  }}'
  template.my_ssh_public_key: '{{ my_ssh_public_key  }}'

After creating the custom credential type, create the custom credential as seen here.

Then add the custom credential to the respective template to be used as seen here.

The following playbook variables were used:

admin_username: "admin"
admin_public_sshkey: "{{ '~/.ssh/id_rsa.pub' | expanduser }}"
admin_private_sshkey: "{{ '~/.ssh/id_rsa' | expanduser }}"

admin_ssh_private_key: "{{ ssh_private_key | d(admin_private_sshkey) }}"
admin_ssh_public_key: "{{ ssh_public_key | d(admin_public_sshkey) }}"

The playbook when setting the authorized_key:

- name: Add admin user SSH authorized keys
  when: admin_ssh_public_key is defined
  authorized_key:
    user: "{{ admin_username }}"
    key: "{{ lookup('file', admin_ssh_public_key) }}"