get no debug on why a role is not triggered from the ansible playbook

62 Views Asked by At

I made a very simple ansible playbook playbook_standalone_add_users.yml (just add 2 linux users), it run very well on a Vagrant with libvirt provider and ansible provision test infrastructure, see the Vagrantfile below.

When I try to transform playbook_standalone_add_users.yml into:

  • a single file role: ./ansible_roles/myinit/main.yml
  • and a playbook to call this role: ./tests/playbook_to_test_myinit.yml

My provisioning do nothing, no error, and more annoying no way to get any kind of useful debug -- e.g. know if the role code is executed or not ;(

Let's demonstrate step by step.

Working playbook_standalone_add_users.yml

The Vagrant file:

$ cat Vagrantfile 
ENV['VAGRANT_DEFAULT_PROVIDER'] = 'libvirt'

Vagrant.configure("2") do |config|
  config.vm.box = "debian/bullseye64"
  config.ssh.insert_key = 'true'
  config.vm.provider :libvirt do |libvirt|
    libvirt.memory = 2048
    libvirt.cpus = 2
  end
  config.vm.provision "ansible" do |ansible|
    ansible.verbose = "v"
    ansible.playbook = "playbook_standalone_add_users.yml"
  end
end

The organization of files:

$ tree
.
├── playbook_standalone_add_users.yml
└── Vagrantfile

the playbook_standalone_add_users.yml playbook

$ cat playbook_standalone_add_users.yml 
---
- hosts: all
  become: yes
  become_method: sudo
  vars:
    users:
    - username: luis
      groups: sudo
    - username: paul
      groups: sudo
  tasks:
    - name: Add Users
      user:
        name: "{{ user_item.username }}"
        createhome: yes
        shell: /bin/bash
        groups: "{{ user_item.groups }}"
        append: yes
      loop: "{{ users }}"
      loop_control: { loop_var: user_item }
$

Let's setup the Vagrant VM:

$ vagrant up --no-provision
...
$
$ vagrant ssh -c "ls /home"
vagrant
$

Now let's provision:

$ vagrant provision 
==> default: [vagrant-hostsupdater] Checking for host entries
==> default: Running provisioner: ansible...
    default: Running ansible-playbook...
PYTHONUNBUFFERED=1 ANSIBLE_FORCE_COLOR=true ANSIBLE_HOST_KEY_CHECKING=false ANSIBLE_SSH_ARGS='-o UserKnownHostsFile=/dev/null -o IdentitiesOnly=yes -o ControlMaster=auto -o ControlPersist=60s' ansible-playbook --connection=ssh --timeout=30 --limit="default" --inventory-file=/home/luis/proj/g/infra/mz_devops/vagrant/deb/.vagrant/provisioners/ansible/inventory -v playbook_standalone_add_users.yml
No config file found; using defaults

PLAY [all] *********************************************************************

TASK [Gathering Facts] *********************************************************
ok: [default]

TASK [Add Users] ***************************************************************
ok: [default] => (item={'username': 'luis', 'groups': 'sudo'}) => {"ansible_loop_var": "user_item", "append": true, "changed": false, "comment": "", "group": 1001, "groups": "sudo", "home": "/home/luis", "move_home": false, "name": "luis", "shell": "/bin/bash", "state": "present", "uid": 1001, "user_item": {"groups": "sudo", "username": "luis"}}
ok: [default] => (item={'username': 'paul', 'groups': 'sudo'}) => {"ansible_loop_var": "user_item", "append": true, "changed": false, "comment": "", "group": 1002, "groups": "sudo", "home": "/home/paul", "move_home": false, "name": "paul", "shell": "/bin/bash", "state": "present", "uid": 1002, "user_item": {"groups": "sudo", "username": "paul"}}

PLAY RECAP *********************************************************************
default                    : ok=2    changed=0    unreachable=0    failed=0    skipped=0    rescued=0    ignored=0   
$

and test it ... it works !

$ vagrant ssh -c "ls /home"
luis  paul  vagrant
$

Not working single file ansible role called from a playbook

So with the following structure:

$ tree
.
├── ansible_roles
│   └── myinit
│       └── main.yml
├── tests
│   └── playbook_to_test_myinit.yml
└── Vagrantfile

We replace the playbook_standalone_add_users.yml

$ cat playbook_standalone_add_users.yml 
---
- hosts: all
  become: yes
  become_method: sudo
  vars:
    users:
    - username: luis
      groups: sudo
    - username: paul
      groups: sudo
  tasks:
    - name: Add Users
      user:
        name: "{{ user_item.username }}"
        createhome: yes
        shell: /bin/bash
        groups: "{{ user_item.groups }}"
        append: yes
      loop: "{{ users }}"
      loop_control: { loop_var: user_item }
$
  • with ./ansible_roles/myinit/main.yml:
$ cat ./ansible_roles/myinit/main.yml
---
- hosts: all
  become: yes
  become_method: sudo
  vars:
    users:
    - username: luis
      groups: sudo
    - username: paul
      groups: sudo
  tasks:
    - name: Add users Role
      user:
        name: "{{ user_item.username }}"
        createhome: yes
        shell: /bin/bash
        groups: "{{ user_item.groups }}"
        append: yes
      loop: "{{ users }}"
      loop_control: { loop_var: user_item }
$

note: there is no diff betewn the standalone playbook and the role:

$ diff playbook_standalone_add_users.yml ansible_roles/myinit/main.yml 
12c12
<     - name: Add Users
---
>     - name: Add users Role
$
  • and we call it from ./tests/playbook_to_test_myinit.yml
$ cat ./tests/playbook_to_test_myinit.yml
---
- name: Test myinit Ansible Role (create 2 linux users)
  hosts: all
  become: true
  roles:
    - ../ansible_roles/myinit
$

Obviously we change the ansible.playbook value in Vagrantfile to provision with it:

$ cat Vagrantfile 
ENV['VAGRANT_DEFAULT_PROVIDER'] = 'libvirt'

Vagrant.configure("2") do |config|
  config.vm.box = "debian/bullseye64"
  config.ssh.insert_key = 'true'
  config.vm.provider :libvirt do |libvirt|
    libvirt.memory = 2048
    libvirt.cpus = 2
  end
  config.vm.provision "ansible" do |ansible|
    ansible.verbose = "v"
    ansible.playbook = "tests/playbook_to_test_myinit.yml"
  end
end

we start Vagrant from scratch:

$ vagrant destroy 
==> default: [vagrant-hostsupdater] Removing hosts
    default: Are you sure you want to destroy the 'default' VM? [y/N] y
==> default: Pruning invalid NFS exports. Administrator privileges will be required...
==> default: Removing domain...
==> default: Deleting the machine folder
$

$ virsh list 
 Id   Name   State
--------------------


Start a fresh new VM:

$ vagrant up --no-provision
...
$

$ vagrant ssh -c "ls /home"
vagrant
$
$

an now provision -- does not work ;(

$ vagrant provision 
==> default: [vagrant-hostsupdater] Checking for host entries
==> default: Running provisioner: ansible...
    default: Running ansible-playbook...
PYTHONUNBUFFERED=1 ANSIBLE_FORCE_COLOR=true ANSIBLE_HOST_KEY_CHECKING=false ANSIBLE_SSH_ARGS='-o UserKnownHostsFile=/dev/null -o IdentitiesOnly=yes -o ControlMaster=auto -o ControlPersist=60s' ansible-playbook --connection=ssh --timeout=30 --limit="default" --inventory-file=/home/luis/proj/g/infra/mz_devops/vagrant/deb/.vagrant/provisioners/ansible/inventory -v tests/playbook_to_test_myinit.yml
No config file found; using defaults

PLAY [Test myinit Ansible Role (create 2 linux users)] *************************

TASK [Gathering Facts] *********************************************************
ok: [default]

PLAY RECAP *********************************************************************
default                    : ok=1    changed=0    unreachable=0    failed=0    skipped=0    rescued=0    ignored=0   
$

$ vagrant ssh -c "ls /home"
vagrant
$

No error, we see the line

PLAY [Test myinit Ansible Role (create 2 linux users)] *************************

with more debug vagrant provision --debug we get:

$ vagrant provision --debug
...
...
==> default: [vagrant-hostsupdater] Checking for host entries
DEBUG ssh: Checking key permissions: /home/luis/proj/g/infra/mz_devops/vagrant/deb/.vagrant/machines/default/libvirt/private_key
 INFO warden: Calling IN action: #<Vagrant::Action::Builtin::Provision:0x00005646c44c0840>
 INFO provision: Ignoring sentinel check, forcing provision
 INFO provision: Checking provisioner sentinel file...

 INFO interface: detail: Running ansible-playbook...
 INFO interface: detail:     default: Running ansible-playbook...
    default: Running ansible-playbook...
 INFO interface: detail: PYTHONUNBUFFERED=1 ANSIBLE_FORCE_COLOR=true ANSIBLE_HOST_KEY_CHECKING=false ANSIBLE_SSH_ARGS='-o UserKnownHostsFile=/dev/null -o IdentitiesOnly=yes -o ControlMaster=auto -o ControlPersist=60s' ansible-playbook --connection=ssh --timeout=30 --limit="default" --inventory-file=/home/luis/proj/g/infra/mz_devops/vagrant/deb/.vagrant/provisioners/ansible/inventory -v tests/playbook_to_test_myinit.yml
PYTHONUNBUFFERED=1 ANSIBLE_FORCE_COLOR=true ANSIBLE_HOST_KEY_CHECKING=false ANSIBLE_SSH_ARGS='-o UserKnownHostsFile=/dev/null -o IdentitiesOnly=yes -o ControlMaster=auto -o ControlPersist=60s' ansible-playbook --connection=ssh --timeout=30 --limit="default" --inventory-file=/home/luis/proj/g/infra/mz_devops/vagrant/deb/.vagrant/provisioners/ansible/inventory -v tests/playbook_to_test_myinit.yml
 INFO subprocess: Starting process: ["/home/luis/.pyenv/shims/ansible-playbook", "--connection=ssh", "--timeout=30", "--limit=default", "--inventory-file=/home/luis/proj/g/infra/mz_devops/vagrant/deb/.vagrant/provisioners/ansible/inventory", "-v", "tests/playbook_to_test_myinit.yml"]
 INFO subprocess: Vagrant not running in installer, restoring original environment...
DEBUG subprocess: Selecting on IO
DEBUG subprocess: stdout: No config file found; using defaults
 INFO interface: detail: No config file found; using defaults

No config file found; using defaults
DEBUG subprocess: stdout: 
PLAY [Test myinit Ansible Role (create 2 linux users)] *************************
 INFO interface: detail: 
PLAY [Test myinit Ansible Role (create 2 linux users)] *************************


PLAY [Test myinit Ansible Role (create 2 linux users)] *************************
DEBUG subprocess: stdout: 
TASK [Gathering Facts] *********************************************************
 INFO interface: detail: 
TASK [Gathering Facts] *********************************************************


TASK [Gathering Facts] *********************************************************
DEBUG subprocess: stdout: ok: [default]
 INFO interface: detail: ok: [default]

ok: [default]
DEBUG subprocess: stdout: 
PLAY RECAP *********************************************************************
default                    : ok=1    changed=0    unreachable=0    failed=0    skipped=0    rescued=0    ignored=0   

 INFO interface: detail: 
PLAY RECAP *********************************************************************
default                    : ok=1    changed=0    unreachable=0    failed=0    skipped=0    rescued=0    ignored=0   



PLAY RECAP *********************************************************************
default                    : ok=1    changed=0    unreachable=0    failed=0    skipped=0    rescued=0    ignored=0   

DEBUG subprocess: Waiting for process to exit. Remaining to timeout: 31999
DEBUG subprocess: Exit status: 0
 INFO warden: Calling OUT action: #<Proc:0x000055bb9d8e65d0 /usr/share/rubygems-integration/all/gems/vagrant-2.2.19/lib/vagrant/action/warden.rb:126 (lambda)>
 INFO warden: Calling OUT action: #<Vagrant::Action::Builtin::Provision:0x000055bb9efbd808>
 INFO warden: Calling OUT action: #<VagrantPlugins::HostsUpdater::Action::UpdateHosts:0x000055bb9efbd880>
 INFO warden: Calling OUT action: #<Proc:0x000055bb9f05c160 /usr/share/rubygems-integration/all/gems/vagrant-2.2.19/lib/vagrant/action/warden.rb:126 (lambda)>
 INFO warden: Calling OUT action: #<Vagrant::Action::Builtin::Call:0x000055bb9e856678>
 INFO warden: Calling OUT action: #<Proc:0x000055bb9ed9b368 /usr/share/rubygems-integration/all/gems/vagrant-2.2.19/lib/vagrant/action/warden.rb:126 (lambda)>
 INFO warden: Calling OUT action: #<Vagrant::Action::Builtin::Call:0x000055bb9cd33d00>
 INFO warden: Calling OUT action: #<Vagrant::Action::Builtin::ConfigValidate:0x000055bb9cd33ee0>
 INFO interface: Machine: action ["provision", "end", {:target=>:default}]
 INFO environment: Released process lock: machine-action-bae584283f96bd7eb3b32166ae454319
DEBUG environment: Attempting to acquire process-lock: dotlock
 INFO environment: Acquired process lock: dotlock
 INFO environment: Released process lock: dotlock
 INFO environment: Running hook: environment_unload
 INFO runner: Running action: environment_unload #<Vagrant::Action::Builder:0x000055bb9f62e070>
$ 
$ vagrant ssh -c "ls /home"
vagrant
$

Trying to investigate on logs I found this:

$ vagrant provision --debug > /tmp/log.txt 2>&1
$ grep -i sent /tmp/log.txt 
 INFO provision: Ignoring sentinel check, forcing provision
 INFO provision: Checking provisioner sentinel file...
 INFO provision: Sentinel found! Not provisioning.
$

But I have no clue

0

There are 0 best solutions below