ansible ssh prompt known_hosts issue -


i'm running ansible playbook , works fine on 1 machine.

on new machine when try first time, following error.

17:04:34 play [appservers] *************************************************************  17:04:34  17:04:34 gathering facts ***************************************************************  17:04:34 fatal: [server02.cit.product-ref.dev] => {'msg': "failed: (22, 'invalid argument')", 'failed': true} 17:04:34 fatal: [server01.cit.product-ref.dev] => {'msg': "failed: (22, 'invalid argument')", 'failed': true} 17:04:34  17:04:34 task: [common | remove old ansible-tmp-*] *************************************  17:04:34 fatal: no hosts matched or hosts have failed -- aborting 17:04:34  17:04:34  17:04:34 play recap ********************************************************************  17:04:34            retry, use: --limit @/var/lib/jenkins/site.retry 17:04:34  17:04:34 server01.cit.product-ref.dev      : ok=0    changed=0    unreachable=1    failed=0    17:04:34 server02.cit.product-ref.dev      : ok=0    changed=0    unreachable=1    failed=0    17:04:34  17:04:34 build step 'execute shell' marked build failure 17:04:34 finished: failure 

this error can resolved, if first go source machine (from i'm running ansible playbook) , manually ssh target machine (as given user) , enter "yes" known_hosts file entry.

now, if run same ansible playbook second time, works without error.

therefore, how can suppress prompt ssh gives while making ssh known_hosts entry first time given user (~/.ssh folder, file known_hosts)?

i found can if use following config entries in ~/.ssh/config file.

~/.ssh/config

# vapp virtual machines host *   stricthostkeychecking no   userknownhostsfile=/dev/null   user kobaloki   loglevel error 

i.e. if place above code in user's ~/.ssh/config file of remote machine , try ansible playbook first time, won't prompted entring "yes" , playbook run (without requiring user manually create known_hosts file entry source machine target/remote machine).

my questions: 1. security issues should take care if go ~/.ssh/config way 2. how can pass settings (what's there in config file) parameters/options ansible @ command line run first time on new machine (without prompting / depending upon known_hosts file entry on source machine target machine?

the ansible docs have a section on this. quoting:

ansible 1.2.1 , later have host key checking enabled default.

if host reinstalled , has different key in ‘known_hosts’, result in error message until corrected. if host not in ‘known_hosts’ result in prompting confirmation of key, results in interactive experience if using ansible, say, cron. might not want this.

if understand implications , wish disable behavior, can editing /etc/ansible/ansible.cfg or ~/.ansible.cfg:

[defaults] host_key_checking = false 

alternatively can set environment variable:

$ export ansible_host_key_checking=false 

also note host key checking in paramiko mode reasonably slow, therefore switching ‘ssh’ recommended when using feature.


Comments

Popular posts from this blog

apache - PHP Soap issue while content length is larger -

asynchronous - Python asyncio task got bad yield -

javascript - Complete OpenIDConnect auth when requesting via Ajax -