IPv4 salt minion connecting to IPv6 master is seen as not_alived
See original GitHub issueDescription of Issue / Setup / Steps to reproduce the issue
When a salt master is configured as following -
interface: 0::0
ipv6: True
It listens on 4505 as following -
LISTEN 0 128 *:4505 *:*
However, when a minion connects to this master using its IPv4 address, the ss -ant | grep 4505
shows the following -
ESTAB 0 0 [::ffff:10.1.2.10]:4505 [::ffff:10.1.2.11]:42220
Here 10.1.2.10 is master and 10.1.2.11 is minion IPv4 addresses.
When salt-run manage.alived is run, the minion’s id is not returned.
Should the following on https://github.com/saltstack/salt/blob/b95213ec903402f25c1e0aeb3990fe8452ab63ce/salt/utils/network.py#L1629 remote_host.strip(“[]”) be checked for ipv4_mapped (salt._compat.ipaddress) address when it’s IPv6 address ?
Versions Report
salt --versions-report Salt Version: Salt: 3000.3
Dependency Versions: cffi: 1.11.5 cherrypy: unknown dateutil: 2.6.1 docker-py: Not Installed gitdb: Not Installed gitpython: Not Installed Jinja2: 2.10.1 libgit2: Not Installed M2Crypto: 0.33.0 Mako: 1.0.6 msgpack-pure: Not Installed msgpack-python: 0.6.1 mysql-python: Not Installed pycparser: 2.14 pycrypto: Not Installed pycryptodome: 3.9.7 pygit2: Not Installed Python: 3.6.8 (default, Nov 21 2019, 19:31:34) python-gnupg: Not Installed PyYAML: 3.12 PyZMQ: 17.0.0 smmap: Not Installed timelib: Not Installed Tornado: 4.5.3 ZMQ: 4.3.2
System Versions: dist: centos 8.1.1911 Core locale: UTF-8 machine: x86_64 release: 4.18.0-147.el8.x86_64 system: Linux version: CentOS Linux 8.1.1911 Core
(Provided by running salt --versions-report
. Please also mention any differences in master/minion versions.)
Issue Analytics
- State:
- Created 3 years ago
- Comments:6 (6 by maintainers)
@golmaal He is on the Core Team, but there was a conversation being had so it was difficult for him to respond right away, so I am helping, here. This is correct we will need to correct this so I am putting into the current release cycle for now and will attempt to get it fixed. I cannot commit to the fix, but welcome community involvement and will see if I can get this assigned this week or early next week. Thank you!
looks like other work took priority and we didn’t get to this work, moving to Silicon release cycle