Using an IPv6 iscsi target portal causes mount to fail
See original GitHub issueUsing democratic-csi from helm chart v0.13.1.
If I configure:
driver
config:
driver: freenas-iscsi
iscsi:
targetPortal: '[2001:123:456::1]:3260'
And then request a k8s deployment with a pvc with the appropriate storage class, the pod fails to start with:
MountVolume.MountDevice failed for volume "pvc-5168eaae-77c4-4361-91e9-6994283d36c3" : rpc error: code = Internal desc = {"code":1,"stdout":"","stderr":"iscsiadm: false is an invalid session ID or path\n"," ││ timeout":false}
The volume is provisioned and shared properly on Freenas, and an iscsi session is opened to the correct portal on the k8s node. However, on the csi-driver
log, I can see:
csi-driver executing iscsi command: iscsiadm -m node -T iqn.2005-10.org.freenas.ctl:default-aptcacher-iscsi-claim -p [2001:123:456::1]:3260 -o new
csi-driver executing iscsi command: iscsiadm -m node -T iqn.2005-10.org.freenas.ctl:default-aptcacher-iscsi-claim -p [2001:123:456::1]:3260 -o update --name node.startup --value manual
csi-driver executing iscsi command: iscsiadm -m node -T iqn.2005-10.org.freenas.ctl:default-aptcacher-iscsi-claim -p [2001:123:456::1]:3260 -l
csi-driver executing iscsi command: iscsiadm -m session
csi-driver executing iscsi command: iscsiadm -m session -r false --rescan
The fault here is the last line - it’s trying to rescan a session but passing in a session id of false
, which isn’t valid.
It seems this comes from the logic in src/utils/iscsi.js#L181
(https://github.com/democratic-csi/democratic-csi/blob/v1.7.2/src/utils/iscsi.js#L181) which looks to discover the session id. For whatever reason, it doesn’t find it, so returns false
rather than a session id, which then gets passed through to rescanSession
in https://github.com/democratic-csi/democratic-csi/blob/v1.7.2/src/utils/iscsi.js#L491.
I suspect the portal == i_session.portal
comparison is fault somehow, but not really sure how to debug to get any further.
Issue Analytics
- State:
- Created a year ago
- Comments:12 (8 by maintainers)
Success! Many thanks.
And yes, I was discussing that very bug earlier today with the
csi
team which is what made me realize you would have issues. That specific code is not relevant directly in the case of this driver, but indeed my logic suffers from the exact same issue (and has been resolved in the exact same fashion). I should havev1.7.5
snapped shortly.