Warp causing IO failure in noobaa endpoint log
See original GitHub issueEnvironment info
- NooBaa Version: v4.9.0-188.ci
- Platform: OCP 4.8.9
Actual behavior
- IO failures using warp
Expected behavior
- IO failure shouldn’t happen
Steps to reproduce
- Started IO using warp and got failures for all 3 runs in Noobaa endpoint logs:
For bucket-2:
Warp run:
[root@hpo-11 akshat]# warp mixed --access-key=MDJzrV8UnqMwsK8cgPig --secret-key=bfNt3V0Pl4SOSjvvCAdm5pMdFLanLuOmRZxjxlxl --obj.size=1G --host=s3-openshift-storage.apps.ocp-akshat-1.cp.fyre.ibm.com --duration=20m --bucket=bucket-2 --debug --tls --insecure
warp: <ERROR> upload error: Put "https://s3-openshift-storage.apps.ocp-akshat-1.cp.fyre.ibm.com/bucket-2/Z7HyQxNj/2.quPJR2EKh6CszLfe.rnd?partNumber=2&uploadId=0d8c0328-c533-463a-9d1f-838b9a3fa989": write tcp 9.30.26.199:51626->9.30.38.152:443: write: broken pipe
warp: <ERROR> upload error: Put "https://s3-openshift-storage.apps.ocp-akshat-1.cp.fyre.ibm.com/bucket-2/roeWN10H/2.4oNfmf1gYj%299Mi%28w.rnd?partNumber=2&uploadId=30654bff-8d89-4e50-a0da-d1c2fd899bc2": write tcp 9.30.26.199:51652->9.30.38.152:443: use of closed network connection
warp: <ERROR> upload error: Put "https://s3-openshift-storage.apps.ocp-akshat-1.cp.fyre.ibm.com/bucket-2/y3ZWYpuv/2.D%28UZfeKJwmVQOOEV.rnd?partNumber=2&uploadId=49b391c1-6432-41d6-ac93-aa9986ce58d6": write tcp 9.30.26.199:51630->9.30.38.152:443: use of closed network connection
Oct-18 6:21:09.957 [Endpoint/13] [ERROR] core.endpoint.s3.s3_rest:: S3 ERROR <?xml version="1.0" encoding="UTF-8"?><Error><Code>NoSuchKey</Code><Message>The specified key does not exist.</Message><Resource>/bucket-2/AQnpkLPO/2.PSXt9dlvI2HUc%28y6.rnd?partNumber=3&uploadId=25a7cc15-ab3a-4b59-9491-148d74e8e8af</Resource><RequestId>kuw9qup0-52t8vu-3u7</RequestId></Error> PUT /bucket-2/AQnpkLPO/2.PSXt9dlvI2HUc%28y6.rnd?partNumber=3&uploadId=25a7cc15-ab3a-4b59-9491-148d74e8e8af {"user-agent":"MinIO (linux; amd64) minio-go/v7.0.11 warp/0.3.43","content-length":"134217728","authorization":"AWS4-HMAC-SHA256 Credential=MDJzrV8UnqMwsK8cgPig/20211018/us-east-1/s3/aws4_request, SignedHeaders=host;x-amz-content-sha256;x-amz-date, Signature=926e329c030e678b033779ed39c6b4908e2a325d1d6bb3e84a0f865399430aca","x-amz-content-sha256":"UNSIGNED-PAYLOAD","x-amz-date":"20211018T062039Z","host":"s3-openshift-storage.apps.ocp-akshat-1.cp.fyre.ibm.com","x-forwarded-host":"s3-openshift-storage.apps.ocp-akshat-1.cp.fyre.ibm.com","x-forwarded-port":"443","x-forwarded-proto":"https","forwarded":"for=10.17.66.193;host=s3-openshift-storage.apps.ocp-akshat-1.cp.fyre.ibm.com;proto=https","x-forwarded-for":"10.17.66.193"} Error: No such file or directory
Oct-18 6:21:52.923 [Endpoint/13] [ERROR] core.endpoint.s3.s3_rest:: S3 ERROR <?xml version="1.0" encoding="UTF-8"?><Error><Code>InternalError</Code><Message>We encountered an internal error. Please try again.</Message><Resource>/bucket-2/mCC7U35O/2.D2HLB0fKBhxw0Esv.rnd?partNumber=4&uploadId=1296b846-159f-4ef5-88c6-cf7c716e7ffc</Resource><RequestId>kuw9rscp-fcgf1h-17k0</RequestId></Error> PUT /bucket-2/mCC7U35O/2.D2HLB0fKBhxw0Esv.rnd?partNumber=4&uploadId=1296b846-159f-4ef5-88c6-cf7c716e7ffc {"user-agent":"MinIO (linux; amd64) minio-go/v7.0.11 warp/0.3.43","content-length":"134217728","authorization":"AWS4-HMAC-SHA256 Credential=MDJzrV8UnqMwsK8cgPig/20211018/us-east-1/s3/aws4_request, SignedHeaders=host;x-amz-content-sha256;x-amz-date, Signature=ec7d455b713b9867f7be45bc651a3750e0437e6d09f42c9ae62362f414ad18df","x-amz-content-sha256":"UNSIGNED-PAYLOAD","x-amz-date":"20211018T062122Z","host":"s3-openshift-storage.apps.ocp-akshat-1.cp.fyre.ibm.com","x-forwarded-host":"s3-openshift-storage.apps.ocp-akshat-1.cp.fyre.ibm.com","x-forwarded-port":"443","x-forwarded-proto":"https","forwarded":"for=10.17.66.193;host=s3-openshift-storage.apps.ocp-akshat-1.cp.fyre.ibm.com;proto=https","x-forwarded-for":"10.17.66.193"} Error [ERR_STREAM_PREMATURE_CLOSE]: Premature close
For bucket-50:
Oct-18 6:24:26.708 [Endpoint/13] [ERROR] core.endpoint.s3.s3_rest:: S3 ERROR <?xml version="1.0" encoding="UTF-8"?><Error><Code>InternalError</Code><Message>We encountered an internal error. Please try again.</Message><Resource>/bucket-50/w%28%28VwG%28y/29.RlZE%28Frsowm32k8P.rnd</Resource><RequestId>kuw9v2am-cwwqix-yy5</RequestId></Error> PUT /bucket-50/w%28%28VwG%28y/29.RlZE%28Frsowm32k8P.rnd {"user-agent":"MinIO (linux; amd64) minio-go/v7.0.11 warp/0.3.43","content-length":"50000000","authorization":"AWS4-HMAC-SHA256 Credential=eUEqexB7B0zYWROHCBYB/20211018/us-east-1/s3/aws4_request, SignedHeaders=content-type;host;x-amz-content-sha256;x-amz-date, Signature=fd504e846dc09e68facc831b4e2847de4d5a1e4c8e89adeae7fd1351cfaa300a","content-type":"application/octet-stream","x-amz-content-sha256":"UNSIGNED-PAYLOAD","x-amz-date":"20211018T062355Z","host":"s3-openshift-storage.apps.ocp-akshat-1.cp.fyre.ibm.com","x-forwarded-host":"s3-openshift-storage.apps.ocp-akshat-1.cp.fyre.ibm.com","x-forwarded-port":"443","x-forwarded-proto":"https","forwarded":"for=10.17.66.193;host=s3-openshift-storage.apps.ocp-akshat-1.cp.fyre.ibm.com;proto=https","x-forwarded-for":"10.17.66.193"} Error [ERR_STREAM_PREMATURE_CLOSE]: Premature close
Oct-18 6:24:26.872 [Endpoint/13] [ERROR] core.endpoint.s3.s3_rest:: S3 ERROR <?xml version="1.0" encoding="UTF-8"?><Error><Code>InternalError</Code><Message>We encountered an internal error. Please try again.</Message><Resource>/bucket-50/FAT7jTZB/30.R%29lM40Cr6iBeg38n.rnd</Resource><RequestId>kuw9v2dx-ew9g81-11uy</RequestId></Error> PUT /bucket-50/FAT7jTZB/30.R%29lM40Cr6iBeg38n.rnd {"user-agent":"MinIO (linux; amd64) minio-go/v7.0.11 warp/0.3.43","content-length":"50000000","authorization":"AWS4-HMAC-SHA256 Credential=eUEqexB7B0zYWROHCBYB/20211018/us-east-1/s3/aws4_request, SignedHeaders=content-type;host;x-amz-content-sha256;x-amz-date, Signature=73b9955ac071df178d23d7dadf848ab342440087950d013327bac029c4a11818","content-type":"application/octet-stream","x-amz-content-sha256":"UNSIGNED-PAYLOAD","x-amz-date":"20211018T062355Z","host":"s3-openshift-storage.apps.ocp-akshat-1.cp.fyre.ibm.com","x-forwarded-host":"s3-openshift-storage.apps.ocp-akshat-1.cp.fyre.ibm.com","x-forwarded-port":"443","x-forwarded-proto":"https","forwarded":"for=10.17.66.193;host=s3-openshift-storage.apps.ocp-akshat-1.cp.fyre.ibm.com;proto=https","x-forwarded-for":"10.17.66.193"} Error [ERR_STREAM_PREMATURE_CLOSE]: Premature close
For bucket-23:
Oct-18 6:23:33.734 [Endpoint/13] [ERROR] core.endpoint.s3.s3_rest:: S3 ERROR <?xml version="1.0" encoding="UTF-8"?><Error><Code>InternalError</Code><Message>We encountered an internal error. Please try again.</Message><Resource>/bucket-23/1l%28%29UI8M/21.ThscTUeluQnrQYFD.rnd</Resource><RequestId>kuw9ty6z-de8snj-xmc</RequestId></Error> PUT /bucket-23/1l%28%29UI8M/21.ThscTUeluQnrQYFD.rnd {"user-agent":"MinIO (linux; amd64) minio-go/v7.0.11 warp/0.3.43","content-length":"100000000","authorization":"AWS4-HMAC-SHA256 Credential=8VqyriCUouPXkfYWfOPS/20211018/us-east-1/s3/aws4_request, SignedHeaders=content-type;host;x-amz-content-sha256;x-amz-date, Signature=130fb11badfe12862c3d7078a68189f2f0877f5cfc8ce245131dba3f7230d774","content-type":"application/octet-stream","x-amz-content-sha256":"UNSIGNED-PAYLOAD","x-amz-date":"20211018T062303Z","host":"s3-openshift-storage.apps.ocp-akshat-1.cp.fyre.ibm.com","x-forwarded-host":"s3-openshift-storage.apps.ocp-akshat-1.cp.fyre.ibm.com","x-forwarded-port":"443","x-forwarded-proto":"https","forwarded":"for=10.17.66.193;host=s3-openshift-storage.apps.ocp-akshat-1.cp.fyre.ibm.com;proto=https","x-forwarded-for":"10.17.66.193"} Error [ERR_STREAM_PREMATURE_CLOSE]: Premature close
Oct-18 6:24:27.513 [Endpoint/13] [ERROR] core.endpoint.s3.s3_rest:: S3 ERROR <?xml version="1.0" encoding="UTF-8"?><Error><Code>InternalError</Code><Message>We encountered an internal error. Please try again.</Message><Resource>/bucket-23/CiF0Ujy6/22.%29hlYb%28Qe1gnQa7ha.rnd</Resource><RequestId>kuw9v32i-czd1vk-4uv</RequestId></Error> PUT /bucket-23/CiF0Ujy6/22.%29hlYb%28Qe1gnQa7ha.rnd {"user-agent":"MinIO (linux; amd64) minio-go/v7.0.11 warp/0.3.43","content-length":"100000000","authorization":"AWS4-HMAC-SHA256 Credential=8VqyriCUouPXkfYWfOPS/20211018/us-east-1/s3/aws4_request, SignedHeaders=content-type;host;x-amz-content-sha256;x-amz-date, Signature=e15f9e4e99f7b9061ed35984931510106492782fdff6dfdcc528c90b8c1c64d3","content-type":"application/octet-stream","x-amz-content-sha256":"UNSIGNED-PAYLOAD","x-amz-date":"20211018T062356Z","host":"s3-openshift-storage.apps.ocp-akshat-1.cp.fyre.ibm.com","x-forwarded-host":"s3-openshift-storage.apps.ocp-akshat-1.cp.fyre.ibm.com","x-forwarded-port":"443","x-forwarded-proto":"https","forwarded":"for=10.17.66.193;host=s3-openshift-storage.apps.ocp-akshat-1.cp.fyre.ibm.com;proto=https","x-forwarded-for":"10.17.66.193"} Error [ERR_STREAM_PREMATURE_CLOSE]: Premature close
### More information - Screenshots / Logs / Other output
Attaching logs for complete outputs
Issue Analytics
- State:
- Created 2 years ago
- Comments:17 (3 by maintainers)
Top Results From Across the Web
1918521 – noobaa-endpoints are stuck at pending state
Description of problem (please be detailed as possible and provide log snippests): After deployed OCS4.7, noobaa-endpoint pods are stuck at ...
Read more >The known issues in - IBM Spectrum Scale DAS
S3 service creation fails with the error "Something went wrong while processing the request." I/O gets interrupted if the node running the noobaa-core...
Read more >OCS-CI documentation
msg i.e Failure cause. Return type bool ocs_ci.helpers.helpers.validate_pod_oomkilled(pod_name, namespace='openshift-.
Read more >Security Bulletin 02 Nov 2022
CVE Number Base Score Reference
CVE‑2021‑32679 8.8 https://nvd.nist.gov/vuln/detail/CVE‑2021‑32679
CVE‑2021‑32688 8.8 https://nvd.nist.gov/vuln/detail/CVE‑2021‑32688
CVE‑2021‑32765 8.8 https://nvd.nist.gov/vuln/detail/CVE‑2021‑32765
Read more >Search Results - CVE
This vulnerability is caused by an incomplete fix for CVE-2022-47933. ... error which led to the **API route handler timing out and logging...
Read more >
Top Related Medium Post
No results found
Top Related StackOverflow Question
No results found
Troubleshoot Live Code
Lightrun enables developers to add logs, metrics and snapshots to live code - no restarts or redeploys required.
Start Free
Top Related Reddit Thread
No results found
Top Related Hackernoon Post
No results found
Top Related Tweet
No results found
Top Related Dev.to Post
No results found
Top Related Hashnode Post
No results found

must-gather-0202_1.tar.gz.zip panic_endpoint_2feb.zip
Closing this defect as it is not recreatable with build 5.9_nsfsfixes-20220215