"can only concatenate str (not \"NoneType\") to str" error when using revoke_ingress method of ec2 client
See original GitHub issueDescribe the bug Attempting to use revoke_ingress method and getting the error that’s been posted. Saying “can only concatenate str (not "NoneType") to str” even though all the values being passed to the method are appropriately strings or integers.
Steps to reproduce
def remediate_sg(vuln_dict: dict, ec2_client: boto3.client) -> None:
print(vuln_dict)
ec2_client.revoke_security_group_ingress(CidrIp=vuln_dict['Cidr'], FromPort=vuln_dict['ToPort'], GroupId=vuln_dict['SG_ID'],
IpProtocol=vuln_dict['Protocol'], ToPort=vuln_dict['ToPort'])
{'SG_ID': 'sg-0e96de665e60647c0', 'Cidr': '0.0.0.0/0', 'FromPort': 22, 'ToPort': 22, 'Protocol': 'tcp', 'Date/Time': '04/02/2020 20:29:15.583307', 'Region': 'us-east-1'}
I’ve posted an example of the code, along with the dictionary being passed to the function as an example. The dictionary being passed has all the appropriate values, but yet this error is still raised.
Expected behavior It should be able to revoke the security group ingress rule without raising that error. None of the values being passed are of the ‘None’ type even though the error is stating that is the case.
Debug logs
{
"errorMessage": "can only concatenate str (not \"NoneType\") to str",
"errorType": "TypeError",
"stackTrace": [
" File \"/var/task/lambda_function.py\", line 22, in lambda_handler\n remediate_sg(vulnerability_dict, ec2_client)\n",
" File \"/var/task/lambda_function.py\", line 6, in remediate_sg\n ec2_client.revoke_security_group_ingress(CidrIp=vuln_dict['Cidr'], FromPort=vuln_dict['ToPort'], GroupId=vuln_dict['SG_ID'],\n",
" File \"/var/runtime/botocore/client.py\", line 316, in _api_call\n return self._make_api_call(operation_name, kwargs)\n",
" File \"/var/runtime/botocore/client.py\", line 612, in _make_api_call\n http, parsed_response = self._make_request(\n",
" File \"/var/runtime/botocore/client.py\", line 632, in _make_request\n return self._endpoint.make_request(operation_model, request_dict)\n",
" File \"/var/runtime/botocore/endpoint.py\", line 102, in make_request\n return self._send_request(request_dict, operation_model)\n",
" File \"/var/runtime/botocore/endpoint.py\", line 132, in _send_request\n request = self.create_request(request_dict, operation_model)\n",
" File \"/var/runtime/botocore/endpoint.py\", line 115, in create_request\n self._event_emitter.emit(event_name, request=request,\n",
" File \"/var/runtime/botocore/hooks.py\", line 356, in emit\n return self._emitter.emit(aliased_event_name, **kwargs)\n",
" File \"/var/runtime/botocore/hooks.py\", line 228, in emit\n return self._emit(event_name, kwargs)\n",
" File \"/var/runtime/botocore/hooks.py\", line 211, in _emit\n response = handler(**kwargs)\n",
" File \"/var/runtime/botocore/signers.py\", line 90, in handler\n return self.sign(operation_name, request)\n",
" File \"/var/runtime/botocore/signers.py\", line 160, in sign\n auth.add_auth(request)\n",
" File \"/var/runtime/botocore/auth.py\", line 368, in add_auth\n signature = self.signature(string_to_sign, request)\n",
" File \"/var/runtime/botocore/auth.py\", line 348, in signature\n k_date = self._sign(('AWS4' + key).encode('utf-8'),\n"
]
}
Issue Analytics
- State:
- Created 3 years ago
- Comments:17 (6 by maintainers)
Top GitHub Comments
@mr26 - Thank you for your post and providing all the details. How have you configured credentials to use with lambda function ? As you are getting error on this line
File \"/var/runtime/botocore/auth.py\", line 348, in signature\n k_date = self._sign(('AWS4' + key).encode('utf-8'),\n"
it makes me think that your secret key value is set to None.Can you please check the secret key value to make sure it has been set correctly ? You can check your secret key using boto3 seesion.
If your secret key value is correctly set but still you are getting error please provide me with the debug log. You can enable log by adding
boto3.set_stream_logger('')
to your code.I am glad you figure out the issue. Closing this issue as it has been resolved.