AWS: IAM iam.py errors when processing multiple filters if no action taken
See original GitHub issueDescribe the bug When running an iam user key filter policy, the iam.py script performs filtering, runs the disable action, but then drops out with an error before it can run the second action:
[ERROR] 2021-01-29T09:48:39.544Z 9911f614-77a7-465b-8544-743b132ff816 Error while executing policyTraceback (most recent call last): File "/var/task/c7n/policy.py", line 316, in run results = a.process(resources) File "/var/task/c7n/resources/iam.py", line 2211, in process assert m_keys, "shouldn't have gotten this far without keys"AssertionError: shouldn't have gotten this far without keys | [ERROR] 2021-01-29T09:48:39.544Z 9911f614-77a7-465b-8544-743b132ff816 Error while executing policy Traceback (most recent call last): File "/var/task/c7n/policy.py", line 316, in run results = a.process(resources) File "/var/task/c7n/resources/iam.py", line 2211, in process assert m_keys, "shouldn't have gotten this far without keys" AssertionError: shouldn't have gotten this far without keys
This can happen when we still have objects which match. The run above had 4 matched keys which were disabled as expected, but then the script threw the above error.
To Reproduce Running the following policy against a diverse data set with approx 5 matches triggers the error. Removing the username/not-in filter doesn’t seem to make a difference either. Running against a smaller data set of <10 users in a private account worked fine in testing.
Can reproduce by running normally on MacOS with no lambda creation.
Expected behavior The filters should combine to only match keys which are active and beyond the unused age date, then run all of the expected actions.
Background (please complete the following information):
- OS: AWS Lambda
- Python Version: 3.8
- Custodian Version: 0.9.10
- Tool Version: [if applicable]
- Cloud Provider: aws
- Policy:
- name: iam-active-key-age
region: eu-west-2
resource: iam-user
mode:
type: periodic
schedule: "cron(0 07 ? * MON-FRI *)"
role: custodian-role
filters:
- type: value
key: UserName
op: not-in
value_from:
url: s3://redacted/redacted.txt
format: txt
- type: credential
key: access_keys.last_used_date
value_type: age
value: 730
op: greater-than
- type: credential
key: access_keys.active
value: true
op: equal
actions:
- type: remove-keys
disable: true
matched: true
- type: notify
to:
- splunkhec://redacted
transport:
type: sqs
queue: https://redacted
- Traceback:
START RequestId: 2a0d1d12-66d8-4c45-8532-142bacdbdcef Version: $LATEST
[INFO] 2021-01-29T10:23:33.750Z 2a0d1d12-66d8-4c45-8532-142bacdbdcef Processing event
{
"debug": true
}
[WARNING] 2021-01-29T10:23:33.751Z 2a0d1d12-66d8-4c45-8532-142bacdbdcef Custodian reserves policy lambda tags starting with custodian - policy specifies custodian-info
[DEBUG] 2021-01-29T10:23:33.751Z 2a0d1d12-66d8-4c45-8532-142bacdbdcef Storing output with <LogFile file:///tmp/iam-active-keys-90-days/custodian-run.log>
[DEBUG] 2021-01-29T10:23:33.751Z 2a0d1d12-66d8-4c45-8532-142bacdbdcef Running policy:iam-active-keys-90-days resource:iam-user region:eu-west-2 c7n:0.9.10
[DEBUG] 2021-01-29T10:24:35.687Z 2a0d1d12-66d8-4c45-8532-142bacdbdcef Filtered from 1227 to 5 user
[INFO] 2021-01-29T10:24:35.689Z 2a0d1d12-66d8-4c45-8532-142bacdbdcef policy:iam-active-keys-90-days resource:iam-user region:eu-west-2 count:5 time:61.94
[ERROR] 2021-01-29T10:24:36.357Z 2a0d1d12-66d8-4c45-8532-142bacdbdcef Error while executing policy
Traceback (most recent call last):
File "/var/task/c7n/policy.py", line 316, in run
results = a.process(resources)
File "/var/task/c7n/resources/iam.py", line 2211, in process
assert m_keys, "shouldn't have gotten this far without keys"
AssertionError: shouldn't have gotten this far without keys[ERROR] 2021-01-29T10:24:36.358Z 2a0d1d12-66d8-4c45-8532-142bacdbdcef error during policy execution
Traceback (most recent call last):
File "/var/task/c7n/handler.py", line 165, in dispatch_event
p.push(event, context)
File "/var/task/c7n/policy.py", line 1140, in push
return mode.run(event, lambda_ctx)
File "/var/task/c7n/policy.py", line 525, in run
return PullMode.run(self)
File "/var/task/c7n/policy.py", line 316, in run
results = a.process(resources)
File "/var/task/c7n/resources/iam.py", line 2211, in process
assert m_keys, "shouldn't have gotten this far without keys"
AssertionError: shouldn't have gotten this far without keys[ERROR] AssertionError: shouldn't have gotten this far without keys
Traceback (most recent call last):
File "/var/task/custodian_policy.py", line 4, in run
return handler.dispatch_event(event, context)
File "/var/task/c7n/handler.py", line 165, in dispatch_event
p.push(event, context)
File "/var/task/c7n/policy.py", line 1140, in push
return mode.run(event, lambda_ctx)
File "/var/task/c7n/policy.py", line 525, in run
return PullMode.run(self)
File "/var/task/c7n/policy.py", line 316, in run
results = a.process(resources)
File "/var/task/c7n/resources/iam.py", line 2211, in process
assert m_keys, "shouldn't have gotten this far without keys"END RequestId: 2a0d1d12-66d8-4c45-8532-142bacdbdcef
REPORT RequestId: 2a0d1d12-66d8-4c45-8532-142bacdbdcef Duration: 62609.07 ms Billed Duration: 62610 ms Memory Size: 512 MB Max Memory Used: 127 MB
custodian version --debug
output Please copy/paste the following info along with any bug reports:
Custodian: 0.9.10 Python: 3.7.4 (default, Sep 7 2019, 18:27:02) [Clang 10.0.1 (clang-1001.0.46.4)] Platform: posix.uname_result(sysname=‘Darwin’, nodename=‘x’, release=‘19.6.0’, version=‘Darwin Kernel Version 19.6.0: Sun Jul 5 00:43:10 PDT 2020; root:xnu-6153.141.1~9/RELEASE_X86_64’, machine=‘x86_64’) Using venv: True Docker: False Installed:
argcomplete==1.12.2 attrs==20.3.0 boto3==1.16.42 botocore==1.19.42 importlib-metadata==3.3.0 jmespath==0.10.0 jsonpickle==1.3 jsonschema==3.2.0 pyrsistent==0.17.3 python-dateutil==2.8.1 pyyaml==5.3.1 s3transfer==0.3.3 setuptools==40.8.0 six==1.15.0 tabulate==0.8.7 typing-extensions==3.7.4.3 urllib3==1.26.2 zipp==3.4.0 Additional context
This policy appears to perform the correct actions when it does not have a filter on ‘active’ keys, however this generates a lot of output on the splunk logging and on CloudTrail when we’re auditing the results.
Issue Analytics
- State:
- Created 3 years ago
- Comments:7 (2 by maintainers)
Top GitHub Comments
I’ve successfully run initial tests against a large account which was failing before and which now filters down to 4 users.
2021-02-12 12:13:52,000: custodian.resources.user:DEBUG Filtered from 1229 to 4 user
Those 4 shouldn’t match as the newer keys are active, but the keys that were active were still enabled after the tests and we didn’t get the AssertionError, so I think this is closed for now. I’ll have to wait a few weeks before I get a slot to do more testing, but can raise a new issue if there are still problems. Thanks very much for the fix.
Thank you very much, it’s much appreciated . I’ll pull down 6437 and run some tests against it before the weekend, and should be able to put some more test cases together if I can clear the time.