[Bug] Not every cloudwatch message is passed
See original GitHub issueDescribe what happened: We have noticed that for very very long messages that are stored in Cloudwatch, log forwarder is not passing all of them.
I have done some tests and found that Cloudwatch is automatically splitting longer lines to the size of 262119 characters. the maximum that logs forwarder is accepting is 256000 which causes some of logs to be dropped.
The difference is not big and actually changing the DatadogBatcher max_log_size_bytes
parameter to 265000 did the trick. And all logs from Cloudwatch was passed further.
I’m not sure if there are any other consequences of this change.
Describe what you expected:
All entries stored in Cloudwatch should be passed to Datadog.
Steps to reproduce the issue:
Create a log entry in Cloudwatch that is between 256000 and 262119 characters.
Issue Analytics
- State:
- Created 2 years ago
- Comments:5 (4 by maintainers)
Top GitHub Comments
I can reproduce it and apparently CloudWatch logs’ documented event size limit is 256K, but in reality it’s 262,144 (can be found when attempting to create a big event in the CW logs console). After discussion with our logs intake team, it is safe to bump the event size limit. Considering the metadata and other overhead, I will bump the limit to 512K (the backend limit is 1M).
@radekl Thanks for letting us know, I’m going to do some experiment too. Really appreciate your feedback!