question-mark
Stuck on an issue?

Lightrun Answers was designed to reduce the constant googling that comes with debugging 3rd party libraries. It collects links to all the places you might be looking at while hunting down a tough bug.

And, if you’re still stuck at the end, we’re happy to hop on a call to see how we can help out.

batch_execute_statement parameter mismatches

See original GitHub issue

Describe the bug Datetime parameters for insert statements sent to Aurora MySQL Serverless by batch_execute_statement will sometimes have incorrect values.

Steps to reproduce

  1. Create a new Aurora MySQL 5.7 Serverless instance
  2. Execute the CREATE TABLE in the comments of badDBvalues2.py in the attached file badDBvalues2.py.zip
  3. Start 2 instances badDBvalues2.py in separate terminal windows

Expected behavior The test Python code creates the same set of rows using batch_execute_statement then reads them back repeatedly. It should never report a mismatch, but it has reported an error for us every time we run it. It may take 10 iterations or a 1000, but it will eventually fail.

Debug logs The following log files from our production environment are attached. These were captured running live applications and not the test script against an Aurora MySQL 5.7 Serverless database:

  1. dbBadValues2ParameterSets.txt in dbBadValues2ParameterSets.txt.zip - application logs showing the parameter sets sent to batch_execute_statement. These are the same parameter values used in the test Python code.
  2. dbBadValues2.general.txt in dbBadValues2.general.txt.zip - Aurora MySQL general logs showing values actually inserted (includes the CloudWatch log entry as well as the INSERT columns and values parsed into JSON)

The problem occurred at 2021-03-29T12:07:25.561662Z according to the MySQL general log. The values the application passed to batch_execute_statement were:

                {
                    "name": "startedAt",
                    "typeHint": "TIMESTAMP",
                    "value": {
                        "stringValue": "2021-03-29 08:06:01.000000"
                    }
                },
                {
                    "name": "endedAt",
                    "typeHint": "TIMESTAMP",
                    "value": {
                        "stringValue": "2021-03-29 08:06:07.000000"
                    }
                },

And the values shown in the MySQL general logs are: “startedAt”: “2021-03-29 08:06:56.0”, “endedAt”: “2021-03-29 08:06:07.0”,

As you can see, the value for startedAt that was inserted does not match what was passed to batch_execute_statement. This incorrect value does not appear anywhere else in the parameter sets this time, which is different from the first trace, so we have no idea where it came from.

We can find no pattern with this. It occurs rarely and randomly. As far as we can tell, this has only been a problem for datetime columns, but we can’t be sure so it does call into question other data types. It appears the problem lies somewhere between boto3 and Aurora, although we can’t be certain of that.

We have a support case open with AWS on this problem which they are researching, but they suggested also opening an issue here.

badDBvalues2.parameterSets.txt.zip badDBvalues2.py.zip badDBvalues2.general.txt.zip

Issue Analytics

  • State:closed
  • Created 2 years ago
  • Comments:5 (2 by maintainers)

github_iconTop GitHub Comments

1reaction
jamie-burkscommented, Apr 12, 2021

Omitting the TIMESTAMP hint does seem to get around the issue with the 6 decimal places that we are using. Previously we could not get more than about a thousand rows inserted without an issue, but after omitting the hint we did not see a failure in over 8 million rows inserted.

0reactions
github-actions[bot]commented, Apr 12, 2021

⚠️COMMENT VISIBILITY WARNING⚠️

Comments on closed issues are hard for our team to see. If you need more assistance, please either tag a team member or open a new issue that references this one. If you wish to keep having a conversation with other community members under this issue feel free to do so.

Read more comments on GitHub >

github_iconTop Results From Across the Web

BatchExecuteStatement - Amazon RDS Data Service
Runs a batch SQL statement over an array of data. You can run bulk update and insert operations for multiple records using a...
Read more >
ExecuteTransaction - Amazon DynamoDB
In the following list, the required parameters are described first. TransactStatements ... Type mismatch for attribute to update.
Read more >
boto3 - botocore.exceptions.ClientError: An error occurred ...
AWS ECS DescribeContainerInstances generates InvalidParameterException because of accountId mismatch.
Read more >
Class AmazonDynamoDBClient - javadoc.io
Parameters : awsCredentials - The AWS credentials (access key ID and secret key) ... Result of the BatchExecuteStatement operation returned by the service....
Read more >
CHANGELOG.txt 4.0.6.0 - PowerShell Gallery
Modified cmdlet Update-CWAIApplication: added parameter CWEMonitorEnabled. ... Invoke-RDSDStatementBatch leveraging the BatchExecuteStatement service API.
Read more >

github_iconTop Related Medium Post

No results found

github_iconTop Related StackOverflow Question

No results found

github_iconTroubleshoot Live Code

Lightrun enables developers to add logs, metrics and snapshots to live code - no restarts or redeploys required.
Start Free

github_iconTop Related Reddit Thread

No results found

github_iconTop Related Hackernoon Post

No results found

github_iconTop Related Tweet

No results found

github_iconTop Related Dev.to Post

No results found

github_iconTop Related Hashnode Post

No results found