Katib MySQL reaching max_prepared_stmt_count limit
See original GitHub issue/kind bug
What steps did you take and what happened:
I have been running a lot of trials recently and seems like now the Katib MySQL has reached the max_prepared_stmt_count
. The same error is seen in Katib UI pod too. I have temporarily increased the max_prepared_stmt_count but seems like it will not be too long until we reach that limit too. I’m attaching a few more details:
Logs of katib-ui pod contains following statement:
2021/08/26 11:40:09 GetObservationLog from HP job failed: rpc error: code = Unknown desc = Failed to get ObservationLogs Error 1461: Can't create more than max_prepared_stmt_count statements (current value: 16382)
Logs of katib-sql contains following statement:
Error 1461: Can't create more than max_prepared_stmt_count statements (current value: 16382)
Few observations from Katib db:
+-----------------+---------------+------------+-------------+
| user | host | count_star | command |
+-----------------+---------------+------------+-------------+
| NULL | NULL | 0 | COM_QUERY |
| NULL | NULL | 0 | COM_PREPARE |
| event_scheduler | localhost | 0 | COM_QUERY |
| event_scheduler | localhost | 0 | COM_PREPARE |
| root | localhost | 0 | COM_QUERY |
| root | localhost | 0 | COM_PREPARE |
| root | katib-db-manager | 0 | COM_QUERY |
| root | katib-db-manager | 0 | COM_PREPARE |
| root | katib-db-manager | 0 | COM_QUERY |
| root | katib-db-manager | 0 | COM_PREPARE |
| root | katib-db-manager | 0 | COM_QUERY |
| root | katib-db-manager | 111952 | COM_PREPARE |
+-----------------+---------------+------------+-------------+
Prepared_stmt_count | 17120
mysql> show global status like 'com_stmt%';
+-------------------------+--------+
| Variable_name | Value |
+-------------------------+--------+
| Com_stmt_execute | 103062 |
| Com_stmt_close | 94210 |
| Com_stmt_fetch | 0 |
| Com_stmt_prepare | 111318 |
| Com_stmt_reset | 0 |
| Com_stmt_send_long_data | 0 |
| Com_stmt_reprepare | 0 |
+-------------------------+--------+
7 rows in set (0.00 sec)
What did you expect to happen:
- Expected the metric containers to push observations to Katib MySQL.
- Expected Katib UI to continue function normally.
Anything else you would like to add:
- Seems like this issue needs to be fixed in the code level rather than increasing the limits.
- Suspecting this line of code. Shouldn’t there be a
defer stmt.Close()
line too?
Environment:
- Kubeflow version (
kfctl version
): 1.3.0 - Kubernetes version: (use
kubectl version
): 1.18 - Katib controller version: v0.11.0
- Katib mysql image: mysql:8
Issue Analytics
- State:
- Created 2 years ago
- Comments:8 (8 by maintainers)
Top Results From Across the Web
How max_prepared_stmt_count can bring down production ...
Run this query to check how many prepared statements are running in mysql server. ... You can see there are 1045729 prepared statements...
Read more >Limit max prepared statement count - mysql - Stack Overflow
Prepare () , a connection is taken from the internal connection pool (or a new connection is created, if there aren't any free...
Read more >RDS Mysql ERROR Max-prepared-stmt-count Exceeded
Basically, you will get this error in AWS Aurora Mysql when max_prepared_stmt_count exceed the default value “16382”.
Read more >4 Things To Know About MySQL Prepared Statements
However, the limit on the total number of prepared statements that have been created but not closed is global, not per-connection. That means ......
Read more >Untitled
Jogo ballies shoot, Unprg guias de matricula 2012 ii, Max lease time dhcp linux. ... Prepared statement select batch, Dermalogica precleanse oil review, ......
Read more >Top Related Medium Post
No results found
Top Related StackOverflow Question
No results found
Troubleshoot Live Code
Lightrun enables developers to add logs, metrics and snapshots to live code - no restarts or redeploys required.
Start FreeTop Related Reddit Thread
No results found
Top Related Hackernoon Post
No results found
Top Related Tweet
No results found
Top Related Dev.to Post
No results found
Top Related Hashnode Post
No results found
Top GitHub Comments
I verified that this issue happens due to db resource leak. It will be fixed by #1720
Should be fixed by this PR: https://github.com/kubeflow/katib/pull/1720