Parametrized query hits max_prepared_stmt_count
Affects | Status | Importance | Assigned to | Milestone | |
---|---|---|---|---|---|
oursql |
New
|
Undecided
|
Unassigned |
Bug Description
Hi,
I have a script that executes the same query over and over for a large number of times. The script would eventually raise ProgrammingError, with the following message (1461, "Can't create more than max_prepared_
The attached code should reproduce the bug (it will connect to mysql based on the paramenters in ~/my.cnf).
If you monitor the server's status (mysqladmin extended-status) you will notice that Prepared_stmt_count grows until it hits max_prepared_
The way I am getting around this is to embed the loop block within a try/except. When the exception is catched, I close the cursor and recreate a new one. It works great, and in fact Prepared_stmt_count goes back to 0, but I feel that it's a bit hackish.
I looked up a bit on various fora and it seems that this might be due to a memory leak:
http://
If it helps i had the same issue and the problem was with the 'fetchone()', this leaves the result set open as you have not emptied it for after the last item is a null item or resultset.
I was able to solved this by using fetchall(), even when i knew it would only return one result ie. 'select * from test where id = ? limit 1'.
fetchall() then closes the open resultset ie. the preapred_stmt that oursql uses, and fixes the (1461, "Can't create more than max_prepared_ stmt_count statements (current value: 16382)") error for fetchone() does not close the prepared_stmt and eventually you hit the msyql limt of open prepared_stmt's '16382'.