myconnpy appears to be roughly an order of magnitude slower than MySQLdb
Affects | Status | Importance | Assigned to | Milestone | |
---|---|---|---|---|---|
MySQL Connector/Python |
Opinion
|
Medium
|
Geert JM Vanderkelen |
Bug Description
from mysql import connector
import MySQLdb
import time
baseq = "SELECT * FROM test"
def measure(func, *args, **kw):
start = time.time()
result = func(*args, **kw)
end = time.time()
return end - start, result
def scansequential(
for i in xrange(iters):
conn = connect(
for i in xrange(5):
curs = conn.cursor()
retval = curs.execute(baseq)
rows = curs.fetchall()
iters = 100
t, _ = measure(
print "myconnpy", t
t, _ = measure(
print "MySQLdb", t
When run on my box, with 20 rows in the test database, I get:
myconnpy 0.712414979935
MySQLdb 0.139988899231
With 40 rows, I get:
myconnpy 1.10226392746
MySQLdb 0.16171002388
This means that for twice the data, MySQLdb takes about 14% longer, but myconnpy takes 54% longer. This suggests that there's a bottleneck in the parsing code. I guess figuring out stuff like that is what the profile module is for. :-)
Ah, this is true, and I think true for all 'pure' Python applications.
I'm using profile to reduce function calls and find these bottlenecks, and it got already faster!
But this is not a bug per se, a Python app using MySQL C libmysqlclient will always be faster.
However, if one can point us to a better way to do it in the code, that would be great. (and I'm sure there will be lots of please which can be improved!)
Please open new bugs which point to specific performance problems.
Thanks for benchmarking though! That's another thing we need to put in place.