BIGINT range check only working for string conversions
Affects | Status | Importance | Assigned to | Milestone | |
---|---|---|---|---|---|
Drizzle |
Fix Released
|
Medium
|
Andrew Hutchings | ||
7.0 |
Fix Released
|
Medium
|
Andrew Hutchings |
Bug Description
When inserting into a bigint column, vaues > max bigint (64bit signed max) are accepted, but when the values are in quotes an out of range error is hit. I would expect out of range for both instances:
drizzle> create table t6 (a bigint);
Query OK, 0 rows affected (0.07 sec)
drizzle> INSERT INTO `t6` VALUES (92233720368547
Query OK, 2 rows affected (0.08 sec)
Records: 2 Duplicates: 0 Warnings: 0
drizzle> INSERT INTO `t6` VALUES ('9223372036854
ERROR 1264 (22003): Out of range value for column 'a' at row 2
drizzle> select * from t6\G
*******
a: 9223372036854775807
*******
a: -1
2 rows in set (0 sec)
Related branches
Changed in drizzle: | |
importance: | Undecided → Medium |
Changed in drizzle: | |
assignee: | nobody → Andrew Hutchings (linuxjedi) |
This was actually fixed in 2010-12-20 release but I can't change the milestone to that one now