Incorrect exprDataType value to select for varchar partitioning key
Affects | Status | Importance | Assigned to | Milestone | |
---|---|---|---|---|---|
Stado |
Fix Committed
|
Undecided
|
Unassigned |
Bug Description
Stado is built with revision 107 from lp:~sgdg/stado/stado.
# table:
create table dnsinfo (mid varchar(16), sip bigint, dip bigint);
# data
file dns.dat:
1376983262,0,0
# load data
./gs-loader.sh -u admin -p admin -d xtest -t dnsinfo -f ',' -i dns.dat
# insert data
insert into dnsinfo values(
# query
Stado -> select * from dnsinfo;
+------
| mid | sip | dip |
+------
| 1376983262 | 0 | 0 |
| 1376983263 | 0 | 0 |
+------
2 row(s).
Above query works correctly. But
Stado -> select * from dnsinfo where mid = '1376983263';
no rows to display
With "where" condition, no row is displayed.
In SqlExpression.java,
public String getNormalizedVa
{code}...
case Types.VARCHAR:
if (getExprDataTyp
&& normalized.length() > getExprDataType
normalized = normalized.
}
return normalized;
{code}...
For load and insert, the exprDataType from getExprDataType() has length 16, which is correct.
For select query with where condition, the exprDataType has length 5, so the normalized value is "13769".
Then partitionMap.
Thus, the query result is incorrect.
In SysColumn.java:
/**
* Returns the column length in bytes
*/
public int getColumnLength() {
int colLength;
switch (this.coltype) {
colLength = 1;
case java.sql.Types.BIT:
break;
case java.sql. Types.CHAR:
colLength = this.collength;
break;
case java.sql. Types.VARCHAR:
colLength = this.collength / 3;
break;
{code}...
the length is divided by 3.
This could be the root cause.