The column length returned is wrong when the column charset is UCS2
Affects | Status | Importance | Assigned to | Milestone | |
---|---|---|---|---|---|
Trafodion |
New
|
High
|
Unassigned |
Bug Description
Defect Description:
When test column charset as UCS2, the column length returned is wrong, the expect value is 200K, but the actual values is 886.
Test Environment:
DB cluster: onyx13 (build: trafodion-
Linux ODBC driver: clients-
Test Steps:
Step 1. create a table
571 sql = "create table tblcolumnsize20
Sep 2. insert data into the table
573 a = u""
574 b = u""
575 for i in range(1, 100000, 1):
576 a = a + unicode('a') # print sys.maxunicode, if 65535 then ucs2, if 1114111 then ucs4, manually check
577 b = b + unicode('b')
578 a = a + unicode('E')
579 b = b + unicode('E')
580 sql = "insert into tblcolumnsize20
581 cursor.execute(sql)
Step 3. do compare between the expected value and the actual value
585 #print "---", rows
586 print "length of the result set ", len(rows)
587 self.assertEqua
588 len_a = len(rows[0][0]) # error: length=866
589 print "length of the first column value ", len_a
590 self.assertEqua
591 self.assertEqua
In the step 3, when execute the line 590, the expected value is 100000, but the actual value is 866. Here is the console output,
-bash-4.1$ python -m tests.ODBC.
sslength of the result set 1
length of the first column value 866
Fssss
=======
FAIL: testBigColumnSi
-------
Traceback (most recent call last):
File "/designs/
self.
AssertionError: get length of the first column wrong.
-------
Ran 7 tests in 13.882s
FAILED (failures=1, skipped=6)
Changed in trafodion: | |
importance: | Undecided → High |