However we encounter problems with the CHAR datatype field caused by the conversion of Oracle from single-byte characters (version 8 ) to multi-byte characters (version 10). Oracle seems to have changed the metadata description of the table definitions.
In Oracle 8:
TABLE COL_NAME DATATYPE DATA_LENGTH
---------- ---------- ---------- -----------
BASPR_WGAP GESLACHT CHAR 1
And in Oracle 10:
TABLE COL_NAAM DATATYPE DATA_LENGTH CHAR_LENGTH
---------- ---------- ---------- ----------- -----------
BASPR_WGAP GESLACHT CHAR 4 1
So the length of a column is no longer determined by DATA_LENGTH definition, however by CHAR_LENGTH. DATA_LENGTH is now used by Oracle to determine the bytes of data stored in the database.
We have been debugging the code and think to have located the problem in AbtOracle8DatabaseConnection>>getColLengthForDescriptor: ifError:
- Code: Select all
getColLengthForDescriptor: parmdpp ifError: anErrorBlock
| colLength array rc |
colLength := ByteArray new: 2.
(array := Array new: 6)
at: 1 put: parmdpp;
at: 2 put: OCI_DTYPE_PARAM;
at: 3 put: colLength;
at: 4 put: 0;
at: 5 put: OCI_ATTR_DATA_SIZE;
at: 6 put: self errhp;
yourself.
((rc := self callPlatformFunction: OCIAttrGet
withArray: array
useThreadPreference: true
threadKey: (self defaultThreadKey) ) = OCI_SUCCESS)
ifFalse: [^self verifyReturnCode: rc ifError: anErrorBlock].
^ colLength abtAsInteger.
AbtOracle10DatabaseConnection uses this code from the version 8 interface to get the column length of a CHAR field with OCI_ATTR_DATA_SIZE. This however delivers the length of raw data reserved in the database, not the length of the field itself.
Do you have a suggestion in solving this problem?
Thanks.[quote][/quote]